The Bleeding Edge of Smart TVs

Circa Fall 2011... the title of this story eludes to the context of the project. Qualcomm was an existing client for SapientRazorfish (formally Level Studios and Rosetta) and I had been tapped to lead delivery on a highly visible proof-of-concept project that proved the powerful capabilities of Qualcomm's Snapdragon chipset. This is the primary chip that powers a host of phones from multiple manufacturers including Samsung, Sony, Motorola and LG. Qualcomm's strategy consisted of powering SmartTVs with the same technology. The project consisted of visual design, user experience, and development of an interactive experience that was not just smoke-and-mirrors. Our team had to develop an interactive, 3D TV interface with a fully featured programming grid that streamed live TV and worked on prototype hardware provided by Qualcomm. The vast feature set all needed to be completed in time for the 2012 CES show in Las Vegas the following January.  We had just 16 weeks to design and deliver the project.

snapdragon-chip-with-logo.jpg

Since the Demo was a 3D project, we decided to develop the interface on a Unity platform.  The most significant technical challenge was wrapping the Unity interface in Java to enable interaction with the Snapdragon hardware and a vast list of capabilities including: keyboard and mouse interaction, live Rovi data feed for programming information that dynamically generated a program grid, live TV broadcast, Facebook integration, screen saver mode, concurrent TV and Gaming capabilities, Android phone interface, and finally (thrown in towards the end of the project) facial recognition. Yikes!

I remember the first all-hands meeting. It was was held on-site at Qualcomm's main campus in one of the auditoriums. Qualcomm's main entrance is quite a site.  It is covered in plaques of 100s, if not 1,000s of patents the company has filed over the years.  I remember seeing a fifteen foot robot there as well. It looked like it could crush a car but I do not think it was operational.  It was impressive nonetheless.

There were roughly thirty stakeholders that that were sitting in the room. It was a combination of the Level Studios team, Qualcomm product managers, hardware engineers and the executive team.  The overall objectives were reaffirmed and we quickly changed the discussion to implementation guidelines and how the Qualcomm engineering team would communicate and interact with the Level Studios creative and software development teams.  We agreed the best way to collaborate was to be on-site as much as possible. 

Qualcomm wanted our team there 100% of the time but I worked with our Account manager to convince them that our team works best at our home-office in El Segundo. San Diego was just 80 miles from our office so it wasn't too horrible of a commute to make on a weekly basis. We confirmed our team would do most of our work remotely and travel to Qualcomm to work directly with the engineers to integrate weekly code drops. This was one of the key points to the success of the project.

Back at the office in El Segundo, we did not have the luxury of a long discovery and design timeline. We had to begin developing the user experience and design as quickly as possible. The design team was led by Dave McLain, and the director of engineering was Darryl Kanouse.

To be completely honest, there was nobody on our staff that had the technical skills to get the Unity software working on the Snapdragon prototype hardware. We had a theory that it could work; we just not have anyone with the necessary deep knowledge of Android and Java to pull it off. I'm surprised Darryl didn't have a restraining order taken out on me as I was in his office multiple times each day for weeks checking on the hiring status.  We were at least four weeks into the project when Andrew Marshall finally joined the crew. He came to us from USC and was the only reason the technology implementation succeeded. 

Once Andrew was on-board, we discovered the prototype system that Qualcomm had provided was faulty.  It was a clunky black box that was designed specifically to provide the necessary ports and connectivity for the chipset to function. These were highly guarded pieces of hardware and only a few existed. We practically had to steal the original box from the engineering team. Having to send another unit set QA back about two weeks; adding even more risk to the already aggressive project timeline. The entire team continued to put our heads down, working late hours to gain back critical time necessary to meet the looming deadline.

The design was spot on. I unfortunately do not have any of the motion design work but I was able to find some of the final comps that was part of Dave's final design presentation.  This final design was an amalgamation of three independent concepts that were used to create the final look & feel of the interface. Dave and his team did a fantastic job under extreme pressure to get the designs completed. Not only did they need to be approved by the client, Scott Noma had to be confident he could develop a working interface on the Unity platform. The design work which included concepting, storyboarding, motion design, user experience and development on the Unity platform. Development time alone could (and should) have taken the entire duration of the project. Again, the design team rocked this project.

Interface Gallery

I remember, just weeks before the final build was to be completed, we had three P1 critical bugs and over a dozen P2 issues.  This was a challenging narrative to communicate to the Product Owners and executives. We were able to negotiate a reduction of scope, removing lower priority items, to meet the deadline. We agreed to complete the remaining features after final delivery in order to fulfill Qualcomm’s ongoing marketing effort to sell the the Snapdragon chips into Smart TVs overseas.  Unfortunately we were not able to get full Facebook integration support functioning on the device prior to the CES show. We also created very specific scripts for the Demo operators to follow as workarounds for remaining defects that were shipped with the final build.

With as much stress, long hours, critical thinking and continual negotiations with the client that was going on for four months, I have to say all of the teams performed exceptionally well. The Qualcomm engineering team assigned to the project were very professional and extremely helpful every step of the way. We miraculously completed the project in time for CES with a solid product that met ~90% of the target goals, including facial recognition. We were, however, disappointed it did not make the big stage for Paul Jacobs's keynote address.

I searched for what content I could find on Youtube illustrating the working demo. This is all I could find. There may be some more content out there so please send it to me if you happen to come across it.