AFDS 2012 Day 3



Author: James Prior
Editor: Charles Oliver
Date: June 22nd, 2012

AFDS Day 3: Wednesday, June 13th

The third day, Wednesday June 13th, of AFDS '12 started with breakfast talk buzzing from the overnight announcement of AMD's adoption of TrustZone technology, and brought us three keynotes from very different people. First up was a high-energy, power presentation from CEO and Co-Founder, David Perry, of GaiKai.

Keynote #1 - David Perry, GaiKai

GaiKai is a cloud gaming service, aiming to bring PC gaming to the masses regardless of form factor, power limitations, or commitment to gaming. The premise of GaiKai is to host games on regional datacenters, big racks of servers with GPUs in them, and stream the game play to your java-enabled, internet connected device. This was most ably demonstrated with an iPad playing Call of Duty at 60fps without any trouble.

David Perry is a gaming industry veteran, having founded Shiny Entertainment and Acclaim.com before starting GaiKai. A former programmer himself, he's inside the gaming industry and looking to make finding games easier, and making the experience better. David took us through 100 powerpoint slides in 45 minutes, including several videos, with a polished ease and aplomb that should be the envy of every marketing person on the planet. Inside the presentation were some real gems of truth, David gives every appearance of understanding how to get more people gaming and improving their experience - imagine plugging in your XBOX 360 controller into your smart TV and playing a wide range of PC titles, now messing with settings, just play. Then the same game plays on your macbook, your notebook, your tablet, your phone; no transfer of settings or worrying about storage space. This is a reality with the GaiKai gaming client now inside Samsung and LG smart TVs.

Less obvious is how enthusiast gamers are serviced, as the flipside of not worry about local processing and storage capacity is not having flexibility to specify your quality settings. Eyefinity or Surround gaming should work, as will simple 3D, but advanced features that use the SDK APIs these features offer won't as they're not in the right place to process. On the subject of latency, David was optimistic - in current smart TVs, the input delay from some sources is 100-200ms; for internet applications that can be as low as 15ms, meaning that games streaming from the GaiKai cloud could be more responsive than from a local console; an interesting notion.

Another interesting notion from David was that games should be free to play, instead using in-game advertising and microtransactions to be the method for generating revenue. This is a tough sell for people like Activision, who make more money from the ~350-400m people playing the latest Call of Duty than Zynga does from 100M users on Facebook games like Farmville. For a lot of people, lowering the bar of cost of entry - use existing equipment, no more need to buy new consoles, PCs etc., will be very attractive, and now the $60USD plus DLC/expansion pack cost can go to customizing the in-game experience to exactly what they want - or just back in their pocket.

Initially this might seem like a bad idea for companies like AMD as well, because they make the designs and sell the IP and chips that power the consoles, the notebooks and PCs. For them, the upside is that people will still want local processing power - people like to play offline, as well as on; plus enthusiasts aren't being served by this service. They also get to sell many more, much more expensive, server and workstation products to fill the datacenters required for this service. While products like the AMD Radeon HD 7970 are expensive for home consumers (you can buy 2 consoles for that money, these days) that pricepoint is very, very low for people spinning up datacenters. Additionally, there's at least one TV in pre-production using AMD's Brazos as a platform, something the newly appointed Arun Iyengar, corporate vice president and general manager of a new Embedded Solutions Group will be likely to further explore.

Keynote #2 - Alan Kraemer, SRS

Next on the keynote agenda was Alan Kraemer, CTO and Executive VP of SRS Labs, Inc. SRS Labs should be familiar to everyone who's used consumer electronics in the last two decades, as a company who develop and license audio technologies to increase immersion and enjoyment of audio. Alan's presentation, while a lot more low-energy than David Perry's, was very much in-tune with the ethos and content the crowd at AFDS wanted. SRS are pushing forward a new standard for audio, Multi-Dimensional Audio (MDA).

The essence of MDA is make the consumer experience better by allowing intelligent mixing of the soundstage to happen at the end-user level instead of just the audio production booth. Currently consumers are offered 7.1, 5.1 and stereo options for their audio content, which doesn't take into account how the consumer home audio setup is configured; and what if you want more subs, more speakers? MDA allows audio engineers to take more channels and position the audio however they want, with the end output device doing the final mix as it knows where the speakers are and how they're setup; everyone gets the same mastered audio, the experience scales with your home theater audio investment. As part of his presentation, GaiKai's David Perry had listed things already achieved in gaming and included Audio in that list as being 'done' - nothing further to do. Alan quite neatly demonstrated in two slides why that's not true.

The aim of MDA is to insert height and depth into the sound field, moving on from 'wow, a noise came from that speaker behind me!' to building a sound stage that moves in time and space. SRS had a live demo of their technology using a 22.3 setup in a convention room and special footage from AMD's favorite director, Robert Rodriquez, using a scene from Predators. The effect was quite noticeable, grabbing more attention to the scene and leaving the experience more complete.

For those of you keeping up with such things, MDA type standards are not new but one key aspect of SRS's presentation is that MDA is compatible with existing proprietary specifications and also open. SRS want to create an open common format for expressing information about audio in space, combining the intent of the mixer with knowledge of the local sound field to let producers use a single, common format for audio. The objects defined inside the standard are extensible and can support force feedback - the audio feed could be used to define motion control of seats or projects, and vibration units, too.

Keynote #3 - Steven Batiche, Microsoft

The final keynote of the day was provided by Steven Batiche, director of research for Microsoft's applied science group. The theme of this presentation was about creating technology 'so good it gets out of the way' and was centered on human-machine interactions (HMI). Various videos and slides showed how Microsoft is researching and creating solutions for ways to improve how information is presented and worked with to enable creativity and productivity at the speed of thought.

To call Steve a mad scientist is an understatement, I don't think the guy operates in the same world as the rest of us. This is a crucial component of someone looking at turning HMI on its head and making it better, you need new perspectives, new tools, new thought processes, and the different augmented reality concepts that Steve and his team look at show innovation in some of its purest forms. The focus of their research is really broad, from using prisms combined with adaptive lens panels and eye tracking to provide a stereoscopic view that the user can adjust with natural head movements - want to see more of the background behind the object currently in view? Like you would in real life, peer around it and more of that view is exposed.

Gesture and touch also feature prominently in Microsoft's research, as they try to blend the interactions to make manipulating virtual objects more natural and easy to do, expanding on pinch to zoom with pinch and pull away, twist, flip etc. to get more manipulation in place. The primary focus was on how HMI advances can create better business and personal workflows, and better person to person interaction. The grand overarching vision of Microsoft's research isn't something attainable in the short term, this is very long view blue-sky stuff that tends to bring the best short term innovations as current technology limits are overcome to achieve the end goal.

Experience Zone

After lunch, and a secret meeting, there was time for a quick walkaround in the Experience zone where a new widget turned up - a Brazos powered wall-wart. Well, actually it's a full PC that plugs directly into the wall and has wired Ethernet, HDMI and USB ports, perfect for making dumb displays into any kind of smart display you want. There were a couple of other interesting form factors on display, the Cooler Master heatsink Brazos PC and a more traditional mini-PC system.

Afternoon Session - Richard Brunner, VMWARE

Then it was time for more technical sessions. I was able to attend a session from VMware's Chief Platform Architect, Richard Brunner. The session covered how new designs, like those from SeaMicro, with their massive density are great for use with hypervisors, but also how smaller cores - think ARM, Atom and Bobcat - can be leveraged for hyperscale deployments. This presentation dovetailed nicely with the previous one, which had shown SeaMicro's new Opteron 6200 series system on a stick giving massive density and throughput, great for HPC and Virtualization. VMware support for GPUs is trailing the APU hitting the datacenter but as application virtualization and desktops as a service take off you can expect them to be right there with AMD - AMD's hypervisor of choice is VMware, after all.

Stepping out of the VMware session I bumped into AMD PR and an invitation to dinner with various members of the press, and Mark Papermaster, newly appointed CTO and Sr. Vice President. Mark has the job of overseeing the developer branch of AMD as well, and is very focused on getting the talent and tools AMD needs to deliver better products in a timely manner. Currently his big projects are working with the large customers AMD does business with to get them able to execute better, and thus want to use more AMD products along the way. Internally, there is a reinforced commitment to quality and process, aimed at bringing the user experience metric for AMD products higher. As AMD is using that as their marketing differentiator currently, this is an important investment and one that benefits the end user ultimately - fundamentally, people don't want better products than Intel, Apple or NVIDIA but just better products, period. We want to achieve our tasks, objectives, etc. smoother and faster and focusing on how to get to that point rather than one-upping the competition and hoping that's what is needed is a better strategy. As part of a hyperscale deployment, VMware can offer fault tolerance to a level that is not available in other platforms.

Part of the software development arm is AMD Israel, born out of the AMD acquisition of Graphic Remedy. This brought a lot of talent to AMD, with a new Director of Development Tools, Avi Shapira, managing optimization of software on the Heteregeneous System Architecture (HSA) platform. This includes creating and improving SDK's for developers and start up investments for AMD ventures. The big news for Avi was the new CodeXL product born out of the gDebugger and CodeAnalyst products that Graphic Remedy has previously developed and sold. CodeXL is free to use and is available in standalone and Visual Studio form, with the possibility of an Eclipse plugin on the horizon. CodeXL allows line-by-line debug of OpenCL code on APUs and AMD GPUs, without the need for additional adapters or console debug cards. CodeXL will support both Windows and Linux (both the AMD proprietary and open-source driver, although the proprietary driver is needed for some features), and will be available through public beta soon.

Dinner & Dancing

The evening rounded out with bowling, snacks and drinks at Lucky Strike Lanes which AMD provided complementary and exclusive access to for the evening for AFDS attendees, followed by the phenomenon that is the Distributed Dance Party. When AFDS attendees checked in, along with their shoulder bag (slightly bigger and better quality than last years) they received a mysterious fanny pack. Inside the fanny pack was a self-powered stereo-speaker and FM receiver, for use when the 'flash mob' broke out.

It started out in the lobby of the Hyatt Regency hotel, with clever (highly subjective - Ed) use of the Fire Alarm to get everybody out of their rooms etc. to join in the fun. The DDM works by one guy with a portable transmitter playing music and lots of people in silly costumes equipped with boom boxes tuned to the right frequency showing up. Oh, and of course, all the party-loving, wild-haired crazy developers who go to AFDS to get a fanny pack a run around downtown, through malls, into parking lots, along the streets and parks of Bellevue after 10pm. Your intrepid reporter tried in vain to talk with some of these funnily clothed, wide-eyed party people but got distracted when someone with a company credit card said 'Hey, a bar!'