Unless you’ve been locked in your mother’s basement, blissfully tapping away at your Atari 2600 paddle for the past 32 years, you’ve probably heard about the upcoming OnLive game service that officially launches today. While we’re reasonably sure that the majority of our readers have progressed beyond the latest game technology from the 70’s, we’ll elaborate anyway.
OnLive Inc. made waves in the industry back in March of 2009 when it showed a demo of its upcoming service at the Game Developers Conference (GDC) in San Francisco. The gist of their plan is to offer a console-like gaming service where the graphics and sound are rendered in “the cloud” then fed back to your PC, Mac, or set-top “MicroConsole” over a broadband Internet connection. In the words of Steve Perlman, Founder and CEO of OnLive, “It doesn’t matter if you have a high-end PC, you don’t need a GPU any more, it just works.” Their original five-minute intro video can be viewed here, providing a great overview of the company mixed with a healthy dose of hype. (If their service is anywhere near as good as their CEO’s oratory skills, they just might have a snowball’s chance).
The obvious caveat to their planned bid at console domination is latency. Your standard wireless keyboard, mouse, or controller can often times generate enough lag to raise the blood pressure of even the most casual PC gamer or plastic guitarist, and in most cases you’re no more than 20 feet from the device accepting the input and rendering the graphics accordingly. Now add in hundreds, or thousands of miles of separation, possibly dozens of hops through modems, routers, repeaters and firewalls, and pray all your neighbors aren’t bit torrent junkies on the same broadband cable loop.
According to a blog post by OnLive’s Steve Perlman, the goal of the company (and secret-sauce to minimizing hair loss due to lag) is to have data centers positioned throughout the world so that users are never more than 1000 miles from the nearest OnLive server farm. At launch, OnLive operates data centers in San Francisco, Dallas, and Washington D.C. which creates a blanket of coverage in the U.S. excluding only those poor souls in Northern Minnesota, North Dakota and northeast Montana.
Besides the overused mantra of location, location, location, OnLive claims to have addressed this issue somehow in their proprietary software and hardware portfolio, and while specific details are in short supply, it is likely their custom hardware partnership with Dell has greatly contributed to solving this fundamental problem. Dell Data Center Solutions (DCS) was tapped to work along-side OnLive engineers to build a high-density, massively scalable GPU-powered server deployment.
The hardware has been designed in such a way that a new rack of servers can be wheeled into an OnLive data center, plugged in and start churning out frames in a single day, providing rapid scaling as demand for the service dictates. With just the initial launch hardware it is already being touted as “one of the largest GPU-powered server deployments to date.”
Details about the actual hardware are sparse. In a brief phone conversation with Jane Anderson, OnLive’s press liaison, she confirmed to us that the OnLive service utilizes hardware from both ATI and nVidia (no specific models mentioned unfortunately), and that their custom software controls this cacophony of hardware to streamline the rendering and encoding processes. The currently accepted theory of how it all works revolves around real-time video compression of the rendered frames, which puts the ball primarily in nVidia’s court (at least for the compression portion if GPU’s are used for this task). The video is rendered, presumably at 24-30 frames per second, and instead of being output to a monitor device, the video stream is intercepted by OnLive’s custom software stack which encodes the image steam on a GPU or some sort of custom designed FPGA, and routes the resulting data back through the Internet to your computer or console where it is decoded and displayed on your television or computer monitor at up to 720p, HD resolution. [According to OnLive’s support page, users must have a 5 Mbps connection for 1280×720 HD video quality, and a minimum of 1.5Mbps downstream for SD 720×480.]
It is conceivable (though we must stress this is all conjecture) that the graphics rendering could take place on any GPU regardless of manufacturer, and then be passed off to a specialized GPU, CPU or custom designed FPGA for encoding. This would allow OnLive the ability install hardware from any vendor offering a board with the right power level and feature set (e.g. DX10/DX11) needed to render a game without having to wait, and in turn force OnLive’s customers to wait, for a single manufacturer to implement the latest features.
Dell has not responded to our inquiries about actual hardware specifications, but we will update this story if new details emerge.S|A
Dave Morgan
Latest posts by Dave Morgan (see all)
- Intel introduces new Xeon naming scheme - Apr 6, 2011
- NXP has the world’s smallest package - Mar 29, 2011
- Top ten highs and lows from CES 2011 - Jan 14, 2011
- So what happens to AMD now? - Jan 12, 2011
- Alcatel-Lucent’s LTE based connected car concept - Jan 10, 2011