Forsyth: how ‘Larrabee’ will change the game

Tom Forsyth, a software and hardware architect for the Intel architecture code-named “Larrabee,” is joining game industry legend Michael Abrash to present the Larrabee new instructions (LRBni)—programming methods and hardware instructions for Larrabee—at the Game Developers Conference in San Francisco, Calif.

In the following Q&A—excerpted from an interview printed this week in Intel’s graphics magazine, Visual Adrenaline—Forsyth describes his work on the Larrabee project and how Larrabee will affect game developers (read the full interview). Prior to joining Intel, Forsyth worked on various games for Microprose, Mucky Foot, and SEGA; at 3DLABS writing Microsoft DirectX drivers; and was one of the principals at RAD Game Tools in Seattle, Wash.
##CONTINUE##
Q. Some have theorized that working on the Larrabee project must be “like being at the absolute center of the gaming universe.” Is that how you see it? And do you interact with other experts both inside Intel and in other companies?
I’d say it’s at the leading edge of the gaming graphics universe. We always have to bear in mind that while graphics are important, they’re not what a game is fundamentally about. Games are created and played because they’re fun, and graphics are only a part of that. But it’s the part that gets the most attention and the most research, and when you include all the art and animation staff, it’s where the majority of the effort and development costs go.

One of the most fascinating things I’ve faced being part of the Larrabee team has been reconciling the needs of hardware design, software drivers, and the game developers who use the whole rendering stack. When considering how to implement a graphics algorithm, we’ve had an almost unique chance to decide whether a feature is built into the hardware of the chip, handled entirely by the software, or whether it’s a hybrid of the two—creating a modified instruction to help accelerate a more general path.

We’ve worked very closely with a bunch of very smart people. In addition to the public ties we have with people, such as Professor Pat Hanrahan at Stanford and Tim Sweeney at Epic Games, lots of developers in games, industry, and academia have helped us shape the direction of the hardware and software.

It’s been really fun working with them, seeing their reaction to early designs, iterating on their ideas and wishes, and finally making a lump of smart rock that contains as many of those ideas as we could cram in. And we’ve barely started discussing the details in public—once the covers come off, I’m looking forward to seeing people do some awesome stuff with Larrabee.

Q. Given the huge amount of development with OpenGL and DirectX, do you see both [graphics programming] standards supported going forward, along with a new model? How do you balance support for existing programming standards with new capabilities that are specific to Larrabee?
On a basic level, there are two extremes of graphics programmers. One extreme wants to make the best graphics possible, and they are happy to pick a single platform and API [application programming interface] in pursuit of that goal. These programmers will aggressively use all the coolest new features and jump through any necessary hoops to get the very best graphics they can. They push the limits of graphics on the high-end PC cards and on the single-platform console games. If a graphics card exposes a feature, even through a custom one-off API, they will use it. We love these people, and they will make Larrabee do astonishing, outrageous things, but sadly there aren’t many of them.

The other extreme wants to make the best possible game for the greatest number of people. Their game must ship on as many different graphics cards and platforms as possible. These programmers need to pick a set of features that they can support on that wide range of platforms, and to do that they may need to ignore some of the unique features of each platform.

To me this skill is no less amazing—to be able to target platforms as diverse as 20 different PC cards, the Microsoft Xbox 360, the Sony PlayStation 3, and the Nintendo Wii, each with its own little quirks, and to achieve such consistency is truly impressive. Though I guess you have to have tried it and done it to realize just how tricky it is.

These people absolutely need every bit of help to make their life simpler, and the OpenGL and DirectX standards are crucial for this. If a feature is not exposed through those standards, they honestly don’t have time to use it. Larrabee must have the very best OpenGL and DirectX support possible, because that is what the majority of titles use.

Then there’s the middle ground: programmers who mainly use the existing APIs, but have a bit of freedom to play with a few novel features—here and there. Maybe they have a neat lens-flare effect on one card and not another. Maybe high dynamic range is implemented a different way on one card. Or maybe particles are rendered slightly better on one platform than on another.

Included in this category is middleware (both public and internal to studios)—when you’re making a graphics engine that will be used on multiple titles, adding a few bits of special code targeting specific cards is a reasonable investment of time and effort. So these people need the standard APIs, but they can also explore a bit around the edges and play with some of the new features as long as these features aren’t too crazy.

We’re supporting all three types of programmers: those who need rock-solid Direct3D and OpenGL support, extended features for those that want to dip their toes in the future, and “bare metal” programming in C++ and even assembly language for those that want to run headlong into the new-but-old world of software rendering.

Q. There was some confusion earlier about whether Larrabee would be rendering graphics using rasterization or ray tracing. You announced on your http://www.eelpi.gotdns.org/blog.wiki.html last April that Larrabee was absolutely committed to supporting the conventional rasterization pipeline. Yet ray-tracing would still be enabled for what you described as “wacky tech” projects. What did you mean by that? Is ray tracing still too far out of the mainstream to be considered for cost-justified projects?
Ray tracing and rasterization are two very different rendering methods, each with its own strengths and weaknesses. Games programmers have been using rasterization almost exclusively for well over a decade, and the art techniques and pipelines are tuned to rasterization’s strengths and away from its weaknesses. This also shapes what sort of games people make—where they set them and what they allow you to do. If rasterization doesn’t do a certain thing well, people won’t tend to make games like that.

Ray tracing requires some significantly different styles of artwork and content to make it shine, and it will allow and encourage different styles of game. And indeed I expect some of those to be very wacky (in a good way!). But that’s going to take awhile, and there needs to be broad hardware support before many teams can experiment with these new types of content.

I don’t believe ray tracing is inherently “better” than rasterization—it’s just different. But one of the real joys of Larrabee is being able to have this discussion at all! We finally have a bit of hardware that can run both algorithms on an even playing field. We soon will have the smartest people in the world writing renderers using their own techniques, and we’ll be able to do exact apples-to-apples comparisons and find out.

And better yet, you won’t actually have to declare a “winner” at all—you’ll be able to have both. Hybrid schemes are already being played with and discussed: Use rasterization for part of your scene and ray tracing for another. It’s peanut butter and chocolate.

Of course there’s nothing new under the sun. Michael Abrash likes to remind me that the original Quake* engine has two fairly different rendering schemes in it.

One is the span-buffer method of occlusion, which is used for the level geometry, and then over the top of that is a more “traditional” Z-buffer scheme, which renders the characters. Each method is more efficient than the other at those particular items of work. It’s nice to have that sort of flexibility back in rendering algorithms and to give that power back to developers to see what they do with it.

Q. You’ve been right in the middle between the hardware and software engineers bringing Larrabee to market. What has been the biggest challenge so far?
The biggest challenge is keeping people aware of that balance between hardware and software. People tend to be polarized; they’re either hardware or software, and they speak different languages. We always need to keep that dialog going between the two camps. That’s my main role through the design phase.

Software design can happen much later, so you don’t have to decide everything up front. Some things can change after the chip is designed. The amazing thing is that people still ask about changing stuff in Larrabee. Are you kidding me? The architecture has been locked down for a year.

Q. What is Larrabee’s biggest benefit to game developers?
The primary focus used to be making more realistic graphics. Now as we get there, we see that realism is somewhat overrated. Films don’t have real lighting—it’s faked like crazy. Real physics aren’t that fun—if I fall 12 feet, I break my leg. Real AI will headshot you every time.

So you want the game to look intelligent and realistic while still having fun beating it. We need to enable as much realism as the developers want, then allow them the fine control to step sideways. An example of that in graphics is colorizing and brightening, such as in the TV show Pushing Daisies, where scenes are filled with pastels, or have huge contrast, or their hue changes to emphasize a mood. The colors are completely unrealistic and yet we not only accept them, they tell us things without conscious input. And it all relies on exquisitely fine control over the rendering.

That is what Larrabee will provide to developers: new techniques for enhancing the visceral feel of a game.

-----------------------------
BY Intel Stuff
Source:circuit

© 2009 Intel Corporation

0 comments:

 

Copyright 2008-2009 Daily IT News | Contact Us