Friday, 2024-04-19, 8:19 AM
SaintTecho
Welcome Guest | RSS
Main | Registration | Login
Site menu
Tag Board
200
Main » 2008 » December » 10 » EA, Take-Two adopt Nvidia PhysX
EA, Take-Two adopt Nvidia PhysX
9:40 AM
By Brooke Crothers

Rival publishers simultaneously reveal plans to adopt Ageia-powered physics engine in upcoming PC titles.


Electronic Arts and Take-Two Interactive Software are adopting Nvidia's PhysX technology, bringing more realistic gaming to the PC. The largest graphics chip supplier announced this week that Electronic Arts and Take-Two have licensed its PhysX technology as a development platform.

Nvidia got its physics technology when it acquired Ageia in February. PhysX runs on the graphics processing unit, or GPU. Intel and Advanced Micro Devices, on the other hand, have been promoting technology that is executed on the central processing unit, or CPU. Intel's approach uses technology from Havok, a developer of a physics engine that Intel bought in September of 2007.

Adhering to the laws of physics
The goal of Nvidia's technology--based on the laws of physics--is to make game objects respond in a realistic way to physical events. More conventional technology uses a canned response, in which the same motion is repeated over and over. For example, a window breaks or a person falls the same way every time. In a PhysX-enabled football sports game, however, the angle and velocity of the impact is calculated by the GPU to generate a real-time response that is different every time.

The technology was meant to run on the GPU, according to Jon Peddie, whose firm tracks developments in the graphics chip industry. "It's a GPU thing, and the fact that EA and Take-Two are coming out [with support] gives you a clue why," Peddie said. "This really is a significant event," he said, "enabling the GPU to do physics."

Ageia's secret sauce is its physics libraries, which are supported on Microsoft's Xbox 360, Sony's PlayStation 3, Nintendo's Wii, as well as on the CPU and Ageia's own PPU (physics processing unit), Ujesh Desai, vice president of product marketing at Nvidia, said in an interview last week. "It's a very open platform. Something game developers really liked, which is why a lot of game developers adopted it," he said.

The launch pad for Ageia on the PC is Nvidia's CUDA, or Compute Unified Device Architecture. CUDA already has a large installed base of GPUs that can run a C program, "which is what PhysX is," Desai said. "We bought Ageia, [and] they ported their PhysX API to our GPU, using our C compiler on top of CUDA. So now there are 100 million GeForce [chips] out there that can do PhysX processing."

And PhysX-enabled games will offer much greater realism. "Today, the way they do sports games is motion capture. They capture the different animation--running, falling," Desai said. "What you realize is that for the first 5 to 10 minutes of the game (or movie), it looks believable, but after you play for a while, you realize, wait a minute, every time he falls, he falls the same way. Every time I make that tackle, it looks the same."

The game Backbreaker uses PhysX. "They're calculating those tackles in real time, based on how the body interacts and the body mechanics interact. So no two tackles are the same," according to Desai. Another game, Mirror's Edge, is coming out for the PC in January from EA's Swedish studio DICE.

"Ageia changed the rules on this," Peddie said. "It's much, much more realistic."

Ageia's physics was originally done on an Ageia Physics Processing Unit, Peddie said. "This was the only way to make it work. But now this capability [software] has been ported to Nvidia GPUs, and this can be done on Nvidia silicon," he said.

Physics can also be used to make things look more photo-realistic. "In today's games, cloth and hair look very fake because you don't have the right physical properties," Desai said. But with PhysX, "all these things can be physically simulated."

Havok--the company Intel acquired--was the first to introduce physics into games and bring out a physics library. Havok's physics has been run on the CPU in a time-scheduled way, Peddie said. "Because of that, there weren't many CPU resources to really do a great job on the physics," he said. "Nothing would really happen. What happened, at most, is that you would hit this thing (a window or a wall, for example), and it would apply a decal to indicate that there was some change in it. It's not very realistic."

AMD, for its part, will pursue a balanced platform. "The GPU is a great place to do processing. We'll do the offloading [to the GPU], where it makes sense," said Korhan Erenben, product marketing manager at AMD Graphics Products Group. "[But] we are aligned with Havok, in terms of working on a future direction of physics. Right now, it is on the CPU, and we think that serves the broad installed base. Taking it to the next step would be to have a capability on the GPU--where and when it makes sense."

Physics is better on GPUs
Peddie explained why physics is more suited for the GPU than the CPU. GPUs today typically have hundreds of processors that are good at doing many things in parallel. "If you have threads or processes that can be run simultaneously, [and] if you have processors available to deal with each one of those threads, then you can get your results a lot sooner," he said.

He described a technique called Same Instruction Multiple Data (SIMD). "The same instruction is the physics equation. Things fall toward Earth all the time. And the multiple data will be what the things are. It might be a rock, might be a person, might be the wheel of a car. You have to be able to process this stuff and have it behave in a realistic fashion. To do that, you have to process it very quickly," Peddie said. "The advantage that GPUs bring is that they have this humongous number of processors. Certainly as good as the [Intel] 486 ever was. So they're really good processors, and you've got hundreds of them literally inside the GPU."

There will be challenges for users, however. "The tricky part is, why would I want to take one graphics card and spend $500 on it, and then not use it for graphics but rather use it for physics?" he said. "The answer is, of course, I wouldn't." Peddie suggested that a gamer might use the really good card for physics and employ the old card "that you got last year" for graphics, assuming that there are enough slots in the PC.


source : http://www.gamespot.com/

Views: 1309 | Added by: sainttecho | Rating: 0.0/0 |
Total comments: 1
1 Anj@nk  
0
nice post here...

Only registered users can add comments.
[ Registration | Login ]
Login form
News calendar
«  December 2008  »
SuMoTuWeThFrSa
 123456
78910111213
14151617181920
21222324252627
28293031
Search
Site friends
| gamesreview | adealfay | afifclever | alan | alex | anjang | aria-xds | blog bisnis online | blognirey | Bollywood Actress Picture | Bread & Pastry | cellular tip's and trick's | dasir | dee-nesia | Dian | Dieq41 | elzenz | flames | Gallery Photo Cewek | Go Adsens | international-basketball | Khudori | komoditifantasi | Linda | logoscom | Magdalena | making-money-man | Mata Telinga | Mazhi | Mawar | muse | pl4y312 | post-B | priwa | r3dwood | ripmaggots | riqzmiecreation | ryan | Sang Milyader | scarletwork | Shuhow | Si Joe | Seno | Teta | Thoip | Tripzibit | Undecided | Veston | www.ucoz.com |
Statistics

Total online: 1
Guests: 1
Users: 0
Copyright MyCorp © 2024
Make a free website with uCoz