No announcement yet.

Physics Drivers Outrage: Nvidia Guilty?

  • Filter
  • Time
  • Show
Clear All
new posts

  • Physics Drivers Outrage: Nvidia Guilty?

    Opinion - When we published an article detailing Nvidiaís advantage in 3DMark Vantage, we had a good feeling that data might spark some controversy. Using GPU for physics calculation in a CPU benchmark highly suspicious thing in any way you look at it. And in fact, it all appeared that Nvidia has been caught with its hands in a cookie jar. Finger-pointing was the result, but as it turns out, there are always two sides to the story and the benchmark maker has an entirely different opinion.

    It isnít like Nvidia and ATI have always played nice. And if you notice anything but the usual in this industry, you would suspect cheating, just like it is the case in this recent physics outrage, which came up since Nvidia is claiming a huge advantage in the 3D Mark Vantage physics test. Earlier this year Nvidia was involved in the highly controversial Creedgate, when Ubisoft found that its "The Way Itís Meant To Be Played" Assassinís Creed title was faster on ATI cards. The company decided to remove DX10.1 support, since the game was not just slower, but also unstable on GeForce cards. The explanation following this controversy was received as being rather doubtful.

    In the most recent case of accusations, AMD claims that Nvidia has been fiddling with the 3DMark Vantage benchmark. Just several hours after that story was published, ATI partners contacted us with similar claims about Unreal Tournament 3. The whole revolves around Nvidiaís driver version 177.39.

    It all would be fine and dandy if somebody actually had called representatives of the two software companies and asked them for an explanation about what actually happened. We were able to contact those companies and received surprising statements from Oliver Baltuch, president of Futuremark and Mark Rein, vice president at Epic Games.

    Letís take a step first and look how the physics case unfolded:

    November 2004: Nvidia introduces its first commercial chipset supporting multi-GPUs - the nForce 4 SLI for AMD platform. AMD downplays the value of multi-GPU cards, claiming that the value lies within a single-die GPU.

    February 2005: Nvidia announces that the company has shipped three million SLI-capable chipsets.

    March 2005: ATIís unveils the never-actually-launched X850 Crossfire platform, consisting of two Radeon X850 boards connected via an external cable. The solution never saw any real volumes.

    March 2006: GPU Physics begins its life as a marketing gimmick between ATI and Nvidia. Both companies announce GPU Physics at GDC Spring in San Francisco using Havok FX, a sub-set of the Havok physics API that used the GPU to "animate" physics. Was never really true Physics.

    May 2006: At E3 2006 in Los Angeles, key game developers criticized Havok FX and decided to go either with Havok or Ageia ís PhysX API - since both APIs are CPU agnostic and work on almost all platforms.

    June 2006: ATI was the first company to demonstrate the new technology at Computex in Taipei. ATI used a system with three X1900XTX graphics cards.

    September 2007: Intel buys Havok and Nvidia/AMD open negotiations with Ageia. AMD did not want to pay for Ageia and decided that the role of physics on a GPU should be buried.

    November 2007: At the AMD Phenom launch in Warsaw, AMDís developer relations manager says: "GPU physics is dead for now".

    February 2008: Nvidia announces the acquisition of Ageia.

    April 2008: During its Financial Analyst Day, Nvidia announces that a physics driver will be available to the general public by mid-summer. The first public demonstration did not go as planned, but the potential was clear.

    June 2008: Nvidia releases PhysX Application Software 8.06.12 first to the general press, then to the public. This version of PhysX enables GPU acceleration of the PhysX API. Controversy sparks around 3DMark Vantage and Unreal Tournament 3.

    Looking back in history, we notice that ATIís first reaction to multi-GPU was negative, but the company followed suit with Crossfire and now the company is preaching about advantages of smaller GPUs instead of large monolithic dies. Later, AMD was downplaying the value of GPU physics and then announced that it found an agreement with Intel/Havok. But this move was "too little, too late" for companies like Epic and Futuremark, who made their design calls years ago. AMD didnít work on GPU Physics and even tried to bury it. As a result, PhysX has become the physics API of choice for more than 150 games and Futuremark used PhysX in its benchmark.

    AMDís Official Statement: Nvidia fools 3DMark Vantage

    The issue with AMD attacking Nvidia over 3DMark was summed in an interesting article by my ex-colleague Charlie Demerjian. We have received an official statement from Dave Baumann, former head of Beyond3D and now in a senior technical role inside AMDís graphics unit:

    "We believe physics simulation, whether performed on the CPU or the GPU, will be an increasingly important feature of upcoming games. The powerful parallel processing capabilities of modern GPUs have been proven to be very useful for accelerating some types of physics calculations, such as cloth simulations and rigid body collisions, used to enhance game visuals. However, using the GPU in this way only makes sense if it doesnít detract from graphics rendering performance. In other words, adding a few more moving objects into a scene isnít necessarily beneficial if it requires other 3D effects to be simplified, or sacrifices resolution and frame rate.

    3DMark Vantage attempts to address the growing importance of game physics by including support for GPU-accelerated physics in the GPU tests, implemented using DirectX 10 geometry shaders. The developers balanced the physics and rendering workloads in a way they felt was reflective of what we would see in next-generation games. Additionally, they included CPU tests that supported the use of Ageia PhysX PPUs to offload some physics calculations from the CPU. This decision was made prior to the acquisition of Ageia by Nvidia, and the subsequent discontinuation of discrete PPU products.

    Recently released drivers from Nvidia (ForceWare 177.39) fool the 3DMark Vantage benchmarks into thinking an Ageia PhysX PPU is installed, while actually doing the additional physics processing on the GPU. Since Vantage has separate GPU & CPU benchmarks which both include physics processing, this causes the performance benefits of GPU physics to be double-counted, resulting in an artificial inflation of the final score. Real games can be expected to limit the amount of GPU physics processing to avoid significantly impacting rendering performance. Also, we are confident that the vast majority of upcoming game titles will not include support for PhysX, but will instead rely on more popular physics middleware (such as Havok) or proprietary physics engines, which will not benefit in any way from Nvidia ís PhysX drivers."

    Summing up, AMD claims that the ForceWare 177.39 driver "fools" the 3DMark Vantage benchmark. Weíre not so sure. Ageia has built a very solid library of titles that use the PhysX API. Being a standard library within the Unreal Engine got Ageia more than one hundred contracts alone. PhysX is the most common used physics API for console games (NovodeX and Meqon APIs) today, Sony has licensed PhysX SDK as the official physics engine for Playstation 3 console, Microsoft licensed PhysX for their own Robotics Studio and the list goes on. So, why would we undermine PhysXí value as an API? Because of Intelís Havok?

    Wasnít this a case of "fooling" a benchmark in the first place?

    Futuremark: No cheating here

    3DMark Vantage is in the hot seat as far as Nvidiaís PhysX goes, because this is the first time that a GPU is influencing CPU scores. AMD claims that Nvidia violates BDP Driver Rules. This is what Futuremarkís had to say:

    "The driver in question has not been submitted for authorization and is only for demo purposes only. Nvidia has followed the correct rules for driver authorization and the BDP by sending us the 177.35 published driver (the same as AMD has now sent us the 8.6 published driver), both of which are currently undergoing the Authorization process in our Quality Assurance area at this moment.

    Only drivers that have passed WHQL and our driver Authorization Process have comparable results that will be allowed for use in our ORB database and hall of fame. Other drivers which have not been submitted will not be commented on. Otherwise, we would have to inspect every Beta and press driver that is released.

    Our application is not changed in any way, thus any statement implying otherwise is incorrect."

    According to Futuremark, Nvidia did not violate BDP Driver Rules. Then again, they didnít state that the 177.39 drivers were legit either. However, the 177.39 driver will not enable Physics on a GPU, it is the PhysX 8.06.12 Application Software. We spoke with Oliver and other members of the Futuremark team and learned that they have no issues with the PhysX Software 8.06.12 because it is WHQL, but the display driver has to be certified as well. Once that is done, both PhysX 8.06.12 and 177.35 will be certified for use on ORB.

    The only difference between future WHQL driver 177.35 and 177.39 is the inclusion of GeForce 9800GTX+, a strangely named 55 nm die-shrink of the G92 chip.
    Looking for ==>> <<== invite.

  • #2
    Epic Games: Itís customer service

    Following conversation with Futuremark, we spoke with Mark Rein, VP of Epic Games. Mark is known to be quite knowledgeable when it comes to new technologies and the company isnít shy about pointing fingers even at the largest corporations.

    When it comes to the topic of physics on a GPU, Mark believes that using GPU for physics is not a cheat. Rather, he considers this feature a bonus for users of Unreal Tournament 3 who own Nvidia graphics cards.

    Those users can now play games that offer features that were designed for an Ageia PPU simply by using their high-end GPU. The only thing that Nvidia did was to change the library that we shipped with the game, and ultimately made those levels run better."

    Mark called it ironic that nobody cried foul when Epic released the UT3 Bonus Pack, which contained PhysX-enabled levels and required an Ageia PPU to run. Nvidiaís purchase of Ageia made this technology available to millions of gamers, instead of several thousands (according to some sources, Ageia shipped only 120,000 boards).

    Mark chose to call Nvidiaís PhysX driver "customer support". He mentioned that Nvidia had a long history of going the extra mile to improve their customersí PC gaming experience through driver features and optimizations created by working closely with developers.

    Personally, I have been propagating physics in games for ages now, because it is the only way to enable the creation of realistic games. For me, physics can be Havok, PhysX or even Ray tracing, but games and other applications should have physics, because looks are nothing, if the game creators cannot put their feelings about the world into motion.

    How many times did we complain about race cars not crashing when they are touching a curb or passing the grass at very high speeds? Physics is the answer, and Nvidiaís PhysX is one of the roads that game developers can take.

    Nvidia: PhysX driver for public arrived, PhysX part is WHQL certified

    At the end of the day, we asked Nvidia when the PhysX driver will be available to general public. More importantly, when there will be a WHQL certified driver so that Futuremark can approve it. We were given an answer by Bryan Del Rizzo, one of the members of Nvidiaís PR team:

    "The PhysX system software is WHQL. [The] 177.39 display driver is BETA (so the co-installer will be under beta downloads)"

    This answer didnít exactly satisfy us, we wanted to hear when will the driver be WHQLíed and when will the driver go through Futuremarkís BDP Driver Certification process.

    Nvidia sources told us that that the WHQL display driver is currently being certified at Microsoft. The release is expected by mid-summer. While a specific date was not given, commons sense suggests that WHQL driver will be available when 9800 GTX+ cards ship - which will be July 11.

    For now, you can find official beta driver on following pages: 32-bit Windows XP, 32-bit Windows Vista and 64-bit Windows Vista .
    If you want the PhysX Application Software in stand-alone form, download link is here. So far, supported boards include GeForce 8, 9 and the GTX 200 series.


    To us, there is a clear conclusion of this matter. AMD rained on its own parade and the launch of its excellent 4850 and 4870 cards. We got the impression that the software vendors in question believe that their products benefit from GPU physics and that accelerating in-game physics with a GPU was a positive move. Forget corporate politics for one minute and figure this one out: Nvidia modified PhysX and brings it to millions of owners of GeForce cards. We would say around 10 million, since we do not think that 8600 versions and below are capable of serving PhysX demands with graphics.

    If you own a GeForce 8800, 9600/9800 or GTX 200 series, we can only recommend a download of the latest drivers and PhysX software when they become available and start playing those levels in Unreal Tournament 3, Ghost Recon and other PhysX games.

    Physics Drivers Outrage: Nvidia Guilty? - Tom's Hardware
    Looking for ==>> <<== invite.


    • #3
      Nvidia rules is some ways, but is trying to monopolize the graphics card biz IMO.
      Raving Masocore

      -Currently Playing - Electronic Super Joy


      • #4
        so much info lol..eyes hurt


        • #5
          Nvidia is trying its best to bury ATI. SLI is clearly a very successful adaption of the Nvidia platform and Crossfire just isn't as popular nor will it ever be (my opinion only if you disagree then PM me don't slander my name on a public forum). I believe on-gpu physics is an excellent idea apart from the fact that you can't the best of both worlds (well the majority of the time) so if Nvidia integrates some form of physics processing ability into their gpus I believe it would be in their best interest to design it in such a way that it does not impact on the graphics processing ability of the card. If it did substantially affect the ability of the card ie frame rate, texture details then I can predict a large swing of the market over to the ATI camp.