While it's only regarding one game, they're making a comparison running it with PhysX effects on two particular cases:
- One using 2 Nvidia GPU's as the display adapter(GTX285) and dedicated PhysX acceleration(GTS250), which is supported by Nvidia.
- One using an ATI card as the display adapter(HD5870) and a dedicated Nvidia GPU(GTS250) for PhysX acceleration, which was possible to do without hacks up until August, I think, when Nvidia otherwise on a account of a couple of bs reasons.
They concluded that between these two scenarios, on the last one the game ran a little slower on the same resolution but offered a more consistent performance. So, Nvidia's move to drop support for a dedicated NV GPU running PhysX whenever the display adapter is from their rival was probably nothing but BS, even more so when they tout PhysX as an "open standard".
Is it that hard to understand?
boundle (thoughts on cracking AITD) wrote:
i guess thouth if without a legit key the installation was rolling back we are all fucking then
They concluded that between these two scenarios, on the last one the game ran a little slower on the same resolution but offered a more consistent performance. So, Nvidia's move to drop support for a dedicated NV GPU running PhysX whenever the display adapter is from their rival was probably nothing but BS, even more so when they tout PhysX as an "open standard".
Did you read nvidia's official stance about dropping support on non-nvidia cards?
In order to make sure our customers have a great experience, we QA every release of our PhysX or Graphics drivers by testing approximately 14 NVIDIA GPUs for graphics processing with 8 GPUs for PhysX processing on 6 common platforms with 6 OS’s using 6 combinations of CPU and memory. This is over 24000 possible configurations. While we don’t test every possible combination, it should be clear that the work and cost to NVIDIA is substantial. AMD does not support PhysX for their customers. Adding AMD GPUs would significantly increase the necessary work and cost for NVIDIA. We prefer to invest in inventing new technologies that give our customers great new experiences.
nvidia is saying that they have to get all sorts of combinations of ati cards, and why would they want to do that when ati doesn't want to support physx? If someone has to do the QA on ati cards... well.. that would be ati. Has ati ever done any testing on combining their graphics cards with nvidias? No.
Now, how does the article you posted prove that nvidia is lying about the need to spend so much time and money on QA? Please enlighten me. All I'm reading is that nvidia's cards are perfect physx accelerators, which we already know for some time and guess what? nvidia is saying the same thing.
And then like you peoples were all like, "YOU IS TROLLIN!" and I was like "I AM NOT TROLLING!! I AM guess_who_kthxbai YOU SEE! Mm!"
In order to make sure our customers have a great experience, we QA every release of our PhysX or Graphics drivers by testing approximately 14 NVIDIA GPUs for graphics processing with 8 GPUs for PhysX processing on 6 common platforms with 6 OS’s using 6 combinations of CPU and memory. This is over 24000 possible configurations. While we don’t test every possible combination, it should be clear that the work and cost to NVIDIA is substantial. AMD does not support PhysX for their customers. Adding AMD GPUs would significantly increase the necessary work and cost for NVIDIA. We prefer to invest in inventing new technologies that give our customers great new experiences.
Didn't know Nvidia's official statement and now that I've read it I was wrong.
So I guess that only thing the hardocp article shows is that in this game the ATI card performed pretty well when using a dedicated PhysX NV GPU, even better (to a certain point) than the officially suported solution; now of course Nvidia will fix this bug in their upcoming driver releases and it's not known if the PhysX hack works as well with the other games as with B:AA so I guess the only thing the article shows is that, as of now (and for this game alone), the PhysX hack seems to be working pretty good.
They could have tested it with a few more ATI cards, tough.
boundle (thoughts on cracking AITD) wrote:
i guess thouth if without a legit key the installation was rolling back we are all fucking then
i had 6 riddler items left, and i was about to go to the party for the finale.
there wasnt even a saving icon on the screen when i quit ffs!
this is why i hate autosave only games... such a pile of shit, no control over jack shit, cant even make your OWN backups, ur just reliant on one shittily programmed function to keep ur progress safe... nah fuck that... looked great, played great, but some fucking RETARDED design decisions in this thing... oh and physx bugs up the fucking ass. yeah im pissed, this is horse shit... i was right at the fucking end...
exactly, i mean common sense would state that the game would AT LEAST keep a previous save... i mean keeping 2 saves so you can atleast 'reset to previous save' in case something happens... shitty design. shitty shitty shitty design.
but the jokes on them, because i didnt pay for the fucking thing.
In order to make sure our customers have a great experience, we QA every release of our PhysX or Graphics drivers by testing approximately 14 NVIDIA GPUs for graphics processing with 8 GPUs for PhysX processing on 6 common platforms with 6 OS’s using 6 combinations of CPU and memory. This is over 24000 possible configurations. While we don’t test every possible combination, it should be clear that the work and cost to NVIDIA is substantial. AMD does not support PhysX for their customers. Adding AMD GPUs would significantly increase the necessary work and cost for NVIDIA. We prefer to invest in inventing new technologies that give our customers great new experiences.
Didn't know Nvidia's official statement and now that I've read it I was wrong.
So I guess that only thing the hardocp article shows is that in this game the ATI card performed pretty well when using a dedicated PhysX NV GPU, even better (to a certain point) than the officially suported solution; now of course Nvidia will fix this bug in their upcoming driver releases and it's not known if the PhysX hack works as well with the other games as with B:AA so I guess the only thing the article shows is that, as of now (and for this game alone), the PhysX hack seems to be working pretty good.
They could have tested it with a few more ATI cards, tough.
So Nvidia did anything they could, but AMD/ATI remained stubborn, as usual and quite frankly deserve their fate and for all i care, go bancrupt at the same time.
In order to make sure our customers have a great experience, we QA every release of our PhysX or Graphics drivers by testing approximately 14 NVIDIA GPUs for graphics processing with 8 GPUs for PhysX processing on 6 common platforms with 6 OS’s using 6 combinations of CPU and memory. This is over 24000 possible configurations. While we don’t test every possible combination, it should be clear that the work and cost to NVIDIA is substantial. AMD does not support PhysX for their customers. Adding AMD GPUs would significantly increase the necessary work and cost for NVIDIA. We prefer to invest in inventing new technologies that give our customers great new experiences.
Didn't know Nvidia's official statement and now that I've read it I was wrong.
So I guess that only thing the hardocp article shows is that in this game the ATI card performed pretty well when using a dedicated PhysX NV GPU, even better (to a certain point) than the officially suported solution; now of course Nvidia will fix this bug in their upcoming driver releases and it's not known if the PhysX hack works as well with the other games as with B:AA so I guess the only thing the article shows is that, as of now (and for this game alone), the PhysX hack seems to be working pretty good.
They could have tested it with a few more ATI cards, tough.
So Nvidia did anything they could, but AMD/ATI remained stubborn, as usual and quite frankly deserve their fate and for all i care, go bancrupt at the same time.
Uh no. Nvidia didn't want ATi to help them. NGOHQ, a third party driver modder resource, did.
"There's no doubt that Nvidia is more than delighted to see its API working on cards made by its strongest competitor, not to mention the threat it may represent to Havok."
Yes, so delighted that Nvidia DISABLES cards when you mix them at the driver level. Whose driver? Nvidias!
Finished the main story line last night. I'm only at about 75% since I didn't do all that many challenges and such on the way. Overall, it's probably GOTY at this point. I hope they sold well and make good money because this is an amazing game. I got mine free with the purchase of a EVGA GeForce GTX 275 FTW edition. I look forward to the sequel.
Uh no. Nvidia didn't want ATi to help them. NGOHQ, a third party driver modder resource, did.
Sure, they didn't themselves offered to help ati but they publically accepted to help ngohq to do that.
Official quotes by nvidia:
"Eran and I have been talking via email and we have invited him to join NVIDIA’s registered developer program. We are delighted at his interests in CUDA and in GPU accelerated physics using PhysX. Eran joins a long line of developers who are now working on using the GPU to run physics and who are doing so with the world’s leading physics software — PhysX. "
"We’ll help any and all developers that are using CUDA. That includes tools... documentation... and hands on help. We’re delighted with the interest in CUDA and PhysX; and that includes the news on www.ngohq.com."
The guy from NGOHQ only needed newer HD 4xx0 cards for further testing and asked one from Ati but Ati never responded and on interviews Ati did a while later they said that they're not interested in supporting PhysX (note: the question wasn't about the NGOHQ porting effort), developers aren't interested in PhysX, and they're pretty sure it will die pretty soon.
And then like you peoples were all like, "YOU IS TROLLIN!" and I was like "I AM NOT TROLLING!! I AM guess_who_kthxbai YOU SEE! Mm!"
Finished the main story line last night. I'm only at about 75% since I didn't do all that many challenges and such on the way. Overall, it's probably GOTY at this point. I hope they sold well and make good money because this is an amazing game. I got mine free with the purchase of a EVGA GeForce GTX 275 FTW edition. I look forward to the sequel.
I just finished the game yesterday with a 50% do over, did not have the proper version (got stuck at wall where it wouldn't reconize the bat claw)
Play it @ 2500x1600 on a 30'monitor,3.8 cpu and a 295 o/c
This game looked fantastic and the game played well too
I'm glad I stuck it out
Fired up Flashpoint 2 and it looks like shit in comparision
Just loaded new Need for Speed and it looks OK but was exspecting more
Just my 2 cents
Almost finished it, I'm on the final fight, but yeah it's been a fantastic game.
The boss fights are a bit naff.
"Hey, this boss is too easy, what can we do to make it a little harder?'
"We could...erm...erm.."
"Fuck it, we'll spam 'em with countless goons at the same time, that'll do it".
And yet most of them remain easyish anyways. Last one's pissing me off at the mo but.
However that's highlighting the only real negative. The game looks, plays and sounds fantastic. You really are Batman!
Signature/Avatar nuking: none (can be changed in your profile)
You cannot post new topics in this forum You cannot reply to topics in this forum You cannot edit your posts in this forum You cannot delete your posts in this forum You cannot vote in polls in this forum