A technical read, but very interesting indeed. Quite pathetic what nVidia has done, but we already know this, or do we not.
Summed up well by a user from another forum:
Quote:
The point of the article is:
1) When the PhysX thread runs on CPU it relies on "legacy" x87 instructions that Intel "asked" not to be used anymore since the introduction of Pentium 4.
2) It could use SSE2 instructions with ease. That's not up to developers but rather a limitation imposed by Nvidia. SSE2 is in orders of magnitude faster than x87.
3) The Nvidia argument that PhysX can only run properly on their GPUs b/c it's a billion times faster than "software" (CPU) is basically a lie.
Gotta lol at that dude that bought an nVidia card just for the PhysX.
I just wish a powerful physics engine using OpenCL would arrive and become widely used, removing PhysX from the stage. As soon as a platform independent solution has taken some space PhysX will slowly (or quickly) die away since proprietary technologies are doomed and to the detriment for both developers and consumers.
The whole problem is that NVIDIA have built this massive cockblock around it. They've even had offers to have it converted to work with OpenCL, but they denied it.
This just shows once again how much marketing, and how little engineering is done at NVIDIA these days :E
Did you even bother reading the article Slizza, or does it go beyond your comprehension?
No, i read the first page and decided i didn't particularly care.
Already knew physx was far from perfect.
There is nothing in replaceing physx right now, so i'll continue to use it and enjoy it.
It's alot better than nothing.
I wont try kid myself that i don't want these extras.
no "PhysX Deficiency" here
Corsair 750D :: 750W DPS-G:: Asus x370 PRO :: R7 1800X ::16gb DDR4 :: GTX 1070::525gb SSD::Coolermaster 240MM AIO::
I'm pissed such a thing as physix isn;t universal, hardware developers shouldn't be using physix as an extra. I hope this shit doesn't happen when AI routines get too complex.
Kids cry about everything.
Nvidia try to give there customers these lil extras like physx and 3d suround etc .
Neither of these things are perfect by any means but they don't need to include them at all really.
There competition does nothing for its customers so in comparison it's 100% better.
Well eyefinity is there, but that's FAR from perfect and of no use to your average gamer.
Ceratinly of less use than physx.
Instead of whinging about nvidia not sharing it with ATI you should be whinging at ATI bothering offering you anything, while the competition tries to give more to there customers.
Corsair 750D :: 750W DPS-G:: Asus x370 PRO :: R7 1800X ::16gb DDR4 :: GTX 1070::525gb SSD::Coolermaster 240MM AIO::
Last edited by Slizza on Fri, 9th Jul 2010 00:07; edited 1 time in total
Slizza you troll so hard, it's not even funny anymore. This has nothing to do with ATI or sharing with ATI. This has nothing to do with moving paper extras in Batman. This is about PhysX on the CPU, and this is the same on nVidia GPU as it is on ATI GPU.
I get it, what i'm saying is it's understandable why they are doing things the way they are.
If there was some effort from there competition then you would think things would play out alot differently.
Untill that happens i can't see it changing.
Corsair 750D :: 750W DPS-G:: Asus x370 PRO :: R7 1800X ::16gb DDR4 :: GTX 1070::525gb SSD::Coolermaster 240MM AIO::
Nvidia try to give there customers these lil extras like physx and 3d suround etc .
Neither of these things are perfect by any means but they don't need to include them at all really.
There competition does nothing for its customers so in comparison it's 100% better.
Then you go:
Slizza wrote:
Well eyefinity is there, but that's FAR from perfect and of no use to your average gamer.
Ceratinly of less use than physx.
So 3D Surround is much more useful than Eyefinity, or did you only mention that because it took them almost a year after ATI did it to get the technology they claim they had been sitting on for years to work?
Or how the 3D bit requires 120Hz monitors, of which there are like 4 or 5 models from different brands altogether?
Not to mention the fact that Eyefinity doesn't require 2 cards to drive 3 monitors like Surround, let alone the 6 monitors a 5870 2GB can handle.
And PhysX has seen about as much use as DX11 so far, what exactly is so great about it? PhysX has not been used as a physics engine yet, it's only been used as a framerate hitting particle system. Batman AA only uses PhysX to show some bricks and paper flying around, Mirror's Edge uses it to make glass break into more pieces - in neither case it has any effect on gameplay.
Slizza wrote:
Instead of whinging about nvidia not sharing it with ATI you should be whinging at ATI bothering offering you anything, while the competition tries to give more to there customers.
Except ATI has offered NVIDIA to even fucking license PhysX if they had been allowed to port it to OpenCL.
Instead, there's a coalition of a bunch of companies including ATI developing an open physics platform, and unlike NVIDIA, they're not gonna deny NVIDIA from using it (which they will) because they know that an open standard encourages developers to use it, because people will actually give a shit. Most gamers wouldn't see what PhysX does in a game if it slapped them in the face, and that's because most developers don't use it, and those that do, don't have a clue what physics are so they use it to create pretty effects.
This is what I meant with NVIDIA turning into Apple: they come out with a product that's big, noisy and hotter than hell itself, price it above performance, and market it with bullshit. The iPhone 4 is no different: it's not faster than a Nexus One, has a major engineering flaw, but costs twice as much because of the pretty Apple packaging. Combine that with blindly following people like you, and you've got very similar situations. On every forum, NVIDIA fanboys are becoming just as bad as Apple fanboys, only seeing the good parts and ignoring the bad parts.
NVIDIA fucked up their last videocard, and this article shows how much they care about their products. You do realise that this hurts developers right? Because it's this bullshit that'll make developers look bad when their game eats up a CPU when there's not much going on in the game. I wouldn't want to pay for a license that undoubtedly is pretty damn expensive, only to discover that out of 3 possible platforms (NV, ATI and on the CPU) there's only 1 that works, even though it's advertised as 2 and would've been 3 if they had been smart. If they had allowed someone like ATI to port this thing to OpenCL, which would mean they would've had to invest nothing into doing that, they would've had much more licenses sold for the damn thing, and the PhysX acquisition would've turned profitable by now.
ATI is far from perfect, but they've been working much harder and supporting their customers over the last 2 years, as any ATI user can tell you. The progress in their drivers is immense, and like I said, they're part of several open standard initiatives. Let alone them being the better engineers right now.
Edit: you say NVIDIA are doing more for their customers btw, then why the hell can't I use my 9800GTX in conjunction with my 5870 without having to hack the driver?
The bottom line here is that nvidia are under no obligation to optimize physx that they want to keep exclusively running on the gpu (nvidia gpu) for competitors products.
Is it a dick move by nvidia? well, yes, yes it is.
As for you eyefinity argument...not many people use multi monitor and from what i read eyefinity is problomatic in games anyway and many peopel with multi screens are just not using it.
the 3d part in 3d surround can be used on a single screen.
The reason for disableing your setup for physx without the hack is nvidia do not want to be responsible for coding there drivers to work with ati drivers too (fair enough right?).
nvidia already do a better job on the driver front.
Try use your card in anything but games in comparison.. 2d performance is still poop due to no interest in developing the drivers, and linux users... well.
In short nvidia are being dicks by gimping there physx on cpu, but thanks to competition they are put in a place where they can do this freely.
Corsair 750D :: 750W DPS-G:: Asus x370 PRO :: R7 1800X ::16gb DDR4 :: GTX 1070::525gb SSD::Coolermaster 240MM AIO::
As for you eyefinity argument...not many people use multi monitor and from what i read eyefinity is problomatic in games anyway and many peopel with multi screens are just not using it.
That's typically because games are unable to deal with such massive resolutions or aspect ratios; to be quite honest, those are generally the games that aren't interesting for this anyway. A game like GRID for example, which was 15 months old when Cypress came out, supports it out of the box. It automatically adjust FOV based on aspect ratio; for shooters you can usually adjust the fov in the settings or config files
Slizza wrote:
the 3d part in 3d surround can be used on a single screen.
It's called 3D vision in that case, not 3D surround, and there's only like 5 games that properly support 3D
Slizza wrote:
The reason for disableing your setup for physx without the hack is nvidia do not want to be responsible for coding there drivers to work with ati drivers too (fair enough right?).
See, this is the argument everyone always comes up with, including NVIDIA, and it's complete utter bullshit. It annoys the shit out of me.
If something related to the PhysX processing breaks, it's either due to the PhysX driver itself or due to the way the game uses that driver - the ATI driver has absolutely no effect, just like NVIDIA's display driver doesn't have any effect. It's a lame, weak excuse; if this really was the case, someone would've encountered issues a long time ago, but people are still doing ATI with a dedicated PhysX PPU just fine, it's just annoying that it takes a driver hack to work. There is just no way ATI's driver could break it, unless ATI were to make calls to the PhysX driver, which they won't do anyway.
Slizza wrote:
nvidia already do a better job on the driver front.
Try use your card in anything but games in comparison.. 2d performance is still poop due to no interest in developing the drivers, and linux users... well.
Debatable, NVIDIA's latest drivers have been breaking stuff for older cards (8600GT's don't get PhysX anymore for example). The NVCP is still tons better than CCC, yes, but on a performance level ATI has had the biggest jumps of the two.
Under Linux both are fucking terrible, having the weirdest issues everywhere
Of all the answers we got for why PhysX still uses x87, the most convincing ones were the ones rooted in game developer apathy towards the PC as a platform. Rege ultimately summed it up by arguing that if they weren't giving developers what they wanted, then devs would quit using PhysX; so they do give them what they want, and what they want are console optimizations. What nobody seems to care about are PC optimizations like non-crufty floating-point and (even better) vectorization.
Signature/Avatar nuking: none (can be changed in your profile)
You cannot post new topics in this forum You cannot reply to topics in this forum You cannot edit your posts in this forum You cannot delete your posts in this forum You cannot vote in polls in this forum