Let’s take a look at some of the key features and capabilities of PhysX 3.0 SDK:
Larger Levels: Game levels are getting larger these days. That means they require more actors. In PhysX 3.0, developers can combine multiple actors into a single “aggregate”, which is managed as a single bounding-box entity in the broadphase stage of the collision pipeline. This reduces the computing load required to predict collisions between actors, and helps improve overall performance and memory efficiency of PhysX-3 relative to earlier versions.
Streaming: PhysX 3.0 enables efficient streaming of asset data into a simulation through a new feature,binary in-place serialization, which allows quick and memory-efficient insertion of actors into a scene. In addition, out-of-scene actor creation, which allows actors to be created outside the scene and stored rather than being created and destroyed on demand, provides developers with better asset management while minimizing troublesome compute load spikes.
More Effective Multithreading: The new Task Manager with managed thread pool allows games to take advantage of multi-core processors on all platforms, resulting in greatly increased performance and a much improved gaming experience.
Flexible and Powerful Tools: In addition to a highly optimized physics runtime, NVIDIA is releasing improved tools for artists that have been tailored to work within the developer’s asset production pipeline. A new release of PhysX Visual Debugger allows superior performance profiling, detailed memory analysis and improved visualization of all PhysX content across all major platforms.
Or even just make PhysX useful on anything but their high-end GPUs. A lot of the shit that's done with it aren't actually tasks that require a massive amount of parallelism, so if they're not bullshitting, there are a lot of situations where a CPU will get very close - that is, assuming they actually do use all available cores and instruction sets.
Physx hasn't done anything useful in gameplay since it came out.it's always eye candy...nothing else. Geo Mod, Dice's Frostbite engine, Havok, whatever physics system crysis developed.....everyone of them did everything physics won't do or hasn't done yet.
Physx hasn't done anything useful in gameplay since it came out.it's always eye candy...nothing else. Geo Mod, Dice's Frostbite engine, Havok, whatever physics system crysis developed.....everyone of them did everything physics won't do or hasn't done yet.
Tho none of those physics engines cant do particle/water/smoke etc effects, which looks like PhysX counterparts. Personally i dont have anything against PhysX, some of the effects would be fucking badass in FPS games if they would just use them.
What, particles like in fucking Mafia 2? Come on Breezer, it's a fucking joke. That's not physics, that's making a game look ridiculous. Hydrophobia does water just as good (if not better) as Cryostasis, which is the ONLY game where PhysX did add something to the game. Or do you mean something like in Batman: AA where your little toe grazes a pile of paper and it magically shoots up into 1700 different directions - in a completely predefined pattern I might add (I was not impressed by the "fantastic PhysX" in that game when I had my 460 - fantastic game, still shitty PhysX).
The problem is not that PhysX couldn't do something useful. The problem is that there is not a single developer who will invest time in applying it realistically, because they'd be shunning half their potential playerbase. Simple as that. Result: it's only used for gimmicky effects, that look like shit
If only NVIDIA stopped being such a bunch of fucking cockblockers
Now we just need some games to adopt it.
The use of physx so far hasn't been as good as it should have been and the performance hit unreasonable.
Physx 3.0 should fix the performance issue if earlier articles about why physx is such a hog are to believed.
Great news
Corsair 750D :: 750W DPS-G:: Asus x370 PRO :: R7 1800X ::16gb DDR4 :: GTX 1070::525gb SSD::Coolermaster 240MM AIO::
Harldy useless. just not achieving its full potential due to the PC being the least favored platform for releases and devs seeing no money in making games with the latest and greatest tech like physx/dx11.
Corsair 750D :: 750W DPS-G:: Asus x370 PRO :: R7 1800X ::16gb DDR4 :: GTX 1070::525gb SSD::Coolermaster 240MM AIO::
Harldy useless. just not achieving its full potential due to the PC being the least favored platform for releases and devs seeing no money in making games with the latest and greatest tech like physx/dx11.
Hence: useless. By definition lack of use. Developers find no use for it if they have to develop multiplatform titles.
but they clearly do find use for it. what I'm saying is its not used to full potential. hence not useless. under used. like a whole host of PC potentially awesome tech.
Corsair 750D :: 750W DPS-G:: Asus x370 PRO :: R7 1800X ::16gb DDR4 :: GTX 1070::525gb SSD::Coolermaster 240MM AIO::
Harldy useless. just not achieving its full potential due to the PC being the least favored platform for releases and devs seeing no money in making games with the latest and greatest tech like physx/dx11.
Hence: useless. By definition lack of use. Developers find no use for it if they have to develop multiplatform titles.
but they clearly do find use for it. what I'm saying is its not used to full potential. hence not useless. under used. like a whole host of PC potentially awesome tech.
Corsair 750D :: 750W DPS-G:: Asus x370 PRO :: R7 1800X ::16gb DDR4 :: GTX 1070::525gb SSD::Coolermaster 240MM AIO::
I would give my left testicle for full dynamic PhysX/Havok/whatever destruction in games, not some static shit.
I hope things change with PhysX 3.0, so we would get this kind of "useless gimmick" in games.
As long as PhysX is proprietary and used as a marketing tool to sell more Geforce-cards, PhysX won't be a viable option. Unless developers are treated like royalty through the TWIMTBP-program, most will skip it until a standard based on OpenCL or DirectCompute has matured enough (Bullet physics for instance).
That's the version we're using now, or rather 2.8.3 to be specific with 2.8.4 in beta.
(Multithreading improvements and other stuff mainly in that build but of course you'd need PhysX titles using the 2.8.4 SDK to take advantage of it.)
3,0 should have a nice performance improvement but that'll likely mean GT600 using even more particle effects (Which is most of the PhysX effects from smoke to debris to water sprays but it's not all it does.) so overall it likely has a minimal impact on current CPU assisted performance.
(Not that PhysX extras can be used without a proper CUDA device most of the time anyway but still the common systems like ragdolls and other simpler calculations might still benefit.)
@ Breezer: not saying I wouldn't, but it shouldn't be with technology that requires such specific hardware. RF:G went a long way and that's plain old Havok running on the CPU. PhysX' particle effects are still shit though and a lot of shit that's done with it has been done without it just as effectively. A lot of shit that's been displayed with PhysX is hardly noticeable, only in slowmotion and could just as easily be done by using all the CPU power we have at our disposal.
Batman didn't have anything that couldn't be done without it, Mafia 2 didn't, Mirror's Edge didn't, Sacred 2 didn't. Metro I won't comment on, as I never bothered to play that again when I was back on NVIDIA. Think I've covered the biggest titles in the list of games that use GPU and/or PPU accelerated PhysX - please also note that if a game uses PhysX it doesn't necessarily does so on a GPU or PPU!
I can point out *exactly* why PhysX is nowhere near as good as you think it is.
Arguably the most important part of physics in general, is not GPU accelerated. Ageia used to accelerate on their PPU, but for unknown reasons NVIDIA don't. I'm talking about rigid body collision - that runs off the CPU. Now take a look at that name, what the fuck do you suppose that term applies to, huh? It's not even as if it's a hard thing to solve math-wise, as I'm sure any of the science boys on the hump can confirm - it is pure lazyness on NVIDIA's part. Everything Havok does on the CPU, PhysX does on the CPU as well. There is not a single fucking thing which is GPU accelerated. To make that simpler: all the physics you saw in HL2 back in 2004 and in RF:G back in 2009? PhysX can accelerate absolutely jackshit of that.
PhysX isn't bad, it's just dead and has been ever since NVIDIA took over in 2008. It has made absolutely no progression on terms of efficiency nor features, and to make things worse, they completely cockblocked any attempt to have it run on anything else but their shit. It's also not as if AMD wouldn't be able to handle it, because their theoretical computational power is typically quite a lot higher for same-class GPUs. I won't even comment on the CPU-side of PhysX, we know the story there.
So that destruction you linked to there Breezer? All CPU powered, until it gets to the minute little particles. Wanna know why the small particles are GPU accelerated? Because they don't have to (more accurately in this case: can't) interact with anything else, so they can do those calculations without giving two shits about having to apply any logic. And it's not as if we've never had particles like this before, nor does it add any realism whatsoever - it's the rigid body physics that we want to see, and that part is CPU powered, not fucking GPU powered.
Signature/Avatar nuking: none (can be changed in your profile)
You cannot post new topics in this forum You cannot reply to topics in this forum You cannot edit your posts in this forum You cannot delete your posts in this forum You cannot vote in polls in this forum