16GB of Vram in one of the top cards? uh oh, this doesn't sound good for midrange, probably still at 8GB. another skippity skip generation not like there's many good titles requiring it.
I dont get it, if we aren't given the hardware to run all this bloated Vram hogware, what is the purpose here? Throw money in a pit?
Let me guess, problematic technology that will suffer birth defects and be hard to implement. Also AMD making a ghetto version that gets implemented in games instead
Might aswell call GPUs AI acceleration cards soon.
More GPU power and ram for AI local installs? And better FPS in the games I play.
I don't do AAA games that do raytracing or can afford the 'AI frame gen', but some I play are demanding... I guess satisfactory and MSFS 2024 are the only ones I play that has that frame gen AI optimized crap, but I turn it off in both
i think nvidia is gonna make the biggest blunder ever. launching the 5080 n stuff before the 5090 will have a ton of noobs hold out buying waiting for the 5090 and causing their own low demand for 5080 , also giving intel and amd room to undercut
i hope intel gets desperate for any wins after the disaster that was 2024 for them and will go undercut mayhem selling at a loss for market share
The 5090 will have as much VRAM as I have RAM in my PC..
It is for sure not a gaming card. You can obviously use it for superior performance and quality but the power and VRAM is so overkill for gaming. It's probably aimed at premium/pro customers who actually use it for AI/3D design etc.
...and people with more money than they know what to do with.
Meanwhile I'll look to get a second hand 3070 Ti+ or 4070 to replace my 2060.
RTX 5070 TI (16 GB) should be a great for 1080p gaming.
Ultra settings / 60 fps / Ray Tracing MAX / Path Tracing MAX
Future proof and quality 1080p gaming.
1080p? More like 1440p with ultra setting as long as you're okay with 60 fps. I doubt 1080p gaming requires any upgrade to the 5-series whatsoever unless the RT is a lot faster than previous gens and can actually be used without dragging the performance down.
But.. if you're stuck on an older generation like 10x0/20x0/30x0 I guess it's very much worth upgrading to depending on the pricing which will be just as ridiculously expensive as the last decade worth of NV GPU's.
i think nvidia is gonna make the biggest blunder ever. launching the 5080 n stuff before the 5090 will have a ton of noobs hold out buying waiting for the 5090 and causing their own low demand for 5080, also giving intel and amd room to undercut
Thats usually how it works every gen. What gen did the *90 come out first for them?
The *090 is binned *080's that overmatch specs for the *080. They don't make chips just for *060,*070,*080,*090.
They make one chip. Those that fail to pass get parts disabled down to 70 specs. it still fail? down to 60 [and so on. More complicated than that, but general idea].
Even *080 isn't a flawless chip it has its margin of failed transistors and such. It's 'overbuilt' for those margins so enough pass even with flaws. So *090 is the one that as the process is refined [and fewer errors/defects] more of them can go over the minimum stable specs needed for 5080. Once enough have been side binned enough, and enough roll off the line regularly to fill 90 productions, *090 comes out.
Pricing themselves out of the market. A common practice nowadays. Blame this or that when all parts just want to make a bigger buck and blame whatever the fuck as the reason for the increase.
Prices are not as bad as I thought they'd be. I'm eyeing the 5090. I'm just curious if 2x 360 rads will be enough to cool that and CPU or if I have to slap another rad somewhere.
Signature/Avatar nuking: none (can be changed in your profile)
You cannot post new topics in this forum You cannot reply to topics in this forum You cannot edit your posts in this forum You cannot delete your posts in this forum You cannot vote in polls in this forum