Why are people blaming the GPU creators for trying to compensate for shit-running games? And not blaming the game makers for creating a heavy need for the framerate compensation to be on cards?
It's not nvidia, AMD, or anyone fault that even on the best they can make new games cannot manage to get good framerates. They don't make the game code
I wonder how good 5080 and 5090 are compared to my 4080. I might buy one of them and give my old card to my gf.
Rasterization seems to be like 20-30% faster (which comes from increased power consumption), so pretty much nothing, but the new multi frame generation makes them shitloads of "faster" if you want fake frames.
I wonder how good 5080 and 5090 are compared to my 4080. I might buy one of them and give my old card to my gf.
Rasterization seems to be like 20-30% faster (which comes from increased power consumption), so pretty much nothing, but the new multi frame generation makes them shitloads of "faster" if you want fake frames.
More closer to the truth than their bullshit claims, if the benchmark is the same and running at the same settings on the left.
Not to say that the new tech is/will be bad, just that kewl leather coat jensen is full of it as usual
boundle (thoughts on cracking AITD) wrote:
i guess thouth if without a legit key the installation was rolling back we are all fucking then
All I care about is if it runs VR better, local AI models faster, and chokes less on complicated 3D CAD models. AutoCAD, and SolidWorks especially, make my GPU fans scream more than any game I have.
my 4080 is more than enough for gaming on a screen. Those 3 I need 'moar power!"
Which sucks for keynotes and press releases. They talk about everything BUT the things I myself care about GPU horsepower for..lol
I wonder how good 5080 and 5090 are compared to my 4080. I might buy one of them and give my old card to my gf.
Rasterization seems to be like 20-30% faster (which comes from increased power consumption), so pretty much nothing, but the new multi frame generation makes them shitloads of "faster" if you want fake frames.
More closer to the truth than their bullshit claims, if the benchmark is the same and running at the same settings on the left.
Not to say that the new tech is/will be bad, just that kewl leather coat jensen is full of it as usual
RTX 3xxx series -> 4xxxx series was huge leap because of 2 generational leaps in manufacturing process, Blackwell is made on same as the 4xxx series, so cant expect much here other than the more advanced AI stuff.
The improved DLSS 4 model is coming to older GPUs as well. Impressive move. If the overall quality improvement is really as substantial as the preview by Digital Foundry suggests, this could breathe new life into aging Nvidia GPUs.
I'm thinking that too. Seeing reports of folks running 7950x and 4090's with just two rads. The GPU generally don't hit max TDP in gaming.
friketje wrote:
Fans are not the problem indeed. But a 650 watt card is a risk at itself. You’ll probably should replace gpu paste in 2 years. Shitty job, breaking the warranty seal and lots of stuff to break doing it.
At the plus side the gpu is overkill and vram bottlenecked, it’s not realy needed to push it to 100%, doubt it is even designed to do so at a regular basis.
Worst case scenario, I can always set limits on the card or just throw in a 3rd 360 rad, would require reconfiguring the setup a bit.
in games as in hardware, patient bear always wins. Never be an early adopter, never prehurrdurr.
Throwing a ridiculous amount of watts at the problem of bad development practices is a solution for those who fart money.
fable2 wrote:
at that point another carrot on a stick will be invented for consumers to grasp at, and we will probably repeat this all over again.
there's no End Game to this racket
Indeed to both
Not that it concerns me at the moment, I already gave them plenty of euros last year and plan to keep the 4070 until it begs to be put out of its misery (or commits seppuku like the 970 did)
dethy wrote:
Worst case scenario, I can always set limits on the card or just throw in a 3rd 360 rad, would require reconfiguring the setup a bit.
i see the future of game devs already. fuck optimizing . kjust turn on fake frames u whining gamers, we have no time to optimize, we hav dlc to pump out and nvidia pay us for supporting fake frames
reminds me of the time cs devs added fake minus lag to peoples counterstrike patch and everyone was like omfg everything feels much smoother. all they did was deduct 20 ms from lag counter changing everyones visual ping to lower nr ... here they just ass to a nr, look 120fps .. sooo silky smooooth roflol
The improved DLSS 4 model is coming to older GPUs as well. Impressive move. If the overall quality improvement is really as substantial as the preview by Digital Foundry suggests, this could breathe new life into aging Nvidia GPUs.
Maybe i'm stupid, but if the true improvement is just a few percentage, shouldnt the difference be minimal then if 4090 supports DLSS 4.0 also?
From what it looks like, the gains are the extra RT and Cuda cores to process the DLSS. (I just skimmed the data on the new cards).
So 4x00 series can/will have access to it, but it cannot process it as fast as 5x00 so less gains than them. At least what I get out of the tech info from a cliff notes skim of it.
It will be only marginally better FPS on older cards (from optimization ver 3.5), as most of DLSS 4.0 on them is aimed at improvements in fidelity and less 'frame gen' weirdness.
From what it looks like, the gains are the extra RT and Cuda cores to process the DLSS. (I just skimmed the data on the new cards).
So 4x00 series can/will have access to it, but it cannot process it as fast as 5x00 so less gains than them. At least what I get out of the tech info from a cliff notes skim of it.
It will be only marginally better FPS on older cards (from optimization ver 3.5), as most of DLSS 4.0 on them is aimed at improvements in fidelity and less 'frame gen' weirdness.
RT is still mostly a gimmick in my eyes that sometimes look worse, sometimes look better. Not something im willing to spend over 2000 dollars on. When games start becoming more demanding i might change my stance, but for now i find my 4090 more than sufficient enough
Same, Until games are fully raytraced (the entire engine/scene is) and not just water, reflections, and raytraced lighting in specific situations..all slapped over an otherwise rasterized scene. I don't have a use for it either.
Its not just that, sometimes proper lightning makes dark scenes brighter and less atmospheric in games where you appreciate the darkness. Horror and semi horror like metro.
Crypto mostly is where my expendable fun money comes from.
Also, have to keep in mind I don't spend money on SHIT else. I don't eat out, have cable TV, I don't buy clothes or shoes over $20, and don't go out to events that cost money or sports games. I don't go out to movies. I don't have a car payment and I work from home so no petrol to buy for the car much.
I'd say in any month I might spend 50 max on stuff that isn't pure necessity (lights, mortgage, phone, etc). And thats usually one or two nights out at a cheap pub that $50.
All my money that isn't necessity bills (and maybe one doordash a month) gets stored for garage woodshop tools, or computer parts each year.
The guy belows counts the cores.Seems the 50 series has the same stagnation we saw with the AMD 9000 CPU's. Just a minor increase in performance. The exception is the 5090. Moore's law is dead.
DLSS 4 can be a thing though. But it has to make better use of the VRAM for it being realy worthwhile. In the 40 series framegen is useless when you realy need it cause it's bottlenecked by VRAM.
Signature/Avatar nuking: none (can be changed in your profile)
You cannot post new topics in this forum You cannot reply to topics in this forum You cannot edit your posts in this forum You cannot delete your posts in this forum You cannot vote in polls in this forum