"Immersive sim" what the fuck is that supposed to mean? Can you like date and stuff? Take a stroll in the park, drive a car etc? Pisd out a window of a tall building
Yeah, it's bittersweet as well when we take into account that the beloved Arkane Austin was dismantled, System Shock 3 is essentially a chimera at this point, Bioshock 4 is stuck in a vicious development hell, and all Deus Ex could spawn since 2016 is the upcoming low-budget remaster.
There are Judas and Clockwork Revolution that perhaps could turn out decent, and at least we miraculously got a competent SS1R, but the genre (sadly) can't produce $ati$factory numbers so we're destined to forever cherish the great titles from the past. And Prey certainly belongs to such a prestigious circle.
Cool story. If only there were proper HDR displays that are not TVs. Most "H"DR displays are "H"DR400 or "H"DR600 "certified", which is a joke. Apple's XDR displays have a sustained peak brightness of 1600 nits, but it's either their laptops (shit for gaming) or the Pro Display XDR which costs an arm and a leg (and another leg for the display stand ) and I don't even think there is a modern driver for it for Windows. All the gaeming "H"DR displays are cheap shit nonsense.
Me neither, and I've been meaning to for quite some time. Unfortunately, this Remaster - Luna mod seems to be incompatible with EGS version of the game, at least according to the docs from the mod.
I've found some variant that "should" work, but doesn't for me - https://rpghq.org/forums/viewtopic.php?t=4089
I must be doing something wrong apparently.
Anyone manged to make it work with EGS?
OLED can never reach brightness levels of an LCD with a backlight; it's just not technically possible without burning the diodes. On the other hand, you are stuck with shitty local dimming solutions for true blacks. Both are stop gaps until microLED technology matures enough and gets cheaper enough to have in consumer TVs/monitors.
OLED can never reach brightness levels of an LCD with a backlight; it's just not technically possible without burning the diodes. On the other hand, you are stuck with shitty local dimming solutions for true blacks. Both are stop gaps until microLED technology matures enough and gets cheaper enough to have in consumer TVs/monitors.
Correction: Monitors can't, but TVs already can.
LG G5 RGB tandem OLED can do 2500 cd/m² peak and 470 cd/m² full field white.
Samsung S95F QD-OLED can do 2200 cd/m² peak and 400 cd/m² full field white.
More than enough. I don't know why OLED Monitors are still so shitty.
OLED TV's are great, it's just the monitors that are lagging four to six years behind depending on which aspect of performance you look at.
Well, it’s because gaemrzz are willing to buy piss and revel in it like it’s colored rain. No demand for quality, as long as it is 377485hz and has RGB leds.
The television market is quite competitive all along the price spectrum, while PC monitors are a race to the bottom for the most part (some exceptions at the highest professional end).
GP27Q is a 1440p monitor? Eh, sorry, that cannot be something recommended in 2025. But I see there is a GP27U, which is 4K.
4K is a 30-60% drop in FPS, hardly worth it for minimal better image. In fact i think i'd rather have a 1080p screen sometimes, still looks great imo + great boost for FPS. FPS is far more important than resolution for games, unless you only play strategy games.
I dont use my monitor for movies at all.
Even if you have a 4090 the loss is something like 20-40% from going all the way to 4K, if you have something less than a 3090 (its what i have), lol, no..
As for using high resolution in windows, 32" sucks.. i had it, at 4K its just a hassle and everything is too tiny and scaling things up via windows just breaks stuff because these UI's are often not built to be shown with huge fonts. you need like a 42" for 4K. 2 monitors at 1440p is way better for working in windows.
4K is a 30-60% drop in FPS, hardly worth it for minimal better image. In fact i think i'd rather have a 1080p screen sometimes, still looks great imo + great boost for FPS. FPS is far more important than resolution for games, unless you only play strategy games.
I dont use my monitor for movies at all.
Even if you have a 4090 the loss is something like 20-40% from going all the way to 4K, if you have something less than a 3090 (its what i have), lol, no..
As for using high resolution in windows, 32" sucks.. i had it, at 4K its just a hassle and everything is too tiny and scaling things up via windows just breaks stuff because these UI's are often not built to be shown with huge fonts. you need like a 42" for 4K. 2 monitors at 1440p is way better for working in windows.
Chosing monitor is very simple nowdays, the criteria - OLED (everything not is just sad, never again I am dealing with VA ghosting, IPS glow/clouding, shit uniformity of them both, etc) with PROPER pixel size ~96dpi, 120+ Hz, 4K with HDR support. And now the choice boils down to ... just a SINGLE model, its LG 48" OLED TV.
It solves ALL of your complaints above:
- watching movies in HDR on your PC? - no problem;
- pixel size is big enough - no need to scale UI in Windows, combined with 4k resolution gives unmatched productivity;
- your GPU can't handle 4k? no problem at all, pixel size is big enough, just use "No scaling" or analogous option in your GPU drivers and suddenly your TV transforms into 1080p, 1440p or any other custom size monitor you want, including any ulrawide size!
Unfortunately, it does have its own unfixable problems but overall it is still the best.
VRR flicker. this is the real bummer for OLEDs-- especially for shitty optimized UE games (i.e. most of them). depending on the panel this can get really irritating (on my new QD-Oled S95f , SilentHill2 remake or Chronos are flicker central -- even with stable frames (frametimes/pacing/fps))
VRR flicker. this is the real bummer for OLEDs-- especially for shitty optimized UE games (i.e. most of them). depending on the panel this can get really irritating (on my new QD-Oled S95f , SilentHill2 remake or Chronos are flicker central -- even with stable frames (frametimes/pacing/fps))
I got lucky with the uniformity on my sample, but that is an issue of a sample not a model in general (you can get unlucky sample with absolutely any other monitor).
As for the VRR flicker - yes, this is an inherent OLED problem, one of the issues I mentioned this TV has. That flicker basically makes VRR unusable, luckily the higher your lowest framerate the less benefit VRR provides, and considering I always aim to play at stable 120 fps, using Vsync is just fine.
LeoNatan wrote:
Such a big TV used as a monitor will destroy your neck. Not with it for HDR in games.
Not sure what you mean about HDR but personally I do not use HDR for anything other than watching HDR movies. I recall enabling it once in Assassins Creed Odyssey but as soon as I turned OLED light/brightness/contrast to a tolerable levels that do not burn your eyes to a crisp in 3 seconds the game looked exactly as in SDR.
As for the neck, well, I guess you can pretty much destroy your neck and your spine and what not on pretty much any monitor depending on your chair/table/position and the amount of time you spend with it. But yes, I'd say with monitor of this size you would want to use a table lower than usual.
the problem is that shitty software implementation can make you VRR-flicker even with rock solid framerates
just check out some of those UE engine games with shitty optimization. even with rock solid fps and your gpu load being far from max you will still sometimes get ever-so-slight frametime dips. if that occurs in a near-black scenario on screen you WILL flicker.
it depends on the display though. my old C9 rarely had any flickering, the new Samsung flickers considerably... kind of ironic
Signature/Avatar nuking: none (can be changed in your profile)
You cannot post new topics in this forum You cannot reply to topics in this forum You cannot edit your posts in this forum You cannot delete your posts in this forum You cannot vote in polls in this forum