Asus RoG Swift PG278Q
Page 1 of 4 Goto page 1, 2, 3, 4  Next
Przepraszam
VIP Member



Posts: 14491
Location: Poland. New York.
PostPosted: Wed, 6th Aug 2014 04:24    Post subject: Asus RoG Swift PG278Q
Back to top
Hfric




Posts: 12017

PostPosted: Wed, 6th Aug 2014 05:32    Post subject:
so much bs.... i could not hold it

that hotkey with crosshair overlay
that 144 test

this ad does more harm then good


Back to top
scaramonga




Posts: 9800

PostPosted: Wed, 6th Aug 2014 06:58    Post subject:
Looks damn sweet though, hmm....I like ASUS stuff Very Happy
Back to top
Breezer_




Posts: 10799
Location: Finland
PostPosted: Wed, 6th Aug 2014 09:30    Post subject:
I would get this instantly, but biggest problem of this screen is that it is a shitty TN panel.
Back to top
Werelds
Special Little Man



Posts: 15098
Location: 0100111001001100
PostPosted: Wed, 6th Aug 2014 09:53    Post subject:
And the utterly retarded price of $799.
Back to top
Breezer_




Posts: 10799
Location: Finland
PostPosted: Wed, 6th Aug 2014 10:41    Post subject:
Werelds wrote:
And the utterly retarded price of $799.


Its in Finland only 849 euros Laughing That translates to 1136 dollars Laughing Gotta love Finland.
Back to top
DeadByDawn




Posts: 271
Location: Australia
PostPosted: Wed, 6th Aug 2014 11:06    Post subject:
It might have a TN panel but it's far from a 'shitty' TN panel.
I don't know much about the korean overclocked IPS panels but I'd take 120-144Hz + 1ms response time any day over the a IPS panel.

Although I'm a massive FPS gamer and avid gamer in general so from what I've played on IPS panels, in particular the Mac ones, I much prefer my 120Hz Benq panel.

I should also say, this would have been an insta buy for me but the cost here in Oz is $999 Sad
Back to top
Breezer_




Posts: 10799
Location: Finland
PostPosted: Wed, 6th Aug 2014 11:08    Post subject:
Do not even start IPS vs TN conversation, we all know how it ends.
Back to top
DeadByDawn




Posts: 271
Location: Australia
PostPosted: Wed, 6th Aug 2014 11:44    Post subject:
On another note, what type of hardware do you need for 1440p vs 1080p?
I had a dual screen setup (24 inch) but after finishing study and no longer working much from home I found the two screens became annoying when gaming. I figured instead I'd upgrade to a larger single screen
Back to top
Werelds
Special Little Man



Posts: 15098
Location: 0100111001001100
PostPosted: Wed, 6th Aug 2014 11:49    Post subject:
It's 75% more pixels, so do the math Smile

It's not quite as big of a jump as 720p -> 1080p (125% more pixels), but it's not far off either.
Back to top
matta666




Posts: 1061
Location: Manchester
PostPosted: Wed, 6th Aug 2014 12:30    Post subject:
heh 'the choice of champions' spotty pasty teenage champions Laughing
Back to top
tonizito
VIP Member



Posts: 51402
Location: Portugal, the shithole of Europe.
PostPosted: Wed, 6th Aug 2014 12:35    Post subject:
Laughing Laughing Laughing


boundle (thoughts on cracking AITD) wrote:
i guess thouth if without a legit key the installation was rolling back we are all fucking then
Back to top
DeadByDawn




Posts: 271
Location: Australia
PostPosted: Wed, 6th Aug 2014 13:29    Post subject:
so it might sound stupid but is the conversion that simple, 75% more pixels = 75% more power required.

I figure it might be displaying more pixels but with the gpu doing other calculations as well is it simply a direct correlation? obviously it's using more power to display more but if, for example, it's calculating the physics of an explosion, although I'm seeing more of the explosion is the power needed to calculate that bigger explosion a direct comparison to the amount I see or is the full explosions being calculated anyway and therefore seeing more of the image not taking any more power to calculate the physics which would then make it not a direct comparison.

I hope that makes sense, its a bit off topic but a serious question.
Back to top
Breezer_




Posts: 10799
Location: Finland
PostPosted: Wed, 6th Aug 2014 13:37    Post subject:
On high end GPUs its not -75% performance, its hard to tell exact performance loss compared to 1080p, but its atleast around 30-40% performance hit. Also the higher the resolution, the more it requires VRAM.
Back to top
DeadByDawn




Posts: 271
Location: Australia
PostPosted: Wed, 6th Aug 2014 13:53    Post subject:
Thanks for the feedback, I only ask because this screen is a serious contender as my next purchase. The only problem is I'm running 2 x 670 but they are only have 2GB of vram. I can't justify the move to a 700 series card so may have to wait for 800 series to get more vram
Back to top
Werelds
Special Little Man



Posts: 15098
Location: 0100111001001100
PostPosted: Wed, 6th Aug 2014 13:56    Post subject:
DeadByDawn wrote:
so it might sound stupid but is the conversion that simple, 75% more pixels = 75% more power required.

I figure it might be displaying more pixels but with the gpu doing other calculations as well is it simply a direct correlation? obviously it's using more power to display more but if, for example, it's calculating the physics of an explosion, although I'm seeing more of the explosion is the power needed to calculate that bigger explosion a direct comparison to the amount I see or is the full explosions being calculated anyway and therefore seeing more of the image not taking any more power to calculate the physics which would then make it not a direct comparison.

I hope that makes sense, its a bit off topic but a serious question.

As Breezer lays out, no, it's not a 1:1 thing.

VRAM is though. The same frame takes up 75% more space; a 1920x1080x32 frame takes up just over 8 megabytes, while a 2560x1440x32 image takes up just under 14.5 megs - that's just the rendered frame.

As for the other stuff: not all rendering operations are directly affected by rendering resolution. There's a lot of shit that you never even realise happens. Culling for example (not rendering shit that isn't visible) won't take up any more time on a higher resolution, but rendering a shadow will. Thus, how big of a hit it is depends heavily on the engine too; a pretty engine like CryEngine drops by 50% easily. Frostbite and even Ubioptimised games sit around 40%. UE3 and such around 30%.
Back to top
Przepraszam
VIP Member



Posts: 14491
Location: Poland. New York.
PostPosted: Wed, 6th Aug 2014 14:58    Post subject:
I bet my 660ti will handle it just fine. Laughing Laughing


Back to top
Shoshomiga




Posts: 2378
Location: Bulgaria
PostPosted: Wed, 6th Aug 2014 19:23    Post subject: I have left.
I have left.
Back to top
LeoNatan
☢ NFOHump Despot ☢



Posts: 73196
Location: Ramat Gan, Israel 🇮🇱
PostPosted: Wed, 6th Aug 2014 19:32    Post subject:
Przepraszam wrote:
http://www.asus.com/Monitors/ROG_SWIFT_PG278Q/


Day 1. Awesome Awesome Awesome

TN poop at 800$
Back to top
KillerCrocker




Posts: 20503

PostPosted: Wed, 6th Aug 2014 22:43    Post subject:
Breezer_ wrote:
Do not even start IPS vs TN conversation, we all know how it ends.

SUre we do. Tn is shit


3080 | ps5 pro

Sin317-"im 31 years old and still surprised at how much shit comes out of my ass actually ..."
SteamDRM-"Call of Duty is the symbol of the true perfection in every aspect. Call of Duty games are like Mozart's/Beethoven's symphonies"
deadpoetic-"are you new to the cyberspace?"
Back to top
Axeleration




Posts: 814

PostPosted: Fri, 8th Aug 2014 11:57    Post subject:
I'd like to try G-Sync feature tho and 144hz seems glorious.
Back to top
Werelds
Special Little Man



Posts: 15098
Location: 0100111001001100
PostPosted: Fri, 8th Aug 2014 12:13    Post subject:
Better wait until later this year, early next year when we'll have monitors with Adaptive Sync on the market. Dell and Eizo are both working on it. For those I'll gladly pay more than normal.

But for this TN piece of shit with a wonderful vendor lockin? Fuck off.
Back to top
Axeleration




Posts: 814

PostPosted: Fri, 8th Aug 2014 12:49    Post subject:
@Werelds
Can you comment about this.
Q: How is G-SYNC different than Adaptive V-SYNC?
A: Adaptive V-Sync is a solution to V-Sync’s issue of stuttering, but input lag issues still persist. NVIDIA G-SYNC solves both the stutter and input lag issue.
Back to top
Werelds
Special Little Man



Posts: 15098
Location: 0100111001001100
PostPosted: Fri, 8th Aug 2014 13:03    Post subject:
Axeleration wrote:
@Werelds
Can you comment about this.
Q: How is G-SYNC different than Adaptive V-SYNC?
A: Adaptive V-Sync is a solution to V-Sync’s issue of stuttering, but input lag issues still persist. NVIDIA G-SYNC solves both the stutter and input lag issue.

Adaptive Sync. Not Adaptive V-Sync.

These:
- http://www.vesa.org/news/vesa-adds-adaptive-sync-to-popular-displayport-video-standard/
- http://support.amd.com/en-us/kb-articles/Pages/freesync-faq.aspx

Not this:
- http://www.geforce.com/hardware/technology/adaptive-vsync

Adaptive Sync and AMD's implementation of it in FreeSync is exactly what Nvidia's G-Sync is but without a $100 premium for a crappy display controller that locks you into a specific GPU vendor. Adaptive Sync will work on both Nvidia and AMD (that is, assuming Nvidia don't cockblock it; but they'll most likely rebrand G-Sync to support Adaptive Sync). Monitor manufacturers can freely implement it in their controllers.

Adaptive V-Sync is a software thing in Nvidia's driver and something else entirely.
Back to top
Axeleration




Posts: 814

PostPosted: Fri, 8th Aug 2014 13:14    Post subject:
Oh ok my bad, you meant freesync, but to be honest e.g. a slice taken from the faq:
"Upon connecting a FreeSync-enabled monitor to a compatible AMD Radeon™ graphics card..."
has me a little bit concerned



1:04:51

Nvidia will get screwed
Back to top
matta666




Posts: 1061
Location: Manchester
PostPosted: Fri, 8th Aug 2014 13:14    Post subject:
I hope it's better implemented than the NVidia version in their control panel. There is still a lot of tearing with adaptive vsync at the moment.
Back to top
Werelds
Special Little Man



Posts: 15098
Location: 0100111001001100
PostPosted: Fri, 8th Aug 2014 13:30    Post subject:
Axeleration wrote:
Oh ok my bad, you meant freesync, but to be honest e.g. a slice taken from the faq:
"Upon connecting a FreeSync-enabled monitor to a compatible AMD Radeon™ graphics card..."
has me a little bit concerned



1:04:51

Nvidia will get screwed

Linus gets it wrong. Project FreeSync is NOT a different name for Adaptive Sync, nor will Project FreeSync end up being called AS. Project FreeSync builds *on top* of AS.

AS is the "push refresh" technology, part of the DisplayPort _spec_ (which is nothing but words). This spec is royalty free, so anyone is free to implement it on either side of the cable (monitor or GPU) and there are no vendor requirements on either side.

Project FreeSync is AMD's implementation of that technology. Nvidia can, and most likely will, implement it too. And they'll probably do so quietly under the G-Sync name, so that they can go "HAH WE DID IT FIRST ANYWAY".

Just like AMD and Nvidia have their own implementation of tesselation, anisotropic filtering and so on.

matta666 wrote:
I hope it's better implemented than the NVidia version in their control panel. There is still a lot of tearing with adaptive vsync at the moment.

Read above.

It is something different than Adaptive V-Sync. Forget about Adaptive V-Sync, it's crap.

Adaptive Sync, FreeSync or G-Sync isn't.
Back to top
matta666




Posts: 1061
Location: Manchester
PostPosted: Fri, 8th Aug 2014 14:12    Post subject:
Thanks
Back to top
AmpegV4




Posts: 6248

PostPosted: Mon, 18th Aug 2014 09:09    Post subject:
After using strobe backlight blur reduction i can never go back to 60hz crap no matter how great the color, resolution or contrast. I don't get he fuss about gsync either, no tearing but its still going to be blurry blurry unless its strobed and you cant enable both.

http://www.testufo.com/#test=photo&photo=quebec.jpg&pps=960&pursuit=0&height=0
Running this test gives me a crystal clear photo while it scrolls across the screen, shooters are just night and day difference amazing to play.
Back to top
Shoshomiga




Posts: 2378
Location: Bulgaria
PostPosted: Mon, 18th Aug 2014 13:44    Post subject: I have left.
I have left.
Back to top
Page 1 of 4 All times are GMT + 1 Hour
NFOHump.com Forum Index - Hardware Zone Goto page 1, 2, 3, 4  Next
Signature/Avatar nuking: none (can be changed in your profile)  


Display posts from previous:   

Jump to:  
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum


Powered by phpBB 2.0.8 © 2001, 2002 phpBB Group