Acer XB280HK is a 28-inch 4K Ultra HD monitor with G-Sync

Published by

Click here to post a comment for Acer XB280HK is a 28-inch 4K Ultra HD monitor with G-Sync on our message forum
https://forums.guru3d.com/data/avatars/m/243/243536.jpg
waw, this and two 780ti 6gb would set me back £2000. I much prefer lower scale gaming for now, when gtx 990 is here perhaps.
data/avatar/default/avatar01.webp
why would you need gsync anyway, I still believe screen tearing is made up belief that doesn't actually exists.. If i ever had it was so minor that I never noticed, those screen tearing screenshots sure look ridiculous..
https://forums.guru3d.com/data/avatars/m/211/211933.jpg
why would you need gsync anyway, I still believe screen tearing is made up belief that doesn't actually exists.. If i ever had it was so minor that I never noticed, those screen tearing screenshots sure look ridiculous..
You'd be surprised how wrong you are . Screen tearing is real and is out to get you! Depends on what games you play i guess, i myself don't mind it that much but there's a lot of people who can't play because of it.
https://forums.guru3d.com/data/avatars/m/199/199386.jpg
wow, this and two 780ti 6gb would set me back £2000. I much prefer lower scale gaming for now, when gtx 990 is here perhaps.
I think the problem is deeper than that. For affordable 4,096 Horizontal pixel gaming there needs to be a complete redesign of the basic PC architecture. AMD, Intel, IBM, Apple, nVidia, Samsung, Microsoft and considerable contribution from the global community need to get a new clean design from scratch. All this x86 crap needs to go. Too expensive, too much power draw and running out of steam fast. When I was a kid, I was told we'd be in flying cars by 2015 - but no, we still got x86 processors running binary...64 bit? pfft, I would have expected 1,024 bit tech running my kettle ffs. This is what happens when incremental increases are more profitable. /gets off soapbox.
https://forums.guru3d.com/data/avatars/m/235/235224.jpg
4k G-Sync, do want 😀
https://forums.guru3d.com/data/avatars/m/227/227994.jpg
why would you need gsync anyway, I still believe screen tearing is made up belief that doesn't actually exists..
I'm guessing you also believe there is no difference between 30 and 60 FPS, because the human eye can't see past 30 FPS eh?
https://forums.guru3d.com/data/avatars/m/252/252414.jpg
I'm guessing you also believe there is no difference between 30 and 60 FPS, because the human eye can't see past 30 FPS eh?
Don't forget narrow-FOV motion sickness :infinity: :D
https://forums.guru3d.com/data/avatars/m/238/238382.jpg
why would you need gsync anyway, I still believe screen tearing is made up belief that doesn't actually exists...
Is this like a joke comment?
https://forums.guru3d.com/data/avatars/m/182/182702.jpg
why would you need gsync anyway, I still believe screen tearing is made up belief that doesn't actually exists.. If i ever had it was so minor that I never noticed, those screen tearing screenshots sure look ridiculous..
Nice troll bro 😀
https://forums.guru3d.com/data/avatars/m/115/115616.jpg
waw, this and two 780ti 6gb would set me back £2000. I much prefer lower scale gaming for now, when gtx 990 is here perhaps.
Well, g-sync drastically reduces the hardware requirements, as you don't need your system to stay at 60fps+ levels 100% of the time. So if you accept frame rates around 40, single 780ti or your dual 760.
https://forums.guru3d.com/data/avatars/m/242/242956.jpg
why would you need gsync anyway, I still believe screen tearing is made up belief that doesn't actually exists.. If i ever had it was so minor that I never noticed, those screen tearing screenshots sure look ridiculous..
Assassins Creed Black Flag Check Mate. I think Gsync is great but Nvidia really screwed up with it what a blunder.
data/avatar/default/avatar01.webp
Well, g-sync drastically reduces the hardware requirements, as you don't need your system to stay at 60fps+ levels 100% of the time. So if you accept frame rates around 40, single 780ti or your dual 760.
I like G-sync, and its nice to see it evolve over the 1080p 27" monitors.. But im a little bit worry with 4K +G-sync. We will need wait reviews, but g-sync have a limit at 33.3ms (30hz or 30fps ). It is the maximum time it can retain a frame, so basically when you are in the 30fps range and under, it will need to completely redraw the frame ( and could create a stutter more or less visible). So with 4K monitor, what i fear is the min framerate, if you go to much under 30fps with the minimum framerate ( so at more of 33.3ms for render a frame ), this could start to be a problem. It will be needed to be really carefull on the min framerates and set the graphism accordingly for dont meet the problem.
data/avatar/default/avatar34.webp
Well, g-sync drastically reduces the hardware requirements, as you don't need your system to stay at 60fps+ levels 100% of the time. So if you accept frame rates around 40, single 780ti or your dual 760.
yea but if you're packing a display like this whats the point of medium/high textures instead of ultra :P i'm guessing 4k single displays are hitting the 3gb vram limit in high end games already (or they probably should be if the textures are up to snuff) that's only gone get worse. in the next 2 years so bundeling 2 2GB gpu's or 2 3GB gpu's with a display like this almost seems like a waste : /
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
All this x86 crap needs to go. Too expensive, too much power draw and running out of steam fast.
I don't get it - how exactly is x86 a problem when it comes to high-res gaming? And what do you propose to improve it? x86 may be old, kind of messy (but it is CISC after all...) and a bit power hungry, but there's a certain point where x86 becomes very efficient, if not more efficient than any other architecture. Try getting an ARM CPU to compete with an i7 in terms of performance-per-watt and the ARM will most certainly fail miserably. When you try to get an x86 CPU to compete against the power draw of an existing ARM CPU, the ARM will most likely perform better. This is my gripe with intel - they want to dominate EVERYTHING but x86 is not a 1-size-fits-all architecture by any means.
When I was a kid, I was told we'd be in flying cars by 2015 - but no, we still got x86 processors running binary...64 bit? pfft, I would have expected 1,024 bit tech running my kettle ffs.
I both agree and disagree. x86 should have been obsoleted a long time ago, but in the Windows world, software compatibility would be a nightmare if that were the case. But why go beyond 64 bit architectures? In the server world, where software compatibility in new systems often doesn't matter at all, they still stick with 32 bit and 64 bit architectures. Every year servers have the opportunity to increase the bus width but they don't. GPUs are the only exception, but their operation isn't comparable to a CPU. If money wasn't in the equation, then at this point we'd likely all own a quantum computer at this point. But since that isn't the case and since companies only do things in their own interest, your demands seem very naive.
data/avatar/default/avatar35.webp
x86 is an architecture, 64 or 32bits have nothing to with that.. in 70's x86 was using a 16bits instruction set, it have then pass to an 32bits instructions set.. it is now using the x86_64 instructions set, who is too the AMD64 instructions sets ( AMD64 is too used by Intel processors ( with some little modifications on the implementations, VIA processors etc ). To note that AMD64bits instructions are too backward compatible with both 16 and 32bits. This is why you can run full 32bits software with it, and not the invert.. the x86_64 is not 32bits extended to 64bits, but full 64bits but is backward compatible with 32bits operands etc.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
To note that AMD64bits instructions are too backward compatible with both 16 and 32bits. This is why you can run full 32bits software with it, and not the invert.. the x86_64 is not 32bits extended to 64bits, but full 64bits but is backward compatible with it.
x86-64 is backward compatible on a HARDWARE level. In contrast, I believe IA64 is a 64-bit x86 architecture, but it isn't backward compatible on either a hardware of software level. This brings me back to my point about Windows being an issue - if you want newer, better architectures, the software has to be designed for it. 32 bit Windows is STILL prevalent. For whatever reason, MS didn't try pushing for wider buses 8 years ago, and because of this they're crippling the computer industry.
data/avatar/default/avatar10.webp
x86-64 is backward compatible on a HARDWARE level. In contrast, I believe IA64 is a 64-bit x86 architecture, but it isn't backward compatible on either a hardware of software level. This brings me back to my point about Windows being an issue - if you want newer, better architectures, the software has to be designed for it. 32 bit Windows is STILL prevalent. For whatever reason, MS didn't try pushing for wider buses 8 years ago, and because of this they're crippling the computer industry.
The Itanium version is different ( on scratch 64bits ) but it is not an x86 architecture anyway. For softwares / OS, thats a different questions i agree.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
I think the problem is deeper than that. For affordable 4,096 Horizontal pixel gaming there needs to be a complete redesign of the basic PC architecture. AMD, Intel, IBM, Apple, nVidia, Samsung, Microsoft and considerable contribution from the global community need to get a new clean design from scratch. All this x86 crap needs to go. Too expensive, too much power draw and running out of steam fast. When I was a kid, I was told we'd be in flying cars by 2015 - but no, we still got x86 processors running binary...64 bit? pfft, I would have expected 1,024 bit tech running my kettle ffs. This is what happens when incremental increases are more profitable. /gets off soapbox.
I don't get it - how exactly is x86 a problem when it comes to high-res gaming? And what do you propose to improve it? x86 may be old, kind of messy (but it is CISC after all...) and a bit power hungry, but there's a certain point where x86 becomes very efficient, if not more efficient than any other architecture. Try getting an ARM CPU to compete with an i7 in terms of performance-per-watt and the ARM will most certainly fail miserably. When you try to get an x86 CPU to compete against the power draw of an existing ARM CPU, the ARM will most likely perform better. This is my gripe with intel - they want to dominate EVERYTHING but x86 is not a 1-size-fits-all architecture by any means. I both agree and disagree. x86 should have been obsoleted a long time ago, but in the Windows world, software compatibility would be a nightmare if that were the case. But why go beyond 64 bit architectures? In the server world, where software compatibility in new systems often doesn't matter at all, they still stick with 32 bit and 64 bit architectures. Every year servers have the opportunity to increase the bus width but they don't. GPUs are the only exception, but their operation isn't comparable to a CPU. If money wasn't in the equation, then at this point we'd likely all own a quantum computer at this point. But since that isn't the case and since companies only do things in their own interest, your demands seem very naive.
There is no reason to retire x86. Modern x86 processors barely resemble "x86" processors anyway. They all have sophisticated frontends that decode CISC based instructions to internal formats that resemble RISC where they need to. ARM's only advantage is that they focused on low power from the start, Intel has slowly been going through and modifying their instructions/internal arch to better suit those needs. Moorefield is x86 and is power/performance competitive with ARM and should be seen in products later this year. Also nothing is black and white, it isn't like ARM is only capable of low power, they could easily scale their design and make internal changes to better suit high performance needs. They just knew it would be more difficult to compete with Intel so they went a completely different route to avoid competing with them. And yeah, the compatibility thing is a problem. It would be like retiring all world languages in favor for a superior one that is more accurate with less words/phonics/whatever. Languages evolve naturally to fill the voids/gaps/concerns that populations have. X86/ARM is exactly the same way, neither one is set in stone they are constantly evolving. There have also been tons of other instruction sets that have claimed all kinds of benefits but failed in gaining traction, mainly because Intel can do whatever it wants internally on a chip and mimic those benefits. As for G-Sync, I think I may personally wait to see what the deal is with Freesync and see if my questions about G-Sync get answered. Nvidia keeps saying they are going to fix things in the future with G-Sync, but is that going to require an new module? Are they going to issue firmware updates to the current G-Sync module? Is that why they went with an FPGA instead of a ASIC board? I really don't want to lock myself into Nvidia only with my monitor purchase.
https://forums.guru3d.com/data/avatars/m/232/232349.jpg
why would you need gsync anyway, I still believe screen tearing is made up belief that doesn't actually exists.. If i ever had it was so minor that I never noticed, those screen tearing screenshots sure look ridiculous..
You can not be serious...... It has plagued "gaming" for years. I can not wait for this bad boy as I'm rocking quad titans and have been drooling for this since the conception of G-Sync!!