Acer Predator X34 Will get 100Hz Refresh Rate Support

Published by

Click here to post a comment for Acer Predator X34 Will get 100Hz Refresh Rate Support on our message forum
https://forums.guru3d.com/data/avatars/m/50/50906.jpg
On these FreeSync/G-sync monitors, why do they only spec the highest refresh and not also the lowest? What's FreeSync/G-sync for if there's no lowest refresh rate information?
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
that gsync bonus :P
One has to love brown nosing Acer did here. So, is Freesync version 30% cheaper here? I do not think so. Why would I pay for 100Hz screen which they artificially limited to 75Hz?
data/avatar/default/avatar01.webp
On these FreeSync/G-sync monitors, why do they only spec the highest refresh and not also the lowest? What's FreeSync/G-sync for if there's no lowest refresh rate information?
That's a freesync only problem. G-Sync, when the lowest refresh is reached, they double the refresh rate.
https://forums.guru3d.com/data/avatars/m/94/94596.jpg
Moderator
That's a freesync only problem. G-Sync, when the lowest refresh is reached, they double the refresh rate.
I don't think so.
data/avatar/default/avatar28.webp
It just redraws the same frame, the outcome is subtly different but still jarring.
I have only seen this mentioned for FreeSync once it dips below the minimum refresh rate. Do you have a link?
https://forums.guru3d.com/data/avatars/m/94/94596.jpg
Moderator
It is in the article posted by Teawithgrief. The transitioning of true frames occurs at the same time, throwing in filler frames will make it less jarring but the outcome is still the same, not enough FPS.
data/avatar/default/avatar21.webp
Unfortunately I don't see quote regarding "jarring" or anything like that related to the G-Sync functionality in the article. In the comments section if your referring to this comment then it's above the minimum refresh rate and as the tester mentioned related to his "old oscilloscope having a hard time triggering."
No, but G-Sync did have an issue where it didn't handle quick oscillations around those frame doubling transition points very well, causing a kind of stutter when you were rendering at 36-39 FPS for a while.
To which the tester responded
That actually was more my old oscilloscope having a hard time triggering. It's an old scope. Cut it some slack 🙂
data/avatar/default/avatar25.webp
I wanted to wait and see how the Freesync version of this monitor will turn out, and after this 75Hz slap in the face with a pricetag of over a THOUSAND FREAKING EUROS i believe it will be the ASUS MG279Q instead. Good job Acer - and by good job i actually mean f**K you as you did me.
data/avatar/default/avatar39.webp
Compared to other 34", curved format screens with a 21:9 aspect ratio and 3440 x 14 the price is about right. If you are buying a monitor in this category (ultra-wide gaming screens) high prices are expected whether it's FreeSync or G-Sync. G-Sync's will average about $100 more and this version most likely has the newer G-Sync 2 module .
data/avatar/default/avatar09.webp
Expected? Yes - as in Wearefirstonthemarketearlyadopteridiötbetatester-Tax. Well see about the price when Asus,LG and the others(god forbid the Koreans make some) come with similar Modells, they will be throwing them at us, at their actuall worth - which is at 800$ tops.
data/avatar/default/avatar31.webp
The Acer XR341CK is actually very good. I got mine last Friday and played a couple games on it maxed out: -Crysis 3 -Tomb Raider -Bioshock Infinite -Counter Strike GO I used the frame limiter in CCC to max out at 75Hz since the freesync range is 30Hz to 75Hz. Everything felt very BUTTERY SMOOTH with no screen tearing. I didn't notice any blur either. There is some blacklight bleed around the upper edges but most of it went away once it was calibrated. (TFTCentral settings + colormunki). My previous monitor was an ROG Swift. The difference between 120Hz (couldn't get 144 to work) and 75Hz w/Freesync is very minimal. It's not as smooth as 120Hz but freesync makes up for it.
https://forums.guru3d.com/data/avatars/m/152/152580.jpg
I do not see any difference in the picture between 48 and 75Hz on my FreeSync monitor. In the range of FreeSync, picture is just absolutely smooth. I'm afraid that the same applies to 76-100Hz range. The picture will be the same absolutely smooth. Only two parameters seems to be important - frame limiter, which does not allow to exceed the upper synchronization limit and lower limit.
data/avatar/default/avatar10.webp
now I will just wait for a Korean version of this screen which hits 100hz ...who gives a toss about freesync/gsync 1000 euro is rediculous!!
https://forums.guru3d.com/data/avatars/m/223/223196.jpg
Getting there. I guess full 144 Hz spec will be available when I get my Skylake-E PC in 2017.
https://forums.guru3d.com/data/avatars/m/237/237478.jpg
So this says to me, "Just wait for Gen 2 or 3." Cool product but the price is a little over bearing.
data/avatar/default/avatar21.webp
and probably will have ULMB @ 100hz too.
Rumors has it that the 100Hz is only available in G-Sync mode, and with G-Sync on you can't use ULMB so it would be one of those at a time. I'm really hoping it's "true" 100Hz but I'm finding it a bit too good to be true.
data/avatar/default/avatar28.webp
ULMB is nvidias refined lightboost hack. Im asuming its also done throught their module.
Yeah it's a nVidia thing but as of today it can't be activated while using G-sync so you have to pick one :P so unless something has changed. Man I really want more info if I should wait for this or grab the freesync one.. I hate waiting :wanker:
data/avatar/default/avatar22.webp
Even if it doesnt have ULMB, you should still get the gsync one... cuz its 100hz. Its a no-question.
oh it has ULMB, you just can't use it at the same time as G-Sync! 🙂 Yes that's what I'm waiting for - a confirmation that this screen is real 100Hz with OR without G-Sync. There's some reports that the screen is only 100Hz when using G-sync (nvidia cards) so then I wouldn't be able to use 100Hz as a AMD user. But if they say it's 100Hz all the time then I'll get it and perhaps get nvidia cards next year!