Dell P4317Q Multi Client 4K Ultra HD Monitor
More info on Dell's new P4317Q Ultra HD monitor has surfaced. The display size is so big that Dell calls this a Multi Client monitor, the monitor uses a 42.51” IPS panel, likely an LG Display model.
The panel boasts a 3840 x 2160 (‘4K’ UHD) resolution, with a pixel density of just under 104 PPI. Dell touts the ‘Multi-Client’ capability of the monitor, with PiP and PbP allowing 4 devices to output to the screen simultaneously.
It has a 60Hz refresh rate, 1000:1 static contrast ratio, 10-bit colour output (likely 8-bit + FRC dithering) and 178° horizontal and vertical viewing angles. An enhanced phosphor WLED backlight is used to offer excellent sRGB colour space coverage but no support for wide colour gamuts.
An 8ms grey to grey response time is specified. The monitor includes 100 x 100mm and 200 x 200mm VESA holes for alternative mounting. The ports include; DP 1.2, mDP 1.2, 2 HDMI 1.4 ports (with MHL), VGA, 4 USB 3.0 ports (plus upstream), a 3.5mm audio input, 3.5mm audio output and RS232 port. Two 8W speakers are also included, which should provide fairly decent sound output.
The monitor is listed for $1350 / 1050 EURO on Dell’s product page and appears to be intended as a B2B product and may not see the usual wider retail availability.
Member
Posts: 31
Joined: 2015-07-29
I would be happy with even just g-sync. Jizz would be all over it while playing Witcher 3...

We need better GPUs however, one 1080 is not nearly enough. Maybe two, but even then there is still a LOT room for improvement in 4K performance.
tbh I don't really see the problem, I mean it's easier to use a big screen with a lower resulution than use DSR on a lower res screen. Also, in my opinion, it's way easier to change (unsless you do WC) and resell the video card than the monitor. I mean if a monitor is good for its usage it can last from 3 to 5 years, a video card 2 generations later gets really old and is almost unusable.
Member
Posts: 31
Joined: 2015-07-29
Speaking from my own experience Technology is just being able to run modern beautiful games on 2160p on single GPU, 1080 can do it but Baarely I would say. Titan maybe will have better chances here due to larger ram. My 1080 in Rise of the Tomb Raider with everything on maximum except AA (using FXAA instead of SMAA) at 1440p eats 7.8gb vram, thats quite a lot.
I can only assume that at 2160p it will eat even more vram.
Hence the way I see it right now we need a GPU which can run at around 2.5k mhz, which has 12-16gb ram, and on top of that have better cooling - ideally water.
Will we see a single GPU like that any time soon? Maybe, 1080Ti might show something like 12gb vram, however I doubt we will see it at 2500mhz, more like ~2200mhz.
So current generation leaves it to SLI to be used with 2160p if you want 60+fps on modern titles.
And SLI speaking from my own experience is Not that good especially recently. Less and less titles support it (might change, but so far that has been my observation for past couple of years, moving from 670's SLI to 980's SLI), you get much higher temperatures compared to single GPU, gameplay is less fluid due to micro stutter, latency, which of course CAN be partially fixed by Gsync but still.
So the way I see this currently 1440p is best all rounder for gaming as 1080 works currently really well with it, while 2160p is a bit of too much challenger for 1 GPU.
What we need is 32-36 inch 1440p screens with 100+hz, IPS (or new tech), low response and Gsync. Preferably curved (at least for me). This would be amazing.

I would pre-order it XD
Senior Member
Posts: 813
Joined: 2009-11-30
that "unusable" depends on the user... i still seeing people using GTX5** series, even older one
even say its unusable for you, it still last like 3-4years
as each gpu generation is like 2-3years
kepler may.2012
maxwell Feb.2014
pascal april.2016
by the time you get the last gen card, probably your monitor already failing
Senior Member
Posts: 364
Joined: 2015-06-18
Another meh 4k monitor with only hdmi 1.4.
Senior Member
Posts: 3534
Joined: 2014-10-20
I would be happy with even just g-sync. Jizz would be all over it while playing Witcher 3...
We need better GPUs however, one 1080 is not nearly enough. Maybe two, but even then there is still a LOT room for improvement in 4K performance.
Speaking from my own experience Technology is just being able to run modern beautiful games on 2160p on single GPU, 1080 can do it but Baarely I would say. Titan maybe will have better chances here due to larger ram. My 1080 in Rise of the Tomb Raider with everything on maximum except AA (using FXAA instead of SMAA) at 1440p eats 7.8gb vram, thats quite a lot.
I can only assume that at 2160p it will eat even more vram.
Hence the way I see it right now we need a GPU which can run at around 2.5k mhz, which has 12-16gb ram, and on top of that have better cooling - ideally water.
Will we see a single GPU like that any time soon? Maybe, 1080Ti might show something like 12gb vram, however I doubt we will see it at 2500mhz, more like ~2200mhz.
So current generation leaves it to SLI to be used with 2160p if you want 60+fps on modern titles.
And SLI speaking from my own experience is Not that good especially recently. Less and less titles support it (might change, but so far that has been my observation for past couple of years, moving from 670's SLI to 980's SLI), you get much higher temperatures compared to single GPU, gameplay is less fluid due to micro stutter, latency, which of course CAN be partially fixed by Gsync but still.
So the way I see this currently 1440p is best all rounder for gaming as 1080 works currently really well with it, while 2160p is a bit of too much challenger for 1 GPU.
What we need is 32-36 inch 1440p screens with 100+hz, IPS (or new tech), low response and Gsync. Preferably curved (at least for me). This would be amazing.