ASUS ProArt Display OLED PA32DC has been unveiled.

Published by

Click here to post a comment for ASUS ProArt Display OLED PA32DC has been unveiled. on our message forum
https://forums.guru3d.com/data/avatars/m/199/199386.jpg
Right size, wrong refresh rate. Right tech, but not QD-OLED. Right resolution, wrong HDMI connections. Wrong price, and 3 months of creative cloud is meaningless if you already have a subscription tied to a business account; as is. Independent content creators will see little value here beyond the ROG Swift OLED PG42UQ, for an additional 2,000 bones...especially when you consider that this monitor is for content creation, yet, does not even support HDCP 2.3, yet, the PG42 does...how the world can an editor of HDCP 2.3 content, working remotely, work on content for this content creation monitor? The PG42 is everything this monitor needed to be for 22/23/24/25, yet, is stuck in 2017 for HDMI and is very late to the party. I think that sums up my comments.
https://forums.guru3d.com/data/avatars/m/271/271560.jpg
Loobyluggs:

Right size, wrong refresh rate. Right tech, but not QD-OLED. Right resolution, wrong HDMI connections. Wrong price, and 3 months of creative cloud is meaningless if you already have a subscription tied to a business account; as is. Independent content creators will see little value here beyond the ROG Swift OLED PG42UQ, for an additional 2,000 bones...especially when you consider that this monitor is for content creation, yet, does not even support HDCP 2.3, yet, the PG42 does...how the world can an editor of HDCP 2.3 content, working remotely, work on content for this content creation monitor? The PG42 is everything this monitor needed to be for 22/23/24/25, yet, is stuck in 2017 for HDMI and is very late to the party. I think that sums up my comments.
!) QD/OLED is not better, it's a workaround. all QD/OLEDs have that funky pixel layout which is bad for productivity. 2) i agree re: ROG PG42UQ but this is not designed for working remotely it's made for editing rooms which are quite small 3) not all content creators are are editing HDCP 2.3 PERIOD. you are not thinking of how vast the film and tv markets are or the tens of thousands of production companies. also if you're a graphic designer/creative director HDCP means nada - and i'm talking all advertising in every industry. this monitor is perfect for the job, especially viewing dailies on the set. my only question is if they added that kick-ass heat sink that the PG42UQ has - and obtw that's also not QD/OLED
https://forums.guru3d.com/data/avatars/m/220/220214.jpg
DisplayHDR 400 only?? I don't understand this - when every consumer OLED TV can go up to about 800 nits, how are these so called professional monitors only going up to half that? How could anyone properly monitor and evaluate HDR content if the monitor is not capable of displaying anything over 400 nits in the content. Most HDR content has to be mastered to 1000 nits, so you can't see anything from 400 to 1000 levels... only some sort of compressed tone-mapped version.
https://forums.guru3d.com/data/avatars/m/271/271560.jpg
geogan:

DisplayHDR 400 only?? I don't understand this - when every consumer OLED TV can go up to about 800 nits, how are these so called professional monitors only going up to half that? How could anyone properly monitor and evaluate HDR content if the monitor is not capable of displaying anything over 400 nits in the content. Most HDR content has to be mastered to 1000 nits, so you can't see anything from 400 to 1000 levels... only some sort of compressed tone-mapped version.
i understand your confusion but, HDR 400 True Black is better than HDR 600, in addition this has the HDR 10 codecs as well brightness is never a problem in a modern display, BLACK is. and HDR mastering *does not have to be mastered at 1000 nits* if you're using True Black. HDR 400 True Black gets far brighter than 400 nits and btw most consumer OLEDs are HDR 400 True Black and also have the HDR 10 codecs. there is zero compression or tone mapping on an HDR True Black screen the fault with all of this is VESA selling certifications like popcorn at a theater.
https://forums.guru3d.com/data/avatars/m/199/199386.jpg
tunejunky:

!) QD/OLED is not better, it's a workaround. all QD/OLEDs have that funky pixel layout which is bad for productivity. 2) i agree re: ROG PG42UQ but this is not designed for working remotely it's made for editing rooms which are quite small 3) not all content creators are are editing HDCP 2.3 PERIOD. you are not thinking of how vast the film and tv markets are or the tens of thousands of production companies. also if you're a graphic designer/creative director HDCP means nada - and i'm talking all advertising in every industry. this monitor is perfect for the job, especially viewing dailies on the set. my only question is if they added that kick-ass heat sink that the PG42UQ has - and obtw that's also not QD/OLED
The demographic for this monitor is remote content creators and editors, working to a paid contract, or, working on their own business SOHO, hence the Adobe Creative offer - which makes no sense, because either you have already got a subscription, or, the company you work for has paid for it. HDCP 2.3 is very important for all of those streaming services, like Netflix, Disney, Amazon
https://forums.guru3d.com/data/avatars/m/271/271560.jpg
Loobyluggs:

The demographic for this monitor is remote content creators and editors, working to a paid contract, or, working on their own business SOHO, hence the Adobe Creative offer - which makes no sense, because either you have already got a subscription, or, the company you work for has paid for it. HDCP 2.3 is very important for all of those streaming services, like Netflix, Disney, Amazon
you have that in reverse i'm very familiar w/ HDCP 2.3, but my point is the market for this product is far larger than you've assumed and the HDCP is a layer that's added in post-production - but not for everything as the content itself is created without it - thus the far larger market. i've been in editing rooms in Hollywood (Paramount, Sony, Lightstorm (prod. Co) and several others) for film and TV (CBS, NBC, Fox) as i used to work for a very well respected and highly utilized monitor manufacturer for almost 30 years.
https://forums.guru3d.com/data/avatars/m/242/242134.jpg
@Loobyluggs QD is better? lol, ok. content creation does not give a crap about refresh rates. while to small for me and probably others that don't care about multi moni setups, i will never get anything anymore that doesn't support VRR. same for ABL, which afaik Sony displays dont need, yet still do not experience burn ins within the same time frame, like the LGs screens do (using ABL).
https://forums.guru3d.com/data/avatars/m/220/220214.jpg
tunejunky:

i understand your confusion but, HDR 400 True Black is better than HDR 600, in addition this has the HDR 10 codecs as well brightness is never a problem in a modern display, BLACK is. and HDR mastering *does not have to be mastered at 1000 nits* if you're using True Black. HDR 400 True Black gets far brighter than 400 nits and btw most consumer OLEDs are HDR 400 True Black and also have the HDR 10 codecs. there is zero compression or tone mapping on an HDR True Black screen the fault with all of this is VESA selling certifications like popcorn at a theater.
Yes I know what the VESA "Black" is... added in to please the OLED manufacturers who knew they couldn't compete with high-brightness LCD panels so they wanted to also include the dark end where they have the advantage. If it can go far brighter then why isn't it certified as such? Why isn't it HDR 600 or HDR 1000 certified? I presume you can still have HDR 1000 TRUE BLACK ? How could there be no tone-mapping when displaying 1000 or 2000 nit content? In that case it would just extreme hard clip the highlights and blow out all the bright areas... not what you want if you are supposed to be monitoring the final image accurately.
https://forums.guru3d.com/data/avatars/m/199/199386.jpg
tunejunky:

you have that in reverse i'm very familiar w/ HDCP 2.3, but my point is the market for this product is far larger than you've assumed and the HDCP is a layer that's added in post-production - but not for everything as the content itself is created without it - thus the far larger market. i've been in editing rooms in Hollywood (Paramount, Sony, Lightstorm (prod. Co) and several others) for film and TV (CBS, NBC, Fox) as i used to work for a very well respected and highly utilized monitor manufacturer for almost 30 years.
If you have to deliver to Netflix, it has to be in multiple different formats, like, every language subtitled and every market catered for - it's a huge list and HDCP is on that list, you are going to be working with it (in remote teams) so you can deliver the content. If you are NOT working remotely, then this is irrelevant for the reasons you've stated. These people (remote workers) are (potentially) part of a big system, hence why it doesn't make sense to purchase something wherein you are locking yourself out of the workspace you are supposed to be working in. It ticks a lot of boxes for the IQ, but not many other boxes for multiple system usage and therefore fails.