Projector + Apple TV 4K HDR / WCG ponderings

AVForums

Help Support AVForums:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.

Drifter

AVForums Grandmaster
*
Joined
Jan 25, 2012
Messages
4,533
Reaction score
1,261
Location
Durbanville
My projector, Optoma UHD55, has dual HDMI inputs. HDMI 1, which a standard input, and HDMI 2, which offers WCG (wide colour gamut).

When using HDMI 1, the source signal will dictate the projector display mode i.e. if it is fed an HDR signal (i.e. Dolby Vision from Netflix) it will auto adjust the resolution to 4K and set the display mode to HDR. I set my Apple TV 4K's resolution to 1080P and activate dynamic range and frame rate matching, which changes the output signal based on the source content, the PJ then detects this signal and auto adjusts the resolution and display mode accordingly. When standard content (so non HDR) is displayed, you can change the PJ's display mode between various presets including "cinema", "reference", "bright", "game" etc. This changes the colour settings as each of these presets has its own configuration. However when an HDR source is detected, the PJ will auto select the "HDR" preset (the rest of the display modes cannot be selected) which uses the REC.2020 colour gamut.

HDMI 2 has an additional trick up its sleeve. It offers wide colour gamut (WCG). When using this input and feeding it with an HDR signal, as with HDMI 1, it auto selects the HDR display mode. Another display mode option however appears in the menu (which is not available when using HDMI 1) called WCG_HDR. When selecting this display mode, a faint electric noise can be heard from the PJ as it engages the shutter that enables WCG output. For those not familiar with WCG, here is an overview "Wide color gamut, or WCG, is often lumped in with HDR. While they're often found together, they're not intrinsically linked. Where HDR is an increase in the dynamic range of the picture (with brighter highlights in particular), WCG is an increase in color: "Redder" reds, "greener" greens, "bluer" blues and more. Both provide more of an impact on-screen than the resolution increase from high-def to 4K, but in different ways." . The result is a truly stunning image from a colour, dynamic range and detail perspective.

There is a downside to using HDMI 2 however, and herewith my pondering/question. As mentioned above, I use my Apple TV 4K's resolution set to 1080P SDR and activate dynamic range and frame rate matching. HDMI 2 does not appear to detect these Apple TV output changes (resolution, dynamic range and frame rate) and will project native HDR content as 1080P SDR. This can be rectified by changing the Apple TV 4K's output to 4K HDR 60, HDR content is showed in the native resolution which is just dandy. The problem is, that with the Apple TV 4K's output set to 4K HDR, standard HD content and content with lower resolution (youtube videos, HD Netflix movies, Amazon Prime content etc) are now all "upscaled" to 4K HDR by the Apple TV irrespective of whether you select the dynamic range and frame rate matching or not. As the content does not have the HDR metadata, it results in a "toned down" colour pallet" which looks a bit washed out.

I've been living with it like this, as my main prerogative for video quality is associated with HDR/Dolby Vision content but there must be a solution for this. Whilst HDR content on HDMI 1 looks great, the WCG option on HDMI 2 takes HDR to another level and I will therefore stick to using HDMI 2.

For those of you who have displays (PJ or TV) that offers HDR with WCG that uses an Apple TV 4K and watch a combination of Youtube and other streaming services (so a mixture of content with different resolutions, frame rates and colour ranges), what have you set your Apple TV 4K's video output as?

PS: Come to think of it, I have not tried setting the Apple TV's output to 4K SDR with dynamic range and frame rate matching as ON. I will do so tonight.

Remember:
  • Not all 4K TVs can do wide colour gamut.
  • Some TVs claim to be HDR and WCG compatible, but can't actually display HDR and WCG. These TVs will accept an HDR/WCG signal from a 4K BD player or media streamer, but aren't capable of displaying colors greater than what a standard HDTV can.
  • All current-generation OLED TVs (from LG and Sony) are capable of HDR and WCG.
  • Most TVs with quantum dots (i.e. what Samsung uses in their QLED TVs), are capable of the same.
  • Some other TV technologies, like Sony's Triluminos or LG's Nano cell TVs, can also display WCG.
  • Most (not all) HDR content includes WCG data. Basically all 4K blurays have WCG.
 
Last edited:
Top