WCG is simple to understand. When talking about the advantages of Blu-ray one only ever heard about the resolution bump, but there was a "wider colour gamut" used with that format too. So in the same way that SD used Rec.601, and HD used Rec.709, WCG refers to the colour gamuts that UHD uses (in this case, two): DCI-P3 and Rec.2020. The big plus with DCI-P3 is that it's the colour gamut that the majority of films are presented in natively (both new films and modern restorations of old films), so in that sense there isn't a loss between the DI and the UHD. Rec.2020 is an even wider gamut than DCI-P3, but it's not commonly used (and no TV can display it fully yet).andyli wrote: ↑Tue Aug 17, 2021 8:30 pmHi Eddie, you explanation is crystal clear and massively helpful. I do have an additional question that has huanted me for a long time. When people talk about a film getting graded in HDR, what are the colorists doing differently from when the film is done in SDR? I gather that in SDR, you are still able to assign brightness, contrast, and specific colors to each scene (or each frame if it's desired to do so) and fine tune the image to your heart's content. So what exactly does HDR bring to the table at the grading stage? Does grading on a HDR-capable display make people see each scanned image differently and assign another set of brightness, contrast and color values?
I guess what confuses me is the fact that HDR and WCG often come hand in hand and it's hard for the technically less informed to really discern what is the benefit brought by WCG and what exactly is the result of HDR replacing SDR.
As for HDR, I went into some detail here. But it's the same principle as WCG really, where that offers a wider colour space, PQ offers a wider dynamic range. I don't know enough about film restoration to discuss details, but ultimately when a colourist is grading "in HDR" they are simply managing the raw information from the OCN or the raw digital files from the camera in a suite that offers them these wider ranges. Then when running off an SDR version they have to fit their results into the more narrower ranges of gamma and Rec.709. So as an example, this in HDR becomes this in SDR (taken from GeoffD's review of Batman 1989). In that example, the extra detail that is visible in HDR has not been "added", it's there on the film element, but it cannot be seen in the confines of SDR.
I think one of the reasons people have this impression that HDR is "adding" something on top of normal images, is because their TVs switch to HDR mode when they receive an HDR signal, in the same way that a 3D set switches to 3D mode. But the only reason this happens is because TVs are having to deal with two transfer fuctions now, not just one. When in normal use a 4K HDR TV is set to gamma, and only switches over to PQ when dealing with HDR signals. There was no equivalent to this with Blu-ray over DVD (as both still used gamma).