-
Notifications
You must be signed in to change notification settings - Fork 40
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
What is the correct rendering of images with PQ curve? #186
Comments
Going by the source PNGs, the output in Chroma/krita is correct. |
Hmm... Now that I look closer it's actually a bit confusing. @cconcolato |
The Chrome/krita renders are what is expected. The PNG source data is simply a means to ensure you're receiving the correct RGB values. If you compare the converted RGB output of the AVIF (normalized to float) against the RGB values in the PNG (also normalized to float), you will get the same values. This is more of a decode sanity check than a "does this match render", as there's no official (guaranteed to be accepted everywhere) way to signal BT.2020 PQ in a PNG, yet. As the README says:
In any typical image viewer, if you try to view a PNG with no embedded color profile, it will simply assume the RGB values in there are SRGB, which will give you the wrong display in this situation. As a simple example, let's take the upper left pixel of As you can see, the normalized value of this pixel is As you can see here, once we normalize the RGB values, we also get If I force the profile to be BT.2020 PQ, however: You can see here that the normalized value Once the HDR PNG (cICP block in PNG) tech linked above is blessed and actually implemented by (some browsers/viewers), perhaps I can jam such a block into these PNGs and they'll start looking correct in the same places HDR AVIFs look correct. For now though, these PNGs simply serve to let you know that you've arrived back at the right RGB triplet, and nothing else. |
Please correct me if I understood the situation correctly. Majority of the apps (except eog) above recalculate decoded data via following EOTF to linear TRC: However monitor cannot display whole range (0 - 10000nits). In case darktable and/or firefox attempts to display maximal RGB value, monitor will not emit 10000nits but only 80? (as sRGB suggests) That's probably why the image looks dark. Following code is from krita. They also use the well known PQ EOTF. But they multiply the result with 125 constant. But where the 125 comes from? Is it 10000 / 80 ? (maxPQnits / maxSRGBnits or it is a pure coincidence?)
What is recommended to do with too bright values which exceeds possibility of typical sRGB displays? Is is better to clip each of the R,G,B channels individually or to normalize the triplet to maximal possible brightness? |
Let me check if there's a document that has recommendations for how to handle this. |
The HDR folks recommend the following: You might also want to check out HDRTools. From what I've been told it doesn't fully implement the document above since that came later, but it should have good starting points. |
Thank you Leo, that's very interesting material to study. |
Hello,
I am confused regarding different rendering of hdr_cosmos12920_cicp9-16-0_lossless.avif
When I open https://github.com/AOMediaCodec/av1-avif/raw/master/testFiles/Netflix/avif/hdr_cosmos12920_cicp9-16-0_lossless.avif in Firefox and in Google Chrome, I get different results:
Which one is correct?
BTW, same file in eog viewer displays this way:
The text was updated successfully, but these errors were encountered: