This forum has been archived. All content is frozen. Please use KDE Discuss instead.

Draw HDR images not having an HDR screen?

Tags: None
(comma "," separated)
kaori
Registered Member
Posts
5
Karma
0
Hello everyone.
My display doesn't support HDR rendering.
And yet, I want to draw HDR images. Is it possible?

I won't see properly only very light/ very dark details on my screen
(I'm okay with that, I just want to know that THEY ARE THERE)
or
ALL THE COLORS will be messed up, shifted, whatever?

This HDR thingy is extremely confusing, to be honest.
User avatar
halla
KDE Developer
Posts
5092
Karma
20
OS
It is possible, and has been possible since 2005, but you'll be guessing at the result. Use the exposure slider in the lut docker to draw at different exposure levels.
kaori
Registered Member
Posts
5
Karma
0
Thank you.

Do I need to make any adjustments in "Settings - Configure Krita..." ?
There are 2 tabs related to HDR: "Display" and "Color Management".

1. In "Display" tab there are "HDR Settings", I can choose "Preferred Output Format" there.
Should I keep "sRGB (8 bit)" since my screen doesn't support HDR rendering?

2. What about "Color Management" tab?
In "General" I can choose "Default color model for new images".
I guess I can pick any of these:
- "RGB/Alpha (16-bit integer /channel)"
- "RGB/Alpha (16-bit float /channel)"
- "RGB/Alpha (32-bit float /channel)"

3. There are also "Display" settings in "Color Management" tab.
"Rendering intent": "Perceptual"
"Screen 1": ICC profile can be either ACES, or Rec2020, is that correct?
User avatar
halla
KDE Developer
Posts
5092
Karma
20
OS
Since you don't have a hdr screen, all display options are irrelevant. You need to choose either a 32 or a 16 bits float colorspace to work in.
kaori
Registered Member
Posts
5
Karma
0
boudewijn wrote:Since you don't have a hdr screen, all display options are irrelevant.


"Configure Krita - Display" is about my screen settings (meaning the physical device).
"Configure Krita - Color Management - Display" is about how colors encoded in my image (the color space of the image).
I can choose any color space I want for the image.

Isn't that correct?
Lynx3d
Registered Member
Posts
31
Karma
0
OS
No, all those display settings control how colors are converted for your display (hardly surprising?), choosing anything that does not match your display color space will simply yield wrong colors.

How colors are encoded in your image is defined by the image color space that you choose when creating an image, or when using Image -> Convert Image Color Space...
kaori
Registered Member
Posts
5
Karma
0
Lynx3d wrote:No, all those display settings control how colors are converted for your display (hardly surprising?)

I thought
"Configure Krita - Display" is the settings of my screen (physical device).
"Configure Krita - Color Management - Display" is the info encoded in images
(of how those images should be displayed on any device, not only mine).

Thank you very much for your clarification.

Lynx3d wrote:How colors are encoded in your image is defined by the image color space that you choose when creating an image,

Creating a new document in Krita I set "Dimensions" tab this way:
Model: RGB/Alpha
Depth: 16-bit integer /channel
Profile: Rec2020-elle-V4-g10.icc

However, I'm not able to set "Background Color" whiter than 8-bit white in "Content" tab.
Even if I manually type "65535", these fields preserve "255" value for some reason:
Red: 255
Green: 255
Blue: 255
User avatar
halla
KDE Developer
Posts
5092
Karma
20
OS
Like I said before ("You need to choose either a 32 or a 16 bits float colorspace to work in."), "Depth: 16-bit integer /channel" does not give you a HDR enabled colorspace.
kaori
Registered Member
Posts
5
Karma
0
Thank you.

boudewijn wrote:Like I said before

My bad I wasn't attentive enough, I'm sorry.

I've read this doc:
https://docs.krita.org/en/general_conce ... depth.html
I hope I've understood it correctly:

16-bit integer doesn't have brighter whites and deeper blacks.
It's used for smooth gradients, not for HDR.
Precision per channel: 16-bit

16-bit float on the other hand, does actually have HDR
at the cost of less smooth gradients (compared to 16-bit integer).
Precision per channel: 10-11bits

32-bit float has smoothest gradients and widest dynamic range.
(However, according to https://docs.krita.org/en/general_conce ... kflow.html "Some graphics cards, such as those of the NVidia-brand actually have the best performance under 16bit float, because NVidia cards convert to floating point internally. When it does not need to do that, it speeds up!")
Precision per channel: 23-24 bits


Bookmarks



Who is online

Registered users: bartoloni, Bing [Bot], Google [Bot], Yahoo [Bot]