This forum has been archived. All content is frozen. Please use KDE Discuss instead.

Request: 10bpc and 12bpc options in the color picker.

Tags: None
(comma "," separated)
User avatar
JTF195
Registered Member
Posts
7
Karma
0
I was wondering if it might be possible to add 10bpc and 12bpc color support to the color picker - not 10 and 12 bit color-managed documents (unless you want to :P )

Specifically, options in the dropdown to allow picking color values ranging from 0 to 1023 and from 0 to 4095 respectively.
Thankfully Krita has already implemented the 16bpc values option in the color picker, so it may not be terribly difficult to add those ranges as well.

The general idea behind this is that various broadcast standards and displays support 10 and 12 bit color, so it makes sense that the color picker be able to limit the palette to those ranges rather than the more granular 16bpc range (0 to 65535)

I made heavily edited a proof of concept quite a long time ago when I was contemplating the idea of adding such an enhancement to Photoshop (which I never did)
I have it hosted on my personal website here:
https://www.rapturewerks.net/colorpicker

If my website happens to be offline, you can download the files here:
http://puu.sh/pEL6a/31a233fed1.zip

Edit: To further clarify, you would still create a 16 bit document, and would still be able to use the full 16-bit range.
I'm asking for the option to make the color picker round to 10 or 12 bit increments.
User avatar
TheraHedwig
KDE Developer
Posts
1794
Karma
10
OS
I have mentioned this on the IRC, but I'll mention it here. This is more difficult to implement than you'd think due to how the color space and the selectors tie together.

16 float has a precision of 10bit, meaning that from the 0 to 1.0 range(that's screen black to screen white) you have 1024~ values. Coincidentally(or perhaps not?) 16bit float is also the bit depth given to us thanks to openEXR and OCIO, two components that, as you may know, are part of common vfx tools, which are designed to produce for those broadcasting standards. It might thus be better to work in 16bit float and set the specific color selector to 16bit float. If you aren't changing the range of visible values in the lut docker, then that should give 10bit output. It also gives the advantage of exporting to openEXR to further process your work elsewhere for 10bit output.

For Krita, the next step in supporting this workflow better is to find our maintainer a card that can produce a 10bit output as we have already obtained a monitor capable of showing 10bit color. We can then make sure that Krita is able to output to a 10bit display(which it theoretically can already, we just haven't seen it with our own eyes).
User avatar
JTF195
Registered Member
Posts
7
Karma
0
It isn't really necessary to work in 10bit integer values, what with the fact that displays check the LUT and use the nearest displayable color anyway, but it's a quality of life thing - it's easier to mentally process integers.

The implementation I was thinking of was to simply "display" 10bit integer values in the color picker's sliders and input boxes, and then just internally translate them to 16bit int or 16/32bit float, or whatever. It would be totally hacky and halfassed, and upon further reflection has no place in a professional quality project.

Instead, I guess I'll simply use my JavaScript color picker, which converts between the different bit depths when a new one is selected from the dropdown, and then input the resulting values in Krita's.

Anyway, if I'm understanding correctly, the high level goal is fully-10bit documents?

I do also have a native 10bit panel monitor and gpu (Nvidia Quadro K620s are pretty cheap, fyi), so I will be willing to test any 10bit stuff you happen to add.
User avatar
TheraHedwig
KDE Developer
Posts
1794
Karma
10
OS
Well, we're gonna have python scripting in the coming year(it was funded by the kickstarter), and doing at the least the new itteration of the palette docker with the new python API (we'll use pyqt to make the widgets accesible) is one of the things we'd like to do, so probably, making your own little docker using the same API should be trivial.

I'm surprised though, I'm better at parsing floats than at integers :p

You can already make a 16 bit float document, or a 32bit float document. And open exrs. And use OCIO with the lut management docker. So I am not sure what you mean by "high end goal"?

Technically, our 10bit stuff is already supossed to work, the problem is that we ourselves can't test it. Ideally, what you'd do is make a 16bit float file, draw a black-to-white gradient, apply a gaussian blur(because our gradient tool was designed 10+ years ago for 8bit and only does 256 stops. We need to rewrite it for this type of stuff but it's been difficult to find time). Then check if it's smoother looking than a regular 8bit gradient.
User avatar
JTF195
Registered Member
Posts
7
Karma
0
What I meant by 'high level goal' is: are there any plans for creating documents with internal bitdepths of 10 or 12? I guess I should assume not. Your wording confused me a little bit.

I'm not going to pretend I understand half of what you're talking about. I haven't actually even been working on graphics-related projects lately. But finding out about the existence of a really nice FOSS Photoshop replacement has piqued my interest a bit, so I may eventually take a stab at figuring some of that out.

Here are a few 16bit gradients you're welcome to use to test stuff: http://puu.sh/pKrub/c8370487f2.zip
I've tested them with my 10bit GPU+display, so I know they work.
User avatar
TheraHedwig
KDE Developer
Posts
1794
Karma
10
OS
JTF195 wrote:What I meant by 'high level goal' is: are there any plans for creating documents with internal bitdepths of 10 or 12? I guess I should assume not. Your wording confused me a little bit.


No, there isn't. But the 16bit float stuff should be sufficient for the making of those documents, in the same manner how you would usually make a high resolution image and scale it down for the internet, you can make an image with high colour resolution, and then use some conversion tool later to optimise for 10bit and benefit from using dithering or something to produce smoother colour gradients :)


I'm not going to pretend I understand half of what you're talking about. I haven't actually even been working on graphics-related projects lately. But finding out about the existence of a really nice FOSS Photoshop replacement has piqued my interest a bit, so I may eventually take a stab at figuring some of that out.


Ah sorry, I assumed stuff because f the 10bit screen. Basically, they're a bunch of widely adopted libraries that you can use for authoring and saving high bitdepth content.

Here are a few 16bit gradients you're welcome to use to test stuff: http://puu.sh/pKrub/c8370487f2.zip
I've tested them with my 10bit GPU+display, so I know they work.


I was hoping you could tell me whether or not it looks banded (as if it were an 8bit gradient)when you open those up in Krita and look at them with your 10bit/c monitor :)
User avatar
JTF195
Registered Member
Posts
7
Karma
0
Alright. I hooked my 10bit gpu and display back up and did some testing

I verified that the images show no banding in Photoshop with the proper settings.
(Incidentally, the quickest way to test this is by using the eyedropper tool on the image, because it temporarily toggles to 8bpc while you're clicking it.)

Unfortunately, I've been unable to convince Krita to display the images without banding.

Let me know if I can be of any more assistance
User avatar
halla
KDE Developer
Posts
5092
Karma
20
OS
I was afraid that that might have gotten broken. What GPU do you have? I'm looking to buy a replacement for my existing nvidia card so I can actually check myself and try to fix the issue. If you can build krita yourself and know C++ and OpenG, you could of course help by fixing the issue. It's likely something very simple since it did use to work... (At least, on Linux, which is where Kai-Uwe tested it).

Edit to add: http://www.oyranos.org/2014/05/image-ed ... -monitors/ -- it worked in 2014...
User avatar
JTF195
Registered Member
Posts
7
Karma
0
I mentioned above that I have a Nvidia Quadro K620.

I'm also running Windows 10 Insider Preview build 14376 and Nvidia driver 368.39

I had to remove it from my computer again, because it doesn't play nicely with my motherboard's firmware and my GeForce card.

I do know a tiny bit of C++, but I doubt it's enough that I'd be able to help.
User avatar
halla
KDE Developer
Posts
5092
Karma
20
OS
Ah, so you had both cards in your computer at the same time?
User avatar
JTF195
Registered Member
Posts
7
Karma
0
I tested with just the Quadro and with both GPUs. They're fine together as long as the GeForce and Quadro drivers are the same version.
You just have to have the 10bit display hooked up to the Quadro (obviously), and check in the drivers for the setting that changes which GPU is used for OpenGL. I set it globally to my GeForce card (because it's way more powerful) and then set it to the Quadro for specific 10bit programs.

The problem was something to do with my motherboard initializing them both with CSM disabled in the firmware.
It wouldn't POST if I had any displays connected to the GeForce card, so I just removed the Quadro card for now.
User avatar
halla
KDE Developer
Posts
5092
Karma
20
OS
Just checking: given that we've got a Lenovo Thinkstation, would this card work & let me test and fix the 10 bits display support?

https://www.centralpoint.nl/videokaarte ... m-4153641/
User avatar
JTF195
Registered Member
Posts
7
Karma
0
Yeah, that's the card I have. You will need to connect the display with a DisplayPort cable, though.


Bookmarks



Who is online

Registered users: bancha, Bing [Bot], Google [Bot], Sogou [Bot]