Author Topic: Time to ditch sRGB/Linear as default (?)  (Read 31247 times)

2019-02-13, 13:52:14
Reply #195

sprayer

  • Active Users
  • **
  • Posts: 542
    • View Profile
But monitors have dynamic contrast what should guaranteed quality color at all brightness. So even at low brightness monitor can show full spectrum color, color will be cut if i change brightness at video card driver. For sRGB mode for my model monitor was recommends 2 brightness with calibrator.

But i was telling what HDR content creators can't use HDR1000 for work, it's too bright and harm for eyes. And how this content may work at monitors with 8bit real colors i don't count fake 2bit FRS as it's just like dithering in photoshop?
Also 8bit video looks dull in HDR but i can change player settings so it will looks very similar to 10bit.
So i am thinking it some kind of ads than real advantages in colors, maybe for video it has more colors with compression, but in CG we are rendering every frame in full colors so they should have more colors than this fake HDR videos.

2019-02-13, 14:04:45
Reply #196

Juraj Talcik

  • Active Users
  • **
  • Posts: 3425
  • Tinkering away
    • View Profile
    • studio website
There is also the current issue of 10bit output being restricted to professional cards regardless of physical input in monitor. nVidia gonna nVidia. I wonder if HDR will force them to reconsider this stupid policy.
talcikdemovicova.com  Website and blog
be.net/jurajtalcik    My Behance portfolio
lysfaere.com   Something new

2019-02-14, 16:32:05
Reply #197

jpjapers

  • Active Users
  • **
  • Posts: 1068
    • View Profile
But monitors have dynamic contrast what should guaranteed quality color at all brightness. So even at low brightness monitor can show full spectrum color, color will be cut if i change brightness at video card driver. For sRGB mode for my model monitor was recommends 2 brightness with calibrator.

But i was telling what HDR content creators can't use HDR1000 for work, it's too bright and harm for eyes. And how this content may work at monitors with 8bit real colors i don't count fake 2bit FRS as it's just like dithering in photoshop?
Also 8bit video looks dull in HDR but i can change player settings so it will looks very similar to 10bit.
So i am thinking it some kind of ads than real advantages in colors, maybe for video it has more colors with compression, but in CG we are rendering every frame in full colors so they should have more colors than this fake HDR videos.

When you talk about colour though you actually mean chromaticity which doesnt take into account luminance aswell. By reducing the monitor brightness youre restricting the luminance values it can represent and therefore restricting the colours available on your shiny new HDR monitor.

2019-02-21, 09:20:27
Reply #198

IsmaeL

  • Active Users
  • **
  • Posts: 44
    • View Profile
There is also the current issue of 10bit output being restricted to professional cards regardless of physical input in monitor. nVidia gonna nVidia. I wonder if HDR will force them to reconsider this stupid policy.

This has changed. As you mentioned HDR forces them to output at higher bit depth.
But they are not stupid XD it seems that you only get 10 bit when playing games. So basically it only works when launching Direct X.

So Yes, you still need Quadro
Falling in love with Corona

2019-03-14, 13:44:51
Reply #199

Juraj Talcik

  • Active Users
  • **
  • Posts: 3425
  • Tinkering away
    • View Profile
    • studio website
You're right ! I think this how the policy exception worked for longer time. I think it's both DirectX and Fullscreen at same time, or something like that, for GTX.
talcikdemovicova.com  Website and blog
be.net/jurajtalcik    My Behance portfolio
lysfaere.com   Something new