Author Topic: Time to ditch sRGB/Linear as default (?)  (Read 33228 times)

2019-02-11, 13:33:20
Reply #180

sprayer

  • Active Users
  • **
  • Posts: 564
    • View Profile
Juraj Talcik
You can still apply  camera raw to 32bit, via dialog merge to HDR, but it will convert in 16bit, and  you need to enable this feature in preference
They remove old behavior as they said it's was too different result between Camera raw window and ending image.

Affinity is good, i don't have any problem this stability, but it's work a bit different with editing mask (applying levels) for example it need some time to learn, and still it's lack of smart layers for every filter, i need smart layers for making template with wrapping tool like in photoshop
But phothsop have many problems too, how many years they torture users with ctrl+z command and fix it only in last version, and they can't add saving 32bit full float for exr and hdr formats, also i am  curious how the same tools working different in other adobe software like illustrator.
So i am dreaming to leave photoshop to something like affinity or krita
« Last Edit: 2019-02-11, 13:37:29 by sprayer »

2019-02-11, 15:43:13
Reply #181

Juraj Talcik

  • Active Users
  • **
  • Posts: 3492
  • Tinkering away
    • View Profile
    • studio website
Yeah the whole thing (CamRaw) is crafty, which is why I lament every time why Adobe just couldn't make it fully functional. It would be blessing to have it work on linear files without any hassle as that would make the post-pro workflow absolutely identical to photography, one smooth process without unnecessary division between "big" changes in linear 32bit, and "small" changes after.

Now you originally mentioned the Affinity and I have to contend that it isn't as good as it appears.. we bought two licences but almost kinda regret it as while it is lot more ambitious (much better 32bit support, simultanenous layer adjustments (super good for textures),...) it's just not even stable. At the moment I am fully back at Photoshop for almost anything. It's not just habit...it's the same stuff as 3dsMax, might not be ideal, but still the best choice.


Cheers for the info on Affinity, stability is a concern I might wait it out. Another thing that concerns me is I use lightroom for my photography so the Adobe suite suits my pipeline, as well as the terrific catagorizing/tagging etc and similarities between adobe tools.

Regarding what you said about not being ideal but the best choice, I see you did a test on the ACES workflow using Davinci with dubcat - out of curiosity did you implement this workflow? Are you working in a higher gamut space or just sRGB in post? I ask because I have read the tech papers, did some tests on my own, I also work in Adobe 98 space for my photography on a wide gamut monitor for print (output to sRGB for screen), however for 3D archviz I just dont see any major advantage currently - mostly because Photoshop is the main post production tool (with the limitations we spoke about), easy to swap PSD files with other studios and if its kept in sRGB (even if its a low gamut space - which still looks great even with photography) theres no confusion along the way for input, output, space conversion, color matching, etc. I wont go into print for now, just curious about your view on the topic currently.

 edit: Not to mention I wont be remastering any of my old work, which I doubt many people do in archviz.

The stability might be fine for most but I tried it mainly for 32bit (HDR editing of large files) and that was where it wasn't that great.

No I never even tried Davinci Resolve, I simply can't force myself to adopt node based approach to post-pro.

I use sRGB clamped monitor because neither 3dsMax nor Corona are color managed and I can't wait to setup correct colors in Photoshop, I spend 99perc. of my time in 3dsMax. I might switch to wide-gamut when at least Corona becomes color managed, without it...it's insanity to do so imho.
I also edit my photos in AdobeRGB via CameraRaw in either PS or Lightroom and only clamp when saving for web, but I do this while still being in sRGB clamp on my display so I only get advantage of some mixing not really exposed to the wider spectrum. Switching and managing wide-gamut pipeline my stressing me so I gave preference to lowest common denominator, sRGB because of my CGI work.

(I've been advocating for color management of 3dsMax and/or Corona for years here, but that is rather slow to get traction. I see it was potentially moved into 5.0 version, so maybe next year ? If 3dsMax doesn't come with it sooner. Honestly they have to, soon everyone will be using DCI-P3 since AdobeRGB is dead and HDR will become common space, color management must come.)

Juraj Talcik
You can still apply  camera raw to 32bit, via dialog merge to HDR, but it will convert in 16bit, and  you need to enable this feature in preference
They remove old behavior as they said it's was too different result between Camera raw window and ending image.


Oh I know there are few workarounds, but...it's not ideal :- ). The one I mentioned is "Open As" and then it will preserve dynamic range without clamping it. But both are shit solutions.
talcikdemovicova.com  Website and blog
be.net/jurajtalcik    My Behance portfolio
lysfaere.com   Something new

2019-02-11, 16:26:00
Reply #182

lupaz

  • Active Users
  • **
  • Posts: 275
    • View Profile
AdobeRGB is dead and HDR will become common space, color management must come.)

Wow. You're right! Once displays become HDR we'll need to change our workflow. Massive change. I didn't even think about it.
It may take...what?...10 years though?

2019-02-11, 18:32:23
Reply #183

Juraj Talcik

  • Active Users
  • **
  • Posts: 3492
  • Tinkering away
    • View Profile
    • studio website
I am still very ignorant of the whole HDR thing, esp. since standards (both software and hardware) of the technology are so ever changing still these days. But I believe it will be much earlier than 10 years, after all, all the TVs have it, even the cheapest ones. And can you even buy classic SDR prosumer (consumer/professional intersection like Dell Ultrasharp,etc.., anything short of NEC/EIZO) type displays ? All the monitors on market right now are some kind of HDR 144Hz PVA panels oriented for gaming and media, they're like 99perc. of what you can buy that's coming out new to the market. From next year on, I presume majority of cell-phones will be enabled as well.


But oh boy, I believe it will be massive revolution when it becomes common spread accross all industries. Apparently right now SDR content looks terrible while on HDR mode, and vice-versa, how HDR content looks on SDR display we've known for years :- ).
So if someone wanted to jump on the hype train for archviz right now in the moment, what would it look like in practice ? Double the post-production ?
Does anyone already do it in some form for their clients ? I can imagine some real-estate people could showcase such content on top-grade TVs in their show rooms to wow the clients. HDR content on 100" 8k TV sounds more impressive to me than nause inducing VR.
« Last Edit: 2019-02-11, 18:35:59 by Juraj Talcik »
talcikdemovicova.com  Website and blog
be.net/jurajtalcik    My Behance portfolio
lysfaere.com   Something new

2019-02-11, 19:17:27
Reply #184

lupaz

  • Active Users
  • **
  • Posts: 275
    • View Profile
For sure.
I wonder how it'll affect our eyes too! Much more light going into our retinas :S

2019-02-12, 02:15:55
Reply #185

Fluss

  • Active Users
  • **
  • Posts: 408
    • View Profile
From next year on, I presume majority of cell-phones will be enabled as well.
Basically, every smartphones with AMOLED displays are already compatible these days. The Galaxy S8 can display HDR content since 2017. So, this starts to be pretty common and widespread.

But oh boy, I believe it will be massive revolution when it becomes common spread accross all industries. Apparently right now SDR content looks terrible while on HDR mode, and vice-versa, how HDR content looks on SDR display we've known for years :- ).
So if someone wanted to jump on the hype train for archviz right now in the moment, what would it look like in practice ? Double the post-production ?
Does anyone already do it in some form for their clients ? I can imagine some real-estate people could showcase such content on top-grade TVs in their show rooms to wow the clients. HDR content on 100" 8k TV sounds more impressive to me than nause inducing VR.

That's actually the whole point of ACES! Keeping scene referred data all along the pipeline and working in a wide gamut space to be able to deliver to whatever display referred space. So basically, select your display transform in a dropdown list and deliver to the intended platform. You can already (kind of) do that already in a serious and valid post-production package that support OCIO.

The only issue is that most renderers are using sRGB primaries and it can cause some discrepancies during the conversion from linear sRGB to the intended working space (mostly AP1 for us, this is the space that defines the ACEScg gamut and that encompass the REC.2020 one). So it would be better to render straight into ACEScg from scratch. As a renderer is almost colorspace agnostic, you should already be able to do so, except for spectra related stuff ( everything driven by Kelvin temperature) because those correspond to defined RGB triplets in the targeted colorspace (6500k/D65 is the white point for sRGB as an example, but ACEScg has a D60 white point).

The real issue here is all that HDR shit, to be honest. Every manufacturer is applying a whole load of post-effects to make the image "look better" without any respect of the initial vision of the content creator. The only thing we should benefit from that technology is the wider gamut, it should not have any impact on the dynamic range of the displayed medium (software wise). All the stuff they add on top of that is a massive pile of shit and a lot of film producers start to raise their voices against those marketing trends.
« Last Edit: 2019-02-12, 09:29:15 by Fluss »

2019-02-12, 11:48:47
Reply #186

jpjapers

  • Active Users
  • **
  • Posts: 1122
    • View Profile
From next year on, I presume majority of cell-phones will be enabled as well.
Basically, every smartphones with AMOLED displays are already compatible these days. The Galaxy S8 can display HDR content since 2017. So, this starts to be pretty common and widespread.

But oh boy, I believe it will be massive revolution when it becomes common spread accross all industries. Apparently right now SDR content looks terrible while on HDR mode, and vice-versa, how HDR content looks on SDR display we've known for years :- ).
So if someone wanted to jump on the hype train for archviz right now in the moment, what would it look like in practice ? Double the post-production ?
Does anyone already do it in some form for their clients ? I can imagine some real-estate people could showcase such content on top-grade TVs in their show rooms to wow the clients. HDR content on 100" 8k TV sounds more impressive to me than nause inducing VR.

That's actually the whole point of ACES! Keeping scene referred data all along the pipeline and working in a wide gamut space to be able to deliver to whatever display referred space. So basically, select your display transform in a dropdown list and deliver to the intended platform. You can already (kind of) do that already in a serious and valid post-production package that support OCIO.

The only issue is that most renderers are using sRGB primaries and it can cause some discrepancies during the conversion from linear sRGB to the intended working space (mostly AP1 for us, this is the space that defines the ACEScg gamut and that encompass the REC.2020 one). So it would be better to render straight into ACEScg from scratch. As a renderer is almost colorspace agnostic, you should already be able to do so, except for spectra related stuff ( everything driven by Kelvin temperature) because those correspond to defined RGB triplets in the targeted colorspace (6500k/D65 is the white point for sRGB as an example, but ACEScg has a D60 white point).

The real issue here is all that HDR shit, to be honest. Every manufacturer is applying a whole load of post-effects to make the image "look better" without any respect of the initial vision of the content creator. The only thing we should benefit from that technology is the wider gamut, it should not have any impact on the dynamic range of the displayed medium (software wise). All the stuff they add on top of that is a massive pile of shit and a lot of film producers start to raise their voices against those marketing trends.

So would the solution be implementing colour management into corona & to max and rendering straight to ACEScg? I didnt realise that even on certified HDR displays they were still adding lots of processing ontop but i guess its obvious when i think about it. I guess its to compensate for shit panels or discrepancy across a panel batch so they all look about the same?

2019-02-12, 12:43:32
Reply #187

Juraj Talcik

  • Active Users
  • **
  • Posts: 3492
  • Tinkering away
    • View Profile
    • studio website
I think stuff like that will be more unified once the standards start to be similar to each other along the way. And it isn't any different from people using "Vivid" mode on their display/TV right now.

The only person seeing the image as the creators (us) intended...is well, us :- ). Just something to live with.

talcikdemovicova.com  Website and blog
be.net/jurajtalcik    My Behance portfolio
lysfaere.com   Something new

2019-02-12, 13:06:54
Reply #188

jpjapers

  • Active Users
  • **
  • Posts: 1122
    • View Profile
I think stuff like that will be more unified once the standards start to be similar to each other along the way. And it isn't any different from people using "Vivid" mode on their display/TV right now.

The only person seeing the image as the creators (us) intended...is well, us :- ). Just something to live with.

The thing im not looking forward to is the day that the 30fps standard disappears and we have to start rendering animations at much higher framerates

2019-02-12, 13:25:22
Reply #189

Juraj Talcik

  • Active Users
  • **
  • Posts: 3492
  • Tinkering away
    • View Profile
    • studio website
I think stuff like that will be more unified once the standards start to be similar to each other along the way. And it isn't any different from people using "Vivid" mode on their display/TV right now.

The only person seeing the image as the creators (us) intended...is well, us :- ). Just something to live with.

The thing im not looking forward to is the day that the 30fps standard disappears and we have to start rendering animations at much higher framerates

Yeah, 4k/60 would be nightmare :- ) But same happened to still frame rendering, I used to do render that took 2 hours on single quad-core with Vray for 4k resolution, now that same 4k resolution is easily 2 hours on 200 cores... Quality standards constantly grow.
Anyway, apparently half of TVs in 2019 from Samsung are 8k. And what was that ideal VR clarity ? 8x4k each eye at 90 FPS ?
talcikdemovicova.com  Website and blog
be.net/jurajtalcik    My Behance portfolio
lysfaere.com   Something new

2019-02-12, 14:11:14
Reply #190

Fluss

  • Active Users
  • **
  • Posts: 408
    • View Profile
So would the solution be implementing colour management into corona & to max and rendering straight to ACEScg? I didnt realise that even on certified HDR displays they were still adding lots of processing ontop but i guess its obvious when i think about it. I guess its to compensate for shit panels or discrepancy across a panel batch so they all look about the same?

What do you mean here? The solution to what? If you're talking about the color discrepancies that occur during the colorspace switch then yes, we should render straight into ACEScg to get a proper color-managed workflow. But this is not that simple tho, as it may introduce other caveats.

I think stuff like that will be more unified once the standards start to be similar to each other along the way. And it isn't any different from people using "Vivid" mode on their display/TV right now.

The only person seeing the image as the creators (us) intended...is well, us :- ). Just something to live with.

For sure! It is just that this HDR thingy introduced a load more of those post-processes. What's more, HDR and HDR10 are stuck with a fixed curve at the beginning of the media playback which end up with details loss in very bright or dark scenes. Things are going in a good way tho, HDR10+ and Dolby vision specifications introduce dynamic metadata to allow the change of the brightness boundaries on the fly (per scene) rather than remaining constant for the whole experience.

I think stuff like that will be more unified once the standards start to be similar to each other along the way. And it isn't any different from people using "Vivid" mode on their display/TV right now.

The only person seeing the image as the creators (us) intended...is well, us :- ). Just something to live with.

The thing im not looking forward to is the day that the 30fps standard disappears and we have to start rendering animations at much higher framerates

Yeah, 4k/60 would be nightmare :- ) But same happened to still frame rendering, I used to do render that took 2 hours on single quad-core with Vray for 4k resolution, now that same 4k resolution is easily 2 hours on 200 cores... Quality standards constantly grow.
Anyway, apparently half of TVs in 2019 from Samsung are 8k. And what was that ideal VR clarity ? 8x4k each eye at 90 FPS ?

Yeah, sadly displays are evolving faster than computer hardware. High transistor density starts to be tedious for semiconductor manufacturers. As for VR, sales does not seem to raise that much and without mass adoption, I guess we won't see high-density panels come anytime soon. It's sad because we start to see some interesting technologies, like foveated rendering, to be able to render RT on that hardware. The Varjo technology is even more enticing, a small ultra high-density panel that follows your sight, backed by a standard resolution panel. They claim it to be equivalent to a 70k display.

see here :
https://varjo.com/bionic-display/
« Last Edit: 2019-02-12, 14:25:13 by Fluss »

2019-02-12, 16:13:54
Reply #191

jpjapers

  • Active Users
  • **
  • Posts: 1122
    • View Profile
Im interested to try varjo if i ever can. Ill be suitably impressed if the micro OLED panel moves as quickly as the saccades of your eyes

2019-02-12, 16:42:57
Reply #192

sprayer

  • Active Users
  • **
  • Posts: 564
    • View Profile
You are talking about colors, but can you tell me why HDR monitors all have bright LED what you can't actually work with content creation, i have HDR monitor and enabling it it's lock brightness settings, i can't stare on this too long in monitor distance(its 1000cd in HDR1000 mode). In sRGB i am working at 2 level brightness. And common monitors still have 8bit+2FRC how this works with bt2020 10bit?
And i am sure most of people here set comfortable brightness ~120cd for working. I am doubt what who making HDR content working in HDR1000 mode

2019-02-12, 16:56:21
Reply #193

jpjapers

  • Active Users
  • **
  • Posts: 1122
    • View Profile
You are talking about colors, but can you tell me why HDR monitors all have bright LED what you can't actually work with content creation, i have HDR monitor and enabling it it's lock brightness settings, i can't stare on this too long in monitor distance(its 1000cd in HDR1000 mode). In sRGB i am working at 2 level brightness. And common monitors still have 8bit+2FRC how this works with bt2020 10bit?
And i am sure most of people here set comfortable brightness ~120cd for working. I am doubt what who making HDR content working in HDR1000 mode

AFAIK this is because there isnt a brightness control standard across monitors which is annoying and the stupid 1-100 slider for brightness should actualy be 0 to whatever max nits your display is. You should be able to subtract nits (or cd) from the brightness rather than some arbitrary dimming setting because luminance affects colour just as much as anything else so in terms of colour calibration its a bit difficult.
« Last Edit: 2019-02-12, 17:19:28 by jpjapers »

2019-02-12, 17:04:55
Reply #194

Fluss

  • Active Users
  • **
  • Posts: 408
    • View Profile
This is because of the core fundamental of the dynamic range: the ratio between bright and dark areas. In your case, HDR1000 refer to the peak brightness of the monitor (1000nits). Black areas for the specification have to be under 0.03 nits. So if you lower the peak brightness, you lower the dynamic range and then you'll be out HDR1000 specification, hence the brightness lock.
« Last Edit: 2019-02-12, 17:29:28 by Fluss »