Author Topic: dubcats secret little hideout  (Read 110186 times)

2018-09-26, 01:57:13
Reply #300

Fluss

  • Active Users
  • **
  • Posts: 437
    • View Profile
In your 3 previous tests, luminance seems pretty consistent tho, this one looks a bit off luminance wise. Would be great to see how they compare on that point, in CIE LAB , with proper luminance metric (no RGB, HSL/HSV etc..). If it's consistent across the dumps, i'd bet for tint shifting or white balance.

What command did you use for the dcraw dump (Especially for the color option)?

2018-09-28, 01:58:27
Reply #301

dubcat

  • Active Users
  • **
  • Posts: 460
  • ฅ^•ﻌ•^ฅ meow
    • View Profile
What command did you use for the dcraw dump (Especially for the color option)?

Hey!

-v -H 0 -o 1 -q 3 -4 -T

In this specific case, since I shoot all the lights at 6500 (since our CG life is depending on this setting), the exif is already 6500.
So these settings give me the same result

-v -w -H 0 -o 1 -q 3 -4 -T

I haven't had any spare time since last time.
The next thing on my schedule is to 3D scan the analysis scene, and replace the rough scan geo with proper 3D models.
After that I will material scan the rough black fabric that my analysis scene is covered with.
When those two variables are finally taken care off, we can nail the raw image variables.
Everything will be open source as always :)
             ___
    _] [__|OO|
   (____|___|     https://www.twitch.tv/dubca7 / https://soundcloud.com/dubca7 ( ͡° ͜ʖ ͡°)

2018-09-28, 02:42:27
Reply #302

dubcat

  • Active Users
  • **
  • Posts: 460
  • ฅ^•ﻌ•^ฅ meow
    • View Profile
Bertrand Benoit aka BBB3viz released an awesome set of images lately, you can check them out at https://bertrand-benoit.com/blog/classical.
People have asked him what wall paint value he used, and he responded 180 RGB.
This answer has then turned into another questing. What does he mean about 180 RGB, does it mean sRGB or does it actually mean RGB.

180 sRGB is 46.5% reflectance and 180 RGB is 70.6%.
So he is talking about real linear RGB values. 70.6% is still dark for white values, but this is the common average value for white paint.

In Scandinavia we have a standard white color for doors, windows etc. And that color is called Eggwhite, you can find it here https://www.e-paint.co.uk/Lab_values.asp?cRange=RAL%20Classic&cRef=RAL%209010&cDescription=Pure%20white. This paint is 84%. If we remove 1.5 IOR from the scanned value, we get this albedo value.



edit:
Made the image 500px wide.
« Last Edit: 2018-09-28, 02:47:02 by dubcat »
             ___
    _] [__|OO|
   (____|___|     https://www.twitch.tv/dubca7 / https://soundcloud.com/dubca7 ( ͡° ͜ʖ ͡°)

2018-09-28, 15:15:55
Reply #303

Fluss

  • Active Users
  • **
  • Posts: 437
    • View Profile
People have asked him what wall paint value he used, and he responded 180 RGB.
This answer has then turned into another questing. What does he mean about 180 RGB, does it mean sRGB or does it actually mean RGB.

180 sRGB is 46.5% reflectance and 180 RGB is 70.6%.
So he is talking about real linear RGB values. 70.6% is still dark for white values, but this is the common average value for white paint.

Well despite his amazing renders, BB is more of an eyeballing kind of guy (and he has an insane eye), he can can pretty much nail any materials just by looking at them. That leaves me speechless every time! But after reviewing some of his scenes (some older ones, don't know is it still the case), I just noticed he is not using "strict PBR rules". So I guess that's just the expression of his artistic vision. That said, his approach is not devoid of foundation and we can clearly identify some technical background behind all of this. Overall reflectance values of the scene are indeed extremely important as it will greatly influence how light propagates through it and thus affect the dynamic range of the final render. That's just a pain in the ass to get it right for every single material in the scene (for me at least).

I've been lurking around on different forums and articles in that regard. Without a "state of the art" material scanner, it's hard to guess these values. Anyway, Some studios are relying on this sensor: https://www.nixsensor.com/. For sure, it is not as smart as being able to split it into diffuse and spec or anything nice like that but you'll get an accurate overall brightness and hue for your material for cheap. What's more, it can give you ACEScg values (the pro one), so linear data. I'm considering buying one of them atm.

And here we come, talking ACES workflow. I had some time to kill so I decided to give ACES a spin. I used Vray and OCIO config to see what it gets.

lin sRGB + 2.2 gama curve


lin ACEScg + sRGB viewtransform


lin sRGB +5EV + 2.2 gama curve


lin ACEScg +5EV + sRGB viewtransform


We can clearly see the advantages of the ACES tone mapping. The way blown highlights are handled is way closer to what a real camera would produce. We can feel the "Fstorm look" there uh? The only drawback is that "ACES bug" with bright saturated values professor Dubcat was referring earlier this topic. At this point, I'm not that sure it is a bug rather than a "workflow issue".

That color 'bug" literally freaked me out! Why the fuck is this happening? So I tried to figure out what was the issue and here what I found :

To understand that we need to be pragmatical.

The issue: I'm increasing EVs and my primaries are not fading to white in sRGB. It does in ACES but colors seem shifted. Why?

According to the definition of EV (Wikipedia), the Exposure value is used to indicate an interval on the photographic exposure scale, with a difference of 1 EV corresponding to a standard power-of-2 exposure step, commonly referred to as a stop. So basically, it's a multiply operation. The colors used for the lights are pure sRGB primaries (1,0,0 ; 0,1,0 ; 0,0,1). I guess you understand what's happening at this point. Let's sample our blue: At current exposure value we have (0,0,3.391 float 32). So multiply that by whatever you want to, you'll always see a blue primary (the blue being clipped to be displayed and the other two components being 0 -> REMINDER: 0*x=0 😉).

Now let's change that blue color to (0.1,0.1,1)

sRGB - start to see fading even at current exposure


sRGB +5EV - totally faded to white while the other two primaries remain the same.


OK, that's just math so shouldn't ACES behave the same way as we were using the same pure primaries? In fact, it does...

What's defining a color ? Well it's a balance between R, G and B values (thanks captain obvious). What I mean here is that your color won't be more blue by increasing the blue component (that's actually luminance), it will be more blue by removing more red and green.

ACEScg gamut is wider than sRGB gamut which means colors can be more saturated.


So when we are using ACEScg to grade, we are converting from lin sRGB to lin ACEScg. Since the gamut is expanded, my sRGB(0,0,1) is not translated to ACEScg(0,0,1) as it's less saturated. Less saturation brings back a bit of red green to my color in ACEScg colorspace ! and that's why colors fade out! If you try to grade a pure blue primary in ACEScg, it will behave the same way as in sRGB, 100% saturation is reached, 0 red and 0 green, blue is clipped -> no fade out.

But we are actually converting to ACEScg after rendering which introduce some bias in the solution and it is especially visible near extreme values. To avoid that it's better to convert anything that contributes to the color information in the final rendering. So basically all lights and textures that are not supposed to be shades of gray.





« Last Edit: 2018-09-28, 15:21:27 by Fluss »

2018-09-28, 15:27:55
Reply #304

Juraj Talcik

  • Active Users
  • **
  • Posts: 3763
  • Tinkering away
    • View Profile
    • studio website
Never heard of the NIX scanner, that is cool nifty toy :- ). I knew I could somehow use xRite sond to do this but never bothered.

I wonder how far are these from "diffuse albedo" to be used in our generic shader. Aren't these still perceptive colors ?

Now only which one to buy..
talcikdemovicova.com  Website and blog
be.net/jurajtalcik   Our studio Behance portfolio
Instagram   Our studio Instagram, managed by Veronika

2018-09-28, 16:27:58
Reply #305

Fluss

  • Active Users
  • **
  • Posts: 437
    • View Profile
Never heard of the NIX scanner, that is cool nifty toy :- ). I knew I could somehow use xRite sond to do this but never bothered.

I wonder how far are these from "diffuse albedo" to be used in our generic shader. Aren't these still perceptive colors ?

Now only which one to buy..

I wonder too... That's why I'm considering buying one but haven't done yet.  What do you mean by perceptive color?

All I know is that it seems pretty accurate (lots of people use that for print matches). I also like the fact that you get control over the illuminant.

2018-09-28, 16:32:22
Reply #306

karnak

  • Primary Certified Instructor
  • Active Users
  • ***
  • Posts: 66
    • View Profile
I don't know much about these Nix products (first time I hear of them), but I have been using an X-Rite ColorMunki Photo to get diffuse values.

The ColorMunki Photo is a spectrophotometer that can sample the light spectrum from 380 to 730 nm every 10 nm (or every 3.33 nm in high-resolution mode).
In very simple terms, with the spectral data, you can get RGB values in a color space of your choosing (with appropriate illuminant, primaries and so on).

In my opinion, the data you get from this instrument is good enough, especially considering the price and the size of the device.
I say 'good enough' because the light spectrum is infinite, and with this tool, perhaps we are sampling the bare minimum visible range (380-730 nm) at the bare minimum interval (every 10 nm). I have read a lot of o literature on the subject, and this might lead to slightly inaccurate values in some cases. I feel I have already gone overboard with unrequested details, sorry. :)
Corona Academy (May 2017)

2018-09-28, 16:41:00
Reply #307

Fluss

  • Active Users
  • **
  • Posts: 437
    • View Profile
reading some amazon reviews :

"Accuracy and correlation tested with macbeth chart and X-Rite spectroreflectometer.
Very accurate inside sRGB gamut, could be improved a bit in very blue colors.
But it's acceptable for this price and such small trichrometer sensor.
Lovely product. Looking for next product already!"

Good enough for me.

2018-09-28, 16:50:29
Reply #308

Juraj Talcik

  • Active Users
  • **
  • Posts: 3763
  • Tinkering away
    • View Profile
    • studio website
Yeah sounds great, esp. since it comes with simple mobile app.

By perceptive I mean the results (and same goes for all the RAL, NCS, Pantone, etc.. libraries) are supposed to mimic how we perceive these materials by eye under regular daylight.
It was setup fully independent of the concept of diffuse albedo. I am not even sure if they can be considered to be full albedo so that we could just take away some brightness and add some saturation.

These colors always feel like they are too bright to be used in CGI shader.
talcikdemovicova.com  Website and blog
be.net/jurajtalcik   Our studio Behance portfolio
Instagram   Our studio Instagram, managed by Veronika

2018-09-28, 18:52:24
Reply #309

karnak

  • Primary Certified Instructor
  • Active Users
  • ***
  • Posts: 66
    • View Profile
I'm not sure about the libraries, but in my previous example what I get from the instrument is the energy that the surface reflects, without distinction between diffuse reflection and specular reflection. Therefore to get only the diffuse you need to subtract the specular reflection and you can do this easily if you know the IOR of the surface (and if you don't care that much about it, you can just remove 4% which is equal to the energy that the default IOR adds).

I think a good way to remove 4% from RGB colors might be to use Lab mode and only work with the L component.
Corona Academy (May 2017)

2018-09-28, 19:15:38
Reply #310

Juraj Talcik

  • Active Users
  • **
  • Posts: 3763
  • Tinkering away
    • View Profile
    • studio website
I know about the 0° specular removal, I mentioned that idea multiple times over many years here. But I still have hitch for it to be more complex than just that and my issue is with how the shader uses this data. I know that it is energy conserving and nada,nada.. but it still doesn't end up looking fully right to me.

I've been overthinking this for way too long though..  but I would still like to have physical sample of something that was also scanned by actual BSDF scanner like xRite TAC7 or the ChaosGroup's scanner.
talcikdemovicova.com  Website and blog
be.net/jurajtalcik   Our studio Behance portfolio
Instagram   Our studio Instagram, managed by Veronika

2018-09-28, 19:41:55
Reply #311

karnak

  • Primary Certified Instructor
  • Active Users
  • ***
  • Posts: 66
    • View Profile
That would be extremely cool to have.
What X-Rite device you have? I can help you on the software side if you want to read some surfaces.
I know it's not the same as BSDF scanners, but at least it's something.

edit.

I forgot to say that I get the same values with calibrated polarised photography and with spectrophotometer with manual removed specular.

What happens outside F0 is a totally different matter though and depends on other factors, some of them are well approximated by the BRDF models used inside the shader, others are not.
« Last Edit: 2018-09-29, 08:53:49 by karnak »
Corona Academy (May 2017)

2018-10-01, 13:16:46
Reply #312

Fluss

  • Active Users
  • **
  • Posts: 437
    • View Profile
I forgot to say that I get the same values with calibrated polarised photography and with spectrophotometer with manual removed specular.

What happens outside F0 is a totally different matter though and depends on other factors, some of them are well approximated by the BRDF models used inside the shader, others are not.

Absolutely. Following the energy conservation law, the close relationship between the diffuse and the specular part is what defines the object appearance. When the specularity increases, the diffuse component drops, and vice versa. And that has to be correlated to the micro surface topology itself. These tiny details are what really defines how a surface looks at a defined angle. I guess BSDFs provided by high-end material scanners sort of translate that surface topology information into a fresnel curve across the whole range of angles (sampled x steps, interpolated in between) for both diffuse and specular components. Microfacets models we are using are way too generic in that regards and have a hard time simulating materials with complex microsurface details (wood, fabric etc) but anyway, good enough to achieve pretty decent results most of the time. That said, I'm always wondering why we're stuck with that good old Lambertian diffuse model in corona while almost any other renderers on the market are offering more advanced one.

I miss a macro lens... I'd really like to see how far we can go by scanning small patches of geometry and then making them tillable in order to use them as source pattern in Fstorm Geopattern.
« Last Edit: 2018-10-01, 13:31:02 by Fluss »

2018-10-01, 13:50:19
Reply #313

Juraj Talcik

  • Active Users
  • **
  • Posts: 3763
  • Tinkering away
    • View Profile
    • studio website
That said, I'm always wondering why we're stuck with that good old Lambertian diffuse model in corona while almost any other renderers on the market are offering more advanced one.

I keep asking Corona devs about this every three months for past three years. If we could at least get option for some existing alternative to soften up look I would be super grateful but Corona shader is super limited and not evolving in any way.

It desperately needs better diffuse shading, Coating, Sheen, etc..

talcikdemovicova.com  Website and blog
be.net/jurajtalcik   Our studio Behance portfolio
Instagram   Our studio Instagram, managed by Veronika

2018-10-01, 16:55:09
Reply #314

Fluss

  • Active Users
  • **
  • Posts: 437
    • View Profile
Some samples coming from the mura scanner are available at the bottom of the home page :

https://www.muravision.com/