Author Topic: HOWTO: Capture and Calibrate Textures (PBR Style)  (Read 34536 times)

2016-05-12, 14:45:17
Reply #15

JakubCech

  • Active Users
  • **
  • Posts: 119
  • jakubcech.net
    • View Profile
    • jakubcech
Hey DubCat your posts are amazing, +1 from me to keep posting such stuff :)
I would love to ask you, can you point me at some good sources of this PBR related stuff? Or some PBR/Disney/game/whatever related stuff / channels / videos you think are very interesting.
One dumb question, why shooting raw? What if someone uses a smartphone with manual settings (not talking about resolution or just better sensor capabilities)?
Thank you.
« Last Edit: 2016-05-12, 15:21:14 by JakubCech »

2016-05-12, 17:33:12
Reply #16

dubcat

  • Active Users
  • **
  • Posts: 452
  • ฅ^•ﻌ•^ฅ meow
    • View Profile
Can you point me at some good sources of this PBR related stuff? Or some PBR/Disney/game/whatever related stuff / channels / videos you think are very interesting.
Hey man. I've linked to a couple of guides and materials that I know are proper PBR.

There are two types of PBR.
* Cheap ass "Bitmap2Material 3" materials that most texture-stores are selling. (They are still PBR, just not calibrated to anything in the real world).
* Proper calibrated materials (This can be color values, 3D scanned normals/displacement, shadow cancellation by capturing a hdri of the environment.)

Color Values
- You can calibrate the values like i did here in Photoshop/Lightroom. Or you can use 3D Lut Creator. 3D Lut Creator will auto generate correct Matrix, change the exposure and linearize the picture.
They got two official tutorials covering this topic.

Normals/Displacement
- Most people use PhotoScan for 3D Scanning. But I've heard good stuff about this new badboy Capturing Reality
This tutorial series about PhotoScan is great, check them out.

Shadow Cancellation
- You need a good amount of equipment to do this.
   * Lens and Tripod mount to shoot HDRi
   * Grey ball (Cheap solution would be to use ColorChecker graycard)
   * Chrome ball (You use this ball to align the HDRi inside 3dsMax, it is not for capturing HDRi (Like in the old days))

   * You re-create the grey/chrome ball in 3dsMax.
   * Rotate the HDRi until the HDRi reflections in the Chrome ball match the real life reference picture.
   * Change the exposure of the HDRi, until the grey ball match the real life reference picture.

   * Bake a map that contain the shadows and divide them from the diffuse texture in linear space.

I made a quick example here



Great presentations from Epic, check them out.

Proper info
Allegorithmic, Marmoset, Epic and Eat3D

Proper reference materials
Allegorithmic and Quixel

One dumb question, why shooting raw? What if someone uses a smartphone with manual settings (not talking about resolution or just better sensor capabilities)?
Camera manufacturers want their pictures to look good out of the box. So they apply white balance, contrast, sharpening, de-noising etc and save them as JPG.
When you take a picture in RAW, you get the raw sensor information. Nothing is baked.

Another important thing is that JPG is 8bit and RAW is 16bit.
You can make major adjustments without messing up the picture with RAW.
« Last Edit: 2016-05-12, 19:43:05 by dubcat »
             ___
    _] [__|OO|
   (____|___|     https://www.twitch.tv/dubca7 / https://soundcloud.com/dubca7 / https://dubcatshideout.com  ( ͡° ͜ʖ ͡°)

2016-05-13, 11:08:04
Reply #17

-Ben-Battler-

  • Active Users
  • **
  • Posts: 171
    • View Profile
Thank you for the links dubcat, those are really interesting!
Visit Pangaroo

2016-05-13, 12:19:34
Reply #18

romullus

  • Global Moderator
  • Active Users
  • ****
  • Posts: 5454
  • Let's move this topic, shall we?
    • View Profile
    • My Models
Shadow Cancellation
- You need a good amount of equipment to do this.
   * Lens and Tripod mount to shoot HDRi
   * Grey ball (Cheap solution would be to use ColorChecker graycard)
  * Chrome ball (You use this ball to align the HDRi inside 3dsMax, it is not for capturing HDRi (Like in the old days))

   * You re-create the grey/chrome ball in 3dsMax.
   * Rotate the HDRi until the HDRi reflections in the Chrome ball match the real life reference picture.
   * Change the exposure of the HDRi, until the grey ball match the real life reference picture.

   * Bake a map that contain the shadows and divide them from the diffuse texture in linear space.

Tons of useful information, as always. Big thanks! I have one question about lighting information removal, though. If HDRI panorama is needed only for lighting removal, wouldn't capturing it with old school method (chrome ball) be more than enough for that? That would save a lot of time when capturing and would be much less expensive.
I'm not Corona Team member! Everything i say, is my personal opinion only. Render Legion does not endorse my words nor actions.

2016-05-13, 18:01:37
Reply #19

dubcat

  • Active Users
  • **
  • Posts: 452
  • ฅ^•ﻌ•^ฅ meow
    • View Profile
Wouldn't capturing it with old school method (chrome ball) be more than enough for that?
I've never tried with a Mirror Ball before. Right now I use a scratched up steel ball for HDRi alignment at home, it's not clear enough to be used for Mirror Ball HDRi capture.
I'm planning on purchasing the Lighting Checker "Twins". (I guess that steel ball is clear enough to do a proper test).

Would be cool if someone with a sexy chrome/steel ball could give it a try. It could be a great poor man's alternative if it works.
             ___
    _] [__|OO|
   (____|___|     https://www.twitch.tv/dubca7 / https://soundcloud.com/dubca7 / https://dubcatshideout.com  ( ͡° ͜ʖ ͡°)

2016-05-13, 19:29:59
Reply #20

karnak

  • Primary Certified Instructor
  • Active Users
  • ***
  • Posts: 62
    • View Profile
I have a small chrome ball, but I have never tried to capture an hdri panorama with it.
There are no big scratches on it, but it is a dust magnet!
Corona Academy (May 2017)

2016-05-15, 22:44:49
Reply #21

dubcat

  • Active Users
  • **
  • Posts: 452
  • ฅ^•ﻌ•^ฅ meow
    • View Profile
I sampled some albedo values today.
When I shot the last picture it was starting to get dark and blueish outside.
We can use this underexposed blue picture to compare RAW and JPG for people who want to see the difference.

Open them in a new tab for better comparison.

It's the same shot, but the camera has applied Lens Correction to the JPG.



After 3D LUT Creator has Linearized them.



After 3D LUT Creator has generated new Matrix.

             ___
    _] [__|OO|
   (____|___|     https://www.twitch.tv/dubca7 / https://soundcloud.com/dubca7 / https://dubcatshideout.com  ( ͡° ͜ʖ ͡°)

2016-06-14, 14:41:54
Reply #22

JakubCech

  • Active Users
  • **
  • Posts: 119
  • jakubcech.net
    • View Profile
    • jakubcech
Thank you for all the usefull information dubcat. I run trough all the info +- and your posts. very useful. In comparison RAW to JPEG, which one did in final look resembled real appearance you saw better?
 + I saw you posting the link for akromatic gadget - do you happen to know any source where all different materials are shot together with this chart? That would make great reference for materials creation.

2016-06-14, 16:50:10
Reply #23

dubcat

  • Active Users
  • **
  • Posts: 452
  • ฅ^•ﻌ•^ฅ meow
    • View Profile
In comparison RAW to JPEG, which one did in final look resembled real appearance you saw better?
It was starting to get dark and blue outside, so my eyes saw something like "RAW as Shot".
That's why we have to calibrate the texture with ColorChecker, because we want the original Albedo value.

I kinda messed up in this example. There had just been a major update to 3DLUTCreator, and they made some big changes to the Linear/New Matrix process.
You can see that the shadows in the RAW image is kinda purple, this is because I used the old method that I was used to, with the new 3DLUTCreator.
I can post new proper comparison shots.

do you happen to know any source where all different materials are shot together with this chart? That would make great reference for materials creation.
Wish I did! That's why I run around and capture Albedo values every now and then.
             ___
    _] [__|OO|
   (____|___|     https://www.twitch.tv/dubca7 / https://soundcloud.com/dubca7 / https://dubcatshideout.com  ( ͡° ͜ʖ ͡°)

2016-06-14, 18:39:34
Reply #24

dubcat

  • Active Users
  • **
  • Posts: 452
  • ฅ^•ﻌ•^ฅ meow
    • View Profile
I had some spare time to run out and do a new comparison. This time I included cross polarization.
In the new 3DLUTCreator update you just press "Match" and everything is done for you.

If you have a ColorChecker Passport like me, don't include the top portion of the checker when you match, it will give you worse result.

With Cross Polarization
You can see that it has a hard time with the blacks, this is because there is no specular.
I'm working on a new Match preset that will work with cross polarized pictures.

You can see that the polarizing filter has a brown/green tint.




Without Cross Polarization




Here is a little test I did with Specular on the ColorChecker.



And this is how you create the Specular and Albedo texture.
I say 100-50% here, this depends on your filter. Only the ratio matters.
Here is my Layer stack (Linear 32bit)

When you shoot cross polarization, you get two pictures.
One with 50% Albedo and one with 100% Specular and 50% Albedo.



Here is the 50% Albedo picture.



Duplicate this picture and change the Blend Mode to "Add"



This is the 100% Specular and 50% Albedo picture.



Duplicate one of the 50% Albedo pictures, put it above the Specular picture and change Blend Mode to "Subtract"
You will notice that everything that has SSS has colored specular. Leaves/skin has blue specular.

« Last Edit: 2016-06-17, 05:29:37 by dubcat »
             ___
    _] [__|OO|
   (____|___|     https://www.twitch.tv/dubca7 / https://soundcloud.com/dubca7 / https://dubcatshideout.com  ( ͡° ͜ʖ ͡°)

2016-06-15, 12:04:02
Reply #25

maru

  • Corona Team
  • Active Users
  • ****
  • Posts: 8612
  • Marcin
    • View Profile
I worship your posts. They make me want to go out and shoot all the crap around. Got to do that some time!

2016-06-18, 05:48:39
Reply #26

philipb

  • Active Users
  • **
  • Posts: 16
    • View Profile
Oh wow....

This is fantastic information....Thanks Dubcat!

I just have one or two questions. Im trying to lock all this down in my mind, and have been reading everything I can get my hands on. I'm shooting my own textures, I have a Macbeth and have been experimenting with cross polarization. Im very happy with the results, but still may questions.

1) You use the term linearize. I have seen others use the same term in reference to texture/Albedo, but not sure we are all using it in the same way when it comes to texture processing (nothing to do with LWF or gamma). Some seem to mean just A) basically that lightroom/camera RAW is at all its default settings? Others B)  the idea of bringing the grayscale of the Macbeth chart into their proper luminance ranges, with the targeted adjustment tool?

Im mainly asking as Im curious exactly what 3d LUT does in this area? Its like a raw image processor that will do all this adjustment work for you ( in the sense of adjusting luminance rages)?


2) This bring me to the part where I'm finding very little information. The inbuilt (nikon for me) tone curve of our cameras.

Now I know this may not be important, but, I wish i could develop a DNG profile from/for my nikon that was Truly 'linear'. It seems that there is much contrast added to the .raw images somewhere in the camera image processing. I have begun to think of the process of adjusting the Macbeth grey scale patches as removing this inbuilt tone curve.

Does this make sense to anyone else or am i just imagining things? If our cameras had a truly liner/neutral response wouldn't our photos come out a lot closer to the 'linearized' (luminance adjustment) adjustments we make to our textures? I find when I'm adjusting my images to fit the proper Mcbeth luminance range that i am always making very similar adjustments, some how its an inverse S-curve. Always with a very steep dip between the first and second white patches. Am I wrong in thinking that I'm fighting the Nikon tone curve? If so is there a way to remove it more automatically without buying 3d LUT ?


Also, some other thoughts......I use a very similar aproach to you for adjusting the luminance patches in a targeted manner....however after setting my basic black and whit patch in ACR I go into photoshop and use a 1 curves adjustment set to color blending mode, then a second set to luminosity blending mode. In the first I white balance each color range with separate rgb curves, in the second I adjust the luminance...I found doing it all with a single curve adjustment was messing with my colors. Specifically because correcting for the nikon tone curve (weather or not my thinking is correct) Involves some pretty steep corrections for the second patch, and getting the luminance right meant my colors would start to go loopy.

This is of course time consuming, so if 3d LUT will take care of this work that would be great. Is that in fact what it does?

Okay guys, also full disclosure, I haven't tried corona, though I can see people are getting fantastic results. I just found my self reading on the forum as there are so many great discussions.

PS. the thread showing the curve response of all the 3ds max adjustments was brilliant.....ahhh the mysteries are unlocked....too bad it was mostly bad news...hehe



« Last Edit: 2016-06-18, 05:53:10 by philipb »

2016-06-19, 10:17:17
Reply #27

dubcat

  • Active Users
  • **
  • Posts: 452
  • ฅ^•ﻌ•^ฅ meow
    • View Profile
1) You use the term linearize.
Hey man
Yeah this can be confusing, it's basically what you said, two things.
You don't have to think about this stuff when you are using 3D LUT Creator. I'll post some pictures further down.

2) The inbuilt tone curve of our cameras.
I know exactly what you are talking about! I actually think this is the RAW file decoders fault.
I noticed that CameraRAW / Lightroom gave me super contrasty pictures, even though I used calibrated camera profiles/linear curve/ 0 everything.
After some googling around I found a free program called "dcraw" that gave me better results.
Not long after that I began using 3D LUT Creator. 3D LUT Creator is using "libraw" as a decoder and "libraw" is based on "dcraw".

This is how CameraRAW decodes the RAW file, super contrasty.



When you open a RAW file with 3D LUT Creator it decodes the file with "libraw" and saves it as a 16bit LogC tiff file. This way you keep the dynamic range.



You can customize how "libraw" decodes the RAW file here, but default is pretty good.



The view port will load with a LogC LUT, so you will never actually see the LogC version.
This version has a more "Linear" look if you ask me.



This is what you get when you align the checker pattern and click Match.



3D LUT Creator does this by first adjusting Exposure/White Balance, and then it creates a new matrix.
No curves are used.



In some cases using the "Linearize with curves" tool after "Match" will give you slightly better result, and in other cases it will ruin the whole picture.
Trial and error thing.





When you are done calibrating, you click "Send LUT to Photoshop" and apply it to the pictures without a ColorChecker.

too bad it was mostly bad news
hehe, yeah :P

EDIT:
I use "Clip" and not "Blend" (The Default).
Blend will tonemap and mess up your whites. Look at the orange tinted white!



I would recommend these settings



And when I use "Match" I toggle the "Offsets" button (Below "Match"), this will give you better results.
« Last Edit: 2016-06-19, 18:36:52 by dubcat »
             ___
    _] [__|OO|
   (____|___|     https://www.twitch.tv/dubca7 / https://soundcloud.com/dubca7 / https://dubcatshideout.com  ( ͡° ͜ʖ ͡°)

2016-06-25, 15:48:14
Reply #28

philipb

  • Active Users
  • **
  • Posts: 16
    • View Profile

Hey Dubcat, thanks for the detailed reply.

You are 100% right. Its the raw decoding that introduces the contrast I have been fighting against. I have been starting from Adobe standard camera calibration in ACR. This has not been a great starting point, I now realize.

3D LUTs  logC starting point got me thinking..... I understand that log is used to shoot 'flat' for video guys and preserve maximum dynamic range ('Flat' = 'linear'). So really what is needed is a flat starting point.

Being to cheap to purchase 3DLUT for the moment (thought it looks like the perfect tool for processing texture) I looked into a few other options. Here is what i found.

1)dcraw...your suggestion....definitely gave me a much flatter starting point. Great! In fact after setting my white and black point (243/52) all the other patches were pretty much in the right places. First time I've seen that.  Also interestingly, using dcraw i got like 30 extra pixels of resolution out of my sensor, hilarious, I guess Nikon feels these edge pixels are not fit for consumption.

Ill look into this further....but i must say that the lack of  GUI is a bit of a bummer...hey, do you remember what settings you were using by chance?

2)LOG vision camera profile for ACR. Its essentially a very flat conversion. Perhaps to flat. Also looking at the histogram its almost as though something has been clipped. If I try to stretch the black and white points to the edges of the histogram its like the data hits an invisible wall. Also i found the colors to be a little off. have to do more playing here. Maybe be an option, but may also have problems.

3) Nikon NX capture 'flat' profile......after stumbling across something about Nikon's flat profile online i decided to download their free raw developer.....Paydirt! thought this software is a horrendous piece of crap, the initial conversion using the flat profile is pretty damn good. From there I can export to tif and keep going. This gives me the best colors and results I think. This will likely be my new texture workflow.

Ill post a comparison of the three images when I get a chance..........Just wanted to follow up and share what id found.....of course 3dLUT is clearly the way to go, but for now im definitely getting a much better initial conversion. Wonder If I can convert with flat then create a color matrix profile with adobe DNG editor to apply back to the tifs in ACR....haven't tried that yet....will post back

PS. I don't know if you saw but episcura the texture site is giving a month of free pro access for each 8 textures you submit, 3 if they are tiled. Not bad. Now if only they would have a section of textures shot with Macbeth charts then that would be really amazing.

Okay, thanks again DC!


2016-06-25, 16:46:42
Reply #29

dubcat

  • Active Users
  • **
  • Posts: 452
  • ฅ^•ﻌ•^ฅ meow
    • View Profile
I'm glad you found alternatives that work for you.

I basically had a dcraw.bat file with these settings.

dcraw-9.27-ms-64-bit.exe -v -H 0 -o 1 -q 3 -4 -T raw.CR2

-v
Provides textual information about the RAW development process.

-H
This option will set the highlight behaviour: 0=clip (Same as I use in 3dLUT)

-o
Sets the output colour profile: 1=sRGB

-q
Sets the quality of the Bayer demosaicing algorithm employed: 3=AHD (Same as I use in 3dLUT)

-4
Generate a linear 16-bit file instead of an 8-bit gamma corrected file.

-T
Output a TIFF instead of PPM.

This is what you get.

             ___
    _] [__|OO|
   (____|___|     https://www.twitch.tv/dubca7 / https://soundcloud.com/dubca7 / https://dubcatshideout.com  ( ͡° ͜ʖ ͡°)