Author Topic: HOWTO: Capture and Calibrate Textures (PBR Style)  (Read 34930 times)

2015-11-26, 15:56:05

dubcat

  • Active Users
  • **
  • Posts: 452
  • ฅ^•ﻌ•^ฅ meow
    • View Profile
This is how I capture and calibrate my textures for PBR rendering.

The basic PBR Albedo/Diffuse rule is that you stay within 53-243 RGB.
Lucky for us the "Macbeth Chart" has 53-243 swatches, wohoo!

I'm most interested in the "Luminance" of the calibrated texture and not the colors. But it doesn't hurt to have correct colors too!

- Must Have Gear -
* DSLR Camera (or a camera that can shoot raw/manual)
* Macbeth chart ("X-Rite ColorChecker Passport" is a popular one)

- Good to Have Gear -
* Camera Tripod
* Camera Remote Shutter Trigger
* Polarizing Filter (2x if you want to cross polarize)
* Camera External Flash (If you want to cross polarize)

- Camera Tips -
* Capture textures when there is diffuse light, like an overcast day.
* Always shoot in manual.
* Always shoot in RAW.
* Set a custom White Balance. (Optional)
* Use "Faithful" profile on your camera. (Optional)
* Try to keep the Macbeth chart as parallel to the surface as possible. (Sometimes I put tape on the back)
* Don't overexpose, you do not want to clip the Macbeth chart.
* If you are cross polarizing, set shutter speed to 200+, this will cancel out the ambient light.
* When you use a polarizing filter you will loose around 1 stop of light and the white balance will shift.

- Creating the Camera Pofile -
Take a picture of your macbeth chart in the same lighting conditions as your textures.
If you are using a macbeth chart from X-Rite, download the "ColorChecker" app
Start the "ColorChecker" app and load the picture.
The "ColorChecker" app will do it's thing.
That's all, you are done.

- Calibration -
I use Lightroom most of the time, but you can do the same stuff in Photoshop with CameraRAW. I'm using Photoshop in this guide.
Remember that these are RAW images.

If you open the RAW image in Photoshop without changing anything it will look like this. eeeeew nasty!



Before we can calibrate the image, we need to linearize it.

Open the "Camera Calibration" tab
Change the "Process" to 2010.
If you made a "Camera Profile", select it.



Open the "Lens Corrections" tab
Enable "Lens Profile Corrections" and "Remove Chromatic Aberration"



Open the "Detail" tab
Disable "Sharpening Amount"



Open the "Tone Curve" tab
Change the "Curve" to "Linear"



Open the "Basic" tab
Set "Blacks" / "Brightness" / "Contrast" to 0



The picture should now be linear and look like this.



Time to calibrate the image!

Select the "Color Sampler Tool"
Click on the "Luminance" swatches like this.



Lets fix the White Balance first.
Click on the "White Balance Tool"
Click on sample #2



I have made this little cheat chart to show you what the values should be.
The RGB is for Photoshop and the % is for Lightroom.



You want to get Sampler #1 and #6 correct first. That's 243 and 53.
Start by adjusting the "Exposure" slider until you get 243.
If #6 is too dark use "Fill Light" if it's too bright, use "Blacks"



#1 and #6 are correct, but the other swatches still need some tweaking.
This is where we use the "Targeted Adjustment Tool".
Select the tool and drag your mouse up and down over the swatches.
You need to fudge each swatch multiple times before each one is correct.
You can still adjust "Exposure" "Blacks" etc.



When you are done, click on the "Presets" tab and create a preset.



Open the picture without the macbeth chart and click on the preset.

Here is the calibrated picture, niiice.




The picture we just calibrated was shot without any polarizing filter.
If you want to capture really good textures, you will need to invest in cross polarization.

Here is the albedo texture from cross polarization.
That's one dirty concrete wall ! What is that ? diarrhea splatter ? Guess we will never know.



and this is the specular. (These were not calibrated only linearized)

« Last Edit: 2015-11-28, 03:37:21 by dubcat »
             ___
    _] [__|OO|
   (____|___|     https://www.twitch.tv/dubca7 / https://soundcloud.com/dubca7 / https://dubcatshideout.com  ( ͡° ͜ʖ ͡°)

2015-11-27, 22:06:24
Reply #1

lacilaci

  • Active Users
  • **
  • Posts: 757
    • View Profile
Interresting, correct me if I'm wrong. But if you shoot in raw, why choose "faithfull" profile in camera?? It's only good for "in camera" preview , you can shoot even in monochrome, it won't make the raw monochrome anyways. Or does the setting change something for the raw?

2015-11-27, 22:54:51
Reply #2

romullus

  • Global Moderator
  • Active Users
  • ****
  • Posts: 5511
  • Let's move this topic, shall we?
    • View Profile
    • My Models
It won't change anything in RAW, but PS or Lightroom will recognise that profile and automatically will apply according tonemapping settings.
I'm not Corona Team member! Everything i say, is my personal opinion only.
R.I.P. Niki Lauda

2015-11-27, 23:13:16
Reply #3

lacilaci

  • Active Users
  • **
  • Posts: 757
    • View Profile
It won't change anything in RAW, but PS or Lightroom will recognise that profile and automatically will apply according tonemapping settings.

Does it? I kinda remember it always like resseting those settings... Like, monochrome opened right away as colored etc.. But I may remember this wrong, using DxO these days :)

2015-11-28, 02:00:27
Reply #4

dubcat

  • Active Users
  • **
  • Posts: 452
  • ฅ^•ﻌ•^ฅ meow
    • View Profile
The custom White Balance and onboard Camera Profile is optional. I just do it to keep everything consistent and non auto, it makes me sleep better at night ;)
             ___
    _] [__|OO|
   (____|___|     https://www.twitch.tv/dubca7 / https://soundcloud.com/dubca7 / https://dubcatshideout.com  ( ͡° ͜ʖ ͡°)

2016-02-23, 02:41:57
Reply #5

SeBass

  • Users
  • *
  • Posts: 2
    • View Profile
Hi dubcat. just one quick question. If the final texture with the correct albedo is "brighter" than the original one, I suppose the final render will look wash out. So what would be the next step once the render is done? Apply a contrast curve or something in post? Apologies if I'm not getting things right. My mother language isn´t english, so I'm doing my best to understand this topic.

2016-02-23, 08:19:24
Reply #6

dubcat

  • Active Users
  • **
  • Posts: 452
  • ฅ^•ﻌ•^ฅ meow
    • View Profile
Hey SeBass
You are correct, PBR calibrated renders look bland.
Personally I use ArionFX in Photoshop. I love the default "RandomControl Camera" and "Canon DSC S315 #2" Response.
I've emulated the "RandomControl Camera" curve and shared it here as LUT.

On another note, I recommend that you underexpose your textures. I've captured a lot of bright kitchen textures, and when I've processed them in Photoshop/Lightroom they got really overexposed/wrong. Lately I've been calibrating my stuff with 3D Lut Creator, kickass tool.
             ___
    _] [__|OO|
   (____|___|     https://www.twitch.tv/dubca7 / https://soundcloud.com/dubca7 / https://dubcatshideout.com  ( ͡° ͜ʖ ͡°)

2016-04-11, 13:23:49
Reply #7

SairesArt

  • Active Users
  • **
  • Posts: 657
  • Pizza | The Cheesen One
    • View Profile
    • SairesArt Portfolio
Really awesome guide! Looked into cross polarisation, never came to my mind to use a polarizer like that!
Found this resource from a game dev on that topic: http://filmicgames.com/archives/233

@Dubcat: Since a simple photo is technically Diffuse + specular, wouldn't be the correct path to polarize the simple photo as well and calibrate the image through the polarizer with the chart? Or since we aim to shoot when overcast the specularity doesn't matter with textures that much to influence the outcoming image?
Or do you already use the polarizer in the albedo image and I didn't read it correctly in your guide?

2016-04-11, 19:10:21
Reply #8

dubcat

  • Active Users
  • **
  • Posts: 452
  • ฅ^•ﻌ•^ฅ meow
    • View Profile
Or do you already use the polarizer in the albedo image and I didn't read it correctly in your guide?
I didn't use a polarizer in the example as I wanted the guide to be generic.

This is my current workflow.

01. Take a picture of the chart with cross polarization.
02. Take a picture of the texture you want with cross polarization.

03. Open the chart picture in 3D Lut Creator.
04. Run the ColorChecker tool.
05. Save the LUT.

04. Apply the LUT to the texture pictures in Photoshop.

You can test this for yourself with the demo version of 3D Lut Creator, but you can't save the LUT/Pictures.
They have an official ColorChecker tutorial (Two Parts)

             ___
    _] [__|OO|
   (____|___|     https://www.twitch.tv/dubca7 / https://soundcloud.com/dubca7 / https://dubcatshideout.com  ( ͡° ͜ʖ ͡°)

2016-04-11, 19:33:02
Reply #9

Juraj Talcik

  • Active Users
  • **
  • Posts: 3435
  • Tinkering away
    • View Profile
    • studio website
Dubcat, have you thought of building a personal scanner ? Would give normal/height from the cross-polarization. Not sure how but I've seen few of those setups recently, didn't seem that complicated at all.
talcikdemovicova.com  Website and blog
be.net/jurajtalcik    My Behance portfolio
lysfaere.com   Something new

2016-04-11, 20:11:49
Reply #10

dubcat

  • Active Users
  • **
  • Posts: 452
  • ฅ^•ﻌ•^ฅ meow
    • View Profile
I began building a scanner two summers ago, but I ran into a couple of problems
This guy solved one of my problems by getting a free “Fast Polarization Modulator", lucky guy!


I kinda gave up after that.
             ___
    _] [__|OO|
   (____|___|     https://www.twitch.tv/dubca7 / https://soundcloud.com/dubca7 / https://dubcatshideout.com  ( ͡° ͜ʖ ͡°)

2016-04-11, 20:16:00
Reply #11

Juraj Talcik

  • Active Users
  • **
  • Posts: 3435
  • Tinkering away
    • View Profile
    • studio website
Hah, his blog post even mentions the inspiration from Ready-at-Dawn. That's the one I was impressed by :- ) Looked like ghetto version of Megascans you could possibly build.

I have giant white room in apartment building, so I could utilize that..

Edit: Read through all that...ok...I guess not.

Edit: And I thought that the mechanical part looked so easy.... I was more puzzled if the software was fully bespoke that did normal map and specular extraction.
« Last Edit: 2016-04-11, 20:24:05 by Juraj_Talcik »
talcikdemovicova.com  Website and blog
be.net/jurajtalcik    My Behance portfolio
lysfaere.com   Something new

2016-04-11, 20:37:45
Reply #12

dubcat

  • Active Users
  • **
  • Posts: 452
  • ฅ^•ﻌ•^ฅ meow
    • View Profile
I need to look into this normalmap method you mentioned.

My little prototype box had LED strips on each side. I took 4 pictures and combined them in Photoshop like this.
Image 1: Top light in Green Channel and Left light in Red Channel
Image 2: Bottom light in Green Channel and Right light in Red Channel
Then I combined the two images with Overlay blend and filled the Blue Channel with #8080FF (And adjusted it afterwards).
             ___
    _] [__|OO|
   (____|___|     https://www.twitch.tv/dubca7 / https://soundcloud.com/dubca7 / https://dubcatshideout.com  ( ͡° ͜ʖ ͡°)

2016-04-11, 20:51:15
Reply #13

Juraj Talcik

  • Active Users
  • **
  • Posts: 3435
  • Tinkering away
    • View Profile
    • studio website
I need to look into this normalmap method you mentioned.

My little prototype box had LED strips on each side. I took 4 pictures and combined them in Photoshop like this.
Image 1: Top light in Green Channel and Left light in Red Channel
Image 2: Bottom light in Green Channel and Right light in Red Channel
Then I combined the two images with Overlay blend and filled the Blue Channel with #8080FF (And adjusted it afterwards).

Like this ?

http://www.zarria.net/nrmphoto/nrmphoto.html
talcikdemovicova.com  Website and blog
be.net/jurajtalcik    My Behance portfolio
lysfaere.com   Something new

2016-04-11, 21:13:39
Reply #14

dubcat

  • Active Users
  • **
  • Posts: 452
  • ฅ^•ﻌ•^ฅ meow
    • View Profile
Yes. I used megascan normal maps as reference back then.
Maybe they are using the same process.

EDIT: I forgot to say that I used "Highpass" with 256 radius on the pictures. That's the secret dDo value.

« Last Edit: 2016-04-11, 22:56:04 by dubcat »
             ___
    _] [__|OO|
   (____|___|     https://www.twitch.tv/dubca7 / https://soundcloud.com/dubca7 / https://dubcatshideout.com  ( ͡° ͜ʖ ͡°)

2016-05-12, 14:45:17
Reply #15

JakubCech

  • Active Users
  • **
  • Posts: 119
  • jakubcech.net
    • View Profile
    • jakubcech
Hey DubCat your posts are amazing, +1 from me to keep posting such stuff :)
I would love to ask you, can you point me at some good sources of this PBR related stuff? Or some PBR/Disney/game/whatever related stuff / channels / videos you think are very interesting.
One dumb question, why shooting raw? What if someone uses a smartphone with manual settings (not talking about resolution or just better sensor capabilities)?
Thank you.
« Last Edit: 2016-05-12, 15:21:14 by JakubCech »

2016-05-12, 17:33:12
Reply #16

dubcat

  • Active Users
  • **
  • Posts: 452
  • ฅ^•ﻌ•^ฅ meow
    • View Profile
Can you point me at some good sources of this PBR related stuff? Or some PBR/Disney/game/whatever related stuff / channels / videos you think are very interesting.
Hey man. I've linked to a couple of guides and materials that I know are proper PBR.

There are two types of PBR.
* Cheap ass "Bitmap2Material 3" materials that most texture-stores are selling. (They are still PBR, just not calibrated to anything in the real world).
* Proper calibrated materials (This can be color values, 3D scanned normals/displacement, shadow cancellation by capturing a hdri of the environment.)

Color Values
- You can calibrate the values like i did here in Photoshop/Lightroom. Or you can use 3D Lut Creator. 3D Lut Creator will auto generate correct Matrix, change the exposure and linearize the picture.
They got two official tutorials covering this topic.

Normals/Displacement
- Most people use PhotoScan for 3D Scanning. But I've heard good stuff about this new badboy Capturing Reality
This tutorial series about PhotoScan is great, check them out.

Shadow Cancellation
- You need a good amount of equipment to do this.
   * Lens and Tripod mount to shoot HDRi
   * Grey ball (Cheap solution would be to use ColorChecker graycard)
   * Chrome ball (You use this ball to align the HDRi inside 3dsMax, it is not for capturing HDRi (Like in the old days))

   * You re-create the grey/chrome ball in 3dsMax.
   * Rotate the HDRi until the HDRi reflections in the Chrome ball match the real life reference picture.
   * Change the exposure of the HDRi, until the grey ball match the real life reference picture.

   * Bake a map that contain the shadows and divide them from the diffuse texture in linear space.

I made a quick example here



Great presentations from Epic, check them out.

Proper info
Allegorithmic, Marmoset, Epic and Eat3D

Proper reference materials
Allegorithmic and Quixel

One dumb question, why shooting raw? What if someone uses a smartphone with manual settings (not talking about resolution or just better sensor capabilities)?
Camera manufacturers want their pictures to look good out of the box. So they apply white balance, contrast, sharpening, de-noising etc and save them as JPG.
When you take a picture in RAW, you get the raw sensor information. Nothing is baked.

Another important thing is that JPG is 8bit and RAW is 16bit.
You can make major adjustments without messing up the picture with RAW.
« Last Edit: 2016-05-12, 19:43:05 by dubcat »
             ___
    _] [__|OO|
   (____|___|     https://www.twitch.tv/dubca7 / https://soundcloud.com/dubca7 / https://dubcatshideout.com  ( ͡° ͜ʖ ͡°)

2016-05-13, 11:08:04
Reply #17

-Ben-Battler-

  • Active Users
  • **
  • Posts: 171
    • View Profile
Thank you for the links dubcat, those are really interesting!
Visit Pangaroo

2016-05-13, 12:19:34
Reply #18

romullus

  • Global Moderator
  • Active Users
  • ****
  • Posts: 5511
  • Let's move this topic, shall we?
    • View Profile
    • My Models
Shadow Cancellation
- You need a good amount of equipment to do this.
   * Lens and Tripod mount to shoot HDRi
   * Grey ball (Cheap solution would be to use ColorChecker graycard)
  * Chrome ball (You use this ball to align the HDRi inside 3dsMax, it is not for capturing HDRi (Like in the old days))

   * You re-create the grey/chrome ball in 3dsMax.
   * Rotate the HDRi until the HDRi reflections in the Chrome ball match the real life reference picture.
   * Change the exposure of the HDRi, until the grey ball match the real life reference picture.

   * Bake a map that contain the shadows and divide them from the diffuse texture in linear space.

Tons of useful information, as always. Big thanks! I have one question about lighting information removal, though. If HDRI panorama is needed only for lighting removal, wouldn't capturing it with old school method (chrome ball) be more than enough for that? That would save a lot of time when capturing and would be much less expensive.
I'm not Corona Team member! Everything i say, is my personal opinion only.
R.I.P. Niki Lauda

2016-05-13, 18:01:37
Reply #19

dubcat

  • Active Users
  • **
  • Posts: 452
  • ฅ^•ﻌ•^ฅ meow
    • View Profile
Wouldn't capturing it with old school method (chrome ball) be more than enough for that?
I've never tried with a Mirror Ball before. Right now I use a scratched up steel ball for HDRi alignment at home, it's not clear enough to be used for Mirror Ball HDRi capture.
I'm planning on purchasing the Lighting Checker "Twins". (I guess that steel ball is clear enough to do a proper test).

Would be cool if someone with a sexy chrome/steel ball could give it a try. It could be a great poor man's alternative if it works.
             ___
    _] [__|OO|
   (____|___|     https://www.twitch.tv/dubca7 / https://soundcloud.com/dubca7 / https://dubcatshideout.com  ( ͡° ͜ʖ ͡°)

2016-05-13, 19:29:59
Reply #20

karnak

  • Primary Certified Instructor
  • Active Users
  • ***
  • Posts: 64
    • View Profile
I have a small chrome ball, but I have never tried to capture an hdri panorama with it.
There are no big scratches on it, but it is a dust magnet!
Corona Academy (May 2017)

2016-05-15, 22:44:49
Reply #21

dubcat

  • Active Users
  • **
  • Posts: 452
  • ฅ^•ﻌ•^ฅ meow
    • View Profile
I sampled some albedo values today.
When I shot the last picture it was starting to get dark and blueish outside.
We can use this underexposed blue picture to compare RAW and JPG for people who want to see the difference.

Open them in a new tab for better comparison.

It's the same shot, but the camera has applied Lens Correction to the JPG.



After 3D LUT Creator has Linearized them.



After 3D LUT Creator has generated new Matrix.

             ___
    _] [__|OO|
   (____|___|     https://www.twitch.tv/dubca7 / https://soundcloud.com/dubca7 / https://dubcatshideout.com  ( ͡° ͜ʖ ͡°)

2016-06-14, 14:41:54
Reply #22

JakubCech

  • Active Users
  • **
  • Posts: 119
  • jakubcech.net
    • View Profile
    • jakubcech
Thank you for all the usefull information dubcat. I run trough all the info +- and your posts. very useful. In comparison RAW to JPEG, which one did in final look resembled real appearance you saw better?
 + I saw you posting the link for akromatic gadget - do you happen to know any source where all different materials are shot together with this chart? That would make great reference for materials creation.

2016-06-14, 16:50:10
Reply #23

dubcat

  • Active Users
  • **
  • Posts: 452
  • ฅ^•ﻌ•^ฅ meow
    • View Profile
In comparison RAW to JPEG, which one did in final look resembled real appearance you saw better?
It was starting to get dark and blue outside, so my eyes saw something like "RAW as Shot".
That's why we have to calibrate the texture with ColorChecker, because we want the original Albedo value.

I kinda messed up in this example. There had just been a major update to 3DLUTCreator, and they made some big changes to the Linear/New Matrix process.
You can see that the shadows in the RAW image is kinda purple, this is because I used the old method that I was used to, with the new 3DLUTCreator.
I can post new proper comparison shots.

do you happen to know any source where all different materials are shot together with this chart? That would make great reference for materials creation.
Wish I did! That's why I run around and capture Albedo values every now and then.
             ___
    _] [__|OO|
   (____|___|     https://www.twitch.tv/dubca7 / https://soundcloud.com/dubca7 / https://dubcatshideout.com  ( ͡° ͜ʖ ͡°)

2016-06-14, 18:39:34
Reply #24

dubcat

  • Active Users
  • **
  • Posts: 452
  • ฅ^•ﻌ•^ฅ meow
    • View Profile
I had some spare time to run out and do a new comparison. This time I included cross polarization.
In the new 3DLUTCreator update you just press "Match" and everything is done for you.

If you have a ColorChecker Passport like me, don't include the top portion of the checker when you match, it will give you worse result.

With Cross Polarization
You can see that it has a hard time with the blacks, this is because there is no specular.
I'm working on a new Match preset that will work with cross polarized pictures.

You can see that the polarizing filter has a brown/green tint.




Without Cross Polarization




Here is a little test I did with Specular on the ColorChecker.



And this is how you create the Specular and Albedo texture.
I say 100-50% here, this depends on your filter. Only the ratio matters.
Here is my Layer stack (Linear 32bit)

When you shoot cross polarization, you get two pictures.
One with 50% Albedo and one with 100% Specular and 50% Albedo.



Here is the 50% Albedo picture.



Duplicate this picture and change the Blend Mode to "Add"



This is the 100% Specular and 50% Albedo picture.



Duplicate one of the 50% Albedo pictures, put it above the Specular picture and change Blend Mode to "Subtract"
You will notice that everything that has SSS has colored specular. Leaves/skin has blue specular.

« Last Edit: 2016-06-17, 05:29:37 by dubcat »
             ___
    _] [__|OO|
   (____|___|     https://www.twitch.tv/dubca7 / https://soundcloud.com/dubca7 / https://dubcatshideout.com  ( ͡° ͜ʖ ͡°)

2016-06-15, 12:04:02
Reply #25

maru

  • Corona Team
  • Active Users
  • ****
  • Posts: 8665
  • Marcin
    • View Profile
I worship your posts. They make me want to go out and shoot all the crap around. Got to do that some time!

2016-06-18, 05:48:39
Reply #26

philipb

  • Active Users
  • **
  • Posts: 16
    • View Profile
Oh wow....

This is fantastic information....Thanks Dubcat!

I just have one or two questions. Im trying to lock all this down in my mind, and have been reading everything I can get my hands on. I'm shooting my own textures, I have a Macbeth and have been experimenting with cross polarization. Im very happy with the results, but still may questions.

1) You use the term linearize. I have seen others use the same term in reference to texture/Albedo, but not sure we are all using it in the same way when it comes to texture processing (nothing to do with LWF or gamma). Some seem to mean just A) basically that lightroom/camera RAW is at all its default settings? Others B)  the idea of bringing the grayscale of the Macbeth chart into their proper luminance ranges, with the targeted adjustment tool?

Im mainly asking as Im curious exactly what 3d LUT does in this area? Its like a raw image processor that will do all this adjustment work for you ( in the sense of adjusting luminance rages)?


2) This bring me to the part where I'm finding very little information. The inbuilt (nikon for me) tone curve of our cameras.

Now I know this may not be important, but, I wish i could develop a DNG profile from/for my nikon that was Truly 'linear'. It seems that there is much contrast added to the .raw images somewhere in the camera image processing. I have begun to think of the process of adjusting the Macbeth grey scale patches as removing this inbuilt tone curve.

Does this make sense to anyone else or am i just imagining things? If our cameras had a truly liner/neutral response wouldn't our photos come out a lot closer to the 'linearized' (luminance adjustment) adjustments we make to our textures? I find when I'm adjusting my images to fit the proper Mcbeth luminance range that i am always making very similar adjustments, some how its an inverse S-curve. Always with a very steep dip between the first and second white patches. Am I wrong in thinking that I'm fighting the Nikon tone curve? If so is there a way to remove it more automatically without buying 3d LUT ?


Also, some other thoughts......I use a very similar aproach to you for adjusting the luminance patches in a targeted manner....however after setting my basic black and whit patch in ACR I go into photoshop and use a 1 curves adjustment set to color blending mode, then a second set to luminosity blending mode. In the first I white balance each color range with separate rgb curves, in the second I adjust the luminance...I found doing it all with a single curve adjustment was messing with my colors. Specifically because correcting for the nikon tone curve (weather or not my thinking is correct) Involves some pretty steep corrections for the second patch, and getting the luminance right meant my colors would start to go loopy.

This is of course time consuming, so if 3d LUT will take care of this work that would be great. Is that in fact what it does?

Okay guys, also full disclosure, I haven't tried corona, though I can see people are getting fantastic results. I just found my self reading on the forum as there are so many great discussions.

PS. the thread showing the curve response of all the 3ds max adjustments was brilliant.....ahhh the mysteries are unlocked....too bad it was mostly bad news...hehe



« Last Edit: 2016-06-18, 05:53:10 by philipb »

2016-06-19, 10:17:17
Reply #27

dubcat

  • Active Users
  • **
  • Posts: 452
  • ฅ^•ﻌ•^ฅ meow
    • View Profile
1) You use the term linearize.
Hey man
Yeah this can be confusing, it's basically what you said, two things.
You don't have to think about this stuff when you are using 3D LUT Creator. I'll post some pictures further down.

2) The inbuilt tone curve of our cameras.
I know exactly what you are talking about! I actually think this is the RAW file decoders fault.
I noticed that CameraRAW / Lightroom gave me super contrasty pictures, even though I used calibrated camera profiles/linear curve/ 0 everything.
After some googling around I found a free program called "dcraw" that gave me better results.
Not long after that I began using 3D LUT Creator. 3D LUT Creator is using "libraw" as a decoder and "libraw" is based on "dcraw".

This is how CameraRAW decodes the RAW file, super contrasty.



When you open a RAW file with 3D LUT Creator it decodes the file with "libraw" and saves it as a 16bit LogC tiff file. This way you keep the dynamic range.



You can customize how "libraw" decodes the RAW file here, but default is pretty good.



The view port will load with a LogC LUT, so you will never actually see the LogC version.
This version has a more "Linear" look if you ask me.



This is what you get when you align the checker pattern and click Match.



3D LUT Creator does this by first adjusting Exposure/White Balance, and then it creates a new matrix.
No curves are used.



In some cases using the "Linearize with curves" tool after "Match" will give you slightly better result, and in other cases it will ruin the whole picture.
Trial and error thing.





When you are done calibrating, you click "Send LUT to Photoshop" and apply it to the pictures without a ColorChecker.

too bad it was mostly bad news
hehe, yeah :P

EDIT:
I use "Clip" and not "Blend" (The Default).
Blend will tonemap and mess up your whites. Look at the orange tinted white!



I would recommend these settings



And when I use "Match" I toggle the "Offsets" button (Below "Match"), this will give you better results.
« Last Edit: 2016-06-19, 18:36:52 by dubcat »
             ___
    _] [__|OO|
   (____|___|     https://www.twitch.tv/dubca7 / https://soundcloud.com/dubca7 / https://dubcatshideout.com  ( ͡° ͜ʖ ͡°)

2016-06-25, 15:48:14
Reply #28

philipb

  • Active Users
  • **
  • Posts: 16
    • View Profile

Hey Dubcat, thanks for the detailed reply.

You are 100% right. Its the raw decoding that introduces the contrast I have been fighting against. I have been starting from Adobe standard camera calibration in ACR. This has not been a great starting point, I now realize.

3D LUTs  logC starting point got me thinking..... I understand that log is used to shoot 'flat' for video guys and preserve maximum dynamic range ('Flat' = 'linear'). So really what is needed is a flat starting point.

Being to cheap to purchase 3DLUT for the moment (thought it looks like the perfect tool for processing texture) I looked into a few other options. Here is what i found.

1)dcraw...your suggestion....definitely gave me a much flatter starting point. Great! In fact after setting my white and black point (243/52) all the other patches were pretty much in the right places. First time I've seen that.  Also interestingly, using dcraw i got like 30 extra pixels of resolution out of my sensor, hilarious, I guess Nikon feels these edge pixels are not fit for consumption.

Ill look into this further....but i must say that the lack of  GUI is a bit of a bummer...hey, do you remember what settings you were using by chance?

2)LOG vision camera profile for ACR. Its essentially a very flat conversion. Perhaps to flat. Also looking at the histogram its almost as though something has been clipped. If I try to stretch the black and white points to the edges of the histogram its like the data hits an invisible wall. Also i found the colors to be a little off. have to do more playing here. Maybe be an option, but may also have problems.

3) Nikon NX capture 'flat' profile......after stumbling across something about Nikon's flat profile online i decided to download their free raw developer.....Paydirt! thought this software is a horrendous piece of crap, the initial conversion using the flat profile is pretty damn good. From there I can export to tif and keep going. This gives me the best colors and results I think. This will likely be my new texture workflow.

Ill post a comparison of the three images when I get a chance..........Just wanted to follow up and share what id found.....of course 3dLUT is clearly the way to go, but for now im definitely getting a much better initial conversion. Wonder If I can convert with flat then create a color matrix profile with adobe DNG editor to apply back to the tifs in ACR....haven't tried that yet....will post back

PS. I don't know if you saw but episcura the texture site is giving a month of free pro access for each 8 textures you submit, 3 if they are tiled. Not bad. Now if only they would have a section of textures shot with Macbeth charts then that would be really amazing.

Okay, thanks again DC!


2016-06-25, 16:46:42
Reply #29

dubcat

  • Active Users
  • **
  • Posts: 452
  • ฅ^•ﻌ•^ฅ meow
    • View Profile
I'm glad you found alternatives that work for you.

I basically had a dcraw.bat file with these settings.

dcraw-9.27-ms-64-bit.exe -v -H 0 -o 1 -q 3 -4 -T raw.CR2

-v
Provides textual information about the RAW development process.

-H
This option will set the highlight behaviour: 0=clip (Same as I use in 3dLUT)

-o
Sets the output colour profile: 1=sRGB

-q
Sets the quality of the Bayer demosaicing algorithm employed: 3=AHD (Same as I use in 3dLUT)

-4
Generate a linear 16-bit file instead of an 8-bit gamma corrected file.

-T
Output a TIFF instead of PPM.

This is what you get.

             ___
    _] [__|OO|
   (____|___|     https://www.twitch.tv/dubca7 / https://soundcloud.com/dubca7 / https://dubcatshideout.com  ( ͡° ͜ʖ ͡°)

2016-11-28, 19:05:36
Reply #30

laurenth

  • Users
  • *
  • Posts: 3
    • View Profile
Hello here,

I have been using similar workflows :
- shoot raw images with chart ( no polarization )
- get linearized tiff from dcraw
- white balance and expose in photoshop
-> all swatches match pretty well with the chart

I recently started using cross-polarization, and I noticed I couldn't use the same worflow :
- when I white balance my images at the end of the process, the patches of the chart don't all fall into place as they do with non polarized images.
-> if I expose according to the 3rd grey patch as I usually do I get too bright white, and too dark black

It feels like cross-polarization, which I use to remove specular lighting, removes specular also from the chart. It doesn't seem to remove the same amount of specular on all the patches, and thus I can no longer calibrate according to the chart values.
Matching the cross-polarized patches with the reference values using something like 3D lut creator might try to compensate that missing specular part, and do some odd color transformations.

Did anyone come across the same situation ?
Do you think that the specular part on the chart is neglectible and thus we can consider there is no spec in the chart ?

Any insights appreciated :)

2016-11-28, 19:49:39
Reply #31

dubcat

  • Active Users
  • **
  • Posts: 452
  • ฅ^•ﻌ•^ฅ meow
    • View Profile
I've made a custom calibration preset in 3d lut creator, it will only work with my camera/filter.
You can use my white to black values as a base to find the correct values for your own setup.

Albedo



Glossiness

« Last Edit: 2016-11-28, 23:54:36 by dubcat »
             ___
    _] [__|OO|
   (____|___|     https://www.twitch.tv/dubca7 / https://soundcloud.com/dubca7 / https://dubcatshideout.com  ( ͡° ͜ʖ ͡°)

2016-11-29, 10:49:26
Reply #32

laurenth

  • Users
  • *
  • Posts: 3
    • View Profile
Thanks for the answer, I feel less alone in the world ^^
May I ask how you came up with these values ?

In my case, I tried to calibrate my polarized chart based on the brightest grey first since I found it's the one that might have the least specular ( because of the low glossiness ) and also even if there is a little specular in it, it should not be much compared to the value of the bright grey itself. I expect a bigger difference on the black.

2016-12-16, 22:43:54
Reply #33

dubcat

  • Active Users
  • **
  • Posts: 452
  • ฅ^•ﻌ•^ฅ meow
    • View Profile
My samples are still not perfect, I'm improving my camera/light setup every month.
The secret is to have everything calibrated and then recreate the scene in 3dsmax.
I have HDRis of my light boxes and ring light (Diva Ring Light Nebula), these are calibrated to give me the same intensity/falloff as in real life. I've measured the distance/angle of these lights and recrated them in 3dsmax.
Then i shoot something with and without polarization filter. You can then create a glossiness map from the specular map you get from cross polarization (You need to light the object from all angles, or else the specular map will not be correct). I don't have any fancy app that generate the glossiness right now, so I adjust the map until the cross polarization diffuse/glossiness match the non-polarization picture.

I use this dcraw script to make linear raws
Code: [Select]
dcraw-9.27-ms-64-bit.exe -v -w -H 0 -o 1 -q 3 -4 -T *.CR2
I haven't found anything online about this stuff, so I'm improving as I go ;)

Edit:
I want to mention that I adjust my white balance in camera. I shoot the greycard of my Color CheckerPassport, go into custom white balance on my Canon and select the picture. You might think, but I'm shooting in raw, it doesn't matter. Well, it does matter when you run the raw through the script I just posted. The polarizing filter will tint your pictures, just white balance in camera and be done with it.
« Last Edit: 2016-12-17, 03:21:24 by dubcat »
             ___
    _] [__|OO|
   (____|___|     https://www.twitch.tv/dubca7 / https://soundcloud.com/dubca7 / https://dubcatshideout.com  ( ͡° ͜ʖ ͡°)

2017-12-06, 21:23:28
Reply #34

philipb

  • Active Users
  • **
  • Posts: 16
    • View Profile
Dubcat!!!

Just looking back over this thread and ran across this:

"And this is how you create the Specular and Albedo texture.
I say 100-50% here, this depends on your filter. Only the ratio matters.
Here is my Layer stack (Linear 32bit)

When you shoot cross polarization, you get two pictures.
One with 50% Albedo and one with 100% Specular and 50% Albedo"

Can you explain the 50% albedo issue? When I do X-pol I tend not to extract a spec pass and just end up making bump/spec from the Albedo. But when i X-Pol capture my Albedo i expose as close as possible to the ColorChecker values and then bring them in line 100% with curves. Do you see a problem with this? Also, what is the significance of the 32bit space here?

PS. also, as we discussed before, in terms of getting a liner/flat RAW, I have found a great free open source, raw developer...... It's called Rawtherapee. It has a flat default which does not add any toning and so is perfect for developing textures from.RAW. I would highly recommend it to anyone looking to develop their own textures. It accepts custom camera color calibration (ala adobe dng profile), and lens correction profiles. It is really fantastic and is my go-to texture-from-raw tool.



2017-12-07, 20:10:55
Reply #35

dubcat

  • Active Users
  • **
  • Posts: 452
  • ฅ^•ﻌ•^ฅ meow
    • View Profile
Hey man!

Can you explain the 50% albedo issue?
When you cross polarize you will either get 50% abledo, or 50% albedo + 100 specular depending on the angle.
If you add the 50% albedo with itself in 32bit, you will get 100% albedo.

i expose as close as possible to the ColorChecker values

This is where the cross polarization headache comes in.
The polarization filter will lower the exposure a little.
You will only get 50% albedo max.
You can try to calibrate the camera without the filter, since the color checker values are 100% albedo + 100% specular. And then turn up the exposure, to compensate for the filter.
Or the optimal solution would be to tether the camera, shoot booth polarization pictures. Combine them in 32bit with Add, and see if the checker value is correct. If not, just remember to adjust both pictures with the same values.

what is the significance of the 32bit space here?
Since we are doing math, we have to use 32bit when combining the 50% and the 50/100% picture.

Rawtherapee
Thank! Will give it a try :)
             ___
    _] [__|OO|
   (____|___|     https://www.twitch.tv/dubca7 / https://soundcloud.com/dubca7 / https://dubcatshideout.com  ( ͡° ͜ʖ ͡°)

2018-06-17, 22:57:09
Reply #36

dj_buckley

  • Active Users
  • **
  • Posts: 234
    • View Profile
Hey Dubcat (or anyone else who might know the answer) sorry to reignite an old thread, but keen to know why in the original post from 2015 you change the Lightroom Process to 2010 initially for calibration?

I'm just looking into this properly now, and have been messing around in Lightroom with those Process options and find myself wondering what exactly they're doing and why you'd choose one over the other when it comes to texture creation.

I've just opened a random raw photo in lightroom and 'linearized' it according to the steps in the first post.  I then changed the process back to Version 4 (Current) and noticed it changes the tone curve and a few of the tone settings but maintains a similar look so I found myself asking how critical this step was.

Thanks

2018-07-01, 03:15:10
Reply #37

dubcat

  • Active Users
  • **
  • Posts: 452
  • ฅ^•ﻌ•^ฅ meow
    • View Profile
Thanks

Hey man!

The reason I used 2010 is because Adobe moved away from "Recovery" after 2010. Before ACES, CameraRAW Recovery was my go to tone mapper, it's just too good. Adobe has removed 32bit support for CameraRAW in the 2018 update, but you can still manually download CameraRaw 9.9 from their site.

If you select 2010 and do adjustments and then select Version 4, CameraRAW will auto convert the 2010 settings to version 4.

Like many others I've made the change from Photoshop to Affinity Photo. Because Affinity Photo has 32 bit floating point support and the 360 degree HDRi mode is 1000 times faster than the 2017 Photoshop feature. Photoshop clamp floating values above +16 and values bellow 0 is clamped to 0. If you try my ACES tone mapper Photoshop script, you might noticed black pixels, this is because of the poor 32bit support.

I have to edit the original post and update it to 2018 standards with homemade cheap Megascan setup.

Here are the differences

             ___
    _] [__|OO|
   (____|___|     https://www.twitch.tv/dubca7 / https://soundcloud.com/dubca7 / https://dubcatshideout.com  ( ͡° ͜ʖ ͡°)

2018-07-12, 21:02:23
Reply #38

dj_buckley

  • Active Users
  • **
  • Posts: 234
    • View Profile
Quote
If you select 2010 and do adjustments and then select Version 4, CameraRAW will auto convert the 2010 settings to version 4.

Makes sense.

I'm actually using Lightroom.  The camera I use has a 'Flat' Picture Mode built into it which looks very Linear (RAW).  I've found if I load the RAW file into Lightroom and just switch the Profile to Camera Flat.  It looks pretty much like an original RAW file to me but I wouldn't really know otherwise.

If I'm correct, the workflow you were using with the 2010 process was purely to Linearize the RAW so you were actually seeing the original raw data right?  So I'm wondering with the Camera Flat profile, whether any of that 2010 process bit is relevant to me anyway.

Also if you can get the same results from 2014 process, I'm assuming you chose to use 2010 as it's what you were comfortable with through experience and you could in fact use either?

2018-07-12, 21:16:14
Reply #39

dj_buckley

  • Active Users
  • **
  • Posts: 234
    • View Profile
Just to add I'm thinking of just buying 3D LUT Creator and eliminating the need for Lightroom altogether as I'm starting to shoot my own textures with the macbeth more and more these days.

How does 3D LUT Creator deal with Removing CA and Lens Distortion though or doesn't it?  That seems like a critical step in the Lightroom workflow to me when creating textures?
« Last Edit: 2018-07-12, 21:20:24 by dj_buckley »

2018-11-28, 17:51:35
Reply #40

manuce

  • Users
  • *
  • Posts: 1
    • View Profile
I know its a old post but have you guys checked this mmcolorTaget tool in nuke ?

 This is the best tool to calibrate all the textures and hdri in one go. All the cool kids are using this :)

https://www.marcomeyer-vfx.de/?p=88