Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Messages - Fluss

Pages: [1] 2 3 ... 25
Gallery / Re: Roman Grand Hall
« on: 2019-02-26, 11:07:22 »
Nice details here! Maybe a little bit too much glare for my taste.

General Discussion / Re: Frosted glass noise
« on: 2019-02-21, 23:26:59 »
Try ranch computing, really good renderfarm

I need help / Re: burn issue
« on: 2019-02-19, 10:52:18 »
Did you use the override reflection slot? It can produce a lot of fireflies when you have different HDRIs in global illumination and reflection slots.

This is because of the core fundamental of the dynamic range: the ratio between bright and dark areas. In your case, HDR1000 refer to the peak brightness of the monitor (1000nits). Black areas for the specification have to be under 0.03 nits. So if you lower the peak brightness, you lower the dynamic range and then you'll be out HDR1000 specification, hence the brightness lock.

So would the solution be implementing colour management into corona & to max and rendering straight to ACEScg? I didnt realise that even on certified HDR displays they were still adding lots of processing ontop but i guess its obvious when i think about it. I guess its to compensate for shit panels or discrepancy across a panel batch so they all look about the same?

What do you mean here? The solution to what? If you're talking about the color discrepancies that occur during the colorspace switch then yes, we should render straight into ACEScg to get a proper color-managed workflow. But this is not that simple tho, as it may introduce other caveats.

I think stuff like that will be more unified once the standards start to be similar to each other along the way. And it isn't any different from people using "Vivid" mode on their display/TV right now.

The only person seeing the image as the creators (us) well, us :- ). Just something to live with.

For sure! It is just that this HDR thingy introduced a load more of those post-processes. What's more, HDR and HDR10 are stuck with a fixed curve at the beginning of the media playback which end up with details loss in very bright or dark scenes. Things are going in a good way tho, HDR10+ and Dolby vision specifications introduce dynamic metadata to allow the change of the brightness boundaries on the fly (per scene) rather than remaining constant for the whole experience.

I think stuff like that will be more unified once the standards start to be similar to each other along the way. And it isn't any different from people using "Vivid" mode on their display/TV right now.

The only person seeing the image as the creators (us) well, us :- ). Just something to live with.

The thing im not looking forward to is the day that the 30fps standard disappears and we have to start rendering animations at much higher framerates

Yeah, 4k/60 would be nightmare :- ) But same happened to still frame rendering, I used to do render that took 2 hours on single quad-core with Vray for 4k resolution, now that same 4k resolution is easily 2 hours on 200 cores... Quality standards constantly grow.
Anyway, apparently half of TVs in 2019 from Samsung are 8k. And what was that ideal VR clarity ? 8x4k each eye at 90 FPS ?

Yeah, sadly displays are evolving faster than computer hardware. High transistor density starts to be tedious for semiconductor manufacturers. As for VR, sales does not seem to raise that much and without mass adoption, I guess we won't see high-density panels come anytime soon. It's sad because we start to see some interesting technologies, like foveated rendering, to be able to render RT on that hardware. The Varjo technology is even more enticing, a small ultra high-density panel that follows your sight, backed by a standard resolution panel. They claim it to be equivalent to a 70k display.

see here :

From next year on, I presume majority of cell-phones will be enabled as well.
Basically, every smartphones with AMOLED displays are already compatible these days. The Galaxy S8 can display HDR content since 2017. So, this starts to be pretty common and widespread.

But oh boy, I believe it will be massive revolution when it becomes common spread accross all industries. Apparently right now SDR content looks terrible while on HDR mode, and vice-versa, how HDR content looks on SDR display we've known for years :- ).
So if someone wanted to jump on the hype train for archviz right now in the moment, what would it look like in practice ? Double the post-production ?
Does anyone already do it in some form for their clients ? I can imagine some real-estate people could showcase such content on top-grade TVs in their show rooms to wow the clients. HDR content on 100" 8k TV sounds more impressive to me than nause inducing VR.

That's actually the whole point of ACES! Keeping scene referred data all along the pipeline and working in a wide gamut space to be able to deliver to whatever display referred space. So basically, select your display transform in a dropdown list and deliver to the intended platform. You can already (kind of) do that already in a serious and valid post-production package that support OCIO.

The only issue is that most renderers are using sRGB primaries and it can cause some discrepancies during the conversion from linear sRGB to the intended working space (mostly AP1 for us, this is the space that defines the ACEScg gamut and that encompass the REC.2020 one). So it would be better to render straight into ACEScg from scratch. As a renderer is almost colorspace agnostic, you should already be able to do so, except for spectra related stuff ( everything driven by Kelvin temperature) because those correspond to defined RGB triplets in the targeted colorspace (6500k/D65 is the white point for sRGB as an example, but ACEScg has a D60 white point).

The real issue here is all that HDR shit, to be honest. Every manufacturer is applying a whole load of post-effects to make the image "look better" without any respect of the initial vision of the content creator. The only thing we should benefit from that technology is the wider gamut, it should not have any impact on the dynamic range of the displayed medium (software wise). All the stuff they add on top of that is a massive pile of shit and a lot of film producers start to raise their voices against those marketing trends.

Work in Progress/Tests / Re: Glass Render Test
« on: 2019-02-06, 17:42:31 »
Well, you now get material to test the upcoming Corona autocaustics feature :)

Feature requests / Re: The most wanted feature?
« on: 2019-02-06, 15:22:44 »
Please do not paint us as uncaring when day has just 24 hours and workday just 8 ;)

Nice reminder Ondra. As I'm one of those who is constantly pushing for new features/improvements or pointing out what's wrong, I'd like to take some time to say that you guys are incredibly talented and that what you manage to build since the first release of Corona is quite impressive. Numbers don't lie:

So keep up the good work!

Ondra, would you consider to move your office to Mars when humans will establish colony on the red planet? I know, 37 additional minutes isn't that much, but it still would be an edge over competitors :]

I laughed :)

I wonder if Fstorm is able to sample bump so good, just because in the video was used its own noise shader? It might be that this trick won't work with regular noise or bitmap texture.

I wondered this too. At same though, someone told me they integrated this ?

The benefits of that go way beyond anisotropy alone. It simplifies shader creation and gives super accurate result regardless of resolution. Would be super nice to have that.

Juraj, finally took time to plunge a bit deeper in the paper you linked here and damn yeah! That's really cool!

now we have prototype in corona too ;)

Well, that was fast! :)

Work in Progress/Tests / Re: dubcats secret little hideout
« on: 2019-01-31, 15:42:13 »
From the screenshots it looks like you didn't plug the highpassed linear roughness map into the mask slot ? Hard to tell since the screenshot is clipped off.

It was plugged in, that's a gamma issue for sure, I'll prepare you an explanation of what's going on.

FStrorm still does volume displacement, and it's fuc***g awesome. Ever wondered why fStorm users can use displacement on every single plant in their interior, and why the displacement only take like 400mb instead of 200 gazillion terabytes ? This is why.

Yep, Fstorm displacement is great (the only drawback is that it does not handle procedurals)! we discussed that in this thread:

Gallery / Re: Pictures from Corona Land #1 - Winter scene
« on: 2019-01-30, 15:50:43 »
Nice work man, looking forward to see more too!

Work in Progress/Tests / Re: dubcats secret little hideout
« on: 2019-01-25, 02:02:26 »
Hey dubcat,

Looks like the exposure inconsistency is related to the gamma 2.2 part in the IOR mask generation. Why did you choose to perform those operations in gamma 2.2 to put it back into gamma 1.0 afterward? To my knowledge, playing with gamma while performing arithmetic in a node tree that is supposed to be plugged in a linear input always end-up in a bad way, especially with data as sensitive as IOR.

Work in Progress/Tests / Re: dubcats secret little hideout
« on: 2019-01-24, 16:25:30 »
For people who are reading this later, feel free to simplify the material with pure float values as Fluss mentioned.

There is a really strange behavior I cannot explain. After further testing, it looks like it's the 1/IOR part that not working here. On the render result at least as the material preview and the rendered material do not follow the same behavior. Here are some examples demonstrating the issue :

No IOR map plugged (IOR 1.5 in the material) :

1.5 IOR linear float corona color plugged in, work as intended :

Mixing 1.1 to 1.8 linear float using your node graph, the material preview is fucked up but render looks fine to me :

Mixing using the 1/IOR method, the material preview looks fine but render does not :

I have the feeling that there is some weird gamma related stuff behind that.

For the glossiness darkening and for the plasticy bump feeling, I'm not sure we are talking about the same stuff (see my previous posts).

Feature requests / Re: Cloth BRDF
« on: 2019-01-23, 18:37:51 »
Wouldn't it be better to put efforts in geopattern instead, which has way more uses?

If I have to choose between Geopattern and this tech to be implemented first, I'd definitely go for Geopattern because it is way more versatile indeed! It is not limited to fabrics so the cloth shader is not the priority TBH.

But you'll still need much more effort with Geopattern rather than this tech tho, just because you have to model the pattern by hand. What's more, Geopattern is useful for small repetitive cloth patches but as far as the yarn pattern is not consistent on the full mesh, it's another story. The real benefit of this is its procedural nature. You can reproduce patterns in no time.

Here is a small example (patterns don't match between examples but anyway, you got the idea):

Build the pattern with the pattern editor :

And you're done! This pattern will be used by the shader to build the yarns structure :

Then, each yarn will be replaced by a fully simulated fiber model with both dense fibers and flyaway fibers (those ones create the fuzziness). And you get control over a few other parameters :

That would be the ultimate cloth solution which is a tedious part in CGI since a long long time.

Pages: [1] 2 3 ... 25