Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - Fluss

Pages: [1] 2 3 ... 25
1
Yeah, definitely not the same process. Here, it is as simple as putting a map in the shader, activate the feature and hit render. This would be insanely useful with maps such as Megascans, or almost any scanned texture, which look great but are highly repetitive when tiled.

2
Stumbled upon this article: https://blogs.unity3d.com/2019/02/14/procedural-stochastic-texturing-in-unity/

It allows to tile stochastic or near stochastic textures with no visible repetition. Would be insanely useful to have such a tool in Corona. Especially for large surface areas, such as road, concrete walls etc...

Here is the full paper: https://drive.google.com/file/d/1QecekuuyWgw68HU9tg6ENfrCTCVIjm6l/view

3
Work in Progress/Tests / Re: dubcats secret little hideout
« on: 2019-03-26, 14:27:32 »
I think it's the best place for this. This is one of the best synthesis of what should characterize an uber shader I've seen so far. Enjoy !

https://autodesk.github.io/standard-surface/

4
Gallery / Re: Roman Grand Hall
« on: 2019-02-26, 11:07:22 »
Nice details here! Maybe a little bit too much glare for my taste.

5
General Discussion / Re: Frosted glass noise
« on: 2019-02-21, 23:26:59 »
Try ranch computing, really good renderfarm

6
I need help / Re: burn issue
« on: 2019-02-19, 10:52:18 »
Did you use the override reflection slot? It can produce a lot of fireflies when you have different HDRIs in global illumination and reflection slots.

7
This is because of the core fundamental of the dynamic range: the ratio between bright and dark areas. In your case, HDR1000 refer to the peak brightness of the monitor (1000nits). Black areas for the specification have to be under 0.03 nits. So if you lower the peak brightness, you lower the dynamic range and then you'll be out HDR1000 specification, hence the brightness lock.

8
So would the solution be implementing colour management into corona & to max and rendering straight to ACEScg? I didnt realise that even on certified HDR displays they were still adding lots of processing ontop but i guess its obvious when i think about it. I guess its to compensate for shit panels or discrepancy across a panel batch so they all look about the same?

What do you mean here? The solution to what? If you're talking about the color discrepancies that occur during the colorspace switch then yes, we should render straight into ACEScg to get a proper color-managed workflow. But this is not that simple tho, as it may introduce other caveats.

I think stuff like that will be more unified once the standards start to be similar to each other along the way. And it isn't any different from people using "Vivid" mode on their display/TV right now.

The only person seeing the image as the creators (us) intended...is well, us :- ). Just something to live with.

For sure! It is just that this HDR thingy introduced a load more of those post-processes. What's more, HDR and HDR10 are stuck with a fixed curve at the beginning of the media playback which end up with details loss in very bright or dark scenes. Things are going in a good way tho, HDR10+ and Dolby vision specifications introduce dynamic metadata to allow the change of the brightness boundaries on the fly (per scene) rather than remaining constant for the whole experience.

I think stuff like that will be more unified once the standards start to be similar to each other along the way. And it isn't any different from people using "Vivid" mode on their display/TV right now.

The only person seeing the image as the creators (us) intended...is well, us :- ). Just something to live with.

The thing im not looking forward to is the day that the 30fps standard disappears and we have to start rendering animations at much higher framerates

Yeah, 4k/60 would be nightmare :- ) But same happened to still frame rendering, I used to do render that took 2 hours on single quad-core with Vray for 4k resolution, now that same 4k resolution is easily 2 hours on 200 cores... Quality standards constantly grow.
Anyway, apparently half of TVs in 2019 from Samsung are 8k. And what was that ideal VR clarity ? 8x4k each eye at 90 FPS ?

Yeah, sadly displays are evolving faster than computer hardware. High transistor density starts to be tedious for semiconductor manufacturers. As for VR, sales does not seem to raise that much and without mass adoption, I guess we won't see high-density panels come anytime soon. It's sad because we start to see some interesting technologies, like foveated rendering, to be able to render RT on that hardware. The Varjo technology is even more enticing, a small ultra high-density panel that follows your sight, backed by a standard resolution panel. They claim it to be equivalent to a 70k display.

see here :
https://varjo.com/bionic-display/

9
From next year on, I presume majority of cell-phones will be enabled as well.
Basically, every smartphones with AMOLED displays are already compatible these days. The Galaxy S8 can display HDR content since 2017. So, this starts to be pretty common and widespread.

But oh boy, I believe it will be massive revolution when it becomes common spread accross all industries. Apparently right now SDR content looks terrible while on HDR mode, and vice-versa, how HDR content looks on SDR display we've known for years :- ).
So if someone wanted to jump on the hype train for archviz right now in the moment, what would it look like in practice ? Double the post-production ?
Does anyone already do it in some form for their clients ? I can imagine some real-estate people could showcase such content on top-grade TVs in their show rooms to wow the clients. HDR content on 100" 8k TV sounds more impressive to me than nause inducing VR.

That's actually the whole point of ACES! Keeping scene referred data all along the pipeline and working in a wide gamut space to be able to deliver to whatever display referred space. So basically, select your display transform in a dropdown list and deliver to the intended platform. You can already (kind of) do that already in a serious and valid post-production package that support OCIO.

The only issue is that most renderers are using sRGB primaries and it can cause some discrepancies during the conversion from linear sRGB to the intended working space (mostly AP1 for us, this is the space that defines the ACEScg gamut and that encompass the REC.2020 one). So it would be better to render straight into ACEScg from scratch. As a renderer is almost colorspace agnostic, you should already be able to do so, except for spectra related stuff ( everything driven by Kelvin temperature) because those correspond to defined RGB triplets in the targeted colorspace (6500k/D65 is the white point for sRGB as an example, but ACEScg has a D60 white point).

The real issue here is all that HDR shit, to be honest. Every manufacturer is applying a whole load of post-effects to make the image "look better" without any respect of the initial vision of the content creator. The only thing we should benefit from that technology is the wider gamut, it should not have any impact on the dynamic range of the displayed medium (software wise). All the stuff they add on top of that is a massive pile of shit and a lot of film producers start to raise their voices against those marketing trends.

10
Work in Progress/Tests / Re: Glass Render Test
« on: 2019-02-06, 17:42:31 »
Well, you now get material to test the upcoming Corona autocaustics feature :)

11
Feature requests / Re: The most wanted feature?
« on: 2019-02-06, 15:22:44 »
Please do not paint us as uncaring when day has just 24 hours and workday just 8 ;)

Nice reminder Ondra. As I'm one of those who is constantly pushing for new features/improvements or pointing out what's wrong, I'd like to take some time to say that you guys are incredibly talented and that what you manage to build since the first release of Corona is quite impressive. Numbers don't lie: http://www.cgarchitect.com/2018/02/2018-architectural-visualization-rendering-engine-survey

So keep up the good work!

Ondra, would you consider to move your office to Mars when humans will establish colony on the red planet? I know, 37 additional minutes isn't that much, but it still would be an edge over competitors :]

I laughed :)

12
I wonder if Fstorm is able to sample bump so good, just because in the video was used its own noise shader? It might be that this trick won't work with regular noise or bitmap texture.

I wondered this too. At same though, someone told me they integrated this ? http://www.cs.ucsb.edu/~lingqi/publications/paper_glints3.pdf

The benefits of that go way beyond anisotropy alone. It simplifies shader creation and gives super accurate result regardless of resolution. Would be super nice to have that.

Juraj, finally took time to plunge a bit deeper in the paper you linked here and damn yeah! That's really cool!

13
now we have prototype in corona too ;)

Well, that was fast! :)

14
Work in Progress/Tests / Re: dubcats secret little hideout
« on: 2019-01-31, 15:42:13 »
From the screenshots it looks like you didn't plug the highpassed linear roughness map into the mask slot ? Hard to tell since the screenshot is clipped off.

It was plugged in, that's a gamma issue for sure, I'll prepare you an explanation of what's going on.

FStrorm still does volume displacement, and it's fuc***g awesome. https://fstormrender.ru/manual/displacement. Ever wondered why fStorm users can use displacement on every single plant in their interior, and why the displacement only take like 400mb instead of 200 gazillion terabytes ? This is why.

Yep, Fstorm displacement is great (the only drawback is that it does not handle procedurals)! we discussed that in this thread: https://corona-renderer.com/forum/index.php?topic=22625.0

15
Gallery / Re: Pictures from Corona Land #1 - Winter scene
« on: 2019-01-30, 15:50:43 »
Nice work man, looking forward to see more too!

Pages: [1] 2 3 ... 25