Author Topic: Unreal Engine 4 for ArchViz - Thoughts?  (Read 160181 times)

2016-07-08, 23:14:11
Reply #285

nkilar

  • Active Users
  • **
  • Posts: 821
    • View Profile
    • My personal website
Yup, the 16GB Titan-P is going to be pretty revolutionary.

1 or 2 years, and Titan-Volta(or whatever) with 32GB of HMB memory, is going to absolutely kill CPU :- ). I am honestly surprised by this evolution. Mainstream 300 euro cards have 8GB of memory, glorious days.

You should buy the card nonetheless for Unreal :- ) I know I will.

If you are in the states you can get that 300 euro mark for the 8gb gfx card down to 250$ :P With CUDA reportedly getting ported for the AMD GCN (at least for Octane) that would make the AMD Vega super interesting too, especially since AMD isn't so stingy with VRAM.

What might also be interesting to people on a budget is a comparison between what a 1000$ CPU / GPU part gives you compared to each other in terms of rendering performance. Interesting stuff indeed!

2016-07-08, 23:17:20
Reply #286

Juraj Talcik

  • Active Users
  • **
  • Posts: 3660
  • Tinkering away
    • View Profile
    • studio website
Well I live in Europe, so dreaming about US prices without tax is quite pointless to me :- )

That comparison is quite interesting to me, something I want to find out with Redshift.
talcikdemovicova.com  Website and blog
be.net/jurajtalcik   Our studio Behance portfolio
Instagram   Our studio Instagram, managed by Veronika

2016-07-08, 23:32:30
Reply #287

nkilar

  • Active Users
  • **
  • Posts: 821
    • View Profile
    • My personal website
Let me know how it works out, please :) Its fun to think about these things...

It will be also interesting to see what happens when GPUs hit the architecture shrink wall like the CPUs? Architecture optimizing?

edit: I feel ya about the prices though, I'm from Europe too.

2016-07-09, 06:43:44
Reply #288

philippelamoureux

  • Active Users
  • **
  • Posts: 218
    • View Profile
I've seen Nvidia announce the gtx 1060 for 250$ with same perf as my 800$ gtx 980 (1year old). I want to cry. But yea, it's a good thing for everybody.

2016-07-09, 10:07:54
Reply #289

Benny

  • Active Users
  • **
  • Posts: 176
    • View Profile
1 or 2 years, and Titan-Volta(or whatever) with 32GB of HMB memory, is going to absolutely kill CPU :- ).

So using Corona will in 2 years potentially leave me at a serious disadvantage?

I love Corona but I'm an impatient man and interactivity is critical to me. If the Corona team is saying that it will never happen it's quite frankly worrying.

2016-07-09, 12:46:08
Reply #290

Juraj Talcik

  • Active Users
  • **
  • Posts: 3660
  • Tinkering away
    • View Profile
    • studio website
I am currently playing with 3 GPU renderers....and only one of them is fast.

Using CPU renderer is not at being in disadvantage, the Corona devs believe they can speed-up the engine by other methods than simply swapping to GPU. Arnold and Vray think the same. (yes, Vray has VrayRtGPU, you can ignore it safely)
I also don't think they will be as fundamental about it forever :- ) You can have ideals, and then there is reality.
talcikdemovicova.com  Website and blog
be.net/jurajtalcik   Our studio Behance portfolio
Instagram   Our studio Instagram, managed by Veronika

2016-07-09, 21:04:29
Reply #291

Benny

  • Active Users
  • **
  • Posts: 176
    • View Profile
Thanks for the wisdom, albeit a bit cryptic.   :)

I realize this is a Corona forum, but you can't tease like that, which is the fast one?

The only one I have actual experience of is Vray GPU, but my memory card can't hold most of my scenes so I never really bothered with it although it may be more optimized nowadays. Why should it be ignored? Still too different result from the regular one?

But back to our favorite renderer, wasn't there talk a while ago about some collaborative project together with AMD?

2016-07-09, 22:59:59
Reply #292

philippelamoureux

  • Active Users
  • **
  • Posts: 218
    • View Profile
If there were more online gpu renderfarm that would help too. Right now if you don't have your own and you have a movie to make, you are pretty much screwed!!!

2016-07-09, 23:30:31
Reply #293

Juraj Talcik

  • Active Users
  • **
  • Posts: 3660
  • Tinkering away
    • View Profile
    • studio website
Having gpu render farm defeats the whole "gpus is faster" concept no ? If I am still at mercy of render-farm, I can just stay forever with cpus :- ). The farms won't have more vram, and they won't be cheaper.


Quote
Thanks for the wisdom, albeit a bit cryptic.   :)

I realize this is a Corona forum, but you can't tease like that, which is the fast one?

Haha, I didn't mean to be cryptic :- ) Just didn't consider it important for the argument to name them. I already wrote here I am testing Redshift (I don't plan on switching, but it's the first gpu engine that really interests me).
talcikdemovicova.com  Website and blog
be.net/jurajtalcik   Our studio Behance portfolio
Instagram   Our studio Instagram, managed by Veronika

2016-07-11, 17:09:08
Reply #294

Benny

  • Active Users
  • **
  • Posts: 176
    • View Profile
Having gpu render farm defeats the whole "gpus is faster" concept no ? If I am still at mercy of render-farm, I can just stay forever with cpus :- ). The farms won't have more vram, and they won't be cheaper.

Well, I guess the core point is whether gpu will be exponentially faster long term. I don't know of any CPU based renderfarm that can give me a close to real time experience but a future gpu based might, and since real time rendering is the holy grail of the industry such a renderfarm would be worth pursuing if it arrives here faster.

Having said that I must confess I have no idea how cpu and gpu correlates to each other. How many cuda cores would be comparable to a 5960X? Is it even possible to make such a comparison (putting aside software for a moment)? Vray RT owners must have tried this. If Titan P has 6000 cores and next years Titan 12,000 and then 20,0000 and so on, can cpu's make a dent even if they also double their amount of cores over the years?

2016-07-11, 17:22:28
Reply #295

Juraj Talcik

  • Active Users
  • **
  • Posts: 3660
  • Tinkering away
    • View Profile
    • studio website
It's not possible to make such comparison in general terms. GPU's "cores" are in no way similar functionally, so they can't be described as fraction of CPU core.

It can only be done on software level and particularly similar sampling, so CPU vs GPU path-tracing. But even then it's not that revealing about GPU performance solely.

But the scaling concept applies well, GPUs have quite easy time multiplying their performance per generation because of their super-specialization. CPU can't do that, even if Intel would have brought something revolutionary to table.
As long as the GPU's limitations (you still can't port everything to it) keep falling, it might be eventual winner. Esp, when the 32GB models start hitting mainstream.

Do note that, Titan-P will be 1400 +/- euros imho at minimum, if it's really so much more powerful than 1080. 2-3 of them, which you will need for really speedy GPU renderer, still makes for very expensive computer for average user. Average user with single quad-core is not gonna decrease his render times radically with single GTX 1070. Absolutely not with the two GPU path-tracers out there :- )
But for power-user, buying 7 Titans will be far better value/performance than Xeon farm (if we ignore the existence of "ES" models ;- ) ... )

But with Redshift I want to make that laic comparison anyway :- ) I have pretty much all CPUs at home imaginable, and quite few interesting GPUs ( Titan-X, will buy Titan-P also as soon as it's on market). So I will come up with some kind of metric to do a comparison. I fear it would start shitstorm, but it's a question everyone is asking themselves anyway :- )
talcikdemovicova.com  Website and blog
be.net/jurajtalcik   Our studio Behance portfolio
Instagram   Our studio Instagram, managed by Veronika

2016-07-11, 18:18:51
Reply #296

Ondra

  • Administrator
  • Active Users
  • *****
  • Posts: 8900
  • Turning coffee to features since 2009
    • View Profile
Well, I guess the core point is whether gpu will be exponentially faster long term. I don't know of any CPU based renderfarm that can give me a close to real time experience but a future gpu based might, and since real time rendering is the holy grail of the industry such a renderfarm would be worth pursuing if it arrives here faster.

I would like to have your optimism... I mean, 10 years ago I heard exactly the same story: GPUs are not yet all-powerful, but soon they will be and will allow you to render stuff you never even thought possible... just one or two generations... just one or two problems to fix :D


Generally it is impossible to compare GPU core to CPU core, but you can compare the overal practical performance, which is roughly similar per-$ and per-Watt.
Rendering is magic.
Private scene uploader | How to get minidumps for crashed/frozen 3ds Max | Sorry for short replies, brief responses = more time to develop Corona ;)

2016-07-12, 06:10:31
Reply #297

philippelamoureux

  • Active Users
  • **
  • Posts: 218
    • View Profile
The rendertime isn't a huge deal. 10 mins on cpu or 7 mins on gpu, or vice versa... we have all the power of the world in 1 click with render farms for now.

What I want is a real-time viewport experience, as smooth as unreal/unity.
Corona interactive is good but I don't really like 3ds max, I find it sluggish. It's one of the only software that feels heavy and buggy.

2016-07-12, 07:05:33
Reply #298

Benny

  • Active Users
  • **
  • Posts: 176
    • View Profile
I would like to have your optimism... I mean, 10 years ago I heard exactly the same story: GPUs are not yet all-powerful, but soon they will be and will allow you to render stuff you never even thought possible... just one or two generations... just one or two problems to fix :D

That's all very true. I'm old enough to remember a company called Intergraph, who were trying to take on SGI at the time with Windows NT machines, boasting about their realtime rendering graphics system. They were showing impressive things at the time, which of course would look hopelessly simple by today's standards.

Still, if different product strategies are viable but on different development trajectories, and one is potentially a better long term solution, their performance curves will eventually cross and after that point it is no longer one or two generations away, it has happened. And I agree with Philippe, it is not so much about the final render as it is about the interactivity getting there.

I'm not familiar with Redshift, but if a serious player comes out that is similar to Corona, but focused on GPU instead of CPU, it could become a very successful product. I really love Corona and their desire to remove all clutter from the renderer, realizing that not everyone likes to tinker with render settings. (Having come to the point I'm at today together with and because of Vray, I actually feel disloyal to Vray saying that, as it clearly is a comment directed in that direction, but I just can't do it anymore. Vlado and his user base is either just to smart for me, or they are too interested in render technology in itself. Quick Settings aside I've long ago given up that they will clean up and automate the interface and every suggestion to make it more elegant, like a material preview shader, is apparently a waste of time that no one will use anyway.)

Philippe, I actually find Max both fast and capable, what would be the alternative? If one comes along I'm sure creative forces like Juraj, Guthrie, Bekenman, Bernard and the other usual suspects will jump ship and we'll all follow along, just like we did with Corona.  :D

2016-07-12, 17:49:40
Reply #299

Juraj Talcik

  • Active Users
  • **
  • Posts: 3660
  • Tinkering away
    • View Profile
    • studio website
WYSIWYG workflow like real-time engines offer is pretty amazing, but it isn't fair comparison. People use raytracers for their absolute visual quality, so there will always be some trade-off. I can live with many limitations, as long as the final result is worth it.

Unreal will be Unreal, and raytracers will be raytracers for quite some time.

Redshift is pretty genial in many aspects, but it's not competitor for Corona honestly, it will compete primarily against Arnold (and Vray I guess, if it survives long enough :- ) ... ).
It has way more options than Vray 3.4SP :- ), which btw, can be used quite similarly to Corona already, they streamlined the workflow. The fact the "hardcore" community users stick to their "verified" methods dating back to 1.5, isn't Vlado's fault.
talcikdemovicova.com  Website and blog
be.net/jurajtalcik   Our studio Behance portfolio
Instagram   Our studio Instagram, managed by Veronika