Author Topic: Unreal Engine 4 for ArchViz - Thoughts?  (Read 162983 times)

2016-04-08, 10:01:11
Reply #240

rambambulli

  • Active Users
  • **
  • Posts: 160
    • View Profile

2016-04-08, 10:42:53
Reply #241

philippelamoureux

  • Active Users
  • **
  • Posts: 218
    • View Profile
Their presentation was weird. They said it was real-time but all they showed was from a static pov. At this point it just seemed like a normal 360 stereo panoramic image to me.

''I got the chance to experience a demonstration of Iray VR Lite on an HTC Vive, and in this demo Nvidia had a handful of pre-defined head positions. I could look around, and in the distance there would be a glowing green orb. Using the controller, I could aim at it and trigger a teleport. With that, I was able to maneuver through the upcoming headquarters, Endeavor. It was quite an impressive demo, but lacking the ability to move my head around did feel, well, lacking.'' -Niels Broekhuijsen

''The demo was run on Nvidia’s VCA (Visual Computing Appliance) with 32 nodes in Santa Clara, with the output transmitted to the convention center through a very fast Internet connection''

http://www.tomshardware.com/news/nvidia-iray-vr-gtc-2016,31573.html

That all sounds very MEH to me. Require overly expensive hardware.

2016-04-08, 10:46:54
Reply #242

Juraj Talcik

  • Active Users
  • **
  • Posts: 3699
  • Tinkering away
    • View Profile
    • studio website
Quote
Given that a typical Iray render takes seconds or even minutes to calculate, even on a multi-GPU machine, whereas VR headsets have a refresh rate of 90fps, Iray VR isn’t a real-time raytracing solution.

Quote
Given that a typical Iray render takes SECONDS

:- D


Anyway, so it's rendering hundreds of 360s which somehow blend with depth buffer. Well that, must truly be "impressive". Seems like another completely useless gimmick.
talcikdemovicova.com  Website and blog
be.net/jurajtalcik   Our studio Behance portfolio
Instagram   Our studio Instagram, managed by Veronika

2016-04-08, 11:00:31
Reply #243

philippelamoureux

  • Active Users
  • **
  • Posts: 218
    • View Profile

2016-04-08, 11:22:11
Reply #244

DeadClown

  • Global Moderator
  • Active Users
  • ****
  • Posts: 1439
    • View Profile
    • racoon-artworks
Anyway, so it's rendering hundreds of 360s which somehow blend with depth buffer. Well that, must truly be "impressive". Seems like another completely useless gimmick.
Sigh.... VR is currently one of those bullshit bingo loaded topics in the industry. Everyone is trying to present themselves as "world-leading" in producing "immersive" demos that show the "future" of VR - no matter how impractical or far it is from production reality. Most of the stuff has been around for years or decades and was as unusable then as it is now. Lightfields for example - I'm still waiting for oTOY to release their incredible "changing-everything1!!11!" technology they were (not) showing for months. A calculator and some primary school maths is enough to show that there are serious limitations in this approach (which is why this tech exists for over a decade and no one has been using it for serious stuff). It's a bit like deep compositing - everyone was thrilled by the possibilities but even big production companies seem to be abandoning it in some cases because it's impractical.
In this case it's just nvidia marketing crap, nothing more.
Any sufficiently advanced bug is indistinguishable from a feature.

2016-04-08, 11:42:44
Reply #245

Juraj Talcik

  • Active Users
  • **
  • Posts: 3699
  • Tinkering away
    • View Profile
    • studio website
nVidia does it routinely though. They do have some nice public tech, esp. Gameworks, but then the other half is just jumping on any train for what you write, marketing purposes. But I am surprised it gets so much traction nonetheless.
talcikdemovicova.com  Website and blog
be.net/jurajtalcik   Our studio Behance portfolio
Instagram   Our studio Instagram, managed by Veronika

2016-04-08, 13:12:54
Reply #246

Ondra

  • Administrator
  • Active Users
  • *****
  • Posts: 8904
  • Turning coffee to features since 2009
    • View Profile
Quote
Given that a typical Iray render takes SECONDS


Seriously, this is how you lie like a professional. NVidia marketing folks are experts at this. This is the same concept as GPU rendering. Do not try to persuade anyone about the lie. Just state it as a footnote, btw-remark in something completely unrelated, over and over again. Be confident, talk about it as a widely established fact. Do not try to convince anybody about the lie - just do not let them even doubt it! Bravo.
Rendering is magic.
Private scene uploader | How to get minidumps for crashed/frozen 3ds Max | Sorry for short replies, brief responses = more time to develop Corona ;)

2016-04-08, 13:26:56
Reply #247

rambambulli

  • Active Users
  • **
  • Posts: 160
    • View Profile
Quote
That all sounds very MEH to me

+1

Sorry for mentioning it.

2016-04-08, 13:29:12
Reply #248

philippelamoureux

  • Active Users
  • **
  • Posts: 218
    • View Profile
Does that mean that this was a lie too? ;-P

SECONDS!!!

« Last Edit: 2016-04-08, 13:35:41 by philippelamoureux »

2016-04-08, 15:28:16
Reply #249

maru

  • Corona Team
  • Active Users
  • ****
  • Posts: 9225
  • Marcin
    • View Profile

2016-04-08, 16:34:41
Reply #250

romullus

  • Global Moderator
  • Active Users
  • ****
  • Posts: 6015
  • Let's move this topic, shall we?
    • View Profile
    • My Models
Don't forget to order bunch of these! :]

I'm not Corona Team member. Everything i say, is my personal opinion only.

2016-05-05, 10:34:31
Reply #251

melviso

  • Active Users
  • **
  • Posts: 390
    • View Profile
    • Portfoilo
@philippelamoureux

Been really busy. How is the ue4 project coming along? Any updates?


Currently trying out Lumberyard in my free time. Just scratching the surface atm. I am really digging the realtime GI and how easily u can set up Time of Day. Unfortunately their material editor is quite limited, no nodes. So all texturing will have to be done externally. I am still trying to test the realtime reflections. The engine seems quite straightforward.

Will still be experimenting with ue4 as well (finally downloaded it).

2016-06-03, 12:11:50
Reply #252

melviso

  • Active Users
  • **
  • Posts: 390
    • View Profile
    • Portfoilo
Someone on the unreal engine forums is doing some test with ue4:


 More info here: https://forums.unrealengine.com/showthread.php?107852-Environment-WIP-Forever

I have tried out Amazon lumberyard during my spare time and while it has realtime GI (doesn't have indirect shadows). It isn't the best to use due to it's limited material editor and file, texture and animation requirement hurdles for import. Documentation is not really fleshed out. Another thing is I have noticed a lot of ppl are not using it. that might not be best as a community is important for this engine to strive. The devs have however been really up and doing . I am not sure Cryengine is getting a lot of use since they changed their payment model
Unity has a new payment model in place which is really upsetting it's userbase.

Ue4 on the otherhand is a beast and lightmass seems to be doing quite well. It's realtime GI- Vxgi is still in development. UE4 is the it engine right now and the ease of importing 3d models, animations and textures is just dope. Let's all hope UE4 doesn't change theirs though there is something about Unreal Engine Enterprise.


2016-06-03, 15:06:00
Reply #253

cecofuli

  • Active Users
  • **
  • Posts: 1491
    • View Profile
the grass is amazing (on the UE4 forum, not in this image)

2016-06-03, 15:31:25
Reply #254

maru

  • Corona Team
  • Active Users
  • ****
  • Posts: 9225
  • Marcin
    • View Profile
I'd love to run this in real-time (even though it would be probably 5fps ;) ).

Do you guys know of any nice free UE demos? I only know of "Winter Chalet" (yes, I did find the snowmobile).