Author Topic: DR nodes constant updates but not rendering passes  (Read 142 times)

2019-02-02, 13:23:25

ant_march

  • Active Users
  • **
  • Posts: 7
    • View Profile
I'd like to understand whats happening with Corona and network rendering, and Corona RAM usage, as it seems my render nodes do not have enough RAM.  I've setup network rendering, using my workstation and 3 render nodes.  Each of my nodes have 16gb of ram.  It works, I have successfully sent jobs across dr, the nodes join in and render passes.

Below is last nights 4k render, using DR.  Usual type of scene, Xref's proxies, Forest Pack, downloaded models, 4k textures etc. 



I'm not sure if it's rendering on my larger scenes, but I'm not sure why it says I'm running out of RAM.  As you can see, it says the nodes  Updated but did not render any passes. Does this mean they still contributed?  I limit the render to 100 passes, 6 hours, and a noise level of 3.5.  The log showed it reached something like 65/100, with no nodes rendering passes and took a lit under 4 hours to finish.  I've pasted in the log from the one of the render nodes,
4.1 gb used by render elements
5.6 gb used by geom
17.9 gb used by textures.

WOW, whats actually happening?  I can see the nodes are running out of ram, but all my bitmaps for this scene come to no more than 2gb at most. 

So why is this happening, and how can I work around it? 
Updated the nodes with more ram is out of the question.  DR only starts to become useful on big and complex scenes that take a long time to render, is it actually working if it doesn't contribute passes?  If not, then how can I make it work on the scenes that will be render intensive?



2019-02-03, 15:25:02
Reply #1

TomG

  • Corona Team
  • Active Users
  • ****
  • Posts: 2112
    • View Profile
Just a note that you mention "below is the 4K render" and that there should be some a log from the render node too - neither of those seem to have shown up in the post though :(

2019-02-03, 16:10:02
Reply #2

ant_march

  • Active Users
  • **
  • Posts: 7
    • View Profile
The render stats, I did not make a screen grab of the render stats.  The DR panel of the render window lists the render nodes, on the screen grab.  I also pasted the render log from one of the nodes, into the screen grab.  Here it mentions the GB usage?  If not what's on the image, to what else are you referring? thanks.

2019-02-04, 11:04:45
Reply #3

Frood

  • Active Users
  • **
  • Posts: 1130
    • View Profile
    • Rakete GmbH
If not what's on the image

The problem is: there is no image to look at :) Only a (broken) link to dropbox where we get a 401 error from the server, meaning unauthorized access.


Good Luck



Never underestimate the power of a well placed level one spell.

2019-02-04, 12:59:02
Reply #4

ant_march

  • Active Users
  • **
  • Posts: 7
    • View Profile
is this showing now? If not, what do people use to insert images into forum topics?

  Dropbox is not what it used to be. 


2019-02-04, 13:05:52
Reply #5

Frood

  • Active Users
  • **
  • Posts: 1130
    • View Profile
    • Rakete GmbH
is this showing now?

Yes, now it's accessible.


Good Luck



Never underestimate the power of a well placed level one spell.

2019-02-04, 13:06:47
Reply #6

ant_march

  • Active Users
  • **
  • Posts: 7
    • View Profile

2019-02-04, 13:35:09
Reply #7

Frood

  • Active Users
  • **
  • Posts: 1130
    • View Profile
    • Rakete GmbH
I can see the nodes are running out of ram, but all my bitmaps for this scene come to no more than 2gb at most. 

2GB on disk you mean? Usually these are compressed formats like jpg or any LZW compressed images like TIF. Textures get unpacked into memory so the size grows massively compared to the file size. You could test your scene quickly without textures by using the material override in the "Scene" tab of render setup: Use a simple Corona material and (test-) render your scene on all nodes, then check memory and/or pass contribution of the slaves.

So why is this happening, and how can I work around it? 

Looks like the nodes have been paging memory permanently and weren't able to render a single pass. While rendering, look at their task manager and check disk/CPU performance. The slaves for sure go up to 100% disk load and almost no CPU load. Amazing that they did not crash.
I fear with 16GB of ram and such an amount of textures there is no other way then to just reduce the resolution of all textures where possible. I wonder where they come from, the scene does not look like 18GB of textures to me. And there may be other issues with the scene - enabled bitmap paging or similar. Hard to say from a picture.


Good Luck



Never underestimate the power of a well placed level one spell.

2019-02-04, 13:49:24
Reply #8

ant_march

  • Active Users
  • **
  • Posts: 7
    • View Profile
I wasn't aware the textures would unpack to such a larger size.  Textures can soon add up, if jpg's are saved with maximum quality over medium quality for example.  Something I've stared doing as well is saving 4096 textures for most things.  Large brick coverage requires it to aid with hiding tiling. AXYZ characters come with targa files, pretty large, and downloaded textures off Poliigon with spec, bump and normal at 4096 also add up, plus the plant libraries.  I'll probably enable the proxy resolution to downscale the textures to see if that helps and maybe page to disk even.  Yes tiny res maps will render way quicker, and you could argue for many aspects of a scene 4k textures are not required.  Close ups yes. renders not necessarily.  At least I know now if the DR node isn't rendering passes, it isn't contributing, and the unpacking multiplies the size of textures hugely.

Thanks for your response Frood.

2019-02-15, 14:01:31
Reply #9

maru

  • Corona Team
  • Active Users
  • ****
  • Posts: 8538
  • Marcin
    • View Profile
So the conclusion here is that the scene was simply too heavy for the nodes to handle?
Here are some RAM optimization tips: https://coronarenderer.freshdesk.com/support/solutions/articles/12000023310
I would also say that 16 GB of RAM is not enough for most "standard sized" scenes. If it's something simple, it will be probably enough. But if it's an interior with furniture, or exterior with trees, then I would definitely recommend 32 as the minimum.