Author Topic: Creating a render farm  (Read 11854 times)

2016-06-02, 18:14:10
Reply #15

Juraj Talcik

  • Active Users
  • **
  • Posts: 3627
  • Tinkering away
    • View Profile
    • studio website
My 70 euro x2 for dual Xeon E5-2670 v1 should arrive tomorrow hopefully, and I already have the rest. Will post my build to see how it fares.

You can actually buy them instead of 3930k i7 and put them into single-socket motherboard :- ) You will then pay 70 euro instead of 500+ euro for the same performance :- )

For anything else, read the thread Maru linked. This question has been asked 50 times since this forum was created and there are simply enough answers by now.
talcikdemovicova.com  Website and blog
be.net/jurajtalcik   Our studio Behance portfolio
Instagram   Our studio Instagram, managed by Veronika

2016-06-02, 18:34:54
Reply #16

jasond

  • Active Users
  • **
  • Posts: 20
    • View Profile
Thanks very much for the tips, that thread is quite helpful as are the other posts in the Hardware forum!! I'm going through it closely with our IT guys as they'd be having all the fun installing :( They are mainly trying to make the business case for building a dedicated blade or a stack of PC's that we can use 24/7. Will post in the appropriate forum next time :)

2016-06-02, 18:58:39
Reply #17

Juraj Talcik

  • Active Users
  • **
  • Posts: 3627
  • Tinkering away
    • View Profile
    • studio website
Blade servers are the thinnest units meant into Blade enclosure. They're extremely expensive, noisy and require external cooling. They usually can't hold the most performance oriented units which top out at 145W each CPU. They're just meant for massing huge density in clusters. Useless for anyone else.

Then then are simply rack-mount servers, which can host common 4U chassis which are almost as roomy as regular ATX cases ( "almost" being the key, you will be using 80-100mm fans inside, so they're gonna be MUCH louder :- ) ... ) This is good density oriented solution for medium-to-big sized company.

But you can just buy the top range v4 Xeons and simple 4x ATX-XL cases for example ('Big towers') will yield huge performance (like 25 000 R15 Cinebench points, so roughly 25x 6core i7s ). Rack is not such a necessity and it's loud...and hot. For smaller studios a stack of normal cases should suffice 100perc. time.

Unless you are single freelancer on very low budget don't buy multiple i7s...or some IKEA Helmer case nonsense :- ) It's not worth to have full room of boxes and managing it to save some 30perc. of budget in value/performance. If you can have 40 cores/80 threads in single silent tower, with single licence for software, that...is amazing.
« Last Edit: 2016-06-02, 19:02:51 by Juraj_Talcik »
talcikdemovicova.com  Website and blog
be.net/jurajtalcik   Our studio Behance portfolio
Instagram   Our studio Instagram, managed by Veronika

2016-06-02, 19:22:11
Reply #18

jasond

  • Active Users
  • **
  • Posts: 20
    • View Profile
Fantastic information Juraj, thank you very much! We have a large server room here with several racks that could hold either the blade enclosure or 4U servers but like you say, a stack of Xeon towers seems to be the best bang for the buck.

Thanks again.

2017-05-26, 06:50:16
Reply #19

danielmn

  • Active Users
  • **
  • Posts: 70
    • View Profile
Hey guys,
thanks for all the great info.  Just wanted to add my 2 cents and add a question.

As far a GHZ speed between processors I have seen very little difference.   I have gone from a 6 core 3.4ghz to a 6 core 4.2 ghz and only got about 10% difference on the benchmarks..... After I saw this I throttled down the Overclock... there was no reason to run the risk of an OC and power consumption.


Quesiton:
What is your take on RAM consumption if I was too only rely on DR rendering...several machines working on one stil.... Can the main machine have all the ram and the rest just have bare minimum RAM.... just wondering, what your thoughts were.

background
I am building a dual 8 core cpu machine.  But I was thinking of buying some cheap $200 all ready built HP quad processor machines.  The only downside with these machine they have very little ram.
Daniel M. Najera
3D Enviroment Artist
danielmn81@gmail.com
https://www.facebook.com/daniel.m.najera1

2017-06-11, 12:13:41
Reply #20

cgifarm

  • Active Users
  • **
  • Posts: 55
  • Your Brand New RenderFarm
    • View Profile
    • CGIFarm
Hey Guys, having a small renderfarm is cool for small and fast projects. Problem is that if you would like to scale this up in the future, you will hit the following problems:

1. Provisioning your computers with OS's and installing applications + plugins takes a LOT of time, major software install must be performed on each server if you don't have a good provisioning solution which will make it easy to create an image of one of your machines and install that on the other machines.
2. Electricity upgrade, your computers will begin using lots of electricity when they are rendering, and you must have a good installation so you avoid any cable meltings or blown fuses.
3. You will begin doing server administration instead of focusing on your projects, which I believe will lead into loosing more money than actually getting by using your servers.
4. Heat issues - you will have to ventilate the room where you keep the servers properly, AC could also be needed.
5. Licensing, you will have to purchase or rent licenses for all your nodes.
6. Networking bottleneck - this happens if you have 20 servers plus I guess, or very big scenes and your 1GB connection won't be enough and you need to go on 10 GB Fiber Optics, which ad up to the costs a lot.
7. Your project assets must have proper paths so all your nodes can find the assets, otherwise you will have missing textures or not properly set outputs, this can be solved by only working on your network storage I guess and using network paths.
8. If rendering big res images, all your nodes must have enough ram, like 48 - 96 GB if you are doing 8-12 K res. Having multiple nodes with poor performance won't do anything on these monster res images.

It can be a fun project, but I would advise giving our farm a test. We offer $20 worth of credits, enough for you to make an impression. Here's a quick video on how our process works.

https://www.cgifarm.com/renderfarm-quick-start-guide


Good luck with your project guys! My project also begin like building something small for my studio, but I ended up focusing only on the renderfarm in the end, it had to be one or the other :).
Working on a Renderfarm Platform - checkout our website cgifarm.com and our cost calculator : https://www.cgifarm.com/renderfarm-cost-calculator

2019-01-22, 02:07:01
Reply #21

Tex3D

  • Users
  • *
  • Posts: 2
    • View Profile
Hey everyone, I’m super new to using Corona (I bought it today!) and I’ve already started setting up a small 3 node render farm. Can someone please explain to me the texture situation?

So I start max and load my scene and then send it to render via Corona. Do I have to have the texture maps on each computer (in a special folder) or does the main one send the textures and lighting info to each slave node assuming DR is set up correctly? I keep seeing people talk about correct paths and such so it sounds like all textures need to be on each machine? Is this right?

Thank you for any help!
Dave

2019-01-22, 08:55:19
Reply #22

Frood

  • Active Users
  • **
  • Posts: 1303
    • View Profile
    • Rakete GmbH
Hi and welcome :)

Just some short notes:

1. You will be using DR and Backburner (and maybe deadline) for sure with 4 machines sooner or later (single node processing gives you better quality in less time generally). So besides drserver, you may consider also to install/use Backburner server on all computers.

2. Current Corona DR usually sends missing assets to the slaves but you may end up with a lot of files unneeded on your slaves and I consider it more as fallback than something to rely on (in case of 3 slaves, all assets would need to be copied 3 times around).

3. With no dedicated server, I personally would create a share on your main workstation where all assets can be accessed in your network and create scenes using UNC paths even when working locally. Means: If you are working on your main computer "mybox", when linking assets it would get "\\mybox\textures\shinywood_d.jpg" and not "D:\data\textures\shinywood_d.jpg" for example. "textures" would be shared in this case on "mybox". This way you ensure that scenes work everywhere in your network.

4. If you will be running drserver or Backburner as windows service, you would have to use UNC paths anyway (this is because usually the operating system as "user" has no access to mapped drives).

5. I would run corona license server in your network (preferable also as windows service) for convenience.

6. Don't forget to check firewall settings in case something goes wrong, disable firewalls temporarily to check if it's the cause of the problem.



Good Luck




Never underestimate the power of a well placed level one spell.

2019-01-29, 20:44:12
Reply #23

Tex3D

  • Users
  • *
  • Posts: 2
    • View Profile
Awesome reply! Thank you so much for your help!!

-Dave

2019-03-08, 08:40:13
Reply #24

3dvizual

  • Active Users
  • **
  • Posts: 25
    • View Profile
If you where to build a small renderfarm today how would the best setup be?

What I can read is that the Rack solutions is loud, noisy and produces a lot of heat.

How would you create a small renderfarm in your studio (maybe 4 pcs)? What will be the specs on each render slave pc.

Will you use AMD Threadripper 2990wx 32 cores / 64 threads - 96GHz in total or a Intel setup?

What graphicscard? And how many RAM in each slave. And which motherboard


« Last Edit: 2019-03-08, 08:47:29 by 3dvizual »