Page 1 of 1

Network GPU rendering

Posted: Mon Jun 22, 2020 4:17 pm
by Matthew Hermans
Hi,

I'm coming back to visit Maxwell after being a user on the Beta days. I've been out playing with GPU rendering for a few years and want to try and harness the new 5.1 features including Caustics on the GPU. I noticed the Network Rendering doesn't support GPU mode or do i need to use a command line parameter ?

I've got a lot of GPUs now so is there a supported way to run them all in a cooperative manner via the Network Rendering Cooperative Mode or is GPU not supported in any network mode ?

Thanks!
Matt Hermans
Electric Lens Co
Sydney, Australia

Re: Network GPU rendering

Posted: Wed Jun 24, 2020 12:17 pm
by Forum Moderator
Hello Matt,

Yes, the local network render jobs allow using the GPU engine; the cloud jobs don't, for the moment. The mxi files rendered with the GPU can be merged between them, so cooperative jobs are allowed; the cooperative network jobs have to use either all the nodes CPU or all the nodes GPU; mixing them is not allowed (the mxs file will set the engine that will be used). Merging manually CPU and GPU mxi files is not recommended or supported now but can work if the channels and features are supported in both render engines; you can get strange results, though, for example, if you are using SSS materials which are not supported in the GPU; if you use channels not supported in the GPU or multilight, the mxi files won't merge, as the mxi files rendered with the GPU will have a different set of buffers.

Also, you can plug several graphics cards in the same machine and use them for the same render.

I hope this helps.

Best regards,
Fernando

Re: Network GPU rendering

Posted: Thu Jun 25, 2020 1:54 am
by Matthew Hermans
Thanks for the reply Fernando!

I think i must have not investigated deep enough. the Add Job wizard doesn't offer CPU/GPU mode controls - so it must be inherited from the .mxs or perhaps i can use a command line argument. I'll let you know if i come up against any barriers.

Thanks again,