- Sun Aug 14, 2016 10:59 pm
#391923
Obviously, the MacBook Pros for on the road work have to go out of the window to be replaced by Windows laptops (Nvidia issue).
But then, regarding PCs, what I don't understand is what happens if one's render nodes are ok with 16GB RAM (industrial design studio work not so heavy on scenes or whatever reason) and one then adds an 8GB VRAM GPU - will the rendering be accomplished jointly by CPU and GPU and memory also, or will one have to buy a GPU with 16GB VRAM or more (if such GPUs exist)? Or is all rendering done in the GPU so one can put just as well dusty old prehistoric CPUs into one's boxes?
One could, just as an example, buy an Evga 12G-P4-2990-KR nVidia GeForce GTX TITAN X for €1.400,00 for each render node PC - but then may be stopped out by the 12GB VRAM. More technically knowledgeable people may know if this is a wasted investment.
I suppose Next Limit will soon release information on how a render node PC for GPU rendering should be configured for the future.
Right.ptaszek wrote:BUT!
We don't know how GPU will work in Maxwell yet! Maybe on OSX it will come later or something like that.
Maybe GPU first will be used only for preview mode not for the final rendering. Maybe you will be limited with some options (like SSS won't work yet on GPU). There are still a lots of open points to make any changes in hardware.
If you want to stay with MACs like me, let's wait first for Maxwell v4, Then maybe 10xx drivers will be released and if we are lucky all will work good!
If not and Maxwell GPU will rox. I go for WINDOWS
Good luck
Obviously, the MacBook Pros for on the road work have to go out of the window to be replaced by Windows laptops (Nvidia issue).
But then, regarding PCs, what I don't understand is what happens if one's render nodes are ok with 16GB RAM (industrial design studio work not so heavy on scenes or whatever reason) and one then adds an 8GB VRAM GPU - will the rendering be accomplished jointly by CPU and GPU and memory also, or will one have to buy a GPU with 16GB VRAM or more (if such GPUs exist)? Or is all rendering done in the GPU so one can put just as well dusty old prehistoric CPUs into one's boxes?
One could, just as an example, buy an Evga 12G-P4-2990-KR nVidia GeForce GTX TITAN X for €1.400,00 for each render node PC - but then may be stopped out by the 12GB VRAM. More technically knowledgeable people may know if this is a wasted investment.
I suppose Next Limit will soon release information on how a render node PC for GPU rendering should be configured for the future.