- Tue Feb 14, 2006 11:51 pm
#119816
I really don't know what the hec I'm talking about but here's my stabbed at it.
Feature 1: (again speculation)
You know the first thing I noticed about this video is the once the files were merged the noise in the image was significantly reduced (smoothed). I can only speculate that this means that in a distributed rendering setup each node would/could be set to render at a reduced SL value. Then in the composite render the image is in effect 'sampled up' to achieve the much smoother result. Therefore renderings would be sped up such that the more nodes you have the lower SL you can achieve per each node to achieve the 'sampled up' version.
Feature 2: (here we go again)
I noticed that there are four lights yet there were five files merged together. Maybe each mxi is actually a type of rendering pass. So output0.mxi is a type of material/geometry/raw GI pass. Then output1-4.mxi are passes which contain the information relative to each light (i.e. light location, falloff, relative location to geometry, effect on materiality). So when the images are 'composited' together each light can therefore be effected/manipulated relative to parameters. In a way this sound very similar to an HDRI with an infinate amount of exposures per light source.
Well might be worth a good laugh but that my guess

Feature 1: (again speculation)
You know the first thing I noticed about this video is the once the files were merged the noise in the image was significantly reduced (smoothed). I can only speculate that this means that in a distributed rendering setup each node would/could be set to render at a reduced SL value. Then in the composite render the image is in effect 'sampled up' to achieve the much smoother result. Therefore renderings would be sped up such that the more nodes you have the lower SL you can achieve per each node to achieve the 'sampled up' version.
Feature 2: (here we go again)
I noticed that there are four lights yet there were five files merged together. Maybe each mxi is actually a type of rendering pass. So output0.mxi is a type of material/geometry/raw GI pass. Then output1-4.mxi are passes which contain the information relative to each light (i.e. light location, falloff, relative location to geometry, effect on materiality). So when the images are 'composited' together each light can therefore be effected/manipulated relative to parameters. In a way this sound very similar to an HDRI with an infinate amount of exposures per light source.
Well might be worth a good laugh but that my guess


