Any features you'd like to see implemented into Maxwell?
User avatar
By Thomas An.
#16665
Hi all,

I have been turning this in my head for a few days now:

You will notice that when a render starts and at sampling level 0 (the most grain) the output file is written for the first and has a size of (say) 360 kb.

On every new sampling level (or any time the file is updated) some of the noise is removed and file size also drops by a small amount:
Level 1 --> 346 kb
Level 2 --> 325 kb
Level 3 --> 315 kb
Level 4 --> 311 kb

... and so on. At level 15 or higher the change is only a few hundred bytes.

It is probably safe to assume that the rate of noise reduction corrolates to the rate of file size reduction (in other words the more noise in the image the higher its entropy and the more size it occupies on the disk)

Would you think it would be reasonable if Maxwell included an internal algorithm that monitors the file sizes each time they are written and perform a regression based on those data and stop the render when the rate of file-size reduction drops bellow a value epsilon (user defined value)?

So basically it will have to run through the first 2 or 3 levels to gather enough data for its regression calculation and estimate the potential file-size assymptote.
We can call this new value the "entropy value" and can range from 0 to 1 and represents the deviation from the estimated assymptote.

So now we can have a third option of when to stop a render .

Just a thought... I might be crazy though ... you tell me :D

-
User avatar
By tom
#16695
:D :D Yeah yeah yeah... now this rocks!
I believe Oscar can give a chance to this option... It seems pretty easy to implement such a thing.
By Maya69
#17405
yes i think is very good idea

We've adopted a similar outlook and stick to CPU r[…]

render engines and Maxwell

Funny, I think, that when I check CG sites they ar[…]

Hey, I guess maxwell is not going to be updates a[…]

Help with swimming pool water

Hi Choo Chee. Thanks for posting. I have used re[…]