All posts related to V2
User avatar
By dariolanza
#369238
Hello Jason,

I can not disagree more with your words.
There are certain "industry procedures" (or call it however you want) that doesn't break Maxwell's philosophy or its physical correctness.

If you want to be more purist, you shouldn't use Bump mapping nor Normal mapping, but you should model until the latest bit of your object (up to its microgeometry).
You shouldn't use HDRI maps to light an scene, as you would call it a "fake" as well. Instead, you should model the whole city all around you, to get the reflections and illumination coming from all around. But we are in an industry, and there are some usual procedures used to gain productivity, like Bump mapping or HDRI lighting.

And placing emitters with the right color to reinforce some emitters that are difficult to impact is not a fake at all, but a usual procedure photographers use everyday to push up the light sources and allow reducing the film ISO, thus reducing the noise.
And this is the same concept applied in Maxwell to get less noise in the same render time.

Believe me that we are really really picky with what we incorporate into Maxwell, and that we preserve Maxwell's correctness as our more precious treasure.

Of course we don't consider any of these procedures a fake. When we talk about fakes in CG we all know what we are talking about: cutting bounces, tweaking the rays, faking global illumination... All those tricks created to make light not behave like real light, or the camera not working like a real camera, just to gain speed.

Putting emitters behind a window like a photographer does won't break Maxwell to work like a real camera in any way, or to calculate the lights like real lights.

If you want to be purist you could avoid any secondary emitter, and render only with the indirect light coming from the sky. Of course you will get the correct result, a replica of the real scene, but reaching unaccessible emitters is hard (in ray calculations) thus the render will be noisy and slow, and people don't like noise. So, what do you want?

And of course, keep in mind that the first thing in our heads is, since the beginning, is to find algorithms that solve the paths faster maintaining the physical correctness of the result.
Everyday we work in that line. We got a huge improvement in Maxwell v2, and will get more improvements in the future versions. Just keep in mind that we work hard to solve the algorithm faster rather than implementing any other features.

Greetings

Dario Lanza
User avatar
By Half Life
#369240
I think maybe you misunderstood me Dario. The "break" I am referring to is the need to set emitters to unreal values to get a scene to clear (faster).

I understand the stance that "it will clear eventually" is a technically valid stance -- but certainly not a terribly helpful one to the end user.

First and foremost Maxwell is a light simulator -- and users are encouraged in all the Maxwell literature to use "real-world" values for all emitters... However this problem (unevenly powered emitters producing excessive noise at low sample levels) is not documented anywhere, and furthermore the solution goes directly contrary to the standard advice from Next Limit.

I realize this is likely what JD once referred to as a "corner case", meaning very specific and not representative of how the software works on the whole... but it is very clear to me that these "corner cases" are the biggest problem to solve going forward for Maxwell. The reason being simple, anything that forces the user to adopt a completely different solution from what they normally should do in Maxwell is highly damaging to the intended/designed workflow (essentially breaking it).

I think maybe you want to define Maxwell via the "correctness" of the internal engine -- which is valid. However most users will instead define Maxwell based on the "correctness" of it's UI conventions with itself... meaning where the UI/workflow breaks down (like it did in this case) the program is false to itself (internal calculation "correctness" be damned).

If you prefer to label that a failure of the UI design (and/or the documentation) rather than a failure of the engine, that is fine... however the result is exactly the same for the user -- it's broken and only solution is to break it more to get usable results.

For instance, you will never convince me AGS is a suitable solution for realism nor real-world in any way... it is an imaginary substance meant to mimic glass because Maxwell often cannot correctly render glass in sunlight (unless you render to SL 34-35) . Under all normal circumstances a user is told to use a glass MXM (with "real-world" values) for glass objects... but in this instance is forced to use AGS instead -- and in that way Maxwell is broken (it disagrees with itself).

Essentially I am talking about a logic failing rather than a math one.

Also, for what it is worth there is no need to model down to the micro-geometry level -- that is what displacement is for. But yes, you are correct both bump and normal mapping are falsehoods for the sake of expediency (like AGS)... however that is apples and this is oranges, because you are talking about texture mapping/geometry and I am talking about light calculations... which is supposed to be Maxwells strongest suit.

Best,
Jason.
Will there be a Maxwell Render 6 ?

Let's be realistic. What's left of NL is only milk[…]