All posts related to V3
#391992
eric nixon wrote:
t doesn't? Do you have inside information or what is your reasoning here?
My reasoning is that you were talking nonsense; as though there is no normalization when there blatantly is.

Your way of responding in general seems unreasonable. You raise an issue but wont check it with the latest build for the benefit of all who want to know. I cant test this right now. No equipment here..

I have no inside information, just my experience which I am sharing with you. I genuinely thought this was fixed because I haven't seen any issues lately, and therefore the issue cant be that bad can it?
Your "reasoning" really should consist of a bit more than just an argument from personal incredulity, before you call another’s response "unreasonable". If you see no value in discussing this issue, then why engage at all?

You might not be aware of it, but your posts tend to come off as rather rude and condescending. This makes it much more difficult to have a constructive discussion, which is what I'm interested in. Unfortunately, no one from NextLimit seems to want to participate. Tom or any of the others should be able to quickly clear up any misunderstanding on my part, should I actually be wrong.
#391996
gmenzel wrote: Your "reasoning" really should consist of a bit more than just an argument from personal incredulity, before you call another’s response "unreasonable". If you see no value in discussing this issue, then why engage at all?

You might not be aware of it, but your posts tend to come off as rather rude and condescending. This makes it much more difficult to have a constructive discussion, which is what I'm interested in. Unfortunately, no one from NextLimit seems to want to participate. Tom or any of the others should be able to quickly clear up any misunderstanding on my part, should I actually be wrong.
I told I would check it. I prepared a scene with this error very visible and report it. Now we are quite busy with the GPU engine, but will look over it asap.

I can confirm there is normalization on additive blending if needed. But it seems to be a bug on certain situations. I don't want to say nothing for sure before going really into that part of code and check it, but right now I think it's a problem that can be fixed with the current material system.
#391997
luis.hijarrubia wrote: But it seems to be a bug on certain situations.
that makes it seem like this doesn't happen very often tbh , maybe I'm looking at this wrong :o (this is with 3.2.1.5)

Image

That's just a standard plastic for me(or it was) ... e.g rough bottom layer and a additive 1.5 ior coating . Surely that's a bug if the edge of the ball is brighter than the light source surrounding it?
Last edited by photomg1 on Wed Aug 24, 2016 3:35 pm, edited 1 time in total.
#392001
luis.hijarrubia wrote:I told I would check it. I prepared a scene with this error very visible and report it. Now we are quite busy with the GPU engine, but will look over it asap.

I can confirm there is normalization on additive blending if needed.
You work for NextLimit? I wasn't aware of that. Your presence on the board gives no obvious indication to that effect. I'm glad you are looking into this.

I want to be precise here. When I say "normalization", I mean measures that ensure that the physically correct amount of light is re-emitted, i.e. a proper balancing of the layer weights in accordance with the material properties, as well as the incident light angle. I'm NOT talking about Maxwell simply cropping the total energy at 100% of the allowed range. I understand that this IS being done at the moment.
luis.hijarrubia wrote:But it seems to be a bug on certain situations. I don't want to say nothing for sure before going really into that part of code and check it, but right now I think it's a problem that can be fixed with the current material system.
I don't see how this could be possible without introducing an additional layer blending mode or improving the coating component, but am open to be convinced otherwise.
#392002
photomg1 wrote:Surely that's a bug if the edge of the ball is brighter than the light source surrounding it?
Again, this seems not to be a simply bug, but rather a general behavior consistent with the way additive layers work. I wrote this before: As long as the output of one BSDF is naively added to the output of another BSDF, the result will always be too much energy being reflected.

Now, this is a subtle effect in most situations and will often only become obvious in direct comparison with some sort of ground truth image. I only became aware of the problem when I started trying to match my rendered results to reference photos (a white and very shiny leather, specifically) and just couldn't seem to get it right.
#392003
gmenzel wrote:
luis.hijarrubia wrote:I told I would check it. I prepared a scene with this error very visible and report it. Now we are quite busy with the GPU engine, but will look over it asap.

I can confirm there is normalization on additive blending if needed.
You work for NextLimit? I wasn't aware of that. Your presence on the board gives no obvious indication to that effect. I'm glad you are looking into this.

I want to be precise here. When I say "normalization", I mean measures that ensure that the physically correct amount of light is re-emitted, i.e. a proper balancing of the layer weights in accordance with the material properties, as well as the incident light angle. I'm NOT talking about Maxwell simply cropping the total energy at 100% of the allowed range. I understand that this IS being done at the moment.
I Know. I'm kind of a rookie, I'm working on Maxwell team for almost the same time I'm registered here, but I was here to learn. I have work on some issues related to the blending system. So I'm not the one that did it, and I'm not sure I understand it 100%. But that's how It's working AFAIK.

When an additive layer it's evaluated, the final value it's calculated and if the energy is over 1, the energy it's normalized. Not cropped, but all the material bsdfs are normalized. And this it's done per pixel. As you can have an object with some black parts (black is absorbing all energy, you will never have the problem), and other white parts, that added to other bsdfs can make the total amount of energy getting out being bigger than the energy getting in. The normalization process here is quite complicated. That's why I think there can be bugs there. But maybe the day I really get into this I see there's some kind of architectural issue that makes this impossible to do with the current model.

I will inform if there is any development around this.
  • 1
  • 3
  • 4
  • 5
  • 6
  • 7
Will there be a Maxwell Render 6 ?

Let's be realistic. What's left of NL is only milk[…]