Please post here anything else (not relating to Maxwell technical matters)
By hatts
#373163
I just spotted this image on the summary page for Autodesk VRED renderer (a piece of software I had never heard of):

Image

Has anyone heard of this? Raytracing without internally converting to polygons? Is it real or a marketing trick?
I'd appreciate a link to an academic paper if anyone has one. All I could find was old papers from the 90s.
#373177
There were quite a few approaches for direct Nurbs rendering during the years. One could already do this in obscure applications,
like Realsoft > 10 years ago.
But the technology never became very popular...None of the mayor engines support this tech - as the digital entertainment industry
shifted away from Nurbs (for good reasons) already in the nineties. Optimal support for meshes got the key requirement for all aspects
of this industry.
It seems that in recent time some render products which focus on the automotive industry (such as Vred) have established direct Nurbs
rendering now, which enables CAD customers to load native Nurbs data. That's of course pretty elegant, especially as meshes always
mean an inferior geometry representation, when compared to the surface data.
I wonder though what happens when one wants to render anything else than uniform car-paint. To my knowledge there's no solution
whatsoever for texture mapping or even displacement in direct Nurbs rendering - the methods I know (static projectors, UV-unwrapping and Ptex)
all work on the render mesh.
#373189
hatts wrote:Good info. Leads me to another question though; wouldn't procedural maps work on NURBS? Or some other form of vector (as in, not-raster) maps?
Hmm...not that I can think of and I'm pretty sure there's nothing like this in the market.
Nurbs-polysurfaces, even in good models simply don't come with a clean, evenly dense common structure - something which would only
roughly lend itself as an equivalent to UV-islands in meshes. What makes up a solid may be highly precise but the inherent structure
is super diverse indeed - very dense and complex surfaces next to super simple ones - all with arbitrary and non user-editable UV-orientations.
Then there's trimmed surfaces - which under the hood still have their original size but get displayed with certain portions of their extends
suppressed. If one still somehow managed to apply a procedural texture to the raw Nurbs it would show all sorts of densities and orientations
as also Isoparms are oriented in most diverse directions too. Also without a secondary structure I could not imagine editing flattened islands
independent from the actual 3D geometry - something which is a breeze with meshes and makes this workflow so powerful.

If the desire is to work on textures with the same control as available on meshes I can not see how one could do this without a (far simpler) secondary
structure – :idea: maybe a mesh :). To me the lack of texture support seems like THE conceptual weakness of the whole thought of Nurbs rendering - but
I'm no Nurbs developer - and certainly was extremely pleased to get proven wrong...
#373190
Yes that sounds challenging.

However, mesh technology overcame the same problems with mesh density and/or topological consistency by introducing the ideas of retopologizing, and providing tools that gave users the ability to make new custom UV maps etc., as well as enabling "dumb" UV projections that consider the bounding box of a mesh.

So by that logic, it seems that you could apply isoparm remappings to all surfaces in a model, with a standardized "grid density." All of this must have gone into figuring out the original tools devised for mesh retopologizing and remapping, so why can't it be modified to use equation-driven boundaries instead of coordinates?

dynamesh for NURBS, is all I ask! :lol:

what about gpu maxwell q project?

SS Pinto Bean

Hi Tommy, Great stuff - love it~! Thanks for pos[…]

Never No More Studio Lighting

Hello Mark! Very good tips about the camera setti[…]