muahaha, the WTF moment .. hehe..
* * *
okay, my example.
check out our "pompeji" example. in the movie (
http://www.procedural.com/cityengine/pr ... mpeii.html ) you can see a closeup animation of the facades. they're built like this:
"color channel" x "dirt channel" (different pictures on same white clean plaster)
the openGL renderer gets two different uv sets per facade (uvs are set up before the facade is split into smaller pieces with our cga grammar), one for the color texture and one for the dirt texture. they two channels can of course have different uv coordinates. mostly, the main plaster is a repeating (tileable) texture, while the dirt is repeated exactly one over a whole facade.
so it's important to know that the openGL display just multiplies those values directly in the display, there's no incredibly huge "baked" pool of textures lying on the harddrive.
we can write out the .fbx format which carries the data correctly (as you see in the movie which was rendered in renderMan).
my computer's still importing a large fbx file, but i'll post a screengrab of the shading network that maya generates so we can discuss this later on..
* * *
maybe a MEL script could pick the network apart and repipe it ..