Any features you'd like to see implemented into Maxwell?
#379652
Hi, i’ve created a cardboard VR goggles for my smartphone, using its gyro to look around in a VR scene that i have produced with the Unity3D game engine. Works top, the stereo split screen delivers the correct image for each eye and you can turn around and watch the scene in 3D :-))

Image

Now the scene in VR is rendered in real-time, limiting its render quality for playback in a smartphone.
So i rendered a Maxwell scene with a multi stereo camera setup and separately stitched the pictures from the left and right cameras together. This is necessary, because you need to keep up the relation between left and right eye. 2 spherical renders wouldn’t work because camera constraint gets gradually lost the further you turn to the side.

Image

Ok now the wish list part would be a maxwell render lens, where you define the numbers of stereo cams in a circle, their distance to the center and render the multi-cam spherical movies for left and right.

THAT’D BE SUPER AWSOME and puts everything far behind what you can achieve in visual immersion.

Any better than this would already be to either being at the location, or receiving Matrix-movie like spinal stimulation for simulation ;-)

So yea, and then there can be a pre-prepared Unity3D scene (i have one), where you just exchange the left-right movies with your production, hit build and put it online for everybody to install and watch on their smartphones with simple stereo-magnifier-goggles (like mine, or as they sure be available to buy for 2 $ soon), or Oculus Rift setup etc. in seconds. And BAM be in the movie scene !

Regards,
Frank
#379677
if the spherical lens would also carry a value to shift the virtual cam from the rotation center in x and z, and rotate the cam to look parallel or towed in, and to render 2 cameras with diff positions, then i think that'd be the correct spherical render result for each eye.

Something like this, where the spherical render would travel the diameter value of the imaginary circle:
Image

Right now i rotate the re-positioned cameras around the center and render 12 images each for the full cycle, then stitch them with PTGui. But there is aberration from the correct perspective to the content where the stitch blends between images, causing the 3D effect to gradually decrease at such portions of the images, when watching it over VR on my smartphone.

A homogeneous spherical cam-offset / cam-rotate render could deliver the correct perspective per each pixel of the resulting left and right spherical images.

So, is this a known issue?

Thanks a lot for your response, I will update and […]

did you tried luxCore?