Everything related to Maxwell Render and general stuff that doesn't fit in other categories.
By Dover Studios
#389568
Image

I have put together a new PyMaxwell based python script called "Automagic VR" that simplifies the process of customizing the LatLong Stereo render settings in Maxwell Studio.

The script works with Maxwell Render 3.2 and takes a new .mxs scene file that has a camera with the LatLong Stereo lens activated.

When the "automagicLatLongStereo.py" script is run in PyMaxwell, the mxs scene file is opened up and the camera's target is used to set up the LatLong Stereo settings. The "Parallax Distance" is set to the target distance value, and an automatic formula is used to come up with a comfortable stereo camera "separation" setting.

Also a screen space "Separation Map" texture of your choice is applied to the LatLong Stereo lens shader.

Then the lens shader's camera view is adjusted so a set of center, left and right camera view based scene files are generated with the names <scene>_C.mxs, <scene>_L.mxs and <scene>.R.mxs. These new files are saved to the same folder as the original mxs scene file.

Then at the bottom of the script you have the option to automatically render either the center camera, or the left and right camera views in Maxwell Render. (This automatic rendering feature only works with Maxwell Render on Windows right now)

If you wanted to render the left and right views automatically you would set the code to:
Code: Select all
  # Camera Views to Render
  # ----------------------
  # Set each of the views to 1 to render, and 0 to skip rendering
  leftView = 1
  rightView = 1
  centerView = 0
If you wanted to render just the center camera view automatically you would set the code to:
Code: Select all
  # Camera Views to Render
  # ----------------------
  # Set each of the views to 1 to render, and 0 to skip rendering
  leftView = 0
  rightView = 0
  centerView = 1
Automagic VR Download:
http://www.andrewhazelden.com/projects/ ... studio.zip

Note: There is a new build of Maxwell Studio that I believe might come out either today, or in a day or two from now. The Automagic VR script is made to work with that build as the updated LatLong Stereo shader's "Parallax Distance" attribute operates using the cm unit scale vs the previous mm unit scale mode.

Regards,
Andrew Hazelden
Domemaster3D Co-Developer
Last edited by Dover Studios on Tue Dec 01, 2015 7:14 pm, edited 1 time in total.
By Dover Studios
#389651
Hi Nova66.

You should try downloading the latest Maxwell Render 3.2.0.4 beta release and see if it works better on your system. You will want to use 360 cm as the parallax distance to match the 6.5 cm stereo camera separation value.

Also Mihai did a nice webinar on stereo rendering in Maxwell Render that is available on the MaxwellZone website:
http://www.maxwellzone.com/tutorials/ti ... tion-tips/

The HD video clip from Mihai's webinar is on Vimeo here:
https://vimeo.com/147934179

Cheers,
Andrew Hazelden
User avatar
By Nova66
#389652
Hi Andrew Hazelden,
Dover Studios wrote:You should try downloading the latest Maxwell Render 3.2.0.4 beta release and see if it works better on your system.
Oh I have and it works great now, thanks for keeping me in mind :-)
v3.2.0.3 fixed the Separation Map issue and v3.2.0.4 fixed an issue with interior lighting.
Dover Studios wrote:You will want to use 360 cm as the parallax distance to match the 6.5 cm stereo camera separation value.
I touched on this before but I don't quite see why 360cm should be such a sensitive parallax distance for the 6.5cm separation value. My models are always constructed at real world scales, my camera separation value is almost always going to match my IPD which is somewhere between 6.5cm & 6.8cm, and my focal point or parallax distance can be anywhere from 1m to 100m away. Admittedly I haven't had a chance to do a super wide range of tests but so far I haven't had an issue with this approach.
Dover Studios wrote:Also Mihai did a nice webinar on stereo rendering in Maxwell Render that is available on the MaxwellZone website:
Thanks for the tip, I hadn't come across that one yet.

All in all, I'm really happy with how the Lat-Long Stereo Lens has turned out. All I need now is a bigger render farm to render the super high resolution stereo panorama images :-)


Cheers,
Andrew.

P.S. I love the Eagle Transporters from Mihai's video even more that my Tintin Moon Rocket :-)
User avatar
By Mihai
#389653
That model was courtesy of Curt Roth (Nico on the forums), the guy who makes the flight simulators: http://www.trcsimulators.com/

It's a terrific model, I wish there was time for me to do something more elaborate with it. Since a few years now I've had the thought of creating a collaborative project with Maxwell with the help of users from this forum...
By Dover Studios
#389654
Nova66 wrote:Hi Andrew Hazelden,
Dover Studios wrote:You will want to use 360 cm as the parallax distance to match the 6.5 cm stereo camera separation value.
I touched on this before but I don't quite see why 360cm should be such a sensitive parallax distance for the 6.5cm separation value. My models are always constructed at real world scales, my camera separation value is almost always going to match my IPD which is somewhere between 6.5cm & 6.8cm, and my focal point or parallax distance can be anywhere from 1m to 100m away. Admittedly I haven't had a chance to do a super wide range of tests but so far I haven't had an issue with this approach.
As usual there is always a lack of clarity in text communication when brevity is used in a sentence. I meant to say "If you have any existing Maxwell LatLong Stereo based scene you rendered previously in older beta releases with a 10X larger parallax distance in the UI number field (eg 7200 mm) due to the old mm vs cm scale thing in the UI, you will want to update those .mxs scenes to convert the unit scales smaller by sliding that decimal place over by one to a value like (eg 720 cm). As always you can use any IPD and parallax distance setting you want if you are satisfied with the results. This was just to give new users something of a rough starting point they could then further tweak their scenes from. :wink:

The key thing is just to balance the strength and amount of 3D depth effect that is present against the fact that there is a certain minimum "safe distance" that objects should be kept away from the camera if you are going to be mixing super close and super distant objects in the scene so you don't get too much of a parallax effect in the rendering which can be uncomfortable to some people that have a high sensitivity to getting headaches from strong depth effects in stereo imagery.

(This is partially an issue in my opinion due to the lack of DOF in the current crop of stereo 360° renderings which would otherwise tone down the details in the background a bit if you were mixing super extreme differences in stereo depth ranges in the same scene.)

For example, if you do an anaglyph style LatLong stereo composite and see about 30% or more of the whole frame shifted apart (in the red vs blue/green channels) between the placement of the same object in the left and right eye view, then very few human brains will be able to comfortably merge the stereo views and make a workable stereo image that doesn't cause "brain shear" and induce headaches after a certain amount of viewing time (20 minutes+) of viewing this type of imagery.

That said, if you are able to comfortably "free view" standard cross-eyed stereo imagery without any 3D glasses on for an extended period of time, then you have something of an "iron stomach" and you aren't likely someone who has a sensitivity to any of the stereo imagery depth issues and could pretty much handle any stereo depth setting you want. Other viewers are more sensitive to strong depth effects and also complain heavily about shakey camera work too in 360 video clips. :-)
By Dover Studios
#389686
Hi Nova66 and Mihai.

I've put together a new open source repository on GitHub called the "Maxwell Render Toolbox". My goal is that it will serve to collect together in one place the various little Maxwell scripts that I write to help automate pipeline like tasks using the PyMaxwell program that comes with Maxwell Render. :) The latest versions of my PanoView and Automagic VR LatLong Stereo scripts are now going to be stored in the Maxwell Render Toolbox repository.

I’ve also added two new scripts to the Maxwell Render Toolbox GitHub repository this weekend: The mxi2gearvrcube.py script will convert a set of Maxwell .mxi based LatLong Stereo panoramas into a Gear VR stereo cubemap style horizontal strip panorama. The script uses PyMaxwell, Panotools, and Imagemagick to do the panoramic conversions. This Gear VR stereo cubic image has the same image projection as Octane VR and Vray cubic panorama output.

The mxi2photosphere.py script will embed the Google Photosphere EXIF metadata into a LatLong panorama. This makes it easier to view the panoramic rendering on Google+ and on a Google Cardboard HMD display.

The main Maxwell Render Toolbox wiki page is accessible here:
https://github.com/AndrewHazelden/Maxwe ... olbox/wiki

There is also a Windows based installer for the toolset included on the GitHub releases page:
https://github.com/AndrewHazelden/Maxwe ... x/releases

Open Source Tools

There is a collection of popular open source command line tools that come with the installation in Maxwell Render Toolbox "tools" directory that includes FFmpeg, Imagemagick, Exiftool, the Google Spatial 360° media metadata script, and the Panotools warping library that would make to possible to do some interesting tasks like direct from PyMaxwell Side by Side and Over/Under stereo image compositing, adding YouTube 360° and Google Photosphere panoramic metadata tags, using FFmpeg to read a folder of .mxs/.mxi project files and turn the rendered images sequences into MP4/MOV/AVI/MKV movie files automatically with the alpha channels preserved if an appropriate movie codec is chosen, etc...
User avatar
By Nova66
#389687
Hi Dover Studios,
Dover Studios wrote:I’ve also added two new scripts to the Maxwell Render Toolbox GitHub repository this weekend: There is also a Windows based installer for the toolset included on the GitHub releases page:
https://github.com/AndrewHazelden/Maxwe ... x/releases
This is great stuff :-)

A user (reaver) on the forums has already questioned the Pros & Cons of Cube Maps vs Lat-Long with regard to maximum image fidelity. Before Maxwell had the Spherical 360° Panorama lens, I used to manually render out the six square faces of a virtual cube and combine those individual images into one Equi-Rectangular panorama, the workflow was tedious but worked pretty well. I was wondering if you know, do Octane Render and Vray render these stereo panoramas natively in the Cube Map format or do they also use a spherical Lat-Long lens and simply convert to the Cube Map format at the end?


Thanks,
Andrew.
By Dover Studios
#389688
Hi Nova66.

Yes, it is mathematically possible to do a direct to stereo cubemap rendering style without having to do the steps of rendering a LatLong image in advance and then doing a cubic re-projection. An example of this can be seen in the VR_Camera lens shader source code by Pedro Gomez that is on GitHub ( Scroll down to the code on line 270 - 370):

https://github.com/pedrofe/vr_camera/bl ... Camera.cpp

In the end of the day though, YouTube 360 / Vrideo / Kolor Eyes and most other panoramic viewing tools all have excellent LatLong playback support but it is a bit of the wild west out there in cubemap land as there isn't much of an across the board 100% standardized implementation in place of a singular stereo cubemap format to rule them all.

The GearVR cubemap format does have some issues with the extreme 12:1 aspect ratio making bad use of the standard 0-1 UV space range. Also unless everyone adopts the H.265 video codec fairly quickly I can see having an 18432x1536 px video file giving legacy video codecs a harder time than truly needed for playback and decoding of animations (compared to something like the cubemap 3x2 format in an over / under stereo layout.

Cubemap production wise, people are split between the Gear VR style "Carmack" horizontal strip format (which isn't even the same layout and orientation as the classical horizontal strip format that was used by the old "mental ray cube1 horizontal strip", or the "Nvidia/DirectX" style horizontal strip mode), then there is the Cubemap 3x2 arrangement which is yet another popular option. You can also find a pile of other variants of cubemap layouts like the vertical strip, vertical cross, horizontal cross, vertical tee, and horizontal tee that have been used extensively over the years for HDRI image based lighting and environment maps. Whew!

For converting between these different cubemap formats, my (free) Domemaster Photoshop Actions pack can come in handy:
http://www.andrewhazelden.com/blog/2012 ... ions-pack/

On the desktop 360° media viewing front, the Whirligig player runs on Windows and can handle the Gear VR cubic format:
http://www.whirligig.xyz/player2/

You can select the different playback modes in Whirligig GUI or use the following command line options to directly launch the player in the cubic mode:

Gear VR Cubic Stereo Side by Side imagery:
Code: Select all
"Whirligig64bit.exe" -feature "C:\image.jpg" -projection customformat -customformatsbs "Octane.obj" -eyeorder "rl"
Gear VR Cubic Mono imagery:
Code: Select all
"Whirligig64bit.exe" -feature "C:\image.jpg" -projection customformat -customformat "Octane.obj"
Cheers,
Andrew Hazelden
By BradT
#391468
I'm trying to write a PyMax script to change the camera type to lat-long and adjust some parameters on a folder full of MXS files (an MXS sequence exported from 3dsMax). I've managed to change the camera type, but I'm having trouble figuring out how to access the lat-long parameters. For example, I'm trying to change the lat-long lens to "center" mode. The documentation isn't very helpful.

I've looked at Andrew's automagicLatLongStereo.py script, and can kind of see what's going on there, but I'm still missing some pieces. Also, his script isn't working for me either. It would be nice if the 3dsMax plugin would let you adjust the lat-long lens directly, but I guess that isn't implemented yet.

Has anyone already written such a script?
By BradT
#391505
I figured this out by copying the lens parameters from an MXS file that had the parameters set up as I wanted, then pasting those onto my target MXS files.

Here's the code, in case anyone else finds this useful. Apologies for how rough it is.
Code: Select all
################################################################
# Apply a Lat-Long lens model to a folder full of MXS files.
# Lat-Long lens parameters are copied from a prototype MXS file that you must provide.
# Prototype file only needs to contain a lat-long camera.
# **Please change file paths in Main Loop section to point to your files as described in comments below
################################################################

from pymaxwell import *
import os
from math import *
import sys

def read_prototype_camera(camPrototypePath):
	#Read prototype scene and grab it's lensParams
	prototypescene = Cmaxwell(mwcallback)
	ok = prototypescene.readMXS(camPrototypePath)
	if ok == 0:
		print("Error reading camera prototype scene from file")
		return 0
	pcamera = prototypescene.getActiveCamera()
	pLensParams, ok = pcamera.getCameraLensExtensionParams()
	if ok == 0:
		print('There was an error reading the prototype lens params')
		return 0
	ok = pLensParams.getByName('Type')
	print "type is: "+ str(ok)
	# Should print something like: type is: ([0], 0, 2, 1, 4, 1, True)

	return pLensParams


def read_edit_save_camera(file,inPath,outFolder,outMxs,pLensParams,xRes,yRes):


	# Read scene from disk
	scene = Cmaxwell(mwcallback)
	ok = scene.readMXS((inPath+'/'+file))
	
	if ok == 0:
		print("Error reading scene from file")
		return 0
	
	if not os.path.exists(outFolder):
		os.mkdir(outFolder)

	# Get active camera
	camera = scene.getActiveCamera()
	
	# Set lens type to lat-long stereo

	camera.setLensType(TYPE_EXTENSION_LENS)
	camera.setLensType(6)

	# Set lens params
	camera.applyCameraLensExtension(pLensParams)

	lensParams, ok = camera.getCameraLensExtensionParams()
	if ok == 0:
		print('There was an error changing the lens type')
		return 0
	a = lensParams.getByName('Type')
	print "type is: "+ str(a)
	# Should print something like: type is: ([0], 0, 2, 1, 4, 1, True)

	#Set resolution
	ok = camera.setResolution(xRes, yRes)
	if ok == 0:
		print ("Error setting resolution")
		return 0

	# Save changes
	outPath = outFolder+'/'+file
	ok = scene.writeMXS(outPath);
	
	if ok == 0:
		print("Error saving "+outPath)
		return 0
	
	print(outPath)
	return 1

# This is the main loop. Change the path variables below to point to your files.
# camPrototypePath points to an MXS file that has the lat-long camera set up as you desire. The settings will be copied to all the target files.
# inPath is the folder containing the MXS files that you want to change.
# outFolder is the folder you want to dump the changed files to
# outMxs can be ignored. I was too lazy to remove this bit of test code.
# xRes and yRes are your target render dimensions in pixels.
if __name__ == "__main__":
	camPrototypePath = "D:/Projects/PythonProjects/PiMaxwell/Latlongstereo_center.mxs"
	inPath = "//CORE2/DFSProjects/FasterThanLight/Prod/Shots/FTL_ReachingExtremeSpeeds_SaturnFlyby/3D/3dsMax/renderoutput/ScriptTest/"
	outFolder = '//CORE2/DFSProjects/FasterThanLight/Prod/Shots/FTL_ReachingExtremeSpeeds_SaturnFlyby/3D/3dsMax/renderoutput/ScriptTest/ScriptOut'
	outMxs = 'camera_edited.mxs'
	xRes = 4096
	yRes = 2048

	pLensParams = read_prototype_camera(camPrototypePath)

	mxslist = getFilesFromPath(inPath,'mxs')
	print ("files in: "+inPath+" "+str(mxslist))
	for file in mxslist:
		ok = read_edit_save_camera(file,inPath,outFolder,outMxs,pLensParams,xRes,yRes)
		if ok == 0:
			print("Error processing"+inPath+'/'+file)
		else:
			print ("Processed"+inPath+'/'+file)

	print "Finished"

Help with swimming pool water

Nothing beats observing the real world or, if that[…]

Sketchup 2026 Released

Considering how long a version for Sketchup 2025 t[…]

Greetings, One of my users with Sketchup 2025 (25[…]

Maxwell Rhino 5.2.6.8 plugin with macOS Tahoe 26

Good morning everyone, I’d like to know if t[…]