Everything related to the integration for Maya.
yup, thinking about upgrading my hardware. I use Maya+Maxwell and it appears that the weakest part of my PC is the graphics card with only 4GB ram. Is there a good card at a friendly budget good for maya and maxwell with 8GB ram....? would like to hear from personal experience as some of my scenes are quite heavy with trees, plants etc. thanks !
I have both a Quadro card and a GTX card on two different machines. After years of using Quadros I decided to try a GTX for the greater number of Cuda cores (because I discovered that when it comes to Quadro, Nvidia's policy seems to be the more you spend the more Cuda you get). In fact, I had vowed never to buy another Quadro card again after years of sitting around waiting for renders to finish only to discover I could have waited less had I bought a GTX (more Cudas for my buck).

Generally, I'm pleased with the performance of the GTX vs. the Quadro (in all my 3D apps). However, there are subtle (and sometimes annoying) differences.

Re: Maxwell Studio, initially, I thought a Quadro wasn't necessary at all. Yet, I was frequently experiencing weird glitches/behaviors when navigating the viewport --things jumping around for instance (i.e., a lack of precision when navigating the viewport). Thought it was me until I realized that it didn't happen --like never-- on my Quadro machine.

So, that's my experience with the two cards. On the surface mostly similar. But, beneath the surface there's subtle annoying issues to be on the lookout for.

My next graphics card upgrade will most likely be a Quadro. But, now that I know from experience that a GTX will give me faster rendering performance, I'm going to stick to the lower end Quadros, i.e. just enough Quadro to handle reasonable polys, modeling, and viewport navigation tasks. I'll leave the rendering, and the Cudas, to my GTX.

Hope that helps!
BTW, I 'm not advocating purchasing both cards (may as well go for a higher end Quadro). I only have the two because I thought I was going to switch to GTX forever. What I realized is that fluid viewport navigation, modeling, etc. was more important to me. I can leave rendering performance to the CPU's (especially given the rapidly growing CPU count in systems today) --thank you Maxwell. :D

So, this card will likely be my only GTX card. But, I'll stick to the lower end Quadro cards (or their alternative :wink: ) from now on.

Typically, when it comes to geometry heavy scenes I prefer to "link" individual models into the final scene (MXS references) for the faster viewport performance. Basically, I open the individual model, say a tree, texture it, save it, and then link it into my scene, say a landscape (multiple trees, all referenced).

Now, for what it's worth (and if this experience helps your decision process in any way) . . . I think geometry dense Maxwell scenes tend to respond better on systems with more CPU cores.

Here's why:
I use The Grove 3d Trees a lot --'cause I like them :D . However, these are dense, geometry heavy trees made for realism. I'm talking hundreds of leaves, hundreds of little berries on those leaves, sometimes, and twigs upon twigs (all instances in Maxwell of course).

One tree in particular, the Manna Gum tree is like molasses to load up in Maxwell on my GTX machine. Spinning wheel . . . Maxwell not responding . . . for minutes . . . every time . . . did I say every time . . . I so much as select a leaf on this tree.

On the other hand, I can open that same tree up on my Quadro machine and it handles it like kid's play. No lag. No Maxwell not responding spinning wheel. Nothing but "I got this".

Here's the thing: my Quadro card is a 2GB card. My GTX card is an 8GB card. The system memory on my Quadro machine is 16GB. The system memory on my GTX machine is 24GB.

So, what gives here? Well, the Quadro machine has dual Xeons --a 12 core 24 thread machine it is. The GTX machine is just a 4 core 8 thread i7 baby of a machine (operating at a faster clock speed than the Xeons --likely because it wants to grow up faster).

If I had to guess, it's those CPU cores doing all the heavy lifting with the Manna Gum tree.

Just a thought. :)
Hope it helps.
You can test this theory by keeping an eye on your Task Manager CPU usage while the model is moving.

My bet however is on the difference in Quadro and Geforce drivers. Depending on the specific model cards you have, the drivers used, and Maya version, there can be a very big difference in viewport performance. This is less of a problem on the newest GTX series cards now that nVidia has their mid-tier Creator Ready driver series so don't take this to mean that Quadro is always faster, but in the past that could certainly be the case.
Yep. Well, that's just awesome (massive nVidia eyeroll on my part). :roll:

So, it appears that my old 2GB Quadro (2000 series) is tail whipping my 8GB GTX 1080 all around in the Maxwell viewport. That's some serious shady tampering under the hood of those drivers on nVidia's part. :twisted:

Updated note to self: go back to Quadro! And pick up some flowers on the way. Just don't spend too much money expecting to get a negligible render boost out of it! (GTX and/or more CPU cores will likely still outshine on that front.)

so what is the conclusion.... get a 1080GTX and a proper driver? or settle for a 4GB new quadro?
Viewport performance is the most important thing to me.
I was thinking that also using a 8GB card will allow using the GPU on hi-res denoise renders. but that's not as important to me as the viewport.
choo-chee wrote:
Thu Jun 13, 2019 1:40 pm
so what is the conclusion.... get a 1080GTX and a proper driver? or settle for a 4GB new quadro?
Viewport performance is the most important thing to me.
I was thinking that also using a 8GB card will allow using the GPU on hi-res denoise renders. but that's not as important to me as the viewport.
If your goal is good viewport performance in Maya and non-terrible GPU rendering performance in maxwell, get the best pascal series card you can afford. Thats GTX 1080, 1080Ti, or Titan Xp, and use the Studio drivers.

The newer Turing series cards do add quite a bit of render performance but they are not supported by Maxwell and no ETA on when they will.

To me Quadro only makes sense these days in a few specific scenarios if you are using applications that still greatly benefit from them, such as Solidworks or similar CAD programs. The more animation focused programs like Maya, Max, Cinema, do not really benefit from Quadros substantially enough to warrant the price premium.

So, after something like three weeks of evaluation[…]

Fully working plugin for C4D

When you will be working on it please try to keep […]

Let's talk about Maxwell 5.2

Is there any update regarding CPU engine ? Som[…]

Hello, I agree with Jasper ... I still have this […]