Everything related to Maxwell Render and general stuff that doesn't fit in other categories.
By Polyxo
#393073
Hi All,
I'm about to buy a new computer and would like to be ready for GPU rendering and editing with 10 Bit capable UHD-monitors.

Being new to GPU rendering I have some questions:
I read that it is common for GPU rendering to use a dedicated GPU for monitor display, while other GPUs do nothing but rendering.
Or is there a way to reserve resources for viewport display in any GPU (similar to the way one can limit the number of CPU threads a
renderer uses)?

If the first statement is correct one would likely be better off using two GTX 1070 instead of one TitanX.
Then again Maxwell 4 users would equally likely want to use both of their GTX 1070 for rendering, as soon as MW supports this...


Which leads me to the next question:
Is there any motherboard which has a good enough onboard graphics chip to at least serve as a fallback solution (on a multi-monitor,
partly UHD setup)? Obviously one would not do heavy image editing while all dedicated GPUs are rendering...

Finally, I would like to use 10bit displays for the first time.
That feature had been reserved for Quadro cards until recently – this seems to have changed with the Nvidia 10series.
Can someone with 10bit displays and 10 series Geforce cards confirm that this feature actually works?
By itsallgoode9
#393079
yep, you'll want to use a separate gpu for your computer monitor, although V4 doesn't yet let you do that :roll: :roll: :roll: :roll: :roll:

I have a 1080 and 10 bit is showing as enable in Photoshop. I haven't done any test to make sure it actually IS working, but it is at least showing as working
By Polyxo
#393080
itsallgoode9 wrote:yep, you'll want to use a separate gpu for your computer monitor...
So there's no way to reserve a fraction of the GPU capacity for the display?
As I'm not hardcore rendering around the clock I'm hesitant to spend hundreds of dollars for a card
which sits idle most of the time – hence my question.
itsallgoode9 wrote: I have a 1080 and 10 bit is showing as enable in Photoshop. I haven't done any test to make sure it actually IS working, but it is at least showing as working
Thanks. I believe that checkbox inside Ps isn't a suitable success indicator though.
I may also set this checkbox active with the 8 bit per channel display pipeline I currently have.

Afaik one actually needs to turn this on in the Video card settings, but the respective pulldown menu
is only shown if one has a 10bit capable UHD monitor.
By itsallgoode9
#393081
Polyxo wrote:
itsallgoode9 wrote:yep, you'll want to use a separate gpu for your computer monitor...
So there's no way to reserve a fraction of the GPU capacity for the display?
As I'm not hardcore rendering around the clock I'm hesitant to spend hundreds of dollars for a card
which sits idle most of the time – hence my question.
itsallgoode9 wrote: I have a 1080 and 10 bit is showing as enable in Photoshop. I haven't done any test to make sure it actually IS working, but it is at least showing as working
Thanks. I believe that checkbox inside Ps isn't a suitable success indicator though.
I may also set this checkbox active with the 8 bit per channel display pipeline I currently have.

Afaik one actually needs to turn this on in the Video card settings, but the respective pulldown menu
is only shown if one has a 10bit capable UHD monitor.
Not sure about how Maxwell has handled it but in other GPU renderers I've used, it doesn't leave anything in reserve, unless you lower the priority level, with cuts down the speed by 25%-50%, so even then, it's not ideal. I'm not sure how Maxwell has implemented it in 4. As far as I'm concerned, 4 is an alpha, considering it's missing very key features, so I haven't touched it. Either way, I DO know that they don't have multiple GPU support, so it's not even an option right now.

As far as the 10 bit GPU, I'll check that when I get home to make sure. When I was using a Quadro before, 10 bit was auto enabled if you had a 10 bit monitor and there was no setting to manually turn it on or off, so I didn't even look in the Geforce settings since I didn't have to before. I"ll check that tonight.
Last edited by itsallgoode9 on Wed Nov 02, 2016 6:35 pm, edited 1 time in total.
By luis.hijarrubia
#393082
In all this time testing the product while developing, I don't notice any freezing on Windows interface when doing a GPU rendering. I can move the mouse or use other programs, surf the net and so on with no problem.
By itsallgoode9
#393083
luis.hijarrubia wrote:In all this time testing the product while developing, I don't notice any freezing on Windows interface when doing a GPU rendering. I can move the mouse or use other programs, surf the net and so on with no problem.
I stand corrected :)
User avatar
By Mihai
#393089
This grey scale ramp should be a good way to test if PS is actually working in 10-bit. It should be pretty smooth, on my monitor I can definitely see the banding.

https://www.dropbox.com/s/hcj0ds5uswq40 ... p.zip?dl=0

What I'm curious about though is in what situations this 10-bit precision will be of use. I mean, you will be able to see a smoother gradient especially in shadow areas, but not others on their 8-bit setups. You might actually decide to spend time working on details in a shadow area because it looks interesting on your monitor, but for 99% of your viewers they won't see that and it will be a waste of time. As for printing, as I understand it, printers only have 8-bit drivers anyway....?
By Polyxo
#393105
Mihai,
Mihai wrote:This grey scale ramp should be a good way to test if PS is actually working in 10-bit. It should be pretty smooth, on my monitor I can definitely see the banding.
You have a 10bit display and have it plugged in via DisplayPort?
Mihai wrote:You might actually decide to spend time working on details in a shadow area because it looks interesting on your monitor, but for 99% of your viewers they won't see that and it will be a waste of time.
There's at least no danger to introduce something which looks great on 10 bit and poor on 8bit.
In the past I have run in quite a few banding problems and I found it difficult to trust that things
will look ok when printed out. Having a bit of headroom sure doesn't hurt.

I figured that if one goes for High Res screen, has a good GPU and uses 10-bit capable software
anyway it was silly not to use that route.
Last edited by Polyxo on Thu Nov 03, 2016 11:34 am, edited 2 times in total.
By itsallgoode9
#393118
Polyxo wrote:Thanks for the gradient. Nasty banding on my old monitor...
Strange, I have 10 bit turned on in geforce, 30 bit turned on in photoshop and connected by display port to a 10 bit compatible monitor but i'm still seeing bad banding. Is there somewhere else that could be overriding and keeping from the 10 bit option to be on?
User avatar
By dk2079
#393119
itsallgoode9 wrote: Is there somewhere else that could be overriding and keeping from the 10 bit option to be on?
hdmi connected?

I think only the latest specification of hdmi allows 10 bit data, if at all..

you need display Port

*edit

sorry missed that you actually are connected with display port. maybe check you monitor hw manual, I guess you have to enable the 10 bit mode in the menu somewhere.

& hdmi 2.0 can transport 10 bit
Last edited by dk2079 on Thu Nov 03, 2016 8:44 pm, edited 1 time in total.
User avatar
By Mihai
#393129
Polyxo wrote:Mihai,
Mihai wrote:This grey scale ramp should be a good way to test if PS is actually working in 10-bit. It should be pretty smooth, on my monitor I can definitely see the banding.
You have a 10bit display and have it plugged in via DisplayPort?
Sorry, I ment to say that I expected it to show banding, since I'm not using a 10 bit monitor.
itsallgoode9 wrote: Strange, I have 10 bit turned on in geforce, 30 bit turned on in photoshop and connected by display port to a 10 bit compatible monitor but i'm still seeing bad banding. Is there somewhere else that could be overriding and keeping from the 10 bit option to be on?
I've read that maybe you need some special drivers from Nvidia? And apparently, if you're on OSX, 10-bit display is not available even in Photoshop.
By dmeyer
#393273
the Quadro+Tesla combination technically lets you tell which GPU to use for compute in the drivers, but Maxwell does not seem to respect this setting and uses the display GPU anyway.

But in the future if it does, a common use case is a lower-end quadro (4000 series) and Teslas for compute.
User avatar
By Mihai
#393763
I've recently changed my monitor and video card (got a GTX 1070), and reading some more about the 10bit output over Displayport 1.2, it seems Nvidia blocks this on the gaming cards and 10bit display is only available for DirectX, in fullscreen. Not available in OpenGL. So even if you turn on 10bit in the Nvidia settings you won't see a change. You need the Quadro line of cards for it to work in Photoshop, AE etc.

I don't have a true 10bit monitor, it's a "fake" 10bit one (8bit + dithering), but I should still see a definite improvement in PS if the card actually did output 10bits. Maybe there's an Nvidia hack somewhere...
By Polyxo
#393770
I can confirm your observations.
There has been some buzz about Nvdia lifting the 10bit openGL restriction for recent Geforce Cards but that's all nonsense.

Having a 1070 too and 10bit capable monitors I am able to check a 10bit pipeline in the Graphics Card and could set 30 bit active inside
Photoshop as well. People with similar hardware due to this probably thought that they were actually running with 10 bit. They should have loaded
a test-gradient to Photoshop before posting nonsense.

I finally decided to stick to 8bit,as very few apps aside from Photoshop support a 10bit display pipeline and as I want to move away from Adobe anyway.
For my use case I find the expense for the better (5000/6000+) Quadros hard to justify – they alone cost a lot more than a very decent windows desktop
machine + good monitors.
Sketchup 2024 Released

I would like to add my voice to this annual reques[…]