GPU Context Switching in Vista. . .

Geo

Mostly Harmless
Legend
. . .not so hot.

http://www.apcstart.com/site/dwarne/2006/06/193/windows-graphics-system-to-be-overhauled

Steve Pronovost of Microsoft’s DirectX team disclosed at WinHEC in Seattle last week that although the WDDM 1.0 introduces some rudimentary task scheduling for GPUs, a new generation of GPU hardware and a major architectural change to the way Windows deals with video cards [i.e. WDDM 2.0 and WDDM 2.1 --geo] will be required.

:sigh: I'm trying to decide if I'm surprised or not. On the one hand, I can see where this would be a new issue driven by having 3D available to the desktop apps.

On the other. . . well, Duh! Certainly the graphics IHVs have been slobbering for quite a long time now over what that would mean, and nobody thought to raise a hand on this issue early on and get it addressed? You see, OS 101. . it's called "Windows" for a reason.

So, having had my mini-rant, how serious is this issue on the hardware side? Is it "just" a driver issue, or is it going to require more hardware support that means backwards compatibility is going to be a significant issue? For extra credit, discuss whether the USC/scheduler concepts have any relevance at all (plus or minus) for this kind of context switching/resource sharing to multiple simultaneous 3D apps. And whether we might actually see some graphics IHV marketing/competition based on this factor. . .
 
Apart from WDDM 2.1 I'm not aware of there being anything new in that article...??

Or, if its not been made much of its not actually that new... it might not be spelt out, but its in the slide-deck for PDC 2005 ;)

I'd imagine that full-screen games can still drive the full (or near full) potential of the GPU - be it a current or next-gen part. It's when you're back in windowed mode that it'll show if anywhere.

Given the relatively level playing field for features I'm totally expecting performance to become the new big thing to compare on. Thus the ability for native/hardware context switching becomes an important factor.

hth
Jack
 
geo said:
. . .not so hot.

http://www.apcstart.com/site/dwarne/2006/06/193/windows-graphics-system-to-be-overhauled



:sigh: I'm trying to decide if I'm surprised or not. On the one hand, I can see where this would be a new issue driven by having 3D available to the desktop apps.

On the other. . . well, Duh! Certainly the graphics IHVs have been slobbering for quite a long time now over what that would mean, and nobody thought to raise a hand on this issue early on and get it addressed? You see, OS 101. . it's called "Windows" for a reason.

So, having had my mini-rant, how serious is this issue on the hardware side? Is it "just" a driver issue, or is it going to require more hardware support that means backwards compatibility is going to be a significant issue? For extra credit, discuss whether the USC/scheduler concepts have any relevance at all (plus or minus) for this kind of context switching/resource sharing to multiple simultaneous 3D apps. And whether we might actually see some graphics IHV marketing/competition based on this factor. . .

WDDM 1.0 = D3D 9 GPU on Vista
WDDM 2.0 = D3D 10 GPU on Vista

Both are built in to windows Vista when you buy it.

WDDM 2.1 will just use D3D 10 on Vista and will probably
ship as an update later on.

D3D 10 needs a brand new GPU and Vista only anyway so I
don't really see a problem here.

Most new machines will come with D3D 10 GPU's in 2007 (Intel, Nvidia, or ATI).
 
Well, one could wonder if G80/R600 are the generation that ATI/NV are targeting for WDDM 2.0/2.1. That would be something. But the article says there's no timeline for WDDM 2.0 yet anyway, so it won't be something we will be seeing for Vista launch anyway (which is implied by the name anyway). And if it's that far out, I have to wonder if the hardware compatibility is being targeted for the next next gen (or at least refresh) rather than G80/R600.
 
geo said:
Well, one could wonder if G80/R600 are the generation that ATI/NV are targeting for WDDM 2.0/2.1. That would be something. But the article says there's no timeline for WDDM 2.0 yet anyway, so it won't be something we will be seeing for Vista launch anyway (which is implied by the name anyway). And if it's that far out, I have to wonder if the hardware compatibility is being targeted for the next next gen (or at least refresh) rather than G80/R600.

Well unless something huge changed WDDM 2.0 was to ship with Vista and since Vista is being delayed well...

D3D 10 is already built right into Vista so it makes sense that WDDM 2.0 would as well.

R600 should be a D3D 10 GPU.
 
Docwiz said:
WDDM 1.0 = D3D 9 GPU on Vista
WDDM 2.0 = D3D 10 GPU on Vista

Both are built in to windows Vista when you buy it.

WDDM 2.1 will just use D3D 10 on Vista and will probably
ship as an update later on.

D3D 10 needs a brand new GPU and Vista only anyway so I
don't really see a problem here.

Most new machines will come with D3D 10 GPU's in 2007 (Intel, Nvidia, or ATI).

Oh, right. Now that you mention it, I do recall two levels based on DX9 and DX10. . . tho not that context switching on the desktop would be crappy under the first. In fact, I thot the point of the first was to be the desktop-3D enabler. And if WDDM 2.0 is "built in to windows Vista when you buy it". . .then why is there no timeline for it?

Does this say something about DX10 avail then as well? Are WDDM 2.0 and D3D10 (that was for the purists, but don't expect me to keep it up!) linked inextricably? Or not?

Pronovost did not say when Microsoft expected WDDM 2.0 and 2.1 to be introduced into Windows, and Microsoft said it “did not have any information to share” in response to an APC enquiry.
 
Visa

geo said:
Oh, right. Now that you mention it, I do recall two levels based on DX9 and DX10. . . tho not that context switching on the desktop would be crappy under the first. In fact, I thot the point of the first was to be the desktop-3D enabler. And if WDDM 2.0 is "built in to windows Vista when you buy it". . .then why is there no timeline for it?

Does this say something about DX10 avail then as well? Are WDDM 2.0 and D3D10 (that was for the purists, but don't expect me to keep it up!) linked inextricably? Or not?

Well as far as my knowledge goes (people can correct me if I am wrong) D3D 10 is tied to the new Vista Display Driver Model (VDDM) and that is 2.0 for D3D 10 and 1.0 for D3D 9 for Vista.

So if D3D 10 is built in to Windows Vista, VDDM 2.0 should be built in as well.
 
Docwiz said:
Well as far as my knowledge goes (people can correct me if I am wrong) D3D 10 is tied to the new Vista Display Driver Model (VDDM) and that is 2.0 for D3D 10 and 1.0 for D3D 9 for Vista.

So if D3D 10 is built in to Windows Vista, VDDM 2.0 should be built in as well.

Have we had a commitment from MS on that point in the last year? That D3D10 would ship with Vista? I recall various speculation on that point over this "ridiculous is not just in sight, it's in the rear view mirror" odyssey on the way to Vista. . . but not recently.
 
geo said:
Have we had a commitment from MS on that point in the last year? That D3D10 would ship with Vista? I recall various speculation on that point over this "ridiculous is not just in sight, it's in the rear view mirror" odyssey on the way to Vista. . . but not recently.

Well it is one of the big features in Vista. Not just desktop composition but D3D 10 support as well. Some of the slides around the Internet from PDC 05 or maybe GDC 06 should have some information about this.
 
Docwiz said:
Well it is one of the big features in Vista. Not just desktop composition but D3D 10 support as well. Some of the slides around the Internet from PDC 05 or maybe GDC 06 should have some information about this.
Well considering already several builds have had DirectX 10 in it, I'm quite confident D3D 10 is too :D
 
Kaotik said:
Well considering already several builds have had DirectX 10 in it, I'm quite confident D3D 10 is too :D

Hey, "from your lips to God's ears". . .but if MS is refusing to confirm that now in public as the linked article suggests, then I've got to wonder if they are still keeping their options open.
 
"parallel engine" looked interesting. Made me wonder if it is a subset of the context switching fix for 2.0. . .
 
geo said:
So, having had my mini-rant, how serious is this issue on the hardware side? Is it "just" a driver issue, or is it going to require more hardware support that means backwards compatibility is going to be a significant issue? For extra credit, discuss whether the USC/scheduler concepts have any relevance at all (plus or minus) for this kind of context switching/resource sharing to multiple simultaneous 3D apps. And whether we might actually see some graphics IHV marketing/competition based on this factor. . .
Context switching will require hardware support and it's likely that no existing chip has this support. Current GPUs process entire draw packets sequentially, but what if the OS wants to switch apps. If the draw packet is long and there is no context switching support in the GPU the OS won't get what it wants in a timely manner.

Other ways to kind of support context switching are dual core GPUs and multi-GPU configurations. Of course here the number of threads would be limited to the number of cores/GPUs. Maybe this will allow multi-GPU systems to actually improve Windows' performance so they will be good for more than just games.
 
geo said:
Is it "just" a driver issue, or is it going to require more hardware support that means backwards compatibility is going to be a significant issue?

With any luck, these fancy-schmancy PCI-Express mainboards we were all forced to upgrade to will suddenly be made obsolete for a new interface standard and additional hardware support. :D
 
geo said:
"parallel engine" looked interesting. Made me wonder if it is a subset of the context switching fix for 2.0. . .
This stands out:

The Windows Vista operating system will include native support for multiple graphics accelerators through an ATI sponsored technology called Linked Adapter. Linked Adapter will treat multiple graphics accelerators as a single resource (GPU and memory), and working together with parallel engine support, schedule the most efficient workload possible across the graphics processors and graphics memory pool to maximize performance.
Does that signal the end of the road for mobo-chipset-specific multi-card configurations?

Jawed
 
Jawed said:
This stands out:


Does that signal the end of the road for mobo-chipset-specific multi-card configurations?

Jawed

To me that would rather lead to IGP used for GUI rendering and PCI-E card be used for 3D rendering...
At least that would be nice.
 
Jawed said:
Is a context analogous to a render state?

Jawed
A context in this case is everything that is needed to suspend (and later resume)execution of the rendering commands submitted by a certain OS thread/application. Until now there was no limit on how long such a context switch may take, so the driver and GPU were allowed to complete all pending operations and thereby minimize the amount of state that has to be saved and restored.

If a context switch has to complete in a few microseconds, the GPU might need to interrupt execution of very long shaders and save the current registers and other GPU thread data in memory.
 
Xmas said:
If a context switch has to complete in a few microseconds, the GPU might need to interrupt execution of very long shaders and save the current registers and other GPU thread data in memory.

Does the USC concept come in here at all? Say I have two business apps with 3D-y interfaces crunching away and making purty pictures. Does USC mean it's easier for the GPU to assign, say 2/3rds of the units to the active window while keeping 1/3 going on the other window?
 
Back
Top