Nvidia GT300 core: Speculation

Status
Not open for further replies.
History tends to repeat itself, that would mean that GT300 will be successful while R8xx family will be plagued by driver problems, bad design decisions, other unanticipated problems not directly related to AMD/ATI, and oh did I mention terrible drivers?

I agree about the terrible drivers bit. I cant stand the ATi drivers. They are buggy, cause stability problems AND I hate that god awful CCC which requires .NET. Or maybe they finally became sane and now it does not require .NET now maybe? I dont know. All I know is that their hardware is awesome. I got my brother a 4870 1 gb and he loves it.

I think Nvidia needs to get their shit together and cater to my demands of awesomeness damnit.

PS: I am not an Nvidia fanboy by any means. Its just that I have an SLI motherboard and when I bought it at the time, Nvidia was rocking it. Therefore I have no choice unless I upgrade my whole machine, but buy nvidia vdo cards.
 
It just appears that since G80, nVidia has been focusing on GPGPU far more than on just graphics alone.
Currently, Cuda/PhysX/OpenCL/DXCS seem underway nicely, which is mostly because of the groundwork done by the G80 a few years ago.
So I wonder if GT300 will put the focus back on graphics, or if nVidia continues to focus on GPGPU.
ATi only recently caught up with nVidia in terms of graphics, and hasn't quite caught up in terms of GPGPU yet... I wonder if it had any influence on nVidia's course with GT300 at all.
 
I agree about the terrible drivers bit. I cant stand the ATi drivers. They are buggy, cause stability problems AND I hate that god awful CCC which requires .NET. Or maybe they finally became sane and now it does not require .NET now maybe? I dont know. All I know is that their hardware is awesome. I got my brother a 4870 1 gb and he loves it.

I think Nvidia needs to get their shit together and cater to my demands of awesomeness damnit.

PS: I am not an Nvidia fanboy by any means. Its just that I have an SLI motherboard and when I bought it at the time, Nvidia was rocking it. Therefore I have no choice unless I upgrade my whole machine, but buy nvidia vdo cards.

I don't see where the "buggy, stability problems etc" come from, sure, some have them, but then again, some have them on nV camp too, and some have 'em on both and some on neither, it's a matter of luck, position of planets and hardware working properly. and sometimes user not making errors.

And yes, CCC still requires .NET, and so does nVidias CP (the newer, not the classic) as far as I know, but CCC is quite fast nowadays, something it wasn't in it's early days.
 
History tends to repeat itself, that would mean that GT300 will be successful while R8xx family will be plagued by driver problems, bad design decisions, other unanticipated problems not directly related to AMD/ATI, and oh did I mention terrible drivers?

You could say that underestimating the competition is in relative terms a bad design decision and the rest of your list doesn't necessarily have to apply at all. I didn't see any major driver problems either in R600 as one example or in GT200 as another. Merely the competition of each of the two having simply a better solution.

It just appears that since G80, nVidia has been focusing on GPGPU far more than on just graphics alone.
Currently, Cuda/PhysX/OpenCL/DXCS seem underway nicely, which is mostly because of the groundwork done by the G80 a few years ago.
So I wonder if GT300 will put the focus back on graphics, or if nVidia continues to focus on GPGPU.
ATi only recently caught up with nVidia in terms of graphics, and hasn't quite caught up in terms of GPGPU yet... I wonder if it had any influence on nVidia's course with GT300 at all.

Any IHV be it NVIDIA, AMD, Intel (or even the small vendors like IMG for small form factor markets) underestimating the importance of GPGPU, sounds extremely shortsighted to me. The only thing GPGPU needs IMHLO is a strong concentration on open standards like OpenCL or DX11.

Having advanced GPGPU functionality and high efficiency shouldn't necessarily mean that the underlying hardware has to suck in gaming. Look at IMG's SGX; it's got MIMD units, doesn't suck with GPGPU and retains the highest performance ratio per sqmm in its league.

Personally I wouldn't even want as a user any coming architecture to cut short with anything GPGPU. I'd want both and I'm confident it is possible; it just comes probably down to each IHVs design decisions to find something that has the best possible balance between the two "worlds".
 
Last edited by a moderator:
Any IHV be it NVIDIA, AMD, Intel (or even the small vendors like IMG for small form factor markets) underestimating the importance of GPGPU, sounds extremely shortsighted to me. The only thing GPGPU needs IMHLO is a strong concentration on open standards like OpenCL or DX11.

Well, ATi seems to have had considerably less focus on GPGPU than nVidia in the past few years.

Having advanced GPGPU functionality and high efficiency shouldn't necessarily mean that the underlying hardware has to suck in gaming. Look at IMG's SGX; it's got MIMD units, doesn't suck with GPGPU and retains the highest performance ratio per sqmm in its league.

I certainly didn't mean to say that nVidia's GPUs suck in gaming. Rather that graphics-wise, nVidia hasn't really done anything since the G80. They didn't even bother to add DX10.1 functionality.
Other than that, the main focus of nVidia has been on software... most notably Cuda and PhysX.
 
And yes, CCC still requires .NET, and so does nVidias CP (the newer, not the classic) as far as I know, but CCC is quite fast nowadays, something it wasn't in it's early days.

Wrong, Nvidia's Control Panel does not require .NET, not even the latest 182.50 WHQL drivers released last week or the 185.68 beta drivers released today.

While ATI obssess with obsolete DX10.1 and 3 games that supports it most recently, H.A.W.X, Stormrise and Battleforge. 3 games? Yeah, whatever.

DX11 hardware will support DX10.1 features, so point is moot.

http://www.youtube.com/profile?user=nvidia&view=videos

See the latest 3 DirectX Compute videos.
 
Wrong, Nvidia's Control Panel does not require .NET, not even the latest 182.50 WHQL drivers released last week or the 185.68 beta drivers released today.

While ATI obssess with obsolete DX10.1 and 3 games that supports it most recently, H.A.W.X, Stormrise and Battleforge. 3 games? Yeah, whatever.

DX11 hardware will support DX10.1 features, so point is moot.

http://www.youtube.com/profile?user=nvidia&view=videos

See the latest 3 DirectX Compute videos.

I wonder how much has the TWIMTBP program have to do with it. The features are not really irrelevant, there is just no developer support.

I remember back when nVidia had the checkbox is was implemented everywhere, aka ps3.0, even though is was nothing special either.
 
Wrong, Nvidia's Control Panel does not require .NET, not even the latest 182.50 WHQL drivers released last week or the 185.68 beta drivers released today.
What I don't understand is why is .NET such an obstacle. Sure, you have one more thing to install, but maybe you'd need to do that anyway because of some other app. (That goes to everyone here.)
While ATI obssess with obsolete DX10.1 and 3 games that supports it most recently, H.A.W.X, Stormrise and Battleforge. 3 games? Yeah, whatever.
Since when is DX10.1 obsolete?
DX11 hardware will support DX10.1 features, so point is moot.
DX11 hardware will, DX10.1 hardware is.
See the latest 3 DirectX Compute videos.
Been there, bought the T-shirt. It's a tech demo, that's all it is. Still I don't see what were you really trying to tell us.
 
Wrong, Nvidia's Control Panel does not require .NET, not even the latest 182.50 WHQL drivers released last week or the 185.68 beta drivers released today.
I'm quite sure nV at least had CP using .NET, maybe they've reverted back from it or something
While ATI obssess with obsolete DX10.1 and 3 games that supports it most recently, H.A.W.X, Stormrise and Battleforge. 3 games? Yeah, whatever.
There's nothing obsolete about it, and you forgot STALKER Clear Sky, FarCry 2 and unpatched Assassin's Creed at least
 
I definitely dont see how 10.1 is obsolete. In fact I think Nvidia should pull the thumb out of their rears and get to it seeing that 10.1 DOES indeed provide quite a nice fps boost!

As for .NET being an obstacle, its quite annoying to have to go install another app to get an app running. Nvidia does not use .NET thats why I brought it up. I bought my brother a 1 gb ati 4870 as I probably mentioned earlier and after installing the OS and and trying to get drivers, he found out he had to go and install .NET and that is just annoying. One just expects to go to the drivers site for the video card, download and get on with it.
 
One just expects to go to the drivers site for the video card, download and get on with it.
And that's how it does work. CCC != drivers and unless you want to use AA modes other than the standard box filter or use CrossFire you don't really need it.
Still, how many times do you install Windows and all and how many times do you just re-install the drivers?
 
.Net comes installed on the only OS's that support DX10/DX10.1. Anyone who says installing .NET is an obstacle is a raving mad lunatic.
 
While ATI obssess with obsolete DX10.1 and 3 games that supports it most recently, H.A.W.X, Stormrise and Battleforge. 3 games? Yeah, whatever.

Two of them are in UK top 20 chart:
http://www.bit-tech.net/news/gaming/2009/03/31/top-10-pc-games-charts/1

Game with Physics acelerated on the GPU? None.

DSC said:
DX11 hardware will support DX10.1 features, so point is moot.

Think better. ATI support DX_10.1 + tessellation. Both will be present in DX_11, so:
1+1=2
ATI already support some DX_11 features, so probably will see a boost in performance by DX_11 times.
 
Game with Physics acelerated on the GPU? None.



Think better. ATI support DX_10.1 + tessellation. Both will be present in DX_11, so:
1+1=2
ATI already support some DX_11 features, so probably will see a boost in performance by DX_11 times.

Why are people so ridiculous? Physics on GPU you cant think of any games huh...
 
Two of them are in UK top 20 chart:
ATI support DX_10.1 + tessellation. Both will be present in DX_11

http://www.microsoft.com/downloads/...6A-6D37-478D-BA17-28B1CCA4865A&displaylang=en

"Tessellation

Direct3D 11 provides additional pipeline stages to support real-time tessellation of high order primitives. With extensively programmable capabilities, this feature allows many different methods for evaluating high-order surfaces, including subdivision surfaces using approximation techniques, Bezier patches, adaptive tessellation, and displacement mapping. This feature will only be available on Direct3D 11-class hardware, so in order to evaluate this feature you will need to use the Reference Rasterizer. For a demo of tessellation in action, check out the SubD11 sample available through the Sample Browser."

The tessellation unit is a HW unit, it is doubt that the tessellation unit of R600 class can fully meet the standard of DX 11.
 
http://www.microsoft.com/downloads/...6A-6D37-478D-BA17-28B1CCA4865A&displaylang=en

"Tessellation

Direct3D 11 provides additional pipeline stages to support real-time tessellation of high order primitives. With extensively programmable capabilities, this feature allows many different methods for evaluating high-order surfaces, including subdivision surfaces using approximation techniques, Bezier patches, adaptive tessellation, and displacement mapping. This feature will only be available on Direct3D 11-class hardware, so in order to evaluate this feature you will need to use the Reference Rasterizer. For a demo of tessellation in action, check out the SubD11 sample available through the Sample Browser."

The tessellation unit is a HW unit, it is doubt that the tessellation unit of R600 class can fully meet the standard of DX 11.

IIRC there were some changes to the tesselation unit in RV670 and/or 770
 
Really from an end user perspective there is very very little differentiating ATI's and NV's products atm. This is partly why we see NV pushing physx and CUDA so heavily.. they are in dire need of setting their products apart from ATI's.

As for DX10.1 there's no doubt in my mind that if ATI were to spend what NV spends on devrel we'd be seeing far more DX10.1 supported games. We'd also be seeing far more stream apps. Limited marketing/developer relation resources are stifling ATI's growth in these areas. I really hope OpenCL and or DX11 compute takes off so we can do away with all this proprietary nonsense.

It'll be interesting to see what happens with GT300. If they manage to reclaim the performance lead I would expect NV to stop trying so hard to force physx and cuda down ppls throats.
 
http://images.bit-tech.net/content_images/2008/09/directx-11-a-look-at-what-s-coming/6.jpg

Full functionality (for example, tessellation) will require Direct3D 11 hardware

I guess most people choose to gloss over & ignore this. Again, enjoy your obsolete DX10.1 hardware. And even cho found another similar quote.

This feature will only be available on Direct3D 11-class hardware

Lets see how useful your DX10.1 tessellation unit is now in the future. Ahh, but posting here is a waste of time anyway, the ATI fanboy brigade bias here is nauseating.

Also, you think the small measly performance boost given by DX10.1 is anything compared to what you'll get with DX11 class hardware?
 
Status
Not open for further replies.
Back
Top