NVIDIA GT200 Rumours & Speculation Thread

Status
Not open for further replies.
But that would require re-designing the chip. GT200 should be optimized to squeeze more performance out of less transistors, so it would make more sense making a new chip based on the GT200 than on the somewhat antiquated G92.
 
...
Get pulled deeper into the gaming experience than ever before with NVIDIA® GeForce® GTX 280 and GTX 260.

High definition worlds crackle and hum with cinema-quality clarity in games including Unreal Tournament® 3, Assassin’s Creed™, Call of Duty® 4: Modern Warfare™ and S.T.A.L.K.E.R.: Clear Sky. Lifelike characters behave with such realistic intensity that your palms sweat and your heart races. Objects react to the forces of nature exactly as nature intended.

And beyond games, GeForce GTX 200 GPUs shift everyday processing tasks from the CPU to the GPU - Blu-ray movies, 3D Internet browsing with PicLens, Folding@home protein-folding, and video transcoding. With a 50% perf boost over the previous generations, both performance and immersion are absolute.

Get Visual. Get GeForce.
http://we.pcinlife.com/thread-940795-1-1.html

;)

But I still ask me to what this 50% refer, maybe also a low set estimate like the 90% of 9600GT over 8600GTS, which is in real benchmarks significant more.
 
Our friends at Fudzilla just reported the GT200 class will be available immediately upon launch in June and that it "will definitely be faster than a 9800 GX2". However, the price tag as we know is going to be steep, at $600 :(. Looking at the GX2 though at a minimum of $500, and we can see where "high-end" is going.

Pricing aside, and even performance aside, what are the technological advantages and disadvantages between Nvidia next gen and ATI next gen?

Right now I can list these and there importance (to me anyway):

1) DX 10.1 - Who cares... as someone stated earlier

2) CUDA - Could be VERY useful for encoders

3) DDR5 - It doesn't really matter what memory technology the card is using, as long as it's fast (although some might scream that the card is too power hungry, if that's something you care a lot about. With the power this thing will need, I can see that justification!).

4) 65nm/55nm - Sort of the same thing as no. 3).

5) Enhanced UVD, Accelerating Encoding - Useful on a card this expensive?

6) Int. Display Port - A competitor to HDMI and UDI. IDP seems to be a positive thing, but again how useful is it on a video card this expensive?

7) HDMI 7.1 - ???

8) Int. HDCP - ???

9) ???

http://www.fudzilla.com/index.php?option=com_content&task=view&id=7586&Itemid=1
 
Last edited by a moderator:
Well, it shouldn't come as any sort of surprise that NV will charge the most they can for this card. I'm waiting for the $800 version to replace that original 8800 Ultra price point.

IMO GT200 is going to be the last thing you'd want to run in a HTPC, so whether they accelerate HD video, have display port, and other video-related stuff means nothing to me. A game PC can play 1080p fine even with pure software decoding. For a HTPC, a little Radeon 3450 or 3650 (or 780G for that matter) is totally the way to go IMO. Don't really need game power at all, and if it's on all the time the ultra-low power use of one of those is awesome. And they are uber cheap.

GT200 is for games and that's all it needs to do well. I'm sure it will be a CUDA monster, and that may interest some group out there, but it won't matter for gaming. All it needs is a pair of DVI ports and lots of game power.
 
Well, it shouldn't come as any sort of surprise that NV will charge the most they can for this card. I'm waiting for the $800 version to replace that original 8800 Ultra price point.

IMO GT200 is going to be the last thing you'd want to run in a HTPC, so whether they accelerate HD video, have display port, and other video-related stuff means nothing to me. A game PC can play 1080p fine even with pure software decoding. For a HTPC, a little Radeon 3450 or 3650 (or 780G for that matter) is totally the way to go IMO. Don't really need game power at all, and if it's on all the time the ultra-low power use of one of those is awesome. And they are uber cheap.

GT200 is for games and that's all it needs to do well. I'm sure it will be a CUDA monster, and that may interest some group out there, but it won't matter for gaming. All it needs is a pair of DVI ports and lots of game power.

I have a friend named, Mike, that is going to be purchasing hundreds of dollars on server board, memory, CPU components so he can encode HD in 20-30fps on his rig. That speed IS impressive when you normally right now only will get 3-6fps on a typical end-user system.

I, or we since you know him as well, might want to actually tell him to get the GTX260 or even an 8-series card, perhaps his wisest choices, lol. Early reports suggest 2x real-time with CUDA support, which means 40fps+... a good deal better than his 20-30fps target.
 
Last edited by a moderator:
I have a friend named, Mike, that is going to be purchasing hundreds of dollars on server board, memory, CPU components so he can encode HD in 20-30fps on his rig. That speed IS impressive when you normally right now only will get 3-6fps on a typical end-user system.

I, or we since you know him as well, might want to actually tell him to get the GTX260 or even an 8-series card, perhaps is his wisest choices, lol. Early reports suggest 2x real-time with CUDA support, which means 40fps+... a good deal better than his 20-30fps target.
The only time for him to even ponder that is when x264 (or some other H.264 encoder) has functioning, efficient CUDA support. Once it's proven, that might be fun indeed. It has to actually be usable within real video encoding apps, not just some cheesy NV utility.

If it actually works, it would be interesting to see how the soon-to-be-cheaper 8 series cards perform.
 
Last edited by a moderator:
The only time for him to even ponder that is when x264 (or some other H.264 encoder) has functioning, efficient CUDA support. Once it's proven, that might be fun indeed.

He might be able to use RapiHD, although Ateme (Nero) might be a better quality solution as it offers more features. You are right though, that CPU based solutions at the moment can offer higher quality output as the CUDA support isn't quite integrated well enough in these HD solutions (at the moment), but it is going to cost you (unless of course you're fine with 3-6fps).

Not sure on their costs though or true effectiveness at this time... might be interesting to see what Larabee introduces for CPU/GPU combo solutions yet. I'm not an encoding buff myself so some of this is a bit sketchy for what I understand, but I do know I'm not jumping on any bandwagons yet for it. Maybe next year...
 
http://www.nordichardware.com/news,7809.html

Both the GeForce GTX and Radeon HD 4800 series will arrive in about three weeks. Each series will bring two new cards to the market; GeForce GTX 280 and 260, and Radeon HD 4870 and 4850. There is a big difference between the cards though as the GeForce GTX series is enthusiast range, while Radeon HD 4800 series is more mid-range. There have been talks of what GeForce GTX 280 can do in Vantage, but it has now been completed with figures for the other cards.

These are of course in no way official and we can't say for certain where they come from. The only thing we know is that the numbers are not unreasonable, but some information about the rest of the system would be nice. ATI performance (with all cards) is still subpar due to poor drivers, and should improve in Vantage with coming releases. The numbers that are circulating the web are something like this;


Graphics card Vantage Xtreme profile*
GeForce GTX 280 41xx
GeForce GTX 260 38xx
GeForce 9800GX2 36xx
GeForce 8800 Ultra 24xx
Radeon HD 4870 XT 26xx
Radeon HD 3870X2 25xx
Radeon HD 4850 Pro 20xx
Radeon HD 3870 14xx
* 1920x1200 4AA/16AF

;)
 
Well at least they got the GTX 280 and GTX 260 numbers right. But I leaked those some time ago on another forum. ;)
 
http://www.nordichardware.com/news,7809.html

Both the GeForce GTX and Radeon HD 4800 series will arrive in about three weeks. Each series will bring two new cards to the market; GeForce GTX 280 and 260, and Radeon HD 4870 and 4850. There is a big difference between the cards though as the GeForce GTX series is enthusiast range, while Radeon HD 4800 series is more mid-range. There have been talks of what GeForce GTX 280 can do in Vantage, but it has now been completed with figures for the other cards.

These are of course in no way official and we can't say for certain where they come from. The only thing we know is that the numbers are not unreasonable, but some information about the rest of the system would be nice. ATI performance (with all cards) is still subpar due to poor drivers, and should improve in Vantage with coming releases. The numbers that are circulating the web are something like this;


Graphics card Vantage Xtreme profile*
GeForce GTX 280 41xx
GeForce GTX 260 38xx
GeForce 9800GX2 36xx
GeForce 8800 Ultra 24xx
Radeon HD 4870 XT 26xx
Radeon HD 3870X2 25xx
Radeon HD 4850 Pro 20xx
Radeon HD 3870 14xx
* 1920x1200 4AA/16AF

;)

It just goes to show, if these numbers are even close to accurate, that they mean much less when it comes to real-world game performance. The GeForce 8800 Ultra for example will often come close to the GX2 performance in many game scenarios.

What we'll need is real-world benchmarks and solidified drivers to tell any sort of tale on what these cards are going to be worth performance wise. Vantage scores are a combination of CPU, memory, graphics, so some games under the higher-end cards might do much better than the 12 percent difference between the GTX280 and 8800Ultra scores suggest here. This is a great start though, most definitely.
 
Here is a Crysis benchmark from Hardware.Info. If these numbers are accurate what do you think the FPS would be for those 2 new cards?

netgsh.png
 
Here is a Crysis benchmark from Hardware.Info. If these numbers are accurate what do you think the FPS would be for those 2 new cards?

netgsh.png

Why the hell is the 9800 GTX slower than the GTS 512? It should be faster(Maybe not by much, but still)...

That's the second time I've seen a graph with the 9800 GTX slower than the GTS 512.
 
Also, I still can't believe my 8800Ultra has reigned for so long...

Whats not to believe? ATI basicly gave up after the G80 launched.

As far as I'm concerned, I dont believe a single thing right now when it comes to anything numbers related to there "unreleased stuff". I dont care if CJ were to post a screenshot of of his own system running a new ATI GPU not yet seen by the world, I want to see it to believe it. And I dont believe the 30% faster than 9800GTX either.
 
Last edited by a moderator:
Whats no to believe? ATI basicly gave up after the G80 launched. As far as I'm concerned, I dont believe a single thing right now when it comes to anything numbers related to there "unreleased stuff". I dont care if CJ were to post a screenshot of of his own system running a new ATI GPU not yet seen by the world, I want to see it to believe it. And I dont believe the 30% faster than 9800GTX either.

I wouldn't count them out of the high-end game so quickly. What about R700... a true bridged dual-GPU solution. By the numbers above it suggests it could pull ahead of the GTX280.
 
Status
Not open for further replies.
Back
Top