http://we.pcinlife.com/thread-940795-1-1.htmlGet pulled deeper into the gaming experience than ever before with NVIDIA® GeForce® GTX 280 and GTX 260.
High definition worlds crackle and hum with cinema-quality clarity in games including Unreal Tournament® 3, Assassin’s Creed™, Call of Duty® 4: Modern Warfare™ and S.T.A.L.K.E.R.: Clear Sky. Lifelike characters behave with such realistic intensity that your palms sweat and your heart races. Objects react to the forces of nature exactly as nature intended.
And beyond games, GeForce GTX 200 GPUs shift everyday processing tasks from the CPU to the GPU - Blu-ray movies, 3D Internet browsing with PicLens, Folding@home protein-folding, and video transcoding. With a 50% perf boost over the previous generations, both performance and immersion are absolute.
Get Visual. Get GeForce.
Well, it shouldn't come as any sort of surprise that NV will charge the most they can for this card. I'm waiting for the $800 version to replace that original 8800 Ultra price point.
IMO GT200 is going to be the last thing you'd want to run in a HTPC, so whether they accelerate HD video, have display port, and other video-related stuff means nothing to me. A game PC can play 1080p fine even with pure software decoding. For a HTPC, a little Radeon 3450 or 3650 (or 780G for that matter) is totally the way to go IMO. Don't really need game power at all, and if it's on all the time the ultra-low power use of one of those is awesome. And they are uber cheap.
GT200 is for games and that's all it needs to do well. I'm sure it will be a CUDA monster, and that may interest some group out there, but it won't matter for gaming. All it needs is a pair of DVI ports and lots of game power.
The only time for him to even ponder that is when x264 (or some other H.264 encoder) has functioning, efficient CUDA support. Once it's proven, that might be fun indeed. It has to actually be usable within real video encoding apps, not just some cheesy NV utility.I have a friend named, Mike, that is going to be purchasing hundreds of dollars on server board, memory, CPU components so he can encode HD in 20-30fps on his rig. That speed IS impressive when you normally right now only will get 3-6fps on a typical end-user system.
I, or we since you know him as well, might want to actually tell him to get the GTX260 or even an 8-series card, perhaps is his wisest choices, lol. Early reports suggest 2x real-time with CUDA support, which means 40fps+... a good deal better than his 20-30fps target.
The only time for him to even ponder that is when x264 (or some other H.264 encoder) has functioning, efficient CUDA support. Once it's proven, that might be fun indeed.
http://www.nordichardware.com/news,7809.html
Both the GeForce GTX and Radeon HD 4800 series will arrive in about three weeks. Each series will bring two new cards to the market; GeForce GTX 280 and 260, and Radeon HD 4870 and 4850. There is a big difference between the cards though as the GeForce GTX series is enthusiast range, while Radeon HD 4800 series is more mid-range. There have been talks of what GeForce GTX 280 can do in Vantage, but it has now been completed with figures for the other cards.
These are of course in no way official and we can't say for certain where they come from. The only thing we know is that the numbers are not unreasonable, but some information about the rest of the system would be nice. ATI performance (with all cards) is still subpar due to poor drivers, and should improve in Vantage with coming releases. The numbers that are circulating the web are something like this;
Graphics card Vantage Xtreme profile*
GeForce GTX 280 41xx
GeForce GTX 260 38xx
GeForce 9800GX2 36xx
GeForce 8800 Ultra 24xx
Radeon HD 4870 XT 26xx
Radeon HD 3870X2 25xx
Radeon HD 4850 Pro 20xx
Radeon HD 3870 14xx
* 1920x1200 4AA/16AF
Well at least they got the GTX 280 and GTX 260 numbers right. But I leaked those some time ago on another forum.
So HD4870/50 are not right?
Here is a Crysis benchmark from Hardware.Info. If these numbers are accurate what do you think the FPS would be for those 2 new cards?
Also, I still can't believe my 8800Ultra has reigned for so long...
Whats no to believe? ATI basicly gave up after the G80 launched. As far as I'm concerned, I dont believe a single thing right now when it comes to anything numbers related to there "unreleased stuff". I dont care if CJ were to post a screenshot of of his own system running a new ATI GPU not yet seen by the world, I want to see it to believe it. And I dont believe the 30% faster than 9800GTX either.