So, do we know anything about RV670 yet?

While back I bought GF8600GT simply because it was best for your buck, after using for little bet, I tried using VLC-player under WinXP playing 720p and 1080P HD video clips. This card simply failed, I even tried updating drivers, it didn't solve my problem. Then I went with ATI HD 2600PRO and my problems went away.
Uhm, XP? You do realize NVIDIA only activated its new video engine in XP several months after launch, right? Assuming that is your problem and you didn't know, I certainly can't blame you either, since that situation is quite screwed up! :(

EDIT: Uhm, wait, VLC-Player doesn't even support hardware acceleration as far as I can tell - is the problem that it couldn't even run at all? :|
 
Uhm, XP? You do realize NVIDIA only activated its new video engine in XP several months after launch, right? Assuming that is your problem and you didn't know, I certainly can't blame you either, since that situation is quite screwed up! :(

Are you sure that VP2 was really enable on XP now?

According from the Tomshardware HQV test here
http://www.tomshardware.com/2007/10/26/avivo_hd_vs_purevideo/page9.html
It seems likely that 8600 GTS VP2 is still not working on the XP. Also, it got a little lower score in Vista in comparison to HD 2600 XT, but without VC-1 hardware decoding. The article was outed on 26 Oct 07, so this result would be recent ;)
 
Uhm, XP? You do realize NVIDIA only activated its new video engine in XP several months after launch, right? Assuming that is your problem and you didn't know, I certainly can't blame you either, since that situation is quite screwed up! :(

EDIT: Uhm, wait, VLC-Player doesn't even support hardware acceleration as far as I can tell - is the problem that it couldn't even run at all? :|

You could still play, but the screen is bleeding when you play back the video clip.

Edit: Apple-QuickTime player also gave me trouble with GF8600GT

So far ATI HD2600Pro does not have this issue.
 
Last edited by a moderator:
One shouldn't use those .MOV container files from the net to judge any video decoding performance issues they might or might not have - those, even though 1080p or 720p, have bitrate as low as 9-10Mbps, something even normal DVD specs allow (well, almost allow, anyway).
 
Are you sure that VP2 was really enable on XP now?
http://www.tomshardware.com/2007/10/26/avivo_hd_vs_purevideo/page14.html
So yes, but IQ features aren't. Meh! :?
---
Back to RV670: Even assuming we don't see any major game developer using 10.1 in 2008 (which is a big assumption, mind you)... One very positive point I can see with this is that it might put ATI boards back in the machines of engine programmers for games expected to come out when 10.1 *does* become a standard.

And one thing ATI has been suffering from, IMO, is that they just aren't the primary development target. For DX10 games, the reason is simple, with the 8800 Series outselling HD 2900s by more than two orders of magnitude apparently; but this already was the case nearly systematically in recent years, such as with NV40 vs R420 and G70 vs R520. I don't know for sure, but I'd suspect R580 improved things a bit and then... well we know what happened after that.

ATI now having a feature superiority, rather than just a 'performance for a given feature' superiority (such as with dynamic branching), is much more important from a developer POV imo. We'll see, but it probably won't be easy to judge...
 
might put ATI boards back in the machines of engine programmers for games expected to come out when 10.1 *does* become a standard. [...]
ATI now having a feature superiority [...] is much more important from a developer POV
First, let me say that I intend to buy RV670. And I'd like to agree with you, but I can't:
  • TWIMTBP lists about hundred games, Get In The Game lists 3.
  • NVidia spent 5000 manhours devrel'ing/supporting Crysis (that's about 5 people working fulltime for 6 months). The page ATI and Crytek promotes new ATI Radeon™ X800 XT. :rolleyes:
  • supporting more and more platforms and configurations is getting harder for both. It's also hard to hire more people to provide more support if the company is in red (and I'm not talking about the logo here).
I'm sorry if I offend somebody, but from the looks of it I get the impression that compared to NVidia, AMD devrel and/or marketing is working like they're in retirement or something. ;) Additionally, the Axe of Reorganization may have hit quite hard after the merger.

Overall I don't expect things to change: even if products have tesselator and DX10.1, these sill have to be actively sold and supported.
 
First, let me say that I intend to buy RV670. And I'd like to agree with you, but I can't:
  • TWIMTBP lists about hundred games, Get In The Game lists 3.
  • NVidia spent 5000 manhours devrel'ing/supporting Crysis (that's about 5 people working fulltime for 6 months). The page ATI and Crytek promotes new ATI Radeon™ X800 XT. :rolleyes:
  • supporting more and more platforms and configurations is getting harder for both. It's also hard to hire more people to provide more support if the company is in red (and I'm not talking about the logo here).


I tend to agree with you. Even if RV670 ends up being used for future developement, and AMD actually sends a lot of them out to devs, it will still be very hard work to go against Nvidia's TWIMTBP and their aggressive dev-rel and marketing support.

AMD will have to actually put effort into improving their dev-rel and getting developers to understand the RV670.
 
AMD will have to actually put effort into improving their dev-rel and getting developers to understand the RV670.

AMD will first have to put more effort into development of their architecture, because that's where they are lacking. And without that, no use doing anything else.

Rewinding back to R600 design, I actually can't believe the management okey'ed that. Just the block diagram (shaders and texturing especially) told me enough on the first glance and I'm far from being a real expert.
 
AMD will first have to put more effort into development of their architecture, because that's where they are lacking. And without that, no use doing anything else.

That's also true, but in the context of getting devs on board and using your DX10.1 card, AMD still have to battle the superior Nvidia dev-rel. Devs will not jump to products with advanced features just because you hand them out, especially in the face of Nvidia bending over backwards and doing all the difficult programming for them.
 
Many will buy the GT simply because its the faster card even if its only by 10% . People tend to go with the fastest avail card within a certain price bracket + whats a few bucks more if that gives you 10% more performance
Price is and has been ALWAYS the king.
 
That's just basically a summary of the DX10.1 document that AMD leaked a day before the G92 launch. :)
 
Charlie's new piece. Only interesting thing in there is the RV670 X2, release date of 'winter' and price of $399. :oops:
 
BZB: I honestly think DX10.1 is totally unimportant at this point in time. And I also doubt we'll get any useful performance with features like global illumination and shader AA (as seen with R600). Kinda like NV40 and SM3 I suppose, new features but not enough performance to use them on a wide-scale. I may be wrong though, that's just what I ass-ume.
 
BZB: I honestly think DX10.1 is totally unimportant at this point in time. And I also doubt we'll get any useful performance with features like global illumination and shader AA (as seen with R600). Kinda like NV40 and SM3 I suppose, new features but not enough performance to use them on a wide-scale. I may be wrong though, that's just what I ass-ume.

Situation is not even close, NV has a huge PR weapon in they hand in nv40 time with Crytek's sm3.0+hdr patch for Far Cry, i'm sure Crytek won't release dx10.1 patch for Crysis.
 
BZB: I honestly think DX10.1 is totally unimportant at this point in time. And I also doubt we'll get any useful performance with features like global illumination and shader AA (as seen with R600). Kinda like NV40 and SM3 I suppose, new features but not enough performance to use them on a wide-scale. I may be wrong though, that's just what I ass-ume.


It's not really about DX10.1, it's about using advanced "developer only" features to get your cards into dev's machines and get them developing on your hardware, rather than your competitors hardware.
 
Back
Top