ATi unveil R420

JCLW said:
mavis said:
Great. I can't believe ATi's reference card doesn't have dual-DVI. What a major let down. :(
The nV40 doesn't support dual DVI natively either. The reference card uses a seperate TMDS transmitter to drive the second DVI. I beleive it's the Sil1172 on the nV40 cards.

I'm sure someone will make a dual DVI version. Asus (and Saphire IIRC) make a dual DVI 9800 board.

The fireGL are the same way. The one I have is a sii164CT64 driveing the second DVI.
 
JCLW said:
The reference card uses a seperate TMDS transmitter to drive the second DVI

AFAIK there are no TDMS transmitters integrated into NV40. There is actually two external chips, one on the underside of the board and one on the front, under the fan. The documentation makes note of "DVI ports for connection to decoders".
 
joe emo said:
some Q/A from later on in the article, translated of course:

Question one:Which Radeon does the X800 indicator product rank at present have to plan? Estimated when can go on the market? Market selling price appointment in which rank?

ANS:At present we determined can promote has Radeon X800 Pro with X800 XT these two section, certainly future possibly will be able to have other ranks the product to promote(about this 1., Janus will find material demonstrated, ATi will have possibility to be able to promote 128MB edition Radeon X800 series product), as for the time which will go on the market are partial, we estimated will be able in 5/4 officially to publish X800 Pro, will need to be able to go on the market as for X800 XT to 5,/21 side, but we believed, officially time which will sell in the market also can approximately at this time, but the present stage we also not conveniently will announce these two section products the selling price.

...Moreover 3Dc is a new specification, always unavoidably can have some questions, but here 1. which explained with you is, compares uses the seal room to CG the overhead construction to say, 3Dc is with a Windows accommodating standard...

Question five:New Radeon the X800 product has not supported Pixel Shader as if 3.0, why ask the reason? How ATi also regards this specification?

ANS:In fact, Pixel Shader 3.0 with 2.0 effects, to the naked eye said is no difference, simple saying, the majority of games the instruction collection which need with the formula, in Pixel Shader 2.0 then sufficiently deals with, in addition within the short time cannot have any support Pixel the Shader 3.0 software urgent needs use such specification, much less present Pixel the Shader 2.0 specifications by no means completely display, if demands guides Pixel Shader 3.0, then draws a chart the chip operation ability is can carry out smoothly, or the undecided number, looked from the ATi angle, The increase operation ability can compare increases other functions to come to have the significance.


I guess from the ATI perspective, they want us to believe that PS 3.0 and PS 2.0 have no difference - but I believe in reality since PS 3.0 has more flow and other things, the difference will become apparent when game developers start using it.

What is 3Dc? more confusion in the graphics shader specifications..
 
hstewarth said:
I guess from the ATI perspective, they want us to believe that PS 3.0 and PS 2.0 have no difference - but I believe in reality since PS 3.0 has more flow and other things, the difference will become apparent when game developers start using it.
Uhm, what? ATi is claiming there are no visual differences between PS 2.0 and 3.0, not that there are no differences. :rolleyes:

What is 3Dc? more confusion in the graphics shader specifications..
A compression technique.
 
hstewarth said:
I guess from the ATI perspective, they want us to believe that PS 3.0 and PS 2.0 have no difference - but I believe in reality since PS 3.0 has more flow and other things, the difference will become apparent when game developers start using it.

What is 3Dc? more confusion in the graphics shader specifications..

I think what ATI is going to say is that there is no visual difference between 2.0 and 3.0, but technically some effects can be done more effeciently or have less of a performance hit if produced thru 3.0 to which ATI claims is that since they are going to have much better shader performance in general that this 'advantage' is not an issue.

In other words an effect rendered on a 6800U will be more effeciently run and have better FPS thru 3.0 vs 2.0 and visually not be any different. But that same effect run on a X800pro/xt via 2.x will be again visually the stame but faster than the 6800u usuing 3.0.

The issue may come done to TWIMTBP vendors being advised by their partner not to provide any or any effecient 2.0 code path for their shaders or to create 3.0 shaders that either cannot be compiled at all into 2.0 or only at great penalty. This is where the games wills start to be played.

I do find it interesting that Far Cry a TWIMTBP game is going to support the new ATI compression that is coming out.
 
Stryyder said:
I do find it interesting that Far Cry a TWIMTBP game is going to support the new ATI compression that is coming out.
Few developers actually take TWIMTBP as "ignore ATI". . .
 
Yes, but supporting a feature Nvidia cards don't? Giving non-Nvidia cards an advantage?

I guess if you use 3Dc, you're not playing the game "the way it's meant to be played".
 
Well, if I remember correctly, 3Dc is an open standard, meaning NVIDIA could support it. Obviously it wouldn't be supported in NV40, but the opportunity is there.
 
Nick Spolec said:
Yes, but supporting a feature Nvidia cards don't? Giving non-Nvidia cards an advantage?

I guess if you use 3Dc, you're not playing the game "the way it's meant to be played".

Would that not be the same as running nVidia HW in legacy mode ( ie. no PS2.0 ), as was the case with the retail game?
 
Nick Spolec said:
Yes, but supporting a feature Nvidia cards don't? Giving non-Nvidia cards an advantage?

I guess if you use 3Dc, you're not playing the game "the way it's meant to be played".
Just like UT2003 and TruForm.
 
hstewarth said:
I guess from the ATI perspective, they want us to believe that PS 3.0 and PS 2.0 have no difference...
Anyone remember when 3DFX were saying that 16 bit colour was more than enough for games and, anyway, nobody could tell the difference between it and 32 bit colour... ? :)
 
Diplo said:
hstewarth said:
I guess from the ATI perspective, they want us to believe that PS 3.0 and PS 2.0 have no difference...
Anyone remember when 3DFX were saying that 16 bit colour was more than enough for games and, anyway, nobody could tell the difference between it and 32 bit colour... ? :)

No, but I do remember when 3dfx were saying that they believed 22 bit color was an optimal comprimise between lower quality / faster 16 bit and higher quality / slower 32 bit.

And considering their 22 bit was just about as fast as 16 bit, and was pretty close (but did not match) 32 bit quality....I'd say it was an excellent comprimise.
 
Diplo said:
hstewarth said:
I guess from the ATI perspective, they want us to believe that PS 3.0 and PS 2.0 have no difference...
Anyone remember when 3DFX were saying that 16 bit colour was more than enough for games and, anyway, nobody could tell the difference between it and 32 bit colour... ? :)

Less then a hand full of games actually make extensive use of PS 2.0, let alone ANY that make use of PS 3.0... (Farcry patch isn't out) So... How are we supposed to tell the difference? What? Wait for an Nvidia demo that "illustrates" the visual differences between the two standards?

How would they know the visual differences? They never even had a GPU that really supported PS 2.0 until the 6800.
 
Diplo said:
hstewarth said:
I guess from the ATI perspective, they want us to believe that PS 3.0 and PS 2.0 have no difference...
Anyone remember when 3DFX were saying that 16 bit colour was more than enough for games and, anyway, nobody could tell the difference between it and 32 bit colour... ? :)
wow, a seeming applicable analogy that on more than a cursory glance, really isnt applicable at all. And from Diplo to! Who would have thunk it!
 
Althornin said:
Diplo said:
hstewarth said:
I guess from the ATI perspective, they want us to believe that PS 3.0 and PS 2.0 have no difference...
Anyone remember when 3DFX were saying that 16 bit colour was more than enough for games and, anyway, nobody could tell the difference between it and 32 bit colour... ? :)
wow, a seeming applicable analogy that on more than a cursory glance, really isnt applicable at all. And from Diplo to! Who would have thunk it!

tsk tsk. Just take a cold shower, go outside, walk around a bit and you'll feel much better. Life will suddenly take on a whole new meaning. :LOL:
 
Nick Spolec said:
Less then a hand full of games actually make extensive use of PS 2.0, let alone ANY that make use of PS 3.0... (Farcry patch isn't out) So... How are we supposed to tell the difference? What? Wait for an Nvidia demo that "illustrates" the visual differences between the two standards?
I agree. In fact, not many games even use many PS1.x features, apart from the obligatory water effect. I'd also say the difference in quality between PS1.x and PS2.0 isn't particularly great or noticeable in a lot of cases.

So, what should we do? Say things are fine as they are, let's freeze here? Quality can never get better, so let's stop striving for it anymore?

When Nvidia unveiled T&L with the GeForce 256 did any games support it? Nope. I remember then 3DFX fans saying how it would never catch on and laughing that no games supported it (I bought my GeForce 256 2nd hand off someone who sold it to buy a Voodoo3 - hah!). Now how many games don't support T&L? How many recent games are designed just for 16bit colour?

The point is, it may not be that relevant at the moment, but at some point in the future 32bit precision will become the norm. However, this progress will only be made if manufacturers actually release hardware to support it. It's a chicken and egg situation! How can developers start working on features if there is no hardware they can use to test them? So instead of bemoaning the fact, just because your favourite video card manufactuer (who loves you too, I'm sure) doesn't support it, celeberate the fact that it's a step nearer.

So personally I'm very happy to see cards supporting new features, because this actually enables developers to make better looking games in the future. If people want to pay $500 to be an early adapter, be my guest. All I can say is, "Thanks for paying off the R&D costs for these cool new features I'll wait a while for and get for half the price".

Isn't this what we all want, better looking games? It's like some of you are more interested in arguing the toss for your favourite video card manufacturer than actually playing cool games. I can't wait for games to use the Unreal 3 Engine - it looks stunning. I couldn't care less what card it runs on, so long as one day I can play games like that. I couldn't care less if Tim Sweeney/Carmack is sucking Satan's cock, so long as they keep making stunning advances in 3D programming. :devilish:
 
Diplo said:
When Nvidia unveiled T&L with the GeForce 256 did any games support it? Nope. I remember then 3DFX fans saying how it would never catch on and laughing that no games supported it (I bought my GeForce 256 2nd hand off someone who sold it to buy a Voodoo3 - hah!). Now how many games don't support T&L?
I think this was only true for Direct3D games, where the programmer explicitly has to use or to not use HT&L. In OpenGL the difference is entirely in the drivers. There's no difference in the coding.
 
Diplo said:
Nick Spolec said:
Less then a hand full of games actually make extensive use of PS 2.0, let alone ANY that make use of PS 3.0... (Farcry patch isn't out) So... How are we supposed to tell the difference? What? Wait for an Nvidia demo that "illustrates" the visual differences between the two standards?
I agree. In fact, not many games even use many PS1.x features, apart from the obligatory water effect. I'd also say the difference in quality between PS1.x and PS2.0 isn't particularly great or noticeable in a lot of cases.

So, what should we do? Say things are fine as they are, let's freeze here? Quality can never get better, so let's stop striving for it anymore?

When Nvidia unveiled T&L with the GeForce 256 did any games support it? Nope. I remember then 3DFX fans saying how it would never catch on and laughing that no games supported it (I bought my GeForce 256 2nd hand off someone who sold it to buy a Voodoo3 - hah!). Now how many games don't support T&L? How many recent games are designed just for 16bit colour?

The point is, it may not be that relevant at the moment, but at some point in the future 32bit precision will become the norm. However, this progress will only be made if manufacturers actually release hardware to support it. It's a chicken and egg situation! How can developers start working on features if there is no hardware they can use to test them? So instead of bemoaning the fact, just because your favourite video card manufactuer (who loves you too, I'm sure) doesn't support it, celeberate the fact that it's a step nearer.

So personally I'm very happy to see cards supporting new features, because this actually enables developers to make better looking games in the future. If people want to pay $500 to be an early adapter, be my guest. All I can say is, "Thanks for paying off the R&D costs for these cool new features I'll wait a while for and get for half the price".

Isn't this what we all want, better looking games? It's like some of you are more interested in arguing the toss for your favourite video card manufacturer than actually playing cool games. I can't wait for games to use the Unreal 3 Engine - it looks stunning. I couldn't care less what card it runs on, so long as one day I can play games like that. I couldn't care less if Tim Sweeney/Carmack is sucking Satan's cock, so long as they keep making stunning advances in 3D programming. :devilish:

If ya ask me I want game developers to do more than use pretty water in games with pixel shaders!!!
 
Back
Top