Anand has the details about r520,rv530,rv515

Chalnoth said:
Hardware. The slave is just a normal video card. The master card has that extra compositing chip and inputs to take the output from the slave card, combine it with the master card's GPU output, and send the result to the monitor.

Hmm if the r520 has the composite chip built into the gpu silicon wouldn't that be more the way the nV makes SLi? Or was the r520 too far into developement for ATi to add it into the GPU silicon?
 
_xxx_ said:
Is it confirmed that the mobile version of the GTX is on 90 nm?

I thought it was, but now......I don't know.

Check out the discussion in the 3D Graphics Boards & Drivers section for what we're talking about.
 
Razor1 said:
Hmm if the r520 has the composite chip built into the gpu silicon wouldn't that be more the way the nV makes SLi? Or was the r520 too far into developement for ATi to add it into the GPU silicon?
There's more to it than just that, because you also need to have an external input for the pass-through cable.
 
_xxx_ said:
Is it confirmed that the mobile version of the GTX is on 90 nm?
No, it's definitely not. It's fabbed at TSMC on 110nm - I confirmed this this morning with NVIDIA.
 
Razor1 said:
Or was the r520 too far into developement for ATi to add it into the GPU silicon?


I am sure it was feature complete and locked before SLI first surfaced...At least to the point that adding that fucntionality into the R520 would have required major work and it was too late in the pipeline..
 
jb said:
I am sure it was feature complete and locked before SLI first surfaced...At least to the point that adding that fucntionality into the R520 would have required major work and it was too late in the pipeline..

R580 is apparantly working in house, do you think it's development was too far along as well?
 
Karma Police said:
R580 is apparantly working in house, do you think it's development was too far along as well?

I have only dabled in IC design back in one of the old groups I use to work at. Usally most design changes were locked out about 4 to 6 month before we got first pass silicon back. I would bet that most places follow a simular idea meaning to design it in a a core feature of your IC it has to be at the start of the project. Of course you always can wiggle a bit and ATI does have some talened desginers (as dose NV)....
 
caboosemoose said:
Indeed. Surely it is nothing more than a regular G70 from the "very good" bins.

There's a review out at Bit-tech. Apparently from most of the benchies it just a step behind the G 70 GTX.

Quote:
The GeForce Go 7800: Frankly, we didn't believe NVIDIA when it told us that this mobile chip was going to be as fast as the desktop chip. However, leaving aside the fact that desktop enthusiast mainboards and chips are faster than laptops, the graphics chip itself appears to be almost exactly as fast as the 7800 GTX - that's an incredible feat.


http://www.bit-tech.net/hardware/2005/09/28/evesham_nvidia_7800/1.html
 
geo said:
Bit-tech has had a few goodies of late.

Well, they've had very good access, though from a devil's advocate sort of perspective, I'd rather they were a little more sceptical regarding Loast Coast. There wasn't much questioning regading how Valve had delivered the impossible with HDR and AA on current cards (answer: they haven't because it isn't). I've seen Loast Coast running now, and I can assure you it is less than wonderful. In fact, Valve don't really have much of a track record for build engines (one to date), and graphically speaking Source is pretty ordinary.
 
Last edited by a moderator:
caboosemoose said:
Well, they've had very good access, though from a devil's advocate sort of perspective, I'd rather they were a little more skeptical . . .

Heh, if you really wanted to be :devilish: then you'd have noted that the cynical might wonder if there was a connection between point 1 and point 2.

To which the inevitable rejoinder would include references to a green-eyed monster, and then downhill from there.

Me, I'll just let it go with nice stuff of late guys.
 
Last edited by a moderator:
geo said:
Heh, if you really wanted to be :devilish: then you'd have noted that the cynical might wonder if there was a connection between point 1 and point 2.

To which the inevitable rejoinder would include references to a green-eyed monster, and then downhill from there.

Me, I'll just let it go with nice stuff of late guys.


I don't really mind it - I mean it's pretty simple: if you get a special invitation from someone like Valve to preview something like Lost Coast you're in a very tough postion. if you think it's a bit pants you're kind of stuck - if you say so you will very likely be seen as very ungrateful and very definitely not be invited back. It's a bit of a fluff piece - but every site/mag on the planet would have done excatly the same thing, no question about that.
 
Exactly. And it's not like these are going to be "the last word" on these products. But it's nice to get a little something out there for folks to start chewing on. So it works out.
 
HDR is fine in theory but the implimentations seen today are horrible. If all HDR means is big blotchy washed out white spots everywhere then give me the option to turn that stuff off.
 
Junkstyle said:
HDR is fine in theory but the implimentations seen today are horrible. If all HDR means is big blotchy washed out white spots everywhere then give me the option to turn that stuff off.
That's not all that it does mean, though. Even in FarCry, which really has a spotty HDR implementation, there are scenes that you just can't get without HDR. For example, one of the more memorable sequences for me was when exiting one of the caves. The transition from the dark cave to the bright sunlight was done very well, I thought, as I just couldn't see much of anything until I got near to the mouth of the cave and my eyes adjusted.
 
Junkstyle said:
HDR is fine in theory but the implimentations seen today are horrible. If all HDR means is big blotchy washed out white spots everywhere then give me the option to turn that stuff off.

Levels have to be made with HDR in mind otherwise things get really crazy. Total environmental color and amount of light effect HDR greatly. Lets say we are using a bright white light and the evironmental color is black (pretty much environmental color doesn't effect the scene), you get a washed out effect. But now if you change the environmental light to say grey, then HDR will tone down to something more reasonable. Now if you change the environmental color to white then HDR won't effect the scene at all.
 
Back
Top