ATI Talks Lithography

jvd said:
I’d rather know or discuss real details about ATI. They rock
Do you have any details ? All i've been hearing is it will be an advance design and fall in between thier product cycles.

Sure. R420 is based on PCI Express.
Also the 9800 Sapphire Dual GPU is coming out soon in responce to XGI.
R420 may included an 8xAGP bracket adapter.

I also have other ideas. But ChairmanSteve of my home forum is the PC person.
I'm all console with an awareness for other things.

Now go hit the news. I'm too lazy to dish PC info. :p

BTW- We should all be beyond comparing GF2 with anything at this time. Yes, Please?
For that topic just start a new thread.
 
But I’ve never agreed beyond 90nm.

Everyone else has; and has moved on since.

Shin`ichi may have indeed been trying what ya’ll are thinking. (He’d have to for 1024bit bussing) Course he is no longer with Sony.

WTF is this? This is where I fail to see your thinking. How does Shin leaving = There is a big hole in PS3/CELL development? Are you really trying to pull off the thinking that Shinichi was doing all the engineering himself? There were 300 people working on Cell, and it's design was done a while ago. One man leaving doesn't mean anything.
 
David_South#1 said:
jvd said:
I’d rather know or discuss real details about ATI. They rock
Do you have any details ? All i've been hearing is it will be an advance design and fall in between thier product cycles.

Sure. R420 is based on PCI Express.
Also the 9800 Sapphire Dual GPU is coming out soon in responce to XGI.
R420 may included an 8xAGP bracket adapter.

I also have other ideas. But ChairmanSteve of my home forum is the PC person.
I'm all console with an awareness for other things.

Now go hit the news. I'm too lazy to dish PC info. :p
Ah i meant about the xbox contract . I thought ati was not going to write drivers for the dual gpu from sapphire .
 
Paul said:
But I’ve never agreed beyond 90nm.

Everyone else has; and has moved on since.

Shin`ichi may have indeed been trying what ya’ll are thinking. (He’d have to for 1024bit bussing) Course he is no longer with Sony.

WTF is this? This is where I fail to see your thinking. How does Shin leaving = There is a big hole in PS3/CELL development? Are you really trying to pull off the thinking that Shinichi was doing all the engineering himself? There were 300 people working on Cell, and it's design was done a while ago. One man leaving doesn't mean anything.

Hi Paul,
This is why I prefer to take Cell and Playstation as two different topics.
Sony/Toshiba are the people pushing for 65nm.
IBM and STI (cell) are not dependant on it.
I'd honestly prefer not debate the point so long as people view them as the same thing.

I'll make it simply. As the CTO & Head of R&D is it Shin who's responcible for efforts like 65nm Fab.
It may be that he got Sony to get in ahead of itself. He's done it before.
If you wish to discuss this point go to the thread I started about him.

Besides this is the ATI thread. ;)
 
and there are many things that the geforce 2 is able to do that the ps2 can't do. But at the end of the day they both put out close to the same image when a game is coded for just that hardware. That is what my point is. I wasn't saying one was better than the other. I was just pointing out that the video game systems are on the same lvls as the pc hardware out at the time . The only reason why the console games look better at the start is because games are made for that specific hardware where as it may take years for a game to come out to target the pc hardware and by that time we are talking about the newest hardware .

The past is not a good indicator of the future... the ps3 is being done in cutting edge tech.

For example I'm sure the difference between the EE and the pc cpus of the time will be ridiculously dwarfed by the difference between cell and cpus near its time... Same goes for gpus, what if the gs would have been several fold as complex, and far faster? The ps3 chips are said to likely run at multi-ghz speed, and both will likely be bigger than anything on the market at the time.

Don't go pulling the whole polygon count bit because they obviously aren't on the same level if the gpu is doing things the CONSOLE CANT


Come on, I've been playing late gen psx rpgs, the models exceed those games that required pcs that came yrs later. Quake iii, and half life are surpassed. True they ain't technically superior, but to me seeing bttr gphx even if it's within a few areas of a game, is quite amazing, I'd say incredible when you consider the ram, and the difference in yrs between hardware.

things that the geforce 2 can do that improve image quality and make up for its lack of polygon pushing power and provides and image quality on par with the ps2 .

If I'm not mistaken the geforce line currently lags behind others in the pc arena in terms of IQ, and I believe it's never been known for uber AA.

Now to settle this as adults . Why don't you post the full specs of the emotion engine and I will post the full specs of the radeon 64 meg and the geforce 2 as soon as I'm able to find them since none of the sites seem to have it .

We did that last time, the spec of those gpus are theoretical numbers that are never achieved, IIRC, the bandwith wasn't that hot.... heck they'd be fuxxored dealing with highly detailed modern models.

The ps2 not only owns them on fillrate, poly count, vram bandwith, but it also owns'em on flexibility, remember the vus are capable of more advanced T&L than the primitive stuff those had.

.Hi Paul,
This is why I prefer to take Cell and Playstation as two different topics.
Sony/Toshiba are the people pushing for 65nm.
IBM and STI (cell) are not dependant on it.
I'd honestly prefer not debate the point so long as people view them as the same thing.


Well, if it's true that sony has 90nm playstation products out right now, and they're launching another 90nm product next year... it seems unrealistic to expect them to put all dev. in stasis for 2yrs(ps3 is likely to launch either march or fall 2005)...

U know new cutting-edge consoles are supposed to use state-of-the art tech, not what a company had available yrs prior.

Besides this is the ATI thread.

Yes, we must hope they can finish off nvidia, and move on to future fights with decent competitors.

edii
 
David_South#1 said:
This is why I prefer to take Cell and Playstation as two different topics.
Sony/Toshiba are the people pushing for 65nm.
IBM and STI (cell) are not dependant on it.
I'd honestly prefer not debate the point so long as people view them as the same thing.

I'll make it simply. As the CTO & Head of R&D is it Shin who's responcible for efforts like 65nm Fab.
It may be that he got Sony to get in ahead of itself. He's done it before.
If you wish to discuss this point go to the thread I started about him

I'm going to sleep, but before I do just a quick suggestion. Most of us here are well aware of the duality between STI's architecture (eg. Cell) and what SCEI will impliment. There has been many a discussions on this very topic and such subset discussions as per the long-term impact this will have in the consumer electronics industry via Sony and Toshiba's aboption. I know this because I've spoken on it many a times and I happen to know that Paul understands this - Panajev already touched on adapting the architecture for PS3 usage in this very thread. So, you don't need to lecture us on it my friend.

It's also been discussed many a times that Cell will enter production in 2004 @ the 90nm node. From what I've heard, this will be done at E. Fishkill. Subsequently, SCEI and Toshiba will begin risk production on a 65nm Cell superset for PS3 in 2H 2004/1H2005 at their presently under construction Nagasaki and OTSS expansions. This is all understood.

The problem is that you've routinely lumped comments about the Cell arhitecture in with Sony/PS2 and the result is that people think you're refering to the CellPS3 and call you on it. For example:

David_South said:
Other comments around here suggest 65-45nm production? :cry:
Considering the only business their Fab will be doing is PS2 before the PS3 is ready,
Pushing TO START in 65nm is more than excessive

This is wrong and infact, it's your error. The EE+GS Roadmap ended at 90nm AFAIK as the IC has reached diminishing returns and it's sub-100mm^2 size has already made it extremely viable economically. I question if it's reached the equilibrium point @ 90nm where any downgrade would lead to an effective higher cost on a relative scale. Not to mention the move to an integrated IC has effectivly doubled production at Nagaski/OTSS. The 65nm investment, which happens to match their XDR investment has overwhelming evidence that it's for CellPS3. If anything, I'd hazard to say it's you who are confusing Cell's 90nm enterence with PS3, instead of seeing it's broader market picture.

Ok, now back to ATI for real.

EDIT: The very fact that you can buy a PS2 today using the process you think PS3 will use scares me :)
 
The past is not a good indicator of the future... the ps3 is being done in cutting edge tech.

For example I'm sure the difference between the EE and the pc cpus of the time will be ridiculously dwarfed by the difference between cell and cpus near its time... Same goes for gpus, what if the gs would have been several fold as complex, and far faster? The ps3 chips are said to likely run at multi-ghz speed, and both will likely be bigger than anything on the market at the time.
Yes so will the gpus that come out in the same year.

Common, I've been playing late gen psx rpgs, the models exceed those games that required pcs that came yrs later. Quake iii, and half life are surpassed. True they ain't technically superior, but to me seeing bttr gphx even if it's within a few areas of a game, is quite amazing, I'd say incredible when you consider the ram, and the difference in yrs between hardware.
I'm sorry are you comparing a playstation 1 video game to quake 3 ? There is no comparison. Quake 3 blows it away. At the time of the psx and saturn computers hadn't moved into the 3d arena .

If I'm not mistaken the geforce line currently lags behind others in the pc arena in terms of IQ, and I believe it's never been known for uber AA
THe geforce 2 (we are not talking about the fx line which is horrible) used supersampling for aa which is very nice. It did not look as good as the voodoo 5 aa but then again they say not even the r300s beat that in quality.
We did that last time, the spec of those gpus are theoretical numbers that are never achieved, IIRC, the bandwith wasn't that hot.... heck they'd be fuxxored dealing with highly detailed modern models.

The ps2 not only owns them on fillrate, poly count, vram bandwith, but it also owns'em on flexibility, remember the vus are capable of more advanced T&L than the primitive stuff those had
The ps2 has many bandwidth problems and bottlenecks which keeps it from hitting its max number and the numbers drop quickly as you add more things . There are many things in which the geforce 2 owns the ps2 in .

As I have said neither completely owns the other in terms of specs and when you look at games that are targeted at the platform you will see that the final image is equal.
 
Vince,
At the time it was performing 64 operations per cycle at 550MHz.
No silicon details were provided. But can be interpreted a few ways. 32bit / 64bit
4 ops per cycle unit = 16 units / 2 ops per unit = 32 units
An APU is composed of 4 or 8 total units = 2IU + 2FPU / 4IU + 4FPU
So a guess of 2-8 APU

64 ops per cycle = 32 FP MADD or 32 Integer MADD per cycle.

This means 8 APUs as, if you looked at the APUs patent posted several times here, you cannot do Integer ops at the same time as FP ops on the same APU ( the APUs, given by the specific IBM patent, can only do either of the following: 1 FP Vector op, 1 Integer Vector op, 1 Scalar FP op or 1 Scalar integer op [all per cycle] which would make sense [simplier Register file, simplier APU logic, reduced power consumption] ).

This configuration and 550 MHz of clock-speed in 2H 2003 ?

Pretty impressive, thanks for the leak ( if real [no offense David, really] ).

You should keep those numbers private, your sources will not like having too much data exposed ( well, in this case, it would be particularly good news, so they might have encouraged you to post it [if this is real] ).

Edit: the performance numbers seem too good to be real... I am really starting to doubt the leak. Also, I do not believe each APU to be doing 4 FP MADD or Integer MADD operations per cycle.

BTW, having 2 of "your" kind of APUs is basically the same as having 8 of the "IBM patent" kind of APUs [which is what I think is reflected in Suzuoki's CELL patent] if you really think about it ( performance wise ).

(4 * 2 ops/cycle ) * 8 = 64 ops/cycle

( (4* 2 ops/cycle) * 4 ) * 2 = 64 ops/cycle

Edit 2: Still, your kind of APU might be supported by IBM's APU patent if you do not assume it would be doing FP operations along Integer operations ( either 1x4 FP Vector or 1x4 integer Vector or 1x4 FP Scalar or 1x4 Integer Scalar [I say 1x4 because you have 4 groups of sub-units in the APUs ).

Your APU would trade in logic ( quite a bit of it ) for performance and would not need 4 GHz to reach 1 TFLOPS ( which I believe will be what the CELL based Broadband Engine and the CELL based Visualizer would achieve if you combine their performance together ).

This would make the chip a bit bigger than I was thinking about and the effort to shrink it using 45 nm SOI ( witgh capacitor-less e-DRAM cells ) would be even more pressing for SCE ( they could still do it ).

With your kind of strong APUs ( 32 ops/cycle per APU ) you could do this:

1 BE with 2 PEs and 8 APUs per PE running at 1.5 GHz

+

1 VS with 2 PEs with 4 APUs per PE running at 1.25 GHz

= ~1 TFLOPS combined


With the weak APUs ( 8 ops/cycle per APU ) we would need to have 4 PEs in each BE and VS:

8 APUs/PE * 8 ops/cycle per APU * 4 PEs * 3 GHz = 768 GFLOPS

+

4 APUs/PE * 8 ops/cycle per APU * 4 PEs * 1.5 GHz = 192 GFLOPS

= 960 GFLOPS

In both cases we are pretty close to the 1 TFLOPS target.


Now, pushing the 65 nm process quite a bit more we could imagine 1 TFLOPS for the Broadband Engine alone.

This could come in two configurations:

a.) 4 PEs ( 32 APUs at 8 ops/cycle each ) at 4 GHz

b.) 4 PEs ( 32 APUs at 32 ops/cycle each ) at 1 GHz

Now, we have to think and decide which one of the two choices is the design that is closer to what SCE's plants could manufacture.
 
zidane1strife said:
For example I'm sure the difference between the EE and the pc cpus of the time will be ridiculously dwarfed by the difference between cell and cpus near its time...

This reminds me of the kind of things being said about PS2 leading up to it's release. PS2 kind of failed to live up to its promise of "emotion synthesis" and all that crap though. PS2 also used a cutting-edge process with very very large chips (for its time). Just because it used large chips didn't mean it had great IQ, cutting edge features, or even basic features like texture compression. I just wonder if PS3 will be another piece of quirky overhyped hardware. I'm thinking ATI has the ability to make something competitive or superior with far fewer transistors and much lower cost. And I'm pretty convinced that given the same transistor budget, ATI could design a better piece of hardware than Sony.
 
SH3 is great, but as someone else said graphically it's nothing a Geforce2 couldn't handle (I remember playing the PC version of SH2 on my $40 GF2 MX, with better IQ than the PS2 version).
 
I think that a GF2 + Pentium III 600 MHz ( Xbox 1 if it came out closer to PlayStation 2's launch ) would chug while running Silent Hill 3 ( the shadow system and lighting seems more advanced than Silent Hill 2 and so is animation and character detail ).
 
I think that if my GF2MX could run SH2 a regular GF2 could run SH3. but regardless, my point is that big hardware does not always equal good hardware.
 
Panajev2001a said:
I think that a GF2 + Pentium III 600 MHz ( Xbox 1 if it came out closer to PlayStation 2's launch ) would chug while running Silent Hill 3 ( the shadow system and lighting seems more advanced than Silent Hill 2 and so is animation and character detail ).
IT depends on how many lights it uses . One or two and the geforce would be fine. The geforce 2 can do alot at 640x480
 
Josiah said:
SH3 is great, but as someone else said graphically it's nothing a Geforce2 couldn't handle (I remember playing the PC version of SH2 on my $40 GF2 MX, with better IQ than the PS2 version).

I also played SH2 on my 1700+ and GF2 MX + 256 DDR, and it sucked. Frame rate was never above 20. Situation was worse in fog....and with noise filter ON. It was simply unplayable. And also I dont like PC "type" graphics....I love console "style"...
 
Clem,

Well Visualizer is widely concidered to be a "GPU" of PS3, in reality it is only a Rasterizer.

It has APU's meaning it can be programmed to run programs, most likely shader's.

My guess is that it will support your basic functions such as filtering, but the shaders would be programed and run right on VS's APU's.

As for Nvidia, it really all blew over. Some sony exec came out and basically bashed the idea proving it wrong.


Visualizer (4-Visualizer PEs together) is a GPU, going by the Sony patent that's been well known since early this year.

Visualizer is a GPU, unlike Graphics Synthesizer, because Visualizer has PUs+APUs which can provide floating point computations for T&L and/or Vertex Shading capability. Therefore, the Visualizer could seemingly provide its own geometry & lighting calculations for itself (for the Pixel Engines + Image Cache) unlike the GS in PS2, because by itself, the GS cannot provide its own calculations for geometry & lighting because the VUs in the EE CPU provide T&L for GS. Therefore the GS is just a rasterizer. but the 4-Visualizer PEs that form the "Visualizer" chip is a GPU.

If Visualizer only consisted of Pixel Engines + Image Caches + CRTCs, then it could be concidered a rasterizer. but since Visualizer also has these PUs+APUs, it has a front-end and could therefore do 3D graphics processing (the entire 3D pipeline) completely independantly of the BE/EE3/CPU if it had to. While the Visualizer most likely could not do billions of polygons/sec that PS3 will do by itself (it will need assistance from the BE/EE3/CPU for that) I speculate that a Visualizer GPU made from 4-Visualizer PEs (as shown in the patent) could do several hundred million simple polygons with lighting on its own, without the CPU.

Whereas GS in PS2 cannot do 1 polygon without the EE feeding it T&L calculations.
 
Deepak said:
Josiah said:
SH3 is great, but as someone else said graphically it's nothing a Geforce2 couldn't handle (I remember playing the PC version of SH2 on my $40 GF2 MX, with better IQ than the PS2 version).

I also played SH2 on my 1700+ and GF2 MX + 256 DDR, and it sucked. Frame rate was never above 20. Situation was worse in fog....and with noise filter ON. It was simply unplayable. And also I dont like PC "type" graphics....I love console "style"...
I'm sorry but what ress were u running it at ? Run it at 640x480 and it will run fast enough.
 
jvd said:
Deepak said:
Josiah said:
SH3 is great, but as someone else said graphically it's nothing a Geforce2 couldn't handle (I remember playing the PC version of SH2 on my $40 GF2 MX, with better IQ than the PS2 version).

I also played SH2 on my 1700+ and GF2 MX + 256 DDR, and it sucked. Frame rate was never above 20. Situation was worse in fog....and with noise filter ON. It was simply unplayable. And also I dont like PC "type" graphics....I love console "style"...
I'm sorry but what ress were u running it at ? Run it at 640x480 and it will run fast enough.


whatever resolution, games like J&D2, ZOE2 and to a certain extent SH3 (remember how many polygons are on those beautiful models) would NOT run at a decent framerate on a GF2. hell, ZOE2 doesnt run a perfect framerate even on PS2... :LOL:
 
jvd said:
Deepak said:
Josiah said:
SH3 is great, but as someone else said graphically it's nothing a Geforce2 couldn't handle (I remember playing the PC version of SH2 on my $40 GF2 MX, with better IQ than the PS2 version).

I also played SH2 on my 1700+ and GF2 MX + 256 DDR, and it sucked. Frame rate was never above 20. Situation was worse in fog....and with noise filter ON. It was simply unplayable. And also I dont like PC "type" graphics....I love console "style"...
I'm sorry but what ress were u running it at ? Run it at 640x480 and it will run fast enough.

and also bring memory down to 32MB and also tone down the processor speed....
 
Back
Top