Official: ATI in XBox Next

Status
Not open for further replies.
No offense zidane, but do you really know what you are saying? I mean, you posts try to convey some impresso tech-knowledge, but they just dont sound convincing enough...

You kind of reminds me of one TXB member nicked -> 1TFLOPS ( :LOL: )...someguy claiming to be a PS2 developer..sounding all tech smartie, yet many of his posts are lifted straight off from here.... :rolleyes:
 
So it shows finally....


ANYWAY, can i get the full Nvidia "miracle machine" quote? From just the 2 words "mircale machine" , it can either REALLY mean miracle machine, or a miracle just get games running, or a miracle that it will be ready anysoon, or a mircale it will actually work, or simply a bargaining chip against MS..?
 
Well, I might be wrong, but the words ain't mine

30-60%efficiency on the world's TOP supah computah... rest going as low as 15% or below... That is reality

4X perf increase from TTT which already appears to more than double the perf of the fellowship... that is reality

I quoted the Weta guy, these guys are moving 100s of TBs across their networks(true alot of that is art assets, film footage, etc... )

I showed the Toy story 1 specs, a few 100s of 75-100Mhz sun crapo stations... the real world perf, just a fraction of that pathetic spec...

The possibility that the console textures will be of lower rez, and that they're meant to be displayed at screens with far lower rez, is a fact.

The physics, a.i., etc... does not have to be as good, we're not dealing with a hollywood film here... that is fact.

Hardware accelerated gphx calcs, can be orders of magnitude faster... well, many a people have said this.

Let's take our nice little time machina... and see what happens when we travel yrs back... slowly... but surely...

The Hdds become significantly smaller and slower, the amount of ram is significantly less and slower, the busses are smaller/slower, the caches are smaller, the connections between cpus take significantly more time to move around significantly less data, etc... Why is this important? Because we're dealing with massive amounts of data, and if it takes ever more to move around, it will drag perf. along with it...

I've only taken the words of others, quotes, and specs, blended together, and as always, based on observations, given what It appears is arising out of all this data...

edited 2
 
This 2.8GHz Intel Xeon processor with 533MHz front-side bus speed helps deliver processing power, throughput and headroom for peak performance to your application-intensive blade servers.

533MHz system bus transfers information from the processor to the rest of the system at a rate up to four times faster than the 133MHz system bus used on Pentium III processors.

Wow that’s amazing...
Blade servers come with up to 2 40GB ATA 100 Hdds at 5400rpm(other higher perf. Hdds can be added)
Well with all that ridiculous amount of data moving around... it's good to know we have impressive specs...
The I/O expansion option adds dual-port FC connectivity at up to 1.2Gbs…
Wow impressive...
[/quote]

I'm not at all sure what you are quoting from there, but if you are talking about the SGI system then the CPU isn't too important WRT graphics processing - this is a system that used muliptle graphics processors. i.e. when it says "number of graphics pipes" is talking about the numbers of R300 graphics chips. Of course, each of these will have their own memory subsystem etc., etc.

... sell's are gonna come down once ps3 is unveiled...

Considering this is using scaleable graphics solutions, other scaleable graphics will be available by that time as well - it will probably be just as easy to create these type of systems from R500/NV50 (if NVIDIA get their scaleable solutions into gear).
 
I'm not at all sure what you are quoting from there, but if you are talking about the SGI system then the CPU isn't too important WRT graphics processing - this is a system that used muliptle graphics processors. i.e. when it says "number of graphics pipes" is talking about the numbers of R300 graphics chips. Of course, each of these will have their own memory subsystem etc., etc.

I'm quoting IBM 2.8Ghz xeon blade server specs, these are the things that have replaced sun/etc at pixar, and these are the things behind the lotr RoTk.

Considering this is using scaleable graphics solutions, other scaleable graphics will be available by that time as well - it will probably be just as easy to create these type of systems from R500/NV50 (if NVIDIA get their scaleable solutions into gear).

Well, the cell arch is said to be scalable, and it appears to give more flexibility too.

Price/perf... Go for a petaflop render farm, or bundle a couple 100 gpus...
 
ANYWAY, can i get the full Nvidia "miracle machine" quote? From just the 2 words "mircale machine" , it can either REALLY mean miracle machine, or a miracle just get games running, or a miracle that it will be ready anysoon, or a mircale it will actually work, or simply a bargaining chip against MS..?

Well, this ain't the quote, but it shows in what context it was said...

"Nvidia's CEO talks about the uncertainty of building XB2 because of
all the resources it will take. how only a few companies have the
critical mass (in engineering) and expertise to build consoles on the
level of what is coming next-gen. Nvidia, he says, is on good terms
with all three console makers"
 
zidane1strife said:
No direct replies to my post?

I just brought up the fact that the WORLD's MOST POWERFUL supah computah's perf. hovers around 30-60ish %, and that most other supah computahs go far below this 10-40%ish(thanks to pcengine for that number)... and that many of these machines use the fastest, widest, and best interconnects/ram/etc... some custom designed for this very purpose...

Then I brought up the fact that the world's most powerful render farm is composed of some 2000ish cpus each capable of a few gflops, but in reality will only reach a really really small portion of that potential...

Furthermore the ridiculously large amounts of data, that are being moved around, and into the processing elements, take ridiculously long times going through all of these bottlenecked busses, connections, etc...

You just answered why we wont see anything close to movie quality grpahics on the ps3
 
Zidane,

What you seem to be forgetting is that the data isn't the issue here. As I've already stated CELL will likely have succeed in moving data. It's the algorithms for other parts of the game. Implementing the algorithms to take ADVANTAGE of the resources made available by CELL is what I'm talking about. The data is basically a non-issue, it comes in nice neat packages for the most part and the data structures and the related algorithms are well explored.

The issue is taking advantage of the computing evnironment that CELL presents isn't that well explored, relatively speaking.
 
People seem to forget that we'll never have "movie quality" CG from a system simply becaue game developers don't have $100+ millions to spend on the games. :p Nor are are they getting box office profits any time soon... Who cares about the hardware capabilities? Hehe...
 
zidane1strife said:
Considering this is using scaleable graphics solutions, other scaleable graphics will be available by that time as well - it will probably be just as easy to create these type of systems from R500/NV50 (if NVIDIA get their scaleable solutions into gear).

Well, the cell arch is said to be scalable, and it appears to give more flexibility too.

Price/perf... Go for a petaflop render farm, or bundle a couple 100 gpus...

This same exact thing was said about PS2. As I understand it they did develop a multi-processor system (GSCube) but who is using it? If it's such a great idea, why wasn't LoTR rendered on a GSCube?

Something tells me ILM isn't going to abandon Renderman anytime soon...
 
cthellis42 said:
People seem to forget that we'll never have "movie quality" CG from a system simply becaue game developers don't have $100+ millions to spend on the games. :p Nor are are they getting box office profits any time soon... Who cares about the hardware capabilities? Hehe...

No when you are dealing with hudreds of gigs of just textures it would destroy even the ps3 with its on die cache. That is the reason. Its just not feasible. And by the time it is feasible for a certian quality. Say toy story , The movies will be so far beyond that .
 
Well the point of my posts was to show that render farms real world perf is quite pathetic, the numbers of the top ones hover around the numbers that are possible through the patent.

That these modern ones are orders of magnitude more powerful than previous ones.

That thanks to hardware acceleration the ps3 could perform certain tasks orders of magnitude faster than render farms that have orders of magnitude more perf.

And that since, yeah we don't need to have models as geometrically complex, we don't have to be as precise in the physics calcs, etc... things would be easier, and comparable.

The data thing was to show that the b/w starving that goes on and slows the render farms, won't likely be a prob. on ps3.

That's all...
 
zidane1strife said:
Well the point of my posts was to show that render farms real world perf is quite pathetic, the numbers of the top ones hover around the numbers that are possible through the patent.

That these modern ones are orders of magnitude more powerful than previous ones.

That thanks to hardware acceleration the ps3 could perform certain tasks orders of magnitude faster than render farms that have orders of magnitude more perf.

And that since, yeah we don't need to have models as geometrically complex, we don't have to be as precise in the physics calcs, etc... things would be easier, and comparable.

The data thing was to show that the b/w starving that goes on and slows the render farms, won't likely be a prob. on ps3.

That's all...

No instead of having large amounts of ram and little b/w . The ps3 will have almost no ram compared to the render farms and healthy bandwitdh. Whats your point. They fixed one problem and added another .

The cell chip is no big deal. There is allways something better 6 months later.
 
The cell chip is no big deal. There is allways something better 6 months later.

What do you mean no big deal? I don't understand. When you say big deal are you talking spec wise? Or big deal to things as a whole? It's a pretty big deal to Sony :)

Assuming 1TFLOPS there would be nothing Intel would have to match it's floating point performance for years. Not to mention from what we see FIG6 can do 1TOPS as well.

You can't compare the standard PC chip to cell, as they do two totally different things. Which have different goals in mind.
 
OH btw Zidane FYI, the Earth Simulator's memory subsystem is so fast that it doesn't require any cache whatsoever. It uses some kind of memory crossbar and only uses DDR memory. The processors aren't starved for data because the memory bandwidth is there. So the peak efficiency of 87% in the Linpack benchmark is possible only because of the high bandwidth.
 
Assuming 1TFLOPS there would be nothing Intel would have to match it's floating point performance for years. Not to mention from what we see FIG6 can do 1TOPS as well.

There is a world outside of x86.
 
Paul said:
The cell chip is no big deal. There is allways something better 6 months later.

What do you mean no big deal? I don't understand. When you say big deal are you talking spec wise? Or big deal to things as a whole? It's a pretty big deal to Sony :)

Assuming 1TFLOPS there would be nothing Intel would have to match it's floating point performance for years. Not to mention from what we see FIG6 can do 1TOPS as well.

You can't compare the standard PC chip to cell, as they do two totally different things. Which have different goals in mind.

SO paul how much stock in sony do u have ?

Once again as with ever other piece of tech that has ever come out it does a few things right and a few things wrong. And soon something else will come out that does a few things better than this tech. Just watch. When the cell chip finally does come out . There will be another chip out there just as capable or more capable than the cell. Around the same time . Mabye 6months - a year diffrence but it will come .

Comparing the cell chip to an intel chip is like comparing a dune bugy to a porshe . They both do certian things well in thier own element.
 
OH btw Zidane FYI, the Earth Simulator's memory subsystem is so fast that it doesn't require any cache whatsoever. It uses some kind of memory crossbar and only uses DDR memory. The processors aren't starved for data because the memory bandwidth is there. So the peak efficiency of 87% in the Linpack benchmark is possible only because of the high bandwidth.

Thanks, PC-engine

So in other words, render farms in comparison are probably weaker than what I've suggested...

The thing is, to be more impressive than Monsters.inc, toy story 2, etc... You don't have to be necessarily technically superior...

We all saw those low-end cg photos a while back, they were quite impressive, to a casual those landscapes would be even MORE impressive than toy story and the like, and as I've shown with perf rivaling the top render farms in the world... the ps3 should have more than enough power to do them in real-time... and that, the opinion of the casual gmr, my friend is what matters in the end...
 
zidane1strife said:
Thanks, PC-engine

So in other words, render farms in comparison are probably weaker than what I've suggested...

The thing is, to be more impressive than Monsters.inc, toy story 2, etc... You don't have to be necessarily technically superior...

We all saw those low-end cg photos a while back, they were quite impressive, to a casual those landscapes would be even MORE impressive than toy story and the like, and as I've shown with perf rivaling the top render farms in the world... the ps3 should have more than enough power to do them in real-time... and that, the opinion of the casual gmr, my friend is what matters in the end...



IF u're referring to those gorgeous procedurally generated landscapes pix someone posted some time ago, then i'm pretty skeptical that will be the quality we will get from PS3....
surely enough though, there are ways to make something simple look very pretty anyway, shortcuts are always used. there's no denying ANY next generation of console will produce VERY pretty graphics, but PERSONALLY i'm more looking forward to the advancements in animation and physics, AI and gameplay innovations... call me a geek...
 
Status
Not open for further replies.
Back
Top