Revolution specs from IGN

Vysez said:
A good technique used to give surfaces and models a good visual aspect, was to use APS.

So, now I'm wondering if the Hollywood GPU based one the Flipper feature set will support or not dot3 Normal Mapping.

APS being?
 
I was hoping with this being the Console Technology forum section of Console talk that the discussions would be entirely about what could possibly be in this 249Mhz GPU?

I mean what the hell had Bioware/Pandemic devs that saw Revolution presentations behind doors at GDC saying, "What we saw was mind-blowing!"

Instead this is about ******s, trying to prove that Revmote is nothing but a gimmick because its not they wanted, or it doesn't represent the norm. Nintendo isn't console sales leader, so it must be a gimmick created to offset the fact that technically they couldn't compete. Although neither of us partcipating in the forum/thread, have the slightest idea how much it cost to produce the revmote or the R&D cost requirements.

I mean why would spend a considerable amount of money on a gimmick, from my experience most are pretty cheap.
 
Ooh-videogames said:
I was hoping with this being the Console Technology forum section of Console talk that the discussions would be entirely about what could possibly be in this 249Mhz GPU?

I mean what the hell had Bioware/Pandemic devs that saw Revolution presentations behind doors at GDC saying, "What we saw was mind-blowing!"

Instead this is about ******s, trying to prove that Revmote is nothing but a gimmick because its not they wanted, or it doesn't represent the norm. Nintendo isn't console sales leader, so it must be a gimmick created to offset the fact that technically they couldn't compete. Although neither of us partcipating in the forum/thread, have the slightest idea how much it cost to produce the revmote or the R&D cost requirements.

I mean why would spend a considerable amount of money on a gimmick, from my experience most are pretty cheap.

Most people aren't considering the Rev a gimmick. I would also like to know what's possible on that kind of a GPU.
 
Remeber that Nintendo has seen a lot of money on games that don't push the graphical technology curve. They've been doing this for a while now, so these specs are not a surpriese. Do you know that the one of the hottest selling games in Japan is a Gameboy DS brain game that is graphically quite simple? It's in fact one of the reasons the GB DS is trouncing the PSP in sales in Japan. Who would have predicted that? And let's not forget the gobs of money that Nintendo made with Pokemon, another technologically simple game, in both handheld and consoles incarnations. For the life of me, I can't believe the fanaticism with Pokemon, but you can't deny the fact that it made a ton of cash for them. There's something to these games that compel millions of people go out and buy them, and whatever it is, it isn't the graphics, and they're only on Nintendo's systems. Many people buy Nintendo systems to play Nintendo games, not for the sake of just owning the system.

It's unfortunate that we won't see a Metroid game with graphics of the caliber of a 360 or PS3, but with Nintendo, you have to either buy into their philosophy or don't play their games at all.
 
I mean what the hell had Bioware/Pandemic devs that saw Revolution presentations behind doors at GDC saying, "What we saw was mind-blowing!"

well, the IGN and others' impressions from the 'behind doors' demos of the remote were positive twords the actual controls. they did hint that the games in the demos used Cube graphics. they said that maybe the graphics are low to make people focus on the controls instead.
 
Teasy said:
To say that DX9 pixel shaders would be useless on a 250Mhz chip without knowing anything about the architecture behind the chip (number of pipelines ect) is really silly to be honest. The Radeon 9700 was a excellent DX9 GPU and ran at 250Mhz (the Pro was 275Mhz).

Actually the Radeon 9700 Pro has a 325 MHz R300 in it. Mine was easily overclockable to 380 MHz without any glitches.
 
Today ATI has a GPU named RV516 with the comercial name ATI X1400 and next technical specs:

-ATI Avivo
-600Mhz
-2 Vertex Shader 3.0
-4 Pixel Shader 3.0
-4 TMU
-4 ROPS

Its cost is just the half than Flipper in 2001 (half size) and it is better than the supposed Hollywood that IGN is talking about in the article. I cannot believe ATI pushing an inferior GPU than X1400 in Revolution when they have designed ATI X1400 for laptops with the same heat problems that Revolution has.
 
Urian said:
Today ATI has a GPU named RV516 with the comercial name ATI X1400 and next technical specs

But can this efficiently emulate old Gamecube games?

Sometimes it's not just about "power/efficiency/specs" but what can do the job you need it for, and from what it looks like, Revolution won't have terribly much spare CPU capacity when emulating Gamecube games, so they'll have to use a GPU that if it isn't register compatible will be reasonably close to Flipper, or with a compatibility mode...
 
[maven] said:
But can this efficiently emulate old Gamecube games?

Sometimes it's not just about "power/efficiency/specs" but what can do the job you need it for, and from what it looks like, Revolution won't have terribly much spare CPU capacity when emulating Gamecube games, so they'll have to use a GPU that if it isn't register compatible will be reasonably close to Flipper, or with a compatibility mode...
That really depends on the NGC library used to code the system, if it's "high-level" enough, you have a driver and so you can just have a compatibility layer.
 
Ingenu said:
That really depends on the NGC library used to code the system, if it's "high-level" enough, you have a driver and so you can just have a compatibility layer.

XBox 1 was supposed to be DX-like and high-level, but the BC is still very flawed. I wonder what would be the size, cost and thermal dissipation of including a 90nm version of Flipper in addition to a small and powerful GPU (like this RV516), though, like Sony did with the PS2 including the PS1 hw...
 
Corwin_B said:
XBox 1 was supposed to be DX-like and high-level, but the BC is still very flawed. I wonder what would be the size, cost and thermal dissipation of including a 90nm version of Flipper in addition to a small and powerful GPU (like this RV516), though, like Sony did with the PS2 including the PS1 hw...

The problem with the Xbox is that the functions accessing the hardware directly (from the SDK) are linked statically into the game executable, so that to the emulator it looks as if each game is written to the bare metal without any nice abstractions such as API calls etc.
Things were much easier on the N64 in that regard, but I don't know how it is done on the Gamecube.
 
Corwin_B said:
XBox 1 was supposed to be DX-like and high-level, but the BC is still very flawed.
I think that's similar to the Amiga 500 >> 1200 compatibility issues. If devs stuck to the official APIs, compatibility was there. But when devs bypassed these and went straight to the metal, compatbility was lost. I presume this is the same on GC. How many devs are content to stick to the offical APIs?!
 
Corwin_B said:
XBox 1 was supposed to be DX-like and high-level, but the BC is still very flawed. I wonder what would be the size, cost and thermal dissipation of including a 90nm version of Flipper in addition to a small and powerful GPU (like this RV516), though, like Sony did with the PS2 including the PS1 hw...

Problem is XBox is x86 based, X360 is PowerPC based, different endian-ness, different system architecture... So you end up with the need to emulate a whole system, you just can't let a driver do the work, like in a typical PC.

Whereas the Revolution is by all means (IMO) an evolution of the NGC hardware, so it is likely to be compatible (just like an AMD64 runs i386 apps), and have a driver do its job (just like PC drivers translate API calls into hardware states/instructions), so nothing too hard.
(Again only if the NGC wasn't coded to the metal)
 
Tahir2 said:
Teasy are you happy with the speculated specifications for the new Nintendo console?

You mean 88MB of 1T-Sram, 243Mhz GPU based on Flipper with no extra shading capabilities and a 729Mhz Gekko derivitive? No I absolutely wouldn't be happy with that specification and if that was really the spec of the system and there were no suprises that improved the picture then I have to say I wouldn't buy the console at launch. It would take great games to get me to buy the system. Where as usually I buy a new console each generation at launch and then wait for the great games to come :)
 
[maven] said:
But can this efficiently emulate old Gamecube games?

I don't think it would need too. As Urian said the RV516 is half the size/cost that Flipper was in 2001. With todays technology Flipper would also be half the size. They could put both GPU's on a single die for the same kind of money they were spending on Flipper alone when GC launched. Both cores could have access to a single pool of embedded ram (since neither would need it at the same time anyway). Flipper could be used for perfect backwards compatability when playing GC games and when playing Revolution games it could possibly be used to assist the main GPU.
 
The thing in this equation that really puzzles me is why they're so concerned with backwards compatibility. Gamecube has only a handful of exclusives in its library, and of those, many of them are franchises that receive generational updates (Mariokart, Mario Party, F-Zero, Day of Reckoning, etc), and so there wouldn't be much of a consumer interest in them. And who cares about the myriad of cross-platform titles, especially when some of the most compelling of them (ex. Burnout 3) never came to the Cube? The problem this generation is that not that many people wanted Cube games to begin with, so I don't see how limited BC with the Gamecube would hurt Revolution sales in the slightest. Make it compatible with the Primes and the Zeldas, and you're golden.
 
swaaye said:
Well that CPU is certainly faster than a Pentium 3. P3 has all sorts of deficiencies cuz it's x86. A PPC at that clock should be a good bit faster. Not magically though.

The GPU is probably faster than an X700 I would imagine.
.

well, not really. pentium 3 and the gecko are comparable. x86 vs PPC means nothing, they are just instruction sets, that tells nothing about the architecture. pentium 3 has the complex and efficient architecture of the pentium pro, plus SSE. which is still seen on pentium M and core duo (with SSE2/SSE3), processors that stand more that well against athlon 64 and PowerPC G5.
take x360 and PS3 CPUs on the other hand, they have simple, inefficent PPC cores.

as for the GPU, X700 has almost twice the frequency and twice the pipelines ;), so not in the same ball park.
Rev looks much like the xbox, with 24MB more RAM but without the hard drive. (it has 512MB flash though, presumably for savegames and downloaded games. you don't want to use the flash for caching because it would wear out)


--------------------

what annoys me is the eDram stays at 3MB. so, I fear that we'll not see any improved image quality. 2x or 4x AA, and trilinear or aniso filtering would be a tremendous gain, even with the default RCA cables (I have the DVD player on RCA and it doesn't look like crap!). I think GC game do not even use mip-mapping.

also, moving from 24bit RGBA to 32bit RGBA would be/would have been nice. I played the latest Prince of Persia game on GC, there often were terrible artifacts (best seen in dark zones), making the game look like a terrorist propaganda .WMV or a cell phone video.

run a four year old game on your PC at 640x480 32bits with 4xAA and 8xAF and watch it about a meter away.. should looks quite nice no? this is what the rev could have been.
we had enough years with jagged polygons and terrible filtering.

(and in a ideal world, they would ship the RGB cable for Europe and maybe S-video cable for US, instead of ship RCA and sell the good cables at a premium)
 
Blazkowicz_ said:
as for the GPU, X700 has almost twice the frequency and twice the pipelines ;), so not in the same ball park.

Twice the pipelines of what?

what annoys me is the eDram stays at 3MB. so, I fear that we'll not see any improved image quality. 2x or 4x AA, and trilinear or aniso filtering would be a tremendous gain. I think GC game do not even use mip-mapping.

GameCube can already do Trilinear and Anisotropic filtering, the framebuffer space available doesn't limit filtering in the slightest. Your last comment is a very strange one, where did you get that idea?..
 
Back
Top