Revolution Tech Details Emerge ( Xbox1+ performance, 128 MB RAM )

Sounds true, doesn't mean it is however.
Confirms my thought about the "NGC on steroïd" rumor and Revolution specs.
Overall I'm tempted to consider the blog legit.

(Maybe because it says things I already knew, and some other I want to believe...)
 
Teasy said:
You might as well tell us how you know its fake :)

That blog reads like some Nintendo f@nboy's wet dream fantasy drivel:
-Story and gameplay ideas driven by the fans! You'll get credited in the manual! What a grand experiement in game development!
-Rev remote is even better than the mouse! In fact, nothing beats it!
-The game we are making is a GTA combined with FPS! No, we haven't announced anything, and we're not gonna tell you who we are either. We're just a mysterious European dev! Isn't that fantastic?
-Rev will give you fantastic next gen visuals!
-We are willing to waste fantastic amount of time updating this blog everyday, but we won't actually tell you anything substantive so don't even ask!


Come on peeps. Let's not even waste our time reading this crap, let alone discussing it. Seriosuly.
 
Dr Evil said:
Yes 480p is 1/3 third of the pixels of 720p thus the image is not comparable!
pc999 said:
It should be in a normal/480p TV.
Nah, the one rendered at 720 will look nicely supersampled on a SD and ED displays, while the other is obviously going to look like it was rendered at 480p.
 
Shogmaster isn't that blog supposed to be made just by an exployee of a developer, not the actual development house itself? None of what you said discredits the site considering that, especially the bit about next gen visuals.. Could still be totally fake of course :)
 
Last edited by a moderator:
Nah, the one rendered at 720 will look nicely supersampled on a SD and ED displays, while the other is obviously going to look like it was rendered at 480p.

So why hasn't that happened so far with 360 games?
 
Teasy said:
You might as well tell us how you know its fake :)

What Shog says. These blogs all follow a pattern: take what we already know, add every ******'s wet dreams to it, and voila! You're an insider! This is simply the 3rd generation of the fake insider blogs.

1st generation: Revo will have gyro-based controller plus heat sensors/pressure sensors/emotion sensors/holographic bowling ball/whatever else we can dream up. And here's a classified logo to boot!

2nd generation: Yes, that's the controller (which really was motion-sensitive all along), but what Nintendo hasn't told you that I'm telling you now is that, despite no HD, the thing is actually way more powerful than an Xbox 360! 768 MB of RAM! Cinco-core CPU! PPU! 600 MHZ GPU doing 1 kajillion shader ops per second!

3rd generation: Ok, so yes, that's the controller, and no, the specs don't beat X360 (which rational people all concluded from the "no HD" messages), but it's ok! We're making the most amazing game of teh evar!!111! Plus top secret graphics technique so that the specs produce images 100x the IQ of any similarly-specced machine!

E3 can't come fast enough. If it's on blogspot, it's a lie. Developer blogs are generally hosted at sites related to the company. If a Revolution blog appears on ubisoft.com or something, let me know.
 
Teasy said:
So why hasn't that happened so far with 360 games?
What do you mean? If games do render at 720p then they have to look supersampled at 480i/p because they quite simply are. I have seen shots showing this wasn't the case for NBA 2K6 on the 360 and it was clearly rendering at a lower resolution when outputing 480i/p, but I played some Kong on a SDTV and that was defiantly supersampled and I'm pretty sure the same goes for most 360 games.
 
fearsomepirate said:
What Shog says. These blogs all follow a pattern: take what we already know, add every ******'s wet dreams to it, and voila! You're an insider! This is simply the 3rd generation of the fake insider blogs.

1st generation: Revo will have gyro-based controller plus heat sensors/pressure sensors/emotion sensors/holographic bowling ball/whatever else we can dream up. And here's a classified logo to boot!

2nd generation: Yes, that's the controller (which really was motion-sensitive all along), but what Nintendo hasn't told you that I'm telling you now is that, despite no HD, the thing is actually way more powerful than an Xbox 360! 768 MB of RAM! Cinco-core CPU! PPU! 600 MHZ GPU doing 1 kajillion shader ops per second!

3rd generation: Ok, so yes, that's the controller, and no, the specs don't beat X360 (which rational people all concluded from the "no HD" messages), but it's ok! We're making the most amazing game of teh evar!!111! Plus top secret graphics technique so that the specs produce images 100x the IQ of any similarly-specced machine!

E3 can't come fast enough. If it's on blogspot, it's a lie. Developer blogs are generally hosted at sites related to the company. If a Revolution blog appears on ubisoft.com or something, let me know.

Ahh man those are some great points. I think I'm with you on this one.
 
mckmas8808 said:
Ahh man those are some great points. I think I'm with you on this one.

I've had one consistent rumor source the entire time, but she dropped off the face of the earth. Unfortunately, I can't leverage any "see I told you so," but I heard the spec (the exact spec--she said double-clocked Cube kits with around 128 MB and the controller) of the alpha dev kit months ago. I didn't swallow what she said uncritically because, well, there were too many rumors flying around. But hey, she clearly wasn't trying to spin some fantastic please-everyone lie, so I kept it in the back of my head. Turns out she was right. Of course, I could just be making this all up now that the news is out. I didn't want to spread it at the time because a) she made me tell her I wouldn't spread it (and I really do keep my word), and b) I didn't want to go down in infamy on one or more msg boards as the guy who spread some stupid rumor.

Since she was right about that, I'll tell you what else she said: the current X1K series of Radeon chips is a product of Hollywood development. She said that the financial deal between ATI and Nintendo went down right when the X1K development started, that the deal was way, way more than would be appropriate for an enhanced Flipper (somewhere in the hundreds of millions), that ATI ArtX has been responsible for all of ATI's genuinely new architectures since they joined, and that Nintendo would naturally want to work with the ArtX guys again (how true is all this? You tell me). So she said Hollywood and X1K are related architecturally, although Hollywood would have fewer shaders and a lower clockspeed. I'm more believing her because she didn't try to claim any incredible ground-breaking technology in the new chip that would do X360-beating graphics with 1/4 the computational performance.

So there. I'll spread that rumor. It at least adds up with what we know.
 
Last edited by a moderator:
To be fair I don't think that blog writer said anything about XBox 360 beating graphics did he/she? (obviously I haven't read the whole thing).

Interesting info though fearsomepirate. That rumour matches the one from RevoGaming about Hollywood being X1600 based. Its worth noting that what your source says, about alpha dev kits vs the final machine, does more or less back up what the blog is saying as well. That IGN's anonymous dev is basing his/her opinion on Alpha kits and that the final system will be much more powerful.

Incidentally RevoGaming where also the site that claimed they were told (by the same source that told them about Hollywood) that first prototypes of Broadway tapped out around the begginning of November. Which fits with the info from that blog about the first proper Rev development kits (with prototype Rev chips) going out to major developers a couple of weeks ago. Still I suppose the blog writer may have read Revo gaming's site anyway.. I just think its fun to play connect the dots that's all :LOL:
 
Last edited by a moderator:
If you want my opinion the thing that make me think more it may be true is when he say that
QUESTION: Are you using a In-house engine for the game?

ANSWER: We’re currently using a special for Revolution designed prototype engine, provided by the engineers of Nintendo, which is included in our current development kit.
if this is fake it makes no sense, it would say they are using a midleware (which could be bad if the midleware prove he is a fake) or would say it is in fact a in-house engine that no one could discredit, this case only make sense a posteriori once that it would really help dev or at least the small compannys that could not survive if they need to build the engine from the grund or license (anyway this is the console that here are more possibilitys from surviving with good/innovators ideas) and Nintendo need to build their own engines for their games (Mario, MP ...) so if they included one generic/basic... wouldnt be only because they had goodness in their hearts to spend more money for the small devs.This makes even more sense if they are indeed using other method of rendering that isnt immediat rendiring (is this he name right?) as it would help unxeperienced dev with such techenninq serving at least as a sample.It would help everyone as, even if they builds their own engine, it gives the oportunity to teste the controler and ideas for it even without a engine)

The rest exept the gfx part I think we should expect anyway IMO (which part shouldnt we expect?).

So I think we should think in what kinds of rendering can they use if he is true, then (eg) TBDR should make they save somewhat in terms of power for equal results right?What more can they use?(if true this defeats my own theory that you can see in Personal Rev Vision thread, but I most confess if the saves in long term/big public/from the start are good enough then dont know why they wouldnt invest in it anyway but, of curse, they need to gamble harder).

If he is a a fake then he is a really good faker IMO.

Just my two cents.
 
fearsomepirate said:
Since she was right about that, I'll tell you what else she said: the current X1K series of Radeon chips is a product of Hollywood development.

I always said the X1600 architeture (aparentelly R580 too) should be great for rev (low fill/texel rate but lots of shader power, very good for 3d mark05 and probably UE3 like gfx, I guess) also at 500Mghz it is very cool, but dont know if that and the above (my post) could be susteined at the same time, unless I misunderstud somethingh in this
I can tell you however Revolution will use a different technique to display next-generation graphics, which they haven’t officially announced yet. This technique is way more powerful compared to cube-mapping.
, but I would think at first that it should be tied with the HW, unlees it is a adition of their own if both thinghs have compatibility between.
 
Last edited by a moderator:
fearsomepirate said:
I've had one consistent rumor source the entire time, but she dropped off the face of the earth. Unfortunately, I can't leverage any "see I told you so," but I heard the spec (the exact spec--she said double-clocked Cube kits with around 128 MB and the controller) of the alpha dev kit months ago. I didn't swallow what she said uncritically because, well, there were too many rumors flying around. But hey, she clearly wasn't trying to spin some fantastic please-everyone lie, so I kept it in the back of my head. Turns out she was right. Of course, I could just be making this all up now that the news is out. I didn't want to spread it at the time because a) she made me tell her I wouldn't spread it (and I really do keep my word), and b) I didn't want to go down in infamy on one or more msg boards as the guy who spread some stupid rumor.

Since she was right about that, I'll tell you what else she said: the current X1K series of Radeon chips is a product of Hollywood development. She said that the financial deal between ATI and Nintendo went down right when the X1K development started, that the deal was way, way more than would be appropriate for an enhanced Flipper (somewhere in the hundreds of millions), that ATI ArtX has been responsible for all of ATI's genuinely new architectures since they joined, and that Nintendo would naturally want to work with the ArtX guys again (how true is all this? You tell me). So she said Hollywood and X1K are related architecturally, although Hollywood would have fewer shaders and a lower clockspeed. I'm more believing her because she didn't try to claim any incredible ground-breaking technology in the new chip that would do X360-beating graphics with 1/4 the computational performance.

So there. I'll spread that rumor. It at least adds up with what we know.

It just wouldn't make any sense for Nintendo to release early dev kits as a final representation of overall specs. Iwata mentioned nearly a year ago the Revolution would use a API similar(or the exact) to the GC. Also Miyamoto said many devs could and would start work on GC devkits.

So Nintendo decides to request some Gekko's/Flippers with slightly increased performance. They add 128MB's of memory.

I've been wondering about throuhput numbers(eg. polygon, flops), these souped up Gekko/Flipper chips, is Mhz the only thing that got an increase?
 
Well lets go back in time and see how powerful hardware always wins? (right!)

SNES vs. Genesis: SNES a lot prettier and sounded a ton better, easily a tie.
N64 vs PSX: N64 far more powerful than PSX. PSX gets tons more quality games and sells massively more. PSX has easiest and cheapest development (N takes note of this.)
Dreamcast vs. N64: Uhhh, lol. Dreamcast is like Voodoo2 vs. Virge here, but Dreamcast quickly dies off.
Gamecube and PS2 vs. Xbox: PS2 utterly demolishes both. Gamecube, while being a lot less powerful than Xbox, easily holds its own graphically. N makes Cube easiest console in history to develop for (learns from N64 mistakes).
Gameboy vs. Lynx/GameGear/etc/etc: Heh, not even close. GB absolutely dominates. Its chirping self and its little 8-bit Z-80 CPU absolutely hold the crown.

What do I see with Nintendo's new console? I see a company that has refined a plan to build again the easiest console to develop for. I see N planning to be able to dominate a price war. I see how they've learned that graphics are not improving exponentially anymore and that powerful graphics/CPU hardware does not have the ROI it used to have. They saw a need to try to innovate in a new way (controller, and probably other things). They know PS3 and Xbox have built up a public image that they will again have a hard time competing with. They've asked themselves question of how to defeat this image, and they've shown us their controversial answer.

Who brought us the analog stick for consoles? Who brought about the 4 player capability? Force feedback? Basically, which console company always takes risks to bring us new and potentially exciting ideas? Sure as shit not MS or Sony. How fast did PSX copy N64's analog stick and force feedback? Has MS ever innovated with their console? (HDD excluded, that WAS cool, now it costs big $).

There are examples every generation of how hardware prowess does not define how well a console does in the market, how fun it is, or how innovative the games are. There are more good games on SNES than there are on Xbox. A simple puzzle game can be more fun than Halo could ever dream of being (depending on the person's willingness to try new things, Halo carries a certain definition of manhood with it lol). What does this mean?
 
swaaye:

Of course you know is not that simple.

Most of those system were the most powerful at the time of release. This time Nintendo is coming with the least powerful harware at last place (if the PS3 releases first).
 
pc999 said:
So I think we should think in what kinds of rendering can they use if he is true, then (eg) TBDR should make they save somewhat in terms of power for equal results right?
From what I can remember a TBDR is limited in the number of polygons it can draw by the size of its on-chip scene buffer. There's a post around here somewhere that said Kyro's 6 MB buffer could handle 500,000 polygons in a scene:

http://www.beyond3d.com/forum/showpost.php?p=286496&postcount=208
 
It's not limited at all. Unless there are any sneaky ways to circumvent the issue entirely, it's perfectly possible to fill the buffer to capacity, then draw what's in it, fill another buffer (possibly while drawing the first one), then draw that, etc.
 
Guden Oden said:
It's not limited at all. Unless there are any sneaky ways to circumvent the issue entirely, it's perfectly possible to fill the buffer to capacity, then draw what's in it, fill another buffer (possibly while drawing the first one), then draw that, etc.

Is the buffer seperate from the VRAM? With such low speeds required it doesn't sound like it needs to be, why can't it just take as much VRAM as needed? Also, wasn't the buffer on the chip in the Dreamcast only like 512KB or some other value >1MB? (plus, I seem to remember dreamcast being benched upwards of 10million polys/sec, though maybe that was Naomi which had more ram)
 
OtakingGX said:
From what I can remember a TBDR is limited in the number of polygons it can draw by the size of its on-chip scene buffer. There's a post around here somewhere that said Kyro's 6 MB buffer could handle 500,000 polygons in a scene:

http://www.beyond3d.com/forum/showpost.php?p=286496&postcount=208

I dont know much about that or even from 3D , what I know is primary from IR.

Which things can they do more, from what I hear there is Voxels and while they are better in some things (colission detection, Content (maps?) creation(?), destructibles environments, more(?)) they use too much memory (unless they create/had created some brutal compresion for it).

I would like to know which renderings more could they use (feasible in real-time/console).

I would try to do anouther search in google once that the first one had not been very sucessefull (to say the best).
 
Back
Top