Xenon , Ps3 , Revolution ...

From a purely practical standpoint, maintaining code quality when dev teams get big gets harder and harder. Start adding external contractors into that mix and I'm going to claim it goes from difficult to impossible.

Oh well this is a given, no arguements there...

I doubt half the programmers on my team have ever written a significant amount of assembler, most have them don't understand what a compilers optimiser can and can't do never mind understanding a machine architecture in enough detail to make the "right" decision.

Then you're lucky... Outside the couple that may have taken some 68k assembly in school and/or played around on a 6502, I think I'm the *only* one... (There's folks around here who think my 867MHz G4 PowerBook is smoking fast just because it's "RISC")... -___-

The whole in order/OOO thing I actually don't care to much about, outside of manual prefetches your at the mercy of the compiler to solve the problem for you anyway.

Absolutely, I think it's being blown waaaaaaaaay outa proportion... I mean if you've had to do any performance oriented work on anything outside of x86 recently, an Alpha, or Power4/5/970 then this whole notion of lack of OOOe is pretty moot in my book...

So code ugly, code fast? Well that would explain a lot about your love for SoA

Since when is SoA "ugly"? I think you've been in VU/VFPU/VS land a little toooo long.. :p

Just like SoA is basically forcing the definition of new atomic unit - 4x4 Matrix where we used to have 4x1Vector... Obvious question arises whether it's worth coming up with replacements for algorithms that can't be described in that manner or do we just put up with inefficiencies in those situations.

This is assuming all you're going to be handling is vertices... What happens when you want to toss audio samples into the SIMD engine? How about simple batch compares? Or using sorting and logical instructions to mow through decision trees? From the hardware engineering standpoint it *does* get you away from shoe-horning your architecture into a specific data type, and makes implementing a more orthogonal instruction set easier.

It's really not all that bad, especially if you have a decent language interface (think PIM code) and can really be abstracted away with a good solid template library (e.g. MacSTL)

Sure no of one with minimum execution speed overhead and a really tiny interpreter :p

Well, I'm rather fond of Small...

Oh it was a game, shipped last year on PC, sold a lot of units Wink

I will agree I think that particular project was out of hand, a lot of it was a function of distributing the code in such a way that the 60+ engineers could work on it without stepping on each others toes.

FWIW almost all of the code volume was in gameplay code.

Are you *SURE* you didn't manage to sneak a copy of XP in there somewhere? :p
 
psurge said:
If these new consoles all have 2 thread in-order general purpose units, maybe compilers will eventually support "scouting threads" (not sure what this is officially called), by which i mean a thread more or less dedicated to prefetching memory for a compute thread...

well, although theoretically possible, it would hardly come from the compiler side*. what the compiler could successively do, though, is to put in prefetches for the more prominent accesses in the code.

* compilers hardly care bout threads dispatching, at least not the cc's.
 
darkblu said:
psurge said:
If these new consoles all have 2 thread in-order general purpose units, maybe compilers will eventually support "scouting threads" (not sure what this is officially called), by which i mean a thread more or less dedicated to prefetching memory for a compute thread...

well, although theoretically possible, it would hardly come from the compiler side*. what the compiler could successively do, though, is to put in prefetches for the more prominent accesses in the code.

* compilers hardly care bout threads dispatching, at least not the cc's.

It would be a pointless excercise because of the limit on outstanding prefetches. It's just about enough to hide latency if you think about it and put them inline.
 
ERP said:
It would be a pointless excercise because of the limit on outstanding prefetches. It's just about enough to hide latency if you think about it and put them inline.

well, it basically what i meant, just that it would be compiler-carried.
 
Do you really think that your game heavvily optimized will look that much better than a game built eclusively with Renderware or Unreal Engine 3 (Xbox 2) ? I think your game still migth look better of course, but not as much as it would have last generation and much less than it would have the generation prior to that.

I think it is a fact that the difference between technically excellent titles and not technically excellent titles (provided good art resources for both titles) is shrinking and will keep on shrinking.

I both agree & disagree with you here pana. The defining factor is always going to be the art assets, there will never be total parity. As ultimately Renderware based titles will begin bearing some tell-tale visual similarities. Any Renderware, or generically engineered middleware titles are typically not making the most proficient use of the console's architecture, though easing the developer's job. Yes the UE3 engine looks phenomenal, but it cannot be optimized or outdone? Will generation 1 PS3 titles differ noticeably from 3rd gen. software? I agree that the margin is growing smaller between heavily optimized & middleware solutions, but not as small as you so assert. A discernible difference will still be there, the question is only how pronounced.

Again this belief perserveres that the Revolution will be quite underpowered, & Nintendo's skimping on the RAM I see. :rolleyes: However will ports look going from those 512mb systems to the main memory of the Revolution's 384mb? Your speculation is skewed methinks version, how often has Nintendo been 112mb of ram behind the competition? Even the XBX's UMA cannot allow for a full 64mb dedicated solely to games.
 
london-boy said:
Li Mu Bai said:
how often has Nintendo been 112mb of ram behind the competition?
:oops: :oops: HUH?! That would imply consoles in the past or present have had more than that... :devilish:

Relatively speaking london, when launching against a competitor's console within the same technological time period. Relax. ;)
 
Li Mu Bai said:
london-boy said:
Li Mu Bai said:
how often has Nintendo been 112mb of ram behind the competition?
:oops: :oops: HUH?! That would imply consoles in the past or present have had more than that... :devilish:

Relatively speaking london, when launching against a competitor's console within the same technological time period. Relax. ;)

Oh i'm all relaxed!
I was just like... "how often has Nintendo been 112mb of ram behind the competition? "... Errrmm NEVER..? ;)
 
Ae we taking these figures as fact? Have they been qualified yet? Much as I'd love 512+ mb next gen, even 768, is it really going to happen? :oops:
 
512MB would be just purrrfect. Actually, 1GB would be just purrrrfect, 512MB would be nice.

Not sure how expensive it can be to have those amounts of RAM, i wouldn't have thought they cost a whole lot, but i guess it all depends on what type of Ram it is.
 
Archie said:
Since when is SoA "ugly"? I think you've been in VU/VFPU/VS land a little toooo long..
I guess I came off sounding harsher then intended - after schools banging AoS like notation into our heads for all those years in math classes, I guess we're all a little brainwashed in that manner.

And yeah, I AM spoiled by having an ISA where I no longer need to worry over data orientation during algorithm design, because I can arbitrarily mix horizontal and vertical view of my register set, without overhead.
Who wouldn't be? :p

This is assuming all you're going to be handling is vertices... What happens when you want to toss audio samples into the SIMD engine? How about simple batch compares? Or using sorting and logical instructions to mow through decision trees?
Well no, I was more thinking of running arbitrary vector algebra through it. I've given a quick glance at the vectorization I did for certain matrix solvers and I can't say for sure if I it can be converted to optimally run with SoA at all - and having to convert algorithms in the first place isn't a trivial issue.
I do agree that trading off logical and other SIMD operands for better FPU ISA wouldn't be too nice either.
 
Li Mu Bai said:
london-boy said:
Li Mu Bai said:
how often has Nintendo been 112mb of ram behind the competition?
:oops: :oops: HUH?! That would imply consoles in the past or present have had more than that... :devilish:

Relatively speaking london, when launching against a competitor's console within the same technological time period. Relax. ;)

Gamecube. It's a nice system and all, but it's rather light on the ram. So was the N64. Oh, and GBA.

More memory is always nice, but even if all three next gen systems get 256 MBs (+ whatever extra for framebuffer, z etc) as I would guess they will, it'll all be okay.
 
function said:
Li Mu Bai said:
london-boy said:
Li Mu Bai said:
how often has Nintendo been 112mb of ram behind the competition?
:oops: :oops: HUH?! That would imply consoles in the past or present have had more than that... :devilish:

Relatively speaking london, when launching against a competitor's console within the same technological time period. Relax. ;)

Gamecube. It's a nice system and all, but it's rather light on the ram. So was the N64. Oh, and GBA.

More memory is always nice, but even if all three next gen systems get 256 MBs (+ whatever extra for framebuffer, z etc) as I would guess they will, it'll all be okay.

Has revolution been confirmed to have 384MB of ram yet? I haven't heard that figure anywhere, plus it's more than the 256MB of xbox 2, and ps3 is only rumored to have 256MB or 512MB.

BTW, I'd say nintendo systems have more had poor allocation of ram than low amounts of it.
Gamecube had a good amount, but 16MB of it was very slow.
N64 had more than any other system at the time, but low amount of texture cache.
And GBA has carts, and is a 2d system so a lot of ram isn't quite as needed.(ROM ~= RAM in many cases, right?)
 
Fox5 said:
function said:
Li Mu Bai said:
london-boy said:
Li Mu Bai said:
how often has Nintendo been 112mb of ram behind the competition?
:oops: :oops: HUH?! That would imply consoles in the past or present have had more than that... :devilish:

Relatively speaking london, when launching against a competitor's console within the same technological time period. Relax. ;)

Gamecube. It's a nice system and all, but it's rather light on the ram. So was the N64. Oh, and GBA.

More memory is always nice, but even if all three next gen systems get 256 MBs (+ whatever extra for framebuffer, z etc) as I would guess they will, it'll all be okay.

Has revolution been confirmed to have 384MB of ram yet? I haven't heard that figure anywhere, plus it's more than the 256MB of xbox 2, and ps3 is only rumored to have 256MB or 512MB.

BTW, I'd say nintendo systems have more had poor allocation of ram than low amounts of it.
Gamecube had a good amount, but 16MB of it was very slow.
N64 had more than any other system at the time, but low amount of texture cache.
And GBA has carts, and is a 2d system so a lot of ram isn't quite as needed.(ROM ~= RAM in many cases, right?)

The GC ram was not slow, it was actually very fast. Its realworld speed and theorectical speed were about the same, unlike the ram used on Xbox and PS2.
 
ILUVMATH said:
Fox5 said:
function said:
Li Mu Bai said:
london-boy said:
Li Mu Bai said:
how often has Nintendo been 112mb of ram behind the competition?
:oops: :oops: HUH?! That would imply consoles in the past or present have had more than that... :devilish:

Relatively speaking london, when launching against a competitor's console within the same technological time period. Relax. ;)

Gamecube. It's a nice system and all, but it's rather light on the ram. So was the N64. Oh, and GBA.

More memory is always nice, but even if all three next gen systems get 256 MBs (+ whatever extra for framebuffer, z etc) as I would guess they will, it'll all be okay.

Has revolution been confirmed to have 384MB of ram yet? I haven't heard that figure anywhere, plus it's more than the 256MB of xbox 2, and ps3 is only rumored to have 256MB or 512MB.

BTW, I'd say nintendo systems have more had poor allocation of ram than low amounts of it.
Gamecube had a good amount, but 16MB of it was very slow.
N64 had more than any other system at the time, but low amount of texture cache.
And GBA has carts, and is a 2d system so a lot of ram isn't quite as needed.(ROM ~= RAM in many cases, right?)

The GC ram was not slow, it was actually very fast. Its realworld speed and theorectical speed were about the same, unlike the ram used on Xbox and PS2.

I thought the A-RAM was supposed to be very slow.
 
Fox5 said:
Has revolution been confirmed to have 384MB of ram yet? I haven't heard that figure anywhere, plus it's more than the 256MB of xbox 2, and ps3 is only rumored to have 256MB or 512MB.

Don't think any system has been publically confirmed yet ... but I can't see Revolution besting what's in the Xenon/PS3 though, even if it comes a few months later. It's just a guess of course, but I can see them all having roughly the same amount.

BTW, I'd say nintendo systems have more had poor allocation of ram than low amounts of it.
Gamecube had a good amount, but 16MB of it was very slow.
N64 had more than any other system at the time, but low amount of texture cache.
And GBA has carts, and is a 2d system so a lot of ram isn't quite as needed.(ROM ~= RAM in many cases, right?)

Gamecube's Auxiliary ram is so slow that (aside from sound) I can't see it being used for much other than an optical disk cache, or maybe something like a page file. I've read developer comments that amount to how the PS2 effectively has more memory that they can run the game from. 26MB in the GC to 64MB in Xbox is a pretty big difference considering they both launched at pretty much the same time!

As for the N64, well, it came 18 months (maybe a little more) after the Saturn and PSX, so you'd expect it to have more ram. Except it doesn't. IIRC, the PS1 had 3.5 MB, and the Saturn 4MB (4.5MB including its CD cache) compared to the N64's 4MB. And that's with an 18 month gap! It you look 17 months later than the N64 (less time than the Saturn/PS1 -> N64 gap) along comes the DC with 26 MBs of ram - a huge 6.5 x increase over the N64!

The N64 is the only console that I've had to buy a ram expansion pack for.

And the GBA, yeah it's cheap and cheerful, but the GP32, released around the same time as the GBA (and costing only a little more despite being a grey import manufactured in a tiny fraction of the GBAs numbers by a tiny upstart company that didn't make a huge cut on software sold for the machine) had 8 MBs of ram and a CPU about 8 times as fast. A monopoly isn't a good thing, and I'm glad Sony and mobile phone manufacturers are now pushing into the handheld market.

I'll probably get a lot of shit for this, but I don't think hardware is Nintendo's strength. It never has been. IP and software are Nintendo's strengths, and their hardware simply exists at the most basic level that will allow them to carry the Nintendo machine forward into the next generation. Given the profits they continue to make, i can't see this stratagey changing soon. Perhaps this does make hardware one of their strengths after all, just not in the way we'd normally think as being good for us as consumers. :p
 
Gamecube's Auxiliary ram is so slow that (aside from sound) I can't see it being used for much other than an optical disk cache, or maybe something like a page file.

Hmm, this may be so, and the gamecube has less ram than xbox, but the main thing I noticed after getting an xbox is that the big games on xbox load much much slower or have more load points than the big gamecube games.
Both ninja gaiden and halo 2 have huge load times at the start, and fable and other games have long load times between every area/level, yet the top gamecube games often have no or few load times at all.

The N64 is the only console that I've had to buy a ram expansion pack for.

I believe the saturn got more use out of its ram expansions though.
BTW, I thought DC only had 16MB of ram, not 26MB. 8MB video(which is quite a bit of dedicated video ram for the time), and 8MB system.(or is it 8/16 in which case it's 24 total) Anyhow, they didn't use the same ram technology, n64 didn't have the option of sdram.

I'll probably get a lot of shit for this, but I don't think hardware is Nintendo's strength.

Nintendo has never given the most powerful possible to give, but they've given powerful systems. NES was the most powerful at its time(1982 in Japan?), SNES was the most powerful in its generation unless you count add ons and really expensive machines, n64 was the most powerful in its gem, and it had carts unlike the other two systems so ram wasn't quite as critical, and gamecube was pretty powerful. On the handheld side, gameboy color was underpowered, but gba was the most powerful by far when it launched, and I wouldn't say GP32 was only a bit more expensive. I seem to recall GP32 going for around $150 or $200 when GBA was around $80 or $90, and I don't think gp32 had a cpu 8x as fast.(if it does then it probably lacks the graphics hardware of the gba, but didn't gp32 use basically the same arm7 as the gba, but at twice the mhz and with different graphics capabilities?) Nintendo makes hardware that is powerful enough while meeting a cheap enough price point.
 
Fox5 said:
Hmm, this may be so, and the gamecube has less ram than xbox, but the main thing I noticed after getting an xbox is that the big games on xbox load much much slower or have more load points than the big gamecube games.

Yeah, it's not that I think having a disk cache is a bad idea (on the contrary - the low load times on the GC are very nice), it's just it doesn't substitute for a larger pool of main memory. Developers could allocate a large chunk of Xbox ram for preloading data from the DVD or HDD, but they don't seem to feel that it's a good use of resources.

I believe the saturn got more use out of its ram expansions though.

Yeah, never bothered to import one. Somehow a 4MB expansion for aracde perfect ports of 'hardcore' animation heavy arcade games doesn't seem bad (actually it seems good), where as as a 4MB expansion for a much newer system so it can play original titles designed for that platform seems like trying to close the stable door after the horse has ... casually walked off. ;)

BTW, I thought DC only had 16MB of ram, not 26MB.

16 MB main ram, 8 MB video RAM, 2MB Audio ram. All of it 100mhz 64-bit SDRAM. Aggregate bandwidth almost the same as the GC main ram too, come to think of it (remember that the DC's video chip had a tile buffer too).

Nintendo has never given the most powerful possible to give, but they've given powerful systems. NES was the most powerful at its time(1982 in Japan?)

Nes was bested by the Mark 3 / Master System hardware though, although I can't remember exact timeframes off the top of my head ...

SNES was the most powerful in its generation unless you count add ons and really expensive machines,

SNES was a great machine, possibly Nintendo's best balanced machine IMO, but despite coming 2 years after the Megadrive (and on the whole looking better) it had a much, much weaker CPU and often a lower resolution. Gaming lore has it that the SNES was originally planned with a 10 or 12 mhz 6800 (like the MD CPU, only faster) but this was pulled in favour of a much cheaper CPU, as most of the kind of games Nintendo thought would be popular weren't CPU heavy. Or something like that ...

n64 was the most powerful in its gem, and it had carts unlike the other two systems so ram wasn't quite as critical, and gamecube was pretty powerful.

Being "the most powerful in a gen" is all dependent on when you want to draw the boundry lines for that generation. N64 was, on balance, more powerful than the PS1 and Saturn, but it came 18 months later. If you look less than 18 months later than the N64, you see a console appear (the Dreamcast) that absolutely beasts the N64 by a margin several times that of the margin by which the N64 was better than the Saturn and Playstation. And thats despite costing less than the N64 at launch, and including a modem and optical drive.

On the handheld side, gameboy color was underpowered, but gba was the most powerful by far when it launched, and I wouldn't say GP32 was only a bit more expensive. I seem to recall GP32 going for around $150 or $200 when GBA was around $80 or $90, and I don't think gp32 had a cpu 8x as fast.(if it does then it probably lacks the graphics hardware of the gba, but didn't gp32 use basically the same arm7 as the gba, but at twice the mhz and with different graphics capabilities?) Nintendo makes hardware that is powerful enough while meeting a cheap enough price point.

Come on now, the GBA was a really weak peice of hardware, even when it launched. :eek:

The GP32 never got official distribution in the US or the UK, and so shipping and reseller costs bumped the price up massively. It was being producing in small numbers (something like 1% of the numbers of the GBA) and comprehensively outspecced the GBA. And Gamepark had to make their money on the hardware, rather than game lisencing fees on high volume game sales. Despite this I could buy one for £130 when the GBA was £90 (and that included import duty and VAT), and likewise get frontlit and later backlit versions (with a much better screen than the GBA SP) for a similar ~50% increase on highstreet GBA SP prices.

As for the specs, you're underestimating the GP32. It had a 133 mhz ARM 9 cpu compared to the GBAs 16 mhz ARM 7 (an 8 fold increase), and 8 MB's of ram compared to the GBAs 384 (or so) KB. It doesn't have the GBAs sprite and tile hadrware, but frankly given how much extra power it has and how easy it is to to these things in software you shouldn't need them to outperform the GBA at 2D! :p

...

... why did I just spend ages typing all that on a friday night? :oops:
 
Fox5 said:
ILUVMATH said:
Fox5 said:
function said:
Li Mu Bai said:
london-boy said:
Li Mu Bai said:
how often has Nintendo been 112mb of ram behind the competition?
:oops: :oops: HUH?! That would imply consoles in the past or present have had more than that... :devilish:

Relatively speaking london, when launching against a competitor's console within the same technological time period. Relax. ;)

Gamecube. It's a nice system and all, but it's rather light on the ram. So was the N64. Oh, and GBA.

More memory is always nice, but even if all three next gen systems get 256 MBs (+ whatever extra for framebuffer, z etc) as I would guess they will, it'll all be okay.

Has revolution been confirmed to have 384MB of ram yet? I haven't heard that figure anywhere, plus it's more than the 256MB of xbox 2, and ps3 is only rumored to have 256MB or 512MB.

BTW, I'd say nintendo systems have more had poor allocation of ram than low amounts of it.
Gamecube had a good amount, but 16MB of it was very slow.
N64 had more than any other system at the time, but low amount of texture cache.
And GBA has carts, and is a 2d system so a lot of ram isn't quite as needed.(ROM ~= RAM in many cases, right?)

The GC ram was not slow, it was actually very fast. Its realworld speed and theorectical speed were about the same, unlike the ram used on Xbox and PS2.

I thought the A-RAM was supposed to be very slow.

I forgot all about the A-RAM i was thinking it was all 1T-SRAM.
 
The SNES had several resolution modes and at least one that was higher than the MD. Regardless even if it didn't the higher color palette as well as simultaneous color palette more than made up for it anyway.
 
Back
Top