Are developers ready to cope with multi-core consoles?

Seems that all next gen consoles are going in multithreading(Xbox2, N5) and message-passing(PSX3) direction.

So the question is, are the developers ready for the new era? Multithreading is not new for PC developers but what about traditional console developers?
 
They're gonna have to..... they got used to Saturn, N64 and PS2, they'll get used to anything.
At the end of the day they're gonna have to if they want to make money...

What IMO is more source of concern is the amount of Art assets needed in the next generation of videogames... THAT is going to start becoming a problem...
 
M2 (two PPC 602s) would have done multi-processor in a console right, whereas Saturn (two SH-2s) was done wrong.


N64 was a pain to code for, but it only had 1 CPU core (MIPs R4xxx) and its Reality Co-Processor GPU which had two main parts.
Reality Single Processor + Reality Display Processor.
I suppose you could say the RSP was equivalent of 1 VU.
 
They don't "have to" and they didn't cope with Saturn, N64 didn't have multiple cores, and complained quite a bit about PS2. Multicore CPU's aren't a good thing when it comes to ease of use.
 
N64 RSP section of the RCP was a R4400 core (maybe with some custom modifications) with a vector processor added on to it, so I would say it counts as multiprocessor.

*G*
 
A lot of work has been done on tools and compilers, for various kinds of parallel supercomputers in the last twenty or so years. We are probably going to see that starting to filter down to consoles soon.
Anyway, going parallel/multithreaded is widely acknowledged, to be the only way to push silicon-based technology significantly further.
 
london-boy said:
What IMO is more source of concern is the amount of Art assets needed in the next generation of videogames... THAT is going to start becoming a problem...

Procedural generation of geometry, textures and animation will become a necessity and not a option.
It will make the games look more consistent and less constructed, so why be sad about it?
 
To me it seems "automatic" parallelization usually consists of you trying to structure a serial implementation just right so you know beforehand the types of parallelization the compiler can perform on it for you ;)
 
Meh, it's not much different from trying to vectorize an algorithm. In some ways it's actually easier (you can work asynchronously and/or do not need to work with uniform data), and in other ways it's more difficult (thread coherency).

With regards to the subject of the thread (no pun intended). I think it's a bit of the "sky is falling" attitude... Dividing work up between the CPU, GPU, and audio processors themselves constitute dealing with with multiple processors. While it is true that the application specific nature of some (namely the GPU and audio/IO processors) tend to make work allocation decisions somewhat easier, the power and flexibility of these components are starting to force developers to make more concious desisions about work allocation (as if they haven't already been doing this anyways for years on systems with an a$$ load of custom hardware)...
 
Grall said:
N64 RSP section of the RCP was a R4400 core (maybe with some custom modifications) with a vector processor added on to it, so I would say it counts as multiprocessor.

*G*

I heard from an N64 dev that the RCP was a R4600+vector unit.
 
Hey, don't forget the "16-bit" PC-engine! It was really two 8-bit processors, and it was very successful in Japan.

And yes, devs most definitely did "put up with" Saturn - in fact, devs took it upon themselves to support Saturn in Japan after Sega gave up on the platform, just like with Dreamcast today!

And finally, the one case where a multiprocessor console really did get wrecked because, franktly, it was TOO complex: Jaguar. :D The entire system is five programmable processors!
 
To my knowledge, PC Engine had ONE 8-bit processor based on the venerable 6502(high clock speed though for an 8-bit chip though), and two 16-bit graphics chips...

Jaguar only really had three "real" processors; 68k and the two RISC cores (as if that's not enough, heh heh). The blitter isn't actually a 'programmable processor' since it doesn't run programs (it just has registers that other processors program :)), and the object processor is very limited in the nature of the programs it runs (it's akin to the Amiga's copper one could say).

What really messed up the Jag was all the hardware bugs and architectural shortcomings, plus the general ineptness of Atari Corp of course.


*G*
 
Squeak said:
Procedural generation of geometry, textures and animation will become a necessity and not a option.
It will make the games look more consistent and less constructed, so why be sad about it?


Sad? No, Not "sad", just "how r they going to cope with the load"-worried.....

Personally, i REALLY think that Art assets amount, diversification, production etc will be much more of a problem compared to "coding"...

We could say it is already happening today... (the case of advanced graphics but lame art versus the not-so-advanced tech but beautiful art)
 
The PC-Engine had three 8-bit processors. The 8-bit CPU and the two 8-bit graphics chips that ran in parallel which is the equivalent of a single 16-bit graphics chip hence the name TurboGrafx-16.

Custom 8bit CPU: HuC6280 (7.16MHz) (Customised 6502)
Video Processor: HuC6270
Color Processor: HuC6260
 
N64 was a pain to code for, but it only had 1 CPU core (MIPs R4xxx) and its Reality Co-Processor GPU which had two main parts.
Reality Single Processor + Reality Display Processor.
I suppose you could say the RSP was equivalent of 1 VU.
The RSP was used to do also audio generation, right ???


Anyways and refering to the thread theme I think that it won't be a problem. And it the case it would be sure someone would make the necessary tools.

According to what I understood this is an absolutelly transparent thing in the CELL case where the compiler or JIT compiler will do all the necessary to distribute work between the APUs.
Don't know about GC2 and XBOX2 architectures since I really don't know precise details.
 
london-boy, art is more scalable than coding though ... it's just a question of money.

If you get enough power then occlusion culling and automatic simplification will take the worry about polygon counts and overdraw off the shoulders of artists at least. Maybe you will start to see some alliances with the CGI industry to get access to proprietary modeling tools.
 
...

this is an absolutelly transparent thing in the CELL case where the compiler or JIT compiler will do all the necessary to distribute work between the APUs.
If that was true, then I would shut up and not come back.

"I can't imagine how you will actually program it" - Tim Sweeney, Epic.
Why can't Tim Sweeney imagine how he could possibly program CELL?? You will figure it out.
 
Why can't Tim Sweeney imagine how he could possibly program CELL?? You will figure it out.

Because he knows nothing about Cell, he hears multiple cores on a chip and automatically assumes. Anyone would.
 
Back
Top