New article on Cell (good!)

Nicked said:
Zaxxon.

Make it happen Sega.

Dont tease me! Zaxxon and Galaga, what more could anyone want?

As for Cell, the only real downside I see for it, outside of the PS3, is the markets that it is targeting. Many of those are slow and bureaucratic, the defense industry does have some really cool stuff that never makes it to fruition. Television, music and movies, we all know how slow they are to adopt things (HD anyone?), and the academic world sometimes (I hesitate to say most) is no better. Envisioning is one thing, implementing is another. Regardless of whether some, any, or all of the applications for Cell mentioned become realized, I would have to say it is successful today, merely by its inclusion in the PS3.
 
overclocked said:
These is from the Mastersytems and early Arcade days hmmm..
I maybe are going to mix between cause i dont remember but Strider, SpaceHarrier...?

I'm rooting for a Alex the Kidd in Miracle World remake myself... That series never lived up to the potential of the first game.

Miracle World > Super Mario Bros.

:)
 
cam said:
Which is why this quote from ign seems strange "the PlayStation 3 also nestles some pretty expensive technology -- not the least of which are Blu-Ray support and a built-in hard drive" in there Cnn - ps3=$500 write up. I thought a hard drive as standard was just a rumour?

Could just mean that they will have the HDD inside the case also.
Personally i wouldt mind different SKUs(even three should be fine) but with the Baddest SKU launching first.
 
Pozer said:
I'm rooting for a Alex the Kidd in Miracle World remake myself... That series never lived up to the potential of the first game.

Miracle World > Super Mario Bros.

:)

Hehe that was a good platformer, still have great memorys from my child years :)

Btw what was the awesome platformer called? Wonderboy in monsterworld or something it was a series i remember but that one was for Megadrive i think.
 
a little heavy on the "holy crap" factor-they make it sound just a little TOO amazing- but damn, it caught my attention nonetheless.
good times ahead.
 
I had trouble reading that article, because the PR BS is so thick it gave me a stomach ache. IT'S JUST A CHIP PEOPLE! Yes, it's a very interesting design, and extremely well suited to media processing (e.g. image, video, audio, etc.), but it's not going to change the world, despite what Sony would like you to believe.
 
DemoCoder said:
I like CELL, but CELL is not the future IMHO, it is 1/2 of the future. I think the future is the combination of throughput style designs like the Niagara chip, and "SPE farm" approach of CELL, that is, lots of additional TLP combined with a large pool of functional units. If you're going to go the route of dropping OoOE and ILP scalability, you need TLP to make up for stalls.


That will hold us over until we get RSFQ and nanorod based designs. :)


I agree. Cell is just a step to that future.


I like Intel's 'Platform 2015' outline where chips will reach many billions of transistors - have several full fledged cores - many multi-purpose cores / functional units - and specialized units; some for 3D rendering. They were talking about tens or hundreds of cores on a single chip, capable of hundreds or thousands of threads. Of course they couldn't be hundreds of full fledged CPU cores, but maybe units that are more like the SPE in Cell or the unified shader ALUs in Xenos. sounds like Intel is working on a family of true 'super chips' that are CPU, GPU and more in one. All of that is pure vaporware at the moment though.


I am trying to find some choice quotes about Intel's Platform 2015 efforts.

http://www.intel.com/technology/techresearch/idf/platform-2015-keynote.htm
The Trend to Many-Core
Rattner introduced the "many-core" concept, explaining that Intel researchers and scientists are experimenting with "many tens of cores, potentially even hundreds of cores per die, per single processor die. And those cores will be supporting tens, hundreds, maybe even thousands of simultaneous execution threads."

http://www.itnews.com.au/newsstory.aspx?CIaNID=18036&r=hstory
Intel's 10-year technology vision includes the ability to put hundreds of cores on one processor, which would permit the creation of scores of "arrays" on a chip devoted to separate functions like video, graphics or VoIP, one of the chip giant's top technologists said this week.


Since the chips are designed to hold hundreds of cores, "we fully expect these [core] arrays to be partitioned," Rattner said to reporters and analysts after his speech. Processors then could allocate scores of cores and threads for improved performance in specific applications, including graphics and media. A hundred cores "could be used over a range of applications," he said.


http://www.intel.com/technology/magazine/computing/platform-2015-0305.htm
1. Chip-Level Multiprocessing (CMP)
Intel continues pioneering in one of the most important directions in microprocessor architecture—increasing parallelism for increased performance. As shown in Figure 1, we started with the superscalar architecture of the original Intel® Pentium® processor and multiprocessing, continued in the mid-90s by adding capabilities like "out of order execution," and most recently introduced Hyper-Threading Technology in the Pentium 4 processor. These paved the way for the next major step—the movement away from one, monolithic processing core to multiple cores on a single chip.

Intel is introducing multi-core processor-based platforms to the mainstream. These platforms will initially contain Intel processors with two cores, evolving to many more. We plan to deliver Intel processors over the next decade that will have dozens, and even hundreds of cores in some cases. We believe that Intel's chip-level multiprocessing (CMP) architectures represent the future of microprocessors because they deliver massive performance scaling while effectively managing power and heat.


Figure 1. Driving increasing degrees of parallelism on Intel® processor architectures.


In the past, performance scaling in conventional single-core processors has been accomplished largely through increases in clock frequency (accounting for roughly 80 percent of the performance gains to date). But frequency scaling is running into some fundamental physical barriers. First of all, as chip geometries shrink and clock frequencies rise, the transistor leakage current increases, leading to excess power consumption and heat (more on power consumption below).

Secondly, the advantages of higher clock speeds are in part negated by memory latency, since memory access times have not been able to keep pace with increasing clock frequencies. Third, for certain applications, traditional serial architectures are becoming less efficient as processors get faster (due to the so-called Von Neumann bottleneck), further undercutting any gains that frequency increases might otherwise buy. In addition, resistance-capacitance (RC) delays in signal transmission are growing as feature sizes shrink, imposing an additional bottleneck that frequency increases don't address.

Therefore, performance will have to come by other means than boosting the clock speed of large monolithic cores. Instead, the solution is to divide and conquer, breaking up functions into many concurrent operations and distributing these across many small processing units. Rather than carrying out a few operations serially at an extremely high frequency, Intel's CMP processors will achieve extreme performance at more practical clock rates, by executing many operations in parallel². Intel's CMP architectures will circumvent the problems posed by frequency scaling (increased leakage current, mismatches between core performance and memory speed and Von Neumann bottlenecks). Intel® architecture (IA) with many cores will also mitigate the impact of RC delays³.

Intel's CMP architectures provide a way to not only dramatically scale performance, but also to do so while minimizing power consumption and heat dissipation. Rather than relying on one big, power-hungry, heat-producing core, Intel's CMP chips need activate only those cores needed for a given function, while idle cores are powered down. This fine-grained control over processing resources enables the chip to use only as much power as is needed at any time.

Intel's CMP architectures will also provide the essential special-purpose performance and adaptability that future platforms will require. In addition to general-purpose cores, Intel's chips will include specialized cores for various classes of computation, such as graphics, speech recognition algorithms and communication-protocol processing. Moreover, Intel will design processors that allow dynamic reconfiguration of the cores, interconnects and caches to meet diverse and changing requirements.

Such reconfiguration might be performed by the chip manufacturer, to repurpose the same silicon for different markets; by the OEM, to tailor the processor to different kinds of systems; or in the field at runtime, to support changing workload requirements on the fly. Intel® IXP processors today provide such capability for special purpose network processing. As shown in Figure 2, the Intel IXP 2800 has 16 independent micro engines operating at 1.4 GHz along with an Intel XScale® core

pl2015_g2.gif

my note: an example of a current Intel chip with one main core and 16 'micro engines' - I guess used as an example to show where Intel is going with so called 'many core' single die processors.


2. Special Purpose Hardware
Over time, important functions once relegated to software and specialized chips are typically absorbed into the microprocessor itself. Intel has been at the forefront of this effort, which has been the driving force behind our business model for over 35 years. By moving functions on chip, such capabilities benefit from more-efficient execution and superior economies of scale and reduce the power consumption drastically. Low latency communication between special purpose hardware and general purpose cores will be especially critical to meet future processor architecture performance and functionality expectations.

Special-purpose hardware is an important ingredient of Intel's future processor and platform architectures. Past examples include floating point math, graphics processing and network packet processing. Over the next several years, Intel processors will incorporate dedicated hardware for a wide variety of tasks. Possible candidates include: critical function blocks of radios for wireless networking; 3D graphics rendering; digital signal processing; advanced image processing; speech and handwriting recognition; advanced security, reliability and management; XML and other Internet protocol processing; data mining; and natural language processing.




platform2015.jpg

see, told ya I wasn't making it up when I said many billions of transistors, 100s of cores and threads

now that *sounds* alot cooler than Cell, in my opinion. but Cell is present-day technology, Cell is now, while Platform 2015 is upto 9-10 years away -- some of its key technologies emerging into Intel chips between a few years from now and then, as I think it will be a gradual thing with Intel.

If Apple gets into the console race sometime within the next 10 years - they'd need something KILLER such as what Intel is blowing all that vapor about - not an upgraded Mac Mini with conventional single or dual core CPU and better graphics card, but something with a real 'Platform 2015' processor would blow the future next-gen Cell based PS4 out of the water.
 
Last edited by a moderator:
Titanio said:
I haven't even finished reading it, but already there's some nice new info in there, including new customers for Cell (Raytheon, for missile systems, Stanford University for a supercomputer), and a comment from Pandemic Studios on it etc.

The Raytheon inclusion seems odd to me. I've worked with them extensively over the past few years, and all of my work with them were on PPC440s and Power4/5s. Their code uses 64-bit floats extensively, I don't see how Cell is of much use to them right now in its current state.
 
There was talk of a revision which pushed half as many DPFP as SPFP so it could be that they are planing on using that.
 
Asher said:
The Raytheon inclusion seems odd to me. I've worked with them extensively over the past few years, and all of my work with them were on PPC440s and Power4/5s. Their code uses 64-bit floats extensively, I don't see how Cell is of much use to them right now in its current state.
NDA? ;)
 
Hmmm, sound cloud-nine thinking here...

Masakazu Suzuoki, Sony's lead designer on Cell, says Sony aims to use this power to create movies that are interactive and changeable, with multiple story lines, so people will watch the same flick more than once.
There's very few films that can manage even one half decent story line. Coming up with several for the same film will be very hard, and making them actually work well together nigh on impossible. It's also stupid. What's the point in trying to squeeze multiple stories into the same places and times with the same characters? Take say Spiderman. What variations can be added to change the story without coming up with something that's totally different and may as well be a different film? And if they're thinking of stories more akin to those books with the dice rolling and decision making, that often end in failure (10 minutes into the film you choose an option that sees the hero fall to his death and the movie ends...), that's what computer games are for! If you want a film people will watch more than once, get a good story with good production values. IT has worked since time immemorial and doesn't need changing!

As for the future of super chips, it seems to me that whoever invents main RAM that can ran be accessed in 1 cycle direct to the CPU will pave the way forward. Memory accessing still remains the bottleneck more than anything else. Cell's performance benefit is attained by working round this, which isn't always possible.
 
Asher said:
The Raytheon inclusion seems odd to me. I've worked with them extensively over the past few years, and all of my work with them were on PPC440s and Power4/5s. Their code uses 64-bit floats extensively, I don't see how Cell is of much use to them right now in its current state.

I guess we'll find out soon, I'd presume an announcement will be coming before the end of the month if they were providing comment for an end-of-month article. They say they've been studying it for over 15 months now, so I guess they either found applications, or a DP-focussed revision is coming (as Xenus says).

Shifty Geezer said:
Hmmm, sound cloud-nine thinking here...

There's very few films that can manage even one half decent story line. Coming up with several for the same film will be very hard, and making them actually work well together nigh on impossible. It's also stupid. What's the point in trying to squeeze multiple stories into the same places and times with the same characters? Take say Spiderman. What variations can be added to change the story without coming up with something that's totally different and may as well be a different film? And if they're thinking of stories more akin to those books with the dice rolling and decision making, that often end in failure (10 minutes into the film you choose an option that sees the hero fall to his death and the movie ends...), that's what computer games are for! If you want a film people will watch more than once, get a good story with good production values. IT has worked since time immemorial and doesn't need changing!

I don't think it's impossible, but it would require a little change to how movies were made. But have you never wished in a movie that the director focussed more on a particular charater or sub-plot? You could follow one particular character's story over another's, while still being consistent with the story as a whole. It's actually a technique I've seen used already in movies - replaying the same events multiple times from different character's perspectives, and it can work really well. Although I don't see how simply switching to another arc or version in the story would require lots and lots of processing power, so perhaps he's thinking of something different :p
 
Any guesses what Pandemic might be working on? A new Battlefront game, perhaps? Do we know of any next-gen games they're working on at all?
 
Hehe, good stuff. Kind of surprised they have four projects on the boil!

On another note about Cell and new clients, there are a couple of reports out there saying that the US Department of Homeland Security is using Mercury's Cell systems now e.g.

http://www.technewsworld.com/story/48274.html

Although there's only a couple, so I'm not sure how confirmed that is..
 
Back
Top