Welcome, Unregistered.

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Reply
Old 14-Nov-2011, 22:19   #1
Shifty Geezer
uber-Troll!
 
Join Date: Dec 2004
Location: Under my bridge
Posts: 30,355
Default Feasibility of Cell emulation

If Sony moves away from Cell next gen, will any processor be able to emulate Cell effectively for BC? What are the architectural pitfalls that could prevent that? From some highlighted insights by T.B. about the intricacies like the MFC, I'm thinking a conventional multicore PPC, ARM or x86 won't be able to, but I'd rather hear from people who know about these things what they think!
__________________
Shifty Geezer
...
Flashing Samsung mobile firmwares. Know anything about this? Then please advise me at -
http://forum.beyond3d.com/showthread.php?p=1862910
Shifty Geezer is offline   Reply With Quote
Old 14-Nov-2011, 23:58   #2
Cyan
Senior Member
 
Join Date: Apr 2007
Posts: 3,860
Default

Quote:
Originally Posted by Shifty Geezer View Post
If Sony moves away from Cell next gen, will any processor be able to emulate Cell effectively for BC? What are the architectural pitfalls that could prevent that? From some highlighted insights by T.B. about the intricacies like the MFC, I'm thinking a conventional multicore PPC, ARM or x86 won't be able to, but I'd rather hear from people who know about these things what they think!
If you say a multicore PPC, ARM or x86 won't be able to do it, then there isn't much left it could run Cell legacy code. I think the reason why it's difficult to emulate the Cell is that it will be very hard to find a similar design in next gen consoles and processors in general. In all honesty, after reading developers posts and unless they are having serious issues with their user base asking for backwards compatibility, if I were Sony I would drop BC for good. or I would create a compatible processor, just adding Hyperthreading, something that developers are asking for when next generation starts.
__________________
Powered by ATi. Sophie Ellis Bextor is sheer love! Laura Jackson is the most beautiful woman who has ever existed!http://i1.ytimg.com/vi/AXaeEdu0FbQ/maxresdefault.jpg http://www.youtube.com/watch?v=pSBTC1HHN-E Heroes of Might and Magic: They aren't only games, they are passions. And feelings.
Cyan is offline   Reply With Quote
Old 15-Nov-2011, 00:47   #3
patsu
Regular
 
Join Date: Jun 2005
Posts: 27,477
Default

There are business feasbility and technical feasibility. I assume you're talking about the second one Shifty ?
__________________
My wife pays up to hundreds of dollars for paintings we just hang on the wall They do nothing, just hang their. Journey is interactive, so it does more than our paintings. Art can be expensive! Get over it!
-- 3rdamention@GAF
patsu is offline   Reply With Quote
Old 15-Nov-2011, 10:13   #4
archangelmorph
Senior Member
 
Join Date: Jun 2006
Location: London
Posts: 1,551
Default

Nothing currently available would likely be able to emulate the CELL with any reasonable performance...

Without SPEs you're going to have a very hard time getting the kind of throughput they can achieve on highly optimized, vectorized code, even on some of the fastest desktop CPUs
__________________
blog
twitter
archangelmorph is offline   Reply With Quote
Old 15-Nov-2011, 10:15   #5
Shifty Geezer
uber-Troll!
 
Join Date: Dec 2004
Location: Under my bridge
Posts: 30,355
Default

Quote:
Originally Posted by patsu View Post
There are business feasbility and technical feasibility. I assume you're talking about the second one Shifty ?
Yes. The design of Cell to me suggests it cannot be emulated in the next few iterations of processors. PPC would get closest as it could at least share the same ISA, and perhaps could feature ideas for the purposes of emulation, but that'd be an odd business to enter into for a single product requiring Cell BC.
__________________
Shifty Geezer
...
Flashing Samsung mobile firmwares. Know anything about this? Then please advise me at -
http://forum.beyond3d.com/showthread.php?p=1862910
Shifty Geezer is offline   Reply With Quote
Old 15-Nov-2011, 10:16   #6
Arwin
Now Officially a Top 10 Poster
 
Join Date: May 2006
Location: Maastricht, The Netherlands
Posts: 14,922
Default

The Cell would have to be partly emulated by CPU, partly by GPU. Imho only a new processor designed to be compatible with Cell could emulate it.

But emulation is probably dead anyway. We'll just see re-releases instead.
Arwin is offline   Reply With Quote
Old 15-Nov-2011, 16:26   #7
patsu
Regular
 
Join Date: Jun 2005
Posts: 27,477
Default

Quote:
Originally Posted by archangelmorph View Post
Nothing currently available would likely be able to emulate the CELL with any reasonable performance...

Without SPEs you're going to have a very hard time getting the kind of throughput they can achieve on highly optimized, vectorized code, even on some of the fastest desktop CPUs
Why ? What are the specific trouble spots and use cases ?
__________________
My wife pays up to hundreds of dollars for paintings we just hang on the wall They do nothing, just hang their. Journey is interactive, so it does more than our paintings. Art can be expensive! Get over it!
-- 3rdamention@GAF
patsu is offline   Reply With Quote
Old 15-Nov-2011, 18:00   #8
patsu
Regular
 
Join Date: Jun 2005
Posts: 27,477
Default

Quote:
Originally Posted by Arwin View Post
The Cell would have to be partly emulated by CPU, partly by GPU. Imho only a new processor designed to be compatible with Cell could emulate it.

But emulation is probably dead anyway. We'll just see re-releases instead.
Emulation or backward compatibility. I wouldn't say they are dead. The PS2 -> PS3 transition may be an exception because Sony wanted people to buy more PS3 games given the big jump in investment.

If the transition is seamless to begin with, then minimal effort to upgrade old games (to run on the new console) may very well be welcomed for some. They can focus on the new game modes and assets rather than the engine code.
__________________
My wife pays up to hundreds of dollars for paintings we just hang on the wall They do nothing, just hang their. Journey is interactive, so it does more than our paintings. Art can be expensive! Get over it!
-- 3rdamention@GAF
patsu is offline   Reply With Quote
Old 15-Nov-2011, 21:21   #9
TheChefO
Naughty Boy!
 
Join Date: Jul 2005
Location: Tampa, FL
Posts: 4,656
Default

I'd think it would make more sense to build on the Cell rather than try to emulate it.

The design is paid for and is better able to scale than most other designs. Granted, SPU usage seems to be a pain, but if they design it right, SPU usage may not be necessary for most games.

A 4-8+ core PPE cell with 8(+) SPE's would still have BC enabling Sony to not trash the Digital content that consumers have purchased and not trash the investment developers have in Cell over the past 5 years.

The GPU/Ram will be the defining edge next gen. As long as it isn't a pain to get code working (devs should well know Cell by now) then they can get software on the shelves asap and have a viable launch lineup instead of trying to figure out how to take advantage of a new design.

Cheaper to produce and grow with and similar results on screen vs anything MS is likely to come up with.
__________________
"...the first five million are going to buy it, whatever it is, even if it didn't have games."
"I don't think we're arrogant"

...it seems laughable, laughable I tell you, that early 2012 technology that is under the 2005 budgets for the consoles cannot fit into a next gen box.
- Acert93
TheChefO is offline   Reply With Quote
Old 15-Nov-2011, 23:34   #10
Grall
Invisible Member
 
Join Date: Apr 2002
Location: La-la land
Posts: 6,691
Default

Quote:
Originally Posted by Arwin View Post
But emulation is probably dead anyway. We'll just see re-releases instead.
Re-releasing PS2 games, re-tooled to run on PS3 is feasible because PS2 was a relatively simple piece of kit. Cell is extremely high-performing, it's high-performing even today if just looking at the SPEs when running optimized code. I believe you'd need something like a Sandybridge-E to top Cell's floating point performance...

It's going to be a huge undertaking to re-write today's triple-A games for another system that isn't Cell compatible, it might very well be simpler for Sony to stick the currently available SPEs (six, unless the seventh has been unlocked by later firmwares) on whatever new CPU they decide to use, and then just re-compile the PPE code for the new chip.
__________________
"Du bist Metall!"
-L.V.
Grall is offline   Reply With Quote
Old 15-Nov-2011, 23:37   #11
Arwin
Now Officially a Top 10 Poster
 
Join Date: May 2006
Location: Maastricht, The Netherlands
Posts: 14,922
Default

The PS2 wasn't that simple a piece of kit, and personally I have little trouble imagining a new generation console that is a much closer match with the PS3, PS2 or both than the difference between the PS2 and PS3.

I think precisely those things that are hard to do with anything but Cell now, will be a much more common feature/strength of upcoming designs.
Arwin is offline   Reply With Quote
Old 16-Nov-2011, 00:15   #12
patsu
Regular
 
Join Date: Jun 2005
Posts: 27,477
Default

Quote:
Originally Posted by TheChefO View Post
I'd think it would make more sense to build on the Cell rather than try to emulate it.

The design is paid for and is better able to scale than most other designs. Granted, SPU usage seems to be a pain, but if they design it right, SPU usage may not be necessary for most games.

A 4-8+ core PPE cell with 8(+) SPE's would still have BC enabling Sony to not trash the Digital content that consumers have purchased and not trash the investment developers have in Cell over the past 5 years.

The GPU/Ram will be the defining edge next gen. As long as it isn't a pain to get code working (devs should well know Cell by now) then they can get software on the shelves asap and have a viable launch lineup instead of trying to figure out how to take advantage of a new design.

Cheaper to produce and grow with and similar results on screen vs anything MS is likely to come up with.
The pain is not generally associated with the SPUs. So far, the complains were more leveled at the limited split memory and GPU vertex limits. We even hear some developers marvel at the SPUs, although some others use the SPUs simply like a regular vector/math engine. ^_^

The PPU is noticeably weaker than the SPUs. I assume Sony will improve it more, and perhaps make the SPUs (or SPU-like entities) work with the GPU(s) more efficiently ?

OTOH, if the GPU (or GPUs) is powerful enough, the developers can also achieve similar results by running more iterations to approximate the result.
__________________
My wife pays up to hundreds of dollars for paintings we just hang on the wall They do nothing, just hang their. Journey is interactive, so it does more than our paintings. Art can be expensive! Get over it!
-- 3rdamention@GAF
patsu is offline   Reply With Quote
Old 16-Nov-2011, 16:00   #13
Persistantthug
Member
 
Join Date: Jun 2010
Posts: 109
Default

Quote:
Originally Posted by Arwin View Post
The Cell would have to be partly emulated by CPU, partly by GPU. Imho only a new processor designed to be compatible with Cell could emulate it.

But emulation is probably dead anyway. We'll just see re-releases instead.

HD remasters/re releases were only made possible because there was no network that had to be carried over.

Obviously, That's not the case anymore.

Consumers have an invested interest in the network......developers have an invested interest in the network.....and even more important, Sony has billions of dollars worth of invested interest in the network.

Backwards Compatibility for the PS3 games is pretty much mandatory for the PS4.
Persistantthug is offline   Reply With Quote
Old 16-Nov-2011, 18:36   #14
TheChefO
Naughty Boy!
 
Join Date: Jul 2005
Location: Tampa, FL
Posts: 4,656
Default

Quote:
Originally Posted by Persistantthug View Post
HD remasters/re releases were only made possible because there was no network that had to be carried over.

Obviously, That's not the case anymore.

Consumers have an invested interest in the network......developers have an invested interest in the network.....and even more important, Sony has billions of dollars worth of invested interest in the network.

Backwards Compatibility for the PS3 games is pretty much mandatory for the PS4.
This.

I think however Sony gets it done, they need to maintain BC (along with MS) due to the investment in DC if they expect any semblance of loyalty into nextgen.


@Patsu: As long as the cpu can run native cell code without hassle, I think everyone involved will be happy. How they choose to grow from that point will be interesting to see, but if I were in their shoes, I'd minimize the headache/time-sink for developers to get code on the shelf, maintain BC, and let the GPU/Ram be the edge to compete nextgen.

Let's not forget, even if they did absolutely nothing to improve Cell nextgen, and all they did was improve the GPU, PS4 would still see improved CPU performance due to the fact devs could (finally) use cell for things other than helping the gpu.

If this gen has taught us anything I'd say it's that the CPU is of little importance in differentiating the console experience for consumers.

For Sony to invest heavily on the CPU is just money that isn't spent on the (programmable) GPU...
__________________
"...the first five million are going to buy it, whatever it is, even if it didn't have games."
"I don't think we're arrogant"

...it seems laughable, laughable I tell you, that early 2012 technology that is under the 2005 budgets for the consoles cannot fit into a next gen box.
- Acert93
TheChefO is offline   Reply With Quote
Old 16-Nov-2011, 18:45   #15
Elan Tedronai
Junior Member
 
Join Date: Apr 2010
Posts: 70
Default

What kind of question is this? There's nothing out there that can emulate the cell to the point of consumer satisfactin. Nothing at all.

If sony plans to make the ps4 backwards compatible then they will have to use the cell one way or another
Elan Tedronai is offline   Reply With Quote
Old 16-Nov-2011, 19:17   #16
Shifty Geezer
uber-Troll!
 
Join Date: Dec 2004
Location: Under my bridge
Posts: 30,355
Default

Quote:
Originally Posted by Elan Tedronai View Post
What kind of question is this?
A futile and pointless one. As a technical question asked in the technical forum, I had hoped for someone with processor savvy to talk about the issues facing Cell emulation, and what would be needed to get it working. But contributors could only regurgitate the non-technical "Is BC important" discussion already had, to be topped off with brainless contributions like your own founded on no technical savvy whatsoever. So I give up.
__________________
Shifty Geezer
...
Flashing Samsung mobile firmwares. Know anything about this? Then please advise me at -
http://forum.beyond3d.com/showthread.php?p=1862910
Shifty Geezer is offline   Reply With Quote
Old 17-Nov-2011, 06:51   #17
AlexV
Heteroscedasticitate
 
Join Date: Mar 2005
Posts: 2,433
Default

I think it holds merit, so with all due apologies to Shifty I'll reopen this one. Also, in order to rile thing up, I suggest that people do a slightly better work at documenting themselves...moving beyond "omagad it's deep magicTM" would be good. CELL is old, cute but not exactly cutting edge, it doesn't have an awesome memory pipe, it doesn't have hardware caching, the ISA is not exactly from another galaxy, IPC is pretty gimpy. As long as the emulator writers are actually aware about prefetching to cater to the DMA work (albeit modern prefetchers should do a reasonably good job anyways), you'd probably end up doing the bulk of work from L2.

A reasonably modern core like NHLM or SB is considerably ahead of CELL in terms of execution prowess, and now it's working from a cache that's quite fast. You also have HT to help fill in some bubbles. More importantly, people have done work behind the scenes on this and it's reasonable, to say the least. So please, less rehashing of memes like: "we tweaked CELL so hard it's over 9000 now, if we had an updated version that ran at Graham's Number GHz and had Moser's Number SPEWs it would crack the very fabric of the universe".
__________________
Donald Knuth: Science is what we understand well enough to explain to a computer. Art is everything else we do.
AlexV is offline   Reply With Quote
Old 17-Nov-2011, 09:44   #18
Shifty Geezer
uber-Troll!
 
Join Date: Dec 2004
Location: Under my bridge
Posts: 30,355
Default

It's more the intricacies that make it look hard to me, based on the talk between Vitaly, T.B, and Sebbbi in a couple of Cell threads. The MFC is more sohpisticated than we thought. Well, I guess another core could be used to do its workload prefetching data to the cache.

TBH I know diddly squat about low-level CPU emulation! If it can be 'proven' that another architecture in a configuration suitable for a console (not a 16 core server CPU!) can perform BC duties admirably, that'd be one less of only a few reasons to stick with Cell, and if one of the architectures is better suited (ARM vs. x86) then that'd be a high probability choice for PS4.
__________________
Shifty Geezer
...
Flashing Samsung mobile firmwares. Know anything about this? Then please advise me at -
http://forum.beyond3d.com/showthread.php?p=1862910
Shifty Geezer is offline   Reply With Quote
Old 17-Nov-2011, 10:55   #19
Love_In_Rio
Senior Member
 
Join Date: Apr 2004
Posts: 1,157
Default

Bluegene/Q maybe?. There are rumours of it being modified in the Marenostrum supercomputer of Barcelona to be used in PS4.
Love_In_Rio is offline   Reply With Quote
Old 17-Nov-2011, 11:16   #20
Dominik D
Member
 
Join Date: Mar 2007
Location: Wroclaw, Poland
Posts: 702
Default

Quote:
Originally Posted by Shifty Geezer View Post
It's more the intricacies that make it look hard to me, based on the talk between Vitaly, T.B, and Sebbbi in a couple of Cell threads. The MFC is more sohpisticated than we thought. Well, I guess another core could be used to do its workload prefetching data to the cache.

TBH I know diddly squat about low-level CPU emulation! If it can be 'proven' that another architecture in a configuration suitable for a console (not a 16 core server CPU!) can perform BC duties admirably, that'd be one less of only a few reasons to stick with Cell, and if one of the architectures is better suited (ARM vs. x86) then that'd be a high probability choice for PS4.
I think one has to answer one question first: do you want emulation, or simulation? There's a problem of not having simulation if you want to replicate behavior including stalls, race conditions, etc. So for example if you want to emulate/simulate Commodore 64, you need certain things to be timed in a specific way so you can outsmart the "hardware" and put sprites on the border. But if you don't care about it and you only aim for the sunny day scenarios, I think you could get away with emulation on PPC, where you patch your code for the new architecture (e.g. substitute missing ops with existing ones, rewrite jumps,...) and you get PS3 (or some other CELL) binaries running if you emulate enough hardware. The thing is: with heavily multithreaded code running on, say, PS3 you're going to hit all sorts of conditions that weren't there on CELL. Performance will be different and unpredictable, but if you have enough horsepower, you should be OK. This is pretty much what MS did with 360 back compat and that's why some games did not work (corner cases with horrible performance characteristics and code too performant and HW-dependent to run in emu on 360).
__________________
Shifty Geezer: I don't think the guy really understands the subject.
PARANOiA: To be honest, Shifty, what you've described is 95% of Beyond3D - armchair experts spouting fact based on the low-level knowledge of a few.

This posting is provided "AS IS" with no warranties, and confers no rights.
Dominik D is offline   Reply With Quote
Old 17-Nov-2011, 13:57   #21
Weaste
Member
 
Join Date: Nov 2007
Location: Castellon de la Plana
Posts: 175
Default

I don't think that you could emulate it right now without SERIOUS cost restrictions.

Why bother to emulate it in any case? Even if they don't want to use it as their main CPU in whatever design they have for PS4, why not use it as some sort of satellite processor, not only providing BC for PS3 games but also giving whatever other CPU they might use a big fat multimedia/maths co-processor hung off the side? Not harking back to the days of 68k and 68882 here, but isn't this how Toshiba have basically used the SPURS engine?

How much would it cost them to stick say a 28nm Cell into a PS4 even if it had a different main CPU?

Also, could the point that they have not bothered to integrate Cell with RSX on the same die/package as Microsoft have done suggest that Sony has a whole have plans to use the chip in other CE devices once it's on a small enough node? What I mean by this is that the only reason to integrate it with RSX would be if they were never going to use it for anything else in the future other than PS3. Why have they not done this?
Weaste is offline   Reply With Quote
Old 17-Nov-2011, 13:59   #22
Weaste
Member
 
Join Date: Nov 2007
Location: Castellon de la Plana
Posts: 175
Default

Quote:
Originally Posted by Shifty Geezer View Post
PPC would get closest as it could at least share the same ISA,
Yet SPU and PPU don't share the same ISA do they?
Weaste is offline   Reply With Quote
Old 17-Nov-2011, 20:13   #23
patsu
Regular
 
Join Date: Jun 2005
Posts: 27,477
Default

Quote:
Originally Posted by AlexV View Post
I think it holds merit, so with all due apologies to Shifty I'll reopen this one. Also, in order to rile thing up, I suggest that people do a slightly better work at documenting themselves...moving beyond "omagad it's deep magicTM" would be good. CELL is old, cute but not exactly cutting edge, it doesn't have an awesome memory pipe, it doesn't have hardware caching, the ISA is not exactly from another galaxy, IPC is pretty gimpy. As long as the emulator writers are actually aware about prefetching to cater to the DMA work (albeit modern prefetchers should do a reasonably good job anyways), you'd probably end up doing the bulk of work from L2.
Yap, Cell is an exercise in losing weight to gain extra speed. It didn't rely on just powerful and complicated tech because of power, heat, memory restrictions 5-7 years ago. Once everything is simplified and has "no extra weight", it runs fast naturally, as long as the programmer knows what he (or she) is doing.

Now with more room and newer tech, one should be able to hit similar performance envelope and then some. This is especially true if the new hardware ties the compute elements closer to the GPU (i.e., lesser communication overhead for Cell rendering work).

Quote:
A reasonably modern core like NHLM or SB is considerably ahead of CELL in terms of execution prowess, and now it's working from a cache that's quite fast. You also have HT to help fill in some bubbles. More importantly, people have done work behind the scenes on this and it's reasonable, to say the least. So please, less rehashing of memes like: "we tweaked CELL so hard it's over 9000 now, if we had an updated version that ran at Graham's Number GHz and had Moser's Number SPEWs it would crack the very fabric of the universe".
Cell programs should be relatively well behave and predictable since the code and run-time is highly disciplined (Otherwise performance would suck). That's part of its DNA. We should be able to move the software around as long as the hooks to the outside world "looks" the same to the SPUs and PPU. The added complication may be the subtle dependencies between RSX and Cell since they work rather closely together. However if the new GPU is significantly faster, it may be okay.

The question is how will this compromise other design choices (e.g., a more GPGPU like setup). And PS2 emulation ?

EDIT: The security subsystem is another consideration. We will need a way to segregate the hardware just like how the secure SPU work. Otherwise, even if the secure SPU code runs, the system is wide open.
__________________
My wife pays up to hundreds of dollars for paintings we just hang on the wall They do nothing, just hang their. Journey is interactive, so it does more than our paintings. Art can be expensive! Get over it!
-- 3rdamention@GAF
patsu is offline   Reply With Quote
Old 17-Nov-2011, 20:25   #24
patsu
Regular
 
Join Date: Jun 2005
Posts: 27,477
Default

Quote:
Originally Posted by Dominik D View Post
The thing is: with heavily multithreaded code running on, say, PS3 you're going to hit all sorts of conditions that weren't there on CELL. Performance will be different and unpredictable, but if you have enough horsepower, you should be OK. This is pretty much what MS did with 360 back compat and that's why some games did not work (corner cases with horrible performance characteristics and code too performant and HW-dependent to run in emu on 360).
The current Cell software cannot go over the existing main and video memory bandwidth. The DMA and MFC insulate the SPUs' data and code from the rest of the world. So if they have some way to make DMA and MFC work within the "specs" in the new environment, it _should_ be ok.

In the worst case, when PS3 code is running, they can limit the use/influence of other programs.

EDIT:
Btw, that's one of the reasons I Cell software. They should be "movable" after the fact because they rely on so little (hardware), and are more predictable ("Everything" specified in the code by the programmer). We should be able to run them in a new environment, or in more adventurous mode, spread them across a few nodes as long as we can guarantee the DMA/MFC performance somehow.

Even if the DMA/MFC performance doesn't match up, the ported code should run efficiently on other hardware just because data locality (for the entire software library/base) is observed.
__________________
My wife pays up to hundreds of dollars for paintings we just hang on the wall They do nothing, just hang their. Journey is interactive, so it does more than our paintings. Art can be expensive! Get over it!
-- 3rdamention@GAF
patsu is offline   Reply With Quote
Old 17-Nov-2011, 21:15   #25
3dilettante
Regular
 
Join Date: Sep 2003
Location: Well within 3d
Posts: 5,422
Default

The abstraction of the external system from the SPE POV does remove some difficulties.
The SPE doesn't need to know about the protection model or system-level things like how memory is structured since it relies on asynchronous units for handling the translation.

Another decent thing is that questions about memory model are less difficult with regards to consistency. Since the LS is non-coherent, there is none.
An SPE could be plopped down in a wide variety of surrounding architectures, so long as the interface was updated.

The downside is that the ISA is explicitly different, and that does pose a problem unless the future SPE keeps the same ISA or it is architected to have a superset of the original or have dual mode decoding.
If the core trying to emulate the SPE is a standard coherent processor, it would need to also support the SPE's interfaces and then also have the regular methods of communication. There may need to be an LS mode or advanced cache control to match SPE performance in tightly optimized games.

This may not be a bad thing, since the ability to efficiently and quickly message and coordinate other cores was very evolved and has an edge in some scenarios over the standard SMP model.

Sony could try binary translation on the fly, or a wholesale translation of the code if ISA compatibility is not present. The fastest would be to translate and store the results as code is encountered. This is a known performant solution, but it raised copyright concerns because it copied code. Since Sony owns the platform and sometimes the developer, this may not be a problem.

What would likely not do is a code-morphing solution or software emulation. Such measures require straight line performance that is a multiple of the SPE in order to hide the extra work needed to emulate it. Since it clocks at 3.2 GHz, it does not seem likely we will see 6-9 GHz replacements.
__________________
Dreaming of a .065 micron etch-a-sketch.
3dilettante is offline   Reply With Quote

Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 03:10.


Powered by vBulletin® Version 3.8.6
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.