I have no idea.AFAIK Cell was co-developed by IBM to be produced on their fabs. Can TSMC even produce a chip with an embedded Cell without entering into IP infrigement?
And if so, could they do it without an incredibly amount of man-months/years dedicated to significant re-engineering?
For a bulk of the library, possibly not, but for 100% 'play from disk' emulation, I think it necessary.An 8-core 3.2GHz Zen 2 wouldn't be able to emulate the Cell? With two 256bit FMAs, each Zen 2 core has twice the theoretical floating point throughput of a SPE at ISO clocks (25.6 vs 51.2 GFLOPs at 3.2GHz), and this is obviously with much better utilization due to modern schedulers and much larger caches. Would Sony even need a Cell to emulate the PS3 at this point?
You really wouldn't need to. It's 4 mm²! Power draw is going to be a handful of wattsUsing modern ARM cores would enable them to use them in standby mode with very low power utilization, whereas applying power-saving features to a Cell would again require massive engineering efforts.
It's not really about a substantial benefit, but killing two birds with one stone. They need a secondary processor for IO. They want PS3 BC. All the arguments again including a Cell CPU, which are all valid, can likewise be levied at PS1's inclusion in PS2. Do PS2 gamers really want to play old-gen PS1 titles? Is it worth the expense rather than just using the PS2 CPU? Admittedly, PS1's inclusion would have been a lot cheaper than a Cell inclusion in PS5.And this is all assuming Sony would see a substantial market benefit in enabling PS3 BC into the PS5. What demand is there to play PS3 games that weren't already ported to the PS4?
I wonder if it really is. Just look at how far the RPCS3 developers have gone, probably from just doing JIT emulation through trial and error:For a bulk of the library, possibly not, but for 100% 'play from disk' emulation, I think it necessary.
Nah, I think the alternative would be to emulate the PS3 using the exact same hardware that already powers PS5 games so at zero mm^2 cost, probably with the GPU massively downclocked.The alternative would be running expensive, powerful hardware to emulate PS3 where a few watts of tiny silicon would be more efficient.
I have no idea.
For a bulk of the library, possibly not, but for 100% 'play from disk' emulation, I think it necessary.
You really wouldn't need to. It's 4 mm²! Power draw is going to be a handful of watts
It's not really about a substantial benefit, but killing two birds with one stone. They need a secondary processor for IO. They want PS3 BC. All the arguments again including a Cell CPU, which are all valid, can likewise be levied at PS1's inclusion in PS2. Do PS2 gamers really want to play old-gen PS1 titles? Is it worth the expense rather than just using the PS2 CPU? Admittedly, PS1's inclusion would have been a lot cheaper than a Cell inclusion in PS5.
Incidentally, Sony shrunk Cell to fit 8 onto a server rack for PSNow. Shrinking it even further, in a combined PS5/PS3 box, which also runs PS4 and PS1/2 emulated games, would cover all the bases for offering the entire 25 years PS library. The alternative would be running expensive, powerful hardware to emulate PS3 where a few watts of tiny silicon would be more efficient.
I don't know. Nintendo also tried to kill 2 birds at once with the WiiU (with hardware Wii BC)...They created an innefficient frankenstein.I have no idea.
For a bulk of the library, possibly not, but for 100% 'play from disk' emulation, I think it necessary.
You really wouldn't need to. It's 4 mm²! Power draw is going to be a handful of watts
It's not really about a substantial benefit, but killing two birds with one stone. They need a secondary processor for IO. They want PS3 BC. All the arguments again including a Cell CPU, which are all valid, can likewise be levied at PS1's inclusion in PS2. Do PS2 gamers really want to play old-gen PS1 titles? Is it worth the expense rather than just using the PS2 CPU? Admittedly, PS1's inclusion would have been a lot cheaper than a Cell inclusion in PS5.
Incidentally, Sony shrunk Cell to fit 8 onto a server rack for PSNow. Shrinking it even further, in a combined PS5/PS3 box, which also runs PS4 and PS1/2 emulated games, would cover all the bases for offering the entire 25 years PS library. The alternative would be running expensive, powerful hardware to emulate PS3 where a few watts of tiny silicon would be more efficient.
The thing that bothers me about discless Xbox is bc, my entire 360 library is physical.Think I should have added a $450 discless Anaconda.
That would make things pretty interesting.
I'm guessing there's a growing amount of people who would rather save $50 than have an optical drive now.
https://www.google.com/amp/s/seekin...-plus-19-percent-samsung-settlement-sony-deal
They don't do much outside of exploiting their generic patents for rumble motors. But they have patents for dual axis force feedback.
Pretty sure that means the DS5 will use a pair of Alps Haptic Reactor (and probably the next Move controller iteration will have one).
Is it possible that SPE of Cell can be used for ray-tracing?I have no idea.
For a bulk of the library, possibly not, but for 100% 'play from disk' emulation, I think it necessary.
You really wouldn't need to. It's 4 mm²! Power draw is going to be a handful of watts
It's not really about a substantial benefit, but killing two birds with one stone. They need a secondary processor for IO. They want PS3 BC. All the arguments again including a Cell CPU, which are all valid, can likewise be levied at PS1's inclusion in PS2. Do PS2 gamers really want to play old-gen PS1 titles? Is it worth the expense rather than just using the PS2 CPU? Admittedly, PS1's inclusion would have been a lot cheaper than a Cell inclusion in PS5.
Incidentally, Sony shrunk Cell to fit 8 onto a server rack for PSNow. Shrinking it even further, in a combined PS5/PS3 box, which also runs PS4 and PS1/2 emulated games, would cover all the bases for offering the entire 25 years PS library. The alternative would be running expensive, powerful hardware to emulate PS3 where a few watts of tiny silicon would be more efficient.
I don't know. Nintendo also tried to kill 2 birds at once with the WiiU (with hardware Wii BC)...They created an innefficient frankenstein.
They based their entire console architecture on something incredibly outdated, which was dumb. Had they done it the way being suggested, as Sony did with PS2, Nintendo would have designed a new system on a new architecture and only included necessary, useful hardware from GC. The CPU was only 19 mm² so could have been coupled with a non-PPC processor as an audio or other processor. Or keep that part simply the same with compatible PPC CPU and just use useful parts from Hollywood that couldn't be nicely emulated for doing something.I don't know. Nintendo also tried to kill 2 birds at once with the WiiU (with hardware Wii BC)...They created an innefficient frankenstein.
Potentially different in this scenario is that they wouldn’t necessarily be melding the different architectures ( WiiU GPU ) or extending a dead architecture ( WiiU tri-core of Watdom) so much as just using duct tape to hang the Cell off of some interconnect.I don't know. Nintendo also tried to kill 2 birds at once with the WiiU (with hardware Wii BC)...They created an innefficient frankenstein.
Wii was an OC’ed GameCube - the last time Nintendo tried to compete on HW power.I don't know. Nintendo also tried to kill 2 birds at once with the WiiU (with hardware Wii BC)...They created an innefficient frankenstein.
Aren't N+1 gen PowerPC processors' ISA a superset of N gen's?The CPU was only 19 mm² so could have been coupled with a non-PPC processor as an audio or other processor. Or keep that part simply the same with compatible PPC CPU and just use useful parts from Hollywood that couldn't be nicely emulated for doing something.
From what I've seen from the RPCS3 emulator, any DX12 era GPU above Intel's GT2 can emulate the RSX perfectly. They're just saying it needs Vulkan support, so I'm guessing anything with shader model 5 and up is good enough.That said there’s going to be some low level RSX headaches to deal with... it’s not just all on Cell.
This implements a subset of a general-purpose file system that serves as a sort of bypass of the coexisting standard file system. By limiting the data to a specific use case of read-only large game packages, and using their role as a platform maker to make assumptions on the OS and SSD side that independent OS and SSD manufacturers can't, a lot of steps intended for managing arbitrary accesses, arbitrary clients, and protections.https://www.resetera.com/threads/ps...nd-sonys-ssd-customisations-technical.118587/
This is crazy from SIE patent and it is possible to go much faster than PCIE 3 and PCIE 4 maybe 10 Gb/s for PS5 SSD... from 1Gb/s to 20Gb/s
They invented their own file system...
The secondary CPU's workflow in this instance is coordinating between the host, accelerator, and SSD controller. Accessing hash tables, running system software, reading buffers/signals, and breaking accesses into 64KB chunks does not seem like it pairs well with the vector-oriented SPEs, and the PPE's value-add may be questionable. I'm curious how its characteristics as a high-clock, long-pipeline, narrow SMT core with vector units measure up to the ARM cores that can be found in SSD controllers."Additional CPU." That one really could be Cell. Should be a few mm² at 7 nm, wouldn't need to be programmed by anyone other than Sony, and could do PS3 BC.
Conceptually, the accelerator's functions are in line with the compression done by controllers like those from Sandforce, and now-common SSD hardware encryption.From that patent:
There it is. Custom hardware for faster decrypting and decompression, as we've been saying it would be needed if significantly lower loading times were to ever be achieved.
This is something that could take several years for gaming PCs to catch-up. Maybe it can be done through software if people have 16+ cores and massive amounts of RAM to use as storage scratchpad, but that would also require devs to make a massively parallel method for decompression/decryption.
I think that comes down to the rights held by TSMC's client.AFAIK Cell was co-developed by IBM to be produced on their fabs. Can TSMC even produce a chip with an embedded Cell without entering into IP infrigement?
The main CPU generally hands things off the sub-CPU, whose job is similar to what the cores in SSDs do internally, just on the same die this time. The host processor would send off a request and wouldn't come back until the final output had been copied to a destination buffer and the sub-CPU signaled completion.Wait
I've grown up believing that dma was the best thing after chocolate and sex, and now we are all happy because the cpu will trash its cache for manually manage all the io?
Not sure, they sell patents for dual-axis resonant rumble, which is actually used in the Switch joy con. It's mildly interesting for lower power consumption. The rest of their development doesn't seem to be useful or clever in any way. Their screen haptic stuff looks like patent trolling to me, but I'm not a patent lawyer.Does it mean that PS5 controller will have a touch pad with haptic feedback?
It will be interesting for games if the touch pad really has haptic feedback.
That could be a critical hit to the seamless cinematic experience though.So that's 5 seconds then instead of 1.
All I'm saying is that 1X was marketed as 4K. It's the main marketing point used for it, right up there with 6 TF. All I'm saying is that I don't think potential customers who've been fed this 4K 6 TF marketing are going to forget it all. Next gen is a step backwards in resolution? A step backwards in processing power? I'm curious how you thing a GPU with such similar specs is going to produce games that look a generation ahead. The GPU isn't the part of 1X that's holding it back. This has been my point all along. Sure, Ryzen is a big step up from Jaguar, and that extra CPU power can speed up your simulation, AI and other stuff, but it's hard to market framerate. If Microsoft tries to market a less than 4k, less than 6TF GPU that produces 1X quality visuals at higher frame rates as a next generation machine, they are going to have their sit in the corner holding one swollen eye while they see Sony spending their lunch money on twinkies with the other.I don't see next gen about being the real 4k, whatever that may mean to people.
But what you will get is better graphics regardless if you buy a Lockhart or an Anaconda when games are coded for them compared to 1X.
I also never said Lockhart is aiming for 4k.
I haven't put that much thought into it. But I'll consider this statement while I contemplate more about the topic.I'm not sure your not over thinking it to be honest?
To be fair to Nintendo, they repackaged one of their least successful systems in Gamecube into their most popular system ever. They tried it again with WiiU, and it failed to gain steam, but they didn't do it for no reason. The same concept had already been successful in the previous generation. Actually, some hardware choices for SNES were made to maintain hardware compatibility with NES, before that idea was scrapped. The hardware remained, though. So they had successfully done it twice, WiiU is the outlier.Right, but they're incompetent.
Sure, Ryzen is a big step up from Jaguar, and that extra CPU power can speed up your simulation, AI and other stuff, but it's hard to market framerate.