'Graphic North Bridge' PlayStation 4's Custom Chip?

That's not going to cut if for full game downloads initiated by the PlayStation App (or auto predicted). In which case, I can't see additional flash storage (slow or not) being cost justified when you might as well just send it all to the HDD.

EDIT

Unless you mean buffering as in storing up a few gig of data first and then sending it to the drive as the flash fills. That way large sequential blocks of data are written instead of potentially smaller chunks. Not sure what kind of power saving that would really offer, though.
You got it. If you download something with 1 MB/s (Cerny talked specifically about slower connections, I guess Sony doesn't assume a 100+MBit/s connection as granted all over the world), you need more than an hour to fill a small 4GB flash buffer which will use a lot less power than spinning up a harddrive. If that buffer is full, it writes then the 4 GB to the harddrive and powers it down again.
I think the thread and the interview mentioned some (upcoming) regulations about standby power consumption. It would be a pity if Sony (or MS) would be forced to deactivate the standby connectivity/background download functionality with a firmware update in some countries just because it sucks 2W too much to pass the regulation, isn't it?
 
I think the thread and the interview mentioned some (upcoming) regulations about standby power consumption. It would be a pity if Sony (or MS) would be forced to deactivate the standby connectivity/background download functionality with a firmware update in some countries just because it sucks 2W too much to pass the regulation, isn't it?

Indeed. MrFox, however, addressed this point in the PS4 tech thread. The EU stand-by mode power reqs for devices only seem to apply if the device isn't actually doing anything. Downloads and other activities don't have to adhere by the 2W (in standby mode, 1W in off mode) requirement.

I'm still not sure I understand it completely, but it looks like the directive can almost be ignored for both the PS4 and the 720. As long as it has a reason to be up, it's not considered standby. Even the GDDR5 refresh.
http://ec.europa.eu/energy/efficien...tion/guidelines_for_smes_1275_2008_okt_09.pdf

His post also includes some examples of the exceptions from the PDF.

EDIT

My watt numbers appear to be incorrect and were the old requirements. I think this is what the PS4 may be facing:

4 years after the regulation is in force (December 2012):

Power consumption in off mode must be 0.5 Watts or less;
Power consumption in stand by mode which allows reactivation must be 0.5 Watts or less;
Power consumption in stand by mode which allows reactivation and displays information (such as a clock) must be 1 Watt or less.

Same exceptions noted previously seem to apply, though.
 
Last edited by a moderator:
Indeed. MrFox, however, addressed this point in the PS4 tech thread. The EU stand-by mode power reqs for devices only seem to apply if the device isn't actually doing anything. Downloads and other activities don't have to adhere by the 2W (in standby mode, 1W in off mode) requirement.
That's why I've added an "upcoming" in braces to my sentence. ;)
That regulation was decided about in 2005 and implemented at the end of 2008. Don't you think such things will get more stringent in the future (in fact, that European regulation halved the allowed consumption for products sold starting this year and afaik there is some consideration going on to introduce additional limits for idle power consumption)? I mean generally, not just in the European Union. It can't be wrong to design the system in a way to reasonably minimize power consumption in the connected standby mode.
 
Last edited by a moderator:
This is what Mr. Rigby PM'ed me on GAF:

Mod edit: I'm uncomfortable allowing a banned member to talk by proxy. The guy was banned for being an impossible debater with a love of excesses of information, little of which had any baring on the topic. There's nothing wrong with a theory like XMB being replaced with an HTML5 interface, but the spamming of reams of text in complete denial and with technicolor to drive his unlikely points home were just generating crazy amounts of noise. (If you disagree with me, I can keep reposting that every 5 minutes in different colours until you do....)

If anyone wants to sort through his stuff elsewhere and post salient info he might have found (and the guy spends a lot of time looking and might well find some nugget or two), sure. But allowing him to spread reams of random thinkings on this forum once again goes against the principles of a member being banned, and the principles of wanting to keep noise to a minimum.
[/mod]



Yo jeff, why don't you just post these stuff to your GAF thread so everyone else could read. I'd hate to be the middleman here. :devilish:

To other readers, please apply the usual salt for Internet posts.
 
rigby's being banned here makes me doubt the utility in trying to debate him by proxy.

The base assumption that the PS4 (probably Durango, too) cannot be always online while maintaining 500mW is disputable, and with that the first half of all those words. Standby allows transition to other modes based on a remote switch or timer. Unless we think the console needs to poll an update server every second, the console can just transition to a network update state periodically.

I'm really not sure if reposting a PM is really okay unless he's fine with it, but even then, his being banned means it seems kind of silly to have his posts come through via another means.
 
It kinda depends on the vendors' visions. If these devices are also acting as a listening server at the same time (e.g., IP-phone, WAN RemotePlay, home security server, blah), then there may be interesting scenarios for nextgen consoles to handle. Today, a sleeping PS3 handles a LAN RemotePlay request as Wake-on-LAN but I am not sure if this alone is sufficient. We have to wait quite a while for the PS3 to boot up and load the RemotePlay server. We also always RemotePlay into the default user account first.

FWIW, the recent, very loud "always-on(line)" complain may be missing some key info.

I do agree that if rigby has more to say, he should post his findings himself. ^_^
 
Always-on in the context of the other thread is a completely separate topic with a lack of precise naming. Technically, the issue being debated could happen even if the device were unable to perform any network activity when not fully active.
 
Yesh, I think that "always on" complain may be missing some context, and may be too narrow. It should not be just a client or gaming topic. These devices are flexible like a PC albeit with low power mode. They are capable of executing the telcos/cable companies' quad-play strategy.

I doubt Sony is banging on 4K to push PS4 later on. They will have to find additional angles. Gaming is the current focus.
 
Could it be possible that the PS4 GPU is designed kinda like the upcoming AMD Volcanic Islands GPU but with Starsha in place of the 8 Serial Processing Modules?

AMD+Next+Generation+GPU+SPU.jpg



Volcanic-Islands2-635x645.jpg
 
Could it be possible that the PS4 GPU is designed kinda like the upcoming AMD Volcanic Islands GPU but with Starsha in place of the 8 Serial Processing Modules?

AMD+Next+Generation+GPU+SPU.jpg



Volcanic-Islands2-635x645.jpg

Yeah except with jaguar cores. It basically looks like apu diagram. Is that AMD plan? Calling apus on discrete cards, gpus?
 
Last edited by a moderator:
They may be able to tap on the console optimization for games if they reuse the architecture and tools here as much as possible.

Sorry onQ, no Starsha sighting here. :)
 
OnQ, that's not an upcoming VI plan. It's a mockup made by a forumite a while ago as the architecture he though would be ideal. Apparently someone turned it 90 degrees, changed the labels, and published it as a super-secret leak.

Seriously, 90% of everything posted on the internet about upcoming stuff is bullshit. Including on the big-name websites. You have got to stop taking things at face value when posted without any confirmations.
 
OnQ, that's not an upcoming VI plan. It's a mockup made by a forumite a while ago as the architecture he though would be ideal. Apparently someone turned it 90 degrees, changed the labels, and published it as a super-secret leak.

Seriously, 90% of everything posted on the internet about upcoming stuff is bullshit. Including on the big-name websites. You have got to stop taking things at face value when posted without any confirmations.

I see it's basically just taking the scalar ALU's & placing them in their own unit & connecting them back to the compute units using the UNB.

but I'm still asking the same question could the PS4 GPU CU be connected to the Starsha GNB the same way.
 
That this seems to be a random person's mock-up aside, and some of the decisions I might find debatable, it's still not showing what you say it's showing.

Those aren't scalar units moved from the CUs to the other side of the UNB, which would be very bad.
The diagram maker relabeled a hypothetical 16-core Bulldozer.
 
Ok I'll admit that I barked up the wrong tree because the 320 ALUs seemed too weak to be the PS4 GPU but now I think I know why this seemed to be too weak when really it's powerful.

"Scalar ALU’s 320"

MIEnKne.png


Credit to mistercteam for the patent image.
 
Ok I'll admit that I barked up the wrong tree because the 320 ALUs seemed too weak to be the PS4 GPU but now I think I know why this seemed to be too weak when really it's powerful.
In what way?
What's the math.

"Scalar ALU’s 320"
Please state what is implemented by this sentence fragment. In what way are the commonly used words to describe microarchitecture, "scalar" and "ALU", of special significance right now?

MIEnKne.png


Credit to mistercteam for the patent image.
I give credit to the people who filed this at ATI in 2005.
Please clarify what other credit I am to give.
What, other than you seeing this image out of context from that individual, makes you think it is relevant?
 
In what way?
What's the math.


Please state what is implemented by this sentence fragment. In what way are the commonly used words to describe microarchitecture, "scalar" and "ALU", of special significance right now?


I give credit to the people who filed this at ATI in 2005.
Please clarify what other credit I am to give.
What, other than you seeing this image out of context from that individual, makes you think it is relevant?

The ALU's are Scalar ALU's & not the normal Vector ALU's that are in the AMD GCN Compute Units.
 
The ALU's are Scalar ALU's & not the normal Vector ALU's that are in the AMD GCN Compute Units.

I don't think that sentence means as much as you think it does.

What do you think it means for an ALU to be scalar?
What do you mean by *pretty much every generic tech term in that sentence*?

How many AMD GCN compute units are equivalent to those 320 scalar ALUs you write words about?
 
Back
Top