*spin* Devs Showing Games on PC/Console HW

Status
Not open for further replies.
1) Perhaps if MS went with Edram @ 64mb+ off-chip, they wouldn't need to be demoing on pcs at this point, and would likely have more room on their apu to go with 18+cu. But hey, what do I know? Heck, what does intel know? What does Sony know? You get my drift... ;)

Demoing some games on PCs only means that MS is at a certain point in their production cycle, nothing more about yields or schedule can be confirmed from it since we don't know what their original plan was. Intel's Haswell design validates the XBO design, not the other way around as you're trying to suggest. As far as "what does Sony know", let's not have such short memories with regards to Sony's past silicon design decisions and act as if they have been a model of simplicity and/or efficiency.
 
Or its further along the process.

They ran on devkits not actuall retail units. So being easier to make as a point is not known and is irrelevant.

I doubt an actual retail unit exists yet or that production has started either.

Dev kits using final hardware as opposed to pc's using nvidia cards.
 
Lots more 3rd parties showing games publicly on live demos PS4 kits on alpha-beta form. At least they claim PS4

Watchdogs
Destiny
NFS
Thief
AC
 
Speaking of which - on GAF many complain that the game has received a massive downgrade, primarily because it had a sluggish framerate, with what I recognise as vsync enabled (where before I've seen a tonne of tearing on PC versions). I think this was one of the first time I saw a PS4 version - at the PS4 reveal they admitted they were showing the PC version.

I don't see a really big difference - the same depth of field effects for the camera, and perhaps a little more boring section of the city. Vsync on vs off seems the biggest difference to me.

What do the experts think?
 
I don't think we'll ever be able to shift through all the info until the games actually come out on these consoles.

Too many rumors are flying everywhere and many are tainted by console preferences. One moment things are being run on X, the other it's a rumor. He said, she said. Etc.

I know my views are tainted, so it takes a lot of strength to avoid it.
 
...Intel's Haswell design validates the XBO design, not the other way around as you're trying to suggest...

No, actually Intel is going with an off-chip EDRAM design, packing 128mb (as I suggested).

As for Sony, they've obviously learned from their mistakes. Bringing themselves back to where they originally came from. Efficient architecture design which is developer friendly (ps1). They've strayed from that philosophy over the years, but are now firmly back in it.
 
No, actually Intel is going with an off-chip EDRAM design, packing 128mb (as I suggested).

As for Sony, they've obviously learned from their mistakes. Bringing themselves back to where they originally came from. Efficient architecture design which is developer friendly (ps1). They've strayed from that philosophy over the years, but are now firmly back in it.

Off-chip or on-chip is a minor differentiation compared to the overall concept of using a (relatively) large, very low latency, cache. Intel has clearly invested heavily in this architecture. As its been stated before, the decision to use 1t, 6t, on chip, off chip is more about licensing, integration, shrinking, and cost reduction over time.

With regards to Sony, I agree they clearly learned from their mistakes now, but they have historically been a firm that's used the knowledge in this area to create some very esoteric designs. So my response in regards to "what does sony know" was more about their company DNA not being about simplicity (or efficiency). They've some talented folks there for sure but I wouldn't put Sony, as a company, in the category of having expertise in simple and optimal chip design. (especially given how heavily they've leaned on AMD for this one.)
 
Off-chip or on-chip is a minor differentiation compared to the overall concept of using a (relatively) large, very low latency, cache. Intel has clearly invested heavily in this architecture. As its been stated before, the decision to use 1t, 6t, on chip, off chip is more about licensing, integration, shrinking, and cost reduction over time...

I disagree here. The difference between traditional EDRAM (as used in xb360 and intels latest) and SRAM is roughly 1/6th the transistors which roughly means 1/6th the size which means MS would have had more budget for a larger GPU which could match (or even exceed) PS4 in CU. It would also mean a smaller APU chip which would be easier/cheaper to manufacture which would likely mean not having to rely on pc's for this e3 in showing their wares (see ps4 demos).
 
Status
Not open for further replies.
Back
Top