Ed Fries and Krazy Ken versions of Xbox One and PS4

Proelite

Veteran
Supporter
According to this video, http://www.viddler.com/v/10738ac4?secret=81714898

Microsoft confessed that they didn't make the best graphically performing console that they could have.

If Ed Fries (who was disappointed in 360's power!), and Krazy Ken were still around to head the development of Xbox One and PS4 back in 2008/2009 in some evil macabre alternate versions of video game history, what kind of overheating, overpriced, and inefficient monsters would they have came up with?

3D wafer stacked memory?
Discrete GPUs?
Delay a year and wait for Hybrid Memory Cubes?

Note that this thread is similar to http://forum.beyond3d.com/showthread.php?t=63663, but logic and sanity does not apply here.
 
Delay a year doesn't seem like a good option, they already delayed the next gen by 2 years due to the WFC and tech is always improving so you just have to make best with the tools available to you.

There are a few nice things happening in 2014 but i just feel like the consoles HAVE to launch this year, get people excited for consoles again, this generation has been too long.
 
A Xbox One focused on the highest end of graphics. What would that have been?

More CUs? GDDR5?

Ken Kutaragi's PS4 might have been a APU + Discrete GPU with lots of on board stacked Dram. Sony's CTO talked about such an approach back in 2010.
 
In my opinion MS didn't go crazy with the Xbox One performance because MS believes once it's streaming based solution goes online, the amount of performance generated locally won't matter much.

Look at where MS's poured its money. Server infrastructure and low latency big pool of SRAM paired to fast and wide i/o bus are two areas. Furthermore, MS lowered the latency on its controllers and Kinect.

A streaming solution's biggest enemy is latency and one of MS's main goals seems like it was to reduce as much latency as possible within the console. And Im guessing they are did the same thing on the server end too.

Given that MS's design is the most exotic out of the two, Ken would of been just fine as the Xb1 design seems to fall within all the crazy talk that surrounded Cell.
 
Delay a year doesn't seem like a good option, they already delayed the next gen by 2 years due to the WFC and tech is always improving so you just have to make best with the tools available to you.

There are a few nice things happening in 2014 but i just feel like the consoles HAVE to launch this year, get people excited for consoles again, this generation has been too long.
Yes, it lasted for way too long especially for those who bought a Xbox 360 from day one.

The video is great though, thanks for sharing ProElite!!

CPU seems okay. And the GPU can go longer than we might expect if you don't need absolute bleeding edge a la Titan. I am very interested in how games actually perform in the platform because the architecture is really, really interesting overall. After E3 I think the Xbox One can rival PS4 and PCs and give them a run for their money.

Speaking in terms for AMD and their new technology, a 8000 series GPU would have to wait til next year to crystalize in a console.

With Sony launching a console a year before there is only one advantage, and it is that there would be new software really pushing the envelope compared to when both consoles are originally going to be released.

I insist that a little over a year later is too much though. Launching this year can be mostly skippable if they want to, but what's the point if they completed a great machine already?

It would mean that they would have to wait for the next architecture base to be released and touch things here and there, so in the end they'd need two years.

We can be amazed and play incredible games with those specs. Of course, 8 years from now it will become far more important in the future to change to the 4k resolution specifications.

This will put far more of a premium on the GPU, and the need for computational horsepower.

Gaming machines are always WIPs with the initial up front build cost being the worst thing about them, but of course getting far more functionality out of them over the long term.
 
A Xbox One focused on the highest end of graphics. What would that have been?

More CUs? GDDR5?

start with at least 4 piledriver-ish cores clocked at least 3.2 ghz imo.

if you're going for high performance, forgo an soc imo.

thats the area next gen lacks, 1.6ghz jag cores are pretty light on single thread performance.
 
start with at least 4 piledriver-ish cores clocked at least 3.2 ghz imo.

if you're going for high performance, forgo an soc imo.

thats the area next gen lacks, 1.6ghz jag cores are pretty light on single thread performance.
Jaguar cores aren't bad, they should do the job. Going with faster CPUs cores would have eaten significantly in the power (if not silicon budget) even with a beefier power budget.

When I look at MSFT silicon budget (Sony be supposedly lower) I'm not sure SoC are the issue imo.
North 400mm^2 allows for a lot of ALUs.

The real issue here is that even with an among R&D I think the benefit of something custom would have light it would also have eaten in the (R&D) budget for the GPU. WRT GPU I would think that even with an humongous R&D budget one would not match Nvidia or AMD in house effort which have gone through many iterations.

Anyway I agree with Archie4oz may have KK still be there, I could have seen him support the Cell and what could have been its heir and go with a "larrabee type" of set-up /homogenous architecture with some specialized units (tex for example).
I think KK was THE real geek, he would not have pass on a paradigm shift (back to mostly software rendering) even if "barely" doable or would have trigger performances penalties, pain on the software side of things, etc.
 
Ken Kutaragi was "the" real geek, yes.

From a pure hardware perspective he IS de idol, with so much imagination to waste (and so many hours of software engineeers wasted trying to deal with their architectures too).

What a man... Sadly R+D budgeting has a limit. It generated variety, but great financial trouble too.

Nevertheless, I really miss him. THE geek.
 
Cell Cubed. 64 SPEs and 8 Power7 cores replacing the PPCs, and 64 MBs super fast eDRAM. No GPU.

I was thinking something very, very similar. ;) I do think there would be dedicated texture units though, taking a queue from the limitations that Larrabee hit early on.

... and inefficient

Ehm, you can say whatever you like, but they weren't inefficient.
 
Back
Top