Predict: The Next Generation Console Tech

Status
Not open for further replies.
If those documents are real they look a lot like early pitch documents to me. In which case the final product could be very different.
Nukezilla even describes them as early planning docs. We know the project codename has changed at least! I think they just show what some of the discussion was early on (that or they're complete fakes), but there's nothing there to let us know what the final console will actually be.
 
WTF are you talking about? IPC is not a measure that can be used cross instruction set. A single x86 instruction is a lot more powerful than a single ARM instruction. (x86 can do a load and an alu op in a single instr. ARM can do a shift and an alu op, but that's generally less useful. Also, as I understand it, those get cracked to two ops on A15 and take two cycles.)

Also, you cannot even use IPC as a measure on a single instruction set without defining a specific program you are using. IPC depends massively on the code you run.

Also, no x86 cpu out there gets >3 IPC on generic code. BD is limited to <2 by the frontend when all threads are in use. This isn't that much of a limitation, because the very best cpus out there (SB,IB) can barely break 2 on typical code.

I'm talking about Millions of Instructions Per Second in a Dhrystone benchmark, divided by the number of clock cycles, giving you an average number of instructions executed per clock. Yeah, theres all kinds of special cases, and all kinds of exclusions, and all kinds of nit-picks. I'm just saying "general case."
 
From second hand information on Gaf (BGAssassin), PS4 is supposedly

4 core AMD CPU
2GB GDDR5
18 CU (1152 SPU) Sea Land GPU at 800 mhz
DSPs
 
Last edited by a moderator:
From second hand information on Gaf (BGAssassin), PS4 is supposedly

4 core AMD CPU
2GB GDDR5
18 CU (1152 SPU) Sea Land GPU at 800 mhz
DSPs

why DSPs ? and why no frequency figure for the CPU ? a 4 core AMD CPU at 2 GHZ or at 3.2 GHZ dosent have the same performance.
 
link for above

http://www.gameinformer.com/b/news/...0-million-into-next-gen-game-development.aspx

What would this tell us about any timeline? Late 13 still? I cant decide if 80m is a lot of money in game development terms or not. Sure it would be for one game, but how many games must EA spread it across?

I assume it refers to the April 2012-March 2013 fiscal year as well?

Overall yeah I think this points at late 13 again.

Er... EA investing in software, news at 11.

The more interesting part to me is "very impressed".
 
I cant decide if 80m is a lot of money in game development terms or not.
Not. Unless they have way better tools than now, that'll be a few AAA titles of around current standards.

Note the source: https://twitter.com/#!/jimreilly/status/199592143185129474
In fiscal 13, EA will invest $80 million in development of games for Gen4 console systems.
So 80 million in FY 2013 (ending Apr 2014, right?). Perhaps a few launch titles? Seems pretty cold IMO. Does give a launch schedule though. Sounds like very little investment, meaning not many customers expected to be buying next-gen games up until beginning 2014, meaning sales only up to the year in advance at best. There won't be many XB3's or PS4's in houses by the end of that 80 million investment, unless EA are seriosuly lowballing it.

Edit: The other tweet...

EA on $80 million investment: "We are very impressed and have decided to invest in the next generation of consoles." Ok, cool!

Ummm, "we have decided to invest in next gen consoles." Really?!?! Like they were mulling it over and were thinking of maybe giving this next gen a miss?
 
Is/has memory always really been the limiting factor on consoles?

I know that lack of memory has been hurting PS3/360 for a while now but did PS2/Xbox suffer a similar problem?
 
If there's any truth to the always on/always connected/ set-top box functionality that could be handled by a low power ARM processor while rest of the system is powered down.

this is consistent with older rumors that MS would offer a media, set-top-box only computer, they could do this by omitting the big gaming GPU and x86 cores.

we do have another example of combining ARM and another arch, it's both Wii.


why DSPs ? and why no frequency figure for the CPU ? a 4 core AMD CPU at 2 GHZ or at 3.2 GHZ dosent have the same performance.

almost every system is including DSPs ; at the very least you have a h264 decoder, I have one in a 8400GS and one in a VIA chipset though I never use them.
you can have something else as well, cell phone makers call them "image processing units" (and there must be one for the radio), Intel calls them "Quicksync".
 
I'm talking about Millions of Instructions Per Second in a Dhrystone benchmark, divided by the number of clock cycles, giving you an average number of instructions executed per clock. Yeah, theres all kinds of special cases, and all kinds of exclusions, and all kinds of nit-picks. I'm just saying "general case."

Dhrystone is not IPC. If you mean DMIPS/clock, say DMIPS/clock. A dhrystone op is not a cpu op -- the dhrystone benchmark is ran through a compiler, which can eliminate quite a bit of the work. A cpu that can never execute more than one op per clock (such as some old arm11 cpus), can still get dhrystone scores well past 1.

Not only that, dhrystone is not even relevant to IPC at all any more. It's hot data and code sets fit entirely in the L1 cache of nearly every 32-bit cpu on the market. This means that the memory subsystem, which today is the most important part for determining IPC, is not tested at all. Also, it's tendency to do unrealistic amount of irregular, unpredictable jumps means that cpus with bad branch predictors but shorter pipelines get a huge boost.

Dhrystone has not been an accurate benchmark for some 20 years. The only people who use it anymore are the ones who have no clue at all, and the ones whose products it gives really high scores. These are not special cases, exclusions, or nitpicks. Dhrystone does not measure IPC. As it puts no strain on the memory subsystem, it does not give scores that are even remotely relevant to the modern world. It is simply not an usable benchmark.

If you want to use a single benchmark that scores cpus for realistic workloads, use specint.
 
why DSPs ? and why no frequency figure for the CPU ? a 4 core AMD CPU at 2 GHZ or at 3.2 GHZ dosent have the same performance.

Because it is most likely fake ?

Modern CPUs with SIMD units are very effective at traditional DSP tasks. The 360 does all audio processing on the CPUs. On PCs, the dedicated audio card is a dying breed, because the CPUs can do all the processing with just 2-5% of a single core and all you need is a DAC (and you don't even need that for digital out).


Cheers
 
I did buy a sound card for the DAC (116dB signal/noise ratio, woot!), it's real hi-fi, another reason would be for the software drivers (if you're on windows), you want e.g. great software and hardware buffers so you get both low latency and no cracks, etc.

I'd say the sound cards aren't dying at all, especially they have gotten very high quality so a cheap one beats an old $2000 CD audio player.

but I agree DSPs aren't really needed. except you still get them for encoding/decoding tasks. if you want to compress framebuffer output to send on a WiiU tablet or a PS vita you may as well use a DSP.
 
Dhrystone has not been an accurate benchmark for some 20 years. The only people who use it anymore are the ones who have no clue at all, and the ones whose products it gives really high scores.

I'd go a step further and say: It is used by people with something to hide. Same goes with core mark.

Cell phones now have 1GB of RAM. CPU2006 has a maximum working set of 900MB- There is no excuse for not doing proper benchmarking.

The gcc subtest in CPU2006 is a good indicator of integer performance since it hasn't been broken by 'magic' compilers.

Cheers
 
I'd say the sound cards aren't dying at all, especially they have gotten very high quality so a cheap one beats an old $2000 CD audio player.
.

How many Sound blasters did Creative sell last year. How many did they sell 12 years ago ?

As a mass market item, they are not dying, - they are already dead.

Cheers
 
Creative are responsible for killing the DSP on sound cards :)
because of their greed, EAX got proprietary after gaining support in games, then a competing standard died and Creative was the only brand that allowed the cool effect in games. (or sound blaster emulation in DOS for that matter)

then microsoft pulled the plug with Vista, when they decided they would support an API that works everywhere. I've never played a game with EAX by the way.
without Creative, we might have had DSP on on-board sound cards :). (well, we do have on-board "X-fi" sometimes and they have licensed "X-fi" for some time but it's pointless now. even more pointless is to buy a 300 euro motherboard rather than a 100 euro motherboard and a 70 euro sound card)
 
Status
Not open for further replies.
Back
Top