Xbox One (Durango) Technical hardware investigation

Status
Not open for further replies.
GameInformer:

CPU:

x64 Architecture
8 CPU cores running at 1.6 gigahertz (GHz)
each CPU thread has its own 32 KB L1 instruction cache and 32 KB L1 data cache
each module of four CPU cores has a 2 MB L2 cache resulting in a total of 4 MB of L2 cache
each core has one fully independent hardware thread with no shared execution resources
each hardware thread can issue two instructions per clock
GPU:

custom D3D11.1 class 800-MHz graphics processor
12 shader cores providing a total of 768 threads
each thread can perform one scalar multiplication and addition operation (MADD) per clock cycle
at peak performance, the GPU can effectively issue 1.2 trillion floating-point operations per second
High-fidelity Natural User Interface (NUI) sensor is always present
Storage and Memory:

8 gigabyte (GB) of RAM DDR3 (68 GB/s)
32 MB of fast embedded SRAM (ESRAM) (102 GB/s)
from the GPU’s perspective the bandwidths of system memory and ESRAM are parallel providing combined peak bandwidth of 170 GB/sec.
Hard drive is always present
50 GB 6x Blu-ray Disc drive
Networking:

Gigabit Ethernet
Wi-Fi and Wi-Fi Direct
Hardware Accelerators:

Move engines
Image, video, and audio codecs
Kinect multichannel echo cancellation (MEC) hardware
Cryptography engines for encryption and decryption, and hashing

http://www.gameinformer.com/b/news/archive/2013/01/21/rumor-durango-hardware-specs-revealed.aspx
 
What would you do with raytracer in a console? somehow fit raytraced parts in otherwise rasterized game? Sounds ridicilous at best, IMO.
 


A site has released what it says is a look at the technical specifications behind Microsoft's Xbox 360 successor, which is going by the code name Durango. If they're to be believed, the next generation will provide more than just an incremental upgrade.

Site vgleaks.com has posted the specs, including the system diagram pictured above.

you quote gameinformore and gaminformer quotes vgleaks.
so vgleaks quote vgleaks

what's your point?
 
HDD being standard isn't a surprise, download is such a big deal next gen that it was sure to have one (WiiU not having one is the surprise for me). The big thing is that it will be a very big HDD compared to the 20GB last gen which was too small for game installs. Hopefully it allows full installs on a need basis.

It's not that a 360 HDD as standard is a surprise, it's more that I've been unreasonably worried that next gen consoles wouldn't have something to mitigate the low BW and high latency of optical storage. I know I was keen on the idea of flash distribution (due to performance, base system cost, size and basically nostalgia for carts!), but optical + HDD is good too from an end user POV.

But this is only half good news - we need the PS4 to have a HDD too so that multiplats can all be made from the ground up for fast, high BW streaming. If both the PS4 and Xbox 3 deliver on this then the worst hardware issue that this gen could have faced would be avoided.

And lol @ the slow as hell (15 ~ 30 MB/s) 20GB 360 drives. :D

And oh boy, if both PS4 and X3 come with HDDs then the Wii U is even more boned (as if it wasn't already) on AAA multiplatform ports!
 
What would you do with raytracer in a console? somehow fit raytraced parts in otherwise rasterized game? Sounds ridicilous at best, IMO.

Well, Mental Ray for example has been using a rasterizer and a raytracer for a while, although it is an offline renderer and not a video game...

I think one possible approach is to run a 'ray server' that gets called by a rasterizer whenever there's reflection/refraction/shadowing going on. Not that familiar with actual renderer architectures though, sorry, so I can't get into any details.
 
proelite seems to know about these DMA engines on GAF, and he's throwing around that Alpha kits has over a 2.5 TF GPU just to emulate the final system performance...

for whatever thats worth...likely not much.

Quote:
Once again there was a big gulf in terms of alpha kits and beta kits for Durango on paper. The Alpha kits were needed to emulate the final performance of the combined silicon.

Alpha kits were

8 core 16 thread intel CPUs, probably running at 1.6 ghz
12GB of Ram
High end AMD 7000 series >2.5 teraflop GPU

starting to get a little unbelievable...

This isn't true, the alpha kits had 16 cores (the extra 8 were to emulate some of the custom blocks) and a 1.2 TF GPU.
 
And oh boy, if both PS4 and X3 come with HDDs then the Wii U is even more boned (as if it wasn't already) on AAA multiplatform ports!

I don't think anything would be ported to the Wii U after 2013 so lack of a HDD doesn't matter much...
 
Not sure where you're getting that from. He never said 1.2TF wasn't true. I believe he said things along the lines that he feared it was true and he wished it wasn't. He also reiterated many times a raw TF comparison was wrong and that it was more efficient. None of that has changed.

Umm, I thought aegies said the 1.2 TF GPU was 'impossible'.

Quite frankly, he's out of his depth - like an IGN journalist trying to write tech pieces.

He's also given us statements like the SoC is 450 mm^2 and the GPU is as powerful as a 680 in real world performance!
 
Last edited by a moderator:
A cynical could say that all these "blocks" and "data engines" are nothing more than damage control from MS after the dissapointment from the specs ...
 
how can you be so sure, do you work on one of those?

Well, for one bgassassin said the GPU in the kits hadn't changed FLOPs from alpha to beta.

This is borne out by Dec 11 pastebin also having 1.2TF for the kits.

The 16 cores in the devkits is where the 16 core CPU rumours came from.

But yes, I also have other sources :p
 
I agree, anyone trying to think of a console like a PC in terms of the ability to extract performance based on raw numbers is acting silly. The Xbox 360 has a GPU Gflops rating worse than a Geforce GT 620 and I really really doubt any PC game with that card would look anywhere near as good as a 360 game.

I have a GT620M on my laptop. I have tested 8-10 of the most graphically intensive multiplatform games (including BF3 and Crysis 2). All of them ran better than they do on my X360 (Batman Arkham City even did 1080p 30 FPS), which makes me wonder if this talk about PCs needing much faster GPUs to do the same stuff is real (this card delivers 240 Gflops, same as Xenos) and kinda proves the point that GPU flops are not a perfect metric for gaming performance without taking into account other factors.
 
I don't think anything would be ported to the Wii U after 2013 so lack of a HDD doesn't matter much...

It would probably be more of a dream crusher than a real problem then. :eek:

A cynical could say that all these "blocks" and "data engines" are nothing more than damage control from MS after the dissapointment from the specs ...

Because engineering entirely new, completely custom functionality is easier than adding more compute units to an existing scalable design?
 
Well, for one bgassassin said the GPU in the kits hadn't changed FLOPs from alpha to beta.

This is borne out by Dec 11 pastebin also having 1.2TF for the kits.

The 16 cores in the devkits is where the 16 core CPU rumours came from.

But yes, I also have other sources :p

You're off on the 1.2TF for the kits. The Dec 11 pastbin was for target specs.
 
NB is almost certainly on the same chip as the CPU and GPU. This be a Sock!

Yeah that's what I figured. So... Bandwidth between esram and the CPU could possibly be what? I mean, the esram is connected to the north bridge but it's also embedded/directly connected in/to the GPU(I guess?), but the actual north bridge is integrated into the CPU. And the esram has direct connections to both the GPU and the north bridge, according to the diagram.

I'm over analyzing it, and while some details might be correct that diagram sure is shit.
 
Well, sony had for sure plan a,b and c.I am sure people in Sony knew about these data movers
If the "data movers" are like DMA engines that move and consolidate data while the CPU or GPU is processing the previous block, so that it hides the latency of the main memory, then yeah, I think Sony already knows about them because they designed exactly that into the Cell processor in 2005.
 
Status
Not open for further replies.
Back
Top