Predict: The Next Generation Console Tech

Status
Not open for further replies.
From my understanding, this may not be the greatest thing to brag about. >_> (That's all I'll say aboot that)

...

Precalculated is generally used in the context of offline computation e.g. baking lighting information into the map. This is opposed to real-time calculation.

That must have been the tech they used in the Zelda HD demo :rolleyes:;):LOL:

Because you need the extra power in processing to process offline lightning for the ram...wait, that dont make much sense...

He said the extra processing power would allow for precalculating data and increasing cache sizes.
 
Last edited by a moderator:
I was think more on the lines of procedural generation like with vegetation (Speedtree or whatever Dunia used), terrain generation, etc.
 
It seems that DD4 is a different way of dealing with access memory (correct me if I
'm wrong):


" DDR4 also anticipates a change in topology. It discards dual and triple channel approaches (used since the original first generation DDR[28]) in favor of point-to-point where each channel in the memory controller is connected to a single module.[2][3] This mirrors the trend also seen in the earlier transition from PCI to PCI Express, where parallelism was moved from the interface to the controller,[3] and is likely to simplify timing in modern high-speed data buses.[3] Switched memory banks are also an anticipated option for servers.[2][3] "


Im understand right? They are have 2GBytes( testing 16Gb) DDR4 modules ( http://www.hynix.co.kr/gl/pr_room/news_data_readA.jsp?NEWS_DATE=2011-04-04:08:55:16 ) ?

" Three months later in April 2011, Hynix announced the production of 2 GB 2400 MHz clock speed DDR4 modules, also running at 1.2 V on a process between 30 and 39 nm (exact process unspecified),[8] adding that it anticipated commencing high volume production in the second half of 2012.[8] Semiconductor processes for DDR4 are expected to transition to sub-30 nm at some point between late 2012 and 2014.[2][23]
[edit] "

http://en.wikipedia.org/wiki/DDR4_SDRAM

What kind of raw bandwidth might a normally clocked DDR4 achieve on 128-bit bus?

Am I correct in calculating, DDR4-2400 on a 128 bus would only be 38.4GB/s?
 
Last edited by a moderator:
That must have been the tech they used in the Zelda HD demo :rolleyes:;):LOL:

The lighting was real-time. The whole point of the demo was to show the day/night lighting conditions as well as knocking over candles - dynamic lighting.
 
What kind of raw bandwidth might a normally clocked DDR4 achieve on 128-bit bus?

Am I correct in calculating, DDR4-2400 on a 128 bus would only be 38.4GB/s?



You right.This links/sites showed Hynixs talks about 19.2GB/sec for 64bit(like dual pump?),but im pray here( :oops:) for Sony and MS use 256Bit* Unified 8GB memory for system (and for gpu 32 ROPs at least...) DDR4 - 3200 (max estimate in 4.26GHz) for 102.4GB/sec enough for 1080P/60Hz or 720P/3D.

* ( since 2001 xbox used unified memory at 200MHz DDR/128Bit/6.4GB/sec)
 
Last edited by a moderator:
isn't the specification of desktop memory (like DDR2/3) usually taking into account the signal loss due to cheap and varying motherboards as well as due to using a modular system, instead of a unified soldered one.

GPU memory on the other side is on a board, designed specifically for the GPU and the memory. there are specification for GDDR, but even if they operate out of some specification, it's ok, as the board either works and is sold or doesn't work and won't be sold. there is no compatibility issue etc. after it goes into production.

from that point of view, I think the next gen consoles will have mostly GDDR(5), and not more than addressable by 32bit CPUs. (just like x360 has GDDR3).

on the other side, to compensate this, I expect them to have some 8-16GB SSD for caching, like the new Ivy Bridge motherboards will have, although not as transparent disk cache, but some explicit controlled by the game.

on a side note.
It's quite interesting to see nitendo's moves. they didn't do that bad with N64 and GC, but realized, as long as they have to share a market, they will always have a limit in grow and will fight others (which declines the amount you can win). So they wnt for the "small" casual gamer market and rised it to 90Mio, while Sony/MS got the big "core" market, selling together 100Mio+.
Now MS and Sony try to fight for the 90Mio, throwing out casual input devices, nintendo goes with a new console for the core market (although I've expected that to happen at the same time kintect/move were released).
 
isn't the specification of desktop memory (like DDR2/3) usually taking into account the signal loss due to cheap and varying motherboards as well as due to using a modular system, instead of a unified soldered one.

Plenty of video cards used DDR2 or DDR3 memory due to cost.

On the other side, to compensate this, I expect them to have some 8-16GB SSD for caching, like the new Ivy Bridge motherboards will have, although not as transparent disk cache, but some explicit controlled by the game.

I imagine it will be similar to how the original Xbox 360 20GB SKU operated. No installs, just title update partition, BC cache (well, who knows what they'll do for BC), game cache, and a partition for downloadable content & profiles.

I wonder if they (meaning MS/Sony) would skip out on including an HDD in the base SKU. At least compared to the 360 Core, they would have a fairly decent amount of solid-state memory to rely upon in every unit, and it might be something more akin to how PS3 operates with smaller installable caches as opposed to a full blown installation. Then they could still go with the profiteering HDD add-on or the higher end SKU, whilst cutting down on BOM.

From a loading perspective, it'd be better to have streaming from both the optical drive and the permanent storage anyway. That way the game isn't fighting over access on the single device e.g. stream high def video from disc, load game data from the installed cache.
 
It's been a while since your last post :)

On a different topic now that larrabee project has stalled a bit (or at least it no longer a target as a product in near future) where would you put the odds for a many core system? I guess larrabee failure is Intel failure to enter an already consolidate market so the reasons why you though it was an option are still valid.
Nintendo passed (we at least kinow that for sure), Sony last declarations make also clear that they won't won't follow that road, that let MS. Whereas they are in the best position to deliver the matching software their link with the pc market through directx makes me believe that the odds are really low. I think it could only happen if IBM is interested in joining force to develop a POWERPC A"x".
Intel seems out of the picture.
So are you still believing it can happen?


EDIT
ABout HDD in MS console, it looks like Ms is negociating deal with TV providers (possibly ISP?) to sell the 360 as a tv set top box. I can only see them get more serious about it with their next system. Taking this in account I can only MS passing on HDD if they sell HDD through other means. By hearing the stuff about live being integrated to windows 8, MS dream to "control your house" and how they are threatened by mobile device for personal use, I wonder if they should start to consider the xbox as the new PC in personal realm. In your house the XBOX would be the server, with tablet, phone, netbook having a slave /master relationship with it. Honestly my belief is that MS only enemy is itself no other actors can compete if MS were to act properly. In short term my belief is that personal and enterprise products from MS (at least from a high level pov) should have nothing to do with each others. It means that MS have to sell not only OSes but the physical product in the personal realm. That means MS phone, MS tablet, MS XBOX... and assuming that they manage to get sound agreement with ISP and TV providers it means NAS and cloud storage.
Home server is a great idea, but HTPC won't get there any time soon. PC are the wrong product for the usage a console so a close box is.
that's my crazy take on a billion dollars business but if Ms want to put competitors away in the mobile market and soon netbook they have to either push their own deivces on those market in a first time concurrently with their windows products (which are going nowhere in the enterprise world).
If MS does an half move the nI expect them to follow the same model as with the 360 but with could storage for paying user.
 
Last edited by a moderator:
I think in the interests of multiplatform (i.e. PC), they just won't need to go down the many-core route (unless you count the increasingly complex compute shading post-DX11). Even to this day, PC development hasn't really taken much advantage of 8 threads, which is entirely understandable if you look at the steam hardware survey for instance. I don't think the uptake will be that much higher by 2014 either.

That would be incentive enough to have similar numbers of more powerful cores instead of going for 16+ threads. I'd be interested in a developer survey of just what they do on Cell that isn't related to making up for RSX deficiencies and just how much processing time is taken. Cell can do everything, but ultimately, there are choices to be made regarding CPU/GPU budgets, and whether or not it'd be a waste to go with a strong CPU. Hypothetically, if RSX had been a more efficient design, what would Cell be doing now, and would it have been worth it to Sony to produce such a huge CPU over the years?

In the MS GPU job-posting, they do also mention physics, and I do believe there is work with AI on GPU already. Potentially they could just pour all the budget into the GPU compute resources (and by 2014, we should hear much more about DX12), which would give the hardware some legs 5+ years later.

At that point they could just have a relatively straight-forward CPU to handle the basics of the OS and render setup. I suppose there's audio processing as well, though even a dedicated CPU core should be more than enough I would think.. *shrug* :p
 
In one of the dev interviews about WiiU it was stated that devs familiar with Wii API will be at home with WiiU.
Just curious as to what this might say about the systems potential power.
1.negative impact not good
2. Neutral nothing nada zilch
3. Positive yahoo
4. Not sure yet.
 
Well, from what I have come to understand about the Wii's development suite, it's not a good thing to brag about. It otherwise doesn't say anything about hardware power.

Oh that's good. I wondered if the "custom" aspect of their API would have any impact on the choices they make when designing their "custom" chips.
 
It's been a while since your last post :)

On a different topic now that larrabee project has stalled a bit (or at least it no longer a target as a product in near future) where would you put the odds for a many core system? I guess larrabee failure is Intel failure to enter an already consolidate market so the reasons why you though it was an option are still valid.

I think it may be a bit inappropriate to label larrabee as a failure. Intel has a yearly R&D budget of over 6 billion US dollars. It made sense for them to make a physical design and try the waters further than just simulations could allow them, and to be better prepared just in case a market materialized. They got some feedback, dipped their toes, no market materialized, the thing is put on ice, lessons learned duly noted.

You can look at present day CPU/GPU architectures as a heterogenous computing platform that aims to do reasonably well on single threaded, lightly multithreaded and heavily multithreaded workloads with a bunch of hardware assist for graphics rendering. There is dedicated hardware helping each of these scenarios, rather than trying to use a single architecture/approach boosted sufficiently to tackle all problems. As long as our current approach doesn't lead to very convoluted and prone to error software, the approach is efficient. Since it is the current approach, the case is actually that our current code base supports the heterogenous architecture, so it is very, very difficult (impossible) to design a unified processor that does better from an efficiency (performance per watt/performance per dollar) point of view. That isn't necessarily critical in all areas of computing, but if you're doing $300 consumer devices to be sold in the many millions, efficiency matters. A lot.
 
I was assuming also that he was talking relative to the performance of existing implementations of Anvil on other HD consoles, but his lines on the API on the original Wii confuses these matters. So I have to agree with AlStrong for a moment that there is no clear indication yet that we're looking at an upgrade, even if it is incredibly hard to imagine there won't be (but we should all know Nintendo by now, and wait for evidence).

Well it appears to be a 4800 series GPU, DX10.1 even. So it will support HD beautifully and the distinction between the Wii U and the Next-box and PS-whatever will be mightily blurred next generation. Hardware grunt is becoming less and less relevant, its the whole eco-system that can deliver the most value entertainment anyplace and anytime that will matter more. Looks like my prediction in 2008 of a portable Next-Wii came true, sort of. :LOL:


http://www.engadget.com/2011/06/14/wii-u-has-last-gen-radeon-inside-still-more-powerful-than-ps3-a/
 
Well it appears to be a 4800 series GPU, DX10.1 even. So it will support HD beautifully and the distinction between the Wii U and the Next-box and PS-whatever will be mightly blurred next generation. Hardware grunt is becoming less and less relevant, its the whole eco-system that can deliver the most value entertainment anyplace and anytime that will matter more. Looks like my prediction in 2008 of a portable Next-Wii came true, sort of. :LOL:


http://www.engadget.com/2011/06/14/wii-u-has-last-gen-radeon-inside-still-more-powerful-than-ps3-a/

You do realize that 10.1 and the 4000 series aren't exactly current hardware? Let alone 2012 and beyond hardware.
 
Another piece of rumour, that might be interesting:

New Xbox will be at E3 2012

"The source expects Microsoft to unveil and subsequently launch its new machine ahead of Sony, .."

"Crytek is hoping that Timesplitters 4 will be seen as an early visual benchmark for the platform."
 
Status
Not open for further replies.
Back
Top