Predict: The Next Generation Console Tech

Status
Not open for further replies.
Somebody here heard rumors of Ms working with IBM ;)

They worked with them on the xbox 360 so why wouldn't they do it again.

I think at the time of the 360 the 3 core waternoose was both more powerfull and smaller than a similarly spec athlon 64 x2 chip
 
Carmack said it was about 50% the single-thread perf of a PC processor (pentium 4, athlon 64). it might get better results if code is using vector units more, but I think a athlon X2 would be a good match and often surpass it.

now what will IBM gives. a Power7 derivative?
 
I have a question and I apologzie in advance for my lack of knowledge. Its my understanding that AMD is using z-ram in the future. Could this replace the edram that the xbox 360 uses (in a future xbox next console of course) and offer more speed and larger amounts of ram on the chip ?


Z-Ram I think uses a SOI process. In theory I think they could use Z-Ram in the GPU without any issue.


The good news is it seems engineers are coming up with ways to make eDRAM clock higher. I won't be suprised if the next Xbox console has enough eDRAM to handle 1080p with 4x AA.
 
that would be very expensive. one could wonder, shoudln't they concentrate on offering 720p with a high IQ? (4x AA, 16x AF and a nice framerate). Most people who buy HDTV get 720p displays and even typical projectors are 720p. It could look good enough upscaled, or a simple technique could be used where rendering is done at 720p and upscaled, then fonts and HUD elements are added at 1080p res.
 
that would be very expensive. one could wonder, shouldn't they concentrate on offering 720p with a high IQ? (4x AA, 16x AF and a nice framerate). Most people who buy HDTV get 720p displays and even typical projectors are 720p. It could look good enough up scaled, or a simple technique could be used where rendering is done at 720p and up scaled, then fonts and HUD elements are added at 1080p res.
I completely agree, anyway manufacturers are likely to come with their marketing bullet point 1080p AAxX AFxX, but I hope they won't force editors's hands too much.

In fact 1366*768 (the real most common resolution) +AAx4 +AFx8 should be the absolute prerequire.
(Obviously a dedicated competitive scale would be welcome as sharpening effect yields better and better result while starting form 576p image).
 
that would be very expensive. one could wonder, shoudln't they concentrate on offering 720p with a high IQ? (4x AA, 16x AF and a nice framerate). Most people who buy HDTV get 720p displays and even typical projectors are 720p. It could look good enough upscaled, or a simple technique could be used where rendering is done at 720p and upscaled, then fonts and HUD elements are added at 1080p res.

Is it really true most people buy 720p?

At least in the US, 1080p models are advertised more on a weekly basis at all the chain stores.

There are certainly a number of 720p models offered and with lower prices, it should be affordable to a larger potential installed base.

But even 1080p models are around $1000 now. Maybe in 2-3 years, even the lower price bands will have 1080p models.
 
there's a way but it depends on displays. They may or not accept 1360x768 signal from VGA and/or DVI (six vertical lines discarded). sometimes you have to tweak the VGA signal's parameters. Reasonable to do on the PC but most likely a console vendor wouldn't bother supporting that.
 
Do 768p panels actually support 768p input? I don't think they do, and there's no way to produce and display a native resolution on these 720p, 768p native panels.
I never thought of that, you may be right in that case I would go with 720p as Blaskowicz.
It would save computing power, frame buffer would have a tinier memory footprint, it would save bandwith.
 
Do 768p panels actually support 768p input? I don't think they do, and there's no way to produce and display a native resolution on these 720p, 768p native panels.

Perhaps not entirely relevant, but the UK hi-fi firm Arcam used to (I guess they still do) make dvd players that would scale to 768p, I never read any tests on what happened why they tried them with 768p displays tho ...
 
Will eDRAM be so prohibitive in cost to support 1080p in 2011-2012 when the 360 was able to include enough eDRAM to support 720p in 2005-2006?
 
Do 768p panels actually support 768p input? I don't think they do, and there's no way to produce and display a native resolution on these 720p, 768p native panels.

It would have to be through the DVI or VGA or a special HDMI port i.e. PC support. In the last case, I don't think it is a very common feature compared to having DVI or VGA for 'older' sets. My plasma panel can receive PC resolutions up to 1360x768 using the specially marked HDMI port (and even then, I lose 5 columns of pixels as the panel is 1365...). Other panels might be different - those that are actually 1360 or 1366 (Man, I hate how the industry can't even decide on something like this).
 
During a presentation named “The Future of Gaming Graphics” @ the GC developers conference, Cevat Yerli (Crytek) briefly mentioned the next gen of consoles.

CY: “The PlayStation 4 and Xbox ‘720′ will arrive in 2011 or 2012, we think; but this is just our estimate…we don’t know, and even if Microsoft and Sony told us, I couldn’t say because it would be under NDA. But we think in three to four years’ time, although there are good reasons why it should be 2010 already…but we’ll see.”
 
2011 or 2012 seems like a good estimate, the former for MS and the latter for Sony and Nintendo. But I don't necessarily agree with the notion that they should be out in 2010. 4-5 years isn't exactly that good of a life span for consoles, particularly ones that have cost consumers the most.

We don't need another Xbox1 "tragedy". Although some people will swallow it.
 
http://arstechnica.com/articles/paedia/idf-gelsinger-interview.ars
There is an interesting part about Intel trying to force Sony's hands.
If Sony will adopt Larrabee as a GPU for its next gen console they have to get rid of CELL, it would be embarassing to have developers not using the CPU becase the 'GPU' is way easier to program for (no local store, better ISA, better tools, etc..) ;)

Better replace it with a low cost OOOE x86 processor, or (crazy idea..) another larrabee based chip, though we don't know if larrabee architecture supports multiple chips.
 
nAo said:
If Sony will adopt Larrabee as a GPU for its next gen console they have to get rid of CELL, it would be embarassing to have developers not using the CPU becase the 'GPU' is way easier to program for (no local store, better ISA, better tools, etc..) ;)
That's assuming Cell doesn't evolve either though.
Thing is though, with GPU that programmable, do we really want another discrete CPU with a different arch, even if it's easy to work with?
 
Status
Not open for further replies.
Back
Top