Predict: The Next Generation Console Tech

Status
Not open for further replies.
If one manufacturer is about to use larrabee in its next system I would not say that they will have to start from scratch. I don't think that we can compare larrabbe to cell in regard to tools/environment.
Intel won't come with empty hands in regard to the software.
Next systems release date is at least 2 years from now Intel is already actively working on various tools and libraries, in no way larrabee will come as naked in regard to software as the cell was at launch.
In fact I wouldn't be surprised if tools and libraries available to larrabee by 2012 are at least as good (my bet would be better) as their cell counter part.
Intel may offer the best compiler and profiling tools of the business, thread building block, optimized library, havoc (and likely a very well optimized rendition).
I won't be like devs will have to come with their own solutions about how to spread their works on so many cores, both Intel and Microsoft agree that the solution that offers the best scaling is tasks base (finer grain that thread) and work stealing algorithm ( I read it's easier to implement efficiently on a system with caches).

And Larrabee will have a graphic ISA it's not like everything will have to be programmed by hand, I can't see why larrabee ISA would be more bothering than low level ISA used on the PS360, as it's software you have can make your own changes but it's not mandatory.

I am sure there will be tools, but my point was Larrabee strength will shine, when one takes advantages of it, by using different model than DirectX 11 in software. If Devs are just going to port DX11 to it, don't bother using Larrabee.
 
Does the next generation need to stretch? I would think that with the 1080p/60Hz wall, there just wouldn't be the need for as big a technical leap as the HD consoles made this generation.

This.

I don't see the need for home consoles to go > 1080p resolution, and 60fps is adequate for nearly all types of games. TVs aren't going to do > 1080p resolution for a while yet, and for Sony/MS to chase that market it would be suicide.
 
Second part of the watch impress thing is up.

Allegedly claims Sony is faced with a choice between a continuing with Cell architecture/nvidia gpu, or going Larrabee. But supposedly if they use Larrabee, only Larrabee 2 is an option, as I guess iteration one would be considered too early.
 
The tale of two consoles

Does it matter really if the console manufacturer subsidises the hardware OR software of the console so long as the console itself is profitable?

There may be two possible competing models here:

1. Expensive hardware. (standard Xbox 360/PS3 SKUs)

2. Expensive media. (Fast flash based media)

Does it really make much difference which way a manufacturer goes? Is the console manufacturer better off with the status quo, the number 1 option when number 2 may work just as well? It may not neccessarily cost them any more or may in fact cost them less with option 2 than 1 as people don't buy huge quantities of games all at once.

If a console manufacturer chooses to go with option 1 its saddled with a permament fixed expense for the two mechanical components which are the optical drive and the hard-disk drive as well as increased motherboard complexity for the interface and additional ram.

If a console manufacturer chooses option 2 they have to subsidise the cost of the media yes, but those costs will come down and the level of fixed costs within the machine will also be far lower and there will be fewer packaging constraints. A far simpler design, a far more consumer friendly and reliable design at that.

It may be time to pull back from the complexity of the current console design mindset. The consoles as dedicated games machines can make choices which other fixed architecture designs such and personal computers cannot do or be. Simplicity in itself is an incredible virtue and what simpler way is there to feed the high bandwidth required from games than with a system which casts off the bandwidth constraints of the two weak links in todays computing world?
 
Second part of the watch impress thing is up.

Allegedly claims Sony is faced with a choice between a continuing with Cell architecture/nvidia gpu, or going Larrabee. But supposedly if they use Larrabee, only Larrabee 2 is an option, as I guess iteration one would be considered too early.
The google translation makes my head hurt.

Anyway I find it even more unlikely that Larrabee will find its way into the PS4 after reading that, I was a sceptic before as well.
 
For Sony to use Larrabee in the PS4 would be really stupid. It would waste all of their extensive Cell R&D, developers would hate them forever for making them learn a new x86 based GPU architecture, and it would cause G1/2 games on PS4 to look like crap as evidenced by PS3 G1/2 games.

No sticking with the current Cell/Nvidia architecture makes a lot more sense, the PS3 needs only a modest "power upgrade" to reach 1080p60 native and anything beyond that is a waste unless Sony are seriously considering 3D as a viable product.
 
For Sony to use Larrabee in the PS4 would be really stupid. It would waste all of their extensive Cell R&D, developers would hate them forever for making them learn a new x86 based GPU architecture, and it would cause G1/2 games on PS4 to look like crap as evidenced by PS3 G1/2 games.

No sticking with the current Cell/Nvidia architecture makes a lot more sense, the PS3 needs only a modest "power upgrade" to reach 1080p60 native and anything beyond that is a waste unless Sony are seriously considering 3D as a viable product.

The current gen of consoles can do 1080P/60 easily anytime they want. Simply reduce the graphics enough.

The "problem" is they can make the games look better at 720P/30, and that will never change no matter how much power is at developers disposal.
 
The current gen of consoles can do 1080P/60 easily anytime they want. Simply reduce the graphics enough.

The "problem" is they can make the games look better at 720P/30, and that will never change no matter how much power is at developers disposal.

I should have added that part, of course there are 1080p60 games out there like RR7, but they are lacking in the graphical department despite the hefty resolution and framerate.

I suppose I meant to say 1080p60 with top end graphics, effects and post-processing!

Even then it is still only a modest bump of PS3/360 to achieve this.
 
1080p/60 could be achieved with most current day mid/high pc graphics cards (gtx260/HD4870) We need to go over that.
 
...

No sticking with the current Cell/Nvidia architecture makes a lot more sense, the PS3 needs only a modest "power upgrade" to reach 1080p60 native and anything beyond that is a waste unless Sony are seriously considering 3D as a viable product.

That brings up an interesting point. How much extra job would 3d require?

On the CPU side I guess it´s close to zero overhead. How much extra job would it require if you go the most clever route on the GPU side? A lot of vertex calculations should be shareable. Far away objects could probably just need an offset placement in the two images, some shader work could maybe be shared for close objects as well. I hardly expect it would require twice the GPU power unless you use a dumb brute force way implementation.

Sorry if this has been discussed elsewhere already, please link me to the thread if that´s the case.
 
1080p/60 could be achieved with most current day mid/high pc graphics cards (gtx260/HD4870) We need to go over that.

Why bother if TVs are not going to get > 1080p resolution until much later. I think the electronics industry has settled on 1080p as the standard. Chasing the > 1080p market would be releasing a product to serve less than 1% of the market, a lucrative 1%, but 1% nonetheless, and I think we have all learned a thing from ATi and Nintendo this gen that mainstream products are where the money's at.
 
You mean produce a console that renders in excess of 1080p? Um, why?

Well.. downscaling. Why use other AA techniques if downscaling gives it for free? It free's the way for big HD+ ready stickers on the box of the consoles too.
 
Last edited by a moderator:
I wouldn't call it free if you have to render at a higher resolution than displayed..

using any other method would involve the use of logic or dedicated hardware. using available hardware cycles would be the cheapest option.
 
Well.. downscaling. Why use other AA techniques if downscaling gives it for free?

AA is a much less expensive smoothing technique than a bi-cubic resize (bilinear and nearest neighbour are pointless if you want a decent end product). It makes little sense to render at 1440p/2160p and resize it to 1080p, the number of extra pixels is massive and that cost (actual money and computational) would be wasted.

I don't want to see $600 consoles ever again!
 
Well.. downscaling. Why use other AA techniques if downscaling gives it for free? It free's the way for big HD+ ready stickers on the box of the consoles too.


Supersampling is the least free solution... You're looking at a linear increase in memory, memory bandwidth consumption, and pixel shading...
 
Indeed, it's the reason MSAA and other technologies were developed. If you can afford to render at 4x 1080p resolution and downscale, you can afford to render at 1080p with 4xMSAA and far prettier visuals. The only way Supersampling will become a worthwhile option for AA is if we have got photorealistic rendering and have buckets of processing cycles to spare!
 
Status
Not open for further replies.
Back
Top