Inquisitive_Idiot
Regular
Just been thinking about this recently as there seems to have been a recent spat of speculation regarding the next-generation of consoles. Its funny to see how predictions have transitioned from the beginning of the generation to where we are now. Its seems like the changing markets and time have really tempered peoples expectations.
I know there are a lot of facets to the design of the console, and each part choice can affect the design of the entire system, but as of late my thoughts have been focused primarily around the which GPU they will choose. I have a 1Gb Radeon 5770 in my desktop computer, and I think its a fantastic part for the price; basically anything outside of the most demanding games (Crysis, Crysis 2, The Witcher 2) is playable at 1920x1080 at max or very near max detail settings. With this in mind, I was thinking that a part at or around this chip would seemingly be a decent starting point for a next-gen GPU, especially if they limited the resolution to 720p, in which case the shading power of the 5770 should be sufficient to produce some very nice visuals. I was also thinking that by the time Xbox 3 and Playstation 4 are released they should be able to utilize the 28nm manufacturing process for increased power efficiency and heat dissipation. I am sure this subject has been covered over the course of this thread, but I figured I would throw my two cents into the discussion.
I know some people are still championing enthusiast level cards with 4-8Gb of RAM, but I have to ask what is still driving this logic. I just cannot see a way for it to be feasable for Microsoft of Sony to include this level of technology in an affordable manner.
I know there are a lot of facets to the design of the console, and each part choice can affect the design of the entire system, but as of late my thoughts have been focused primarily around the which GPU they will choose. I have a 1Gb Radeon 5770 in my desktop computer, and I think its a fantastic part for the price; basically anything outside of the most demanding games (Crysis, Crysis 2, The Witcher 2) is playable at 1920x1080 at max or very near max detail settings. With this in mind, I was thinking that a part at or around this chip would seemingly be a decent starting point for a next-gen GPU, especially if they limited the resolution to 720p, in which case the shading power of the 5770 should be sufficient to produce some very nice visuals. I was also thinking that by the time Xbox 3 and Playstation 4 are released they should be able to utilize the 28nm manufacturing process for increased power efficiency and heat dissipation. I am sure this subject has been covered over the course of this thread, but I figured I would throw my two cents into the discussion.
I know some people are still championing enthusiast level cards with 4-8Gb of RAM, but I have to ask what is still driving this logic. I just cannot see a way for it to be feasable for Microsoft of Sony to include this level of technology in an affordable manner.