The Next-gen Situation discussion *spawn

When the 360 came out it was running games that the pc hardware of the time had trouble running. So while the 360 may not have run oblivion perfectly the pcs weren't either.
Good PC's were running Oblivion with way better graphical settings - espercially with mods and .ini-editing - than any of the consoles did. Huge differences.

Also, when comparing how GTV IV etc. ran on computers and consoles, one really should look at how similiar games performed on PC. GTA was just horribly ported for PC so it's in no way an example of magical resource utilisation by consoles.
 
With regards to either console following an iOS type of hardware model.

It just doesn't make sense for a console. At least not while AAA games still remain a large focus....read the rest above.
If you are a developer, you build for the platform you know. If Steambox 2013 comes out and you start dev in 2013, you might be releasing in 2015 but you build for the 2013 model. Even if a 2015 model comes out, all other devs who started at the same time as you will be targeting the 2013 model so you're not behind anyone else. If you are building a game in 2014 to go out in 2016 and you know the specs for the 2015 model, you can make that the ultra and the 2013 model the medium. For the consumer, we won't see the benefits for a new console right at launch. You might get some games that throw a bone like extra effects at launch but it won't be a generational difference. If you buy a Geforce 680 when it launched you didn't see any benefit. Same with iPad 4. But people buy them anyways. And unless you are a tech-junkie, you can skip a cycle so you buy every 4 years.

Oh, and while tablet games are small, they are getting bigger and will continue to do so as each new tablet comes with more storage. The Avenger's game is over 2 GB.
 
If you are a developer, you build for the platform you know. If Steambox 2013 comes out and you start dev in 2013, you might be releasing in 2015 but you build for the 2013 model. Even if a 2015 model comes out, all other devs who started at the same time as you will be targeting the 2013 model so you're not behind anyone else. If you are building a game in 2014 to go out in 2016 and you know the specs for the 2015 model, you can make that the ultra and the 2013 model the medium. For the consumer, we won't see the benefits for a new console right at launch. You might get some games that throw a bone like extra effects at launch but it won't be a generational difference. If you buy a Geforce 680 when it launched you didn't see any benefit. Same with iPad 4. But people buy them anyways. And unless you are a tech-junkie, you can skip a cycle so you buy every 4 years.

It doesn't work that way for PCs. While the majority of PC games are console ports, many of the top remaining PC devs still develope by trying to guess what the state of PC graphics and computing power will be when their game hits. CD Projekt Red for example. Crytek as another. They don't develope games for current hardware, they develope games for future hardware.

Hence it isn't uncommon that you end up with games that are unplayable at the highest settings even with the best hardware available at the time the game launches.

Something like that would be unacceptable on a console however. And that's where the problem comes in. Without a stable and known hardware target it becomes more and more likely that you'll either have games that never take advantage of the console (develope for the current console) or game that cannot perform well on the current console (misjudged the state of hardware when your game finally launches) or performs well and maybe takes advantage of the current console (somehow managed to guess exactly how the next console update will be).

Most console developers would like do as you say, develope for the current console. Which means that with say a 2 year hardware upgrade cycle, the console that is currently in stores and purchaseable will not have a game released for it during its entire life cycle. It won't be until after the console is replaced that the games targetting it will finally come out.

So, what does that potentially end up doing? It ends up potentially encouraging people to wait until the end of the product life cycle before picking it up and ignore any new console that is launched. Not something that is going to promote a healthy console market.

That dynamic doesn't exist with smartphones at the moment as the deveopement cycles are so short that you can still get something that started developement when you bought your device and you can use that app during the life of the device. Added to that there isn't a culture on smartphones of scrutinizing how much games have improved from one generation to the next.

I mean does anyone care if the latest Angry Birds is graphically a generation beyond what was possible with the previous Angry Birds? :p That's what I mean by the fact that if you are going after an audience dominated by casual gamers, then sure, a 2 year cycle will work.

For core gamers and the types of games and graphics that they demand out of the console that they just bought? Won't work.

Regards,
SB
 
But couldn't someone make that same argument with the launch games for Xbox 3, PS4 right now? They've been in development for 2 years no? They could overshoot their targets too. Same with Vita when it launched. Same with Wii U, 3DS. I mean devs have been working with target hardware for a long time. Sure, you won't squeeze the most out of a new system, but you can make a conservative estimate.
 
As long as there are multiplatform developers, it should be possible to implement iOS style h/w refresh. I think the key thing is to focus on gameplay first, like Nintendo. My wife and kid love iOS gaming. The situation has reversed in my family. Now they play games more often and longer than me.
 
But couldn't someone make that same argument with the launch games for Xbox 3, PS4 right now? They've been in development for 2 years no? They could overshoot their targets too. Same with Vita when it launched. Same with Wii U, 3DS. I mean devs have been working with target hardware for a long time. Sure, you won't squeeze the most out of a new system, but you can make a conservative estimate.

Absolutely, yes!

Look at what was said about X360 and PS3 games when the consoles launched. Look at the complaints about how it wasn't much better than the previous gen.

But here's the catch. The X360 and PS3 were around for longer than 2 years. People that have been console gamers for a while knew to expect game graphics to increase in quality as the generation went on. Look at GoW on PS2 as a prime example of something like that. Compare Halo 3 to Halo Reach.

Compare COD 3's graphics to COD: BO2's graphics. Especially the PS3's versions.

Now imagine if you just cut that all off. And every game that is ever launched from now on is basically like the first 1-2 years of the X360 and PS3.

In other words, no learning the best programming practices of each console. generation. Optimizing developement and programming practices as the generation goes on for more efficient graphics rendering.

Imagine that every console from no on will always not be efficiently targetting the consoles. Games will get better as times goes on, yes. But you end up with the following.

Buy a console. 2 year lifespan games are rough and not totally optimized similar to the PS3/X360's first 2 years.
New console 2 year life span. Games are not totally optimized similar to PS3/X360's first 2 years. Games now targetting new console so are never optimized for previous console which cannot run these games well.
New console 2 year life span. The chain snowballs. And basically you're in PC gamer territory where you need to update your hardware to be able to play the newest PC optimized (not ports from console) games at high quality.

Or as someone mentioned before do developer's only target that first console in that chain as it offers the biggest chance to recoup their investment. So any new console see's absolutely no benefit? Or do we go with a PC model of trying to address all consoles potentially introducing bugs? And then have people complain that too much time was spent optimizing for the low end console or too much time was spent trying to optimize for the higher spec'd console. I see the console industry entering into it's death throes if it attempts to do this.

Compare that to right now.

Buy a console. First 2 years just like above.
Years 3-4, games are starting to be optimized, games look better and perform better just like above. Difference? Customer doesn't have to buy a new console to play it well.
Years 5-6, games reach high efficiency in graphics rendering and programming. See increases as above. Except you can still play them on a console purchased 5-6 years ago.
Years 7-8, a game here or there may be able to exploit some more efficiencies but in general games have plateau'd. Time for a new console.

And just look at iOS and PC everytime there's a new generation of hardware.

On iOS, Apple fixes bugs, introduces new features, etc. that break existing applications and then expects the developers to fix things. Users have to hope that if it's an app they use that get's broken that the developer is still around and still interested in fixing it.

On PC, similar things. New graphics card comes out with new drivers that break compatibility with older games. Or updated OS breaks compatibility. Or some other component breaks compatibility. But on the PC, instead of the developer having to fix things and user crossing their fingers, the hardware manufacturer's or Microsoft ends up fixing or making workarounds for the broken compatibility. Or they work with the software developer to try to fix any incompatibility assuming the game isn't old. At which point the developer won't be interested anymore.

What's that going to be like on the consoles?

Regards,
SB
 
You can still optimize for h/w when it keeps improving. iOS is not a good example for consistent graphics improvement mainly because Apple is not focused on core gaming. Nonetheless, Retina versions do look better than non-Retina ones. iPhone 5 version has different aspect ratio.

Multiplatform game developers do have different visual quality for gaming PCs and consoles based on Direct X and GL versions. And the games do improve over the last release as the devs polish their engines and workflow.
 
Gentlemen, let's make a deal not to use that term outside of its intended context (which on B3D means never outside of RPSC).

Thank you and I agree. Been meaning to pick up almighty on this. It's really grating whenever I see it used like that.
 
SB, few things to note.
Early games in a new console are not optimized but the best launch titles in a new console blow away the best late gen titles in the previous generation. Resulution, poly count all noticibly higher.
Devs get used to a platform over time true, but remember in the quick upgrade cycle scenario, we are talking hardware evolution, not revolution. x86 to x86 or ARM to ARM. Not one esoteric arch to another. Ramp up time is much less. Also, Xbox and PC's use the Direct X layer, not to the metal. Playstation is also going this route now.
As other posters said, PC games today do a good job of varying quality settings. I have PC's with Nvidia 4x to 6x cards and can play any game just by reducing graphic settings.
Remember also unlike PC, we are talking 2-3 very defined and similar systems devs need to target.
Every gaming capable platform except consoles do evolutionary upgrades. PC/Mac, tablets, phones. Consoles instead have this weird fixation on 10 year cycles even if it is just marketing talk, completely at odds with the breakneck speed of the rest of the industry.
 
Resulution, poly count all noticibly higher.

By whom?

Us geeks here on 3D related forums or within the industry? Yes, sure.

The actual audience buying these games? Very unlikely, and with each new generation even less so.

The average customer still believes that all PS3 games are running at full 1920*1080 resolution, even those that are actually running at 920*540. They won't even know what poly count means.

Just FYI...
 
And you've hit one of the reasons I left. Almost all of the core gamers that created the Xbox that were in management have been forced out or left, and what's left over is MBAs with dollar signs in their eyes. I just found I could no longer believe in and agree with the direction the execs were taking the Xbox org.

Don't take my dissatisfaction with the management as a condemnation of the product. I don't think they will be pulling a Wii U. It's just that they're moving away from the model of "Core gamer first, casuals after". Also, both sides have a problem, their current designs are so ridiculously capable, and it's a non-starter to launch a box that can't do everything the previous box did. Remember what the launch 360 and PS3 could do? Pretty much play a game online. That's it. They've had 8 years of extra development and features that the companies have to either bring over or improve on for the new generation, and that's not easy.

:cry:

By whom?

Us geeks here on 3D related forums or within the industry? Yes, sure.

The actual audience buying these games? Very unlikely, and with each new generation even less so.

The average customer still believes that all PS3 games are running at full 1920*1080 resolution, even those that are actually running at 920*540. They won't even know what poly count means.

Just FYI...

Laa-Yosh, there is no one more qualified than you to do this.

Start a thread.

Post examples of diminishing returns: run some LOD on some models with various poly counts. Do the same for alpha, shadow resolution, AA, resolution, etc to demonstrate this principle.

Do this and you may snatch away ERP's #1 poster award.
 
SB, few things to note.
Early games in a new console are not optimized but the best launch titles in a new console blow away the best late gen titles in the previous generation. Resulution, poly count all noticibly higher.
Devs get used to a platform over time true, but remember in the quick upgrade cycle scenario, we are talking hardware evolution, not revolution. x86 to x86 or ARM to ARM. Not one esoteric arch to another. Ramp up time is much less. Also, Xbox and PC's use the Direct X layer, not to the metal. Playstation is also going this route now.
As other posters said, PC games today do a good job of varying quality settings. I have PC's with Nvidia 4x to 6x cards and can play any game just by reducing graphic settings.
Remember also unlike PC, we are talking 2-3 very defined and similar systems devs need to target.
Every gaming capable platform except consoles do evolutionary upgrades. PC/Mac, tablets, phones. Consoles instead have this weird fixation on 10 year cycles even if it is just marketing talk, completely at odds with the breakneck speed of the rest of the industry.

I don't disagree with any of that.

First party devs at the start of a cycle will have had earlier and likely greater access to the console hardware as it was being developed. The parent company Sony/MS, can afford to have their first party devs change developement targets as the console hardware changes as it's in the best interest to showcase a new console. 3rd party devs. don't have that luxury. And even then I can't think of a single first year title for any console that is better than multiple titles released late in the console's life cycle.

As well, you don't seem to understand just how difficult it is to maintain compatibility with games whenever a new generation of hardware is introduced on PC even when the hardware is similar to the one previous and made by the same company. Both Nvidia and AMD have to put in significant driver effort to make sure that new hardware works with recently released games. Hence, it isn't uncommon to see games released just a few months prior to a new generation of video cards underperform on those video cards (and sometimes not even render correctly or cause the game to crash) but then performance increases as new drivers are released.

And god forbid if it's an older titles released 2-3 years prior that isn't one of the more popular titles. It may take 1-2 years (if ever) for problems with the game to be addressed.

Add onto that, developers may have to work around quirks between different generations of hardware from the same manufacturer. What worked just fine on a video card from X IHV might have problems with a video card launched 1-2 years later by the same X IHV. And so they have to spend some developement time fixing the problems or hope that the IHV releases a driver that puts in special code to deal with that particular game.

Backwards and forwards compatibility isn't as easy as most seem to believe it is. In general it works out fine. But it is definitely not uncommon for problems to arise, and then significant effort is put in behind the scenes by MS, the hardware manufacturers, and the software developers to try to make it as seemless as possible for the end consumer.

And that's one of the major reasons you saw a large exodus over the past decade of people leaving PC gaming for console gaming. Speaking of varying levels of graphics quality, that's another thing that many core gamers abandoned the PC for in favor of consoles. The constant upgrade cycle to play a game at its best quality.

So, how tolerant do you think console gamers would be if suddenly an "updated" console came out that hard trouble running a game that worked on their launch console?

If it's cheap (say 0.99-1.99 USD like iOS and Android apps) then they probably wouldn't care so much. But what about the 20 USD game or 60 USD game they bought just 2 months ago?

Want to make sure the console market dies? Make it more like PC gaming.

Regards,
SB
 
Want to make sure the console market dies? Make it more like PC gaming.

The problem is you can't let console gaming continue like it has either. That's because phones & tablets are so pervasive that they are eating into the traditional console market. People are getting accustomed to yearly hardware upgrades. Becoming more like the PC market by supporting backward/forward compatibility & increased hardware refresh cycles are one way to stave off the mobile onslaught, but it would only be temporary. To survive consoles need a more unique experience that you can't get in the mobile space: something that only a set-top box sitting next to the biggest screen in your house can provide. Will they be able to do it with Kinect, VR/AR glasses & IllumiRoom type technologies that get introduced every year or two? I think so. You can't get that kind of immersion on a lower power portable device. Eventually maybe, but not for a couple of years after the console.

Tommy McClain
 
If phones & tablets are eating into traditional console market, then they are devouring the PC market.

I agree that embracing future tech is the way forward and it doesn't need to be done at launch.
 
The problem is you can't let console gaming continue like it has either. That's because phones & tablets are so pervasive that they are eating into the traditional console market. People are getting accustomed to yearly hardware upgrades.
I'm not sure that's really a long term issue though. The smart-phone phenomenon is only a decade old. It started with old, clunky tech, which has been accelerating extremely fast. But there will come a point, in theory, when enough is enough and people won't care to upgrade every year. I see it like the whole computing space compressed into a few years. If you think about computing, many folk don't upgrade their PC because it's 'fast enough'. Touch devices are more convenient too. So people are buying touchies, and then a new model comes out that's fast with a better screen and people want to upgrade. It's highly unlikely that momentum will be maintained forever though. Retina screens are a full-stop. Once someone has a 300+dpi screen, there's no upgrade path for visuals (quality can come into it, but I'm using generalisations here). When the current machine can browse the net quickly and edit photos simply and quickly, what's the reason to upgrade? Some (Apple fanboys!) are of the opinion that people will buy the latest, greatest tech no matter the price, but I don't think that's true, and I don't think we'll see $500 a year spent on the new touch device.

Looking at my own purposes, I'd like an upgrade to my TF101 (two year old machine) for a higher res display and pen functionality. I may get a Note 10.1 for the pen, but it hasn't the improved screen. So if I do get that, I'll upgrade to a 1080p 10.1 note. After that, the machine will do my fine and I don't see any need to buy a new tablet for a long time. Just as I've no need to upgrade my Core2Duo laptop except if I start doing heavy HD video editing.

There is a desire for new things that makes people shop-happy, but it wears off after a while when tech becomes new, and people settle down into more considered buying habits, I think. Just as a car is bought every few years rather than upgraded to the yearly model, and a TV, and every other device, the flash and pop of mobile computing will become normalised and people won't go on a relentless upgrade cycle indefinitely. There's even an argument that after a while, it'll somewhat stagnate. Beyond 3D gaming, there's little a mobile needs to do that's processing incentive, same as a PC. So when everyone has a high res display on a responsive device, reasons to buy a new machine end up few and far between. Give it another two years and the mobile space might reach saturation and slow down massively. I guess a lot depends on the contracts. When a person's contract runs out, they are free to switch to far cheaper mobile plans, but a lot of folk may not realise that and/or may be pretty conditioned to carry on taking out 2 year contracts with new handsets more powerful than they'll ever need!
 
I'm sympathetic to the driver compatibility situation but remember nVidia and AMD have to deal with a wide open PC market. In a cross-compatible console, we are talking supporting 2-3 machines at a time with evolutionary architectures. The iPad has great backward compatibility. I haven't seen any game that runs on an older iPad not work with the latest.

If you have a game that supports 3 machines simultaneously, you won't get the best performance out of them it is true. You might ask, why would anyone buy the new console if it only offers slightly better graphics and performance? It is because it is the current one that is being sold. They do it with cars, TVs, tablets, phones. Is the iPad 4 of any use right now? No, nothing really takes advantage of its extra processing power. But the iPad 4 replaces the iPad 3 so Apple can stay ahead of the curve. The iPad 4 is not for iPad 3 buyers, but for people who don't have iPads or have iPad 1. Does Intel really need to come out with a new chip every year? Technology moves much faster than appears practical. People buy a tech product every few years, but manufacturers create a new one every year.

Every new console generation, developers are hand-wringing over sales with a 2-3 million user base at most. If the Xbox 3/PS4 games could be playable with 360/PS3 at lower performance, all of a sudden they have a massive audience to sell to. That's of course impossible because the architecture changed so much. But now that they are going with x86/Radeon, the Xbox 4/PS5 can be cross compatible.

There are at least 2 big problems with the console industry. One, it resets every 5-8 years. It's no surprise you get a new leader almost every new generation. All your hard work to get people in your ecosystem mostly vanishes. Two, the cycle of 7-8 years is too long for many people who are antsy about tech advancement. Maybe a 2 year cycle is too short; maybe a 3 year cross-compatible cycle might be better.
 
I'm sympathetic to the driver compatibility situation but remember nVidia and AMD have to deal with a wide open PC market. In a cross-compatible console, we are talking supporting 2-3 machines at a time with evolutionary architectures. The iPad has great backward compatibility. I haven't seen any game that runs on an older iPad not work with the latest.

If you have a game that supports 3 machines simultaneously, you won't get the best performance out of them it is true. You might ask, why would anyone buy the new console if it only offers slightly better graphics and performance? It is because it is the current one that is being sold. They do it with cars, TVs, tablets, phones. Is the iPad 4 of any use right now? No, nothing really takes advantage of its extra processing power. But the iPad 4 replaces the iPad 3 so Apple can stay ahead of the curve. The iPad 4 is not for iPad 3 buyers, but for people who don't have iPads or have iPad 1. Does Intel really need to come out with a new chip every year? Technology moves much faster than appears practical. People buy a tech product every few years, but manufacturers create a new one every year.

Every new console generation, developers are hand-wringing over sales with a 2-3 million user base at most. If the Xbox 3/PS4 games could be playable with 360/PS3 at lower performance, all of a sudden they have a massive audience to sell to. That's of course impossible because the architecture changed so much. But now that they are going with x86/Radeon, the Xbox 4/PS5 can be cross compatible.

There are at least 2 big problems with the console industry. One, it resets every 5-8 years. It's no surprise you get a new leader almost every new generation. All your hard work to get people in your ecosystem mostly vanishes. Two, the cycle of 7-8 years is too long for many people who are antsy about tech advancement. Maybe a 2 year cycle is too short; maybe a 3 year cross-compatible cycle might be better.

People were talking about a 3-5 year cycle with the xbox 360..indeed im sure microsoft said something similar at the time...the recession, RROD and rapid advances in software technolgy have allowed/forced them to prolong it.
 
I'm not sure that's really a long term issue though. The smart-phone phenomenon is only a decade old. It started with old, clunky tech, which has been accelerating extremely fast. But there will come a point, in theory, when enough is enough and people won't care to upgrade every year. I see it like the whole computing space compressed into a few years. If you think about computing, many folk don't upgrade their PC because it's 'fast enough'. Touch devices are more convenient too. So people are buying touchies, and then a new model comes out that's fast with a better screen and people want to upgrade. It's highly unlikely that momentum will be maintained forever though. Retina screens are a full-stop. Once someone has a 300+dpi screen, there's no upgrade path for visuals (quality can come into it, but I'm using generalisations here). When the current machine can browse the net quickly and edit photos simply and quickly, what's the reason to upgrade? Some (Apple fanboys!) are of the opinion that people will buy the latest, greatest tech no matter the price, but I don't think that's true, and I don't think we'll see $500 a year spent on the new touch device.

Looking at my own purposes, I'd like an upgrade to my TF101 (two year old machine) for a higher res display and pen functionality. I may get a Note 10.1 for the pen, but it hasn't the improved screen. So if I do get that, I'll upgrade to a 1080p 10.1 note. After that, the machine will do my fine and I don't see any need to buy a new tablet for a long time. Just as I've no need to upgrade my Core2Duo laptop except if I start doing heavy HD video editing.

There is a desire for new things that makes people shop-happy, but it wears off after a while when tech becomes new, and people settle down into more considered buying habits, I think. Just as a car is bought every few years rather than upgraded to the yearly model, and a TV, and every other device, the flash and pop of mobile computing will become normalised and people won't go on a relentless upgrade cycle indefinitely. There's even an argument that after a while, it'll somewhat stagnate. Beyond 3D gaming, there's little a mobile needs to do that's processing incentive, same as a PC. So when everyone has a high res display on a responsive device, reasons to buy a new machine end up few and far between. Give it another two years and the mobile space might reach saturation and slow down massively. I guess a lot depends on the contracts. When a person's contract runs out, they are free to switch to far cheaper mobile plans, but a lot of folk may not realise that and/or may be pretty conditioned to carry on taking out 2 year contracts with new handsets more powerful than they'll ever need!

Very well said Shifty.

I would also add that there is next to no proof to be found that the current surge in mobile computing devices is having any significant effect on the home console business. Dedicated gaming handhelds sure, but that really only because of two things;
1) Devs/pubs are moving more and more towards phones/tablets because of the software ecosystems wherein they can a) continue to profitably sell the low dev-cost shovelware they did in past generations to sustain company profits, where on current dedicated portable and home consoles this practice has become unsustainable, and b) obtain a higher ROI becuase the lower dev costs, distribution costs and platform royalties, together with unconscionably massive HW installed bases make it an uncontested platform for casual gaming (and some core gaming) software.

On the other hand the home console business is able to offer something far more unique and compelling that these portable gaming platforms cannot. Both casual and core gaming is still quite healthy and has in no significant way been affected by the mobile platforms. An example of this is the success of the Kinect platform, which shows that a home console experience can be sufficiently differentiated from mobile gaming, such that it is compelling enough to sell a truck load to even the most casual of gaming consumers (i.e. those most at risk of being lost to mobile games). Core games are a given for home consoles, as the core gaming demographic are in no danger of defecting to mobile devices to accept signifiantly lower quality games, with far less depth and scope over the core games on home consoles that are designed around the values that the core gamer desires the most.

If anything the only thing that has hurt the console gaming industry in a significant way as of now, is the prolonged generation. Wherein the biggest releases have been mostly limited tired sequels, pubs have because more risk averse and thus have dared not to launch new IPs and test new genres. Therefore the biggest games have become homogenised somewhat, thus play fatigue has set in to a degree. There is also an arguement that core gamers are anticipating a new generation of HW been announced, and thus have limited their purchases to wait for the new boxes. All these factors would have affected poor upfront sales of all but the most mega-hyped games (and thus mega-publicised games, thus uber marketing budgets), and so causes the retail prices of most AAA games to crash quickly, which players in turn capitalise on and thus stop buying games at launch (as titles will drop to 40% of the price within a month) further exacerbating the issue. There's also a point to be made that players who buy many games, with so many sold at such low prices, will have huge back-logs at this point in the gen, so that further discourages buying newer games particularly at launch.
 
He didn't say MS has a monopoly in console gaming. The monopoly word was attached to only Windows and Xbox was the only "thing" they had success other than Windows. The wording is a bit confusing though.

It was pretty much implied though. But it is disingenuous to consider that MS has anything like a monopoly in the gaming sphere. People seem to forget that it is only in North America that they have had overwhelming success. Everywhere else in the world they have performed fair to middling but still been overtaken by Sony.

And if it is MS's hubris that makes them drop the ball this coming gen, if in fact that is true, then it won't be the first time they have completely underestimated not only the competition but the market places attitude to growth in a particular area.

But given their track record over the decades of releasing painfully rubbish software so their next release looks so much better, Millenium anyone?, there is no reason to suspect they won't do that with their consoles to. They've already managed it with the abortive RT entry in the tablet space.
 
I'm sympathetic to the driver compatibility situation but remember nVidia and AMD have to deal with a wide open PC market. In a cross-compatible console, we are talking supporting 2-3 machines at a time with evolutionary architectures. The iPad has great backward compatibility. I haven't seen any game that runs on an older iPad not work with the latest.

If you have a game that supports 3 machines simultaneously, you won't get the best performance out of them it is true. You might ask, why would anyone buy the new console if it only offers slightly better graphics and performance? It is because it is the current one that is being sold. They do it with cars, TVs, tablets, phones. Is the iPad 4 of any use right now? No, nothing really takes advantage of its extra processing power. But the iPad 4 replaces the iPad 3 so Apple can stay ahead of the curve. The iPad 4 is not for iPad 3 buyers, but for people who don't have iPads or have iPad 1. Does Intel really need to come out with a new chip every year? Technology moves much faster than appears practical. People buy a tech product every few years, but manufacturers create a new one every year.

Every new console generation, developers are hand-wringing over sales with a 2-3 million user base at most. If the Xbox 3/PS4 games could be playable with 360/PS3 at lower performance, all of a sudden they have a massive audience to sell to. That's of course impossible because the architecture changed so much. But now that they are going with x86/Radeon, the Xbox 4/PS5 can be cross compatible.

There are at least 2 big problems with the console industry. One, it resets every 5-8 years. It's no surprise you get a new leader almost every new generation. All your hard work to get people in your ecosystem mostly vanishes. Two, the cycle of 7-8 years is too long for many people who are antsy about tech advancement. Maybe a 2 year cycle is too short; maybe a 3 year cross-compatible cycle might be better.

The bolded actually isn't true at all. Perhaps at the beginning of a generation it is. But look at games like Crysis 2, Battlefield 3, and pretty much most AAA multi-platform games this generation.

Devs don't build engines that basically act a fat driver layer like windows on a PC, thus making it impossible to expoit the advanced features, nuances, and strengths of each platform. Unlike PC, they only have maybe a couple of three platform configs to support with MP games, and so developers will indeed build engines and technologies that wring the most out of each platform in terms of performance. First party platform holders also support this process with tools and codebases that further assist developers in this process.

Of course first party devs will always get more out of each singular platform than a MP dev, but that's more about time and development budget constraints than anything else. MP devs won't have the time or manpower to optimise the hell out of a platform when they have two or three individual platforms to support, the way first party devs can.

This still doesn't mean that platform hw isn't being exploited to a significant degree however, as the low level HW features are much more exposed in consoles than almost any other platform.

Speeding up the time between HW refreshes will only serve to create and increasingly larger abstraction layer between the exposed HW of each console generation and the developers. Making things effectively like the PC market. However, unlike the PC market where the single platform holder and IHVs will be responsible for providing driver layers and programming APIs, but don't however have to invest time, money and manpower into certification processes and the like, by turning consoles into PCs with more abstracted HW and frequent refreshes, you effectively create a platform that has none of the benefits of a fixed platform, and yet still retains all of the flaws of one. It would make the consoles PCs but even worse; because of the higher platform royalties, high devkit pricing, cert. and TRC processes, and all the other patching and regulatory red tape. Consoles would die a quick death.
 
Back
Top