Feasibility of an upgradeable or forwards compatible console *spawn*

When do you think the PS2 launched? :???:
When did Digital Foundry and the pixel counters appear to tell everyone how undersampled PS2 games were? The internet back then was ugly HTML web-pages and GeoCities and simple fora. It wasn't telling people how sub-par their console was. Back then people used magazines to read about games and were generally oblivious to the shortcomings of their machine, and were all the happier for it!
 
I agree that it would be a bit of a risk......There are people that would be upset.
But the alternative is to just sit and for Microsoft to get embarrassed every single month for the rest of this generation.
I mean lets face it.......this resolution-gate stuff is a PR nightmare for both XBOX Brand and Microsoft itself, one that keeps sticking into them like a thorn over and over and over and over with no apparent end.

I'm not saying an enhanced XBOX One absolutely will happen........but I wouldn't really be surprised if it does.
But if it does......I'm taking all "prognosticator credit", you can be sure of that. ;)
I think it would be a PR disaster of greater amplitude than the pre-launch fiasco. Whether it is Sony or MSFT who does it what are speaking about? Most likely a machine able to push 4K resolution, the number of customer potentially interested is pretty low, not high enough to legitimate anything on the software side other than a resolution increase, you will want the same copy of any game to function on both systems with minimal changes and impact on performances. I believe it is doable using widely available process (/GF or TSMC 28nm process), the complicated design of the X1 makes things complicated, whereas Sony straightforward choices are making it relatively easy to consider.
It is disputable but I do believe that at this point perceptions are set for the generation (see last gen the laasssttttiiiiinnnnnnggggggg impact of the PS3 introduction), what I would wish for MSFT is that all the efforts they seem to have put on the software would allow them to map "durango software machine" to cheaper hardware and along with their deep pocket fght with a significant price advantage. As things go by the end of this gen Sony might have close to a monopoly outside of the USA and the situation may not be that rosy either in the USA.
 
Last edited:
PS3 still got inferior versions of games. A more direct comparison might be PS2, inferior to XB all its life. It too suffered lower resolutions, but there wasn't an internet back then telling everyone how many pixels were actually being rendered! XB1's lower res isn't really a PR nightmare as it's not damaging to the public face of the brand. It's more a competitive disadvantage.
Resolution back then was mostly irrelevant considering the SD TV sets. Also the PS2 has become such a huge success that everyone cared about the game exclusives it was getting. It was a phenomenon. Everyone knew that the PS2 was technically inferior. But would someone opt for XBOX to buy DOA or opt for PS2 and buy Tekken? Would someone opt for XBOX to buy Splinter Cell or opt for PS2 to get the next MGS? Would someone opt for XBOX to play Sega GT or opt for PS2 to play Gran Turismo?
The PS2 was a beast in terms of games people wanted to play and it has built its reputation as the most powerful console ever released a year or two before the XB1 and GC launched. The brand was already strong. By the time competition launched people didnt care if the XB was more powerful.
XB's capabilities were too obvious without pixel counting.

It was a lot different than today's simultaneous release of consoles with equal exclusives under their disposal. So performance in this case is a more obvious buying factor than it was back then.

Where the PS2 gained some unfair ground was with the DC. The PS2 was obviously more powerful in many areas but so was the DC in others judging from comments here. If people had access to information back then as they have today I suspect that the DC's fate would have been a little bit better
 
When did Digital Foundry and the pixel counters appear to tell everyone how undersampled PS2 games were? The internet back then was ugly HTML web-pages and GeoCities and simple fora. It wasn't telling people how sub-par their console was. Back then people used magazines to read about games and were generally oblivious to the shortcomings of their machine, and were all the happier for it!
B3D was very much alive back then and we did have the longest, most tedious, fanboy infested discussions. Back then it was interlaced vs progressive, low res Ico etc etc etc...
 
But Joe Gamer didn't, so there was no 'PR backlash' for PS2 rendering NFS at 30% lower resolution than XB. These days, every man and his dog is pixel counting and telling the world what inferior resolution and framerate a game is running at, although it's still limited to core gamers. But I accept they'll tell their friends that console A is better than console B (in much the same way some said PS3 was better than XB360 because of the Cell supercomputer in it...) and there will be some negative marketing. But, it's still marketing and should be competed against with marketing rather than throwing in the towel calling it a PR nightmare. You don't have to be the best product on the market to have dominant market share. You just need to out-market your competitors. I reckon XB1 handled differently from the outset could be selling far, far better even having a performance deficit, and it shouldn't be considered that performance deficit == fail == new model ASAP.
 
When did Digital Foundry and the pixel counters appear to tell everyone how undersampled PS2 games were? The internet back then was ugly HTML web-pages and GeoCities and simple fora. It wasn't telling people how sub-par their console was. Back then people used magazines to read about games and were generally oblivious to the shortcomings of their machine, and were all the happier for it!

Believe it (or remember it) or not but people were dissecting graphical techniques before Digital Foundry :yes: As for your recollection of the internet circa 2001, sure it looked different because HTML was still on version 4, but I recommend you have a look at Wayback Machine and check out some of the websites still around today. Before judging remember websites were optimised for a 1024x768 screen and had to 800x600 as well. A fair few of the gaming websites around today started shortly in 1995/96.
 
eGPU..Lightpeak...Thunderbolt...Dockport ...their names are plenty, but the targeted use being connecting another GPU to the system....and eGPU seems to mature and get more useful this year with Razer, Asus, Dell, MSI joining the fun. What was a dream for consolites, should become a reality next gen, with easily upgradeability!

imagine hooking up another GPU to PS5 for extra power? Can one daisy chain eGPU?
Stock PS5 = 4K/30
PS5 + eGPU = 4K/60?
PS5 + eGPUx2 = 4K/120?

I expect Sony, if they are smart, to make PS5 OS to work with multi GPUs from the start.

Will you buy a eGPU addon for $199-249, two years down the road for the extra boost?
 
Dockport isn't even designed for transmitting arbitrary data to a GPU, its like a replacement for Displayport, but for docks.
Thunderbolt and Lightspeak are the same thing, Lightspeak is an early codename for Thunderbolt.

I doubt that external GPUs will become popular on consoles the cost, heating and powering issues will just be too high, also fragmenting the market is bad enough with other accessories (such as the Kinect / EyeToy / etc). Fragmenting it on graphical capabilities is suicide.
 
But but is it time we stop fearing about fragmentation..? In that console gamers are more techy savy and more welcoming to buying and plugging things into their consoles that keeps it more relevant?

I don't think the eGPU box will be bulky, a MXM GPU on a mini PCB and a blower fan, no bigger than some power bank... no louder than stock PS5.

Stock PS5 wont run games at slideshow, but adding eGPU will increase the fluidity...i bet if TB was available back then, Sony would make their VR headset TB powered!
 
But but is it time we stop fearing about fragmentation..? In that console gamers are more techy savvy
Informed as a purchaser. I don't really think more tech savvy. Lots of people use technology but not many people understand it.
 
image.jpg
 
Informed as a purchaser. I don't really think more tech savvy. Lots of people use technology but not many people understand it.

I yeah, I don't follow that line of reasoning. Sure PC gaming forums may have more technical questions about problems but that reflects the diversity of the platform, not that PC owners are less tech savvy or inversely that console owners are more tech savvy.
 

Times are a lot different now.

A properly supported eGPU add-on for the current crowd would work differently. Imho.

Why have so much resistance to such potential that potentially will scares off Sony from doing it.

I would figure with the bad experience of PS3+1080p then, a dual GPU PS5 will fluidly playback content for the 4K/60/HDR crowd and pleases them.
 
Times are a lot different now.

A properly supported eGPU add-on for the current crowd would work differently. Imho.

Why have so much resistance to such potential that potentially will scares off Sony from doing it.

I would figure with the bad experience of PS3+1080p then, a dual GPU PS5 will fluidly playback content for the 4K/60/HDR crowd and pleases them.

I have no resistance to the idea. I'm one of the idiots that bought a Genesis, Sega CD and 32x.
 
Two interrelated problems with upgrades are the extent to which the new parts are held back by the old, and the degree of redundancy and duplication that they introduce, pushing up the total cost of delivering the new level of capability (in turn, hurting sales).

For example, the M-CD was a really cool piece of kit, but the second 6800 was often idle and the quality of the visuals was held back by the ageing graphics chip in the MD/Genesis that really needed to be able to handle more colours to let the new system shine.

With the 32X - an underrated device IMO - you had a combined system with two video encoder chips, two PSUs, 3 CPUs (+Z80), something like six memory pools, a sidelined 2D graphics chip (now only use for overlays and backgrounds) and two external power supplies. The cost of the MD and 32X combined was far higher than of a similarly capable device that had been made from scratch. This is a not a good position to be in if you wish to grow your userbase beyond the base unit.

Where upgrades do work is when they compliment a system, aren't held back by the existing system, and are cheap enough to be justifiable for a few good games. The examples I can think of are the memory extension packs for the N64 and Saturn, and the Dual Shock for PS1. They all added something, didn't introduce redundancy or a large amount of complexity, and were relatively affordable.

I miss the upgrade days because they were exciting, but they weren't necessarily the best days for the industry.
 
Well an eGPU in concept doesn't have any redundancy. In modern threaded/jobbed software that scales across processing options, adding another GPU (or even CPU/APU) should appear as just more resources for the software to use, and should be available transparently to games if implemented right (although maybe that 'right' isn't realstically possible?).

Expose settings in game for users to customise (maybe just simple low/med/high quality - we have this now on mobile games) andthey can choose what level of pretties and framerate they want, with the option of paying more for better quality.

Of course, that's all very PC, but at the same time PC doesn't yet offer a console experience. If that changes than the upgradeable console is kinda redundant, but then it may be all Sony have left to compete with MS.
 
Back
Top