The challenges, rewards, and realities of a two tier console launch

The very fact of somehow segmenting the users of the same console, which is not the same at the end of the day.
Segmentation has two reasons we did not had before:
1. Growth, spanning multiple generations of people.
2. Some people think 4K is not worth it, others buy into it.

Targeting all this with a single console that lasts >5 years is not optimal.
PCs addresses this much better, allowing the market to regulate itself.
If we get this freedom on consoles too, everybody should get what he wants more precisely.

Tech enthusiasts like we are worried about lower specs holding back. I feel the same. But it can be solved by allowing devs to target sub tiers, like a game available for X, Lockhart and Anaconda, but not for XBOne anymore.
If we get this freedom, we have the same self regulation and constant lift in specs like we have on PC, keeping the instant on and game comfort that consoles offer.
We loose nothing. In the worst case it takes a bit longer until cutting edge stuff comes up, but in return we do not need to wait for years for the next generation in hope games would stop to look like shit after that abrupt change.
 
If that's the case, consoles will have really become pc's, with no options to upgrade yourself. Like buying a new pc every year, or month, if they do monthly/yearly new consoles etc.
 
Premium console at launch vs premium after 2-3 years. I dont think that that 100-200€ extra would give the same performance boost as 2-3 years of tech developing.

I rather take 500€ box on launch and truly more powerful after 3 years
This is exactly what make Xbox One X so great. Compare it to the original 2013 launch system in size and shape alone, not to mention the engineering and analysis that went into making the system hit it's performance targets. Imagine if they both launched the same time, would it be larger than the original? Would it have a vapor chamber? Would it still have ESRAM and Kinect support? Certainly it would have lacked functional 4K, VRR, and HDR support because the HDMI standard lacked those things at the time. 4k Bluray wasn't out yet either. I don't think there is a valid argument that a $700 premium Xbox One in 2013 would be better than X's 2017 launch. And the same is true for PS4 Pro, minus size, 4K bluray because the released system doesn't support it, but plus an extra USB port for VR and HDMI ports that don't fall apart.
 
If that's the case, consoles will have really become pc's, with no options to upgrade yourself. Like buying a new pc every year, or month, if they do monthly/yearly new consoles etc.
Lets say new consoles come out every three years, and you can skip each second to still play every new game.
So you would still buy new console every 6 years, but progress moves on more continuously overall.
Consoles do not hold back PC anymore, and they are not more expensive either. All great.

But may be wishfull thinking. If they announce all Anaconda games will and must have Lockhart version too (like it is now with X and Pro), the situation is very different.
 
All great.

Sounds great yes. 3/4 consoles over each generation from each manufacturer seems abit messy though. Or maybe no generations at all anymore? They become too much like pc's then, Sony, MS could aswell not sell hardware anymore then, but sell their OS or something, everyone can install on their own hardware. Total freedom then.
 
I'm still holding onto the idea that Anaconda may be configurable with 16 cores for the cloud, so it can host 2 Lockhart instances and several X1 instances. Gotta pack in that density!
 
4K TV's of today are very good at scaling 1080P content. I guess most people are gaming on 40-55 inch TV's and sitting at 3-4 meters distance?.
If MS releases Lockhart and Anaconda with only a resolution difference, they will be pitched against each other under these conditions and I believe the perceived difference will be negligible. How are they going to convince buyers to get a let's say 200$ more expensive Anaconda?
(I will always buy the most expensive option of the console brand I want, but that's me, not the majority of the market I think).
 
4K TV's of today are very good at scaling 1080P content. I guess most people are gaming on 40-55 inch TV's and sitting at 3-4 meters distance?.
If MS releases Lockhart and Anaconda with only a resolution difference, they will be pitched against each other under these conditions and I believe the perceived difference will be negligible. How are they going to convince buyers to get a let's say 200$ more expensive Anaconda?
(I will always buy the most expensive option of the console brand I want, but that's me, not the majority of the market I think).

People who sit at 4 meters away from a 40" TV probably won't notice the difference between 720p and 1080p, let alone 1080p and above.
That might be a reallistic scenario for people casually watching soap operas while doing something else, but not very reallistic for gaming.

If I had a 40" TV I'd probably place myself no farther than 2m.
 
People who sit at 4 meters away from a 40" TV probably won't notice the difference between 720p and 1080p, let alone 1080p and above.
That might be a reallistic scenario for people casually watching soap operas while doing something else, but not very reallistic for gaming.

If I had a 40" TV I'd probably place myself no farther than 2m.
Whoa. I’ve got a 65” and that is my viewing distance. 40”? I’d say like 1.5m max
 
Whoa. I’ve got a 65” and that is my viewing distance. 40”? I’d say like 1.5m max

Are there even 40" TVs in the market nowadays?
I see the small and very cheap 32" FHD models that people put in the kitchen, and then there's 43" and up.
 
Are there even 40" TVs in the market nowadays?
I see the small and very cheap 32" FHD models that people put in the kitchen, and then there's 43" and up.
I believe so; it is the smallest? Form factor for 4K. Hard to find new 1080p screens though, at least not from any major TV brand. Monitor setups are still quite common though.
 
Last edited:
There are lots of kids / teens who play games in their bedroom. Hell as a kid that was the only place I was allowed to.

Outside of the living room smaller displays are still quite common.
 
There are lots of kids / teens who play games in their bedroom. Hell as a kid that was the only place I was allowed to.

Outside of the living room smaller displays are still quite common.

I don't know how old you are, but back when I was a kid the average age of console gamers was probably around 20 years old, whereas right now it should be closer to 35.

The point is 20 years ago when I had a Dreamcast in my room as a teenager it wasn't usual to have a gaming console in the living room.
Nowadays I'm guessing that most households with home consoles have their console connected to the bigger screen, because the parents are now the main users.

I have no doubts there are kids and teens who play console videogames in their bedroom, but I don't think it's the majority of households, or even the majority of kids/teens who play videogames.
 
I don't know how old you are, but back when I was a kid the average age of console gamers was probably around 20 years old, whereas right now it should be closer to 35.

The point is 20 years ago when I had a Dreamcast in my room as a teenager it wasn't usual to have a gaming console in the living room.
Nowadays I'm guessing that most households with home consoles have their console connected to the bigger screen, because the parents are now the main users.

I have no doubts there are kids and teens who play console videogames in their bedroom, but I don't think it's the majority of households, or even the majority of kids/teens who play videogames.
You're Thinking mobile gaming and tablets are a bigger allure for the younger audiences and consoles for the 20+ adults?
 
You're Thinking mobile gaming and tablets are a bigger allure for the younger audiences and consoles for the 20+ adults?

Not 20+ but more like 13+ for consoles, but yes that's what I'm seeing nowadays.
 
People fear the unknown. The fear that somehow it’s going to prematurely bottleneck the whole generation. The fear that everyone will cater for the shitty low spec box because everyone will buy that one instead.

I think that's a big part of it. And likely why I see far fewer long time PC gaming people being concerned than I do long time console players.

Especially those that were more focused on PC gaming versus Amiga, Atari, or Apple home computer gamers.

While the x86 PC gaming space started at a much lower bar than their Amiga, Atari, and Apple counterparts, they progressed significantly faster as the market for PC parts exploded in the 90's. That allowed for extremely rapid innovation WRT to hardware, something that you couldn't necessarily do with locked and proprietary systems, especially at the volume home computing was at during the 80's and 90's.

Microsoft releasing Windows 95 and later Directx (allowing hundreds of thousands of differing combinations of hardware to generally play nice together in games) was key in allowing gaming on PC to truly explode.

Combined with the introduction of 3D hardware that was affordable to PC gamers meant that developers had to rapidly develop the ability to scale their games across a wide range of hardware that offered vastly different performance characteristics. And not just that but wildly different amounts of memory.

So, for those of us that lived PC gaming from the 80's and 90's through to modern gaming, it's extremely odd to see people so concerned about developer's ability to push the envelope WRT the latest hardware while still enabling scaling to years old hardware.

But when thinking about, the move to focusing on console development first has seen a relatively massive regression in developers pushing the envelope with the latest and greatest hardware. Console hardware has up to recently featured only one hardware configuration per console manufacturer. Ignoring Nintendo that meant just 2 hardware configurations to develop for and then ports to PC which are sometimes handled by a different development house.

This generation has seen developers grapple with how to approach 4 different hardware configurations (for multiplatform developers) and some have done well while others have struggled to scale well to all current console platforms (again ignoring Nintendo for the moment).

This is something that would have been unthinkable back around the time when Microsoft first launched the Xbox. Non-console developers livelihoods depended on their ability to scale across hardware while simultaneously pushing hardware.

But, something that has shone through is that those few developers that have embraced the challenges of scaling their games across hardware while pushing what is possible (iD, The Coalition, Turn 10 and Playground Games really stand out) have not only pushed what is possible with both the base consoles and their mid-gen refreshes but also pushed even further on PC and more importantly their engines scale back father with better performance on PC than most other developers.

I guess, in that sense the fears might be justified to an extent. Not all developers are as capable as iD or The Coalition at pushing what is possible while simultaneously scaling back to very old and outdated hardware.

Regards,
SB
 
I don't know how old you are, but back when I was a kid the average age of console gamers was probably around 20 years old, whereas right now it should be closer to 35.

The point is 20 years ago when I had a Dreamcast in my room as a teenager it wasn't usual to have a gaming console in the living room.
Nowadays I'm guessing that most households with home consoles have their console connected to the bigger screen, because the parents are now the main users.

I have no doubts there are kids and teens who play console videogames in their bedroom, but I don't think it's the majority of households, or even the majority of kids/teens who play videogames.

I was in my 20's when I had my Dreamcast. Damn, I'm an old bugger!

Most of the childed folks I know have a console under the main telly, but a lot have a system in the kids room so the adults can use the big telly. Often, the newest and best console is the one under the main telly, where as older and cheaper systems migrate upstairs or into spare rooms (where they're lucky enough to have them).So Xbox 360, X1 and PS4 standard.

That's one of the reasons I think a cheaper machine that can still share the same digital library might have value. If you want to get the kids / teens / live at home 20 somethings out of the way, or you can't justify a $500 - $600 box for only some of the users of the tv, go for that.

I assume with HDMI that MS and Sony will have been able to get feedback on connected devices ...? If so, would be interesting to know just what the spread of display devices is.
 
Right, so this idea's popped up in the next generation hardware thread and a couple of others, and it's a very straightforward one: launch the base and high end consoles at the start of the generation.

I'm of the opinion that this should happen, because I'm not convinced that 3 or 4 years brought much to the table of this generation's mid-gen consoles. At least, not much that couldn't be solved by a better cooling solution. 3 or 4 years of developer experience is more valuable IMO.

There are arguments for and against, from all kinds of perspectives, and I'd like to read some.

Personally, I'd like to see a base console at $350 for quick mass market appeal, and a $700 beast. I think it targets the broadest possible range of players, with streaming taking care of the bottom end.

Everyone else?

Lockart hold back Nextgen Development, Multiplattformgames on Scarlett does not look like good as PS5 Exclusives.If Microsoft really pulls through Lockhardt and demands that the scrap have to be supported, then they get 720p 30FPS and mud textures. Otherwise you have no chance against PS5 Exclusives. Whoever came up with that has gone mad.That would still be on the Distaster scale via Always Online and Xbox One Kinect bundling.RAM is not easy to scale. The Min-Spec has a huge impact on the level of detail and gameplay.Even the 16GB are ridiculous for next-gen. If Microsoft really pulls through Lockhardt, we have practically no progress for multi-platform games.The only thing that can be easily scaled down are pools for textures and geometry. But that's only maybe half of the memory. So that I don't have to cut corners when you have to support Lockhardt, it shrinks to almost zero. That's why I say yes, then there's 720p and lowest detail for the scrap heap. Whoever buys this is his own fault. As if the Xbox One wasn't a lesson enough.There is no easy RAM scalability. Does not exist. If Lockhardt has to be supported, cuts have to be made for all other consoles. And massive. There is no extra sausage in content production. The times are long gone.
 
I don't think MS would have a lower end SKU if it somehow directly compromised the main machine's abilities to deliver competitive performance against the PS5. Whereas a jump from 1080p to 4K generally requires a 4x jump in GPU capability on all fronts to guarantee the same frame rates, I don't think Lockhart would be anything less than half as capable as Scarlett in order to minimize any compromises going from targeted 4K to targeted 1080p. Plus it would be weird to have a "nextgen machine" with less GFLOPS than the XboneX, regardless of RDNA2 efficiencies vs GCN and different resolution targets. It would be readily capable of running XboneX titles at their native res as well for BC.
 
We have two dies version:
- One Binned Soc to rule them all with power
- One Faulty Soc to find them with the right performance/price ratio

But the final purpose is One Pay Service to bring them all and in the Live bind them.

The binning strategy is nice, but it's pricey and can't react to the market response.

But what if I told you that you can both have the smaller cheap soc plus the huge soc, and still recycling the faulty huge soc in the cheap series?

No waste + flexibility.
Oh my Gooooooooooooooooooooooooooood!

hqdefault.jpg
 
Back
Top