Sony PS6, Microsoft neXt Series - 10th gen console speculation [2020]

Microsoft is losing something like 100 to 200 USD on each Xbox - I think people in this thread should also consider that factoid when discussing the possibility of PS6 and XSNext. The cost of console gaming is going up too.
Any chance you’ve heard anything WRT how much sony loses on each PS5?
 
Any chance you’ve heard anything WRT how much sony loses on each PS5?

They don't lose money on the 500 dollars/550 euros model but they lost money on the digital model. After they gain money fast on the digital model with sales of games and/or subscription on PSN. And it took them 10 months to have the disk model reach profit.


I am sure we will have one last generation at least on Sony side. Higher clock and less CU is better for cost of the console. Xbox Series X is probably very expensive to make and same for Xbox Series S.
 
They don't lose money on the 500 dollars/550 euros model but they lost money on the digital model. After they gain money fast on the digital model with sales of games and/or subscription on PSN. And it took them 10 months to have the disk model reach profit.


I am sure we will have one last generation at least on Sony side. Fast and narrow is better for cost of the console. Xbox Series X is probably very expensive to make and sam for Xbox Series S.
I wonder why Sony is not losing any on the disk model and maybe they are losing a hundred on the digital one, whereas MS is losing more. Doesn't make much sense at a first glance.
 
I wonder why Sony is not losing any on the disk model and maybe they are losing a hundred on the digital one, whereas MS is losing more. Doesn't make much sense at a first glance.

The bigger APU for Xbox Series X and they probably lose 100 dollars on Xbox Series S. And they have two very different model of consoles this is not ideal from a factory. When the new rumoured PS5 with optional BR disk will release it will be even better on Sony side.
 
As we've seen throughout the years, graphics in games is just another form of marketing. From "shots" on the back of boxes to advertisements in magazines and now reviews and other factors. Graphics nowadays has minimal impact on gameplay, yet it has significantly more clout WRT marketing.

I remember when PC screenshots were used to market console versions of games. That was fairly common up until the point where consoles became the primary dev. target and even then it's not uncommon to see PC footage used in advertisements for the console versions of games.

To put it another way, why do companies spend hundreds of millions of dollars on advertising and brand image? For the same reason that publishers prefer to sell their games based on a screenshot of their game rather than a video of their game. Screenshots don't generally show rendering artifacts that are far more noticeable when a game is in motion (especially temporal artifacts). In both cases you are investing money to elevate the consumer consciousness that your product is the best for the target market because of X thing. In the case of games, unfortunately (IMO), this tends to be dominated by discussions about graphics rather than the far more important gameplay (again, IMO).

For games, graphics have historically been a large selling point regardless of whether that level of graphics exists only on X platform but not Y or Z platforms. Or in the case of PC on A000001 combination of hardware but not on A000002, A000003 ... AXXXXXX combination of hardware.

Graphics, thankfully aren't the only reason people buy games, but WRT a AAA publisher or AAA developer marketing their game it's likely by far the largest marketable point for their game. And thus unfortunately to the detriment of AAA games, publishers and developers with that level of budgeting (AAA) will quite happily sacrifice gameplay in order to be able to claim to have the best graphics or at least competitive graphics if they can't have the best.

Basically for AAA developers, graphics set the stage for both initial gamer investment as well as it's potential pool of buyers. The worse the graphics the fewer people will pre-order it and the fewer people will consider potentially forking over 60-70 USD on the game.

So, if a AAA developer can distance themselves from other AAA developers by having noticeably better or more pleasing graphics they'll in turn be rewarded by greater consumer interest which leads to greater pre-orders and as long as the gameplay isn't complete dog shite (pardon my language) then also increased lifetime sales.

NOTE - I'm not saying graphics are the ONLY reason people buy games, but prior to a game coming out, other than developer/IP reputation and name recognition, graphics are by far the most important thing for a AAA game to have.

BTW - unlikely those large investments in marketing, targeting the best hardware will generally mean a better looking game all the way down the hardware chain as long as the developers are at least relatively competent with making a scalable engine.

Or think of it another way. As a AAA developer you'll most likely need to sell well on PC in addition to selling well on PS and XB and for some publishers you even need to sell well on NSW in addition to PC, PS and XB. You can just half ass it on PC and get X level of sales or you can treat it like a proper platform and get a greater level of sales. Basically, as a developer, are you feeling lazy and want to just leave money on the table or not?

Regards,
SB
Yeah but obviously it works the way they are making games now. Make a game that runs well on the most common denominator and let the more powerful hardware push higher. The difference is enough to notice, but not enough to make people feel they are losing too much if they dont own a super expensive rig.
How would the market be responding if they make games that are designed to push lets say an RTX 4090 to the max? They advertise you those fantastic immersive worlds, outstanding AI, and physics for those who own a high end PC. And then the game has to get a significant amount of compromises in the settings to be able to run on most PCs to the point that the experience is barely there for the majority. Which is what happened to Crysis.
Consumers will be trusting less games that play significantly worse to what has been adverdised.

The days you are referring to I suppose were the 32 bit and 128bit era. Our internet was slow, videos were highly compressed, images were small, visuals were so imperfect that our brains filled the gaps, our CTR's "corrected" visual impurities. Back then companies were making a lot of outlandish claims either about their games or their hardware. The back of the boxes often had CG cutscenes too. We didnt care. Things that fed our excited gaming minds and wanted to believe them. The industry was still young. Today outlandish claims will go through huge criticism, such ads would be ridiculed and torn apart, and gamers get a lot more information. We measure every pixel and frame. We didnt care as much back then. A lot of images were bullshots even for PCs.

Remember how people were tearing apart Watch Dogs or Cyberpunk 2077 because they were not living up to the expectations of the original marketing material.

If what you say was true developers would have been doing it now. Nothing stops them really. Except how finances and the market work.
 
Last edited:
So, if a AAA developer can distance themselves from other AAA developers by having noticeably better or more pleasing graphics they'll in turn be rewarded by greater consumer interest which leads to greater pre-orders and as long as the gameplay isn't complete dog shite (pardon my language) then also increased lifetime sales.
That's the theory, the halo effect. However, if that's the case why aren't publishers doing this already? Are they all missing an obvious trick, or do their own numbers show $10 million in marketing and a CGI trailer is better marketing than creating a high-end graphics pipeline? ¯\_(ツ)_/¯
 
I speculate that consoles are dying, and will be replaced by generic PCs that anyone can either buy pre-built from Sony or MS, or build themselves.
Reminds me of the very rational 'PC is dying' arguments. Or the 'Nintendo should go 3rd party' arguments too. Common sense has a tendency to not play out how we'd expect.

I imagine when consoles do die, they'll be replaced with streaming rather than PC. At the moment consoles aren't dying; Sony won't stop making PS consoles until they can no longer make (good) money from it, which doesn't look to me to be next gen. So I guess, unless you can point to PS6 being canned and Sony putting out a PC instead, this line of discussion doesn't fit this particular thread. ;)
 
Already covered earlier this year: https://forum.beyond3d.com/threads/new-patent-by-mark-cerny-on-improving-ray-tracing.62768/

As for meaningfulness, let's just take a moment to remind ourselves of the Sony Patents that never came to pass, like Photon Mapping. Indeed, revisit some of the old discussions that presented such good arguments for Magic Hardware in PS5 that in the end were total bunk:

Photon Mapping:

Touch Screen Controller:

I'd say following precedent of 20 years of fancy, exciting patents that don't materialise, ignore all patents at this point other than for theoretical discussion, and only talk about what's presented in real hardware tear-downs or reveals.
 
Already covered earlier this year: https://forum.beyond3d.com/threads/new-patent-by-mark-cerny-on-improving-ray-tracing.62768/

As for meaningfulness, let's just take a moment to remind ourselves of the Sony Patents that never came to pass, like Photon Mapping. Indeed, revisit some of the old discussions that presented such good arguments for Magic Hardware in PS5 that in the end were total bunk:

Photon Mapping:

Touch Screen Controller:

I'd say following precedent of 20 years of fancy, exciting patents that don't materialise, ignore all patents at this point other than for theoretical discussion, and only talk about what's presented in real hardware tear-downs or reveals.

Or better wait an hypothetical PS5 pro if Sony is a little bit more talkative about the features of the console.
 
Already covered earlier this year: https://forum.beyond3d.com/threads/new-patent-by-mark-cerny-on-improving-ray-tracing.62768/

As for meaningfulness, let's just take a moment to remind ourselves of the Sony Patents that never came to pass, like Photon Mapping. Indeed, revisit some of the old discussions that presented such good arguments for Magic Hardware in PS5 that in the end were total bunk:

Photon Mapping:

Touch Screen Controller:

I'd say following precedent of 20 years of fancy, exciting patents that don't materialise, ignore all patents at this point other than for theoretical discussion, and only talk about what's presented in real hardware tear-downs or reveals.
I rather agree with you we that should be careful but those 2 patents were interestingly not filed by Mark Cerny.

Have we ever seen one of his patent that was not eventually used (at least supposedly) on a Playstation (4, 5) or VR device?
 
Mark cerny patent for a doing ray tracing.. looks like intel nvidia like rt core.. ps5 pro should be powerful

We can hope for Sony's sake that their not trying their own RT implementation. While they might be able to come with something very potent, the risks are larger it wont compete with whatever AMD (if they ever) or intel/NV can provide to them. Then theres also this with compatibility/cross plat development and all.

Anyway patents rarely have had anything meaningfull impact on future products to begin with, not forgetting that engineering something now might end up drastically different performance wise in the future.
 
We can hope for Sony's sake that their not trying their own RT implementation. While they might be able to come with something very potent, the risks are larger it wont compete with whatever AMD (if they ever) or intel/NV can provide to them. Then theres also this with compatibility/cross plat development and all.

Anyway patents rarely have had anything meaningfull impact on future products to begin with, not forgetting that engineering something now might end up drastically different performance wise in the future.

I highly doubt this is very different than AMD implementation maybe asked a little customisation if they use it on a future console. The customization they asked from volatile bits to ID buffer, Cache scrubber or Flexible Scale rasterization aren't huge.
 
I highly doubt this is very different than AMD implementation

Well i sure hope it is different go AMD’s current implementation. Both regarding RT and ML.
That is IF sony decides to try engineer their own RT hardware. Actually they might have to seeing AMD’s current developments.
 
Well i sure hope it is different go AMD’s current implementation. Both regarding RT and ML.
That is IF sony decides to try engineer their own RT hardware. Actually they might have to seeing AMD’s current developments.

Like I said probably a minor modification of what AMD offer with RDNA 3 when you read the patent and see the RDNA3 presentation.
 
Well i sure hope it is different go AMD’s current implementation. Both regarding RT and ML.
That is IF sony decides to try engineer their own RT hardware. Actually they might have to seeing AMD’s current developments.

RDNA3 still seems to have a large uplift in ray tracing performance over RDNA2. I am not sure amd would want to intergrate something into RDNA that they themselves don't control. Would seem counter productive for them as a whole when they can instead continue to focus on increasing performance of their solution or creating a new solution
 
RDNA3 still seems to have a large uplift in ray tracing performance over RDNA2. I am not sure amd would want to intergrate something into RDNA that they themselves don't control. Would seem counter productive for them as a whole when they can instead continue to focus on increasing performance of their solution or creating a new solution

Someday they'd have to implement something similar akin to intel and nvidia. They cant be lagging behind forever. The DF team suspects RDNA4 might be it, RDNA3 isnt really there yet. And perhaps thats a good thing as better late and good then quick and bad.
 
Someday they'd have to implement something similar akin to intel and nvidia. They cant be lagging behind forever. The DF team suspects RDNA4 might be it, RDNA3 isnt really there yet. And perhaps thats a good thing as better late and good then quick and bad.
They may find another way , who knows we will just have to wait
 
Back
Top