Xbox Series S [XBSS] (Lockhart) General Rumors and Speculation *spawn*

Status
Not open for further replies.
I disagree, especially with no optical drive. I think this should cost reduce quite nicely over time.

I don't see it going much lower than 249 in the third year and maybe 199 in the fourth/fifth.

I'm still genuinely surprised at how many people thought $299 as a surprise.
Given the leaked info about the Series S I expected that price from a mile away, even went on record saying that it would be dead in the water if it fetched for anything above that.
 
Common. Prior to RT cores, CPUs were fairly competent at RT when compared
to GPUs as I understand it.
I wouldn't be suprised that the building and updating of the BVH structure to also have a cpu implication then for consoles and DXR (RDNA2) due to as far as we know that none having specific hardware support.

Is there an efficient way to do it in shaders?
If it can be done on the jaguar cores, zen should plow through it.
 
I wouldn't be suprised that the building and updating of the BVH structure to also have a cpu implication then for consoles and DXR (RDNA2) due to as far as we know that none having specific hardware support.

Is there an efficient way to do it in shaders?
If it can be done on the jaguar cores, zen should plow through it.
So it largely comes down to the type of ray tracing you are doing as i understand it.
If the rays are coherent, the architecture of a GPU is ideal to do the processing (all the rays will hit the same area locally, so it can be processed very effectively by a CU/SM grouping.

if the rays are bouncing everywhere and extremely incoherent, then that random hit patterns will penalize the GPU setup. So if you are doing secondary bounces etc and all the interesting things with ray tracing, CPUs have been traditionally good at it vs GPU until they started introducing the RT Cores.

There are others that can chime in but I also understand that there were large memory limitations. (hur hur, similar issues we face in ML) So particular cards (3000x series) would be insufficient for movie level RT rendering (just not enough memory to perform the task given the size of memory will exceed the limit). And this is where RT is traditionally done (professional market)

I think the game market is different and has very different requirements, which is why we're seeing a surge here.

I do believe that what UE5 did with PS5 is a very appealing setup that when extrapolated to a bigger configuration could be very effective in particular industries.
 
Last edited:
I don't see it going much lower than 249 in the third year and maybe 199 in the fourth/fifth.

I'm still genuinely surprised at how many people thought $299 as a surprise.
Given the leaked info about the Series S I expected that price from a mile away, even went on record saying that it would be dead in the water if it fetched for anything above that.

Look at what costs money in the design. The memory, flash and SoC will all get cheaper within the first few years. I don't actually think your timetable is that far off, though. Maybe too pessimistic by a year all around.
 
This idea that it has to at first launch cheaper than the 1S and now also be cheaper quickly is bizzare to me.

In uk its launching at £250
Current pricing
Xbox One S £250
Xbox One Sad £200

I'd argue it's already hit the lower end price point at launch.

During sales next year it will be around and under £200

These are already mass market prices, and during sales is in impulse territory.
 
I kind of wonder if it would be a win if Microsoft were to put in big HUGE letters on the box.
Some folks will want the drive and Microsoft probably don't what that cost gulf with only the drive being the differentiator. Whilst the different on a 1080p TV may not be massive between X and S - and we don't know yet if the RT solution may be scaled back - you want to feel like you're getting better visuals. Equally you seed doubt in the mind of people who may upgrade to 4K in the near future.

I'm glad somebody finally had the balls to price and date their consoles, well done Microsoft! :yes:

I was expecting to be able to at least register interesting for pre-ordering Series X today on Amazon but it's not even listed. PS5 has had a fully featured page along with all the announced peripherals since the reveal. I wonder if Microsoft are shunning Amazon in the UK.
 
Dy0JcCM.png


:LOL:
 
Pretty much all implementations of RT in games so far have been completely connected to the frame buffer's resolution. Most effects are spawning rays for every render-target pixel, or every other one, or 1/4 etc. That means all those games RT tracing performance cost will be reduced "automatically" proportionally with resolution reductions*, in direct oposition to your assumptions that just because rays are traced in world-space they bare no correlation to render-res. Your point was simply wrong. Not just wrong, you were claiming EXACTLY the OPOSITE of what is actually a fact. And snarkly so.

*the exception being acceleration structure build. That will have the same cost irrespective of resolution, geometry density is what matters in this case. If most of that is baked and streamed from storage, the cost will be low. If it's dynamic, it could pose a problem.

**Ray tracing probes distributed in worldspace is resolution independent, but also is not an aproach used by any game so far.

Thanks for this.
 
Last edited by a moderator:
It's not designed for the hardcore gamer. It's for the casual. The hardcore will want the Series X.

Tommy McClain

Sure. But what constitutes a hardcore gamer? Is it simply a higher income bracket than others? Can't there be cheap hardcore gamers?

FYI AzBat, I'm not being super serious with these questions. I just like hearing console gamers opinions on such matters/topics. :)
 
Last edited:
Sure. But what constitutes a hardcore gamer? Is it simply a higher income bracket than others? Can't there be a cheap hardcore gamer?

FYI AzBat, I'm not being super serious with this question. I just like hearing console gamers opinions on such matters/topics. :)

Too late at night for that shit. ;) I'm going to bed, I'm going to be worthless in 4 hours. LOL

Tommy McClain
 
.
True, but you started this by being so rude, you didn’t even answer my questions, more belittle me for no good reason (and completely incorrectly!).

I'm sorry but I don't feel like I was rude to you or belittle you. I answered your question mentioning that rays happen in world space, which is still correct. I asked you to provide a reason why you though that it was resolution dependent (as you mentioned this is a technical forum to discuss this things). Function provided an answer on your behalf that initially I didn't understand fully, as I interpreted as just reducing the number of rays.

Since rays are, in the current implementation, kicked off from the camera view point (rather than actual sources, effectively inverting the natural direction of light, unless we are Cyclops haha), as it is the cheaper method of doing it, resolution does have an impact. Nevertheless the geometry on the scene will still have arguably as much if not more impact since the rays will bounce on it, so the more complex the geometry the harder the ray tracing will be - like Milk even mentioned, it is a fixed cost that you cannot avoid and does not scale with resolution as it is the world space - which still points to potential differences between XBSS and XBSX beyond simply resolution, which was my initially slightly flawed argument!

Yes I was wrong because under the current methods rays are initially casted from camera, which does bring resolution into view. As per milk second ** ray tracing can be done independently from screen resolution. It's just that it is unlikely at the current performance the hardware has. That would be the real ray tracing I was thinking it was, but we are not there yet.
 
Last edited:
I want to see nextgen installation size. On lockhart due to 1080p assets and lack of redundancy, hope that it will make 512GB usable.
 
Of course that's an option if you're a patience person, and don't mind swapping huge triple-A titles that are getting bigger (100GB+), not smaller. More power to those folks...

You would be free to use an SSD and actually gain the perf from it, even for Xbox One games the Jaguar cores loading assets will not cause as much issue on the Zen cpu. It could make external USB games fly

Copying 100gb also should be pretty speedy over full speed USB3 with an SSD.

It will be totally different to the current external game transfer experience.
 
I want to see nextgen installation size. On lockhart due to 1080p assets and lack of redundancy, hope that it will make 512GB usable.

I want to see the cross gen install sizes as well. That's what will make up the bulk of the next 18 months. If developers don't modify their workflow/packing/streaming systems then sizes will get bigger for the shorter term.
 
Last edited by a moderator:
Are the rumored specs confirmed BTW? 4 TFLOPs GPU and 10GB RAM?

All I saw so far was 512GB SSD and the generic RDNA2 + HDMI 2.1 features.




Auch, that price-per-GB already starts at over twice the price of what we'd get with an add-in NVMe SSD with similar performance.
Does this mean the 512GB in the XSS aren't expandable?

For being precise I don't think this part of the comment is ok but the other part are true. Raytracing is dependent of resolution and in geometry complexity.

EDIT:
This is how they do raytracing on GT7, they raytrace at 768p and checkerboard the reflection at 1080p and they can have raytraced reflection at native 4k 60 fps.

Less pixel is equal to cheaper raytracing.
 
Does xsx SSD use PCIE4.0?

The raw speed of xsx SSD is 2.4GB/s. Won’t the BOM be close to Kingston A2000 which is 2.2GB/s?
IIRC the SeriesX expansion uses PCIe 4.0 x2, meaning it's got the same performance as a PCIe 3.0 x4.
All you need is a NAND and controller that can keep up with the same 2400MB/s read speeds.
 
Last edited by a moderator:
Status
Not open for further replies.
Back
Top