Baseless Next Generation Rumors with no Technical Merits [post E3 2019, pre GDC 2020] [XBSX, PS5]

Status
Not open for further replies.
Oooooh, time to baselessly speculate!

Some sort of system link/Crossfire type thing would be very cool. If it can do that, I'll let Jim Ryan bugger me. And I'm not into butt stuff, even with women, so he should consider himself lucky. I'll also shave my ring piece, so it's like a freshly buttered bagel.

Anyway, enough erotica. How feasible would any kind of system link be?

I mean if it’s that easy, I have a lot of gadgets at home.
 
I'm not sure but from a quick search a 1.5 inch 128 X 128 OLED for Arduino already costs around £12 - £15. Granted that Sony would get them in bulk so it would be cheaper, but for sure it would be higher resolution than that and probably a bit larger than 2, at 2.65 or something. Regardless, I don't think it is a meaningless cost. Plus, given than most titles will be multiplatform and XBoX or PC don't have controllers with screens, it would probably be barely used to full potential other than 1st party titles. Nah, the more I think about it, the more I doubt it.
I don't think the chances are very high for the screen, but here's a 2 inch 320x240 screen.... It's $2 in 1000 qty.

https://www.alibaba.com/product-detail/240-X-320-2-0-Inch_62377221891.html

And that 1" oled on the ds4 button adapter is below $1 in large qty.
 

The saltyness with this is awsome,

For thoes who don't know: a company made a very black paint and Anish brought the exclusive rights to it. No other artist was allowed to use "his" ultra black black. This product is a direct response to that with added snark.

*Note: By adding this product to your cart you confirm that you are not Anish Kapoor, you are in no way affiliated to Anish Kapoor, you are not purchasing this item on behalf of Anish Kapoor or an associate of Anish Kapoor. To the best of your knowledge, information and belief this material will not make it's way into the hands of Anish Kapoor.
 
I'm not necessarily sure that correlation of being behind means they weren't being communicated with from the start.
I don't exclude the possibility of early communication with AMD from the start, I merely conclude that given the rapid deployment of NVIDIA's RT initiative across the board, it's efforts most probably predated DXR. Meaning they were already in the process of doing hardware accelerated RT in the form of a CUDA/Optix extension, and later as a DX12/Vulkan proprietary extension. This is the only explanation for their seemingly immediate hardware support for DXR.

DXR was first announced in March 2018, it was immediately supported on Volta (which was released in May 2017), with announced support for later architectures, which turned out to be Turing (just an upgraded Volta + RT cores), Turing got released a few months later in August 2018. Things don't happen this quickly unless NVIDIA has laid the groundwork for hardware RT before DXR, especially as AMD is completely absent from that picture, didn't even come up with an official hardware RT solution to this day, meaning they probably started working on RT during the time of DXR formulation, not before it.

at the very least a telling of how intertwined Nvidia may be with DirectX at least from a feature set perspective
Yeah, they practically got 4 DX12 features early, however these features were also ahead of their DX12 integration, meaning NVIDIA was probably independently developing them already.

They had AI hardware acceleration through Tensor cores on CUDA, which got integrated later into DirectML (which is still in it's beta phase).
They had Mesh Shaders on Turing on a separate DX12/Vulkan proprietary extension, which is still currently being integrated into DX12.
They even had Variable Rate Shading (VRS) on Turing since day one, through a separate Vulkan proprietary extension, DX12 began integrating the feature some 8 months after Turing announcement.

The last one and the big one is Ray Tracing, can you spot the pattern? I am postulating NVIDIA had RT acceleration even before DXR, at least in the same way they had AI acceleration before DirectML.

If Sony is really developing a custom RT solution, then Sony's RT effort most probably have predated DXR, same as NVIDIA.
 
I don't exclude the possibility of early communication with AMD from the start, I merely conclude that given the rapid deployment of NVIDIA's RT initiative across the board, it's efforts most probably predated DXR. Meaning they were already in the process of doing hardware accelerated RT in the form of a CUDA/Optix extension, and later as a DX12/Vulkan proprietary extension. This is the only explanation for their seemingly immediate hardware support for DXR.

DXR was first announced in March 2018, it was immediately supported on Volta (which was released in May 2017), with announced support for later architectures, which turned out to be Turing (just an upgraded Volta + RT cores), Turing got released a few months later in August 2018. Things don't happen this quickly unless NVIDIA has laid the groundwork for hardware RT before DXR, especially as AMD is completely absent from that picture, didn't even come up with an official hardware RT solution to this day, meaning they probably started working on RT during the time of DXR formulation, not before it.


Yeah, they practically got 4 DX12 features early, however these features were also ahead of their DX12 integration, meaning NVIDIA was probably independently developing them already.

They had AI hardware acceleration through Tensor cores on CUDA, which got integrated later into DirectML (which is still in it's beta phase).
They had Mesh Shaders on Turing on a separate DX12/Vulkan proprietary extension, which is still currently being integrated into DX12.
They even had Variable Rate Shading (VRS) on Turing since day one, through a separate Vulkan proprietary extension, DX12 began integrating the feature some 8 months after Turing announcement.

The last one and the big one is Ray Tracing, can you spot the pattern? I am postulating NVIDIA had RT acceleration even before DXR, at least in the same way they had AI acceleration before DirectML.

If Sony is really developing a custom RT solution, then Sony's RT effort most probably have predated DXR, same as NVIDIA.

Anybody consider that RT was already planned for next gen consoles and Nvidia accelerated its RT plans to get a jump on the market? 12nm is not the ideal node for offering RT while trying to offer more compute/rasterization performance than previous gen.

RTX seemed extremely rushed especially considering that if RT is hard to implement as a good design and AMD had no RT on the horizon planned, then Nvidia could have easily waited until 7nm. 20 series card at 7nm make a lot more sense than at 12nm for almost all other variables other than time.
 
Last edited:
Stadia was about 'predicting future player input' to compensate lag. I think it works by transforming the frame to a future point in time, so your laggy input is on time then.
(Still unsure if i got this right)

This sounds like a whole lot of invitations to problems coming to their doorstep.
- Of course he won, the A.I. was practically moving for him!
- Of course I lost, the stupid A.I. kept moving me to the wrong place!
 
Both are valid POVs.
I just don’t agree with the idea that the regression tests help provide evidence that MS is using AMD and Sony is not.

the tests on RT in my opinion show nothing; Except that a test could be run. The results are inconclusive.
 
Arden has RT block on GPU. GLX is GL0/1.

z8v3mLi.png
Do you have a full unedited picture of this one?
 
If HW RT is only for ray intersection, how are they implementing ray marchers in real time?
 
Anybody consider that RT was already planned for next gen consoles and Nvidia accelerated its RT plans to get a jump on the market? 12nm is not the ideal node for offering RT while trying to offer more compute/rasterization performance than previous gen.
It's hard to rush a complete integration of something 4 or 5 years in advance, NVIDIA's RT efforts must have started in 2015 or 2014, (when they were planning for Volta/Turing), they were also heading toward RT since their days of accelerating RT rendering through CUDA on Fermi (2010), then Real Time RT demos on Kepler (2013), then doing some partial RT effects like HFTS, VXAO on Maxwell and Pascal.
 
Why not just use your existing Google spy-gadget?

There will be different gadgets across the house to talk to your voice assistant. The console will just be one of them. I know two persons who have bough 2 google home speakers, one for the bedroom and the other one for the living room (+ their phone).

In the future we'll have hundreds of mics in our houses so the NSA can chose to tap which ever they want.
 

Look, look...

13F9:OBR:A0

AMD_Flute_Tweet.jpg


So its now confirmed that Flute benchmark used Oberon chip.

Now, check out what Taiwanese leaker AquariusZi said back in October :

Arden
OBR/Oberon
Sparkman
The first one should be XBOX
The second full opportunity is ps5
The third case is very interesting, which has only recently begun to be evaluated
Layout is very similar to Renoir, but a little fatter
And it's not a dedicated FPX pin for a laptop
It's the common BLX of semi customized SOC
... is there a shortage of video game manufacturers?
I'm not sure...

Now check socket in that Flute benchmark...
 
Last edited:
Status
Not open for further replies.
Back
Top