Next Generation Hardware Speculation with a Technical Spin [pre E3 2019]

Status
Not open for further replies.
It's been years* than IMG tech propose RT tech, and no one to my knowledge used it. I doubt someone will now....

"we first talked about our ray tracing IP back in 2012, and in 2014 this was followed by the launch of our ray tracing GPU family, a GPU featuring a block dedicated to accelerating ray tracing. This was intended for use in mobile hardware, but for demonstration and ease of development purposes, we had the chip integrated into a PCIe evaluation board, which was running by 2016."

https://www.imgtec.com/blog/imagination-technologies-the-ray-tracing-pioneers/
 
Sapphire also confirmed that AMD's Navi does not have specialized ray-tracing hardware on the silicon, but such technology will debut with "next year's new architecture".
https://www.techpowerup.com/255768/sapphire-reps-leak-juicy-details-on-amd-radeon-navi

This would make the PlayStation and Microsofts solution either from the next architecture, or each having custom implementation. It also means at this point Microsoft may not have it at all as we have not got official word on that.
Wonder if actually means new architecture after navi is due next year. Things like this can easily get lost in translation.
Think I raised this before about navi being delayed and possibly their next gen still being on schedule, so may not have years between the two.

You can't deduce anything regarding either Sony or MS at this point apart from the fact that RT may not be navi based.
MS hasn't talked at all about Scarlett, so the fact that they've not given official word yet doesn't mean a thing.
 
It's been years* than IMG tech propose RT tech, and no one to my knowledge used it. I doubt someone will now....
I'm not sure if I would be as sure about no one will use it now.
It's about context and circumstances.
In a mobile chip sure it wasn't picked up, as software/games would have to be written to take advantage of it.
But in a next generation of console, thats a different matter if it could be integrated into the IP that is being used.
Especially if you know that GPUs (especially prior to console release) will have them, but the one your using does not have RT.

Not saying it means that it will be used, as may backport AMD next gen RT into navi. But if that's too costly due to architectural changes, IMG may be a better solution
 
It's been years* than IMG tech propose RT tech, and no one to my knowledge used it. I doubt someone will now....

"we first talked about our ray tracing IP back in 2012, and in 2014 this was followed by the launch of our ray tracing GPU family, a GPU featuring a block dedicated to accelerating ray tracing. This was intended for use in mobile hardware, but for demonstration and ease of development purposes, we had the chip integrated into a PCIe evaluation board, which was running by 2016."

https://www.imgtec.com/blog/imagination-technologies-the-ray-tracing-pioneers/
Their RTRT GPU launched in 2014. PS4 launched 2013. The likelihood of anyone using the tech when it was unproven was always exceptionally low. You can't blame people for not wanting in in this round of consoles. In mobile, it'll be a platform specific feature never used, so not worth bothering with.

Perhaps now the console engineers have played with the PCIe evaluation board, released in 2016, and decided they want that tech in their consoles? We at least know Sony have worked with ImgTec before on Vita.
 
So, you would have navi communicate with an "rt block" from IMG ? How much bandwidth would be needed ? With what protocol/bus ? When you have a devs problem, Sony will have to talk to amd and img ? Does IMG have the ressources to support such requests from the console leader ?
It's a little OT, but I lost hope in IMG for gaming market when they didn't land the Switch. It was perfect for them. At the time they had mips for cpu, and a lot of ips for gpu...

Anyway, we'll see :)
 
It's been years* than IMG tech propose RT tech, and no one to my knowledge used it. I doubt someone will now....

"we first talked about our ray tracing IP back in 2012, and in 2014 this was followed by the launch of our ray tracing GPU family, a GPU featuring a block dedicated to accelerating ray tracing. This was intended for use in mobile hardware, but for demonstration and ease of development purposes, we had the chip integrated into a PCIe evaluation board, which was running by 2016."

https://www.imgtec.com/blog/imagination-technologies-the-ray-tracing-pioneers/
Generally companies announce previously closed IP to be available for licensing for two reasons. 1. They’re in cash trouble and need revenue streaming. 2. It’s already happening and this is pre-emoting the announcement.

This could be both, honestly.
 
From Sony investor meeting

D7DYjddX4AAoFzP.jpg:large
"... a customized ultra-fast, broadband SSD."
http://lightnvm.io and https://openchannelssd.readthedocs.io/en/latest/

Summary: Remove the intelligent controller, expose the raw flash channels to the OS. "Software defined flash storage". Similar to removing an "intelligent" RAID controller and passing the disks to the OS. The OS/app knows better than the (RAID or flash) controller what it's doing with the data. A database might prefer to know about the disk/flash topology instead of being presented a flat flash/RAID device of unknown properties.

Cons: you're now responsible for wear leveling and all the other fun stuffs the controller was doing.
Pros: Control. Possibly price, your "controller" is now as stupid as possible.

A standard physical format 4x PCIe 4.0 M.2 with a "cheaper" controller might be the way to have your cake and eat it too: a standard physical and software interface for price, a 100% custom interface between your flash channels and your app/game for outstanding performance.

Or frankly, just dgaf and use it in FreeBSD NVMe normal mode. The pain of having a new API for storage access sounds stupid, FreeBSD has enough storage APIs as it is. And I'd venture the difference from NVMe to OpenChannelSSD is a small step compared to the generational change of going from 5400rpm lowest bidder HD to a PCIe 4.0 lowest bidder NVMe.
 
"Average lifetime device spend exceeds $700•Average launch year lifetime device spend exceeds$1,600"

$1600 for launch consoles. I'm guessing these spends included hardware costs? Makes the notion of loss-leading hardware all the more viable if it means securing the lion's share of sales.

I'm probably close the 1600, when factoring in hardware, PSN for the last 6 years (actually still have 2 more years), and at least 1-2 games a year.

Taking a $100 loss early on seems like a no-brainer if you can get the costs down quickly (break even within a year).
 
"Average lifetime device spend exceeds $700•Average launch year lifetime device spend exceeds$1,600"

$1600 for launch consoles. I'm guessing these spends included hardware costs? Makes the notion of loss-leading hardware all the more viable if it means securing the lion's share of sales.

I'd say the average early adopter probably goes something like this:

$1600:
- $400 PS4 at launch
- 3 extra dualshock controllers for $50 each
- 8x retail full games at release for average $60
- 6x Yearly $50 PS Plus subscription
- 10x digital full games for average $30


People who went on to upgrade to the PS4 Pro, purchased PSVR plus camera and move controllers, etc. probably go north of $2000 easily.
 
I'm probably close the 1600, when factoring in hardware, PSN for the last 6 years (actually still have 2 more years), and at least 1-2 games a year.

Taking a $100 loss early on seems like a no-brainer if you can get the costs down quickly (break even within a year).
Indeed. By comparison PS4 was breaking even on launch day and with a slightly upclocked 7850 gpu that's worth $249 in retail at the time, one can hope they devoted a good chunk of $100 loss on the custom Navi for PS5:).
 
I'd say the average early adopter probably goes something like this:

$1600:
- $400 PS4 at launch
- 3 extra dualshock controllers for $50 each
- 8x retail full games at release for average $60
- 6x Yearly $50 PS Plus subscription
- 10x digital full games for average $30


People who went on to upgrade to the PS4 Pro, purchased PSVR plus camera and move controllers, etc. probably go north of $2000 easily.
Easily. Which is why some of us are actually totally ok with not-so-slightly more expensive consoles at launch, if it means a discernable difference in tech and more future-proofing, cause ultimately we know how much we spend in this cute little life-consuming hobby of ours. And if Sony/MS understand that and are willing to make an initial loss, great news for everyone!
 
It's been years* than IMG tech propose RT tech, and no one to my knowledge used it. I doubt someone will now....

"we first talked about our ray tracing IP back in 2012, and in 2014 this was followed by the launch of our ray tracing GPU family, a GPU featuring a block dedicated to accelerating ray tracing. This was intended for use in mobile hardware, but for demonstration and ease of development purposes, we had the chip integrated into a PCIe evaluation board, which was running by 2016."

https://www.imgtec.com/blog/imagination-technologies-the-ray-tracing-pioneers/

I could be wrong, but if IRC, I remember reading a few years ago (2016 or something) that Samsung was developing a microprocessor that could be implemented on the GPU via chip or PCB towards assisting GPUs with all forms of RT lighting, shadowing and reflections. I can't find the article for the life of me, but it seemed promising from what was being stated. That the microprocessor could be implemented into cellphones, tablets, consoles and so-on. Samsung wasn't looking to get into the GPU or console space in the traditional sense, but offer their IPs ($$$) towards other GPU designs lacking in certain areas.
 
Last edited:
Ps4+2 x ds4 + camera + killzone bundle = 550€

Traded ps4+ 2x ds4 +180€ to pro

~80 physical games, many using gamestop trades, only 4-5 with full price. Lets say 20€ average = 1600€

~ 20 digital games 10€ average = 200€

Ps4 anniversary edition for collection 500€

Ps+ since 2013 50€ * 6 = 300€

= so +3000€ i guess.

Maybe 2500 because of many of my games are trade deals + sold some.
 
I could see Sony or MS justify a modest loss at launch if a technology could become much cheaper in years 2 onwards. Let's say GDDR6 is used at launch but replaced with a cheaper HBM solution 18 months from now. That could justify a nominal loss at launch if it provided a bandwidth benchmark key to the overall design.

However, with streaming options available I can't see a very good case for loss for market share sake. Simply direct those consumers to streaming solution and sell them more games and services.
 
I could be wrong, but if IRC, I remember reading a few years ago (2016 or something) that Samsung was developing a microprocessor that could be implemented on the GPU via chip or PCB towards assisting GPUs with all forms of RT lighting, shadowing and reflections. I can't find the article for the life of me, but it seemed promising from what was being stated. That the microprocessor could be implemented into cellphones, tablets, consoles and so-on. Samsung wasn't looking to get into the GPU or console space in the traditional sense, but offer their IPs ($$$) towards other GPU designs lacking in certain areas.

Do you mean Samsungs SGRT?

http://web.yonsei.ac.kr/wjlee/document/sgrt.hpg13.pdf
 
Status
Not open for further replies.
Back
Top