DirectX 12: The future of it within the console gaming space (specifically the XB1)

I am thinking about leaving console gaming -except if the next Xbox is portable, and merges things (in the process now, all devices are running the same OS)-, but I need the money now for other stuff, and I am very interested in a decent, well priced tablet that runs DirectX12. :smile2: Which means purchasing a tablet with the Intel processor called Cherry Trail, fully DirectX 12 compatible.

http://liliputing.com/2015/08/teclast-x98-pro-9-7-inch-windows-10-tablet-with-cherry-trail.html

I have that one above, or a similar tablet, in mind. A Cherry Trail processor runs Dolphin emulator, for instance, pretty well.

The current xbox is portable if you have windows 10 on your tab. I am using one right now since my TV bit the dust.

The only thing missing is the ability for MS to stream XB games outside your home. And thats very much in the plans given the leaked docs from years ago.

No point in trying to fit into a tab when ultimately you have to cull performance to accommodate such a small form factor.
 
So you're wanting... a portable Gamecube/Wii ?
To an extent, yes, would have you loved a device like that? I am thinking about a Surface wth a few physical buttons to use like a gamepad, a keyboard and a mouse. Then you could connect a wireless gamepad to it like a regular console in order to play with friends on a TV.

A layer for typical PC uses and an overly protected layer for Xbox gaming.

The current xbox is portable if you have windows 10 on your tab. I am using one right now since my TV bit the dust.

The only thing missing is the ability for MS to stream XB games outside your home. And thats very much in the plans given the leaked docs from years ago.

No point in trying to fit into a tab when ultimately you have to cull performance to accommodate such a small form factor.
Your connection speed allows for high quality streaming, I take from your words. A Surface like console would fix the issue of not being able to stream games outside your home.

In regards to performance, it might be a problem by now -although I think a Surface Pro 4 is more powerful than X1-, but maybe in 4 years from now and with efficient APIs like DirectX12 and Vulcan..
 
Yah. The Iris Pro Graphics 580/GT4e of Skylake should be close to Xbox One in GPU performance, but I don't think it has launched yet. Could end up in laptops, but I'm not sure.
 
Plus, Intel have anywhere between a 1 to 2 node advantage. Unless Intel are going to be designing and fabbing your processors ....

The tablet / portable issue is complicate by sustained heat output. My chum just bought a new gaming laptop with some latest Intel CPU and a 970M (with 6 GB vram!). Thing is, even with both CPU and GPU fans at full tilt the thing will hit temperature thresholds and throttle (certainly on the GPU side) and so to maintain full performance he has to activate the laptops third "mega boost loudass" fan.

If DX12 allows for higher utilisation then heat output (should) increase too. Probably not a detectable issue for the x1 with its overkill cooler (given that it's probably only going to get a little faster) but maybe mobile PC gamers will have to get used to higher fan settings.
 
Chips with Iris Pro also cost more than PS4.

Well, they do when consumers buy them, but I don't know what Apple would pay for the chips.

I doubt the GT4e will be available with anything less than an i7, so it's basically not really something you'd run on a mobile device. You couldn't be running on battery with that thing.
 
Plus, Intel have anywhere between a 1 to 2 node advantage. Unless Intel are going to be designing and fabbing your processors ....

The tablet / portable issue is complicate by sustained heat output. My chum just bought a new gaming laptop with some latest Intel CPU and a 970M (with 6 GB vram!). Thing is, even with both CPU and GPU fans at full tilt the thing will hit temperature thresholds and throttle (certainly on the GPU side) and so to maintain full performance he has to activate the laptops third "mega boost loudass" fan.

If DX12 allows for higher utilisation then heat output (should) increase too. Probably not a detectable issue for the x1 with its overkill cooler (given that it's probably only going to get a little faster) but maybe mobile PC gamers will have to get used to higher fan settings.
DX12 is more efficient and can actually reduce power consumption or keep it at the same power for higher performance. Intel demonstrated here where they're able to reduce CPU power and increase GPU power for no net increase in total power while increasing frame rate by 70%. Alternatively, if they keep the same frame rate, then they can reduce the total power by almost 50%. It's certainly still possible to have higher power dissipation if you desire even more performance, but I think the first scenarios is sufficient for mobile devices.

https://software.intel.com/en-us/blogs/2014/08/11/siggraph-2014-directx-12-on-intel
 
DX12 is more efficient and can actually reduce power consumption or keep it at the same power for higher performance. Intel demonstrated here where they're able to reduce CPU power and increase GPU power for no net increase in total power while increasing frame rate by 70%. Alternatively, if they keep the same frame rate, then they can reduce the total power by almost 50%. It's certainly still possible to have higher power dissipation if you desire even more performance, but I think the first scenarios is sufficient for mobile devices.

https://software.intel.com/en-us/blogs/2014/08/11/siggraph-2014-directx-12-on-intel

Developers who are pushing the envelope will use whatever they can get. In the case of the Xbox One, freeing up CPU and utilising more GPU will likely mean they can find a way to use more CPU too.

On a laptop such as my chum's, even if the CPU was seeing less utilisation, the GPU is on its own heatsink and (primary) fan and so if utilisation goes up that will just get hotter and require more fan (primary and/or secondary).

If you can do more, you do more. Long term, that's what's going to happen whether it's due to developers or hardware engineers shifting power to the now more utilised GPU.
 
It's not. Even if you disregard the inevitable sustained performance issues for high end gaming, the GPU just isn't quite there yet.
Say your theory is right. Are you talking about PC gaming? The scenario is very different if you use a GPU on a PC environment compared to using it on a console.

Plus, Intel have anywhere between a 1 to 2 node advantage. Unless Intel are going to be designing and fabbing your processors ....

The tablet / portable issue is complicate by sustained heat output. My chum just bought a new gaming laptop with some latest Intel CPU and a 970M (with 6 GB vram!). Thing is, even with both CPU and GPU fans at full tilt the thing will hit temperature thresholds and throttle (certainly on the GPU side) and so to maintain full performance he has to activate the laptops third "mega boost loudass" fan.

If DX12 allows for higher utilisation then heat output (should) increase too. Probably not a detectable issue for the x1 with its overkill cooler (given that it's probably only going to get a little faster) but maybe mobile PC gamers will have to get used to higher fan settings.
I have a laptop computer from 2011, which processor is an APU and I experience some overheating issues and fan noise if I don't disable the Turbo mode in the BIOS, but it's a laptop from 2011 and it's dead silent if you change the BIOS settings.

You are factoring in a CPU and a external GPU in the equation. What about an APU from AMD with the most advanced technology of 2020 though?

http://www.windowscentral.com/uks-p...let-controllers-play-streaming-xbox-one-games

linx-windows-10-tablet.jpg
 

Attachments

  • upload_2016-1-3_2-29-25.png
    upload_2016-1-3_2-29-25.png
    585.4 KB · Views: 20
Say your theory is right. Are you talking about PC gaming? The scenario is very different if you use a GPU on a PC environment compared to using it on a console.

Yes, even if it weren't running windows and had a custom OS and low level api.

You are factoring in a CPU and a external GPU in the equation. What about an APU from AMD with the most advanced technology of 2020 though?

Well yes by 2020 mobile APU's may well have caught up to console APU's from 2013, in fact I would expect them to, but that hardly changes the SP4's current standing.
 
Yes, even if it weren't running windows and had a custom OS and low level api.



Well yes by 2020 mobile APU's may well have caught up to console APU's from 2013, in fact I would expect them to, but that hardly changes the SP4's current standing.
By standing you mean it can't compete? Let's say SP4 isn't running Windows or whatever OS but a custom OS..,I am wondering what makes you think it is going to have inevitable performance issues for sustaining performance when programmers can make the most of it. Is it because Intel GPUs aren't quite there yet?
 
Now this is interesting
AMD-Polaris-Architecture-8.jpg


So "Polaris" vs the GTX 950 on med preset in Battlefront (which heavily favors AMD GCN vs Maxwell currently).
 
Well, they want to compare their future cards to the current competition, i don't think the comparison is that bad. We should also expect Pascal to be more efficient than Maxwell.
 
Back
Top