nVidia Project Th-.. Shield (Tegra4)

Yes that is true. But for the low latency gaming that Nvidia is touting, you'll need a current Nvidia GPU. Although that of course, could be subject to change in the future.

Regards,
SB

AFAIK, Quicksync is still faster than nVidia's hardware codec.

Nonetheless, I'm convinced, given there's an adequate CPU in the PC, most of the lag won't be coming before the PC's output but rather the house's own network internals (piss-poor routers like mine, switches and stuff).

I'm actually building a PC that will allow me to stream content to my androids like this:
Discrete GPU renders game -> splashtop uses quicksync (it might be able to since I'm using Virtu MVP) -> PCI-Express 1x -> Intel 2230 WiFi/bluetooth adapter (using a virtual access point)-> Android phone/tablet.

I'm hoping this will give me minimal lag for having 1080p games being playable in the tablet. Fingers crossed.
 
Silent_Buddha said:
That could pose problems for some users. In my area for example, most people are still limited to 7-12 Mbps. And the higher speed grades have "fair use" terminology built into the contracts. There's no hard caps, but they will send warnings and eventually terminate service for excessive bandwidth consumption.
Streaming is currently restricted to the LAN, so bandwidth shouldn't be an issue except for QoS.
 
So you can't tell the difference between down and up rates?

And Gaikai is not owned by Nvidia.

Also, where exactly did Nvidia announce support for Gaikai on shield? I mean it makes sense, but I haven't actually seen them say anything. Do you have a source for that statement?

Silent_Buddha said:
The streaming from PC aspect is also going to be a potentially limiting factor. As you'll need a decent PC with a current Nvidia GPU in order to stream. At least from what has been mentioned thus far. As long as you have the supported hardware, it is a pretty nice feature to be able to then game on any TV in the house.

I'm not sure how feasible streaming games through an online service will be. Nvidia mentioned that streaming at 720p/30 FPS will require ~5 Mbps of bandwidth while streaming at 1080p/60 FPS will require ~15-20 Mbps.

Just to be clear here. There is no ISP required at all to stream from your PC to Shield over your LAN. And this is the only method currently supported by Nvidia for streaming from your PC. Period.

As for the rest, don't even bother...
 
Last edited by a moderator:
So you can't tell the difference between down and up rates?

And Gaikai is not owned by Nvidia.

Also, where exactly did Nvidia announce support for Gaikai on shield? I mean it makes sense, but I haven't actually seen them say anything. Do you have a source for that statement?



Just to be clear here. There is no ISP required at all to stream from your PC to Shield over your LAN. And this is the only method currently supported by Nvidia for streaming from your PC. Period.

As for the rest, don't even bother...

So you are wanting to claim that Nvidia will not allow Project Shield to stream games from Nvidia's own Grid cloud gaming platform? I find that extremely hard to swallow.

As to Gaikai seems Tech Report may have misinterpreted what Nvidia told them. So, if that's the case then yes, got that wrong. So that should read Nvidia's own cloud gaming service which is similar to Gaikai.

Regards,
SB
 
Here is a look at samsungs controller...how it allows for devices with up to 6.3inch screens (note3) the partnership with EA and the 16 games optimised for galaxy s4.

Samsung is clearly (and cleverly) taking a peice of every competitors pie...in this case namely nvidia...although nothing new..this looks like a far better solution to project shield using only one expensive device (phone) over a costly 2 (phone, shield) likely with near identicle results.

What Samsung is doing with their controller is very very similar to what Moga is already doing with their Moga Pro controller: http://www.talkandroid.com/147538-hands-on-with-the-powera-moga-pro-gaming-controller/ . Project Shield should be a much better gaming device because it can accomodate a heatsink (to keep CPU/GPU clock operating frequencies relatively high without the more heavy thermal throttling seen in a smartphone chassis) and because it can accomodate very high battery capacity (to allow for many hours of continuous gameplay without the more limited battery capacity and more heavy battery drain seen in a smartphone chassis).
 
What Samsung is doing with their controller is very very similar to what Moga is already doing with their Moga Pro controller: http://www.talkandroid.com/147538-hands-on-with-the-powera-moga-pro-gaming-controller/ . Project Shield should be a much better gaming device because it can accomodate a heatsink (to keep CPU/GPU clock operating frequencies relatively high without the more heavy thermal throttling seen in a smartphone chassis) and because it can accomodate very high battery capacity (to allow for many hours of continuous gameplay without the more limited battery capacity and more heavy battery drain seen in a smartphone chassis).

Yea I mentioned it wasnt a new idea..I bought something similar....
I wouldnt be so sure about shield having more battery life or perfomance than a galaxy note 3 for instance....with tegra 4 pushing near 9w just how long were you hoping to be playing for on one of them?

Cost is also a factor. ..why spend close to a grand on two devices when you can get near the same experience for half that?
 
I wouldnt be so sure about shield having more battery life or perfomance than a galaxy note 3 for instance

Due to differences in form factor, Shield has > 3.2x more battery capacity than the Galaxy Note 2, and has > 4.75x more battery capacity than the Galaxy S3.

Cost is also a factor. ..why spend close to a grand on two devices when you can get near the same experience for half that?

An unsubsidized high end Galaxy smartphone (or Note) with cellular service will likely cost far more than a wifi-only 720p device with low cost Li-ion batteries like Shield. So the cost of an unsubsidized high end Samsung smartphone + Shield will probably only be about 30% higher than the cost of an unsubsidized high end Samsung smartphone + Samsung controller. As for "near the same experience", there are several reasons why that would not be the case: 1) the Galaxy smartphone (or Note) is very thin with no heatsink and no fan, so the CPU and GPU clock operating frequencies will need to be throttled during heavy gaming; 2) the Galaxy smartphone (or Note) has much more limited battery capacity, so the battery life would be much less during heavy gaming, which is not a good thing because a smartphone is typically one's primary communication device; 3) the Galaxy smartphone (or Note) cannot play games that are streamed from a [Kepler-equipped] PC.
 
Last edited by a moderator:
Due to differences in form factor, Shield has > 3.2x more battery capacity than the Galaxy Note 2, and has > 4.75x more battery capacity than the Galaxy S3.



An unsubsidized high end Galaxy smartphone (or Note) with cellular service will likely cost far more than a wifi-only 720p device with low cost Li-ion batteries like Shield. So the cost of an unsubsidized high end Samsung smartphone + Shield will probably only be about 30% higher than the cost of an unsubsidized high end Samsung smartphone + Samsung controller. As for "near the same experience", there are several reasons why that would not be the case: 1) the Galaxy smartphone (or Note) is very thin with no heatsink and no fan, so the CPU and GPU clock operating frequencies will need to be throttled during heavy gaming; 2) the Galaxy smartphone (or Note) has much more limited battery capacity, so the battery life would be much less during heavy gaming, which is not a good thing because a smartphone is typically one's primary communication device; 3) the Galaxy smartphone (or Note) cannot play games that are streamed from a [Kepler-equipped] PC.

Have you seen the size of the battery in the shield...if so link please? Just because there is the space doesnt mean it will be filled ;).
Yes it has a heatsink..but it likely would need it due to 4 A15s anyway...I dont expect them to be going full chat...besides note 3 will likely carry snapdragon s800 in some markets and a 3500-4000 mah battery....to go along with rumoured 6.3 full hd amoled screen!...with games being optimised for it in a similar way to tegra games the experience is going to be very similar IMO.

But we of course await actual products and games to compare...if the note 3 was rooted it could play THD games anyhow ;)
 
I wouldnt be so sure about shield having more battery life or perfomance than a galaxy note 3 for instance....with tegra 4 pushing near 9w just how long were you hoping to be playing for on one of them?

The verdict isn't out that Tegra 4 really will need > 9W running all four cores at 1.9GHz, that was a very crude estimation by Linley. But even if that's the case that doesn't mean it's going to be sucking 9+W while you're running a game, first and foremost because Android games aren't going to run four Cortex-A15s at 1.9GHz. They wouldn't know what to do with that.

This stupid FUD needs to die. People need to stop comparing mobile SoCs based on peak power draw. You can compare power at normalized performance points or you can compare performance at normalized power points to get useful information. What you can't do is make a comparison between two devices with different performance and power consumption. Worst of all is if you try to extrapolate results by assuming perf/W is linear, which it isn't. Even if it's not stated outright people basically imply it by saying stupid things like "oh, it's only 20% faster but uses 100% more power!"
 
The verdict isn't out that Tegra 4 really will need > 9W running all four cores at 1.9GHz, that was a very crude estimation by Linley. But even if that's the case that doesn't mean it's going to be sucking 9+W while you're running a game, first and foremost because Android games aren't going to run four Cortex-A15s at 1.9GHz. They wouldn't know what to do with that.

This stupid FUD needs to die. People need to stop comparing mobile SoCs based on peak power draw. You can compare power at normalized performance points or you can compare performance at normalized power points to get useful information. What you can't do is make a comparison between two devices with different performance and power consumption. Worst of all is if you try to extrapolate results by assuming perf/W is linear, which it isn't. Even if it's not stated outright people basically imply it by saying stupid things like "oh, it's only 20% faster but uses 100% more power!"

Well if only android games are going to be used then yes I fully see your point...I am unaware of just what resources a tegra 4 'THD' optimised game would use over standard..so I wont comment on that...but my response was in reference to AMS using a heatsink as a reason why gaming will be better on shield over say a galaxy note 3+ s-controller set up.

In my mind shield would need dedicated games maxing out silicon to warrant such a feature. ..if android games are only being used with some optimisations I fail to see then how shield would be a worthwhile investment over a note 3 + controller.

Just my take.
 
The verdict isn't out that Tegra 4 really will need > 9W running all four cores at 1.9GHz, that was a very crude estimation by Linley. But even if that's the case that doesn't mean it's going to be sucking 9+W while you're running a game, first and foremost because Android games aren't going to run four Cortex-A15s at 1.9GHz. They wouldn't know what to do with that.

According to a Heise news about Project Shield, the console needs between 4 and 8 watt running a game. So not so far from the 9W.

The relevant quote is this here:
Der 38-Wh-Akku soll im Spielbetrieb 5 bis 10 und im Videobetrieb 24 Stunden durchhalten
So 38Wh/5h = 7,6W for the complete system

Link: http://www.heise.de/newsticker/meld...lkonsole-mit-Tegra-4-und-Android-1778002.html
 
It doesn't matter what frequency plane the cores are on if the cores are off. If mobile games don't know what to do with > 2 cores then those last two cores will be idle most of the time and can be power gated for a majority of that time. The fact that Cortex-A15 uses a shared inclusive L2 means that turning the cores on and off should be relatively low latency since they just have to save internal state and push L1 to L2. Those other cores aren't going to need high frequency on and off switching. They'll probably tend to go to sleep and wake up once per frame, so can handle latency even as high as a few ms.

And I don't agree at all that games having a "heavy" thread means they're going to run 100% CPU time at any clock speed you set it at. CPU time requirements for games is mostly fixed and only scales somewhat with frame rate, which is also going to quickly hit fixed at 60Hz or so (and a lot of games will cap it at much lower). Consider that most games will want to run on iPads, so compare with that use-case: the single-threaded CPU capability is much lower (1.4GHz Swift vs 1.9GHz Cortex-A15), and the GPU power is similar. So what good is it going to do for the game to peg a 1.9GHz A15?



Yes, 7.6W worst case for the complete system.. that isn't close to 9W just for the SoC. Where french toast really got that 9W number was from Linley and it was just for the CPUs. I'm saying that number doesn't apply even if it's correct.

If games were designed specifically for shield (honestly I dont know if that is or is not the case) would that not max out the cores?..devs would try to use all available resources no?
 
No, you don't try to peg four 1.9GHz Cortex-A15 cores just because they're there, regardless of how much nVidia is paying you. Most "Shield specific" optimizations will be raising the bar on graphics features, not driving up CPU utilization to the max. If they're really developing for Shield making a game that doesn't kill your battery is a consideration.

One thing to consider (which Linley did not) is that I'm pretty sure Tegra 4 won't even let you run all four cores at 1.9GHz. I don't recall if the actual limit has been mentioned but this is standard for nVidia.

Tegra 4 makes separate design decisions to handle separate use cases. There are Cortex-A15 cores that clock to 1.9GHz to address latency sensitive uses where you need something to complete ASAP. There are four cores to balance independent thread execution and leverage use cases that are inherently parallel but not as latency sensitive (like decompressing a large file). And there's a big GPU for games.

But just because all of these things are there doesn't mean that there was an intention to max out all of these resources simultaneously.
 
Have you seen the size of the battery in the shield...if so link please? Just because there is the space doesnt mean it will be filled ;).

You can do a google or youtube search and you will easily find the answer. Shield uses three low cost cylindrical Li-ion batteries which gives it far higher total battery capacity than any smartphone (or Note).

Yes it has a heatsink..but it likely would need it due to 4 A15s anyway

The notion that quad A15 would need a heatsink is a misconception. After all, Samsung was able to fit quad A15 in the Galaxy S4 smartphone chassis with no heatsink and no fan. The benefit of being able to use a heatsink is that the designer can use higher CPU and/or GPU clock operating frequencies without being forced to quickly throttle these frequencies due to thermal constraints under heavy gaming.

besides note 3 will likely carry snapdragon s800 in some markets and a 3500-4000 mah battery

A Galaxy Note 3 with 3500 mAh battery would have "only" 12.9% higher battery capacity than the Galaxy Note 2, which is still nowhere near the battery capacity that one would see in Shield or a larger tablet like ipad 4.

....to go along with rumoured 6.3 full hd amoled screen!

A device with a 6.3" screen is not what I would consider to be a smartphone, but rather a relatively small tablet. Most consumers would rather not have a primary cellular device that has a screen size that is significantly above 5".
 
You can do a google or youtube search and you will easily find the answer. Shield uses three low cost cylindrical Li-ion batteries which gives it far higher total battery capacity than any smartphone (or Note).



The notion that quad A15 would need a heatsink is a misconception. After all, Samsung was able to fit quad A15 in the Galaxy S4 smartphone chassis with no heatsink and no fan. The benefit of being able to use a heatsink is that the designer can use higher CPU and/or GPU clock operating frequencies without being forced to quickly throttle these frequencies due to thermal constraints under heavy gaming.



A Galaxy Note 3 with 3500 mAh battery would have "only" 12.9% higher battery capacity than the Galaxy Note 2, which is still nowhere near the battery capacity that one would see in Shield or a larger tablet like ipad 4.



A device with a 6.3" screen is not what I would consider to be a smartphone, but rather a relatively small tablet. Most consumers would rather not have a cellular device that has a screen size that is significantly above 5".

Yea but im also willing to bet most consumers wouldnt buy a shield anyway :)
 
Back
Top