NVIDIA Tegra Architecture

Wait, wasn't Google supposed to stop making Nexus products? Or was that just for phones? Or another silly rumor that got out of hand?
 
Wait, wasn't Google supposed to stop making Nexus products? Or was that just for phones? Or another silly rumor that got out of hand?

Ι have the feeling they just call them something else :p

Well what if the product doesn't necessarily have to be available a month from now? ;)

He said "in a product" not in any specific product; Denver SoCs will obviously take their time if first A15 K1 devices will appear in June. A (non-Denver) K1 will be available in a month from now and by striking Denver out I can prove him wrong and he may call it noise in the system as much as he want. :p
 
He said "in a product" not in any specific product; Denver SoCs will obviously take their time if first A15 K1 devices will appear in June. A (non-Denver) K1 will be available in a month from now and by striking Denver out I can prove him wrong and he may call it noise in the system as much as he want. :p

Congratulations, you can prove me wrong by completely changing what I said. It's totally possible for 32-bit K1 to be in a product in June. What exactly is your point? The so-called rumors being about Denver and not Cortex-A15 K1 is not some minor detail.

Seriously, I should be blaming nVidia for calling two pretty different products Tegra K1. They probably wanted this sort of thing to happen.
 
Congratulations, you can prove me wrong by completely changing what I said. It's totally possible for 32-bit K1 to be in a product in June. What exactly is your point? The so-called rumors being about Denver and not Cortex-A15 K1 is not some minor detail.

There is NO point. It's a joke :LOL:

Seriously, I should be blaming nVidia for calling two pretty different products Tegra K1. They probably wanted this sort of thing to happen.

Ironically that is what I just wrote elsewhere: Denver could have waited for Erista IMHO. Two SoCs in a year is not cheap for NV itself and it's only creating confusion.
 
Is it using a standard K1 or the Denver variant? The 4GB RAM and the vague 'later this year' release date makes me think it could be the latter. On the other hand they could simply be taking advantage of A15 LPAE.
 
No word on the screen in that tablet?
Is it 1080p? 720p?


I guess this really puts some weight on nVidia's credibility for the computing capabilities of TK1.
 
Is it using a standard K1 or the Denver variant? The 4GB RAM and the vague 'later this year' release date makes me think it could be the latter. On the other hand they could simply be taking advantage of A15 LPAE.

Probably the standard Cortex-A15 K1, as they're talking about hopefully releasing them around Google I/O.

EDIT: I misread, they only hope to showcase it at Google I/O. They're not sure yet when they release it.
 
Since it's all the rage, I wonder if nVidia will be releasing their own close-to-metal interface for Kepler alongside Shield 2.
It would be cheaper to convince other big android players to pressure Google to make their own, Nvidia could play an advisory role while Google bankrolls it.
This should be much easier now that Apple have announced Metal.
 
Google should just inject Khronos with a crapload of money for them to make an OpenGL version that would be universally supported and addressed the draw calls and multithreading stuff like DX12.

No more OpenGL ES. There's no point in making having ES version nowadays, IMO.
 
Is it using a standard K1 or the Denver variant? The 4GB RAM and the vague 'later this year' release date makes me think it could be the latter. On the other hand they could simply be taking advantage of A15 LPAE.

You can do as on a PC, simply waste a small portion of your memory.
It could be reasonable given you don't suffer performance loss from using bank switching, and you eliminate any system or driver problem.
 
Google should just inject Khronos with a crapload of money for them to make an OpenGL version that would be universally supported and addressed the draw calls and multithreading stuff like DX12.

No more OpenGL ES. There's no point in making having ES version nowadays, IMO.

I wonder how would having full OpenGL/Dx12 on mobile devices like Android affect power draw? I'm guessing it would increase power usage with any game developed with tesselation and other advanced rendering effects on them, but this is a bit of an uneducated guess. Would someone here have any enlightment on the subject?
 
I wonder how would having full OpenGL/Dx12 on mobile devices like Android affect power draw? I'm guessing it would increase power usage with any game developed with tesselation and other advanced rendering effects on them, but this is a bit of an uneducated guess. Would someone here have any enlightment on the subject?

My guess is that it would still depend on how they would use having a full OpenGL/DX12 implementation. Using tessellation only to make the process more efficient, not to improve detail, could mean a decrease in power consumption. Just a guess. That said, the extra abilities and maybe even advances in efficiency would very likely be used to improve image quality and the result would then probably be an increase in power draw.
 
Back
Top