It doesn't have to be that specific, if only because everybody should be low power oriented already by now.silentguy is right. The market is really hot for anyone with experience in graphics or low power. I'm sure other areas are hot as well.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
It doesn't have to be that specific, if only because everybody should be low power oriented already by now.silentguy is right. The market is really hot for anyone with experience in graphics or low power. I'm sure other areas are hot as well.
everybody should be low power oriented already by now.Video, fabrics, CPU architecture, caches, signal theory, overall SOC expertise etc.
Qualcomm in particular is in a good position to pick-up AMD employees. They have a Markham office not a Tim Horton's away from AMD's with plenty of open positions. Meanwhile it was recently pointed out to me that they also have an Orlando office with open job positions in graphics (AMD's Florida location houses the ArtX guys).
My group currently has no budget, but if there was someone available who was truly "elite", we'd hire them.
Elite people rarely change jobs, and even the best teams have too few of them.
What is the benefit/ profit?Workstations with performance of a phone or tablet?
This is ridiculous and first of all we will lose the beauty of high power beasts, second- you will need some generations to catch up the old performance, and finally- there will always be people who need maximum performance at any cost.
Low power= less work performed per any given period of time or pure waste.
He didn't mean that we should give up on 100W chips. He meant everybody should be focused on power efficiency.
If you don't use low power design techniques on a chip that has a 100W budget, your competitor will make the same chip with a 50W budget.What is the benefit/ profit?Workstations with performance of a phone or tablet?
This is ridiculous and first of all we will lose the beauty of high power beasts, second- you will need some generations to catch up the old performance, and finally- there will always be people who need maximum performance at any cost.
Low power= less work performed per any given period of time or pure waste.
No.Will GCN the last great architecture for desktops from AMD?.
Will GCN the last architecture for desktops from AMD?.
This is absurd.You know with AMD doing so badly and everybody jumping ship, you really have to wonder what is so wrong with Nvidia that they can basically only get a tie on performance while being months late.
This is absurd.
The one time Nvidia doesn't make a massive monolithic die of doom, they get jumped on for "only tieing" AMD. Let me also point out that most of the "abandon ship" has been happening well after the release of the 7000 series.
In reality, Nvidia's position has improved relative to AMD's. Power efficiency is much better, perf/mm^2 is better, their microstutter is drastically reduced...
But no, because their top chip is smaller than AMD's, there's "something wrong" with them.
In reality, the only thing that's wrong here is the argument you've just made.
Where is power efficiency better? As usual "enthusiasts" are stuck looking at the top cards and ignoring the rest of them. Care to show me how Nvidia is winning the efficiency stakes at the low end or midrange?
I think GK107 is great and GK106 not terrible either.
Of course AMD does well too. They both do efficient GPU and there's not much flame material unless you want to nitpick here and there.
I think we can order the low-midrange products this way and that it's true for both power use and performance :
GTX 650 < 7770 < GTX 650 ti < 7850
If a vendor sucked and the other not we'd see a card both more power hungry and slower than the competition, as could happen in the past.
( I happen to think the GTX 650 and 650ti are appealingly power efficient)
You aren't thinking rationally. Your post is saturated with confirmation bias, and you have no business partaking in rational discourse until you can demonstrate an ability to critically think.Where is power efficiency better? As usual "enthusiasts" are stuck looking at the top cards and ignoring the rest of them. Care to show me how Nvidia is winning the efficiency stakes at the low end or midrange?
So Kepler wins in perf/watt at the enthusiast end when AMD saddles their gpu with a crapload of compute and a memory bus + memory that is 50% bigger than it needs to be? Jeez I'm so impressed!
AMD could barely be in a worse state yet they are still ahead of Nvidia's "great" Kepler architecture where it matters. That should make you and rpg wonder just how godawful bad Nvidia is.
Whatever helps you sleep at night. Meanwhile, in the real world, Nvidia's raking in the dough because AMD flopped terribly with OEM wins.
You aren't thinking rationally. Your post is saturated with confirmation bias, and you have no business partaking in rational discourse until you can demonstrate an ability to critically think.
Why are you unimpressed? Because you want to be. Because you want AMD to win, and because you want Nvidia to lose. So you come here and post a ridiculous argument with no factual backing in order to paint Nvidia as the loser.
This isn't the Reddit hardware hivemind. This is Beyond3D. Your drivel does not belong here, and the majority of us around here are just going to roll our eyes at you.