Yeah but it would be damn good.
And damn HOT.
I doubt any console maker with choose Larabee. Console makers don't want to buy chips by IP. Remeber first Xbox and NV GPU. IT was big failure for M$ and they even sued NV for big prices.
Yeah but it would be damn good.
And damn HOT.
I doubt any console maker with choose Larabee. Console makers don't want to buy chips by IP. Remeber first Xbox and NV GPU. IT was big failure for M$ and they even sued NV for big prices.
Eh... but Intel is already in the GPU market. Developer familiarity? They are already familiar with it, wasn't it the point from the beginning? If they're really not, then Intel is in a tough situation. If Intel considers the console market, it's when they can use throwout chips for consoles.All rambling speculation, but I wouldn't count Intel out. They have a lot to lose. Unlike Cell, LRB has an established market (GPUs) to get a foot in the door (hence one reason why Cell based cards never took off). Intel absolutely wants to prevent NV and AMD from "moving the ball" into their court so LRB is an important investment to extend the x86 platform and prevent others from erroding their marketshare. Even better if Intel can move into the GPU market and capture more of the HPC market (basically Intel & x86 everywhere from Atom to multi-array LRB super clusters). Getting into consoles would ensure developer familiarity and the advancement of tools that work with the platform.
Eh... but Intel is already in the GPU market. Developer familiarity? They are already familiar with it, wasn't it the point from the beginning? If they're really not, then Intel is in a tough situation.
Eh... but Intel is already in the GPU market.
Developer familiarity? They are already familiar with it, wasn't it the point from the beginning? If they're really not, then Intel is in a tough situation. If Intel considers the console market, it's when they can use throwout chips for consoles.
By the way, when we talk about a new generation, we have to talk not only about a vehicle but also its payload, or software.
New hardware will be optimally designed for specific workload. Especially game consoles are highly cost-sensitive. Also some workload may be moved to remote servers in future.
When Kutaragi insisted on dual HDMI or HDMI 1.3 bandwidth, I suspected stereo 3D. Things are certainly moving in that direction.
Just for the record. Do you think that HDDVD and BD have helped growing the HDTV market?The HDTV market is still growing in the US--and it took forever to get it moving. US consumers aren't going to be interested in ditching their new TVs and such features are a poor selling point to consumers who have no need for such.
The PS3 should have taught us that much.
When Kutaragi insisted on dual HDMI or HDMI 1.3 bandwidth, I suspected stereo 3D. Things are certainly moving in that direction.
http://ps3.ign.com/articles/819/819821p1.html
http://www.eetimes.com/news/latest/showArticle.jhtml?articleID=210200055
The HDTV market is still growing in the US--and it took forever to get it moving. US consumers aren't going to be interested in ditching their new TVs and such features are a poor selling point to consumers who have no need for such.
The PS3 should have taught us that much.
Just for the record. Do you think that HDDVD and BD have helped growing the HDTV market?
considering the sales of HDDVD and BR vs HDTVs? No, they have had almost no effect on the HDTV market.
I think this point is overplayed.
MS wanted the IPs for the 360 so they would have more control on cost reduction as they could put bids at various fabs, save money on process node shrinks, and even integrate chips for additional cost savings down the road. iirc MS doesn't own the PPC IPs in general, only their specific use in the Xenon chip for the Xbox platform; dito Xenos: MS owns a license for the Xenos chip and direct adaptations of such, but MS doesn't own AMD's patents and ability to re-use them in new designs (i.e. a new GPU).
The real question for Larrabee is Intel willing to play ball by offering competitive pricing? Intel has had little reason to do so, but with Larrabee and the GPU market in general Intel may have some incentive to use the next Xbox (or Wii or Playstation) to get their foot in the door and head off the competition from making inroads into their marketshare and mindshare. And if Larrabee doesn't suck and is willing to be cost competitive Intel offers something the other partners would struggle to offer: a high degree of assurance of hitting 32nm at launch (for CPU and GPU) and market leadership in reaching the 22nm and 16nm nodes, and possibly beyond. Going with Intel, if a traditional cost reduction schedule is planned, is the safest bet. Remember, we had a number of cheerleaders here in 2003-2005 talking up Sony's better and sooner than Intel's 65nm node in which Cell would launch, etc. Looking at the landscape of the current market, while we hear of rumblings of companies trying to make moves on mass production of advanced process nodes the sure money really is on Intel here. Just look at the 45nm node; Intel reached it Q4 last year for production parts and AMD is getting there now... and the consoles are struggling to get to 65nm with TSMC somewhere in-between for cutting edge products. There is a good chance Intel will be releasing 32nm chips before a single 45nm console chip is released.
If Intel is willing to be cost competitive to get the Larrabee architecture to the market to garner developer experience and support and to stave off NV/AMD GPUs there are some big advantages to going with Larrabee if it is performant. Intel could offer some interesting options in regards to higher clocks and/or lower heat, more transistors yet in a smaller die area, the ability to move multiple chips to the next process node at the same time, or even migrate a multi-chip soluition (e.g. 2 x LRB v.2) into a single chip at the necess process node. If Intel & "Partner" decide to go with a LRB solution that can be future looking/cross market (e.g. 2 "standard" LRB v.2 chips at launch) Intel could use those chips for PCIe boards and the single chip shrink could again be used for a midrange product in addition to the console part (hence potentially some binning). The power consumption of LRB is probably too high for a laptop, but who knows what kind or arrangement can be made if they decide to tack on a couple OOOe cores (single die CPU/GPU solution for midrange PCs?)
All rambling speculation, but I wouldn't count Intel out. They have a lot to lose. Unlike Cell, LRB has an established market (GPUs) to get a foot in the door (hence one reason why Cell based cards never took off). Intel absolutely wants to prevent NV and AMD from "moving the ball" into their court so LRB is an important investment to extend the x86 platform and prevent others from erroding their marketshare. Even better if Intel can move into the GPU market and capture more of the HPC market (basically Intel & x86 everywhere from Atom to multi-array LRB super clusters). Getting into consoles would ensure developer familiarity and the advancement of tools that work with the platform.
Of course LRB cound end up being slow, too large for the performance at that, and cost an arm and a leg and NV/AMD offer cheaper solutions, that are faster, that are great solutions for graphics and physics.
Strictly speaking all you need for stereo-3d is high enough refresh to be tolerable (for TVs). I believe Samsung is using 120hz for their '3d enabled' sets, and that's already a feature that is getting marketed for other reasons - hence I suspect in a few years majority of new sets sold will be 120hz (or higher).Joshua Luna said:The HDTV market is still growing in the US--and it took forever to get it moving. US consumers aren't going to be interested in ditching their new TVs and such features are a poor selling point to consumers who have no need for such.
The only VR that I've seen that was actually convincing worked as augmented reality (overlays on real world). Possibly because having real-world anchor distracted from (rather ugly) rendering quality, and possibly because being able to physically move around the environment without wearing a suit adds something to the experience akin to motion control schemes.It would be interesting if someone could make an extremely light VR HUD with headtracking as that could take the interactive, as well as ease of use in some designs, to a new level. But how do you make having goggles on look fun and cool?
Of course you too. Even your favorite Intel shows it does actually care about application in the article I quoted...
Since 2007, studios have released or put on the drawing board as many as 80 stereo 3-D movie titles. At the Intel Developer Forum Wednesday (Aug. 20), Dreamworks co-founder Jeffrey Katzenberg said all his studio's animated movies starting next year will be created and available in stereo 3-D, a shift he said was as significant as the transitions to talkies and color.
The only VR that I've seen that was actually convincing worked as augmented reality (overlays on real world). Possibly because having real-world anchor distracted from (rather ugly) rendering quality, and possibly because being able to physically move around the environment without wearing a suit adds something to the experience akin to motion control schemes.
Either way, while googles themselves were very thin and light, IIRC the system used sensors/projectors or something around the room which doesn't seem like something fitting for a consumer product anytime soon.
I'm aware of that, but this matters the most - cost reduction. M$ learnt its lesson. Of course with Xbox they just wanted to make a presence in console world. They are still trying to get to the living rooms via consoles, but they are far far away behind SONY. Of course this is another topic.I think this point is overplayed.
MS wanted the IPs for the 360 so they would have more control on cost reduction as they could put bids at various fabs, save money on process node shrinks, and even integrate chips for additional cost savings down the road. iirc MS doesn't own the PPC IPs in general, only their specific use in the Xenon chip for the Xbox platform; dito Xenos: MS owns a license for the Xenos chip and direct adaptations of such, but MS doesn't own AMD's patents and ability to re-use them in new designs (i.e. a new GPU).