99.999% certainty just plain ole DRAM, likely on a quite wide bus.So it seems pretty much confirmed that Haswell GT3e will have 128 MB eDRAM on-package.
$50 premium for a premium notebook like a macbook retina or similar device isn't much of an issue, however apple didn't have any competitors in the ultra-high res laptop screen market when they launched. Curiously, now there's some other similar laptops out there, google's chrome pixel and whatnot, so it could have been nice had intel finished developing the ivy bridge GT3e (or whatever it would have been called) anyway. They would have had a bigger market than just Apple for it a couple months down the road.Understandable at a $50 premium.
I don't think just integrating dram would have make any sense for Ivy Bridge. Unless they are really talking about GT3 Ivy, and I don't know if intel would have been ready for that.So it seems pretty much confirmed that Haswell GT3e will have 128 MB eDRAM on-package.
Apple already wanted this to happen with Ivy Bridge for their MacBook Pro with retina display but apparently they were the only customer so it got canned. Understandable at a $50 premium.
LOL, so much for that prediction: http://www.anandtech.com/show/6911/...usiness-haswell-gt3e-to-integrate-128mb-edram99.999% certainty just plain ole DRAM, likely on a quite wide bus.
Egg on my face too.LOL, so much for that prediction
This is a condensate of a text written by David Kanter, who digs through informations intel has released to try and figure out what kind of graphics setup the highest tier of haswell CPU will offer. What seems to be the conclusion is 128MB eDRAM, fabricated on intel's own .22nm tri-gate process.
With the catch that this is all speculation, as there's no official word to either confirm or deny. It looks to be anyone's best guess this far though.
How much of a stack can you really need on a DRAM die though...? Maybe that helps with construction, I can't imagine you'd need anywhere near the 11+ layers used in modern microprocessors.Apparently, Intel builds their cap trenches with and within the metal stacks.
Kanter's really great, yeah. He also writes excellent articles that are easy to read for basically anyone really.David Kanter is pretty much as reliable news source as there is in tech.
How much of a stack can you really need on a DRAM die though...? Maybe that helps with construction, I can't imagine you'd need anywhere near the 11+ layers used in modern microprocessors.
Kanter's really great, yeah. He also writes excellent articles that are easy to read for basically anyone really.
Intel had the i740 which did virtual texturing way back in the early AGP bus days. There's a little bit of history for block based schemes for them. Pretty sure the i740 did it all in hardware too, without any driver intervention. I believe the Gamecube/Wii works similarly too, with the hardware taking care of stuffing its cache with textures, without needing to fiddle with it via game code...
Something similar could have been implemented here.
The implementation in the i740 surely was very crude, considering the extreme age of the device (I remember playing the original Half-Life on one of these puppies!), I was merely referencing the principle.
Also, some sort of texture cache, even a miniscule one, could very well have been part of the i740, enough to hold a small chunk of texels for filtering and burst transfer capability, as surely the card did not access textures one texel at a time across the bus; that would have been monstrously inefficient and maybe not even technically possible.
No, i740 still has on board memory (local memory), but it only uses these for frame buffer. All textures are stored in main memory, and have to be accessed across the AGP bus. Of course it has on-chip texture cache, just like every 3D chips, but that's not the topic of this discussion.
My point is, if GT3e only uses the eDRAM for frame buffer, that'd be pretty boring. However, just using the eDRAM for a "texture cache" (not the same as the traditional on-chip texture cache) is not ideal, because the relatively small size of the eDRAM would cause a lot of cache thrashing. A better way is to make the eDRAM and the main memory as some sort of "unified" memory, only that some parts are quicker. The system will have to determine which texture (or which part of some textures) are more frequently used and should be stored in the eDRAM, while others should stay in main memory.
Thank you, I know that. I was around when these things actually sat on shelves in shops! (Actually, that's what they mostly did, unfortunately for intel...)No, i740 still has on board memory (local memory), but it only uses these for frame buffer. All textures are stored in main memory, and have to be accessed across the AGP bus.
Yeah, and as the article I linked to alludes to, it's not; instead it's speculated to be a last-level cache for the entire APU, both GPU and CPU, according to intel papers. Exactly how it is managed may not even be publically disclosed, although considering intel may want devs to optimize for their hardware they probably will reveal its inner workings closer to launch.My point is, if GT3e only uses the eDRAM for frame buffer, that'd be pretty boring.
I think Dave's articles are great, but personally I find his postings to be very pro intel. Which is fair enough, but that, in a way makes his articles even better because despite his bias ( my perceived) he can still put it aside when doing an actual published analysis. ( insert thumps up emotocon that b3d lacks)
Exactly how it is managed may not even be publically disclosed
Are these pro-Intel postings wrong or less accurate because of the bias you detect?
It may well be that he has better access to technical resources at Intel.
That makes me wonder if they'll use it in a similar way to the eSRAM in Microsoft's Durango. Granted, this is significantly larger than the pool of available fast memory on Durango.
Regards,
SB