Comparing the price of a standalone graphics card to a graphics card inside a console doesn't tell is that much because there are too many dissimilarities.
The equivalent GPU in a console will be integrated onto a motherboard so shares PCB expense with the rest of the system for a start.
The only data of value is - power consumption (gives an idea of heat output and what kind of cooling may be required in a console enclosure, die size (how many dies will fit on a wafer to give some indication of price per die to manufacture) and performance (just to make sure it has the pre-requisites for being "next-gen" worthy).
Shrinks, optimizations etc can then be speculated on if said graphics card was proposed for a console. A good reference point is current gen and previous gen scaling factors in power/performance and speculations on cost as per first post.
In any case a 5770 with 800 stream processors and a 128bit memory controller would be great. Or would it?
Current essential specs at 40nm:
166 to 185mm2 (depending on what site you read)
100-120 watts at load
30-40 watts at idle
Average system consumption at load approx - 170 watts to 220 watts (varies site to site). Idle power consumption is approx 30-40 watts for the card alone.
Performance - mid-level, lacks tessellation performance, superseded by a new more efficient architecture (VLIW4).
Even then, is the 5770 powerful enough for a next gen console, in say, 2012?