Their current free-to-play attitude is more of a temp measure and a coverup, due to the PS2 hardware limitation(ie no built-in online parts).
What does free-to-play have to do with an RJ-45/11 PHY not being built onto the hardware? Mind elaborating on that a little more (or perhaps sharing some of that glue with the rest of us)?
Isn't the GPU in Xbox basically a slightly modified GF3 though?
Wanna show me a GeForce3 with two vertex shaders, an AGTL and a HyperTransport bus?
The BB Navigator software (you can download game demo and view new games information on the game channels) also need the hard disk and the BB Unit.
I forgot about the BB Navigator... My VF4 is the US one so I haven't tried it with an HD.
Of all the consoles, my understanding is only the xbox gives you this basically for free.
Considering DTS Interactive is relatively 'free' (4-6% on the cpu), I'm not sure that's much of a big deal... As for the GCN, since it lacks any digital audio output (not including the Q), it's not something that you're going to worry about. At most you'd just mixdown your positional audio to DPLII...
Anyway my impression is that I don't think any of the other consoles have audio hardware that's quite at this level of sophistication, but I could be totally wrong.
Well not in a single chip. You'll get no argument over the relative capabilities of the MCP (I enjoyed by brief time on an XDK and spent many hours on the one on my roommate's nForce system). Now the most significant things about the APU IMO (since that's the portion of the MCP we're talking about) are the sheer number of channels, DICE, and the setup engine. In comparison say, to the PS2 you've got the choice of a couple of output formats (it really depends how your game leverages the hardware, and how wide an audience you wanna target, i.e. more people are going to benefit from Surround or DPLII, than DTS), of which none really put a significant load on the CPU. The IOP pretty much does with the APU setup engine does. As far as signal processing goes, they're fundamentally the same (i.e. setting up banks of audio channels and ping-pong samples between them). It's just in the case of SPU2 you've got 2 cores (CORE0, CORE1) with 24 channels to set up (you can do more via software), whereas with the MCP you've got 8 sets with 32 channels each. So in that aspect the APU pretty much beats the pants off of pretty much anything out there short of professional gear...
If we look at pain free devices that compromise the functionality of a PC, they have already been tried and failed(WebTV likely the most noteable- delivered everything it promised and it wasn't close to enough).
Conversely if you look at the functionality of the PC that compromises many of the pain free devices we use today, for all it's power and flexability it's failed to replace them...
This has already been tried and backed by tens of millions in marketing and it was built around an honest 'turn it on and go' product, WebTV. It was significantly cheaper then a PC and still failed to gain widespread acceptance. Why? It compromised functionality. Despite people's prime concern being access to the net the inability to handle other operations killed any chance WTV ever had in terms of marketplace acceptance.
Well I can counter your WebTV argument with DoCoMo... However one fundamental problem with your argument here is that you're comparing a device to a service. So if we compare say device to device, I'll use game consoles as a counter argument. Game consoles have been a pretty much 'functionally limited' device used for the sole purpose of playing videogames. Yet the PC has been around doing the same thing for just as long, and some would argue does a better job (not to mention offers more options in terms of game expansion and end-user contribution), yet the game console market hasn't died, or failed, but rather flourished (with some arguing at the expense of PC gaming)...
As far as the whole grid computing thing goes. It's not like it can be utilized in every single aspect of all games. However it does display an awful lot of potential for massively multiplayer persistent online worlds whether it be an Everquest/Galaxies type, or a persistent online Sims, or Starcraft...
When it comes to designing a 3D rasterizer? The i740 when first launched was supposed to be a high end part(complete with the price tag) and was supposed to set a new standard in 3D graphics. It failed miserably. How many companies in the world have proven that they can produce a feature complete 'GPU'? I can only think of a small handful; 3DLabs, SGI, ATi and nVidia.
The i740 was never intended to be a high-end part. You're confusing it with the R3D-1000 and 100 (the i740 was derived from the 100). As for it's success for failure, I'd say it did fair. It offered comparable performance to it's primary competitor (the Riva128 and ZX) with better image quality (and was cheaper), and outclassed pretty much everything else (Rage II, Rage Pro, Rendition, M3D, Voodoo) 'till Matrox released the G200. Of course there was the Voodoo2 which was in a class of it's own (which cost a lot for one let alone 2 for SLI, and didn't offer any 2D either).
As for feature complete 'GPU', mind pointing out the specifications for what a 'feature complete GPU entails?' It would sure be interesting since Nvidia coined the term 'GPU' with the NV10 (yet still product 'GPUs' with more 'features'). I guess one could just say that Nvidia is the only one in that category. I guess if you want to be generous you could include Matrox and ATi. SGI has never really made any 'GPUs', and 3Dlabs doesn't either (they've got a lot of high-end solutions which comprise many chips, dunno if any one of the are GPU, or course they've got those new-fangled 'VPU' thingies though)...
(BTW, I kinda forgot that the Wildcat lineup came from Intergraph, and the Oxygen line came from Dynamic Pictures...)
I guess if you want to get a little more broad-minded and generous, you could say Nvidia, ATi, Matrox, 3Dlabs, VideoLogic/PowerVR, Intel, SiS, Via, SGI, HP, IBM, Sun, TI, Fujitsu, Hitachi, Mitsubishi, E&S (they still around?), and to be really nice I'll throw in Sony and Toshiba have all comprised significant enough contributions to 3D (at least on the hardware side) to be considered 'experienced'...
Why did I state that an Intel platform would be more difficult to work with? Based on their history I don't see them coming up with a feature complete part capable of exploiting the HLSL in up to date(let alone ahead of the curve) DX revisions.
I'd think the off and on animosities between the two would be a bigger obstruction. As for "the HLSL" which would that be? Last time I checked there was a single standard. Hell I can think of several off the top of my head (Renderman, Pfman, RTSL, ISL, ESMTL, Cg, whatever's in OpenGL 2.0)?
They have failed to release a DX7 level part to date, I don't see them pulling out the engineers to accomplish such a task when they specialize in processor centric applications.
Perhaps because they do quite fine (downright dominant) with the capabilities of their core-logic designs? Again, if it wasn't a significant market why would Nvidia, ATi (Via, and SiS for that matter) be getting so involved with that sector? In case you haven't noticed, the high-end consumer 3D hardware market isn't all that large, it's costly, and not very profitable. At best it's good for pride, and if it's your core business and you're good at it then it can give you some good 'trickle-down' hardware at the low-end level and if the parts are good enough some business at the high-margin pro-level.
As for Intel 'pulling' engineers. Perhaps from an existing project per se. Of course they've got so much stuff going on, I doubt they don't have the engineering resources for such a task. In case you haven't notices, it's not a far reach to go from CPU design to 'GPU' design (one could argue that CPU is probably more difficult at the logic level). Of course there's the analog side to GPU design as well, but considering Intel's efforts to catch up to IBM in mixed signal processes I don't think that aspect would be too difficult. And they've obviously got the software knowhow... (And yes I know 3DR was weak sauce).
If the design were to fall to Intel I see them almost assuredly following Sony's design theme(reliance on CPU power to fill in for GPU functionality). Even with their sole attempt to enter the high end consumer 3D market they relied on other platform technology to help them cover design issues(the i740 add in AGP boards had no on board texture memory... WTF were they thinking?).
Well one can see Intel focussing on a 'CPU' centric theme (after all they've pushed it with their CPU extensions). Also, you're idea's about Sony's design is somewhat flawed in comparison. Intel's extensions have relied on using the CPU's execution resources to accomplish all the computation, whereas Sony has gone the route of relying heavily on dedicated execution resources to avoid being CPU bound (while providing functionality to allow the CPU to have control and direct access to some of the dedicated functionality).
As for what Intel was thinking on the i740. How about actually utilizing the full spec of the bus? I mean besides it, the Rage chips were the only other ones to really leverage AGP. And I'm afraid I'm going to have to call bullshit on the lack of texture memory on the i740 boards. I had a Starfighter and it had 8MB of memory. Now if you specified DME in it's DX settings, then it would allocate main memory for textures. You could still DMA textures to the board for execution however. That had to be allowed for the PCI cards to get any texture data, of course those were weird as they had 16-24MB of memory (8-16MB allocated for textures) in which half or more was made to believe it was behind the AGP bus... Anyway at that time DME wasn't so bad as the differential in memory performance between graphics cards and main memory wasn't nearly as bad as it is today...
I was also interested in the game before, now it's more.
I was rather interested in their approach to releasing the game in pieces across 4 discs...
And Archie, seeing that You made me curious about it in that other thread, I blame you if I miss any deadlines in following weeks
Sorry?