Equal parts Year in Review, psychiatric therapy for the deeply disturbed on the cheap, and love notes to the class cuties...yes, it's once again time for Beyond3D's "Bricks & Bouquets", 2007 edition! Read the full news item
That said I have a lot more faith now that Matt Pharr and the crew from Neoptica are there to fix things up. Hell if Matt Pharr tells me that pure raytracing is the future I might actually consider it For now, I'm sticking with the consensus reached at the end of that epic thread.
The 8800GT is an excellent *product*. But offering essentially the same SKU at a lower price point thanks to manufacturing and marketing prowess isn't the type of technological progress I'm looking for.How does a budget part not demonstrate progress? Are we seriously of that mind "if its not faster its not better"? That's crazy, there is much to be said about significantly lowering the price of entry for high levels of performance.
It isn't bad at all! But... since Intel has been historically weak in this area, the most likely explanation for this outcome would be screwups by the current leaders. I don't care who wins, but screwups are thoroughly undesirable.Also, why would Intel coming into high end graphics be something bad?
My point was that the progress in the graphics field (and its derivative children) has, historically, made Moore's Law type projections laughable. If the creative talents who established that trend can't carry the torch, Intel will take over by virtue of brute force if nothing else.What would be bad is if Nvidia was the only company in high end graphics, that would be hell on consumers. At least Intel would put much pressure on Nvidia, just hopefully not to the point of destroying the company.
Also, why would Intel coming into high end graphics be something bad? What would be bad is if Nvidia was the only company in high end graphics, that would be hell on consumers. At least Intel would put much pressure on Nvidia, just hopefully not to the point of destroying the company.
Also Rys with Crysis makes me wonder what game he was fooled into thinking it was Crysis. Derivative physics and graphics? No game out there comes close in either area. Physics in Crysis are out right amazing and the graphics on medium match every other game, let alone the high settings.
Are you SURE we are talking of the same company? I think people need to stop taking Intel's marketing statements so literally... (for example: in the Larrabee timeframe, Intel will be at a process density disadvantage)how can they compete against a company that will offer unbeatable process and power reduction technology?
Yep, I loved getting stuck on a rock in the opening swim to the beach. Great physics! It has some cool stuff... but is also blatantly rough and buggy beyond my imagination. Watching other user videos of the physics, and especially AI, indicates this sort of issue isn't something just me and Scotty have hit on. Games have bugs and limitations, but as great as some parts in Crysis-Physics are (destructible foliage and buildings, component damage on vehicles, cinematic physics, etc) I don't think I see enough tempering from the fans about the blatant show stoppers. Having to restart the game 2 minutes into it due to a physics bug is pretty substantial in my book and worth noting. But then again maybe others don't have any of these problems?
How does a budget part not demonstrate progress? Are we seriously of that mind "if its not faster its not better"? That's crazy, there is much to be said about significantly lowering the price of entry for high levels of performance.
Also, why would Intel coming into high end graphics be something bad? What would be bad is if Nvidia was the only company in high end graphics, that would be hell on consumers. At least Intel would put much pressure on Nvidia, just hopefully not to the point of destroying the company.
I really don't see how anyone can actually consider Intel's past in the GPU market. It can't be the same or they'll simply flop. So that's useless...
I really don't see how anyone can actually consider Intel's past in the GPU market. It can't be the same or they'll simply flop. So that's useless...
What I'm trying to understand is what's leading you to believe that Intel has any intention of going head-to-head with either nVidia or ATi in the discrete 3d graphics markets. Intel's always been in the integrated gpu markets, however.