Xbox 2 coming in Nov-Dec 2005 - Revolution could be stronger

Status
Not open for further replies.
Ender said:
Can Nintendo really compete with MS in a price war?
The question is does MS want to compete in a price war? MS is interested to not lose a lot of money again, and at least break even this time. And Nintendo does have a few billions in the bank as well.
 
First of all, you were initially incorrect to assume that Anandtech was comparing dev kits. This article was written in 12/01, clearly after the platforms had both launched & specifications for both systems were finalized.

I wasn't talking about the andantech article when I said that. You didn't post a link to that at the time, remember? I was talking about the EA comments in the article IGN posted back in 2000 regarding game cube hardware performance. If I recall that was before they dropped the speed of flipper, as early devkits were clocked higher then later devkits.

You seem to forget that the unmodified Celeron X-CPU received a downgrade in clock frequency from 800mhz to 733 iirc.

You completely missed my point on downgrading speeds. The article from IGN you posted talked about GC hardware performance a year before it came out, but I'm fairly certain that the performance it talked about was based on the higher clocked hardware. That's why I mentioned that. Yes the CPU speed was raised later on, but that wouldn't have much affect on performance when you're talking about hardware lights etc... If it was talking about skinning and animating characters then the CPU speed could have made a much bigger difference, however that IGN/EA link you posted didn’t' talk about anything in that area.

You consistently refer to the EA benchmarks, I'm sorry but I do not put as much stock into them as you do. We don't know how efficiently coded that benchmark was to begin with, so why cling to its results as if they were gospel?

Ya know it's funny to see you argue using some article quoting someone unknown form EA on gaqmecube hardware performance regarding hardware lights etc, when you then turn around and try to discount more information from EA when it doesn't agree with your opinion.

I'd put far more credit in compared performance numbers from someone like EA as they release the MOST multiplatform games compared to any other developer/publisher. Btw, yes we do know how efficiently EA's benchmark was coded as they explain it in the article. If I recall correctly it was a siggraph article.

You can discount this if you want, but you haven't done any multiplatform development, have you? You have no reason to believe they are wrong.

A general chameleon type code easily transferrable to all respective platforms is what EA does basically.

Find and read the EA siggraph article, they explain what they did on each platform with regards to optimizing. EA's benchmarks for all the platforms was the best one I've seen yet. They did these benchmark tests in an attempt to simulate an environment a game would be running in. complete with AI running in the background (doing what exactly, I don't know) but it's the closets thing we've seen to a real multiplatform comparison with benchmarks.

[edit] the aqrticle that I post later on actually has the CPU idle snce they assume AI and other program oriented tasks would be running on it. However the actual EA document I'm looking for gave stats on skinning and a few other areas that wold invovle teh CPU on gamecube (due to the fixed function T&L flipper uses)[end edit]

Also AFAIR the EA benchmarks had the GC above, or just below the X-Box for whatever it's worth regarding in-game conditions. (correct me if I'm wrong)

The EA article performed tested for poly throughput, skinning, texture through put and a range of other benchmarking areas. You're not correct about where it positioned the GC. In most tests the game cube was hovering above or below the PS2 in performance when the same task was tested.

Which devs. are you referring to exactly? Those with their roots in PC development? What about the plethora of Japanese ones? Put some actual thought behind your statements before making blanket erroneous ones.

Do you think game programmers start out coding on console before ever touching any other platform? The majority of developers have some sort of root in PC development. I haven't met a single game programmer in 10 years that wasn't familiar with coding on Intel CPU's at one point or another. After all, it's one of the most widely available cpu's to the public out there.

So, perhaps you should put some thought into your statement before trying to appear like you know what you are talking about.

By elegant I believe you mean efficient.

No I meant it was an elegant design. Since when has that word meant efficient? Like I said before, there's no real way to tell if it's the most efficient gaming platform unless you've coded on all three. Which you obviously haven't.

How can the X-Box be faster in lighting when on the GC they are done in parallel or simultaneously with other functions? (infinite with specular, or local omni. Negating any possible speed differential) Or shadowing & self-shadowing when they can both be combined in one operation or pass?

Once again, look for the EA siggraph article. I've been looking for it but haven't had any luck finding it yet. You can see the performance numbers

From Gamasutra:

So where in this article does it say that performing the lighting in parallel will make it faster than what Xbox is doing? Or still provide it with less a performance hit, or better yet, less a performance hit in a actual game instead which is all that really matters?

EMBM can be performed faster upon the GC, although bump mapping is faster when done on the X-Box.

Once again WHO CARES?!? We're already heard from multiplatform programmers in this forum (such as ERP) and he's already stated how you'll always find specific cases where one hardware set can out perform another hardware set at a given feature. Like I sad before, what difference does it make if people aren't doing this things in the majority of games? for instance I've hardly seen EMBM in most console games. Nor have I seen 8 texture layers in most console games.

"Who cares about 8 texture passes?" I'm sure that Capcom, Factor 5, EAD, Retro, & the more proficient Gamecube coders do.

All that really matters (if you want to argue efficient terms) is which console is more efficient at the features most developers choose to use in today's games. Which is exactly why I talk about the EA benchmark article.

And exactly how many components on the X-Box is MS responsible for directly producing? Besides it's DirectX API? Do you see how irrelevant that statement you made is?

It's relevant when you want to get into a which CPU is more efficient argument as it looked like that was where you were heading. The point behind my saying that was to head off any sort of argument about the single components in the console being more efficient as the console as a whole is what matters.

What one developer is capable of extracting with 8 textures poly-wise is not representative of what all developers can accomplish. By your logic then how did F5 get 12mpps using 5-6 texture passes then? (its listed maximum) Its all about developer proficiency when exploiting the said platform.

Please, now you're grasping at straws! I'm sure there's a few different ways to implement multiple texture layers on the game cube, but there isn't an infinite amount of ways to do it as they never let you get that close to the hardware.

If you want to bring up factor 5 and their claims then go ahead. I have trouble believing anything they said as they seemed to frequently BS performance numbers IMO. Even still those numbers where from the firs rouge squadron on the GC, correct? It's not hard when you have nothing but static animating objects ( no characters or skinning), to get the most performance out of a hardware T&L system like flipper. Too bad he didn't mention the game was running in 16 bit color with FSAA enabled.

The DS is Yamauchi's last recommendation/contribution to Nintendo, he admitted this himself btw. You overstate Yamauchi's importance to Nintendo under Iwata's guidance. (completely unfounded as many of your statements have been thus far)

I base my opinion on hearing from other developers I've met out of Japan on how different the game companies are culturally. What I see from you is someone that doesn’t know one way or the other, or is simply a real hardcore Nintendo and only hears what they want to hear. One way or the other my opinion is based on SOMETHING. I've yet to see what yours is based on other than article written on the internet that supports your opinion.
 
Zidane...your picture comparisions......

Well, for one, the psx shots doesn't show the polygon popping that would exist on a real psx.

And halflife was not a top notch title at the time when skies of arcadia came out...if you're going to show halflife as least use shots from the hd pack. Anyhow, quake 3 was out by then, or you could just take a pc shot of final fantasy 8, or emulated chrono cross.

From the pc and dc shots you used, even n64 looks better....

01.jpg


sonic8-big.jpg


3.jpg



perfect242.jpg



rj_fire1_l.jpg


perfect_dark_6.jpg


outtrigger.jpg


06.jpg


05.jpg


eternal.jpg


d2-3.gif


6_g.jpg


external_image2.jpg


4.jpg


dc_re303.jpg


screen01.jpg


perfect.jpg


See, any game can have bad pictures.(couldn't really do a best example n64 case against worst example something else though, as I can't recall any n64 games that basically used all their power on a single object, so I just stuck to similar looking games)

Edit:
I guess drag the ones that don't show up to the link bar..particularly the m2 screenshot of d2.....
 
thop said:
Ender said:
Can Nintendo really compete with MS in a price war?
The question is does MS want to compete in a price war? MS is interested to not lose a lot of money again, and at least break even this time. And Nintendo does have a few billions in the bank as well.

Well, MS has made some changes regarding that. The whole agreement between IBM/ATI and MS is unknown... but it is said that it is waaaaay better than MS/Nvidia deal.

I don´t think either that MS would be less aggressive, MS wants to be a leader in the biz and if you look at all the investments they have made so it´s safe to say that they are here to stay. I spoked with Allard at X02, we spoke about MS loosing all this money... he said that MS was in it for the long run and he thought that MS would get the money back in about 15 years...

Regarding NIntendo..
Sure, NIntendo has some billions in the bank but they will have to distribute it amongst lots of machines and will have to makes some investments soon about their Online-strategies. I believe that it will be very competitive from the get go, now that all the machines will be released rather close to each other. Price will be a huge argument on why to buy X machine before Y... and if MS is really serious about establishing themselves... then there isn´t nothing Nintendo can do. They cannot match MS if they bet for a stronger hardware... because they will not be able to take that loss that MS can... I don´t see Nintendo doing it...
 
Here's one of the EA siggraph papers and they explain what they are doing for the platforms, it seems pefectly reasonable to me that they are doing thier best to be optimial on each one.

http://www.cs.brown.edu/~tor/sig2002/ea-shader.pdf

Perhaps you're argument regarding EA not doing things in a optimial manor are out of line.

Also there's a few performance numbers in this document from the other EA performance document I can't seem to find. I'm still looking for it btw.
 
MS might be thinking ahead, but i think they will only burn so much money before they get out of the business.

And a good price alone won't automatically sell XBOX2. GC is dirt cheap and still doing worse than expected (by Nintendo). In the end it will be about the games and if they can get into the japanese market. Even in they make #1 in USA/Europe they will never really succeed if SONY/Nintendo have a whole market on their own. And the japanese devlopers won't seriously consider MS either until their consoles are a success in japan. Unless MS buys them or pays them that is of course.

I read that they got a 10 year plan to get into the japanese market, sounds like they are not rushing anything.

About Nintendo, i think Nintendo is gonna survive without problems. There are still millions of Mario, Zelda, Pokemon fans out there. Their 1st party and exclusives will save them. But i don't expect them to come back big in the near future to be honest. I think MS is more focused on SONY anyway.
 
Well, for one, the psx shots doesn't show the polygon popping that would exist on a real psx.

Not all psx games feature pop-up, many boss battles, summons are optimized, and feature no pop-up. That is one of em.

And halflife was not a top notch title at the time when skies of arcadia came out...if you're going to show halflife as least use shots from the hd pack. Anyhow, quake 3 was out by then, or you could just take a pc shot of final fantasy 8, or emulated chrono cross.

From the pc and dc shots you used, even n64 looks better....

Well, I chose single model pics to compare model complexity. The point of the comparison was to showcase the psx abilities at doing rpgs, that it was good enough that many of the models used in said games can compete and even exceed those of next-gen rpgs( and in other games, not all next gen games feature characters with individual fingers, much less individually animated ones, and with round body and facial features...).

Though, I know the pc and dc are more than capable to run psx esque games, the comparison just serves as a testament to the psx power, showing that some models present in its games can compete with some on more capable h/w and next-gen consoles. That is, for my favorite genre, the psx delivered, with visuals that in some camera angles, had no iq problems that are competent and sometimes exceed modern title character models in animation, polygonal, and texture detail.

As for the pc pics... True the pc pics were bad, that was on purpose, but it was just for fun, no real comparison there.
 
Quote:
Well, for one, the psx shots doesn't show the polygon popping that would exist on a real psx.


Not all psx games feature pop-up, many boss battles, summons are optimized, and feature no pop-up. That is one of em.

Right....I own chrono cross, and on a tv, even in the boss battles, I see popping polys. It's less than normal, but it's still there, and it just destroys the illusion for me.

Oh, and if we want to focus on just single models, well the pokemon in pokemon stadium 1 and 2 look exactly the same as the ones in super smash bros melee, and the ones in pokemon colosseum.(actually, the pokemon stadium games look overall better than colosseum, better effects and everything)[/quote]
 
zidane1strife said:
SOA and Grandia 2 weren't polygon pushing monsters though, they barely bothered. Its nice that some models in Chronocross had more raw polygons dedicated to them, but I'd rather have the vastly superior IQ, texturing etc of a DC game, so they really don't compete in my book, but everyone is entitled to their opinions.

In certain camera angle movements, the iq is comparable albeit lower rez. I'm also talking about texture detail and animation complexity when I talk about the models.

If you don't mind spoilers, or if you've beaten chrono cross already, click on this link. The rez is slightly less than psone output, but just save it and click on it(to activate win xp image viewer) zoom a little bit. In any case the image quality and detail achievable through some angles are showcased in this pic, which is quite faithful to what is seen on an actual tv.
http://img78.photobucket.com/albums/v297/zidane1strife/Destiny.jpg

Compare with....

High quality Dc rpgs:
http://img78.photobucket.com/albums/v297/zidane1strife/skies_of_arcadia_comparison.jpg

http://img78.photobucket.com/albums/v297/zidane1strife/grandia_comparison.jpg

Top notch pc title of the time...
http://img78.photobucket.com/albums/v297/zidane1strife/half-life_comparison.jpg

and with some of the best looking upcoming games in the pc arena(of course the pics for these 2 pc fps are some of the worst, but this is a comparison with a psx game after all... I needed to level the playing field a little :LOL: )

http://img78.photobucket.com/albums/v297/zidane1strife/doom_3_comparison.jpg

http://img78.photobucket.com/albums/v297/zidane1strife/half_life_2_comparison.jpg

Those screens are a bit small, and I haven't gotten that far in chrono, but that model would certainly be far more detailed than what is typical for the game. But again, these games are pixelated messes with nothing much at all going on in the background, its nice that they were able to throw around the polys with pixelated textures and next to no IQ. In that game modeling was a priority, especially in those very limited scenes, now look at Shenmue, since its on DC, they have certain areas where character models are bumped up massively, or DOA, IIRC the cut-scenes were real-time but at a slower fps and the models look just as good as DOA3 on Xbox, or Berserk's cut-scenes, same situation, they looked amazing, those games in those situations had marvelous modeling, but it wasn't interactive.

Now as for the PC shots, man you picked the worst that you could, Doom3 and Half-life 2 are gorgeous, but if I were to run it at the lowest quality settings, it will look like trash.
 
Li Mu Bai said:
You seem to forget that the unmodified Celeron X-CPU received a downgrade in clock frequency from 800mhz to 733 iirc.

Actually the Xbox CPU was upgraded from 600 mhz to 733 mhz. And it's not a Celeron, it's a P3 with half the cache. Just because a celeron has 128k cache too doesn't mean they're the same, because they have different designs. That would also make it modified too ;)
 
DopeyFish said:
Li Mu Bai said:
You seem to forget that the unmodified Celeron X-CPU received a downgrade in clock frequency from 800mhz to 733 iirc.

Actually the Xbox CPU was upgraded from 600 mhz to 733 mhz. And it's not a Celeron, it's a P3 with half the cache. Just because a celeron has 128k cache too doesn't mean they're the same, because they have different designs. That would also make it modified too ;)

It's a mobile pentium 3.....
Anyhow, what else is the difference between a p3 and a celeron besides the cache? I think the mobile p3s were just celerons with lower power requirements. The only thing I can think of that xcpu might have that celerons may not is sse.
 
Fox5 said:
It's a mobile pentium 3.....
Anyhow, what else is the difference between a p3 and a celeron besides the cache? I think the mobile p3s were just celerons with lower power requirements. The only thing I can think of that xcpu might have that celerons may not is sse.

I remember reading it has 4-way something (p3) instead of 2-way something (celeron). The word escapes me right now.
 
or DOA, IIRC the cut-scenes were real-time but at a slower fps and the models look just as good as DOA3 on Xbox
Actually that was just a popular myth. On DC DOA2 there's some change in number of lightsources, but actual number of polys in characters is the same as during gameplay.
 
Ender said:
Regarding the topic.. could Revolution be stronger...

I don´t think so. THe thing is that if Revolution would be stronger, then it would also mean it would be more expensive..right?
Can Nintendo really compete with MS in a price war? No.. and by having a stronger hardware would mean a bigger loss for Nintendo, if they would go to war with MS on price...
They're also talking launch times here. Come out 1+ years afterward and your tech and design can be better for the equivalent money spent, so... More powerful hardware is more affordable, ne? That would be what they're talking about here.

Lord knows Microsoft wouldn't be lowering the price of their console if PS3 and N5 launch evenly, unless their sales were severely slumping and they needed to spruce things up. (At which point you could start getting echoies of GC/Xbox in reverse.)
 
I wasn't talking about the andantech article when I said that. You didn't post a link to that at the time, remember? I was talking about the EA comments in the article IGN posted back in 2000 regarding game cube hardware performance. If I recall that was before they dropped the speed of flipper, as early devkits were clocked higher then later devkits.

Understood. The Flipper's clockspeed was dropped by 40.5mhz. You made it sound like an "extreme" downgrade, & made no mention of Gekko's 80mhz upgrade initially when they were both were in fact altered at the same time & the upgrade was not done "later."

but that wouldn't have much affect on performance when you're talking about hardware lights etc

I wasn't referring to the GC's 8 standard hw lights, but "a custom" lighting method. Something akin to Light Scattering & Bumped Specularity comes to mind. (any technique that involves the global lights being fetched from a texture, or any other location instead of being computed by the lighting hardware)

If it was talking about skinning and animating characters then the CPU speed could have made a much bigger difference,

Basically any vertex shader operation can be done on the CPU *OR* the GPU, but if it's done on the CPU it obviously uses up processor time and adds the overhead of sending the resultant vertex data to the GPU. On the GC, this must be performed on the Gekko. So the increase in speed would prove beneficial here also.

You can discount this if you want, but you haven't done any multiplatform development, have you? You have no reason to believe they are wrong.

Neither have you, so you can place your unaltering faith in them if you choose. But then any game that exceeds these benchmarks numerically would mean what? A more procient & efficient code was created & manipulated in a real-time gaming scenario correct?

The EA article performed tested for poly throughput, skinning, texture through put and a range of other benchmarking areas. You're not correct about where it positioned the GC. In most tests the game cube was hovering above or below the PS2 in performance when the same task was tested.

Strange that their cross-platform games generally aren't a reflection of these tests then isn't it? Or did you completely ignore my Madden example? I could name quite a number of other EA games if you like. (ROTK, The Sims, etc, etc.)

Do you think game programmers start out coding on console before ever touching any other platform? The majority of developers have some sort of root in PC development. I haven't met a single game programmer in 10 years that wasn't familiar with coding on Intel CPU's at one point or another. After all, it's one of the most widely available cpu's to the public out there.

You still fail to mention where the majority of their coding is now done. Or where they've gained most of their expertise. How large is the japanese PC gaming market again? Within in a university level programming environment, yes. But I was referring strictly to one region, & you still made a blanket statement nonetheless.

So where in this article does it say that performing the lighting in parallel will make it faster than what Xbox is doing? Or still provide it with less a performance hit, or better yet, less a performance hit in a actual game instead which is all that really matters?

First off, do you know what the words "Computationally for free mean?" The fixed T&L the GC utilizes causes real-world performance scenarios to degrade at a much slower rate vs. the other consoles iirc. You disregard suistanable main memory latency figures, the embedded 2mb DRAM on the Flipper which eliminates memory bandwidth intensive Z-buffer accesses, as well as seperate ops being collapsable into one pass. I guess ERP or Faf could clarify this for us?

for instance I've hardly seen EMBM in most console games. Nor have I seen 8 texture layers in most console games.

And somehow this makes them irrelevant features? I was also highlighting these design aspects for those who are doubting the Revolution's potential to be on par with, or more powerful than the Xenon. (a design philosophy which will follow into the Revolution)

It's relevant when you want to get into a which CPU is more efficient argument as it looked like that was where you were heading. The point behind my saying that was to head off any sort of argument about the single components in the console being more efficient as the console as a whole is what matters.

That isn't where I was going, you already conceeded which CPU you thought was more "efficient." I happen to agree. (PPC)

Please, now you're grasping at straws! I'm sure there's a few different ways to implement multiple texture layers on the game cube, but there isn't an infinite amount of ways to do it as they never let you get that close to the hardware.

I never implied that there were. (infinite methods) You helped verify my point indirectly, a few ways was all I was attempting to establish.

If you want to bring up factor 5 and their claims then go ahead. I have trouble believing anything they said as they seemed to frequently BS performance numbers IMO. Even still those numbers where from the firs rouge squadron on the GC, correct? It's not hard when you have nothing but static animating objects ( no characters or skinning), to get the most performance out of a hardware T&L system like flipper. Too bad he didn't mention the game was running in 16 bit color with FSAA enabled.

Just like I do not accept EA's benchmark claims. (I really don't think it's a question of who's technically more gifted here, as that's fairly obvious.) Although at least they are aware of what the X-Box's console is capable of, seeing as how they designed a DivX SDK toolkit for the console. RS3, RL, Crimson Skies, Star Wars: Jedi Starfighter, & Star Wars: Starfighter can be grouped into the same genre correct? (static non-animating objects, primarily no characters or skinning) Which game is accomplishing the most from a technical standpoint? Despite the fixed T&L.

I base my opinion on hearing from other developers I've met out of Japan on how different the game companies are culturally. What I see from you is someone that doesn’t know one way or the other, or is simply a real hardcore Nintendo and only hears what they want to hear. One way or the other my opinion is based on SOMETHING. I've yet to see what yours is based on other than article written on the internet that supports your opinion.

Here you are assuming I know no programmers, nor do I have any friends within the gaming media. You're incorrect, but this is irrelevant. Are our arguements both now validated since we both have sources? Of course not, because I'm not calling that into play to prove my point. Performance extractions will always vary based upon the developers skill at exploiting the hw. Yes there are a number (not infinite) of ways to implement or emulate effects dependent upon code manipulation & optimization that would perhaps defy a conventional standardized benchmark. Shunting of resources, cutting a feature or in clever reuse of the color registers, or streaming in order to circumvent an inadequate memory footprint to name but a few ways. Look what has been accomplishable on the PS2 when many had thought it was at its peak.
 
Actually the Xbox CPU was upgraded from 600 mhz to 733 mhz. And it's not a Celeron, it's a P3 with half the cache. Just because a celeron has 128k cache too doesn't mean they're the same, because they have different designs. That would also make it modified too ;)

Modified yes. For strictly gaming purposes? No. (referring to the 40 specific gaming instructions on the 750CXe in comparison SIMD) Btw, was it the GPU that was downgraded then? I know some component was, I must search for the article.

Here's one of the EA siggraph papers and they explain what they are doing for the platforms, it seems pefectly reasonable to me that they are doing thier best to be optimial on each one.

Did it ever occur to you that their best platform centric optimal code may not match another developers' best? Just a thought.
 
Fafalada said:
or DOA, IIRC the cut-scenes were real-time but at a slower fps and the models look just as good as DOA3 on Xbox
Actually that was just a popular myth. On DC DOA2 there's some change in number of lightsources, but actual number of polys in characters is the same as during gameplay.

Alright, if so, they looked much better than in game.
 
Right....I own chrono cross, and on a tv, even in the boss battles, I see popping polys. It's less than normal, but it's still there, and it just destroys the illusion for me.

Yeah there's pop-up in psx games, but in rpgs, at least, I don't recall seeing pop-up problems(at least with boss and summon char.s), and if there were it's so subtle as to be virtually imperceptible. I'm quite sensitive to pop-up and I paid particular attention during that boss battle, thought it was impressive, I also played front-mission 3 and the pop-up in that game bothered me to no end, I think it'd have bothered me if it'd been present in the boss battle. It's been like six months since I last played chrono cross, but I do recall being as impressed as ever by some of the char. models, particularly that one I showed. Oh, as for pixelation, in some models it doesn't occur as long as the camera doesn't get too near to them.

In that game modeling was a priority, especially in those very limited scenes, now look at Shenmue, since its on DC, they have certain areas where character models are bumped up massively, or DOA, IIRC the cut-scenes were real-time but at a slower fps and the models look just as good as DOA3 on Xbox, or Berserk's cut-scenes, same situation, they looked amazing, those games in those situations had marvelous modeling, but it wasn't interactive.

Yes, obviously the DC has better models, after all it is next-gen h/w, and is thus superior h/w. The point is that while it's indeed superior h/w, many of the models present in its games(and in those of ps2, gcn, x box) are surpassed by some of the psx ones, thus a testament to its own capability.(heck even in most next-gen games cut-scenes you don't see individual finger animation...)

Now as for the PC shots, man you picked the worst that you could, Doom3 and Half-life 2 are gorgeous, but if I were to run it at the lowest quality settings, it will look like trash.

I Know. ;)
 
Yes, obviously the DC has better models, after all it is next-gen h/w, and is thus superior h/w. The point is that while it's indeed superior h/w, many of the models present in its games(and in those of ps2, gcn, x box) are surpassed by some of the psx ones, thus a testament to its own capability.(heck even in most next-gen games cut-scenes you don't see individual finger animation...)

Well my point was that its not impressive at all in relation to next-gen graphics. No matter how powerful the Xbox2 is, if an artist dedicates the original Xbox's full power on 1 or 2 models there's no way the in-game models of a typical Xbox2 game will compete. I can further say if I were to show those super high quality models to a discerning gamer I could have them believe it is from a next-gen title, with the PSX, that model has nothing more to it than raw polygons, there's no IQ, its very pixelated, it has very low-res texturing, no one would be fooled into believing it were from the DC or other modern console. (the whole scenario assumes showing people these things in motion, not giving them tiny pics with little detail, detail in pics can be manipulated by a number of factors)
 
Status
Not open for further replies.
Back
Top