SCEI & Toshiba unveil 65nm process with eDRAM

LOL, that's a good one

Seriously though, do you have any screenshots that actually show a current build of Primal, ones that look better then TombRaider1 for the PSX? I understand that early development shots of the game look extraordinarily poor, that is a given for any system though.
Well, demo of that 'extraordinarily poor' looking game looks (from a technical point of view, art is rather sucky and generic) and runs much better than anything of that kind I have on my PC right now, so I don't know what to say :\ There are very respectable amounts of geometry and lighting being thrown around and texturing is not half bad. Mind you, there are games that look way better than that (like mentioned BG:DA) but they are of entirely different kind and not so 'free roaming'.

Btw, as you said, Mafia need 512MB to work well, and that has little to do with the actual graphics card. I already mentioned that as something where PC will show an unavoidable advantage.

No, my PC is not OEM, but I guess I don't have that magic touch for building killer configs :)

The DC doesn't use a Kyro. AFAIK, the DC does not support Dot3.
I remember an engineer who worked on that chip was posting here and explained how it could do DOT3 in two passes.
 
Dreamcast's bumpmapping was done with a dot product, but it wasn't DOT3. They called it perturbed normals bump mapping.
 
SilentHill3_ps2_12.jpg


soul_screen006.jpg

^bad GS captures

We all know PS2 sucks at textures 2 years ago, so why are we still talking about it today?
 
Ben,
If you have VSync on then you are pretty much always landing framerates quite a bit lower then you should, particularly running nV hardware(in game with heavy action too).
I am well aware of how VSync affects framerate, but the it's mostly irellevant if the game doesn't have framerate all over the place to begin with.
A game skipping between 20-30fps is considerably more playable imo then something that runs arbitrarily between 60 and 5 - Vsync disabled or not. Even though they might have the same average fps.
I did use to force VSync off when I still played shooters online, but that was a long time ago.

I'm not quite getting the part I bolded. Are you saying that using ani in conjunction with trilinear would work better then tri alone, or are you saying that ani w/bi would work better then tri alone?
I was just thinking that performance wise, ani w/bi could be more consistent then trilinear, and potentially faster. But this is just rampant speculation right now, I haven't seriously looked into it in a while.
Either way, GF1 aniso is limited to 2x, which certainly wouldn't be much of a big deal to use, but then it's not much of a visual improvement either.

Btw, DC supported "perturbed normal bump mapping" which is something of an alternative equivalent to DOT3 - it still calculates a dotproduct 'perpixel'.
 
Why do you always compare supersampled highres (prob. using non downsampled textures) output from console developement kits to pc screen shots. Nobody challenges that consoles beat pcs significantlyin app. specific perfomance/cost. But especially with mafia (and other pc games) WYSWYG applies, while the getaway demo looks nowhere like that on my ps2. This is the major gripe i have with that console, output from the pal version looks plain bad on big screens (using a panasonic 36 inch widescreen tv). On my brothers old 4:3 sony difference between ps2 and xbox is much smaller, and most ps2 games tend to look better than on my tv (contrary to everthing else: dreamcast, gamecube, xbox, dvd, ...). Also PS2 <-> PC ports tend to have technical problems in either direction, which should not surprise anyone given the different architectural approach and the limited amount of money spend on ports (just remember the FF Pc ports, ... o_O ). I will only be with sony when they adress their analog signal quality issues with pal ps3.
 
Faf-

I am well aware of how VSync affects framerate, but the it's mostly irellevant if the game doesn't have framerate all over the place to begin with.

What I'm saying is that it does effect framerate rather seriously(well, 15%-30% is common) on nV hardware no matter if there is serious fluctuations in framerate(it has a negative impact on all hardware, but it always has seemed to screw up the nV parts more). Cap Counterstrike @100FPS(I believe it is by default anyway) and compare framerates with a smoke grenade going off(not average, instant) with a NV1X part(as a general example).

Marco-

Look at the screenies that Chap posted, do people still want to say Gamespot doesn't seriously screw up their screenies? ;) It may look incredible, but the screenshots that were linked to look like complete crap.

No, my PC is not OEM, but I guess I don't have that magic touch for building killer configs :)

What are you running for hardware?(CPU, vid card, mobo, RAM) If you are in the GHZ range with a NV15 or better there should be numerous games that you can play that will run and look very nice.

Chap-

The top screenshot you posted is a pre render, not real time(it looks like 6x-8x AA is applied though hard to tell with compression artifacts), although it does display how badly Gamespot hoses their screenies quite nicely :)

Btw, as you said, Mafia need 512MB to work well, and that has little to do with the actual graphics card. I already mentioned that as something where PC will show an unavoidable advantage.

And as I stated, there is no way in hell the PS2 could handle Mafia without compromises ;) I also mentioned the XBox wouldn't either, but people didn't seem to get offended at that for some odd reason :)
 
pinky,
ps2 output sucks on composite but it is pretty good on component.
VF4 EVO is running progressive. More 480-Progressive games are coming for ps2, IF we are to believe faf & archie "simply flipping a switch to turn on progressive for ps2" story.

ben,
That SH3 is a realtime cutscene if i am not wrong. there is a video somewhere at konami site.

Guys,
try to play Hitman 2 on PS2, i heard the PC <-> Console graphics is quite good.
 
I was playing Hitman 2 for a few weeks on my PC before I rented the Xbox version which looked pretty crappy compared to the PC version.
 
Well I know I have some interesting sleeping issues, but apparently all of you do too!

I'll be so bored when I finally beat Golden Sun :-? (which I dislike the more I play it)

edit: incase you're wondering why, its cause the script was written and targeted for retarded monkeys (phrase of the night!).

"What do you think Isaac, should we fight the bad guy?!?!?!?"
Yes - "Alright, Isaac, lets go fight the bad guy!!!!!!"
No - "Forget you Isaac, lets go fight the bad guy!!!!!!!"

And those stupid Yes/No icons make me want to throw my GBA against the wall. Its a pretty hard FF bite-off written for retarded monkeys, but its giving me my fix until Squenix validates my purchase with FF IVht (I have faith).

So.. out of curiosity.. wtf were you guys doing up from 1am - 6am EST?! :p
 
No, the theoretical limits of the GF1 aren't all that impressive. Neither are the PS2's.

PS2's theoretical b/w limit for the gpu is significantly higher than that Geforcefx is say'd to have....
PS2's theoretical fillrate limit is about half that of the Geforcefx... even though the Fx is coming nearly 3 YRS later...
PS2's theoretical geometry limit is about a fifth that of the Fx... even though it's 3 YRS later and at doubling per every 6months of pc gpus(according to nvidia) it should already be at least an order of magnitude above the ps2.... (we can't say that about Gf1 now can we?)

Well then i guess GeforceFX is not impressive for it's theoretical perf limits... only the DX9 features are impressive....


Sure it lacks the h/w features... and time does take it's tall on h/w... but if i came here a few yrs ago saying PS2 will exceed the b/w of PC gpus coming 3yrs later, and it's peak fillrate won't be that much lower than sayd gpus... heck it's geometry won't be surpassed even by an order of magnitude in 3+yrs... i would probably been banned for apparent stupidity....

Let's see Gf1 handle 500000polys at 60fps....

PS:(if the ps2 had been 1000x instead of 100x psone... it would still eclipse even GFX in most theoretical limit areas...)
 
The top screenshot you posted is a pre render, not real time(it looks like 6x-8x AA is applied though hard to tell with compression artifacts), although it does display how badly Gamespot hoses their screenies quite nicely
That screen is actually realtime rendered (but obviously supersampled, the actual game will not be antialiased like that)

Look at the screenies that Chap posted, do people still want to say Gamespot doesn't seriously screw up their screenies? It may look incredible,
Well, it doesn't look *incredible* but very solid for such a game (I've read it's a kind of free roaming, demo was too short to really experience that, though)
 
I have no idea how this thread turned into a "texture-debate", but as Marconelly! already pointed out, it should be clear that a PC has a great advantage when it comes to texturing. No surprise, Mafia does feature some very nice textures, but with all due respect, from what I can judge, that game doesn't look as good as those screens lead to believe. I'm not quite sure on what graphics settings I've seen the game, but it's probably neither highest or lowest but realisticly something in between. The outside levels range from impressive to fugly looking IMO. The Getaway, while perhaps not as impressive if we're just looking at the texturing, but certainly is quite impressive when taking everything into consideration. For instance, I was quite surpised with the amount of cars you'll find on the street in the demo, also, cars seem to reflect quite a lot too, while being quite realistic in terms of handling etc. In Mafia, (correct me if I'm wrong) it was evident that the cars don't even reflect anything, but a fake image. This *may* be different on a higher setting though, so if this is the case, please correct me.

About the arguement what a Geforce can handle in comparasment to PS2 - why are we comparing Mafia screens? I kinda missed that and considering this game doesn't even run on the highest graphics setting on a GF2 with decent CPU (my friends PC runs at 800 MHz), I really don't see a GF1 handling this. Why are we comparing this game then?
 
About the arguement what a Geforce can handle in comparasment to PS2 - why are we comparing Mafia screens? I kinda missed that and considering this game doesn't even run on the highest graphics setting on a GF2 with decent CPU (my friends PC runs at 800 MHz), I really don't see a GF1 handling this. Why are we comparing this game then?

I too question the comparison between the ps2 and GF1... how can the GF1 possibly be better? The ps2 is competing against what many call a watered down GF4(which should at least be 8XGF1 according to nvidia), and although it is indeed outdone outside of pixel/texture effects the ps2 doesn't lag too much behind.... and this is so even though this xbox is a FIXED PLATFORM.... IOW has the same benefits as the ps2....

If gf1 miraculously surpassed the ps2? Wouldn't a FIXED platform with at least 8X it's capability make the ps2 look significantly inferior...

I saw(glanced at) what many call one of the best looking xbox games the other day at EB... the game called splinter cell, and it didn't appear to look significantly better to the top ps2 titles in either IQ or geometry... it did have better lighting though, and appeared to have good textures(didn't appear mindblowing or anything, but good...)... Now am I supposed to believe that 1/8th the perf/lighting/geom/textures of this game paired with no pixel effects would surpass the ps2?
 
BenSkywalker said:
Randy-

Hey if you don't want to, could you at least stop talking about PC problems and instead refer to who built your PC? Instead of talking about Windows issues, or PC problems, state Dell or Compaq or whatever POS it is you have.

I think you know very well that my personal setup isn't a Wintel, at all, but I take it you wanted to "out" me here at B3D? Real classy. Should that make my comments any less valid for PC's? I've had extensive experience and exposure to PC's in the workplace (most from 1st tier vendors, but I guess they all suck the same if not built by you, right?), and I've seen how various PC's end up in the hands of people I know casually. I've experienced my fair share in Win95/NT/2k. Seriously, I don't feel I am a PC newbie who doesn't know a dll from a hole in the ground. I, personally, can get a PC singing like a diva and be running virus-free (as I am sure all others here can, as well). Out of all of this, I can certainly say reaching that level doesn't happen by accident, and by not deliberately avoiding pitfalls and neglecting general maintenance, you can easily end up with a belly-up PC. Are you really implying that all of this doesn't exist??? C'mon, now! Are you really satisfied with the explanation that if someone ever has a problem with a PC, then that PC must invariably be a POS, and if it isn't, it is invariably the fault of the nimrod user? Certainly, those situations are sometimes the source of difficulty, but invariably???

I think this sidetrack has gone far enough, but I felt the need to defend my position here this time. So please excuse my digression, and I hope we can continue on-topic.
 
zidane1strife said:
I too question the comparison between the ps2 and GF1... how can the GF1 possibly be better? The ps2 is competing against what many call a watered down GF4(which should at least be 8XGF1 according to nvidia), and although it is indeed outdone outside of pixel/texture effects the ps2 doesn't lag too much behind.... and this is so even though this xbox is a FIXED PLATFORM.... IOW has the same benefits as the ps2....

I think it comes down to people simply looking at the hardware-enabled feature set of the GF1 and utterly ignoring the other strengths that the PS2 GS has to offer. By citing that the GS does not have the buzzword features bullet point by bullet point as the GF1, they reason that the GS is therefore less than GF1. However, they ignore the bandwidth, fillrate, and functionality assets and their implications on how certain graphics effects can be implemented (evidently, if there isn't a buzzword to mark it, then it doesn't exist as a feature :rolleyes: ). Despite that, we can see the GS actually holding its own (albeit not necessarily ahead of the game outright) against current hardware that is several generations ahead. This is about on-par with people who buy a computer simply by the Mhz number of the processor.
 
"If Sony relies on software rendering they will not be able to compete with dedicated hardware. For that matter, I think they are still iffy trying to push out 6.6TFLOPS based on .065u in a general purpose CPU."


I agree Ben.

If PS3 relies on software rendering, that would... suck. like VMLabs ProjectX/Nuon.

I'm convinced there'll be a GS3, a dedicated rasterizer, though.
 
"PPS: EE/CELL,etc... 500M+ trans GS3(more than 256Mbit embedded ram) Clearly the system will have at the very 1B trans. combined...."

If EE3/PS3's ver of CELL is over 500M transistors, then the GS3 could very well be over 1 billion itself. assuming a similar transistor difference between EE3 and GS3 as there was with EE and GS (about 3-4x)
 
Before Ben sees my reply about Mafia, just thought I'd take a few things back. ;) I just got back from my friends place and had another good look at the game. While the game boosts some VERY impressive character models and textures throughout the inside levels (on the highest settings of course), it's the outside graphics that I find/found quite underwhelming. No doubt, inside it looks much better than the Getaway, but outside, well, I'm not too impressed. Not saying of course that the Getaway looks better (as Mafia has the better textures evidently), but as I already said, given that you look at the game as a whole, I think both are very impressive indeed.

Now, while I do admit it's a very fine looking game [Mafia], at least inside the buildings, my friends PC ran the game with a GeForce 2 Ultra along with a PIII 800 MHz. Now, on highest settings this game run smoothly most of the time with occasional hickups on the inside levels. Outside, the framerate is worse. Given that he's running a GF2 Ultra, I really can't see this run smoothly on GF1 hardware.

Given that this game doesn't run smooth on a GF1 at the quality of those screens you posted, it doesn't really support your arguement of GF1 being as good as PS2 hardware.

Also, the Getaway is hardly the game that's maxing out the PS2 - I think there are games that do the hardware better justice. Another thing is textures: as pointed out, it really is clear that the PS2 has trouble competing with PC in textures - that's a given. What PS2 does pretty nicely though are so many other things and that's where it really shines. Restating an example, I really don't see a game like MGS2 running fluid on GF1 hardware, even if the devs tried to max out GF1's feature set. Have you seen Zone of the Enders 2? Seeing that in motion absolutely blew me away...
 
In an ideal world, PS3 would have one CELL/EE3 as the CPU (or if it's going to be a group of CELLs, i don't know) that is balenced between FP and integer performance. this CELL or CELLs does not have to feed the GS3 with T&L caculations at all.

The graphics hardware is comprised of a dedicated rasterizer (GS3) that is over 1B transistors by itself. plus a second CELL optimized for FP calculations that has at least multi TFLOP peak performance, with more than 1TFLOPs sustained. the second CELL is bolted onto the GS3 or is very near by. -- giving Sony PS3 a true "GPU" and avoiding any of the severe shortcoming of software-only rendering/rasterizing.

provided the GS3 has lots of up to date features, and is highly programmable, Sony would be able to compete with whatever Nvidia comes up with for XBox2, since Nvidia will basicly be the heart & soul of any XBox.
 
Zidane-

PS2's theoretical b/w limit for the gpu is significantly higher than that Geforcefx is say'd to have....

It has eDRAM with a 2560 bit bus, I don't think many people would have argued with you that it would retain its b/w edge for a very lengthy time.

PS2's theoretical fillrate limit is about half that of the Geforcefx... even though the Fx is coming nearly 3 YRS later...PS2's theoretical geometry limit is about a fifth that of the Fx... even though it's 3 YRS later and at doubling per every 6months of pc gpus(according to nvidia) it should already be at least an order of magnitude above the ps2....

Can it run Doom3? ;)

The ps2 is competing against what many call a watered down GF4(which should at least be 8XGF1 according to nvidia), and although it is indeed outdone outside of pixel/texture effects the ps2 doesn't lag too much behind.... and this is so even though this xbox is a FIXED PLATFORM.... IOW has the same benefits as the ps2....

You really think the PS2 is competitive with the XBox in terms of visuals? I expect casual gamers not to be able to spot the rather large differences, but not people here.

Marco-

That screen is actually realtime rendered (but obviously supersampled, the actual game will not be antialiased like that)

Realtime on what, a GSCube(isn't that its name?)? :)

Randy-

I think you know very well that my personal setup isn't a Wintel, at all, but I take it you wanted to "out" me here at B3D? Real classy.

I know that we have had discussions before but I couldn't remember where, was it at AI? If so, no need to extrapolate on your comments ;)

most from 1st tier vendors, but I guess they all suck the same if not built by you, right?

If by 1st tier you mean junk like Dells and Compaqs, yes, they suck. If you are talking about Alienware and the like, then those don't suck(they are actually very well made).

C'mon, now! Are you really satisfied with the explanation that if someone ever has a problem with a PC, then that PC must invariably be a POS, and if it isn't, it is invariably the fault of the nimrod user? Certainly, those situations are sometimes the source of difficulty, but invariably???

Pretty much, as long as you don't catch a virus(the digitial or user kind ;) ) or have hardware failure, then a well built PC running a non Win9X OS shouldn't have much in the way of problems. Look around these boards and see how many people have issues with their rigs(on the PC forums), and see how many of them they built and have never really had any problems. You will certainly run in to buggy software every now and then, but that is certainly no fault of the PC.

By citing that the GS does not have the buzzword features bullet point by bullet point as the GF1, they reason that the GS is therefore less than GF1.

Trilinear filtering isn't exactly a current running buzzword feature. That was something that should have been a given basic feature in 1998, and yet most PS2 games use bilinear filtering(if they use proper texture filtering at all). If it was 'free' or close to it on the GS, it would be utilized and eliminate an ugly visual artifact. Anisotropic filtering is also nothing new, and aids in reduction of texture aliasing which is something the PS2 needs desperately, again, far from a current bullet point. Dot3 and EMCMs could be utilized quite a bit by a lot of PS2 titles to enhance their visuals yet they aren't that practical to implement either. I'm not touching on PixelShaders and the like, which themselves have been around for closing in on two years nor anything comparable. I am looking at features from the PS2's launch timeframe which would have aided a good deal in enhancing the PS2's visuals. If the GS had the raster feature set of the GF1 there would be virtually no difference between it and the currently best looking XBox titles available.

Despite that, we can see the GS actually holding its own (albeit not necessarily ahead of the game outright) against current hardware that is several generations ahead. This is about on-par with people who buy a computer simply by the Mhz number of the processor.

It really doesn't though. The PS2 can't compete looking at its best titles vs the XBox's best. I'm using the GF1 as it has a like vintage.

Phil-

Now, while I do admit it's a very fine looking game [Mafia], at least inside the buildings, my friends PC ran the game with a GeForce 2 Ultra along with a PIII 800 MHz. Now, on highest settings this game run smoothly most of the time with occasional hickups on the inside levels. Outside, the framerate is worse. Given that he's running a GF2 Ultra, I really can't see this run smoothly on GF1 hardware.

The feature the GF2U has over the GF1 is speed, it helps you crank up the reslution. Run the game at 640x480(which I have mentioned numerous times, using console res ;) ) and see how it plays on a GF1. Also- how much RAM is your friends rig packing? The game needs 512MB to run smoothly(I upgraded to 512 because of Mafia, mad a huge difference).

Given that this game doesn't run smooth on a GF1 at the quality of those screens you posted, it doesn't really support your arguement of GF1 being as good as PS2 hardware.

GTA3 doesn't run smooth on a PS2 either ;)

What PS2 does pretty nicely though are so many other things and that's where it really shines. Restating an example, I really don't see a game like MGS2 running fluid on GF1 hardware, even if the devs tried to max out GF1's feature set.

Throw Doom3 at the PS2 ;)

Megadrive-

In an ideal world, PS3 would have one CELL/EE3 as the CPU (or if it's going to be a group of CELLs, i don't know) that is balenced between FP and integer performance. this CELL or CELLs does not have to feed the GS3 with T&L caculations at all.

That would be an interesting setup, although they would have to pull a MS and eat a sizeable loss to get it off the ground. Certainly a CELL dedicated to graphics paired with a GS3 would make for some nasty visuals.
 
Back
Top