SCEI & Toshiba unveil 65nm process with eDRAM

megadrive0088 said:
I hope PS3 does true 8x8 (not just 8x) AA natively, standard in all games.

Hey, why stop at 8x8? Wouldn't 64x be even better? :rolleyes: ;) Seriously, how much is enough (and how much is pointless overkill) when talking about regular TV presentation or even HDTV presentation. My hunch is that plain jane 2x supersampling would work just fine on a TV or HDTV. 8x may sound impressive, but does it really make a worthwhile difference considering the output format? This 8x8 or 8x or whatever strikes me a bit like gratuitous spec chasing.
 
old SGI's do 4x4 or 8x8 , so i'd expect at least that much from PS3, XBox2 and GC2.

btw, 8x8 = "64x", 4x4 = "16x"

currently GeForce4Ti 4600 and XBox do 4x
Radeon 9700 does 6x (better than 8500s 6x)
GeForce FX has 6x and 8x modes

specs or no specs, realtime consumer 3D (GFFX not fully seen yet) is woefully lacking in fixing the jaggies, the stair-stepping effect. perhaps HDTV res of 1280x1980 (or whatever the actual res is) combined with
4x4 or 8x8 AA will be enough to rid next gen console games of these artifacts.
 
I can somewhat see the need for those higher AA modes on a computer monitor, but I still think it quite really a waste of fillrate and processing to do that on something that will end up on a TV or HDTV screen. Just a 1st level pass of AA ought to do the trick, IMO. The AA in Baldurs Gate is quite adequate to make the image look quite seamless on my 35" TV, IMO. I don't see this as a need to keep parity with whatever the PC graphics folks are doing, just a need to do what is necessary given the console audience context.
 
Think about it, 2005 what screens will we be using

This is scheduled for 2005 or later.

If they have an integrated VGA adapter I think it would be a good thing to include better AA modes.

Also, the lifespan of the product will be at least 3 years 2008, by that time some of these viewing devices may be cheap enough and somewhat widespread among hardcore gamers.

Speng.
 
on some professional and high-end graphics rendering systems (i.e. Real3D and SGI Infinite Reality) isnt anti-aliasing virtually free in terms of fillrate? i thought dedicated ASICs took care of AA somehow (can't explain how) I wish there was a way for consumer hardware to anti-alias without eating into fillrate & bandwidth. or am I completely mistaken?
 
"Professional/high-end" kind of implies a somewhat cost-no-object implementation, no? I'm sure one can come up with extra hardware to do any sort of AA you wish, but AA implementation on something cost-sensitive such as a consumer console is a somewhat different situation. Adding something of that nature would simply take away from other parts of the hardware that could be augmented in more effective ways which matter to the overall console experience. More importantly, it still comes down to this- looking at 2x AA and 8x AA on a TV set, is the typical user really going to discern the difference? On an HDTV? I'm not so sure. If not, what's the point, then? I know HDTV's are pretty sharp, but computer monitor "sharp"?
 
Magnum-

even if i'm not ever impressed w/ their articles about the graphics technology, i find the inquirer an interesting site.

Enquirer(super market tabloid), not Inquirer is what I was talking about :)

Zidane-

Thus the only difference between the pic and the final ps2 product(the AA, rez enhancement.) are not that relevant...

By that token the GF4 Ti4600 has no real advantage over the GF3 Ti200 which is very clearly not the case. High res and AA are a major advantage in real time graphics.

Faf-

Must be very CPU limited title thenl. Most stuff I run on my GF2 varies in fps greatly just by changing window size.[/qoute]

Running 640x480, I can't think of a single title I have that isn't CPU limited. Quake2 even is(pushing over 200FPS, still CPU limited though).

Ok, on a bit more serious note, DOT3 emulation may not be particularly fast/efficient, but we're talking about competing with a chip with 2.5 less theoretical fill.
This before counting the heavy use of shadow volumes, where PS2 is several leagues beyond what GF1 can do, and then some.

So you think shadow volumes would slow the GF down enough for the Dot3 shortcoming to be overcome?

Well it's still realtime rendered, even if it was taken at 4x resolution and downsampled in photoshop.

I don't understand what you are saying here. The game is running at a resolution that the PS2 can't handle in real time, and was then filtered in Photoshop, and it is realtime?

Phil-

Don't ask me though how many times AA is applied to those titles though. What I can say though is that the first 2 mentioned titles have absolutely no aliasing (at least not on 3 different TVs that I have played them on).

I haven't played BD on the PS2, TTT certainly has considerable aliasing. That isn't a slam on the PS2, I haven't seen a title this gen that doesn't suffer from aliasing.

Marco-

Maybe for testing, but the games on PC become tearing hell with VSYNC disabled :\

Never really had that problem. What refresh rate are you running? I try to keep mine around 100Hz when gaming.

Frontline has very atmospheric and cinematic beginning. The stuff that is going on around you is quite impressive indeed. Heck *I* was impressed how well they pulled off that kind of atmosphere, and I've seen way more games than that guy. Not everyone looks the games through texture resolutions and anisotropic filtering used

The game is way too linear, too easy, has 'cheating' AI(the only thing that offers even a slight challenge) and overall is simply mediocre. I wasn't talking about graphics(which also are horribly bad- played it on the XBox and it looked as bad as a PS2 game which is the norm for an EA port), the whole game is sub par compared to the best shooters out today.

Vincce-

Wise ass ;) :)

Chap-

Argh! PS2 might be old crapware, but why the need to compare it with GF1?[/quoute]

They both came out in 1999.

Whats with the mafia love? Its a frickin PC game where the developers have tons of VRAM to play around with. PC textures will always own PS2 textures.

I agree, obviously some people don't though :)

Randy-

Seriously, how much is enough (and how much is pointless overkill) when talking about regular TV presentation or even HDTV presentation. My hunch is that plain jane 2x supersampling would work just fine on a TV or HDTV.

2x SS isn't nearly enough even on a TV. Look at the edges in FF:TSW, Toy Story or Bug's Life as a few examples. The GC uses a three line flicker filter when outputting and isn't close to eliminating aliasing. And that's if we only look at standard NTSC, not HDTV. At 'normal' TV resolution 8x AA would be probably be decent, 4x AA when running at the highest end HDTV resolutions(and even then, if you are talking about large TV screens those settings wouldn't be enough).

More importantly, it still comes down to this- looking at 2x AA and 8x AA on a TV set, is the typical user really going to discern the difference? On an HDTV? I'm not so sure. If not, what's the point, then? I know HDTV's are pretty sharp, but computer monitor "sharp"?

Fire up any relatively recent CGI movie or one with a lot of CGI effects and then compare it to any game in terms of aliasing. There is a long way to go on that front.
 
Fire up any relatively recent CGI movie or one with a lot of CGI effects and then compare it to any game in terms of aliasing. There is a long way to go on that front.

agree. that statement and the others.

whatever types and amount of anti aliasing those movies use (32x32, 64x64 ???) it shows that realtime games are not even close to being decent. I guess GeForce FX, with it's 8x (not 8x8, totally diff) is a small, small step in the right direction, but still nowhere near good enough.

8x8 samples of the best possible method of AA is my hope for the next generation of consoles @ HDTV resolutions.
 
This comes as no surprise coming from someone [speaking of Ben, that is] who seems to be utterly anal over the smallest imperfection. For the rest of the world, they would not have a clue what you are on about from one thing to the next. We are talking videogames here, remember? It's all fine and good to wish for videogames that look like CG movie productions to the nth degree, but geez, is the world going to end if it doesn't? BTW, I'm sure that fancy-pants AA was quite appropriate when it comes to presenting at some-odd 3500x2000 on a giant movie screen (just guessing on the resolution, but you get the point). When it gets scaled down and blended to lossy compressed digital video from a DVD player, it gets a bit dubious again. Personally, I'd be more impressed with a simple baseline AA and eye-gouging artwork in a videogame than AA 6-ways from Sunday with uber-tropic texture blending but mediocre artwork.
 
2x SS isn't nearly enough even on a TV. Look at the edges in FF:TSW, Toy Story or Bug's Life as a few examples.

Don't really need it for those on a TV since they're usually rendered at least at a 4K resolution, along with various sampling and motion blurring methods per shot. Bicubically sampling that down to a D1 source for DVD mastering or broadcast alone pretty much is good enough.
 
Don't really need it for those on a TV since they're usually rendered at least at a 4K resolution

I guess it depends on how you look at it. Running at 4K(and there was AA applied at that res also, ignoring that however) and downsampling you are looking at ~50x AA anyway :)
 
I guess it depends on how you look at it. Running at 4K(and there was AA applied at that res also, ignoring that however) and downsampling you are looking at ~50x AA anyway

True. However your display characteristics and signal drive quality can have a huge influence.

With current TV, your colour palette alone can play a huge factor is scene aliasing (NTSC is notorious for colour bleeding. PAL is somewhat better). The quality of your TV can have a factor even when using a standard composite connector (higher grade TVs can extract more information from a cruddy signal and filter it for better presentation, e.g. my set upconverts signals to 480p and 960i), my home theatre receiver even performs scan conversion from composite to s-video so that plays a role as well. People with fairly high-grade television sets who use higher performance connectors (e.g. s-video, YCbCr component, SCART, VGA, DVI, etc...), still represent a relative minority amongst consumers and targetting your art assets with that sort of display can lead to rather blah, muddy looking imagery on an "ordinary" TV set.

HDTV adds more complexity to the problem. For one the higher resolution negates the necessity for AA to some degree, at the same time, signal quality tolerances and colour fidelity knock that right back up. Then I also doubt that everybody will have HDTV sets by 2005-2006 (well maybe in Japan) although they should be more common, and the discrepency in image quality between a typical consumer grade NTSC set and a decent HDTV set is even wider...

Then again I doubt the average consumer will care... Just the tech enthusiasts...
 
To wonder if consoles of the next generation will actually be capable of rendering to movie theater type of resolutions in realtime is simply mind boggling. Is this truly what we are proposing here? Is it actually feasible given the timeline?
 
By that token the GF4 Ti4600 has no real advantage over the GF3 Ti200 which is very clearly not the case. High res and AA are a major advantage in real time graphics.

Look dude that's nice and all but according to my book Quick Access, ur arguments have one flaw(logical fallacy), ur deviating from the main subject of this discussion, ur picking at trivial info...

The fact is the most impressive thing about the old guy image is not how clean the image is(although that too is impressive), but the fact it looks so good, and devs. must've put alot of work into it...
So u think it needed a GScube(16ps2s!!!) for the rez and AA....rrrrrrrriiiiiiggghhhhttttt......

(sorry for being repetitive)Again do u think a system with 1/8th the xbox's T&L, with no pixel shaders, or the flexibility of vertex shaders could actually compete with the PS2.... dude there's a reason MS didn't go with a supped up GF2...

PS (again)I mean why do u think the old guy was posted here? THE ARGUMENT is not about ps2 IQ v.s. gf1 IQ... it's about what the ps2 can do, and what the gf1 can do.
 
Never really had that problem. What refresh rate are you running? I try to keep mine around 100Hz when gaming.
You never have tearing with VSYNC off???? How is that even possible? Everytime the game framerate output desynchronizes with your monitor refresh output (and that's pretty much all the time) you get image tears. I see them all the time.

The game is way too linear, too easy, has 'cheating' AI(the only thing that offers even a slight challenge) and overall is simply mediocre. I wasn't talking about graphics(which also are horribly bad- played it on the XBox and it looked as bad as a PS2 game which is the norm for an EA port), the whole game is sub par compared to the best shooters out today.
Well, if you completely ignore points that I've made, and list only the negatives, sure it sound like the game is the worst thing that happened to FPS genre :\ I just think it has what it takes to be impressive, at least on it's starting level.

I agree, obviously some people don't though
Just out of curiosity, who doesn't agree? I was just answering to your statement that the Mafia looks *significantly* better than Getaway, which according to those screenshots is not really the case. I would say it's *somewhat* better looking. As some people said already, why include Mafia in this discussion, to begin with? It's not like it's the game you will be running on a GF1 config and be satisfied with it.
 
Btw, regarding VSync, I see flickering of image at 60-70hz refresh(gives me headaches rather quickly too). And tearing is a heck of a lot more obvious then that. It's a matter of personal preference no doubt, but it's very bothersome to some.

So you think shadow volumes would slow the GF down enough for the Dot3 shortcoming to be overcome?
Yes. In theory, you're looking at 5:1 difference in fillrate alone for volumes - which are fill limited pretty much always.
Going by practical tests with volumes on a GF2 in our own app, I'd give it even more definite nod, since there was even considerably larger then that theoretical difference.

I don't understand what you are saying here. The game is running at a resolution that the PS2 can't handle in real time, and was then filtered in Photoshop, and it is realtime?
Ok, if you want to get technical, the bicubic filtered shots aren't realtime. The highres one, Are. It's rendered by the same app, using same content. And PS2 "can" display up to 1080I, so I don't really see the problem (performance would suck, but you'd still see it in realtime :p ).

On the same note, the bicubic sampled shots are used in 1 out of every 2 XBox games, and people have been claiming them to be realtime from day one (and still are).
 
IMO, this GF1 vs GS arguement is retarded.
I mean how the heck can you compare them when they both work in totally different environments?
 
Say you cap the framerate at 60FPS synched and the system you are testing is capable of drawing every single frame at 1/59th of a second, your framerate ends up 30. There really is a good reason that VSync is disabled when testing(and for most people here at least, when playing).

I didn't say to enable VSync, I just said to cap fps to some level like Vsync. But do monitor, actually display 59 fps ? or its 59 is just internal thing ? or do you get 30Hz on your monitor anyway in this case ?

Anyway, by capping the fps, your average would be more inline with the gaming quality you get.
 
Archie-

Then again I doubt the average consumer will care... Just the tech enthusiasts...

We are pretty much the only people that care now, and I don't expect that to change. Coming into the next gen those with 1080i sets would have to be pretty much blind to not notice the difference I would assume(providing they realize they need to ~$20 connector for their setup).

Randy-

Is this truly what we are proposing here? Is it actually feasible given the timeline?

It's pretty much a given. The XBox already supports 1080i and AA methods that stacked could already come close if they had the fill, and more importantly, the bandwith to handle it. For the timeframe that the next consoles hit, I think everyone would be surprised if they all didn't support 1920x1080i natively, likely with AA.

Zidane-

So u think it needed a GScube(16ps2s!!!) for the rez and AA....rrrrrrrriiiiiiggghhhhttttt......

For actual usability? Yes, absolutely.

Marco-

You never have tearing with VSYNC off???? How is that even possible? Everytime the game framerate output desynchronizes with your monitor refresh output (and that's pretty much all the time) you get image tears. I see them all the time.

Keep the refresh @100Hz and the noticeable tearing isn't going to be that bad the overwhelming majority of the time(spinning around in a FPS is about the worse case scenario).

I just think it has what it takes to be impressive, at least on it's starting level.

Perhaps I didn't explain exactly how highly they thought of the game. The guys I was talking to made it sound like it was one the greatest ever made. I brought up the negatives more then anything due to their reaction of the game, which I found to be decidedly mediocre. Comparing it to Halo, JKII, Metroid and the like MOH is a very poor title IMO.

I would say it's *somewhat* better looking. As some people said already, why include Mafia in this discussion, to begin with? It's not like it's the game you will be running on a GF1 config and be satisfied with it.

Mafia runs quite nicely on a GF1, in fact Mafia runs quite nicely on an integrated nForce 220(the single channel one, slower then a GF1) at 640x480.

Faf-

Btw, regarding VSync, I see flickering of image at 60-70hz refresh(gives me headaches rather quickly too). And tearing is a heck of a lot more obvious then that. It's a matter of personal preference no doubt, but it's very bothersome to some.

Once in a while I notice it. Most people game with the refresh @60Hz where it is a rather serious issue(pervasive). Up the refresh rate high enough and it is significantly reduced.

Yes. In theory, you're looking at 5:1 difference in fillrate alone for volumes - which are fill limited pretty much always.
Going by practical tests with volumes on a GF2 in our own app, I'd give it even more definite nod, since there was even considerably larger then that theoretical difference.

How much fill would you need? Running 640x480 with 4x OD you have enough fill for six passes on a GF1 at 60FPS, twelve @30. The GF1 is nowhere near as imbalanced as the GF2 is. The GF1 has 91% of the bandwith and 30% of the MTexel fill compared to the GTS. The GF1 actually had a very good balance to it.

V3-

I didn't say to enable VSync, I just said to cap fps to some level like Vsync. But do monitor, actually display 59 fps ? or its 59 is just internal thing ? or do you get 30Hz on your monitor anyway in this case ?

If your gfx card was drawing a frame every 1/59th of a second and you had VSynch on you would output 30FPS(that is what your monitor would display). If you had VSync off you would output 59FPS(that is what your monitor would display, although some frames would be offset a varrying amount depending on how much movement had occured, that is the tearing we are discussing).

Anyway, by capping the fps, your average would be more inline with the gaming quality you get.

Thing is, it depends on how the app is capped. Most rely on VSynch, which has the problem I mentioned above. If you cap it per second, you run in to problems if you can push 1000FPS, all your frames would be drawn in the first tenth of a second and then no more updates for the last tenth. The only good way to do it is to limit how fast a given frame can be drawn. Problem with this is, the singular time when you need the highest framerate in a PC game, while spinning around in a FPS, is going to be directly and negatively impacted in a sizeable fashion no matter how you implement the cap.

Anyway, by capping the fps, your average would be more inline with the gaming quality you get.

Depends. 'Crusher' was an excellent test to show you actual framerates that you would see under real world gaming, enabling VSync there gave you numbers that were simply too low to be viable. Other benches that have a lot of 'slow time' in them aren't very good at giving you a reasonable score. Check out UT2K3's 'Flyby' scores vs 'Botmatch', one is completely unrealistic and would be more in line enabling VSync(flyby) while the other(botmatch) is perfectly reasonable without VSync at giving you a good indicator of what to expect in game.
 
Please stop this pointless GF1 vs GS comparison.
If there is Mafia PC, than there is also BGDA on PS2.

Mafia runs quite nicely on a GF1, in fact Mafia runs quite nicely on an integrated nForce 220(the single channel one, slower then a GF1) at 640x480.
Sorry, but i just don't believe this.
Again, i have a GF2MX( = GF1) and even at 640x480 + low details, current games either do not run at stable framerates and or they certainly do not look better than many PS2 games.
I have tried running games/demos of NWN, RSC, Fifa, RtCW, AmericaArmy, Battlefield1942, AvP2, NOFL2, RF on my setup.

Of course older or less graphics intensive games run great on my PC.
Sims, Warcraft and Battle Realms cometh to mind.
 
Back
Top