X-Box hacker interview

It depends of what you call "pointless". It is pointless in the sens that 99% of the gamer population does not care. It is pointless in the sens that we are ne even able to have a discussion on the merits of fairly simple dead platforms like ps1 and N64. it is pointless in the sens that we will talk about this during 3 years and it will not change the fate of both platforms (and then jump on the discussion of xbox2/ps3 platforms).

No it doesn't depend on what I call pointless. If the difference is irrelevant do to the fact a future system will outperform it then difference it self becomes irrelevant making it pointless. why are you giving me your opinion? We were discussing facts.

Looking at the past, we could say that the power difference between the N64 and the ps1 (quite comparable to the GC/ps2 gap ?) did not matter at all during the 32 bit generation (and not only now like you presume).

that is irrelevant. The N64 did still have a power lead. How can you say it didn't make a difference? Can you play Zelda: Oot or Majora's Mask and say this? DId the graphics make a difference? Yes I would say they did. If the N64 was just as powerful as the PSX and turned out a product that was just as good what would have been the insentive to buy it?

it is all about brand, marketing an hype, not specs. Nintendo learnt it the hard way last gen, Microsoft has still to learn it (IMO).

I disagree. What about all those people who bought the PS2 to have a dvd player? What about some of those people who bought the n64? Can you say that 100% of the people who bought a n64 bought it because of the nintendo label? I would say a pretty good number bought it because of its "superior" nature and its "cool graphics."

To me this is sophistry. Graphics may not have helped it outsell the PSX but the truth is the PSX outsold the N64 for very valid reasons. This is not to say that its power didn't help it sell better then it would without it.
 
He does, he's a developer, not an arm-chair coder

First of all I said "who knows what the hell he is talking about" not "he doesn't know what the hell he is talking about."

Second what about him being a developer makes his prior statement correct?

take it you've never seen audio consume several hundred MB/s then have you?

I am sure it was nothing more than a mistake or he is taking into account something that has to do more with encoding/decoding accross the bus. It doesn't even seem possible these audio cards could do this with raw audio. They'd probably end up saturating their maximum number of open channels before they hit 100 MB/s. Mind you he is stating SEVERAL hundred MB/s.
 
It's 60FPS with occasional slowdown. I just asked people who played it.

TVs (NTSC) display 60 half frame/second. The total number of rendered frames in interlacing is 30 fps for 60HZ. You CAN'T display more than 30 FPS on a NTSC formatted screen. SO they COULDN'T have seen 60fps on a TV screen. IF there were slowdowns the game must have run considerably slower than 30 fps.

What i have heard is that in a number of cut scenes it runs at 60fps while in game it runs at 30.
 
TVs (NTSC) display 60 half frame/second. The total number of rendered frames in interlacing is 30 fps for 60HZ. You CAN'T display more than 30 FPS on a NTSC formatted screen. SO they COULDN'T have seen 60fps on a TV screen. IF there were slowdowns the game must have run considerably slower than 30 fps.

You can display 60 fields (even or odd lines) per second on a TV, which is not the same as 60 full frames, true, but RL supports progressive scan AFAIK.

On an interlaced TV 60FPS means 60 FieldsPerSecond, btw.

60FPS games look distinctively different (when in motion) than the 30FPS ones even on the interlaced TV. It's really easy to tell.
 
You can display 60 fields (even or odd lines) per second on a TV, which is not the same as 60 full frames, true, but RL supports progressive scan AFAIK.

BTW those even/odds are of one image. Low and behold when you do the math it becomes 30 fps. I guess i will take your friends expert opinion on what 60fps looks like :LOL:.

On an interlaced TV 60FPS means 60 FieldsPerSecond, btw.

whoa really? Fields not frames. Those fields make 1 image. They are both 640x240 images.

60FPS games look distinctively different than the 30FPS ones even on the interlaced TV. It's really easy to tell.

and why is that? are you going to give me the "trearing" argument?
 
BTW those even/odds are of one image. Low and behold when you do the math it becomes 30 fps. I guess i will take your friends expert opinion on what 60fps looks like

No, they are not of one image in 60FPS games. In the first 1/60sec pass TV updates even lines with one frame from the framebuffer that console outputs. Second 1/60sec pass updates odd lines with the NEXT frame from the framebuffer that console outputs. That way you have a full picture composed of the two consecutive frames, that constantly updates one set of lines (even/odd)

It's really beyond the point, btw. as RL supports progressive output.

and why is that? are you going to give me the "trearing" argument?
It looks so much smoother. Just play 60FPS game, and play a 30FPS one right after that. It's glaringly obvious.
 
No, they are not of one image in 60FPS games. In the first 1/60sec pass TV updates even lines with one frame from the framebuffer that console outputs. Second 1/60sec pass updates odd lines with the NEXT frame from the framebuffer that console outputs. That way you have a full picture composed of the two consecutive frames, that constantly updates one set of lines (even/odd)

First of all they wouldn't even be 640x480 fields. They'd be 640x240 fields. Wouldn't this method cause tearing, an increase in prominent aliasing and flickering? Furthermore why do you assume that it is automatically doing this?

Whats humorous is that the frame buffer is about the same as if it were running at 30 fps with interlacing. I wouldn't call this an amazing show of performance either as its running bandwidth would be comparable to other games running at 30 fps.

It looks so much smoother. Just play 60FPS game, and play a 30FPS one right after that. It's glaringly obvious.

If this method is used it might but there'd probably be increased aliasing and flickering
 
Not necessarily. Since each unique frame is being updated twice as often (60 vs. 30), the disparity between fields for a given image motion should be halved. Perhaps it would be discernible as a slight blurring of edges that are in motion, unlike the distinct mismatch of successive fields from 2 distinct frames at a rendered 30 fps.

Flickering is really not that relevant on a decently designed TV set. The flavor of phosphors are selected with a sustain rate that is most suitable to the target refresh rate (60 Hz). Showing 60 Hz refresh on a computer monitor is another bowl of fruit altogether, however.
 
randycat99 said:
Not necessarily.

how so?

(640*480*32)/8000000 (for one frame buffer) = 1.2288 megs
1.2288 megs/frame * 30 f/s = 36.864 MB/s

(640*240*32)/8000000 = .6144 megs
.6144 megs/frame * 60 F/S = 36.864 MB/s

the bandwidths required for these frame buffers for 1 second are identical.
 
People don't "see" in "MB/s". They see successive images of varied content with the variance closely related to the rendering rate of unique frames. They also don't selectively see each even/odd field. They see a melding of each field with the field that follows it and so on.
 
randycat99 said:
People don't "see" in "MB/s". They see successive images of varied content with the variance closely related to the rendering rate of unique frames. They also don't selectively see each even/odd field. They see a melding of each field with the field that follows it and so on.

randy i think i was distinctly refering to his earlier comments about the GC's performance being directly on par with the xbox. Not what people think they see or the quality of the image, their preference, their opinions, your opinions, or any other opinion based statement. I am WELL aware of what interlacing IS. Hence the reasoning why i used the term.
 
Hey, you quoted me, so I responded. My earlier statement was in regards to this:

Legion said:
If this method is used it might but there'd probably be increased aliasing and flickering

I suspected you were just trying to be coy, so I indulged your motion. ;)
 
(640*240*32)/8000000 = .6144 megs
.6144 megs/frame * 60 F/S = 36.864 MB/s

Most games that run at 60fps render at 640*480 and blend pairs of horizontal lines together to reduce interlace flickering and aliasing. This is called flicker filtering and it's the equivalent of 640*240 with 2x AA. If the lines aren't blended then as you said, you would get flickering. VF4 on PS2 is an example that doesn't flicker filter and Tekken 4 is an example that does.
 
First of all they wouldn't even be 640x480 fields. They'd be 640x240 fields. Wouldn't this method cause tearing, an increase in prominent aliasing and flickering? Furthermore why do you assume that it is automatically doing this?

No it wouldn't, and here's why:

Let's take an Xbox for an example.

Xbox games that run at 60 frames/sec on progressive TVs. Therefore XBox has to render one full 640x480 frame per each 1/60 sec. If you attach that same Xbox with that same game on a standard, interlaced TV, what happens? The game still renders on the 640x480 buffer but instead of outputting the whole frame on the progressive TV, it scales the frame down vertically (downsampling), making it 640x240 with 2x vertical antialiasing. That 640x240 image is displayed on a TV in one 1/60 field.
Repeat the process for the next frame. Vertical downsampling helps greatly to reduce the flickering that you have mentioned.

Flickering problem with some PS2 games that output in 60FPS happens because developers, to preserve memory, decide not to use 640x480 initial buffer, but render everything directly to 640x240 making the vertical downsampling impossible. They just display newly rendered 640x240 image on the TV and thus, flicker occurs.
 
I suspected you were just trying to be coy, so I indulged your motion. ;)

Notice i said "might" and not for a fact. Coy huh? Well i guess one good turn deserves another. You might go back and read the topic of this conversation so that you might know what we were talking about. Meaning those could possibly be detracting problems caused by the effects of movements within fields of seperate images.
 
Xbox games that run at 60 frames/sec on progressive TVs. Therefore XBox has to render one full 640x480 frame per each 1/60 sec. If you attach that same Xbox with that same game on a standard, interlaced TV, what happens? The game still renders on the 640x480 buffer but instead of outputting the whole frame on the progressive TV, it scales the frame down vertically (downsampling), making it 640x240 with 2x vertical antialiasing. That 640x240 image is displayed on a TV in one 1/60 field.

that is still the same bandwidth. the frame that will be sent to be render will still be 640x240x32. Thats still 36.864 megs/frame. it would send a full 640x480 frame accross the bus and downsample it elsewhere? That would be a waste of bandwidth. So you are basically telling that the system is processing the frame at least twice. First at 640x480 then at 640x240. Like is said before how is this indicative of the GC performance?

You know what guys? I thank you for contributing this information about rendering methods on TVs. BUT i am still trying to figure out what this has to do with the GCs perfromance and how we know that RL is running at 60 fps and exactly whats it rendering in those frames.
 
What I was discussing has nothing to do with your discussion about bandwidth and performance. I was just pointing out how 60FPS updates work on regular TV, why the games use it, and why it looks better / has no flicker. It started with the Rougue Leader being 60FPS...
 
Back
Top