Look at this Google-cached (pulled down) PlayStation 3 page

I am honestly confused by your post here.

This is what was requested, how the 320x240 original and 1280x960 interpolated image would look on IDENTICAL SCREEN SIZE. The 320x240 would look that pixelated when the monitor resolution is set to 320x240, while interpolation of same image to 1280x960 and showing it in 1280x960 resolution would make it look smooth and detailed as the example

if this is what you are argueing about fair enough. however I'd like to point out that on CRT''s this isn't even and issue and the pixelation it more due to dot pitch than resolution here.

and by Screen size I hope you are referring to RESOLUTION's and not physical screen dimensions. smooth--yes detailed --you hsve got to be kidding me.


It is legit, this is what you would see if you reset your monitor to 320x240. If this is how far your thoughts will reach on any particular subject, I am very dissapointed.

can we please lay off the sht flingin *everyone* it creates a lot of noise and misunderstandings as above.
 
Deadmeat, do you remember someone called Nobody's Perfect? he used to be around some of the gaming forums and on usenet. He talked quite a bit about Sony, Sega, the PSX2, and how it wouldnt be able to match a Lockheed Real3D console from Sega. just wondering if you happen to remember him.?
 
You need better glasses.
Are you trying to deny that the upsampled picture you posted is significantly more blurred than picture originally photographed in that resolution would be? Two pictures you posted are basically just two forms of upsampling. One of them provides chunky, pixelated look, the other provides blurred look (it's probably some fractal based algorithm that attempts to reconstruct shapes, and in the process screws up edge colors) Neither is a good replacement for the material that would be originally photographed in such high resolution, second one is more appealing, yes, but it's still far from being acceptably good looking.

That Xbox was not a game console, but some kind of TVPC. Microsoft was smart enough to realize this wouldn't work and redesigned the thing to outrun PSX2.
Yes, and it forced them to make long delay, come to the market when Sony already set it's dominance, and made them repeat in their every speech how such mistake won't happen again...

Anyways, my point was that although they had more time to deliver their console, corners had to be cut, money had to be saved wherever possible. It was like that with their original 500MHz+GF2 concept, it was like that with the current Xbox, and I think it will be like that with Xbox 2. Their hardware won't be much, if at all, better than PS3 if they want to launch roughly at the same time. Or maybe they will decide to spend really insane amounts of money, to put the most powerful Intel's CPU and most powerful NV chip available at the moment, only to again discover that market really doesn't care that much about hardware specs, especially if it's not at all visible in games.

Mean to suggest me that this Vince guy designs systems for a living?
I have no idea what he does for living, but from what I see, he doesn't go around feverishly trying to correct people who made some of the most successfull products in their field of work. He doesn't make outlandish claims about hardware design, so I don't ask him to prove me (with his own, existing work) why should I listen and believe those claims.

Could be said the same about you too.
Well, at least I don't go around screaming that something terrible is wrong and that only my magnificent idea can remedy that... I also don't come up with illogical and far fetched constructs, only to support my mindest on the issue ("they must have added VU1 at the last minute, it was the only way that failed electrician could make something that outperforms precious Dreamcast!") Be as you wish, but know that I was not taunting you, I'm concerned and dead serious... having witnessed with my own eyes what can happen if precautionary steps are not taken before it's too late. It really doesn't have anything to do with you being my friend or not, some people are just more prone to that kind of stuff due to their genetic heritage.
 
There was never a case of a new console unable to outperform ones predating it. Whatever comes out after PSX3 will outperform it, it is a fact.

Uhhh Gamecube?

And what makes you think Xbox 2 is going to come out after PS3? MS has said that they don't let that happen again.

If PSX3 comes out in 2005 in Japan, it will hit the US the next year to launch alongside Xbox2, right? Is 6 Ghz that far-fetched in 2006 holyday season???

Are you pulling this out of your ass?

If ps3 comes out 2005 in Japan it will be mid year, than a US launch that fall of 2005. Where are you getting this fall 2006?
 
Paul said:
There was never a case of a new console unable to outperform ones predating it. Whatever comes out after PSX3 will outperform it, it is a fact.

Uhhh Gamecube?

And what makes you think Xbox 2 is going to come out after PS3? MS has said that they don't let that happen again.

If PSX3 comes out in 2005 in Japan, it will hit the US the next year to launch alongside Xbox2, right? Is 6 Ghz that far-fetched in 2006 holyday season???

Are you pulling this out of your ass?

If ps3 comes out 2005 in Japan it will be mid year, than a US launch that fall of 2005. Where are you getting this fall 2006?



both PS1 and PS2 launched in march in japan and in september (i think) in USA, and then in november in europe.... i hope they would just release the damn thing at the same time, european gamers discrimination is just so last generation.. so 60Hz-mode :LOL:
still. we know NOTHING about release date. we only know that Nintendo and Microsoft will just be waiting to see what Sony does. Sony dictates when the next generation starts. Nintendo and Microsoft know that they cant afford to pull another Dreamcast, and certainly they cant afford to pull another GC+Xbox. i think they will just release around the same time, in which case it will be interesting to see what hardware they come up with....
well, it will be interesting just knowing what hardware Sony will actually come up with :LOL:
 
well people seem to think "oh well we dont need to have a fast CPU, Nvidia or whoever are gonna take care of everything with a super GPU..."

i dont get this....

wherever u wanna dump the burden, whether on the CPU or the GPU, those guys have a hell of a job to do to come up with something as powerful as Cell is supposed to be (DISCLAIMER: THIS IS ALL IN CASE PS3 REALLY PUSHES THIS FLIPPING 1TFLOPS AND ALL THAT PR BULLSHIT)
 
The thing about the 1 TFLOPs that gets me wondering is, if we asume that Sony wants to go through will a full software rasterization using CELL - what kind of performance could we be speaking of, and, if we assume that Microsoft takes the latest GPU and CPU on the market, how will they match up?

On one hand, we've got a 1 TFLOP processor doing everything in software - on the other, we have a weak CPU with powerful GPU emphasizing its hardware mastered pixel effects.

I suppose the TFLOP one would have a huge advantage in geometry (perhaps doing micropolygons as has been discussed) while the other would have significantly less, although boosting nice effects to overcome the shortage (example: Doom 3). The question is, to rephrase, how would they each step up to one another? In average, would they be in the same league or would the TFLOP doing micropolygons have a serious advantage if used well?

Anyone any ideas, thoughts?
 
I am not saying they can slap a Pentium III 1.3 GHz instead of the current 733 MHz Celeron 2...

Do you realize that we haven't seen Prescott yet ? Do you realize Prescott is going to have improoved Hyper-Threading ( SMT ), 16 KB of L1 D-Cache ( 2x as much L1 D-Cache as Northwood and Willamette ), faster FSB and other micro-architectural improovements ?

Add all of that to higher clock-rate ( imagine a 4.5 GHz Prescott )... now , since we might be talking about Tejas and not Prescott in Xbox 2 add some more performance at the same clock rate to count for some optimizations in the CPU core and better compilers...

You got quite a lot of CPU power there ( more than enough to do complex physics, complex A.I., etc... ) ... the NV/ATI chip should be able to do the rest...

Assuming Tejas does the same number of FP Ops/cycle as Prescott and Nortwhood ( and Willamette )...

4.5 GHz * 4 FP Ops/cycle = 18 GFLOPS

And I think that Tejas, counting all the improovements over the basic Willamette core and the better FSB, should come closer than what you think to that figure...
 
The visual difference would be quite aparent as both styles of doing graphics is quite different.

It would be like comparing DOOM 3 and half life 2, both look great, however as you can see both look radically different. D3 with it's dynamic shadows, metal like bump mapping and CGish visuals. On the other side you have HL2 with it's very detailed high resolution textures that make it look more "real".

Next gen there really aren't going to be games that look "bad" compared to the games on the other systems, just that the visual aproach will be different. So it will depend on opinion more than anything else.

Do you like the CGish or the photorealistic is what it's going to come down to.
 
Phil... the thing about Software Rasterization is that in this case we would not be doing texture filtering and similar oiperations in software... we will have support in silicon for that kind of basic 3D operations that would be otherwise just an exercise in futility to move in software...

This won't be exactly like making a pure CPU renderer: developers will enjoy more freedom in shaders, but that should not go to the point of having custom texture filtering...

I think that bi-linear, tri-linear, anisotropic filtering and support for mip-mapping, Z-buffering should be Hardware accelerated by the Pixel Engine portions of the Visualizer...

It should not even take a huge part of the die area either...

The rest yes, it should go software... the architecture is well suited for the kind of parallelism and performance that such an approach would require...

I pewrsonally like the idea of effects not being part of a check-list, but something that a creative developer can make on his/her own...

Ultimately even in the PC world they are going the same route...

Tim Sweeney was once advocating return to software rendering when CPU get powerful enough, Carmack said that soon enough GPUs will market only on performance of their shaders and not their HArdwired features ( pretty much like current CPU do )... Vertex and Pixel/Fragment shaders using the same kind of processing unit, differentiation between per Vertex and per Fragment programs blurring more and more ( in terms of what they can do, how they can access data, etc... ).
 
...

To marconelly!

Are you trying to deny that the upsampled picture you posted is significantly more blurred than picture originally photographed in that resolution would be?
You basically have two options of viewing that 320x240 image on any higher resolution monitor in full screen, reset the monitor resolution to 320x240 to fill the screen or blow them up to 1280x960 and interpolate to produce a higher detailed version. You tell me which option you prefer.

second one is more appealing, yes, but it's still far from being acceptably good looking.
Like you said, you have to accept the interpolated image when the original is too low in resolution and is pixelated.

only to again discover that market really doesn't care that much about hardware specs, especially if it's not at all visible in games.
Well spoken. It is hard to show off the technical prowess of PSX3 when it is so hard to develop for.

Well, at least I don't go around screaming that something terrible is wrong and that only my magnificent idea can remedy that...
I have no solutions to offer to Sony. I am just pointing out the glowing errors of PSX3 design.


("they must have added VU1 at the last minute, it was the only way that failed electrician could make something that outperforms precious Dreamcast!")
Not the only way, but a quick and easy way.

having witnessed with my own eyes what can happen if precautionary steps are not taken before it's too late.
I don't have any such experience to offer, and it doesn't affect me.

Uhhh Gamecube?
It was Nintendo's fault that GC was delayed.

To Phil

The thing about the 1 TFLOPs that gets me wondering is, if we asume that Sony wants to go through will a full software rasterization using CELL - what kind of performance could we be speaking of, and, if we assume that Microsoft takes the latest GPU and CPU on the market, how will they match up?
Very few developers have the experience and resources to benefit from a full-scale software rasterization and programmability offered by CELL; most are content with hardware-acceleration and standard libraries. The analogy is that it is nice to have a manual transmission if you are a street racer, but the vast majorities of commuters are happy with automatic transmission.

You can look at the bandwidth numbers of CELL as the best indicator of its real world performance; 25 GB/s for Yellowstone bandwidth and 30~40 GB/s peak throughput for Redwood interface between CPU and GPU is rather restrictive. What's so shocking about PSX3 is that SCEI seemed to have repeated all the major design errors of PSX2 that crippled it; the memory bandwidth is fairly low for a machine of this FLOPS rating, vertices still travel between CPU and GPU over a slow bus, the system memory is on the CPU side so textures still have to travel over the redwood bus. A PSX2 deja vue.

kaigai05l.gif
 
Deadmeat, the fact that u actually believe that PS2 and PS3 are the job of one single man makes all your posts lose credibility. i stopped reading them altogether to be honest... :rolleyes:

and by the way, i'm sorry to say that u must consider Kutaragi some kind of higher power being to be able to do all that all by himself..... :LOL:

all you've been doing for the last 2 days is telling us how CELL and anything Sony will do will undoubtedly fail, that a failed electrician did PS2 and his vision is flawed, blah blah blah.... without acknowledging the fact that currently we know very little about the next generation of consoles.... apart from some patents on an ARCHITECTURE which will somehow be used in one of them....
 
You forget about the Local Storage in each APU which lead a combined 2 MB of extra memory on the GPU and 4 MB on the CPU plus that diagram doesn't take into account e-DRAM on the CPU side.

Also the APUs have quite more registers than the VUs in the EE had... VUs had 32x128 bits registers and 16x16 bits GPRs while the APUs have 128x128 bits GPRs.

Local Storages ( 128 KB per APU ) and GPRs ( 128x128 bits ) in addition to the e-DRAM should make sure that less stress is put on the external RAM link.

In the Cell patent External memory was attached to the I/O asic and the DRAM was clearly e-DRAM as it had a combined 1,024 bit data-path ( through a cross-bar switch ).

GPU will do texture decompression ( no, not just 8 bits CLUT ) and with support of HOS/subdivision surfaces they memory bandwidth needed to load vertex data on will be lower... ( at least for the external RAM to CPU link )...

Also the link between EE and the GS was 1.2 GB while the link between main RAM and EE was 3.2 GB/s

The link between CPU and external RAM with Yellowstone should be around 25.6 GB/s and the Redwood link between CPU and GPU could even get close to 50 GB/s which is like 2x the link between external RAM and CPU
.
 
Re: ...

DeadmeatGA said:
You basically have two options of viewing that 320x240 image on any higher resolution monitor in full screen, reset the monitor resolution to 320x240 to fill the screen or blow them up to 1280x960 and interpolate to produce a higher detailed version.

...if you really need full screen. Honestly, viewing 320x240 fullscreen on a 1280x960 display and expecting quality is pretty retarded. You more likely do like most computer users do- view the image in a 320x240 sized window sitting on a 1280x960 desktop. [strains to see DMGA wittingly switch screen resolution modes for every single odd-sized picture in his photo collection folder he wishes to see]

Ah well, doesn't seem to matter anymore anyway. You appear to have retracted your "original" image example. The bottomline is, viewing something upscaled 4x by 4x will always be a makeshift solution no matter what filtering you employ, never a replacement or even a good simulation for an actual native image at that higher resolution. ...and that's what you were shooting for in your claim- "film level" quality. What you delivered was something of the size of a film level, but far short of any comparable performance to a film level.
 
Well spoken. It is hard to show off the technical prowess of PSX3 when it is so hard to develop for.
So what do you say about the rumor that Sony plans to not only deliver standard devkits, but to deliver (to any interested 3rd party) a template engine for any given genre (FPS, platformer) Templates would be made in advance by their top developers, say Naughty Dog would write engine for 3rd person games, etc...

I have no solutions to offer to Sony.
"Auto Paralellism" ?

25 GB/s for Yellowstone bandwidth and 30~40 GB/s peak throughput for Redwood interface between CPU and GPU is rather restrictive.
I thought other memory solutions have significantly smaller bandwidth? So why complain about that? Besides, 25GB is not confirmed for PS3, it's just what they said will be the first step in their development, and that they plan it to go up to 100GB. Also you are forgetting the local APU embedded memories.

Btw, why do you keep referring to PS2 as PSX2 and PS3 as PSX3? Is that how their internal development teams refer to them?
 
marconelly! said:
Btw, why do you keep referring to PS2 as PSX2 and PS3 as PSX3? Is that how their internal development teams refer to them?

NO, IT JUST GOES TO SHOW HOW MUCH HE KNOWS ABOUT WHAT HE'S TALKING ABOUT :LOL: J/K
 
london-boy said:
marconelly! said:
Btw, why do you keep referring to PS2 as PSX2 and PS3 as PSX3? Is that how their internal development teams refer to them?

NO, IT JUST GOES TO SHOW HOW MUCH HE KNOWS ABOUT WHAT HE'S TALKING ABOUT :LOL: J/K

The only word that come to my mind with the whole argument is "ridiculous". He is not listening/reading/comprehending, just keep on writing and writing when you post a reply. Forget it. :(
 
Here's something I noticed- when replying to a quoted comment, he replies with something with great confidence, but the context of the reply has almost nothing to do with the quote he was trying to address. So it almost looks like he "answered" something with great decisiveness, but in reality he has simply given a boilerplate answer that had nothing to do with the question. This is the sort of thing you see when the "liberals" and the "conservatives" go at it in some political discussion. In the end, nothing gets answered, and the discussion ends up wildly OT from where it started.
 
Back
Top