NGGP: NextGen Garbage Pile (aka: No one reads the topics or stays on topic) *spawn*

Status
Not open for further replies.
That leKed document from vg leaks was a bit too detailed for me to believe it was some kind of informer...or developer....a dev wouldnt have access to all the inner workings like that 9 months oit from release would they?

There are long answers with lots of bullet points, but basically 'yes the developer would certainly have access to the inner workings like that months before release'.

If you want a developer to build a game to showcase of the true power of your console - then it is helpful to tell them what that "true power" is exactly...
 
The PS4 comes with a customized Pitcairn. What more can you expect unless you want to wear earplugs because of the annoying noise of the system fans? Tahiti would be overkill for a console, especially if you want to go with an APU. A GeForce derivative would make no sense as well since Nvidia has nothing like the HSA. A Picairn packed in a HSA APU is a very good choice. Sony will deliver.
 
If i find this hilarious because when everyone was drooling over Cell and its theoretical TFLOPs and how it was a supercomputer. It was the 360 with the most relevant technology as it was the first piece of mainstream hardware to be released with an unified shader gpu which dominates gpu tech now.

Nevermind that MS is continually on the forefront designing an API to keep up with ever growing performance of PC gpus. Sony is so good at designing and manufacturing harware that it dropped it all to basically go off the shelf. Its the equivalent of the 720 running a linux OS with opengl as its graphic api.

I love Sony but Sony having hardware spec advantage over MS has happened before and AMD gpus and x86 cpus are MS's forte. AMD gave MS unified shaders before it put it in its own products, so its practically impossible to see if MS and AMD are working together to realize AMD vision of tommorrow with MS basically financing the endeavor.

Come on..

The PS3 is the only console from sony release that actually has an off the shelf GPU the other 2 has GPU build by sony or toshiba,MS has use off the shelf parts more often.

Cell was actually so good that it help a console with a weak GPU stay close and in some cases (first party) even surpass a console with a more advance GPU,all thanks to a capable CPU.

MS API are easy to use,but sony's allow for working to the metal with hardware,if Sony developers over came extreme and difficult hardware to code for and deliver incredible things,imagine what they will do with easier to code hardware.

And AMD didn't give unified shaders to MS,MS pay good money for using it,if sony would have go there first and pay they money they would have get it as well is call business,Xenos was just as advance as Cell was.
 
a radeon 7800 have 1 GFlops, the 7900 have 2 GFlops, so both durango and orbis brute force stay between a 7800 and a 7900, but brute force is not the only factor in the equation
 
After last gen, I'm personally not comfortable any more to make any speculation on which system might be more powerful even a few years after its release ... :D

But I'm getting the impression that will be able to tell sooner and better this gen than last gen, simply because the hardware components at least seem relatively well known compared to last time. But we still don't have many systems in the PC space with ED/ESRAM to compare with, for instance, so that bit will remain tricky at least. ;)
 
I don't think 6GB is likely - as the recently departed Rangers said sometime ago, 4GB of GDDR5 is already like 2x the cost of 8GB of DDR3 http://forum.beyond3d.com/showpost.php?p=1695766&postcount=185

And as numerous people have said, changing to 6GB at this time is not possible since it'd require a new bus and so a expensive and time consuming system redesign.


2 more GB would not be any problems,that is Rangers opinion.

If sony want those 6GB that is what the PS4 will have,it will not break the back,considering the PS4 use CPU and GPU that are not that expensive,compare to Cell Jaguar is dirt cheap.

But i don't see what 6GB,hell 1GB more would do,most current top of the line GPU have 3 or 4 GB of ram the PS4 one is not even close to be high end GPU,so i think it will be ok.
 
a radeon 7800 have 1 GFlops, the 7900 have 2 GFlops, so both durango and orbis brute force stay between a 7800 and a 7900

Cape Verde: HD7770 has 1.28 TFLOPS
Pitcairn: HD7850 has 1.76 TFLOPS, HD7870 has 2.56 TFLOPS
Tahiti: HD7950 has 2.86 TFLOPS, HD7970 has 3.79 FLOPS, HD7970 Ghz has 4.1 TFLOPS

According to the rumors Orbis GPU is a HD7850 on steroids with more CUs and more TMUs but nowhere near to Tahiti. Durango is a customized Cape Verde.
 
After last gen, I'm personally not comfortable any more to make any speculation on which system might be more powerful even a few years after its release ... :D

But I'm getting the impression that will be able to tell sooner and better this gen than last gen, simply because the hardware components at least seem relatively well known compared to last time. But we still don't have many systems in the PC space with ED/ESRAM to compare with, for instance, so that bit will remain tricky at least. ;)

Naa. I think its quite easy to say so, if these are real and final specs. As for the eSRAM, it is there, along with the DMEs, to mitigate any bandwidth issues similar to what the eDRAM did for the 360. The difference here is that it is an SRAM instead of a DRAM, it is larger, and although it doesn't have the amount of bandwidth in the 360 eDRAM, it doesn't need to because it can compress its data, which was something the 360 couldn't do, and also it is much more flexible and useful than the setup in the 360; you can texture to it, read, write, modify etc, and as such it can act as a large cache, which would be interesting to see what developers can do with it.

Anyway, all things being equal, as far as what we know from these leaks, the ps4 is more powerful. How that translates on screen is something we can't say yet until we start seeing the games and start getting proper developers insight, because all things might not be equal in real world situation ;)
 
Cape Verde: HD7770 has 1.28 TFLOPS
Pitcairn: HD7850 has 1.76 TFLOPS, HD7870 has 2.56 TFLOPS
Tahiti: HD7950 has 2.86 TFLOPS, HD7970 has 3.79 FLOPS, HD7970 Ghz has 4.1 TFLOPS

According to the rumors Orbis GPU is a HD7850 on steroids with more CUs and more TMUs but nowhere near to Tahiti. Durango is a customized Cape Verde.


I was talking about 7800M and 7900M

both console stays between, near the 7800, fact


http://www.amd.com/US/PRODUCTS/NOTEBOOK/GRAPHICS/7000M/7800M/Pages/radeon-7800m-series.aspx#2

http://www.amd.com/us/products/notebook/graphics/7000m/7900m/Pages/radeon-7900m-series.aspx#2
 
Depends on your business strategy.
It outperforms the Wii U easily, it's close enough to Orbis to get proper third party support, and they have enough financial space to pack a whole lot of multimedia, peripheral and social stuff. It's perfect for a jack of all trades system. Most people won't recognize the graphical difference to Orbis anyway, and even more people will buy it because 8GB > 4GB. The system will be able to play Kinect dancing games that change the music according to your moves, older players will be able to control any game without a controller (finger gun) and nevertheless there will be Halo, Forza and Alan Wake ready to rock. Everyone in the family can have fun with it.
I definitely prefer Orbis over Durano, you probably too, but that doesn't mean that Durango will fail.

I'm not talking about financial success , sales , multimedia, peripheral and social stuff . It's just pure tech i'm talking and in that front it's beyond underwhelming .
Obviously i find Orbis an attractive option but i'll have an open mind about Durango since i got the previous xboxes and had great fun with them - but i'm not wiling to pay the asking price unless it's too low , and i'm talking about 250e low , that's how i value it's hardware .

And the PS4?

To be honest i expected more , it just plays middle to the other guy's low .
 
That vgleaks document strongly hints at more smaller tiles compared to 360. It alsos implies that the reason ROPs are not tied to the eram this time around is because if the tiles are small enough to fit the cache, they will have enough bandwidth to achieve it's theoretical fill rate...

If the gpu can indeed have many small tiles (and no where near the penalty rate from tiling xenos had) 32mb shouldn't be a problem... Nor its seemingly low bandwidth compared to edram from 360...
Yep. For all we know, Durango is a TBDR device with API's cutting up workloads into small pieces.
 
I don't get it what make any one think that MS will priced Durango $50 to $100 less than the current xbox 360.?
Current XB has an unusually high price.

I see this new trend now on all the forums i visit including this one,since Durango won't be ultra powerful it most be dirt cheap which will not be the case.
The traditional console space has had expensive devices because the hardware has been expensive. If the hardware is cheap, how do you justify the price to consumers? That's not to say Durango will be cheap as maybe Kinect and other gubbins is pushing up the price, but there's definitely logic to expecting a lower entry price as long as one can see beyond the current shelf-price of XB360.

Come on..

The PS3 is the only console from sony release that actually has an off the shelf GPU the other 2 has GPU build by sony or toshiba,MS has use off the shelf parts more often.
MS have only built two consoles. The first was moderately off-the-shelf, although with a customised GPU straddling Ti3 and Ti4 series. The second had a custom PPC CPU and a proprietary GPU. Ergo your statement is evidently false.

MS API are easy to use,but sony's allow for working to the metal with hardware,
According to devs on this board, you're mistaken. Read this thread - Is programming "to the metal" on XBOX 360 forbidden?
 
I'm not talking about financial success , sales , multimedia, peripheral and social stuff . It's just pure tech i'm talking and in that front it's beyond underwhelming .
Obviously i find Orbis an attractive option but i'll have an open mind about Durango since i got the previous xboxes and had great fun with them - but i'm not wiling to pay the asking price unless it's too low , and i'm talking about 250e low , that's how i value it's hardware .



To be honest i expected more , it just plays middle to the other guy's low .

Well if these spec are correct and lets be honest there a whole host of data we don't have and taking flops as the reference the PS4 looks like it got around 30 odd % more legs. Nice yes, but certainly no massive leap.
 
Well if these spec are correct and lets be honest there a whole host of data we don't have and taking flops as the reference the PS4 looks like it got around 30 odd % more legs. Nice yes, but certainly no massive leap.

both have a gpu worse than 7900 mobile, 12 or 14 CU's is trivial, I don't know how people can be disappointed by one and happy about the other, in my opinion they are both in the same boat, both disappoint , both are in the best case on the mid-class league from a PC stand point, difference minimal, both overperformed by a mobile chip, what are we talking about? from two or three years from launch will be there smartphones with 4 GB ram CPU16+ cores @ 2+ GHz and probably a better gpu too
it's a joke?
 
Cape Verde: HD7770 has 1.28 TFLOPS
Pitcairn: HD7850 has 1.76 TFLOPS, HD7870 has 2.56 TFLOPS
Tahiti: HD7950 has 2.86 TFLOPS, HD7970 has 3.79 FLOPS, HD7970 Ghz has 4.1 TFLOPS

According to the rumors Orbis GPU is a HD7850 on steroids with more CUs and more TMUs but nowhere near to Tahiti. Durango is a customized Cape Verde.
So having 50FLOPS is called "on steroids" now? Basically, PS4 is HD7850 and Durango is HD7770 with more bandwidth.
 
Well if these spec are correct and lets be honest there a whole host of data we don't have and taking flops as the reference the PS4 looks like it got around 30 odd % more legs. Nice yes, but certainly no massive leap.

It's not only more FLOPS, it also has more CUs, more TMUs and more ROPs

So having 50FLOPS is called "on steroids" now? Basically, PS4 is HD7850 and Durango is HD7770 with more bandwidth.

Same here: It has more TMUs, more CUs and mor bandwidth than a HD7850.

12 or 14 CU's is trivial, I don't know how people can be disappointed by one and happy about the other?

It's 12 CUs vs 18 CUs.
 
So having 50FLOPS is called "on steroids" now? Basically, PS4 is HD7850 and Durango is HD7770 with more bandwidth.

hardly, HD 7850 don't have 4 CU out of hardware balancing for graphics
people want to believe that there are still 18 CU's and not 14 CU's with 4 CU's for something else that graphic (as they will give trivial graphical contribute, as stated by the rumor)
 
Come on..

The PS3 is the only console from sony release that actually has an off the shelf GPU the other 2 has GPU build by sony or toshiba,MS has use off the shelf parts more often.

Cell was actually so good that it help a console with a weak GPU stay close and in some cases (first party) even surpass a console with a more advance GPU,all thanks to a capable CPU.

MS API are easy to use,but sony's allow for working to the metal with hardware,if Sony developers over came extreme and difficult hardware to code for and deliver incredible things,imagine what they will do with easier to code hardware.

And AMD didn't give unified shaders to MS,MS pay good money for using it,if sony would have go there first and pay they money they would have get it as well is call business,Xenos was just as advance as Cell was.

Yes Cell was good but at a cost and is one reason why it had a weak gpu. And if all Sony could get was a G71 from Nvidia whats what makes you think they would of got Xenos from AMD.
 
hardly, HD 7850 don't have 4 CU out of hardware balancing for graphics
people want to believe that there are still 18 CU's and not 14 CU's with 4 CU's for something else that graphic (as they will give trivial graphical contribute, as stated by the rumor)

I said it several times already: DX11 level of graphics is full of effects that require GPU computing: You need it for lighting, for ambient occlusion, for depth of field, bokeh, motion blur, lens flare, fluids, textiles, particles, smoke, etc. A desktop PC GPU like the HD7850 has to detach some computational resources for these graphics effects as well. Having 4 Orbis CUs ready for GPGPU is no downgrade in terms of graphical capabilities unless you want DX9 level of graphics without these fancy effects. Take a look at Uncharted: Naughty Dog used the Cell SPEs to support the RSX with graphical computations (lighting, ambient occlusion, etc). If it's there then it'll be used.
 
Status
Not open for further replies.
Back
Top