Sony Gamescom 2013 conference

And the fact Minecraft is superior on PC was certainly brought up plenty of times on the internet, so I'm not sure why the same point one time would offend you in this case...

I am not offended, you seem to be offended by the PS4 launch line up. My point is that Minecraft sold 8 million copies on the 360, so no matter how hard you try to make it a problem that F2P and "indy" games are also on the PC it's still 100% valid games in the launch line up for the PS4, it's simply different markets.


We only play Minecraft on the iPad. ^_^
Minecraft on the Playstation would work too. No Windows PC at home except for my wife's work laptop.

The 360 version is pretty much fantastic, it's like a PC light version but with Co-op, way ahead of the portable edition imho. If i were you i would buy the PS4 version when it's released, i know i will. I expect it to run at Full HD instead of 720p on the 360 and since my PS4 will be on the PJ it's going to be fun CooP with plenty of screen space :)
 
http://www.heise.de/newsticker/meldung/Gamescom-Erste-Eindruecke-von-der-Playstation-4-1940079.html

Impressions on various PS4 stuff. Also of note, Guerilla is targeting 60fps for KZ. Probably only for multiplayer I'm guessing.

The developers at Guerilla render the game in full HD and target a framerate of 60 fps. "We could have set our game on 30 fps and added more makeup, but how the game feels is important to us", the Dutchmen explained. Killzone is not supposed to be a pure graphics demo but wants to keep players engaged over the long term.

hizUjJ9.png


Edit: Confirmed
 
Last edited by a moderator:
I would have been pretty amused if he'd said:

"well, we noticed that after Call of Duty, now also Halo, Battlefield 4 and even Assassin's Creed 4 is going for 60fps in multi-player, so we felt we really couldn't stay behind ... "

Or

"well, after we basically completed all of Killzone Shadow Fall basically already end of last year because we want to be sure to make launch with a well-tested sku, we suddenly heard from Sony that we'd have 4GB more RAM. So then we had a very short deliberation whether we should upgrade all the art to take advantage of the memory, or just try to use it to improve the framerate. As the latter only costs one or two programmers instead of 30+ artists, we decided to go for the framerate ..."

;)

Anyway, so far, the future is looking bright for next-gen multi-player, whatever the cause.
 
I don't think Knack, DriveClub, and Killzone can compare to Killer Instinct, Forza 5, Dead Rising 3, and possibly Ryse.

Killzone is good, but it's hardly like CoD and BF4 aren't going to fill that FPS spot in the lineup anyway.

There isn't even a fighter in PS4's launch line-up.

It gets even worse when you go beyond the launch into the "launch window" where MS has TitanFall (won 60 E3 awards) and Project Spark.

But my main comment was just that MS seems to have more "playable" games right now.

Too bad Forza is kind of feature incomplete and the full game isn't on the disc. KI isn't complete either, kind of half a game based on a IP no one remembers. Ryse is a 360 Kinect port stuck in Dev hell. IMO all these launch titles are going to be rushed and poorly received. I'm glad to have some games pushed back so they can be given time to finish, like infamous.

But we are glad every XB1 exclusive happens to be every game you ever wanted, what luck for you.
 
Can we please grow up and move beyond all these game versus discussions?
 
The 360 version is pretty much fantastic, it's like a PC light version but with Co-op, way ahead of the portable edition imho. If i were you i would buy the PS4 version when it's released, i know i will. I expect it to run at Full HD instead of 720p on the 360 and since my PS4 will be on the PJ it's going to be fun CooP with plenty of screen space :)

Only my son (and his friends) play Minecraft in my household. :)
The iPad version is handy because he can play anywhere.

I'll give the PS4 version a closer look to see what it is all about.
 
Only my son (and his friends) play Minecraft in my household. :)
The iPad version is handy because he can play anywhere.

I'll give the PS4 version a closer look to see what it is all about.

Out of all the versions we bought, the 360 has been his favorite (8-year old now). PC he likes for the mods and such, but on 360 he can play with his school friends rather easy. World size has never been an issue, playing with the controller is his method of choice. On the phones he likes survival craft better for that screen size oddly.

Looking forward to the PS4 version and of course Vita - and I can't believe how much money Notch has gotten off of our family so far.. :LOL: :cry: ;)
 
Ah he plays Minecraft and other small educational games on iPad when we are outside. At home, I usually forbid him to touch PS3 these days. ^_^

If he get to play on PS3, I only allow him to play titles where he could create stuff (e.g., LBP, Modnation Racer, SoundShapes) or express himself (e.g., SingStar, Beat Sketcher). Am also guiding him in Python, Sketch (Arduino), and Scratch programming.

At this point, I haven't decided what to do with PS4 yet. Minecraft is on the radar. I am hoping we can work on some projects together.
 
"well, after we basically completed all of Killzone Shadow Fall basically already end of last year because we want to be sure to make launch with a well-tested sku, we suddenly heard from Sony that we'd have 4GB more RAM. So then we had a very short deliberation whether we should upgrade all the art to take advantage of the memory, or just try to use it to improve the framerate. As the latter only costs one or two programmers instead of 30+ artists, we decided to go for the framerate ..."

;)

KZ tech slides showed they are only using 4.5GB. Supposedly the prior 4GB arrangement was 3.5GB with .5GB for OS.

So they didn't gain THAT much by the move to 8GB and have already been using it for a while.
 
I'm not yet ready to buy the Eurogamer article as gospel, that the move from 4GB to 8GB only netted developers a 1GB increase of RAM (half of which would be some kind of virtual memory).
 
I'm not yet ready to buy the Eurogamer article as gospel, that the move from 4GB to 8GB only netted developers a 1GB increase of RAM (half of which would be some kind of virtual memory).

Actually it ended up 4.5GB +.5GB of some sort of wonky memory (system managed, or something).

Total of 5GB.

Shadowfall presentation in May showed them using 3,072 MB video memory, 1536 MB system memory, and 128 MB shared memory.

I dunno, kinda odd it works out to ~ 4.5GB which is what the PS4 should have minus the weird .5GB. And it doesn't work out for a 4GB dev kit either.
 
Based on what we know now, I would get the impression that the video vs system memory refers to how memory is mapped to use the Garlic or the Onion bus, from which we learnt from the Ubisoft developer that which bus is used depends on how you reserve the memory. Then perhaps the shared memory would be a special case that we don't know about yet - perhaps that's framebuffers that both CPU and GPU have a special kind of combined access to? Or just something else completely (memory for specific OS libraries perhaps, like friend's lists), don't know.
 
Based on what we know now, I would get the impression that the video vs system memory refers to how memory is mapped to use the Garlic or the Onion bus, from which we learnt from the Ubisoft developer that which bus is used depends on how you reserve the memory. Then perhaps the shared memory would be a special case that we don't know about yet - perhaps that's framebuffers that both CPU and GPU have a special kind of combined access to? Or just something else completely (memory for specific OS libraries perhaps, like friend's lists), don't know.

Garlic = video
Vegetable-less CPU bus = system
Onion = shared

From a "system/BSD" perspective those terms would mean something completely different.... but it's most likely the above.
 
Actually it ended up 4.5GB +.5GB of some sort of wonky memory (system managed, or something).

Total of 5GB.

Shadowfall presentation in May showed them using 3,072 MB video memory, 1536 MB system memory, and 128 MB shared memory.

I dunno, kinda odd it works out to ~ 4.5GB which is what the PS4 should have minus the weird .5GB. And it doesn't work out for a 4GB dev kit either.

I thought we knew all along that the dev kits that were in proximity to the reveal but before the decision was made to switch to 8GB were already 8GB?
 
Probably right, I dont really know all the dev kit details though.

Point was more like it's a memory amount that doesn't make sense if you were targeting a 4GB system
 
Garlic = video
Vegetable-less CPU bus = system
Onion = shared

From a "system/BSD" perspective those terms would mean something completely different.... but it's most likely the above.

In past gen that may have been true but I think the traditional use of system and video ram will change.

Garlic = incoherent gpu data
Vegetable-less CPU bus = system
Onion = coherent shared data---->compute

Maybe as we move forward we will see less ram reserved for garlic and move ram reserved over onion and veggie less bus.
 
In past gen that may have been true but I think the traditional use of system and video ram will change.

Garlic = incoherent gpu data
Vegetable-less CPU bus = system
Onion = coherent shared data---->compute

Maybe as we move forward we will see less ram reserved for garlic and move ram reserved over onion and veggie less bus.

Compute can also be in Garlic if the GPGPU process them exclusively, especially if the results are used for rendering.

If the game perform more generic functions like running WebKit, then the DOM tree can be allocated in the non-Garlic and non-Onion (shared system) heap.

Onion is only for "syncing/sharing" states.

Moving forward, we should still see more Garlic data if the GPGPU is well used.
 
Compute can also be in Garlic if the GPGPU process them exclusively, especially if the results are used for rendering.

Onion is only for "syncing/sharing" states.

Moving forward, we should still see more Garlic data if the GPGPU is well used.

True, but I can imagine that gpgpu will readily take advantage of the more copious amount of cache on the CPU side unless latency over onion is as bad as calls to main memory. Which shouldnt be the case as its not an off chip request and is helped by the fact that onion+ avoids the gpu caches.
 
Last edited by a moderator:
Did they publish the numbers ? It may not be as bad as people think. They have a lot of queues to choose work from.

The SPUs only had 256K local store. But they overloaded the SPUs with lot's of work for similar reasons.

The fine grained cache control is helpful when multitasking such that another parallel work is not screwed.
 
Did they publish the numbers ? It may not be as bad as people think. They have a lot of queues to choose work from.

The SPUs only had 256K local store. But they overloaded the SPUs with lot's of work for similar reasons.

The fine grained cache control is helpful when multitasking such that another parallel work is not screwed.

Well we know calls to main memory are faster over garlic, so cache misses over onion should be slower. But I can't imagine cache hits over onion being slower than calls over garlic.
 
Back
Top