How should devs handle ports between consoles? *spawn

Sounds like you agree. A more powerful box deserves more.

Yes, of course...why would I bother to upgrade my PC or buy the more powerful console if I didn't want to see the results. Hell I used to OC my PC to the edge to get a couple extra FPS!
 
And about Destiny. I'm not convinced PS4 couldn't have done better. Just look at their PS360 games which didn't reach parity from what I have read. Sad that DF seemingly refuse to compare them.
 
Last edited by a moderator:
A tweet from Phil Spencer saying it's not true means little more than Microsoft's more than obvious stance on the issue. Was anyone expecting to see Phil Spencer sending a tweet saying "Yes, we're pressuring PC devs to lock the FPS in order to make our console look less bad"?
This is the real world, people, not an imaginary one.

I know, this 'blind faith' makes me laugh so much, let's not forget all the 180s MS have done this gen and how so little details about the 'game sharing limitations' was released until MS had no option but to undo everything then all of a sudden details came out about this utopian sharing system which all gamers would benefit from.

Either way, the original source weight is about the same as the MS PR weight.
 
OK, why not lock PC down while we are at it. This makes no sense and helps nothing but ones pocket.
Why should PC be locked down and what's in it for Ubisoft and their close partner, Nvidia?

Based on sales and revenue, PC is no threat to consoles and hasn't been for a long time. And sure, while "PC" is "better" than consoles, if you look at the Steam hardware survey you'll see the certain in Steam's ecosystem, most PC owners are using far more modest hardware.

Consoles are an easy option for people wanting to game on their TV, and for PS4, game remotely as well with Vita.
 
I took no position at all, I only re-posted the news because it belongs here. It's just a rumour that may or may not be discussed, that's all. AFAIK there's no forum rule stopping users from posting rumours about the subject at hand.
TBH, I don't think there's enough material to believe anything at all, at least for the moment. There's this picture which kind of proves that this presentation actually happened in a french technical school and that's it:

I apologize.
 
Why should PC be locked down and what's in it for Ubisoft and their close partner, Nvidia?

Based on sales and revenue, PC is no threat to consoles and hasn't been for a long time. And sure, while "PC" is "better" than consoles, if you look at the Steam hardware survey you'll see the certain in Steam's ecosystem, most PC owners are using far more modest hardware.

Consoles are an easy option for people wanting to game on their TV, and for PS4, game remotely as well with Vita.

What's in it for PS4 users?
 
So they are CPU limited, and MS freeing up some resources meant there was only 1-2 fps difference between the consoles

Well someone emailed Giant Bomb with that declaration and I think a few grains of salt should be employed. I am happy about the resources that are being thrown at the lighting however :)
 
Why would PS4 users care what the experience is like on PC? :???:

Because most human beings enjoy the mundane feeling of enjoying a state-of-the-art experience. If playing the game on a high-end PC becomes a distant state-of-the-art, then playing it on PS4 won't feel as good.
 
And this 25gb of baked lighting doesn't make sense to a layman like me, if you've already baked your lighting, how is that going to impact the CPU, maybe the real limiting factor is the HDD.

.

This is a good question and reminds me of a question I was gonna ask but failed to do. I would assume based on my pathetic understanding of these things, that while getting the assets off of the hard drive and into memory is a bottleneck it is the CPU that is in charge of getting those assets and assigning them to either memory or to be sent to the GPU or whatever. If there are a lot of assets to be managed there is a lot of CPU used. So you are talking about CPU time as well as bandwidth issues as well as hard drive issues. I assume a bunch of assets are loading nearly automatically to some memory buffer for each level or stage so the hard drive isn't as much of a problem.

I believe I read that with baked in lighting the lighting data is "mixed" into the texture and when the CPU sends the data to the GPU the GPU deals with applying the texture onto the polygon and dealing with how it looks after that. My question is the difference between baked in lighting generally and GI baked in lighting. I again ( foolishly ? ) assume that the GI part of it just means more data being mixed into the asset or more work on the CPUs part in telling the GPU what to do. More data means more work by the CPU or bandwidth or both.

I will however do a search on the subject on the forum so take this as a rhetorical question until I cry for mercy :LOL:
 
Last edited by a moderator:
Because most human beings enjoy the mundane feeling of enjoying a state-of-the-art experience.
I'd debate "most" and the state of the games industry (or society) doesn't support this at all. No doubt there are pockets in any market who derive some hollow self satisfaction by feeling they have the best possible experience that money can buy but this seems to the fringe element.

I'd argue that the majority are just gamers who enjoy playing games.
 
Most human beings enjoy the feeling of "best possible" (usually linked to a feeling of exclusiveness) in some way, yes. It's an indicative of personal success.

This doesn't mean that most people are driven by this feeling, of course. Even less if we're specifically talking about videogames. Different people have different priorities and we all know many people prefer the ease of use of a console instead of a PC even when cost isn't an issue.
It doesn't mean these people wouldn't somehow ache to get a state-of-the-art experience, though.
 
But wasn't the original AC generally considered a technical tour-de-force?

Sure there was some tearing, but for the time it was released the animation and lighting systems were just superb. It was showing up most 1st party large budget games. I still have the marks on my jaw where it hit the floor at my first sight of Jerusalem.

Then there are the times when I am Kenway, on the Jackdaw in the middle of the ocean with sunlight cutting through very convincing blue-green waves or being buffeted in a tropical storm and I still every now and then say a little 'god damn'.

Then I think that they create these incredibly atmospheric, beautiful moments in an open world game, on a multiplatform engine.

Over-familiarity with the series may have taken the sheen off those moments for many, but from a technical standpoint are these not class-leading developers?

AC1 was developed by one team and probably started more than 10 years ago. I am not claiming that the AC games do not have special qualities, but I am making the claim that UBI can't make a game as "tight" as Bayonetta or MGS5:GZ. They are much more interested in making bigger than denser games (both from an asset and mechanical perspective).

And while some people like this tuna does not!
 
Most human beings enjoy the feeling of "best possible" (usually linked to a feeling of exclusiveness) in some way, yes. It's an indicative of personal success.
Apologies I misunderstood what you said. Yes, agreed.

However, I would also say that most people both expect and accept that in whatever their personal interests are, it's not the best experience possible and many people have a better experience - by whatever subjective measurement an individual may apply.

I have beautiful 4K Sony X series Bravia TV and nice Samsung 5.1 surround sound system. I game mostly from a luxurious and comfy leather sofa. I have a decent PC (Mac) but chose to game mostly on a console. I don't for a second think my setup is anywhere close the best experience possible, nor do I begrudge those folks who worked hard to have better equipment - I give it no thought at all. If there are people who's enjoyment is somehow diminished by the thought/knowledge that somebody else may be playing a game with a higher fps count then that's fucked up. I would hope such people with are in such a tiny minority that can be largely ignored. Certainly I don't see large-scale evidence of such thinking from the gaming sites I visit.

Which brings me back to my question to Delta9. Why would PS4 users care what the experience is like on PC?
 
1 - I wouldn't be surprised if this is true. Sony and Microsoft have several billions of dollars invested in their consoles and PC gaming has been climbing up in popularity and sales really fast these last couple of years. With so much money being at stake, all kinds of aggressive tactics can take place within small meetings behind closed doors between publishers and developers. Even more when the PC gaming side has no real representative that would/could take competitors to court for illegal duo/monopolistic measures.
This isn't about being evil. It's business. PC gaming is a sitting duck regarding exclusivity and (un)optimization deals.

Normally I'd never believe such rumors but in this case it could be possible. In the past the console makers never would care since most pc sales are laptops and laptops could never compete with consoles for visuals. But that's changed and you can buy laptops now that can offer what the new consoles can on visuals, just one year after the new console have launched. Given that these new consoles are a massive financial investment for these companies and that they have to get them to last at least another 6 years, then I can see why they would be concerned when laptops are able to match them today and overpower them tomorrow once 22nm hits. So yeah, maybe it seems far fetched that MS/Sony would try and pressure pc games to be locked down to look worse, but in this circumstance I'm not so sure. They really should be worried when the devices that sell in far higher quantity (laptops) are able to match their dedicated gaming consoles today. And in the end of the day, there is no pc body that can push back to have this not happen. It's not like Lenovo or Dell will come running and apply pressure the other way, there are no checks and balances here.
 
They really should be worried when the devices that sell in far higher quantity (laptops) are able to match their dedicated gaming consoles today.

I think this statement only underlines your lack of comprehension as to why many people chose to game on a console rather than a PC.

If it was primarily about technical capabilities then console sales would slow year on year as the technical disparity between the console and increasingly powerful cheap PCs/phones/tablets increased.
 
I think this statement only underlines your lack of comprehension as to why many people chose to game on a console rather than a PC.

If it was primarily about technical capabilities then console sales would slow year on year as the technical disparity between the console and increasingly powerful cheap PCs/phones/tablets increased.

Tell that to makers of mp3 players and sound cards. If you already have a device that does something acceptably, then why spend hundreds more on something else that does the same thing at higher cost, with more limitations and no portability? Not everyone is a blind gamer that buy's any console these manufacturers excrete. Many people actually look at things like cost, value, and do I have a device that can already do what this one does. Do you really not see the threat here? It's fascinating that people can easily see the threat things like tablets and laptops had to desktop pc's because they did mostly the same stuff with better form factor and hence many shifted to that. But try and apply the same logic to consoles and good luck getting people on a gaming forum to see the threat.
 
This is a good question and reminds me of a question I was gonna ask but failed to do. I would assume based on my pathetic understanding of these things, that while getting the assets off of the hard drive and into memory is a bottleneck it is the CPU that is in charge of getting those assets and assigning them to either memory or to be sent to the GPU or whatever. If there are a lot of assets to be managed there is a lot of CPU used. So you are talking about CPU time as well as bandwidth issues as well as hard drive issues. I assume a bunch of assets are loading nearly automatically to some memory buffer for each level or stage so the hard drive isn't as much of a problem.

I believe I read that with baked in lighting the lighting data is "mixed" into the texture and when the CPU sends the data to the GPU the GPU deals with applying the texture onto the polygon and dealing with how it looks after that. My question is the difference between baked in lighting generally and GI baked in lighting. I again ( foolishly ? ) assume that the GI part of it just means more data being mixed into the asset or more work on the CPUs part in telling the GPU what to do. More data means more work by the CPU or bandwidth or both.

I will however do a search on the subject on the forum so take this as a rhetorical question until I cry for mercy :LOL:


I thought the PlayGo chip was in charge of moving data to & from the harddrive?
 
I thought the PlayGo chip was in charge of moving data to & from the harddrive?

According to the famous Gamasutra Cerny article:

Cerny talked about PlayGo, the system by which the console will download digital titles even as they're being played.

"The concept is you download just a portion of the overall data and start your play session, and you continue your play session as the rest downloads in the background," he explained to Gamasutra.

However, PlayGo "is two separate linked systems," Cerny said. The other is to do with the Blu-ray drive -- to help with the fact that it is, essentially, a bit slow for next-gen games.

"So, what we do as the game accesses the Blu-ray disc, is we take any data that was accessed and we put it on the hard drive. And if then if there is idle time, we go ahead and copy the remaining data to the hard drive. And what that means is after an hour or two, the game is on the hard drive, and you have access, you have dramatically quicker loading... And you have the ability to do some truly high-speed streaming."

So getting stuff into memory may involve the CPU more often than not. Maybe some GPU compute will take over some of this but AFAIK the CPU chooses what goes where essentially. Of course batching things together and letting the DMAs do their thing is an obvious way to reduce the CPU need.
 
Last edited by a moderator:
Back
Top