How should devs handle ports between consoles? *spawn

And this 25gb of baked lighting doesn't make sense to a layman like me, if you've already baked your lighting, how is that going to impact the CPU, maybe the real limiting factor is the HDD.

There are many ways to bake lighting, it's quite probably not just a simple extra lightmap texture layer. We'll have to wait and see if they're going to talk about it in further detail at some conferences. But it does look quite good IMHO.
 
Tell that to makers of mp3 players and sound cards. If you already have a device that does something acceptably, then why spend hundreds more on something else that does the same thing at higher cost, with more limitations and no portability?
You've completely lost me. I don't know that it's relevant but the last soundcard I bought was an AWE32, which must have been about 20 years ago. The last MP3 player I bought was a 80Gb 5th Gen click wheel iPod which was closing on 10 years ago.

I'm still lost, though :???:

Not everyone is a blind gamer that buy's any console these manufacturers excrete. Many people actually look at things like cost, value, and do I have a device that can already do what this one does.
Oh so anybody who choses a platform different to yours is a "blind gamer". It's not possible that other people have different priorities, requirements or preferences to you? :nope: Blind gamers buying excretement :yep2:

Do you really not see the threat here?

I don't feel threatened by PCs. I've owned personal computers since my parents bought me a C64. Do you feel threatened by consoles? Have you spoken to anybody about this? :oops:

It's fascinating that people can easily see the threat things like tablets and laptops had to desktop pc's because they did mostly the same stuff with better form factor and hence many shifted to that. But try and apply the same logic to consoles and good luck getting people on a gaming forum to see the threat.

I use a PC (and have done for 30+ years), a smart phone (5+ years), tablets (3 years) and consoles (almost 20 years). I don't feel any are a threat to me, my girlfriend or our way of life. I enjoy games on all platforms and know that some technology will wane in popularity and disappear. Even consoles fade with this generation so be it. Gaming won't disappear. I'm not going to turn into some crazy person lamenting the decline of 8-Track
 
Even if that's the case PS4 has 500 Gflops of GPGPU compute free that could have handle the lighting.

... or have the GPU do other CPU-ish things to make room for more lighting. In any case since this title and the engine it is running on is pushing lighting as a major differentiator it's seems likely that the next iteration will likely use compute for something. Whether the next title will still be managing baked lighting or some other method or both they are going to want some more headroom for new game logic and stuff like that and that means CPU.

I am just spitballing here but when starting this process I would assume Ubi had an idea that compute was going to be something that was going to be added to the engine but maybe they will just iterate on what they have for the next title and do a major rehaul of the engine to include compute next time. There may be other middleware compute solutions by then, who knows.

I am merely thinking out loud here but are the engines for new consoles leaving some "room" for some basic compute solutions or will that wait for a newish engine ?
 
I believe I read that with baked in lighting the lighting data is "mixed" into the texture and when the CPU sends the data to the GPU the GPU deals with applying the texture onto the polygon and dealing with how it looks after that. My question is the difference between baked in lighting generally and GI baked in lighting. I again ( foolishly ? ) assume that the GI part of it just means more data being mixed into the asset or more work on the CPUs part in telling the GPU what to do. More data means more work by the CPU or bandwidth or both.

In the days of Quake, baking lighting meant simply running the radiosity/GI sim on your level and rendering the results into a texture. This texture used a secondary UV set on all level geometry with no repetition, so its resolution was quite low - just look at any screenshot and you can see the stair-stepping in the shadow edges. AFAIK even Halo3 used similar techniques.
Half-Life 2 added some trickery later on, I don't remember the exact details though.

This is however relatively simple stuff for rendering, you just need to combine your lightmap texture with the color map, and maybe do some more complex math in the shader if you want specular highlights and such. Voodoo 2 cards got a second TMU practically just for this :)
But a significant issue is that your baked lighting cannot affect dynamic objects like characters or moving stuff. There were several simple tricks to overcome this, like ambient cube maps and such, but they weren't really in sync with the lighting of the static level geometry.
You also can't have time of day changes in the lighting as you can't really render multiple lightmaps for multiple conditions, you can't cache as much data.
It's also not practical to render lightmaps for an open world game; first, the dataset would be HUGE, second, the time to calculate it would take a LOT of time. Also, CPUs and GPUs got a lot faster, and shaders can be far more complex, so many new opportunities have opened up.

So nowadays the approaches are quite different, light probes placed in the level and such stuff, especially if you want dynamic lighting on the level geometry and so on. Again, we don't yet know anything about how ACU does its lighting, but it definitely looks far more advanced and realistic than AC4.
 
Again, we don't yet know anything about how ACU does its lighting, but it definitely looks far more advanced and realistic than AC4.

Usual disclaimer about how I don't know anything like as much as folks like yourself, but ... I think the new AC lighting looks pretty nice. And they've done this across 3 platforms, one of which has wildly varying configurations.

If it's true about CPU being a limiting factor in scaling resolutions, then I find the implication that Ubi shouldn't have made this step up because PS4 owners deserve to have all their flops used rather annoying.
 
So nowadays the approaches are quite different, light probes placed in the level and such stuff, especially if you want dynamic lighting on the level geometry and so on. Again, we don't yet know anything about how ACU does its lighting, but it definitely looks far more advanced and realistic than AC4.

Bart Wronski discussed in some detail what he/they were doing for ACIV (not sure if you saw that already)
http://bartwronski.files.wordpress.com/2014/05/assassin_s-creed-4-digital-dragons-2014.pdf

I just skimmed over it the other... week, but I think they basically did just bake probes for different times of the day and blended?

Anyways, there were a couple hints where he wanted to go with it.
 
Truth is, new lighting techniques are becoming increasingly complex and harder to understand to me just as well ;)

Also, this is a new generation of hardware, so developers are obviously still experimenting with the new possibilities. Just because some tech is newer, it doesn't mean it's better and anything can turn out to be a dead end. In this case, it is likely that Ubi has overestimated the CPU's capabilities but by the time they've had results it was too late to turn back. Or it could also be a case where they can optimize CPU usage or move more calculations to the GPU later on. The inevitable 2015 AC game will probably help to understand this a bit better.
 
I just skimmed over it the other... week, but I think they basically did just bake for different times of the day and blended?

I'll check it out later, but I don't think they've baked light maps. Storing lighting data in a different format could probably still allow for multiple bakes and blending.

Understand that with light maps, you're also baking shadows and indirect lighting, basically everything.
Edit: so, blending between different lighting setups would have required blending the shadows as well, and you would've needed LOTS of samples (ie. separate light maps) to make that transition look smooth enough.
The main addition in HL2 was that they were able to have dynamic speculars but I'm not sure how far that went either. The game definitely did not have a time of day feature though.

So as long as AC4 is not using light maps, it's probably a lot easier to pre-process for multiple conditions.

Also, my impression is that most game lighting tech operates with abstractions, ignoring real world physics. So they can optimize based just on how the implementation works*, throwing real-life considerations away completely. And even physics isn't entirely sure if light is a wave or a particle, anyway ;)

* For example, you can just filter light maps or shadow maps, and the results will be similar enough to real life, even though you're not working with physical sizes for the light emitters and so on. Or, in early CGI, before GI computations were affordable, it was common to just place a LOT of direct light sources manually to simulate bounce lights. Ambient (and reflection!) occlusion was a similar cheat as well, very little to do with real life but working well enough until computers got fast enough to just turn on GI.
 
Apologies I misunderstood what you said. Yes, agreed.

However, I would also say that most people both expect and accept that in whatever their personal interests are, it's not the best experience possible and many people have a better experience - by whatever subjective measurement an individual may apply.

I have beautiful 4K Sony X series Bravia TV and nice Samsung 5.1 surround sound system. I game mostly from a luxurious and comfy leather sofa. I have a decent PC (Mac) but chose to game mostly on a console. I don't for a second think my setup is anywhere close the best experience possible, nor do I begrudge those folks who worked hard to have better equipment - I give it no thought at all. If there are people who's enjoyment is somehow diminished by the thought/knowledge that somebody else may be playing a game with a higher fps count then that's fucked up. I would hope such people with are in such a tiny minority that can be largely ignored. Certainly I don't see large-scale evidence of such thinking from the gaming sites I visit.

Which brings me back to my question to Delta9. Why would PS4 users care what the experience is like on PC?

Wouldn't that also bring back you're first quote?
Not sure if your trolling. If your customers have an Xbox One then you want to sell them games so they will give you money.

Did that really have to be explained?
Why does this only work one way?
 
I'll check it out later, but I don't think they've baked light maps. Storing lighting data in a different format could probably still allow for multiple bakes and blending.

Ah right. Talking about different things. They've baked the sunlight bounce irradiance.
 
... or have the GPU do other CPU-ish things to make room for more lighting. In any case since this title and the engine it is running on is pushing lighting as a major differentiator it's seems likely that the next iteration will likely use compute for something. Whether the next title will still be managing baked lighting or some other method or both they are going to want some more headroom for new game logic and stuff like that and that means CPU.

I am just spitballing here but when starting this process I would assume Ubi had an idea that compute was going to be something that was going to be added to the engine but maybe they will just iterate on what they have for the next title and do a major rehaul of the engine to include compute next time. There may be other middleware compute solutions by then, who knows.

I am merely thinking out loud here but are the engines for new consoles leaving some "room" for some basic compute solutions or will that wait for a newish engine ?

Ubi are already using compute for the clothing and maybe other things. Here's what I think happened.

1) Port the game from pc to consoles, and find out the cpu sucks and has no headroom to run their fancy cloth physics that they can comfortably run on the cpu alone on pc
2) Move the cloth routines to the gpu using compute
3) Reduce the res to 900p and scale back other effects because gpu compute is now taking away from traditional rendering time

If they can get async compute running in the future maybe we'll see the res bumped up. Not very impressed by what they're showing in light of the 900p and especially due to the baked lighting.
 
You've completely lost me. I don't know that it's relevant but the last soundcard I bought was an AWE32, which must have been about 20 years ago. The last MP3 player I bought was a 80Gb 5th Gen click wheel iPod which was closing on 10 years ago.

I'm still lost, though :???:

Oh so anybody who choses a platform different to yours is a "blind gamer". It's not possible that other people have different priorities, requirements or preferences to you? :nope: Blind gamers buying excretement :yep2:

I don't feel threatened by PCs. I've owned personal computers since my parents bought me a C64. Do you feel threatened by consoles? Have you spoken to anybody about this? :oops:

I use a PC (and have done for 30+ years), a smart phone (5+ years), tablets (3 years) and consoles (almost 20 years). I don't feel any are a threat to me, my girlfriend or our way of life. I enjoy games on all platforms and know that some technology will wane in popularity and disappear. Even consoles fade with this generation so be it. Gaming won't disappear. I'm not going to turn into some crazy person lamenting the decline of 8-Track

None of what I said is about any of that, you have to remove your opinions from the mix and think more broadly to understand what I'm getting at. But in any case this is the wrong thread for this, and discussing such things on a gamer forum is probably about as useful as me complaining on an Apple forum that ios8 has rendered my iPad Mini useless for weeks now by breaking it's wifi. Net result would be the same, an endless circle jerk.
 
Which brings me back to my question to Delta9. Why would PS4 users care what the experience is like on PC?

Wouldn't that also bring back you're first quote?

Not sure if your trolling. If your customers have an Xbox One then you want to sell them games so they will give you money.

Why does this only work one way?

Why does what work one way? How are these issues related? You asked why developers bothered with the Xbox One to which I answered because people willing to buy games own them. The next issue is PS4 owners want PC versions of their games nerfed to they feel they have a better relative experiences, which I say is rubbish.

None of what I said is about any of that, you have to remove your opinions from the mix and think more broadly to understand what I'm getting at.

If you can't come to the point and others have to try to 'get at' what you're trying to convey then you need to communicating better in the first place

But in any case this is the wrong thread for this, and discussing such things on a gamer forum is probably about as useful as me ...

Then start I new thread. You clearly would like to discuss, I would and I'm sure others would as well as it's a recurring discussing that creeps into other threads.
 
If you can't come to the point and others have to try to 'get at' what you're trying to convey then you need to communicating better in the first place

The problem is you took it personal and starting giving examples relating to yourself. Once it goes that way all is lost and there is no point in arguing anything further. You have to completely remove yourself from the mix, zoom out and evaluate the situation more broadly and objectively without having your own personal preferences taint things. Otherwise these kinds of discussions go no where fast. At the end of the day any and all devices regardless of who makes them, how loved they are and how many marketing dollars are behind them have to justify their cost and existence because everything is replaceable. For example, people are now saying iPhone 6+ phones will cannibalize iPad sales because some people that have an iPhone 6+ will no longer feel the need to also purchase an iPad tablet whereas before they would own both. That doesn't imply that an iPad is a bad product, but it does show that it is no longer a must have if you have a large phone. It can be replaced. If someone already owns a laptop for work, play or whatever that can as of today do whatever a console can with the addition of being cheaper to game on, being portable, and so on then for some people it will be able to replace a console. It doesn't mean a console is a bad product, it just means that a portion of the populace will no longer need a console, just like a portion of the populace that uses a large phone will no longer feel the need to also buy a tablet. Everything is replaceable. In the past the concept of a laptop replacing a console was laughable for a whole host of reasons. And yet today you can get the equivalent gaming experience of a new console on battery power, with full portability and cheaper games to boot. That right there should cause anyone with a multi billion dollar console investment to be worried.

But like I said, discussing that sort of thing on a gaming forum is probably as useful as discussing the value of meat on a vegan forum. So never mind.
 
It's fascinating that people can easily see the threat things like tablets and laptops had to desktop pc's because they did mostly the same stuff with better form factor and hence many shifted to that. But try and apply the same logic to consoles and good luck getting people on a gaming forum to see the threat.

Most people used PCs for surfing, shopping and emailing, the problem with consoles is that they have exclusives and better controls along with brand loyalty and ease of use.
 
How would you lock a PC which has significantly more power and get away with it? I'm not quite sure what you're trying to say...

See the Evil Within. You simply code your game is such a screwed up way that a lowley 750Ti can hit 30fps with ease but a 980GTX with 3x the power can't hit 60fps.

No frame lock required.
 
IAll I'll say for now is:
1 - I wouldn't be surprised if this is true. Sony and Microsoft have several billions of dollars invested in their consoles and PC gaming has been climbing up in popularity and sales really fast these last couple of years. With so much money being at stake, all kinds of aggressive tactics can take place within small meetings behind closed doors between publishers and developers. Even more when the PC gaming side has no real representative that would/could take competitors to court for illegal duo/monopolistic measures.
This isn't about being evil. It's business. PC gaming is a sitting duck regarding exclusivity and (un)optimization deals.

2 - A tweet from Phil Spencer saying it's not true means little more than Microsoft's more than obvious official stance on the issue. Was anyone expecting to see Phil Spencer sending a tweet saying "Yes, we're pressuring PC devs to lock the FPS in order to make our console look less bad"?
This is the real world, people, not an imaginary one.

While we clearly need to take this claim with a big pinch of salt until we get official confirmation one way or the other, I certainly wouldn't be surprised if this is true. Need for Speed the rivals, The Evil Within, The Crew Beta and I'm sure there are others all locked the PC version to 30fps. There should be no reason, ever, to lock a PC game at 30fps.

Given Ubi's recent ridiculous statements about 60fps no longer being an industry target (err, like it ever was outside of the PC) and 30fps being "better" than 60fps they certainly seem to be setting the ground work for pushing everyone down the 30fps route.

Incidentally though, I'd say PC gaming is represented by Nvidia and (to a lesser extent) AMD in these types of back room deals. Their developer programmes would surely be very upset at these kinds of performance limitations being artifically imposed on their high end GPU's.
 
While we clearly need to take this claim with a big pinch of salt until we get official confirmation one way or the other, I certainly wouldn't be surprised if this is true. Need for Speed the rivals, The Evil Within, The Crew Beta and I'm sure there are others all locked the PC version to 30fps. There should be no reason, ever, to lock a PC game at 30fps.

Given Ubi's recent ridiculous statements about 60fps no longer being an industry target (err, like it ever was outside of the PC) and 30fps being "better" than 60fps they certainly seem to be setting the ground work for pushing everyone down the 30fps route.

Incidentally though, I'd say PC gaming is represented by Nvidia and (to a lesser extent) AMD in these types of back room deals. Their developer programmes would surely be very upset at these kinds of performance limitations being artifically imposed on their high end GPU's.

I agree with this statement with the exception of "err, like it ever was outside of the PC". Consoles have been providing games at 60fps for a long, long time.

Even though I no longer game on a PC, I find it dumbfounding that games would be limited to 30fps on hardware that can run at a much higher rate.
 
I agree with this statement with the exception of "err, like it ever was outside of the PC". Consoles have been providing games at 60fps for a long, long time.

Yeah I probably should have restricted that comment to just the previous console generation. Although some games were 60fps during the last gen, the general industry target was 30fps (on consoles).
 
Back
Top