Epic Says This Is What Next-Gen Should Look Like

What Laa-Yosh is trying to explain is why bother investing in better gameplay or bigger/wider environments when the current CoD model is selling as good as it is?

The people asking for sandbox environments in their shooters are the minority compared to the millions of people who buy CoD every year.
But it seems to me that the annual COD purchase is more about the multiplayer than the single player. Sure, the SP is okay, but just about everyone I know who buys COD does so for the multi...including interns at the company I work it now who don't have Live Gold (shoulda gotten a PS3 :cool:) and just play splitscreen with dorm buddies (splitscreen is a far more important feature than a lot of you dev types give it credit for).

I agree that SP has kind of stagnated, and informal anecdotes seem to confirm that, but it's not a technology problem. There's not really any type of game that can't be done. Sure, it's not as pretty if you make a bigger, more open world, but it's not like that stopped Grand Theft Auto from succeeding. Heck, there's no reason GTA III couldn't have been first-person. And I remember Borderlands doing fairly well.

Whatever you do it has to be fun if you want it to sell.
 
UE3 is in a good enough shape with the recent updates in that Samaritan techdemo to build some pretty cool looking launch games. I think it added full deferred rendering to the engine, along with hair, tessellation, cloth simulation etc. None of those things have been in Gears 3 and that game quite clearly comes close to maxing out the Xbox.
So it's going to be a double win - features and possibilities can clearly go beyond the X360/PS3 hardware levels, but devs also get the benefit of staying with well-known art pipelines and tools.

I think we're only going to see an incremental upgrade in graphics when the nextgen systems finally launch. Expect stuff to be a fair level above BF3's PC version but not radically different from what we see now.
 
While Id missed this gen, they're in a really good position right now for next gen.

I don't see how id has any more of a headstart on next-generation engines than Epic. UE4 is going above and beyond, targeting many-cores compared to UE3. That sort of hardware won't be appearing until 2013-2014 anyway, so everyone is just as much in the dark as they are. Are you somehow expecting id software to have a showcase many-core title by 2013?

Targeting many-core isn't likely to be as important for the transitory period (2012-2014) anyway, in the same vein that the launch titles for 360/PS3 didn't make great use of multicore or more advanced shading algorithms. As Laa-Yosh already mentioned, UE3 is already a well-established piece of middleware, so it's not like it'll have much of a problem on WiiU. Especially consider how WiiU is more or less playing the catch-up game in technology rather than pushing "true next-gen", so UE3 has some legs left for the current state of multicore and DX10+ class hardware.

Also consider multicore penetration in the PC market... It's taken years for a significant chunk of gamers to get even 4 cores. And only now with Frostbite 2 has it been shown that quad cores are needed over dual cores. How many folks are even going to have >4 cores by then? Even dual cores on PC are much more powerful, which up to now have been fairly sufficient compared to the 6+ threaded consoles. My point being, that it'll be years before "many-core" matters for game releases when it mostly comes down to the art; it'll be relatively trivial to push graphics - higher quality shadow maps, deferred shading with FP16, higher quality shaders/lighting, more polys etc.
 
Actually, all signs point to Epic trying something radically different. Sweeney has been interested in software rendering approaches, micropolygons and such for a while now, and the long development schedule also suggests that they're breaking new ground...
 
Actually, all signs point to Epic trying something radically different. Sweeney has been interested in software rendering approaches, micropolygons and such for a while now, and the long development schedule also suggests that they're breaking new ground...
I also recall Todd Hollenshead from iD saying that software rendering was the future. but GPUs are getting so ridiculously powerful that I wonder what's left for poor CPUs when it comes to rendering.

I remember one of my favourite games ever, Need for Speed 3: Hot Pursuit for PC in 1998, and there was a software renderer and a 3D accelerated renderer you could select in the video options..

The 3D renderer looked a lot better and allowed higher resolutions, of course, but the software renderer included some unique effects.
 
Who said you have to do software rendering on CPU? There have been a few GPU software renderers as well :)

I can't remember where I saw the presentation/research paper but someone had implemented a software rasterizer working on a GPU and in some scenarios it performed extremely well compared to usual solutions.
 
Sweeney is talking UE4 not until 2014:

http://techreport.com/discussions.x...mpaign=Feed:+techreport/all+(The+Tech+Report)

Sounds like they're going to miss the next gen boat. There's no way that MS and Sony are going to give Nintendo a 2+ yr headstart.

While Id missed this gen, they're in a really good position right now for next gen.

Well I am sure Epic will be there when next gen is or maybe year after, like Gears 1 in 2006. Morever I am sure Epic will exactly be sure to be there early next gen as a matter of strategy. If anything Epic mentioning 2014 suggests a timeline for next gen consoles then, not a knock on Epic's timeliness.

My point is next gen console timeline is what Epic will build UE4 timeline around.

Since Xb720/PS4 isn't likely to launch until at least 2013, it works out fine.
 
Who said you have to do software rendering on CPU? There have been a few GPU software renderers as well :)

I can't remember where I saw the presentation/research paper but someone had implemented a software rasterizer working on a GPU and in some scenarios it performed extremely well compared to usual solutions.
HPG 2011?
 
Full software rendering is not the future, at least not for the X3/PS4 generation IMHO.

I can foresee something like Frostbite 2's SPU-based tiled deferred shading as a good future direction though, where laying out the initial passes will utilize the GPU's hardwired extra performance (interpolators, filtering etc) but the CPU is heavily used in the rendering process as well.

As for the timing of UE4, it will only be released/completed well into the next hw cycle. No need to release earlier, either - I'm sure UE3 has a few inefficiencies and such because they did not wait for final hardware. UE3.5 will be more than enough for launch titles.
 
While it's possible individual games could use software rendering I agree with Laa-Yosh that it won't be practical anytime soon for a general purpose engine like UE.

Also, that HPG paper while interesting is misleading if you just look at the performance numbers. The paper didn't measure power usage and didn't execute shaders. If it measured power you'd see fixed function hardware is much more efficient because the shaders in a modern GPU take up a significant portion of the die and they were mostly idle in those tests.
 
I'm curious to know what kind of horse power are we really looking at here to run something like an UE4 based game if the "UE3.xx" based Samaritan demo didn't prove hard enough already. Will it be safe to assume UE4 version of Samaritan would blow the current one away?
 
I'm curious to know what kind of horse power are we really looking at here to run something like an UE4 based game if the "UE3.xx" based Samaritan demo didn't prove hard enough already. Will it be safe to assume UE4 version of Samaritan would blow the current one away?

Well Rein said Samaritan wasn't UE4 but could be called UE 3.9999 (or something like that). So I guess, no.
 
So it's pretty much UE4 minus a wet shader for that 0.0001% then. Why couldn't they just call it UE4, I mean I'm pretty sure UE3 was more of a UE2.99 at the time of its inception.
 
So it's pretty much UE4 minus a wet shader for that 0.0001% then. Why couldn't they just call it UE4, I mean I'm pretty sure UE3 was more of a UE2.99 at the time of its inception.

Because the engine version number isn't defined purely by the features of the renderer... ;)

Let's not forget Unreal is an entire suite of tools for games development & I'm quite sure that
any radically new rendering tech will require an even larger degree of development effort in building the tool chain to allow clients to built content that leverages it...
 
That's not true IMHO - again, if Sweeney wants to develop up until 2014, it means there has to be something radically different. Voxels, micropolygons, raytracing, whatever.
 
I think that "3.999" as almost UE4 might be misconception. Maybe Mark really just meant it's highly refined, "almost last version" of ue3 not almost "features/renderer parity" Samaritan demo was running at 2560/1600 res with high amount of MSAA under dx11api ( with its cons). performance gains for consoles easly seen;) On top of that i bet highly unoptimized brute force approach in effects and assets departments is true in this case. I think this quality should be easily achievable in some launch titles for next gen consoles. every technique in samaritan is available in current sdk...
As Laa-Yosh pointed out that kind of long R&D must guarantee next leaps in visuals and/or performance.
 
Honestly a single GTX580 should be able to achieve at least Samaritan level visuals at 1080p. That GPU is just unbelievably, wickedly powerful. Order of magnitude over current consoles.
 
Honestly a single GTX580 should be able to achieve at least Samaritan level visuals at 1080p. That GPU is just unbelievably, wickedly powerful. Order of magnitude over current consoles.

But it wasn't run on a single GTX580, it was run on 3 GTX580's. To somehow extrapolate that to a single is like saying the bullshots we so often see should be done in game.
 
But it wasn't run on a single GTX580, it was run on 3 GTX580's. To somehow extrapolate that to a single is like saying the bullshots we so often see should be done in game.

It was also very unoptimised and Epic I believe did say they could get it running on a single 580 with some proper optimisation.
 
Back
Top