Unreal creator Tim Sweeney: "PCs are good for anything, just not games”

one

Unruly Member
Veteran
Part 1: Unreal creator Tim Sweeney: "PCs are good for anything, just not games”
http://www.tgdaily.com/content/view/36390/118/
TG Daily: You have to admit, the margin is obviously there.

Sweeney: Agreed. But it is very important not to leave the masses behind. This is unfortunate, because PCs are more popular than ever. Everyone has a PC. Even those who did not have a PC in the past are now able to afford one and they use it for Facebook, MySpace, pirating music or whatever. Yesterday’s PCs were for people that were working and later playing games. Even if those games were lower-end ones, there will always be a market for casual games and online games like World of Warcraft. World of Warcraft has DirectX 7-class graphics and can run on any computer. But at the end of the day, consoles have definitely left PC games behind.

TG Daily: In other words: Too big?

Sweeney: Yes, that is huge difference. If we go back 10 years ago, the difference between the high end and the lowest end may have been a factor of 10. We could have scaled games between those two. For example, with the first version of Unreal, a resolution of 320x200 was good for software rendering and we were able to scale that up to 1024x768, if you had the GPU power. There is no way we can scale down a game down by a factor of 100, we would just have to design two completely different games. One for low-end and one for high-end.
That is actually happening on PCs: You have really low-end games with little hardware requirements, like Maple Story. That is a $100 million-a-year business. Kids are addicted to those games, they pay real money to buy [virtual] items within the game and the game.


TG Daily: Broken down, that means today’s mainstream PCs aren’t suitable for gaming?

Sweeney: Exactly. PCs are good for anything, just not games.

TG Daily: Can that scenario change?

Sweeney: Yes, actually it might. If you look into the past, CPU makers are learning more and more how to take advantage of GPU-like architectures. Internally, they accept larger data and they have wider vector units: CPUs went from a single-threaded product to multiple cores. And who knows, we might find the way to get the software rendering back into fashion.
Then, every PC, even the lowest performing ones will have excellent CPUs. If we could get software rendering going again, that might be just the solution we all need. Intel’s integrated graphics just don't work. I don't think they will ever work.

Part 2: Tim Sweeney, Part 2: “DirectX 10 is the last relevant graphics API”
http://www.tgdaily.com/content/view/36410/118/
TG Daily: In the first part of our interview you implied that software rendering might be coming back. Daniel Pohl, who rewrote Quake 3 and Quake 4 using ray-tracing [and is now working as Intel's research scientist] recently showed ray-tracing on a Sony UMPC, an ultraportable device equipped with a single-core processor. True, the resolution was much lower than on PCs of today, but it looked impressive. What are your thoughts on ray-tracing? How will 3D develop in the next months and years?

Sweeney: Ray-tracing is a cool direction for future rendering techniques. Also, there is rendering and there is the ray scheme of dividing the scene into micro-polygons and voxels. There are around five to ten different techniques and they are all very interesting for the next-generation of rendering.

Rendering can be done on the CPU. As soon as we have enough CPU cores and better vector support, these schemes might get more practical for games. And: As GPUs become more general, you will have the possibility of writing a rendering engine that runs directly on the GPU and bypasses DirectX as well as the graphics pipeline. For example, you can write a render in CUDA and run it on Nvidia hardware, bypassing all of their rasterization and everything else.

All a software renderer really does is input some scene data, your position of objects, texture maps and things like that - while the output is just a rectangular grid of pixels. You can use different techniques to generate this grid. You don’t have to use the GPU rasterizer to achieve this goal.


TG Daily: What kind of advantage can be gained from avoiding the API? Most developers just utilize DirectX or OpenGL and that's about it. How does the Unreal Engine differ from the conventional approach?

Sweeney: There are significant advantages in doing it yourself, avoiding all the graphics API calling and overhead. With a direct approach, we can use techniques that require wider frame buffer, things that DirectX just doesn't support. At Epic, we're using the GPU for general computation with pixel shaders. There is a lot we can do there, just by bypassing the graphics pipeline completely.


TG Daily: What is the role of DirectX these days? DirectX 10 and the Vista-everything model promised things like more effects and direct hardware approach, claiming that lots of new built-in technologies would enable a console-like experience. DirectX 10.0 has been on the market for some time and the arrival of DirectX 10.1 is just ahead. What went right, what went wrong?

Sweeney: I don't think anything unusual happened there. DirectX 10 is a fine API. When Vista first shipped, DirectX 10 applications tended to be slower than DirectX 9, but that was to be expected. That was simply the case because the hardware guys were given many years and hundreds of man-years to optimize their DirectX 9 drivers. With DirectX 10, they had to start from scratch. In the past weeks and months, we have seen DX10 drivers catching up to DX9 in terms of performance and they're starting to surpass them.

I think that the roadmap was sound, but DirectX 10 was just a small incremental improvement over DX9. The big news items with DirectX 9 were pixel and vertex shaders: You could write arbitrary code and DX10 just takes that to a new level, offering geometry shaders and numerous features and modes. It doesn't change graphics in any way at all, unlike DX9. That was a giant step ahead of DirectX 7 and DirectX 8.


TG Daily: Since you are a member of Microsoft's advisory board for DirectX, you probably have a good idea what we will see next in DirectX. What can we expect and do you see a potential for a segmentation of APIs - all over again?

Sweeney: I think Microsoft is doing the right thing for the graphics API. There are many developers who always want to program through the API - either through DirectX these days or a software renderer in the past. That will always be the right solution for them. It makes things easier to get stuff being rendered on-screen. If you know your resource allocation, you'll be just fine. But realistically, I think that DirectX 10 is the last DirectX graphics API that is truly relevant to developers. In the future, developers will tend to write their own renderers that will use both the CPU and the GPU - using graphics processor programming language rather than DirectX. I think we're going to get there pretty quickly.

I expect that by the time of the release of the next generation of consoles, around 2012 when Microsoft comes out with the successor of the Xbox 360 and Sony comes out with the successor of the PlayStation 3, games will be running 100% on based software pipelines. Yes, some developers will still use DirectX, but at some point, DirectX just becomes a software library on top of ... you know.


TG Daily: Hardware?

Sweeney: GPU hardware. And you can implement DirectX entirely in software, on the CPU. DirectX software rendering always has been there.
Microsoft writes the reference rasterizer, which is a factor of 100 slower than what you really need. But it is there and shows that you can run an entire graphics pipeline in software. I think we're only few years away from that approach being faster than the conventional API approach - and we will be able to ship games that way. Just think about the Pixomatic software rendering.

Part 3: Tim Sweeney, Part 3: Unreal Engine 4.0 aims at next-gen console war
http://www.tgdaily.com/content/view/36436/118/
TG Daily: Throughout GDC, there have been several companies that were presenting game controllers that are so-called brain-computer interfaces. A while ago, I used OCZ's NIA device in Unreal Tournament 2004 and was blown away by the usability of that device. How do you see the interface between us, humans, and the computer evolving, given the fact that Nintendo has seen such a huge success with its Wii-mote?

Sweeney: I think the key challenge here is to look at all the cool things engineers are developing and identify which ones are just gimmicks, which ones are cool ideas that might benefit one part of the market, but aren't fundamental changes and which ones are things that really change the way we work with computing interfaces. I still think that motion controllers, such as the Wii controller, have a limited purpose, sort of a gimmicky thing. Standing there and holding a bunch of devices and moving them around wildly is great for party games, but I don't think that will fundamentally change the way people interact with computers. Humans are tactile beings, so things such as touchscreens fundamentally improve the user interface.


TG Daily: That brings us back to the iPhone, which we talked about earlier. Apple appears to have made a lot of progress in this area.

Sweeney: I agree. You are not just bringing up the map on the screen, but you move it with your fingers, you zoom in and zoom out. It's incredible that nobody thought of that earlier. With 3D editing tools, the touchscreen approach would be an excellent thing. You can grab vertices and drag them in different directions. A touchscreen could really improve and change things. I think that we might see that technology migrating to Maya. It is hard to tell how exactly that will pan out, but I see that as a very big improvement in computing versus the motion stuff. These are just neat toys.

The other big direction is head tracing - cameras built into consoles. They watch you and detect, for example, your arm movement. It is just more natural, because it is somewhat annoying to hold a bunch of wired plastic do-adds, wireless things you have to pick up and recharge them every once in a while. To me, it's more compelling to just use free-form movement and have computers recognize your gestures.

Sweeney: Five years into development of personal computers we exhausted all the major ideas such as keyboard, mouse, joystick and gamepad. But then you see something like Apple’s multi-touch, or you see that YouTube video on a big screen based interface where people walk around and just start manipulating objects that are projected there. That is new stuff, that's entirely new. No one really has done that before and it is clear that there are still a lot of major ideas that haven't surfaced yet. Yet. As the technology improves, one thing is certain: As you increase complexity of the user interface, you need more processing power.

TG Daily: Let’s talk about your game visions for the future and the next Unreal Engine? Where is EPIC going with the Unreal Engine 3.5 and 4.0?

Sweeney: The Unreal engine is really tied to a console cycle. We will continue to improve Unreal Engine 3 and add significant new features through the end of this console cycle. So, it is normal to expect that we will add new stuff in 2011 and 2012. We're shipping Gears of War now; we're just showing the next bunch of major tech upgrades such as soft-body physics, destructible environments and crowds. There is a long life ahead for Unreal Engine 3. Version 4 will exclusively target the next console generation, Microsoft's successor for the Xbox 360, Sony's successor for the Playstation 3 - and if Nintendo ships a machine with similar hardware specs, then that also. PCs will follow after that.

Also, we continuously work on transitions, when we go through large portions of the engine. We completely throw out parts and create large subsystems from the ground up, while we are reusing some things that are still valid.


TG Daily: Like ...?

Sweeney: The Internet bandwidth. In five years, the bandwidth isn't going to be more than 5-6 times higher than it is today. So the network code we have in the engine now will stay the same. Our tools are still valid, but we will rewrite large sections of the engine around it, as the new hardware develops.


TG Daily: What part of the engine will need a completely new development?

Sweeney: Our biggest challenge will be scaling to lots and lots of cores. UE3 uses functional subdivision and paths, so we have the rendering thread that handles all in-game rendering. We have the gameplay thread that handles all game-plays and uses AI. We have some hopper threads for physics. We scale very well from dual-core to quad-core, and actually you can see a significant performance increase when you run UT3 on a quad-core when compared to a dual-core system.

Down the road, we will have tens of processing cores to deal with and we need much, much finer grain task-parallelism in order to avoid being burdened by single-threaded code. That, of course, requires us to rewrite very large portions of the engine. We are replacing our scripting system with something completely new, a highly-threadable system. We're also replacing the rendering engine with something that can scale to much smaller rendering tasks, in- and out-of-order threads. There is a lot of work to do.

TG Daily: You already have started working on Unreal Engine 4.0?

Sweeney: We have a small Research & Development effort dedicated to the Unreal Engine 4. Basically, it is just me, but that team will be ramping up to three to four engineers by the end of this year - and even more one year after that. In some way, we resemble a hardware company with our generational development of technology. We are going to have a team developing Unreal Engine 3 for years to come and a team ramping up on Unreal Engine 4. And then, as the next-gen transition begins, we will be moving everybody to that. We actually are doing parallel development for multiple generations concurrently.


TG Daily: Stepping back, what do you see as the most significant technology trends these days?

Sweeney: When it comes to the PC, Intel will implement lots of extensions into the CPU and Nvidia will integrate many extensions into the GPU by the time next-gen consoles begin to surface. We are going to see some CPU cores that will deal with gameplay logic, some GPU stuff that will run general computing... and two different compilers. One for the GPU and one for the CPU. The result will be a reduction of our dependence on bloated middleware that slows things down, shielding the real functionality of the devices.

It would be great to be able to write code for one massively multi-core device that does both general and graphics computation in the system. One programming language, one set of tools, one development environment - just one paradigm for the whole thing: Large scale multi-core computing. If you extract Moore's Law, you see that with the number of cores that Microsoft put in Xbox 360, it is clear that around 2010 - at the beginning of the next decade - you can put tens of CPU cores on one processor chip and you will have a perfectly usable uniform computing environment. That time will be interesting for graphics as well.

At that time, we will have a physics engine that runs on a computing device, we will have a software renderer that will be able to do far more features that you can do in DirectX as a result of having general computation functionality. I think that will really change the world. That can happen as soon as next console transition begins, and it brings a lot of economic benefits there, especially if you look at the world of consoles or the world of handhelds. You have one non-commodity computing chip; it is hooked up directly to memory. We have an opportunity to economize the system and provide entirely new levels of computing performance and capabilities.
 
Last edited by a moderator:
TG Daily: Broken down, that means today’s mainstream PCs aren’t suitable for gaming?

Sweeney: Exactly. PCs are good for anything, just not games.

Mainstream PCs are good for anything, just not games. Isn't it rather easy to see what he is saying?

This thread reminds me of tabloid newspapers. :smile:
 
Yes, it's pretty clear what he's saying, and it's true.

The problem is OEM's have to get off this fixation of "Gaming PC's" being some high-priced disparate product line. If you can build a great gaming PC - with an 8880GT no less - for under $800, while paying far higher prices than an assembler would due to the fact you're not buying large bulk quantities, OEM's should to.

They don't realize they're hurting their future market for short-term financial gain.

I don't think we'll see software rendering in the next 5 years. With Intel part of the new PC Gaming Alliance though, I'm hopeful that they'll start to take gaming on low-end machines more seriously as well - AMD/ATI's 790 platform has got to give them a kick in the nuts too.
 
Last edited by a moderator:
Aha mainstream PC's and sure it does. It is something that needs a change. Many see offerings of prebuilt PC's containing a Quad-core or fast Dual-core, 2-4GB of RAM and yet comes with a really weak GPU that would barely compete with a 7600GT.

But now that high-end GPU's are available cheap then I should think more PC's will come with a stronger GPU to match the rest. Actually I've seen it start over here and at mainstream PC prices to. If this trend continues then he can wipe his tears off and glue a smile on! :p
 
So, let's see, Intel gimped PC gaming, Intel gimped Vista, and I'm pretty sure Prescott was responsible for global warming and Core for global cooling. Those bastards!
 
I can't see game studios to start using software rendering again for non-casual games. Let's hope AMD's 780 and 790 chipsets will turn this around, back to the factor of 10.
 
But now that high-end GPU's are available cheap then I should think more PC's will come with a stronger GPU to match the rest. Actually I've seen it start over here and at mainstream PC prices to. If this trend continues then he can wipe his tears off and glue a smile on! :p

The fastest option for Dell's mainstream Inspiron desktop line is a 8600GT. Contrast that with the highest CPU/memory options of a 6600 Quad and 4GB DDR2. Sweeney is right, it does look hopeless at the moment. However, in a few years when a $50 graphics card puts PS3/Xbox hardware to shame things will probably look a bit different. I just hope there are still PC developers around by then who aren't targeting console specs.
 
However, in a few years when a $50 graphics card puts PS3/Xbox hardware to shame things will probably look a bit different.

In a few years, around 2011, the next Xbox will be out so the PC will have technically eclipsed nothing. Worse yet, devs will start targetting this new advanced xbox at full spec around 2010 when they start deving for it, wheres on pc's they still have to worry about the low end. So the pc will be beat once again.

Unless something dramatically changes, I think pc's, like arcades way back in the day, may fall victim to economies of scale which greatly favors consoles. Already I know many who have either:

1) given up spending hundreds on new video cards and just play on consoles
2) only play wow on their pc
3) stick to older pc games that run on older hardware
 
Things like this make me hate Intel..............with a severe passion. I don't think PC gaming will go by the wayside of course, but it's possible it might come down to internet flash games, low spec MMOs and what not. Even if Vista "accelerated" the need for GPUs in laptops it didn't do a good job, they're just integrated crap like usual with Vista made to work on them.

So what's the next thing to look at? I think it could be processing core like the Fusion and Larrabee, processors that are good at lots and lots of different tasks, be them highly specialized with more simple things like baseline orchestration or highly parallel stream processors. Basically an "all-in-one" chip could be PC gamings saving grace, it's a single processor to do anything, with multi-threading, high parallelization, stream processing, all that good stuff that is adaptable to tasks involving encoding/decoding information, high floating point for 3D and physics, yet still stable like a current x86 AMD or Intel processing core.

Now honestly it's a long time before I could see this really happening but it is a thought on the idea, considering people can buy a computer that can do anything, be they want to game, or do entertainment like movies and music that'll be very FLOPS heavy, or more practical tasks like school work. The whole chip works to those means individually or altogether efficiently.
 
Last edited by a moderator:
That all-in-one chip would also cost about the same as high-end CPU and GPU put together and would still get old just as fast as current CPUs and GPUs.
 
That all-in-one chip would also cost about the same as high-end CPU and GPU put together and would still get old just as fast as current CPUs and GPUs.

That's possible but if enough R&D went into it, a viable market such as now could benefit. And like I said it's not a CPU or GPU or necessarily a stream processing unit, but a processor with all those capabilites. Sure it would get fast real quick, but it's only one part, and you could possibly have mobos with multiple chips. The fact that it's a processor that's supremely flexible is the idea, plus much of the extra cost of a graphics card is the extra components packed onto the card with the GPU, VRAM and what not. A single processing chip with a single pool of seperate RAM could make for factor easier and production more simple. Once R&D is paid for, there would be cost savings in board manufacturing because no seperate graphics board would be needed and possibly no sound board as well.
 
That all-in-one chip would also cost about the same as high-end CPU and GPU put together and would still get old just as fast as current CPUs and GPUs.

Not for long though..

The only reason console hardware can be sold cheaply it because of the economies of scale..

What you think if *every* PC that sold in 2010 had a 24-core TFLOPS Larabee at it's core then they'd still be expensive per unit..? Considering how many CPUs Intel churn out each year I actually agree that this is by FAR the only way to save the mainstream PC market..
 
I simply can't resist quoting one of my favorite gaming bloggers, peterb of Tea Leaves:

peterb said:
Today I read an interesting interview with Tim Sweeney of Epic whose tag line is “PCs are good for anything, just not games”.

Summarizing the interview perhaps a bit unfairly, here’s what he says:

(1) People aren’t buying expensive enough PCs.
(2) Even the expensive PCs aren’t good enough to run his games.
(3) People who buy cheaper machines with Intel integrated graphics are giving their money to Blizzard instead of Epic.
(4) This aggression cannot stand. The solution is that everyone except us should change what they’re doing and buy machines with more expensive graphics hardware.

...

In summary: I wish Epic was a publicly traded company, so that I could short them.
 
The only reason console hardware can be sold cheaply it because of the economies of scale..
XB360 has been sold around 18M during the last 27 months. During the same time we've seen CPUs going from dual to quad core and the latter dropping to cheaper levels than the duals in 2005. Intel has sold about as many quads in the later months as MS has sold XB360 (though they probably include Xeons too), I have no ideas how many duals have been sold but I imagine it is not far from 10x as much. So basically even though CPUs have been advancing in a rapid pace it still has rather neglible effect on prices overall, assuming you won't target some fixed performance level (e.g x2 3800+ or equivalent).

Console HW is cheaper mostly because it doesn't change for years and you can keep on optimizing for price for a long time. In PC land things aren't that simple and relatively fast upgrade cycle is the main thing that makes it different from consoles in the first place.

What you think if *every* PC that sold in 2010 had a 24-core TFLOPS Larabee at it's core then they'd still be expensive per unit..?
Of course they would be cheaper, no doubt about that. Though I do think the price difference won't be too big. Also worth considering is that Intel isn't producing all that many different CPUs. Main difference is in packaging and that has pretty much fixed cost.

Considering how many CPUs Intel churn out each year I actually agree that this is by FAR the only way to save the mainstream PC market..
It is one way but most certainly not the only one. If Intel is to believed and they really do increase the performance of integrated graphics by an order of magnitude then it'll surely have quite good effect.
 
XB360 has been sold around 18M during the last 27 months. During the same time we've seen CPUs going from dual to quad core and the latter dropping to cheaper levels than the duals in 2005. Intel has sold about as many quads in the later months as MS has sold XB360 (though they probably include Xeons too), I have no ideas how many duals have been sold but I imagine it is not far from 10x as much. So basically even though CPUs have been advancing in a rapid pace it still has rather neglible effect on prices overall, assuming you won't target some fixed performance level (e.g x2 3800+ or equivalent).

Console HW is cheaper mostly because it doesn't change for years and you can keep on optimizing for price for a long time. In PC land things aren't that simple and relatively fast upgrade cycle is the main thing that makes it different from consoles in the first place.

Of course they would be cheaper, no doubt about that. Though I do think the price difference won't be too big. Also worth considering is that Intel isn't producing all that many different CPUs. Main difference is in packaging and that has pretty much fixed cost.


It is one way but most certainly not the only one. If Intel is to believed and they really do increase the performance of integrated graphics by an order of magnitude then it'll surely have quite good effect.

I'm not talking about price change over time..

I was referring to the cost per unit of Intel manufacturing & distribution 100,000 Larabees per month to 10,000,000 of them as a rough example..
 
They are already distributing 10M CPUs per month, how has that influenced their prices?

Basically what you are suggesting is that everyone should stop producing low-end HW and only sell midrange and higher stuff. That might help for a short while but you still have to consider the people who will not upgrade as fast as hardcore gamers.
 
They are already distributing 10M CPUs per month, how has that influenced their prices?
It doesn't with respect to the chips they are already producing.. But those still in development, of which prices have yet to be specified, the factor of yield affects this greatly.. That's what i'm talking about..

Basically what you are suggesting is that everyone should stop producing low-end HW and only sell midrange and higher stuff. That might help for a short while but you still have to consider the people who will not upgrade as fast as hardcore gamers.

No that's not what I'm suggesting at all..

I'm suggesting that if Intel decide not to position Larabee as a sort of GPU-like "Add In" component for desktops then it's not going to help anything.. vendors will still need to buy the expensive add in card & consumers will still be stuck with powerful CPUs & gimped graphics processing options at the low end..

However if Intel can position Larabee as the "all in one" chip then they can distribute as the natural progression of consumer CPUs but with the added benefit of powerful graphics, physics & decoding silicon built in then vendors need not worry about seperate add in cards which will allow Larabee to reach right down into the low end of the market, something no PCI GPU has ever been able to do from a cost perspective..

THAT's why I see Larabee as a good bet to shake things up because the cost to the consumer is minimised due to the reduced cost of the amount of silicon per system sold..
 
It doesn't with respect to the chips they are already producing.. But those still in development, of which prices have yet to be specified, the factor of yield affects this greatly.. That's what i'm talking about..
What would be the difference between Larrabee and currently sold 45nm CPUs? The latter are only a tiny part of overall CPU sales but aren't much pricier than their 65nm counterparts.

However if Intel can position Larabee as the "all in one" chip then they can distribute as the natural progression of consumer CPUs but with the added benefit of powerful graphics, physics & decoding silicon built in then vendors need not worry about seperate add in cards which will allow Larabee to reach right down into the low end of the market, something no PCI GPU has ever been able to do from a cost perspective..
Problem with that is Larrabee is not powerful "all in one" chip, it has pretty bad single-threaded performance and majority of todays games wouldn't run that well on it.
 
Problem with that is Larrabee is not powerful "all in one" chip, it has pretty bad single-threaded performance and majority of todays games wouldn't run that well on it.


Did reviews hit the internet already? :D

;)
 
I don't think an in-order SIMD optimized CPU that must use four parallel threads per core to achieve good enough efficiency can deliver all that good single threaded performance with average spaghetty code :)
 
Back
Top