Dreams : create, share & play [PS4, PS5]

You could fake it easily with some well placed additional lights though I suppose?
Yes, but that is more difficult in comparison to the diffuse lighting of a cloudy day in nature... not that I'm saying that that video is not impressive, because it is (as well as other dreams from the same creator).
 
Insane! :D
I remember how many doubted Dreams could compete regular AAA tech. But IMO, this even beats the shown next gen Hellblade 2 cutscenes.
Close up detail is lacking, though.
how is the engine holding so much SDF data in a single scene? IIRC watching the way they make wood, it's a repeating cone shape that is sliced. So after they sculp their object, how is that object saved in SDF format but still be able to keep smaller than say vertices data?
 
how is the engine holding so much SDF data in a single scene? IIRC watching the way they make wood, it's a repeating cone shape that is sliced. So after they sculp their object, how is that object saved in SDF format but still be able to keep smaller than say vertices data?
Remembering the paper, their modeling tool generates a tree of CSG (constructive solid geometry, so making unioins or subtractions of objects) operations using a small set of primitives (cube, sphere, cone, helix...). Probably each primitive (or a set of them) can link to procedural material (to add noise, or make each point a blade of grass, etc.).
Likely that's the data that's stored and shared.

To render it, they generate a somewhat regular distribution of points over the surface. The points are hierarchical in 2 or 3 levels. Top level point can become a 'brush stroke', e.g. to form grass blade or cloud, bottom level points are just a single pixel. I don't know how they handle LOD for more distant stuff.

Finally the points are splatted to screen, which is similar to rasterizing triangles in random order. There was some talk they also use occlusion queries. maybe they raster coarse representation of cubes for thick / large objects.

In the paper they did not use SDF for rendering - they only used it for modeling, to make a volumetric representation of the initial primitives and make CSG possible.

I actually work on something similar, but i need to support full meshes as modeling primitives. I do this to create surfel hierarchy for my GI stuff.
I am very impressed of their performance to convert data int renderable points. They do it on GPU. For me it's harder because mesh support and the need of a regular surfel distribution - too complicated for GPU and it's pretty slow.
This is really impressive tools work they did! ...while the world only pays attention on the renderer. :)
 
EUdPtv9WsAA5FGm
 
Martin Nebelong does some amazing things on dreams (as do others). He posted two things he did in just one day and I particularly love this one:



Looks amazing! The way anyone would want to make comic books now I’m sure [emoji16]
 
So much good stuff happening in that Facebook group. Really inspiring. I haven’t done much myself but I am taking it all in and the few times I do try something it is easier and easier to get done what I want ...

I liked this one too:

https://indreams.me/scene/dWNaosfxEHT
 
Question is if this can be done efficiently. Maybe it will decide if we see point clouds in other games too, but for foliage it seems great in any case.

With the popularity of deffered rendering engines, and the available general compute power in this gen's gpus (as proven by dreams) I'm quite disapointed by how little experimentation we've seen with thia kind of hybrid systems.

I unserstand the power of legacy regfarding polygons, bit stuff like particles, fluids, magic effects etc were ripe for some sdf /voxel based primitives, and with a pbr deferred engine, you just gotta plug that stuff right on top of your g-buffer and let the same lighting code handle everything just the same.

I feel like this gen's evolution of rendering tech was overall less innovarive than last's. Maybe it is just slower because of the longer dev-cycles though.
 
I unserstand the power of legacy regfarding polygons, bit stuff like particles, fluids, magic effects etc were ripe for some sdf /voxel based primitives, and with a pbr deferred engine, you just gotta plug that stuff right on top of your g-buffer and let the same lighting code handle everything just the same.
It may not be that easy because of the 64 bit atomics limitation. Needing 32 or 24 bits for z, is the rest enough for at least normal, albedo, roughness + metalness? Not really.
I have not thought much about this and don't know what they do in Dreams. Forward would be easier because needs only a single color, but i assume that's very bad because shading overdraw and not compatible with post processing requiring normals or motion vectors.

Splatting twice for a depth prepass could also make sense to avoid shading overdraw. Likely resulting flicker could be hidden with random displacement and TAA. Then deferred would work with the second pass filling G-buffers as usual?

A third option would be to include Z into each value and buffer, then it should work with a single pass at the cost of BW.

Not sure what's best here, but just a matter of trying it out.

Who knows, maybe secret sauce of PS5 is atomic write of vec4 :D


EDIT: Need to add i'm always assuming their renderer is still mostly about splatting points.
 
Last edited:
Needing 32 or 24 bits for z, is the rest enough for at least normal, albedo, roughness + metalness?

That is a problem of deferred shading in general, and yet game engines use it it all the time and manage to make their materials work.
And as far as the old Media Molecule presentations went, they were indeed drawing into a G-buffer then, and my assumption is they still are, and that's how they are integrating their marching cube rastered geo with the point splats.
 
I'm really struggling with this game. ;) I've said all along that I doubt it'll gain traction, and I think that's the case now with some months out. There are no reports of millions of sales. There doesn't appear to be a huge community around it with YouTube content having lowish viewing figures. Some incredible creations out there and people just don't care to witness them. And this is with Lockdown everywhere and people having loads of time on their hands, they aren't wanting to grab Dreams and create stuff. I put all that down to the creativity being so inaccessible versus LBP which was so simple.

I've been trying to create something really simple, requiring me to go to forums to ask for support, only to discover the game is bugged and what I want is presently impossible. Along the way, I used the DS4 instead of the Move controllers for a spell, and the navigation is really awkward. There's a jittery imp thanks to motion controls, despite having a camera to help stabilise things, and MM have removed the Z-axis movement. I was fighting the interface a lot with that, and still do with the Move controllers where I've many more hours sunk in.

My idea is dirt simple, based on the MM realtime puppeteering showcase at one of the game shows - create a 'sock puppet' mapped to the Move controller where you can move it around in 3D space and open/close a mouth based on the trigger.

There's a video tutorial showing how to create an open/close mouth using keyframes. This was easily adapted to map to a button press by attaching a Controller Sensor to the bottom half of the head and running a button wire to the circuitry. I could possess the puppet and animate the head. Great. Simple and predictable. I then changed the Possession style to 'follow imp', and now I can move the head around, but the animation is distorted with the head collapsed in on itself.

That begins my querying and R&D, trying all sorts of things. Until finally I can prove it's bugged and joints fail on objects with Follow Imp behaviour. Slap a Controller Sensor on an object with Follow Imp behaviour. Add another object and link them with a String connector. Move the first object and the second swings around correctly. Add any amount of elasticity and it breaks, the second object collapsing onto the first. Similarly with keyframed joints, where my idea fails.

So I've spent some notable time creating a reproduction case and submitting a bug report. At some point it'll likely be fixed. However, my concern here is that's a pretty fundamental fault and not an obscure bug, and I expect, due to the complexity of the engine, there are going to be plenty of these issues. Furthermore, I expect changes in the game to introduce bugs because of this complexity. There are a couple of upset creators on the official feedback forum mentioning their creations getting broken on updates. That's incredibly frustrating, when you work so hard on a project only for a tools update to break it, and you don't know whether you should spend time trying to find a workaround, or wait until a fix happens, which might not even happen. If there's not even much audience for your creation, reasons to persist in creating are even fewer.

It feels like a productivity tool, not a game, and that's going to hamper adoption no end. My friend bought it to tinker, but ended up finding it too much like hard work at the end of the day and now spends his time play Apex Legends to relax. There's a lot I want to make with this, but I can't even get started. :(
 
I'm really struggling with this game. ;) I've said all along that I doubt it'll gain traction, and I think that's the case now with some months out. There are no reports of millions of sales. There doesn't appear to be a huge community around it with YouTube content having lowish viewing figures. Some incredible creations out there and people just don't care to witness them. And this is with Lockdown everywhere and people having loads of time on their hands, they aren't wanting to grab Dreams and create stuff. I put all that down to the creativity being so inaccessible versus LBP which was so simple.

I've been trying to create something really simple, requiring me to go to forums to ask for support, only to discover the game is bugged and what I want is presently impossible. Along the way, I used the DS4 instead of the Move controllers for a spell, and the navigation is really awkward. There's a jittery imp thanks to motion controls, despite having a camera to help stabilise things, and MM have removed the Z-axis movement. I was fighting the interface a lot with that, and still do with the Move controllers where I've many more hours sunk in.

My idea is dirt simple, based on the MM realtime puppeteering showcase at one of the game shows - create a 'sock puppet' mapped to the Move controller where you can move it around in 3D space and open/close a mouth based on the trigger.

There's a video tutorial showing how to create an open/close mouth using keyframes. This was easily adapted to map to a button press by attaching a Controller Sensor to the bottom half of the head and running a button wire to the circuitry. I could possess the puppet and animate the head. Great. Simple and predictable. I then changed the Possession style to 'follow imp', and now I can move the head around, but the animation is distorted with the head collapsed in on itself.

That begins my querying and R&D, trying all sorts of things. Until finally I can prove it's bugged and joints fail on objects with Follow Imp behaviour. Slap a Controller Sensor on an object with Follow Imp behaviour. Add another object and link them with a String connector. Move the first object and the second swings around correctly. Add any amount of elasticity and it breaks, the second object collapsing onto the first. Similarly with keyframed joints, where my idea fails.

So I've spent some notable time creating a reproduction case and submitting a bug report. At some point it'll likely be fixed. However, my concern here is that's a pretty fundamental fault and not an obscure bug, and I expect, due to the complexity of the engine, there are going to be plenty of these issues. Furthermore, I expect changes in the game to introduce bugs because of this complexity. There are a couple of upset creators on the official feedback forum mentioning their creations getting broken on updates. That's incredibly frustrating, when you work so hard on a project only for a tools update to break it, and you don't know whether you should spend time trying to find a workaround, or wait until a fix happens, which might not even happen. If there's not even much audience for your creation, reasons to persist in creating are even fewer.

It feels like a productivity tool, not a game, and that's going to hamper adoption no end. My friend bought it to tinker, but ended up finding it too much like hard work at the end of the day and now spends his time play Apex Legends to relax. There's a lot I want to make with this, but I can't even get started. :(
My son essentially just watches YouTube esq creations - things like the ‘wario gets killed...’ ones, he has tried the odd game but that’s it. He is quite creative and did make LBP levels but has no desire to try making anything, which is a shame.

Regarding your issues, I recall many people had (and still have?) problems making movable characters that move ‘naturally’ - maybe there is a work around that some have figured? Maybe these work arounds are what break in updates?

Talking of updates breaking levels, I tried my LBP levels on LBP3 a few months back and a couple had been broken by the updates...as you say, due to the complexity of Dreams this is likely a lot harder to update without breaking something, hell even standard predictable games that get updates sometimes break something else in a game!
 
Updates are the worst! With Unity, generally speaking you stick to a version for the entire life of a project. Updating to the latest version has in the past killed my projects and I've had to waste considerable time solving them. Errors from updates are new and unknown, so there's no solution out there to be looked up and you just have to work the problem until you find a solution. When it comes to a point where I have to update, say an old bug has only been fixed in a later version, I dread it!
 
I'm getting closer to release of my creation (not game but element) so I don't have time to go through other creations but I'll do it when I have more time.

Regarding bugs, I'm not sure if something is bug or intended design so I just use what works for me and move on, I already wrote in this thread about wonky connectors so I use them only for animating things with 0% weight and I use movers to move things around, I even use mover to bypass ingame gravity and laser sensors to bypass ingame collisions for tyres. One thing is sure, their hour counter on indreams.me is broken, no way I played Dreams only for 13 hours.
 
Back
Top