Dark matter as a function of quantized gravity

Frank

Certified not a majority
Veteran
The fundamental problem in unifying gravity with quantum mechanics is, that gravity changes space in an analog way with no clear lower or upper bounds, while quantum theory requires clear, integer boundaries between states.

Or, in other words: quantum mechanics tells us what mass there is at any given location, gravity tells us what effect that has on the volume of space around it. But while the mass distribution of quantum theory is strictly localized and the bounds clearly defined and integer (see the Standard Model), gravity seems to have an unlimited resolution without clear boundaries.

That huge discrepancy between the two most established physics models results in a lot of casualties in between, of which the two most important ones are the definition of universal constants and the requirement for things like gravity waves and dark matter to make the models fit each other and the observational data.

In short: things seem to be much heavier as far as gravity is concerned, than predicted by the quantum theory.

The obvious solution to that would be, to use the space curvature induced by mass as a multiplication factor for the actual mass, as the volume covered is stretched by gravity. But that doesn't solve the fundamental problem, that any particle in the universe would have to be subjected by a gravity force from every other particle.



Like there is a law that states that the total amount of energy contained in the universe can never change, there is a similar law that states that the information density can never change. But, that doesn't assume that every particle in existence keeps track of every other one at all times. Especially because particles are created and destroyed all the time.

While we might speculate that the universe is analog and all that bookkeeping happens automatically, by unknowable forces behind the screen (very unscientific, that would be like stating that God did it), it is theoretized that any other force or interaction is communicated by messenger particles. The information has to be transferred physically. Cause and effect.


But, there are other possible solutions to this problem. And many of them rely on the information (energy) density as a function of the curvature of space. Or, in other words: when there are many other particles around (dense), there is a lot of information going around. While in open space far from everything, there is only little.

To make that work, particles would have to be attracted to the location with the highest information/energy density. And that is exactly what gravity is all about.

But, where is that information? Is it a value inherent in each location in space that can hold a particle as defined by the volume of the location of that particle as defined by quantum mechanics? Is space itself doing the bookkeeping?

But in that case, we would expect space itself to have mass, as there is information content there. Which it might have, as there seems to be more mass in locations which have a large information density (hold many particles) as accounted for by the mass of the particles itself. But it would have to be another kind of mass than the regular one. Dark matter.

Another explanation would be, that the exchange of the messenger particles is only between neighboring particles. This needs some more explaining.


According to quantum mechanics, all of space is divided into energy shells, that can hold one or more particles. Inside an atom, those particle volumes are packed close together. If that atom is part of a dense mass, those volumes will be pretty small, relatively speaking.

So, in that respect, our universe looks like an enormously huge three dimensional construction build of bricks. And the bricks are very small and dense where there are many particles close together, while the bricks are very large and light-weight where there are few.

Coincidentally, that is exactly what gravity does as well. Or is it? Because, a large mass increases the time it takes information to travel the same subjective distance. Which can be explained by there being many more information exchanges on the way. So it does fit.

If all of space (our entire universe) is divided into particle volumes (probability zones within quantum shells, like the shells of atoms), and we equal the gravity curvature with the size of those volumes, everything fits perfectly. Especially if we quantize the time it takes to exhange an information particle, and the amount of them that can be exchanged at any one time. The amount of energy available is bounded.


The only thing still missing, is what we started with: that things seem to be much heavier than we would expect by counting the energy content of all the particles. And that is where the information exchange comes in. If all particles only exchange information with their neighbors, there are more nieghbors in a dense mass than in flat space, and those information exchanges are particles that have mass as well, it might offer a very good explanation for the observed increase in density.

Dark matter as a function of quantized gravity.



What do you think?
 
I think that just like the DCT coefficients carry part of their information in coefficient position the same must be true for the particles you speak of.
 
I don't really follow your discourse well, but I think you are proposing one of the classical ideas known as 'atoms of space'. Or in other words, we posit that since most everything in quantum mechanics is discrete, why not make spacetime discrete as well?

There are very strong nogo theorems about such an endeavour, for instance the fact that we don't see any structure like this, since you can always boost your system (by lorentz invariance) to observable levels and no such discreteness is seen. So your theory would have to explicitly break lorentz invariance, and that runs into the second problem.

'Nonrenormalizability of gravity'. Quantum gravity is known to be nonrenormalizable as a field theory, which is basically the statement that it takes infinitely many new parameters to specify the theory exactly. Or in other words, the theory is only predictive up to some cutoff, and worse it depends crucially on the cutoff (something that is put in by hand). Now, if you make your theory break lorentz invariance, there is no deformation that is possible (even in principle) to rectify this. You will always have this unpredictability inherent in your laws and thats really bad. Only with the strong constraints of Lorentz invariance + *something else* do you even have a chance at getting out of this situation.

That *something else* is what people are looking for in Quantum gravity, for instance in string theory that something else is a very fuzzy interaction that smoothes the divergences of the theory and actually places an infinite tower of constraints on the system. So all those infinitely many parameters that are unspecified in generic gravity, are now constrained infinitely many times and out pops something that is finite. I have to point out how extraordinarily nontrivial this is.

In fact there is only *one* other way out of that conundrum, and that is to look for a pattern in the way the unspecified infinite quantity of parameters behave. Unfortunately, no such pattern (or fixed point in technical jargon) has ever been found. Ergo it follows that discrete physics of space/time is doomed in the most naive sense.
 
Actually, it is possible to quantize spacetime and still not break Lorentz invariance. You just have to not have an ordered grid. A stochastic grid does just fine. In addition, we don't necessarily know that there isn't a preferred reference frame. There is, after all, a rest frame to the universe. Though I will grant that there may well be significant theoretical problems with a theory that isn't Lorentz invariant.

As for the fact that quantum gravity, in its most simplistic form, isn't renormalizable, consider that this comes out of the way that we do perturbation theory. It is possible that the perturbation theory that works so well for the strong, electromagnetic, and weak forces just isn't the right way to go about gravity.
 
Yes, quite true. There is an approach based on causal sets that presumably avoids Lorentz violation where one "sprinkles" Poisson statistics on a lattice. (previous attempts at regaining Lorentz symmetry in Feynmann checkerboard models in 3 +1 tend to run into Unitarity problems) and there are other more modern variants (CDT etc). Also I am neglecting to point out that lattice gauge theory even though it has a lorentz violating cutoff, has a residual symmetry group that is large enough to ensure that any such violation is left in marginal operators in the continuum limit.

But this causal set approach has an additional issue nowdays b/c we now know there is a cosmological constant term. This forces a horrendous finetuning problem (in fact its a finetuning problem that is twice as bad on a log scale as the normal CC problem) in order to get omega ~ 1 / l ^4 where l is the average spacing size. And of course, this still runs into the nonrenormalizability issues quite regardless absent a fixed point, and in fact are technically worse as there are many more lattice theories than continuum ones.

Also on a technical side, many of these random lattice models tend to run into problems regaining suitable flat space limits as well, as they tend to have modes that favors spacetime crumpling up. Its a numerical nightmare, that only recently has an inkling of progress.
 
Back
Top