A Graphbased Explanation of Physics
I have been thinking about a model of physics that is fundamentally different from the ones I have been taught in school and university.
It is not a theory, because it does not make predictions. It is a different way of looking at things.
I have found that this made a lot of things we normally consider weird a lot easier to understand. Both quantum physics and general relativity make intuitive sense in this framework, while other everyday aspects of life become more complex instead.
Almost every model of physics I have read of so far is based on the idea that reality consists of stuff inside a coordinate system, but the dimensionality of the coordinate system is unclear. Relativity talks about bending space, but it still treats the existence of space as the norm.
I propose that there are no dimensions at all, and 'space' is only a derived property instead of a core component of the universe.
Rationale
If we assume that the universe is computable, then dimensionbased physics, while humanly intuitive, are unnecessarily complicated. To simulate dimensionbased physics, one first needs to define real numbers, which is complicated, and requires that numbers be stored with practically infinite precision. Occam's Razor argues against this.
A graph model in contrast would be extremely simple from a computational point of view:
a set of nodes, each with a fixed number of attributes, plus a set of connections between the nodes, suffices to express the state of the universe. Most importantly, it would suffice for the attributes of nodes to be simple booleans or natural numbers, which are much easier to compute than real numbers.
Additionally, transition functions to advance in time would be easy to define as well as they could just take the form of a set of ifthen rules that are applied to each node in turn. (these transition functions roughly correspond to physical laws in more traditional physical theories)
Therefore, a graphbased model of the universe, while not humanly intuitive, would be mathemtically much simpler than a dmensionbased one. Occam's Razor would argue in its favor.
Idea
Model reality as a graph structure.
Reality at a point of time is a set of nodes, a set of connections between those nodes, and a set of attributes for each node.
There are rules for evolving this graph over time, in discrete computational steps. These rules might be as simple as those in Conway's game of life, but they lead to very complex results due to the complicated structure of the graph.
Connections between nodes can be created or deleted over time according to transition functions.
What we call particles are actually patterns of attributes on clusters of nodes. These patterns evolve over time according to transition functions. Also, since particles are patterns instead of atomic entities, they can in principle be created and destroyed by other patterns.
Our view of reality as (almost) 3dimensional is an illusion created by the way the nodes connect to each other: This can be done if a rule exists that matches these criterions: operate on an arbitrarily large graph (a set of vertices, a set of edges), such that the graph comes to more closely fulfill the following invariant:
 There exists a mapping f(v) of vertices to (x,y,z) coordinates such that for any pair of vertices m,n: the euclidean distance of f(m) and f(n) is approximately equal to the length of the shortest path between m and n (inaccuracies are fine so long as the distance is small, but the approximation should be good at larger distances).
I have no idea what such a rule would look like, but for anyone who has seen the complexities you can achieve even in something as simple as Conway's Game of Life, it should be clear that it must be possible for such a rule to exist.
Quantum Physics and General Relativity
A dimensionless graph model would have no contradiction between quantum physics and relativity:

Quantum effects happen when patterns (particles) spread across nodes that still have connections between them besides those connections that make up the primary 3D grid.
This also explains why quantum effects exist mostly on small scales: the rule enforcing 3D grid connections tends to wipe out the entanglements between particles.

Space dilation happens because the patterns caused by high speed travel cause the 3D grid pattern to become unstable and the illusion that dimensions exist breaks down.
There is no contradiction between quantum physics and relativity if the very concept of distance is unreliable.
I don't really know a lot about time dilation experiments: Is there an 'objective time' compared to which our timeframe is relative? Do different sources of time dilation 'stack'? I would need to know more about this subject to be able to take a good guess at how this could be represented in a graph format, but here are two possible approaches:
 Each pattern spends some of its processing steps on interacting with other patterns and some on noninteractive internal processes. A shift in the proportion between the two would be perceived as a time dilation effect from the outside. This interpretation would imply that there exists an 'objective time'.
 The nodes of the graph don't actually advance through time in parallel. Instead, there is a single process that advances each nodes of the graph one after the other. Usually this process picks the next node to advance in a moreorless uniform way from all nodes, but some patterns can reduce the probability of anode inside that pattern being picked. This interpretation would imply that there is no such thing as 'objective time'.
Additional Observations
Here are some more short notes on how this could work:

Waveparticle duality makes intuitive sense:
Light is a particle in the sense that light is made up of discrete instances of patterns on the graph.
Light is a wave in the sense that these patterns propagate along the graph in a way that tends to ignore the rule that makes the graph appear to have three dimensions.
The collapse of a wave happens when the lightpattern interacts with other patterns in such a way that the dimensionenforcing rule gets a hold of the lightpattern, thereby effectively assigning it a coordinate in space.

There is a maximum number of node connections in the graph along which a pattern can propagate in one computational step.
When translated into a traditional distance measure using the dimensionalityenforcing rule, this number works out to the speed of light.
It is possible for effects to happen at something that appears to be faster than the speed of light because the 3D pattern only holds true on a macroscale. There can be connections between nodes that are shorter than the euclidean distance between those nodes in the 3D pattern, but the number of such connections goes down at larger scales, as the 3Denforcing rule breaks them down.

All movement is relative because there is no reference system.
Relativity of speeds is trivially true if you consider that an entire cluster of nodes can change its connections to the outside in a movementlike way while another cluster within that cluster is doing the same thing.
As a result, no single object can propagate faster than c, but because different parts of the graph can change in parallel, it still works out ok.

Energy is an abstract property of the graph that is interesting simply because it is an invariant of all transition rules. No matter what happens to a pattern, energy can never be created or destroyed simply because no application of a transition rule to any graph structure will alter this property.

Since particles are patterns, and patterns can interfere with each other and even destroy each other, the equivalence between mass (patterns forming particles) and energy (described above) makes intuitive sense.
I have a number of additional notes on this topic, but I find it very hard to phrase them in an understandable way. If you are interested in this topic, don't hesitate to contact me at floriandietz44@gmail.com.
Notes
This is not really a theory. I am not making predictions, I provide no concrete math, and this idea is not really falsifiable in its most generic forms.
Why do I still think it is useful?
Because it is a new way of looking at physics, and because it makes everything so much more easy and intuitive to understand, and makes all the contradictions go away.
I may not know the rules by which the graph needs to propagate in order for this to match up with experimental results, but I am pretty sure that someone more knowledgeable in math can figure them out.
This is not a theory, but a new perspective under which to create theories.
Also, I would like to note that there are alternative interpretations for explaining relativity and quantum physics under this perspective. The ones mentioned above are just the ones that seem most intuitive to me. I recognize that having multiple ways to explain something is a bad thing for a theory, but since this is not a theory but a refreshing new perspective, I consider this a good thing.
I think that this approach has a lot of potential, but is difficult for humans to analyse because our brains evolved to deal with 3D structures very efficiently but are not at all optimised to handle arbitrary graph structures with any efficiency. For this reason, Coming up with an actual mathematically complete attempt at a graphbased model of physics would almost certainly require computer simulations for even simple problems.