Sets of emergent laws that show the (well defined) complete set of relations of objects concerning one or several variables of interest.

#### Solving classical prediction problems by “KnowledgeNet” based models

The first step in the emergent law based model building process extracts “KnowledgeNets” from databases.

Because we define knowledge empirically as “until now always repeated patterns” we are able to create directed graphs that show all relations of objects concerning one or more variables of interest. Our approach enables us to ensure that all interesting emergent laws of a certain quality hidden in the data can be found. In this sense we can make sure to find ”complete knowledge”.

*Video 1: The Structure of a KnowledgeNet created for the variable of interest “number of rented bicycles in the next hour”.*

The nodes in the KnowledgeNet show objects (groups of observations) that always had a certain relation – a greater (or smaller) number of rented bicycles – to all other nodes that are connected by edges.

*Video 2: The Structure of a KnowledgeNet created for the variable of interest “rate of signed bank contracts”.*

KnowledgeNets are our empirical basis for the construction of predictive models.

- The resulting models reach a prediction accuracy that is at least comparable with up to date machine learning algorithms.
- Feature creation and selection is performed automatically by the learning process.
- On the basis of a single KnowledgeNet models for several variables of interest can be constructed.
- This also works for variables that were not used to create the net.
- The resulting models only consist of emergent laws – every relation used to predict the variable of interest was always true. No assumptions are needed.
- The models are no black boxes and the used laws can be perfectly understood by the user.
- The approach can handle the falsification of laws because a KnowledgeNet contains enough objects to replace falsified laws by laws attached to other objects.
- The approach to use only emergent laws is empirically justified by meta-laws. (see Meta-Properties of Emergent Laws)

Prediction Task | Best Mean Absolute Prediction Error of some state-of-the-art Machine-Learning algorithms |
Mean Absolute Prediction Error ofEmergent Law based Model |

Number of rented bicycles | 33.77 | 27.9 |

Temperature in Washington DC | 0.01343 | 0.01278 |

Humidity in Washington DC | 0.03643 | 0.0359 |

*Table 1: Prediction performance for several variables of interest created from the above KnowledgeNet.*