4For future research, I would like to point out that many of these boxes can be quantified, although different measures are possible, and research is needed to decide which ones are useful. Uncertainty is typically quantified by Shannon entropy as defined in information theory and already used by, e.g., Hirsh et al. (2012); Friston (2010). The complexity of the world could be the number of states (or the entropy over the typical distribution over them) in a category-based world model. The amount of data available can again be quantified by information theory: In this case, it might be the Fisher information multiplied by the number of data points between the data and the parameters of an ideal world model, i.e., how much information the data contains about the world; or the mutual information between the data and the world states. Computational resources can be quantified using flops per second or a similar measure. To quantify uncontrollability, related probabilistic computations are possible (Huys and Dayan, 2009); we might also be able to use various tools from control theory, e.g. (Liu et al., 2011).