My continued investigation into the nature of modeling and analogy in the natural sciences has yielded quite the interesting perspective. In my recent chats with my research advisor, Dr. Mark Criley (IWU philosophy department chair), we have been discussing the seemingly branched chain of causality in the modeling of physical systems.
By this I mean, due to the inherently non-deterministic aspects of the very, very small illuminated by recent understandings of quantum mechanics, the dream of fully simulating a physical system exactly seems to have died quite the decisive death. Of course, there are still many ways that systems can be approximated using physical models, but the common notion of ‘if we know all the relevant quanta and qualia of every particle in a system, then we can predict exactly what it will do in the future’ rings true no more. Rather, the modeling of systems in the future that wish to take advantage of the quantum weirdness of the more fundamental particles will have to rely on a sort of parallel modeling, where each probabilistic outcome creates a branch in the modeling hierarchy.
This realization all arose from an initial conversation we had about the miscibility of scale (and the reading of a brilliant excerpt by … to be continued because I forgot the reading at school 🙁