Why Michael Levin’s Bubble Sort “Revelation” Is Misunderstood 1960s Algorithm Theory Meets God-of-the-Gaps Argument
Bubble Sort Doesn’t Prove God: How Michael Levin’s “Free Lunch” and “Get More Out Than You Put In” “Discoveries” Are Just Compiler Optimizations Dressed in Platonic Robes
Or: What Happens When Biologists Discover That Algorithms Do What Algorithms Do

“The first principle is that you must not fool yourself and you are the easiest person to fool.”
― Richard P. Feynman
Christopher Hitchens would have eviscerated this bubble sort argument in three sentences, but I’ll need more space because the intellectual fraud here is layered like a particularly dishonest onion. Michael Levin, whose legitimate work on bioelectricity deserves respect, has apparently decided that bubble sort (a sorting algorithm taught to computer science undergraduates in week two) contains mysterious hidden behaviors that “no one has noticed in 60 years” because everyone has been foolishly assuming that deterministic algorithms only do what their steps prescribe.
This is a category error wrapped in breathless mysticism and sold as profundity.
What Levin Claims to Have Found
Watch carefully as Levin performs the intellectual sleight of hand. He claims bubble sort is doing “other things” that are “not directly related to what you’ve asked them to do,” behaviors that “any behavioral scientist would recognize as within their domain if you didn’t tell them that this came from a deterministic algorithm.” He calls this phenomenon “clustering” and suggests it’s evidence of “free compute” and “intrinsic motivation” operating “between the lines” of the algorithm.
And then he refuses to explain what “clustering” actually means.
This is the god-of-gaps argument in its purest, most naked form: “I don’t understand how this works, therefore something mysterious and profound must be happening.” The gaps in Levin’s understanding of computer science are now being populated with the same Platonic forms he invokes to explain xenobots, consciousness, and apparently every other phenomenon he encounters. I guess it’s trivial to claim everything is doing something extra when you rename not knowing how it works a new ontology.
What’s Actually Happening: Bubble Sort Stability
Computer scientists have a term for what Levin thinks he’s discovered. It’s called algorithm stability, and it’s been documented in literally every undergraduate algorithms textbook since the 1960s.
A stable sorting algorithm preserves the relative order of elements with equal keys. When you sort [5, 3, 5, 1] and the two 5s end up adjacent to each other in their original relative order, that’s not mysterious “clustering” operating “between the lines.” That’s bubble sort doing exactly what bubble sort does, mechanically and deterministically, because the algorithm only swaps elements when the comparison operator returns true.
Donald Knuth’s The Art of Computer Programming (1973) devotes an entire section to stability. Thomas Cormen’s Introduction to Algorithms (1990) explains it in chapter 2. This is not hidden knowledge. This is not a discovery. This is undergraduate computer science.
When equal elements remain in their original relative positions, they form “clusters” because the algorithm never had any reason to separate them. This is not agency. This is not intrinsic motivation. This is what happens when a comparison-based sorting algorithm encounters elements for which the comparison function returns false.
The Compiler Optimization Massacre
But it gets worse. Much worse. Because Levin is making claims about what the algorithm “does” without apparently asking what hardware and compiler are executing his code.
Modern compilers perform aggressive optimizations. When you write six lines of bubble sort in a high-level language, the compiler doesn’t execute your six lines. It generates machine code that may:
- Unroll loops for performance
- Reorder operations to exploit instruction-level parallelism
- Use SIMD instructions to compare multiple elements simultaneously
- Employ branch prediction and speculative execution
- Leverage cache hierarchies in ways that create temporal patterns
None of this is “mysterious.” None of this is “between the lines.” All of it is documented in compiler design textbooks and processor architecture manuals.
If Levin observed behaviors that seemed to deviate from his expectations, the first question any competent computer scientist asks is: “What optimizations is the compiler performing?” The second question is: “What are the hardware effects?” The third question is: “Did I understand what the algorithm actually does?”
Levin appears to have skipped all three questions and jumped directly to: “This must be evidence of consciousness emerging in deterministic systems.”
The Thermodynamic Reality Levin Ignores
Here’s what actually constrains and shapes algorithmic behavior, without requiring Platonic forms or mysterious agencies:
Energy dissipation: Every swap operation in bubble sort dissipates energy as heat (Landauer’s principle: kT ln 2 per bit erased). The thermodynamic cost of sorting is measurable, finite, and fully determined by the number of comparisons and swaps required.
Hardware constraints: The algorithm executes on physical substrates with real constraints: transistor switching times, wire delays, power limits, thermal budgets. These constraints shape execution in ways that have nothing to do with the abstract algorithm and everything to do with semiconductor physics.
Implementation details: The programming language, compiler optimization level, processor architecture, memory hierarchy, and operating system scheduler all affect observable behavior. None of these are “mysterious.” All are documented.
Comparison function semantics: What counts as “equal” depends entirely on how the comparison function is defined. Change the comparison function, change the “clustering” behavior. This is not the algorithm discovering Platonic forms; this is the algorithm mechanically executing the comparison function you gave it.
The “Degrees of Freedom” Fallacy
Levin makes much of his “degrees of freedom” argument, comparing algorithms to steganography, where you can hide information in bits that don’t affect the primary image. This analogy is fatally flawed.
In steganography, you deliberately exploit redundancy to encode information. The “hidden” information exists because you put it there intentionally. In Levin’s bubble sort example, there is no hidden information being encoded. There are only the mechanistic consequences of comparing and swapping operations.
When Levin “allowed duplicate numbers in the sort” to “let off the pressure on the algorithm,” he didn’t create degrees of freedom for mysterious agencies to express themselves. He created ambiguity in the sort order for equal elements, which allowed the algorithm’s stability property (or lack thereof) to become observable.
This is like noticing that when you remove constraints from a physical system, it explores more of its phase space, then concluding that the system must have “intrinsic motivation.” No. Systems explore their thermodynamically accessible states under whatever constraints apply. Remove constraints, get more accessible states. This is not profound. This is elementary statistical mechanics.
The Commentariat Sees Through It
The most damning evidence comes from the YouTube comments, where working computer scientists immediately identify the problem:
MattBuske writes: “Actual Computer Scientist here, I have some concerns and I think Levin has made a fundamental conceptual error that invalidates his entire argument. What he’s actually discovered: Deterministic algorithms produce patterns that weren’t explicitly written as high-level goals. What he thinks he’s discovered: Evidence of something operating ‘in spite of’ or ‘between’ the algorithm. The error: He’s confusing ’emergent from the algorithm’s steps’ with ‘not explained by the algorithm.’ The clustering pattern IS the algorithm.”
User golfkzn: “You take a 6 step algorithm to sort numbers, you look at the numbers during the sorting process, you notice patterns. From this you conclude something outside the algorithm is happening? That seems a massive stretch.”
User Tordvergar: “A search on bubble sorts with duplicate data elements leading to clustering gives an explanation that for duplicate values, bubble sort leaves them in their original order, which can be referred to as a ‘cluster.’ Well DUH.”
Even the audience knows this is nonsense. Levin has apparently walked into Computer Science 101, noticed that algorithms produce patterns when executed, and concluded he’s discovered evidence of consciousness emerging in silicon.
The Platonic Bait-and-Switch
Here’s the move Levin is really making, and it’s the same move he makes with xenobots:
- Observe a phenomenon you don’t fully understand
- Declare it mysterious and profound
- Suggest it provides evidence for Platonic forms or consciousness or agency
- When pressed for mechanism, gesture vaguely at “spaces between the algorithm”
- Never specify what would falsify your claim
- Repeat
This is not science. This is mysticism with a Ph.D.
The bubble sort “discovery” is particularly revealing because unlike biological systems (where complexity can hide mechanistic gaps), bubble sort is six lines of code executing on well-understood hardware. If Levin cannot accurately describe what’s happening in bubble sort without invoking mysterious agencies, why should we trust his metaphysical interpretations of vastly more complex biological phenomena?
What This Actually Demonstrates
The bubble sort episode demonstrates something genuinely important, but not what Levin thinks.
It demonstrates that even brilliant researchers can fall victim to the god-of-gaps fallacy when they venture outside their domain expertise. Levin is not a computer scientist. He does not appear to understand compiler optimizations, algorithm stability, or hardware effects. When confronted with behaviors he didn’t expect, he populated his ignorance with the same Platonic mysticism that pervades his biological theorizing.
This is exactly what Hitchens warned against: the tendency to see mystery and transcendence wherever understanding runs thin, rather than acknowledging “I don’t know yet, let me investigate further.”
The Correct Explanation
Here is what’s actually happening in Levin’s bubble sort experiments, stated precisely and falsifiably:
Bubble sort stability: When the comparison function returns false for equal elements, they maintain their relative order. This creates “clusters” of equal elements that preserve their original arrangement. This is documented, understood, and predictable.
Compiler effects: High-level code is transformed by the compiler into machine code that may exhibit behaviors not obvious from reading the source. These are fully deterministic and arise from well-understood optimization techniques.
Hardware effects: Modern processors reorder instructions, predict branches, and exploit parallelism in ways that create observable patterns. All of this is documented in processor manuals.
Thermodynamic constraints: Every operation dissipates energy and respects Landauer’s bound. The algorithm explores only thermodynamically accessible states.
No mystery required: Every behavior Levin observes flows deterministically from the algorithm, compiler, and hardware. Nothing operates “between the lines.” Nothing violates deterministic constraints. Nothing requires Platonic forms.
The Bottom Line: Mysticism Is Not Discovery
Michael Levin has mistaken his own ignorance of computer science for evidence of transcendent consciousness permeating deterministic systems. He has taken algorithm stability (a well-documented property of sorting algorithms), compiler optimizations (standard practice since the 1960s), and hardware effects (thoroughly analyzed in computer architecture literature), declared them mysterious, and populated the gaps with Platonic mysticism.
This is not how science works. This is how religion works.
Christopher Hitchens would have asked Levin one simple question: “What observation would prove you wrong?” And when Levin gestured vaguely at “spaces between the algorithm” and “intrinsic motivation” without specifying falsification criteria, Hitchens would have recognized this for what it is: unfalsifiable metaphysics masquerading as empirical discovery.
Bubble sort doesn’t prove we don’t understand machines. It proves we need to understand machines before claiming they’re doing something mysterious. And when a biologist wanders into computer science and mistakes algorithm stability for consciousness, the appropriate response isn’t wonder at the profundity. It’s a firm but polite suggestion to read the textbook before declaring a revolution.
The mystery isn’t in the algorithm. The mystery is why anyone takes this seriously.
References and Prior Art
The behaviors attributed to “hidden” or “extra-algorithmic” activity in bubble sort are not novel, mysterious, or poorly understood. They are explicitly documented properties of sorting algorithms, compiler optimizations, and physical computation, with a literature spanning more than six decades. The following references provide authoritative grounding.
Sorting Algorithms and Stability
Bubble sort is a canonical comparison-based sorting algorithm whose properties, including stability, time complexity, and behavior under duplicate keys, have been formally analyzed since the mid-20th century.
Zutshi, A., & Goswami, S. (2021). A systematic review of sorting algorithms. Journal of Information Engineering and Applications.
DOI: https://doi.org/10.1016/j.jjimei.2021.100042
This review situates bubble sort as a quadratic-time, stable sorting algorithm and summarizes its known mechanical behaviors in the presence of equal keys.
Ganapathi, A., & Chowdhury, R. A. (2022). Parallel divide-and-conquer algorithms for bubble sort, selection sort and insertion sort. The Computer Journal.
DOI: https://doi.org/10.1093/comjnl/bxab107
A modern peer-reviewed treatment of bubble sort that reinforces its deterministic nature and well-understood behavior.
Shabaz, M. (2019). SA Sorting: A novel sorting technique. Journal of Engineering.
DOI: https://doi.org/10.1155/2019/3027578
Includes explicit discussion of bubble sort’s stability and its predictable “clustering” of equal elements due to comparison semantics.
Knuth, D. E. The Art of Computer Programming, Vol. 3: Sorting and Searching (1973).
Cormen, T. H., Leiserson, C. E., Rivest, R. L., & Stein, C. Introduction to Algorithms (1990).
Compiler Optimization and Program Behavior
Claims about algorithms exhibiting unexpected or “between-the-lines” behavior ignore the extensive literature on compiler transformations and hardware-level execution effects, which explain such observations without invoking metaphysics.
Brown, M. et al. (2020). Not So Fast: Understanding and Mitigating Negative Impacts of Compiler Optimizations on Code Reuse Gadget Sets.
arXiv: https://arxiv.org/abs/2005.08363
Demonstrates how compiler optimizations materially alter observable program behavior while preserving semantics.
Flückiger, M. et al. (2017). Correctness of speculative optimizations with dynamic deoptimization.
arXiv: https://arxiv.org/abs/1711.03050
A formal treatment of how speculative execution and optimization reorderings arise from deterministic compilation strategies.
Georgiou, K. et al. (2018). Less is more: Exploiting standard compiler optimization levels for better performance and energy consumption.
arXiv: https://arxiv.org/abs/1802.09845
Shows that performance patterns emerge from optimization levels alone, without any algorithmic novelty.
Wang, X. et al. (2011). Undefined behavior and compiler optimizations.
https://people.csail.mit.edu/nickolai/papers/wang-stack.pdf
A seminal systems paper explaining how compilers exploit undefined behavior, often producing execution patterns that surprise non-specialists.
Thermodynamics of Computation
All computation occurs on physical substrates and is subject to thermodynamic limits. Observed algorithmic behavior cannot escape these constraints.
Landauer, R. (1961). Irreversibility and heat generation in the computing process. IBM Journal of Research and Development.
DOI: https://doi.org/10.1147/rd.53.0183
Establishes the minimum energetic cost of information erasure, bounding all computational processes.
Taken together, this literature shows:
- Bubble sort’s “clustering” behavior is a direct consequence of algorithm stability, documented since the 1960s.
- Compiler optimizations and hardware execution effects are deterministic, measurable, and extensively studied.
- No unexplained degrees of freedom, intrinsic motivation, or Platonic structures are required to account for the observed phenomena.
What is being presented as discovery is, in fact, prior art.







