Computation as Core Language

#core-framework #meta-principle

What It Is

Computation provides universal language for describing mechanisms across all domains where information is processed, decisions are made, and behavior emerges. This is not metaphor—computation is fundamental substrate underlying any system that transforms inputs to outputs through rule-following processes. Physics equations are computations. Neural networks are computations. Genetic expression is computation. Market dynamics are computations. Computation is not one description among many—it is the deepest available language for how mechanisms work.

The Church-Turing thesis establishes that all effectively calculable functions can be computed by a universal Turing machine. Stephen Wolfram's principle of computational equivalence extends this: systems that appear complex are often performing computations of equivalent sophistication. A simple cellular automaton can exhibit complexity matching the human brain. The universe itself may be computational substrate—transforming initial states to next states through local rules iterated across space and time.

This means: if a system processes information and produces outputs through rule-based transformations, it is computational. And if it is computational, we have access to the entire toolkit of computational thinking—algorithms, complexity theory, information theory, state machines, caching, resource allocation. These are not domain-specific tricks. They are universal principles emerging from the mathematical structure of computation itself.

Computability and Universal Computation

The Turing Machine Foundation

Alan Turing proved in 1936 that a simple abstract machine (Turing machine) can compute any effectively calculable function. The machine has:

  • Infinite tape (memory)
  • Read/write head (processor)
  • State table (program)
  • Finite set of states and rules

The profound implication: Any computation performable by any physical system can be performed by this simple abstract machine. Computation is not implementation-specific. It is mathematical structure that transcends substrate.

Deeper still: Lambda calculus demonstrates that computation reduces even further—to pure pattern recognition, binding, and substitution. Pattern matching is the irreducible computational primitive. Turing machines, circuits, brains—all implement pattern matching at different scales.

System Substrate Computational? Turing-Equivalent?
Electronic CPU Silicon transistors Yes Yes
Human brain Biological neurons Yes Yes (practically bounded)
Genetic expression DNA/RNA/proteins Yes Yes
Cellular automaton Mathematical rules Yes Yes
Water flowing downhill Physical laws Borderline No (no information storage)

If system transforms inputs to outputs through rules and maintains state, it is performing computation. If it can simulate a Turing machine given sufficient resources, it is Turing-complete.

Wolfram's Computational Equivalence

Stephen Wolfram observed that systems quickly reach computational universality. Even simple rules (cellular automata with 3-state rules) can generate patterns of maximal complexity. This suggests:

Principle of Computational Equivalence: Almost all processes neither trivial nor obviously simple correspond to computations of equivalent sophistication.

Implication for behavior: Your brain running addiction pattern is performing computation of similar complexity to your brain doing PhD-level mathematics. The difference is not computational power (both are Turing-complete) but program being executed. Both are sophisticated computations. One program produces self-destructive cascade. Other produces creative output.

This reframes behavior change: not "develop better computational capacity" but "install better program." The hardware is universal. Change the software.

Algorithmic Complexity as Fundamental Metric

Kolmogorov complexity measures information content as length of shortest program that generates the output:

K(x) = min{|p| : U(p) = x}

Where:
  K(x) = Kolmogorov complexity of string x
  p = program
  U = universal Turing machine
  |p| = length of program p

Applied to questions:

Complex question = requires long program (high computational cost):

  • "How can I be better?" requires exploring entire knowledge graph
  • Algorithmic complexity: O(∞) - unbounded search
  • Output: Random or incomplete

Simple question = requires short program (low computational cost):

  • "What's the next action on highest-priority task?"
  • Algorithmic complexity: O(log n) - prioritized lookup
  • Output: Specific actionable answer

The question theory framework uses algorithmic complexity (O-notation) to evaluate question quality. This is not arbitrary—it reflects actual computational cost of executing the search program the question specifies.

Computation as Mechanism Description

Why Computational Language Is Fundamental

Other languages describe behavior at higher abstraction levels. Computational language describes the mechanism that produces the behavior.

Language Type Example Abstraction Level Debuggability
Moral "He's lazy" Highest (character judgment) Zero (no mechanism)
Psychological "He has low motivation" High (mental state) Low (vague mechanism)
Neurobiological "His dopamine system is dysregulated" Medium (physical substrate) Medium (not directly manipulable)
Computational "Expected value calculator outputs low signal" Low (mechanism) High (manipulable inputs)

Computational language wins because it describes manipulable mechanisms. You cannot directly manipulate "character" or "dopamine levels." You can manipulate expected value inputs: increase reward salience, reduce temporal distance, make progress visible. These are architectural variables.

Why is computation the "best basis" for mechanism description?

1. Universal applicability - Computation describes any rule-based transformation, regardless of substrate 2. Precise specification - Algorithms specify exact operations, enabling replication 3. Complexity analysis - O-notation reveals computational cost objectively 4. Compositionality - Complex mechanisms built from simple computational primitives 5. Debuggability - Computational systems have systematic debugging protocols 6. Transferability - Principles discovered in one domain transfer to all computational domains

When you describe procrastination as "work_launch_script failure, default_script runs instead," you are not using metaphor. You are describing actual computational process: attempted program load, load failure due to unmet preconditions, fallback to default program. This describes mechanism in computational terms that immediately suggest interventions: identify preconditions, satisfy them, or modify default program.

More Than Metaphors

The claim: computational descriptions are not metaphorical mappings to underlying reality. They are the underlying reality at appropriate level of abstraction.

Three levels of description for "waking up at 5am":

Level Description Usefulness
Physical Neurons fire, muscles contract, eyes open True but not actionable (cannot directly control neurons)
Computational Circadian program executes wake_script at time_threshold True and actionable (can modify program, threshold, inputs)
Moral "He's disciplined enough to wake early" False (describes output not mechanism, not actionable)

The physical level is true but too low for intervention. The moral level is false (confuses output with input). The computational level is both true and actionable—it describes mechanism at the level where you can actually intervene.

Neurons firing IS a computation. Ion channels opening based on voltage thresholds, neurotransmitter binding triggering cascades, action potentials propagating—these are computational operations. Describing them computationally is not metaphor. It is accurate description of what is actually happening.

Emergence and Levels of Abstraction

You can describe web browsing at many levels:

  • Physics: Electrons flow through semiconductors
  • Logic gates: AND, OR, NOT operations on binary signals
  • Assembly: MOV, ADD, JMP instructions
  • Operating system: System calls, process scheduling
  • Browser: HTTP requests, DOM rendering
  • User experience: "I clicked a link"

All levels are true. All describe the same event. But computational level (algorithm being executed) provides the right abstraction for understanding and modifying behavior. Too low (physics) and you drown in irrelevant detail. Too high (user experience) and you lose mechanistic understanding.

Computation Gives Access to Tools

Adopting computational language imports entire toolkit from computer science:

Algorithmic Complexity Theory

O-notation for analyzing cost of operations:

State Machine Theory

Finite state automata for modeling behavior:

Information Theory

Shannon entropy for measuring uncertainty:

Caching and Compilation

Program optimization for automation:

  • Neural pathway caching reduces activation cost
  • Compiled routines (automatic) vs interpreted execution (effortful)
  • Trade-off: compilation time vs execution efficiency

Resource Management

Computational budgets for allocation:

None of these are merely metaphorical. They describe actual mechanisms using precise mathematical language that enables quantitative analysis and systematic optimization.

Computational Thinking as Transferable Skill

The advantage of computational framing: if you understand algorithms, state machines, caching, and resource allocation from programming, you can transfer that understanding directly to behavior without learning new framework.

Reading "your habits are cached subroutines" immediately makes sense if you understand caching in CPUs. You know:

  • Caching trades setup time for execution speed
  • Frequently-used operations should be cached
  • Cache needs warming period (30 days for neural pathways)
  • Cached operations run automatically without conscious control

You transfer entire body of knowledge from CS to behavior instantly. Non-programmers must build this understanding from scratch. Programmers get it immediately through transfer learning.

This is why the mechanistic mindset is particularly powerful for technical people—they already think algorithmically about software systems. This framework just says: apply the same thinking to yourself.

Why Not Other Frameworks?

Physics Language

Physics provides precise mathematical description but wrong abstraction level for behavior. Describing neural firing in terms of ion channel dynamics is accurate but useless for intervention. You cannot directly manipulate ion channels. You can manipulate inputs to computational processes.

Economic Language

Economics provides optimization frameworks (utility maximization, rational choice) but wrong assumptions. Humans are not perfectly rational optimizers. They are bounded rational computers with limited working memory, cognitive biases, and satisficing heuristics. Computational language captures these limitations naturally (finite memory, imperfect algorithms) where economic language treats them as deviations from ideal.

Psychological Language

Psychology describes mental states (motivation, emotion, cognition) but at abstraction level that obscures mechanism. "Low motivation" describes output. Expected value calculator describes mechanism producing that output. Mechanism description enables intervention.

Biological Language

Biology describes physical substrate (neurotransmitters, brain regions) but causality is indirect. You cannot directly increase dopamine. You can modify inputs to computational processes that regulate dopamine through feedback loops. Computational level is where intervention is possible.

Comparison table:

Framework Abstraction Mechanistic Depth Actionability Transfer Learning
Moral Very High None Very Low None
Psychological High Low Low Low
Economic Medium Medium Medium Medium
Biological Low High Low Low
Computational Medium High High Very High

Computational hits sweet spot: deep enough to reveal mechanism, abstract enough to be actionable, universal enough to enable transfer learning.

Integration with Mechanistic Framework

Every major framework in this wiki uses computational language as foundation:

  • State Machines - Direct import from CS formal automata theory
  • Activation Energy - Computational cost measured in resource units
  • Willpower - Resource allocation and budget management
  • Working Memory - RAM analogy is not metaphor but accurate abstraction
  • Question Theory - Questions as Cypher queries with measurable complexity
  • Caching - Neural pathway compilation through repetition
  • Prevention - Removing expensive operations from execution path
  • Cybernetics - Control loops as computational feedback systems

These are not decorative computational metaphors applied to behavior. They are recognition that behavior is computation and should be described using computational language at appropriate abstraction level.

Resolution and Computational Thinking

Pedagogical magnification reveals why computational thinking transfers so effectively across domains. Computation exists at every resolution—from quantum mechanics to conscious decision-making to civilization-scale emergence. Intelligence is not operating at maximum resolution but matching resolution to available compute and desired intervention.

Most CS pedagogy actually gets this right: teaching Python or JavaScript (macroscopic) before assembly or memory management (microscopic). Students learn what programs accomplish before diving into how computers execute instructions. This natural sequence—macro before micro—builds intuition that makes lower-level details meaningful when students eventually encounter them.

With AI handling resolution translation, computational literacy becomes distinct from coding. Students can operate at intention-level resolution (describe what they want), with AI managing implementation-level details. They learn computational thinking through causality (does it work?) rather than syntax (did I write it correctly?). They zoom into machinery only when curiosity or debugging demands it, not because arbitrary prerequisites require it.

Key Principle

Computation is universal language for mechanism description - Any system transforming inputs to outputs through rules is performing computation. Computational language provides precise mechanism description at actionable abstraction level. Algorithmic complexity reveals computational cost objectively. State machines, caching, resource allocation, information theory—these are not metaphors but accurate descriptions of actual mechanisms in biological systems. The Church-Turing thesis establishes computational equivalence across substrates. Wolfram's principle suggests all non-trivial systems perform equivalent computations. This makes computational thinking universally transferable. If you understand algorithms and resource management from programming, you understand them in behavior. The transfer is direct because the substrate is identical: rule-based transformations of information under resource constraints. Computational language gives access to entire toolkit of computer science for debugging, optimizing, and engineering behavior. Not because behavior is "like" computation, but because behavior is computation at appropriate abstraction level.


Computation is not metaphor we apply to behavior. Computation is what behavior is when described at the mechanistic level. Accept this and the entire toolkit of computational thinking becomes available for systematic behavioral engineering.