EN NL
← LTP ← Overview Encoding →
Brain Mechanism 4 of 19

Hebbian Learning

Neurons that fire together wire together.

What is it

In 1949, psychologist Donald Hebb proposed a simple rule: when neuron A repeatedly helps activate neuron B, the connection between them strengthens. The famous summary — "neurons that fire together wire together" — captures one of the most fundamental principles in neuroscience.

Hebbian learning is different from simple repetition (LTP). LTP says: use a connection more, it gets stronger. Hebb says: when two things are active at the same time, a connection forms between them. It's the difference between practising a single note and learning that a chord sounds right. Co-activation creates association.

This is how your brain builds categories, links memories to emotions, and connects a smell to a place you visited twenty years ago. Simultaneity is the teacher.

What it does in the brain

Hebbian learning is how associations form. You hear a song while falling in love, and forever after the song triggers the emotion. The auditory neurons and the emotional neurons fired together, so they wired together. Neither caused the other — they simply co-occurred, and the brain treated co-occurrence as meaning.

This mechanism is also how the brain forms clusters. You meet someone at work, at a conference, and at a dinner. Each time, neurons representing that person fire alongside neurons representing professional context. Eventually the person IS professional context in your neural map — even if you've never consciously categorised them.

The dark side of Hebbian learning is bias. If two things happen to co-occur by accident and the brain encodes them as linked, you get a false association. Stereotypes are Hebbian learning gone wrong: co-occurrence mistaken for causation.

What it does in ThetaOS

Layer 7 (Hebbian Co-occurrence) and Layer 9 (Hebbian Clustering) implement exactly this principle. When two entities appear together in the same context — the same photo, the same meeting, the same transaction batch, the same week — the system creates or strengthens a link between them. Nobody told it these things are related. Co-occurrence told it.

This is how ThetaOS discovers relationships that were never explicitly entered. A person and a location that appear together in 14 different weeks get a Hebbian connection, even if no one ever said "this person goes to this place." The data said it by co-firing.

Peter Ros and the city of The Hague: they co-occur in photos, in calendar entries, in text mentions. No one labelled Peter as "The Hague." But across 153 photo-days and 92 text mentions, the Hebbian layer discovered it. Ask Tom about Peter and The Hague surfaces automatically — not because it was filed there, but because the neurons fired together.

Layer 9 goes further: it clusters entities that share Hebbian connections. If A co-occurs with B, and B co-occurs with C, but A and C never directly co-occur, the system still recognises them as part of the same cluster. This is how ThetaOS builds its network graph — the same way your brain builds its social map.

Built — Layers 7 + 9