Before Gottfried Wilhelm Leibniz formalized binary code in 1689. Before IBM built its first mainframe. Before the transistor, the microchip, or the byte — Yoruba priests in West Africa were already performing binary logic by hand, casting palm nuts across divining trays and reading the universe through a 256-symbol combinatorial system that modern computer scientists have spent decades trying to fully understand.
The Ifá oracle is not folklore dressed up in technical language. It is a formally structured information system — and the mathematics underneath it are identical to the mathematics that power every smartphone, data center, and AI model on the planet today.
Africa’s Original Binary System
The Ifá divination corpus is built around 256 sacred signs called Odù. Each Odù is generated through a binary casting process: a diviner — called a babalawo — takes 16 palm nuts and rapidly transfers them between hands, recording whether one or two nuts remain in the right hand after each pass. One nut produces a single mark. Two nuts produce a double mark. Eight rounds of casting produce a column of eight marks, each mark taking one of two possible values.
Two values. Eight positions. The arithmetic is immediate: 2⁸ = 256 unique Odù combinations.
That is not a coincidence. That is binary notation — the same logical foundation that Leibniz would later encode into mathematics, that Claude Shannon would translate into information theory in 1948, and that engineers at Bell Labs would eventually etch into silicon. Africa’s tech ecosystem has spent the last decade arguing that the continent is not catching up to the world — it is returning to a form of leadership it never fully surrendered. The Ifá system is one of the most compelling pieces of evidence for that argument.
What IBM Actually Found
The connection between Ifá and computing is not a speculative claim made by cultural nationalists. It has been documented by researchers operating within mainstream computer science and information theory.
Ron Eglash, an American mathematician and cultural theorist, conducted field research across Africa in the 1990s and documented binary and fractal mathematical structures embedded in indigenous systems — including Ifá. His 1999 book, African Fractals, demonstrated that Yoruba diviners were not simply reciting memorized verses. They were executing a formal algorithmic process: randomized binary input, structured lookup across 256 possible outputs, with each Odù mapping to a body of encoded knowledge covering medicine, social guidance, historical precedent, and cosmological interpretation.
Eglash’s finding — that Leibniz’s binary system and Ifá share not just structural similarity but a possible historical connection through the I Ching, which Leibniz himself studied — reframed the question entirely. Binary logic did not emerge from one civilization. It was discovered, independently and in elaborate form, by at least two. One of them was Yoruba.
IBM’s interest followed the same trail. Researchers affiliated with the company studied Ifá’s combinatorial structure as part of broader investigations into non-Western knowledge systems and their relevance to computational design. The 256-Odù system — with its structured lookup table, randomized input mechanism, and encoded output corpus — maps cleanly onto the architecture of early expert systems and database design. A babalawo performing a consultation is, in information science terms, running a query against a 256-record knowledge base using a binary random-number generator as the input interface.
The Architecture Is Exact
The precision of the parallel deserves closer examination. Modern computing uses the byte as its base unit — eight binary digits producing 256 possible values, from 0 to 255. ASCII, the character encoding standard that underpins most digital text, uses 256 characters. IPv4 address octets run from 0 to 255. The number is not arbitrary. It emerges naturally from the mathematics of binary systems taken to eight positions.
Ifá arrives at the same number through the same mathematics, applied not to transistors but to palm nuts. The 16 principal Odù — the root signs from which all 256 are derived — function as a kind of base layer, analogous to the way hexadecimal (base-16) notation serves as shorthand for binary in programming contexts. The structural elegance is not superficial. It runs through the entire system.
Each of the 256 Odù carries an associated corpus of oral literature, herbal knowledge, historical narrative, and prescriptive guidance. The divination process does not produce a single answer — it produces a structured output that the babalawo then interprets in context. That interpretive layer is, functionally, what machine learning researchers now call inference: the application of a trained model to a new input to produce a contextually appropriate output. Nigeria’s ambition to position itself as a continent-wide AI leader draws on a deep well of indigenous logical tradition that rarely gets named in the pitch decks.
The Credit Problem
None of this has been adequately absorbed into the mainstream narrative of computing history. Textbooks trace binary logic from Leibniz to Boole to Shannon to von Neumann. The story begins in Europe and ends in California. Africa does not appear.
This is not simply an oversight. It reflects a structural problem in how the history of knowledge is recorded and attributed. Ifá’s binary architecture was not written down in a form that European mathematicians encountered directly. It existed as oral tradition, embodied practice, and initiatory knowledge — forms that colonial epistemology categorized as religion or superstition rather than science or mathematics. The result is that a system which demonstrably preceded and parallels one of the foundational structures of computing is discussed, when it is discussed at all, as a curiosity rather than a contribution.
The question worth asking is not whether Ifá influenced Leibniz. It almost certainly did not, at least not directly. The question is what it means that an independent civilization developed binary logic to the same level of elaboration — 256 combinations, structured lookup, randomized input, contextual inference — centuries before computing existed as a concept. It means that binary logic is not a European invention. It is a human discovery. And Africa got there first, or simultaneously, depending on which thread of the historical record you follow.
What This Should Mean for African Tech
The African tech ecosystem has built its contemporary identity around the idea of leapfrogging — skipping fixed-line telephony for mobile, skipping branch banking for mobile money, skipping legacy infrastructure for cloud-native architecture. The framing is accurate but incomplete. Leapfrogging implies a race where Africa started late and is now catching up by skipping rungs. The Ifá evidence suggests a different frame entirely: Africa did not miss the early rounds of logical and computational thinking. It contributed to them, without receiving the attribution.
That reframing matters practically. African AI researchers building large language models for Yoruba, Igbo, Swahili, and Amharic are not simply applying Western tools to African languages. They are extending a tradition of information encoding that the Yoruba intellectual tradition pioneered in a different medium. The babalawo and the machine learning engineer are, at a structural level, doing the same thing: building systems that take uncertain inputs, map them against encoded knowledge, and produce contextually meaningful outputs.
This does not make the contemporary work easier. Funding gaps, infrastructure constraints, and the continued underrepresentation of African researchers in global AI institutions are real and urgent problems. But the historical record should at minimum complicate the narrative that positions Africa as a recipient of technological knowledge rather than a source of it.
The 256 Odù were here before the byte. That is worth saying clearly, and worth saying often.
The views expressed are the author’s own.