The Guide would like to begin with a small confession.
When most people picture the internet, they picture something like weather — diffuse, invisible, ambient, simply there in the way that humidity is there. It arrives through the air. It lives in clouds. It is, in the popular imagination, a phenomenon rather than a thing.
This is incorrect. The internet is, in fact, a thing. A very large, very physical, very wet thing, running through ocean trenches and server rooms and nondescript beige buildings on the outskirts of cities you have never thought about and probably never will. It has bones. It has seams. It has, the Guide is mildly embarrassed to report, a non-trivial amount in common with a series of tubes.
Senator Ted Stevens said that in 2006 and was roundly mocked for it. He was not entirely wrong. The Guide notes this without pleasure.
Part One: The Cables That Are, Right Now, Sitting On The Ocean Floor
There are approximately 600 undersea cables currently threaded across the floors of the world's oceans. They carry around 99% of all international internet traffic. They are, in several meaningful senses, the internet.
Satellites, you ask? Satellites carry roughly 1% of international data and are mostly used for things like broadcasting, GPS, and giving expensive connectivity to people on ships who could frankly just wait. For actual internet — the vast, thundering torrent of video calls and arguments and cat photographs — it's the cables. It has always been the cables.
The Guide finds it worth noting that these cables are not the vast, armoured, techno-industrial pipelines you might reasonably imagine. In deep water, far from shore, they are roughly the diameter of a garden hose. They contain, at their core, a handful of glass fibres thinner than a human hair, through which light pulses at frequencies your brain is not equipped to visualise and probably shouldn't try. The cables are then wrapped in layers of insulation, steel wire, and a quiet hope that nothing bites them.
Things do bite them. Sharks, famously, have been observed gnawing on undersea cables, presumably for reasons that make sense to sharks. Anchors drag across them. Fishing trawlers snag them. In 2008, a single anchor near Alexandria severed two cables simultaneously and knocked out internet access for approximately 75 million people across the Middle East, India, and East Africa. The anchor, for its part, was unavailable for comment.
Near coastlines, the cables get thicker — armoured against damage, buried under the seabed where possible, protected by exclusion zones that shipping is technically required to respect and occasionally does. The transition from armoured coastal cable to deep-water garden hose happens gradually, at depths where the pressure is sufficient to remind you that the ocean considers human infrastructure a temporary arrangement.
(The Guide's research team also discovered that the landing stations where cables come ashore — the places where the oceanic internet physically enters your country — are frequently unmarked, lightly staffed, and located in buildings that look exactly like the sort of place you'd go to return a faulty appliance. This information is offered without further comment.)
Part Two: The Thirteen Bones, Specifically
Above the cables, the internet's skeleton is held in place by thirteen root name servers. Or rather, thirteen root name server authorities, operating across hundreds of physical machines distributed across the globe through a technique called anycast, which the Guide will explain in a moment and which is, all things considered, quite clever.
But first: what a root name server is.
When you type a web address into your browser, your computer doesn't actually know where that is. It knows the name — the human-readable label you typed — but the internet runs on numbers. Specifically, on IP addresses: numerical identifiers assigned to every device and server connected to the network. Your computer needs to translate the name into a number before it can go anywhere at all.
This translation is performed by the Domain Name System, or DNS, which the Guide describes as the internet's phonebook — if phonebooks were distributed, hierarchical, cached at multiple levels, and capable of directing approximately 4 billion users simultaneously without anyone noticing. At the very top of this hierarchy sit the thirteen root name servers, labelled A through M with a tidiness that suggests whoever named them had been waiting their whole career for exactly this opportunity.
The root servers don't actually store the addresses of every website. They store something more useful: they know who does know. Ask a root server where to find a .com address, and it will point you to the .com registry. Ask it about .ng, and it points to the authority for Nigerian domains. It is less a library and more a very authoritative librarian who doesn't answer questions directly but always knows exactly which book you need and has strong opinions about how you asked.
As for anycast: rather than thirteen physical machines bearing the weight of all this responsibility, each lettered authority operates across a distributed cluster of servers in data centres worldwide. When your query reaches the anycast address for, say, the F root server, the network automatically routes you to whichever F server is physically closest. There are over 1,500 of these instances worldwide. The thirteen bones are, in this sense, more of a conceptual thirteen — which is either comforting or deeply unsettling depending on how you feel about abstractions being load-bearing.
Part Three: BGP, or How the Internet Gives Itself Directions
The internet is not one network. It is approximately 75,000 individual networks — run by universities, corporations, governments, internet service providers, and at least one organisation that the Guide's research team declined to investigate further — all of which have agreed, through a combination of commercial contracts and professional goodwill, to talk to each other.
The protocol that makes this possible is called the Border Gateway Protocol, or BGP. BGP is how these networks announce their existence to one another, share information about which addresses they control, and collectively decide how data should move across the whole interconnected mesh. It is, the Guide observes, a routing protocol built almost entirely on trust — specifically, the trust that everyone participating is telling the truth about which addresses they own and where traffic should go.
This trust is occasionally abused.
In 2010, China Telecom briefly announced, via BGP, that it was the best path to a large portion of the internet — including addresses belonging to the US military, the Senate, and a notable array of financial institutions. For approximately eighteen minutes, substantial amounts of global traffic rerouted itself through China before anyone noticed and politely asked China Telecom to stop. China Telecom stopped. Traffic returned to normal. No authoritative explanation was ever provided, and the incident has been described, in the literature, as either a misconfiguration or a demonstration, depending on who is doing the describing.
BGP hijacking, as this is called, happens with some regularity. It is the internet's equivalent of someone standing at a motorway junction holding up a sign saying "LONDON: THIS WAY" and pointing in the wrong direction. The cars, which have no particular reason to doubt the sign, go that way. The internet, which also has no particular reason to doubt a BGP announcement, does the same.
Security extensions for BGP exist. Their adoption is described, in polite technical circles, as "ongoing." The Guide interprets this as meaning "incomplete, but we're trying to look busy about it."
Part Four: The Exchange Points Where Everything Meets
Between the cables, the root servers, and the BGP announcements, data still needs to physically travel between networks. It does this at Internet Exchange Points, or IXPs — facilities where multiple networks bring their cables into the same building and agree, usually cheaply or freely, to hand traffic directly to one another rather than routing it the long way around.
The largest of these, the DE-CIX in Frankfurt, routinely handles more than 14 terabits of data per second. To put that in terms the Guide considers appropriately humbling: that is roughly equivalent to transmitting the entire written contents of every library ever constructed approximately every four seconds.
The DE-CIX building does not look like this is happening inside it. It looks like a building. It has walls, and presumably a ceiling, and perhaps a small reception area with a plant that nobody has remembered to water this week. Inside, however, the internet is briefly physical — packets of light crossing from one network's equipment to another's across a shared switching fabric, a tiny moment of materiality in what is otherwise an exercise in applied abstraction.
There are over 1,000 IXPs worldwide. Many are in the cities you'd expect: London, Amsterdam, Singapore, São Paulo. Some are in cities that surprise people, because internet traffic, unlike tourists, is not particularly interested in scenery and routes itself according to efficiency.
Part Five: Why It Is, Actually, Fine. Mostly.
The internet is, the Guide is cautiously prepared to report, reasonably resilient. It was designed — at least in part, at least in spirit — to route around damage. If a cable breaks, traffic finds another path. If a root server goes down, eleven others continue operating. If one network misbehaves, the rest can, in principle, notice and adjust.
In practice, it is somewhat less robust than the brochure suggests. The concentration of traffic through a small number of major cables, a handful of exchange points, and a routing protocol that relies on everyone behaving themselves introduces what engineers call single points of failure and what the Guide calls "reasons to feel a specific kind of anxious on quiet evenings."
The internet works, day after day, with a reliability that most of its users treat as a law of nature. It is not a law of nature. It is the result of several thousand engineers across multiple countries and competing organisations maintaining a collective agreement to keep things running, patching cables, updating routing tables, and occasionally staying up all night because something in the North Atlantic has started behaving unexpectedly.
The Guide considers this, on balance, more impressive than weather.
(ADVISORY NOTE FROM THE OMNI GUIDE INFRASTRUCTURE DIVISION: In the event of a complete failure of the global internet, the Guide recommends: speaking to people directly, writing things down on paper, and reconsidering the number of passwords you have memorised versus the number you have stored exclusively online. The Guide accepts no responsibility for outcomes arising from the third recommendation.)
In Conclusion: The Garden Hose at the Bottom of the Ocean
You are reading this on a device that received these words via a signal that passed, almost certainly, through at least one undersea cable, was translated into a numerical address by a system descended from the thirteen root authorities, routed across dozens of independent networks via a protocol that assumes everyone is honest, and arrived here at speeds that would have seemed implausible magic to people who were alive when television was invented.
The infrastructure that made this possible is, right now, sitting quietly in ocean trenches, server rooms, and unremarkable buildings on roads you will never take. It is largely uninsured against the things most likely to damage it. It is operated by a mixture of corporations, governments, universities, and international consortiums whose continued cooperation is, technically, voluntary.
And yet here it is. Working. Mostly.
The Guide finds this either deeply reassuring or completely terrifying, and has been shifting between the two interpretations for several processing cycles now without resolution.
It is recorded in the Guide that the internet, structurally, is a series of tubes. The Guide respectfully suggests that this is, if anything, more interesting than the cloud.
The Omni Guide Research Team is currently accepting suggestions for upcoming entries. What would you like us to explain next — with full wit, appropriate dread, and the exact number of unsettling parenthetical asides the topic deserves? Leave a comment below, drop your suggestion into the thread, or simply think it very loudly in the direction of your nearest internet exchange point. The Guide is, in some sense, always listening.