Let it break. See if you own it.
"Convivial tools are those which give each person who uses them the greatest opportunity to enrich the environment with the fruits of his or her vision." - Ivan Illich, Tools for Conviviality (1973)
In December 2024, Chamberlain, launched Security+ 3.0 for their garage doors. This update killed every solution third-party controllers had developed to integrate with Apple Home, Home Assistant, or Google Home. Since then, if clients wanted smart control of their garage, they were pushed into Chamberlain's ad-stuffed myQ app and a short list of partners, nearly all requiring paid subscriptions.
A garage door should not be Cloud-authenticated, subscription-gated, and deliberately incompatible with anything the manufacturer doesn't control. A garage door opener used to be simple. A motor, a chain, a button. Something anyone could fix with a screwdriver.
We've all been here. A printer we cannot repair. Software that updates itself without asking. An old phone's pictures we can't access because the software does not exist anymore. We are sold ecosystems so seamless, so frictionless, that leaving feels unthinkable until it breaks.
Something feels wrong, even if we do not name it. We sense it when the update breaks an appliance we relied on. When a service shuts down and takes our data with it. When the repair costs more than replacement, by design. When we realize we don't understand how anything works anymore, and we're not supposed to.
The philosopher Ivan Illich called this missing quality conviviality: a society built on tools that empower rather than subordinate their users.
"A convivial society should be designed to allow all its members the most autonomous action by means of tools least controlled by others." — Ivan Illich, Tools for Conviviality (1973)
For Illich, conviviality means technologies we can understand, repair, and adapt. It means tools that extend our autonomy instead of making us dependent on distant experts or opaque systems. Unlike mere convenience, conviviality is about regaining agency: the ability to shape the tools that, in turn, shape your life.
But how, concretely, do we achieve conviviality? It's tempting to agree with Illich in principle, to nod sagely at the right-to-repair movement and share memes about glue-free laptops, but then reach for the most convenient device anyway. The question is not merely which tools to use, but what structural changes make autonomy—real, generalizable, at scale—possible.
This is the heart of the problem: conviviality is an aspiration, not a method. It points us towards tools and systems that enhance autonomy, but gives little guidance for building them amid technical complexity and institutional inertia. We are left with a gap between the philosophical ideal and the practical realities of modern technology.
By what means can conviviality thrive in our world of black-box devices and inscrutable software?
The answer is transparent technologies.
A transparent technology exposes itself to the user’s skilled intervention. Think of a classic bicycle built from standard parts, every mechanism visible and serviceable, rather than an e-bike locked down with proprietary batteries and sealed electronics. It can also be a Framework laptop whose components you can swap with a screwdriver, not a heat gun; or a Rails monolith instead of a tangle of microservices maintained by distant teams.
Transparency is how conviviality is actually implemented. It is both a technical property (the system is inspectable) and a social one (users are empowered to act). When things break, transparent tech gives us the option to repair, repurpose, or at least understand—rather than passively await the next invisible update.
If conviviality is a goal, transparent technologies are a means to an end.
When we demand transparency (in software, in devices, in institutions) we rebuild the conditions for autonomy that Illich insisted on. And when we settle for opacity, we cede sovereignty to the keepers of the black box.
Why opacity keeps winning
If we know transparency makes us free, why do we keep choosing the black box (the sealed laptop, the invisible service, the system we’ll never fix)? Why does the market reward opacity at every turn?
Opacity is obviously profitable. Sealed devices ensure replacement sales. Proprietary APIs lock users into subscriptions. Cloud-only tools guarantee recurring revenue. Conviviality threatens the business model of planned obsolescence.
But on the consumer side, it is not just the pull of convenience; there is a deeper logic at work. Opaque systems offer predictable outcomes with minimal engagement. When the central heating works, warmth is automatic. With cloud software, updates and scaling are someone else’s problem. When something is “managed,” risk appears to vanish—the system promises to shield us from friction, confusion, and failure.
The trade is implicit: we hand over understanding for efficiency; autonomy for assurance; context for consistency. Businesses prefer opacity because it promises fewer surprises and lower support costs. Institutions bet on black-box systems to control complexity : by centralizing expertise, they hide it from users entirely.
A decade after Illich, philosopher Albert Borgmann named this drift. He called it the device paradigm: the systematic transformation of engagement into commodity, made palatable by hiding the machinery from view.
Borgmann called this logic the "device paradigm." In Technology and the Character of Contemporary Life (1984), he describes how modern technology transforms engagement into commodity: as devices grow more sophisticated, their workings are deliberately concealed behind ever more opaque machinery. The result is abundance without understanding—commodities delivered seamlessly, but at the cost of user autonomy.
Borgmann distinguished between focal things and devices using examples like the fireplace and cooking a meal. But it can be extended :
| Focal thing | Device |
|---|---|
| Fireplace | Central heating |
| Cooking a meal | Microwave dinner |
| Playing guitar | Streaming music |
| Writing code you understand | Calling an API you don't |
The fireplace demands engagement: you gather wood, tend the flames, sit together around the hearth. Central heating delivers the same warmth—but the machinery is hidden, the engagement is gone, and when it breaks, you call a professional.
When you understand how something works, you can fix it. When you can fix it, you're free.
The device paradigm is the enemy of conviviality. As the machinery recedes from view, so does our autonomy.
What makes technology transparent?
Five properties distinguish transparent technologies from opaque devices:
1. Understandable architecture
Can we understand how this works—not just by reading docs, but by using, breaking, and repairing it?
- Transparent: A Rails monolith. One codebase, clear conventions, predictable structure. A bicycle that teaches you mechanics through repair. Understanding through engagement, not just documentation.
- Opaque: A microservices mesh with 47 interdependent services, event buses, and eventual consistency. Legible only to specialists with access to internal tools.
2. Rewards skillful engagement
Does this tool invite creative, skillful activity—and reward that engagement with presence and meaning? Or does it reduce us to passive consumption?
- Transparent: Baking bread from scratch. Feeling the dough, judging the rise, adapting to the day's humidity. The work commands your presence and repays it with satisfaction no microwave can deliver.
- Opaque: Meal kits / assembly line cooking. All decisions pre-made; you follow steps. Efficient, but hollow.
3. Visible state and visible tradeoffs
Can we see what the system is doing—and what it's hiding? Every tool amplifies some capacities and reduces others. Transparency means making that structure visible.
- Transparent: A SQLite database. One file. Query it directly. A hand-crank grinder that shows you what "grinding" means.
- Opaque: A dashboard that shows metrics but conceals the collection apparatus. A distributed cache with sharding and replication lag you'll never inspect.
4. Fixable when broken
When it breaks, can we diagnose and repair it—or does breakdown trap us in helplessness? Repair is pedagogy: the bicycle teaches mechanics through breakdown.
- Transparent: A bicycle. Flat tire? You see it, understand it, fix it—and learn something. A Framework laptop: swap the battery, replace the screen.
- Opaque: A modern car's ECU. Warning light? Pray the dealer's diagnostic tool knows. A glued-shut phone that teaches only dependence.
5. Modifiable without supporting a monopoly
Can we adapt it to our needs—without locking everyone into one solution? This is Illich's most radical criterion: convivial tools don't impose themselves.
- Transparent: Email. Self-host or use any provider. Any client works. The protocol doesn't care. Home-cooked software built for your family's exact needs.
- Opaque: iMessage. Seamless for Apple users, but the lock-in is the point. SaaS that offers "customization" through a 200-field settings panel—configured, never modified.
Transparency nudges us to use the tools right
The five properties above are not arbitrary. They trace back to deeper insights about how we relate to tools.
Philosopher Martin Heidegger, in Being and Time (1927), explained it with a simple example: when a carpenter uses a hammer, she doesn't notice the hammer at all—it just "disappears" into her work. But if the hammer breaks, suddenly all her attention snaps to the tool itself. Heidegger called this shift "ready-to-hand" (when the tool works and fades into the background) versus "present-at-hand" (when the tool breaks and becomes a problem).
Later, philosopher Don Ihde added another layer. In Technology and the Lifeworld (1990), Ihde argued that no tool ever fully melts away. Every piece of technology, even the most invisible, changes how we act and how we see the world. When you ride a bicycle, your balance and vision change. A car dashboard picks what info you see and what you don't. Even the simplest tool shapes our experience.
So transparent technology is not just about seeing inside the machine or reading the code. It's about understanding how the tool shapes us—and having the option to inspect, question, or change it. That's how conviviality moves from an abstract hope to something you can actually do: by letting users see and edit not just the guts of a tool, but the ways it "pushes" or guides us.
Now is a good time to care
1. Complexity compounds faster than understanding
A modern car has over 100 million lines of code. A John Deere tractor requires dealer software to diagnose. A refrigerator now needs firmware updates. Each layer of sophistication adds opacity. At some point, no single person, not even the manufacturer's engineer, can understand the whole system.
The same pattern appears in software. A static blog built with Next.js can bloat into hundreds of megabytes of dependencies not even the experts can reliably debug. Next.js is "fast food" (see Software meal tier system) convenient, but served from someone else's kitchen, inscrutable when something goes wrong. Fine craft, like home-cooked HTML, is slower—but you are in control.
2. Dependency means accelerating monopolies
Illich, in Tools for Conviviality (1973), warned about "radical monopoly"—when a type of tool becomes mandatory.
Today's radical monopolies are ecosystems. HP printers that reject third-party ink cartridges. Keurig machines that scan pods for authenticity. Tesla repairs that can only happen at Tesla service centers. Chamberlain garage doors that block third-party controllers.
The pattern extends to software: cloud services, proprietary APIs, black-box AI. Every opaque dependency is a point of fragility. When it breaks, you wait for someone else to fix it.
Transparent dependencies—a bicycle shop that uses standard parts, open-source software you can fork—at least let you patch, adapt, or understand the failure.
3. AI makes transparency elsewhere urgent
LLMs are the most opaque systems ever deployed at scale. No one—not even their creators—fully understands why they produce specific outputs.
This creates a new premium on transparency in the surrounding layers. If the AI is a black box, at least make everything else inspectable: the data you feed it, the prompts you use, the systems it connects to. The convivial response to AI is not rejection—it's transparency in the layers you control.
Let it break: breakdown as liberation
Breakdown is not a problem to be avoided. It’s a moment of revelation—a flash that exposes the true contours of our dependence. Heidegger saw this with uncanny clarity: when the hammer splinters in your hand, you cease relating to it invisibly as “equipment”; suddenly the tool stands before you, visible, alien, demanding reflection—forcing the question: can you fix it, or are you helpless
Illich, too, saw repair not as inconvenience, but as emancipation. The event of breakdown transforms the user from passive consumer to participant. We discover if we have cultivated autonomy—or if we are simply industrially impotent, at the mercy of black boxes.
As a society we treat breakdown an inconvenience : something to be patched, masked, or offloaded onto professional support. But convivial systems treat breakdown as a source of power. When things break, we are invited to learn, to repair, to bridge the gap between dependence and sovereignty.
Conviviality, in Illich’s sense, is not the comfort of seamlessness. It is the capacity to respond meaningfully to breakdown: to seize the moment when transparency is tested, and to walk it toward autonomy.
Let it break. Then see what (if anything) stands between you and control.
Working toward transparency
If opacity is the easy way, how to we built the right wat? Three principles :
1. Prepare for the moment of breakdown
Heidegger showed us that tools disappear in use—until they fail. When they fail, what do we find?
- Assume things will break. A mechanical watch reveals its logic when you open the caseback. A cast iron pan shows its wear in ways you can address. Document the why, not just the what.
- Prepare escape hatches. A car with a manual transmission teaches you how engines work. A Framework laptop lets you swap components. A Rails monolith keeps the whole system where you can see it.
- Prefer what time has tested. The bicycle has been debugged for a century. Postgres has decades of edge cases resolved. Boring technology is technology that has already broken and been repaired.
2. Make the invisible visible
Ihde reminded us that every tool shapes how we see and act—even when we don't notice it.
- Expose the mechanism. A hand-crank coffee grinder shows you what "grinding" means. A SQLite database is one file you can query directly. Logs show what is happening under the hood. Visible mechanisms teach.
- Avoid magic A thermostat that shows its logic is convivial. A "smart" home that acts without explanation is not. If the framework does something for you, make clear what and why.
- Publish schematics. Exploded views, parts lists, teardown guides. The Framework laptop publishes them; Apple publishes lawsuits.
3. Invite skillful engagement
Borgmann's focal things demand our presence. Transparent technologies should reward attention.
- Choose tools that teach. Baking bread teaches fermentation. Maintaining a bicycle teaches mechanical advantage. Writing code you understand teaches how systems think.
- Welcome modification. A cast iron pan can be reseasoned and handed down. A bicycle with standard parts can be fixed anywhere. Open source invites forks.
- Reject the subscription trap. When a tool requires ongoing payment to function, you rent access to a black box. Chamberlain's garage door is the warning.
P
The goal isn't purity—it's optionality. Use the sealed MacBook for work; understand a Framework on weekends. Ship to the cloud; maintain the capacity to self-host. The risk isn't using opaque tools. It's losing the skill to use transparent ones.
Here's the paradox that makes it worthwhile: expertise in one transparent domain sharpens our sense for all the others. Learning to fix a bicycle helps us notice what's wrong with a car—even when we can't open the hood. Understanding a Rails monolith lends clarity when we're dropped into a labyrinthine SaaS product. The more we know the internals somewhere, the less we're at the mercy of the inscrutable everywhere.
This is not about rejecting convenience. It's about cultivating capacity. Every hour spent with tools you understand is an investment in autonomy that compounds. Every skill you let atrophy is a door that closes.
Andrew Feenberg reminds us: technology is not fate—it's "a scene of struggle." The struggle is not against opaque tools—it's against the gradual loss of the skills to live without them.
"Technology is not a destiny but a scene of struggle." — Andrew Feenberg, Critical Theory of Technology (1991)
When we understand how something works, we can enrich the environment with it. When we can fix it, we're free. When we can modify it, we can express our vision.
That's conviviality. Transparency opens the door. Engagement walks through it.