Bureaucracy and AI¶
flowchart LR
bur[bureaucracy load] --> mech[mechanical: forms · routing · summarisation]
bur --> human[non-mechanical: judgement · trust · accountability]
mech --> ai([AI absorbs])
human -.new bottleneck.-> rem[remains human]
- stigmergy in daily life — forms as adversarial traces
- scope — what AI is in-scope for
- risk — tier the new bottleneck
- commons — alternative to extracted bureaucracy
Investigation · rating: medium. L0/L1/L2 all filled (T-012, 2026-05-09).
Status: sapling | 2026-05-09 | rating: medium Compress levels: L0 ↓ L1 ↓ L2
L0 — TL;DR (≤5 lines)¶
A large fraction of human energy and time goes to bureaucracy: forms, approvals, audits, status meetings, compliance, repeated re-explanation of the same context. AI is on track to absorb most of the mechanical layer (drafting, lookups, routing, summarization). What that exposes is the part that was never mechanical — judgement, accountability, trust — which becomes the new bottleneck. The interesting question is what comes after AI saturates the mechanical layer.
L1 — Overview¶
Core question¶
Given AI absorbs the mechanical bureaucratic layer over the next decade, what fraction of current bureaucratic energy is freed, what stays (because it was always doing something non-mechanical), and what new bureaucracies emerge in response?
Why it matters¶
- Time spent on bureaucracy is the largest hidden tax on creative output.
- AI's biggest near-term economic effect is plausibly here, not in code generation.
- Predicting the residue (what bureaucracy is for once the paperwork is free) shapes what to build next.
Mermaid map (L1)¶
flowchart LR
pre["Pre-AI bureaucracy:<br/>forms, approvals, audits, meetings"] -->|absorption| ai[AI handles mechanical layer]
ai --> freed[Freed time / energy]
ai --> residue["Residue:<br/>judgement, trust, accountability"]
residue --> new["New bureaucracies<br/>emerge around residue"]
freed -.may flow to.-> new
freed -.may flow to.-> create[Creative / scientific output]
The freed time has two sinks. Which one wins is a policy and culture question, not a technical one.
Skeleton sub-claims¶
- Most bureaucratic work is mechanical signal-routing. Filling the right form, sending to the right person, checking the right box. AI handles this cheaply.
- Some bureaucratic work is deliberate friction. Filtering applicants, slowing high-risk action, generating accountability trail. AI doesn't remove this — it makes the friction cheaper to operate, and possibly cheaper to defeat*.
- The residue is judgement + trust. Who is allowed to do X. Whose recommendation counts. These cannot be auto-resolved without changing what they mean.
- New bureaucracies emerge around AI itself. Audit trails for AI decisions, model provenance, AI-output verification. Replacement is partial; net bureaucratic load may not fall.
- Post-AI scarcity will look different. Not "what can be produced" but "what can be trusted". Reputation, identity, and adversarial filtering become the load-bearing substrates.
- Possible developments after AI: trust-graph infrastructure becomes load-bearing; small high-trust groups outperform large low-trust ones; identity verification becomes a public utility; "bureaucracy as a service" becomes a commoditized API.
L2 — Deep dive¶
How much energy actually goes to bureaucracy¶
The honest answer is: a lot, but estimates differ widely because "bureaucracy" is not a clean category. Three pieces of evidence triangulate it:
- Time-use studies of knowledge workers consistently show 30–45 % of a working week on emails, status meetings, and "coordination" tasks. Producing the actual deliverable is the smaller fraction.
- Compliance-heavy industries (banking, healthcare, public sector) push higher — 50–70 % of professional time goes to documentation, regulatory filings, and audit-readiness work, not to the regulated activity itself.
- The personal-life equivalent — taxes, insurance, school forms, medical pre-authorizations — is small per-person but covers most adults and accumulates to weeks per year for a typical household.
The energy fraction is harder, but follows a similar pattern: switching costs (context-loads from interruption) plus low-grade anxiety about pending forms eat cognitive bandwidth on top of the wall-clock time. The two together are why bureaucracy feels like more than the timesheet suggests.
Most bureaucratic work is mechanical signal-routing¶
A surprising amount of bureaucratic work is shape-only: a request arrives, a person re-shapes it into a different form, and forwards it. The decision is rarely made at the routing step — the router does not have authority to decide; their job is to make sure the right authority sees a well-formed request.
This pattern is what AI absorbs first, because it is exactly what language models are good at: take loosely-structured input, classify it, extract the relevant fields, draft the corresponding output. A 2025-vintage model handles 80–95 % of single-step routing tasks at the quality level of a junior staffer. The remaining 5–20 % are the cases where the form's literal answer is wrong but the spirit is clear — and that is exactly where judgement (next section) takes over.
The freed time is real. The question is what it flows into.
Some bureaucratic work is deliberate friction¶
Not all bureaucracy is mechanical. Some bureaucracies exist precisely because their friction is the point:
- Filtering: a 30-page application is partly a test of whether the applicant can read 30 pages.
- Slowing high-stakes decisions: required cooling-off periods, multi- approval flows, change-control boards. The point is not to make the decision better but to make the decision slower, so reflexes have to pass through several humans.
- Generating accountability trail: the form is signed not because the signature changes the decision but because the signature creates a record that someone takes responsibility.
AI does not remove these — it makes them cheaper to operate (drafting the 30-page application is now free) and, dangerously, cheaper to defeat (filling 100 plausible 30-page applications is now also free). The friction-as-purpose bureaucracies will need to redesign around the new attack cost. Some will shift from "long form" to "high-trust referral"; others will collapse into theatre.
The residue is judgement + trust¶
Strip out the routing and most of the friction, and what is left is the part that was never mechanical:
- Judgement on edge cases. Whether this particular request meets the spirit of the rule, given context that is not in the form. Judgement is cheap to produce ("yes, fine") and expensive to delegate (because delegating it requires reproducing the context).
- Trust in the deciding entity. Whose recommendation counts. Whose signature is enough. This is a property of the decider, not of the request, and AI cannot manufacture it — at most, AI can verify claims about a decider's history, which is a much smaller move.
- Accountability for outcomes. If the decision goes wrong, who is on the hook. AI cannot be on the hook (it has no assets and no continuous identity), so the buck still stops with a human — the human's job becomes "stand behind this AI-routed decision," which is a different job from "make the decision."
The residue is small as a fraction of current bureaucratic load but large in value per minute. The shift is from a population of routers to a smaller population of judgers, with each judgement carrying more consequence per unit time.
New bureaucracies emerge around AI itself¶
Replacement is rarely net. Three new bureaucratic loads grow alongside AI absorption:
- AI-output verification. Every AI-drafted document needs a human sign-off if it has external consequences. The sign-off needs context to be meaningful, so the verifier ends up reading much of what the AI wrote — partly defeating the savings, especially while error rates stay above the human review threshold.
- Provenance and audit trails for AI decisions. Regulated industries already require logs of which model, which prompt, which data produced a given output. This is itself a bureaucracy, and a growing one.
- Defensive bureaucracy against AI-generated abuse. Spam, synthetic applications, fake reviews, low-effort regulatory filings to overwhelm reviewers. The defenders need new filters, which become bureaucracies in their own right (CAPTCHAs, identity proofs, rate-limiting policies).
Net bureaucratic load may not fall in the first decade. The composition changes — fewer typists, more reviewers and provenance-checkers — but the hour count can stay similar. The quality of what is produced under that hour count is the gain, not the hour count itself.
Post-AI scarcity will look different¶
In a pre-AI world the binding constraint was "what can be produced." Words, filings, designs, code, art — all rate-limited by skilled hands. In a post-saturation world that constraint relaxes by orders of magnitude.
What replaces it is "what can be trusted." Specifically:
- Trust that a piece of output came from the source it claims. Identity and provenance become utility-grade infrastructure, the way HTTPS became utility-grade after spam made unsigned email untrustworthy.
- Trust that a recommendation was made by a person whose track record we can read. Reputation graphs become load-bearing — not as social surface but as authentication primitive.
- Trust that a process was actually followed and not staged. Audit trails for human + AI hybrid workflows become the new compliance.
The economic shape: scarcity migrates from production to attestation. The salaries follow.
Possible developments after AI¶
A short list of shapes the residue might take. None of these is a prediction; they are the small set of futures that the L1 sub-claims most naturally extend to.
- Trust-graph infrastructure as a public utility. A handful of high-trust attestation services (banking, government IDs, professional licensure) become infrastructure the way DNS is — boring, ubiquitous, load-bearing, with a small standards body curating the edges. Most bureaucratic decisions become "look up in the trust graph; route to a human only on edge cases."
- Small high-trust groups outperform large low-trust ones. When routing is free, the only remaining advantage of large organizations is trust capital. Groups that can credibly say "we vouch for our members" capture the high-judgement work; the rest fragments into AI-augmented solos.
- "Bureaucracy as a service" becomes a commoditized API. Filing, compliance, and administrative drafting are sold by the call, the way email and SMS are. Most current paperwork-heavy roles disappear; a small number of platform-engineering roles take their place.
- The failure mode is volume, not quality. AI-generated bureaucratic output outpaces the AI-driven absorption capacity for a window — every agency drowns in plausible-looking applications, every reviewer is swamped. The fix is to push cost back into requests (paid filings, identity-proof gates), which itself becomes a new bureaucracy.
The interesting policy question is whether the freed time (from the L1 "freed" arrow) flows toward judgement-and-creation, or toward consuming the new bureaucracy that is emerging around AI. The technology forces neither outcome.
Open questions¶
- What is the actual percentage of working hours on bureaucracy today, by sector? (Estimates exist; calibrate against several.)
- Which bureaucracies strengthen under AI absorption (because their friction was load-bearing) vs collapse (because they were pure mechanical routing)?
- How fast does the residue grow? If AI absorbs 80% of mechanical load in 5 years but trust infrastructure takes 20 years to build, the gap is the interesting policy window.
- What does the failure mode look like — runaway AI-generated bureaucracy outpacing AI's ability to absorb it?
References¶
- (Pending L2 fill — Graeber "Bullshit Jobs" on bureaucratic work; productivity-paradox literature; recent McKinsey/OECD bureaucracy time-use estimates. Verify before citing.)
Inspiration sources¶
- The user's framing: "do investigation on how much energy and time spent on bureaucracy, how we can utilize new technologies for this. what comes with AI, how it will be utilized, possible future developments after AI." Direct.
See also¶
- ENERGY-AND-ATTENTION — bureaucracy is the most expensive attention sink at the population level.
- STIGMERGY-IN-DAILY-LIFE — forms are stigmergy with adversarial intent.
- UNIVERSE-EVOLUTION-AS-COMPRESSION — bureaucracy is bad compression of trust into procedure.