A Hacker's Mind
by Bruce Schneier
Buy a print copy from Bookshop.org (affiliate)This page contains highlights I saved while reading A Hacker's Mind by Bruce Schneier. These quotes were captured using Readwise and reflect the ideas or passages that stood out to me most.
Highlights
. But, people being as they are, a great deal of medieval thought went into exactly what counted as "meat" and "not-meat," especially during the month-long fasting seasons of Lent and Advent. Fish didn't count as meat. Barnacle goose was also considered not meat, because it had scaly webbed feet and (supposedly) gave birth in the water. A similar argument applied to beavers: also not meat. (This isn't just a historic curiosity. Even today, Detroit Catholics are permitted to eat muskrat during fast days on the basis of a missionary's ruling from the 1700s.) Some French monasteries served rabbit fetuses, not deemed meat because they were swimming in amniotic fluid. (Really: I'm not making any of this up.) St. Thomas Aquinas declared chicken to be of aquatic origin—whatever that means—and not meat. Some bishops went further, declaring that, because they are not quadrupeds, all fowl is fair game (so to speak).
Uber and Lyft, for example, have created an unsustainable market for hired-car rides by charging artificially low prices that do not accurately reflect the value of drivers' labor.
It's basically central planning by elite investors, something that would be called communism if the government did it.
In 1902, the Hanoi government tried to eradicate the rat population by paying for rat tails. People quickly realized that the smart thing to do was to trap a rat, cut off its tail, and then release it back into the wild so it could breed more rats to catch. In 1989, Mexico City introduced a pollution-control scheme whereby cars with even and odd license plates could drive on alternate days. People responded by buying a second car, often an old and highly polluting one.
More recently, Uber drivers in Nairobi created a hack to cut the company out of its share of the ride fee. Passengers hail drivers through the Uber app, which also sets the ride charge. At pickup, the driver and passenger agree to "go karura," which means that the passenger cancels the ride and pays the driver the entire amount in cash.
VC funding and private equity are such a normal part of our economy that it might seem odd to call them hacks. But they are; they're hacks of pretty much everything the market is supposed to do. No one calls it a hack; everyone just calls it "disruptive" and "innovative." That it's both legal and accepted doesn't change the fact that money and power decide what behavior is accepted and who gets a seat at the gaming table.
As a whole, the VC system subverts market capitalism in many ways. It warps markets, allowing companies to charge prices that don't reflect the true cost or value of what they're selling. It permits unprofitable enterprises and unsustainable business models to thrive and proliferate.
We don't want some central planner to decide which businesses should remain operational and which should close down. But this is exactly what happens when venture capital firms become involved. The injection of VC money means that companies don't need to compete with each other in the traditional manner, or worry about the normal laws of supply and demand.
Today, I'm certain that companies view a "too big to fail" bailout as their ultimate insurance policy.
This isn't the first time the US government bailed out "too big to fail" companies. The Federal Deposit Insurance Corporation was created in the 1930s, following a torrent of bank failures, in order to monitor banks and protect consumer deposits. In 1979, the government bailed out Chrysler Corporation. It was a smaller bailout—only $1.5 billion—but the justifications were similar. National security was invoked; the company was building the M1 Abrams tank during the height of the Cold War. The economy was invoked; it was necessary to save 700,000 jobs in Detroit and beyond. And the US was in the middle of an automotive trade war with Japan.
Directors of an enterprise deemed too crucial to fail, on the other hand, know that the inevitable costs of any poor decisions they might make will be paid by taxpayers: that is, by society as a whole. This creates moral hazard and incentivizes risky decision-making. If they're successful, they'll win. And if they're unsuccessful, they're protected from loss. "Too big to fail" is an insurance policy against bad bets.
Norms only work if there are consequences for violations, and society couldn't keep pace with the onslaught.
It's hard to find numbers for Uber Eats, but Uber itself lost $6.8 billion—and that's better than its $8.5 billion loss in 2019. And it's unsustainable for individual investors, too; food delivery doesn't work for anybody. The drivers—gig workers with no benefits or job security—are poorly paid. The services hurt restaurants: they're not profitable for a whole bunch of reasons, they don't bring in incremental sales, and a restaurant's reputation suffers when the delivery service screws up. Even the customers don't fare well: they're hit with service fees and higher prices and suffer all sorts of delivery problems.
Hacks like these exemplify three attributes that we'll revisit later on in the book. First, "too big to fail" is generalizable. As big banks, real estate, and other "essential" sectors of the economy recognize that they are able to employ the "too big to fail" hack, the market economy as a whole becomes vulnerable to enterprises that expand unsustainably. Second, hacks can be systematized and reshape decision-making: the bailouts in 2008 codified "too big to fail" into law. By demonstrating that the federal government was willing to bail out the banking, real estate, and automotive sectors, Congress normalized the hack as just another part of the high-finance game. And third, the very concept of "too big to fail" changes the incentives of those regulating the massive organizations, and consequently the organizations themselves.
(In economic recessions, government revenues decline because people earn less and pay less taxes, and government spending increases on programs like unemployment insurance. In short, recessions become more costly as they become more severe.)
Adam Smith wrote about this in 1776, explaining that the economic interests of businessmen are often misaligned with public interests. The goal of businessmen—and, of course, business enterprises—is to maximize profits. The goal of the public is to (more or less) maximize product quantity, quality, variety, and innovation, and minimize prices. Lack of competition means that sellers no longer fear losing buyers, and thus have no incentive to provide any of those things the public wants.
Fail-Safe/Fail Secure: All systems fail, whether due to accident, error, or attack. What we want is for them to fail as safely and securely as possible.
Simplicity: The more complex a system, the more vulnerable it is.
Children are natural hackers. They don't understand intent and, as a result, don't see system limitations in the same way adults do. They look at problems holistically, and can stumble onto hacks without realizing what they're doing. They aren't as constrained by norms, and they certainly don't understand laws in the same way. Testing the rules is a sign of independence.
Def: System /
tǝm/ (noun) -
A complex process, constrained by a set of rules or norms, intended to produce one or more desired outcomes.
Hacking subverts the intent of a system by subverting its rules or norms. It's "gaming the system." It occupies a middle ground between cheating and innovation.
Def: Hack /hak/ (noun) -
-
A clever, unintended exploitation of a system that (a) subverts the rules or norms of the system, (b) at the expense of someone else affected by the system.
-
Something that a system allows but which is unintended and unanticipated by its designers.
No matter how locked-down a system is, vulnerabilities will always remain, and hacks will always be possible. In 1930, the Austro-Hungarian mathematician Kurt Gödel proved that all mathematical systems are either incomplete or inconsistent.
Compartmentalization (isolation/separation of duties): Smart terrorist organizations divide themselves up into individual cells. Each cell has limited knowledge of the others, so if one cell is compromised the others remain secure.
Defense in Depth: The basic idea is that one vulnerability shouldn't destroy the whole system
A system exists to further a set of goals, usually put forth by the system's designers. A hacker hijacks the same system for a different set of goals, one that may be contrary to the original ones.
This is obvious in the hacks of ATMs, casino games, consumer reward plans, and long-distance calling plans. The goal of whoever managed the ATM is to dispense cash to account holders and deduct the appropriate money from their accounts. The goal of the hacker is to receive cash without having money deducted from their account (or without even having to have an account at all). Similarly, the goal of a casino is to be fair (which means equal opportunity between players, not equal opportunity between players and the house). The goal of the hacker is to tilt that advantage in their favor.
All computer code contains bugs. These are mistakes: mistakes in specification, mistakes in programming, mistakes that occur somewhere in the process of creating the software, mistakes as pedestrian as a typographic error or misspelling.
The hacker isn't just outsmarting her victim. She's found a flaw in the rules of the system. She's doing something she shouldn't be allowed to do, but is. She's outsmarting the system. And, by extension, she's outsmarting the system's designers.