Hey there, happy Monday!
Articles to Read.
Is there a loneliness epidemic?
The media seems to have agreed that rich countries are experiencing a ‘loneliness epidemic’. There are literally thousands of newspaper articles that use this exact expression.
What is the evidence for this? The word ‘epidemic’ suggests that things are getting much worse and loneliness is increasing rapidly. But does the data in fact show that societies are becoming lonelier?
Despite the popularity of the claim, there is surprisingly no empirical support for the fact that loneliness is increasing, let alone spreading at epidemic rates.
It is true that more people are living alone around the world. But loneliness and aloneness are not the same. As we explain in a companion post, spending time alone is not a good predictor of whether people feel lonely, or have weaker social support.
As we explain later, today’s adolescents in the US do not seem to be more likely to report feeling lonely than adolescents from a couple of decades ago; and similarly, today’s older adults in the US do not report higher loneliness than older adults in the past. Surveys covering older adults in other rich countries, including Finland, Germany, England and Sweden, point in the same direction – it’s not the case that loneliness is increasing across generations in these countries.
—
We live in interesting times. For instance, we are witnessing several extinction events all at once. One of them is the massive extinction of species. The other is the extinction of jobs. Both are caused by advances in technology. As programmers, we might consider ourselves immune to the latter–after all, somebody will have to program these self-driving trucks that eliminate the need for drivers, or the diagnostic tools that eliminate the need for doctors. Eventually, though, even programming jobs will be automated. I can imagine the last programmer putting finishing touches on the program that will make his or her job redundant.
But before we get there, let’s consider which programming tasks are the first to go, and which have the biggest chance to persist for the longest time. Experience tells us that it’s the boring menial jobs that get automated first. So any time you get bored with your work, take note: you are probably doing something that a computer could do better.
One such task is the implementation of user interfaces. All this code that’s behind various buttons, input fields, sliders, etc., is pretty much standard.
I’m often asked by programmers: How is learning category theory going to help me in my everyday programming? The implication being that it’s not worth learning math if it can’t be immediately applied to your current job. This makes sense if you are trying to locally optimize your life. You are close to the local minimum of your utility function and you want to get even closer to it. But the utility function is not constant–it evolves in time. Local minima disappear. Category theory is the insurance policy against the drying out of your current watering hole.
—
Do you have any pre-writing rituals or habits before you sit down to write?
Ideally not. Ideally I already have the next few sentences lined up in my head, and I just sit down and start writing. But unfortunately that only happens about 20% of the time. When it doesn’t, I’m in trouble, because I do things like check Twitter, which is not good for the brain. You’d think by 55 I’d be more organized, but apparently not.
How do you know if something you’re working on is not worth publishing or not any good? Have you developed any criteria that lets you evaluate your own work?
I have a trick for this. I think the goal of an essay is to surprise the reader. And if you write about a topic you understand fairly well and you’re able to discover things you didn’t consciously realize when you started writing, they’ll probably surprise most readers too. That’s the test: am I surprising myself?
—
There is a renaissance underway in online text as a medium.
I want to take a stab at lightly theorizing this renaissance. And also speculating, in light of this renaissance, about what might be the eighth and penultimate death of blogging. And the future of books. So it’s going to be a sprawling, messy hot take on the State of Textual Media. Or at least a simmering take, since I’ve been thinking about this stuff for a year on the backburner.
The text renaissance is an actual renaissance. It’s a story of history-inspired renewal in a very fundamental way: exciting recent developments are due in part to a new generation of young product visionaries circling back to the early history of digital text, rediscovering old, abandoned ideas, and reimagining the bleeding edge in terms of the unexplored adjacent possible of the 80s and 90s.
I imagine, to traditionalists already bemoaning the slow decline of print-based media like books, newspapers, and magazines, these technologies I want to talk about might seem like the four horsemen of the apocalypse. But whether they strike you as renaissance or apocalyptic technologies, they’re here, so let’s meet them.
—
How the Horrific 1918 Flu Spread Across America
Wherever it began, the pandemic lasted just 15 months but was the deadliest disease outbreak in human history, killing between 50 million and 100 million people worldwide, according to the most widely cited analysis. An exact global number is unlikely ever to be determined, given the lack of suitable records in much of the world at that time. But it’s clear the pandemic killed more people in a year than AIDS has killed in 40 years, more than the bubonic plague killed in a century.
The impact of the pandemic on the United States is sobering to contemplate: Some 670,000 Americans died.
In 1918, medicine had barely become modern; some scientists still believed “miasma” accounted for influenza’s spread. With medicine’s advances since then, laypeople have become rather complacent about influenza. Today we worry about Ebola or Zika or MERS or other exotic pathogens, not a disease often confused with the common cold. This is a mistake.
We are arguably as vulnerable—or more vulnerable—to another pandemic as we were in 1918. Today top public health experts routinely rank influenza as potentially the most dangerous “emerging” health threat we face. Earlier this year, upon leaving his post as head of the Centers for Disease Control and Prevention, Tom Frieden was asked what scared him the most, what kept him up at night. “The biggest concern is always for an influenza pandemic...[It] really is the worst-case scenario.” So the tragic events of 100 years ago have a surprising urgency—especially since the most crucial lessons to be learned from the disaster have yet to be absorbed.
—
The war on food waste is a waste of time
Food waste is frequently articulated as an environmental crisis, a claim that rests on two arguments. The first is clearly climate-oriented: When food waste ends up in a landfill, it rots and produces methane, a potent greenhouse gas that warms the planet. In this argument, households are largely to blame, and the solutions put forward to address household food waste mostly center on policing behavior, whether through more judicious domestic labor or patronizing public education campaigns aimed at addressing consumer confusion.
Much like paper straws or canvas totes, though, well-meaning small changes miss the forest of structural change for the trees of lifestyle tweaking. The object of thrown-away food bears scrutiny, even though it is the way we dispose of food — mostly dumping it in landfills — that generates methane emissions. Large-scale composting or biogas generation, which could actually put a dent in this methane problem, often require public investment and political will — something consumer-focused finger-pointing does not.
This creative accounting suggests that wasting less food would somehow undo all of the harms of food production. But the nutrient cycle does not care whether or not you clean your plate. All the environmental impacts that brought that meal into being are done deals; in the parlance of introductory economics, they are “sunk costs.” In focusing so much on waste, we give a pass to the way things are further upstream. There is a rosy assumption that wasting less food would make it back up the supply chain in the most impressive game of telephone ever and signal to farmers to grow less food. But that seems unlikely in an agricultural paradigm staked by subsidies that incentivize the overproduction of four or five commodity crops, where farmers are subjugated by the demands of fewer and fewer agribusiness firms rather than consumers.
—
What Happened When Tulsa Paid People to Work Remotely
The first class of hand-picked remote workers moved to Tulsa, Oklahoma, in exchange for $10,000 and a built-in community. The city might just be luring them to stay.
Funded by the George Kaiser Family Foundation, an influential Tulsa-based philanthropy, Tulsa Remote is designed to put the small city on the national map, and shock it with a jolt of new energy, pulled from the outside in.
A year after Tulsa Remote launched, the first participants — a mix of expats from expensive coastal cities, wanderlusty young adults, and those with roots in the region — say they’ve found many of the things they were looking for: a more comfortable and affordable quality of life, new neighbors they like, enough of an economic cushion to ease the stress of buying new furniture, and a fresh start. Many say they’ll stick around past the end of the one-year program. More than that: Some of them tell stories of positive personal transformation that are so dramatic, they might appear too perfect, almost canned. But after checking in with participants over the course of eight months, I found that many of them remained just as effusive. Maybe it’s something about Tulsa. Or maybe it’s something about Tulsa Remote.
┄
More to Check Out:
- Man takes a photo of himself every day for 20 years
- How to brainstorm great business ideas
- Let’s Disrupt Dating Apps
- In Britain, Even Jails Have a Class System
- How to Raise Money – It’s a Journey Not An Event
—
My Update:
Working in SF.