To be a U.S. teenager in 2023 is both the same as it ever was and astoundingly different from even a generation ago. Along with all the classic challenges of growing up—grades, parents first loves—looms a crop of newer ones: TikTok, gun violence, political division, the whipsaw of COVID-19, the not-so-slow creep of climate change.
“The main domains are the same: school, home, family, and peers,” says Dr. Asha Patton-Smith, a child and adolescent psychiatrist at Kaiser Permanente in Virginia. But the stressors that emerge within those domains have changed tremendously in a world where the internet and real life have largely blurred into one, with everything from school to social interaction now happening at least partially online and a fire hose of bad news always only a swipe away.
This new world has taken a toll on U.S. teenagers if the staggering data on adolescent mental health are any indication. In 2020, 16% of U.S. kids ages 12 to 17 had anxiety, depression, or both, a roughly 33% increase since 2016, according to an analysis by health-policy research group KFF. The following year, 42% of U.S. high school students said they felt persistently sad or hopeless, 29% reported experiencing poor mental health, 22% had seriously considered suicide, and 10% had attempted suicide, according to the U.S. Centers for Disease Control and Prevention (CDC).
These data are sometimes used to argue that kids aren’t as tough as they used to be. But kids see it differently. “Other generations are telling us that we’re a weak generation … and we haven’t lived through this and that,” says 16-year-old Jasmine. “But we’re in a new world experiencing new things … They haven’t experienced half of what we’ve experienced.”
It’s not only big, macro-level societal shifts that are having an effect. CDC data also show that personal traumas like sexual violence, bullying, and social isolation are concerningly common, particularly among teen girls and teens who do not identify as straight—two groups at particularly high risk for poor mental health.
.
“Even though they may not be as bad as what adults are dealing with, we still have problems too. Adults shouldn’t disregard our feelings because we go through things as well. [They don’t] get to minimize our problems.”—Malayah, 14, Georgia Photographs and Interviews by Robin Hammond | Text by Jamie Ducharme
It was a typical night for research scientist Benjamin Baird, then a graduate student at the University of California, Santa Barbara. It was late, around 1 a.m., and he was reading everything he could find online about his dissertation topic, which was human consciousness. That’s when he came across some information about lucid dreaming.
“I’d never considered that something like that was possible,” Baird says. “I could be conscious and aware while I was asleep and in a dream? It blew my mind.”
The story suggested that people get in the habit of counting their fingers. If you don’t have five of them, you’re probably dreaming, the article said. Baird fell asleep thinking about what he’d read. Soon he was in a dream world. He looked at his hand. One. Two. Three. Wait? Three? “Oh,” he thought, “I’m asleep. I’m dreaming!” “Then I took off flying and woke up,” he says.
The experience left Baird hungry for more. Now, several years later, he’s a researcher at the University of Texas focusing on cognitive neuroscience. His work, along with that of others, has helped scientists to figure out ways to harness lucid dreaming for improved mental health and physical performance.
Lucid Dreaming Goes Mainstream
Way back in the 1970s, when Stanford psychophysiologist Stephen LaBerge began developing techniques that allowed himself and others to control their dreams, the greater scientific world was skeptical, at least at first.
That’s because it was difficult to prove that LaBerge and others were truly having lucid dreams. For many researchers at the time, lucid dreaming seemed about as scientifically plausible as levitation. That is, until LaBerge and other researchers found a few lucid dreamers who were willing to sleep in a lab with all sorts of sensors attached to their bodies, including their eyelids.
The word for a TV remote is marote; for chicken, it’s chimpken, and for the Aperol Spritz cocktail it’s app-a-ball spitz-ee. Shrimp is swimps, hair ties are hair gigglies and Starbucks is Starbonks.
All of these are examples of so-called marriage language, the weird and oftentimes embarrassing dialects people in long-term relationships develop to communicate with their partners.
It’s typically a mishmash of inside jokes (giving friends and family members nicknames) and purposeful malapropisms (slipping up and mispronouncing bird as birb), plus faux abbreviations (a shower is a show show, spinach becomes spinch) and code words for cruder terms (every couple seems to have their own word for passing gas).
Most people give their partner affectionate nicknames, and as many as two-thirds of couples use romantic baby talk to signal closeness. Marriage language is the natural extension of these behaviors, a personalized lexicon built up between two people who have spent so much time together that they’ve started using their own dialect.
Lilianna Wilde and Sean Kolar, musicians and content creators from Los Angeles who have been married for almost five years, said that their marriage language started to develop when they first moved in together after a year of long-distance dating. First came “show show” — Sean’s nickname for a shower. Then there was chick rotiss for a rotisserie chicken, pantaloons for jeans, and an oopsie for a sidewalk curb.
.
Oh yes, darling, let’s have a delicious “app-a-ball spitz-ee.”Credit…Liliana Wilde
Some content on this page was disabled on April 15, 2025 as a result of a DMCA takedown notice from Guardian Media Group. You can learn more about the DMCA here:
Mocktails and sunscreen, masking and mindfulness — for those of us who strive to be upright, responsible citizens, the constant reminders of various ways we ought to be good are all around us. They’re almost enough to make you forget the pleasures of being a little bit bad. We asked 16 writers — most of them respectable adults — about the irresponsible, immoral, indulgent things they do. Transgression has the power to teach us something about how we ought to live. But it’s also just … fun?
I’m not a drunk. And I’m not a liar. But I am, unequivocally, a drunk liar. After a few tequila shots and an audience of strangers (usually men), I’ve become: a Harvard graduate (summa cum laude), an up-and-coming model, an athlete, a virgin, a Kennedy. It’s adult make-believe. It’s free entertainment. There is something irresistible about telling a big, wet, flapping, booze-induced lie to people (men) that you almost certainly will never see again. Almost. When I was 19, I convinced a man I was British and kept the lie going for three dates. Years later, we bumped into each other in public, in my pitiful American form, resulting in a humiliating exposition that left everyone questioning my sanity. But who, ultimately, is the real idiot here?
In recent years, the word “storytelling” has been thoroughly absorbed by the language of commerce, reshaped into self-aggrandizing doublespeak for “selling.” Which only increases the countervailing need for storytellers in the more artful, ancient, and magical mold of Neil Gaiman, who conjures fictions that stir up our primal fears and darkest desires, our subconscious yearnings and unspoken fantasies. Whether he’s turning death and dream into flesh-and-blood characters, as in his classic comic series, “The Sandman”; populating the modern world with figures from ancient myth in fiction best sellers like “American Gods” and “Anansi Boys”; or conjuring eerie new children’s fairy tales and ghost yarns (“Stardust,” “Coraline”), the British-born author’s genre-jumping writing is a constant reminder that the stories that linger longest, that move us most profoundly, are often the ones that can’t be turned into means but function as ends in themselves. “Myths and stories are how we have made sense of the world for as long as we’ve been wandering the planet,” says Gaiman, 61, who helped oversee Netflix’s long-awaited “The Sandman” series, which premieres on Aug. 5. “And right now, making sense of the world is somewhere between difficult and impossible.”
For the last five or six years, we’ve been living through what feels like almost unfathomable turmoil, and I think a lot of people see this period as an unprecedented chapter in the human story. But when it comes to stories, I basically believe in Ecclesiastes’ “There is nothing new under the sun.” So my question to you is whether you think we are living in a new story — or is it just new to us? This reminds me of something that happened after the Sept. 11 attacks. When we could fly again, I flew to Trieste, Italy, for a conference. I remember going into a display of Robert Capa photographs taken in that area during World War II. Until that moment, I had regarded World War II as being unimaginably distant in time. It was this thing that had happened in history, that had happened to my family — basically all of them were killed; a couple of outliers made it to England — but that was history. That happened then. But there was something very strange about looking at those Robert Capa photos post-9/11 because they made me go, Those people are us. I feel the same way today. History is now. But I’m also getting more obsessive about human beings over huge swaths of time. Part of that came out of being on the Isle of Skye during the serious U.K. lockdown. On Skye, if there’s a rock somewhere, it’s probably because somebody put it there. I realized that the rock that I was using to keep the lid on my dustbin was a stone that had been dragged around. People have been in this place for thousands and thousands of years, and in this bay, I’m living in, they’ve left behind rocks! Realizing that about the rocks makes you take the long view. Which is that the human race is mostly people just trying to live their lives, and that bad [expletive] is going to happen. That then moves you into other territory.
Can a poster save the burning planet? Amid the polemics, politics, and the deluge of climate change data, can a remarkable piece of visual art break through the noise and inspire action?
An illuminating exhibition at New York City’s Poster House offers a nuanced, if inconclusive, exploration about the utility (or futility) of printed propaganda in tackling the worsening environmental crisis. “Every poster in this exhibition is a failure—not in the sense that they failed in their graphic intent of communicating a message, but rather that they failed to successfully modify behavior,” writes curator Tim Medland to introduce We Tried to Warn You! Environmental Crisis Posters, 1970–2020 (on view until Feb. 25, 2024). “Nevertheless, these impactful images have shaped the bounds of public debate on environmental issues, drawing attention to distinct and particular concerns.”
A graphic history of activism
Curiously, the most influential “climate poster” in history isn’t a poster at all, but a photograph. Taken aboard Apollo 8 by NASA astronaut Bill Anders in 1968, the first color image of Earth showed the fragility and a beautiful blue planet in deep space. “Earthrise,” as the photo is known, sparked environmental movements and became a recurring motif in environmental posters, including two in the exhibition: Milton Glaser’s “Give Earth A Chance” (1970) Environmental Action Coalition and Gunter Rambow’s “All the Earth Speaks Up” (1983) created for Germany’s Green Party. (NASA has since released several versions of “Earthrise” over the years.)
Before we glimpsed a snapshot of our profound interconnectedness, conservation efforts were localized; tactical rather than existential. Prior to 1968, posters about the natural world took the form of travel vignettes, such as M. Pallandre’s romanticized 1890s rendering of the thermal baths of the Pyrenees, or charming silk screened preserve-our-forest appeals created during the US Works Progress Administration in the 1930s. And in the 1940s and 1950s, the U.S. Forest Service disseminated many posters featuring Smokey Bear, America’s most enduring wildfire prevention poster child, a bear.
We all know the line: For more than 150 million years, dinosaurs ruled the Earth. We imagine bloodthirsty tyrannosaurs ripping into screaming duckbills, gigantic sauropods shaking the ground with their thunderous footfalls, and spiky stegosaurs swinging their tails in a reign of reptiles so magnificent, it took the unexpected strike of a six-mile-wide asteroid to end it. The ensuing catastrophe handed the world to the mammals, our ancestors, and relatives so that 66 million years later we can claim to have taken over what the terrible lizards left behind. It’s a dramatic retelling of history that is fundamentally wrong on several counts. Let’s talk about some of the worst rumors and what really happened in the so-called “Age of Dinosaurs.”
The oldest dinosaurs we know about are around 235 million years old, from the middle part of the Triassic Period. Those reptiles didn’t rule anything. From recent finds in Africa, South America, and Europe, we know that they were no bigger than a medium-sized dog and were lanky, omnivorous creatures that munched on leaves and beetles. Ancient relatives of crocodiles, by contrast, were much more abundant and diverse. Among the Triassic crocodile cousins were sharp-toothed carnivores that chased after large prey on two legs, “armadillodiles” covered in bony scutes and spikes, and beaked, almost ostrich-like creatures that gobbled up ferns.
Even as early dinosaurs began to evolve into the main lineages that would thrive during the rest of the Mesozoic, most were small and rare compared to the crocodile cousins. The first big herbivorous dinosaurs, which reached about 27 feet in length, didn’t evolve until near the end of the Triassic, around 214 million years ago. But everything changed at the end of the Triassic. Intense volcanic eruptions in the middle of Pangaea altered the global climate; the gases released into the air caused the world to swing between hot and cold phases. By then, dinosaurs had evolved warm-blooded metabolisms and insulating coats of feathers, leaving them relatively unfazed through the crisis, while many other forms of reptiles perished. Had this mass extinction not transpired, we might have had more of an “Age of Crocodiles”—or at least a very different history with a much broader cast of reptilian characters. The only reason the so-called Age of Dinosaurs came to be is because they got lucky in the face of global extinction.
It’s strange to talk about dinosaurs “dominating” an ocean world. While sea levels have risen and fallen over time, the seas make up about 71 percent of Earth’s surface and contain more than 330 million cubic miles of water. The claim that dinosaurs, as diverse as they were, were the dominant form of life on Earth only makes sense if we ignore that three-quarters of our planet is ocean.
.
(Clockwise from top) A T. rex model, T. rex skull, and Triceratops skull on display at the Museum of Natural History in Vienna, Austria. DepositPhotos
It didn’t take long for Mario Zechner to prove the government wrong. In May, the independent software developer was listening to a radio interview with Austria’s labor minister, Martin Kocher, who said the government would build a new database that will help people find the cheapest milk, eggs, and other supermarket products to help fight soaring food prices. However, the planned system would take months to build and cover only a handful of food types. Zechner decided to take action.
Two hours after hearing the interview, Zechner had built the first prototype of a comparison system, pulling the cost of 22,000 items from the websites of Austria’s biggest two supermarket chains. “I decided to just sit down in the afternoon and see how hard it really can be,” Zechner says. The result was Heisse Preise (which translates as “Hot Prices”), with Zechner open-sourcing the project on GitHub. “From then on, it kind of escalated,” he says.
Months later, Heisse Preise has grown enormously, demonstrating the power of citizen-developed tools and what can be achieved when data is opened up to everyone. The comparison site now lists prices from 10 Austrian supermarket chains, plus four in neighboring Germany and Slovenia. Heisse Preise includes more than 177,000 items. Thanks to data provided by an anonymous contributor and local press, item pricing history goes back to 2017. Zechner’s creation of the tool came as Europe’s food retailers and governments have clashed over rising prices and the cost of living.
Perhaps most significantly, Zechner’s tool has shone a light on the opaque world of price changes by supermarkets, allowing price increases and decreases to be tracked. The transparency, Zechner and others say, shows there can be little difference in prices at some major supermarkets, and within days of an item changing price, competitors can mirror the change.
Data gathered by Heisse Preise and other newly-emerged DIY comparison sites has fed into the investigations of Bundeswettbewerbsbehörde, the Austrian Federal Competition Authority, which has been probing the food industry since October 2022. The authority, which is due to present its full findings later this month, has already suggested the government should introduce new laws to make shops publish their price data. The authority also says it “can be assumed” that supermarkets themselves crawl the websites of competitors and use that information to set their own prices.
“This data is enormously useful for anyone interested in serious competition policies,” says Leonhard Dobusch, the academic director at the Momentum Institute, an Austrian progressive think tank. “It really allows a peek into pricing strategies [and] price coordination tactics.”
Every year, the roughly 200 Renaissance fairs and festivals held across the United States and abroad attract several million visitors. United by their raucous entertainment, elaborate costumes, and setting in the distant past, these outdoor events boast a surprising backstory.
The country’s first Renaissance Pleasure Faire, staged in Los Angeles in May 1963, was inextricably linked to the Red Scare, a Cold War-era mass hysteria prompted by the specter of communism. It was the brainchild of Phyllis Patterson, a history, English, speech, and drama teacher who’d balked at having to sign a political loyalty oath to work in California public schools. Though Phyllis later told the press she’d left teaching in 1960 to become a stay-at-home mom, her son Kevin Patterson says this was only “part of the story.” In truth, he adds, “she felt strongly about the harms and unconstitutionality of the HUAC”—the House Un-American Activities Committee—and McCarthyism overall, “and was therefore uncomfortable taking a loyalty oath.”
Many of the volunteers involved in the first fair were residents of Laurel Canyon, a haven for left-leaning creatives in the Hollywood hills. Some had been blacklisted or “graylisted” as communists, leaving them unable to find work in the film industry. The fair presented an opportunity for these individuals to use their skills and participate in a project that celebrated free thinking.
After leaving her teaching position, Phyllis started working at the Wonderland Youth Center in Laurel Canyon, where she ran a theater program for children. She held classes in her backyard, pursuing “a vision of how she could open youngsters’ eyes to their own dramatic and artistic potential by using the great themes of the past,” wrote Kevin in the foreword to a 2013 book about the fair.
Through her work at the youth center, Phyllis met actors Robert and Doris Karnes, who served on its Board of Directors. The couple had also suffered the consequences of Joseph McCarthy-era suspicion and repression. In 1959, HUAC called Doris to testify about “an alleged communist infiltration of the youth center,” writes historian Rachel Lee Rubin in Well Met: Renaissance Faires and the American Counterculture. Two years later, the committee issued a report identifying Doris and 19 others as communists or communist sympathizers. The accusations sparked debate among locals: Some wanted to pull their children from drama classes, while others rejected the blackballing and a proposed amendment barring suspected communists from the center.
Film and Writing Festival for Comedy. Showcasing best of comedy short films at the FEEDBACK Film Festival. Plus, showcasing best of comedy novels, short stories, poems, screenplays (TV, short, feature) at the festival performed by professional actors.