
Click the link below the picture
.
Meeaaaoow rises like a question mark before dawn. Anyone living with a cat knows their sounds: broken chirrups like greetings, low growls that warn, purrs stitched into sleepy conversation. Ethologists have organized feline sounds that share acoustic and contextual qualities into more than 20 groupings, including the meow, the hiss, the trill, the yowl, and the chatter. Any individual meow belongs, academically speaking, to a broad “meow” category, which itself contains many variations. The house cat’s verbal repertoire is far greater than that of its largely silent wild cousins. Researchers have even begun to study whether cats can drift into regional dialects, the way human accents bend along the Hudson or the Thames. And just as humans gesticulate, shrug, frown, and raise their eyebrows, cats’ fur and whiskers write subtitles: a twitching tail declares excitement, flattened ears signal fear, and a slow blink promises peace. Felis catus is a chatty species that, over thousands of years of domestication, has pivoted its voice toward the peculiar primate that opens the fridge. Now imagine pointing your phone at that predawn howl and reading: “Refill bowl, please.” Last December, Baidu—a Chinese multinational company that
specializes in Internet services and artificial intelligence—filed a patent application for what it describes as a method for transforming animal vocalizations into human language. (A Baidu spokesperson told Reuters last month that the system is “still in the research phase.”) The proposed system would gather animal signals and process them: it would store kitten or puppy talk for “I’m hungry” as code, then pair it not only with motion-sensing data such as tail swishes but also with vital signs such as heart rate and core temperature. All of these data would get whisked through an AI system and blended before emerging as plain-language phrases in English, Mandarin, or any other tongue.
The dream of decoding cat speech is much older than deep learning. By the early 20th century, meows had been recorded on wax cylinders, and in the 1970s John Bradshaw, a British anthrozoologist, began more than four decades of mapping how domestic cats tell us—and each other—what they mean. By the 1990s, he and his then doctoral student Charlotte Cameron-Beaumont had established that the distinct domestic “meow,” largely absent between adults in feral colonies, is a bespoke tool for managing humans. Even domestic cats rarely use it with each other, though kittens do with their mothers. Yet for all that anecdotal richness, the formal literature remained thin: there were hundreds of papers on bird song and dozens on dolphin whistles, but only a scattering on feline phonology until machine learning revived the field in the past decade.
One of the first hints that computers might crack the cat code came in 2018, when AI scientist Yagya Raj Pandeya and his colleagues released CatSound, a library of roughly 3,000 clips covering 10 types of cat calls labeled by the scientists, from hiss and growl to purr and mother call. Each clip went through software trained on musical recordings to describe a sound’s “shape”—how its pitch rose or fell and how long it lasted—and a second program cataloged them accordingly. When the system was tested on clips it hadn’t seen during training, it identified the right call type around 91 percent of the time. The study showed that the 10 vocal signals had acoustic fingerprints a machine can spot, giving researchers a proof of concept for automated cat-sound classification and eventual translation.
Momentum built quickly. In 2019 researchers at the University of Milan in Italy published a study focused on the one sound aimed squarely at Homo sapiens. The research sliced the meow into three situational flavors: “waiting for food,” “isolation in an unfamiliar environment,” and “brushing.” By turning each meow into a set of numbers, the researchers revealed that a “feed me” meow had a noticeably different shape from a “where are you?” meow or a “brush me” meow. After they trained a computer program to spot those shapes, the researchers tested the system much as Pandeya and colleagues had tested theirs: it was presented with meows not seen during training, all hand-labeled based on circumstances such as hunger or isolation. The system correctly identified the meows up to 96 percent of the time, and the research confirmed that cats really do tweak their meows to match what they’re trying to tell us.
The research was then scaled to smartphones, turning kitchen-table curiosity into consumer AI. Developers at software engineering company Akvelon, including a former Alexa engineer, teamed up with one of the study’s researchers to create the MeowTalk app, which they claim can translate meows in real time. MeowTalk has used machine learning to categorize thousands of user-submitted meows by common intent, such as “I’m hungry,” “I’m thirsty,” “I’m in pain,” “I’m happy,” or “I’m going to attack.” A 2021 validation study by MeowTalk team members claimed success rates near 90 percent. But the app also logs incorrect translation taps from skeptical owners, which serves as a reminder that the cat might be calling for something entirely different in reality. Probability scores can simply reflect pattern similarity, not necessarily the animal’s exact intent.
.
Life on white/Alamy Stock Photo
.
.
Click the link below for the complete article:
.
__________________________________________
Leave a comment