This Jammer Wants to Block Always-Listening AI Wearables. It Probably Won’t Work

0
2

A new startup called Deveillance (pronounced dee-veil-ance) announced its first-ever gadget earlier this week—a sleek, portable tabletop orb that aims to jam nearby devices from recording voices.

Called Spectre I, the microphone jammer is a combination of ultrasonic frequency emitters and AI smarts designed to not only block devices trying to capture someone’s speech, but also detect and log nearby microphones, all while being small enough to carry around. It’s still very much in development, but the company expects to sell the Spectre I in the second half of 2026 for $1,199.

The announcement caused quite a stir on social media. It was boosted by some as cyberpunk-style resistance tech against the ever-growing category of always-listening AI wearables, but also became the target of a firestorm of skepticism by blue-check critics on X who were eager to call it too good to be true.

“I didn’t expect it to go this viral,” says Aida Baradari, a recent Harvard graduate who founded Deveillance and developed the Spectre I. “I’m grateful that I’ve been given the opportunity to work on this. I’m also really grateful, honestly, that people care.”

Baradari was motivated to build the device as a counter to these always-listening devices that the AI boom has ushered in, like the bracelet from Amazon-owned Bee AI or the Friend pendant.

“People should have a choice over what they want to share, especially in conversations,” Baradari says. “If we can’t converse anymore without feeling scared of saying something that’s potentially taken out of context or wrong, then how are we going to build human connection in this new age?”

Private Time

It’s easy to see why that anxiety about privacy has heightened, as government surveillance is en vogue in the US. ICE is building out its own surveillance systems around everything from social media to everyone’s phones to its own employee roster. That tension runs deep within the private sector, too, as big tech fuels ICE while also collecting, buying, and using every scrap of your personal data.

Last month, when home security camera company Ring ran a Super Bowl commercial about using its cameras to find lost dogs, viewers were appalled at the privacy implications of a neighborhood panopticon and responded with immediate pushback. It caused Ring to backpedal. A week later, the company announced it would no longer pursue a planned partnership with the similarly controversial security company Flock Safety.

“People are kind of waking up to the idea that they may not have privacy at any given time,” says musician and YouTuber Benn Jordan, who makes videos about security and privacy issues like audio jammers and Flock security cameras.

Like the hobbyist developer who created an app to warn people if someone is wearing smart glasses nearby, Spectre I is another effort to give users a way to take back control of their privacy. But for a device that uses AI and speakers to block other AI and microphones, the technology has to be proven to work first. Skeptics say Deveillance’s claims appear far-fetched.

“These are some pretty big promises,” Jordan says. “Unfortunately, they’re kind of up against physics.”

Jamming Out

Ultrasonic microphone jammers have been around since before the Cold War, built and developed over the decades by intelligence agencies and DIY tinkerers alike. They have also become a sort of pocket industry, where you can buy jammers on sites like AliBaba or build your own using software from GitHub.

Audio jammers tend to be bulky, thick bricks, because frequency emitters and power sources take up space. Make the device powerful enough to work, and it’s likely too large to be discreet. Too small, and the jammer won’t have the juice to properly disrupt a microphone.

“We’re aiming for a device that’s light and small, though this might end up being hard to do due to constraints in physics,” Baradari wrote in a text to WIRED.

To bridge that gap, Baradari says the Spectre I will use AI to garble speech, not just obfuscate it with a wall of sound. The device sends out AI-generated cancellation signals meant to fool automatic speech recognition (ASR) tech. While the plan is to get the emitters to an ostensibly silent level, the current working version of the Spectre I does produce an audible hum.

The AI is being used to target a range of ultrasonic frequencies specifically tailored to average human voices. “The result is a muddled recording overall— so environmental noises are also muddled,” Baradari wrote. “In traditional jammers, voices can be reconstructed, or the jamming can sometimes be bypassed by ASR systems. With our method, we are making sure that doesn’t happen.”

But Melissa Baese-Berk, a Linguistics professor at the University of Chicago, says, “There’s so much variation in people’s voices. It’s not the case that there’s a specific signal that’s like the ‘voice signal.’”

Baradari claims Spectre doesn’t track any voices or what people are saying, citing that the algorithm is optimized based on Deveillance’s internal development and training. After all, there is no microphone inside the Spectre I. “The AI is optimized to send out signals that can’t be reconstructed in post-processing,” she says.

Deveillance also claims the Spectre can find nearby microphones by detecting radio frequencies (RF), but critics say finding a microphone via RF emissions is not effective unless the sensor is immediately beside it.

“If you could detect and recognize components via RF the way Spectre claims to, it would literally be transformative to technology,” Jordan wrote in a text to WIRED after he built a device to test detecting RF signatures in microphones. “You’d be able to do radio astronomy in Manhattan.”

Deveillance is also looking at ways to integrate nonlinear junction detection (NLJD), a very high-frequency radio signal used by security professionals to find hidden mics and bugs. NLJD detectors are expensive and used primarily in professional contexts like military operations.

Even if a device could detect a microphone’s exact location, objects around a room can change how the frequencies spread and interact. The emitted frequencies could also be a problem. There haven’t been adequate studies to show what effects ultrasonic frequencies have on the human ear, but some people and many pets can still hear them and find them obnoxious or even painful. Baradari acknowledges her team needs to do more testing to see how pets are affected.

“They simply cannot do this,” engineer and YouTuber Dave Jones (who runs the channel EEVblog) wrote in an email to WIRED. “They are using the classic trick of using wording to imply that it will detect every type of microphone, when all they are probably doing is scanning for Bluetooth audio devices. It’s totally lame.” Baradari reiterates that the Spectre uses a combination of RF and Bluetooth low energy to detect microphones.

WIRED asked Baradari to share any evidence of the Spectre’s effectiveness at identifying and blocking microphones in a person’s vicinity. Baradari shared a few short video clips of people putting their phones to their ears listening to audio clips—which were presumably jammed by the Spectre—but these videos do little to prove that the device works.

Future Imperfect

Baradari has taken the critiques in stride, acknowledging that the tech is still in development. “I actually appreciate those comments because they’re making me think and see more things as well,” Baradari says. “I do believe that with the ideas that we’re having and integrating into one device, these concerns can be addressed.”

People were quick to poke fun at the Spectre I online, calling the technology the cone of silence from Dune. Now, the Deveillance website reads, “Our goal is to make the cone of silence become reality.”

John Scott-Railton, a cybersecurity researcher at Citizen Lab, who is critical of the Spectre I, lauded the device’s virality as an indication of the real hunger for these kinds of gadgets to win back our privacy.

“The silver lining of this blowing up is that it is a Ring-like moment that highlights how quickly and intensely consumer attitudes have shifted around pervasive recording devices,” says Scott-Railton. “We need to be building products that do all the cool things that people want, but that don’t have the massive privacy and consent violation undertow. You need device-level controls, and you need regulations of the companies that are doing this.”

Cooper Quintin, a senior staff technologist at the Electronic Frontier Foundation, echoed those sentiments, even if critics believe Deveillance’s efforts to be flawed.

“If this technology works, it could be a boon for many,” Quintin wrote in an email to WIRED. “It is nice to see a company creating something to protect privacy instead of working on new and creative ways to extract data from us.”

Disclaimer : This story is auto aggregated by a computer programme and has not been created or edited by DOWNTHENEWS. Publisher: wired.com