FOR THE WEEK OF NOV. 11, 2019
Users beware: Alexa, Siri and Echo respond to laser pointers aimed from outside a home
Read about and consumer electronics or a tech product and list two things you learn.
Find science, engineering or technology news of interest and tell why it's appealing.
Now try to spot coverage that has nothing to do with digital communication, automation or similar technology -- if you can!
If your family has an electronic "assistant" such as Siri or Alexa, don't keep it in direct line with a window to the outside. New research supports that tip, even though it sounds odd. These devices, it turns out, are vulnerable to intrusion via laser pointers or even a strong flashlight, which can unlock houses, open garages, visit websites and start vehicles if programmed to handle those tasks. That's because shining a low-power laser into these systems, including Google Assistant and Amazon Echo, is the same as activating them by voice. Pranksters or criminals at a window can inject commands from as far as 360 feet.
University of Michigan researchers and collaborators in Japan last week issued a report, video and website showing how it works. They also contacted Amazon, Google and Apple, which are studying the research. Campus experts found that microphones in the smart devices respond to light as if it were sound. Each mic has a small plate called a diaphragm that moves when sound hits, creating a digital signal. It reacts the same way to incoming light. Researchers admit they don't know the physics of how a microphone interpreting light as sound.
Using a $15 laser pointer or a more sophisticated hand-held laser or a focused flashlight, researchers accessed the digital devices, which have gained popularity as household conveniences. In a dramatic example, they took over a Google Home unit on the fourth floor of an office building from the top of a University of Michigan bell tower more than 200 feet away. All the equipment needed to hack each system is sold on Amazon. A telephoto lens and tripod can focus the laser for long-range attacks. The intrusions are silent, with a flashing blue spot on the receiving device as the only clue. While voice assistants typically give an audible response, a hacker could send an initial command that turns the volume to zero or triggers Amazon's "whisper mode."
Japanese scholar says: "Anything that acts on sound commands will act on light commands." -- Takeshi Sugawara, cyber security specialist at the University of Electro-Communications
Google responds: "Protecting our users is paramount, and we're always looking at ways to improve the security of our devices."
Michigan researcher says: "You can hijack voice commands. Now the question is just how powerful your voice is, and what you've linked it to." -- Sara Rampazzi, Department of Electrical Engineering and Computer Science
Front Page Talking Points Archive