{"id":5610,"date":"2019-11-19T12:59:39","date_gmt":"2019-11-19T20:59:39","guid":{"rendered":"http:\/\/sites.law.berkeley.edu\/thenetwork\/?p=5610"},"modified":"2019-11-19T12:59:39","modified_gmt":"2019-11-19T20:59:39","slug":"siri-google-assistant-and-amazon-alexa-can-be-hijacked-with-light","status":"publish","type":"post","link":"https:\/\/sites.law.berkeley.edu\/thenetwork\/2019\/11\/19\/siri-google-assistant-and-amazon-alexa-can-be-hijacked-with-light\/","title":{"rendered":"Siri, Google Assistant, and Amazon Alexa can be Hijacked with Light"},"content":{"rendered":"<p>Researchers have recently found that voice assistant technology is <a href=\"https:\/\/www.forbes.com\/sites\/thomasbrewster\/2019\/11\/05\/amazon-alexa-google-home-hacked-with-a-laser\/#30743f5927d2\">vulnerable to hijacking<\/a> by cheap lasers.<\/p>\n<p>Researchers from Tokyo&#8217;s University of Electro-Communications and the University of Michigan have almost bested inbuilt security mechanism in voice-controlled devices, including popular smartphones. A mere <a href=\"https:\/\/lightcommands.com\/\">shining of a bright laser<\/a> at the devices\u2019 microphone is interpreted as a sound by their system.<\/p>\n<p>Researchers have concluded that by producing electrical signals in the light beam on microphones hijackers may control a device because the system will interpret it as a genuine command. For this test, the cheap laser pointers used was around $13.99 to $17.99. This was coupled with a sound amplifier to direct speakers with a specific instruction cost of $27.99. A laser device was also connected to control the Lasers intensity. This was the most expensive tool of all, costing $339.<\/p>\n<p>The team ran a test on voice control speakers and smartphones of renowned major tech firms, such as Google&#8217;s Assistant, Amazon&#8217;s Alexa, and Apple&#8217;s Siri. The list is not exhaustive, but includes Google Home, various Amazon Echo models, the Apple Home Pod, and Facebook&#8217;s Portal speaker, which runs Alexa. They also tested an iPhone XR, a Samsung Galaxy S9, and a Google Pixel 2.<\/p>\n<p>Relying on inbuilt security layers in the gadgets is now in question. The varying degree of vulnerability in tablets, phones, and speakers is another issue discovered by the researchers after shining the laser from some distance, including through windows. Out of all devices tested, Google Home was hijacked from 110 meters away.<\/p>\n<p>But it may relieve anxieties to consumers using iPhone, iPad, and a few Android smartphones that these devices require extra layers of authentication or a \u201cWake Word\u201d to activate a device before the hijackers trick the system. This additional authentication of preventing a system against an invasion requires a system hacker to use a wake-up a command such as &#8220;Hey Siri&#8221; or &#8220;OK Google. Unfortunately, these additional security measures are missing in the smart speakers.<\/p>\n<p>Researchers went to great lengths to explain in their <a href=\"https:\/\/lightcommands.com\/20191104-Light-Commands.pdf\">paper<\/a> the prospective chance that lasers could also be used to unlock smartphones and devices connected with it. This could expose consumers credit card information and even result in the ability to unlock tech-driven cars which are connected to a victim\u2019s Google account<\/p>\n<p>Since the paper\u2019s publication, it is clear that tech giants such as Amazon need to update their gadgets security software to protect against any foreign invasion. Unfortunately, the research has already shaken consumers trust.<\/p>\n<p><a href=\"http:\/\/sites.law.berkeley.edu\/thenetwork\/wp-content\/uploads\/sites\/2\/2019\/11\/Siri-Google-Assistant-and-Amazon-Alexa-can-be-Hijacked-with-Light-.pdf\">Siri, Google Assistant, and Amazon Alexa can be Hijacked with Light<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Researchers have recently found that voice assistant technology is vulnerable to hijacking by cheap lasers. Researchers from Tokyo&#8217;s University of Electro-Communications and the University of Michigan have almost bested inbuilt security mechanism in voice-controlled devices, including popular smartphones. A mere shining of a bright laser at the devices\u2019 microphone is interpreted as a sound by [&hellip;]<\/p>\n","protected":false},"author":36,"featured_media":0,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[1],"tags":[],"class_list":["post-5610","post","type-post","status-publish","format-standard","hentry","category-uncategorized"],"acf":[],"publishpress_future_action":{"enabled":false,"date":"2026-05-19 13:29:10","action":"change-status","newStatus":"draft","terms":[],"taxonomy":"category"},"publishpress_future_workflow_manual_trigger":{"enabledWorkflows":[]},"_links":{"self":[{"href":"https:\/\/sites.law.berkeley.edu\/thenetwork\/wp-json\/wp\/v2\/posts\/5610","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/sites.law.berkeley.edu\/thenetwork\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/sites.law.berkeley.edu\/thenetwork\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/sites.law.berkeley.edu\/thenetwork\/wp-json\/wp\/v2\/users\/36"}],"replies":[{"embeddable":true,"href":"https:\/\/sites.law.berkeley.edu\/thenetwork\/wp-json\/wp\/v2\/comments?post=5610"}],"version-history":[{"count":0,"href":"https:\/\/sites.law.berkeley.edu\/thenetwork\/wp-json\/wp\/v2\/posts\/5610\/revisions"}],"wp:attachment":[{"href":"https:\/\/sites.law.berkeley.edu\/thenetwork\/wp-json\/wp\/v2\/media?parent=5610"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/sites.law.berkeley.edu\/thenetwork\/wp-json\/wp\/v2\/categories?post=5610"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/sites.law.berkeley.edu\/thenetwork\/wp-json\/wp\/v2\/tags?post=5610"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}