[et_pb_section fb_built=”1″ custom_padding=”0px|0px|0px|0px” fullwidth=”on” _builder_version=”3.0.67″ background_image=”https://www.mdsny.com/wp-content/uploads/2017/09/dreamstime_xxl_45177561.jpg” background_blend=”darken”][et_pb_fullwidth_header title=”Is there Anybody Out There?” text_orientation=”center” title_font_color=”#ffffff” background_overlay_color=”rgba(0,0,0,0.42)” _builder_version=”3.0.67″ title_font=”Droid Serif||||” title_font_size=”50px” title_font_size_last_edited=”on|phone”][/et_pb_fullwidth_header][/et_pb_section][et_pb_section fb_built=”1″ custom_padding=”12px|0px|0px|0px” _builder_version=”3.0.65″][et_pb_row make_fullwidth=”on” custom_padding=”27px|1px|0px|2px” _builder_version=”3.0.65″][et_pb_column type=”4_4″ _builder_version=”3.0.47″ parallax=”off” parallax_method=”on”][et_pb_text _builder_version=”3.0.65″ text_font=”Droid Serif||||” text_font_size=”18px”]

By Michael Fiorito, MDS

Do you use an always-on voice assistant? Enjoy the benefits, but be cautious.

Researchers have discovered a vulnerability in most of the major voice assistants. It affects every iPhone and Macbook running Siri, any Galaxy phone, any PC running Windows 10, and even Amazon’s Alexa assistant.

Using a technique called the DolphinAttack, researchers have translated vocal commands into ultrasonic frequencies that are too high for the human ear to hear, but perfectly decipherable by the microphones and software powering our always-on voice assistants. This relatively simple translation process lets them take control of gadgets with just a few words uttered in frequencies none of us can hear.

The researchers didn’t just activate basic commands like “Hey Siri” or “Okay Google,” though. They could also tell an iPhone to “call 212-444-5000″ or tell an iPad to FaceTime the number. They could force a Macbook or a Nexus 7 to open a malicious website.

[/et_pb_text][et_pb_image _builder_version=”3.0.65″][/et_pb_image][/et_pb_column][/et_pb_row][/et_pb_section][et_pb_section fb_built=”1″ background_color=”#e8e8e8″ custom_padding=”0px|0px|15px|0px” _builder_version=”3.0.65″][et_pb_row make_fullwidth=”on” custom_padding=”22px|0px|0px|0px” _builder_version=”3.0.65″][et_pb_column type=”1_2″ _builder_version=”3.0.47″ parallax=”off” parallax_method=”on”][et_pb_text _builder_version=”3.0.67″ text_font=”Droid Serif||||” text_font_size=”18px” inline_fonts=”Droid Serif”]

In some cases, these attacks could only be made from inches away, though gadgets like the Apple Watch were vulnerable from within several feet. It might be hard to imagine an Amazon Echo being hacked with DolphinAttack.

Hacking an iPhone, however, might involve walking by you in a crowd. The intruder might have their phone out, emitting a command in frequencies you wouldn’t hear, while you’d have your own phone clutched in your hand. Perhaps you wouldn’t see as Safari or Chrome loaded a site, the site running code to install malware – the contents and communications of your phone open season for them to explore.

Voice assistants like Siri, Alexa, and Google Home can pick up inaudible frequencies–specifically, frequencies above the human ear.

User-friendliness is increasingly at odds with security. Our web browsers easily and invisibly collect cookies, allowing marketers to follow us across the web. Our phones back up our photos and contacts to the cloud, tempting any focused hacker with a complete repository of our private lives. We’ve made a Faustian bargain – our easy-to-use technology has come with a hidden cost: our own personal vulnerability. This new voice command exploit is just the latest in a growing list of security holes caused by design.

 

[/et_pb_text][/et_pb_column][et_pb_column type=”1_2″ _builder_version=”3.0.47″ parallax=”off” parallax_method=”on”][et_pb_image src=”https://www.mdsny.com/wp-content/uploads/2017/09/malware-hacking-alexa-e1484589574444-n4domue34k6isrk10f9s4yggiwjjyreq7ecjvrgo66.png” _builder_version=”3.0.67″][/et_pb_image][/et_pb_column][/et_pb_row][et_pb_row make_fullwidth=”on” custom_padding=”1px|0px|16px|0px” _builder_version=”3.0.65″][et_pb_column type=”4_4″ _builder_version=”3.0.47″ parallax=”off” parallax_method=”on”][et_pb_text _builder_version=”3.0.65″ text_font=”Droid Serif||||” text_font_size=”18px”]

For now, there’s a relatively easy fix to most DolphinAttack vulnerabilities: Turn off the always-on settings of Siri or the Google Assistant on your phones and tablets and a hacker won’t be able to talk to your phone (except during those moments you’re trying to talk to it, too). Meanwhile, the Amazon Alexa and Google Home both have hard mute buttons that should do the trick for a majority of the time.

If you use always-on voice assistants, use them smartly. A door is only good if it is closed and locked at best. Do the same with any new technology.

[/et_pb_text][/et_pb_column][/et_pb_row][/et_pb_section][et_pb_section fb_built=”1″ background_color=”#8d8c91″ custom_padding=”4px|0px|20px|0px” admin_label=”section” _builder_version=”3.0.65″][et_pb_row make_fullwidth=”on” custom_padding=”10px|0px|27px|0px” admin_label=”row” _builder_version=”3.0.47″ background_size=”initial” background_position=”top_left” background_repeat=”repeat”][et_pb_column type=”4_4″ _builder_version=”3.0.47″ parallax=”off” parallax_method=”on”][et_pb_cta title=”Pulling the plug doesn’t have to be your only security solution.” button_url=”https://www.mdsny.com/contact/” button_text=”Contact Us” use_background_color=”off” _builder_version=”3.0.65″ header_font=”Droid Sans|on|||” header_font_size=”31px” body_font=”Droid Sans||||” body_font_size=”19px” background_size=”initial” background_position=”top_left” background_repeat=”repeat” custom_button=”on” button_text_color=”#2d3743″]

Don’t become part of a rising statistic — ensure your company is armed against a security hack.

[/et_pb_cta][/et_pb_column][/et_pb_row][/et_pb_section]