Friday, 15 January 2016

How Secure Is Your Smart Watch?

How Secure Is Your Smart Watch

The smartwatch is rapidly growing in popularity, but do wearables constitute a serious security risk? Davey Winder investigates

How long does the battery last? Should I opt for Apple or Android? Can I pay for my lunch with it? The list of smartwatch FAQs has one glaring omission: how secure are the damned things?

While everyone focuses on the magnificent features and longevity of these supercharged wristwatches, it seems nobody is talking about smartwatch security. How hackable are these devices? Are they susceptible to malware? Will they divulge your personal data if they’re lost or stolen? Can they even be wiped remotely?

We routinely ask such questions about our smartphones, the devices that wearables are inextricably paired with. Yet early adopters and the media – who don’t normally shy away from potential scare stories – have been unusually quiet when it comes to smartwatch security. Both, it seems, have been complacent about the risks.

More than a dumb display?

One of the big hurdles that must be overcome when talking about smartwatch security is accepting that it’s a genuine concern in the first place. After all, isn’t the gadget on your wrist just a dumb minimonitor for the smartphone in your pocket? That’s certainly a common perception, along with the notion that, as long as the data on your smartphone is secured, the watch is of very little consequence.

The reality is rather different, as researchers from Trend Micro recently discovered when they conducted penetration testing on some of the most popular smartwatches on the market. The Apple Watch, Motorola Moto 360 and Pebble watches were among those tested for their hardware protection, data connections and local data storage. “It’s clear that manufacturers have opted for convenience at the expense of security,” Bharat Mistry, cybersecurity consultant at Trend Micro. “We discovered that all of them saved data down locally, which enables a hacker to access the data when the watch is taken out of range of the smartphone it’s paired with.”

Both Apple- and Android-powered watches store unread notifications, as well as fitness and calendar data. The Apple Watch adds images, contacts and Passbook information to the list of data stored locally. “As Passbook information can contain highly sensitive data such as plane tickets, smartwatch owners need to be as careful with these devices as they would with their smartphones,” Mistry warned.

The idea that synced local data can be read through the watch interface is concerning, but the fact that the Apple Watch stores so much of it is of even more troubling. Trend Micro’s research exposes the misconception that smartwatches are only smart on the outside.

HP has also recently carried out research into smartwatch security, and the results don’t make for comfortable reading. Once again, all of the tested devices contained vulnerabilities, with HP branding smartwatches “a new and open frontier for cyber-attack”. HP Fortify raised particular concerns about insufficient user authentication, insecure web interfaces that potentially enable hackers to identify user accounts through password-reset mechanisms, and poor encryption of data in transit. The latter is of huge concern to those in the security business. While all the devices implemented SSL/TLS, HP found that 40% of the watches were either vulnerable to POODLE attacks (a wellknown, and relatively easy to mitigate, man-in-the-middle exploit of transport encryption mechanisms) or still using old protocols such as SSL 2.0.

Simeon Coney, chief strategy officer at AdaptiveMobile, believes that the potential to ping unauthorised messages to users is a major concern. “Smartwatches compound one of the primary security risks already prevalent on mobile phones, which is that they encourage users to ‘glance and respond’ to notifications,” he said. This means attackers can exploit this behaviour to contact the user with interactions that aren’t scrutinised for legitimacy in a way that they would if they, for example, received an email on their PC. “Any interaction where the user is either not given enough information to determine its legitimacy or where the device, or ecosystem, does not provide automated security scanning before delivery could be exploited,” he added.

Then there’s the small matter of firmware updates being transmitted without any encryption of the transport mechanism or the update files themselves. This isn’t quite so worrying, as most updates are signed to prevent malicious installs, but the lack of encryption does make for easier downloading and analysis by the crooks. It also suggests that manufacturers are guilty of concentrating on the design and features of smartwatches, at the expense of designing functional security into the devices from the get-go.

MALWARE O’CLOCK


Ken Munro, senior partner at Pen Test Partners, is a professional penetration tester and thinks part of the security problem with smartwatches is that they have evolved to run fully fledged operating systems that can connect to the internet independently. “We should expect more targeted attacks,” he told PC & Tech Authority. “The apps for smartwatches tend to request every permission under the sun. If malware can be uploaded, or if a malicious app is installed on the watch, it could potentially access a lot of data.”

Isn’t the malware threat really to the phone itself rather than the wearable? Is there any evidence that smartwatches are even being targeted by malware writers, either from the standalone perspective or as a conduit to get at that valuable data stored on a paired smartphone? “Currently we are not yet seeing many attacks on smartwatches,” said Jahmel Harris, information security consultant with MWR Labs, who expects attacks to become more common as users adopt mobile payment systems.

Harris added that both Google and Apple have gone to great lengths to protect against the type of attacks that would allow malware to spread from watch to phone, but warned that “applications could be installed covertly on some smartwatches that behave differently to the software running on phones, making it more challenging  for security researchers to analyse malware”.

What smartwatches have going in their favour, at least for now, is that they are relatively niche. In other words, they haven’t exactly become the phenomena that some had predicted and, as with all minority operating systems, the smaller the installed base, the less attractive the target.

Smartphones are much more common than smartwatches, hold more sensitive data, have greater processing power for launching attacks, and have direct connections to external networks. “It just doesn’t make sense for someone to go to the trouble to target a smartwatch, when it would likely be more difficult and have less payoff than targeting the phone itself,” said Chris Camejo, director at NTT Com Security. “The only scenario I could think of is a ridiculously easy-to-exploit vulnerability in a smartwatch, but any such vulnerability would likely be patched quickly.”

PRIVACY ON PARADE


If malware isn’t a threat to your data, maybe you are. Think about it for a moment: who else is reading those texts or emails that are being displayed on your wrist?  What about two-factor authentication codes if they pop up as alerts on the watch? Is inadvertent data sharing, also known as “shoulder surfing”, likely to be a major threat to your privacy? Chris Camejo thinks that the always-on (as in physically always on your wrist) nature of the device carries a far greater risk than a smartphone, which is often kept in a bag or pocket. Much of this information, he surmises, would have greater potential for embarrassment than malicious compromise. “Two-factor codes are generally useless without another piece of information, such as a PIN, password or certificate,” he explained. “While we could dream up some scenarios where shouldersurfed data could lead to a breach, these would likely be contrived.”

Bharat Mistry disagrees from a purely data-privacy perspective: “The controls on these devices are relatively immature, so there is a big risk of inadvertent data sharing,” he warned. “If a user forgets to lock a device, there is the potential for other people to view notifications that pop up on their watch.”

Indeed, the lockscreen would appear to be the best defence when it comes to preserving privacy. Android Wear, for example, is now shipped with the same lockscreen used by Lollipop smartphones, in the form of Keyguard. “Locking a phone can stop an attacker from reading messages. However, unless a lock is set on the watch, a casual glance can reveal a lot of potentially sensitive information,” said MWR’s Jahmel Harris.

If not through shoulder surfing, then how might data privacy be vulnerable? The synchronisation of data via Bluetooth and Wi-Fi poses another risk. “This creates an interesting attack angle, but is likely to require physical proximity to exploit a specific user,” said Simeon Coney. Then again, it has become apparent that Apple’s AirDrop is being used to send unwanted content to users. If this makes its way to the watch it could be used to push malicious messages or phishing “by pushing seemingly legitimate prompts for PIN codes, passwords or other credentials”.

Maybe the biggest privacy risk is theft. A smartwatch may be harder to lose than your phone, by virtue of being strapped to your wrist, but should a strap break or you forget to pick up your watch as you leave your hotel room, it’s vulnerable. Scott Lester, senior researcher at Context Information Security, reminds us that there is no  “find my watch” feature for Apple users, “but there is at least a setting to wipe after ten failed passcode attempts”. When it comes to Android Wear, users with devices that can connect to Wi-Fi do have the ability to remotely revoke them, although this still won’t wipe the data from the device.

APP ATTACK


Finally, what about the apps and any safeguards that are being built into app design to prevent attacks? Mark James, security specialist at Eset, is convinced that “apps will almost certainly be the biggest single failure we will see in this market”.

According to MWR Labs, developers are often unaware of the changes that have been made to smartphones that allow them to communicate with their respective smartwatches. “Android Wear requires developers to create a service in their application, effectively creating an opening that is required to communicate data between Android and Android Wear,” Jahmel Harris from MWR told us. “In theory, this service can only be used by Android Wear, but our research has shown it’s possible to communicate with this service from a rooted wearable.”

As a weakness in one may put the other at risk, it is particularly important that security controls such as root detection, obfuscation and integrity checking are performed on both the applications written for Android Wear and Android itself. “In the case of Apple Watch,” Jahmel added, “MWR has seen developers weakening the security of the iOS application in order to allow sensitive information to be passed to the smartwatch.”

Making sure these apps are clean and originate from known sources will be a big responsibility for watchmakers. As Mark J ames concludes: “Making sure the app design, submission and distribution is meticulously monitored is the only way to protect the users.”