But these critical pieces were weren't standard fare, easily dismissed FUD (Fear, Uncertainty and Doubt). Rather they were thoughtful takes that require those involved with connected devices to think hard about design, regulation, and fail-safe protocols.
Since I was recently accused of FUD for this piece on Windows 10 privacy, who better than me to analyze IoT warning calls? On these pages, Den Howlett published Beware the Internet of Things – it’s early, security sucks and the C-Suite doesn’t care. After weighing the issues - many of which come back to security - Den put the brakes on:
In the final analysis, I get that the IoT is invading every aspect of our life. But I view the current state of the art as crude and far from the world changing ‘thing’ its evangelists suggest is just around some imaginary corner. For example, the much marketed Nest claims to learn. But in over four months of use I see it as little better than an unteachable dog. Sure, I can control it from a distance, but learning from behaviors and settings? Not a chance.
Contributor Charlie Bess extended the security issue to wearables in Start planning for wearables - here's how. After reviewing the extravagant predictions for the wearables marketplace, Bess urges IT leaders to take pro-active steps to address wearables, including security. Bess raises the possibility that a savvy IT department could actually use wearables to enhance security, by gathering useful data from the devices.
Contributor Martin Banks isn't so sure. In Wearables, security and why you’re suddenly the one to blame for bringing down the company. Martin argues that wearables raise the security stakes:
Security used to be just about keeping virii out of corporate systems so data stayed sacrosanct, but the arrival of wearables and IoT – and the potential for exploitation of both in tandem – is about to raise the security stakes alarmingly
Martin uses a hypothetical example of an employee who gets through a security clearance, including a retina and fingerprint check, only to be the conduit for a biochemical release via a wearables hack. Referring to the inevitable cheap Apple Watch knockoff devices, Martin says:
It does not take too much imagination to wonder if the more organised cybercriminals will seize this opportunity to move in on that business to get people to pay to wear devices that are pre-loaded with malicious code?
Consumers aren't in the clear either. Stories on connected car hacks also hit the press, such as the news that certain Chrysler models can be hacked over the Internet. Wired Magazine's Andy Greenberg offered a harrowing personal demonstration, describing the pre-planned hack of his Jeep Cherokee on the highway. On the outskirts of St. Louis, Greenberg was cruising at 7o miles per hour when:
Though I hadn’t touched the dashboard, the vents in the Jeep Cherokee started blasting cold air at the maximum setting, chilling the sweat on my back through the in-seat climate control system. Next the radio switched to the local hip-hop station and began blaring Skee-lo at full volume. I spun the control knob left and hit the power button, to no avail. Then the windshield wipers turned on, and wiper fluid blurred the glass.
Calling these white hat hackers' code a "automakers' nightmare," Greenberg explains that these hackers sent their commands through the Jeep’s entertainment system to its dashboard functions, steering, brakes, and transmission - all from a distant laptop:
The most disturbing maneuver came when they cut the Jeep’s brakes, leaving me frantically pumping the pedal as the 2-ton SUV slid uncontrollably into a ditch.
Yikes! Is that enough FUD for one IoT article? Just a bit more: the hackers believe they can do most of these attacks on any Jeep Cherokee, or, for that matter, any Chrysler vehicle, as long as it's using the vulnerable Uconnect head unit. Granted, it took these hackers/researchers - whose names are Charlie Miller and Chris Valasek - a year to figure out how to control these functions remotely. Not much comfort, especially when Miller says to CNN Money:
Right now I could do that to every [Chrysler] car in the United States on the Sprint network... I'm scared because you should not be able to attack cars remotely like this.
OK, so what to do about it?
IoT, automation, connected cars, wearables and drones are an inevitable part of where we're headed. We need a multi-pronged approach to protect companies - and ourselves - from worst case scenarios. Here's an incomplete list:
1. Companies should embrace white hat hackers and crowdsourced security. Many enterprises are still struggling with the role of external security experts, an issue I recently explored. Chrysler, for example, hasn't handled the vulnerability news all that well, falling back on a USB stick patch or trip to the car dealer. They also issued some shrill language, implying they were more freaked out about the hack than fixing the problem.
2. Industry regulations should play a role, though they won't solve the problem. U.S. Senator Ed Markey is one lawmaker who believes carmakers aren't doing enough to make protected cars safe. With the aid of security experts, Markey's staff compiled a report (PDF link) that concludes American drivers are "at risk" due to privacy and security gaps. Markey is calling for the National Highway Traffic Safety Administration and the Federal Trade Commission to issue regulations with clear-cut privacy and safety guidelines. Modernized regulations are always a good idea, and are a piece of the solution.
3. Designers should build human intervention and fail-safes into all IoT prototypes. Via Den, I learned about this neat IoT design manifesto, which is a good start. We've gone through a period where the automation fetish had undue design influence. The best designs combine automation of the mundane with sophisticated exception handling and multiple levels of human intervention. Star Wars would have ended badly if Luke Skywalker hadn't been able to take his ship on manual. I might never get into a vehicle that makes all driving decisions for me. But I welcome an alert as I'm backing up into blind spot objects. And I might happily trade my exact location for precise traffic/accident warnings. But that data swap should be my choice, easily controllable on my end. Thoughtful/ethical design incorporates those features.
4. Individuals should approach connected devices with thoughtful research. Just about every wifi-connected device we buy has device-specific implications for security and privacy. The Internet is an invaluable tool for finding resources, security apps, and ways to customize the device's settings for our liking. Too often, we learn about these resources after we fall victim to a hack, theft, or data breach. I've always liked the distinction between consumers and citizens. Consumers make me think of those who are on the spending treadmill, buying shiny objects. Citizens take it upon themselves to education, share, test, warn.
As I wrapped this piece, I found myself wondering if there might be a market for retro-devices that savvy companies or niche players could exploit. I'm still deliriously happy with my 1999 Rav4, for example. I don't mind that it's a dumb car by modern standards. The only upgrade is the iPod stereo. GPS I upgraded via my own devices.
You could imagine devices that leverage the brain of a security-approved device, powered by a smart phone that consumers (or companies) already trust. Though as this teenager recently found out, jamming your iPhone into a cassette deck won't turn your car smart. OK, I'm going to stop trying to be helpful - otherwise I might fail my attempt at FUD. Speaking of which, our biggest driving danger remains the idiots who are texting their way to the promised land, threatening to take us with them.
Image credit: reckless driver and scared female passenger on white background © Innovated Captures - Fotolia.com