- iot news week - IoT news of the week for March 30, 2018 | Stacey on IoT

Common sense security principles drafted in the UK: Most of these may sound familiar to you since we’ve suggested them time and again on the podcast, but it’s nice to see some formalization. The UK’s Department for Digital, Culture, Media & Sport has drafted five best practices when it comes to securing and using IoT devices. These include not allowing devices to use default passwords, better communication of customer data usage, and a public contact for reporting issues. Perhaps best of all is the practice that requires device makers to keep software updated and, more importantly, to explain how long software updates will be made available. That sounds like our idea of an “expiration date” for IoT! The draft is open for public comment until April 25th. (National Law Review) — by Kevin Tofel

Nvidia’s deal with ARM is a big deal for AI at the edge: Nvidia announced that ARM will integrate its open sourced Nvidia Deep Learning Accelerator tech that helps run convolutional neural networks (CNNs) into ARM’s new Trillium architecture. ARM announced its Trillium architecture last month as a dedicated processor design for machine learning. The deal between the two is significant because it basically embeds Nvidia’s technology for CNNs in almost every potential chip used in the internet of things. CNNs are used for image recognition, and as this analyst notes, the deal means that Nvidia is giving this tech away for free because it believes the tech to solve CNNs is basically commodified. Nvidia would rather give it away to cement its market share. I also think it’s smart because the internet of things and edge devices will need a heck of a lot of image processing capability as noted in my story up above. All that video data isn’t going to make it to the cloud. (Forbes)

WannaCry is still hitting manufacturing plants: Last year, the WannaCry and the variant NotPetya ransomware took down DHL and Merck in a new style of attack. WannaCry was halted fairly quickly, but NotPetya and variants of the attack have continued to spread. Boeing was recently hit, and fears that it had affected the company’s production line and delivery of airplanes caused panic among its staff and customers. Buried in the story are two scary facts. One is that there’s a realization as to how many old, unpatched Windows machines there are in the manufacturing world, and the second is that WannaCry had recently hit at least two other manufacturing firms, taking their operations offline. (The Seattle Times)

A sensor made of gelatin? A startup named Mimica is making a food label out of gelatin that’s engineered to last as long as the particular food the label is attached to lasts. As the food spoils the label itself spoils, becoming bumpy to the touch, and in the process allowing someone to feel if their milk has soured or their two-year-old vial of sunflower oil has gone rancid. The original idea came out of a desire to build an expiration label for the visually impaired, but the idea resonated across a variety of food companies and now the founder is a company around the tech. I love it. (The Spoon)

Who owns your outdoor camera data? After my colleague Kevin used a Nest cam to prove that a neighbor hit his parked car a few days back, I’ve been curious as to how much value that video would be to the police and the insurance companies, especially since Kevin’s neighbor denies she hit his car. However, there are a lot of other questions that arise when video doorbells and outdoor video cameras are positioned everywhere. One is how long a homeowner must keep the images; another is whether or not the police have a right to compel camera footage in the case of a crime. For more, check out the article. (CEPro)

IBM’s Watson could improve Siri: Apple and IBM announced yet another partnership, and although most of these deals focus on enterprise applications, the latest one could trickle down into Siri. At least that’s what my colleague Kevin thinks. His take is that using IBM Watson for pattern recognition and machine learning in the smart home might lead to a smarter Siri, one that doesn’t just handle your spoken commands better than she does today but a Siri that can anticipate your needs and actions. It sounds far-fetched, but when you follow his logic, you can see the potential. Semi-autonomous homes, anyone? (StaceyOnIoT— by Kevin Tofel

Your next business idea? As someone with dozens of gadgets that require an outlet, I’m constantly tripping over wires, strategically placing circuit breakers everywhere, and bemoaning the invasion of tech in my well-designed spaces. This post offers some innovative ways to redesign power cords and outlets. Someone please implement them. (Medium)

From the biased algorithms department: This article and its radio version take a look at an MIT researcher’s efforts to show how certain machine-trained facial recognition models develop biases that result in the computer not being able to identify dark-skinned faces, especially those of women. What’s eye-opening was how bad they were at it. The accuracy rate of identifying light-skinned men’s faces was 99% across the board. However, when identifying darker-skinned women, IBM’s error rate was almost %, Facebook’s was 34.5%, and Microsoft’s was 20.8%. So when someone says we’re at 99% accuracy when it comes to facial recognition, it’s probably a good idea to ask what their sample population looked like. (WBGH News)

Does your AI need a therapist? Computer science, law, and business are rapidly trying to come to grips with the idea that artificial intelligence based on neural networks and computers training themselves how to understand the world are incomprehensible to people. We’re also becoming aware that not only do we infuse our own human bias into these algorithms, but that these machines “think” in a way that is to us. And since we can’t understand them, they can behave in ways we can’t anticipate. This is terrifying if you’re trying to use neural networks to build a self-driving car or figure out the best business process to use. One idea proposed here is some sort of role for people who try to understand the world from the machine’s point of view, what the article calls a psychotherapist for algorithms. (FastCompany)

Source link


Please enter your comment!
Please enter your name here