Today’s story is about Philips Hue by Signify. They will soon start forcing accounts on all users and upload user data to their cloud. For now, Signify says you’ll still be able to control your Hue lights locally as you’re currently used to, but we don’t know if this may change in the future. The privacy policy allows them to store the data and share it with partners.
Location data, when you're home/not home, which room you're likely in/not in. Data that costs almost nothing to produce, but can be sold for millions.
Bulbs tell them when you're in the kitchen, bathroom, bedroom, etc. Relatively easy to combine it with smart tv, smart watch, security cam, and app/phone data to identify you exactly.
Combine it all and it's likely they'd be able to identify you exactly and identify what you're doing with a high degree of certainty, then micro-target you with ads or propaganda.
Honestly, there comes a point where you'd have more privacy shoving a camera up your ass. Less privacy than the DDR.
A lot of people don't seem to understand that each individual bit of data is often not valuable in itself, but it is as part of a whole.
Basically, everything there is to know about you is a jigsaw puzzle. Many companies out there want that finished image, so they pay a premium for each individual piece of the jigsaw, and the companies you give your data to everyday are selling those pieces.
This might be a stupid question, and I don't know if anyone would even have the knowledge to answer... but is this data ever audited? Other than possible lawsuits, what prevents me from randomly generating data points for my customers and selling them to these companies? I assume they are cross referencing with other data sets and they could catch on quickly?
There would be so much data in understanding people’s light usage. For example, you could figure out how late or early people get up, number of people living in a house, how crowded the house is, how many lights are used per room, etc etc. it would be a gold mine of information.
Let’s say you’re a home automaton designer. You want to design devices to be used in the home, but in order to design such devices, you need enough of a stockpile of user data. This lightbulb data would be incredible valuable.
You can probably even analyse the data and determine things like whether someone is watching tv late at night.
From a nefarious view, how valuable would this data be to robbers and thieves?
also, room names. You can get a pretty good idea of a house's interior layout from the names and sequence of lights being activated. The ongoing attempts to map data to the physical world.
Sonos did this a few years ago and there was a similar outcry. I have stopped using Sonos devices too.
Considering a lot of people are home all the time, probably not worth all that much.
I think people overestimate how much their behavior and data is actually worth.
Companies only care as far as targeting ads to people. But 95% of the time those ads don't actually do anything anyways.
How does a randomized system mess with that data. I only have two hue light, an under cabinet strip. My Echo turns them on and off randomly when I set it in the away mode. Will Phillips get both sets of data? Will Daddy Jeff share? Will he just buy Phillips and cut out the middle man?
It builds a profile of you, and then they combine that with thousands of other profiles to build demographic profiles and then they sell this data to other firms or use it to further tune their own advertising services.
The same as pretty much every other company on the Internet. If it didn't work they wouldn't do it. Some people not understanding this due to over simplified examples makes no difference to that.
That's not necessarily true, people do things that don't work all the time, sometimes for a long time. There have been millions if not billions of dollars dumped into shit that doesn't work. Using charts to predict the stock market doesn't work, yet you can find people still doing it to this day.
Thats a bad assumption. Companies do dumb things all the time. i.e thinking that what theyre doing works is not the same thing as what theyre doing actually benefiting them.