Update: Evernote has reversed course on its previous plan to make machine-learning an opt-out system. Instead, they company will allow customers to opt-in to allowing their notes to be used for machine learning. “[W]e will make machine learning technologies available to our users, but no employees will be reading note content as part of this process unless users opt in,” Evernote wrote in a new blog post. “We will invite Evernote customers to help us build a better product by joining the program.”
Original story below
Here’s the new edition:
The difference here lies in the vague phrase “improve the service.” Not only could this be stretched to cover just about anything, Evernote’s simultaneous announcement that it would use customer data for machine learning rubbed many people the wrong way. Since then, the company has backpedaled frantically. CEO Chris O’Neill has written a lengthy missive attempting to explain and qualify some of the changes to Evernote’s policy, telling people that the handful of engineers allowed to see user data are carefully vetted and hand-selected, and that the machine learning tests Evernote intends to conduct are something consumers can opt out of.
An example of machine learning applied to language, from Evernote’s blog.
There’s a persistent willingness to treat information about our lives as just “data,” even though that data is increasingly used to make decisions about what kinds of products and services are marketed to you. Banks and financial institutions have actively explored using Facebook data to calculate what kind of borrower you are likely to be and whether or not you’ll pay them back. Over in the UK, one company openly advertises itself as using this information to spy on potential renters. Police departments have signed agreements with license plate reader companies in order to avoid data retention time limits.
The question isn’t whether these types of action are legal, or even whether Evernote itself has some nefarious master plan (yes they are, and no it doesn’t). The question is, what kind of society are we creating by training people to treat their personal data as a commodity to be readily handed over to half a hundred services? I don’t pretend to have the answers. But I fear we don’t spend half enough time, as a society, considering the questions.
Now read: 19 ways to stay anonymous and protect your online privacy