Privacy Weekly Episode 6
Welcome to Episode Six
👋 Welcome. This is episode six of my weekly privacy newsletter. We have a number of interesting stories for you this week, including:
- Microsoft To Expand Californian Data Privacy Law (CCPA) Coverage To The Entire United States
- Google Is Secretly Collecting And Processing Millions Of Americans Personal Health Data
- Google Almost Made 100,000 X-Rays Public
- iPhone Tracks Users For Ad Monitoring
- Chinas First Lawsuit Against Use Of Facial Recogniton
- Microsoft To Investigate Israeli Facial Recognition Company AnyVision It Funded
📩 Please subscribe to the newsletter if you like what you see. I'll only use your Email address for the purposes of the newsletter and will not sell it to a third party.
Microsoft To Expand Californian Data Privacy Law (CCPA) Coverage To The Entire United States
The California Consumer Privacy Act (CCPA), which is a data privacy law and goes into effect January 1st 2020, applies only to California and Californians. However, Microsoft has pledged that it will extend coverage to the entire United States (or specifically defined as 'to all Microsoft customers in the US').
Why this matters:
- CCPA is a state wide law that will go into effect on January 1st 2020, a national US law is still a long way off. Microsoft is saying it wants to promote and protect privacy and is taking concrete steps to do so.
- Tech companies are taking privacy more seriously. Apple and Microsoft are marketing themselves as privacy advocates, while their competitors like Facebook, Google and Amazon lag behind.
- CCPA may become an example to the rest of the US, and its concepts may spread to other states, or even the national level.
California Consumer Privacy Act (CCPA) in short:
- Companies must disclose the personal data they collect on indvidiuals (in scope of the law).
- Companies must disclose whether it is sold, and to whom.
- Companies must allow individuals (in scope) to opt out of their data being sold.
- Individuals have rights, such as to delete personal data that a company has collected.
Google Is Secretly Collecting And Processing Millions Of Americans Personal Health Data
According to the Wall Street Journal, Google, in an initiative called Project Nightengale, is secretly collecting and processing health care data from millions of American patients, without their consent, and even without their knowledge. Not only that, but the patients' doctors and health care workers aren't aware of the project either.
Google gets the data from a company called Ascension, which is the second largest health system in the US.
- Millions of Americans health related data involved.
- Data includes: lab results, dianoses, hospital records, patient names, dates of birth.
- No consent or knowledge of the data transfer from Ascension to Google by patients or health care workers.
- At least 150 Google employees have access to the data.
How can Ascension and Google legally do this: Under relevant law, personal data transfer of this kind is allowed if it helps (Ascension) carry out health care functions.
What does Google want with this data: Google wants to provide data related AI and Machine Learning products and services to the health care industry. To do this it needs the data for analysis.
Google Responded to The Verge: Google says that this kind of agreement (with a health provider) is common and lawful and that the data is only used for development of products and services (for that health care provider).
Google Almost Made 100,000 X-Rays Public
In 2017, Google almost made 100,000 X-rays public until concerns were raised by the National Institutes of Health (NIH), according to the Washington Post. NIH indicated that it was still possible to identify people based on the X-rays, whereas initially it was believed that this was not the case, which would be a privacy issue. As a result of the concerns raised, Google cancelled the project.
It was not reported on at the time.
Google did not obtain legal agreements to make the privacy senstive data public. This has similarities with the case described above involving the secret Google project Nightengale.
Why it matters:
- This case, and the project Nightengale case, highlight the difficulties Google has with regards to privacy and the health care industry.
- It shows that NIH had to raise the concerns, and they were apparantly not identified by Google.
Links: The Washington Post
iPhone Tracks Users For Ad Monitoring
Apple says it takes privacy seriously. They market themselves and their iPhone as a better alternative to Android.
However, according to Mozilla, Apple tracks its iPhone users with a unique identification number (called IDFA, which stands for 'identifier for advertisers').
Now this is not a secret feature, it's fully visible within the iPhone settings. Mozilla is saying that most users don't know about this 'feature', and that it shouldn't be enabled by default if Apple are serious about privacy.
China's First Lawsuit Against Use Of Facial Recogniton
China has made widespread use of facial recognition technology to monitor its citizens. Now, in a first for the country, a Chinese law professor has filed a lawsuit against its use at a zoo in Hangzhou.
The Hagzhou zoo installed facial recognition to replace its existing finger print system.
The professor has filed the lawsuit because he claims consent was not obtained to collect and process personal data (in its use of facial recognition technology). And the zoo was not willing to provide a full refund as a result of the change, which the professor objected to.
Why it matters:
- Use of facial recognition is widepread in China. It's used at airports, subways, ATMs, classrooms and now even at a zoo.
- This lawsuit could spark a counter movement within the country.
Links: The Telegraph
Microsoft To Investigate Israeli Facial Recognition Company AnyVision It Funded
Microsoft has hired a former US Attorney General to investigate the use of technology from facial recognition company it funded. Apparantly, the company called AnyVision, used its technology to spy on Palestinians who live in the West Bank.
Why it matters:
- This is yet another case about the use of facial recognition (see another one above in China).
- It highlights the differences in what governments and companies think is justified around the world.
- It shows that Microsoft is serious about ethical issues related to facial recognition usage.