Advances in artificial intelligence threaten private health data

Advances in artificial intelligence threaten private health data

Advances in artificial intelligence have created new threats to the privacy of people's health data, a new University of California, Berkeley, study shows.


The study was led by professor Anil Aswani of the Industrial Engineering & Operations Research Department (IEOR) in the College of Engineering and his team - 

Na L, Yang C, Lo C, Zhao F, Fukuoka Y.


The research was published Dec. 21 in the JAMA Network Open journal.


Identifying individuals by step data


Aswani shows that by using artificial intelligence, it is possible to identify individuals by learning daily patterns in step data (like that collected by activity trackers, smartwatches and smartphones) and correlating it to demographic data.


The mining of two years’ worth of data covering more than 15,000 Americans led to the conclusion that the privacy standards associated with 1996’s HIPAA (Health Insurance Portability and Accountability Act) legislation need to be revisited and reworked, Berkeley News reports.


In principle, you could imagine Facebook gathering step data from the app on your smartphone, then buying health care data from another company and matching the two," he added. "Now they would have health care data that's matched to names, and they could either start selling advertising based on that or they could sell the data to others."


The problem isn't with the devices, but with how the information the devices capture can be misused and potentially sold on the open market, Aswani argues.


How we are using health data


I'm not saying we should abandon these devices," he said. "But we need to be very careful about how we are using this data. We need to protect the information. If we can do that, it's a net positive", adds Aswani.

The results suggest a broader threat to the privacy of health data.


HIPAA regulations make your health care private, but they don’t cover as much as you think,” he says.


Many groups, like tech companies, are not covered by HIPAA, and only very specific pieces of information are not allowed to be shared by current HIPAA rules.


There are companies buying health data. It’s supposed to be anonymous data, but their whole business model is to find a way to attach names to this data and sell it.”


Unethical ways


Advances in AI make it easier for companies to gain access to health data, the temptation for companies to use it in illegal or unethical ways will increase. Employers, mortgage lenders, credit card companies and others could potentially use AI to discriminate based on pregnancy or disability status, for instance, professor Anil Aswani warns.


This study suggests that current practices for deidentification of accelerometer-measured PA data might be insufficient to ensure privacy. This finding has important policy implications because it appears to show the need for deidentification that aggregates the PA data of multiple individuals to ensure privacy for single individuals.