Uncovering Project Nightingale: Google accesses Fitbit records

Google receiving medical files stirs controversy

Rachel Jiang and Emily Xia

In a partnership with Ascension Health Network, Google has received access to millions of records in order to conduct “Project Nightingale,” a program that aims to incorporate artificial intelligence to improve medical diagnoses and treatment through using medical records of individuals. However, the project raised massive privacy concerns for the individuals because of this access to each person’s entire medical history. 

Senior Aadria Bagchi has gone to the hospital multiple times in the past few years because her doctor wasn’t able to pinpoint the exact reason for her stomach pain due to the vague symptoms that led to different possible diagnoses. They ran several tests, including urine tests, stool samples, four different blood tests, X-rays and MRI scans. Though she wasn’t personally affected by the Google’s partnership, she admits that she empathizes with those who were involved. 

“[The hospitals] basically know everything about me,” Bagchi said. “So I wouldn’t personally feel comfortable with some huge company like Google just having access to all that information, even though the purpose of it is to technically help and advance medical stuff.” 

According to math teacher Debbie Frazier, Google needs to ensure that they can track where the medical data is going and limit the data to research purposes only. 

“What you can’t see necessarily as easily is who else has looked at your data and that’s where it gets scary,” Frazier said. “So I think the government needs to step in — they need to be able to track that and create more secure data.”

Google hardware design engineer Scott Johnson, who speaks for himself and not the company, agrees with Frazier and believes that data in general needs to stay dissociated from the patients. According to him, larger companies need to stay accountable for how they handle their data. 

“I would be OK with it if it’s all kept private and must be confined to the study, but I think there are concerns,” Johnson said. “It has to all be analyzed so there’s no connection between particular patients, and there’s an expectation for how Google treats the patients and their records.”

Although both Bagchi and Frazier are concerned about the protection of privacy when it comes to Google having access to medical records, both also agree that there may be exceptions as to when Google should have this access. 

“Maybe for some scenarios, where a doctor can’t pinpoint, can’t connect the symptoms to a disease or an issue or disorder, something that Google could help if they have access to records and stuff like that,” Bagchi said. “But I just don’t think they should just have access to everyone’s. I think it’s a case-by-case basis thing.”

According to Frazier, there are also things that students and adults can do everyday to prevent themselves from releasing private information online. Regardless of whether they are online or using an app, many will encounter a webpage where they are expected to agree to a certain set of terms and conditions regarding the use of the app or site. 

It’s not that it influences medical history, it’s that it hopefully can influence medical future.

— Google hardware design engineer Scott Johnson

A Deloitte study concluded that 91% of individuals who accept these terms and conditions do not read the actual terms. In Frazier’s opinion, apathy to these terms is one reason as to why private information may be passed to other parties unconsciously. 

“I think the issue is about apathy,” Frazier said. “So many people are just apathetic. They’re so used to [the fact that] any time you download an app, you press the I agree button and you move on. You don’t even open it. You don’t read it.”

In addition, Frazier continues to address how apathy leads to a lack of education in this area for students and adults. 

“I think the apathy is contagious particularly because young people grow up with devices even before they can read the whole ‘I agree’ statements,” Frazier said. “We educate ourselves on courteous emailing, right? And we educate ourselves on what to or not to do in terms of bullying on the internet, but we never talk really about what those ‘I agree’ statements mean; we just kind of get them done [and] check the boxes at the bottom.”

Though there are ways students can protect their privacy, Frazier believes that companies should also make an effort to have clearer fine print as well, parallel to Bagchi’s viewpoint as well. 

“There’s smaller things in [the fine print] I think should be highlighted,” Bagchi said. “It shouldn’t just be fine print. I think it should be in bigger font or part of the title, maybe even its own separate form because then you actually know where your information is going.”

Johnson does support companies changing the format of the fine print; however, he acknowledges that they should focus on trying to persuade legal experts to incite real change over time, as they are the ones writing on behalf of the companies. 

“It should be open for public review in terms of whatever the laws mirror and introduce, but it’s always the same people looking at it,” Johnson said. “It’s written in a way that is only for legal experts to understand. So it’s up to the legal world, over time, to improve that.”

Though Project Nightingale has its issues, Frazier believes that research about medical history may be useful to an extent. 

“In your generation, depression’s prevalent; stress [and] all this stuff,” Frazier said. “As teachers, we’re constantly talking about where does that come from and how can we change that. We don’t know if we don’t collect data right?”

Johnson also acknowledges the benefits that large-scale data research can have in the health field, and how statistics can play into the lives of ordinary people. 

“It’s not that it influences medical history, it’s that it hopefully can influence medical future,” Johnson said. “It can help make connections between patterns and data that indicate issues that can be solved through drug research or other technology research in the areas of medicine.” 

With access to medical records, Frazier claims that it will be easier for doctors to make connections, predict illnesses and find treatments. However, she also explains that having access to this information may result in generalization of groups, which she thinks is problematic. Because of the notion that every individual is different, it results in ethical concerns as well.

“I think it can totally go sideways like Google with their image categorization,” Frazier said. “They had this horrible span of many months where anything dark colored [was] interpreted as being a gorilla […] and they got in trouble for that.”

Since machine learning can easily go awry, Frazier believes that there should be a line drawn to the extent of using it for research purposes. 

“I think there’s a lot of utility to it,” Frazier said. “It’s that whole, ‘There needs to be someone ethical watching and saying at what point we crossed the line and this is not OK to do.’ Somebody speaking about the people behind it, not just the money.”

In essence, reduction in privacy in inevitable when new technology is discovered. After all, Google started the research program in order to advance the medical industry, and are not getting information because they want to keep tabs on patients. Frazier thinks that citizens may just be scared that large corporations who get access to their data will pass it to unknown parties. 

“If Google wasn’t such a big name this wouldn’t have ruffled so many feathers,” said Frazier.