Do privacy controls lead to more trust in Alexa? Not necessarily, research finds

April 25, 2020

UNIVERSITY PARK, Pa. - Giving users of smart assistants the option to adjust settings for privacy or content delivery, or both, doesn't necessarily increase their trust in the platform, according to a team of Penn State researchers. In fact, for some users, it could have an unfavorable effect.

Trust in Amazon Alexa went up for regular users who were given the option to adjust their privacy and content settings, the researchers found in a recent study. However, for power users - individuals whose skills and expertise are more advanced than others - trust went down when they were given the opportunity to make privacy setting adjustments.

"That's kind of counterintuitive," said S. Shyam Sundar, James P. Jimirro Professor of Media Effects and co-director of the Media Effects Research Laboratory (MERL) at Penn State. "The mere presence of privacy settings seems to trigger thoughts of potential privacy problems among those who are aware of such loopholes in communication technologies"

He added, "Once you give power users these options and they realize [that privacy settings are] actually controllable, they tend to panic and see the between-the-lines message rather than see customization for what it is, which is really a benevolent effort to provide more user control."

Another major finding of the study showed that users who were sensitive about their privacy found content less credible when given the option to customize their privacy settings. However, trust in the content increased when these users were also given the opportunity to customize that content.

"It is really interesting to see that content customization, which is unrelated to privacy, alleviated the negative priming effects of adjusting privacy settings," said Eugene Cho, doctoral student in mass communications and lead author on the team's paper. "The empowering effect of customization noticed in our other studies extend to smart speaker interactions and to the context of privacy."

But, the quality of content customization services could be impacted by privacy customization settings, said Saeed Abdullah, assistant professor in the College of Information Sciences and Technology and a collaborator on the project. This concept is similar to other artificial-intelligence algorithms that draw on user history to drive personalized content on well-known platforms, such as suggesting the next movie to watch on Netflix or products to buy on Amazon.

"For example, if you delete your user history or your audio recordings from Alexa, it might mean that the platform cannot personalize its offerings very well for you," Abdullah said. "Some people might like them, as some people like to have the best recommendations from the systems. And in that case, they might not take advantage of the privacy options."

He added, "So in other words, the differences between individuals and their perceived expectations of these systems mean that people will use privacy settings in a different way. That's why providing control is so important."

As smart speakers become more common, there's increased concern about the degree to which the devices could be infringing on users' privacy. The researchers hope that their work will inform designers and service providers to consider incorporating various content customization options to lower mistrust in content and relieve privacy concerns.

"If users want the devices to function the way they're supposed to function, they are supposed to always be on," Sundar said. "I feel like we've reached a point in our cultural conversation about the acceptability of having these kinds of devices in our homes, and to what extent we are comfortable."

"Our findings can help us to better design smarter, more privacy-sensitive and more trustworthy smart speakers in the future," added Abdullah.

In the study, 90 participants were recruited to interact with Amazon Alexa through an Amazon Echo device by asking several health-related questions. In the first part of the study, half of the users were randomly given the opportunity to customize their privacy settings - such as deleting their voice recordings -- while the others were not. Then, another random half of the sample was able to customize their content - such as adjusting speed or content length, or selecting the source of information - while the other half was not afforded the opportunity.
Nasim Motalebi, doctoral student in informatics, also contributed to the project. The study was accepted after blind peer review to the 2020 ACM Conference on Human Factors and Computing Systems (CHI), and earned an honorable mention. The conference has been canceled due to the global coronavirus outbreak. The work is being published in the conference proceedings, released today (April 25).

Penn State

Related Privacy Articles from Brightsurf:

Yale team finds way to protect genetic privacy in research
In a new report, a team of Yale scientists has developed a way to protect people's private genetic information while preserving the benefits of a free exchange of functional genomics data between researchers.

Researchers simulate privacy leaks in functional genomics studies
In a study publishing November 12 in the journal Cell, a team of investigators demonstrates that it's possible to de-identify raw functional genomics data to ensure patient privacy.

Some children at higher risk of privacy violations from digital apps
While federal privacy laws prohibit digital platforms from storing and sharing children's personal information, those rules aren't always enforced, researchers find.

COVID-19 symptom tracker ensures privacy during isolation
An online COVID-19 symptom tracking tool developed by researchers at Georgetown University Medical Center ensures a person's confidentiality while being able to actively monitor their symptoms.

New research reveals privacy risks of home security cameras
An international study has used data from a major home Internet Protocol (IP) security camera provider to evaluate potential privacy risks for users.

Researcher develops tool to protect children's online privacy
A University of Texas at Dallas study of 100 mobile apps for kids found that 72 violated a federal law aimed at protecting children's online privacy.

Do COVID-19 apps protect your privacy?
Many mobile apps that track the spread of COVID-19 ask for personal data but don't indicate the information will be secure.

COVID-19 contact tracing apps: 8 privacy questions governments should ask
Imperial experts have posed eight privacy questions governments should consider when developing coronavirus contact tracing apps.

New security system to revolutionise communications privacy
A new uncrackable security system created by researchers at King Abdullah University of Science and Technology (KAUST), the University of St Andrews and the Center for Unconventional Processes of Sciences (CUP Sciences) is set to revolutionize communications privacy.

Mayo Clinic studies patient privacy in MRI research
Though identifying data typically are removed from medical image files before they are shared for research, a Mayo Clinic study finds that this may not be enough to protect patient privacy.

Read More: Privacy News and Privacy Current Events is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to