New research looks at how online information can be manipulated

11/5/2014 Katie Carr, CSL

Associate Professor Michael Bailey has received a $225,000 NSF grant to study whether personalized information services on the Internet may be a new feeding ground for cyberattackers.

Written by Katie Carr, CSL

With the recent tight gubernatorial race in Illinois, voters have been bombarded this fall with TV ads, yard signs, and robocalls.

michael Bailey
michael Bailey
But what if candidates could use their campaign dollars to subtlety influence your vote on social media? It might be as simple as paying to promote the status of a friend who just voted or downplaying negative status posts on Facebook.

Those actions might seem innocuous, but it could selectively nudge the turnout.

“You can envision the possibility,” said michael Bailey, a new associate professor in ECE who is also affiliated with the Coordinated Science Lab, “where turnouts are only a couple hundred votes, could one use the sciences of influence and persuasion and a knowledge of Facebook and Google customization and personalization to actually influence the outcome?”

Bailey, who focuses his work in security and also has an appointment with the Information Trust Institute, was recently awarded a $225,000 NSF grant titled “EPICA: Empowering People to Overcome Information Controls and Attacks,” that will look at situations like this, where personalized information services on the Internet may be a new feeding ground for attackers to compromise the integrity of input data and affect outputs.

“We are interested in exploring the information ecosystem as a critical resource that needs to be protected in the same way that utilities or the transportation sector needs to be protected,” Bailey said. “”You can’t have democracy without free and open access to information.”

This interdisciplinary work pulls together political scientists, computer scientists, and psychologists from multiple universities. Bailey will team up with Georgia Institute of Technology computer scientists Wenke Lee, Nicholas Feamster and Hongyuan Zha, Georgia Institute of Technology political scientist Hans Klein and University of Maryland computer scientist Marshini Chetty. Bailey, as a computer scientist focused on security, will specifically look at the creation of systems from an adversarial point of view. The group will also examine social behavior and economical behaviors about how we think about and view information.

One of the challenges Bailey has seen emerge as the majority of people are receiving their news online now, rather than via print or television media, is that each aspect of the information ecosystem — how we create, locate, aggregate, and consume news and information — is changing.

“One of the things I worry about most is the idea of systemic influence or persuasion,” Bailey said. “What would happen if someone like Google or Microsoft decided to promote a political agenda? How would we know that our search results are unbiased?”

Bailey added that with so much customization and personalization of individual results, it’s had to determine if your results have been manipulated. It’s not enough to simply curtail these features. As a society, we want Google and other search engines to know some things about us because we get value from personalization and customization.

“However, it’s sort of a magic black box," he said. "We don’t know why it works the way it does, so it’s hard to figure out if they’ve started doing something we don’t want them to do. This information manipulation is what we’re going to be looking at.”

By bringing together a variety of security and machine learning techniques, they'll look at the idea of influence and persuasion in a new way. The group will create algorithms, techniques, and theories that influence the science of networking and security.

Specifically, the research group members will study the security of personalized services such as Google Search and News and online-targeted advertising to identify vulnerabilities, as well as develop countermeasures to prevent various attacks, alert users, and incentivize the industry to provide more transparency and protection. They will also be developing an evaluation framework to help facilitate development and adoption of new technologies.

“We want to create tools that help users better understand the impact of customization and personalization and all the elements of the information ecosystem,” Bailey said. “We also hope to influence the areas where technology interacts with public policy, law and societal ideas.”


Share this story

This story was published November 5, 2014.