As seen on The Tartan...
The revelation that Facebook may have passed on the data of as many as 87 million users to Cambridge Analytica has brought the issue of privacy online to the forefront of the public consciousness. Consumers are asking who has their data, where their personal information may have been distributed, and how their interests fit into a business model where the product is their own data.
Carnegie Mellon’s CyLab Usable Privacy and Security Laboratory, or CUPS, has been researching these issues since its foundation.
According to their website, the lab focuses their research on three distinct aspects of security systems, “building systems that 'just work' without involving humans in security-critical functions; making secure systems intuitive and easy to use; and teaching humans how to perform security-critical tasks.”
This type of research is imperative in the world in which we live: security systems have never had more of an impact than they do now. The complex nature of privacy in the era of the internet brings together PhD candidates, masters students, and other researchers across a variety of disciplines including, but not limited to, societal computing, engineering and public policy, human-computer interaction, computer science, electrical and computer engineering, and public policy and management.
The Tartan spoke to the founder and director of the laboratory, Dr. Lorrie Faith Cranor, who also serves as a professor of computer science and engineering and public policy, as well as associate department head of engineering and public policy. She started the lab in 2004 “shortly after [becoming] a faculty member at Carnegie Mellon” and has “been the director from the beginning.”
She explains that privacy and security applications need to be less "hands on" for humans, as “any time humans have to remember to do something or pay attention, there is a risk that they won’t.” She continues, “for example, when web browsers pop up security warnings, humans tend to just swat them away without paying attention.” A more secure and usable browser or system would handle the security threat accurately and without human interaction, which is a great example of what the lab researches and develops.
Some examples of published CUPS research in the past include a Joshua Tan, et al. paper titled “Can Unicorns Help Users Compare Crypto Key Fingerprints?” from 2017, a Lorrie Cranor, et al. paper titled “Teaching Usable Privacy and Security: A guide for instructors” from 2007, and an Aleecia McDonald, et al. paper titled “How Technology Drives Vehicular Privacy” from 2006.
These diverse publications, covering multiple facets of science, technology, and policy, allow the laboratory to have a broad impact in the world of privacy and security systems. Dr. Cranor elaborated on this by mentioning just a few instances of the laboratory’s impact.
“In 2008 we started a company based on our research into anti-phishing training," Dr. Cranor explained. "That company, Wombat Security, was purchased by Proofpoint for $225 million in March 2018. It is now a leading supplier of security awareness training services to companies and organizations around the world.”
Furthermore, Dr. Cranor stated that “Microsoft Internet Explorer phishing warnings were improved based on [our] research” and that “Facebook implemented privacy nudges to remind people to adjust their settings based on [our] research.” Also, Dr. Cranor “started the Symposium on Usable Privacy and Security (SOUPS),” which is now “in its 14th year and run by the USENIX Association.”
While no papers have been published this year, one that is currently being worked on is something that a lot of Carnegie Mellon students and faculty are stakeholders in. Titled “‘It's not actually that horrible’: Exploring Adoption of Two-Factor Authentication at a University,” the paper looks into Carnegie Mellon University's use of Duo Mobile login authentication. It will be presented at the 2018 Conference on Human Factors in Computing Systems later this month.
In addition to this, Dr. Cranor commented that a “paper on how people use private browsing modes in web browsers” and “a study of what people draw when asked to draw pictures of privacy” are currently in the works.