After its refusal to obey a federal judge’s orders to hack into the San Bernardino shooter’s phone earlier this year citing its customers’ security, Apple will voluntarily walk another line between protecting its users’ privacy and collecting more information about people’s habits with the fall release of its new operating system, iOS 10.
Accompanying the introduction of iOS 10 will be the implementation of differential privacy. Differential privacy is a set of statistical techniques that can produce information about a certain population without providing enough information for individuals within that population to be identified, Aaron Roth, associate professor of computer science at Penn said.
The advantages that differential privacy confers can be best understood with a simple example.
Suppose someone wanted to determine how many people in Philadelphia intend to vote for Donald Trump in the upcoming general election. One way of obtaining that information would be to call people on the telephone and ask them directly, but that method could easily give away the personal identities of the sampled population.
Instead, the Philadelphia population could be sampled in a sneakier way. Perhaps, before asking each survey respondent if he or she intended to vote for Donald Trump in November, a coin was flipped. The survey respondent would be instructed to lie about their intended candidate if the result of the coin flip was a head and to tell the truth if the result of the flip was a tail.
Collecting data via this method using a large sample population offers the survey respondents the assurance of plausible deniability. The responses are weakly correlated with the behavior of individual respondents.
“It is possible to get a realistic picture of the predicted voting patterns using this randomized method of recording responses,” Roth said. “The law of large numbers enables one to subtract out mathematical noise from the data to obtain trustworthy, accurate results.”
Apple will likely be employing differential privacy to acquire more complicated types of data from its various target populations, though Roth isn’t sure what they are looking for.
Differential privacy, in theory, will make it difficult to ever trace information back to an individual user, while providing Apple’s computer scientists with workable datasets. It is considered to be one of the most accurate privacy-preserving data techniques within the academic world, according to Engadget, a technology and consumer electronics magazine.
Google has also been public about its use of differential privacy to learn about the behavioral patterns of its users.Comments powered by Disqus
Please note All comments are eligible for publication in The Daily Pennsylvanian.