aster.cloud aster.cloud
  • /
  • Platforms
    • Public Cloud
    • On-Premise
    • Hybrid Cloud
    • Data
  • Architecture
    • Design
    • Solutions
    • Enterprise
  • Engineering
    • Automation
    • Software Engineering
    • Project Management
    • DevOps
  • Programming
    • Learning
  • Tools
  • About
  • /
  • Platforms
    • Public Cloud
    • On-Premise
    • Hybrid Cloud
    • Data
  • Architecture
    • Design
    • Solutions
    • Enterprise
  • Engineering
    • Automation
    • Software Engineering
    • Project Management
    • DevOps
  • Programming
    • Learning
  • Tools
  • About
aster.cloud aster.cloud
  • /
  • Platforms
    • Public Cloud
    • On-Premise
    • Hybrid Cloud
    • Data
  • Architecture
    • Design
    • Solutions
    • Enterprise
  • Engineering
    • Automation
    • Software Engineering
    • Project Management
    • DevOps
  • Programming
    • Learning
  • Tools
  • About
  • Data
  • Engineering
  • People
  • Platforms
  • Tech

People Want Data Privacy But Don’t Always Know What They’re Getting

  • root
  • November 9, 2020
  • 4 minute read

The Trump administration’s move to ban the popular video app TikTok has stoked fears about the Chinese government collecting personal information of people who use the app. These fears underscore growing concerns Americans have about digital privacy generally.

Debates around privacy might seem simple: Something is private or it’s not. However, the technology that provides digital privacy is anything but simple.


Partner with aster.cloud
for your next big idea.
Let us know here.



From our partners:

CITI.IO :: Business. Institutions. Society. Global Political Economy.
CYBERPOGO.COM :: For the Arts, Sciences, and Technology.
DADAHACKS.COM :: Parenting For The Rest Of Us.
ZEDISTA.COM :: Entertainment. Sports. Culture. Escape.
TAKUMAKU.COM :: For The Hearth And Home.
ASTER.CLOUD :: From The Cloud And Beyond.
LIWAIWAI.COM :: Intelligence, Inside and Outside.
GLOBALCLOUDPLATFORMS.COM :: For The World's Computing Needs.
FIREGULAMAN.COM :: For The Fire In The Belly Of The Coder.
ASTERCASTER.COM :: Supra Astra. Beyond The Stars.
BARTDAY.COM :: Prosperity For Everyone.

Our data privacy research shows that people’s hesitancy to share their data stems in part from not knowing who would have access to it and how organizations that collect data keep it private. We’ve also found that when people are aware of data privacy technologies, they might not get what they expect.

 

Differential privacy explained

While there are many ways to provide privacy for people who share their data, differential privacy has recently emerged as a leading technique and is being rapidly adopted.

Imagine your local tourism committee wanted to find out the most popular places in your area. A simple solution would be to collect lists of all the locations you have visited from your mobile device, combine it with similar lists for everyone else in your area, and count how often each location was visited. While efficient, collecting people’s sensitive data in this way can have dire consequences. Even if the data is stripped of names, it may still be possible for a data analyst or a hacker to identify and stalk individuals.

Differential privacy can be used to protect everyone’s personal data while gleaning useful information from it. Differential privacy disguises individuals’ information by randomly changing the lists of places they have visited, possibly by removing some locations and adding others. These introduced errors make it virtually impossible to compare people’s information and use the process of elimination to determine someone’s identity. Importantly, these random changes are small enough to ensure that the summary statistics – in this case, the most popular places – are accurate.

Read More  Here’s How Much Your Personal Information Is Worth To Cybercriminals – And What They Do With It
The U.S. Census Bureau is using differential privacy to protect your data in the 2020 census.

In practice, differential privacy isn’t perfect. The randomization process must be calibrated carefully. Too much randomness will make the summary statistics inaccurate. Too little will leave people vulnerable to being identified. Also, if the randomization takes place after everyone’s unaltered data has been collected, as is common in some versions of differential privacy, hackers may still be able to get at the original data.

When differential privacy was developed in 2006, it was mostly regarded as a theoretically interesting tool. In 2014, Google became the first company to start publicly using differential privacy for data collection.

Since then, new systems using differential privacy have been deployed by Microsoft, Google and the U.S. Census Bureau. Apple uses it to power machine learning algorithms without needing to see your data, and Uber turned to it to make sure their internal data analysts can’t abuse their power. Differential privacy is often hailed as the solution to the online advertising industry’s privacy issues by allowing advertisers to learn how people respond to their ads without tracking individuals.

 

Reasonable expectations?

But it’s not clear that people who are weighing whether to share their data have clear expectations about, or understand, differential privacy.

In July, we, as researchers at Boston University, the Georgia Institute of Technology and Microsoft Research and the Max Planck Institute, surveyed 675 Americans to evaluate whether people are willing to trust differentially private systems with their data.

Read More  DataStax Astra Now Available On AWS, GCP, And Azure, Making It Easier For Developers To Build Multi-Cloud, Multi-Region Applications

We created descriptions of differential privacy based on those used by companies, media outlets and academics. These definitions ranged from nuanced descriptions that focused on what differential privacy could allow a company to do or the risks it protects against, descriptions that focused on trust in the many companies that are now using it and descriptions that simply stated that differential privacy is “the new gold standard in data privacy protection,” as the Census Bureau has described it.

Americans we surveyed were about twice as likely to report that they would be willing to share their data if they were told, using one of these definitions, that their data would be protected with differential privacy. The specific way that differential privacy was described, however, did not affect people’s inclination to share. The mere guarantee of privacy seems to be sufficient to alter people’s expectations about who can access their data and whether it would be secure in the event of a hack. In turn, those expectations drive people’s willingness to share information.

Troublingly, people’s expectations of how protected their data will be with differential privacy are not always correct. For example, many differential privacy systems do nothing to protect user data from lawful law enforcement searches, but 20% of respondents expected this protection.

The confusion is likely due to the way that companies, media outlets and even academics describe differential privacy. Most explanations focus on what differential privacy does or what it can be used for, but do little to highlight what differential privacy can and can’t protect against. This leaves people to draw their own conclusions about what protections differential privacy provides.

Read More  The New Google Cloud Region In Columbus, Ohio Is Open

 

Building trust

To help people make informed choices about their data, they need information that accurately sets their expectations about privacy. It’s not enough to tell people that a system meets a “gold standard” of some types of privacy without telling them what that means. Users shouldn’t need a degree in mathematics to make an informed choice.

Identifying the best ways to clearly explain the protections provided by differential privacy will require further research to identify which expectations are most important to people who are considering sharing their data. One possibility is using techniques like privacy nutrition labels.

Helping people align their expectations with reality will also require companies using differential privacy as part of their data collecting activities to fully and accurately explain what is and isn’t being kept private and from whom.The Conversation

 

Gabriel Kaptchuk, Researcher Assistant Professor in Computer Science, Boston University; Elissa M. Redmiles, Faculty member & Research Group Leader, Max Planck Institute, and Rachel Cummings, Assistant Professor of Industrial and Systems Engineering, Georgia Institute of Technology

This article is republished from The Conversation under a Creative Commons license.


For enquiries, product placements, sponsorships, and collaborations, connect with us at [email protected]. We'd love to hear from you!

Our humans need coffee too! Your support is highly appreciated, thank you!

root

Related Topics
  • Bureau DIfferential privacy
  • Census
  • Census Data
  • Data Privacy
  • Online Privacy
  • Privacy
  • Social Media
You May Also Like
View Post
  • Engineering
  • Technology

Apple supercharges its tools and technologies for developers to foster creativity, innovation, and design

  • June 9, 2025
View Post
  • Engineering

Just make it scale: An Aurora DSQL story

  • May 29, 2025
View Post
  • Featured
  • People

Conclave: How A New Pope Is Chosen

  • April 25, 2025
Getting things done makes her feel amazing
View Post
  • Computing
  • Data
  • Featured
  • Learning
  • Tech
  • Technology

Nurturing Minds in the Digital Revolution

  • April 25, 2025
View Post
  • People
  • Technology

AI is automating our jobs – but values need to change if we are to be liberated by it

  • April 17, 2025
View Post
  • Engineering
  • Technology

Guide: Our top four AI Hypercomputer use cases, reference architectures and tutorials

  • March 9, 2025
View Post
  • Computing
  • Engineering

Why a decades old architecture decision is impeding the power of AI computing

  • February 19, 2025
View Post
  • Tech

Deep dive into AI with Google Cloud’s global generative AI roadshow

  • February 18, 2025

Stay Connected!
LATEST
  • What is cloud bursting?
    • June 18, 2025
  • What is confidential computing?
    • June 17, 2025
  • Oracle adds xAI Grok models to OCI
    • June 17, 2025
  • Fine-tune your storage-as-a-service approach
    • June 16, 2025
  • 5
    Advanced audio dialog and generation with Gemini 2.5
    • June 15, 2025
  • 6
    A Father’s Day Gift for Every Pop and Papa
    • June 13, 2025
  • 7
    Global cloud spending might be booming, but AWS is trailing Microsoft and Google
    • June 13, 2025
  • Google Cloud, Cloudflare struck by widespread outages
    • June 12, 2025
  • What is PC as a service (PCaaS)?
    • June 12, 2025
  • 10
    Apple services deliver powerful features and intelligent updates to users this autumn
    • June 11, 2025
about
Hello World!

We are aster.cloud. We’re created by programmers for programmers.

Our site aims to provide guides, programming tips, reviews, and interesting materials for tech people and those who want to learn in general.

We would like to hear from you.

If you have any feedback, enquiries, or sponsorship request, kindly reach out to us at:

[email protected]
Most Popular
  • 1
    Crayon targets mid-market gains with expanded Google Cloud partnership
    • June 10, 2025
  • By the numbers: Use AI to fill the IT skills gap
    • June 11, 2025
  • 3
    Apple supercharges its tools and technologies for developers to foster creativity, innovation, and design
    • June 9, 2025
  • Apple-WWDC25-Apple-Intelligence-hero-250609 4
    Apple Intelligence gets even more powerful with new capabilities across Apple devices
    • June 9, 2025
  • Apple-WWDC25-Liquid-Glass-hero-250609_big.jpg.large_2x 5
    Apple introduces a delightful and elegant new software design
    • June 9, 2025
  • /
  • Technology
  • Tools
  • About
  • Contact Us

Input your search keywords and press Enter.