In The Secret Life of Data, journalist, sci-fi author, and American University communications professor Aram Sinnreich ’00JRN teams up with coauthor Jesse Gilbert to explore the unknown impacts of the information age.
What do you mean by the “secret life” of data?
It’s what happens to data after its original purpose is complete. Our world is saturated with devices from laptops to voice assistants to smart refrigerators that know when you’re out of milk, but there’s no expiration date on the information produced by a single online event or interaction. When we upload a selfie, stream a video, look up directions, or track our sleep, these actions are recorded, analyzed, and combined with other data without our knowledge. This may be done over and over again in the future using techniques that may not exist yet.
Why do corporations keep so much data?
First, the cost of storage has become so cheap that it makes economic sense for companies to hold on to data in case it becomes valuable in the future. Second, there’s no legal reason not to do this — in the United States, we have zero federal laws against data collection.
As for how the information is currently used, tech companies have long sold data to advertisers, who use it to show us targeted ads. Now, with the artificial-intelligence revolution, those companies are using our data to train generative-AI models. For example, if you use Gmail, Google analyzes all your e-mails. They anonymize your personal information, and they aggregate the data with that of millions of other accounts to help make Google chatbots and other AI systems more humanlike. Other companies like Meta, Amazon, and Apple are also using the photos you upload on social media and recordings captured by Alexa or Siri for their own AI models.
What are some of the more troubling implications of data collection?
With smart devices, data is being collected about you even when you might think you’re offline. For instance, investigative journalists recently found that General Motors’ roadside-assistance service OnStar had been sharing data about drivers’ bad habits with insurers, who in some cases raised customer premiums.
Our data can be stolen by hackers, used by repressive governments to monitor citizens, and accessed by ex-partners who want to keep tabs on us. Intelligence agents, police officers, and employees of social-media companies with access to massive databases have been known to abuse those resources by stalking or harassing former intimate partners and others.
You have given examples of how data can be used for ill intent, but can’t those large and diverse data sets also be used in the public interest?
Used ethically and transparently, big data can give us more nuanced understandings of everything from our political world to natural ecology. It can help urban planners reduce traffic congestion in a city by observing transit patterns. Or it can help analyze the medical records of millions of people to suggest new treatments for disease. It can make the global supply chain more efficient by assessing product demand, so that fewer gallons of gasoline are used to transport goods. There are all kinds of ways in which big data can and does improve the human condition. But without guardrails, it is just as likely — or more likely — to have countervailing consequences.
As a science-fiction novelist, what do you think classic sci-fi got right about the future?
I think the most prescient observation is that quasi-intelligent computer systems come with heavy costs. There’s no way for us to delegate knowledge to machines without grappling with the definition of personhood. The notion that intelligent systems might run amok traces back to Frankenstein: Mary Shelley observed how human beings, once endowed with scientific knowledge and mastery over technology, are capable of building powerful machines that can wreak terrible, if unintentional, social consequences.
Is it possible to avoid data collection — and exploitation — altogether?
It’s very difficult. There’s almost nowhere you can go that’s not within range of the Internet. The latest versions of the iPhone have satellite chips. Our world is full of Wi-Fi hotspots and QR codes that merge physical and virtual spaces. Even on Mount Kilimanjaro, you can connect to a network and exchange data.
My most important tip is practicing “data kindness.” This isn’t just about protecting yourself; it’s about protecting one another. For example, you might want to ask for permission before posting pictures of your sibling’s family on social media.
The ethics of big data is a new area that society has yet to figure out. There are no simple answers as to whether it’s OK to submit your DNA — and effectively your relatives’ DNA — to a genetic-testing company, and so on. Only by being honest about technology can we begin to come up with an ethical framework. We don’t have to agree on the right policies, but we can all agree that we are producing much more data than we ever imagined, and this has many more consequences than we planned.
This article appears in the Fall 2024 print edition of Columbia Magazine with the title "What Will Happen to Your Data?"