“Data Is the New Oil,” Says Futurist Amy Webb [TRANSCRIPT]
Can today’s fringe trends tell us what tomorrow’s tech normal will be?
Amy Webb likes to say that “data is the new oil,” and the parallels between the two valuable commodities were on display last week as Facebook Founder Mark Zuckerberg tried to explain to Congress just what his social media company does with its users’ personal data. With so much data available about us — every time we use our Smartphones, every time we sign up for customer loyalty perks — what are the issues about data telling us about the information and tech industries’ futures?
To get some answers, WDET’s Sandra Svoboda spoke with quantitative futurist Amy Webb, who looks at data in the present to make predictions about the future. She’s the founder of the Future Today Institute, teaches at New York University, and authored the book “The Signals are Talking: Why Today’s Fringe is Tomorrow’s Mainstream,” which shares some of her research and methods about how to make sense of data and trends to make our lives better.
Click on the audio link above to hear their full conversation.
Webb was in Detroit last week as part of the Future of Information series, a collection of free quarterly conversations sponsored by the Community Foundation for Southeast Michigan and the John S. and James L. Knight Foundation.
Click here to register for the next event, which is June 20 and features danah boyd, founder and president of Data & Society.
More information, research and data related to Webb’s work can be found here.
Click for related Detroit Today conversations:
Is Social Media Killing Democracy? with Jamie Bartlett, director of the Centre for the Analysis of Social Media at Demos, and Garlin Gilchrist, founding executive director at the Center for Social Media Responsibility at the University of Michigan.
What is Truth, and Does It Matter? with The Atlantic’s Jeffrey Goldberg and Time’s Charlotte Alter.
The Weaponizing of Fake News in Modern Politics, with Wayne State University‘s Lee Wilkins.
Here’s a transcript of Sandra Svoboda’s conversation with Amy Webb, which aired on Detroit Today on WDET:
Sandra Svoboda: Everyone wants to know what the future is. How can we figure that out in a way that’s relevant in our everyday lives?
Amy Webb: Understanding the future is first and foremost understanding that there is no way to predict what’s coming. So with that in mind, the best that we can possibly do is to use data to listen for weak signals in the present and to look for patterns and from those patterns to spot emerging trends and then to use those trends to try to understand how the world may unfold. Really what that is about is thinking through risk and opportunity scenarios. And really, that’s intensive work, but it’s work that anybody can and quite frankly should be doing if you’re somebody who is concerned about the future.
Click here for the Future Today Institute’s 2018 Tech Trends Report.
Sandra Svoboda: Your writing and your research comes a lot from the tech industry but for people that are involved in other sectors – news media just for one example – how does what’s going on in the tech world apply to other industries now?
Amy Webb: In the year 2018 technology is interwoven into every aspect of everyday life. Regardless of whether you are the CEO or a mid-size firm or working in private equity or working inside of an auto manufacturing company, technology still intersects with your everyday life. If you’re trying to understand what’s on the horizon, technology has to be part of that conversation because it fuels what’s going on in the modern age.
Sandra Svoboda: Seeing what’s going on in the world with so many issues related to technology – and we’ll get into some of those specifics — how could we better have prepared for all of the issues we’re facing now.
Amy Webb: I guess the bad news is none of this should have come as a surprise to us. This goes back to allowing yourself to pay attention in a more meaningful way to what’s happening in the present. What are all of the weak signals surrounding us and how can we make sense of that information? Twitter’s now has been around now for more than a decade, and so has Facebook. But as early as 2010 there were signs that on both platforms people and systems and organizations were using those systems in ways that would exploit our data. It should have been no surprise to the United States government that rogue actors would try to use bots on Twitter to spread misinformation. In fact, we saw that happening in 2011. We saw that happening in 2012. I was in Washington D.C. advising senior leadership that this was the case in 2014 and in 2015. But it wasn’t until 2016 when we finally started talking about it. Now that we are talking about Cambridge Analytica and Facebook and what might have happened or not happened to our data, again it feels like something that just happened.
Click here for NPR’s “How to Check if Your Facebook Data Was Used by Cambridge Anaytica.”
However, the beginnings of this go back many years. The challenge is acknowledging that something seems a little off when you first notice it. Sometimes that can have some negative business implications. The reality is that for many of our technology platforms, their business model is predicated on surveillance. The challenge is we must acknowledge that and either decide we’re OK with it, even though it may have negative implications for our democracy and our feeling of personal security, or we have to do something to change that, or I guess there’s a third option which is we don’t use the services. Everybody feels as though we’re caught constantly with this barrage of brand new news, and it seems like things are constantly happening when in reality is nothing changes overnight, very rarely do things change overnight. All of the threads of the current news stories that we feel like we’re being bombarded with on a daily basis, which range from technology companies using our data in ways that we’re uncomfortable with to the reality that Russia may be slowly but surely infiltrating various parts of our democracy to whatever is happening this five minutes in Syria to the slow decline – it feels like America’s news organizations are in financial disarray. All of these things have been happening over a number of years. The key is to break that cycle of surprise and making decision under duress. The only way to do that is to dedicate yourself to tracking those signals and those trends that are data-driven and then making decisions as you go.
Sandra Svoboda: It’s interesting. You used the word “surveillance” when it came to data that are collected about us on Facebook but I also kind of feel like this has been going on for years. We all like our frequent flyer programs. We like our credit card discounts, when you sign up and you get points. Everything that we’re part of in our economy, I feel like, has the potential for that data harvesting/data sharing/surveillance (word) that you used. Are we just too far down this road to really make a difference now?
Amy Webb: So that’s a pretty big question. Here’s what I would say: If you have not yet heard this you will soon and that is a new aphorism that “Data is the new oil.” The reason that people are likening data to oil as a natural resource is because like oil, our data has to be located, mined, refined, productized and then turned into a commodity that people buy. Our data is very much the same. We are creating untold amounts of data just by being alive. Some of that is obvious: the tweets you tweet and the Facebook posts you post. But there’s a lot of ambient data that you create as well. If you’ve got a cell phone and you’re carrying it around with you, you’re shedding ambient data everywhere that you go that’s accessible by third parties. The trick is, given how much data that we’re shedding, those organizations capable of mining it, using artificial intelligence and other systems to make sense of it, and then to turn that into products for others to use, those companies are making a lot of money. Here’s the thing: in some cases, it’s too late. In other cases, it’s not. Europe has decided they’ve had enough, and so on May 25 there’s something called the GDPR (General Data Protection Regulation), which is going into effect. It’s a sweeping privacy regulation that will affect all of the EU that has to do with how individuals’ data are used. That’s going to be hard to implement. If you think through the likely scenarios, not only is it difficult to implement but definitely it shows us that there’s a high probability that the internet will function differently in Europe than it will in the rest of the world. Which is to say that if you’re a news organization and you publish one story in the United States, it’s possible that people in France, they may have to see a different version of it. A lot of the personalization capabilities are going to be different in Europe than they are in the United States which mean different implementations.
Click here for the European Union’s GDPR page.
We have to stop and think through where we’re headed. While the EU has this blunt force, sweeping regulatory change coming in a month and a half, in the United States it’s a free for all. We have no real regulation, we have not real oversight, and while there has recently been a circus in Washington D.C. where our lawmakers have been on CSPAN asking Mark Zuckerberg what probably feels like to them really challenging questions, the reality is we don’t have any real oversight here in the United States.
Sandra Svoboda: What’s motivating all of us to know the dangers of technology – when we go on Facebook we know our data, our personal information is being shared yet billions of people still use it. How do we reconcile that in what’s motiving us and what does it tell us about American society now?
Amy Webb: What’s so interesting to me is that this is not a new use case. Throughout our history we’ve always had new technologies, and our lives have never moved as fast as those technologies are evolving. As a result of that, often times that technology is not good for us in some way but we keep using it. Larry Page, one of the founders of Google, talks about a toothbrush test. Inside of Google, a product will get a green light if it passes what he calls the toothbrush test: is it something you’ll use once or twice a day and it makes life a little bit better. A lot of our current technologies pass the toothbrush test even if in the process, they may be giving us cavities, right?
Uber. For a long time Uber had surge pricing. For those of you not familiar with surge pricing, this is when for any number of reasons the cost of a ride would increase three, four, five times, sometimes even more than that. People would complain about it. They would take screenshots of their phones and they would post them on Facebook and Twitter and they would complain about how horrible it was that Uber was doing this to them and yet the screenshot showed that the ride had been completed.
Sandra Svoboda: They still used it!
Amy Webb: That’s exactly right. So part of what’s happening here is a cognitive shift. When our lives are made very slightly easier or better in some way, we will use whatever that technology is, even if ultimately it does us some kind of harm. In the case of Uber, we’re spending a lot more money that we might have spent using a different mode of transportation because the experience is so easy and seamless and it works. There’s this national movement #DeleteFacebook and there’s all these celebrities, tweeting of course, pictures of themselves.
Sandra Svoboda: The irony.
Amy Webb: I know, right? Tweeting themselves quitting Facebook. But we have to look at the data, and what do the data show? The data show that as of right now not a whole lot of people have deleted Facebook And by the way, for these five minutes while everybody’s suddenly got their attention focused on Facebook, we’ve all somehow forgotten that less than six months ago we were having the same conversation about Twitter and the revelation that there were Russian bots on the network.
Sandra Svoboda: And when are we going to have it about Google?
Amy Webb: Well that’s the other thing. .Think of all the different repositories for your data. Facebook is not the only company. And by the way, you think that Cambridge Analytica is the only recipient? Let’s think about this in more realistic terms. But here’s what I would say. Facebook may have betrayed our trust, but Facebook didn’t do anything illegal, as far as we can currently tell. And if you’re a human using technology, and you’re using that technology or that platform for free, you have to understand that you are there for the product that’s being sold.
Sandra Svoboda: In the few minutes we have left. I’m going to ask you to think a little about the future. What happens now? Do we get policy and oversight? Does it come from the federal government? Are there things local governments are doing that relate to this issue of privacy and data?
Amy Webb: I’m a realist so I’m just going to give it to you straight. Nothing happens. Nothing of any substantive change happens. And that is because for the ways in which these companies make us feel uncomfortable, our behaviors are not changing as a result, and these are companies that help make the American economy strong. I don’t see any serious change coming down the pipeline. If anything, these are the companies that are writing the legislation that will go into effect. Facebook has already done that for places the state of Maryland.
Sandra Svoboda: Normally I ask policymakers or advocates what normal citizens can do to effect change and I usually get “write your legislator” as an example an advocate on issues, But in this case, is it just our behavior is going to outweigh anything we can do in terms of the traditional citizen political engagement?
Amy Webb: The very best thing you can do, honestly, is develop a better set of digital street smarts.
Sandra Svoboda: Such as? Give us the toolkit.
Amy Webb: Here’s an easy one. You go into your grocery store. You get the little coupon card thing. You just have to fill out your name and your address and all your personal information. You don’t need to tell the grocery store where you live and what your email address is and all that other information in order to get that discount card. You’re getting a discount in exchange for your personal data. That’s something that most people don’t think of.
Here’s an easy thing that anybody can do this week. So it’s Monday, Here’s something that you can do for the next five days: Put yourself on a sort of audit. For every day, write down the amount of times and the circumstances in which you are giving people data in ways that are obvious to you. Hopefully as you do this throughout the week the non-obvious ways in which you’re shedding data will start to become apparent. By the end of the week, by Friday, take a look at everything that you’ve written down, and now consider what are you comfortable with? What’s an even, fair exchange? And what are the behaviors that you have to change going forward?
Sandra Svoboda: I may just do that this week and see what it does. Will you audit me?
Amy Webb: You don’t want me to have your data.
Sandra Svoboda: No, I mean the times I’m giving it out and I’m thinking about it. Are there times it’s good to share your data?
Amy Webb: Again, I think we just all have to be a little smarter. Again, I would say that Facebook has betrayed our trust, and that’s really irritating. And Facebook definitely had some impact on our election, which, depending on where you fall on the political spectrum, potentially reprehensible. On the other hand, Facebook helped enable the Arab Spring. So there are tradeoffs as there always are. So the question you have to ask yourself is this: how comfortable are you giving away your personal information in exchange for services, and what are the implications of that going forward?
So another quick and easy thing anybody can do is audit the visual information that they’ve shared. So in my case. I have a daughter. We have never posted or shared anything about her online ever. Because when she was born a decade ago, this was something my husband and I were thinking a lot about. So go through all of the pictures you’ve ever shared about your kids. Stop, given, the knowledge you have now, and what you’re seeing unfold, how comfortable are you knowing not just that the photos exist but knowing that is visual data that can also be mined and refined and productized. And then anything that you’re not comfortable with, and it could be all of those photos, take them down.