How are power and control being exerted in emerging technologies? How is crime being digitalised and how do we have to address it? What are the perils and possibilities of AI like ChatGPT and DALL·E? How do we reimagine freedom and agency in navigating this current landscape?
NATHALIE NAHAI: Hello, folks, welcome to this conversation. I am Nathalie, and host of The Hive podcast, and also host of the very exciting six-week online course with advaya, that we've got coming up on the 10th of April, exploring the digital age, understanding and reclaiming systems of power. And with me today, I'm very excited to have in conversation, tech researcher and brilliant author, Carl Miller, who some of you might have seen at the Medicine Festival, and at DEMOS, and elsewhere. So he's one of our brilliant teachers coming up on the course, and I'm gonna ask him a bit today about what the digital revolution is, and what it means for how we live. So Carl, what are your thoughts with that? First question to open up the chat.
CARL MILLER: Well, what is it a digital revolution? Well I mean, it's both something quite simple and quite mysterious all at once, isn't it? Because on the one hand, it really just literally means the adoption of digital technologies and the process of digitalisation, which, companies will talk about and policymakers will talk about, us replacing the things that previously analog, with things which are digital. That might be just everything from the invoices that you send through, to how you get your news. But, of course, it's far more than that, because what's happened is that all those influences, and all those kind of consequences have kind of rippled through those screens, into all of us.
So the digital revolution is not just about the adoption of technology, it's actually a kind of mangled intertwining of vastly profound series of social, psychological, political consequences.
So it's changing all of us. So it's a human revolution, and a social revolution, probably, I'd say, first and foremost, and the way that those two bounce off each other—that is, human beings and societal consequences with tech, is what I've really spent most of my life trying to trying to untangle.
NATHALIE NAHAI: And so if we're thinking about things like power and control, not only social, but also political and cultural, whichever way you want to slice and dice it, thinking about power and control, how are they currently being exerted through existing technologies today?
CARL MILLER: Well, in ways which are quite mysterious and quite overlooked and misunderstood. So power is a slippery fish to catch, first of all. It's a concept—massively important, I think, I mean, I wrote a whole book about it, but quite hard to define. It comes in all these different forms: sometimes economic, sometimes coercive, more often than not, actually though, ideational and to do with persuasion, and the force of ideas. But it flows through those technologies in all those different ways. But in ways, I think, which are increasingly kind of mysterious to us.
So that might be, for instance, the default settings that are kind of embedded in the software that you use, and I use—and I'm certainly too lazy to often change. That's a form of control. A form of control might be the way in which hackers can wrestle control of and master systems in ways which the designers didn't intend. Control might be the way in which your ideas are shaped and formed by the information that you see and consume, and that might be through for instance, consuming YouTube videos, which have been recommended to you. So in one way or another, whether it is coercive or economic, I think two things have happened really with power, [and] I'm condensing a book into a 30-second answer here, but I mean, one is that it's happening in ways which we don't often understand and don't often see, so it's hidden. And it's also quite wild. So it's also happening in ways which we often can't control ourselves and certainly haven't set guardrails around.
NATHALIE NAHAI: I have a feeling that that is going to be something that we dig into quite a lot more when you have your session with us, because that's a big one. It has big implications for agency, personal sovereignty, freedom.
But just to kind of dig into the headlines and leave it at that sort of level at the moment, what do you see as some of the biggest disruptions happening in the world of digital democracy?
CARL MILLER: Well, I mean, digital democracy is itself a disruption. It's essentially the idea that we can constantly change the way our democracies work, using technology in one way or another. It began largely as a kind of activist pursuit, I think, because people wanted to find ways of making governments more responsive and agile to their citizens, but if you look now to places like Taiwan, some of the Baltic countries, and across some cities and local governments actually, around Europe, it's beginning to actually be used by those governments themselves, so it's beginning to be attached by power. And really, when you start to create new processes, like consensus-finding ways of having discussions online and then turning those into laws, as they're done in Taiwan, the sky's the limit.
Suddenly democracy is a living being again, it's not something which is kind of baked into vellum.
It's not simply a tradition we inherit, it's also something that we control and, continue to work on ourselves. And that is itself, I think a profound disruption, right? Because when you start to see democracy as being something which lives and breathes and grows, rather than, say, maybe just the single act of voting every five years, what democracy even is, will change in our eyes, I think. And that is a huge disruption if it happens.
NATHALIE NAHAI: Yeah, I like the idea that it's something which can evolve and become more of a living responsive organism based on how people use it. I've got a lot of thoughts about the level of democracy that we have when you get to vote on a foregone set of people. Okay, so let's talk briefly about cybercrime. What's going on in the world of cybercrime? You've got some really exciting... well [this is] probably too short to have long stories but, exciting stories about what's going on in Blackhat conferences and all this kind of stuff...
CARL MILLER: Indeed, yeah, I mean, so cybercrime is one of these dynamics where power is constantly competed over, obviously, between law enforcement agencies, states and criminals, and I've been kind of entranced and scared by, for many years now.
The first thing to say is that crime has utterly transformed over the last 20 years, like half of crime now happens, basically, we call it cyber-dependent crime, through the devices and technologies which we use and largely via the internet. You're more likely to have your social media accounts burgled in your house, you're more likely to receive a virus than be the victim of all forms of violent crime put together. So crime has completely, completely transformed. We often don't see it. It's not necessarily a crime which is visible out there in the street, but it's a crime, which impacts so many more of us than it was before, and it's continuing to transform.
The thing that hasn't really transformed, unfortunately, is the police. The police are set up with these extremely geographic-centred ways of doing policing, we have 43 different local constabularies across the UK, most other European countries are organised on largely the same basis. So the problem—and this isn't... there's lots of good news stories to do with the digital revolution, this isn't one of those—is that we are living through, in my eyes, one of the most serious crises of law enforcement ever, maybe the most serious one ever. I think police forces and therefore states kind of really, really struggling to work out how to enforce laws online, in the same way they can offline. And in the kind of vacuum left by I think, overly distant and overly geographically centred police forces, we're seeing vigilantism and kind of private citizens, in many ways, taking the law in their own hands, or at least trying to protect themselves in any way that they can, too.
NATHALIE NAHAI: Can you say just a little bit more about that? Because I'm so curious about what that looks like?
CARL MILLER: Sure. I mean, the most obvious kinds of groups that we've seen emerge are kind of pedophile-catching vigilante groups, so there's lots of them. They're pretty active. And in some cases, they have started to work with the police in a strange way, because they're kind of basically in one way or another trying to lure pedophiles into sting operations, and then expose them publicly or hand over [them to] police. They work in different ways. Or actually, more problematically, actually go and try and follow the people they presume, but haven't proven, are pedophiles themselves. So it's actually a really good example. I think the people that do it feel like the police aren't active enough.
Child sexual exploitation online is one of the areas which have probably changed, which is one of the greatest consequences of this kind of transition in cybercrime. But when vigilante groups start doing it, obviously not the police, they create all these problems, as well. So it's just one of many. And I will have more stories soon, really, about how I've kind of been drawn into that world, perhaps more than I ever thought I would be. Not this particular world, actually. But the one of non-police officers trying to do something about the crimes they see online.
NATHALIE NAHAI: Yeah, I mean, that's a whole other level of disruption and agency that we might not hear about—see, touch, feel.
So let's talk a little bit about the perils and possibilities of AI, and possibly automated decision-making, because everyone, I would guess, by now, who's got a laptop or a smartphone will have heard of ChatGPT and DALL·E, and all of these kind of natural language processing platforms that have been designed by various big tech companies, to interact with us, and some of the problems that have historically shown up when that happens. So what do you think are the risks around AI, maybe in the short-term, and maybe looking down the barrel a little bit, for like the next 18 to 24 months?
CARL MILLER: Yeah, I mean, it's funny. So especially with ChatGPT. So I actually founded an NLP company 10 years ago, with colleagues, I'm not a foundational researcher in NLP, but most of them are, which kind of continues to this day. So it was strange about six months ago, when these new what are called auto-regressive or transformer-based models suddenly started coming out, and all my colleagues were just losing their minds about them. They weren't writing essays for people, they were doing other kinds of tasks, but they were doing them with so much greater precision and power than the previous generation of models. Clearly something important was afoot there.
The thing that's strange about the use of those kinds of approaches in something like ChatGPT, and I'd say, a whole field of technologies like it, is that we're desperately trying to teach these things, how to mimic human behaviour.
And there's very strong commercial reasons for that. But that does create a series of problems for us. So once you can create a robotics simulacrum that can hold a conversation with you...
I mean, the other thing, that I work on and use NLP to try and solve, is information warfare—the way the autocratic states will manipulate our information spaces. Well, there, what we're beginning to look at is how these kinds of models and technologies will transform that. And that's quite worrying. Because suddenly, you can have these models which can say, rather than just spam apparent disinfo to you from Russia, we could see operations, which would set up and run tens of thousands of direct conversations with people and actually draw people into relationships, draw people into friendships. And for those of us and I know you do Nathalie, as well who work on influence, and how behaviour change works.
Well, that kind of thing, I think, is much scarier than distant voice spamming you information, because we know that influence flows through those kinds of social linkages.
NATHALIE NAHAI: Yeah, that's curious, isn't it, because one of the things that keeps coming up time and again, is the fact that when there've been large breaches in businesses, even considering the tech advancements that we've seen, often it's the social engineering component, the access to someone at the reception to let someone in, to give them access to certain sort of prohibited areas... but it's often the social aspect which is the weak chain. And of course, if we're using technology to create what some people have called synthetic relationships, so between an individual en masse, at a massive scale, with what we think of as chatbots, but are actually much more complex and nuanced than that. If you're ending up with all these people in synthetic relationships, then the influence that can flow through those channels is going to be very hard to rein back.
What do you even think are the possibilities for maybe identifying and/or curtailing those sorts of interactions?
CARL MILLER: I was gonna say actually, just before that, the social engineering part of DEF CON, which is this giant mele of hackers that meet in Las Vegas every year is always my favorite bit. They do kind of elicitation exercises, where the best social engineers in the world will be up on a stage and do live calls to say a bank, to work out who works [unintelligible] the bank and like it is crazy how consistently successful they are in doing that kind of stuff, even to employees, who have been trained in how to try and identify and be resilient to elicitation exercises. So I love that. To me, that form of hacking, hacking humans in a way, is so much more thrilling or interesting to see than simply trying to say, decode secrets buried onto silicon chips, although that can be quite interesting, too.
But yes, like, how are we going to defend against it? Is... we don't know. The problem is that there's this constant dualism right now between those of us building methods to try and protect information spaces, and those building methods to try and invade and manipulate them, and it's a constant back and forth. And we can try and build models to detect their models. We can try and teach people to be more resilient, like probably what we really need to start doing, and I think where the conversation is beginning to go, is actually just really targeting the people that do this at scale in the most sophisticated ways. So there are people, there are organisations or companies, that sell this, and sometimes they freely vend their services. I hope no one listening to this gets any ideas, and operate fairly openly, and they shouldn't and can't.
So I think there's going to be some serious thinking in the months and years ahead, about how we start to close those gaps. Those companies shouldn't exist. And they certainly shouldn't be using banks, and they certainly shouldn't be using payment systems and be part of the polite digital economy.
NATHALIE NAHAI: The polite digital economy. That's a great rallying cry.
Yeah, okay. So lots to think about in terms of the ways in which... I think what you're pointing towards, in some ways, when I hear you talk about this, and also about democracy, is systemic change, because the systems that we have are too slow. They're not responsive enough, they don't give enough people agency. I mean, that's debatable, depending on where you are in the pecking order of power, if you want to give more of the turf agency or not. But it feels like a lot of change has to happen. And I guess one of the questions is, where does that change originate? Does it have to be from the ground up? Or is it getting a few people in key positions of power, to listen to folks like, who are doing the work, kind of at the vanguard of this sort of research, and saying, look, really we need to be aware of and investigating these sorts of things?
CARL MILLER: Well, I think the problem is really that different kinds of systems change at different times, and at different speeds. So you've got systemic change that can happen through technology, which has always been typically quite fast. But then organisational change, say, of the police, which also needs systemic root and branch reform, in my view, there should be one police force responsible for all digital investigations, or like, cybercrime investigations in the country, it makes no sense to split that across 43. That happens very slowly... we've been seeing a crisis which has been building up over a decade or more, but really absolutely no capacity, from the police or the home office to actually make that kind of root and branch reform. It takes a very long time.
The one that people worry about more often is around, like actually drafting new law to respond to those systemic changes from tech. I think there, it's quite clear to legislators and actually to everyone, that that we were not drafting law quickly enough, and the kinds of laws we therefore have to draft typically have to be these like super open-ended future-proof things, which, you're never going to have to probably re-draft in the next lifetime, but also, that therefore means that they're so broad, that when we look at them and read them, we have no idea how they're actually going to be used.
NATHALIE NAHAI: Wow. Okay. So then let's, let's dig a little bit more into that before I go to the last question, because we've got a little time left. Where do you see the greatest progress being made in forms of legislation? Like, are there countries or organisations that are making propositions that are working, that have been different to how we treated tech advancements than before?
CARL MILLER: European Union.
NATHALIE NAHAI: Really?
CARL MILLER: Yeah, European Union, I've come back from two different EU events very recently, one in Stockholm, and one in Brussels. And they've now knitted together this, like actually quite impressive and specific series of legislations to govern digital platforms. The digital markets act, the digital services act, like there's the voluntary code on disinformation. But there's more and more coming out. And we don't know really, if it's going to work if or not yet, because it hasn't been pulled through into regulatory action and case law, and ultimately fines, or other other kinds of penalties. But there's a real determination, I think, across the European Union, to make this happen and make this work.
So in a strange way, whilst all these platforms largely came out of Silicon Valley, or most of them, or China, it's the European Union, that's actually leading this new wave of regulatory control, putting up all these legal guardrails to try and control them more.
I think it's very impressive. They've acted much faster than either my government here in the UK, and certainly faster than the United States—that now is years behind.
NATHALIE NAHAI: Well, that's not going to sit comfortably with people who... Well, I mean, it might do, for people who actually don't want to have more regulation. It might be more of a kind of wild west approach of we don't want to be regulated, and I know that's kind of a popular place to stand, but I think when we're talking about the kinds of cyber crimes you've described, the impact on mental health of young people... I remember reading years ago, there was a study, it was someone I interviewed, a German professor who'd done several studies on the impact of too much screen time on the development of children's lenses [and] their eyes, basically, and how it was causing myopia. This is not research you hear about in Europe, or in the West. I mean, unless you have someone who goes out of their way to do this kind of expose, but yeah, there are some really varied and impactful ramifications to the way in which we currently use text, it's good to hear that Europe is leading the way on this.
So if we're thinking about, and this is the final question before I let you go, if we're thinking about, individual freedom and agency, within the context of surveillance capitalism, so companies making lots of money out of our data, what does it mean to you, when I say these terms—individual freedom and agency, and what could it mean, if we change the way in which we conceive of and regulate technology?
CARL MILLER: Well, I mean, freedom, much like power, important yet very, very thorny idea. I often think that when people say freedom, they kind of mean, the capacity to shape their lives in ways that they want. Although, I've always been pretty convinced that systemic and structural factors around us, like whether we like it or not, do an enormous amount of the shaping of our lives. So I've always been, slightly hesitant, actually, to use the word freedom in that kind of sense. We are knitted out of the fabrics of our society and everyone that we know.
But I think in a very specific sense, in surveillance capitalism, I do think, at least at the outset, [it] means, knowing what is being done to you, and what the costs of that are. And I've got a kind of... well, I'll tell you a very brief 30-second story, where I did an investigation with the BBC, actually, on surveillance capitalism, where I tried to recreate myself in all the data that was available about me, so I went to basically every company exercising a European law at that point, subject access request, it's called, to basically asked for my data back, and all these companies sent me my data back, and there was tons of it, I mean, probably from the floor, up to about my waist, to my chest, in printed out. There was three different types of it, there was some which was like emails, I've sent them, which is fine. I'm very much aware that they hold that and, that they need to, there were some which were kind of generated by me using their services in ways which was less visible, actually, so that might be Uber recording my GPS, every couple of seconds, coordinates, that kind of thing...
... I think there was like some background recording, I have to go back, and I don't want Uber to kind of sue me if I'm wrong kind of claim there. But definitely, like tons, [which could make a] very, very rich GPS picture, which maybe if I'd stopped thinking about it, I would have thought might be recorded, but I hadn't really seen it all in one place. But then there was a third kind of data, which was data generated about the data about me. So this was data that I could have had no idea existed. And this was for instance, audience segmentations for ad tech, so very much directly the surveillance economy, and this was stuff like, what kind of consumer I was. Now, one company had me down in a consumer bracket called young and struggling. They made all these inferences. Another one was like, Mondeo Man or something. Yeah, they came up with slightly offensive audience segmentation...
NATHALIE NAHAI: I think that says more about their lack of creativity than you.
CARL MILLER: I mean, yeah. And they were just garbage, incidentally. So whoever they were selling these to, it just wasn't working. But then there were also ones which were percentage chances, for instance, of me liking gambling, or me doing alcohol sales. And actually, that's where I felt like, if we look at gambling, that is something where I think for some people with problematic behaviour in gambling, that's where really, that's their agency being taken away by compulsions which have formed, and to have organisations making probabilistic guesses as to whether I like gambling, presumably to sell me gambling, is not... I didn't think was okay at all.
And that's just one like, very narrow example, where for some people, I think, the surveillance economy is all about actually taking agency away, rather than giving it to them. So, yes, there we go. And for everyone listening to this, you can go and do this yourself. It's an instruction.
I think it's worth us all kind of thinking about how we live out there in data form.
NATHALIE NAHAI: It'd be amazing if there's a company that could basically take, again data, take that data, pass it, and put it in a kind of dashboard format, that's something I would enjoy, to say okay, this company has this much data on you, this is what they've inferred. Maybe that's another...
That sounds like a business idea, Nathalie, why don't you go ahead and do that?
NATHALIE NAHAI: No, no, no. I don't want to be feeding the beast. Okay, so of all of these different topics, which are very interesting, which are the things that you're going to be talking with us about on your session? What are the things that ignite you right now? The kind of tidbits that people can expect?
CARL MILLER: Oh, well, I mean, I think I'm going to be talking about all of this really, power will be an idea that I'm going to be arguing to everyone, is an idea that they need, and they need to think about. And then in all these different areas of life, of course, politics, but we might talk about business as well. And maybe we'll talk about the law, maybe we'll talk about... I'll tell some stories about cybercrime and the police. And we're going to try and explore this interplay of liberation and control, and power sitting in the middle of that.
I have long seen both this tremendous onrush of both liberation and control, as both flowing through all these screens to us in the digital age. I think in many ways, the digital age is just characterised by both of those happening more than they were previously, so it's not going to be either a parade of horrors or wonderfulnesses, but all of those things wrapped up together.
And I'll try, I think, to unravel it in a way which gives people a sense of what they can do about it too.
NATHALIE NAHAI: Brilliant, and if you've not had the chance to listen to some of Carl's stories, they are absolutely brilliant and kind of gripping your seats, sometimes in horror or amusement, kind of stories.
Okay, so the course is happening on April the 10th. On Mondays, from 5–7pm GMT, I'm going to be hosting it, Carl is going to be speaking on it. We've got some amazing, amazing speakers, activists, pioneers. So if you'd like to come, go to digitalage-course.com. Check it out, come and join us. It's gonna be a wild ride, and probably a lot more wild than you might even expect given the title, digital age. There's a lot of crazy stuff out there, but I think [it] would be very useful for us to know about and make different choices on.
Carl, thank you so much for your time today. It's been fun, as always.
CARL MILLER: Thanks Nathalie, as ever.