In conversation with Nick Adams, Founder & Chief Scientist of Goodly Labs.
Nick Adams describes himself as a parallel, rather than serial, entrepreneur. He started his work in public service as a campaign manager, hoping to go into politics himself. But Nick quickly realized he could do more good focusing on technology.
One of the big challenges governments have is listening to their citizens at scale. Major input into policies comes once every four years as part of the electoral cycle. But a cohort of new founders and startups—Kialo, Pol.is, Society Library, and Nick’s initiative, Public Editor, aim to address creeping disinformation and the challenges of finding consensus.
Nick’s initial foray into learning at scale came from trying to analyze the Occupy movement. He needed a way to wade through thousands of documents from organizers, law enforcement, and the many organizations involved in occupations throughout the US. He incubated Goodly Labs at Berkeley, and their motto could scarcely be more aligned with our own: Building “Technology of, by, and for the people,” they “equip individuals with collaborative tools for building a better society.”
In this episode of FWDThinking, Nick and I discuss how leaders can better understand what societies are saying; how to tackle the asymmetric threat of fake news with the wisdom of crowds; and whether technology has rendered representation obsolete. Nick will also be speaking at FWD50 this November.
All opinions expressed in in these episodes are personal and do not reflect the opinions of the organizations for which our guests work.
Click to read the full transcript of this episode.
Alistair Croll: [00:00:07] Hi everyone. And welcome to another episode of FWDThinking. I am thrilled today to be joined by Nick Adams, who is the founder of Public Editor, TagWorks and a whole bunch of other things. One of the themes that came up last year in FWD50 was the concept of resilient democracy.
[00:00:22] And if you’re reading any kind of news today, you know, that democracy itself is kind of in this epistemic crisis because we don’t have a shared context. And in the past there was like a professional class of expert, we trusted to give us facts, but today everybody is an expert and they’re able to publish on a level playing field that’s gone from expensive broadcast one to many communications to cheap one-to-one, any to any communications. This obviously presents a tremendous number of challenges for anybody trying to deal with getting the facts straight and understanding the nature of information which is necessary for functioning societies. So please join [00:01:00] me in welcoming Nick Adams. Hi, Nick.
[00:01:03] Nick Adams: [00:01:03] Hey, thanks for having me, Alistair.
[00:01:05] Alistair Croll: [00:01:05] So I have a lot questions for you. Digging a bit into your background, I noticed that you said you started in politics, running electoral campaigns and then changed your mind and decided you could do better work in other ways. Was there a thing that made you go, okay, I’m done running elections and campaigns and I want to focus on information?
[00:01:24] Nick Adams: [00:01:24] Yeah. So I started getting my PhD in Sociology at Berkeley and what motivated me to go get further education on society and sociology was that I thought I might become a politician myself. But as I was learning more about how society works, I realized that technology was actually the biggest agent of change and society. So it could be agricultural technology or industrial technology in our information age, it’s all this incredible information and [00:02:00] social networking technology. And I think a really good example of that is something like Facebook, which in about 10 years collected as many users who were spending about as much time as like the whole Chinese government or all of Islam religions and massive civilizations that took hundreds or thousands of years to build up a big following. Facebook did it in 10 years because the technology is so accessible and so easily connects people. So, so yeah, I just, I realized that the biggest impact I could possibly make would be in technology.
[00:02:36] Alistair Croll: [00:02:36] So, you did some work around Occupy. Can you tell me what it means to try and make sense of that much unstructured stuff? And what were the problems you saw when trying to understand what was going on with the Occupy movement?
[00:02:49] Nick Adams: [00:02:49] Sure. Yeah. So the Occupy movement, as some people probably know, featured 184 different little movements across [00:03:00] us cities and towns. And that actually creates a really good situation for what we scientists would call a natural experiment. There’s enough of a very similar phenomenon having happening in a bunch of different places that however things play out differently, probably reflect some more on those different places and reflects more on the city government and the police departments in those places, than it does on the movement itself. So I wanted to understand that and understand it very well. If I could wave a magic wand, I would have showed up at every single event across all of those 184 different movements. But the next best thing is to have a bunch of journalist showing up at all of those events and reporting on them. So I was teaching social science methods at the time. And I kind of skimmed the top Berkeley students in social science methods, pulled them into a team and we went out and collected all of the news articles, including radio and television [00:04:00] news, from all of these places that were reporting on their local occupied movements. And this is just a massive amount of information. Almost 10,000 news articles that included all the information about what the city is doing with the police are doing what the protesters are doing in particular events and events that are in series, which we call campaigns and also what was going on at those encampments. So we were completely flooded and inundated with all this information. And then the next step was to apply our analytical procedures, which is really just the theories of protest movements and police, and protest or interactions from the literature. We went to apply those theories to the documents themselves, find the information that was relevant for those theories, extract it, put it into a database. So that we can start doing these like multilevel time series models to understand these intricate webs of interaction from strategic levels down to like the tactical level on the ground. And [00:05:00] extracting that information from the text, organizing it, getting it into a database where we could do statistical analysis was, was a huge job. And there just weren’t tools out there to make that possible when we got started.
[00:05:11] Alistair Croll: [00:05:11] It does seem like we aren’t very good as democratic societies at listening at scale. I was reading something recently on, on the Great Depression and how the first couple of years of reaction to the Great Depression didn’t go very well because they didn’t really have the instrumentation, the sort of sensing apparatus to understand what was working and what wasn’t working. And as a result, the US rolled out everything from, you know, these various economic analysis books, like the beige book and stuff to the improving the census and emboldening that stuff. And that was great in this sort of atomic world where the sources of information were relatively scarce, but it does feel like we are in a world now where that sense-making apparatus is kind of broken because we have so many possible sources of information. There’s so much mistrust for the [00:06:00] default ones. Do you think that we, do you think a democracy itself is going to have to change to catch up with this shift from one to many, to many, to many sort of broadcast and multicast dynamics of information?
[00:06:14] Nick Adams: [00:06:14] Yeah, absolutely. And, and you’re kind of saying at the top of the conversation in the, in the nineties and in the odds, we all felt like, wow, this is great. We can all communicate to everyone in the world. If I want to be the, the Maven of this or the expert of that, I can, I can plausibly make it happen. I can get my ideas out there. And that is a good thing. There’s a lot of voices that were probably unheard for a long time, that that are now in a position to be heard. And so that’s positive, but our, our ability to edit our ability to, yeah, to really kind of edit this information, contextualize it, curate it, has not scaled. We’ve not scaled any of that yet, or we’re working on it now. [00:07:00] And so it means that there’s this kind of morass of information. And once you, once we also pair that with kind of inequality that is causing a lot of people to lose trust and established elites, we ended up with a situation where we don’t, we don’t even trust the old gatekeepers anymore. And not that those old gatekeepers were even able to scale up to the job of kind of editing and curating and commenting on. All of the stuff that’s now being put out.
[00:07:36] Alistair Croll: [00:07:36] So we’re seeing in the news today, the battle between, for example, Facebook in Australia and there’s an old line that, you know, the truth is behind a paywall, but the lies are free. And now we also have this world where, you know, the, the community on Facebook chooses what to amplify across news and doesn’t necessarily pass that money back to the reporters. So, [00:08:00] what do you think is going to like, there’s an asymmetric threat here. If I want attack information, I can attack it. Especially with the advent of technologies, like GPT three far more easily than information can defend itself. How do we fix that asymmetry?
[00:08:14] Nick Adams: [00:08:14] Yeah. Well, we’re working on that with our public editor project here at the Goodly Labs. You’re reminding me of a cute little bit of marketing that we didn’t push out too much, but there’s the, yeah, there’s the quote from its message attributed Mark Twain that says: “A lie can get halfway around the world while the truth is still lacing up it’s shoes.” and we did a little bit of marketing that says:” With Public Editor, the truth, just got Velcro.” But sorry. Yeah, that’s probably a little too self self entertaining. What we’re doing with Public Editor, we we’ve, we’ve now created a tool, that’s kind of a social epistemology tool,if if you will, but it’s, it’s a way for people collectively to [00:09:00] evaluate and find and label different kinds of misinformation that are appearing in news articles. Right now, we’re working with news articles, but this is a system that we’re already showing can scale we’re in the midst of a demonstration project right now, showing how it scales up. And it could certainly work for easier content like Facebook posts and Twitter posts, which, which tend to be shorter and more focused. But what we’re doing right now, and, and maybe I can show you in a minute, is we make it possible now for a newsreader to read through an article. And as they’re reading, there are labels over all of the words, the particular words and phrases are committing some kind of logical fallacy or some kind of inferential mistake or cognitive bias is showing up in those particular words.
[00:09:43] So a label will show up saying “No, this is this is posing the false dilemma”, “this is confirmation bias”, et cetera. And what that does over time is it, it actually trains the newsreader to be a little bit more discerning as they’re reading. But it also is a [00:10:00] scalable solution so that not every single person in the world has to become an expert in media literacy and expert in critical thinking in order to use the internet and get the valuable information out of it while avoiding the traps and the, and the faulty information that can lead them astray.
[00:10:19] Alistair Croll: [00:10:19] So I’m, I’m a big fan of Game Theory and incentives as a way of extracting information or regulation. How do you avoid this being like preaching to the choir where people who are attracted to a system for critical thinking are themselves people who like critical thinking the setting aside for a moment, the fact that the people who are most likely to commit cognitive biases are the ones that think they’re smart. How do you get everybody to do this? Like, is there a model where you have to, you have to read five articles in order to read five more like how do you get society to realize the value of veracity?
[00:10:55] Nick Adams: [00:10:55] Yeah, I think, I think this is, this is a real concern of ours has been a concern since the [00:11:00] beginning. And I think we might have some trouble reaching the very far out extremes. There are some people who are just, they’re in the game to put out misinformation and to ingest it. I have to admit that at one point I derived some pleasure, some entertainment from watching flat earth videos on YouTube. I have never believed there was a flat earth, but the videos are quite entertaining. So there’s going to be some of that. There’s going to be some people on the extremes that we won’t reach. But if we borrow from thinkers, like Eric Fromm and his great work “The the escape from freedom”, the what’s probably going on with these conspiracy theories is that people are looking for a place to belong. They’re they’re looking for a place, you know, epistemically, ideologically, socially to belong. And if they are disaffected by established elites and the media, the mainstream media, then they can go to these [00:12:00] places and they feel like they belong. And they have an understanding of the world, maybe a secret understanding of the world, a secret and superior understanding of the world. But it’s really about wanting to feel like they belong in their individuality makes sense and in society. So with Public Editor itself, you know, we are cultivating a community of people who do the little tasks. These tasks are distributed across an assembly line. So they’re, they’re pretty easy tasks that people can, can get up to speed doing.
[00:12:30] We’re, we’re doing our best to facilitate and foster community among these folks. So people start to take on the, into their identity, the idea that I’m going to help my democracy, my society share reality again. So I think that’s one aspect of how we get people to care about it. It’s not, we, we, we agree that we’re probably not going to get millions and millions of people to do it for the sake of nerdery or something like that. It needs to be a higher, ideal, like [00:13:00] sharing reality.
[00:13:00] Alistair Croll: [00:13:00] Well, and I think people, if people start to see it as their civic duty to maintain information like that, but again, you get to the problems. I read a hilarious this is a marketing prank, but it’s a pretty interesting, do you know about the birds aren’t real movement?
[00:13:13] Nick Adams: [00:13:13] I have seen that. Yeah.
[00:13:14] Alistair Croll: [00:13:14] So this is like a guy who decided he would start a movement saying birds aren’t real. Forget the flat earth. You can go buy a chicken and take it apart at home and see it’s not a robot, but apparently chickens are exempt, but he’s got this whole thing that like pigeons are actually mechanical government drones. And like people are buying his merch and some people are kind of getting into it and going: Hmm, it seems like even things we can objectively verify with a trip to the supermarket. There’s so much fun to talk about that we are willing to participate in this collective illusion or delusion. Dan Hahn and his brother have talked about, you know, Q Anon being an an ARG. That’s a lot more fun than proofreading articles.
[00:13:54] Nick Adams: [00:13:54] It’s, it is fun. I mean, I’ve seen those, these memes, and I think, I think they’re really fun too. [00:14:00] But at the end of the day, like if you, if you get into the news, if you’re a, an avid news reader, I don’t know if you’ve had the experience I’ve had, but I’ve got a lot of people have had the experience of, of just getting the sense over time over the last few years as a result of kind of the BuzzFeedification of the media where everything has to be grabbing attention in order to actually be written, even in the New York Times when you, after the BuzzFeedification of the media, I’ve gotten to a point where I barely want to read the news anymore cause I feel like I’m always being kind of duped. And so our project really is around the mission of getting people to share reality. It’s around the mission of gradually raising the bar of journalism, the bar of quality and standard quality, gradually raising that back up to where it was, you know, at least a few decades ago.
[00:14:57] Alistair Croll: [00:14:57] You think that, that, I mean, it seems to me like, [00:15:00] this has to be a function of the platform. And we’ve seen cases where in Twitter, you can report things or Facebook, you know, can flag certain things and mentioned providence. But it, it also seems like this is a thing that like people paid to read the Financial Times, knowing that they’re paying for truth, exact for example, right. And that the, the brand of truth becomes a sustainable competitive advantage or a competitive moat for certain, certain publications, but only if we, as a society start to value that over, you know, a cool name or a listicle they publish or whatever.
[00:15:35] Nick Adams: [00:15:35] Exactly. And if you’re familiar with section 230, which, you know, some people call it the 26 words that created the Internet. There’s actually a, quite a bit more than 26 words. And some of the words talk about you know, being the, the policy of the US Government to encourage these third-party platforms, your Facebooks and Twitters of the world to, to cooperate with [00:16:00] third-party filters that filter out, you know, agregious content. And we think that as we get going here, Facebook and Twitter should be making it really easy for their users to include Public Editor quality.
[00:16:19] Alistair Croll: [00:16:19] Oh like, like a Chrome plugin, but for Facebook.
[00:16:22]Nick Adams: [00:16:22] Yeah, exactly. So there should be a plugin that says, so that a user can say, I don’t want to see any news in my Facebook feed, unless it scores at least an 80 out of a hundred on Public Editor. Right. That would be pretty easy to do actually.
[00:16:35] Alistair Croll: [00:16:35] But then a lot of people would say, okay, at what point do you make it mandatory? Right? Like, like a clubhouse the other day. And one of the people’s profiles, like everyone else’s profile was normal. And then one of the people said there was a button that said report for trolling. And it wasn’t anyone else’s and I’m like, I wonder if this person has been accused of trolling in the past. So now there’s an option saying, so you could say like, if the article doesn’t get above a certain amount, [00:17:00] you know, block this or report this for being fake, where if it has been well verified, you don’t get as easy or visible an option to do that.
[00:17:09] Nick Adams: [00:17:09] Yeah. I mean, I, I’m not looking for a really super strong regulation or censorship. I, if that happened, if the government came in and said, everyone has to use Public Editor, I guess I’d be happy with it. I probably wouldn’t fight them on it. But I, I don’t, I don’t think that’s quite the way we, it’s certainly not necessary. There, there’s a technological architecture here that would make it really easy for Facebook and Twitter to supply us with an API that would make it possible for their users to do this fall.
[00:17:39] Alistair Croll: [00:17:39] So we, we’re talking about abstracts. Can you give me a sense of what it looks like? Can you maybe bring up a screen or something and show us?
[00:17:45] Nick Adams: [00:17:45] Sure. Yeah. Let me let me share my screen here. And the other thing I didn’t, I didn’t quite mention, Alistair. I think it’s really, really right to be thinking about people’s motivations when they’re, when [00:18:00] there’s, they’re doing these tasks, these annotation tasks, and we’ve actually done some work to gamify the system and we’re, we’re improving our gamification, but even now people can earn badges. And eventually they can earn certifications that. That could actually be quite possible. It would be quite a valuable if somebody becomes expert in the eight different tasks across our assembly line, they actually are building up a great amount of critical thinking ability, which as we know is more and more valuable, the more crap is out there on the internet.
[00:18:34] But I think everyone can see my screen now, is that right? Yeah, I can see it. Okay. So what we’re showing you is it’s kind of a, a newsfeed of the future where you’re seeing the headline and the first few sentences of the article, but you also see these credibility hallmarks out to the right. And we ended up grading these articles just on a zero to 100 system like most people are familiar with. And if I click into one [00:19:00] of these articles. Now I’m looking at the article and as I read it, If I’m a newsreader and I hover over something, I can see, you know, one point is detected here because you’ve got an unhelpful metaphor. Two points are added to the article score because we have a good qualified source being quoted. There’s some problems here where the reasoning of this particular clause is begging the question, it’s appealing to ignorance. Let me scroll down here. You have a qualified source, but they have inappropriate confidence. They have some hindsight bias. There’s lots of different problems in here. And these labels are generated not by someone going through and just taking notes on what they think they see because they believe they are the master critic. They’re, they’re generated through a process where people are walking through a protocol and they can be checking for, [00:20:00] read the different reasoning fallacies. They could be walking through a different protocol that’s checking through to make sure that the evidence is properly supporting the claim. And we go into pretty scientific, scientifically rigorous evaluations of evidence. So we’re looking at whether things meet Hill’s criteria to satisfy a causation, instead of just correlation, we’re looking at whether there’s systematic uncertainty or statistical uncertainty and a study if it’s reporting statistics. We really kind of dig in there. And then we’re reporting labeling over 40 different types of, of these mistakes on itself.
[00:20:36] Alistair Croll: [00:20:36] So if we’re starting to see, as you said, you know, Facebook is able to, in a matter of a decade, acquire as many citizens, if you will, as many of the world’s biggest countries or most populous religions do you think we’re going to get to a point where a social network becomes a sovereign system. Like there’s no, I live in Canada. There’s [00:21:00] no Canadian Government Social Network and we seem to be fine with the Government building highways. We’re less fine with the Government building broadband. We haven’t really talked about the Government building a social network and it’s unlikely anyone would like, I don’t really want to go hang out on the government run version of Twitter, but at the same time there could be a future in which that’s considered a public resource. And like that’s where you go to hear what your politicians have been up to, or have conversations that are considered accountable and so on. Do you think countries are going to develop their own national social networks?
[00:21:34] Nick Adams: [00:21:34] I think it’s plausible. There was actually a moment, a few years back when someone from the State Department contacted me to, to discuss that sort of possibility. I do think there’s a role for these interactive service providers as they’re called under Section230 to be treated more and more like utilities. And I, and I think that would actually be totally appropriate. We do see situations, I think. [00:22:00] And let me just not name any particular corporation at this point cause I can’t stand up to their PR department. But I think we see situations where some of these corporations could very well make it easier for a small startup or a small business to, to work with them and instead they end up engaging in practices to kind of steal whatever a good idea was there, or incorporate it into the platform, which is precisely not what a utility should be doing.
[00:22:31] Alistair Croll: [00:22:31] But I remember, I remember having a conversation with Jonathan Zittrain years ago at like Web Summit in San Francisco, I think, and we were talking about the Common Carrier laws. And I think people misunderstand that the history of the communications acts were like, if I kidnapped someone in California and brought them to Colorado, and then I delivered the ransom call over AT&T’s phone lines, AT&T is not liable for what I communicated, even though I’ve committed this crime because they’re not [00:23:00] discriminating traffic. Right. And this was the whole idea of bandwidth neutrality is like, if I’m with neutral, then I don’t care what’s in the packet, so I can’t be liable. And there’s good precedent for this in the difference between like prodigy and CompuServe, where one claimed to be curating content and the other one said it’s a free for all. And that affected their legal liability. And so there’s this weird idea that if you curate and claim that your data is good, then you’re liable for it because you’ve, you’ve said it’s good. But that seems to impact, you know, the implementation of editorial stuff. But at the same time, nobody would argue that they’re not curating things because there’s an algorithm choosing what’s in your feed. So doesn’t that sort of circumvent the intent of the, of the, of Section230?
[00:23:45] Nick Adams: [00:23:45] Well, I think this is where the other part of the US policy that I mentioned should actually be getting a huge boost. To, to use the analogy of Facebook is the water company. And they should absolutely be able to run water [00:24:00] lines to our houses. But they shouldn’t be able to print prevent me as a user from putting another filter on the water pump at my sink. With Public Editor, we want to provide an additional filter.
[00:24:13] This was,
[00:24:13] Alistair Croll: [00:24:13] I remember there was a time in the early days of, at AT&T or mid mid-life of AT&T. I remember Steve Wozniak freaking out because he wanted to open up a joke line and, you know, charge, you’d call this number and you’d hear a joke and you pay some money and it was going to cost them like a thousand bucks a month to rent the answering machine from AT&T because they prevented you from plugging anything else into the phone line that wasn’t AT&Tcertified, you’d be fined for doing so. And so it does seem like there’s that old article about the death of the web that was on Wired like 15 years ago, but I, you know, the fact that, that the apps I use can control whether I can view source. And if you go and right click on a webpage, you know, there’s a lot of obfuscation, it’s incredibly hard to find that image or video. You know, in some cases you [00:25:00] can’t do screen captures of it. So this does seem like it comes down to an argument about Open Source and, you know, the right to root and the right to get to the source code, which many of those companies would argue at least when you’re running an app is a, is a trade secret or proprietary. So there’s no way for me to install a plugin on top of Facebook. There is with Chrome and certain other open source projects. So do you think extensibility, like requiring extensibility by the law is a way to move us towards a world where we can have these sort of epistemic correction systems?
[00:25:35] Nick Adams: [00:25:35] That’s exactly what I think. And it’s, it’s right there in Section230, that that sort of filtering, it is supposed to be encouraged by us policy.
[00:25:45] Alistair Croll: [00:25:45] Australia had said, instead of saying, you know, Hey, bad stuff, if Australia had said Facebook, in order for you to be here, you have to make the, make a plugin ecosystem in the next year or so, that will allow people to create stuff that will [00:26:00] filter or like the water filter, if you will. We think your water is toxic or whatever the Australians may think, or it’s bad for trade or or whatever. And as a result, we will make it so that people can implement their own filters on side of it and get some agency on the, on the client side.
[00:26:16] Nick Adams: [00:26:16] I think. So I think this is a, this would be a win for Facebook if they went fiscally and government and it would be a win for the whole you know, industry of what I cal credibility service providers. We are an organization that provides credibility for the web.
[00:26:34] Alistair Croll: [00:26:34] I’ve never heard credibility as a service, but not, I like that. It’s, it’s also crass, which is kind of nice. But that’s, I’d never heard of that term before. I mean, it was like in the last year. Yin Lu, the work that Jamie Joyce, who we’ve chatted with before is doing, it seems like there’s a rise of many of these different organizations that are trying to provide some kind of create a credibility, [00:27:00] whether that’s understanding the ontology of an argument or pruning it down or giving people a user interface to sort of annotate and, and vote and rank things or the work that you’re doing obviously.
[00:27:10] Nick Adams: [00:27:10] And I think what they’re doing is great. I’m going to borrow an analogy w you know, when Google wanted to help people be able to move around in their cities, or I should say ways, the ways at when our people to move around the city, they didn’t try to train every single person. You know, to memorize the map of the city, they, they gave him an app that they can use it in real time. And Yin Lu, and what Jamie is doing is, is really excellent. In some ways that’s like we’re going to build a community map. And this will be the map of kind of the, the discursive reality of our society. And actually, I, I love what they’re doing. Yeah. And we have some similar projects that we, I think we’re just going to roll into the stuff that they’re doing, but with Public Editor, it’s more like that Google map or that waste map, or when you’re in the moment of reading, [00:28:00] what could be garbage on the internet, you’re going to have what you need layered over that information to make sure you understand it and understand it.
[00:28:08] Alistair Croll: [00:28:08] So it seems like, you know, if, if 50 yet the other day I was driving somewhere and Google told me to go straight and I couldn’t, it was a one way street because it was under construction, right? There’s a signal there. If enough people ignore that direction and turn left, you can write a thing that says, Hey, go check if that street has been closed, right. In the same way you can use weak signals if the system is pervasive, to identify that someone should go in and investigate this and, and, and you know, more like a Wikipedia edit where, where enough people flag something, then a more senior Wikipedia person comes in and, you know, writes a Snopes article or whatever else.
[00:28:43] Nick Adams: [00:28:43] Yeah. And you know, I love any comparison to Wikipedia because you know, when Wikipedia started, everyone kind of felt like, Oh, that’s not going to work people aren’t going to do that, people aren’t that altruistic or anything that’s on there is going to be super biased because people are just gonna
[00:28:57] Alistair Croll: [00:28:57] I remember the original plan was to have like a [00:29:00] thousand people that were going to pay to edit it.
[00:29:01] Nick Adams: [00:29:01] Right? Yeah. Yeah, but, but there’s enough of us. Someone gave me a lot of hope. They said one of the fundamental motivations of human beings is to correct people when they’re wrong.
[00:29:14] Alistair Croll: [00:29:14] Well, I mean, that’s how we evolve to be, you know, we evolve to get the approval of our peers and certainly that means getting, being right. But as Jonathan Haidt says, don’t worry about being correct, because that’s not what we optimize for. We optimize for the approval of our tribe. And so we come back to the same problem here, where if you have, you know, anti-vax versus facts or flat earth versus I can’t believe I say spherical earth. But maybe it’s because I’ve been on a lot of airplanes. It seems to me like you still have this problem of activating the normal middle. That society is not a bell curve. It’s as well curve with incredibly loud voices and engagement at either side. And until the middle sorta goes, wait a minute, it’s my job to keep this democracy [00:30:00] functioning by making sure we’re all dealing with consistent information. It feels like we need to wake up in the middle or we need to make it a law in the same way that Australia passes mandatory voting. For example, we can make laws that say that this stuff has to be in place, but then you get a lot of concern about the nanny state and you get the rise of parlor and other platforms that say, we’re absolutely not going to do this. How do you see that playing out?
[00:30:23] Nick Adams: [00:30:23] Well, you know, I, I’m pretty hopeful that there are enough people out there to, to kind of keep this thing going. It actually doesn’t take a ton of people. So what we do, we take in articles, we take in the most shared viral articles on the internet. That’s, there’s another third party that kind of measures that. And if you think about the average newsreader, even an avid news reader is probably only going to read what 30 articles a day, maybe. So we think if we could do the top 100 most shared articles across most shared across Facebook and Twitter, if we could do that every [00:31:00] day we would be making a huge impact and that doesn’t really require that many people that requires 3000 people spending 15 minutes a day. And you know, I think it comes out if we were to pay everyone $15 an hour, it comes out to somewhere around $6 million a year.
[00:31:18] Alistair Croll: [00:31:18] It also seems like you could tell people, look you know, the following newspapers have agreed to give you access by an, a paywall, as long as you review 30 articles a month and you have like an army of reviewers that are incentive to do so, because that way the, the newspaper or publication gets to put a badge on saying this has been publicly edited.
[00:31:37] Nick Adams: [00:31:37] From from your mouth to the news publishers ears.
[00:31:40] Alistair Croll: [00:31:40] Well, hopefully so you’ve obviously analyzed a tremendous number of interesting things. What was the most unexpected thing you learned from the analysis of Occupy?
[00:31:51] Nick Adams: [00:31:51] Ah, wow. That’s a great one. And let me dig back into that cause it’s a, it’s a project that has been a little bit on the back. [00:32:00] Well, one thing we found that was really interesting is that that there were cities, and Atlanta is a great example, they’re in cities that kind of had violent crime waves prior to the Occupy movement, they were just incredibly like slow lax about the movement, and cities that didn’t have a bunch of crime, I think took a lot more, put a lot more focus and attention on the movement. So that was pretty interesting. You can kind of, you can kind of see the the chiefs and the Lieutenant saying like, yeah, it’s a bunch of kids in the park. We’ve got like serious problems and these other, and these other places in the city, but that was really interesting. Also departments that had less than budget per capita. Really seemed to try to nip the movement in the bud and really try to show force early and get them evicted early whereas departments that had a [00:33:00] little more cash on hand could kind of afford to see how it played out over the course of several weeks and just do some more gentle surveillance.
[00:33:08] Alistair Croll: [00:33:08] Have you done any analysis on like the BI movement or other movements since then?
[00:33:13] Nick Adams: [00:33:13] What we’re doing right now is actually extending and deepening our analysis of those occupied movement cases. We’ve kind of done, we’ve analyzed kind of the top layer of the data, but once we get into the intricacies of the blow by blow tactical interactions, we’re expecting to find sequences of interaction that often lead to violence and probably some decision points. So are you some hierarchical? Well, we’re using like Hidden Markov models actually to see how these sequences play out and see if we can identify these decision points where it’s a very clear decision by the police or by the protestors strategists will lead to violence or something more like a company.
[00:33:59] Alistair Croll: [00:33:59] Can you [00:34:00] explain Hidden Markov model to the uninitiated?
[00:34:03] Nick Adams: [00:34:03] Yeah sort of, let me, let me do it quickly. I don’t, I don’t know if I can do it quickly enough to do it justice or in depth enough to do it justice. But when you’re looking at it as a sequence of things that happen, and you’re basically positing that there’s some unknown variable that’s causing them, it’s also kind of in this parallel sequence. And as you kind of move through the sequences you’re testing, whether that, that unknown variable prior to the event is important or not. And, and maybe like what, how much of an effect it has on the next step. So then you can kind of isolate, you know, if there’s four things that happen in a sequence where the first three, like stacked in a way that they were going to happen together, no matter what, or the first two stack together where they were going to happen, no matter what, or is there some decision points, right.
[00:34:51] Alistair Croll: [00:34:51] So it’s, it sounds like you’re trying to extract causality versus correlation.
[00:34:56] Nick Adams: [00:34:56] Yeah, it’s about, it’s about finding the causal [00:35:00] moments where human decisions can actually make a big difference in the outcomes.
[00:35:05] Alistair Croll: [00:35:05] So I’m a huge fan of Isaac Asimov. And the foundation series is basically, you know, data science porn in some ways it’s like, Hey, we’re going to use, we can’t get humans to behave well, we know there’s going to be a collapse of civilization for millennia. How do we weather the collapse? Huge spoiler, by the way. How do we weather the collapse? So it’s a thousand years and you know, society ending not 10,000 years in species ending. It feels like we’re in one of those inflection points now where, you know, we we have to figure out how to organize ourselves as a society for the greatest good for the most people in an era of digital. And that could go anywhere. I mean, it could be that whoever’s in charge is the person with the most, most followers this week, which is a pretty scary future. But, you know, we used to be a monarchy a few hundred years ago, and that was the way it was.
[00:35:57] What do you think is gonna, what’s the [00:36:00] new, and this is a big bet to make. What do you think is going to be the new sort of equilibrium for democracy based on shared information?
[00:36:08] Nick Adams: [00:36:08] Oh, okay. Wow. You could have given me this question ahead of time.
[00:36:17] No, I do. I do see our time as a big inflection point. And I guess I’m an optimist just by my nature. So I, and you know, with all the work that we’re doing, we’re trying to make sure that things end up well. So what I see happening I look at our current democratic electoral governance system. And I see a system that was built in the 18th century and it was optimized for the pony express you needed, that was the communication system and you needed to have the representatives all in one place so that they could deliberate.
[00:36:56] Alistair Croll: [00:36:56] nothing, really any money Has been less than [00:37:00] in civics and things like electoral slates and colleges and so on. Right?
[00:37:04] Nick Adams: [00:37:04] Yeah. But all of that is very old technology. So like my vision of the future is that we obviate all of this dysfunctional technology. We make it obsolete. We can do online deliberation and we can federate it from the level of like your neighborhood or even smaller some organization to a neighborhood, to a town, to a city, to a region, to the nation, to the world. And, and we can use deliberative technology to make decisions together and, and get people to have some skin in the game of deciding what the reality is going to be. I see a world with plenty of resources, plenty. And especially as we shift to a cleaner, more renewable energy and the cost of energy comes down I think, you know, I, if we screw this [00:38:00] up, it’s, it’s a huge tragedy, not just in terms of human lives, live loss, but just in terms, if you’re able to have like the big view of the cosmic view, the difference between where the Delta between where we could be and where we are right now is completely massive. And if we make our situation even worse, because we’re just holding on to some old institutional routines that were built mostly in Europe, always with this idea of like, Power going up to one person usually one man who’s fighting with other men of the same amount of power. That whole regime of power is completely obsolete and very destructive. And I see us right now in this kind of Neo-Feudalism, where we have these, you know, these, especially once, once our billionaires start deciding they should be popular, I like Elon Musk, but it’s a little scary to me that he has that much money and he likes to tweet [00:39:00] and he’s going to build a colony on Mars.
[00:39:02] Alistair Croll: [00:39:02] Oh. And also he’s, he’s separating himself from feared currencies that might control him. So if he wants to live in space independently, like that’s literally the plot of the first William Gibson book. Right. There’s a space station when it sends you an AI. And, and in a more dystopian time, if I were at AI and I wanted to fuel my computing bills and not tell anyone, I would make YouTube videos and get clicks and convert that to Bitcoin, right?
[00:39:27] Nick Adams: [00:39:27] And I just don’t, I don’t see him in particular, doing a lot to try to make the world better for a larger set of people
[00:39:35] Alistair Croll: [00:39:35] That a documentary for what happens when capitalism colonizes, the solar system, right?
[00:39:42] Nick Adams: [00:39:42] Yeah. Yeah.
[00:39:44] Alistair Croll: [00:39:44] We gotta wrap up, but I have a couple of quick questions and I, I think your point about the changes really interesting. If you look at evolutionary history, We see punctuated evolution, right? We see small gradual adaptation and then under conditions of huge duress, very rapid change in the fossil record. Now, whether [00:40:00] you’re talking about the Burgess shale, or you’re talking about, you know, Martin Luther selling to a wall with the printing press, historians look back and they see these moments of punctuated evolution. It feels like we’re in the middle of that punctuated evolution, but let’s assume that what you say is right, and that the Government is going to have to change itself. The governments, therefore the digital part of the government is going to have to create these sort of sense-making platforms. So we have the nature of truth. How do we make that non-partisan because the FWD50 audience is very much neither left nor right but forward. And these are civil servants, public servants who’ve devoted themselves to being non-partisan despite the regime change of a particular political party affiliation. Right. And so, you know, I’ve never seen someone build code that doesn’t have opinions. How do we make sure that the platform is non-partisan when it’s designed by, if you look at the population of Washington DC, very largely democratic and progressive people. How do you make sure that you build this stuff in a non-partisan way so [00:41:00] that you don’t build bias into government at the level of information?
[00:41:04] Nick Adams: [00:41:04] I’m not, I’m not sure I would assume that it’s going to be the governments that build the new governance technology. The, the prototypes that we’re seeing coming together, and you’ve already mentioned, a few of them are, are not being built first by governments. And what I, what I anticipate happening is that, you know, if I were, if I were running the show of what the next governance technology is going to be, it would start with stuff that would work in our smaller groups and in our organizations. And then we would show that it scales to a town. We would show that scales to a city and over time, everyone would start using this governance technology for everything, from deciding where to eat, to deciding like how rent is going to be paid by different people in a house to, you know, making housing policy for a city. And when it, it shows itself to be effective repeatedly then the [00:42:00] transition into using that at a national scale looks more like you have a, a political party or maybe a political team. Maybe you just get away from the words, party and it’s people who are dedicated to using this distributed governance technology to respond to the highest priorities of the community that’s using it.
[00:42:22] Alistair Croll: [00:42:22] Yeah, that’s, that’s a fascinating statement cause I’ve heard people say civic tech is for experimenting and then when you hand it to government’s, it got a scale. But what you’re saying is civic tech is actually for subverting the status quo of the way that government works today, which is, I don’t think I’ve talked to people who describe civic tech as subversive, but.
[00:42:40] Nick Adams: [00:42:40] I’m rather annoyed by all the civic tech out there. That’s like, this is so that the government can present people with one or two options and then they can fight and whoever has 50% plus one wins the day. That sort of adversarial fight your way for policy, I think is just completely wrongheaded. And [00:43:00] it’s a vestige of this European war of all against all. That that happened, you know, for the last several hundred years.
[00:43:08] Alistair Croll: [00:43:08] So it sounds like we’re back to where we started and that’s why you decided not to run further in politics and instead, try to fix the underlying systems.
[00:43:15] Nick Adams: [00:43:15] In the future that I want to live in there, actually isn’t politics. We’re doing policymaking. We go straight to the policymaking. Politics, in my definition is, it’s the process whereby we decide who has control over decision-making. And I would like to say we don’t have that process anymore. We don’t need to spend time deciding who has control over decision decision-making. Anyone can get into this platform that’s, you know, not ultimately hierarchical and they can put in their ideas and we can develop policy together that’s going to work for us.
[00:43:50] Alistair Croll: [00:43:50] Well, that sounds like a pretty aspirational goal, but a good one. So I should mention also you’re going to be joining us at FW50 in November to get into this in a little more detail and I’m sure by then [00:44:00] you’ll have some more stories for us about how it’s playing out and so on.
[00:44:03] But yeah, we’re, what’s the best way for people to find out more about this, if they want to?
[00:44:08] Nick Adams: [00:44:08] If you, if you’re interested in and kind of the broader vision that we’ve been talking about, you should, you should pressure me to write it all down. But if you’re in, you’re sitting in the projects that we’re working on right now, check out goodlylabs.org. The mission of the Goodly Labs is to empower people with tools that allow them to find common ground and build a better society. And we’ve got multiple different projects to help folks from. Their personal relationships to the media situation, to what’s going on with protest to how you can surveil your own elected officials. And, and we’ve got stuff on this deliberation sitting on the back burner. So goodlylabs.org is the place where this is all coming together.
[00:44:51] Alistair Croll: [00:44:51] Amazing. And yeah. Thanks so much for spending some time with us today. I’m glad we got a chance to. See what you’re up to given that our tagline is “Use technology to make [00:45:00] society better for all”, I think we’re pretty closely aligned in some of these aspirations. And I can’t wait to hear what you have to tell us in November with the event itself. But thank you so much for spending some time talking about this. I’m sure we’ll be in touch again soon. [00:45:13]
Nick Adams: [00:45:13] It was my pleasure, looking forward to November. Thanks Alistair.