FWDThinking Episode 16: Digital Trust for Places and Routines

Alistair Croll in conversation with Jackie Lu.

All opinions expressed in these episodes are personal and do not reflect the opinions of the organizations for which our guests work.

[00:00:00] Alistair Croll: Hi, and welcome to another episode of [00:00:10] FWDThinking. I am really excited about our next guest. But before I introduce her, I want to give you a little bit of context. When you walk into an airport, you can see a [00:00:20] baggage carousel or a washroom sign, and you know what they’re for because they’re physical things that are there and they provide affordances. There’s a standard language about what to expect and how they work. [00:00:30] In a world that is increasingly digital. We don’t have those affordances or that accountability or understanding of how things should work and what they’re doing. [00:00:40] For the things around us that are surveilling us, the digital services that might be available from wifi to cameras, to recording devices and the affordances that are all around us. [00:00:50] And it turns out that a group of very talented designers and civic tech people, and government experts and startup founders [00:01:00] have been working on a solution to this problem. It’s called the Digital Trust for Places and Routines or DTPR. It is absolutely going to change [00:01:10] how we interact with the intangible digital side of public spaces and our lives. And I’m absolutely [00:01:20] thrilled to welcome a longtime friend of FWD50. She’s been at one physical and one virtual event and she’s on our advisory board, Jackie Lu. For a conversation about what [00:01:30] DTPR is and how we can make public spaces and their digital facets accountable and transparent that in the short term will help us to [00:01:40] better navigate the world, but in the longterm we’ll maybe even restore trust in the public sector. So please give a very warm FWDThinking welcome to Jackie Lu. [00:01:50] Hi Jackie. How are you? 

[00:01:52] Jacqueline Lu: Good. How are you doing? 

[00:01:54] Alistair Croll: I’m doing well. It’s great to see you again. You’re a frequent part of our stages, whether [00:02:00] virtual or physical and on our advisory board. You’ve been doing amazing work in the fields of smart cities and analytics of both in government and the private sector. And while these former thinking conversations [00:02:10] are usually interviews I think this one’s gonna be a little different, cause you’ve kind of brought something for show and tell this time, which is pretty awesome. So before we get too into [00:02:20] things, can you tell me what DTPR is and what you’ve been working on philosophy?

[00:02:27] Jacqueline Lu: Yeah. So wow. What is DTPR? So I [00:02:30] think DTPR stands for Digital Trust in Places and Routines. And started it was, it was, it’s an open source project that we [00:02:40] started. So Patrick Keenan and I started when we were actually at sidewalk labs, which was you know, just the small project in Toronto, [00:02:50] around smart cities that few people have heard about. And basically what we realized was that what we were hearing was it, you know, people needed to [00:03:00] know kind of what was going on around them when it came to data collection and digital systems. And so it really, DTPR was born out of an effort to try [00:03:10] to make that better. And interestingly, I’d say that even though I think this is something that I really think about, it’s something that I dug in deep when I was working on it [00:03:20] sidewalk labs. When I was at the New York city parks department, I ran it. We were starting to experiment with park benches that, you know, had wifi detect like wifi [00:03:30] sniffers on them to be able to measure usage. And we ran pretty quickly into a problem with like, well, how do we tell people that this park bench is different, but it’s doing something to help improve [00:03:40] sort of public spaces. So I’ve actually been thinking about signs for IOT things in public spaces for a really long time. And so [00:03:50] the DTPR was just like a very specific way for us to think about how do we solve, how do we start to think about helping people? How do we [00:04:00] start to address the fact that digital is invisible in the built environment? And so that means that people can’t engage with it. And if so much of what we’re hearing about smart cities is people want to [00:04:10] understand what’s happening so that they have the cake and hold organizations to account for the use of that technology. If you don’t know. You can’t have that [00:04:20] conversation. And so DTPR was, you know, inspired by the universal symbol set that you see everywhere in the world, in transportation. So imagine back in [00:04:30] the day, you would go to the airport and there’s a baggage claim symbol. And no matter what country you were in you understand that means, and you understand how to use the space. [00:04:40] So you don’t have to speak the language to be able to navigate and understand what this place is doing for you. Or you think about creative commons logos, where there’s so many complex [00:04:50] concepts that are kind of summed up in those CC by some CC symbols that are sort of otherwise pretty arcane.

[00:04:55] Alistair Croll: I think you just said something super important. It’s what this [00:05:00] place is doing. Like, that’s a, that’s a very, very salient sentence. You can unpack that, that places do things for people. [00:05:10] And in the old days they might be like, here’s a washroom or here’s a baggage handler today. Places do things sometimes for people and sometimes for the businesses that own them, but we have no [00:05:20] idea what’s going on. So it seems like the internet of things needs an internet of agency to go with it and keep it honest.

[00:05:27] Jacqueline Lu: Totally. And I think when you think about agency, [00:05:30] so as we started to raise, so what we realized pretty quickly as we sort of did our research is that, you know, privacy regulations and in multiple [00:05:40] jurisdictions, there’s, there’s lots of regulations that, you know, quote unquote extend privacy protection out to the offline world. But those mechanisms for notification and accountability are pretty absent [00:05:50] or inconsistent. And so we’re really looking at “How do you come up with a way”, you know, the signage that is out there is like either long [00:06:00] paragraphs of text or they’re really, really small, or they don’t actually give you a venue to follow up and ask questions. And so the fundamental belief of the DTPR project is that [00:06:10] people should be able to quickly understand how it works because that agency, you got to it. Like when we found in our user research is [00:06:20] actually what leads to. Trust, but you can’t have agency without accountability and you can’t hold organizations to account without transparency. [00:06:30] And so we’re starting at that point sort of bottom layer of like, we’d like kind of this conceptual triangle of how do you start to foster trust in the way these digital systems work in [00:06:40] places. And so we’re starting with the transparency piece because we believe, and what we heard from our users, from, from people that we did research with, right, that [00:06:50] transparency enables them to have that accountability. And with that accountability, then you can have the agency and the trust, I think 

[00:06:59] Alistair Croll: A slide [00:07:00] that shows that pretty well. Do some slides here this one here. So it does seem really interesting that you’re like, you know, if you just go to people and say, trust [00:07:10] my public space, they won’t do it. You got to ease easier citizens, your Denison’s into that by becoming familiar with it. I remember having a conversation with David [00:07:20] McRaney years ago on his podcast about the difference between an algorithm and a human from a, from a legal standpoint. And I think the conclusion here too, is that a human should have.[00:07:30] 

That, that idea of like, I feel like I have recourse, maybe not an agency, but recourse. I know what’s going on. I, I have an escalation process. If I feel this is [00:07:40] wronging me somehow is the basis for the trust of most digital system. 

[00:07:46] Jacqueline Lu: Absolutely. And I think most digital systems, we don’t have [00:07:50] bad recourse, right? Like, you know, the CCTV cameras there, but who is on the other end, you have no way of finding out. I mean, it could be going to someone’s [00:08:00] you know, recording under the hard drive under their desk, or it could be going to the cloud. There’s no way to tell just by looking at the device. And so I think starting with something that feels simple, [00:08:10] like a sign and a visual language, I think there’s, there’s a big on kind of challenge around also like digital literacy and awareness that we are, [00:08:20] but that kind of, as we think about how we are increasingly living inside the computer that, that like that [00:08:30] awareness and like the people, the agents inside the computer that are providing inputs into all of these systems, How are the places that we live and work in, like how, how do we [00:08:40] know what we’re doing? How that influences that space. And so I think what was interesting was it when we did some rounds of user research is, is transparency piece. [00:08:50] But people w we worked with Code for Canada with their grits inclusive, usability testing program. They were just like, definitely, you know, great feedback around the, [00:09:00] the transparency signage, but they were pushing even further. They were just like, okay, this is cool. I understand that this is the data that’s being collected and how it’s being [00:09:10] processed and who’s accountable. But they’re like, I want to know how often this traffic lights slowed down to let someone cross, like they were looking for that really tangible measure. [00:09:20] And that actually, providing that piece of information, like this is the decision that this system and this data collection enabled for your community was [00:09:30] was was, it was interesting to see that that was actually how people want it to decide. 

[00:09:36] Alistair Croll: Yeah. I mean, once you collected the data, it’s going to be [00:09:40] show me some analytics. And I think then it’s going to be, I’d like an opinion about how to change the analytics, which sort of shapes public policy. So if you have a view. [00:09:50] Your city’s streets should favor bicycles over cars for example, this eventually becomes that where, you know, everyone talks about transparent [00:10:00] government, but so much of government is based on data collection and policy change that if you aren’t aware of that cycle, you’ve lost your ability to shape public policy [00:10:10] or have your voice heard. So it, I can see how this is foundational for a much more meaningful shift in how societies govern themselves once this part. [00:10:20] 

[00:10:20] Jacqueline Lu: That that is our hope that we have big dreams. I think part of it is we, so the that’s why we are now, [00:10:30] you know, having spent the last year talking to a lot of different organizations and sharing the DTPR and working we did a small, very [00:10:40] small scale pilot with the city of Boston. We’ve definitely heard from all types of organization. Private sector, IOT companies, [00:10:50] academic institutions, as well as municipalities and government. That kind of like a shared approach. Like this is important because basically there’s, there’s signs out in the world [00:11:00] about IOT and data collection everywhere, but there’s no consistency and there’s no mechanism for shared learning. Everyone’s kind of just either trying something [00:11:10] or not getting started because there isn’t something that you can kind of like pick up off the shelf and just try. And so that’s why we’re really thinking [00:11:20] about do to really thinking about how running this cohort, because we believe that there’s potential for something like this, to be a [00:11:30] solution, to both that sort of problems for people and thinking about governance, but also solution to like an emerging regulatory landscape, especially in the US you have all these different [00:11:40] surveillance, ordinances being passed. New York city has new biometrics legislation, but there isn’t a toolkit or sort of accepted practice and even privacy regulators in Europe [00:11:50] have called for icons or sort of layered approaches to thinking about privacy. I think Brian Boyer he [00:12:00] wrote a small article about GDPR in his newsletter for the University of Michigan. He was like, you know, are you ready for a world where every piece. [00:12:10] Sidewalk lampposts bench has a terms of service and a privacy policy. And how do we make that accessible and approachable for regular people? And how do we also [00:12:20] help organizations do that? And how do we learn? That’s really what we’re trying to do.

So, so I 

[00:12:28] Alistair Croll: mean, I think this is [00:12:30] something that has been a long time coming, and maybe it’s more obvious to the average Dennison today than it was, but you’ve put together a team and I was struck by the. You know, you’ve [00:12:40] got design and civic tech and startup thinking. Why was that kind of trifecta necessary for, for building something like this?[00:12:50] 

[00:12:51] Jacqueline Lu: So it was, I would say it was partly by accident, I wouldn’t say by design. So I think that is something to [00:13:00] just acknowledge. I think where we design has been important from the beginning, [00:13:10] because I think there’s been any number of governance proposals or like principles around technology of like, this is how we can better govern technology. This is what it should [00:13:20] do better for people. But what we found or what, you know, Patrick Keenan, who started the DTPR project with me whatwe felt [00:13:30] was like, well, but who’s talking to the people the experts are saying. A lot of different things. And how are we taking that and asking folks out in the world that are saying that this lack of [00:13:40] transparency, this lack of knowledge is a problem and actually testing with them. So design thinking was, was always a big part of the [00:13:50] DNA. Of this project because it enables, you know, flexibility and learning. It’s key for sort of the evolution of the DTPR standard, but also [00:14:00] ensuring that this actually works for people in communities. It doesn’t just stay in the principal’s space or in kind of declaratives. And [00:14:10] th that was reflected in the way we did this. Like we did initially. Convene the experts. And we talked to experts all around the world. So, you know, if we want to improve kind of [00:14:20] agency around technology and make things more transparent, what should people want to know? Or what, what are the most important things where people want to know? They came up with like 200 things. So [00:14:30] that was a lot of things. But when we started to turn them into prototypes of signage and into actual like information sort of assets that we could then put out in front of [00:14:40] people what we learned was that they wanted to know three things they wanted to know who is w what is this for?

What is this doing? So that’s the purpose. [00:14:50] Who is accountable. So who’s behind this who, which gets to your agency sort of question they and they, and they wanna know whether they can be [00:15:00] seen, whether they’re individually identifiable. And then they want to Ave to either learn more if they want to kind of at their own time and to be able to provide [00:15:10] feedback. And so that actually is reflected in the design of the standard itself. So we, you know, and I think slides. Well, if you bring up [00:15:20] slide 12, this would be a good one. So you can kind of see. So here is what the DTPR signage [00:15:30] looks like for the implementation we did in Boston. And you can see those top level concepts that people, what people told us was important through the, through [00:15:40] the research sessions with purpose on the top with the accountable organization, in this case, it was the city of Boston. Here we [00:15:50] have, it’s, it’s a computer vision camera that D identifies on device and so that’s shown in blue here and the QR code,which [00:16:00] gives you kind of that avenue to follow up and learn more on the Alistair. If you could kind of click on the QR code. In the real [00:16:10] world, you’re coming up to that light pole and you see these symbols in this QR code, and you want to understand what is this place doing for me, for my community. You can actually [00:16:20] scan that QR code and it brings you to what we call the kind of DTPR guide digital channel. And you can see those main concepts reflected in the back at the [00:16:30] top. And here we use are starting to navigate through what we call the data chain. And so here, what we’re doing is we’re using symbols in a unified visual language to bring people [00:16:40] through kind of like a consistent mental model and always presenting the concepts in the same order. So starting out with accountability and what the purpose of the technology, the technology [00:16:50] type, talking about the data. How it’s being processed and accessed and yeah, it’s expandable. So you can kind of drill down and learn more. And I think [00:17:00] one of the most important parts is this feedback at the bottom, that we’re very explicitly asking people, how does this [00:17:10] technology make you feel? Is this information helpful? And so going a bit beyond the idea of a static sign. [00:17:20] And actually creating that feedback loop that people have told us that they’re looking for. And how do you kind of make that structured? And. You know, [00:17:30] organize those learnings in a way that can benefit. I think organizations everywhere around that are trying to, you know, use IOT [00:17:40] to improve things for their community.

[00:17:42] Alistair Croll: So you mentioned design and, and one of the things I love from the many contributions that design thinking has given us [00:17:50] is the idea of the double diamond process that first of all, you’re like solve the right problemand then after that solving the problem, [00:18:00] right? So solve the right problem is like, what are all the problems we could solve and which one is the best one, which usually comes from user feedback and interacting with the stakeholders. And then [00:18:10] once you’ve decided, what problem you’re gonna solve, you know, what are all the ways we might solve that problem? And then what’s the best one. Can you walk me through like your start? You, you mean, [00:18:20] you started to mention that you got up there and started talking to people, but can you walk me through how that process rolled out over time and what governments can do to [00:18:30] kind of replicate that process since you’ve obviously succeeded in building at least a prototype this way. And what was hardest about those two diamonds? 

[00:18:39] Jacqueline Lu: So [00:18:40] I think wow. That what was hardest about those two diamonds? I think part of what we knew we [00:18:50] needed to learn was, you know, kind of again, solving, I think we’re still in the early stages of, of that diamond. I think we’ve gone through a pretty [00:19:00] robust prototyping process where we’ve talked to you know, privacy experts and experts in IOT and smart cities. We’ve also done some [00:19:10] user testing as part of our design process, but I think we’re still wanting to learn and this is actually part of why we’re running a cohort for this fall [00:19:20] if this actually is going to do anythings for anything for people. And I think the reason we all created this as an open source standard is that [00:19:30] it needs to continue to evolve and that we would spark a sort of community around it. 

[00:19:37] Alistair Croll: There’s like creative commons, you know, before then it was incredibly [00:19:40] complicated to try and assign copyright and creative commons was like, you go, okay, I click on this license and I’m done. And it made it, it dramatically [00:19:50] simplifying things and maybe making compromises. You know, everybody has a special in the VC world. There’s something called safe, safe, investing, simple and fast equity and people go, [00:20:00] oh, I’m using a safe document. And then all the VCs stop arguing about terms and conditions like, oh, you’re using safe okay. And it’s the same kind of thing with creative comments. Are you using creative commons? Like Unsplash couldn’t have [00:20:10] existed without that? It seems like you’re doing the same thing for the internet of things. 

[00:20:14] Jacqueline Lu: Absolutely. And having, and I think, well that, I think that’s our hope. I think through the [00:20:20] cohort were hoping to enable the evolution in the learning, like, is this actually the right solution? So this is an idea. We happen to care about a lot. [00:20:30] We think it has a lot of legs, but until we get it out in the world and until we get more this, this out in more places, we don’t know if we’re going to be able to reach that sort of creative [00:20:40] level of success. And so you asked, you know, what can governments do? What can organizations do is really, you know, sign up for the cohort, what we’re hearing, what I’ve been hearing in so many [00:20:50] conversations again, is this is the sort of thing that we need. This is the sort of thing that we want in the world. So sign up. You know, like let us know that you’re [00:21:00] actually interested, help us get some sort of metrics around this so we can sort of demonstrate that there is an demand for this sort of thing to [00:21:10] exist out in the world and so that’s where I think a little bit of the startup mentality comes in is like, you’d need to be able to demonstrate that people want your thing. [00:21:20] 

[00:21:20] Alistair Croll: So many questions about the startup part of it. First of all, you keep mentioning cohort, but I wanted to make sure everybody who’s listening or watching has an understanding of what you’re trying to do here. [00:21:30] The goal is to make the invisible part of public digital infrastructure, visible, [00:21:40] transparent, accountable, so people can get. Using a set of standards that private and public organizations can all adhere to. And then in the same way that you say, you know, Intel [00:21:50] inside was on all the Intel computers, you’re like DTPR compliance. And it becomes something that if you’re not doing it as a vendor, you’re kind of ostracized or you’re not doing what’s possible. [00:22:00] And possibly even, I mean, insurance is an amazing way to compel behavior, but the law often finds that if you are not negligent, if you use the best [00:22:10] practices at the time. So it seems like the first time that there’s a class action lawsuit against surveillance and the judge goes, well, you could have used DTPR, but you didn”t herefore your [00:22:20] negligence and I’m not joking. This is how many big advances happen. Like the private sector and governments need to get ahead of this, but it feels like it’s pretty easy [00:22:30] for private citizens to compel their citizens, citizens, or their, their, their governments and private companies to do this by saying, look, this is best practices, your honor, [00:22:40] if you’re not doing it, you’re exposing yourself to huge liability. And then the insurers say, if you’re not DTPR compliant, I’m not renewing your insurance. You know, corporate liability director and [00:22:50] officer liability, all of a sudden everyone adopts the standards. So it seems like there’s a pretty good path, like a legislative path to compelling that just through the courts and life.[00:23:00] 

[00:23:01] Jacqueline Lu: Yeah. I mean, I agree. And I think that there is, I, I think that we would like though to think, to put a positive spin on it, that people don’t do this because they’re [00:23:10] afraid they’re going to get slapped in the wrist. But that, because it actually sparks between communities and technology. I am an optimist. I [00:23:20] think that we have, you know, a five-year vision that this gives you super powers in, in space. And doesn’t just kind of let you know. But I think the, [00:23:30] what, what I want to there there is. Pull, right. Like there is the legislative push. Like you see this in the local surveillance ordinances that, you know, that, that [00:23:40] elected officials are like, Hey, deployers of IOT. You need to do better. You see this from the privacy regulators when we start, when, you know, like the government of I can’t remember, [00:23:50] like in Canada we have kind of guidelines for meaningful consent and that kind of starts to get, get at, you know, what should we do? Be notified about and told in [00:24:00] order for you to not have to click you know, cookie pop-up every time you walk into a space, because that is the, that is the world that we’re, we’re [00:24:10] unfortunately starting to live into. And we haven’t sort of solved this. So I think a cohort more model You know, we want to get feedback [00:24:20] to make this better. So does this actually solve problems for people, but we also want to learn, does this actually help deployers of IOT communicate more clearly [00:24:30] to thefolks in their communities in the way these IOT technologies work?

[00:24:34] Alistair Croll: So VC startup cohorts, like [00:24:40] Techstars to see if they can like introduce the idea to all of the startups in their cohorts. Cause it seems the basic idea is pretty simple. And, you know, [00:24:50] presumably if the, the, even the data about people wanting to know about the data is useful metadata for those startups.

[00:24:58] Jacqueline Lu: Yeah, I think [00:25:00] there’sno, I agree. We have not talked to Techstars. We should probably talk to Techstars. I think I come from a, I come from a public sector background. I spent almost 20 years in the public [00:25:10] sector, so that’s the space that I know. Just acknowledging my. Frame here. And I also, and I see that. [00:25:20] And so, so I think we think about this as to solve this problem. We actually need all of these different constituencies. We need kind of the regulators. We [00:25:30] need the IOT companies to participate, but we also need. Sort of the public sector and governments to participate as well. So I think that we’re, we, we [00:25:40] will, we do need participation from all of those groups. I think what we’re thinking about is, you know, who has the mandate at the moment [00:25:50] to clearly communicate with communities about, you know, or to kind of make sure that technology is truly acting in the public interest and sort of more broadly. And [00:26:00] so there is one audience that we’re thinking about. And as you know, city of Boston was a great, you know, like they have made it a priority for them to kind of like spark this [00:26:10] conversation with their residents around digital technologies. But we actually do have a number of IOT companies that have reached out to us and been like, [00:26:20] Hey, this is, this is really interesting. How can we use this? Whereas we’re figuring out how to kind of meet them where they are and what their needs are. But I [00:26:30] think we all have this need to communicate better. We don’t know if this is the solution, but that’s why it’s open source. That’s why we want to run it as a cohort [00:26:40] because only by kind of promoting that shared learning around a consistent toolkit. And we can actually start to arrive at that other part of the double diamond, because I [00:26:50] think we’re still in the expansion process of the double diamond. 

[00:26:53] Alistair Croll: But it certainly sounds like you identified the right problem, which is how do we, how do we make [00:27:00] public infrastructure accountable? So you’ve done the first diamond and you’ve got this you’re it seems like you’re right at that point where you’ve gone, I’m going to put a stake in the ground, which is here’s a standard and a cohort [00:27:10] and a clearly articulated vision. Now let’s go see, which thing is it? Is it for like, you know, scooters, like, like a [00:27:20] lime or is it for cars like Lyft or is it for park benches or free wifi or coffee shops or whatever, 

[00:27:27] Jacqueline Lu: or CCTV cameras or digital [00:27:30] wayfinding or, you know, all the things. Yeah. So. 

[00:27:33] Alistair Croll: So I’m gonna put on my bad guy hat for a second. How do you stop people from [00:27:40] hacking it? Like, what if I just feel like going and putting a bunch of fake DTPR stickers on things to mess with you. 

[00:27:47] Jacqueline Lu: First [00:27:50] of all, I think that would be a great sign of traction.

Sure. Main problem. I did, [00:28:00] so I did have a kind of brief dream of like, you know, is there a gorilla art project in here? Just this like mass stickering activity for us to sort of [00:28:10] like raise awareness that like, and maybe it’s a symbol with a question mark. Like, I don’t know what this thing actually is. Let’s have a conversation about. So so I think that like, [00:28:20] yeah, if somebody hacked it in that way, I’d be like, awesome. Now we’re talking. I think this speaks to once we [00:28:30] sort of like future state of what success might look like, I think right now we’re still in the test and learn stage, but my hope would be that once we [00:28:40] are actually like, well, there’s demand for this, this is solving problems for people likely the governance of DTPR would move out of kind of like a startup incubation [00:28:50] model. And there may be some other body, maybe a nonprofit, maybe an existing standards body out in the world that could sort of adopt and steward the [00:29:00] open source piece and that there could be a certification process. There could be a sort of like membership model. We think about some of these. But [00:29:10] right now we still need to kind of be like, well, does it work? 

[00:29:14] Alistair Croll: You don’t remember those old trusty things that was like, this site is trustee certified. It seems like once [00:29:20] the public becomes aware that like, you know, there’s a digital cert that says this is a legitimate DTPR disclosure, but I’m thinking beyond that, you know, with technologies [00:29:30] like near-field and Bluetooth and wifi. I remember living in a house where , one of my neighbors was very noisy and another one of my neighbors [00:29:40] changed their wifi SSID to. We can hear you. And I was like, Wi-Fi SSIDs are the new, like apartment area network. You got [00:29:50] like lands and lands and you have apartment and it works. But there is an idea here. Like you could change the standards around wifi or near-field. So that something pops up andthen [00:30:00] you can do some form of digital verification so that as you’re walking around your device is being told, these are the properly certified digital [00:30:10] systems around you. And then you’re, you’re peering into them.

[00:30:14] Jacqueline Lu: Yeah. We’re I mean, we’re definitely inspired by things like the lead standard. Like when you go into a building. Kind of, [00:30:20] it was built to a specific standard around greenhouse, greenhouse gas, emissions, and climate change and sustainability that we imagine there’s a future where a place might have a [00:30:30] sign that says, Hey, you know, this is a DTPR enabled place. And I think because it’s an open because it’s a structured [00:30:40] taxonomy that describes how these digital systems work and try to present everything in sort of like a structured way. That’s the beginning of machine readable. That is an [00:30:50] API. And so now you can imagine sure a future where maybe DTPR is not just a bunch of signs and a website, but it’s actually like an open source API to the city. [00:31:00] In one of our sort of prototyping and research sprints, we started getting really interested in sort of technological proofs for accountability, so different privacy, preserving [00:31:10] technologies ways to, you know, protect data, to protect processing. And we fell into, we learned about some thinking around personal [00:31:20] fiduciary, AI, that’s kind of promoted by actually one of our emerging coalition supporters Richard Witt. And what we realized [00:31:30] was that like, oh, you know, actually, if we’re successful five years from now, DTPR could be the hook that the person personal fiduciary AI on your device [00:31:40] actually interacts with to be able to let you know when you’re entering a space where data collection or the use of technology might be outside of your personal preference, [00:31:50] because there’s an interesting design problem here iOT is everywhere. If we put a sign on every year. It’s [00:32:00] not going to be very pleasant to walk around. I mean, I’m just going to tell you like that. That’s like a lot of stuff like in my house. And I’m like going through an experiment in my house right now where I’m like, okay, if I was going to label [00:32:10] every sensor in my house, what would my house look like? It’s a little overwhelming. So I think like how does technology then also help you know, [00:32:20] if this turned into an API for a DTPR enabled place, then that actually, I think gives you. More opportunity for different types of [00:32:30] modalities. 

[00:32:30] Alistair Croll: My background is in management. My background is in network management and that idea that you can no longer, like if you have a thousand [00:32:40] devices in an enterprise, you don’t let the management system or the people talk to all thousand devices, you have a network management system sort of collects data from those [00:32:50] devices is the trusted sort of collector and group owner, and any alerts. It’s the one that goes, Hey, that, that computer is sending more traffic than it should or whatever, but it’s [00:33:00] looking for anomalies. And I think one of the, one of the changes I’m seeing in the sort of zeitgeisty around machine learning and AI is that today AI is used on us. It’s used [00:33:10] to compel some kind of behavior in a market or. And it would be great to see AI being used for us or by us. Right. Where you’re saying [00:33:20] this agent is looking out for me on my behalf. And I think we’re starting to turn that corner but it does seem like [00:33:30] the. There’s a digital divide issue there where those that have access to the technology, the PR, the handset, whatever, but the most [00:33:40] vulnerable are unable to see the digital collection systems because they don’t have the technology to do so. So how do you think we should reconcile marginalization [00:33:50] with this kind of tech.

[00:33:53] Jacqueline Lu: I think that’s a tough question. I’m not sure that we have I think we [00:34:00] would need to, you know, in future rounds of sort of this test and learn, actually engage very specifically with those communities and be like, so this is what we have. We know. [00:34:10] It’s not all the way there, but we need your feedback to think about how, how this might actually work. I think right now the model we have for DTPR is like, [00:34:20] it’s a sign, it’s a symbol, it’s a something you can scan on your QR code. I could also see that there might be a future where it’s actually it’s a standalone website and you can sort of investigate it [00:34:30] or, you know, a kiosk in a place would let you know that this information is available. So you don’t have to have your own device to sort of think about it. It [00:34:40] still comes down to, I think, how can we, can we put in place the sorts of systems and structures so that this information. Systematically available cause that’s [00:34:50] actually, I’d say the root cause of the problem right now is that the information is not systematically available and there isn’t an easy path for it to become accessible. And once we are able [00:35:00] to put in place that sort of like underlying sort of like knowledge base, then we can have a conversation about how do we make it available for different people that have lots of [00:35:10] different needs. 

[00:35:10] Alistair Croll: I thought he’d done a great job just by saying, Hey, something’s happening here? That everybody can, can access. But yeah, I mean, there’s definitely a [00:35:20] whole, and this is so often the challenge with innovation. Like this is people say, well, if it doesn’t solve everybody’s problem immediately, you know, but I think you can roundly say that this makes it better for [00:35:30] everyone, even though some people might have to ask for help to say, Hey, what does that sign mean? And someone with a phone can explain it, right. 

[00:35:36] Jacqueline Lu: Well, you gotta start. Right. So I think this is a starting [00:35:40] point and I think that we’re not precious that this is the answer, but let’s get started together. That’s that’s really why we’re, you know running [00:35:50] with this cohort model. So right now we have a website up, we have kind of applications open. 

[00:35:55] Alistair Croll: Who’s the perfect applicant. So you’re doing a, you’re doing a, who’s the perfect applicant [00:36:00] for this cohort. 

[00:36:01] Jacqueline Lu: I think the perfect applicant for this cohort would be someone who’s like, I want to use a technology in this space, but I want to have [00:36:10] a conversation with with the community that would be, and I need feedback on it. So it doesn’t have to be, I don’t know. It could [00:36:20] be, maybe you’re thinking about piloting a technology, ideally, or specifically looking for a way to get feedback from the residents and the people who to live and work in that, in that [00:36:30] community. Because this is a way for us to actually think about how we can kind of do feedback a bit more dynamically and sort of like on an ongoing basis outside of [00:36:40] please show up at this public consultation between. This hour and this hour at her place, which 

[00:36:48] Alistair Croll: Do you want CTOs from [00:36:50] IOT startups attending. Do you want people in civic, tech organizations, you want government who who’s going to be in the cohort? How long is it gonna last? What is the, what is the work [00:37:00] product that are going to be delivering? Yeah. 

[00:37:02] Jacqueline Lu: So I think we’re looking for participation from all of those groups, because I think what we talked about earlier is that this is a very multi-sided problem. [00:37:10] It’s not solely the provenance of government. It’s not solely the provenance of IOT companies. I think if we had some community-based organizations and neighborhood groups that say, Hey, we want to [00:37:20] think about how technology is being used our community right now. That would be really great.

[00:37:30] Yeah. So, so. So why, why a cohort model we decided to do? We weave a cohort is about shared learning, shared experiences, and [00:37:40] also sharing the risk because in doing different, recognizing that doing things differently is hard. And so this call to action. This cohort for pilots is also about [00:37:50] helping us measure the demand for the standard. Lots of folks tell us that doing something like this is important and is it important enough to commit to [00:38:00] trying it out? And so this cohort of pilots what we’re really thinking about is, you know, this is what we, we will help provide the [00:38:10] training. We will provide the sort of supports to sort of how to use the standard. We’ll help you implement the signage. We’ll help you implement the digital channel in, in a particular [00:38:20] place. Feed you like engagement metrics. Like how many people actually scan the QR code? That is an open question right now. And then, because you can only go so far [00:38:30] with just apps. We’ve also developed an NC to sort of user research and feedback toolkit so that people can have, you know, a conversation and, you know, think person on the [00:38:40] clipboard and the intersection. Hey, did you. The sign. What do you think about it? What does it tell you? And then we synthesize that and learn that back and sort of read that back out [00:38:50] to folks that are interested in DTPR and to participate in learn, because I think this is about a new, the beginnings of a new way of thinking about how we deploy [00:39:00] IOT in spaces. And so I think it’s really important when we think about doing this at scale. What are the very real, bureaucratic hacks that you might need to get through? What are some of the internal [00:39:10] barriers and structures that you would need to work through? And so I think this is, this is what we’re trying to deliver. With the cohort were [00:39:20] hoping to start to launch the cohort in October and have it run for kind of like a period of three to four months. And that will give us an opportunity to. [00:39:30] Yeah, 

[00:39:30] Alistair Croll: I think, I mean, I try not to take strong opinions on, on any of these things because we remain very non-partisan, but this is amazing stuff and very [00:39:40] necessary. Your work with Mozilla suggests that, you know, obviously Mozilla ‘s organization has strong ties to the W3C and the IATF and other [00:39:50] organizations. At what point does this become something that they’re starting to be RFCs around so that people know how to implement against the standard? 

[00:39:59] Jacqueline Lu: I think that [00:40:00] I think I am looking before I’m comfortable with sort of, before approaching these sorts of standards organizations and talking to them about what this might look like [00:40:10] for it to become part of these official bodies. I’m looking for proof point that it actually solves problems. It’s not there yet. This is like, [00:40:20] my focus right now is like, get this thing out in the world, get feedback. And I think once. No, and see how it works. And you also understand from [00:40:30] organizations that are trying to implement this, what some of their barriers and blockers are, we’ll have a much clearer sense, be able to develop a sense of like, what does it look like to actually approach one of the standards bodies? [00:40:40] Because this is a very different type of standard. This is. An actual technical standard. I call this a system to people, communication standards. So people don’t think that [00:40:50] this is like about like a new TCP IP, 

[00:40:53] Alistair Croll: right? This is about then once you start building in near-field and other elements, It starts to take on [00:41:00] technical components standards, verification and so on. So I could see this quickly becoming something for which there is the equivalent of the W3C, but for, and I [00:41:10] love the name, helpful spaces because it implies or helpful places. Is that in applies? The places should help us, which is like a wonderful thought, like what a great idea places should be helpful. I just it’s so [00:41:20] obvious. Right. Yeah. So I have a few other questions, but I know time is short and you’ve done a great job of sort ofsaying what the [00:41:30] world might look like in the future. So if I put up these five years in the future, slides, can, can you kind of talk me through them and we’ll, we’ll show them here. Cause I think you’ve got a great [00:41:40] example of what that is. 

[00:41:42] Jacqueline Lu: Yeah, for sure. So five years in the future. So right now we’re talking about signs and [00:41:50] icons and scanning QR codes. But when you think about, you know, what does success look like? What’s the future we want to get to? Actually one of the [00:42:00] whole journey. And what we’re about to go through came out of one of the co-design sessions that we had around the standard. One of the contributors that [00:42:10] have been kind of giving us feedback, if we, as we evolve this, as we went asked us, you know, how does this work in five years? Like what, so what, like, if, if you get [00:42:20] this adopted, you know, what, what does it happen? And so we had been conducting a lot of research, looking at mechanisms of accountability and agency D60and that [00:42:30] kind of. Led to this increasing excitement for a prospect of a personal AI assistant. And this would be provided by, you know, an organization you trust [00:42:40] and importantly, it knows your preferences. So the AI works for you. It doesn’t work for anyone else. And so as a communication standard that actually describes, you know, [00:42:50] sensors and systems in places our vision is that DTPR serves as that foundation from which personal AIS can help us provide [00:43:00] deeper control over digital interactions in the new world. So in the real world. So when you think about imagining what it would be like to have your digital assistant with you everywhere you go. [00:43:10] And as you enter into a place that uses DTPR, you are given a link and you can ask. You can see sort of what services are available for you there [00:43:20] to interact with. You can see if you want to do anything differently. Given the particular context that you’re in today. So next slide. So how it works is [00:43:30] fully described available to you because everything’s been described using DTPR, right? So in the next slide, you can imagine that you go to this future library that offers [00:43:40] a grab and go check out this.

Next slide. And because DTPR is there, your AI understands that it understands that there’s a grab and go system there, and it can [00:43:50] take your personal settings and preferences and apply it to the stuff. And like any system in the place it’s completely inspectable and legible to you. [00:44:00] And because it’s the future, your AI helps reduce your cognitive load by essentially doing the work of reading the information for you and letting you know if [00:44:10] something is different kind of outside of your kind of boundaries of comfort, or do you need to take it? So next slide. And so, yeah. [00:44:20] About all of the systems in a place. So here we see it’s like, you know, co cycling system. You can keep it high level. We can start to drill down to get all the specifics, [00:44:30] the next slide. Here’s kind of like sensor level detail and yeah, so this is, this was a journey in the light and the day in the life of, so you can [00:44:40] start to imagine learning, being able to discover and learn what’s around you. Next slide. It may be support the artist that you kind of discovered in the [00:44:50] Plaza is you went next slide. And again, this is, this tells you and makes transparent to you sort of how it works and who is behind the system. And [00:45:00] who is this person? You can also think about, you know, ordering ahead to a food court, helping you figure out how you find seating because you can [00:45:10] start to all of a sudden be in a future where you see you’re able to see and access what these systems that are all around us actually. I think about unlocking the door. We already have the beginnings [00:45:20] of a lot of these systems now. And I think in this future, at the end of the day, this is an important slide. The idea that you can kind of check in with your personal AI review, all of your [00:45:30] transactions and the interactions that you’ve had with the world today and change anything that might be necessary. So again, it’s about really making everything sort of inspectable [00:45:40] a vision of the future where you there, you know, what this type of communication standard that can actually be sort of new types of interfaces for the places where we [00:45:50] live, work and play. So that is this is, this is, this is the helpful places, future that we hope to get.

[00:45:59] Alistair Croll: I [00:46:00] think it’s the afterward, all the doom scrolling over the last two years. It is so refreshing to go, ah, that looks like a good world. [00:46:10] Now, accessibility marginalization is tons of hurdles to overcome, which is why I think I, the fact that you started with sort of simple physical QR codes. To make it [00:46:20] like, and I, I kinda want to coin like it’s instead of Maslow’s hierarchy of needs, it’s Luiz hierarchy of helpful places. But thatidea that you first, you got to [00:46:30] get transparency and then eventually you, you work your way up to agency. And that seems to be the problem that many of them. Rather techno utopian [00:46:40] efforts to make smart cities have overlooked is that you need to build trust. You need to explain this and, and sort of ease the population [00:46:50] into understanding how their city works with an eye. And I can just see. You know, even down to the point of political accountability, you know, this system [00:47:00] was created based on this law, which use this much of taxpayer dollars. So now I know where my dollars are going. That’s incredible. And it leads to a whole, like one [00:47:10] of the problems that we’re seeing. And we talked about this last year at FW50 is the epistemic crisis of democracy that we’ve gone from a world where you had a few sources of. [00:47:20] And they may have been lying. They may have been racist as hell, right. But you had like five newspapers and three, maybe 10 radio stations, but it was relatively few [00:47:30] broadcast, one to many. And as a result, we got a consensus among the population today. Any person can broadcast to millions of people within [00:47:40] seconds for free. And it’s leading to this kind of epistemic crisis and mistrust of public systems. And it really feels like the way you reestablish that and show people that [00:47:50] collective action and paying your taxes and voting for principled investment in infrastructure around us. Needs this kind of accountability. So I think that [00:48:00] on a profound level, this does more than just, you know, give people recourse or help them understand the digital invisible world around them. It can [00:48:10] reestablish faith in collective efforts if it’s done right. 

[00:48:14] Jacqueline Lu: That is the hope. 

[00:48:16] Alistair Croll: Awesome. Very good hope. Jackie is always [00:48:20] fascinating talking to you. I know we’ve gone a little long here, but I wanted to make sure we talked about this. And this is like amazing stuff. Thank you for bringing it to us. We want to share it with everyone. [00:48:30] Again, how do people get involved if you want to join? 

[00:48:33] Jacqueline Lu: So if you go to, we have a website it’s https://dtprcohort.helpfulplaces.com/. [00:48:40] That is where we’re kind of have our call to action. You can describe what the cohort is, what it looks like to participate, and there’s an application form. [00:48:50] And so we are taking applications right now. We are going to be reviewing them on a rolling basis until September 1st. And from there, we hope to select [00:49:00] and I, or not even select identify five to 10 places and organizations that want to take the first step to trying this out. And [00:49:10] yeah.

[00:49:11] Alistair Croll: And if some of the big tech companies want to, if some of the big tech companies want to found the foundation or get involved or sort of sign on endorse it, they can just reach out to you [00:49:20] and say, Hey, this is a good thing. I’d like to throw some away. 

[00:49:24] Jacqueline Lu: Yeah, no, definitely. And I can, yeah, they can reach out to me. We’ve got [00:49:30] an email address it’s that they can reach out to me. It’s jackie@jckieathelpfulplaces.com to just reach out. We do think it is important for all [00:49:40] of the participants in the cohort to be at least partly self-funded in participation cause I think innovation is hard and [00:49:50] of showing that you’re committed and ready to try something is actually in, in government or in whatever organization you’re in going and getting the budget to be able to [00:50:00] do so. But we also recognize, I also recognize that, you know, municipalities in particular are in a very tight sort of a moment in their budget it’s [00:50:10] cycle. And these are things that, you know, maybe are a little bit on the edge of what they feel like they could be doing. And so we’re absolutely looking for, you know, kind of sponsors. [00:50:20] Additional support to help make it possible for these organizations to sort of get out there into the world and try this out because it really is about that multi-sided [00:50:30] conversation and everyone needs to participate so well. 

[00:50:34] Alistair Croll: I mean, it looks like you’ve done a tremendous amount to get the ball to places where some of the other people can. [00:50:40] And try stuff out, right? Like you’ve put a stake in the ground and this is amazing. I think the design is clear or the vision is clear. So thank you so much for doing the hard work to, you [00:50:50] know, show people what can be, and then hopefully others will jump on that bandwagon and and, and actually take you up on that challenge.

But this is, this is amazing. Thank you so much. [00:51:00] 

[00:51:00] Jacqueline Lu: Thank you so much for having me.[00:51:10] 

DTPR is one of the most exciting ideas I’ve heard about in a long time.

The standard, Digital Trust for Places and Routines, makes the invisible layers of public spaces visible and accountable to the people who inhabit them.

Saying that “we live in a digital world” is almost a cliché. We spent a massive amount of our lives online even before the pandemic, and social distancing and remote work have dramatically increased this trend. Making sure we have rights in that digital realm is a priority, and not a new idea: I gave a talk at Strata six years ago arguing that AI should work for us, not on us, and that nobody should know more about you than you do.

But I was myopic, because there are two digital worlds. One world we travel to consensually, when we open a browser or unlock our phones. The other we walk in constantly with little recourse or awareness.

The world around us is filled with sensors, from cameras to microphones to those little strips that count cars on a road. This “physical digital” realm, packed with an Internet of Things, is both pervasive and unaccountable. When you walk past a security camera, or swipe a badge to unlock a door, you don’t know how that information is being used, or by whom.

And that’s a big problem.

If you knew who had installed a camera, what it was recording, and who had access to that data, it would make the invisible visible. It would create accountability. Better yet, you might be able to access that data if your tax dollars had paid to collect it, and answer questions—how many people crossed the intersection, on average? When is the store busiest?—in ways that could change public policy.

Bringing digital transparency to public spaces and real-world routines is the problem that Jackie Lu, Patrick Keenan, and Adrienne Schmoeker are trying to tackle. Combining their backgrounds in Civic Tech, design, and startups, they’ve defined—and tested—a way to identify digital infrastructure and let denizens learn about it. Bryan Boyer, Director of the Urban Technology degree at University of Michigan and former Helsinki Design Lab member, has written a great post on the challenges and opportunities of digitally understandable public spaces.The team at Helpful Places—as the folks behind DTPR are called—hasn’t just designed a taxonomy for labeling the invisible digital world around us. In 2020, they piloted a test in Boston that identified digital infrastructure.

Denizens who load the page identified by these stickers are taken to information on the sensors with which they’re being monitored, showing what’s collected, who owns the data, and how it’s analyzed and stored. That’s a great short-term goal, but their longer-term vision is even more ambitious: They want to connect this information to personal agents who can show us the digital affordances available in a physical space. One of Jackie’s examples includes accessing a library, reserving a table at a restaurant, and reviewing a summary of your day to better understand how you interacted with the invisible digital half of public spaces.

But I think there’s an even bigger aspect to this. Our trust in government is at an all-time low, with many questioning the value of collective action or shared infrastructure. As more of the services that government offers become digital, making the invisible not only visible, but accountable and useful is essential for the public trust. Imagine seeing a sensor, knowing what it’s for, how much of your taxpayer dollars paid for it, and an analysis of the information it’s collecting. That’s informational democracy, and DTPR lays out a clear idea for how to bridge digital and physical worlds in this way.

There are many hurdles to overcome, of course. For one thing, not everyone has the tools and devices to access digital information, so ensuring it doesn’t further widen the digital divide is paramount (but simply labeling that a digital layer exists is a good first step.) For another, vendors who make notoriously insecure IoT devices may be loath to support standards that hold them accountable. But if public policy requires DTPR the way we require GDPR for data in Europe or LEED certification for green buildings, it could quickly become a mandatory requirement for government buildings and public sector data collection, and drag the private sector along with it.

DTPR is an exciting initiative that crosses the political aisle, and Jackie and team have done the hard work of thinking it through and prototyping it. They’re running a fall cohort for startups, tech firms, and governments if you want to get involved. But first, you should watch this interview with Jackie where we explore the need for this standard, and how the team designed a vocabulary for the digital layer of public space.

Apply to be a part of a DTPR implementation cohort from Fall’21 through Winter’22 guided by the Helpful Places team led by Jackie Lu. Applications are reviewed on a rolling basis through September 1st. Apply here and contact dtpr@helpfulplaces.com with any questions!