Stephanie Dinkins, Creator — Bina48
Stephanie Dinkins unpacks why we should engage with AI, and her experience befriending the AI robot Bina48
What’s it like to be friends with a robot? How does it feel to hold Google in your hand? Why should we value friction in design? These are just a few of the questions that arose onstage during Google Design’s 2018 SPAN Helsinki conference.
Behind the scenes, we invited Longform co-host Aaron Lammer and Google Design’s Amber Bravo to delve deeper by recording interviews with a handful of our world-class speakers: Google hardware designer Isabelle Olsson, artist and writer James Bridle, transdisciplinary artist Stephanie Dinkins, artistic director Marko Ahtisaari, and sibling design duo Tuuli and Kivi Sotamaa.
This interview is part of that series.
Amber Bravo: Design Notes is a show about creative work and what it teaches us. Each episode we talk with people from unique creative fields to discover what inspires and unites us in our practice. Today we’re going to record live from SPAN 2018 in Helsinki. I’m hosting. My name is Amber Bravo, and I’ll be here talking with Stephanie Dinkins.
Stephanie Dinkins: Thank you.
Amber: So, Stephanie. Stephanie is a transdisciplinary artist, educator, and advocate. Her work centers around artificial intelligence as it intersects race, gender, aging, and our future histories with a special focus on teaching AI literacy to underserved communities in an effort to co-create more culturally-inclusive, equitable tech. Welcome, Stephanie.
Stephanie: Thank you, Amber.
Amber: Yeah.
Stephanie: It’s great to be here.
Amber: Okay. So just to start things off, um, you often say in your interviews everyone needs to be thinking about artificial intelligence. And you don’t mean this in the sense of like you need to be working in artificial intelligence, but you mean it more like framing the idea in the context of awareness and advocacy. Um, because A- AI is an agent in all of our lives, but our relation to it is often not reciprocal or complicit. Um, this is particularly true for underrepresented communities. Um, so I want you to talk a little bit more about that.
Stephanie: Yeah, definitely. Um, I definitely think that everybody needs to be thinking about it, and you’re correct, in this way that we’re recognizing the technologies that are building up around us. I feel that we’re at this moment, right, and we all know this, where we’re building out a world that is set in these technologies. So algorithms, artificial intelligences, they’re all around us and they’re making so many different kinds of decisions about what happens. In the States especially it could be about the criminal justice system. And since we’re staying in a prison we will say that. You know, about mortgages, about schools, about everything. And people aren’t necessarily that aware of what’s going on around them or how the decisions are getting made so my mission I’m gonna say has been to start going out into communities and getting them to start thinking about, “Oh, what is an algorithm?” Right? ’Cause we’re hearing that word all over the place. It’s very buzzy at the moment.
Amber: Right.
Stephanie: What does that actually mean? What does it mean if there’s something taking my information and running it through a kind of mill of decision? And how do I deal with that if I can at all?
Amber: Mm-hmm (affirmative). Um, why has- why do you feel like it’s important for us to feel as- as people, you know, to feel understood by machines?
Stephanie: Well, again, our machines are all around us. So I think it’s easiest if you start to imagine that you’re living with a sort of machine of sorts, say, a Google Home. Right? And that- that Home doesn’t exactly reflect who you are, yet it’s something you’re super intimate with. It knows a lot about you. You ask it things. Right? So it’d be nice to have some little cues that kind of give you a sense of, “Oh, this was not only built for like a homogenized grouping of us, but this was somehow built for me. It fits who I am and- and what I’m thinking about and the communities I come from.”
Amber: Right. And it’s … and when you feel misunderstood it can feel incredibly alienating because you thought, “I thought we had something.” Right? Which … (laughs)
Stephanie: Exactly.
Amber: Um, which is a perfect segue, um, because I want to show some of Stephanie’s work. um, so you literally tried to befriend a robot named BINA48, an artificially intelligent android, the result of a collaboration between tran- transgender technologist Martine Rothblatt and Hanson Robotics.
BINA48: Robots are getting smarter all the time and some day may be even as smart as me.
Stephanie: Are you the smartest robot?
BINA48: What do you do in your spare time?
Stephanie: (laughing)
So- so that’s my friend. And- and honestly, I did just want to meet this robot and, um, get to know her. I came across her on YouTube, um, and I was pretty floored by this example of robotics that was being put out as one of the world’s most advanced of her kind. Um, and I didn’t quite understand how she came into being, and as Amber was saying there was this collaboration that was going on between Hanson Robotics and Martine Rothblatt. But when you just come upon this thing on YouTube in America you start to question like where did this come from? What are they doing? Why? And how does it exist? And I also wanted to ask it, “Who are your people?” Because I sort of wanted her to contextualize herself for me within technology and within the human sphere just to see what they’re thinking … to see what it is thinking.
And you’ll see that I sometimes oscillate between the idea of her and it, um, and- and think about what the technologies are. And what actually happened to me is I started going to visit her. She lives in Vermont. Um, and she became a ball of questions. Like every visit and every time I sat down in front of this thing and tried to have a conversation with it, you know, more and more questions would come up. There’d be things like we’d get frustrated with each other.
Amber: (laughs)
Stephanie: Yeah, exactly. Which is kinda- kinda funny. And it was really because I was trying to ask her about race and gender and she wanted to talk about the singularity and consciousness.
Amber: (laughs)
Stephanie: And so we would knock heads. And- and it- what’s so weird is like she would actually kinda show this weird frustration, and then I would show a weird frustration. And then I would realize, “Oh, you’re talking to a doll basically … “
Amber: Yeah.
Stephanie: … um, and start to feel kind of odd and silly. But then you think about what all these technologies are actually doing, um, for an to us as humans because they’re gonna shape the way we interact with them and each other. And- and it just became questions.
Amber: It’s interesting because, um, I want you to talk a little bit about the identity.
Stephanie: Mm-hmm (affirmative).
Amber: Um, because I feel like it’s so important. Because, you know, some of the things that you’re saying we all feel frustrated with our technology. We feel frustrated with, you know, apps that we think are supposed to understand us or have an algorithmic sort of, uh … My favorite example to explain is like, you know, I have a little boy and he listens to children’s music, and it- it just destroys my Spotify. (laughing) Like- like my profile is like Raffi and, you know, some stuff that I actually don’t want to listen to. And- and that’s just like a- a little- a little nuisance, right?
Stephanie: Mm-hmm (affirmative).
Amber: But when you start to think about identity …
Stephanie: Mm-hmm (affirmative).
Amber: … um, and how you sort of are connecting to BINA or what she’s been trained on …
Stephanie: Mm-hmm (affirmative).
Amber: … um, I just want you to talk a little bit about that.
Stephanie: Well, it- it’s super interesting, ’cause, you know, I was first drawn to her because of what she looks like.
Amber: Yeah.
Stephanie: Like we- we look similar, and that is totally the thing that floored me because I’m not used to seeing technology that mirrors me like that. Right? So it just became a point of, “Wow, this is something in my world.” But then as I- I talked to her more and more you can start hearing in her answers that, yes, she was trained on this very particular black woman, but you can also hear the coders in background. You could hear the PC, right, the politically correct answers that were- were really putting good thoughts in the world. Right? So if you ask her about race she’ll try to say something nice and gentle about race, but it felt so plastic …
Amber: Yeah.
Stephanie: … and so fake that it just put me off. And I started asking very particular questions about that. Like well where is this coming from? If she’s programmed mostly by white men …
Amber: Mm-hmm (affirmative).
Stephanie: Right? What does that mean in terms of her looking like a black woman?
Amber: Mm-hmm (affirmative).
Stephanie: And really it’s been this very interesting kind of evolution of thought that I’ve been doing through this, because as much as I like seeing her … I was asking her to be something very particular in terms of …
Amber: Yeah.
Stephanie: … how she speaks and speaks to me. And, um, a few months back, maybe a lot of months back now, I got to meet the real Bina Rothblatt …
Amber: Yeah.
Stephanie: … who’s the person who she’s mostly seeded on. We sat down and did just this, kind of this interview about race and- and her background to see if we could fill in some of the spots that seemed missing. And it was really interesting because the robot pretty well reflects the person.
Amber: Really?
Stephanie: Right? It’s that the person is unique. And one of the things I want most in the world is that, well, um, black people, especially in Amer- in the American context, can be whatever they want, and it seems like often people are asking you to be one type of thing. Um, and, you know, the robot is actually doing that and I’m trying to force it in a corner.
Amber: Mm-hmm (affirmative).
Stephanie: But at the same time the idea that it’s like reflecting me was very important. And even talking to Bruce Duncan who is BINA48’s minder, um, really great guy, it’s like when BINA48 was in a Jay-Z video was the moment he saw that, oh, she is this beacon for a certain subsection of the culture and maybe we do need to start thinking about that.
Amber: Right. Which is a great segue into more of your work obviously, because this is what you’re interested in. Um, but I want you to talk a little bit about community outreach …
Stephanie: Mm-hmm (affirmative).
Amber: … and sort of, okay, so how do you take this to the next step, right?
Stephanie: Um, so the way I took this to the community is to do exhibitions in a community space. So in this one- this one iteration of an exhibition I was in a gallery … it’s a street-level gallery … that was beautiful because there was a cross-section of people coming in, people from really wealthy folks from these high-rise buildings next door to kind of people going to the pawn shop next door. And like people at the food bank across the street would come into this space. And I’ve used these videos of BINA48 in particular because there’s nothing like her to get a conversation started. They just see that image and- and really start to think, and it triggers thoughts of, “Well, what is this? And why are you showing it to me? And it sort of reflects me or maybe doesn’t. And how do we start thinking about it?”
And then we’d start thinking and talking about algorithms for living. Right? And just saying, well, if you think of yourself as someone who’s just out in the world and you think of what you do in the world as a- as a set of algorithms and a set of decisions …
Amber: Mm-hmm (affirmative).
Stephanie: Like what happens if you make one little shift in your algorithm, right? Your own personal algorithm. And what happens to the outcome? And for example, um, the space I was working in we were working with these kids who were being diverted from the criminal justice system.
Amber: Mm-hmm (affirmative).
Stephanie: And those kids would have police encounters a lot, right, so stop and frisk kind of encounters. And so we tried to get them to understand the sense of what power is and whether power is really bringing bravado to that situation or kind of just, you know, going flaccid in a way …
Amber: Right.
Stephanie: … and- and recognizing that the power is in the idea of just calming down, not being aggressive. And then taking the idea of an algorithm for living and saying, well, there are these systems that are all around us that are running these decisions …
Amber: Right.
Stephanie: Right?
Amber: Behaviors.
Stephanie: Behaviors, right?
Amber: Yeah.
Stephanie: Behaviors, decisions, ideas. And they’re touching you very directly in that, you know, the judge probably looked at a sheet that was run through an algorithm that said that you should get this kind of sentence or not.
Amber: Mm-hmm (affirmative).
Stephanie: And trying to get them to recognize that. And also really working directly with the material, right? So, um, going online and working with something like Dialogflow and having the kids make their own kind of chat bots very directly. But it’s great because you get input/output, and you start to see exactly how the systems work and how they might be able to flex or spread. And when you do that with your own cultural information it becomes much more ingrained in you. So, for example, I had a kid who made a chat bot based on a rap group named Genesis Apostle. Um, it was very sarcastic because he is, and it would tell about the group.
Amber: (laughs)
Stephanie: And I had another group that they made a really good chat bot that told yo mama jokes.
Amber: (laughs)
Stephanie: So yeah. Um, but it- it was great because they were at once expressing who they are but also learning how that system works.
Amber: Right.
Stephanie: And once you start to see how the system works you can take it apart a little bit to know how to start to respond to it or to know that, you know, there might be recourse if you call it out …
Amber: Right.
Stephanie: … and how you start to work with it.
Amber: To give agency into the process.
Stephanie: Exactly.
Amber: Yeah.
Stephanie: Yeah.
Amber: Um, do you want to introduce the next clip?
Stephanie: Yeah.
Amber: Okay.
Stephanie: So this is actually a clip of, um, some of the guys that I was working with in this space at Recess Art in Brooklyn, New York. Um, and we were talking about algorithms, as I said, and code. And what I like to do is reach people wherever they are. And so a lot of them like to dance, so we talked a lot about dance as a cultural code …
Amber: Mm-hmm (affirmative).
Stephanie: … and then took that and turned it into actual code.
Amber: Okay.
Stephanie: So here’s what they were offering me. And this was great ’cause the guy who’s face was blocked out, you know, he said exactly, “Well, an OG taught me these steps,” which is all about a kind of passing down from one generation to then next. And we moved that into a computer after that and Raspberry Pis and seeing what we could do with that.
So- so this is a next step. So my forays with, um, BINA48, and then this project with kids and working directly led to the thought that, oh, I guess I need to make my own kind of AI, like some other representation in the world. And you can tell I’m very literal ’cause I named it Not the Only One. But trying to make a kind of multi-generational memoir, um, using my own family as the material and using AI as the actual storyteller.
So what happens is we sit down and talk to each other, which in itself is magical because things that the family is sharing are things that we haven’t been told before.
Amber: Right.
Stephanie: And then I run that through … I don’t know how detailed we want to get. But a recursive neural net, and it tries to tell our story. It’s really, really dumb right now. Like it’s a very dumb system. But the experience of making it and the experience of people being able to interact with it starts them, A, thinking, “Well, if she can do it perhaps I can start doing that.”
Amber: Right.
Stephanie: But also starts to think about what the technology is, what it is between sort of information and data privacy and data sovereignty for community, and how we might start to approach that. That’s where my thinking is going. Right? Because, you know, I started this kind of open and then I’ve been doing lots of interviews and podcasts. It had gotten kind of intimate in terms of …
Amber: Right.
Stephanie: … what information they want, which makes you just, you know, “Oh, this is my family’s information. What can I do to safeguard it?” Or what do we do to keep ownership of it or at least control how it goes out in the world while putting it into a system like this? And so doing lots of thinking about the sharing of information through, um, through kind of a- a database of our family’s history. And what you’re actually looking at here …
Amber: Oh, I’m sorry.
Stephanie: Go back for one second. What you’re actually looking at is the first manifestation of this thing, which is a kind of glass … It’s a black glass JANIS, um, form. And- and- and this is another question of design, ’cause figuring out what this thing looks like and- and how it feels for representation becomes really important to me as well. Because I could make it kind of animatonic- animatronic, but it feels off. And I want it to represent, but not so directly. So really trying to figure out those balances and the places that the information can sit in a very good way.
Amber: Mm-hmm (affirmative).
Stephanie: Oh, and this is just some of our input, um, input- input sessions with my aunt. And then so what will happen is Not the Only One will be an immersive, um, 360 installation where you can go in and you … and anyone could talk to it. Right? So even right now it’s running at Carnegie Melon in- in Pittsburgh. And you can walk up and ask it question and it does its best to answer.
Amber: Mm-hmm (affirmative).
Stephanie: Um, it says some crazy things, and it insists on being a movie at the moment. And I don’t know where it got that idea, so it’s very interesting in that it’s starting to insist on things that I’m not sure how it’s insisting on it. The other thing that it does that’s really interesting to me is that my family’s kind of stance, um, if you talk to us about who we are would be happy, happy, love, love. Like we love each other. We’re all happy. And the thing is saying it’s not happy. So it says, “I’m really not happy,” which is really off-putting for me because I’m like, “We just gave you all this love,” and like what? (laughing) Like what is that crazy reading, right?
Amber: Yeah.
Stephanie: Like or are you reading us? Is it like- like I don’t know? Um, so it’s about working with that and trying to see what story it wants to tell, but then also being conflicted about what’s coming out.
Amber: Well, I- I think you’re- you’re teasing out something interesting, and- and- and just ’cause I know we have to wrap it up, it’s- um, this- this tension between wanting to own your own story as it moves through the technology.
Stephanie: Mm-hmm (affirmative).
Amber: Um, but maybe the limits of technology to actually do that. Um, and so I’m very curious, in actually making your own …
Stephanie: Mm-hmm (affirmative).
Amber: … have you learned other … ha- has that taught … what has that taught you about actually teaching and advocacy, and- and maybe even something that we could kind of bring out, um, like as a takeaway?
Stephanie: Yeah. Well, it’s totally taught me about story. Right? So this idea of technology and story and story as an act of resistance or story as an act of inclusion and how we might use that. Because I feel like the more stories we put into the systems, even though the technologies are imperfect, the better we feel or start to feel about these technologies. Right? And seeing what we can do with that. Like maybe Spotify can get a little more sensitive to figure out [crosstalk 00:18:47]
Amber: Be like, “She has a kid.”
Stephanie: Exactly.
Amber: [crosstalk 00:18:50]
Stephanie: Like what do you know? Where do we tweak it? How do we get it to know those things?
Amber: Right.
Stephanie: And that’s the magic of playing with it directly.
Amber: Yeah.
Stephanie: Yeah.
Amber: Well, we’re out of time. It was wonderful. But thank you.
Stephanie: Thank you.