#32 - An Imperfect Match: Transcript

Thursday, July 31, 2014

Alex Goldman: So on Monday OkCupid posted a blog post that was titled "We Experiment On Human Beings!" And they were talking very proudly about how they, among almost every other website in the world try various different experiments to optimize people's usage of their websites, which seems fairly uncontroversial until you dig down into precisely what they were experimenting on. 

PJ Vogt: Yeah, so what they were doing was OkCupid, when you sign on and you look at somebody's profile, it tells you how well you match with that person according to survey questions you both answered. So, "Do you shower twice a day? Do you believe in God? Do you believe that abortion is okay?"

AG: Everything from "Do you like horror movies?" to "Would you allow a gun in your home?" 

PV: Yeah, so, OkCupid intentionally misled people. If I were to look at Alex's OkCupid profile it would suggest that we are people who are likely to agree on everything and like each other a lot, when in fact we despise each other. And the thing they wanted to test was, does it even matter? Like if you tell two people that they are going to like each other and you send them on a date, are they as likely to get along as if they were actually well matched?

AG: In the blog post Christian Rudder, who is the president and one of the founders of OkCupid wrote "We noticed recently that people didn’t like it when Facebook ‘experimented’ with their news feed. Even the FTC is getting involved. But guess what, everybody: if you use the Internet, you’re the subject of hundreds of experiments at any given time, on every site. That’s how websites work."

PV: So we decided to get in touch with him.

AG: Do you understand the concerns of the people who were...

Christian Rudder: Sure.

AG: ...pissed off about this?

CR: Yeah yeah, I definitely, I do, and part of that is my own fault because, y'know, the blog post is sensationally written, for sure. On the other hand this particular experiment is just a continuum of experiments that are happening all the time. I mean just for match percentage, y'know, if we change the way a variable is weighted of the math y'know we have to test that version B against the version A of the status quo, and y'know inevitably people are gonna see two different numbers for the same process, you know? And that's just part of the scientific method. For us, if you come up with an algorithm and you, you can hope that it works, but you have to perform experiments to prove that it works.

AG: Was there any consideration given to an opt-in procedure where people could, beforehand, be part it and then just having a control group?

CR: No, there wasn't. Once people know that they're being studied along a particular axis, inevitably they're gonna act differently. Just the same way that people on reality TV don't act like themselves. Like I was in some psych experiments when I was in college, just 'cause they give you twenty bucks to go to the department and you, y'know, you sign a form. But that is informed consent — which users can't see but I'm putting in quotes — and you uh, y'know you sit down and you hit a button when some word blinks on the screen or a dot appears and you like move a lever or whatever, and you have no idea what they're measuring you for. Y'know they don't tell you anything, they could just be measuring whether you're obeying their instructions or how you greeted the person of another race at the very beginning of the whole thing and the experiment is just a sham. So like, you're not really informed.

AG: You are correct in that people are constantly experimenting and A/B testing all over the web. The difference in the case of Facebook and I think here especially, is that if you are A/B testing the design of a website or the font of a website, it affects my use of that website. If you're A/B testing the way that we receive information on OkCupid or Facebook it can affect mood, it can affect relationships...

CR: Of course. 

AG: ...there's a much broader scope of ramifications.

CR: Of course, yeah, of course. And like, look, I've had people, I don't wanna put questions in your mouth, but I think a concern is like people have come to OkCupid and trust the site to give them good matches. And I totally agree with that and I understand why it might seem that we've somehow violated that trust. However, my answer there is like, doing experiments to make sure that what we're recommending to the four million people that come every month is the best job that we could possibly do is like upholding that trust, not violating it.

AG: Have you thought about bringing in, say, like an ethicist to, to vet your experiments?

CR: To wring his hands all day for a hundred thousand dollars a year?

AG: Well, y'know, you could pay him, y'know, on a case by case basis, maybe not a hundred thousand a year.

CR: Sure, yeah, I was making a joke. No we have not thought about that.

AG: There's one question in the volume of questions that you guys ask that really stuck out in terms of matching people up.

CR: Mhm.

AG: And that is "Is anyone ever obligated to have sex?" Which is one of the questions that is available on the...

CR: I take your word for it, I definitely have not memorized the list, but.

AG: But that question, if people aren't careful, if they're not reading the answers to everybody's questions, that, that question could lead to really big real world consequences if someone says "Yes, people are obligated to have sex in certain situations."

CR: Yeah, this, to get to that match question you would've had to have answered thousands of questions, and you're right to an extent that our reply overrode that. Again everyone was notified relatively quickly after the fact that the match percentage was corrected and, yeah, we did, we overwrote it, I mean that, that's...But again, like you, this is, that's one drop in a puddle of thousands.

AG: I understand that that question presupposes a lot, but it, it is y'know a legitimate concern.

CR: Uh, I dunno if I would agree with the legitimate. I mean there's also a lot of stuff we don't ask, I mean, y'know, so like, we don't know any of our users. Y'know so like we make no claim to the safety for anyone, and obviously we do everything we can to encourage a safe environment. But like, I think it's disingenuous to suggest that we're setting up people in dangerous situations. I think for the five-hundred or so people who sent messages, and again who were all notified afterwards, I think the worst possible outcome was a stale conversation.

AG: This is Alex recording post interview, this is a little meta, just so you know. I just wanted to let you know that this is the point in the interview where PJ, who was in the control room, could no longer keep quiet and had to interject his two cents into the conversation.

PV: Ok, it's not like I'm like a crazy interrupter, I online date, so I have feelings about this.

PV: You're sort of like taking two things that aren't quite the same and putting them together. Like either you're a company that's trying to make the best possible product, or you're social scientists who are doing experiments about human behavior. And if you're social scientists there are guidelines, and there are ethics, and there are things that scientist have to abide by. And if you're a company that's just, y'know, trying to make the best product that's a different thing. But I feel like in this conflation some of like the safeguards that social scientists would have get lost.

CR: But I, I think that's and odd double-standard for one thing, and I also think, well I just don't know why it is okay with people that, y'know, Oil of Olay's advertising is there to make people insecure basically about their looks. To make women feel old. Feel ugly if you would just wanna use an extreme word. Y'know? Why is that okay, that kind of emotional manipulation okay? Nobody bats an eye.

PV: I think people do get upset about Oil of Olay ads. I mean I think what specifically, at least with me makes me feel uneasy is, I don't like the idea that I would be part of someone's research without knowing that I was part of someone's research.

CR: I think part of what's confusing people about this experiment is the result. The algorithm does kind of work, y'know and power of suggestion is also there. But like, what if it had gone the other way? What if our algorithm was far worse than random? Then if we hadn't had run that experiment we basically are doing something terrible to all the users. Like this is the only way to find this stuff out, if you guys have an alternative to the scientific method I'm all ears.

PV: Right I mean I don't think it's having a problem with the scientific method. It's like if you guys were a restaurant, and you were...

CR: But yeah we're not a restaurant.

PV: But, just, just in a metaphor.

CR: So, okay. I don't wanna argue by analogy but I'll listen to what you have to say.

PV: Okay so if you guys were a restaurant, and I went in one day and the meal was one way, and the next day you tweaked it, I wouldn't have a problem with that. But if you guys were a restaurant and one day you served like, intentionally gave me food poisoning, and the idea was "Well we wanna know, we wanna know if our foods any good. Like we've like..."

CR: But that's not what we're doing, we're not intentionally giving people food poisoning. Keep in mind it's an experiment. There is the possibility that random is better. So maybe we're giving you the better food. We don't know. You can't know that until you run the experiment.

PV: But like for instan- you know Arthur Aron, the psychologist?

CR: I don't.

PJ: He's a guy at SUNY, and he's tried to study what makes people fall in love, and what makes matches work. And he'd done an experiment where he'd brought in two strangers into a lab, and he tried to see like, if he changed this, if he changed that, like could you increase the likelihood that two people would fall in love.

CR: In a lab?

PV: In a lab.

CR: Okay.

PV: And one of the things he found was that a huge predictor is if you've been told that the other person is attracted to you going in, whether or not it's true.

CR: Sure. In a lab.

PV: In a lab.

CR: Yeah.

PV: The way you're saying "in a lab," it's like...

CR: I just mean like, that's by definition a contrived environment. I mean we have to work in the milieu that we work in. I think findings about how people fall in love arrived at in a laboratory, that doesn't seem really pertinent to real life in any way to me. Not that I'm, I'm not taking any issue with this guy's science...

PV: Right.

CR: ...but you, you understand what I'm saying, right?

PJ: But it's like, there was like a whole era of psychology where they cared much less about consent and they cared much more about getting real world results, and people look back at that and they're like "That was not ethically sound." Like Milgram....

CR: Sure. I mean I know Milgram's experiments. I guess people's view on them has changed over time, and I mean people might look back at this stuff differently. But again like I haven't really heard an answer from you guys about how do you change the algorithm in any way, whether it's this extreme way of basically making it random, or just changing the way a variable is weighted. How do you make a change and test it?

PJ: If you're going to make a change and test it, there's a difference between "Hey we didn't tell you that we changed the size of the picture" and "Hey we didn't tell you that we're giving you different information that what you'd expect." I mean, I kinda think we all have a sense, generally, of where that line is.

CR: But, sorry but that's not an answer to my question. Like how would you run, like for real: you're running OkCupid, we wanna test our algorithm against a nonsense algorithm just to see if it's, if it's better than the placebo.

PV: I wouldn't do it.

AG: Well I would.

PV: You would?

AG: And the way I would do it is I would send out an email to five-hundred users that says "We are interested in running an experiment. We can't tell you what that experiment would be, or....y'know"

CR: So that's informed consent then?

AG: But yes, knowing that we're, we're, we're interested in running an experiment on the results you see. If you're willing to participate we will explain what we've done after the fact." But at least in that case they know that they're being experimented on. Also you can say to them "Some of you will be in the control group." So that gives people the option to willingly participate.

CR: But I guess my point is, I understand that that would be informed consent, but what does that get anybody? Where is the informed part of that?

PV: I mean I think you guys are shifting a norm. I think that there are gonna be some people who look at this and say "I don't wanna use this site who's idea of this level of experimentation, like I don't wanna be part of that group."

CR: And that's completely fine, I mean I can't argue with that. And again, I understand that, the misgivings. I think some of it is just kind of inconsistent. And I also think like, look you gotta understand, this is not just OkCupid or Facebook, this is every site.

AG: PJ, you convinced?

PV: No. I'm not comfortable being a guinea pig to the extent that you guys are comfortable making me a guinea pig. But also, if I like break up with my girlfriend tomorrow, I'll probably be back on your site. And you could totally reasonably say "If you don't like the way our company is run go use Tinder or whomever."

CR: Well Tinder also tests, and so does Match and everywhere else. I'm really not trying to be flip. I think you, the way, if you wanna think about these things, like, every person using any website has a small little cost. That cost is a kind of shared cost that everyone bears being, each, parts of these little groups at different times. The upside of that is that the whole thing is better and it works better. And it works better for you too, even, having been part of the experiment. It works better for you as soon as the experiment is over.

PV: I do think that a lot of times a company and it's users' interests overlap but don't align.

CR: Sure, that's one-hundred percent true.

PV: And so as a user of a lot of companies, I'd rather not be experiment on because sometimes it's me versus the company.

CR: Y'know, here's a way to think about it. Like I think, let's talk about advertising for a second, 'cause I think that's a...Y'know, look, people have become about advertising, have become a lot more savvy than the 1950s when you see a doctor smoking a cigarette in an ad and you're like "Oh man I totally believe that these cigarettes are good for you." And I think that, we're at the beginning of that process right now with these kinds of experiments, is that in 20 years I think everyone will be like "Oh yeah they're just running some experiment" and they'll take all of this stuff with a little bit more of a grain of salt, y'know? If you think that OkCupid has unlocked the mysteries of love and has an ironclad algorithm, prophetically can tell you exactly who is right for you, you're a crazy. Y'know? So like, we're doing our best, for sure, and it's the same thing. Like I think people will realize that that's how these sites work, that's how they evolve, they're doing the best job that they can, and they also have their own interests as well. And, and maybe that's the process that we're looking at. And that's the kind of, again the kind of conversation that I think Facebook on accident, and OkCupid on purpose is trying to kickstart.

AG: Christian thanks so much for coming.

CR: Yeah, my pleasure.

AG: TLDR is produced and hosted by PJ Vogt and me, Alex Goldman. Our Executive Producer is Kat Rogers, our Engineer is Jen Munson. Our intern is Ethan Chiel. Our theme song is by the mysterious Breakmaster Cylinder. Christian Rudder has a book called Dataclysm coming out in September, but be careful, it could be an elaborate experiment to learn more about your dating habits. If you like the show please subscribe to the podcast. You can find a lot more TLDR at tldr.onthemedia.org. We tweet at @agoldmund, @pjvogt, and @tldr, and we are TLDR.

Tags:

More in: