Bringing the Human to Data (REPLAY) (Guest: Christian Madsbjerg)

Media Thumbnail
00:00
00:00
1x
  • 0.5
  • 1
  • 1.25
  • 1.5
  • 1.75
  • 2
This is a podcast episode titled, Bringing the Human to Data (REPLAY) (Guest: Christian Madsbjerg). The summary for this episode is: <p><span style="font-weight:400;">We thought we would replay one of the most important episodes of last year - the episode with Christian Madsbjerg, the author of “Sensemaking”. Christian’s point of view has become the basis of so much of what we do on this podcast.</span></p> <p><span style="font-weight:400;">We don’t often read books that change how we think about the world. Our guest on this episode, Christian Madsbjerg, wrote a book “Sensemaking” that did just that. Christian’s consulting firm ReD Associates has, in their own words, led a quiet revolution in business thinking. This book is a treatise on Christian’s underlying philosophical framework for ReD’s goal to bring the humanities and social sciences into today’s businesses dominated by technology, data, and analytics. We hope that you will find that Christian’s perspective will make your rethink your assumptions about the critical importance of the humanities in today’s fast paced world. </span></p>

Ben: Welcome to the Masters of Data podcast, a podcast where we talk about how data affects our businesses and our lives. And we talk to the people on the front lines of the data revolution. And I'm your host Ben Newton. I don't often read books that change how I think about the world, our guest in this episode Christian Madsbjerg wrote a book Sensemaking that did just that. Christian's consulting firm, ReD Associates, has in their own words lead a quiet revolution in business thinking. This book is a treatise on Christian's underlying philosophical framework for ReD's goal to bring the humanities and social sciences into today's businesses dominated by technology, data, and analytics.
I hope that you will find like I did that Christian's perspective will make you rethink your assumptions about the critical importance of the humanities in today's fast paced world. I sat down with Christian in his New York City Office. So without any further ado, let's dig in.
All right. Welcome everybody to Masters of Data. And I'm so excited to be here with Christian Madsbjerg. I appreciate you taking the time. We're up here in the 26 Broadway which Christian just told me was the NASA headquarters and Standard Oil, Rockefeller's company. So it's cool to be in a historic building like this. Appreciate you taking the time Christian.
Christian: Of course.
Ben: I love to start just talking a little bit more about where people came from and what kind of made you who you are, how you got to where you are today. So maybe just start a little bit on that. Kind of what got you started on this journey where you ended up at ReD associates with what you're doing now.
Christian: Right. So I am from Denmark originally. And I grew up on a small island south of Sweden, which has ... used to have 50,000, now, it's got like 30,000 people. And I was in a way saved by the local library, the local sort of public library where I ended up spending my ... most of my childhood-
Ben: Your access to the world.
Christian: Exactly. That was the only place you could really meet the world in that quite a rural area. And I ended up studying philosophy in Copenhagen and London afterwards. And I thought I would be an academic because I still have the attitude towards reading and studying and focusing on something for a long time. But I thought that ... When I met academics, they were so unhappy and so miserable. So I realized that that was probably not going to be a good life.
And then I had to figure out something and I started a company. Because I thought that the kinds of things I learned in philosophy ought to be helpful to other areas than just academia itself. That it would be nice if some of the things I knew about people and our decisions and our how we find our way around the world could be used for companies that make medicine or technology. And it turned out that that hypothesis was true.
And quite quickly, when I was I think 22 or 23, I got my first sort of major clients. I don't understand how anybody would choose to work with somebody that's 22 and looked like somebody that's like 14. So it's a miracle somehow that somebody trusted me.
But in the last 20 years, I then tried to figure out how to use some of the tools in the humanities and some of the ones from the social sciences, the part of the social sciences that I'm interested in, in how we make decisions in the world and how we choose what to make. What kind of car should we make? And how do we deal with patients and education and all kinds of human questions? How do we inform that in a meaningful way?
That's been 20 years of work or something like that. And I've written I don't know many books about it and tried to spread the idea that you can think about humans in a very sophisticated way and you can understand us. And that could be even maybe applied and helpful not just as academic papers but as products and services and things we make.
Ben: Yeah. And one of the things that ... One of the reasons why I wanted to talk to you was to talk about your book since making ... In particular, you and I were talking about this little before is that it's actually amazed me I think what accord I've seen this hit because like I was telling you earlier, I've seen the book come up in multiple different circumstances without me even bringing it up, without people knowing that I was necessarily going to come and talk to you.
It started a conversation with some people work and it's come up in multiple conversations. I'd love to hear from you directly, why'd you write the book? Why did you feel like it was the right time to write this book? Because it's not like another book in this area that I've ever read. And I think it's one of the reasons why I appreciate it. What made you write it?
Christian: Yeah, I guess ... I mean the word that people use about that book is that it's idiosyncratic. That it's sort of original in that sense-
Ben: Yeah, absolutely. I would agree.
Christian: People get very surprised about it. And I think it's because it's angry. It's a kind of an angry book, right? It's trying to say that, "No, you can't look at people like that." And it was I guess three or four years ago so I guess 14 or 15 or something like that. I was really worried about the way we talked about data and the way we talked about people as if we were linear and rational. And that the machines would somehow figure out everything about us through the intake of this new diet of data.
And I know some things about people from the studies I've done and I think the existential psychology world and parts of anthropology have known for 100 years that that's not the case. So I could see things from a philosophical point of view in the world of technology and commerce that I think because there are a few ... few people have that combination who are able to sort of give it a voice or give it a ... give things a name.
I was worried about the way that we talked about machines and we talked about us. I'm still worried. I think it's a little bit better right now because I think some of the big technology companies have gotten some concerns about the-
Ben: More aware.
Christian: Yeah, much more aware and much more ... The pendulum is sort of coming back to a situation where we have two people to talk together, people with sociology background and people with engineering background-
Ben: Why do you think so? What's special in that way?
Christian: I think sitting in front of a congressional panel is helpful.
Ben: That's true.
Christian: I think critique from advertisers are helpful. And I think the failure of a lot of AI is helpful. That the machine just couldn't do those things. That doesn't mean it's not a good idea to research, that doesn't mean it's not a good idea to go down the path of trying to understand things but there were some quite grand claims about how all cars would be driving by themselves from last summer and how data would be better at knowing us than we would ourselves and that just turned out not to be true. It was a technology critique from someone in technology. And I think that was maybe why people reacted to it.
Ben: That's interesting. And I think particularly in this book what I've found and even to hear you talk about it again now, you put more rigor and also in some ways more heartfelt words around things that I could sense I could feel. I remember I wrote down a couple of quotes and there's one that I really like is we often make really poor decisions just because it's so uncomfortable to do the hard work of thinking. And I remember when I was going back through the book, that just struck me. I think you've ... You really nailed some points about how it feels like we put too much confidence in algorithms, we put too much confidence in data by itself.
And the way you approached that was definitely timely. And to your point, with some of those ... There's these recent things were spectacular failure of technology at national, worldwide stage is ... I mean hopefully that's putting the right pressure on these technology companies, right?
Christian: And I think they are responding with some force right now, at least the more sophisticated of them because they're trying to figure out ... I don't think we have it down but somehow people that are interested in people and people that are interested in code need to learn how to talk otherwise we will have another Cambridge Analytica and another big advertising scandal and we will have big financial crises.
Because I mean the entire financial system right now is cropped up by very high valuations of technology firm and the technology firms have cropped up by very high expectations to advertising revenue. And if it turns out that isn't more precise than a banner in the streets, then we have a big financial problem suddenly. So I think we need to figure out how to work together and that's my mission really. Is to try to figure out how can someone with ... I don't know anthropology background work with somebody with an engineering background and not misunderstand each other all the time. And certainly not because they want to miss understand each other which is often the case.
So for me, the big epiphany was there have been many but lately ... I don't know if you know Professor Damasio? He's from UCLA. He did a ... quite an interesting study a while back where he studied people that have had damage to their frontal cortex. And he found that ... So that's the part of the brain, apparently, I don't ... I'm not a brain scientist but that's part of the brain that have to do with moral reasoning.
And he looked at these people that have that part of their brain damaged and they then had only the rational side of their brain left. So they could clearly ... They could do very basic you could say animal type reactions and they could do rational reactions to things. But they could do deep human things like moral things. And that meant that they didn't know what to do about anything.
So it turns out when you look at the people that do not have a moral way of thinking about the world and does not consider ethics at all, which would be a machine really, they are paralyzed by not having the engine of human thinking, which is, what do I think about something? What do I ... do I like it? Do I not like it? Do I feel disgusted by something? Do I feel that something is harmful? It turns out that if you lose that part of the brain and you are more like a machine, you get paralyzed which is interesting from a computer science perspective.
That we're trying to build things based on a set of assumptions about human beings as rational beings and so on that are untrue and ... or that if they were true, would be not being very productive? Does that make sense?
Ben: Yeah, it does.
Christian: So the kind of moral reasoning you would have would be things of disgust. You'd be disgusted by eating dogs or if somebody burnt the American flag or something like that. You would have reactions to that. And they would be I don't like that. There wouldn't be a rational reason to not like eating dogs. It's protein, it's perfectly fine but that's not part of your culture.
And another thing is harm. So would this do harm to me, to someone. If an act you see is potentially doing harm to someone, you would find morally problematic. And last one is disrespect. So if somebody's disrespecting you, the elderly, the religion, your time, whatever kind of disrespect it is. If you don't have those three components, so disrespect, disgust, and harm as part of your vocabulary, your brain stops being able to act on anything.
Ben: That's fascinating.
Christian: Isn't it? So you would have all the cognitive capability of processing something but you wouldn't know whether you liked it or not. And that means you're paralyzed. So what he showed was there's a fundamental error in the strain of philosophy that goes from Plato to Descartes and onto computer science today, a lot of the AI thinking, that we are fundamentally thinking rational beings that can weigh options against each other. If you take the economic ... like economics, it's based on that. I mean there's a little bit new help now from people economics but in general, it's based on the idea-
Ben: [crosstalk 00:12:30]
Christian: That we are totally rational beings. And it turns out in this experiment-
Ben: We're really not.
Christian: We're not. We are also rational beings but without the engine of moral and aesthetic decisions about things and reactions to things, we are stuck. It's sort of a beautiful thing in a way that that's the case. That the passions can't do without reason. And reasons can't do without the passions. And that's what I think we should do in our companies as well. We should combine the two.
Ben: Yeah. So I mean ... it's in some sense ... Like the way ... I hear you describe it is that we're making these products and these technologies but we're forgetting what it is to be human. And we lose that connection what it is to be human.
Christian: Yeah. Why?
Ben: Yeah.
Christian: I mean now your cat can be connected to your wrist, can be connected to your refrigerator, can be connected to Amazon, you can be connected to your car, but why? What's the point? Right?
Ben: Yeah.
Christian: For what purpose would that be helpful? And I think if you look at the Internet of things, like the sensors that we can ... We can enable anything now quite cheaply. But why? Why would we do that? Right? And I think there are many good reasons. You could do many things in health. You could do many things in safety but we need to think about that and it's not obvious what those are.
Ben: I wonder ... This is a little bit of a tangent but you made me think of something I thought before, is that I wonder if that's why a lot of these futurist sometimes get technology adoption so wrong. Is it because they don't think about how the human will react to it? Because I think back to my favorite movie, back to future, when I was a kid, I thought we'd be in flying cars now. But that aside, it doesn't take into account ... It just assumes that technology is going to go on the straight path and that it's not going ... And the reality is when it comes into contact with humans, humans are going to use it in a way that makes sense for them and that it actually satisfies a real need. They're not going to use it just because it's-
Christian: Because you can.
Ben: Yeah, exactly.
Christian: I love that film.
Ben: Yeah, okay. It's one of those things that defined my childhood, that and Goonies. That's like two films from ... I guess that also gets back to something else that particularly ... got me into the book in the beginning was this your idea of thick versus thin data. And I think that's in your particular ... here when we talk about you can make poor decisions if you don't have the context, the data ... about the data and the cultural context. But help me talk little bit more about that, like how you differentiate between this idea of thick data and thin data.
Christian: Right. So thick data is of course it's coming from the anthropological tradition where they talk about thick description. So that's description of a cultural context. We just called it thick data because it would relate to big data. That's sort of the ... I don't know. But the idea is that I subscribed to what you could call a rich reality view. That when you walk into a room, you can measure the room, you can see what temperature it is, what the humidity is, which is too high in this room right now. But we can measure a room to quite some detail but it would never give the soul of the room and history of the room-
Ben: Right.
Christian: Or the feel of the room. And all rooms have a mood and a feel that you can sense if you're a human being that comes from its history and the other people in the room and so on. The thing if you take driving the car in the mountains and you take all the sensor data out and all the GPS data and all the data you could possibly get from that situation, you wouldn't describe it the joy of driving in the mountain with very much precision.
Ben: No.
Christian: What that feels like, what that is experienced as. So a thick description is the description of ... It's small sets of data that describe in-depth what something is experienced as. So what is it like to be a patient told that you have stage four cancer? I think very little data can describe that.
Ben: Yeah.
Christian: Or what is it I'm interested in? What is it I'm looking for in my book searches on Amazon? It seems like they've had some time to gather data about that like 20 years for instance [inaudible 00:16:37] significant amount of data and still that's almost always wrong about what they would suggest me. Why? Because my search for things and trying to understand things is not linear. It's not that it comes after the last thing I looked at. So if I just went and looked a book about the Gulag in Russia, that does mean I'll read another historical novel about terror regimes in Europe in the 1940s and 50s, right?
Ben: Right.
Christian: So that is more a description of how we ... A thick description would be a description of how someone learns and the thin description would be the acts and the activities that you would do in that situation. Or what the experience would be in the mountains driving is thick data, that you are at exactly-
Ben: [crosstalk 00:17:26]
Christian: At one point at driving this fast and so on. Those are all important things but they're not describing the experience. And saying that you can go from thin data and extrapolate or correlate into thick data is wrong. A lot of algorithms are built on that idea that you can-
Ben: Without having a cultural context.
Christian: Exactly, without the context.
Ben: In other words, the algorithms are going to be an important part of our society going forward, because I they're ... I mean they do provide benefits but as you said, they can end up in some really wrong places because they don't have the cultural context. So how to ... You're out there designing these algorithms? How do you get the cultural context and in the right ways? Is it by getting the right people on the team to talk about it?
Christian: Right. Yeah. So let's take a case, a banal case, I don't know how banal it is but have advanced algorithms and psychometrics which is the Cambridge Analytica case. So Cambridge Analytica uses the Big Five, which is the big five characteristics of a personality and which just been normal psychology for 50 years, it's probably the most well studied area in all psychology. And they use that, which didn't have the purpose of tanking our elections. But they combine that with data sets that they apparently got from Facebook. And that led to the situation we've had with Cambridge Analytica, neither coming up with the big five nor coming up with the algorithms were mean spirited.
Ben: No.
Christian: They were science. But they were put together in a rather mean spirited way. And that means that we should be quite careful how we use these things. That the algorithms could have biases in them that we need to think about and it could be used for things that are dangerous or unpleasant.
Now, how do you do that? Well, how would you look at that? Well, I think, and this is a discussion we could probably spend two hours on, but I think qualitative should come before quantitative. A lot of people disagree with that. But I think getting a broad understanding of a topic before you start designing precise algorithms, or precise questions, or precise categories about something is helpful.
So I like studying ... If you want to make algorithms about shopping, I want to understand how people shop today and which would be the meaningful categories people would make that we could then design algorithms around rather than starting with the algorithms and then I don't know do some focus groups on top of that. So I think the trick is that you need to use context people. So that would be social psychology, anthropology, some political science, some sociology.
People from that crowd would be able to say, "How are people experiencing x?" And then the data scientists could say, "Well, we actually have proxies for some of this data already. We have a feed of data coming from here and that could be used in this particular way. Let's test it." And then they could test it in all kinds of ways after that. So I think is qualitative before quantitative is quite important.
Ben: If I'm understanding the way you say that right, it seems like you talk a lot about creativity in the book, which I definitely want to talk more about, it seems like that ... Is that where the creativity comes in? Because you have a qualitative understanding, your creativity, your human creativity is connecting the dots so you actually know where to do the quantitative analysis. Is that-
Christian: I don't think people from my world is any more creative than anybody else. I think engineers are some of the most creative people I ever work with. Scientists, some of the most creative people I ever work with but it's how you connect a broad understanding of how people ... how kids to learn to how the algorithms that are put into the learning material is designed. That's really hard.
And that combines two types of knowledge. It combines cultural knowledge and knowledge about pedagogies, and kids, and humans and technical knowledge about how to design algorithms. And that combination is the killer combination I think. That is the way that great technology will be built. But that's really hard. And that means that creativity will have to be ... happen between at least two people, right? You have to have-
Ben: Can't do that by yourself.
Christian: Very hard and then you have to work with others, which is horrible. And you have to work with somebody with a different language and different background and a different ... That's why we end up in the trouble we're in because that is really hard. I think that's the creative task we have at hand because technology is so important now.
Ben: You know that actually makes a lot of sense because I've said to people before ... Not that I was perfectly doing this but it's easy to be afraid of putting your ideas in front of someone because you're afraid they're going to reject them. And there's just a very human desire to kind of keep it to yourself until you're ready to do that. But I feel like I can't come up with good ideas unless I'm working with other people. Because I think when you have to explain your ideas and you have to explain to someone else in the language they can understand, you usually end up in a better place.
Christian: Absolutely.
Ben: Yeah. And it's easy to forget. And maybe that's like one of the things that we talk a lot about, teamwork. But maybe that's sometimes what doesn't always get communicated. It's not just the fact that you're working with each other and trying to compromise on everything, it's just that by having the communicate across these human barriers actually makes your ideas better, right?
Christian: Yeah. I think there are two skills that we all need to do that. And the first one is vulnerability. It's the ability to open up and show your ideas to others and have them shoot at it and be comfortable with that. So vulnerability. And the second one is humility. Maybe, just maybe somebody else would know something about this that you don't know. And I see a lot of arrogance from both sides.
So vulnerability and humility to each other, from the culture of engineering and science, towards the culture of human science you could say is crucial. And I think the human sciences are so arrogant and I think the engineers have been ... have had a pocket of arrogance in the last 10 years but are normally not deeply arrogant. I don't find the engineering culture deeply arrogant. There's just been a time where they've made so much money and been so successful that they've been kind of arrogant to the rest of the world. And that's why I wrote the book.
Ben: You know what's what's interesting? Where I come from, my background, I was ... Like I told you before, it's originally in physics and math but I've always been a musician. And so always I ... I remember a joke in undergrad, depending on who it was, I let girls know I was in the music school instead of the physics school because I thought they would like me better. But the thing is I always ... In the background, they had these two competing natures. And I think when I was when I was younger, I thought that those were competing, that they weren't complimentary.
But you bring to mind that to my favorite physicists Einstein and Richard Feynman, Albert Einstein and Richard Feynman, they were both incredibly creative and we're actually very in tune with a more ... I don't know the right word for it but a little more humanistic side of ... Albert Einstein was a musician and Richard Feynman was musician and a lot of other things and they thought very creatively. And to me, it's seems like part of what you're talking about creativity is that ability to make connections-
Christian: Yes.
Ben: And not just in your shelf but make connections with other people because you make connections you didn't think of. But that's-
Christian: What I call sense making. That's the point.
Ben: Exactly. To that point, I think that's where ... When I was reading the book, I remember nodding my head and be like, "Okay, that makes sense and really," But when we got to the ... particularly your chapter on the stages of maturity and where you ended up with the ... where people go from basic understanding to kind of mastery of a subject and how it becomes intuitive. And you talked about jazz band and about how Miles Davis said, "You play what's not there not that's there but that takes a certain expertise." And I think that really connected with me.
There was actually somebody else on a previous podcast, Matt Valentine and he talked about data jazz. And I was like, "Yes, that's it."
Christian: Exactly.
Ben: It's amazing to think about that. But then it also ... You talked a lot about this, too, is it seemed like it wasn't just the cultural aspect but the fact that there wasn't always an appreciation, particularly in technology circles for the value of wisdom and experience and being able to make those connections.
Christian: There was a total dismissal and arrogance towards it. The rest of the world will be disrupted was the story. And we could set up ... we could put a robot to run this country much much better than the democratic system. And just like idiotic ideas that are so uneducated. My feeling was when I came to Silicon Valley, which I do quite often, was you guys as such smart idiots. And you're so incredibly deep and talented and importantly ... in important areas but have absolutely no clue in others.
And I feel the same. When I go to Silicon Valley, I know some things about humans but I don't understand what on earth you're talking about when you start when you start talking sharp about math and whatever you talk about? And I just thought that wouldn't we need ... We could work together. So that's the idea behind the company I have. That's the idea behind the book.
Ben: And to that point, I think ... And this is one of the things that I took when I was younger and I think it's become only clear to me when I got older. That I took from both Einstein and Feynman, if you actually go and read Feynamn's lectures, they are not complicated and overloaded with technical language. One of his gifts was explaining very, very complicated subjects in a language that everybody can understand. And I think there is a tendency in ... probably in any subject but I definitely seen it technical subjects were if I use the lingo and I use the technical words, we're using as a shortcut but we're also using that to exclude people. And if you can take ideas and you can package them up in a way that is understandable to a broader group of people, that's actually a deeper level of understanding.
Christian: But that's because you understand it, you are able to simplify like that. I mean the great test on that is can people write? And your two heroes could write and had a way of dealing with words that were ... might not be as elegant as they dealt with math but certainly was up there. And we quote Einstein's words more than his math today.
Ben: I hadn't thought about that, you're right. And he said things that were of extraordinary power.
Christian: So I think the first thing we do with ... When we get a new batch of kids from Yale and Princeton, the first thing we do is teach them how to write because they don't know how to write when they come out of their class. So we teach them how to make shorter sentences, how to have every sentence on trial for it's existence, stop using jargon, starting to be precise and right.
We normally say, "Write like a heat seeking missile." Write for the substance, right? What is it you want to say? And that teaches people how to think and that's common across all types of areas whether that's physics, or engineering, or sociology, or philosophy. That's what we can connect. We can connect on the ideas and the expression of those ideas.
And I think there's no reason why a physicist couldn't write better than an English major because they could ... If they know how to think, they also would know how to write.
Ben: It is interesting you say that because I don't think there was much emphasis when I was in school on writing. Luckily, I took a couple humanities class but I don't really feel like I learned how to write until I got out of school.
Christian: If you learn how to write, you also learn how to think. It's a mirror of thinking. And it's a way to test ... It's the hardest test of your ideas because you can read them and you could say, "What an idiot that wrote these things! If the sentences are clunky, if there's too much fat in the text, it's probably because you didn't think it through. But if you learn how to write straight, you are unstoppable.
You can present to people, you can think straight, you can slice people up if you hear their arguments. It's just a way to exercise your muscle of thinking. And writing should be something that we should teach much more than we do today. It should be not just in school but also in companies-
Ben: Yeah. [crosstalk 00:29:57]
Christian: Because the amount of bad writing and large decks that people produce and just circulate an endless hearsay of nobody really thinking about anything.
Ben: That's a really good point. I definitely find that I don't really feel like I've come to grapple with my own ideas and try to write them down. Talking through them helps a lot but until I actually have to write it out and-
Christian: [crosstalk 00:30:21]-
Ben: Express it concisely ... Yeah, and I think that's ... It's definitely a lost art. Well, definitely one thing I wanted to touch on too because my favorite chapter heading in the whole book was you want to design thinking. You called it design thinking the bullshit tornado. I find that pretty funny. And I think it's definitely connected to this creativity.
So I think definitely a huge phenomenon, the design community. I mean talk a little bit about why you thought that that didn't work. How do you think that thinking fails?
Christian: Right? Well, it's still around unfortunately. It's not as strong as it was five years but it's still around. But it was the idea that we're all creative. We're not all creative just like we're not all great basketball players or great at understanding the market. We are not all equally good at cracking hard problems. Einstein would be better at doing that than other people.
So the idea in the design community that came out of the 80s and 90s was if just you have a process with enough posted notes and enough beanbag chairs and enough primary colors on the walls, everybody, even somebody in finance and in HR would be able to solve the biggest problems in the world. It's just a big lie. It's untrue that that's the case. But they sold it as a way to almost manufacture creativity as if creativity was something you could manufacture in a linear process. And that's just not my experience.
My experience is it takes a lot of preparation and homework to be able to solve new mathematical or technical problems. If you don't know the basics there, you're probably not going to be successful. Just like you probably shouldn't think that you'll get into play at the Blue Note cafe before you've learned to play the notes and read music sheets. So there's something about design thinking that design thinking is easy, it's fast, and it never fails. And I think those are untrue. It's really hard, it most of the time fails and there is no linear process with which to do it.
And I think that the design world then blew so much hot air into that balloon that probably that's our CO2 crisis right there. But such volumes of hot air through magazines like Fast Company and Wired and others, the blogosphere, that somehow people thought of it as a silver bullet to solve world hunger, the next SUV, and the next piece of software. And it's not the case. It's simply untrue. So it was a big marketing stunt going coming from the design community.
And in the middle that unfortunately for me, they imported some ideas from the world of anthropology. So ethnography as a graphic data collection and so on, in very light superficial formats. So that people say now well, "Well, we have done ethnography already. We've had some design firm come and do ethnography." And that's unfortunate because the tool was kind of helpful and shouldn't be treated that way.
Ben: I see. One thing ... When I was going back and looking through this again, this is why I thought it was definitely good to go back through the book again because when I was reading this part and reading your funny story about the guy Martin-
Christian: Because you met him, right? I met a version of-
Ben: Yeah, I met a version of it definitely. But when I read a little later on, I definitely connected when you said about falling in love with the subject you're designing for and actually giving a damn as you said in the book. I think that often gets forgotten because this is not ... It's like if you actually want to create a great product, if you want to actually satisfy a need for a customer, you can't do that if you don't care deeply about it. You're going to create something mediocre, you're not going to really solve the problem. And I thought that was ... That's a pretty profound ... the way to think about it because-
Christian: That gets back to AI. Before we've created a machine or an algorithm that cares, that has social anxiety, that cares deeply about X spirit over Y, we won't have something even close to what humans can do. At the heart of being human, is caring about something. Thinking that something is more important than the other and that's not an algorithm. That's the heart problem about humans and why we're such magical creatures and such sophisticated creatures.
And what I disliked five years ago about the AI community was, "Oh, we have that one down. Don't worry about it." No, you don't have that one. As far as we know, the most advanced thing in the universe is the human brain. And if you take two of them and put them together, wild exponential things can happen in terms of complexity. And then if you take six, seven billion of them and put them together, it's way more advanced than some search mechanism in Silicon Valley.
There's a lack of all and a lack of understanding of how much the human mind can do and how different that is from rational calculation. So if you look at ... They say, "Oh good, Deep Blue won in chess over Kasparov so now machines are better than humans at playing chess." You could say sure. You can also say they didn't play chess as in wanting to win, as being defeated, as in thinking about it in historic context, as thinking about the moment of it. No-
Ben: It didn't have the experience-
Christian: It didn't have the experience of playing chess. So did it ... Is it better than humans at playing chess? You can say that and I think that's a major technological breakthrough. But seeing that then that means one, we're better at chess which means that we're better everything, I'm not so sure about that.
Ben: Yeah, no. I think that's ... It's a profound insight to really think about that because I think there is a tendency to draw a line between ... make a connection where it really isn't. For whatever reason, it makes me ... I read a lot of Asimov books when I was-
Christian: [crosstalk 00:36:35]
Ben: Isaac Asimov and it's interesting, he did touch on a lot of this, what is it, like 60, 70 years ago now. But you can build AI as complex as you want but can you really replicate the human experience, what it is to be human? And that's ... I haven't really seen that yet, right?
Christian: No. Some philosophers would say no. Some people would ... I mean the soft claim would be, "Okay, show it to me." Right? And the hard claim would be no, it's never happening. Those are the two negative positions on AI. I don't know. I mean I don't know if it's one or the other but it certainly is hard, we can say that much.
I think there's something to say about care that Heidegger, the German philosopher whom I base a lot of my work on said that, "At the heart of the human experience is not thinking, it is caring about something." So what makes us human is not the ability to think linearly about things. That's not the only thing that I [inaudible 00:37:28] at least, it is that we lean into some things, we're more interested in some things, we're attracted to some things. And that's way different from a set of an algorithm to weighs data.
If you think about it, what would be the situation if data had free will? It would choose whatever ... It would do things like human beings, nonlinear, crazy things. It would suddenly vote in a different way than expected. It would suddenly live in ... or eat in a different way than expected, what would that be like to code? Difficult, harder than if it is deterministic or probabilistic and sort of predictable. It will be completely unpredictable. And you would ... I don't know how you would do that.
But that is what humans are, completely, wildly crazy in the things that we do sometimes. I mean there's a lot of things that are very probabilistic and you can make bell curves about how many to this and how many do that but there's many things that we would never, ever have ... able to predict including Brexit and Trump and all kinds of other things, right? Why did that happen? Well, it's not something an algorithm could have predicted.
Ben: No, I mean you have to actually understand something about the humans behind the scenes. Well, I guess to wrap up, I mean what's next for you on this? I mean this is ... Like I said, I think this book is only going to have more of an impact particularly with some of the things like we mentioned that have happened recently. What are you thinking next? What's your next project like this?
Christian: I want to build things. I want to build patient programs. I want for our diabetics that can't figure out how to live with their disease. I would like to build systems for depression because I could maybe understand some things about depression that we could do very helpful things with, with the technologies at hand. I would love to deal with education as well, how our kids learn and use that understanding of how kids learn and build it into the algorithms that's going to run the platforms on which we're going to teach them.
So I'd like to ... My task now is to try to get skilled talented engineers with your background to work with skilled and talented people with my background to work together, which is really hard. But that's quite a meaningful task for me. And that's what we're trying to do here.
Ben: I think that's a pretty lofty goal. And I think I'm excited to see what you guys do next.
Christian: Let's see.
Ben: Well, thanks again for your time Christian. I appreciate you taking the time to talk with me.
Christian: Sure.
Ben: All right everybody and thanks for listening. And check for our next episode of Masters of Data.
Speaker 3: Masters of Data is brought to you by Sumo Logic. Sumo Logic is a cloud native machine data analytics platform delivering real time continuous intelligence as a service to build, run, and secure modern applications. Sumo Logic powers the people who power modern business. For more information, go to sumologic.com. For more on Masters of Data, go to mastersofdata.com and subscribe. And spread the word by rating us on iTunes or your favorite podcast app.
Page of

DESCRIPTION

We thought we would replay one of the most important episodes of last year - the episode with Christian Madsbjerg, the author of “Sensemaking”. Christian’s point of view has become the basis of so much of what we do on this podcast.

We don’t often read books that change how we think about the world. Our guest on this episode, Christian Madsbjerg, wrote a book “Sensemaking” that did just that. Christian’s consulting firm ReD Associates has, in their own words, led a quiet revolution in business thinking. This book is a treatise on Christian’s underlying philosophical framework for ReD’s goal to bring the humanities and social sciences into today’s businesses dominated by technology, data, and analytics. We hope that you will find that Christian’s perspective will make your rethink your assumptions about the critical importance of the humanities in today’s fast paced world.