Analytical Storytelling (Guest: Karthik Krishnamurthy)

Media Thumbnail
00:00
00:00
1x
  • 0.5
  • 1
  • 1.25
  • 1.5
  • 1.75
  • 2
This is a podcast episode titled, Analytical Storytelling (Guest: Karthik Krishnamurthy). The summary for this episode is: <p><span>Our guest today, Karthik Krishnamurthy, is the Senior Vice-President of Cognizant Digital Business - AI & Analytics, Interactive and Intelligent Products. In his two-decade career at Cognizant, he has worked across many strategic areas. One of his focus areas today is analytical storytelling and the human connection, where he has collaborated with Christian Madsbjerg, a guest in a previous episode. In the end, it's all about bringing the human to data.</span></p>

Ben: Welcome to the Masters of Data podcast. The podcast that brings the human to data, and I'm your host Ben Newton. In the world of software consulting and data analytics the company Cognizant looms large. Our guest today Karthik Krishna Murthy if a senior vice president of Cognizant Digital Business, AI in analytics, and interactive and intelligent products.
Ben: In his two decade career at Cognizant he has worked across many strategic areas. One of his focus areas today is analytical story-telling and the human connection. Where he has collaborated with Christian Madsbjerg who was a guest on a previous episode.
Ben: So without any further ado lets dig in.
Ben: Welcome everybody to the Masters of Data podcast and I'm excited to have here with me Karthik Krishna Murthy who is the head of global markets for Cognizant. Thank you so much for coming here and being with me.
Karthik: Happy to be here thanks for having me.
Ben: And we're here in your Sam Ramon office it's beautiful outside. So we were just talking about what great weather, how beautiful it is here so. Thank you for letting us come to you in your office here.
Ben: So Karthik we'd love to start with just- like we were saying the humanize view and how you got the way you were. So just tell us a [inaudible 00:01:18] and how did you start in technology how did you end up at Cognizant, end up doing what your doing now. Just give us your story.
Karthik: Well thank you Ben for having me first of all. At Cognizant what I do now is head of global marker for our digital business which effectively covers the work that we do in AI and analytics, product engineering in IOT, as well as what we do for [inaudible 00:01:42] Interactive which is an agency service [inaudible 00:01:44] marketing so far and so forth.
Karthik: But the journey to here has been quite the journey for me. I started off in India in a time- I grew up in India rather at a time when India was just about globalizing. The force of globalization had come in and I realized that I could do more with my remote control than just press the first two buttons on my remote control and get my TV up and running.
Karthik: The world actually had more than two TV channels, and that's where- part of that process was where I released The Power of Presentation and Storytelling and frankly even before I got into the data space I was actually more of a storyteller than a data guy.
Karthik: I used to write scripts for films, I learned what it means to write a good narrative and how can you hold the audiences attention. And that sort of got into my DNA so to speak.
Karthik: So when I got into the data space, which was more by force than design to be honest. In India parents play an extremely important role in what you end up doing in your life.
Karthik: So when I went to my mom and told her I want to go into hotel management she wasn't very happy. And every time that we walked into a restaurant and I have the greatest regard for people in this space, but she would look at the waiter that was serving us food and go "So that's who you're going to become."
Karthik: So that didn't necessarily get me started in that direction that way. But you know I went the engineering route got into data and my first ever piece of work was in the mainframe space right. Those glorious Y2K days.
Karthik: Working on assembler cord beautiful CICS screens. So for me when I then made the transition into the world of data and business intelligence just to look at a BI app and see what we could do with reporting in BI was just heaven right. When you come from that world.
Karthik: But through that process I also realized that technologist are extremely good producers, but we're not necessarily great storytellers, and the magic of technology is in adoption.
Karthik: And I realized that if you can actually bring storytelling to this world from a narrative perspective you could really start to make a difference, and a lot of the work that I've done has been sort of at that end of helping clients understand what is the art of the possible.
Karthik: How do you actually drive outcomes with data, and business intelligence, and reporting, and AI, and big data, and all of those beautiful fancy words that we have in our space.
Karthik: And that's sort of how I got into it, and when you really think about it data and analytics is typically the last mile of technology.
Ben: Yeah that is true.
Karthik: Right. It's that part which effectively connects all of technology to outcomes. Because you're supporting or enhancing position making, and that's all about the human.
Karthik: And to be able to dabble in that space and build a business around that and be able to make that connection back to the human is sort of what I've been doing for a really long time.
Ben: Yeah, I love the way you talk about it Karthik because [inaudible 00:04:53] the analytical storytelling because you're absolutely right that is so often missing from this industry is the ability to take these pieces of technology we built, and really think about the humans that's going to be using it rather than how you connected them, where they're at, and what they're doing.
Ben: It's so important, but it also seems like that's key to succeeding now in this [crosstalk 00:05:15]
Karthik: It really is because if you sort of step back and look at what's happening with the world. Everyday the world is getting increasingly more dependent on technology, and people in extremely powerful roles are making decisions on technology without really understanding what's the true ability, or the true impact of technology.
Karthik: So your ability to tell that story, your ability to create that narrative in a way where you're actually able to drive human emotions becomes extremely critical. And this one moment that happened in my life that I can never forget, it keeps coming back to me every time.
Karthik: I was sitting with the CIO of a fairly large bank, and we were talking about big data and all the promises in big data, and everything. He looked at me and said "You know what KK big data at its core scares me."
Karthik: And I asked him "Why does it scare you?" I've heard a lot of things, I've heard people say "Look I don't get enough value out of it." "I don't think we're using it right."
Karthik: And I [inaudible 00:06:13] had somebody come to me and say it scares me. And he said "Look as a CIO my job is to bring process and structure to everything that we do. And this world feels like the wild weld west, and for me to completely adopt this and be a champion for this it's difficult for me at my core."
Karthik: And that was a very important moment right. Because at the end of the day in our space we are helping humans make decisions that at the end of the day impacts human emotions and outcomes right. You're helping a client, or helping somebody make a decision on how to model there sales force and what's the best way to reorientate their sales force.
Karthik: But based on for example the output that you're creating in your decision systems. And that could effectively mean that this one sales guy that is sitting somewhere has now got a completely different job to do.
Karthik: So that is a certain human emotion associated with that. Which is why I think that if you start with that and then you play it back into your technology and you make that connection.
Karthik: Adoption goes up, usage goes up, and you can start to deal with these human emotions of fear, and worry, and security, and so on and so forth much better.
Ben: That's a fascinating way of looking at it. And you know we were talking about before one of the ways you and I got connected is I had done the interview with Christian Madsbjerg who wrote 'Sense Making' and he recommended you as somebody to talk to.
Ben: Particularly his view on bringing the human to data. I can see now why you guys had a connection there because he's all about bringing human context to data, about how that's missing.
Karthik: Yeah, Christians on this journey to make science sort of art. And I'm trying to create art in science when you think about it.
Ben: You guys meet somewhere in the middle.
Karthik: So we always sort of meet somewhere in the middle. We have some fairly healthy debates on these conversations yeah.
Ben: So I mean because one of the things when you and I talked about I guess your passions about using things like artificial intelligence and this technology to solve real-world problems.
Ben: So talk to me about where your at with this now. What are the problems you're working on that are in this whole space here about creating the human to technology. Where's your head at?
Karthik: So, and for us in data right its sort of been this journey that we've had to take all the way down [inaudible 00:08:46] AI and when I say that I'm also conscious of the fact that this word gets bandied about so much right now.
Karthik: Every company that you talk to they slap an AI term on it and then the evaluation goes up.
Ben: Yeah like somebody said the other day, one of my other guests said "It's basically- half the time it's just statistics." It's not actually AI.
Karthik: Well somebody told me AI is anything you'd want it to be and I can sell it to you for a couple of million dollars.
Ben: That's right. For 200 X markup.
Karthik: So a lot of the time that I spend is in helping people understand what's the reality, and what's the most practical and applied way of getting to value in this space.
Karthik: And this is not a journey that just started now. Data's always been one of these things that as it came more and more into the spotlight we've had to deal with helping clients sort of work through the hype around it.
Karthik: And I always joke with my friends, four of the most abused words in technology belong in our space. Big data, analytics, artificial intelligence, and data science.
Ben: That's true.
Karthik: So almost all the time we help our clients sort of land the plane so to speak. And the way that we go about working on it at least my perspective is, I start with helping our clients understand what the problem is and why the problems occurring.
Karthik: And when you start to have the conversation the opportunity completely opens up. I was in a conversation with someone where we had the chance to work with them on what you would call a customer 360 problem. Really trying to understand 360 degree, getting a 360 degree view of the customer, and being able to do some very deep analytics on their customers so to speak.
Karthik: And they were trying to launch all of these multiple projects [inaudible 00:10:35] and my question back to them was "Do you understand your customer problem? What is your customer problem, and why is it that your customers are reacting the way that they are."
Karthik: Now we ended up doing a workshop, a sense making workshop around it. Christian and I actually went for that meeting. The CEO of this company was in the meeting, right after then the communication that had come to us was "Look we know what's happening with the customer [inaudible 00:11:00] we've got to censor that, lets just build these systems, lets just build these platforms."
Karthik: And we sit in the meeting with the CEO, and this is a CEO of a 25 billion dollar company. And he looks at me and goes "You know what I don't think we have any freaking clue what our customers want. Because the last five products that we've launched have all failed."
Ben: Oh that's sobering.
Karthik: Right. And there was just this moment of reality in the room. And I think that it takes a lot of courage to have that sort of a conversation, and to start there and then to map all of the insights that you can garner around the human back to the technology at some level. And be able to do that effectively and often bring that into the technologies so that their technology becomes more viable and more aligned right.
Karthik: And that's sort of what analytical storytelling. That's where we start, we start with understanding the problem. And to the second piece now they need to start to talk about storytelling. And then you need to create a narrative of decisions you need to create a narrative of decisions that starts with the first decision that needs to be made, right up to the end decision that needs to be made for a particular process flow.
Karthik: So if your trying to make a decision on lets say "Which products do you have to launch in the market?" And you have to build a decision making platform for that, [inaudible 00:12:20] platform for that.
Karthik: There is a set of decision points in that process flow for the decision maker. Understanding what all of that becomes critical, and also what are the motions that drive that decision making process becomes extremely critical.
Karthik: So we spend a lot of time trying to understand that right. So that becomes part of the second step, and within that you got to tell the story. So that was our biggest challenge right how do you get technologist to tell stories.
Karthik: And one way to do that is to find storytellers, and try to latch them onto technologists. In many cases we deal with the right thing, I came across this Jeff Bezos when I first did this in the everything store.
Karthik: He talks about the problem of the narrative fallacy. Which is pretty fascinating. The narrative fallacy is one where you oversimplify an extremely complex thing.
Ben: Oh right.
Karthik: And in the process of trying to tell the story you lose the essence of what you are trying to say. So part of the storytelling process has to be that you avoid the narrative fallacy problem, and your still able to stich the full storytelling model.
Karthik: So we started off for example, one of the things I Told team was look "Have you guys ever gone through a screenplay writing workshop?"
Ben: Probably not many hands went up for that.
Karthik: No. So we had our guys go through screenplay writing workshops. Just to learn how to tell a story. What are those points that you actually catch human attention. Every time that you catch human attention that is a chance that you're influencing decision making.
Karthik: So that's sort of the third step to it right. And then the fourth step becomes the actual technology itself which is tons of visualization tools out there, graphical interfaces that you could use, and then you sort of landed it there.
Karthik: And that approach becomes extremely critical.
Ben: You know the way you're talking about Karthik feels like you're- because I think when I first heard you say storytelling I was thinking more like the technological storytelling, but it sounds like you're really telling a human story.
Karthik: Absolutely.
Ben: It's just you are doing with in the context of technology, but it's still just a very human story.
Karthik: Absolutely. You have to understand what are the points of satisfaction, what are the points of dissatisfaction. When do you start to influence decision making, at what point of the human journey do you start to influence decision making, and then how can you then relay your set of insights to map back to that process.
Karthik: Because there is a really good chance that you could have these ten or fifteen really pretty dashboards, and the most important content that you are serving up is maybe the eight or the nine dashboard, but if you haven't understood the human that's making the decision he's already made a decision the third or the fourth dashboard.
Karthik: He's only looking for data to ratify the decision he has already made. So that's the fallacy that exists in our environment. And that's where I think the biggest gap exists between people that produce reports and intelligence, and people who actually consume it and translate it to action.
Ben: It's interesting because we actually had an interview with another guy Alistair Croll I don't know if you've ever heard of him. He did lean analytics.
Karthik: Yeah.
Ben: Yeah, and one of things he said was at different stages of a company you have the one things that you have to measure. And it seems like that's part of it too. Because you can be swimming in data and part of the thing here is not to over complicate the narrative.
Karthik: It is.
Ben: What's the thread through this narrative you need to make sure you curate. Because if you just overload with data then people will end up focusing on the one thing they care about right.
Karthik: And still how do you avoid the narrative fallacy problem, that's the key. I mean the truth is lets face it right. We hold the machines and technology to a higher standard than ourselves.
Ben: Yeah that's true.
Karthik: There are a ton of decisions we make that we get wrong, but then when the technology makes a misdiagnosis or gets something 98 percent right we still worry about the two percent problem.
Karthik: And at some point in time I'm sure if AI gets to where it is the technologies going to turn back and ask us "Why are you holding me to a higher standard than yourself?" But lets assume that doesn't happen, but that's the truth.
Karthik: And that's why we have to build our technologies closer to this aspect of human emotions and how you're actually driving decision making.
Ben: So when you're working through these kind of things and dealing with your clients. What are the biggest challenges you're facing? Because I guess one of those is actually being able to basic communication storytelling. What other kind of challenges are you running into in translating that data and analytics into decisions?
Karthik: Getting people to face their failures is the first biggest issue, and that's what I talked about with this example right. That to me I think is the first biggest problem. It's a human issue.
Karthik: We don't like to face our failures and to know where those failures are I think is the most important thing because those points are were you need to make your most important decisions, and that's where you need to target your biggest impact from a data, and intelligence standpoint. That to me is number one.
Karthik: Number two is the ability to then tell that story. Which is having the right sort of people that can translate all of the information that flows through the platforms and the systems that we create. The valuable insights that can be tied back to those decisions right.
Karthik: So how do these sort of information points that we create map back to the decision making process and the various points of insight right. So having that ability to be able to do that. That's sort of the second biggest challenge.
Karthik: The third challenge that I would say especially in our space where the art of the possible is frankly a sort of, and I'm over-exaggerating here to make a point, but it's frankly as large as the universe itself in terms of what you could actually do. Being able to zone in on what makes the most sense from a speed and a time-to-market standpoint is extremely critical. So that's sort of the third big challenge right.
Karthik: Very often we always have this ever going tussle between speed and craftsmanship. You could take a lot of time trying to get something right, but then there is a certain degree of time-to-market and speed that's more relevant in today's world than it is anywhere else.
Karthik: So as a lot of these ideas and these ideation points start to get translated to implementation and [inaudible 00:19:03] you go through that journey. You every often find yourself in the speed versus craftsmanship conversation, and being able to see through that and still get to where we need to I think is sort of a really important macro challenge if you will.
Karthik: So these would be three that sort of come to my mind right. And then there's obviously technology. [crosstalk 00:19:25] the world of technology and what we do with that.
Ben: Well one question too on that too I mean. I've heard it also described in more than one way that there's a tendency to want to get it perfect thinking that [inaudible 00:19:36] could actually could get it perfect.
Ben: Where as if you're taking a product to market, or you're trying to build a model, whatever it is. You actually have to see how it behaves in the real world, with real people, with real data.
Ben: And so by some sense if you take to long you're not actually getting the benefit of that feedback loop.
Karthik: Yep you're not, that is very true. And I think the more important thing here is relevance and appropriateness. So there are certain types of users and certain types of consumers where getting it absolutely right is extremely critical if you're working with the CFOs organization and what you're doing is you're doing work around helping them understand their financials so that they can report the street, you want to be absolutely correct with what you do.
Karthik: On the other hand if you're helping the head of sales sort of get a broad indication of product success and start to think about product strategies in new markets you don't need to be absolutely correct. What you need to do is you need to be directionally correct, and in many cases that's where the speed versus craftsmanship conversation becomes extremely critical.
Ben: Right, that does make sense. Well you know one things through all of this and particularly going back to the sense making connection. One part here is that one of the reasons why it's so important to have human context right is to avoid bias and avoid ignoring the human complexity, which is what you need to have success in a market.
Ben: But also to avoid making failures based on bias. I mean particularly where you're coming from and you're dealing with these clients, and you're dealing with this yourself.
Ben: How do you address and avoid bias in this whole process? Because if those people that you're choosing to be storytellers are going to insert their own humanity into their storytelling. So how do you avoid introducing and addressing biases that you're introducing into that process, or does that make sense?
Karthik: It does and I think this first question of why should remove bias, I think that's an important question that we should talk about. We all sort of think about bias as a bad thing and in many cases it is.
Karthik: But also when you look at the world of data, and what we've done in data we've always looked for biases, we've always looked for patterns and patterns have exhibited themselves.
Karthik: So there is a certain type of pattern based biasing that at some level is unavoidable in my mind. That's inherent in the system.
Karthik: To the second point, I don't think anybody's found a real answer yet, but the way that we think about biases is to actually use technology to sort technology at some level. You could today use algorithms that self-detect and self-report.
Karthik: You could have algorithms that monitor those algorithms so you could have algorithms specifically designed to try and capture biases in the code, biases in the code around race, or ethnicity, or marital status, or gender, or so on and so forth.
Karthik: So as the bias conversation continues to move forward I think it's important for us to understand what part of this is pattern based biasing that is inherent in the system, and that is unavoidable, and that's just the power of patterns and repeatability more than anything else, and then what are those biases which are human introduced which can actually be detected through algorithms.
Karthik: And you can create algorithms to do that. You can have algorithms inspecting algorithms. There are people who have coined terms for it. There's [inaudible 00:23:08] responsible AI, explainable AI and so on and so forth.
Karthik: But all of that is about trying to detect biases with algorithms and decision making.
Ben: That seems, to that point exactly one of the things that I've definitely heard pushback on with artificial intelligence is this [inaudible 00:23:26] can't explain it's own decisions. Because you get a sense that there are technology companies in our space, general in the software space building algorithms which then they can't actually explain how the algorithm came up with the conclusion.
Ben: And as those become- you said earlier on technologies such an important part of our life now that so many aspects of our life where things are actually being influenced by machine learning, AI driven decision making.
Ben: If you can't actually see how the algorithm got to where it is, where do you even find a bias at that point right? Because you don't even know how it got to where it is.
Karthik: No it is a real problem right. Especially as you get deeper and deeper into artificial intelligence especially around evolutionary programming and deep learning and so on and so forth. You could sort of lose yourself in that quagmire of that process.
Karthik: But to me I think there is a certain balance that we will have to reach, and that balance I think will be achieved when we decide as a group where do we want to place machines in our world. What is the societal position we give the machines is a very important question to be debated.
Karthik: Do we view machines as a set of things that are responsible to make our lives better, they are gadgets, they are instruments, or do we start to elevate the human-machine conversation the one where we consider machines as extended versions of ourselves.
Karthik: And that conversation I think is an important conversation to have because that will then determine what sort of biases are we actually going to be okay with. There is a certain bias that the three of us sitting around this room will automatically have.
Karthik: And we would accept those biases. There is a certain type of food that you and I don't like, a certain food content that's sitting in our food. We think of that food as bad even before we consume it. That's a bias.
Ben: That's definitely the way my kids are. If I cut things up I've ruined the food clearly.
Karthik: So that's the thing right. So if a machine does the same thing, or a machine goes down that path and gives you an output and tell you "Look, it looks like there's garlic in this food you're not going to like this food." It maybe okay for me to say it, but why is it not okay for a machine to say that.
Ben: Yeah it's true.
Karthik: So that's why in my mind I think the debate and the conversation around what's that societal position that we want to [inaudible 00:25:54] the machines I think is an important conversation. That to me will then provide a certain sense of tolerance around the biases that we are willing to allow machines to have.
Karthik: And then lets have the conversation about the biases that we worry about. And that to me I think is a more manageable problem, you're basically descoping the problem at some level.
Ben: Yeah that makes a lot of sense. Have you ever read Isaac Asimov?
Karthik: I used to yep.
Ben: I read a lot when I was a kid. Did you ever read the foundation series?
Karthik: I haven't read the foundation series.
Ben: The reason why I bring it up and its come up a couple of times is when Asimov wrote the foundation series. Part of the thing you realize at the end is that there was actually a split in society between the society that ended up spreading to the galaxy was one where AI was only assistive and it was meant to assist you in driving a ship, or something like that.
Ben: And there was another part of the society that where the robots actually became more independent, but they also became slaves. And so you can kind of see clearly where Isaac Asimov ended up on this question right, but he was basically making the point that then you see the ones where they became more cognizant they became slaves, and then these humans live all by themselves surrounded by slave robots.
Ben: So clearly not a new questions.
Karthik: I mean if AI doesn't work this would be what effectively the third winter of AI so we've already had two dating back to the fifties.
Ben: I didn't think about it that way, you're right. Yeah we always like to think our problems are all new right.
Ben: Well I guess going from here where are you going forward from here, what are the new big challenges [inaudible 00:27:36] that you're looking to dig your teeth into in these next few months, years?
Karthik: Well I think there's. Like I said I think the opportunity to actually do work that matters to this world is significant. Working with the United Nations on [inaudible 00:27:51] related problems and creating more insight into something like that.
Karthik: Working with states and governments on helping them figure out answers to human issues. The possibilities of technology is endless. While at the same time I also think- and that's sort of where I spend a lot of my time, and I hope to spend a lot of my time working with our clients.
Karthik: At the same time I think there is a certain degree of regulation, and control that needs to be put on technology but done in a way where it actually assists the development of technology as opposed to curbing it.
Karthik: And I think there is work to be done there as well. And those are all really interesting conversations for me. To me at its core is the focus on the human, and if we don't realize that to be successful in data you need to be human first, all of these things will not happen.
Karthik: So as you think about human issues, working for example with the government to help understand and predict flood situations by deploying effective artificial intelligence, and predicting water level increases. That could save lives, that's the sort of work that sort of really gets me going.
Karthik: And to be able to have that sort of an outcome, and that's possible today with the volume of data that's present and with the technology that's in your hands. I had an opportunity just a couple of weeks back to go down to the finals of the hyper loop SpaceX competition.
Ben: Oh really?
Karthik: Yeah because Cognizant was the prime sponsor for one of the universities, the University of Delft that made it to the finals, they were from about 400 universities globally. We came in number two. And just to go there and see the possibilities of what these kids are creating.
Karthik: Just this notion that someday you could go from San Francisco to Los Angeles in thirty minutes, and come back. That basically redefines the concept of interconnected communities and what you could do.
Ben: Yeah that's true.
Karthik: So that's why I think the possibilities are endless. If we can put the focus back on the human and solve the human problems around it I think we can advance the push of technology, and data, and intelligence which is all embedded in that.
Ben: Oh, that's fascinating it makes a lot of sense. Well I mean I think this has been a great conversation Karthik I really appreciate your time, and I wish you all the luck in solving those big problems. I think it's- we'll have to keep in touch to learn more about what you're working on.
Karthik: We should. And one of these days we'll [inaudible 00:30:20] figure out how, what's a better way to present rather than use projectors right.
Karthik: I don't think the projector problem is something that nobody seems to [crosstalk 00:30:28]
Ben: Okay I'll sign up for that one. I remember when we used to write with markers on those projector things, absolutely.
Ben: Well thank you so much I appreciate it.
Karthik: Thank you for having me.
Speaker 3: Masters of Data is brought to you by Sumologic, Sumologic is a cloud native machine data analytics platform. Delivering real time continuous intelligence as a service to build, run, and secure modern applications.
Speaker 3: Sumologic empowers the people who power modern business. For more information go to sumologic.com. For more on Masters of Data go to mastersofdata.com and subscribe, and spread the word via rating us on iTunes or your favorite podcast app.
Page of

DESCRIPTION

Our guest today, Karthik Krishnamurthy, is the Senior Vice-President of Cognizant Digital Business - AI & Analytics, Interactive and Intelligent Products. In his two-decade career at Cognizant, he has worked across many strategic areas. One of his focus areas today is analytical storytelling and the human connection, where he has collaborated with Christian Madsbjerg, a guest in a previous episode. In the end, it's all about bringing the human to data.