Critical Thinking, Ethics, and Analytics (Guest: Alistair Croll)

Media Thumbnail
  • 0.5
  • 1
  • 1.25
  • 1.5
  • 1.75
  • 2
This is a podcast episode titled, Critical Thinking, Ethics, and Analytics (Guest: Alistair Croll). The summary for this episode is: <p>Our guest today has had a long and varied career in technology. Alistair Croll is a serial entrepreneur, much sought after speaker, prolific author - including the best selling Lean Analytics - which is a must read, an organizer of conferences like Forward50, and consultant to companies large and small. Alistair is a big thinker in the area of Data and Analytics, and we got to cover a lot of ground in this episode.</p>

Ben Newton: Welcome to the Masters of Data podcast, the podcast where we talk about how data affects our businesses and our lives. And we talk to the people on the front lines of the data revolution. And I’m your host, Ben Newton. Our guest today has had a long and varied career in technology. Alistair Croll is a serial entrepreneur, a much sought after speaker, prolific author, including the bestselling ‘Lean Analytics’, which is a must read, and organizer of conferences like Ford 50, and consultant to companies large and small. Alistair is a big thing in the thinker in the area of data and analytics, and we got to cover a lot of ground in this episode. So, without any further ado, let’s dig in.
Welcome everybody to Masters of Data, and I am excited to have my guest here today, Alistair Croll. He has spent a lot of time in the technology industry doing a lot of different things. You’ve been a serial entrepreneur, a speaker. You’ve written four books including ‘Lean Analytics’, which I love. You work with startups and established companies on just making their business better and doing business model innovation. Alistair, thank you so much for coming on the show today. I really appreciate it.
Alistair Croll: No worries. Thanks for having me, Ben.
Ben: With that intro, Alistair, like I said, I think you’ve actually been described as having a little bit of career ADHD, which I can definitely identify with, doing a lot of different things, a lot of different interests. Tell me a little bit about how you got to where you were. Where did you kind of start? How did you get into the industry? And how did your career kind of develop?
Alistair: So, I started out actually with a misspent youth with an Apple 2 computer and not enough time in the sun. So, that tells you how old I am. And I got a job in the telecommunications industry working for somebody called Icon Technology out of school. Actually, it was kind of cool. Every time you dialed one of those AOl modems with that CD rom that Steve Case stuck to every magazine he could, you were dialing one of my modems. I was the project manager for the dialup access concentrators for AOL. And eventually, that got into… Because when you’re dealing with networks, especially back in the 90’s, everybody wants more performance. You have to have policies that say how you handle different kinds of traffic. So, voice, and video, and network management traffic has policies about it. And those policies come down to network performance. Then I got into network performance. And then I found a company called Coradient with some friends of mine that was probably the first ever real user monitoring product. If any of your listeners are familiar with BMC, which eventually acquired the company, their product line for management is called TruSight. So, we built TruSight and sold to BMC. And because that was all about measuring real users, we wound up talking to a lot of people who cared about web analytics.
So, web analytics showed you what people did, and TruSight would show you if they could do it, which is a pretty different but important question. And maybe the conversation rate you’re trying to get is not what you’re getting because people keeping getting a 500 error. And maybe you don’t know that because there’s no way the JavaScript loads on the page, so you don’t see it in analytics, for example. And so we built tools that would show you that stuff. And that kind of got me into the web analytics world, and then I wrote a book called ‘Complete Web Monitoring’ about how to measure your online presence. Eventually that was a slippery slope towards both running conferences and writing about data and analytics.
Ben: And that’s actually…even the Coradiant piece I think we talked a little bit about this before, but I that’s kind of where your and my worlds definitely touch. Because I then end up working some with the TruSight product at BMC. And I remember…you’re absolutely right…that whole connection with the user experience. I think it’s kind of come 360 for me because I think the idea that you have to drive a differentiated customer experience and how important that is to succeed in the marketplace, that’s more important today than it’s ever been. So, it’s good work that you guys did.
Alistair: Yeah, when the customer sees… If you go to a store, and you have a bad retail experience, the person on the other side of the counter knows you have that bad retail experience. This idea that you can go to the store and online and have a bad retail experience, and nobody knows about it is infuriating to customers. I was literally…I was on the phone with a bank today, in fact, trying to get insurance. And this is my bank. I was just trying to get home insurance, right? And I’ve used entirely online systems. They just worked. And this one, I had to go through four different people. And then they’re like, “Oh, you’re a university graduate? Well, you have to tell us the name of the director of the program for home insurance inspections.” And I’m like, “What?” And they said, “Well, to prove you graduated from that university.” I said, “I graduated from that university, but not with a degree in home insurance inspection.” There’s nobody that’s aware of how some guy is getting transferred across three different voicemail operators, misclassified as a graduate of some home inspection program, and then getting accused of lying. The user experience is so critical. And getting back to the topic of data, the only way you can get there is through data. And so often, we ignore that data at our peril.
Ben: Yeah, I know. And absolutely. Wow, that’s a crazy store.
Alistair: The funniest part is I went and Googled the guy, and I told her. And she’s like, “Oh, so you do know him.” I’m like, “No, I know how to Google things.”
Alistair: I probably shouldn’t troll service personnel so much.
Ben: Well, I guess you proved you know how to use the internet, so that should be enough, right?
Alistair: Exactly.
Ben: That’s crazy. Well, we were talking about some of the things that we could talk about today. A lot of the things that you’re working on. And I want to definitely touch on a few of the things. One of the things that really I think caught me is some of the work you’ve been doing at Harvard, a class that you’ve been teaching over the last couple years. And it really kind of connected with a lot of other discussions that we’ve been talking about. So, I’d love for you, Alistair, just to dig into that and tell us about why did you start it, and what have you been teaching, and what’s kind of come out of it for you.
Alistair: Sure. So, full credit, the professor who teaches the course is a guy named Srecant Detar, amazing thinker, teaches in the design school there, and asked me to be a visiting executive to put together a course on data science and critical thinking for MBA students, which is refreshing. Because not only do we need data and public policy fields but also in stuff like business schools. If you’re managing with data, the number of ways you can make stupid mistakes is unbelievable. And really the course is sort of like a stupid human tricks class. It’s full of… We try and give the students framework to understand data and how to analyze it.
And we do get into certain what’s the difference between random forest, and clustering, and stuff like that so they understand it and they are articulate it. But a lot of this is how do you know the data is real, before you collect any questions, you try to answer how you know if you were successful, how should you store it, how should you retrieve it, how should you display it, what new experiments should you run. So, there’s a whole sort of workflow or life cycle for data. And we kind of go through each of those stages and look at the places where things can go horribly wrong. One of my favorite examples, there’s something called Benford’s Law.
If I had nine buckets numbered one to nine, and I went through Ben Newton’s credit card transactions, and I took the first digit of every transaction… So, I just go through your credit card transactions, and I take the first number, one, seven, four, whatever they are, and I put them in those buckets, most people would expect the buckets to fill up roughly evenly. So, as many numbers beginning with a one as with as seven as with a four, and so on. The reality is that for almost every set of naturally occurring numbers, you get about 30% of them beginning with a one, and about 20% beginning with a two, and it goes down from there. It’s called Benford’s Law.
Now, that sounds like a neat magic trick. But when Greece wanted to enter the European Union, it filed a bunch of financial data. And one economist said, “Hey, wait a minute. This stuff doesn’t match Benford’s Law. I think it was randomly generated.” And they ignored him. And you can argue that Brexit is a direct consequence of that kind of oversight. So, these little tiny things have huge, wide-ranging changes. And they often feel like dumb little magic tricks or cognitive bias. And just getting people to ask questions about the data they’re looking at, and what it means, and how it’s being used is incredibly important.
Ben: So, I guess that’s kind of the critical thinking of the class then is that you’re trying to equip these MBA students.
Alistair: Yeah. I’ll give you one more quick example. During World War II, these bombers were coming back from the front riddled with bullets. And so the allies got together and said, “Let’s analyze these and figure out where the bullets are hitting and how we should better armor the planes.” And they had a whole plan for where to put the armor on. And one statistician, I think it was in the eastern US, said, “Wait a minute. We should put the armor where there are no bullet holes because those planes don’t make it back.” It’s pretty obvious once you say it, right?
It’s a problem called survivorship bias. We tend to analyze the survivors and over fit the consequences. But if you’re a business person, you’re making decisions based on, “What are my customers like?” When you’re trying to grow the number of customers you have, yeah, you should look at what your customers are like. But you should also look at what the people who didn’t choose to buy from you have in common, because that probably tells you about your offering.
And it’s often more useful. It’s better to analyze the thousand handsome men who went to LA the same month Brad Pitt did and didn’t get an acting career and ask them what they wish they’d done than to talk to Brad Pitt because he’s going to have a whole bunch of different histories. So, I think just getting people to think in these critical ways or be aware of how they have to work with data and with information is hugely important today.
Ben: You talk to a lot of different companies. You’re running these conferences. What’s the level of recognition in the marketplace that this is going on? Are people taking it seriously? Are they really trying to leverage data in a critical thoughtful way, or has there been a lot of movement on this in the last couple of years?
Alistair: I think the biggest problems remains that people aren’t asking good questions. If you ask a good question, your question should have…you have so much data you can probably find the answer. Knowing which questions to ask… I think we said in the end of ‘Lean Analytics,’ in the old days, a manager was someone who convinced people to act in the absence of information. And in the modern era, a manager or a leader is someone who convinces people to act by asking the right question.
And once you have that question, obviously you need data scientists and statisticians to tell you how to collect the data in an unbiased way, whether you can act on the model ethically or not. A great example of that is in Boston, there was an app that the city built called Street Bump. And Street Bump measured when the car hit a pothole. Sounds great, right? And it worked very well. The problem was that it told them where all the rich white peoples’ potholes were. Because if you look at what kind of people drive themselves to work in their own car with a passenger seat they can put a phone on and an unlimited data plan, it’s probably going to be rich white people.
Ben: Wow, that’s interesting.
Alistair: So, was there anything biased about the thing? No, it was great. But there was a systematic problem that needed to be addressed. And only by looking at this data model and going, “Huh, look at that map. That map looks awfully like socioeconomic studies,” can a human kind of step back and say, “I think maybe we have a problem here. Let’s attach these to busses and garbage trucks.” Which is what they did, and they found the rest of the potholes. So, the first is asking the right question. The next step is recognizing that there’s a cycle of experimentation, that you should go ahead and try something.
The street problem thing was great because they tried it, and then they looked at the model, and they adjusted it. And that’s how it’s supposed to be. And so there’s the first thing is asking good questions, the second thing is a recognition that experimentation is the norm. You can spin up computer resources for pennies. And then I think deciding how and why to act on it and managing data projects. Often people go find a data scientist and say, “Spin me some straw into gold.” Data scientists aren’t Rumpelstiltskin. They need a specific task, and they need to work with people who have domain knowledge and so on.
Ben: Well, you know, the way you described that…I don’t think I’ve heard it described exactly that way. And I find it really interesting because it’s not only that… Because there’s a lot of talk right now about cognitive bias, and bias in data, and how it’s…maybe people aren’t seeing that beforehand. But part of what you’re saying is that maybe you would never be able to see that until you’ve actually got it out there, you’ve tried it out. And then you expect that there’s going to be some bias, and you look for it, and you correct it. And instead of just spending all this time beforehand necessarily worrying to much about it, you actually have to get it out there and try it with real data. Does that sound right to you?
Alistair: What’s the Mike Tyson quote? Everyone has got a plan until you get punched in the face.
Alistair: I think there’s a slightly more discreet version of that, which is like no plan survives contact with the enemy. The reality is that… I’m a big proponent of the lean startup model, and that’s why we did ‘Lean Analytics.’ This idea of iteration doesn’t just apply to the product you’re building, it applies to the experiments you’re running to figure out what product to build. But you should design an experiment, try it out very quickly, see what happens. I’ve worked with companies that have done huge surveys. And I show up and get five people to take them, and they can’t figure out how to take the survey.
I’m like, “You can’t get five people in your office to answer this correctly? Why on earth would you spent thousands of dollars sending it out? The cost to build a survey is free. Just go fire up Google Forms and put a survey together. It’s so easy.” And so I think there’s… We talk about Cloud Computing as a technology stack, but the Cloud Computing mentality… Randy Bias had a great quote. And apologies to animal lovers that are listening to this. He said, “Once upon a time, we had servers. We named them. And when they got sick, we nurtured them back to health just like pets. And now, we have servers, and we give them numbers. And when they break, we kill them just like cattle.” People haven’t internalized that. Many people still think of IT as this unique and precious snowflake instead of this resource that’s disposable and cheap.
Ben: Yeah, it’s funny you say that, Alistair, because I’ve always used that analogy off and on. And I’ve started using it in some training courses I’ve given. And it does seem like a particularly brutal version of cattle.
Alistair: Yeah.
Ben: When we talk about shooting them in the head, but that’s exactly right. I think both when you and I were starting out, people would name their servers after cartoon characters, and they spent all this time lovingly nurturing them. And it’s at the point now where you really can’t do that at scale. You have to really think differently.
Alistair: Well, we used to name them after the planets in the solar system. And that’s because we had a sun box that was our big server, and then we had more than the nine servers, we had to resort to something increasingly obscure, yeah, for sure.
Ben: One thing since you brought it up a couple of times, I’ve definitely enjoyed your book, ‘Lean Analytics,’ but I don’t know if everybody who’s going to be listening to this is aware of that,and it’s obviously key to a lot of the discussion you have and kind of building your thinking for there. So, can you take a minute and kind of describe the book, and how you wrote it, and kind of how it’s evolved since then, and how you’re using it with your discussions with companies and things like that?
Alistair: Sure. So, in 2011, I think, I was part of a team of four people who launched an accelerator called Year One Labs. And it was based on the premise of lean startup, which is you have an entire year to launch your startup rather than the usual 90 days, which turns into a pitch contest and a lot of accelerators. We actually told people they weren’t allowed to write code for the first month, which made them incredibly nervous and forced them to go talk to their customers. And every one of them realized they were trying to build a product nobody needed, which was…right there, that would increase the success of accelerators everywhere, right? It was a great experience.
And Ben Yoskovitz, my ‘Lean Analytics’ coauthor and I would sit down with these companies and say, “How are you doing?” And they’d go, “We’re doing great.” “Okay. How do you know you’re doing great?” “Well, conversions are at 14.” And we’re like, “Is that good?” “We don’t know.” So, Ben and I bought a lot of people beers, and they eventually told us their top secret internal numbers. Because back then, people weren’t sharing what good was. So, we called up companies and convinced… And then once someone who has told you, you can sort of trade data. So, you can go to one guy and say, “Look, I’ll tell you my conversion if you tell me yours.”
We interviewed about 135 people, some VP’s, some founders, and found out what normal was for a lot of these metrics. So, the idea behind ‘Lean Analytics’ is that your company if you’re smart will go through five specific stages of growth. Empathy, which is figuring out what the market needs, stickiness, which is getting the initial users to keep using it, either re-subscribing or whatever, virility, which means they tell others, revenue, which is where you’re making money and pumping some of it back into customer acquisition, and scale, which is where you’re growing customers and margins disproportionally to your costs.
There’s different metrics for each of those stages, right? The metrics you want to track when you’re looking at virility is like how quickly are people sharing my message, and how convincingly are they doing so. That’s different from the stickiness metric, which is what’s the loyalty, and am I focused on acquiring customers or getting them to return. And so we have this idea in the book of the one metric that matters. Because if you know anything about math, trying to solve an equation for more than one variable is impossible. You’ve got to isolate one variable, right? And so we urge people to pick the one metric that matters and get it to a place where it’s good enough. And if you don’t do that, then you’re probably doing something horribly wrong. So, the underlying idea is that you should pick a metric, run an experiment with a hypothesis, see if you move the needle enough, rinse, repeat. And then you’re being much more disciplined about the process of growing your business, of any size, your product launch if you’re a big company. What’s happened since the book came out is that a lot of data scientists used it and analysts used it to explain to their bosses how to talk about analytics and data because we provide some fairly good lexicons.
And so a lot of big companies have asked us to come in and help them simplify their dashboard. Ben and I were in front of a company in Spain a few years ago that showed us their dashboard, and it was seven point fonts, and we coined the term the mini metrics that might matter or MMTMM. And I think they were okay with it, but you could tell they were like, “All right, all right, stop making fun of us.” So, a lot of companies have an MMTMM problem. They need to focus on one metric that matters for that project, or that department, or that startup.
Ben: Yeah, thank you for walking through that. And I think definitely when I’ve…reading the book and… I think it’s so needed to be able to encourage companies and just overall using data to backup decisions. Because this is a tendency human nature to say, “Well, this is my gut instinct, and I’m going to go after this.” And there’s a place for that. Some of the most innovative ideas have been built off of that. But then to not use data and actually not know what success is, then you never know if you’ve succeeded. And you never know what you actually have to improve and work on. And what you guys did in there is a huge contribution to that.
Alistair: Thank you.
Ben: So, moving on to another couple of things that I know you’ve been doing lately. So, one of the things you had brought up when we talked before was that you gave a keynote at Electronic Arts last week. So, I’d love to hear what you were talking about and what your kind of impression of that whole process was.
Alistair: Sure. So, I’m under NDA with them, so I can’t get into too much of what I heard, but I can tell you what I was talking about. So, the challenge for any organization is that analysts traditionally speak when spoken to. We come from a world where data resources are expensive compute was expensive. And so the product owner would show up and say to analysts, “Hey, I need to know widgets by country by weather by color.” And the analysts would go off, and they’d build the schema. And they would pour the data into the schema, and they’d bring the reports back like tablets from the mount. And the project manager would go, “Ah, could you look at religious affiliation?” And they’d be like, “No, but we’re going to do another query now.” And so it was this very bad iteration. And today, we first collect the data, and then we find emergent schema within it, put another way as I’ve been… And I’ve had heated debates with members of the police world about this. Once upon a time, we used to find a suspect and collect data on them. And today, we find data and go and look for suspects within it, which is an interesting take on society. We can talk more about that a bit. So, the challenge was really to give them reasons why they are necessary for the company’s survival. And what I have found, and I’ve been working on this project for a little while under the name Tilt the Windmill, which is a spin on the idea of tilting app windmills, that companies that succeed have a portfolio of innovation. They have sustaining core innovation…what Clay Christensen would call core innovation, which is just to keep the lights on. If your Volvo, that’s cup holders in your next year’s car. There’s adjacent innovation, which is where you change one part of the product, or the market, or the go to market method. So, that would be adding the electric car, but you’re still selling it to drivers, and you’re still selling it through dealerships. There’s disruptive innovation, which is something that actually destroys something. So, if you’re Mercedes Benz, and you own car2go, the more people that use car2go, the fewer people are buying Mercedes. That’s disruptive. And then there’s discontinuous innovation and discontinuities are this philosophical idea that Foucault and others talked about where the future becomes unknowable. So, the self driving car is an example of that.
In 20 years time, we’re very likely to go, “People used to drive themselves to work? That’s preposterous. Not only did they have accidents, but when did they shave?” We won’t understand that world. And I think that it’s very important for analysts to think in those four mindsets. So, one of the things I did was I had some beers with some friends a couple weeks beforehand and said, “What are some crazy ideas that might affect the video game industry?” So, I’m not sure if you’re a gamer, but…
Ben: Oh, I am, yeah.
Alistair: So, there is a company called Riot Games. They have a very popular game called League of Legends. League of Legends came from an idea of a game called Defense of the Ancients put out by Valve, which was the result of a map mod for Warcraft, which is based on a StarCraft map. So, at what point would an analyst see the adoption of StarCraft and that map being played a lot in contests and figure out that Riot Games is going to immerge?
At what point would you decide that the first generation of people who grew up on video games are now in old folks homes losing their faculties and cognition and decide that retirement home video games is a burgeoning market? And so there’s this question of what kind of data would you pick. And I’ve done this for lots of industries. I was talking to the department of transportation and some folks in the Pacific Northwest about resilience and risk.
If you’re someone in the transportation world, you look at self driving cars as this huge boon because the average even in London, England during the day…the average car is parked 93% of the time. Which means that we have at least 10 times as many cars as we need. So, that’s great. Self driving cars spares us all the cars except when there’s an earthquake and every one needs to flee Cascadia. Well, are you going to do an Uber share, or are you just going to get your family out of downtown Seattle? So, there’s all these tradeoffs that I like to think about with what happens when technology and society collide, and we don’t really think the consequences through.
Ben: I like the way you described that. Because I do love to read about the future technology, but there is also a tendency with futuristic to kind of view this very smooth sailing for all these technologies when in reality, as soon as they hit the average person on the street, it’s sometimes really hard to predict how they’re going to adopt them. I always think back to in the 80’s… I grew up in the 80’s and looking at Back to the Future and what they thought the world was going to be like now with flying cars and whatever…
Alistair: Turns out it’s idiocracy, right?
Ben: Yeah, right. [Laughs]
Alistair: Oops, we missed that one. But I think that’s a really important point, that we teach home ec, and we teach basic economics to students. We are going to have to teach critical thinking in order to make our population more resilient. And I mean resilient to attack, resilient to disinformation. Whenever I’m talking to my daughter about marketing and media…she’s seven now…I’m always asking her, “So, what is the objective function?”
So, in machine learning, an objective function is when you give an AI a goal. So, “I want you to get the highest score on a go, or I want you to get the highest score on Pong, or whatever. I want you to find the cats that aren’t…things that aren’t cats.” And I think this year is the year where society starts to ask itself, “What’s the objective function of a political party, of a social network, of a marketing campaign?” And that’s the first step to me in critical thinking is to ask yourself, “What is this thing trying to get me to do or say?”
Ben: Yeah, absolutely. That was one of the things that made me start thinking about starting a podcast like this. Because I definitely noticed that there’s an awareness about data and how it’s being used, and the technologies being leveraged against it that maybe even wasn’t there a couple years ago. Even though the…this has all been going on for years, and years, and years, there’ve been several things that have happened recently that I think have at least raised it to public consciousness in a way it wasn’t before.
Alistair: Yes. And I think it’s been thrust front and center. But we have… I’m a Canadian. So, there are two things about Canadian politics that are different. We have more than two parties. So, there isn’t this sense of which team are you rooting for. And we don’t know when our elections will be. We call our elections roughly every four years, but there’s no scheduling of the political year on television, which then leads the media to go and want to populate it with convincing interesting stories. And so we have a very different perspective, I think, from the US about elections. And it gives me a chance to sort of see some of these changes with a bit of detachment.
Ben: That’s an interesting point. I didn’t think about that. Well, you know, kind of making maybe a pivot to another thing that we had talked about earlier is among the many things that you are involved in, one of those is with conferences. And a particular is one that you were telling me about, a conference called Ford 50. So, I’d love to hear more about what that is, and what you guys are doing with that, and kind of just tell us more about it and what’s going on, who attends, and what’s kind of the objectives for that.
Alistair: Sure. So, we started the conference last year. There was kind of a void left by a much more tradeshowey kind of conference that happened in Canada. Had great participation. But the underlying idea was how do we use technology to make society better. There’s some pretty important questions there. For example, in North America, we spend 12 billion dollars a year on tax preparation. In Estonia, they spend zero dollars because it’s all done by the government.
And you can make a pretty good argument that when the government does not keep up with technology, it leaves the gate open for someone else to come in. And in some cases, that debate about what should be…should the government be making chocolate bars? Probably not. Should the government be collecting its taxes? Probably. And so there’s a healthy debate to be had about where you draw the line about a government and its role in society. There’s also the idea of the tension between… A lot of people don’t remember this from civics, but in most governments, you have this idea of a democracy, which is the rule of the masses, and the majority rule wins, and the idea of a constitution, which is the defense of the individual rights.
And that’s very analogous to what we see in data science, where if I… I’m all in on Google. I love Google. I use all their stuff. They know tons about me. It bothers me that my Google agent can’t take the fifth the way that an attorney could, because it probably knows more about me. Nobody is more honest than in their search bar. But the constitution says these are my rights, and my right not to be attacked. Even if everybody votes to kill Alistair, I have a constitutional right not to be killed, right? And so there’s this inherent tension in any democracy between the constitution, which protects the rights of the few, and the democratic model, which encourages the desires of the many. We see that in data.
When I give Google my data, I’m willing to give it stuff because I’m going to get tremendous advantage from it. I’m going to get good context, good recommendations, whatever. But at the same time, I have certain things I don’t want it to give up or give away. And so I think that there is no organization more built on data than government. And there is no organization more likely to be affected by machine learning. Laws are just codes written in complicated words. When we amend a law or change the text, we’re just debugging, because we found an edge case, right? That’s Lawrence Lessing, that’s not me. Code is law. So, I think digital government is an amazing opportunity to rethink what the role of society is and how society works.
There’s a thing in policy called the Veil of Ignorance. It’s a thought experiment by a guy named Rawls. And he said that we should put a blindfold on, the veil of ignorance, and then design a society not knowing whether in that society we will be a prince, or a pauper, or that we’ll be a saint or a sinner. So, you may after you’re don’t press a big button saying launch society. And then now Ben Newton is a known serial killer. Or Ben Newton is a single mom. Or Ben Newton is the president. Without knowing ahead of time, what kind of society would you build? And he called this the Veil of Ignorance.
And I argue that we’ve had so many fundamental changes because of digital technology, the fact that we can now create things that are abundant. We used to talk about the tragedy commons, now it’s the triumph of the commons that makes Wikipedia. Did you know that Wikipedia was made in the amount of time Americans spend watching commercials in one weekend? Clay Shirky actually did the math. If we took all Americans, and you had them stop watching commercials for a weekend, and they spent that time making Wikipedia, you could get Wikipedia up to about 2011.
That’s pretty crazy. So, this idea of the triumph of the commons, and what can be done, and the ability to create digital abundant stuff where the copies are far and the marginal cost is zero means we have to revisit the Veil of Ignorance. We have to put the blindfold back on and ask, “What can be done? Should we be using direct democracy in an era where every citizen can vote? Or do we still need representation?”
And I think there is…that’s one of the reasons why digital government is a fascinating field is that there are so many societal question. Why is unemployment a bad word other than because we use GDP as a scorecard? Most people I know would love to work 30 hours a week. And if we have the means for that to happen. So, there’s a lot of really juicy fundamental policy questions that are wrapped up in technology. And that’s why I’m really excited about Ford 50.
Ben: Sounds fascinating. I guess to kind of wrap up a little bit… We’ve covered a lot of ground. Like I said in the beginning, you… I love how you have your fingers in so many different things and such a broad set of interests. So, what’s next for you? What are some of the things that you’re looking at that are kind of just nascent ideas in your head that you think you might chase after in the next couple of years?
Alistair: Sure. So, one thing I’m spending a lot of time on is in scale tech. Everyone talks about the scale up problem. We have lots of startups, but we don’t have a lot of scale ups. And those scale ups are big enough to catch the attention of their competitors, but still weak enough and small enough that they can be taken out. And so they have to invest heavily in technology that gives them some kind of force multiplier. Often times… You know the old parable the shoemaker’s children are most poorly shod? Often times these tech companies don’t employ the technology internally to scale efficiently.
So, I’m spending some time with a few very large growth scale stage VC firms on that. And then the other thing is a pet project I’ve been working on for a little while. I wrote the first blog post on my Medium page, which is just medium/@acroll. And it is based on a talk I’ve given a few times that is continually the most controversial talk. It’s called ‘Just Evil Enough.’ The dirty secret of startups is that every successful startup did something evil, slightly evil, that allowed it to get a platform to behave in an unintended way.
A technical example of that would be Farmville. Like Farmville started out, and they realized that their app could post to your friend’s Facebook feeds. They got 30 million users in a month. And then Facebook went, “Oh, that’s a vulnerability.” And they patched the vulnerability. So, script kiddies annoy the hell out of me. But this idea of a zero day growth exploit is awesome. I’ll give you one more example. Tupperware. When I said Tupperware, you don’t think of the Tupperware box, you think of Tupperware parties.
The party was a way to get a platform called the dinner party and turn it into another…subvert it to your advantage, which is how hackers think. So, I like the new definition of marketing, which I’m going to be writing about in this book, which is that the goal of marketing is to create attention you can turn into profitable demand. And to do that, you have to be slightly evil. And I make a pretty good case on it. I’ve been collecting lots of dirt on how companies got where they are that I’m going to put together into a book.
Ben: [Laughs] Considering what I do for a living, I’m definitely going to take a look at that book when you put it out, Alistair. It sounds awesome. I think you’re doing a service to us all to uncover the dirty little secrets. So, I like that. Well, Alistair, thank you so much for coming on the show. I think the stuff that you’ve got your hands into is absolutely fascinating. And I look forward to seeing how this book goes and the other things you’re working on. And thank you so much for taking the time.
Alistair: Absolutely, it was a pleasure.
Female: Masters of Data is brought to you by Sumo Logic. Sumo Logic is a cloud native machine data analytics platform delivering real time continuous intelligence as a service to build, run, and secure modern applications. Sumo Logic empowers the people who power modern business. For more information, go to


Our guest today has had a long and varied career in technology. Alistair Croll is a serial entrepreneur, much sought after speaker, prolific author - including the best selling Lean Analytics - which is a must read, an organizer of conferences like Forward50, and consultant to companies large and small. Alistair is a big thinker in the area of Data and Analytics, and we got to cover a lot of ground in this episode.