Matt Clifford

Entrepreneur First - Ambition and Power

Matt Clifford is the co-founder and chair of Entrepreneur First, a global talent investor that has helped create over 500 companies worth more than $10 billion. He’s also the founding chair of the Advanced Research and Invention Agency, the UK’s answer to DARPA. 

In this episode of World of DaaS, Matt and Auren discuss: 

  • Investing in talent

  • Traits of successful founders

  • The UK’s advanced research efforts

  • AI as a national asset

The Overrated and Underrated Traits of Successful Founders

Matt Clifford, co-founder and CEO of Entrepreneur First, believes that intelligence is overrated when it comes to evaluating founders. He explains, "I think intelligence is overrated for founders. So I think a lot of your listeners will be familiar with this idea of Berkson's paradox." Instead, Clifford emphasizes the importance of understanding power dynamics and group influence."I think the thing that I think that nearly all of these people are also good at is sort of, and it's sort of a bit of a dirty word, but it's like small P politics, like just really understanding group dynamics and power," Clifford notes.

The Unconventional Approach to Co-Founder Matching

Entrepreneur First takes a unique approach to matching co-founders. Rather than playing matchmaker, they create an environment where potential co-founders can experiment with partnerships. Clifford explains:"Today, what we tell founders is that, you know, in the outside world, outside the EF bubble, the way that co-founders come together is usually it's people who've known each other for a long time. And as a result, the bar to deciding to work together is actually pretty high."Instead, EF encourages a "low bar in, low bar out" approach, allowing founders to try working together and easily move on if it doesn't work out.

The Evolution of Ambition and Technology

Clifford presents an interesting perspective on the history of ambition and how technology has shaped its expression. He argues that ambitious people have always existed, but the tools available to them have changed over time."My core thesis here is that ambitious people have always existed and what's changed through history is like the best tools at their disposal to fulfill that ambition," Clifford states.He draws parallels between literacy in the past and computer science today as "technologies of ambition" that provide leverage for smart and ambitious individuals.

NOTABLE QUOTES:

"Napoleon would be green with envy if he saw the reach that Mark Zuckerberg has."

"I think a really underrated thing in founders is just like, did they understand power? Do they have a knack or an instinct for seeing where it is and like how to acquire it?"

“past a certain level, the marginal point of IQ doesn’t help…but understanding the human dynamics of getting stuff done does.”

The Unique Structures of Incubators vs. Venture Capital Firms

When discussing the structural differences between venture capital firms and incubators, Clifford offers insights into why incubators tend to have more diverse structures:"Most incubators, the pitch to their investors is not doing that. Like, yeah, of course there'll be a capital element of what we do, but the way we earn our equity is not just the capital. It's some other thing, like work. It's operational work in some way."This operational aspect leads to more varied structures among incubators compared to the relatively standardized venture capital model.

The full transcript of the podcast can be found below:

Auren Hoffman (00:00.814)

Hello, fellow data nerds. My guest today is Matt Clifford. Matt is the co -founder and CEO of Entrepreneur First, a global talent investor that has helped create over 500 companies worth over $10 billion. He's also the founding chair of the Advanced Research and Invention Agency, ARIA, which is UK's answer to DARPA. Matt, welcome to World of DAS. Very excited. Now, you're kind of investing in talent. What are some like?

Matt (00:22.377)

Thanks so much for having me.

Auren Hoffman (00:30.03)

high level traits that you think are underrated and maybe some traits that are overrated.

Matt (00:34.751)

Yeah, so at Entrepreneur First, as you said at the start, we're a talent investor. We fund people before they have companies and sort of take a bet on them as individuals and sort of take them on a journey from pre -company to having a company. So this question is sort of the thing we obsess over. Maybe I start with an overrated one. I think actually, and I'm gonna qualify this, but I think intelligence is overrated for founders. So I think a lot of your listeners will be...

familiar with this idea of Berkson's paradox. Berkson's paradox is this idea that you can have two traits that are correlated in the general population. when you take a zoom into the graph or take a selection of the population, they're actually correlated in the opposite direction. So they were positively correlated before and negatively correlated in the selection. I sort of think like this about intelligence as founders, as in, sure, if you like draw a chart of like the entire population and you know,

plot, you know, entrepreneurial outcomes with intelligence, then of course they're correlated. want to know who wants to fund, you know, like not smart founders. But I suspect if you zoom in among successful founders, I suspect

Auren Hoffman (01:44.622)

Like, got it. So like, you're really talking about, let's say, top 10%. Are you in the top one or the top 10 or something like that, right?

Matt (01:50.461)

Yeah, and my guess is if you zoom into the top one, I suspect that the marginal point of IQ or however you want to measure it doesn't help. And actually, think it's one of these things. I know there are some investors that really love this. They want to find people that were IMO medalists or whatever. For sure, that's certainly not a negative signal. But I think above a certain threshold, I'm no longer convinced it really makes that difference, that much difference relative to

Auren Hoffman (02:20.472)

Cause if you think in like, the tech world, like you think of the, the famous founders are also like many of them, not all of them, but many of them are famously super smart. know, Bill Gates, Mark Zuckerberg, Jeff Bezos, the, the, the guys that did Google, the Collison brothers, right? These are like famously super, super smart people. Now maybe I'm B2B. like, don't know, like Mark Benioff is not as famously smart. He's a super great business person, but he's not.

Matt (02:20.585)

Things.

Auren Hoffman (02:49.634)

famous because he's so intelligent. maybe at some point it like breaks down.

Matt (02:53.949)

Yeah, I mean, to be clear, I'm not saying that any of those people aren't smart or aren't successful. They clearly are.

Auren Hoffman (02:57.23)

Yeah, yeah. And by the way, not only are they smart, they're like, you know, more than just the top 1 % kind of smarts.

Matt (03:02.781)

Yeah. Although it's really interesting to try and break down like when and how does that affect occur? Like clearly these people are, you know, really exceptional and some of them I've been lucky to spend quite a lot of time with and others I don't know at all. you know, I always remember there's this sort of story about Bezos and how, you know, like he started very smart, but he also knew how to use his time to compound that to become smarter and smarter. And so

I think there is also a selection effect here that like, if you become the CEO of a successful business, you sort of get to access, you know, I can't remember who was who put it like this, but everyone's best 15 minutes, you know, like they do. And there is something about, you know, someone's like fine tuning in an LLM. Yeah. So I guess to be clear, like at EF, we have a very high bar for smart. Like we don't take people we don't think are smart. I guess what I feel sometimes is overrated is like,

Auren Hoffman (03:43.852)

Yeah, yep. You get you there's a compounding that you get smarter on. Yeah.

Auren Hoffman (03:54.445)

Okay, yeah.

Matt (04:00.377)

you know, this person was like the number one person in the IIT exams. It's like, I don't know if that matters versus being the number thousand, given that like a million people take the exam. Yeah. In terms of like things that are underrated. Well, maybe it's the sort of flip side of that. Like, I think the thing that I think that nearly all of these people are also good at is sort of, and it's sort of a bit of a dirty word, but it's like small P politics, like just really understanding.

Auren Hoffman (04:04.31)

Yeah, yep, yep.

Yeah, that's a good point. Right. Yep.

Matt (04:29.001)

group dynamics and power. like power is something that I feel, maybe it's a UK thing. Maybe you feel in the U S more comfortable talking about power, but I feel like power is a thing that people don't really like talking about. But like, think a really underrated thing in founders is just like, did they understand power? Do they have a knack or an instinct for seeing where it is and like how to acquire it?

Auren Hoffman (04:47.928)

Sure, I'm not following this at all. Like, what do mean by that? And like, what? Maybe it's a US thing too, yeah.

Matt (04:52.171)

Well, think ultimately when you start a company, you start with nothing, right? You're starting with one, two, three people, no assets. Ultimately, what do you got to do? You got to get people to do things for you. How do you do that? It's ultimately about, there's lots of things, but one of the things about is influence. And I think that the best founders are really, really good at thinking about who do I actually need to say yes to get to what I want? And what is the coalition I need to do this?

Auren Hoffman (04:57.133)

Yeah.

Auren Hoffman (05:18.51)

Hmm. How do you, but how do you assess that in a, like, I mean, I, how would you assess that in a interview or, or, I don't know, or based on their resume or something?

Matt (05:30.841)

I think it's hard to do on that. One of the things that we do at Entrepreneur First that I think has made us successful is we spend three months with people once we've made an initial collection, helping them build companies. And I certainly think we learn a lot more observing them over three months than we do, you know, like in the interview. But I think in the interview, the question I always like to ask people is like, what's the act of persuasion that you're proudest of in your life? And like, you're not allowed to say like when you met your partner or whatever, like, you know, like,

Auren Hoffman (05:40.237)

Yeah.

Auren Hoffman (05:58.977)

Yeah, yeah, yeah.

Matt (05:59.795)

What's something that you shouldn't have been able to get done given the resources you had and how do you do that? And I think that's very important. I think founders tend to have a really good sense of like the human dynamics of getting things done, persuasion.

Auren Hoffman (06:15.222)

Yeah, or there's like a hustle piece of it potentially when you're younger, right?

Matt (06:17.673)

Yeah. Yeah. Sometimes it's just like persistence, right? Is like one way to do it. But like, actually a lot of the people that you mentioned when you were reeling off your list of very successful founders, I think, you know, all of them have done an incredible job of building like coalitions of supporters inside and outside their organizations. And I think that's like really important.

Auren Hoffman (06:21.965)

Yeah.

Auren Hoffman (06:36.994)

Yeah.

What about like something that like, is there something that could be like really negative, but you don't see as a negative, like, don't know the, the, the, they, they, they went to jail when they're younger. Something like where people would say, this is super negative, but you're like, well, actually it's not as negative as people think. Like that's an overrated negative or something.

Matt (07:00.637)

mean, maybe a version of this is like, I think people do referencing really badly in that I think a lot of the time, really great founders reference really badly. so I think sometimes there's a people that I feel like over the last 10 years, people have like, in tech have started to see it's, you know, it's become almost a truism that like referencing is really important. And like, you know, if you had to choose between interviewing and referencing, should reference and I buy that. But I feel the bit that is missing from that is like, sometimes great founders are

Auren Hoffman (07:20.684)

Yep.

Matt (07:28.189)

reference really badly, you know, like they really annoyed people, guess, like annoying people is like something that a lot of people think is bad that I really don't mind about. Like, I think most of the best founders that I've worked with can be really annoying. Yeah, yeah, I'd be very, yeah, I'd be very surprised if you were to do a big five personality tests on on successful founders, or founders in general, I'd be surprised if like disagreeableness wasn't correlated with success.

Auren Hoffman (07:29.955)

Yep.

Auren Hoffman (07:41.782)

Yeah, they're rassable too, right? In many ways are there, yeah, they say, they kind of say what they think.

Auren Hoffman (07:58.028)

Yeah, it's probably anti -correlated if you're like becoming the CEO of Shell or something where you have to go up the rankings, but it's probably very highly correlated if you're going to like start a company where you're disrupting things.

Matt (07:58.367)

somehow and often.

Matt (08:05.278)

Right, exactly.

Matt (08:11.787)

Exactly. so because in our model of how we invest a lot of the time, nearly all the time, people are first time founders, often they've done something else, they went down another path that wasn't founding, maybe wasn't tech at all. And then sort of, we meet them at this point, they're making a career change. You know, often in that context, you know, when you reference people, you hear what people they've worked with in a more structured, more hierarchical environment think. And often, it's like, yeah, this person's really annoying, like they don't know how to play. It's like, yeah, that's fine.

Auren Hoffman (08:39.084)

Yeah.

Matt (08:41.373)

I kind of find about that. As long as, and this comes back to the point about power and politics, as long as it's clear that in order to do the things they actually wanted to do, they knew how to get them done.

Auren Hoffman (08:53.922)

I did a reference on someone recently and you'll kind of turn out that they had like, they got in a lawsuit with their last company. And then even like earlier in their career, they sued somebody like with that, you know, generally you say, that this is a, this is a big turnoff. that be a, would that be as big of a turnoff for you? Or would you say, it's, you know, it's fine. That's what founders do. They, they sue people.

Matt (09:19.019)

I think, well, so I think like a controversial thing that it's sort of like people don't like to talk about is that in VC, it can be rational. may be immoral, but it can be rational to sort of absorb the prejudices of the people who invest after you. So let's say you're willing to be like,

you know what, this guy went to jail, you know, like he did something really bad, but like, you know, there's sort of alpha in not listening to that and just investing anyway, because, know, like no one else will want to do it. The problem is if the next people won't do it for that reason, then you could take on a lot more financing risks than you otherwise would. And that's a bit controversial because you could say that about other things like, you obviously on average, you know, women have a much harder time fundraising. This is a huge challenge, I think, particularly if you're in talent industry where, clearly

Auren Hoffman (09:59.297)

Yep.

Matt (10:13.823)

half the talent is female. like, but you know, there is this sort of thing about the next people's prejudices may affect you. mean, like, yeah.

Auren Hoffman (10:20.601)

interesting. Okay, guys, so even if I'm not prejudiced from where other people are prejudiced, then okay.

Matt (10:24.821)

And so it becomes, now, obviously with women, we don't let that affect us. And in fact, one of the things we're really proud of at EF is we say, well, part of the reason we can be a great partner to really ambitious female founders is that we can really move the needle on getting the next round done because we know all the investors and we've worked with them so many times. But I think in general, this thing of like financing risk is maybe a bit underrated. And so if someone had a criminal record, but they'd come clean,

Auren Hoffman (10:42.06)

Yeah. Yep.

Auren Hoffman (10:54.658)

Yeah.

Matt (10:55.187)

I think that's where I would start to be like, can I get the next round on or is someone going to be like,

Auren Hoffman (10:58.038)

Yeah. Yeah. Yeah. Interesting. what are some like key indicators you look for, when evaluating potential founders, like before they have an idea, like what else is there? there some sort of a, say qual or is there something like in their resume or some other type of thing that you look for?

Matt (11:17.157)

One thing we specialize in at EF is, as I sort of alluded to, is this sort of idea of like taking people out of these like tracked, quite conventional career, like conventional in the sense of like, yeah, banking, finance, corporates, and sort of helping them start companies. And so one of the risks to us and one of the challenges is that people can be very, very, very successful in a structured environment where there's effectively some sort of, you're making your boss look good is of,

Auren Hoffman (11:27.404)

McKenzie or something, yeah.

Auren Hoffman (11:42.86)

Yep.

Yeah, and it's hard to fail in those. If you join McKinsey, it's very hard to fail. Like you have to do something really bad to fail.

Matt (11:50.345)

Right, right. Yeah, and so, you know, you can get a lot of people that have every conventional measure of success, but the minute you put them in an unstructured environment, they're not gonna succeed. And so the thing I like to ask at interviews is sort of like, tell me about the most successful thing you've ever achieved that no one told you to do. And what I find is like, even or maybe especially when you're interviewing people that have had zero exposure to start up.

Auren Hoffman (12:10.7)

Yep.

Matt (12:19.381)

culture and ecosystem, the ones that subsequently become founders have some sort of analog to that. There's something they did, maybe some side project, you know, maybe there's always something where no one told them to do it, but they went and did it. And it was like, they figured out how to succeed in an unstructured.

Auren Hoffman (12:27.725)

Yeah.

Auren Hoffman (12:37.326)

Okay, yeah, that makes a lot of sense. Are there other tips? There's a lot of common identifiers people talk about, like speed of execution that are fairly not, those are not controversial, but are there other signals when you're evaluating somebody that you really pay attention to?

Matt (12:58.527)

really look for like, which is a bit more, I guess, at least in a European context, you know, I sound very British now, more controversial, is this sort of like megalomania, will to power? I do think that

Auren Hoffman (13:10.498)

And you think you can observe that early just like by watching somebody.

Matt (13:14.731)

I think you can, I think you can ask questions that sort of surface what people assume about their own lives. think it's very hard to succeed if you don't think you're gonna succeed. It sounds really banal, but it's amazing how many people sort of embark on this sort of journey. In Zero to One, Pia Thiel has that great line, you are not a lottery ticket.

Auren Hoffman (13:24.216)

Yep.

Auren Hoffman (13:30.166)

Yeah, that's right.

Matt (13:42.079)

And I think this is like such a useful metaphor for thinking about founders. Like people, I mean, at the extreme, people who believe that their odds of success are like, know, best approximated by the data in the population, they're definitely not going to succeed. Like you have to believe that you have like better than average odds. And so I think, you know, you, I think you can find this sort of will to power by sort of asking people about, know, like what they.

Auren Hoffman (13:57.676)

Yes, yeah, yeah, you better, yeah.

Matt (14:08.159)

what they expect to happen and like if they fail, why would they fail? And it's really interesting, especially if you've done this thousands of times, which I have by now, you really notice a difference between the people that can't contain the of the ambition and people that almost tell you what they, there's a temptation to tell you what you think you want to hear, like, they want to hear something well balanced. Actually, I don't want to hear something well balanced. And that to me is a real tell.

Auren Hoffman (14:31.041)

Yeah.

Auren Hoffman (14:36.558)

What are the, I remember young when I was younger, you know, hanging out with all these, all these, you know, internet entrepreneurs and stuff like that. And, some of them were successful, some weren't successful, but even most of the successful ones, like they sold their company, made a ton of money. Like they, they, they, they, they, they just went and like did a winery afterwards or something like that. They lost their ambition after, after that kind of.

They reached a plateau where they're very happy. They could take care of their family. it's like, it's not like they're not Elon Musk's like very few people are like that. Like how, and of course, like, you know, you want to invest in like the person who has just like this outsized ambition. How do you even know that? Like, I don't know that I would have been able to suss it out when I was 22.

Matt (15:24.255)

think it's hard, I think you can, and it's definitely not flawless, but I think you can sort of get a sense of it by like dissatisfaction. So like, I think I like to ask people is like, what's the greatest achievement of their life and get them to like really talk about it. And it's really interesting just hearing like tone of voice, energy, et cetera, when people talk about it, like some people are really satisfied with the greatest things they've achieved with their life so far.

Auren Hoffman (15:39.991)

Auren Hoffman (15:44.354)

Mmm.

Matt (15:50.377)

And some people are really frustrated and disappointed with the greatest things they've achieved with their lives so far. I think like there's a, there's a thing that I think great, actually not just great founders, but like really like ultra successful people in every domain have, which is sad for them, which is that the success lasts very, short amount of time for them psychologically. Like, yeah, like you get the big win and the next day you're like right back to emotionally back to zero and you got to start again. And, and that's a great predictor of success, or at least it seems to be carless.

Auren Hoffman (15:54.155)

Yeah.

Auren Hoffman (16:19.576)

Cause there's some people when you look at them and, and they seem to be like fairly well adjusted people. They've got happy marriages. They've got kids. They, know, if you look at like Mark Zuckerberg, seems like very fairly well adjusted type person. And then, you know, you look at other people, you know, more famously, say an Elon Musk is like, you know, he, he wouldn't say he's well adjusted, like, you know, less, maybe less well adjusted kind of things. Like, do you think like either one or more?

Matt (16:31.871)

Yeah, seems people are adjusted,

Auren Hoffman (16:48.684)

You know, you, cause Elon Musk is more like the, the, the people, people talk about, he's like, you know, crazy as dad issues, you know, all those other types of stuff. Like Zuckerberg probably has a great relationship with his father. Right. So like, how do you think about that?

Matt (17:03.967)

think it's sort of orthogonal to it's a really good point. I think it's sort of orthogonal to what I'm talking about, which is, again, I don't know Zuckerberg, but like, when I listen to him on podcasts, I'm really struck by like the complete, like, endless competitiveness that he has. And this sort of sense that like, he's just like nowhere near done. And like, yeah, I'm sure he's proud of like what he's achieved.

Auren Hoffman (17:21.218)

Yes, he's very competitive for sure. Yeah. Yeah.

Matt (17:29.481)

there's no sense in which he's satisfying anything. Everything is like about maximization. And yeah, again, I I spent a little bit of time with Elon Musk, but I definitely don't know him. And I would say like, my sense of him is not like that, right? Like he, he's still maximizing. He's definitely not satisfying, but it's, as you say, it's a very different thing. But to me, this is really important because it shows that like, even when we're talking about these traits that are like single dimensions, it's really important when you're trying to evaluate founders.

Auren Hoffman (17:32.365)

Yep.

Auren Hoffman (17:45.239)

Yeah, yeah.

Matt (17:58.003)

not to confuse or conflate them with other traits that are like superficially similar but could actually be completely, you orthogonal.

Auren Hoffman (18:05.954)

Yeah, you have all these things. Famously, obviously, Elon Musk, Jeff Bezos, Steve Jobs, they all had either no relationship or very bad relationships with their fathers. you can put Larry Ellison in there and stuff, but then you can say, then you can go to the flip side. OK, Bill Gates had a great relationship with his father Zuckerberg. It's very hard to make a blanket statement about anything.

Matt (18:29.259)

Exactly. But you know, and I think, you know, one thing that we, sort of consider what we do at EF to be an apprenticeship business, you know, like we, we, we, hire, you know, usually quite young people to join us as talent investors. And then we spend, you know, when it works, I've many years like working with them, apprenticing them. And, know, we, we say like, you know, there's, there's value of data. There is a value in data in what we do, but you are also just

honing your intuition through hundreds and hundreds of interactions with people. And we're really big. One thing we do at EF is stack ranking, know, stack ranking of candidates, of individuals, of companies, and trying to like use that as a way to kind of calibrate our intuition.

Auren Hoffman (18:58.732)

Yeah.

Auren Hoffman (19:10.306)

God. is, is Susie better than Jane and is Jane better than Bob or something like that? And it's like, gives you a sense. If you had to pick one, which one would you pick? Okay. Yeah.

Matt (19:16.223)

Yeah, like if you, if you had to go all in on one of these people. Yeah, exactly. As an investment, you know, we also obviously we treat them as people that that's ultimately what we do, but like, we, we also want to have this thing of like, we want to get good at trying to like build intuition around these things. And then where possible, try and like back out, like, what does that intuition tell us? Like, what was it that we, we got really excited about this person. Susie then went on to succeed big time. What was it? You know, like, can we, can we say something useful?

That's a really important question.

Auren Hoffman (19:47.854)

It's really interesting, you know, in the, in my, just in hiring in my life, like I've hired a few people, you know, a small number of people that like truly became truly were 10 Xers, like truly changed the trajectories of the companies and stuff like that. And obviously hired them because like, thought they were good, but at the time, I don't think I could have told you which of the hires I made were going to be 10 Xers. all of them, were hoping there would be good. Some of turned out just be.

you know, there was a few that turned out to be very bad, but most of them turned out to be fine, very good. And then every once in a while you get this like extraordinary person that really moves the needle. have no idea that I even going back in time, I don't know how, how I could have told said that this person would have been like the extraordinary.

Matt (20:32.317)

Yeah, I'm very skeptical about our ability and certainly my ability to do it at the level of the individual. And one of the reasons I like stack ranking is it allows you instead to talk about like quartiles. you know, like, I think I would not believe someone that said that they could perfectly stack rank. Not least there's a lot of randomness after this election, right? But certainly good talent investors at EF, their top quartile will

Auren Hoffman (20:43.959)

Yeah.

Matt (21:00.683)

outperform by orders of magnitude their bottom quartile. Now they might not know which person in the top quartile, but they usually almost always get the biggest outcome in that top quartile of their stack.

Auren Hoffman (21:11.896)

So it's always hard to once you put someone on top quartile, are you treating them differently or? Okay.

Matt (21:15.805)

No, not really. It's not a way that we, it's not a tool we use to work with founders. It's a tool we use to train our people on how you evaluate founders. Yeah. Yeah.

Auren Hoffman (21:25.164)

Okay, interesting. Now, if you think of all the thousands of venture capital firms, almost all of them are structured the same way. They all have a management fee carry. Just legally, they're structured the same way. And then if you think of all these incubators, maybe there's hundreds of incubators, almost all of them are structured completely uniquely. There's almost no two incubators that are even remotely the same. Why is this?

Matt (21:38.602)

Yeah.

Matt (21:48.799)

Yeah.

Yeah, I still quite a lot about this question because we are one of those and we have an unusual structure and one that took a...

Auren Hoffman (21:55.629)

Yeah.

And by the way, for full transparency to our viewers, I'm a small shareholder and entrepreneur first, very happy shareholder. Yeah. Yeah. Yeah.

Matt (22:04.34)

Yeah, we should have said that. Yeah, well, we're very happy to have you as a Shell. Well, so here's the way I think about it. So I think just the first thing I would say is that in general, venture capital funds are trying to sell a commodity capital at a market price. you know, yeah, there's a bit of price discovery, but broadly, there's a fairly simple trade of cash for equity and you determine the price by a

Auren Hoffman (22:31.16)

Yeah.

Matt (22:33.355)

And so in a way, it sort of makes sense that there would be a single optimal structure to perform that somewhat standardized transaction. Most incubators, the pitch to their investors is, not doing that. Like, yeah, of course there'll be a capital element of what we do, but the way we earn our equity is not just the capital. It's some other thing, like work. It's operational work in some way. And that varies across incubators. But because of that,

I think you end up with a lot of different structures because there's much less, there's much more messiness, there's much more nuance in what that thing is that is not capital.

Auren Hoffman (23:11.49)

But if you think of like most, let's say most companies are different. Like, you know, if you think of Facebook is different than Salesforce, which different, well, OpenAI may be the weird exception actually, but they're like, they're structured similar. They're Delaware, C -Corps, they've got like common and preferred. Like they're all kind of like structured in a similar way, roughly. And then, you eventually they may go public. And then when you go public, they're all structured basically the same way. You know, there's a board and this, yeah. But then when you go to like the incubator, it's like,

Matt (23:19.753)

Yeah.

Matt (23:29.215)

Yeah.

Matt (23:36.402)

Yeah

Auren Hoffman (23:40.628)

we got this and then there's just this side fund over here and there's weird and it's like, it's like everything gets like so crazy. It's like, it's like, you can't even put it on like a slide usually. Yeah. Yeah.

Matt (23:43.73)

yeah, well, I do think it is.

Matt (23:50.827)

Yeah, you don't want to see our structure diagram. I mean, I think some of it is that because it's weird, doesn't like, you know, broadly, there's a reason that funds structure is these GPL structures and not as companies and it's largely tax, know, tax driven and in a way that makes total sense. And I think because that model doesn't usually work for incubators, because the operating cost is much larger as a percentage of the overall outlay than normal.

Auren Hoffman (24:18.765)

Yeah.

Matt (24:20.553)

then if you structure as a company, usually that puts you a weird tax position that you're overpaying tax relative to what you could achieve in a farm structure. so you end up with these weird hybrids because it's so weird. And frankly, niche. You're right, there's hundreds of them. But collectively, we're all pretty small. so no government is like, you know what? We need to come up with this really great way of creating a standardized structure that recognizes how incubators work.

Auren Hoffman (24:29.069)

Yes.

Auren Hoffman (24:39.832)

Correct, yeah.

Matt (24:48.811)

Although actually the UK is just launching something that looks quite promising. I'll come back to you on whether it works. But so I think it's partly just that it's that it's a bit weird and it doesn't fit in either box. I think there's like a second thing though, which is maybe a bit more controversial, but you wrote about this like six years ago and I wrote about to my newsletter six years ago, which is like, I think a lot of VC isn't that ambitious. I don't mean that the people aren't really good.

Auren Hoffman (25:16.972)

Yes, that's right. Yeah.

Matt (25:18.761)

or that they're not trying to succeed, like broadly, most VCs don't see their job as sort of like innovating on the process of venture capital. They see their job as like finding great companies and giving them capital. And interestingly, I would argue, and I think you argued the same thing when you first wrote about this, that private equity is actually more ambitious on average. Like people have innovated more on the model in private equity. And that's partly why you have a bunch of private equity management companies that are public companies in a way that you

Auren Hoffman (25:32.568)

Yeah.

Auren Hoffman (25:47.662)

That's right.

Matt (25:48.543)

don't really in BC. And I do.

Auren Hoffman (25:50.218)

If you see it's, it's always like, they never take their own advice, right? Like the one advice would be like, well, it would be, I don't know. Would you really like, if you're a VC really want to like fund a company has more than two CEOs, like probably not. Right. would you want to, well, you want to fund a company that takes outside money and they're willing to sell their equity, like VCs, well, at least LLC, they're not selling LLC usually to anybody. Right. They're not like taking dilution and stuff and the else, even with their own employees, they don't get like.

Matt (25:53.898)

Right.

Matt (26:02.889)

Right. Right.

Auren Hoffman (26:19.448)

The employees don't get in it. get, they get, you know, carrying the fun, but not in the management company. And you know, there's so many things where like, you know, these, he's never go public, but they tell like Bill Gurley tells everyone to go public, but like he's never even thought probably about taking benchmark public, right? It's like they never take their own advice.

Matt (26:23.381)

Right? Right?

Matt (26:33.343)

Right. Right.

Yeah, exactly. And so I think there's just something about like, I don't know, like there's something maybe that attracts people to incubation models, which is like a different impulse that attracts people to VC that is like, it is this sort of like company building one where you sort of want to, maybe to a fault, want to experiment with the structure because you're trying to like, you know, I still believe that, you know, one of the things I love about what we do

Auren Hoffman (26:57.068)

Yeah.

Matt (27:03.741)

in talent investing is it is weird. It's trying to create this new asset class in talent. And, you I'd like to think that's a really ambitious thing to try to do. so, you know, you, you end up with like a lot of structural innovation to try and make that, that, that.

Auren Hoffman (27:18.03)

Are there other like quote unquote incubators or other types of like weird companies you look to? Like, because there's not a lot of like super successful ones that are out there. So are there ones you're like, wow, I like, I studied these guys, I've learned something, I've taken these things away from them.

Matt (27:34.923)

I mean, think there's there's clearly a ton to learn from the successes of things that look a little bit like this. mean, obviously Y Combinator is not, it's not really an incubator, I mean, just an extraordinary, extraordinary company and an extraordinary success.

Auren Hoffman (27:47.021)

Yes.

Auren Hoffman (27:52.554)

One of the great things about Y Combinator, like if you really think about Y Combinator as a company as opposed to a venture capital firm is most people who know Y Combinator can't name one person who works at Y Combinator. Right? Like that is amazing. Like that just shows you the value of what they're doing. Like they're not actually like a bunch of smart collection of people. Of course they have super smart people there, but they've actually built a real company.

Matt (28:07.109)

Yeah.

Matt (28:21.193)

Yeah, well, and

Auren Hoffman (28:21.428)

Angelist would be the same, right? They built like a real company. Like you can't, you can't just like point like, unless you build up, can't know one person who is an Angelist. Like they don't know the name.

Matt (28:24.244)

Yeah.

Matt (28:29.803)

Well, this comes back to the structural question, right? Because like a big reason to structure incubators as companies is the idea that they have some sort of enduring equity value that is distinct from the like track record of the star investors. And the truth is most of the time in VC, that's just not true. Like, you know, and you know, we've seen that, right? You know, I won't name names, but you there are clearly top tier firms today that didn't exist 20 years ago and top tier firms from 20 years ago that are no longer top tier, right? And

Auren Hoffman (28:38.733)

Yes.

Auren Hoffman (28:46.966)

right.

Auren Hoffman (28:57.752)

Yep.

Matt (28:59.317)

But actually YC has created this enduring equity value precisely because it doesn't rely on like any one superstar. Obviously the people that work there are smart, exceptionally smart and like exceptionally capable, but you're right, that's not where the value comes from. And you know, if you go to a different asset class, I don't know.

Auren Hoffman (29:08.28)

Super smart, yeah. Yeah, yeah, super good, yeah.

Auren Hoffman (29:16.428)

And by the way, I'm watching like, I believe right now, if you take the top five people out of YC and obviously they're all amazing and it would be, it would, it would be a negative, but YC would still be amazing. whereas if you take the top five people out of any venture capital firm right now, like it would go under immediately. Yeah.

Matt (29:28.041)

Yeah.

Matt (29:33.535)

Right, absolutely. Yeah, and that's the dream in a way of these models, which is like you actually have something that is...

Auren Hoffman (29:38.541)

Correct. Yeah.

It's like a real company. obviously if you take top five people out of Google, it's the same thing that Google still be incredible, right? Like Apple, like you just, you just go through the list.

Matt (29:46.664)

Right, Yeah, and know, going back to your like advice that VCs don't follow, that's almost the definition of a lifestyle business, Yeah, Yeah.

Auren Hoffman (29:54.552)

Correct. Correct. It's a lifestyle business. Exactly. Venture capital is a lifestyle business. What about, have you followed IAC at all? Because in a weird way, they're kind of an incubator. They spin things out. They start things, they acquire things on the side. They're like a really weird company. They're probably the weirdest tech company out there.

Matt (30:14.057)

That's true. Yeah. I mean, I know a little bit about them, but yeah, I haven't followed it closely, but you know, there are other, I mean, maybe something a bit like that would be like NASPAs or something like, I mean, I think these models like, know, Vista or Constellation where you have this like central, you're like, you really have a thesis about how you use your like central services to drive value is really interesting.

Auren Hoffman (30:24.961)

NASPR is a great example. Yeah. Yeah. Great example. Yeah.

Auren Hoffman (30:37.688)

Yeah.

Auren Hoffman (30:43.478)

Right, Constellations Software is a great example of just like incredibly successful company.

Matt (30:47.816)

Yeah. Which, which even though like, you know, I is reasonably well known is still like underrated. Definitely. Like it should, it should be like, yeah, exactly. Yeah. So, you know, I think there's, there's a, there's a lot of these things, but again, like they, it's, it's funny how much like in traditional VC it's, as you said, like a lot of it's about creating brand around your superstars and maybe that like, it's just sort of a

Auren Hoffman (30:53.784)

Correct, yeah. Most people in tech haven't heard of them. So even in tech, they haven't heard of them.

Matt (31:17.481)

very different set of incentives.

Auren Hoffman (31:20.14)

What in when you think of like the motivation of people starting companies nowadays, it's cool to be a founder and adds a lot of cash to your resume. If you're a founder and stuff, again, not not that different to joining Goldman Sachs or McKinsey or something like that. Like there's something it's a resume edition. How do you suss that out when you're thinking of when you're kind of going through people like, they're just doing this because it's cool.

Matt (31:47.881)

Yeah, this is a very interesting question because it really varies by geography. So EF, we operate in London, which where we started in Paris, in France, obviously in Bangalore and India, and then in New York and San Francisco. And I would say like the answer to that question is really quite different across the sort of five cities. You know, I would say like in London, you know, maybe there's a little bit of that, but I say it's still like, it's no longer a...

Auren Hoffman (32:15.798)

still not cool.

Matt (32:17.029)

Yeah, it's no longer like a weird thing to do. Like we've been going for over a decade. You know, when we got started, it was weird. It's not weird anymore. But I don't know if it's cool. Like I think it's, it's sort of like fine. You know, so it's okay. I would say it's behind that probably in Paris. But I agree with the, know, in San Francisco, it'd be pretty embarrassing to work for McKinsey, I guess, I don't know. And, know, I think it really

Auren Hoffman (32:19.469)

Yeah.

Auren Hoffman (32:39.904)

Yeah, yeah, yeah, it would be. Yeah.

Matt (32:44.317)

It really comes down to like, are the default career paths for people in each location? You know, I think, you know, one thing I believe is that it's kind of annoying if you're already on the inside of an ecosystem when the tourists show up. But it's actually a really good thing on that. Like it's really great if people try being founders because they think it's cool. Like I think like it's really tempting to be like, there's something inauthentic about that. But like I'm always suspicious.

Auren Hoffman (33:08.109)

Yeah.

Matt (33:13.535)

That feels like guild mentality of like, there are like true entrepreneurs and fake entrepreneurs. It's just like, you know, like the way we find the true entrepreneurs is we encourage a lot of people to try and like, and see what.

Auren Hoffman (33:23.982)

Totally. Now, one of the things that you do, which I think is so interesting is you're not only trying to find great talent, but you're trying to match them together as co -founders. then in some ways, you're like a matchmaker creating marriages and you want these marriages to work. It's like you only get the incentive if the marriage lasts at least 20 years type of thing. So how do you think about, even if they're both

really talented people, like they might not be the great pair with one another. Like, how do you think about that?

Matt (34:00.176)

Yeah, so you're absolutely right. So one of the lead reasons that people come to Entrepreneur First is to find a co -founder. And, you know, that's a big part of how we work, which is like people as individuals, we bring them together into a community and, you know, we fund them once they're in teams and companies. I think one thing we learned, you which was extremely humbling in the true sense experience, not in the modern sense of meaning, the opposite, but like I said, we learned we were pretty dumb was when we started EF.

Auren Hoffman (34:21.39)

You

Matt (34:25.311)

We were like, right, so we're gonna know all these people really well. And so we're gonna figure out who should work together and we're gonna like put them together and say, hey, know, R and Matt, should work together, you should start a company. We tried that in the first year and it was a complete disaster. had, it's like awful. And, but we sort of got a little bit lucky with it. And so I don't think it quite got handed home enough how bad we were at it. So we tried it again in the second year and it was like really a disaster. And we got to like, then we got to like six weeks in and we were like, no one is still.

Auren Hoffman (34:37.488)

interesting. Okay.

Matt (34:54.549)

We were doing the opposite of what we do now, which is like people come to us and say, we just don't think we should work together. We're like, no, you should work together, stay together. were like, I was marriage counseling people. It was terrible. And I think what we learned was that actually like, I don't believe there is any way to inorganically do this that produces like really enduring relationships. And so we gradually, from that moment of humility, sort of figured out what we do today, which is almost the opposite.

Auren Hoffman (35:01.943)

Yeah.

Matt (35:24.469)

Today, what we tell founders is that, you know, in the outside world, outside the EF bubble, the way that co -founders come together is usually it's people who've known each other for a long time. And as a result, the bar to deciding to work together is actually pretty high. Like if you've been friends with someone for a while, you're not going to casually like go like Friday night, have a beer, Monday morning, start a company together. Like usually it's a long discussion. You know, you sort of play with ideas in the evenings and weekends. But equally,

Auren Hoffman (35:51.66)

Yeah, yeah.

Matt (35:54.347)

with there being a very high bar to get in, there's a very high bar to get out. So if we've been working with each other for like three weeks, having gone through this process and then I'm like, you know what, we're just not the right team. That's kind of embarrassing and awkward and like, you know, it's high bar in high bar out. What we do at EF is we invert both those. So we basically say in this room are 30 other people. They all want to start companies. They're all incredibly smart. They're all incredibly driven. They will come through our selection process.

If your co -founder is in this room, the only way you're going to find that out is by working with them. Try it out.

Auren Hoffman (36:24.982)

So just try it out and then but like but it's okay to just stop it at any time

Matt (36:29.067)

It's been 24 hours, like it isn't working. In this community, there is no social stigma to just saying so. And so we really enforce this. the way we enforce this is if you start a team with someone in your cohort, you sort of register that team in our tool and it pops up on Slack and everyone knows. And then when you stop, you unregister it and it pops up on Slack and everyone knows. And the norm in our community is you celebrate the breakups. You celebrate the breakups because you've both learned something about what you actually need.

Auren Hoffman (36:32.151)

Yeah.

Auren Hoffman (36:36.535)

Yeah, okay.

Auren Hoffman (36:54.221)

Okay.

Yeah.

Matt (36:58.333)

And you've created liquidity in the founder pool when there's more opportunities for other people. And so instead of it being like high bar in high bar out, it's low bar in low bar out. And so on average, people go through, I think something like two and a half combinations on average, you before they find the one. And so, you know, like one way you could describe that whole process is, not that we actually match make, it's that we create a ton of teams. The bad ones fail and the good ones are the ones that are left.

Auren Hoffman (37:01.472)

Yeah, yeah,

Auren Hoffman (37:21.324)

Yeah. Yeah. Okay. So it's like a Darwinian thing a bit. Yeah.

Matt (37:25.564)

Yeah, it's the selection pressure.

Auren Hoffman (37:28.268)

Now you have an interesting idea around that kind of like the history of ambition and the technology's ambition. Can you kind of unpack that a bit?

Matt (37:38.111)

Yeah, so my core thesis here is that ambitious people have always existed and what's changed through history is like the best tools at their disposal to fulfill that ambition.

Auren Hoffman (37:50.072)

So if like you're Napoleon, it's like, I'll just like, I'm in business. I'm just going to conquer stuff, right? Yeah.

Matt (37:54.155)

I mean, literally, yeah. So like, I once read this essay now a long time ago, but I think it's held up pretty well on like trying to do a brief history of ambition. And you know, the point was like, you know, if you were born 1000 years ago, and you were really ambitious, but you were born to like peasant farmers, you didn't really have a lot of options. I don't care how smart you are. but right, exactly. You end up doing stuff that is probably like

Auren Hoffman (38:09.112)

Yeah.

Yeah, you'd probably have to lead a rebellion or something like that, right? Yeah. And you'd have a, you know, 1 % chance of making it happen and a 99 % chance of getting killed or something, right? Yeah.

Matt (38:23.345)

Exactly. so, you know, but something special happens around that time, about a thousand years ago, which is like literacy becomes like a relatively mass phenomenon. People learn to, to read and write. And it turns out literacy is an incredible technology of ambition. What I mean by that is it just creates leverage for smart and ambitious people. Instead of like only being able to influence the people in your immediate physical surrounding. If you can write things down, then you have a technology that allows you to scale. like when you like.

Auren Hoffman (38:49.25)

yeah, interesting.

Matt (38:52.073)

couple that with social institutions like in Europe, you know, the Catholic Church, that creates these like, early opportunities for scale. Like if you could become a bishop or an archbishop, it's actually a relative, I don't want say it's meritocratic in the modern sense, but like, there really are examples from the medieval period in Europe where people from very humble origins get to become like these very powerful people through the church. And so my sort of joke that I use when I go speak to rooms of computer science students, which is what I do a lot is like, you know, like,

Auren Hoffman (39:06.06)

Yeah.

Auren Hoffman (39:15.768)

Yep.

Matt (39:21.747)

It's really great you're all here learning the most important technology of ambition of your time. And a thousand years ago, we'd have been having the same conversation, but this would be a monastery and you'd all be learning to be, you know. And I really believe that. think like, if you look at the history of ambition in Europe, like so many like ordinary people go through the church, which is one of the reasons I think that the church was not a particularly moral place for much of the medieval period was the people there.

Auren Hoffman (39:31.122)

Alright, right.

Auren Hoffman (39:46.434)

But as an aside, there's all these different things written about birth order and stuff like that. historically, the people that went to the church were not the oldest. And there's all these things like born to rebel, like maybe the not the oldest one is the one that's more likely to be the entrepreneurs. How do you think about that?

Matt (40:07.049)

Yeah, I mean, on Birth Order itself, we've looked at this quite a bit in our own days, we can't find anything. But there's obviously like a bit of a selection effect. Again, it may be like Berkson's paradox thing, where, you know, in the population, it's different. But I think there is definitely something about institutions that are, because the other thing, at least in England, in medieval England, not only did you, you know, have relatively few opportunities, but if you were born into a family that had some.

Auren Hoffman (40:12.888)

Can't find anything, okay.

Auren Hoffman (40:19.309)

Yeah.

Matt (40:36.181)

Because in England, unlike in Europe, we have prima genitia where the eldest son gets everything and the others basically get nothing. It means that like, for ambitious second and third sons, you really got to hustle because you're not getting it. Yeah, and so I think that probably I have not looked at that, but I suspect there is a big effect.

Auren Hoffman (40:41.144)

Yeah.

Auren Hoffman (40:47.736)

You really gotta do some stuff. Yeah.

Auren Hoffman (40:53.878)

Okay, interesting. that's a and today you basically think just in general, like it's tracking toward things like things you can, you're just like, yeah, if you're a business person like today, there's a high likelihood you're going to end up in tech kind of, yeah.

Matt (41:09.461)

I think so. see you sort of like, the question is always like, where's the leverage? And for most of the 20th century, think finance probably was the right answer. Like you write a check in New York and it reverberates around the world. Today, feels like tech is like just, we've already talked about people like Mark Zuckerberg. I always like to say like, Napoleon would be green with envy if he saw the reach that Mark Zuckerberg has.

Auren Hoffman (41:26.264)

Yeah.

Auren Hoffman (41:31.554)

Totally. Now how do you think about like politics? Because when I meet the average congressperson in US or the average member of parliament in the UK, they don't strike me as particularly ambitious people. They usually seem fairly content and fairly, like they don't, they're actually, yeah, I mean, yes, they'd be happy to be prime minister or president if someone allowed them to be, but they, they, they don't actually seem that ambitious.

Matt (41:53.844)

Hmm.

Matt (41:57.727)

I think in the UK, at least politics is a funny mix of megalomania and like its opposite because of the way, I guess, very similar actually to being a member of the house, but because it is actually a very local game, you get a lot of people who are genuinely in it to like try and serve their communities. I'm not saying that's not true of the ambitious as well, but like, and then there are others that will do whatever it takes to get to the top. I do think that...

Auren Hoffman (42:03.928)

Mm

Auren Hoffman (42:10.615)

Yep.

Auren Hoffman (42:17.751)

Yeah.

Matt (42:25.711)

The US is a bit different because the US remains so powerful on the world stage. But in countries outside the US, I think there probably has been a perceived reduction in power that comes from even getting to the top in the countries. And as a result, I think it probably does at the margin not attract as ambitious people.

Auren Hoffman (42:38.989)

Right.

Auren Hoffman (42:43.182)

You're the top person in some random country. You might be equivalent to a US congressperson who's out taking power.

Matt (42:53.299)

Right. you know, if you're, don't know, like, I don't know what the system of government actually is in Luxembourg, but you know, imagine you end up being the finance minister of Luxembourg, you know, it's like a powerful position, but I don't know, it's probably better to be like, CEO minus one, an investment bank or something if what you care about is power.

Auren Hoffman (43:09.846)

Yeah, yeah, totally. And what do you think is like the biggest leverage or the biggest kind of technology of the ambition today?

Matt (43:18.131)

One thing I've been thinking about is, you know, obviously a big part of my work now is AI and I've done a lot of work for the UK government on that as well. And it's very clear that like controlling important parts of the AI value chain is like an incredible source of leverage and power today. I think what's really interesting is that so far, like the foundation model has attracted a lot of super ambitious people. I

Clearly Sam Altman is an extraordinarily ambitious guy. And yeah, and it's one of the reasons that Zuck has got into the foundation model game. It's clearly an important part. I think a thing that's not yet fully played out is that there are going to be important choke points in the AI value chain that are up for grabs. mean, today we obviously, it sort of feels like the TSMC, ASML, Nvidia choke points are like...

Auren Hoffman (43:47.758)

Correct. Clearly he's ambitious, yeah.

Auren Hoffman (43:54.178)

Yep.

Matt (44:14.741)

completely impregnable. something I'm really excited about right now is the amount of innovation happening in the compute stack and the number of people that are looking at that Nvidia market cap and those margins and being like, there's got to be alternative paradigms here. And I suspect that we're going to see some very, very ambitious people attracted to these choke points in the AI value chain where the ability to turn those on and off gives you a C at the table anyway you

Auren Hoffman (44:43.118)

What do you, in the kind of like, how do you think of this kind of paradigm versus like people who know lot of other people, they're connected to a lot of other people, they're kind of nodes. And then there's people who just like, who have a lot of knowledge, who kind of have a lot of like, you kind of the old paradigm of like what you know versus who you know. How do you think about that?

Matt (45:05.671)

Yeah, I mean, I it sort of depends a little bit on, this is where I think it depends a bit on like comparative advantage. Like I think there are still domains, I think there are domains where each dominate. Certainly in politics, I do think like the ability to build coalitions remains probably the most important thing. And that feels like a who you know, not a what you know. But if I go to the point I was making about like choke points in, you know, like the AI supply chain or whatever.

I suspect that people that win in that are just gonna be like people who have a really, really clear idea about exactly where to play. And that's a like what you know thing. mean, one thing I think in general is it's interesting to look at how things are playing out with the EU versus the big tech companies. And this feels to me like one of the defining struggles of our time less because it actually matters so much at the object level and more because

I do think this sort of politics versus technology is going to be a tension that is increasingly important as technology, you know, like as capabilities. And it's happening in the US as well.

Auren Hoffman (46:12.534)

Yeah, it's I mean, it's happening in the US as well. And anti tech is is a fairly is a fairly bipartisan thing in the US.

Matt (46:21.215)

Yeah, and I would say like to caricature and clearly there are like tons of exceptions to this. Tech is a little bit more what you know, you know, at the margin and politics is a bit more who you know. And I think that's like one of the ways in which, you know, one of the things that's been really interesting to see play out over the last year is I think certainly Silicon Valley, but tech more broadly starting to take politics more seriously. And in a way that I think

Auren Hoffman (46:30.474)

Mm hmm. Yep. For sure. Yeah.

Matt (46:47.967)

that was quite slow. I think it took a long time for that reaction to be really catalyzed. And I think a lot of it's because it's just a different mode of operating. It's a who you know, not a what you know thing.

Auren Hoffman (46:58.796)

Now, speaking of like government related stuff, you're the founding chair of ARIA, which is the UK's advanced research and invention agency. And it's kind of modeled on DARPA, right? What are, how do you, how do you think about, mean, I know you're like a big student of DARPA and a big fan of DARPA, but how do you, how do you think about like this next evolution with, with ARIA?

Matt (47:07.923)

Yeah, it is.

Matt (47:19.913)

Yeah, well, think Aria is designed to learn from DARPA, but bring, I guess, a British take on it that plays to our strengths. I was going to say the single biggest difference is that it doesn't have the D. Honestly, I think we obviously had a choice there. We could have been Daria, I guess, maybe we would have picked a different name.

Auren Hoffman (47:36.032)

Obviously it doesn't have the D in it, right? Which is...

Matt (47:49.173)

But the reason we didn't, I think, is at least part about scale. One thing that DoD is really good at is scale, right? And for better or worse. And we just don't have that scale. Right, maybe we would. Right. so I think I probably, if you look at what are some of the things that have really made DARPA successful, it is this pull through from DoD, this sense that there is a customer for this future capability.

Auren Hoffman (47:52.941)

Yeah.

Yeah.

Auren Hoffman (48:01.326)

Yeah, maybe in 1890 you would be Dariarch.

Matt (48:18.879)

And as a result, you can, you know, the, the, sort of scientific and technological work that, that is funded by DARPA, but it's always sort of got this like pull towards like something the military actually want and will use. we've obviously like forgotten that with ARIA. And so we've had to think about like, what are the other mechanisms you can create, to have that pull and, know, the, CEO Ilan Gur, who's a really wonderful guy and done a phenomenal job setting it

Auren Hoffman (48:47.916)

Yeah, and who we both known for a very long time. A huge guy. Yeah. He was great. Great, great guy. Yeah.

Matt (48:49.835)

Exactly. He has this idea that like the way to do this in the UK, building on UK strengths, and also like recognizing that we don't have that, you know, whatever it is, $800, $900 billion a year customer is to like really put the emphasis on scientific entrepreneurship. In other words, our scaling mode is to partner with entrepreneurs that take the ARIA ideas and make them big. And so, you know, that's a bat.

It's a bet that we've made very deliberately and consciously. It's partly a bet because we see that it's an area of emerging strength for the UK, but also one where we think ARIA can help build that ecosystem as well.

Auren Hoffman (49:34.594)

How do you, you know, the UK is also, to me, it's also like a weird place where it's obviously has, it's got all this like national pride and stuff like that. But every once in a while, like when there's like anything like a key government role, like it often like grabs people who are like, have, who are not, who maybe you've never even lived in England or something like that, or, don't have it's, you know, in, in, the United States, like,

Matt (49:41.867)

to all of his art.

Auren Hoffman (49:59.912)

You may have not been born in the United States, but you have to be a citizen usually to get a key role in government, right? You'd never have like the head of the fed who wasn't a citizen, right? Or the head of DARPA who wasn't a citizen, but like, Elon is like, you know, he's like, you know, he's from Pittsburgh. Like he's an American, right? You had like Mark Kearney, who's a Canadian, like what, what is the, what about the UK that it's like very open to, it's like, you know, it's kind like a company will take someone from anywhere. The UK is kind of like that in a way.

Matt (50:05.14)

Yeah.

Matt (50:19.455)

Yeah?

Matt (50:29.491)

Yeah, I think there's a few things happening there. think one is like just, a, it is actually just like a historical strength, I think of the UK to be like extremely open. and, and, know, like, it's, I do wonder. Right.

Auren Hoffman (50:44.91)

In the US you'd never even get a security clearance. Like if you weren't a citizen, like how do you get a security clearance if you're not a citizen?

Matt (50:50.919)

Yeah, I mean, there are various rules in the UK about like how you can get security clearances. And obviously there are some things that you can't without, you know, but you can, there's always, there's always like exceptions. I do think that a lot of it is like, I think we're really good still in the UK at thinking about, elite talent, you know, and thinking about like that as being like,

Auren Hoffman (50:59.544)

Some exception somewhere, yeah.

Auren Hoffman (51:11.309)

Yeah.

That's the core advantage and has to be right, because it's a midsize country and.

Matt (51:15.987)

Yeah. It's a mid -sized country with like a small number of institutions that are still globally magnetic, you know, for top talent, Oxford University, Cambridge University, DeepMind, I hope ARIA and, you know, like I do think EF, yeah. And so like, I do think, actually, I'm really, I'm starting a new program at EF focused on AI. And one thing I am in London, and one thing I am proud of is like,

Auren Hoffman (51:29.271)

Yeah.

Yeah, hopefully. Yeah. Yeah. Yeah. Yeah.

Matt (51:45.119)

There's several people who've moved from the US for it, which I think is great. So I think there is just this thing that actually, although the politics of immigration in the UK are as messy as there are anywhere, in general, that is not applied at the highest level of jobs. And I hope that stays for long time.

Auren Hoffman (51:48.375)

That's awesome.

Auren Hoffman (52:08.524)

Yeah. Cause in the U S like if you were like a great athlete, they'll love you to come in and be on like a team. You know, if you want to play on the New York Yankees and you're not American, great. I don't care. Like, but, and if you want to go join a hedge fund, you know, I'm great. We'll let you in. Like that's fine. But like, they just don't do it for government. and, it's just, it's just a odd thing.

Matt (52:12.917)

Right.

Matt (52:26.133)

I think our government is quite good at this of porousness or maybe permeability is a better word. A lot of the work I've done in government has been about these things that are the cutting edge of technology and really trying to create an interface between the public and the private sector. was very involved in

a lot of the UK's work on state capacity and AI and, you know, I helped create the institution that became the AI Safety Institute, which, you know, has attracted really like high, high level people out of DeepMind and OpenAI. and again, like it required a lot of like internal navigation, but again, these people are doing what's effectively a national security type role in government, wherever they're from. And I think there's just like still a sort of pragmatism at the heart of our approach to this, which

I don't know, my own experience has been like that. I work at EF, that's my full -time job, it's where most of my time goes, but I've been able to do some reasonably significant government roles, like either on the side or as the back score.

Auren Hoffman (53:31.532)

or you're working like a day a week, right, for the UK government. I don't even know like really anyone in the US. I know there's a few here and there, but I don't really know anyone in the US who do that. Like it's incredibly difficult to just have like a part -time job working for the government in the US, whereas it seems like in the UK, they're more open to those types of things.

Matt (53:50.795)

I think that's right. guess one thing I would emphasize though is ARIA is very weird and unusual. I mean that in a huge positive and I can say this because...

Auren Hoffman (53:57.388)

Yeah, because you can you're not on the government pay scale, right? You have a whole bunch of things that are different than.

Matt (54:02.793)

Right. Yeah. And you know, like I wasn't there when this bit happened so I can praise it, you know, to high heavens without taking any credit, which is like, required an act of parliament to set ARIA up. And, know, the team that was there before the civil service team that was there before Alan and I arrived did that. And, know, if you read the ARIA act, the remarkable thing about it is one, it's very short. It's the shortest act of parliament I've ever seen. like four pages long, but it really communicates like an intent.

Auren Hoffman (54:28.386)

Yeah, wow.

Matt (54:31.669)

to do things differently, that ARIA will be exempt, as you say, from a number of rules that normally public sector institutions have. But also the most important thing actually in the ARIA Act, in my view, is that it deliberately and explicitly talks about what risk appetite it should have. Because I think a lot of what you're talking about and a lot of the challenges to doing, even like bringing in external people, are ultimately about a perception of risk.

Auren Hoffman (54:55.864)

Yeah.

Matt (54:58.643)

And, you know, like one thing that ARIA has been very good at is saying, we have a mandate from parliament to take calculated risks and we're going to do that, you know, like in all appropriate areas. Obviously we're not, we're not taking like risks with security or like risks with like, know, compliance, like I do. Exactly. Which is very hard to do in the public sector in general.

Auren Hoffman (55:12.332)

Yeah, you know, yeah. But you're willing to make investment that doesn't pay off, essentially. Yeah.

And if you think of like DARPA, like historically, like it's been so successful. have major computing breakthroughs, obviously the internet, you GPS, have all these things that have come out and you have so many other things that have come out of DARPA, at least historically. I'm not as familiar with like what it's doing today, but it does seem like it's not like what it once was. like, where do you, you know, where do you like look at DARPA and be like, wow, they did something and then where, like,

Are these things just like, are they just have to get calcified over time?

Matt (55:55.883)

I don't know enough about DARPA today. I've read a lot about the history of DARPA. I mean, I would say though that one, I'm very bullish on DARPA. Like one thing that I think they've done really well at is like really holding the bar on being willing to say things take 10 years to know and sometimes longer. And I do think like a really,

Auren Hoffman (56:18.754)

Yeah.

Matt (56:22.409)

Like it's such an amazing story reading the story of mRNA vaccines and the crucial role that DARPA played in those that really until the pandemic, we didn't really appreciate what an incredible thing had happened. And yet without DARPA, I think it's pretty plausible that we would not have had mRNA vaccines at the time that we needed them. And so, know, this is the thing is like, sometimes these projects take so long to come through because the world's not ready. And, you know, that's kind of the

the point of it in a lot of ways. I don't know. think like, I do think less of a comment on DARPA where I don't feel like the context to comment, but like, I do think that one amazing thing about building things from scratch in the public or the private sector is the energy of a new thing and the chance to shape it does give you an unfair advantage with attracting talent. And

Auren Hoffman (57:11.383)

Yeah.

Yeah, the best, like super ambitious people in there, they're like, I'm going to join this because I can, I don't like the last thing you want to do is deal with bureaucracy. So if you just like, if all of a sudden, if I'm going to spend like 10 hours a week in bureaucracy, I'm going to go down to four, like this is a huge and, actually in reality, it's probably like 30 to five or something, right? like then, then it's like, well, I might as well go to this place. I can actually do stuff.

Matt (57:29.034)

Yeah.

Matt (57:37.355)

Exactly. So, you know, I mean, that said, would say that I think DARPA still has a lot of pull from a talent perspective. Yeah. Yeah. And I think one thing they've done very well there, which we have at ARIA as well is actually by explicitly time limiting it. You you can't spend 20 years there at a stint on a program. You know, you have this fixed period of time to have an impact. I think that makes a big difference to like how people think about.

Auren Hoffman (57:43.884)

Yeah, that's true. A lot of super smart people go there like

Auren Hoffman (58:05.789)

yeah, interesting. Now there's been...

Matt (58:07.019)

This idea of like tours of duty is like quite a powerful one, I think.

Auren Hoffman (58:10.722)

There's been these, in some ways, these successful labs over years have always been, there's been, obviously, have Skunk Works, which is part of Lockheed Martin, which is legendary. You've got Bell Labs, which is maybe the most, Xerox PARC. Now we've got Microsoft Research, Google has X. You have all these different things that are out there. They're doing all these really amazing things.

Is it, it possible that just like, we're just going to see, they're thinking oftentimes like decades and decades in the future, which you wouldn't think a public company could do, but like these guys are doing it. are we going to see more innovation happening in these kind of like, kind of more for -profit labs?

Matt (58:54.333)

Yeah, I mean, think the thing that's hard is that in order to do that, you need a business that is so secure. basically, mean, Bell had literally with, yeah, right. and, and, I think in general, the hard thing about genuinely risk taking genuinely long -term innovation that might take a decade and want to pay out is that the economic model for it is really hard to underwrite on a rational basis. And so you do sort of need these like.

Auren Hoffman (59:00.29)

Monopoly. Yeah, like AT &T. Yeah.

Auren Hoffman (59:19.256)

Yep.

Matt (59:23.399)

unreasonable cash flows to pay for it. And it's not just that you need the financial ability to pay for it. You need enough buffer and insulation from the pressure of the market to be able to get away with it. And so I think one thing you might say is that if some of the AI companies that we're seeing now become really great businesses, then that creates another of these buffers that allows people to really expand.

Auren Hoffman (59:35.02)

Yeah, to keep going. Okay, interesting. Now a couple

Auren Hoffman (59:52.278)

Yeah, a couple of personal questions. One of the things you do on the side is you create these like immersive murder games. Like how did you get into that?

Matt (01:00:00.341)

So as you may be able to tell from the middle part of this conversation, a real fascination with history, especially medieval history. And I've always, always been interested in like simulation and sort of like what, know, if you reran this historical moment again, what would happen? And, know, like, I really believe that like history is super contingent and like, it's, yeah, like I really.

Auren Hoffman (01:00:17.383)

What would happen? yeah, that's cool.

Auren Hoffman (01:00:23.714)

butterfly effect of like random little things.

Matt (01:00:27.263)

I guess a big part of my work at EF and actually at ARIA is about just a huge belief in the power of individual agency. And so I got very upset.

Auren Hoffman (01:00:35.278)

Like do you like on the world war one kind of thing, like, do you think Roland was, was, was kind of destined to happen? Or do you think like if our shoot for an ad wasn't assassinated, it may have been like, or maybe it would have been worse like 10 years later. Like it was just going to happen at some point. Yeah. Maybe not at 1914, but maybe by 1919 or something. Okay.

Matt (01:00:47.945)

I think yeah, I think a version of it would have happened probably. think the version would. Yeah. Yeah. And you know, I think it's certainly true that like you could get, and as a result, if it'd been later and done in different ways, you might have ended up with a very different outcome in Germany in the middle of the 20th century, for example. And so, you know, like, it might not be that you could avoid World War I, but maybe you avoid World War II, or maybe World War II is much more limited in like important.

Auren Hoffman (01:01:04.428)

Different outcomes. Yeah, for sure.

Auren Hoffman (01:01:12.782)

Correct, yeah. Yeah, a lot of people, point them the same is like World War I, World War II really is the same kind of war together, right?

Matt (01:01:20.507)

Absolutely. you know, like, yeah, so and I do think like, you know, clearly, like Hitler is this like, totally bizarre and, you know, obviously horrific, but like, extraordinary figure that, you know, probably in like, you rerun the timeline, and you get something like that, but not him and it not being him is actually extremely consequential. And so there's lots in that. Anyway, these, I love these questions. A lot of professional historians hate counterfactual history. They see it as this like,

Auren Hoffman (01:01:40.248)

Correct.

Auren Hoffman (01:01:44.013)

Yeah.

Matt (01:01:48.511)

tedious parlor game, but I always loved it. And so, mind -motor mysteries are always historical, and they're always sort of like very thinly veiled opportunities to rerun important moments in history. So, I have one set in Reformation France at like a crucial moment where who's going to win? know, like, is the king going to be a Protestant or a Catholic? I have one set, to your question, at the brink of the First World War, you know, and, you know, there's like in Central Europe.

I've won it into all France. I basically do these to try and like, I hate scripted things. So I give people characters and goals and incentives and information and then let it... It's a bit like that. It's a bit like, it's sort of like, yeah, D &D, but for historical... Yeah, it's great fun. And one thing...

Auren Hoffman (01:02:32.366)

So you're kind like the dungeon master in a way. Yeah.

Auren Hoffman (01:02:42.03)

All right, next time I'm in London, I'm inviting myself to one of these things. Yeah, yeah.

Matt (01:02:44.467)

You must, you must. One thing that's been great is my wife is a professor of ancient Greek and she's a fellow at one of the colleges here in Oxford. there's a professor at her college who's also obsessed with murder mystery games. And he's let me host these things in Christchurch in Oxford, which is a beautiful setting for it. So it's been really cool. So next time you're in town, Aaron, we will do it.

Auren Hoffman (01:02:59.77)

amazing.

Whoa, that's so cool.

Auren Hoffman (01:03:07.382)

or what is a conspiracy theory you believe?

Matt (01:03:10.664)

I think I, well, one, I think in general, I believe that.

very large organizations like governments are much easier to influence than a lot of people think if you have a really concerted conspiracy from the right people. so, you know, I think, I'm trying to think of a specific one that I believe. I would say like, I think it's...

Matt (01:03:43.979)

trying to think what I can say in public.

Auren Hoffman (01:03:45.325)

Ha ha ha

Matt (01:03:55.883)

There's certainly a lot of indicative evidence that... I probably shouldn't say that. I don't know.

Auren Hoffman (01:04:05.516)

It's interesting because I've always thought that the, if you think of influencing governments, think like, like certain governments are very good at influencing other governments. And I've always thought like, as the UK kind of like maybe declined in power, it got very good at influencing America. Like if you look at the 1950s, it seems like a lot of things that the U S did was like, you know, like, you know, trying to overthrow the government in Iran or something like that seemed like

Matt (01:04:15.039)

Yeah.

Auren Hoffman (01:04:32.45)

very influenced by a small number of British people who've been like, who of like planted the seeds and CIA planted the seeds. So I feel like they were very good at manipulating, not in a bad way, because they're doing it for the best interest, but manipulating America to do what they couldn't do on their own or something.

Matt (01:04:49.835)

Yeah, and you know, it's, it's interesting how many things that would have seemed like conspiracy theories turn out to be true, right? You know, it's kind of interesting, like the, some of the stuff that the CIA did with funding art and, know, so yeah, I think that, I'll tell you what I will say, say, think like 10 years from now, we'll look back and we'll say that some of the weird stuff that happened and will happen in AI over the next

Auren Hoffman (01:05:01.794)

yeah, totally. Yeah.

Matt (01:05:19.723)

10 years will turn out to be much more the result of conspiratorial activity by state actors than maybe it appears today.

Auren Hoffman (01:05:28.992)

Okay, that's awesome. Okay, I'm going to my next time we're together. I'm to talk to you about this. Last question. We ask all of our guests. What conventional wisdom or advice do you think is generally bad advice?

Matt (01:05:39.231)

I'll give you a European answer, which is like, I think there's a very annoying thing that happens in Europe with entrepreneurs, which is a lot of entrepreneurs get the advice that like, you know, like, that's a good idea, but like do something just a little bit less ambitious, because then you're more likely to succeed. Well, yeah, mean, no, it's the implicit. Unfortunately, I think it's still the implicit advice. And actually, I think a lot of a lot of people

Auren Hoffman (01:05:40.983)

Okay.

Auren Hoffman (01:05:55.876)

really? This is still the advice that goes? I I thought that was like in the 90s. I didn't realize that was still going on in Europe. Okay, interesting.

Matt (01:06:06.313)

have an intuition that there should be a sort of an inverse correlation between like odds of success and sort of like ambition. Like if you're to do something really hard, you're less likely to succeed. So, I get there is an intuition there, but I think even that version is wrong. Like I think actually much easier because like in the end it comes down to like, you attract talent and can you attract capital? And guess what? They're attracted to ambition. And so like,

Auren Hoffman (01:06:16.258)

Yeah. right, right.

Auren Hoffman (01:06:23.906)

God, it might be easier to succeed by being more ambitious. Yeah.

Yeah.

Matt (01:06:33.993)

I think that intuition is just wrong and it gets amplified into very bad advice.

Auren Hoffman (01:06:38.987)

I love that. That's really interesting. Okay. Thank you, Matt Clifford for joining us on World of Dazs. By the way, I follow you at Matthew Clifford on X or Twitter. I definitely encourage our listeners to engage you there. This has been a ton of fun.

Matt (01:06:51.763)

It's been great. Thank you so much for having me.

Reply

or to participate.