Simon Brown (00:04.844) Hello and welcome to this episode of the Curious Advantage podcast. My name is Simon Brown. I'm one of the co-authors of the book, The Curious Advantage. And today I'm here with my co-author, Paul Ashcroft. Unfortunately, Garrick can't be with us today, but we are delighted to be joined by Ari Popper. Ari, welcome.
Paul (00:10.723) Hello.
Ari Popper (00:26.026) Simon, great to be here guys.
Simon Brown (00:28.88) That's great to have you with us. Looking forward to a fascinating conversation on what the future might look like. So, Ari, maybe to kick us off, you've had a fascinating journey combining innovation, storytelling, science fiction. What first took you down the path and what inspired the creation of your organisation, SciFutures?
Ari Popper (00:53.23) Yeah, thanks. Well, it's great to be here, guys. Really excited to talk to you. Kind of share some insights. But yeah, how did I get started on this crazy journey? Well, a midlife crisis, actually. Exactly. I was working for a great company. was the president of a research company. And doing quite nicely and just
Simon Brown (01:06.917) It's a good way to get started.
Ari Popper (01:22.092) had no idea what I wanted to do next with my career, like clueless. But I knew I loved science fiction. I loved watching it and it always stimulated me. I loved innovation. I loved working with clients, helping them solve problems. And somehow, someway, I found myself into a science fiction writing class at UCLA. And one day, while writing science fiction stories, for fun,
I had an epiphany to start Site Futures. And you know what it's like when you start something, it's an itch that you can't stop scratching. And it was one of those itches. That's how I got on my journey about 14 years ago now. Yeah. Yeah.
Simon Brown (02:07.499) fences.
Paul (02:09.122) What does Sci features do in a nutshell? tell us all.
Ari Popper (02:16.225) Yeah, we help leaders, we help our clients understand where the world's going with a combination of research and strategy, but our secret source is we bring the future to life in a way that creates epiphanies with how leaders need to how R &D needs to change, and that creates meaningful action in the present.
So over the last 13 years, working with over 50, 60 Fortune 100s, we figured out that in order to get meaningful action, meaningful change, you've got to appeal to the head and the heart. And we figured out how to do that. Yeah, so that in a nutshell, that's basically what we do.
Paul (02:48.226) So, a nutshell, that's basically what we did.
Which means you need to have an understanding of or at least an idea of where the world is going and you need to how to do that.
Simon Brown (03:10.377) which means you need to have an understanding of, or at least an idea of where the world is going in order to do that. So where is the world going,
Ari Popper (03:21.939) gosh. Yeah. You know, it's, we're not trying to predict the future. But what we're trying to do is to inspire meaningful change in the present to create preferred futures. That's what we're trying to do. And we're trying to create more meaningful action, whether it's leadership,
or whether it's innovation transformation, we're trying to get our clients to really leverage the opportunities that are out there. And the way we do that is through a future back methodology. So looking at the emerging signals, the trends, the technologies, which change hourly, they used to monthly, or every six months now they change hourly.
Create opportunity areas set usually three to five years into the future, not too far because it breaks down, not too close, useless. Create those opportunities, inspire visions, narrative visions of what those opportunities are, and then work backwards from there to create meaningful change in the present. Yeah, so that's how you get to meaningful transformation today or meaningful
leadership epiphanies today.
Paul (04:54.322) I'm fascinated by this. As a science fiction sort of writer in my part-time as well, I love to pick your brains, whenever in my experience you ask a group to imagine their future, you often get deer in a headlight. It's very difficult to step outside of your norm, even though, and I love that I always go back to this quote about the future's already here. It's just,
Ari Popper (04:55.947) Yeah. Yeah.
Nice. Yeah. Yeah.
Ari Popper (05:09.601) Yeah. Yeah.
Paul (05:18.424) and not evenly distributed, right? How do you get people to break out of that thinking and actually step into the future? And is that part of, as you say, where the fiction comes in and the creative side comes in?
Ari Popper (05:32.578) Yeah, it's a great insight, Paul. And I'd love to read your stories, by the way. But it's a great insight. Yeah, you're right. Most people aren't very good at telling stories about the future. Some of us have just a natural knack for that. But it's quite rare. But there's two functions that telling stories about the future do.
Paul (05:37.216) Thank you.
Ari Popper (06:01.363) One is to integrate understanding of the opportunity areas and the technologies. So by going through these writing exercises, and we have lots of them with our clients in a workshop, whether it's our leaders or our innovation clients, that it actually helps them process in a different way the information about the future. And we can connect, we can create context, because now you have to pull in the different areas, we can create connection with the future.
by making it personal, having characters that they can relate to. And it's also creative. It can occasionally come up with interesting ideas by writing about it. So it has a lot of benefit just the action of doing it. We don't rely on our clients or the average person, I mean that with respect, to come up with the visions. That's where we go to the experts. And we have to learn that painfully through the life.
of SciFutures. We started out and we're like, everyone's going to be a science fiction writer. We didn't come up with these wonderful ideas. And no, most of them are terrible. But when you have talented science fiction writers like we do that work for us, then you can rely on them. You brief them properly, and then you can rely on them to come up with wonderful visions. And we try and keep it interesting. We try to diversify the talent that we work with. We mix it up.
Paul (07:21.247) trying to make a team rather than by the talent that we work with. And we get up and we get excited out with everybody that's there.
Ari Popper (07:28.621) Yeah, we started out, we had 350, 400 science fiction writers that we have as part of our database. And you can go to scifutures.com slash writers and sign up. I mean, we'll look at your work and give you a go. But we ultimately... Perfect. Perfect. You're welcome. Yeah, but anyway, I'll stop talking.
Simon Brown (07:46.005) Paul's found his new site gig now. He's going to be writing sci-fi at the SciFutures.
Ari Popper (07:58.412) said a lot about that topic.
Simon Brown (08:00.428) And I know from when we were together, Ari, that you've been very effective at predicting the future for certain clients as well. So there may be one or two examples that you can share where you sort of looked at five or more years into the future. And some you shared with me even further into the future. And actually that's manifested.
Ari Popper (08:20.544) Yeah.
Yeah, you I think we've had some great successes. I think there's also misses that we've had. But like I said, the name isn't to the name of the game isn't really to get it completely right. It's to change the present. But we do occasionally get it right. So digital agents, AI agents. Like for the life of me, I just couldn't understand why people this is six, seven years ago, why our clients and just people generally.
couldn't understand that this was just going to happen and it was inevitable. And you can go back and you can see posts of mine. I actually did.
Paul (08:57.578) I've got to say, just to interrupt, but you say it's inevitable, you must be in a very small population of people who thought it was inevitable. It's taken everybody else by complete surprise in the past 12, 24 months, right?
Ari Popper (09:04.823) Yeah.
Ari Popper (09:13.261) Yeah, and not for me really. Like I was just like, yeah, we're going to have an A2A, an algorithmic economy. AI agents are going to talk with other AI agents. It's just so bleedingly obvious to me. I didn't know the timing exactly, but I felt, frankly, I think it came quicker than I thought. But it just seemed obvious. There's an old video of me on YouTube talking about, we'll have AI agents that will go on dating apps for us. And actually,
find matches for us. And those other matches will be other AIs. And they'll kind of date each other. There's a Black Mary episode like that, which is fantastic. And all of a sudden, your AI has done all the work for you. And then the CEO, Bumble, came out and said they were working on that. And I was like, wow. So, you know, one example, I think where we got it wrong as well, is when I started
Simon Brown (10:05.355) Yeah.
Ari Popper (10:13.757) I was quite optimistic about the impact of technology, particularly social media on connection and bringing us together, breaking down barriers and understanding each other. I was quite excited about the potential of technology to unite us. And I think it's done the exact opposite, which is a tragedy, frankly, and getting worse. So I think...
I think, we don't always get it right, but certainly in that instance, I wish I was wrong on that one. wish I actually, I wish my, my naivety wasn't, you know, came true, I guess. But we can learn from these experiences and that, I know we'll talk more about how do we get organizations ready for the future, but we can learn, we can learn from these experiences as well, particularly as things are changing so
damn fast and in powerful ways.
Paul (11:15.184) I wonder if it's a bias of ourselves to be optimists about the future. And so we sort of naturally bias towards an optimistic outcome.
Ari Popper (11:17.325) Thanks.
Paul (11:25.821) Maybe we come back to that. wonder, it's fascinating you said, we sort of look three to five years out and Simon and I were reading a paper about AI 2027 and already some of the things talked about in that seem very far away, but yet scarily close. What are some of the things that you're tracking, paying attention to now that you think might be influencing our lives in two, three, maybe five years time?
Ari Popper (11:35.629) Peace.
Paul (11:54.557) What are some of the key ones?
Ari Popper (11:57.184) Yeah, we do a lot of work in, well, AI is essentially consuming all the oxygen in the room, right? And frankly, it should, because it's, you know, we've got to keep our eye on that. So obviously, AI is massive. But it's the second order and third order effects that we're trying to anticipate. And that you can do. That's a superpower that organizations
to deploy to their competitive advantage and for leadership development as well. And you can do that, as you know, being a science fiction writer, when you start to write little narratives, little sketches, using the grounded in the technology and what's happening, you start to bump up against these different types of futures. And you start to, I wouldn't call them scenarios, but you start to kind of uncover
yeah, that could happen. So with Cambridge Analytica, which is an absolute disaster. And frankly, anyway, don't get me ranting about that. But that was something that we could anticipate. And in fact, we did. It's like, what if you knew everything about a person and you knew everything they liked? And what if you could use an algorithm to reversions your near their personality, so you know what their hot buttons are and how to manipulate them?
Paul (13:06.896) Thank
Ari Popper (13:24.879) and what if that fell into the of somebody who could use that information to persuade them to vote a certain way? Now all of that was possible before it happened. it's thought of that type of thought exercises and scenarios and kind of second order, third order, you know, effects that you can, that we do that help our clients get ready for where the world is in three to five years.
So what if AI agents are talking with AI agents? How do you market your products to your customers? know, yesterday I posted on LinkedIn, I read an article in the Wall Street Journal that Google searches, I think down it was 5 % recently. And again, I've been banging on about that for years. It's like people aren't going to use Google, they're going to ask their agent and their agent is going to go to the agentic marketplace. So how do you get your brands there? How do you make sure you
you're relevant in that commercial environment.
Simon Brown (14:26.611) We're seeing the start of that with shopping features being added into ChatGPT now. that, yeah, that instead of going to Google, you go to your AI and you get your shopping pieces and it can do product comparisons. And yeah, it certainly seems like we're to be there not very far. Yeah.
Ari Popper (14:32.16) Exactly.
Ari Popper (14:39.232) Exactly.
Exactly. And this is the superpower that you can deploy. Let's take emerging technologies, let's take creative people, let's run scenarios, and let's figure out the second order and third order effects. And you can use it for an advantage, so for innovation. So inventing new things. if I don't need to advertise as much on Google now, I'm going to need to...
have some kind of way of getting in the agentic marketplace, let's start a business or let's build a platform. So you could do it that way. But you could also anticipate the negative effects like the Cambridge Analytica's, like the kind of loneliness epidemics that social media hacking our attention, like social AI, something that we spoke about recently that basically pumps out a human operating system to exploit it.
You know, you could start to get organizations ready for that, for those ethical and kind of, you know, big picture challenges. And just the last point on this is what I've, when I started SoFutures, a lot of the work that I was doing with senior teams, we would, we would have conversations about innovation, about product, about business. But these days, it's about what does it mean to be a human? What is it?
Paul (15:50.747) So just a last point is what I when I started, a lot of the work that I've done over the years, we were happy to have a product that these days, what is called
Ari Popper (16:09.889) How do I protect my children's future? How do I educate my kids? I had this great client I got three, four times a year to New York, and then they take me out for dinner and I meet their leaders that we work with. And kids are not, every single time without fail over dinner, at least two or three people will ask me about their children's future. How do I educate my child in this environment? So the conversation is shifted from, you know,
brass tacks innovation to what does it mean to be human? How do we be ethical? What's our humanity? Yeah.
Paul (16:46.61) Do you have an answer to that, Ari, when they ask you over dinner, what are some of the strengths, what are some of the things that we should focus on and how do we, I have kids, Simon has kids, how do we educate them to be prepared?
Simon Brown (16:49.513) You know what I'm talking
Ari Popper (16:54.284) Yeah.
Yeah, you know, it's a very difficult question to answer properly because of the pace of change. But I would say what I usually say to is that they watch my child study and we're like, well, don't do anything that don't go too deep in any technical area that can be automated. Go broad.
And to your guys point, teach adaptability and curiosity. That's more important at this point. My knee-jerk reaction a few years ago used to be, well, humans, have these strong social skills and we're, know, human bonds and human connections are never going to be replaced. But even that one is a bit, I saw a study, I know if you guys saw,
Simon Brown (17:34.08) Yep.
Ari Popper (17:58.294) where on Reddit they created AI redditors. And they were six times more likely to convince people to change their beliefs than humans. Did you see that one, Simon? Yeah. Yeah.
Paul (18:13.325) Wow.
Simon Brown (18:14.919) I, yes, I think I did. I'm just trying to recall that. Yes, because they spent the time listening and debating and managed to move their, yeah, move their opinions over. Yeah.
Ari Popper (18:27.819) Yeah. And they did that without asking people for permission, which is obviously unethical. But nevertheless, my point being is that social connection, human skills, that's called skills, abilities, that can be done by AI too. So roundabout way back to answering your question is adaptability, flexibility, curiosity.
Simon Brown (18:36.414) Yeah.
Ari Popper (18:56.703) knowing enough, but not like I wouldn't want to be a language translator right now, you know, or, I'm also semi convinced that I had an epiphany about a year ago, just that you're either working with AI completely and just embrace it or do something that doesn't involve AI. that would be like massage therapy. My daughter's an opera singer.
You know, those sorts of jobs, you're pretty safe.
Simon Brown (19:29.161) Yeah, handmade pots on the thing the other day. It's something where the human touch on it is actually the thing that's the differentiator.
Ari Popper (19:32.843) Yeah. Yeah.
Ari Popper (19:38.922) Exactly. Yeah. And I have, there's an analogy that I'm starting to form in my mind. I don't know if it's complete, but you guys challenge it for me. I feel like with AI, we're going to get like factory produced ideas, content, human and AI. And it's kind of like mass market factory, like good enough, pretty good, actually getting very good. But, and then we'll have artisanal, which is like human.
handcrafted, purely human. And I feel like that's a, there'll still be a demand for artisinal. It'll just be way smaller like it is today. Most people are eating factory. And I feel like that's a, I don't know, you tell me if that's a good analogy, but it just feels like where we're heading.
Paul (20:27.692) Like a mainstream lager and craft lager, huh?
Ari Popper (20:30.765) Exactly.
Simon Brown (20:32.875) Remember, we were having this discussion maybe a year or two ago where we were asking ChatGPC to give us ideas on strategy and it came out with something for a workshop we're having on strategy and we just sort of looked at it, it's like, do we really need this workshop? But it was then like, well, if everyone is getting that same strategy, it's to a point, it's the vanilla strategy and not everyone will be.
Ari Popper (20:51.863) Yeah.
Simon Brown (20:59.517) successful if everyone is using the same vanilla strategy. then it's how do you provoke that to get something that is actually differentiating and that probably needs the human direction to say, okay, there's the vanilla, but we need to go make a conscious decision to go to a different direction than that because we believe that that's gonna be the successful path.
Ari Popper (21:21.931) Yeah, definitely. But also you can follow that to its logical extension. if everyone's getting roughly the same strategy, the human is the differentiator. But then it also becomes the battle of the algorithms. So a lot of organizations are developing their own. They're using these large language models outside, but then they're plugging on their own specially trained models, right?
And then it becomes a spattle of the algorithms. How well you train it, how much you get out of it, et cetera, et cetera. Fascinating. Yeah.
Simon Brown (21:58.656) I was talking to one of the AI organizations the other day and had a similar piece around what skills for your kids and ethics and philosophy was one that came out there as well. So similar, it's the softer skills, it's yeah, adaptability, not holding too tightly onto the truth, yeah, very much so.
Ari Popper (22:21.985) Yeah, it's it's yeah, please.
Paul (22:24.439) Harry, other, go on, sorry, finish.
Ari Popper (22:29.269) No, I was just going to say, if we step back and become curious about the time we live in, it's like we're literally questioning what it means to be human. You go into L &D session, Simon, must have it all the time. You go in and you basically think you're going to talk about these sort of skills and what you can, and it inevitably veers off to, wow, what does it mean to be human? And that's where the philosophy and ethics come in.
Paul (22:59.223) Yeah, I wanted to ask you, mean, of course we agree. And I wanted to ask you at the other end of the spectrum, leaders, you work a lot with leaders in businesses. So we were talking about how you educate people coming into work. What about for the people that are leading work? What does this mean for them? The conversations that certainly I've been having is huge change in terms of what it means to lead and what I need to learn as a leader. And what are you seeing?
Ari Popper (23:09.698) Yeah.
Paul (23:29.472) in that space.
Ari Popper (23:31.691) Yeah, it's
Ari Popper (23:36.61) We have to, one of the things that we do well at Sofutures is we're able to give them epiphanies about what the world's going to be like two to three years from now, or even four years from now. So it's through, so our leadership development programs basically go like this. Here are the signals, here are the emerging technologies, here's all the building blocks of the future. Okay. They create these opportunity areas.
about how you need to lead in a different area. So it could be, is my organization going to look like? How are my customers going to behave? What is my career going to be like? And how I need to manage my career? What are social and ethical responsibilities like? So it opens up those areas. And then it's like, okay, now time machine, travel into the future in each of those opportunity areas, and we give them an artifact from the future. And this could be a podcast.
This could be a giant poster that's a mock-up of a dashboard interface three to five years from all the different data sets coming in, et cetera. They react to that. And then we're like, OK, here are the, we have five or six leadership mindsets and skill sets that you need to be effective, you need to use to be effective in this future. And so we, there, some of them you've mentioned, or like the futures,
Here, it's just not evenly distributed. We like to say the future is at the edge as well. So what does that mean? You've got to be constantly scanning and scouting. The next one is innovation is a team sport. Okay, who is on your team? It's not typically who you need. What are the makeups of team? Another one is rehearse the future. So as a leader, how do you practice things that are coming? We spoke about that a little bit earlier, how you can anticipate. So we have these skill sets and mindsets.
We help them with that. And then we give them scenarios where they practice and actually go through leading using these different skill sets and mindset. So that's basically how we do it. But I think the pivotal part there is the epiphany. It's the, now I get how agents are going to mediate commerce. now I get how my biometric data
Ari Popper (26:02.189) can be used in a really positive way in a working environment to help me get the best out of myself. But what the ethical issues are and how I need to manage that. Yeah, so that's basically what we do.
Paul (26:15.168) think epiphany is a perfect word for it. Just thinking about some of the conversations we have where people really are struggling, worried and in a fog basically about what could happen and sort of reacting to the now. As you say, you shine a light on that in some way, it would be an epiphany. And now my strategy is clear, now I know what I need to do. Is this a bit about how you, I know you've got a lovely video and you talk about how to time travel on a budget.
Ari Popper (26:29.335) Yeah.
Ari Popper (26:36.791) Suck it.
Paul (26:42.53) Is this a little bit about what you mean in terms of how to time travel on a budget?
Ari Popper (26:48.299) Yeah, it's basically.
Ari Popper (26:53.845) This is one of the bizarre ironies of the work that we do, is that the more powerful the epiphany, the less likely people will attribute it to something outside themselves. So it's a subtle but powerful transformation that happens internally, where suddenly they've realized and understood and had a conversion moment.
but they attribute it to themselves. What they haven't realized is that going through the process, being immersed in these artifacts, time traveling into the future, seeing a video, just like when I had when I was a kid and when I watched sci-fi, and as know, Paul is a sci-fi writer and a sci-fi fan, you have these moments where you're like, suspend disbelief completely. All of a sudden that world is so immersive, you're in it, you're transformed. That's the epiphany, you're transformed. And then,
you're kind of different once you come out of that. you know, so, so it's just from a business development and sales point of view, it's hard for me to sell that. Like, how do you how do you sell that? But it's, I think it's consistent with human psychology is like, you know, the best way to get somebody to do something is for them to think the idea that they had is their own idea. Right. And it's a bit it's a bit like that. That's kind of the mechanism that that we're working. Yeah, exactly.
Paul (28:04.83) That is...
Paul (28:18.118) The inception method, yes.
Ari Popper (28:23.019) Yes.
Simon Brown (28:24.619) So we're talking with Ari Popper. Ari is the founder and CEO of SciFutures, which is a foresight and innovation firm that uses science fiction prototyping to help organizations create meaningful change and epiphanies. With over 20 years experience in marketing and innovation, Ari works with Fortune 500 clients like Visa, Ford and VMware. He's a lifelong sci-fi fan and writer, and he believes science fiction is a powerful way to make emerging technologies human, relatable and actionable, giving companies a competitive edge.
I'd like to come back to the mindsets earlier, because I remember you took me to these previously and I only captured a few of them. So this time, the future is at the edge, the future is a team sport and rehearse the future. There's more that aren't there. yep.
Ari Popper (28:57.538) Yeah.
Ari Popper (29:04.929) Yeah. Yeah. Mind the Gap. Mind the Gap is one of my favorites. need to open, actually I have them in front of me because I was taking a client through it the other day. And Distill the Challenge. Yeah. And Distill the Challenge was the other one. Yeah. What is it? What is it? Exactly.
Simon Brown (29:22.539) So I remember these very powerful ways to up a framework to sort of think about the future. It's still a challenge. Yep.
Ari Popper (29:34.061) that we're solving for. Einstein said if I had an hour to solve the problem and not spend, I can't remember. 55 minutes thinking of the problem. Yeah. Yeah. Yeah.
Paul (29:40.591) How do you get, let me ask you about that, because that's fascinating. How do you get people to distill the challenge, really identify what the nub of the problem is?
Ari Popper (29:50.988) Yeah.
Ari Popper (29:54.753) Yeah, that's a very good question.
At best, they need to spend a lot more time thinking about what they're trying to solve rather than jumping immediately into the solution. And that's what we're seeing with AI today. It's like, AI shiny object, boom. AI shiny object, boom. Everyone runs off. But it's like, wait, wait, wait, wait. What is our vision? What is our strategy? Where do we want to be? How do we compete? How do we add value? That's...
Paul (30:08.306) That's all.
Ari Popper (30:25.559) distill the challenge, what is it exactly that we want this AI to do? Is it a purely automation play? Is it augmenting? So distilling the challenge is really taking time to ask the right questions early upfront. And a good leader needs to do that. Otherwise people are just scattered. And I have clients come and say to me, we've got so many initiatives going on, it's a mess. So that's a mindset and skill set that...
is important, particularly with transformative technologies and shiny new objects.
Simon Brown (30:58.271) and then mine the gap.
Ari Popper (30:58.893) Yeah. Mind the gap is, use this analogy like when you're trying to create the future, it's like stretching elastic band. If you don't pull enough, you're not going to move the organization. It's just going to be too close. You're too safe. If you pull too much, you're going to snap and it hurts. And so you've got to have the right skill set to have the right tension between
the emerging future and where you are today. If you pull too much, you're going to lose people. If you don't pull enough, everyone's going to stay where they are. And that mind the gap is a real leadership challenge because you've got to, you know, there's so many analogies, but you're basically building the car while you're driving it. You you need to be able to continue day to day operations, but you've still got to ruffle feathers and get people to change.
the way they're doing in a safe way so that you can create the future as well. And that's minding the gap. Or, yeah, please, yeah, yeah.
Simon Brown (32:02.987) So coincidentally, was perhaps two hours ago having a very similar conversation on this, where we were saying the question was, why are people not
Ari Popper (32:10.207) Yeah.
Simon Brown (32:16.299) necessarily seeing the full potential of AI. I say in the AI dilemma, I think we talked about it when we were together, the social dilemma. they made this point around this sort of elasticity of you hear this concept, it will be so seemingly far away.
Ari Popper (32:23.361) Yeah.
Simon Brown (32:34.377) that you sort of comprehend it at the time, but so quickly it just snaps back and you sort of, it's too far to comprehend from the here and now. And I think a lot of the things we're seeing from an AI perspective are like that, that it's real, we see examples of it, but then our mind just brings us back to it. That's too far from the reality I see day to day. And therefore we sort of discount it somehow. Yeah.
Ari Popper (32:59.893) Yeah, totally. We see that all the time and it's easier to fall into that trap, right? Because the familiarity of the now and then all of a sudden you're kind of catching up in a huge way. Yeah, the other element of mind the gap is
being able to communicate the future in a way that as a leader, need to, you know, there's certain leaders who do this very well, but you need to be able to inspire people and use storytelling to kind of bridge their disbelief, Simon, exactly like you're saying. You need to be able to feel as if it's tangible and possible and worthwhile investing in.
knowing it's going to be a huge pain in the ass because they've got it.
Paul (33:55.904) Ari, I want to ask about your personal curiosity, if that's okay. you clearly strike us as a curious person, right? But I'm fascinated to know, what's your process? What are you paying attention to? Where do you go for the information? Because you must be paying attention to such a broad
Simon Brown (33:57.322) rethink everything.
Ari Popper (33:58.638) Exactly. Yeah. Yeah.
Paul (34:24.388) range of things, but then how do you not get lost in the fog? How do you then, for yourself, synthesize and decide where to focus?
Ari Popper (34:33.729) Yeah, I mean, I've always been a bit of a curious nerd. like when I was literally a teenager, mother used to buy me the Guinness Book of Records and I would just lie in my bed and read them like from end to end. Like I was just fascinated with like useless facts. I've always kind of had that brain. And so there isn't really a process, but it's a team effort. So, you know, we have, we have a nice team. We're all kind of get on extremely well. We have a Slack channel.
And we're all reading all the time. So, you know, there's some mornings I'll be up at 5am and I'll just be sharing articles and we do that. And that's the way we kind of cross-fertilize our ideas and then we'll get inspired about something and we'll talk about it on Slack or have a meeting and we'll kind of go back to it. So it comes from everywhere. You know, you can't, there isn't one source, but there's great like news aggregators.
Paul (35:16.273) So
Ari Popper (35:33.902) One of my friends wrote, had a friend who wrote an algorithm design, he basically created a bot designed specifically to pull articles and then he used notebook LLM to basically give him a quick download on it. So like, know, creative. for me, it's more about just like pulling it in from wherever it comes. You know, personally, I'm interested in, have
I'm interested in the future broadly, but I'm particularly interested in, my background is I'm a psychologist, study psychology and religious studies. I'm interested in human beings and in how in our society and how we interact with each other, how we become better people, how we help each other. So I generally tend towards my personal interests is I'd say more on those sides of things. What does it mean to be human?
How can AI enhance and augment our humanity? What are the traps and pitfalls that we're heading into? Those sorts of issues I'm naturally very interested in as well. But yeah, it's extremely difficult. You definitely get caught in the fog. And there are times where I can get pretty depressed as well. I can just feel overwhelmed and more just, shit, this is not good.
Paul (36:38.928) Those sort of issues are now very much involved. Yeah, it's stupid, but you get to get involved. And it's sometimes where I can get really trapped. I'm well, but still overwhelmed. Or, I don't know, this is not good. I don't know, some of know the same thing.
Ari Popper (37:00.843) I don't know, Simon, you're nodding. It sounds like you might feel the same, but yeah. Yeah. Yeah.
Simon Brown (37:02.891) Yeah, yeah. it's sort of, I mean, the AI 2027 paper that Paul referenced, if anyone hasn't seen that, is worth a read. And yeah, we could end up in a great place. We could end up in a not so great place. And both feel like not zero chance of, or the not great place is not a not zero chance of happening. So
Ari Popper (37:26.551) Yeah.
Simon Brown (37:31.441) It's, yeah, I think the curiosity to sort of try and understand and then figure out how can we influence, but yeah. Yeah, there's some very, very different views by very bright people on the positive and the negative of where this could all go.
Paul (37:47.915) I think it's also very interesting how you described your, well, I would say it's a process. We talk about, can improve your own success at being curious by A, working with others. So it's not a solo act, it's about community. And secondly, about curating. Curating all the range of stuff around you into, well, here's what I'm gonna focus on. So it's kind of nice to hear that,
Ari Popper (37:52.023) Yeah.
Ari Popper (38:06.637) Mm.
Paul (38:17.039) you're finding similar things work in your stuff too. I wanna ask, are you still an optimist? Do you think, if you cast your mind forward five years time, are we gonna have adapted to AI or will have AI just adapted us, do you think?
Ari Popper (38:32.567) Damn. Why do I get asked this really, really? It's a great question. I have to be an optimist, whether I like it or not. I'll put it to you that way. And I'll continue to fight the good fight. But I'm not going to lie, my unbridled enthusiasm that I had 13, 14 years ago has been damned a bit. But, you know, we have so many superpowers as human beings, curiosity being one.
empathy, our imagination, know, ability to be adaptable, ability to connect with each other as humans. So I'm still I still have a lot of faith in that. But but I am I am concerned. You know, I'm not Pollyanna optimistic. Yeah, I am. I am. I do have concerns.
you know, in the end, we have to believe that things are going to work out well, at least I feel that way. But we have to work towards that too. And that's a great mission for us. You know, it's a good mission for our company.
Simon Brown (39:46.027) Absolutely, and you're in a powerful position to have influence over that if you're working with the leadership teams of these organizations that have a part to play in all of that.
Ari Popper (39:46.733) Yeah
Ari Popper (39:56.696) Thank you. We do. And it's something we really see as a privilege and we don't take it for granted. We're kind of humbled by it. But yeah, that is very true.
Simon Brown (40:13.695) go next. That's a deep conversation.
Ari Popper (40:14.765) It is. That was a great question, Paul. I mean, are you guys optimistic?
Simon Brown (40:23.869) I think I share the we have to be optimistic that.
Ari Popper (40:25.537) Yeah.
Simon Brown (40:28.011) Yeah, the alternative is not somewhere we want to go. it's sort how do we make sure we making the right decisions along the way? And how do we, and from one perspective, one can feel very helpless around elements of it, but I think there are things that we can all do to understand it. yeah, ultimately that will bring influence and perspective. So yeah, I have to be optimistic.
Ari Popper (40:33.175) Yeah.
Simon Brown (40:58.025) we went through the Harvard Business Review article, predictions from the future from, if I remember, 500 futurists or whatever, again, worth a look up if people haven't seen that, but that sort of consolidated the views of where they are, some of the positive elements, but also some of the somewhat scarier elements of where things might go. so, yeah, I think we have to be curious to find a way that is a positive way through those choices.
Yeah, cool.
Paul (41:29.198) I mean, look, I guess I must also be an optimist as well. think, well, look, humans have always invented and discovered technology that we think has has absolutely changed our civilization. And we've done it for thousands of years, right? Maybe even longer. We've just, creating a new one. And there will be others. I mean, we haven't talked, do you look at any of these other, do you look at space tech, nanotech?
Ari Popper (41:30.221) Well said. Yeah.
Ari Popper (41:36.727) you
Paul (41:59.533) Do you look at any of these other new and emerging technologies, Ari, that's also changing our lives? Maybe positively, like drug research and drug discovery, for example. Do you look at those sorts of topics?
Ari Popper (42:03.703) Yeah.
Ari Popper (42:07.116) Mm.
Ari Popper (42:11.457) Yeah, we do. We've done a number of projects in space for clients, which are fascinating. We keep a cursory glance at quantum, keep an eye on that. Yeah, we look at the other transformative. mean, syn-bio is very interesting in a synthetic biology. If you think AI is scary.
Paul (42:40.031) Exactly.
Ari Popper (42:41.357) Look at Senbaya. That stuff is absolutely wild. But yeah, we keep an eye on that. Yeah, but we try to, try, you know, our clients want to know what to do today. So if we're too, you know, elastic band too far out there, like quantum, quantum is such a, you know, when quantum computers become viable and online.
It just has that huge transformative effect where things change radically. same thing with AGI. It's like all bets are off. So we try not to really work in those spaces just because they're unpractical. It's know, singularity events where it's more like practically three to five years. How do we help there, around there?
Simon Brown (43:36.977) Five years feels a very long way away at the moment. I don't know how you get to the sort of five years out. Even predicting three years out feels really hard, but five years out just seems so fast with the levels of change in many of these areas.
Ari Popper (43:55.342) Yeah, it's, you know, when we use our time ranges, it's their kind of like a metaphor. It's like, five years is like really far and like three years, you know, it's like, yeah, but you're right. I mean, it's never used to be that way. You know, Kurzweil was right. You know, he, he predicted all of that. He, you he showed us the exponential curves and he's like, and he was right.
Simon Brown (44:00.886) Yeah.
Simon Brown (44:19.059) Yeah. But we don't understand, or we can't comprehend exponentials. I think that's a big part of the challenge is that, yeah, we just really struggle with exponentials. And when you layer exponentials on exponentials and the interdependency of those exponentials, I think that's part of that elasticity piece, is it becomes too hard for us to get our heads around. And so we revert back to what we recognize, which isn't exponential.
Paul (44:46.847) Is that a thinking skill? Is that a thinking skill in your experience, Ari? So there will be writers, I'm sure, in your group that are more like near future, in the black mirror type, but also people that can think out and do think out maybe five, 10, I don't know, 300 years.
Ari Popper (44:47.789) And suck me.
Yeah.
Paul (45:05.835) Is there a thinking skill? Is it how you go from the seeds of today into the oak trees of several hundred years time or even 10 years time?
Ari Popper (45:15.981) Yeah, I think that's our process, you know. I do think some people naturally have that ability. Very occasionally we see it with our clients in the writing exercise. of a sudden, we call them, they have the Arthur C. Clarke Award. And we go, you got the Arthur C. Clarke because you were really visionary. Have you seen that BBC interview from 1966? The interview Arthur C. Clarke. It's fabulous. should put it on the pocket. But he basically says,
Simon Brown (45:41.643) Thanks.
Paul (45:44.469) We will, we'll post it, yeah.
Ari Popper (45:45.294) Trying to predict the future is a hazardous activity because the profit invariably falls between two stools. It says on the one hand, if you want any chance of predicting the future, you have to be so bold and so outrageous that people would love you to scorn. And he says, you know, it's that one. then he hits 1966 and then he goes on and he basically describes
in the future, doctors will able to do surgery no matter where they are, so robotic surgery. He just makes his fabulous predictions about our world. So yeah, to answer your question, some people have it, some don't. But sci-fi writers naturally like Arthur C. Clarke. They're pretty good at that.
Simon Brown (46:42.933) So we're coming to the end of our time. In a moment, I'll ask you, Ari, for sort of one thing to leave our listeners with, but maybe by way of summary first, you've taken us through your journey from that switch of career following midlife crisis to that sci-fi writing class at UCLA.
you did because it means we get to have fascinating conversations like this. How you look at how you understand where the world is going, bring the future to life through epiphanies and help people to lead through those epiphanies. How create those preferred futures, how we look at a future-backed methodology to do that.
Ari Popper (47:05.613) Thank you.
Simon Brown (47:24.043) how things are changing on an hourly basis now, but how you try to look three to five years out into the future of where that's going. We talked about digital agents, how you saw that that was inevitable, the agent to agents, the A2A algorithmic economy may come, how you have your talented sci-fi writers that help really bring the future to life.
some of the tragedy of social media and that it maybe didn't live up to the positive potential that was there. How things are changing in such fast and powerful ways and how AI is now consuming all of the oxygen in the room. The key thing is to look beyond the first order effects into those sort of second and third order effects. We had great conversation around kids. What should we tell our kids to be learning? That they should go broad, that they should look at adaptability.
curiosity, maybe even philosophy and that those may be some of the skills for the longer term future and then...
Maybe go binary, go either work completely with AI or not at all and choose a career that AI won't touch. How we end up with this sort of factory produced AI view versus the artisanal view. And then we went into leaders and how we can help them with those epiphanies and look for the signals, the emerging technology that creates, or look at the building blocks and what opportunity areas that might create. And then into the five or six leadership mindsets to be effective. So how the future begins at the edge.
future as a team sport, how we should rehearse the future and then how we mine the gap, that elasticity where it sort of bounces back and then how we really distill the change, really focus on what the problem is that we're trying to solve and then going into some of the process of how you
Simon Brown (49:19.071) keep up to date with things around it being a team effort that you do a lot of reading across you and the team, how you share things and you distill that down. And then finishing, I think, on the nice point that we have to be optimists around all of this. So if there's one thing from all of that, what would you leave our listeners with,
Ari Popper (49:37.729) Yeah, it's amazing how much we covered. That's a great, park off. Thank you for having me. It's really great. I think I liked what we spoke about this idea that, you know, let's remain optimists. Let's remain curious. Let's try and maximize our humaneness, the ability to connect with each other, empathy, compassion.
Simon Brown (49:40.491) you
Ari Popper (50:07.873) suspending ego, being aware of our ego and suspending it, you know, creating connection, I think that's, that's precious. Let's, let's focus on that. And then we'll transition, I'm sure with, with a steadier hand. But, you know, we can, I think it starts at the individual, right? We got to do that within ourselves and then just bring that to work. So yeah, that's, that's what I would say. It's very, very fluffy. And I apologize, but
Simon Brown (50:37.547) at all. Yep, indeed. So look at ways we can create those connections and really focus on what it means to be human. That's a way to leave things. Ari, thank you for joining us. It's been a fascinating conversation. I really enjoyed it. Thank you.
Ari Popper (50:37.774) I think it's warranted.
Ari Popper (50:47.499) Yeah.
Paul (50:55.049) Yeah, thank you, Ari. Thank you.
Ari Popper (50:56.127) Likely. Pleasure. Thanks for having me.
Simon Brown (51:00.491) You've been listening to Curious Advantage podcast. As always, we're curious to hear from you. So if you think there was something useful or valuable from our conversation today, then please do leave a review for the podcast on your preferred channel, saying why this was so and what you've learned from it. We always appreciate hearing our listeners thoughts and having a curious conversation. So join today using hashtag curious advantage. Curious Advantage book is available on Amazon worldwide. So please do order your physical, digital or audio book copy now to further explore the seven seas model for how you can be more curious.
subscribe today and keep exploring curiously. See you next time!
We recommend upgrading to the latest Chrome, Firefox, Safari, or Edge.
Please check your internet connection and refresh the page. You might also try disabling any ad blockers.
You can visit our support center if you're having problems.