Andy Polaine: Hello and welcome to Power 10, a show about design operating
at many levels of Zoom from thoughtful detail through to transformation in
organizations, society, and the world.
My name is Andy Polan.
I'm a design leadership coach, designer, educator, and writer.
My guest today is Carissa Carter, a designer Geoscientist, and the
academic director at the Stanford dco.
She's the author of The Secret Language of Maps, how to Tell Visual Stories with
Data and Co-Author of Assembling Tomorrow, A Guide to Designing a Thriving Future.
Carissa teaches design courses on emerging technology, climate change and
data visualization, a work on designing with machine learning, and blockchain
has earned multiple design awards.
Karissa, welcome to Power of 10.
Carissa Carter: It's great to be here Andy.
Thanks for having me.
Andy Polaine: So, I'll just give it a bit of your background, but how does,
what is your pathway to the Dsco?
Carissa Carter: It sounds kinda drastic to think that I, I had
a first career as a geoscientist before making my way to design.
It felt really natural in practice.
I was working as a scientist.
I have always loved geology.
That's what I studied and did my first degrees in and.
I really do, um, enjoy understanding the processes that shape our planet.
I've also always had a thread of how do we build, how do we make, what is art?
How do we present information?
I never really knew how to pull those things together when I was
younger, and honestly, I didn't know that design really was a discipline.
Until I was, until I was maybe in my mid twenties.
Right.
Um, it just, it just wasn't on my radar.
So I made my way.
Into the design field and one thing led to another because I really do
believe everything's interconnected.
Yeah.
And you know, there's the, there's the path.
We all are many things.
Andy Polaine: So when was the moment where you did realize
the design was a discipline?
Carissa Carter: I think I, I remember exactly where I was sitting.
Uh, it was in a cubicle.
I worked at the US Geological Survey at the time, and.
I had an excellent mentor, Dave Rubin, and we were looking, we
were making models of ripples.
So in sand, you know, little ripples moves the sand, grains move in the direction
of the flow and the sand grains bounce.
And they, they, they're the, the way that they are preserved in the rock
record can show you a lot of complex things about, about how the water
flows, how the wind flows, et cetera.
And we're making these models and.
It was in that cube that I had to do a lot of, not just understanding
the science, but a lot of presenting information and making it accessible to
a wide range of audiences and thinking about, well, why does this even matter?
You know, this is like this tiny slice of truth, but like,
what about the big picture?
Where does that go?
And it was a combination of that and like, what?
Was accessible on the web at the time, and like really great encouragement from him.
He was a builder, a tinkerer on his own.
He did his own inventions.
He invented a self-locking bike, and I just like all those things came
together, right, the right people, the right circumstances, the right things.
I was realizing about myself, and then I found out that there's.
Design was something you could study.
There are people that, that look at how we make things and it, and it
came together for me in that cubicle.
Andy Polaine: Yeah.
It's design's perennial problem that, uh, it is ubiquitous and that people
just feel like stuff pops into the world, um, or don't even feel that.
And it just seems to happen and it's everywhere.
And I think it's, that's why it gets ignored, quiet on.
We'll, we'll come back to that right at the end.
Carissa Carter: I mean, it's wild to me.
I mean, still today, I'm, I'm blown away that everything is designed.
Yeah.
Whether or not intentionally or not, like absolutely
everything was a design decision.
Andy Polaine: When I was teaching in Australia.
I had a, a, a parent come up to me actually about, uh, on graduation day.
You know, I'm very proud that their daughter had just graduated and, and
sort of quietly came up to me and said, but, but what does a designer do?
You know?
And I said, well, you know, everything that you are wearing, the car that
you came here in the, the, you know, books, you read, the magazines,
you uh, you read whatever you in your kitchen, you go and use.
Everything has been designed by someone, you know, and it's, um, it sort of
blows people's mind a little bit.
Exactly.
So, um, we talked a little bit, um, about this before.
You know, the, the show is called Power of 10, um, because it's named
after the Eames film, powers of 10.
It's all about the relative size of things in the universe
and how it's all connected.
And so for me, this idea of kind of thinking in layers and
these different layers of zoom.
You know, my main thing that I teach these days is service design, and
that's what people know me for though.
I've got a background in, in earlier stuff, in digital and
stuff and UX and all those things before they record those things.
But it's the easiest way to try and explain to people those different
layers, although it's sort of artificial in the sense of, you know, where
you actually decide to draw a layer.
It's a really kind of fundamental thread that flows throughout the book.
So.
Can you tell me, 'cause you, you've got the onion diagram before
maybe we get to the onion diagram.
Can you tell me what the sort of impetus for the book is?
I always feel most people write a book either, you know, you think, well,
there's a, there's a missing thing out there in the world, but it's often to
scratch your own itch a little bit too.
So what was the impetus of the book?
And in fact, we should probably name your, your co.
Collaborators as well with, um,
Carissa Carter: yeah, so, so the book is Assembling Tomorrow and my co-authors
Scott Doley, who's the creative director at the D School, we had an incredible
illustrator arm, Armando, ve and the book is really, you know, inspired by the
fact that Scott and I are both educators in the field of making things and
teaching other people how to make things.
But even to us right now is a very overwhelming moment in the world.
Our climate is in dire straits.
Our emerging technologies like AI feel like there is a new algorithm released
every day, and these algorithms can learn on their own and it can really
feel like the world is happening to us.
But at the same time, we know that we all have a lot of agency, right?
And.
We can teach that and we know that there are ways to center ourselves and really
build a future that we wanna have.
So we wrote the book to call attention to the urgency of the
unsettling this, of this moment.
Right.
And to pave a path forward that anybody can take.
Andy Polaine: Yeah.
And you kind of talk about where you talk about the layers of design and
ecosystems and flows in particular.
And earlier on you, you use the onion.
As an example.
Mm-hmm.
Or you kind of cut through it.
Can you explain that?
I, maybe I'd put some video of it up or an image of it up, uh, on the video.
But for those listening, uh, what would we be looking at?
Carissa Carter: Well, by the onion, um, we mean the layers of
design that we like to talk about.
And, and one way to think about it is to take any sort of object.
And I have this, you know, iPhone here that I'm holding up as an object.
Okay?
This is when somebody says, well, what is design?
Okay.
This is a product.
This phone is a product that was designed, somebody decided on the
radius of the corners, the material that it's made of, how it feels in my hand.
I can open it up and there's a ton of, of digital products on here too.
All those apps are digital, is both physical and digital.
Product design here.
Each one of these products enables any number of experiences to happen.
So here we are on a new layer.
We went from product to experience.
We can have a video call with somebody on the other side of the world, right?
That experience was designed.
Each one of those products and experiences is embedded in any number of systems.
Okay?
New layer of that onion.
As a Stanford employee, I'm a cog in that system, and this phone has the
security software that they require on it.
I'm on an at t phone plan, which is a system that decides where I get service.
Around this country and this world.
If I want a new app, I get one from the app store.
Another system that was designed right?
So product experiences and systems are all layers, but every single one of those.
Products if by going down into the onion now, you know, is enabled by
any number of technologies, both digital and physical technologies.
Yeah.
Um, and then each one of those technologies is powered by data.
So for example, if you were going to to, to text me and you're texting
to Carissa, my name's not always in the western cannon of names.
It will auto correct a carrot or carcass, right?
And, and then like, there's just a very interesting things happening there.
Like one, somebody designed the data set of what names get
auto corrected or not, right?
And so that was a design decision.
And then two, like how that algorithm works.
Like you, you text me a few times and you've corrected it and
it no longer, um, prompts you.
The next time, you know, to change my name.
So right now we have data, technology, product experience, systems, and E.
Every one of those things has implications, large and
small, near term and long term.
And this is what we mean by the onion, right?
Because there are apps on this phone that have allowed people to band together and
change governments in their city, in their countries to, to really powerful effect.
And those same, those same technologies have enabled things
like intense school bullying.
Andy Polaine: Yeah.
Carissa Carter: Right.
There's positives and negatives to everything, and you can always, with.
Absolutely.
You could do this with anything around you.
You could trace it down from the data that was used to make it or power it all
the way up to the systems that govern it and the implications that it has.
Andy Polaine: There's an interesting thing that I think that happens when,
I mean, John Qui took talks about it in a book at this about the semantic zoom.
If you take a, particularly if you take a physical product and you
kind of zoom in and you end up down in the kinda material level of it.
Um, and actually the iPhone's a good example because of the rare earth,
uh, uh, metals in there, right?
Mm-hmm.
Where, but then you zoom out again, you go, okay, this thing exists
in an ecosystem of my, sort of my Apple ecosystem, for example.
And that exists in the kind of marketplace that exists in the world.
And as we're kinda seeing at the moment that, uh, the, the company
Apple is kind of rubbing up against the, the legal frameworks of, of the
EU and the, those kinds of things.
And you kinda kinda zoom right out and you get geopolitics
again, which of course, you know.
Materials that are coming from problematic countries.
Yeah.
You know, that, that all kind of bounce together so it kind of
wraps around the other side again.
Carissa Carter: Absolutely.
Right.
And it's really interesting questions for our own value systems and
what are our ethics and what are our boundaries, our constraints.
Andy Polaine: So you divide the book into intangibles and actionables.
Um, and so I am interested in the.
The book is very, very beautiful, by the way.
Thank you.
When we spoke before, I made a note about the kind of information
architecture of the book.
'cause one of the problems with this, as I'm sure you know, that it's the kind
of peral service design problem is when everything's connected to everything else.
Where do you start and how do you tell something, the story of this stuff?
In a medium that is sequential, really.
So how did you go about that and why did you make that division?
Carissa Carter: Yeah, so the, the, the front half of the book, the intangibles
are about these forces that are making the world feel somewhat unsettled.
Right now, in the back half are the actionables, which is what can we do
about it and what can we do about it from a very personal agency perspective.
And then interspersed throughout the entire book are these short,
speculative fiction stories that paint pictures of moments in the future,
not super far in the future, maybe.
You know, 2050 ish years out that really allow us to try on some of the
phenomena that we're talking about at that moment in the nonfiction.
So we, we very much believe in fiction as a way of testing.
How we're gonna show up in the future and do we like what we look like in it.
It's a great prototyping tool and it also allows people to
consume content differently.
I think people are reading very differently than they ever used to.
Hmm.
And hooking people through story is often a very engaging way to get
them involved in the topics that we're, that we're dealing with.
Andy Polaine: I was interested that you call them histories of the future
in the book, and they are mm-hmm.
Quite compelling.
There's some.
I was quite surprised and amazed that you used Apple and Amazon, not the company,
but Amazon Forest example in there.
Only 'cause I, I was thinking, oh, well they actually used Apple's name in this.
Why?
You know, obviously, uh, speculative design is a thing and uh, often used
to kind of critique, uh, the future of, you know, where things are going.
I'm interested why you chose to use stories, um, rather than.
Designing artifacts or things and, and using those things as the examples.
Carissa Carter: Well, stories are one way that we can make sense of information.
They're really memorable, right?
Like the, the human mind is wired to create story.
We are, you know, there's a whole section in the book on make believe.
Yeah.
Um, we are, we are wired to find patterns, um, and to see faces like
this is how our brains are set up.
And so by using story.
It, it, it creates a narrative, you know, you can fill in the details,
like if I just talked about an object that might exist in the future.
Okay.
But like, put that object in context and have somebody using it and have
people fighting over it, and then, you know, reacting to it and you
pull in the emotions and it really brings in the whole human experience
when you're, when you use a story.
Andy Polaine: Yeah.
Yeah.
No, I agree.
I, there's a quote you have in there, I think, which is actually
something I've always said, or I was interested to find the quote around.
No one was ever persuaded by the numbers or something.
I was that I think that's in there.
Or maybe I read it today by, from somewhere else.
Carissa Carter: I'm not exactly sure which one you're referring to and nobody's ever,
there's just, this could be in there.
This
Andy Polaine: idea that, you know, we, we tend to talk about num, well, we.
Businesses and leadership often talk about kind of numbers as these very concrete
things, but obviously any projections in the future are just, just as made
up as a, as a speculative fiction.
Right.
But it's actually, I've always found that the, the numbers give people
permission to believe in the story.
Uh, but it's actually the stories you say it's the memorable thing
that people kind of take with them.
Carissa Carter: I mean, and, and, and when you talk about numbers too, right?
You're, you're talking about what you've chosen to measure Yeah.
And what you've chosen to have value and.
There that can be really problematic on its own.
Right.
Because we tend to think that the things that we measure are
the things that matter most.
And the minute you decide like, oh, this is our metric.
Andy Polaine: Yeah.
Carissa Carter: Right.
You're locked into that.
Andy Polaine: Yeah, absolutely.
And I, you've, you've got quite a lot, I mean, you've got a lot about this,
but you also, I think you talk about the kind of crime figures somewhere
as well, crime prediction as a sort of problematic thing because, uh mm-hmm.
Because it's, it's based actually on arrest data rather than, um.
You know, actual crimes.
'cause if I speed somewhere or break the law and nobody catches me, then
it, it never appears in the data.
Yeah.
Carissa Carter: Right.
And so then where, where are we looking?
Who are we arresting?
It's a self, you know, it's a self building phenomenon.
Yeah.
Andy Polaine: We are obviously in the midst, well, perhaps
we're not in the midst.
Maybe we're coming over the, the peak of an AI hype cycle
at the moment as we speak.
It features a lot in the book.
I would say you are fairly balanced actually, or kind of neutral I'd say
about AI in the book that you're not particularly, I don't think you do
this with any of it actually in the book where you, it's not particularly
that you're screaming, this is kind of terrible, terrible thing, nor are you
kind of saying it's fantastic either.
There's this sort of caution throughout the book.
You have a whole thing around the kind of relationships with AI in the book.
Do we want to talk about that a little bit?
Carissa Carter: Yeah, I mean, well, you know, just to unpack that lead
up there, what was fascinating is we really wrote this book.
The main bulk of the writing was between.
You know, 20, 21, 22, little bit 23, right?
So this AI wave that you're talking about us today, right?
Like it, it hadn't happened yet.
And what was happening as we were writing right, is like.
One thing after another, like just started to come true.
Right?
So like we have, we have double the amount of stories that we threw out because
things were just happening so fast.
Mm-hmm.
That, that acceleration was really, really felt as we were, as we were
writing this in, in particularly with ai.
Right.
Especially with generative ais.
You know, advancements in the last 18 months, and we really did try to
toggle between the dystopian, utopian, uh, versions of what might happen
because, you know, we don't believe that either is going to be the case, right?
There's, there is no perfect utopia.
There's a, there's a f there's a figure in the book that.
You know, really says like, even Utopias have a sewage system, which I just love.
Right?
Because it is a, a beautiful Armando drawing of what a sewage system
for a utopia looks like in there.
You should check that out.
Um.
But the reality, right, is like, it's not all going to be doom and gloom either.
And we have to be able to, to work and acknowledge the things that
are going wrong as we are pushing forward to try to do the right thing.
And I think that's a lot of what is, is troublesome about tech and AI
development right now, is that we are really in love with the possibilities
of what might be that we forget to look.
What could go wrong along the way.
Andy Polaine: Yeah.
Carissa Carter: And this culture of shepherding our mistakes into the
future doesn't really exist yet.
And it, it's wonderful that we love to build, I, you know, we
love, we teach people how to build.
Right.
And I don't, I don't want that to go away, but I also want us to take, you know.
Coming back to that onion, like how do we take a whole holistic
view of the effects of what we're building along, along this journey?
Andy Polaine: Talking about mistakes you have in one of the histories of the
future, this idea of the World Creation Council, the UN World Creation Council.
Can you explain what that would be and what its job is?
It's a very, very, it's very, very sad and touching story actually in the book.
I dunno how much you want to tell the whole story.
Yeah,
Carissa Carter: no, that, that comes in, um.
A story called Hello Mamas.
Hmm.
And in this story, you, if I can distill it really quickly,
there's a son who loses his mother.
And this is taking place in the future.
Um, we don't exactly know how she's, she's passed away, but he, the
son comes to realize that it is, he's the seventh great grandson of
somebody who invented the separate condenser part of the steam engine.
Right.
So like really help the industrial revolution happen.
Andy Polaine: Yeah.
Carissa Carter: And you know, if you think about it, they, the things that we
built during that industrial revolution have had outsize impact on the pollution
that we have in our world today.
Right?
Like the smoke that was put into the sky, like, like what if we had
done it differently way back then?
And so then that like begs the question of like, how do we allow ourselves to
continue and like get all that incredible progress, but also like take back,
could we ever take back things that we did that, that were troublesome?
And so the, the idea of the World Creation Council that comes up within
that story is that maybe there is a place you and like right where you could come
to them and say, Hey, I made a mistake.
With what I built, it's had, it's had problems, can you
help me pull it from the world?
Right?
And how that would happen, like, you know, would take some sort
of like systemic technological.
Big breakthroughs to happen, but like what if that was possible?
And so in this story, the main character Dunn is trying to, to
pull his seventh great-grandfather's separate condenser from existence.
Then I won't give away exactly what happens, but
Andy Polaine: yeah, so the council's job is to.
Sort of kick into gear and I, I, I would be, I would've to spoil it
in order to say actually how that happens, but what happens at the end.
But to really try and remove, you know, this problematic technology
from existence, but you have a whole thing about the sort.
The creator does not suffer any.
Harm from this.
This is like a, a global or un decision to say, you know, we recognize that
people, we don't want to stymie people making things, but we also need to have
a mechanism to do a, a kind of take back.
Yeah.
Carissa Carter: Right?
Yeah.
Like, so, like a, there's gotta be a balance between really, you know.
Encouraging people to want to innovate, but also sometimes we don't know the
negative effects when we launch something and it can really, it can get beyond us.
I mean, there's been so many examples of, of this in the world.
Like from everything from, from social media, right?
Like in the beginning it's to say hi to your friends.
And now it can be really emotionally, you know, societally.
Troubling.
We're addicted.
We're, you know, yeah.
There's that type of example all the way down to like the coffee
pods that now contribute so much waste or toothbrushes, right?
Like there's like projects, physic products, physical and digital.
How do we allow people to keep that experimentation?
But then is there a way to come together and say like, okay, we
gotta pull this from our society,
Andy Polaine: but isn't there a kind of, you know, all things in moderation
problem there that, that, you know, almost anything in the world has, you know,
including humans, has a positive side and a kind of negative effect, particularly
once it has gone beyond a certain scale.
Uh, and that the kind of fundamental issue underneath this is our obsession with
using growth as the measure of success.
And that if you actually move away from that, then that kind of
ameliorates a kind of lot of the problem rather than the objects themselves.
Carissa Carter: Yeah.
I mean, I do think we're obsessed with speed and efficiency.
Yeah.
As a society, right.
Make everything faster, get that done more quickly, and that has become a new metric.
Is it the only one?
I'm not sure.
I mean, like I, I am thinking about the, the Montreal protocol and our
ability to come together to remove CFCs, chlorofluorocarbons mm-hmm.
From the atmosphere.
Right.
And like their, their existence was to help, you know, aid in
refrigeration more than anything else.
Right.
And so if you are from, if you, if you grew up, were, were around
in the, in the nineties, you know, like you remember that campaign.
Yeah.
To get the whole of the outta the ozone layer.
Right.
That was like a global way that we actually banded together.
I think it was to this date, still the only time that, that we've
had global consensus on anything.
Andy Polaine: Yeah.
Especially to do with the environment.
Carissa Carter: With the environment.
Right.
And like, was that, what was that one about?
Speed and efficiency.
I mean, I do, I, I'm with you on speed and efficiency, but it does feel like we have
the ability to value other things too.
We've seen it, we've seen evidence of it as a society.
Andy Polaine: Well, it's kind of, I mean, it's kind of a growth thing, right?
In the sense that this is the most efficient meaning, uh, in this case,
probably cheapest way of cooling.
Mm-hmm.
And so therefore, that's, that's the thing we're gonna use.
And obviously then it, as it scales, it becomes a, a massive problem.
And, but partly it's also about.
This idea of not thinking in terms of ecosystems, not, there's, there's a guy
called Joe McLoud, who, who, who writes and talks all about the ends and endings,
you know, of the onboarding process.
He's, you know, we talk about the iPhone again, you, you, a whole onboarding
process is beautiful, but the offboarding process is really rubbish, right?
Mm-hmm.
And so we don't really think, or, or, uh, and manufacturers don't take into account
very often the end of life cycle of things as a thing they should be designing.
And here's a whole kind of framework for this, and so, so
part of it's that as well, right?
Which is if you are focused on growth as the metric and you
know it's mostly money, right?
It's mostly sort of growth of the business, then the kind of rubbish
end of it, the disposal end of it.
Well, that's not a growth thing, right?
Really, unless.
People go properly circular, so therefore you kind of end up with, um, that's the
bit that gets ignored 'cause no one's measuring that as a metric of success.
Whereas instead, if you said, well, how much, if the measure of success of a
company was not how much it had grown, but how much it had reduced its waste or
its, you know, output in, in some way.
I mean, these are the kind of degrowth arguments, right?
Then that measurement and the way we kind of think about success
would radically change the behavior.
I mean, I guess your book is kind of full of little things like this of like
how do you change behavior through, through changing what we measure and
what we think about and focus on, right?
Carissa Carter: Yeah, absolutely.
I agree with that entirely.
I think if you look at.
Productivity, you know, and then coming back to technology and AI too, right?
If you look at like data of, uh, from at least the United States, US Department of
Labor, if you look at their data, American productivity really has stayed fairly
flat since about 2007, eight ish maybe.
Goes up slightly.
Andy Polaine: Yeah.
Yeah.
Carissa Carter: And in a parallel timeframe, you see
our micro-processor speed, like continue to increase exponentially.
So we're making faster and faster computing devices, but we're
not getting more productive.
And if you think about like what else is happening around that time, like
that's really about the same time that you start to see the iPhone, right?
And you start to share video, you know, use video as a medium of, of
sharing information more and more.
And so then this, this, this beg of the question of like to what end?
Like.
Do you feel like you have more space in your day to do things now, or do you
feel more harried and frazzled than ever?
Mm-hmm.
Yeah, and, and I'm not sure how many of us think that we are actually,
you know, that this technology has freed us up to be more.
You know, productive elsewhere.
Andy Polaine: No, I mean, that's the whole kind of the, I can't remember
who originally wrote this about sort of ai, you know, so I, my dream was
always that kind of AI was gonna do the drudge work and I was gonna,
it was gonna free me up to kind of make music and, and I love that.
And images.
I love that.
And it's the kind of where the opposite is true.
Mm-hmm.
That we'd just become slaves to it.
I always, the example I have quite often, I've got a robot vacuum cleaner.
And it's got a water tank and a fresh water tank and a dirty water tank.
And I regularly get these poorly worded notifications on my phone,
uh, which are like empty the water tank, ASAP and oh my God.
Okay.
And I feel like I've just become this kind of, this slave to
my, my robot vacuum cleaner.
That kind of demands to be emptied and to be filled and to be, you
know, you need to do this bit of maintenance, the brush knee replacing.
Oh god.
Okay.
And part of it's actually just the tone of voice is really off.
But part of it is like, you know, as we all know, the little red dots
and the notification or, you know, you see people's emails and they,
they haven't turned off that badge.
Mm-hmm.
And so it's got sort of, you know, 50,000 unread emails and we've become these
kind of real slaves that those devices.
Carissa Carter: Is it a hard, is it a, is it a, um, a smart vacuum cleaner?
Like is it, is it connected?
It
Andy Polaine: is to the
Carissa Carter: cloud in some way.
Andy Polaine: It, it, it is supposedly it's not sharing my data.
Mm-hmm.
But it could well be sharing the, the math of my house.
Mm-hmm.
Yeah.
So, you know
Carissa Carter: where I was going with that, right?
Yeah, yeah, yeah.
Which is another fascinating thing.
Like we, we are like, it brings me into the topic of privacy, right.
And what is still sacred.
Right.
Because like, let's just say hypothetically that, that, that
is sharing your data in under the.
Assumption that by knowing the footprint of your home, right, and knowing the
type of cleaning products you need, it could reorder its special soap at
the right frequency, and you wouldn't have to do it right, like all under
the guise of making your life easier.
But is it okay that that company knows exactly where your furniture is?
Andy Polaine: No, it's not.
The good thing is the robot sometimes gets lost and I have to rebuild the map again.
So, uh, I don't think it's that smart just yet.
But it could be.
You have you, so you have a whole, you have a whole section
about data actually in the book.
Mm-hmm.
Uh, maybe you could kind of talk about that a little
actually, given the, the topic.
Carissa Carter: Yeah.
I mean, data's, everything is data, right?
And in this, in, I think we're in this data grab phase of
society right now too, because.
Every technology needs data to power itself.
Yet the thing that we're gonna see more and more, and we already see it, right,
is that where data pools lies power.
Hmm.
And so right now, like as organizations collect more and more user data, data
on the world, who has the ability to store that data, even if we
don't know what we're going to use it for yet, that will have value.
Andy Polaine: Yeah.
Carissa Carter: Right.
We are all familiar with the, the phenomenon that totally exists right
now where your, your data is sold.
To third party organizations, sometimes, many times without your consent, at least
in the United States, better, better in Europe, um, and elsewhere in the world.
Right.
But yeah.
Should you own it right?
Should you own it yourself as an individual?
I think most of us might feel that way.
At least I do.
Andy Polaine: Yeah.
Yeah.
But
Carissa Carter: how, how this is gonna play out, I think we'll have profound
implications for decisions that are made.
Who gets to, who gets to decide what to do with all of that data?
Andy Polaine: I mean, you, you talk about it a bit in the book, but
there's that, I think a lot of people underestimate just how much, you know,
everyone goes, well, you know, my mom will kind of go, well, you know, who
cares that I've been to kind of the, the supermarket, uh, today in Iran?
I don't care.
But I think people underestimate how much there are different sets
of data can be pulled together to know an awful lot about Someone.
I remember at a company I worked at, there was a data analyst and said,
well, you know, I'd look at, you might have anonymized purchase records.
You might only see the last.
Two digits or you know, some tiny bit of metadata around the actual
kind of purchase of this thing.
But, you know, in the shopping center, when you've logged onto their wifi to
also kind of, because you want free wifi, where while you're in the, uh, shopping
mall, all of that has tracked you.
And if we pull those two, two together, I know exactly who's bought what
when, because I can just kind of pull all this data, uh, points together.
And I think people kind of do.
Not really understand how much, uh, can be found out about you in that way.
And, you know, the privacy is one of those things that sort of gone
through cycles, over history.
Right.
And early, a few hundred years ago, you had, people had no sense
of a kind of right of privacy.
'cause they lived very close to each certain in cities, lived very
close to each other, often the large extended families in a small house.
Uh, and then kind of prophecy became a thing.
And then, um.
Uh, as a sort of concept and, and it's sort of going away again, I'm sure, you
know, there's, there's quite a lot of apathy amongst kind of gen Zs about that.
Like, it's just, you know, that, that ship us out.
Carissa Carter: I think the thing that worries me right now is that that data
that we, maybe, that apathy exists, right?
And people are giving away that location, their, their purchase history.
Mm-hmm.
That data's then like repackaged and given to you, right?
And it sways your next decision based on the media that's presented back to you.
Mm. Based on the directions that you're given to get from here to there based
on so many different factors, right?
Because an organization wants you to engage in their products and.
It starts to get to a moment where I worry that my thoughts and
behaviors aren't my own anymore
Andy Polaine: are that you're being nudged sort of unconsciously.
Carissa Carter: That I am being manipulated.
Andy Polaine: Yeah.
Carissa Carter: And right now it, I think all of our
behaviors are being manipulated.
I mean, this is the polarization of the internet.
I just, you know, one video.
Makes you, you watch this one video and it says, oh, you might also like, right?
Like that is a well-documented phenomenon in things like YouTube that they've
worked really hard to combat, right?
But we know that, that our behaviors can be nudge.
But is there a moment when.
My thoughts aren't even my own thoughts, like, and I'm getting a little bit
philosophical here, but I do worry that our technologies are morphing our
own psyches in ways that we may not yet totally understand or appreciate.
Andy Polaine: I think, you know, I had Oliver Stein on who, uh, makes the
app IA writer and they have a sort of quite a stance on the ai and in his
one is, why should I read something that you haven't bothered to write?
And one of the things they kind of put in, built into the latest
version was this authorship mode.
So that if you do paste some content in, uh, you can label it as ai.
Mm-hmm.
And it kind of pasts it in as kind of, sort of grayed out
slightly so that as you start to.
Rewrite it, you know, in a, in a sense it's kinda like a game, like how can
I kind of turn all that gray back to black again with my own words?
But there is this really compelling problem with that, where you read
it and you go, wow, I can't really think of a better way to put that.
And so I might as well leave it how it is.
I mean, you're at D School, I have students too, and, uh,
this is, this is definitely an issue, uh, um, where I think, um,
Carissa Carter: this was.
You know, we, we have a graduate program where you read essays for
everybody that's coming into Stanford.
And I think, you know, a lot of, a lot of people are talking, you know, around
the globe as to, you know, people that have this year used AI to write essays.
And I think one thing that, while I didn't see any like blatant people
pasting in the chat, GPT says blah, blah, blah, blah, blah, you
know, in their own personal essay.
Yeah.
There was a phenomenon I noticed this year that was different, which is that I
felt like there was a lot of pandering.
I. And you know, like in a way that it seemed like maybe the entire text of, of
the program website had been plugged in.
Right.
And then repackaged in a way to be like, let me tell you
how wonderful your program is.
And like, I, it just felt like, oh, there's something
going on with that phenomenon.
Andy Polaine: Yeah, yeah, yeah.
Yeah.
There there's definitely kind of little signatures that you start to read.
Mm-hmm.
I mean, I find it fascinating that you can tell.
You can kind of tell the difference between a mid journey
image and a Dali image, right?
Mm-hmm.
And I'm kind of fascinated that in a pretty short amount of
time you can kind of spot the tells, you know, of those things.
And I, I kind of feel there is a bit of that, but you know, there's
also makes me then start to wonder how much I. Then, but I know exactly
what you mean by the pandering thing.
It is like, it's like kind of SEO, right?
It's like someone's pulled in all the keywords and then you kind of read it
back and again was kind of saying all the right things, but it's a bit soulless.
There's a kind of uncanny valley aspect to those texts I
think that, um, that show up.
Carissa Carter: And, and, and then let me just give you the flip too, right.
The amount of ways that I'm a, um, you know, like welcomed into, I
just came back from vacation, right?
And I used Suno to make a song for my out of office message, right?
And it was so fun to do that.
I could never have just composed a song in 15 minutes.
But here, all of a sudden, like that, AI has helped me.
Do something fun and creative that I wouldn't have been able to do otherwise.
In no way does that make me a songwriter or a musician.
Mm-hmm.
Which is just like, let's park that as a, as a thing.
But like it, it's.
Allow me to participate in ways that I wouldn't before.
And like, I think that's incredible, right?
Because a lot of times some of these, some of these fields are
really inaccessible to people that wouldn't have access otherwise.
Andy Polaine: Yeah.
I'm totally split on that.
I have to admit the same because I, I, I remember the rise of blogging
mm-hmm.
And the blogging platforms and it.
I had a blog before, you know, when I actually had, so just individually upload
the HTML files and to, and sort of add it to my website, um, in like 1996 or
something, when things like Blogger and, and WordPress and a couple of other
earlier tools came out, it democratized publishing stuff onto the web, right?
Mm-hmm.
And so you suddenly got, because you didn't, you know, instead of having.
Blogs that were mostly tech based, from tech people.
Mm-hmm.
You suddenly got, you know, the whole growth of diverse voices across
the, the web and you know, so in that sense it was kind of amazing.
Of course, we've sort of seen the, we're seeing the opposite happen now
of all these kind of ward gardens, which sort of reduced them down
and created these echo chambers.
But, well, I guess it goes back to my point before sort of anything that we
put out in the world has a, has its positive side, but, but also kind of
in excess, uh, has its negative side.
In the end of the book, you have a call to action and you talk about
design for healing, and you have sort of some principles in there.
Could you tell us a little bit about what those are?
Carissa Carter: Yeah.
Uh, the headline is that anything that you put into the world, whether
that be a thing, a system, a routine, absolutely everything is going to
either break or break something else.
And.
Instead of just trying to mitigate breakage all the time, how do we
design for healing and healing the mistakes in the, in the effects
of what we put into the world?
So how do you continue to shepherd what we make into the future instead
of putting out there and just.
And, and saying it's one and done.
And so our call to action is to really understand that that's going
to happen and that that should be a piece of your design work.
Is the full, bringing it fully into the future as well.
Andy Polaine: Yeah.
I think one of the things to learn from those histories of the future stories,
it's often we tend to sort of start from the beginning and kind of then
create a thing and then, and that's it.
But actually starting from the end is a, is a better place to start
sometimes rather than the beginning.
Carissa Carter: I love that.
Andy Polaine: Yeah.
So, uh, we're coming up to time.
As you know, the show is named after the, the Eames film.
We talked about at the beginning what one small thing is either overlooked
or could be redesigned that would have an outsized effect on the world.
Carissa Carter: I thought a lot about this.
I'm gonna say sunscreen.
Andy Polaine: Oh,
okay.
Carissa Carter: There are so many things I could, there are so many things I
could say to answer this question, and I'm saying sunscreen because it has.
I've been in the back of my mind for years.
There's so many assumptions.
One about how we protect ourselves from the sun's radiation.
Andy Polaine: Yeah.
Carissa Carter: One that it like has to be topical and how it works
and it's more and more becoming.
Important as our climate shifts in those UV rays are hitting us more and
more throughout the day, and our rates of skin cancer are increasing rapidly.
And you haven't really seen too much technological innovation within it.
Okay.
If we could protect our skin, our bodies, in different ways, and this
doesn't need to be like a screen, like it could be other ways that
we shield ourselves, it could, you know, be in within our behaviors.
I just, I think that, I think that we could see in incredible shifts.
Andy Polaine: Very good.
As a Boardman, I can tell you I wear a hat and stay in the shade as much as possible.
As a board Englishman, I should probably say, say that,
Carissa Carter: and if anybody's listening that wants to work on this, could you
please make a sunscreen that we can drink?
Because I have always wondered why, why can't it be something that we like?
Secrete, right?
Like in your po like drink in the sweats.
Oh, I see that you
Andy Polaine: sort of sweat out and then Yeah.
Carissa Carter: Like versus, versus put on on the outside.
So.
Andy Polaine: All right.
That's a good idea.
But don't drink sunscreen right now?
No.
Where can people find you and, and Scott?
Indeed.
Online and Armando.
Carissa Carter: Yeah.
Well, both of us work as a Stanford D school, which, um.
Has all of the, the D school tags around the internet.
Um, you can find both of us on, on LinkedIn as well.
Um, Carissa Carter.
He's Scott Doley, and you can find, uh, the book anywhere books are sold.
Andy Polaine: Correct?
I'll put all the links in the show notes.
Carissa Carter: Thank you.
Andy Polaine: Thank you so much for being my guest on Power of 10.
Carissa Carter: Thanks for having me, Andy.
It was great to chat with you this morning.
Andy Polaine: You've been watching and listening to Power of 10.
You can find more about the show on paule.com where you can also check
out my leadership coaching practice on my courses, as well as sign
up for my irregular news letter.
Doctor's note, if you have any thoughts, please put them in
the comments or get in touch.
You'll find me as April Lane on PKM Social, on LinkedIn, on my website, and
all the links are in the show notes too.
Thanks for watching and listening.
I'll see you next time.
We recommend upgrading to the latest Chrome, Firefox, Safari, or Edge.
Please check your internet connection and refresh the page. You might also try disabling any ad blockers.
You can visit our support center if you're having problems.