INTRO: Welcome to the NSPCC learning Podcast, where we
INTRO: share learning and expertise in child protection
INTRO: from inside and outside of the organisation.
INTRO: We aim to create debate, encourage reflection and
INTRO: share good practice on how we can all work together
INTRO: to keep babies, children and young people safe.
PRODUCER: Welcome to the NSPCC Learning Podcast.
PRODUCER: The online world is constantly changing and young
PRODUCER: people are often at the cutting edge of these
PRODUCER: changes.
PRODUCER: They're often more informed about online trends
PRODUCER: than adults, and are equipped with unique
PRODUCER: knowledge and understanding of what they need to
PRODUCER: know and do to stay safe online.
PRODUCER: It is important to listen to them and try and
PRODUCER: incorporate their voice into your online safety
PRODUCER: work. In this podcast episode, recorded in
PRODUCER: January 2025, we'll be doing just that.
PRODUCER: You'll hear from two members of the Voice of
PRODUCER: Online Youth, a group of young people aged 13
PRODUCER: to 17 who help advise the NSPCC
PRODUCER: and the wider online safety sector on how to help
PRODUCER: children have safe and happy experiences online.
PRODUCER: Will and Zara will provide their own insights and
PRODUCER: experiences on what online life is like from
PRODUCER: their perspective, including what they get up to
PRODUCER: online; what steps they take to stay safe;
PRODUCER: what worries them about being online; and what
PRODUCER: they'd like to learn more about when it comes to
PRODUCER: online safety.
PRODUCER: I'll hand over now to Will and Zara to introduce
PRODUCER: themselves.
WILL: Hi, my name is Will and I guess I joined the Voice of
WILL: Online Youth because I really think it's
WILL: great to give young people a voice when often in this
WILL: space they're overlooked, even though online safety
WILL: is centred around young people because they're the
WILL: ones often online.
WILL: I really enjoy public speaking, so I thought this
WILL: would be a great opportunity to give a voice to young
WILL: people when it comes to online safety.
ZARA: Hi, my name is Zara and I joined the Voice of Online
ZARA: Youth because I want to make the internet a safer and
ZARA: more positive place for people my age.
ZARA: A lot of the decisions about online platforms are
ZARA: made by adults who don't fully understand what we go
ZARA: through or what matters to us, and I want to make
ZARA: sure our voices are heard.
PRODUCER: Thank you both so much for joining us on the
PRODUCER: podcast. In this podcast, we want to try and
PRODUCER: provide an understanding of what online life is
PRODUCER: like for young people.
PRODUCER: So my first question: what sorts of activities
PRODUCER: do you get up to online?
WILL: So online nowadays is almost everything
WILL: that I do because it's not
WILL: just playing video games and talking with friends
WILL: and things online. It's also that school's involved
WILL: online. I look at all my homework online and I do
WILL: quizzes online.
WILL: There's quite a lot online that I feel like people
WILL: wouldn't really notice is there.
WILL: It's just sort of passively online and it's there.
ZARA: I spend a lot of time online doing a mix of things.
ZARA: Social media is probably the main one.
ZARA: I keep in touch with friends or just see what's
ZARA: trending. I also stream music or watch shows,
ZARA: but I feel like when adults think of online...
ZARA: Well, uses for the internet, they don't have
ZARA: a nuanced view. They think it's just for
ZARA: entertainment. But, similar to what Will said,
ZARA: we have a lot of our school work on there or anything
ZARA: educational. It's not just one singular thing.
WILL: It's not like being online is a necessity.
WILL: It's just really, really useful.
WILL: And it's a tool that a lot of people can use and get
WILL: a lot of use out of.
WILL: Sure, I could just dial up my friends on the home
WILL: phone, but it's a lot easier to just Facetime them
WILL: and then be able to see them as well.
WILL: And it's more convenient.
PRODUCER: What's the one thing that you wouldn't be able to
PRODUCER: do without when it comes to being online?
ZARA: I feel like for me, schoolwork is
ZARA: one of the big ones because, frankly, I don't
ZARA: see anyone going to the library
ZARA: and getting textbooks; most of them are online for
ZARA: me. And if I couldn't do that, I'd have to stay
ZARA: after school and then it would make my journey longer
ZARA: and it would affect other things in my life.
ZARA: And also, just talking to friends.
ZARA: Some of us live really far away, so we can't really
ZARA: meet person to person, and it's just,
ZARA: all in all, more convenient.
WILL: I totally agree with Zara that schoolwork is really
WILL: useful to have online, but I'd say if there's one
WILL: thing that would have to, you know, take the top, it
WILL: would probably just be as simple as talking to your
WILL: friends online.
WILL: I think it's really overlooked that you can just send
WILL: people messages and they'll get back to you really
WILL: quickly. Whereas, you know, before the internet
WILL: was really a thing, that just couldn't happen.
PRODUCER: So it's clear that being able to go online is
PRODUCER: important to both of you.
PRODUCER: My next question is more to do with online
PRODUCER: safety. What makes you feel safe and comfortable
PRODUCER: online?
ZARA: I feel comfortable online when I know the platform
ZARA: I'm using really well.
ZARA: Because if I don't, I feel
ZARA: if I see something bad, how am I going to report it?
ZARA: Or how am I going to tell someone?
ZARA: Familiarity. It makes it easier to navigate and I
ZARA: know how to handle settings like privacy controls or
ZARA: reporting tools. It's also reassuring when I'm
ZARA: part of communities that are really well moderated
ZARA: and there's no toxic behaviour or, well, it's not
ZARA: known for toxic behaviour.
ZARA: I generally feel safest when I'm in control
ZARA: of what I see or what I share or what
ZARA: people say to me.
WILL: Zara's pretty much hit the nail on the pin there— on
WILL: the pin? On the head. But yeah, I feel like there's
WILL: not too many times where I'm feeling uncomfortable
WILL: online now that I'm just familiar with everything
WILL: that I see. Yeah, it's just so normalised
WILL: that, because I'm in control of pretty much
WILL: everything that I see, there's nothing terrible that
WILL: makes me feel really uncomfortable.
PRODUCER: I wanted to pick up on Zara's point about
PRODUCER: moderation and privacy settings.
PRODUCER: Do you both take control of your own privacy
PRODUCER: settings, or is this something that maybe your
PRODUCER: parents were involved in?
PRODUCER: Or maybe a bit of both?
ZARA: I feel like when I was younger, my mum would always
ZARA: make an account for me and she just put me on child's
ZARA: settings and then she'd think I'd be safe.
ZARA: Now I just... More than that I just alter
ZARA: my 'For You' page more on things I don't want to see,
ZARA: rather than just outright block everything
ZARA: that's not appropriate, because then I can just
ZARA: feel safe and I can still find stuff that I like
ZARA: and be entertained by.
PRODUCER: And just quickly, when you say the 'For You'
PRODUCER: page, what app are you talking about here?
PRODUCER: Is it Instagram? TikTok?
ZARA: I still use TikTok and it's got
ZARA: a button where you can say, "I'm not interested in
ZARA: this". And whenever I get a new app like
ZARA: Instagram — I got Instagram a few days ago and the
ZARA: whole day I just spent sorting it out, going through
ZARA: a bunch of videos, saying what I like and don't like.
ZARA: So then the next day, it's kind of used to content
ZARA: that I enjoy, and I think that's really important
ZARA: because otherwise, you know, algorithms won't work or
ZARA: anything.
PRODUCER: I'm really struck by what you say Zara about,
PRODUCER: kind of, manipulating the algorithm so that
PRODUCER: you're getting the content that you want to see.
PRODUCER: When it comes to things like moderation,
PRODUCER: are you approaching it from a "I know what I want
PRODUCER: to see" kind of way? Or is it more of a "I
PRODUCER: know I don't want to see" kind of way?
PRODUCER: And are there settings within the apps that you
PRODUCER: will set up to block any content that you don't
PRODUCER: really want to see?
ZARA: I feel like it's a mix of both.
ZARA: For example TikTok, if you just say "I'm not
ZARA: interested" in stuff, the stuff that you are
ZARA: interested will just be there.
ZARA: But other apps work a bit differently.
ZARA: For example, Instagram: I like the videos
ZARA: that I like and enjoy, so then those mainly come
ZARA: up. But for TikTok it's the other way around.
ZARA: I just say "I'm not interested in this".
ZARA: So it just varies across apps.
WILL: Well, I mean, it was really interesting seeing what
WILL: Zara had to say about manipulating your algorithm,
WILL: because I kind of do that too.
WILL: You know, you have to like the videos that you like,
WILL: straight away just scroll past videos that you
WILL: wouldn't find interesting, even if— Say you stumble
WILL: upon a video that has, you know, completely opposing
WILL: views to you, and you want to view that video just
WILL: out of curiosity because what have I got to say?
WILL: I don't like this kind of content, but still, I'm
WILL: interested in the opposing views to mine.
WILL: And then you just suddenly get loads of videos like
WILL: that, hundreds of videos that are just
WILL: not your views. And then you're like, "oh no, what's
WILL: happened here?" So you have to be really careful with
WILL: what you watch and what you like and what you
WILL: dislike.
ZARA: I think yesterday, I was watching something out of
ZARA: curiosity, and then the next day my
ZARA: whole 'For You' page filled with that.
ZARA: And then it didn't let me go back to my original
ZARA: 'For You' page for a couple of days. I think that's
ZARA: definitely a really good point.
PRODUCER: And do you worry about algorithms and how they're
PRODUCER: affecting what you see?
ZARA: Maybe not for me, but I feel like for other people,
ZARA: because if they see that repeatedly...
ZARA: I don't know about you, but if you see a story and
ZARA: you hear it from one person, I feel like you're
ZARA: more likely to believe it from the first person you
ZARA: heard for some reason.
ZARA: And I feel like if other people saw
ZARA: algorithms just, you know, only conveying
ZARA: one view, they'll be more likely to believe that.
ZARA: And then it just creates a whole cycle.
WILL: Yeah. I was just going to say it is kind of crazy how
WILL: futile algorithms are.
WILL: The other day, I was just trying to make a point to
WILL: my TikTok that I want to watch lots of stand up
WILL: comedians. So I was following all these great stand
WILL: up comedians, I was like, "oh, this is great".
WILL: And then the next day, because of Donald Trump's
WILL: inauguration, my whole algorithm was filled with
WILL: content about that.
WILL: And then suddenly, now that it isn't as relevant
WILL: anymore, I'm now getting stand up comedians again.
WILL: So I just wish it was more permanent.
WILL: But it still needs to be easy to change.
PRODUCER: Just really fascinating to hear you both talk
PRODUCER: about the algorithms and how you can manipulate
PRODUCER: them or what changes you'd like to see.
PRODUCER: My next question — we sort of touched on this
PRODUCER: because it's a flip of the question I've just
PRODUCER: asked — what makes you feel less comfortable
PRODUCER: online?
WILL: Yeah, I guess fairly similarly to what we've already
WILL: said, just videos coming up that we don't feel
WILL: comfortable with that are hard to push
WILL: away. Especially when if you watch the entire video,
WILL: even if it's just by accident, like you leave it
WILL: sitting on the counter and it runs twice, then you're
WILL: going to get lots of videos like that one.
WILL: So yeah, I guess better control would be very
WILL: helpful.
ZARA: Something that makes me feel less comfortable is when
ZARA: there's misinformation, but when
ZARA: it's a large amount.
ZARA: For example, for my 'For You' page, it's a mixture
ZARA: of what I like and what's trending.
ZARA: And sometimes people like stuff that
ZARA: isn't true or AI deepfakes or something.
ZARA: For example, my dad sent me a really
ZARA: popular video and I was like, "oh, this is going to
ZARA: be funny." It was an AI image.
ZARA: And my dad's like, "oh my God, that's so cool.
ZARA: Have you seen this before?" And I was like, "no, dad,
ZARA: it's AI".
PRODUCER: Is it something that you'd like to learn more
PRODUCER: about in school — the impacts of misinformation
PRODUCER: or how these things work, how these technologies
PRODUCER: work?
ZARA: Yeah, definitely. Today I had PD day — which is
ZARA: personal development, but I think some schools have
ZARA: PSHE — and the theme was online safety.
ZARA: I don't think we did anything on AI or
ZARA: misinformation. It was just cyberbullying,
ZARA: don't post certain stuff, digital footprint; just
ZARA: like the normal things that we learn in school.
ZARA: And if you do want to learn about it, I feel like you
ZARA: have to actually research it.
ZARA: And then sometimes the research that you do, it's
ZARA: biased. So I feel like it'd be really good if you had
ZARA: a reliable source of information of how to tell
ZARA: what's AI or not.
WILL: Like Zara, I also think it's interesting that schools
WILL: are still, in today's day, repeating the
WILL: same information that lots of us have heard again and
WILL: again about cyberbullying and stuff that,
WILL: you know, we all know is important, but we've heard
WILL: it a million times. Whereas I've never heard a
WILL: teacher mention AI to me in a way that wasn't just
WILL: conversational.
WILL: So it would be really interesting to see the
WILL: curriculum updated to mention current topics
WILL: and affairs. You know, artificial intelligence is
WILL: sort of a sci-fi movie concept, but it's real today.
WILL: It's reality, and we need to be taught about it.
ZARA: I 100% agree with Will, because the internet
ZARA: and the online world is evolving, but our
ZARA: curriculum isn't.
ZARA: For example, my teacher — she was giving an example
ZARA: of an app and she said 'MySpace'.
ZARA: And then she said 'Vine'.
ZARA: And mind you, most of us are Gen Alpha, Gen Z,
ZARA: so none of us knew what it was unless we'd seen
ZARA: skits. And we were so confused, because she was
ZARA: giving us an example of how to report
ZARA: something, and we were so confused the whole time.
PRODUCER: Yeah. And with young people using so many
PRODUCER: different apps, it can be tricky to have
PRODUCER: knowledge of all of them.
PRODUCER: When you make those decisions to say, "oh,
PRODUCER: I'm going to use this new app", do you talk to
PRODUCER: anyone about it? Do you talk to your teachers?
WILL: I mean, I guess when I first got Instagram, not
WILL: a lot of people that I knew were on it.
WILL: So there's barely anyone to talk to about it, to be
WILL: fair. But, you know, you'll mention it to your
WILL: friends like, "oh, I've got Instagram and there's
WILL: this feature that I didn't know is there, and there's
WILL: this and that."
ZARA: I don't think anyone thinks about talking
ZARA: to a teacher.
ZARA: If I was getting Instagram, I didn't go to my form
ZARA: tutor and ask "Miss, should I get Insta?" I just
ZARA: thought of getting it.
ZARA: So I just went on my app and downloaded it.
ZARA: But I feel like even if I did, I feel like they'd
ZARA: tell me not to or say something, you
ZARA: know, negative. Or if I said I'm going to get
ZARA: something, they're just going to be like, "make sure
ZARA: to set, you know, blocks and stuff on there." I
ZARA: feel like we only hear teachers talk about how to
ZARA: stay safe online, like it's so bad that
ZARA: we need to be taught how to stay safe, not to enjoy
ZARA: it. It's normalised that it's bad
ZARA: and we should know how to deal with it, rather than
ZARA: what things it could be useful for.
WILL: Yeah, definitely there needs to be more positivity
WILL: about being online. But again, you can't just
WILL: completely ignore the negatives, because there are
WILL: negatives and they are important.
WILL: It's sort of important to have a non-biased
WILL: view of things and a balanced view of things.
WILL: Otherwise it's just not going to work.
PRODUCER: Yeah, my next question was going to be more about
PRODUCER: those worries and concerns and
PRODUCER: possibly negatives rather than positives.
PRODUCER: What are the things about being online that
PRODUCER: concern you the most right now?
WILL: I guess the thing that is most concerning is people's
WILL: lack of awareness.
WILL: If you look online and you see something,
WILL: I can usually tell if something's AI generated, but
WILL: it's getting better and better.
WILL: But there's so many people in the comments section,
WILL: they're like, "wow, is this real?
WILL: I never knew that, that's crazy!" Because why
WILL: wouldn't you believe it? Because it's on your phone,
WILL: and BBC News is on your phone.
WILL: It's only one click away from your TikTok feed.
ZARA: That's a really good point, because if you think
ZARA: about concerns, people are just going to say
ZARA: generated images or deepfakes.
ZARA: But the actual aspect of people not being aware of
ZARA: how bad it is, is much more scary to think about.
ZARA: This morning we had an assembly on how AI
ZARA: was, you know, manipulating people, which is actually
ZARA: quite good for my school; quite progressive to talk
ZARA: about AI. They gave an example that
ZARA: someone cloned David Attenborough's voice
ZARA: and the person next to me said, "oh, that's so funny.
ZARA: Imagine people are listening to that and getting
ZARA: pranked." I think she gave an example of just David
ZARA: Attenborough saying, "oh no, the house cat has gone
ZARA: extinct." And then people, you know, believing that.
ZARA: And you feel like it's normal to make jokes but
ZARA: I don't think a lot of people understand that it
ZARA: could be used for much more malicious purposes.
PRODUCER: And speaking of malicious purposes,
PRODUCER: are either of you worried about the potential use
PRODUCER: of generative AI in relation to cyberbullying?
WILL: I mean, it's easy to worry about it because it's
WILL: already happening. You know, people are getting
WILL: images made of them that are just— obviously they
WILL: aren't them. But to a lot of people, they see that
WILL: image straight away and think, "that's my friend.
WILL: Why are they doing that?" And it's concerning the
WILL: effects that AI can have and is already having.
WILL: But, as well, you have to look to the future.
WILL: Because if AI is like this now, and you compare that
WILL: to AI from last year, there's already been
WILL: so much progression that it's only going to get more
WILL: and more realistic. So change needs to happen now to
WILL: stop more concern in the future.
PRODUCER: How would you advise schools approach the problem
PRODUCER: of generative AI?
WILL: I mean, it would benefit a lot of people if they just
WILL: had one lesson in their entire life that said, you
WILL: know, how to spot AI images, look for extra fingers
WILL: and that kind of thing.
WILL: Because a lot of people...
WILL: Even though a lot of people would assume everyone
WILL: from the younger generation is super-prepared
WILL: and knows everything about being online and
WILL: they're constantly online, even people who are
WILL: constantly online don't have to be 100%
WILL: informed about the risks or about AI.
ZARA: Everything that Will said I agree with, especially
ZARA: about how it has to be done now.
ZARA: For example, how you have to write that something is
ZARA: AI generated, or if something, you
ZARA: know, is made by AI.
ZARA: Even companies now are getting away with it.
ZARA: For example, chatGPT.
ZARA: It says chatGPT can make mistakes and blah blah blah.
ZARA: But it's in really tiny
ZARA: font and it's in a separate tab thing, you have
ZARA: to click an arrow for it to appear, and you have to
ZARA: actively search for it to say that it's
ZARA: AI generated.
ZARA: Especially in images where you can make AI generated
ZARA: images, it's not even a watermark anymore it's
ZARA: a tiny, like, writing bit on the edge of
ZARA: it.
WILL: Yeah. On Zara's point, what she was saying about how
WILL: ChatGPT has a very small marker saying "this could
WILL: not be true." AI's like ChatGPT, Google
WILL: Gemini, they'll present pretty much everything as
WILL: solid fact. I know it went viral a short while
WILL: ago: if you ask ChatGPT how many R's are there in the
WILL: word 'strawberry', it literally doesn't know.
WILL: But it'll say to you, "oh, there are five R's in
WILL: 'strawberry'", and it's not going to say
WILL: "sorry, I just don't know how to answer that" because
WILL: it can't. It's not programmed to say "no" to
WILL: anything that isn't illegal, basically.
ZARA: There is someone saying "one plus one equals three"
ZARA: to chatGPT so many times that it just said, "yeah, it
ZARA: is three". So many other people did that.
ZARA: And now if you ask, there's a
ZARA: chance that it might say "three" because ChatGPT is
ZARA: not based just on AI, it's kind of
ZARA: general knowledge and it just picks the most looked
ZARA: at. For example, if you just Google on Google Gemini,
ZARA: it just picks the top answers and then puts it there.
ZARA: So it's not definite fact but people take it as
ZARA: definite fact. And if you're googling something, you
ZARA: don't want to go into article, you want it in a tiny
ZARA: little square at the top of your page, because
ZARA: I don't think people really take the initiative to
ZARA: check if the information is true or not.
ZARA: They just, you know, go with it.
PRODUCER: Moving on from generative AI a little bit, I
PRODUCER: wonder if there are any other things about being
PRODUCER: online that concern you at the moment?
WILL: I guess if there was one other thing other than
WILL: generative AI that's concerning, it's
WILL: the amount of control that you don't
WILL: have over what you see.
WILL: I know we were mentioning earlier about trying to
WILL: manipulate your algorithm into giving you the videos
WILL: that you want.
WILL: But nowadays people can buy views.
WILL: Like literally, if you upload a TikTok video, it'll
WILL: say, "oh, this video is doing great.
WILL: Want to make it do greater? Pay us £5 and then we'll
WILL: give it 100,000 views." That's
WILL: scary that you can just manipulate what other people
WILL: see. And that means that you're not in control of
WILL: what you see either.
ZARA: I know how we were talking about, you know,
ZARA: manipulating your 'For You' page, but I don't think
ZARA: anyone really understands how algorithms actually
ZARA: work or why it's tailored to me.
ZARA: I know it's just because, you know, if you like
ZARA: something, it will come up to you, but also the
ZARA: impact on following people.
ZARA: For example, when I got Instagram, I followed my
ZARA: friends and the videos that they watched came to me
ZARA: even before they sent it to me.
ZARA: So I'd really love to learn how that works as well.
PRODUCER: How does that make you feel when you're getting
PRODUCER: served the same content that your friend is
PRODUCER: before they even send it to you?
PRODUCER: Does that feel strange?
ZARA: I think it does feel strange, because if you think
ZARA: about the 'For You' page, it literally has the word
ZARA: "you", not "for your friend" page.
ZARA: But, sometimes... I used to think it was kind of
ZARA: funny because, before my friend would send me a meme,
ZARA: I could say "I watched that already".
ZARA: But now it's kind of...
ZARA: I don't really like how it impacts stuff, because
ZARA: even though they are my friends, some stuff I don't
ZARA: agree with or I don't find as funny, but it'll
ZARA: just be on my 'For You' page. So I think it's
ZARA: something that needs to be looked at.
WILL: Yeah, I totally agree with you Zara on that point
WILL: because I've been having the same problem recently.
WILL: My friend sends me a video and 80% of the time I've
WILL: already seen it. Is this really a 'for me' page?
WILL: Because it seems like everyone in my friendship
WILL: group's getting the exact same posts, and
WILL: that's just strange to me.
PRODUCER: And do either of you worry about
PRODUCER: the data that these apps and these online
PRODUCER: companies are gathering to power their
PRODUCER: algorithms? Is that a concern for either of you?
WILL: I feel like you're kind of aware that
WILL: they're taking your data. And because it, you
WILL: know, says it in the name — it's 'for you'.
WILL: It feels like I don't mind
WILL: too much that they're taking it because at least
WILL: they're given me videos.
WILL: But it is concerning when you don't know where that
WILL: data might end up outside of the platform that you're
WILL: on.
ZARA: I feel like you obviously consent to stuff.
ZARA: To be on the app, you have to click 'consent',
ZARA: otherwise you can't use the app
ZARA: itself. And I think people just complain for the
ZARA: point of complaining, but you can actively look for
ZARA: it. But I remember we were working on our manifesto
ZARA: last year, and we were talking about how a lot of
ZARA: young people, or just people in general, don't really
ZARA: look at it; they just click 'accept' because, you
ZARA: know, I want to see the videos.
ZARA: And I think— I think it was Will's group, they made
ZARA: the idea of having a little pop-up tab of
ZARA: the vendors and stuff, so then people can see where
ZARA: the data gets sent to. But it's only a summary of it,
ZARA: so then more people are more likely to read it?
PRODUCER: I'm conscious we're nearly at the end of our
PRODUCER: time, but I have one concluding question for you
PRODUCER: both. What are the top three things
PRODUCER: you think adults should know about what online
PRODUCER: life is like for young people?
WILL: I've got three. They're not really in any kind of
WILL: particular order.
WILL: So, my first one being that generative AI
WILL: isn't just homework answers.
WILL: It is a lot more than that and it can be
WILL: genuinely quite dangerous.
WILL: My second one is that algorithms are
WILL: a big part of young people's lives and they
WILL: aren't super easy to control or get out of your life.
WILL: It's not like you can just put the phone down a lot
WILL: of the time.
WILL: That doesn't mean it's super hard to either.
WILL: It's like... It's a controllable yet
WILL: uncontrollable part of your life.
WILL: So, I guess my last point is it's not as easy
WILL: as just hitting block or turning off your phone
WILL: to stop an issue that's happening
WILL: because you can't just switch off your phone
WILL: for the rest of your life. I feel like even adults
WILL: would know that. You know, I know so many adults that
WILL: are scrolling on Instagram Reels all day and then
WILL: belittling young people for scrolling on TikTok all
WILL: day — it's the same thing.
WILL: But you can't just hit block either, because, you
WILL: know, people constantly make new accounts, they find
WILL: new ways, and they adapt around changes.
PRODUCER: Thanks, Will. Zara, what are your thoughts on
PRODUCER: this question?
ZARA: I have one key point that goes into three.
ZARA: Online safety isn't just cyberbullying, because I
ZARA: feel like that's the main focus of every single PD
ZARA: lesson. And also the complexity of cyberbullying.
ZARA: It's not just someone blatantly saying, "I don't like
ZARA: you." It kind of includes microaggressions and
ZARA: even stuff that teachers don't realise, like if
ZARA: someone didn't tag you in a post, that would
ZARA: hurt, or something like that.
ZARA: And it's not just someone commenting on your post or
ZARA: DMing you privately, but it's other stuff,
ZARA: more detailed stuff. And I feel like schools should
ZARA: be more aware of that. And also addictive behaviour:
ZARA: I feel like schools think that we're
ZARA: the ones at fault for it, like we're actively
ZARA: thinking, "oh I'm going to go on my phone.
ZARA: I'm not going to do my homework.
ZARA: I'm going to spend six hours on TikTok in my bed."
ZARA: But I don't think they understand the role of
ZARA: algorithms. It's designed to keep
ZARA: us scrolling for hours. That just means that doing
ZARA: their job well. So it's not just us at fault.
ZARA: And also the emotional impact of misinformation.
ZARA: I feel like a lot of schools just teach
ZARA: us not to spread information because, you know, it
ZARA: could be used maliciously.
ZARA: But even hearing it,
ZARA: a lot of people feel emotionally connected to
ZARA: something. And if they hear something against that
ZARA: that isn't real; or they know it isn't real and then
ZARA: loads of people are agreeing with that; or your
ZARA: friends have seen it and they're like, "yeah, that's
ZARA: definitely true, I saw it ten times", it
ZARA: could be overwhelming and it's really confusing.
PRODUCER: So that brings us to the end of our discussion
PRODUCER: today. Zara.
PRODUCER: Will, thank you so much for joining us on the
PRODUCER: podcast and providing a really interesting
PRODUCER: insight into your thoughts and opinions on the
PRODUCER: online world and online safety.
WILL: Thank you so much for having us.
ZARA: Thank you.
PRODUCER: If you've enjoyed this episode and would like to
PRODUCER: hear more from the NSPCC's Voice of Online Youth,
PRODUCER: you can find a link to their manifesto for change
PRODUCER: in the podcast shownotes.
PRODUCER: You'll also find links to lots of other resources
PRODUCER: and training to help you keep children safe
PRODUCER: online.
PRODUCER: Thanks for listening.
OUTRO: Thanks for listening to this NSPCC learning Podcast.
OUTRO: At the time of recording, this episode's content was
OUTRO: up to date, but the world of safeguarding and child
OUTRO: protection is ever-changing.
OUTRO: So, if you're looking for the most current
OUTRO: safeguarding and child protection training,
OUTRO: information or resources, please visit
OUTRO: our website for professionals at
OUTRO: nspcc.org.uk/learning.
We recommend upgrading to the latest Chrome, Firefox, Safari, or Edge.
Please check your internet connection and refresh the page. You might also try disabling any ad blockers.
You can visit our support center if you're having problems.