Insider: Dr Kaitlyn Regehr

The algorithm will see you now 

Dr Kaitlyn Regehr (UCL Information Studies) unpacks her agenda-setting research on the cultural impacts of social media, and why we all need to take greater control over the algorithms that shape our daily lives.

Insider: Dr Kaitlyn Regehr

The algorithm will see you now 

Dr Kaitlyn Regehr (UCL Information Studies) unpacks her agenda-setting research on the cultural impacts of social media, and why we all need to take greater control over the algorithms that shape our daily lives.

Two years ago, algorithms and digital literacy were not part of public discourse. Now, by way of Oprah Winfrey, CNN, the BBC – to name a few – they are doing just that. These conversations tend to crystallise what I've witnessed over the past year: a growing chorus of people wanting greater control over their time, attention and digital device use.

Dr Kaitlyn Regehr on The Oprah Podcast, September 2025

Dr Kaitlyn Regehr on The Oprah Podcast, September 2025

I have two young children and I want them to grow up with a healthy relationship to technology. My wish? For them and their generation to look back on ours the way we view previous generations that thought smoking in delivery rooms was okay. In other words, as wildly unhealthy and outdated.

Beyond screen time

My research looks at how algorithms function and the ways hate, harm and disinformation flourish through the algorithmic economy. And perhaps more importantly, I work to develop tools for people to take greater control over these processes.

There's been enormous emphasis recently on banning smartphones in schools. But 88% of UK children aged between 3 and 17 are watching YouTube, not only on smartphones but also tablets and TVs [Ofcom 2025]. YouTube is an algorithmically driven platform, often featuring short-form, poorly produced content. The saddest thing is that decades of good advocacy around children's programming – rules about what was appropriate, what you could advertise – have been thrown out the window.

The guidance most of us rely on – limiting "screen-time" to one or two hours daily – emerged from childhood obesity research. But that guidance is about physical wellbeing only, and doesn't address mental health. It also assumes all screen-time is equal, which is clearly not the case.

A child watching CBeebies with their family is qualitatively different from a child scrolling through algorithmically driven content. One is community-building; the other is isolating. And in my book Smartphone Nation, I developed a digital diet guide – like a food pyramid – to show how some screen activities are healthy while others aren't.

The "You Loop"

The harms of the digital space look different for different people, and that is its terrible brilliance: it's so targeted. We all exist in hyper-personalised feeds, what Eli Pariser calls the "You Loop". And these hyper-personalised feeds have created a wealth of societal issues. Some worry about echo chambers and polarisation. Others focus on body dysmorphia – we were never meant to look at our own faces this much. Still others are concerned about how pornography is being used as a resource by young people, with a report from the Children’s Commissioner for England suggesting parallels between increasingly violent content and higher rates of domestic abuse in teenage relationships.

What is key to note is that all these issues link back to an attention economy, a model which relies on content becoming ever-more extreme and, therefore, attention grabbing. That's the business model. What I try to do is help people understand this unethical financial structure so they can recognise it and discuss it with their children from an early age.

Policy follows public will

I work extensively with the UK government and have fed into parts of the Online Safety Act (although I think it's far from adequate.) But we need to be realistic: policy follows public will, so if enough people begin to say that we're worth more than our eyeballs glued to screens at any cost, we will see change. Jonathan Haidt's The Anxious Generation only came out 18 months ago, but it’s completely blown open the conversation. And it’s a conversation that I hope we can continue.

I'm not involved in debates about the right to publication – if someone wants to post their suicide journey, that's a free speech issue. What I'm interested in is the right to dissemination. Should Meta have the right to algorithmically recommend that suicide journey to an anxious 14-year-old? Clearly not. Working with suicide prevention charity the Molly Rose Foundation, my collaborators and I aren’t looking for content to be banned, but for rules about how content is algorithmically offered to vulnerable young people.

Building knowledge and understanding

Together with my colleagues Dr Katharine Smales and Dr Photini Vrikki, we're in the process of establishing a UCL Centre for Digital and AI Literacy – the concept for which has already led to positive conversations with a large technology company. At the moment, we’re swamped by requests for our expertise – from government departments, teaching unions, parent organisations, local authorities, the Metropolitan Police and many others. Having a dedicated centre will enable us to continue our work in bringing together policy, industry and key stakeholders and building public awareness that we’re now living in an algorithmic economy.

Crucially, it will also allow us to research AI literacy – for example, understanding what skills young people are no longer developing because they're outsourcing them to AI. For too long, we've taught kids how to use technology, without teaching them how it works and how it affects us.

Reasons for hope

I remain optimistic. At the BBC's Teen Summit in Bradford, I had enlightened conversations with teenagers who said things “I’m getting a brick phone." Teenagers will pave the way for much of this cultural change. We're seeing alternative phones for kids coming to market and more home phones being installed in kitchens again. There's value in the social awkwardness of calling your crush at home and speaking to their mother first. It builds social skills that we're in danger of losing.

This is more complex than a smoking analogy, because we can function without cigarettes but not without technology. But as I discussed in conversation with Heather Riesman, a great champion of literacy and pushing back against digital harm, at the book launch for Smartphone Nation in Toronto: the fear factor with smoking was cancer. What we're facing is equally dire: we're outsourcing our brains and our children's brains. And when people understand that, they’ll want to take control. And I believe they will.

Dr Kaitlyn Regehr is Associate Professor of Digital Humanities at UCL's Department of Information Studies. She’s also academic lead for the proposed UCL Centre for Digital and AI Literacy.

Her book Smartphone Nation is now available to buy.

Donate to support the UCL Centre for Digital and AI Literacy

If you are interested in learning more about supporting the work of Dr Kaitlyn Regehr and her team at UCL, please contact Kate Birch (Head of Development - Arts, Humanities, and Social Sciences):
kate.birch@ucl.ac.uk

Privacy | Cookies | Archive

Office of the Vice-President (Advancement)
University College London,
Gower Street,
London,
WC1E 6BT

ucl.ac.uk

Portico magazine features stories for and from the UCL community. If you have a story to tell or feedback to share, contact advancement@ucl.ac.uk

Editor: Lauren Cain

Editorial team: Ray Antwi, Rachel Henkels, Harry Latter, Bryony Merritt, Lucy Morrish, Alex Norton

Shorthand presentation: Harpoon Productions
Additional design support: Boyle&Perks

Follow UCL