We find ourselves at a fascinating intersection – eager to embrace innovations that demonstrably improve outcomes, yet acutely aware of our profound responsibility as guardians of children’s data and digital experiences.
Struggling readers spotted by eye-tracking software trial
There’s a telling moment I experienced recently whilst visiting a primary in Staffordshire The headteacher, a veteran of twenty-odd years in education, gestured towards a classroom where Year 5 pupils were using a combination of Lexplore’s iris tracking technology integrated with askKira’s analysis platform to identify potential reading challenges.
“Five years ago, I’d have called this science fiction,” she confided. “Now I’m lying awake wondering if we’re handling their data properly.”
Her concern mirrors conversations I’m having with trust leaders and heads across the country. We find ourselves at a fascinating intersection – eager to embrace innovations that demonstrably improve outcomes, yet acutely aware of our profound responsibility as guardians of children’s data and digital experiences.
The potential of education technology to transform learning outcomes has never been greater. When Lexplore’s advanced eye-tracking technology is paired with askKira’s intelligent analytics platform, we can identify reading difficulties in minutes rather than months, allowing for targeted early intervention. Our combined solution doesn’t just gather data – it transforms it into actionable insights that teachers can use immediately to support struggling readers.
Yet every innovation brings with it questions that simply didn’t exist when many of us began our careers. Each eye-tracking session recording a pupil’s reading behaviours, each recommendation our systems make – all represent both opportunity and responsibility.
Recent concerns raised by privacy campaigners about data handling in schools aren’t alarmist – they reflect legitimate questions about how we balance innovation with protection. The DfE’s guidance provides a foundation, but the landscape evolves faster than regulation can keep up.
Based on our work with trusts nationwide, I believe there are five key principles leaders should embed in their approach:
Theory is one thing; practical implementation is another. In one trust we work with, the introduction of the askKira-Lexplore combined solution was preceded by parent information evenings where the technology was demonstrated and questions answered openly.
Another MAT using our integrated platform has established what they call “digital ethics ambassadors” – pupils in Years 5 and above who participate in discussions about new technology and help explain privacy considerations to their peers in age-appropriate ways.
Several schools have integrated discussions about data privacy directly into computing curriculum, helping pupils understand how technologies like eye-tracking work and why data protection matters.
The education sector hasn’t always been renowned for its technological agility, but the pandemic changed that narrative dramatically. Now, as we look toward a future where integrated AI and data-driven tools like askKira’s analytics platform combined with Lexplore’s eye-tracking become increasingly embedded in teaching and learning, we must ensure our ethical frameworks keep pace.
I believe the trusts that will thrive are those that view data protection not as a compliance burden but as a core element of their duty of care – as fundamental as safeguarding or curriculum design.
The concerns of privacy campaigners should be welcomed, not feared. Their scrutiny helps us build better systems and more thoughtful approaches. When they raise flags about data handling – particularly around sensitive technologies like our combined iris tracking and analysis solution – our response shouldn’t be defensive but collaborative – a chance to demonstrate our commitment to getting this right.
As one trust CEO recently put it to me: “By using askKira’s platform with Lexplore, we’re understanding how children learn to read at a level of detail previously impossible. With that power comes the responsibility to be utterly transparent about how that data improves outcomes.”
I couldn’t agree more.
Lorna Stockwood is COO of askKira, working with MATs and schools across the UK to implement ethical organisation AI and effective education technology solutions that identify and address education challenges. For more information, visit askkira.com