King’s College London and ICS.AI collaborate to deliver safe, personalised AI in higher education
- 6 days ago
- 3 min read
King’s College London piloted AI to give students and staff the benefits of generative AI without compromising academic integrity

King’s College London has piloted ICS.AI's SMART: Learn (known as the KEATS AI Pilot during the collaboration) to explore how a university approved teaching and learning AI copilot could help students understand complex topics faster, practise with relevant questions, and build confidence ahead of exams.
Although students were already experimenting with public AI tools, faculty and digital learning leaders wanted a safe, transparent copilot that cited official materials, was grounded in curated academic content, respected assessment rules, and encouraged learning rather than shortcut- seeking.
KEATS AI was designed around two complementary student experiences:
Ask & Understand - a conversational way to query authorised module content with inline citations and guardrails.
Test Me - formative quizzes generated from module materials to support self-assessment and revision.
The pilot covered around 1,000 eligible students across two MSc Neuroscience modules. KEATS AI handled over four thousand chat sessions from the subset of students that engaged and nearly thirty thousand messages. During revision week, Test Me became a go-to feature, with students using it to generate practice questions from their own module materials.
Behind the numbers, students also described a shift in study experience: faster access to relevant explanations, clearer links between topics, and less stress.
From minimum viable product to meaningful usage
The collaboration rolled out in phases. The first phase focused on the foundations: deploying a working student copilot, a beta educator view, and a prototype integration path with KEATS (Moodle). Once the core capabilities were in place, the team shifted to an adoption and evaluation phase with live cohorts. That pivot proved decisive - moving the conversation from ‘what can it do?’ to ‘how does it help in practice?’
Students could then switch between general GenAI access and module-aware experiences, bring content into a chat with a simple @reference and use an initial version of SMART: Notes to work with transcripts and documents.
Teaching and integrity, by design
KEATS AI was deliberately tuned to coach learning. When a request drifted towards producing assessed work, the copilot redirected the student to plan their approach or practise with questions instead. Inline citations and scoping to authorised content made it easier for students to see, and trust, what the copilot was drawing on. For educators, the pilot also laid groundwork: a beta teaching view, evolving analytics, and a growing library of adoption materials.
A Senior Neuroscience Lecturer at Kings College London, commented during the pilot:
“Having an approved King’s College London tool ensures [the use of AI] is aligned with our pedagogy, our curriculum and our institutional values as well. It allows us to create prompts that reflect specific module outcomes, the reading for the modules and the assessment requirement. We can ensure accessibility and data privacy. That’s really important.”
Martin Neale, CEO and founder at ICS.AI, added:
“By providing a sanctioned, transparent copilot with citations and scope limits, King’s College London gave students a trusted route to more effective AI usage. Channelling students towards practice, planning, and understanding has built educator confidence without diminishing the benefits of AI.”

KEATS AI was built on the ICS.AI SMART: AI Platform, hosted in the King’s College London Azure environment for security, compliance, and scalability. A jointly agreed technical implementation ensured that all data remained within the King’s Azure environment under King’s control, in line with the university’s data-protection policies.
Explore Safe, Personalised AI for Your Institution
Discover how King’s College London piloted a university-approved, content-grounded AI copilot to support learning, protect academic integrity, and build student confidence. Read the full case study to see how it worked, what was learned, and what it takes to deploy AI responsibly at scale in higher education.
If you’re exploring AI opportunities across teaching, student services, or back-office operations, we’d be happy to share practical insights from the pilot and discuss what a safe, institution-wide approach could look like for you. Speak to our team about AI in your institution.





Comments