
King’s College London
King’s College London and ICS.AI light the path for safe, personalised AI in higher education

King’s College London wanted to give students and staff the benefits of generative AI without compromising academic integrity. Together with ICS.AI, the university piloted SMART: Learn (known as the KEATS AI Pilot during the collaboration) to explore how a university approved teaching and learning AI copilot could help students understand complex topics faster, practise with relevant questions and build confidence ahead of exams, while also laying foundations for educator workflows and future VLE integration.
What began as a minimum viable product matured - through careful adoption planning, a redesigned interface and in class onboarding - into a live evaluation with Neuroscience cohorts. The result - sustained student engagement, strong advocacy for a trusted, content grounded copilot, and intelligence on the effort needed to bring in module, in VLE capability into production scale use.

Students were already experimenting with public AI tools. Faculty and digital learning leaders wanted something different: a safe, transparent Copilot that cited official materials, was grounded in curated academic content, respected assessment rules, and encouraged learning rather than shortcut- seeking. The remit for ICS.AI was to deliver a copilot that could thrive in the realities of a large university - technically, pedagogically, and culturally.
KEATS AI was designed around two complementary student experiences:
-
Ask & Understand - a conversational way to query authorised module content with inline citations and guardrails.
-
Test Me - formative quizzes generated from module materials to support self-assessment and revision.
From the outset, the project paired product development with a structured adoption plan: onboarding during lectures, weekly training and Q&A, and clear communication that set expectations, showcased responsible use, and invited feedback.
From MVP to meaningful usage
The first phase focused on the foundations: deploying a working student copilot, a beta educator view, and a prototype integration path with KEATS (Moodle). Once the core capabilities were in place, the team shifted gears to an adoption and evaluation phase with live cohorts. That pivot proved decisive - moving the conversation from ‘what can it do?’ to ‘how does it help in practice?’
A redesigned multi-panel user interface (UI) arrived midtrial. Students could switch between general GenAI access and module-aware experiences, bring content into a chat with a simple @reference and use an initial version of SMART: Notes to work with transcripts and documents. The modern UI tuned for study flows - small, thoughtful details - helped turn curiosity into daily habit.
Weekly training sessions demystified prompting, showed how citations keep answers grounded, and encouraged students to try “Test Me” repeatedly in the runup to exams. Educators saw how the Copilot avoided writing assessed work while still supporting learning. This was an explicit design choice to uphold academic integrity.

Usage data and feedback told a consistent story. The pilot covered around 1,000 eligible students across two MSc Neuroscience modules, KEATS AI handled over four thousand chat sessions from the subset of students that engaged and nearly thirty thousand messages. The average of around seven messages per session suggested students were conducting deeper exchanges rather than one-shot queries. During revision weeks, Test Me became the go to feature as students generated practice questions from the very materials they were studying.
At the same time, feedback emphasised trust and transparency. Students liked seeing where answers came from and valued that KEATS AI operated within a King’s College approved environment. Many reported feeling more confident with complex concepts and better prepared for multiple-choice exams.
Teaching and integrity, by design

KEATS AI is deliberately tuned to coach learning. Prompts and responses encourage understanding, rehearsal, and reflection - not ghost-writing. When a request drifts towards producing assessed work, the copilot redirects the student to plan their approach, review course concepts, or practise with questions instead. Inline citations and scoping to authorised content make it easier for students to see, and trust, what the copilot is drawing on.
For educators, the pilot also laid groundwork: a beta teaching view, evolving analytics, and a growing library of adoption materials. The intent was simple - help lecturers do more of what impacts learning while reducing manual overheads associated with formative assessment and content navigation.
INSIDE THE EVALUATION: HOW KING’S COLLEGE LONDON AND ICS.AI RAN THE PILOT

Timeline and setup: Following the initial build (July–December 2024), the team engaged Neuroscience academics to ‘AIenable’ upcoming modules and plan an evaluation phase that mirrored production.
Students were introduced to KEATS AI in the first module session, then supported by weekly training from midMarch to midApril 2025. The evaluation concluded after the final exams in April, followed by surveys and interviews in May and a results review in early June.
Designing for adoption: The heart of the evaluation was realworld usage. Students accessed the Copilot where they studied; materials were uploaded through a Moodle plugin; and everything - from prompts to UI copy - was refined to make learning with AI feel natural, safe, and effective. Crucially, content scoping and citations turned KEATS AI into a study companion that amplified the course, rather than a generalpurpose chatbot.
What changed during the trial: The redesigned interface introduced Quick Start prompts, SMART: Notes, and multi LLM access, while the @reference feature let students bring different “skills” (e.g., general reasoning or module grounding) into a single conversation. These changes, combined with weekly training, corresponded with sustained usage growth as students approached assessments.

Behind the numbers, students also described a shift in study experience: faster access to relevant explanations, clearer links between topics, and less stress. For many, KEATS AI became the place to sense check ideas against their module’s own materials and build confidence before sitting multiple choice questions (MCQs).
What made the difference
A university approved space for AI: By providing a sanctioned, transparent copilot with citations and scope limits, King’s College London gave students a route to use AI responsibly. That increased trust versus public tools and reduced concerns about hallucinations or overreach.
Adoption as a first class workstream: Live onboarding, weekly training, and consistent communications ensured students knew where KEATS AI added value and how to get started. The uplift in usage during revision weeks underscores how contextual enablement drives meaningful results.
Iterating with students: The redesigned interface, SMART: Notes, and @referencing emerged from real classroom needs. They made the copilot more usable and more obviously tied to the act of studying.
Integrity safeguards: Guardrails that deflect requests to write assessed work - and instead channel students towards practice, planning, and understanding - built educator confidence without diminishing the benefits of AI.
Pilot evaluation and future context
The evaluation provided positive indications via student feedback and usage data of the core value of KEATS AI – that students learn better with a trusted, content-grounded copilot – while also defining clear requirements for what a full production experience should add.
Following the pilot, and in the context of a rapidly evolving AI landscape, it was determined that the specific business case for a full-scale production rollout could not be sustained. In the interim, other AI capabilities have emerged within the core KEATS platform itself. Reflecting this new context, King’s is now undertaking a broader evaluation of all relevant options to meet its long-term needs.
The pilot also provided key institutional learnings: in a large and complex organisation like King’s, the adoption of such a platform is a multi-year process. It requires foundational support from bodies like King’s Academy and the faculties, a significant consideration as the technology landscape continues to shift.
Architecture and governance

KEATS AI runs on the ICS.AI SMART: AI Platform, hosted in the King’s College London Azure environment for security, compliance, and scalability. A jointly agreed technical implementation ensured that all data remained within the King’s Azure environment under King’s control, in line with the university’s data-protection policies. Retrieval augmented generation constrains answers to authorised content; inline citations make sources explicit. Central policy controls and analytics help the university steer usage, manage data access, and evaluate impact.
For KEATS, a prototype integration path was successfully proven during the pilot. The evaluation confirmed that the same platform architecture could underpin future educator experiences, with the potential to reduce repetitive tasks while preserving academic standards.
Lessons for the sector
-
Make adoption a programme, not a footnote - treat training, communications, and success measurement as core to the work.
-
Design for integrity - embed guardrails and citations so students learn how to use AI well, not just fast.
-
Meet students in their flow - put the copilot where learning happens - the VLE and the module - then keep friction low.
-
Iterate with cohorts - use real usage and feedback to guide interface, prompts, and features across the term.
What participants said
“I felt supported and grateful to have this as a tool.”
- Student, MSc Neuroscience
“It helped me gain confidence in the topic and prepare for the MCQ exam.”
- Student, MSc Neuroscience
“Having an approved King’s College London tool ensures [the use of AI] is aligned with our pedagogy, our curriculum and our institutional values as well. It allows us to create prompts that reflect: specific module outcomes; the reading for the modules and the assessment requirement. We can ensure accessibility and the data privacy. That’s really important.”
– Senior Neuroscience Lecturer
About SMART: Learn
SMART: Learn (KEATS AI) is a university approved Copilot for students and staff, built for higher education. It combines content grounded answers, formative practice, and features which support study, teaching, and collaboration.
SMART: Notes provides real-time transcription of conversations into structured, compliant documentation through voice recording. In addition to web-application use on laptops, it’s available through a smartphone app on iOS and Android devices too, achieving 98% transcription accuracy and supporting multilingual input.
SMART: Learn and SMART: Notes are part of the ICS.AI SMART: AI Platform – a unified AI for all solution providing a strategic, flexible and affordable choice for AI adoption in education. It powers a number of specialised copilots across student services, research, and operations delivering measurable ROI & efficiency.
