Newsroom14 October 2024

AI-Powered LMS: The Future of Personalised Learning

I Stock 1375897191

AI has often been described as predictive text on steroids, but it is much, much more than that.

The transformation of education with AI can happen on various fronts, such as intelligent tutoring systems, automated content creation, personalising content to the individual, student support and accessibility, and of course, grading and assessment.

Students can see it as a personal tutor, and educators might use it as a teaching assistant. For instance, AI-powered systems can deliver one-to-one tutoring, with explanations, examples and practice exercises based on individual needs.

Generative AI in education

Generative AI can create new content such as quizzes, assignments and lesson plans, and it can translate content into multiple languages, making it accessible for a wider audience. Generative AI can also analyse learner data to create personalised learning paths, ensuring the content is relevant to the individual needs and interests of that learner, and AI-powered assessments can adjust difficulty levels based on learning performance, to provide a more engaging and effective learning experience. Generative AI tools can be outside, or inside, of the LMS. An integration can happen through an API that allows data to be shared and processed seamlessly through an API integration – or it can be a standalone tool. Standalone applications can be integrated into the LMS through various message methods, such as embedding or linking. It’s exciting to see how AI can break down language barriers, making education more accessible with different native languages. If you have students with a disability, AI can generate speech-to-text and text-to-speech, as well as image recognition, further breaking down learning barriers for those students.

Is there bias in AI?

Does AI give us the answers we want? It is important to note that if you do not put in the right information, and if you don’t prompt correctly, then you will not receive suitable information back. There are questions around accuracy and reliability, bias, privacy, and security as well as ethical implications. The terms “bias” and “hallucinating” are used in AI to explain that we may not get a truthful response, or we might get a biased response. These large language models can generate incorrect or misleading information. That is where the human comes in – it is crucial to verify that the output is correct. Large language models can perpetuate bias and you need monitoring and some sort of mitigation strategy when you encounter that bias.

AI and student privacy and security

It goes without saying that security measures must be put in place to protect student data. Organisations are looking at developing action plans not only for this but also about the acceptable use of AI, and how to monitor that.

AI for learning materials

Anyone can go to an AI teaching resource and ask it to produce different learning materials but to use those materials effectively you must know the paradigms of pedagogy as well as course creation. We cannot rely on AI to develop PowerPoints for us, we need the educator’s understanding to inform our AI generative output.

Risks of using AI in education

The risks associated with AI relate to academic integrity and to educator capability. Not every educator has AI and instructional design skills, and not every educator will know their LMS thoroughly and use those tools effectively.

I don’t see widespread use of AI-assisted tools at the moment, because educators know we have to get our framework in place.

The Australian Framework for Generative Artificial Intelligence in Schools, which seeks to guide the responsible and ethical use of generative AI tools, helps us understand some of the issues.

It has six underlying principles about the ethical use of generative AI:

  • Privacy, security and safety - we have already seen some large breaches with databases being hacked and profiles used on other websites
  • Teaching and learning - how can we uplift teaching and learning, how are we going to educate our educators about these tools?
  • Human and social wellbeing – using generative AI for good
  • Transparency – ensuring we are transparent in how we use generative AI
  • Fairness - providing equal opportunity, not only for our students but also for our educators
  • Accountability - generative AI being accountable for what is produced.

It is too simplistic to say that AI is just predictive text on steroids. You can see this when you look at the complexity of information that is available and the complexity of information that can be delivered.

We are in the early days of generative AI. It will continue to evolve, offering new possibilities for personalised learning and opening up education to a wider group of people. As educators, we need to stay informed and understand the opportunities – and the challenges.


This blog was created from an eWorks webinar AI-powered LMS – the future of personalised learning, hosted by Mark Robertson.

    You can view the webinar here and sign up on this page to be informed of our webinars and updates.

    Agile methodologies 43 V2 15

    Related Posts

    Explore more