JOURNAL ARTICLE

Enhancing Student Engagement Through AI-Generated Demonstrations: A Practice-Based Reflection 

Abstract

This practice-based reflection examines the use of large language models to rapidly produce a classroom demonstration in an undergraduate JavaScript course. It outlines a lightweight workflow that begins with clarifying outcomes, proceeds through prompting, instructor review and edits, and concludes with deployment. Reported benefits include faster preparation and clearer alignment to learning objectives. The paper also documents risks and mitigations, including cognitive load from “seductive details,” potential code inaccuracies, and shifts in perceived instructor credibility. It provides practical guardrails: bias audits of names, scenarios, and datasets; accessibility by default (semantic structure, keyboard operability, captions and transcripts, sufficient contrast, alt text); equitable access (low-bandwidth and Artificial Intelligence (AI)-free alternatives, avoidance of paywalled tools); and strict avoidance of student or sensitive data in third-party tools. Limitations include a single-course context and a reflective, non-experimental method. The goal is to offer actionable guidance for instructors who want to use AI for speed and flexibility while maintaining rigor, transparency, and student trust.

Keywords:

Metrics

0
Cited By
0.00
FWCI (Field Weighted Citation Impact)
0
Refs
0.43
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Topics

Online Learning and Analytics
Physical Sciences →  Computer Science →  Computer Science Applications
© 2026 ScienceGate Book Chapters — All rights reserved.