Intended Learning Outcomes and Prompt Engineering

Intended learning outcomes define what a student will have learned upon successfully completing a course or module. Intended learning outcomes play a central role in modern education as they describe clear and measurable learning goals that guide teaching and assessments. Intended learning outcomes can exist at different levels or granularity. At the program level, they describe the overall learning goals of a study program. At the course or module level, they define the learning goals of a course or module. Within the learning materials of a course or module, they may be further broken down. Writing intended learning outcomes has developed into an art of its own where often specific formats have to be followed, certain sets of verbs have to be used (see Blooms Taxonomy) and so on.

I got used to intended learning outcomes and their lingo, even though I still suspect that some subtle details about the choice of verbs are not understood by the readers of intended learning outcomes. I am meanwhile opening each chapter of my lecture notes with targeted intended learning outcomes. I believed that sharing my expectations about what students should learn from the following chapter could be beneficial for some of them; perhaps this raises interest or at least awareness or this is just a good style to adopt.

Today, I finally learned the true value of my chapter specific intended learning outcomes for modern students: They use them as prompts for generative AI systems! This took me as a surprise but soon this started to make sense to me, from a student’s perspective. On second thought, however, this also means that we (instructors) do not write lecture notes anymore for teaching our students but we write them for AI systems that are then teaching our students. We are effectively mutating into prompt engineers and learning is largely indirect via generative AI systems.

And this naturally extends to assignments: A “good” assignment (from the students perspective) is written in such a way that it works well as a prompt for generative AI systems to produce a meaningful or at least plausible answer. An assignment that does not work this way is simply a badly written assignment (from the students perspective).

I do not know where this leads us. It is possible that humans will ultimately need AI supremacy because we are mutating into animals with reduced brain mass, where certain areas of our brains have been replaced by a cut and paste spinal cord.