Krishnamurthi, Shriram; Thießen, Thore; Vahrenhold, Jan
Research article in edited proceedings (conference) | Peer reviewedSoftware developers have long emphasized the need for clear textual descriptions of programs, through documentation and comments. Similarly, curricula often expect students to write purpose statements that articulate in prose what program components are expected to do. Unfortunately, it is difficult to motivate students to do this and to evaluate student work at scale. We leverage the use of a large language model for this purpose. Specifically, we describe a tool, Porpoise, that presents students with problem descriptions, passes their textual descriptions to a large language model to generate code, evaluates the result against tests, and gives students feedback. Essentially, it gives students practice writing quality purpose statements, and simultaneously also getting familiar with zero-shot prompting in a controlled manner. We present the tool’s design as well as the experience of deploying it at two universities. This includes asking students to reflect on trade-offs between programming and zero-shot prompting, and seeing what difference it makes to give students different formats of problem descriptions. We also examine affective and load aspects of using the tool. Our findings are somewhat positive but mixed.
Thießen, Thore | Professur für Praktische Informatik (Prof. Vahrenhold) |
Vahrenhold, Jan | Professur für Praktische Informatik (Prof. Vahrenhold) |