Expert prompt mode - marcusgreen/moodle-qtype_aitext GitHub Wiki
References to slugs in this page mean a keyword wrapped in double square braces which will be treated as instructions to the processing of the text.
Currently every prompt in an AI Text question gets some manipulation and "wrapping" before it is delivered to the external LLM.
For example where the prompt is
"Explain if there is anything wrong with the grammar and spelling in the text."
This will be wrapped as
"in [Yesterday I went to the park](/marcusgreen/moodle-qtype_aitext/wiki/Yesterday-I-went-to-the-park)
analyse the part between [ and](/marcusgreen/moodle-qtype_aitext/wiki/-and)
as follows: Explain if there is anything wrong with the grammar and spelling in the text. Set marks to null in the json object. Return only a JSON object which enumerates a set of 2 elements.The JSON object should be in this format: {feedback":"string","marks":"number"} where marks is a single number summing all marks. Also show the marks as part of the feedback. translate the feedback to the language en"
This is helpful for people who are novices at prompting but limits what more experienced people can do and also limits what can be done in terms of innovation.
This proposal is for a new slug in the form [expert](/marcusgreen/moodle-qtype_aitext/wiki/expert)
which will bypass the wrapping and send the prompt to the external LLM. It will require additional slugs to be processed, including but not limited to [question](/marcusgreen/moodle-qtype_aitext/wiki/question)
and [response](/marcusgreen/moodle-qtype_aitext/wiki/response)
and [userlang](/marcusgreen/moodle-qtype_aitext/wiki/userlang)
.
For this to work it may require additional error trapping to be passed back to the question editing interface, but the main changes should be quite straightforward.