Can’t you just instruct your llm of choice to transform your prompts like this for you? Basically feed it with a bunch of heuristics that will help it better understand the thing you tell it.
Maybe the various chat interfaces already do this behind the scenes?
Maybe the various chat interfaces already do this behind the scenes?