> That's not really how the prompts work. Or how chat gpt works... There's not a clear seperation between the instructions or input
In examples like "Translate the following text between braces to French: {<some text>}", ChatGPT can clearly seperate between the two. For the present use case, you could simply strip any braces from the LinkedIn profile, so no need to worry about escaping.
> Really they could/should change the sql spec to just require ? for all values
Nope, because SQL query plans are generated and cached for the non-? (non-variable) parts, hence you need/want the ability to specify constants, since some query plan optimizations are based on that. You want to distinguish between which values are variable vs. constant in your application.
In examples like "Translate the following text between braces to French: {<some text>}", ChatGPT can clearly seperate between the two. For the present use case, you could simply strip any braces from the LinkedIn profile, so no need to worry about escaping.
> Really they could/should change the sql spec to just require ? for all values
Nope, because SQL query plans are generated and cached for the non-? (non-variable) parts, hence you need/want the ability to specify constants, since some query plan optimizations are based on that. You want to distinguish between which values are variable vs. constant in your application.