Hacker News new | past | comments | ask | show | jobs | submit login

Steps 1 and 2 are more reliant on human-to-human communication if you are developing an application to be used by some end user and not something that just ingests data and processes it. It takes skill to translate non-technical speak into a clear goal and specifications required to achieve that goal, to probe the third party and ask specific questions to tease out their expectations of the application. That, LLMs can't do yet. I have no doubt that they'll get better at that in the near future though.



Ah, requirements capture.. yeah, not straight ChatGPT but "having a directed conversation" doesn't seem like a stretch.

EDIT: for example, Character.ai as an example of character prompting. You can prime an LLM with a prompt so that it asks you certain types of questions.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: