Hacker News new | past | comments | ask | show | jobs | submit login

In writing the code that is supposed to implement my idea, I find that my idea has many flaws.

Sending that idea to an LLM (in absence of AGI) seems like a great way to find out about the flaws too late.

Otherwise, specifying an application in such detail as to obtain the same effect is essentially coding, just in natural language, which is less precise.






Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: