Hacker News new | past | comments | ask | show | jobs | submit login

Lots of applied NLP tasks used to require paying annotators to compile a golden dataset and then train an efficient model on the dataset.

Now, if cost is little concern you can use zero shot prompting on an inefficient model. If cost is a concern, you can use GPT4 to create your golden dataset way faster and cheaper than human annotations, and then train your more efficient model.

Some example NLP tasks could be classifiers, sentiment, extracting data from documents. But I’d be curious which areas of NLP __weren’t__ disrupted by LLMs.




> But I’d be curious which areas of NLP __weren’t__ disrupted by LLMs

Essentially come up with a potent generic model using human feedback, label and annotation for LLM e.g GPT 4, then use it to generate golden dataset for other new models without human in the loop, very innovative indeed.


I’m interested by your comment that you can “use GPT4 to create your golden dataset”.

Would you be willing to expand a little and give a brief example please? It would be really helpful for me to understand this a little better!




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: