Hacker News new | past | comments | ask | show | jobs | submit login

Perhaps first try it out at https://huggingface.co/spaces/togethercomputer/GPT-JT to see what kind of things you can do with it.



I'm flabbergasted. I translated the tweets to Hebrew and reran the example - it returned the correct results. I then changed the input to a negative, and it again returned the correct results. So it's not only in English, and I'm sure that the Hebrew dataset was much smaller. Perhaps it is translating behind the scenes.


Thanks! Perhaps I'm not good at prompt engineering, but I could barely get anything useful out of it.


It's mainly for text classification, which explains why it's not really giving comparable outputs to GPT3




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: