Hacker News new | past | comments | ask | show | jobs | submit login

I'm interested in learning about true generation algorithms. Can you point me in the right direction?



Google “gpt-2”.


Thanks but that is not what the OP is claiming. That generates text from a seed, the OP is talking about an article that generates a summary of an article, but without using existing sentences.


Did you read the paper? https://d4mucfpksywv.cloudfront.net/better-language-models/l...

You don’t need any seed, and can generate summaries (section 3.6).

GPT-2 is the model to learn about if you’re interested in NLP.


Are there any papers benchmarking a transformer NN architecture in comparison to something like a pointer-generator network? I'm doing a bit of work in this area (i.e. reimplementing papers), and I'm curious if GPT2-like models can derive greater semantic meaning.


Both GPT-2 and pointer-generator network are open source, and pretrained models are available, so it should be straightforward to compare them.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: