Hacker News new | past | comments | ask | show | jobs | submit login

The complaint about research papers is that almost all of them omit the ELI5 and provide only the formalism.

You can have both and weave them together into a digestible narrative. I see Physics textbooks sometimes written this way.




Papers are mostly read by other researchers, where the added background is actively bad because it obscures the real meat of the paper to the main audience.

If you just wanted a digestible intro then you would usually buy a textbook.

I think the argument that every research paper ought to be a mashup of a textbook + the actual research to be a bit silly from a “people should specialize at what they’re good at” standpoint.

Put in another context, I also don’t want every recipe to reintroduce what it means to “fry” or “braise” or “marinate”. We have Google for that.


I've long wanted an informational slider to bits of text. Something where you can zoom in and out to the level of desired complexity. LLM's might be able to fill in some of those gaps. You could turn any paper into a introduction of the subject it's a part of.


This sounds like a good use case for local llm models. Browser plugin, precooked prompts for different levels of detail, maybe a lora to give the model some idea of expected output. I bet some of the 13b models could do a useful job on this even if they were imperfect.


Look for "stretchtext"




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: