Hacker News new | past | comments | ask | show | jobs | submit login

You can generate "text embeddings" (ie vectors) from your text with an embedding model. Once you have your text represented as vectors, then "closest vector" means "more semantically similar", as defined by the embedding model you use. This can allow you to build semantic search engines, recommendations, and classifiers on top of your text and embeddings. It's kindof like a fancier and fuzzier keyword search.

I wouldn't completely replace keyword search with vector search, but it's a great addition that essentially lets you perform calculations on text




Thank you! This was an extremely helpful explanation to me, as someone not familiar on the topic. =)


Nice explanation. One use case where keywords haven't worked well for me , and (at least at first glance) vectors are doing better are longer passages -- finding sentences that are similar rather than just words.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: