Hacker News new | past | comments | ask | show | jobs | submit login

In case anyone else was confused why they didn't just project onto q first if they want to have <q,x'> as close as possible to the original <q,x>. Apparently q is a random query vector and their research shows how to define an alternative quantization loss where you weigh quantization loss more heavily when |<q,x>| is high [1].

It's worth noting that without this additional weighting you end up with the euclidean distance under the assumption that q is distributed symmetrically.

[1]: https://arxiv.org/abs/1908.10396




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: