Hacker News new | past | comments | ask | show | jobs | submit login

Maybe. Transformers model associative memory in a way made precise by their connection to Hopfield networks. Individually, they're like look-up tables, but the queries can be ambiguous, even based on subtle higher-order patterns (which the network identifies on its own), and the returned values can be a mixture of stored information, weighted by statistically meaningful confidences.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: