An improvement to the Hacker news website would be to compute and compare hashes of weblinks so that the same link is not reposted multiple times. Later posters of links could then be redirected to the first post of a link.
It wouldn't work - it's trivial to add meaningless query parameters or anchors that would change the hash but still lead to the same content. And stripping that wouldn't work because some sites use them to route to content.
What might work is hashing the text and outbound link content submitted pages of, and building something like a similarity index of text, metadata and a graph of links, but that would probably still be fragile, and definitely be too much effort for a site with as little traffic as this.
Assuming a site has one, although most news sites probably do. Facebook Open Graph and other social media tags are worth looking for as well. Unfortunately, they're not always trustworthy.
It does that, but only if the other submission has been very recently and/or(?) has points over a threshold - otherwise duplicate submissions are explicitly allowed.
Timing is one of the most important parts of getting "hits" on social media, and while it can be optimized for, it's not always controllable. A post may succeed because someone made an attention-getting comment, for example, and that's just the luck of the commenter running across the link when he or she had the time and inclination to leave a note.
Social is fickle that way, and since most social algorithms strongly consider post recency and other time-sensitive factors in their ranking, duplicates should be allowed within a reasonable time frame, because you never know when the critical path will get hit.
https://news.ycombinator.com/item?id=16159589