There's a larger problem with writing papers, that results in this problem that I discuss here[0] and it looks like you're getting at too. What I'd say is that authors (being one myself) are writing papers to reviewers, not to other researchers in my niche (sometimes it feels like not even my several levels of abstraction above).
I can confirm having to significantly tweak papers in ways that I would not have done writing to other researchers. I have papers with hundreds of citations as well as top benchmark scores on papers that could not pass reviews with the most common complaints of "not novel" and "I don't know who would find this useful." This has been one of the most challenging aspects of my PhD and certainly one of the most frustrating.
But the larger problem I see is that everyone is simply hyper-hacking every metric that they can. This is beyond academics, I'm sure you see it in your work or politics too. I think we need to have a serious discussion as to the fact that metrics are proxies and not always aligned with our goals. Or that they stay aligned with our goals, because if someone gets an advantage by optimizing towards the metric rather than optimizing towards the abstract goal we actually want.
It's not AI turning the world into a paperclip that we should be afraid of, it's humans doing that.
I can confirm having to significantly tweak papers in ways that I would not have done writing to other researchers. I have papers with hundreds of citations as well as top benchmark scores on papers that could not pass reviews with the most common complaints of "not novel" and "I don't know who would find this useful." This has been one of the most challenging aspects of my PhD and certainly one of the most frustrating.
But the larger problem I see is that everyone is simply hyper-hacking every metric that they can. This is beyond academics, I'm sure you see it in your work or politics too. I think we need to have a serious discussion as to the fact that metrics are proxies and not always aligned with our goals. Or that they stay aligned with our goals, because if someone gets an advantage by optimizing towards the metric rather than optimizing towards the abstract goal we actually want.
It's not AI turning the world into a paperclip that we should be afraid of, it's humans doing that.
[0] https://news.ycombinator.com/item?id=38200598