> "Author's note: This article was written by the ChatGPT chatbot, in response to prompts from MK. That human-chatbot conversation is presented here, without editing."
Could you substantiate this? There is no connection to biology/bioscience/etc. in the text. That is why its embarrassing that they published it - it has nothing to do with the journal. There are so many other journals where it might make sense to publish such a thing (AI/education/science studies/etc.). But here it just seems like they couldn't manage to get a proper editorial for the issue.
I wouldn't exactly call it "powerful" with respect to LLMs applied to biology. It reads more like an opinion piece with some background context (which is not bad) and fake references.
Even though it was published in Cellular and Molecular Bioengineering, it really doesn't have anything to do with Cellular or Molecular Bioengineering.
This is one paper with 190 citations and one paper with no citations.
The paper is only 2 pages in a prompt-response format, un-edited. The prompts are specific and ask for paragraphs about chatbots, AI and plagiarism. The author's note at the end, after ChatGPT was asked for references and gave several, is:
> Author’s note: These are not real references, unfortunately.
Edit: One more point -- Even though it was published in Cellular and Molecular Bioengineering it really doesn't have to do with either. It's a quick read.
- software tools are cited (you often will site a paper describing the tool). If a tool generates a value and then later a bug is discovered, it's important that the tool was cited.
- one purpose of a citation is to make a clear distinction which work/words are your own and which are not. Are you making a statement because it's based on past work, your own inference, or the hallucination of a LLM? With ChatGPT it's easy to get confused
- there is also the more general issue of academic integrity. You shouldn't submit other people's/machine's words as your own
Maybe more correctly ChatGPT should be on the authors' list ;)
190 citations sounds impressive but considering they mostly come from possibly the two fields that gathers citations the quickest - biomedical sciences and artificial intelligence - this shouldn't be too surprising?
How would it know? If anything this is one of the only places where it’s not an ouroboros because of the citations.
But yeah, I have to imagine LLM-produce material is already on its way back into the next models. I wonder how this problem is perceived among the people building these systems. Given that they’re not yet near ASI or AGI yet, it’s gotta be close to an existential issue, no?
Maybe not “your company will collapse,” but “you will not be able to get further improvements.”
Would love to hear thoughts from someone who is actually addressing this (or knows why we don’t need to).
There is a demand for Low Background Steel, steel produced before the Nuclear Bombs were dropped, for use in Geiger counters and the like. It's usually salvaged from the wrecks of sunken WW1 ships, as that is the only way to guarantee no radiation.
The same will happen to data sets from pre-2022, everything beyond that will be rapidly polluted with AI nonsense.
https://link.springer.com/article/10.1007/s12195-022-00754-8
> "Author's note: This article was written by the ChatGPT chatbot, in response to prompts from MK. That human-chatbot conversation is presented here, without editing."