One thing people seem to be ignoring is that this research is useful because it creates a controlled environment in which to explore the variable affecting crop yields, even if the stated application has very limited practical value. Also keep in mind that many technologies take a considerable amount of time to develop before they are of any value at scale.
Take something that I presume most people on this forum are intimately familiar with: computers. It is not too difficult to find people mocking historical figure for claiming the world only needed a handful of computers, yet those claims made sense in a historical context: early computation devices were mechanical, electromechanical, or tube based. They were huge, slow, and seen of little value outside of performing a calculations in very specific domains. Even science fiction authors of the era had a difficult time imagining them as anything but hulking machines that may have controlled everything, yet were only accessible to a few. Attitudes may have shifted when transistors entered the picture, and shifted even further when integrated circuits were developed, yet it wasn't until the mid-70's until people started imagining what we have today. Even so, they were a novelty to most people until the mid-90's. We are talking about half a century in going from something that we would recognize as a computer until they were adopted by society as a whole. And that is ignoring the fact that people have been trying to develop calculating machines for centuries.
Research is good, even if the vast majority of it ends up leading to dead-ends, simply because we don't have sufficient imagination to determine what will be useful in the future.
This is true - and I think what’s also being missed here is the potential to further this research for producing grain off-world. Growing crops in space or other planets will be a heavily optimized and controlled process so it’s good to explore these possibilities here on earth even if it doesn’t yet make commercial sense.
Take something that I presume most people on this forum are intimately familiar with: computers. It is not too difficult to find people mocking historical figure for claiming the world only needed a handful of computers, yet those claims made sense in a historical context: early computation devices were mechanical, electromechanical, or tube based. They were huge, slow, and seen of little value outside of performing a calculations in very specific domains. Even science fiction authors of the era had a difficult time imagining them as anything but hulking machines that may have controlled everything, yet were only accessible to a few. Attitudes may have shifted when transistors entered the picture, and shifted even further when integrated circuits were developed, yet it wasn't until the mid-70's until people started imagining what we have today. Even so, they were a novelty to most people until the mid-90's. We are talking about half a century in going from something that we would recognize as a computer until they were adopted by society as a whole. And that is ignoring the fact that people have been trying to develop calculating machines for centuries.
Research is good, even if the vast majority of it ends up leading to dead-ends, simply because we don't have sufficient imagination to determine what will be useful in the future.