As a word of warning, when HN discovered my search engine, I was hit hard by a botnet within a few days. Saw about 30-40k queries/hour from some 10k IP addresses. I'm self hosted so the worst that happened is my search engine was a bit slow, but if I was cloud hosted I'd have a very sizable bill to pay.
If you do not already have a global rate limit, implement one ASAP. Better to have one and not need it, than to need it and not have it.
Do you have reversed proxy in front of your API like HA Proxy or Nginx, most of bots will hit you by IP only, so filter and reject request without domain will be eliminate most of them.