> Let them eat /dev/urandom to their heart's content.
No! You can't just give them purely random data. No, sir. That would be easy enough to detect.
What you need is plausible randomness. Shift the value of every transaction by a small percent. Trending everything downward over time, but making it plausible, would be far more entertaining with random upward trends. Best buy now before it gets too expensive! Oh, I'm sorry? That wasn't the actual price? Well, you'd best use a reputable source!
If you're going to poison the well, you don't want to be caught. You want them to wonder at what point their data set diverged and for how long they've been serving incorrect data. Sinister points for interspersing legitimate data with munged data.
The trick with being evil in this case is to be subtle about it. They want to scrape all your metrics? Let them. You just can't guarantee the accuracy of the data they're scraping, right? [wink, wink]
I am going to lok into this later today through nginx. I am planning on having every request from their scraping IP return a static page linking to the proper site.