I've used to S3 to store JSON objects. Two pain points I've noticed:
- If your tests change the time (e.g. with Delorean for Ruby), S3 will fail because the protocol depends on your client having approximately the same time as the server.
- If you ever want to load several S3 files at once, e.g. to show a list of 30 Foos, you'll need to make 30 requests. So this is a bit like an n+1 problem. There might be a way around this, but I haven't investigated it, and it will probably require you to sidestep the abstractions you've built. I'd say with 99.9% confidence you will want to do this someday.
These days I mostly use Postgres instead of MySQL, but I can't help but think that querying MySQL from Node has got to be easier than building your own database on top of S3.
I am also thinking of using Postgre, I think the support is better (now with native json support). You still need to compile the sql queries (in node) tho, which is a pain. Another thing I remember was, when having a new db, you had to get the tables set up and initialized, so more boilerplate code there :)
- If your tests change the time (e.g. with Delorean for Ruby), S3 will fail because the protocol depends on your client having approximately the same time as the server.
- If you ever want to load several S3 files at once, e.g. to show a list of 30 Foos, you'll need to make 30 requests. So this is a bit like an n+1 problem. There might be a way around this, but I haven't investigated it, and it will probably require you to sidestep the abstractions you've built. I'd say with 99.9% confidence you will want to do this someday.
These days I mostly use Postgres instead of MySQL, but I can't help but think that querying MySQL from Node has got to be easier than building your own database on top of S3.