We're using it in production, but then, the author is on our team. We've been using Datomic in production since Jan 2013. It's as good as it says on the tin.
I love living dangerously, if danger means using full-stack Clojure and immutability -grin-
We're also using it in production, and it works really well when paired with Datomic. Thanks to transaction queues on both ends, we can set up a meteor-style sync that's pretty handy.
It's also well-suited for rapid-prototyping. I'm working on a small project that will eventually need a backend, but for now I just store everything in a datascript db and serialize that to localStorage. When I'm ready to add a backend, I'll just subscribe to datascript's transaction queue and forward them to be saved on the server.
This is a project I've kept a closer eye on, as I'm thoroughly fascinated with Datomic, and the concept of using a similar system alongside functional approaches like React seems a good match.
Would love to hear comments from anyone who has given DataScript a test drive - I'm guessing from the alpha state that no one is using it in production but hey, who knows... some people like to live dangerously.
Wrong guess.
The project's repository has a list of things people built with DataScript.
One of those is https://precursorapp.com
It's highly usable, stable and fun app!
FYI, there's new developments in this area... there should be a video up on YouTube about "Om.Next" in a week or so that details using datomic & Om with a relay-style mechanism.
I've been thinking a lot about using a datascript database in place of a large mutable deeply nested hash map, or other non-regular chunk of data, to keep track of state using the amazing clojurescript library re-frame[1].
There are some times when having a novel way to query could lead to simpler data.
As a datapoint, I'm using DataScript in a project and I'm quite happy with it. It wasn't quite the promised land, but it solved many hard problems for me.
Immutable database? That doesn't make sense. If you can't mutate it, how do you write to it? How do you update records in it? If it's really immutable then you can't. Such a database is a compile-time constant.
These days everybody seems to throw around terms like immutable, isomorphic, pure etc. without actually knowing what they mean, or using them in contexts whey they don't make any sense.
> Immutable database? That doesn't make sense. If you can't mutate it, how do you write to it? How do you update records in it?
You don't write to the database, you perform an operation that takes a database and a write request and returns a new database. Same with updates. Essentially, the kind of things you think of as "mutating" operations have a signature of, in Haskell-ish notation:
With this one, the database is an immutable value, so you can hold on to old versions of it. Here's an incomplete example of how you could implement optimistic updates:
Normally when you update a row in a database you lose the ability to access the previous values. E.g. If you update your address it overwrites your previous address
With datomic you are just adding a new fact. My address is now x. This has a lot of nice properties. I can now easily write a query to tell me your previous 5 addresses, rolling back is easy, you can get amazing and scalable read performance, etc etc
In the past we have either had to hand roll our own features (e.g. Have a table of past addresses, or add date columns to specific tables) or rely on error prone/brittle techniques like using audit tables, source control, or transaction logs.
The data is immutable, not the database itself, meaning that you create entries but not modify them. The same way you can not modify a constant after you create it. "Immutable data database" would sound a bit redundant.
I did not understand persistent until someone point me to the wikipedia article [1]:
> In computing, a persistent data structure is a data structure that always preserves the previous version of itself when it is modified. Such data structures are effectively immutable, as their operations do not (visibly) update the structure in-place, but instead always yield a new updated structure.
The way I understand it in terms of python: It's like a nametuple or dict, where each update return a new namedtuple/dict with the new values for modified attributes but still the old values for attributes that did not change. It doesn't need to deep copy all the datastructure/database.
This is not persistent as in "stored on disk". Being append only is a way to achieve persistence except it is also applicable to non-stored-on-disk datastructures.
I love living dangerously, if danger means using full-stack Clojure and immutability -grin-