Hacker News new | past | comments | ask | show | jobs | submit login

The Events here give up on the naming, granularity, and semantics problem: they're extremely low level fine-grained changes to fields in a database.

Events themselves are no longer interesting or semantically meaningful, because they're a single atomic change to a database field. A change in the way state is represented means a different set of events is produced. Subscribing to meaningful occurrences in this model is difficult, and probably will eventually result in the creation of a "meta-event" for each action that contains the semantic intent of the outcome.

Events are IMO most useful for analytics and processing when they correspond to meaningful business outcomes: steps in a workflow, consequences of user actions, and the like - despite this having the problem of making extraordinary and rare business outcomes more difficult to accomodate.




> The Events here give up on the naming, granularity, and semantics problem: they're extremely low level fine-grained changes to fields in a database.

You say "fields", I say "facts".

Which of a set of nominal events or a graph of facts is higher-level will ultimately depend on the domain, but using Datomic over several years now has led me to think that it's much more often the latter than conventional Event Sourcing has trained us to believe.

In fact, it may well be that Event Sourcing has traditionally been relegated to those (few) use cases where the "set of nominal Event Types" approach is better, being the only situations where conventional Event Sourcing is practical.

> Events themselves are no longer interesting or semantically meaningful, because they're a single atomic change to a database field. A change in the way state is represented means a different set of events is produced. Subscribing to meaningful occurrences in this model is difficult,

Datoms are not 'single' or 'atomic' - they're packed coherently together in a Transaction, and are immediately relatable to entire database values. Subscribing is just pattern matching, and is not hard, as the following section of the article tried to show: https://vvvvalvalval.github.io/posts/2018-11-12-datomic-even....

> [...] and probably will eventually result in the creation of a "meta-event" for each action that contains the semantic intent of the outcome.

Which would only bring you back to the 'set of nominal Events' case, so it's not a regression compared to conventional Event Sourcing anyway. That's what the article meant by 'annotating Reified Transactions', and that's something you can even do after the fact (i.e in a later transaction, when the requirement for it becomes apparent) which means that you don't have to get these aspects rights upfront, nor commit to them.

For a more in-depth discussion of Datomic's Reified Transactions, I suggest this talk by Tim Ewald: https://docs.datomic.com/on-prem/videos.html#reified-transac...


"Events themselves are no longer interesting or semantically meaningful, because they're a single atomic change to a database field."

This isn't true; changes to datoms are not considered individually, but are grouped together into transactions. Additionally, you can add arbitrary keys to annotate the transaction currently being committed.

See: https://docs.datomic.com/cloud/transactions/transaction-proc...


That is true, but isn’t it trivially solvable with the datomic model? You just have a datum “significant event” list and add stuff to it?

You still get all the benefits of event sourcing, but then you can query the state much more easily.

At least that’s what it looks to me from a cursory look at datomic and daily struggles with normal event sourcing (kafka)


Have you written about your daily struggles somewhere? I, for one, would be very interested reading about it as I think not nearly enough people talk about the cons/war stories when it comes to ES (or I just don't find those blogs).


We didn’t have much success with it, but there are tools like Debezium that read the Postgres WAL and send row level changes like this to Kafka




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: