The Events here give up on the naming, granularity, and semantics problem: they're extremely low level fine-grained changes to fields in a database.
Events themselves are no longer interesting or semantically meaningful, because they're a single atomic change to a database field. A change in the way state is represented means a different set of events is produced. Subscribing to meaningful occurrences in this model is difficult, and probably will eventually result in the creation of a "meta-event" for each action that contains the semantic intent of the outcome.
Events are IMO most useful for analytics and processing when they correspond to meaningful business outcomes: steps in a workflow, consequences of user actions, and the like - despite this having the problem of making extraordinary and rare business outcomes more difficult to accomodate.
> The Events here give up on the naming, granularity, and semantics problem: they're extremely low level fine-grained changes to fields in a database.
You say "fields", I say "facts".
Which of a set of nominal events or a graph of facts is higher-level will ultimately depend on the domain, but using Datomic over several years now has led me to think that it's much more often the latter than conventional Event Sourcing has trained us to believe.
In fact, it may well be that Event Sourcing has traditionally been relegated to those (few) use cases where the "set of nominal Event Types" approach is better, being the only situations where conventional Event Sourcing is practical.
> Events themselves are no longer interesting or semantically meaningful, because they're a single atomic change to a database field. A change in the way state is represented means a different set of events is produced. Subscribing to meaningful occurrences in this model is difficult,
Datoms are not 'single' or 'atomic' - they're packed coherently together in a Transaction, and are immediately relatable to entire database values. Subscribing is just pattern matching, and is not hard, as the following section of the article tried to show: https://vvvvalvalval.github.io/posts/2018-11-12-datomic-even....
> [...] and probably will eventually result in the creation of a "meta-event" for each action that contains the semantic intent of the outcome.
Which would only bring you back to the 'set of nominal Events' case, so it's not a regression compared to conventional Event Sourcing anyway. That's what the article meant by 'annotating Reified Transactions', and that's something you can even do after the fact (i.e in a later transaction, when the requirement for it becomes apparent) which means that you don't have to get these aspects rights upfront, nor commit to them.
"Events themselves are no longer interesting or semantically meaningful, because they're a single atomic change to a database field."
This isn't true; changes to datoms are not considered individually, but are grouped together into transactions. Additionally, you can add arbitrary keys to annotate the transaction currently being committed.
Have you written about your daily struggles somewhere? I, for one, would be very interested reading about it as I think not nearly enough people talk about the cons/war stories when it comes to ES (or I just don't find those blogs).
Events themselves are no longer interesting or semantically meaningful, because they're a single atomic change to a database field. A change in the way state is represented means a different set of events is produced. Subscribing to meaningful occurrences in this model is difficult, and probably will eventually result in the creation of a "meta-event" for each action that contains the semantic intent of the outcome.
Events are IMO most useful for analytics and processing when they correspond to meaningful business outcomes: steps in a workflow, consequences of user actions, and the like - despite this having the problem of making extraordinary and rare business outcomes more difficult to accomodate.