Yeah as long as you're reading and writing to the database with the same language, and that language has good type safety the benefits of your database schema effectively being defined by the same types as the rest of your code is pretty nice for a lot of use cases.
You just have to be vigilant about correctly migrating existing data to the current shape if you ever make breaking changes to types.
This. Would be nice if there was a framework (in go, or python pydantic) which would help me migrate data made with old structs to new structs. And also deal with the transaction.
For now i use sqlite to deal with transactions and only make backward compatible updates to structs. Brittle, but it is a toy app anyways.
(Normally use django to deal with models and migrations, but wanted to do something different)
Yeah migrations are the biggest issue for me. I really don't like not knowing what the actual shape of the document will be. Missing transactions, and not great relationship performance makes modelling some systems more hassle than it's worth.
I gave it a good go to use mongo and firestore for a few projects, but after a year or two of experimenting I'll be sticking to SQL based DBs unless there are super clear and obvious benefits to using a document based model.
There's a gradual approach there, where you start out with a JSONB column, and then as each piece of the data structure stabilizes* you move it out of json fields and into its own columns/tables.
* meaning, when there's enough code that depends on it that changing it would require some planning
Most commonly I see people use Alembic to migrate SQLAlchemy and Pydantic models.
But I tend to just use Django. Every time I try piecing together the parts (ex FastAPI, Pydantic, Alembic, etc) I reach a point where I realize I’m recreating a half baked Django, and kick myself for not starting with Django in the first place.
You just have to be vigilant about correctly migrating existing data to the current shape if you ever make breaking changes to types.