I haven't ruled out building AI-powered queries into the app, but I am firm about doing it in a way that respects data security (i.e. no shipping database schemas or data to ChatGPT without explicit user consent).
From my understanding, usable on-device LLM models tend to be gigabytes-large, which makes it difficult to roll out to everyone. Apple Intelligence might make this feasible, will need to do more research on this when I do the iOS port.
Add an AI query tool - you could do it on-device with something like functionary ggml and llama cpp with a few functions:
getSchemaForTables(...) getTableStats() runQuery('...')
Then you could do a query like:
"show me all customers who regularly post between midnight and 1am"