this is awesome, seems to be a good use case for using IPFS decentralized storage, either for copies of the big sqllite db dump file, or (as an alternative) for the individual wiki articles "snapshots"
i think it would be easier to keep it in sync with original wiki using the second option, where each article in static.wiki is synced with original wikipedia pages and updated individually, vs. updating the huge sqllite dump on every change in the originals
then the sqllite will be only holding metadata and links to static (and may be version controlled) versions of individual static pages/article snapshots, so the size will be much smaller and easier to distribute on ipfs
i think it would be easier to keep it in sync with original wiki using the second option, where each article in static.wiki is synced with original wikipedia pages and updated individually, vs. updating the huge sqllite dump on every change in the originals
then the sqllite will be only holding metadata and links to static (and may be version controlled) versions of individual static pages/article snapshots, so the size will be much smaller and easier to distribute on ipfs
Edit: looks like there is similar effort in the works, to put wiki on ipfs https://blog.ipfs.io/24-uncensorable-wikipedia/