I have a passion project 4x4anarchy.com that operates with a Python-MariaDB system for querying map data by latitude and longitude, transforming it into GeoJSON for map display. The website deals with sizable tables, approximately 1 GB in size. I've made extensive optimizations, relying on well-structured indexes, caching mechanisms, and query optimization to enhance performance.
Given these circumstances, how might the incorporation of Julia and some geospatial DB (PostGIS) contribute to further optimizing geospatial data retrieval and presentation, especially when dealing with large datasets and intricate geospatial operations?
It would depend on where most of the processing is happening.
PostGIS gives you the benefit of spatial indexes which are extremely performant.
I've seen Python GeoSpatial applications taking hours to finish processing which only took a few minutes when shifted onto PostGIS.
If you're also doing a lot of processing in Python, exploring other languages could also help. In the case of Julia you get a typed language that's also JIT compiled.
I think that the challenge for most is that the PostGIS query planner does the indexing for you in most queries, while a naive all-pairs comparison in geopandas/shapely won't tell you to use the .sindex attribute instead.
I don't know Julia well, but I definitely would suggest exploring whether PostGIS can help improve the speed of your DB queries.
I'd also consider how you deliver your geospatial data to your clients -- I'm not sure GeoJSON is your best bet. Protobuf tiles might be better for your use-case (e.g. the Mapbox Vector Tiles spec).
Given these circumstances, how might the incorporation of Julia and some geospatial DB (PostGIS) contribute to further optimizing geospatial data retrieval and presentation, especially when dealing with large datasets and intricate geospatial operations?