Hacker News new | past | comments | ask | show | jobs | submit login

I tried this with a 1.3GB file [1], and got Gemini to convert the jq query into a go program, that takes 6.6s the second time its run.(to cache the geojson file). My laptop isn't particularly fast (i7-1250p), and go's json handling isn't particularly fast either, so jq's time is not impressing me when run on a ryzen 9 desktop processor on a 500MB file.

It's surprising how quick this kind of processing can be in go.

[1] https://data.acgov.org/datasets/2b026350b5dd40b18ed7a321fdcd...

The program:

  package main

  import (
        "encoding/json"
        "fmt"
        "os"
  )

  type FeatureCollection struct {
        Features []Feature `json:"features"`
  }

  type Feature struct {
        Properties Properties `json:"properties"`
  }

  type Properties struct {
        TotalNetValue int    `json:"TotalNetValue"`
        SitusCity     string `json:"SitusCity"`
 }

  func main() {
        filename := "Parcels.geojson"

        data, err := os.ReadFile(filename)
        if err != nil {
                fmt.Println("Error reading file:", err)
                return
        }

        var featureCollection FeatureCollection
        err = json.Unmarshal(data, &featureCollection)
        if err != nil {
                fmt.Println("Error unmarshaling JSON:", err)
                return
        }

        for _, feature := range featureCollection.Features {
                if feature.Properties.TotalNetValue < 193000 {
                        
  fmt.Println(feature.Properties.SitusCity)
                }
        }
  }



Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: