Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Despite internal format I see immediate external usage in applications. For example batch insertions in Python. Per row insert call has noticeable overhead. And JSONB could bring performance back with CTE:

    CREATE TABLE data(id, name, age);

    WITH ins AS (
        SELECT c1.value, c2.value, c3.value
        FROM json_each('["some", "uuid", "key"]') c1
        INNER JOIN json_each('["joe", "sam", "phil"]') c2 USING (id)
        INNER JOIN json_each('[10, 20, 30]') c3 USING (id)
    )
    INSERT INTO data (id, name, age)
    SELECT * FROM ins
Each json_each could accept a bind parameter with JSONB BLOB from an app.


You don't need JSONB for this - doing this with plain JSON is simpler and already faster than individual inserts for most bindings in my experience.

I typically do bulk inserts using a single JSON argument like this:

    WITH ins AS (SELECT e.value ->> 'id', e.value ->> 'name', e.value ->> 'age' FROM json_each(?) e)
    INSERT INTO data (id, name, age)
    SELECT * FROM ins
The same approach can be used for bulk updates and deletes as well.


I have quite wide records (over 50 fields) and ->> performs not well with text keys. I did not try it with array indexing though:

    WITH ins AS (
        SELECT value ->> 0, value ->> 1, value ->> 2
        FROM json_each('[["some", "joe", 10], ["uuid", "sam", 20], ["key", "phil", 30]]')
    )
    INSERT INTO data (id, name, value)
    SELECT * FROM ins




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: