Hacker News new | past | comments | ask | show | jobs | submit login
Jo – a shell command to create JSON (2016) (jpmens.net)
325 points by robfig on Feb 5, 2022 | hide | past | favorite | 94 comments



I remember that when I worked at Google about a decade ago, there was this common saying:

"If the first version of your shell script is more than five lines long, you should have written it in Python."

I think there's a lot of truth in that. None of the examples presented in the article look better than had they been written in some existing scripting/programming language. In fact, had they been written in Python or Javascript, it would have been far more obvious what the resulting output would have been, considering that those languages already use {} for objects and [] for lists.

For example, take this example:

    jo -p name=JP object=$(jo fruit=Orange point=$(jo x=10 y=20) number=17) sunday=false
In Python you would write it like this:

    json.dumps({"name": "JP", "object": {"fruit": "Orange", "point": {"x": 10, "y": 20}, "number": 17}, "sunday": False})
Only a bit more code, but at least it won't suffer from the Norway problem. Even though Python isn't the fastest language out there, it's likely still faster than the shell command above. There is no need to construct a fork bomb just to generate some JSON.


> Even though Python isn't the fastest language out there, it's likely still faster than the shell command above.

Taking these two command lines:

   jo -p name=JP object=$(jo fruit=Orange point=$(jo x=10 y=20) number=17) sunday=false >/dev/null

   python -c 'import json;print(json.dumps({"name": "JP", "object": {"fruit": "Orange", "point": {"x": 10, "y": 20}, "number": 17}, "sunday": False}))' >/dev/null
For jo (x86_64, Rosetta2), python2 (x86_64, Rosetta2), jo (arm64), and python3 (arm64), running 1000 iterations, with `tai64n` doing the timing.

    2022-02-05 21:25:38.357228500 start-jo-x86
    2022-02-05 21:25:45.319337500 stop-jo
    2022-02-05 21:25:45.319338500 start-python2-x86
    2022-02-05 21:26:18.876235500 stop-python2-x86
    2022-02-05 21:26:18.876235500 start-jo-arm
    2022-02-05 21:26:22.316063500 stop-jo-arm
    2022-02-05 21:26:22.316064500 start-python3-arm
    2022-02-05 21:26:40.379063500 stop-python3-arm
I make it: 7s for jo-x86, 33.5s for python2-x86, 3.5s for jo-arm, 18s for python3-arm.

Test script is at https://pastebin.com/4tTVrDia


python3 is (relatively) slow to startup, and this is something that got significantly worse with 2->3 migration:

  $ time python3 -c ''
  real    0m0.029s

  $ time python2 -c ''
  real    0m0.010s

  $ time bash -c ''
  real    0m0.001s
Which means - you probably don't want to have python scripts on a busy webserver, being called from classic cgi-bin (do people still use those?), or run it as -exec argument to a "find" iterating over many thousands files. Maybe a couple more of such examples. For most use-cases though, that's still fast enough.


I get 14,823s for python3 and 4,667s for jo on my system.

I also wrote my own tool, xidel [1]:

    time for i in $(seq 1 $count); do xidel -se '{"name": "JP", "object": {"fruit": "Orange", "point": {"x": 10, "y": 20}, "number": 17}, "sunday": false()}' > /dev/null; done     

    
which gives me 1,575s

But if you actually want to repeat something a thousand times, you would use a loop in the query for 0,017s:

    time xidel -se 'for $i in 1 to 1000 return {"name": "JP", "object": {"fruit": "Orange", "point": {"x": 10, "y": 20}, "number": 17}, "sunday": false()}'  > /dev/null
  
  
(a python3 loop gives me 0,029s)

[1] https://videlibri.de/xidel.html


what about how long for a human to read it and debug it when it gets beyond trivial?


Fair question. I think jo does tend to get more crufty if you're doing anything reasonably complex with multilevel structures, especially with arrays.

But jo does come into its own when you're wanting to use shell variables.

    > jo mypid=$$ set_or_not=$WEASEL

    > python -c 'import json,os;print(json.dumps({"set_or_not":os.getenv("WEASEL"), "mypid":os.getpid()}))'


I'm not disagreeing that python is slow, but why would you choose to do either in a shell script?

    $ time cat<<EOL
    {"name": "JP", "object": {"fruit": "Orange", "point": {"x": 10, "y": 20}, "number": 17}, 
    "sunday": false}  
    EOL
    {"name": "JP", "object": {"fruit": "Orange", "point": {"x": 10, "y": 20}, "number": 17}, 
    "sunday": false}

    real 0m0.002s
    user 0m0.000s
    sys  0m0.002s


> why would you choose to do either in a shell script?

In the normal case, you'd have variables interpolated in there, not static JSON. And then you run into the quoting problems that jo was created to work around...


Now put a thousand of those JSON objects in a list, invoking jo for every element.


That wasn't the claim made in the original post though, was it? The claim was that the Python snippet would be quicker than the jo snippet.

"Even though Python isn't the fastest language out there, it's likely still faster than the shell command above."

Which is most definitely is not - it's 5x slower.

(Probably not a huge issue in the real world if you're writing a shell script, mind, given that bash itself isn't a performance demon. But claims have to be tested.)


That’s because you’re making a false assumption about the environment prior to executing the statement.

If you are in a shell session and have to choose between executing python -c or calling jo, the latter is faster as you’ve demonstrated. But that’s not a realistic assumption.

Statements like these are almost certainly part of some combined work. The data you’re feeding to jo comes from somewhere. Its output is written somewhere.

You can’t convince me that if you’re already inside some Python script, that invoking json.dumps() is slower than calling jo from within a shell script.

At no point did I claim that launching Python AND running that json.dumps() is faster than running that shell command. I only stated that the json.dumps() is.


> if you’re already inside some Python script [...]

You're not going to shell out to `jo` and that's fine - it's not what `jo` was created for; it's explicitly a shell command to help you work around the annoyance of getting quoting right when constructing JSON from the command line (which I've had to do a lot and I'm pretty sure many people have to.)

> If you are in a shell session [and want to create JSON] ... that’s not a realistic assumption.

Of course it is. People create JSON in shell scripts all the time! That's why things like `jq` exist - because this is what people do!


I actually did that for a more realistic comparison.

Example for jo:

  docker run --rm -it debian bash
  apt update && apt install -y jo nano
  nano bash-loop.sh && chmod +x bash-loop.sh
  
  #!/bin/bash
  for ((i=0;i<1000;i++)); 
  do 
     jo -p name=JP object=$(jo fruit=Orange point=$(jo x=10 y=20) number=17) sunday=false
  done
  
  time ./bash-loop.sh >/dev/null
Example for Python 3:

  docker run --rm -it debian bash
  apt update && apt install -y python3 nano
  nano python-loop.py
  
  import json
  for i in range(1000):
    print(json.dumps({"name": "JP", "object": {"fruit": "Orange", "point": {"x": 10, "y": 20}, "number": 17}, "sunday": False}))
  
  time python3 python-loop.py >/dev/null
Versions:

  Debian GNU/Linux 11 (bullseye)
  jo 1.3
  Python 3.9.2
Results for jo:

  real    0m2.230s
  user    0m1.106s
  sys     0m1.076s
Results for Python 3:

  real    0m0.027s
  user    0m0.021s
  sys     0m0.005s
So it seems like you're probably right about how individual invocations scale for larger amounts of invocations in non-trivial cases!

Note: jo seems to pretty print because of the "-p" parameter, which is not the case with Python, might not be a 1:1 comparison in this case. Would be better to remove it. Though when i did that, the performance improvement was maybe 1%, not significant.

Admittedly, it would be nice to test with actually random data to make sure that nothing gets optimized away, such as just replacing one of the numbers in JSON with a random value, say, the UNIX timestamp. But then you'd have to prepare all of the data beforehand (to avoid differences due to using Python to get those timestamps, or one of the GNU tools), or time the execution separately however you wish.

Edit to explain my rationale: Why bother doing this? Because i disagree with the sibling comment:

> The claim was that the Python snippet would be quicker than the jo snippet.

In my eyes that's almost meaningless, since in practice when you'll actually care about the runtimes will be when working with larger amounts of data, or alternatively really large files. Therefore this should be tested, not just the startup times, which become irrelevant in most real world programs, except for cases when you'd make a separate invocation per request, which you sometimes shouldn't do.

Edit #2: here's a lazy edit that uses the UNIX time and makes the data more dynamic, ignoring the overhead to retrieve this value, to get a ballpark figure.

Use time value for jo:

  jo -p name=JP object=$(jo fruit=Orange point=$(jo x=10 y=20) number=$(date +%s)) sunday=false
Use time value for Python 3:

  import time
  ...
  print(json.dumps({"name": "JP", "object": {"fruit": "Orange", "point": {"x": 10, "y": 20}, "number": int(time.time())}, "sunday": False}))
Results for jo:

  real    0m2.794s
  user    0m1.422s
  sys     0m1.313s
Results for Python 3:

  real    0m0.027s
  user    0m0.020s
  sys     0m0.006s
Seems like nothing changed much.

Edit #3: probably should have started with a test to verify whether the initially observed performance differences (Python being slower due to startup time) were also present.

Single iteration results for jo:

  real    0m0.003s
  user    0m0.000s
  sys     0m0.002s
Single iteration results for Python 3:

  real    0m0.022s
  user    0m0.017s
  sys     0m0.004s
Seems to also more or less match those results.


Often when writing scripts, I'm chaining tools together, e.g. using git to find a thing in a specific commit, using curl to grab something from the web, decoding some json, maybe unzipping a file.

I've never really found any language that feels good for that kind of thing, there's definately a middle ground where it's getting too much for bash, but jumping to a language loses too much at the initial conversion to make it feel worth it until you are well past the point where your future self will think you should have made the switch.

Some languages have things like backticks in php to inter-operate but it's still not great experience to mix between them. For my own little things I'm currently looking at fish, but bash is omnipresent.

This tool seems to delay that point even further, as currently dealing with generating json is definately a pain point (whereas manipulating it in jq is often really good).

But if anyone can point to good examples of this transition in python then I'd be very interested.

edit: jq is more powerful than I thought for creating json, see https://spin.atomicobject.com/2021/06/08/jq-creating-updatin...


Typically if you have bash you have a bunch of other utilities installed too.

The problem with python here is that while python-sh might be nice, you have to install any extra libraries you need with it, and that's not a trivial problem for installing scripts into prod.

Xonsh is better since you kind of get both the benefits of python and a shell like language, but frankly it's broken in a number of ways still. I use it daily, but hopefully you don't need to ctrl-c out of something since signal handling is iffy at best. It is kind of nice to be able to import python directly on the command line and use it as python though...


Thank you for pointing out that jq can create JSON! I use jq all the time for working with the AWS CLI and a big pain point has always been sending JSON. If you don't know, the AWS CLI depends on JSON arguments for quite a few common tasks, and the JSON needed can be quite lengthy.

Up until now I've been creating temp JSON files to feed the commands and I thought jo would be a great tool to make this easier. Now that I know jq can also create JSON, I'll just use that instead.


> there was this common saying:

> "If the first version of your shell script is more than five lines long, you should have written it in Python."

Seriously, these kind of "common knowledge", "universal truth in a sentence" sayings are often the mark of wannabe guru mid career engineers that have no clue what they are talking about.


This is a toxic comment that adds nothing to the conversation.


I don't know, the jo command seems a lot more readable to me than compressing json to a one line string. How is it toxic to call out a silly truthism?


I agree with Galanwe, but

> [...] are often the mark of wannabe guru mid career engineers that have no clue what they are talking about.

is the "toxic" part.

At least he says "are often," as not to accuse the anonymous mid career Google engineer.


"Even though Python isn't the fastest language out there, it's likely still faster than the shell command above."

That is going a bit far. By all means use Python. Go ahead and attack people who use the shell. But let's be honest. The shell is faster, assuming one knows how to use it. A similar claim is often made by Python advocates, something along the lines of Python is not slow if one knows how to use it.

The startup time of a Python interpreter is enormous for someone who is used to a Bourne shell. This is what always stops me from using Python as a shell replacement for relatively simple jobs; I have never written a large shell script and doubt I ever will. I write small scripts.

If anyone knows how to mitigate the Python startup delay, feel free to share. I might become more interested in Python.

Anyway, this "jo" thing seems a bit silly. Someone at Google spent their 20% time writing a language called jsonnet to emit JSON. It has been discussed on HN before. People have suggested dhall is a perhaps better alternative.

https://jsonnet.org

https://dhall-lang.org


> it's likely still faster than the shell command above.

That's not a shell command any more than running "python" is. `jo` is its own executable.

And doing the sub-commands are not necessary; `jo` supports nested data natively.


I just assume at this point that a new python script from a coworker won’t run without an hour of tinkering and yelling obscenities at my screen. Or resorting to running in docker, which seems asinine. Python’s everywhere and does everything though, so I don’t have a good alternative. Shell scripts definitely aren’t it, but they generally hold up better when sharing in my experience.


Shell scripts can't declare dependencies though, in the same way a pip package can. A shell script using this tool requires one to manually apt install it first, or run in a common docker image - asinine. If you don't, your script will fail halfway through during runtime (actually, likely it will not fail, just produce corrupted output, since shell by default ignores errors), a python IDE or mypy will tell you about missing packages during analysis before you try to build and run it.

Besides that, looking at json only, it's part of the standard library so is more likely to already exist on any given machine rather than this.


That depends on how well your bash script is constructed. If you carefully handle the falling case such as missing commands, non-root permissions, etc. It can be easy to use and kind of portable. Of course python scripts have better error trace so if the script doesnt work others can debug with relatively easily.


Sure, but using Python gives you 100 more problems. Would you like to set pip and your package manager on a fight to the death? Or is today the day you learn all about venv? Might as well use Docker. Oh nice, my 600MB script is ready!


That's not really an issue for small scripts, no.


Since I have no idea why this is downvoted: it really isn't for small scripts: if you just ignore pip/packages, Python is going to give you way more functionality out of the box than shell would, with a lot fewer sharp corners that will translate into a less buggy/more correct script at the end of day.

If you do take the time to deal with pip (which, yes, is a problem) you get access to even more batteries that would have been a pain or just flat out impossible with shell.

(& on many distros, you can use your system package manager. I'm not seeing a material difference between a shell script that requires "apt-get install jo" and a Python script that requires "apt-get install python3-requests" or something.)

But either way, for circumstances where shell is the wrong tool for the job, "invoke python in the middle of this shell script because it's a better tool for this particular part of the job" is a strategy I've used before & will keep using, b/c it produces code that isn't riddled with bugs.


It's behavior is really well defined and will stop on a syntax error, type error, or some other error that wasn't handled.

Bash on error will just move on to the next line as if nothing ever happened unless you:

    set -eu
    set -o pipefail


The longest shell script I ever wrote was at Google, because at the time, my co-worker and I didn't know anyone with Python readability.


Although good advice, this is also an area where Nim could shine.


If you're going to use string literal, then

  echo '{"name": "JP", "object": {"fruit": "Orange", "point": {"x": 10, "y": 20}, "number": 17}, "sunday": false}'
is fine as a script


All fun and games until you need quoting. Quoting in shell scripts is already hellish enough, but layering JSON quoting on top of that is a road to madness.


Combining here-string with `jq` (if you need variables), is plenty good enough. You'd need a single jq invocation.


Yes, `jq` would work fine, but so would `jo`. The point is, if there is anything more than the simplest dynamic values, constructing valid JSON just with POSIX shell is a huge pain.


I feel this whenever anyone advocates using jq. It boggles my mind that anyone would want to learn a whole new DSL for something that js makes trivial, especially considering js a much more expressive scripting language anyway


JS is not that easy to embed in scripting though for trivial situations. After seeing a couple of jq examples I can run `jq '.foo[] | {bar, baz}' without really "learning" its DSL. But doing the same with node? That would be much larger.


And now everyone who needs to read your script needs to also find those examples to figure out what your code does. Even your not-real-world example isn't obvious to anyone unfamiliar with jq what the intent is


That's about the only example you need explained to understand ~99% of real world jq usage. $dayjob has quite a bit of it around in various repos, and this actually is as real-world as it gets in my experience. Comparing that to having to learn enough JS to do the same thing in a verbose way, I'm still on the side of jq having an advantage in that case.


For programs this trivial, startup time so dominates runtime, and Python's startup time is so incredibly awful, that you can often fork 10-20 low-overhead processes before Python even starts executing user code.


Hard to write oneliners in python, though.

And this lets you write json without (or with less) braces, commas and quotes. That alone is already a big win.


that's probably you're at google, however there are more embedded devices on earth than whatever google has times 1+ billion. in those devices python is too heavy, and a posix shell along with jo fits perfectly.


Some time ago I wrote zsh helper [1] that uses jo to quickly construct complex JSON requests, mainly for testing and quering services from console.

Paired with httpie [2] aliases [3] it produces concise APL-like syntax:

  POST https://httpbin.org/post test:=j`a=b c=`e=3` l=`*1 2 3``
Which translates to:

  http POST https://httpbin.org/post test:="$(jo a=b c="$(jo e=3)" l="$(jo -a 1 2 3)")"
Or, in other words, sending POST with the following body:

  {"a":"b","c":{"e":3},"l":[1,2,3]}
[1]: https://github.com/seletskiy/dotfiles/blob/78ac45c01bdf019ae... [2]: https://httpie.io/ [3]: https://github.com/seletskiy/dotfiles/blob/78ac45c01bdf019ae...


HTTPie creator here. We’ve recently[0] added support for nested JSON[1] to the HTTPie request language, so you can now craft complex JSON request directly:

  $ http pie.dev/post test[a]=b test[c][e]:=3 test[l][]:=1
[0] https://httpie.io/blog/httpie-3.0.0

[1] https://httpie.io/docs/cli/nested-json


jo immediately reminded me of httpie’s CLI syntax. Is it just a coincidence, or does httpie use jo under the hood?

Btw loving your new desktop app so far!


We did look at `jo`, and also `jarg`[0], the W3C HTML JSON form syntax[1], and pretty much every other approach we could find. We had quite a few requirements that the new nested syntax had to meet: be simple/flexible, easy to read/write, backward/forward compatible, and play well with the rest of the HTTPie request language.

The final syntax is heavily inspired by the HTML JSON forms one. But we made it stricter, added type safety, and some other features. It also supports all the existing functionality like embedding raw JSON strings and JSON/text files by paths.

The final implementation[2] is completely custom. We have plans to ship it as a standalone tool as well, publish the test suite in a language-independent format and write a formal spec so that other tools can easily adopt it. This spec will eventually be a subset of one for the overall HTTPie request language, which is currently tied to our CLI implementation but we want to decouple it.

Happy to hear you like the desktop app!

[0] https://github.com/jdp/jarg

[1] https://www.w3.org/TR/html-json-forms/

[2] https://github.com/httpie/httpie/blob/master/httpie/cli/nest... — this is the path parser


Thanks for the detailed reply! Hadn’t come across the HTML JSON form syntax before. Good to know memorising the httpie request language will have some side benefits in that it’s closely related to a standard in use elsewhere.


I like it, very concise but retains all the features you need.

`jq` can construct JSON "safely" from shell constructs, but is rather more verbose - e.g. with the same examples:

        $ jq -n --arg name Jane '{"name":$name}'
        
        $ jq -n \
        --argjson time $(date +%s) \
        --arg dir $HOME \
        '{"time":$time,"dir":$dir}'
        
        $ jq -n '$ARGS.positional' --args spring summer winter
        
        $ jq -n \
        --arg name JP \
        --argjson object "$(
          jq -n \
          --arg fruit Orange \
          --argjson point "$(
            jq -n \
            --argjson x 10 \
            --argjson y 20 \
            '{"x":$x,"y":$y}' \
          )" \
          --argjson number 17 \
          '{"fruit":$fruit,"point":$point,"number":$number}' \
        )" \
        --argjson sunday false \
        '{"name":$name,"object":$object,"sunday":$sunday}'


>Bam! Jo tries to be clever about types and knows null, booleans, strings and numbers.

I'm very skeptical of this. If I put x=001979 in as a value I dont think I want you trying to guess if that's supposed to be an integer or a string.

This sounds like the Norway Problem waiting to happen.


This kind of thing has its place and can be useful, but I think there should be options to enable/disable such magic. Personally I'd lean towards it being opt-in, but I think with the cli it's a lot harder to not have it opt-out. Everythint is a string in the cli, but not so much when it comes to json. That's why I think it makes sense, providing you can disable the magic.


It is reminiscent of a feature in YAML that has bitten people.

But, clearly there is a use-case for producing json with integer, null, and boolean values.


Of course there is, but it should be done explicitly - i.e. treat as a string by default but use -n if it's a number rather than having a guess.


Here is how HTTPie does it:

  x=005  // {"x": "005"}
  X:=005 // {"x": 5}


then use -s :

        jo -- -s opaque_id=001979
        {"opaque_id":"001979"}


jo opaque_id=$(command used in prod that returns digits + letters for three months and everything works just fine and then suddenly at 3am on a Sunday it returns 1979 and breaks everything)

Avoid using this command in prod.


    jo -- -s a=123 -s b=00123 -s c="$(date)"
    {"a":"123","b":"00123","c":"Sat  5 Feb 21:29:08 GMT 2022"}
What's the issue?


The problem is avoiding your surprise ruined Sunday night three months from now is predicated on you not forgetting to put the -s in.

Implicit typing sucks: https://www.destroyallsoftware.com/talks/wat


Agreed, luckily there is a solution ;)


It’s not a solution, it’s a work-around. A solution would be fixing this issue. This allows working around the issue if you remember that it exists.

It’s like saying that it’s fine if an appliance might randomly burn your house, because there’s a button you can press to not burn your house.


I'm sure it'd be a patch of a few lines to make the type specification mandatory on the command line (I would certainly prefer that also), but it comes down to the opinion of the maintainer if that is wanted or not.


It looks like you can specify the value type per property, for those cases where it matters.


Opt-in safety has such a great track record after all.


    > jo normally treats value as a literal string value, unless it begins with one of the following characters:  
    > value  action
    > @file  substitute the contents of file as-is
    > %file  substitute the contents of file in base64-encoded form
    > :file  interpret the contents of file as JSON, and substitute the result
This is convenient but also very dangerous. This feature will cause the content of an arbitrary file to be embed in the JSON if any value starts with a `@`, `%`, or `:`.

This will be a source of bugs or security issues for any script generating json with dynamic values.


Five minute job:

  $ ./jo foo=1 bar=2 obj=$(./jo -a 1 2 3 "</script" '"')
  {"foo":1,"obj":[1,2,3,"<\/script","\""],"bar":2}

  $ ./jo foo='abc
  > def
  > ghi'
  {"foo":"abc\ndef\nghi"}

  $ cat jo
  #!/usr/local/bin/txr --lisp
  
  (define-option-struct jo-opts nil
    (a   array   :bool
         "Produce array instead of object")
    (nil help    :bool
         "Print this help"))
  
  (defvarl jo-name *load-path*)
  
  (defun json-val (str)
    (match-case str
      ("true" t)
      ("false" nil)
      ("null" 'null)
      (`{@nil` (get-json str))
      (`[@nil` (get-json str))
      (@else (iflet ((num (tofloat else)))
               num
               else))))
  
  (let ((o (new jo-opts)))
    o.(getopts *args*)
    (when o.help
      (put-line "Usage:\n")
      (put-line `  @{jo-name} [options] arg*`)
      o.(opthelp)
      (exit 0))
  
    (if o.array
      (let ((items [mapcar json-val o.out-args]))
        (put-jsonl (vec-list items)))
      (let ((pairs [mapcar (lambda (:match)
                             ((`@this=@that`) (list (json-val this) (json-val that)))
                             ((@else) (error "~a: arguments must be name=obj pairs" jo-name)))
                           o.out-args]))
        (put-jsonl ^#H(() ,*pairs)))))


Never heard of txr. Thanks for this.

For anyone else wondering: https://www.nongnu.org/txr/


This is cool.

In the spirit of "do one thing well", I'd so rather use this to construct JSON payloads to curl requests than the curl project's own "json part" proposal[1] under consideration.

[1]: https://github.com/curl/curl/wiki/JSON#--jp-part


Agree, I was surprised that the cURL feature was considered as it seems to go against the "Do One Thing" and composability points of the UNIX philosophy.


Curl does like 100 "things" already by that standard. The Unix philosophy doesn't have to be reductionist.

Curl does do one thing: make network requests. This feature is making it easier to make network requests, i.e. it makes it better at doing the one thing that it does.


curl? Do one thing? In the same sentence?

curl looks like it has hundreds of flags.

https://danluu.com/cli-complexity/


The more I work with JSON, the more I crave some kind of dedicated json editor to easily visualize and manipulate json objects, and to serialize / deserialize strings. This is especially the case with truly massive JSON objects with multiple layers of nesting. Anyway, cool tool that makes one part of the process of whipping up JSON a little less painful


Add a json LSP to your editor, or use VS code which includes it natively: https://www.npmjs.com/package/vscode-json-languageserver Configure a json schema for the document you're editing and suddenly you get suggestions, validation, etc. as you type. It's pretty magical.

There's one for yaml too that works well in my experience: https://github.com/redhat-developer/yaml-language-server


I wrote something like this for emacs: a couple functions “fwoar/dive” and “fwoar/return” that let you navigate JSON documents in a two-pane sort of paradigm.

https://youtu.be/qbRNmk-malw


I built a JSON editor for Android a while back as part of a tool for kicking off AWS Lambda functions. I was planning on pulling the JSON editor to it's own reusable package, but lost momentum. I imagine it could be useful in many sorts of apps that use JSON.

https://play.google.com/store/apps/details?id=com.alexsci.an...

https://github.com/ralexander-phi/android-aws-lambda-runner


Looks neat, but the documentation was hard to follow. A lot of typos, and some things didn't make sense. For example, I think 2 paragraphs were swapped because the example code below the paragraph taking about using square brackets has no square brackets (but nested objects) while the paragraph right after talks about nested objects but it's example doesn't show it (it does however seemingly demonstrate the square bracket feature mentioned previously)


And I just use Nushell. You have built-ins to create (and parse) not only json but also url, xml and more... https://github.com/skelly37/Reject-POSUCKS-embrace-Nushell#b...


This looks like an informally specified shell-friendly alternative json syntax.

I wonder if a formal syntax would help? Perhaps including relevant shell syntax (interpolation, subshell). It could clarify issues, and this different perspective might suggest improvements or different approaches.


There's also jshon which is a simple stack-based DSL for constructing JSON from shell scripts.

http://kmkeen.com/jshon/

It's written in C and is not actively developed. The latest commit, it seems, was a pull request from me back in 2018 that fixed a null-termination issue that led to memory corruption.

Because I couldn't rely on jshon being correct, I rewrote it in Haskell here:

https://github.com/dapphub/dapptools/tree/master/src/jays

This is also not developed actively but it's a single simple ~200 line Haskell program.


In the common case where you trust your input entirely you can just interpret your string as JavaScript. Then you don't even need to use quotes for the key names.

    $ alias fooson="node --eval \"console.log(JSON.stringify(eval('(' + process.argv[1] + ')')))\""
    $ fooson "{time: $(date +%s), dir: '$HOME'}"
    {"time":1457195712,"dir":"/Users/jpm"}
It may be a bit nicer to place that JavaScript in your path as a node script instead of using an alias.

    #!/usr/bin/env node
    console.log(JSON.stringify(eval('(' + process.argv[2] + ')')))
Since fooson's argument is being interpreted as JavaScript, you can access your environment through process.env. But you could make a slightly easier syntax in various ways. Like with this script:

    #!/usr/bin/env node
    for(const [k, v] of Object.entries(process.env)) {
        if (!global.hasOwnProperty(k)) {
            global[k] = v;
        }
    }
    console.log(JSON.stringify(eval('(' + process.argv[2] + ')')))
Now environmental variables can be access as if they were JS variables. This can let you handle strings with annoying quoting.

    $ export BAR="\"'''\"\""
    $ fooson '{bar: BAR}'
    {"bar": "\"'''\"\""}
If you wanted to do this without trusting your input so much, a JSON dialect where you can use single-quoted strings would get you pretty far.

    $ fooson "{'time': $(date +%s), 'dir': '$HOME'}"
    {"time":1457195712,"dir":"/Users/jpm"}
If you taught the utility to expand env variables itself you'd be able to handle strings with mixed quoting as well.

    $ export BAR="\"'''\"\""
    $ fooson '{"bar": "$BAR"}'
    {"bar": "\"'''\"\""}
You'd only need small modifications to a JSON parser to make this work.


Ended up sniping myself . Made a fairly complete version of what I was imagining: https://github.com/itsjohncs/construct-json#readme


Does it suffer from the Norway problem?


It does not.

jo -- -b foo=no -s a=12 q=[1,2,3] {"foo":true,"a":"12","q":[1,2,3]}


I think you're thinking of YAML; JSON doesn't interpret "no" as a boolean. Scroll down here to see the JSON grammar: https://www.crockford.com/mckeeman.html

I actually think the "Norway problem" is a PEBKAC from users not learning the data format. But this tool may confuse some people or applications who don't know what a boolean, integer, float or string are, and try to mix types when the program reading them wasn't designed to. Probably the issue will come up whenever people mix different kinds of versions ("1", "1.1", "1.1.1" should be parsed as an int, float, and string, respectively)


jo a=12

Should “a” be number or a string in the resulting JSON?

And if it’s number, how can I tell it to output a string?


Depending on your exact needs, you might also want to try Next Generation Shell. It's a fully featured programming language for the DevOps-y stuff with convenient "print JSON" switch so "ngs -pj YOUR_EXPR" evaluates the expression, serializes the result as JSON and prints it.

Disclosure: I'm the author

Project link: https://github.com/ngs-lang/ngs

Enjoy!


Dang yo, why it's not " Jo – a shell command to create JSON written in C (2016) (jpmens.net)"

You guys do that for "written in Rust"


The original title does not contain “written in C”. Do moderators usually rewrite the title to add things like “written in Rust”?


Direct link to repo:

https://github.com/jpmens/jo


You can implement this and many other single purpose CLI tools with inline Python or Perl or other language. Easier to remember because it's your favorite language.

    python -c "import json; print(json.dumps(dict(a=10, b=False)))"


While a handy trick, it doesn't entirely solve the original problem when the values have dynamic content. In your example, replace 10 with $FOO and then you are back to square one, having to escape python-strings within a shell-string. Better avoid the problem entirely by not using shell to begin with. To instead continue on the dirty track, replace 10 with int(sys.argv[1]) and call it as python -c "..." $FOO.


I've been using it for years and it's a great companion of jq and jp (JMESPath).


Awesome, thank you. Lovely counterpart to jq :)


next to "copy as cURL" in dev tools, this might make the regular rotation


Could be simply called `json`.


Good luck in search results.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: