This are fully blown computers with plenty of ram and using python on them is just fine if it meets your cpu usage spec. People get too carried away with the term embedded. Not everything is an MSP430 with a tiny bit of memory or processing power. That said, I've been using modern c++ and rust on my hobby projects as they have most of the advantages of python and blow it away as far as memory usage and performance, at least for embedded usage where we're not building huge web apps or similar.
This is not really the classic embedded space; it is a full ARM processor running a full linux operating system. Its more like a small server than a MCU board. It couldn't be used for hard real-time requirements.
Rapid prototyping and in the case of Jetson - computer vision and Tensorflow running on the GPU. Both have excellent support for Python.
Sure you can make those things run natively but people using them are used to Python, lot of code is written in Python for it and there is little to no performance penalty of running the code in Python (everything perf. critical is running natively anyway, Python is only used to "glue" those bits seamlessly together).
And once you have RAM in gigabytes even the extra memory overhead of Python becomes a moot point.
In this particular case, I think everything compute-intensive happens in the GPU anyway, so there's not much to lose by using a high level language on the CPU side.