Hacker News new | past | comments | ask | show | jobs | submit login

According to the methodology on that page that would classify the standalone version of Keras (using from keras.models imports as recommended by the Keras docs) as "Other". (I tried finding source code to verify this, but couldn't find it)

And if that is correct, then I'd be astonished if the vast majority of the "Other" papers aren't Keras. I work in ML and I don't think I've seen a paper that didn't use PyTorch, TensorFlow or Keras in years.

And is that's the case then almost certainly there are more that use TF than PyTorch: Pytorch is 42%, TF is 23% but Other is 36%.

(In terms of biases, I hate working in Tensorflow, and much prefer PyTorch and Keras. But numbers are numbers).




Jax?


Are there any papers that use it for things other than demonstrating Jax? I can't think of one off the top of my head.

Perhaps I should have specified "papers outside those introducing new frameworks, or around speed benchmarking".

There are a bunch of interesting papers using custom libraries for distributed training, and ones targeted at showing off the performance of specific hardware (NVidia has a bunch of interesting work in this space, and Intel and other smaller vendors have done things too).


It's still early days for JAX, but there's neural tangents https://arxiv.org/abs/1912.02803 and reformer https://arxiv.org/abs/2001.04451 from iclr.


I agree about it being early days.

Reformer is a good example that I'd missed.

Neural Tangents is another paper demoing a framework.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: