Accessible to non-researchers, especially those with a programming background but not an AI background.
The TF/Keras approach advocates the minimum amount of code necessary and effort needed to make model changes, with sensible default configurations and layer architectures.
STRONGLY disagree. I’m just a hobbyist, but trying to read Keras models can be a god damn nightmare if the author has to do anything even slightly non-standard. Keras seems to REALLY want you to believe that you can just throw a bunch of layers together and call .fit and everything will just work, but it never seems to be that simple unless you’re training on MNIST or ImageNet.
I disagree with you because we switched from TF 1/Keras to PyTorch and our codebase reduced to half of its size. Our team is a bunch of developers with little AI background. Problem with TensorFlow is it is mostly not readable and onboarding new people to project is really hard. In contrast, PyTorch much more readable and people with python background can easily adapt to a pytorch project after a small machine learning lesson.
Especially with the caveat of "with a programming background", it is far easier to reason and debug through PyTorch with just Python knowledge, compared to TensorFlow/Keras, which sooner or later requires you to learn a condensed history of TensorFlow/Keras development to understand why things are the way they are.
is NOT a good example of a beginner friendly library. It's a thin wrapper facade that hides all of the actual complexity behind "Train ImageNet in 3 lines of code!"
The reason why Keras became so popular is that it borrowed a lot of concepts from Lua Torch (which predates even Theano). And anyone who worked with Torch immediately sees it reading Keras code. But Torch was Lua and naturally it received less recognition than it deserved. Your will not lose anything by simply moving to PyTorch.
Check out the fastai library. It's something like Keras is for Tensorflow.
As a non-researcher, mostly programmer who has spent a lot of time delving into this ecosystem, PyTorch is the most like "standard programming". With fastai giving you models to do working three liners.
I haven't used tensorflow interactive execution though, it supposedly is closer to PyTorch than the graph building model.
I am a non-researcher with a programming background working in ML for the past ~2 years, pytorch was a godsend for me and felt much more programmatic and pythonic than TensorFlow. Keras is also good, but claiming that PyTorch makes it harder for non-researchers is wrong IMO.