I just don't see the point of all these simple introductions to some ML topic. If you want to do a good job it is hard. If you need a simple introduction and don't just dig in a persevere with harder stuff you probably aren't going to succeed in whatever ML task you aspire to.
I upvoted because I'm quite familiar with ML theory and like to read the latest tweaks, etc. but I've never used a tool like TensorFlow. Papers, blog posts, etc. usually give diagrams and/or fully-fledged code snippets. That's great for the high-level overview, and low-level nitty-gritty, respectively.
Yet minimal code examples like this let me see straight away what TensorFlow programs look like, without having to distinguish between fundamental aspects and snippet-specific ones.
My only remark would be to put a comment at the top something like "Fit a straight line, of the form y=m*x+b".
I guessed that the "m" and "b" variables were referring to these common usages, but was wary of this assumption until line 14.
For all I know, "m" and "b" could be common parameter names for some TensorFlow config or something :)
Another suggestion: 'operation' would be better named 'train_step' -- it's more descriptive and eliminates the need for the comment at the end.
Also, it's best practice and common tensorflow idiom to do:
init = tf.initialize_all_variables()
...
with tf.Session() as session:
session.run(init)
Not just for style, but because creating new graph nodes after creating the session requires tearing down and then re-setting-up some of the internal session stuff. It's much faster to move the initialize_all_variables() creation outside of the session.
I actually ended up coding the hard stuff from scratch in matlab/julia. I wish I has seen a simple example on how to use Theano early on -- then I would have been off on my way much sooner.
Learning to use TensorFlow/Theano/ect isn't like learning another programming language or package. I think it is more like learning a new skill.
> I actually ended up coding the hard stuff from scratch in matlab/julia
I always find it easier to use and understand the higher level tools (like TensorFlow) once you've written your own rudimentary implementation from scratch. Most of my work is done Julia, and sometimes that code is fast enough that I don't have to use anything else.
Yes, this is why we teach math starting from Vector Calculus and Real Analysis, and start programming by having students implement the Fast Fourier Transform in Befunge. Only wussies need "introductions" and "tutorials". Just dig into the hard stuff and persevere, damnit!!