Hacker News new | past | comments | ask | show | jobs | submit login

I am really excited about this library. Tensors are the future; matrices are going to look so old in a few years. In fairness, Theano already knew that. I cannot wait to dig into this and am very relieved that Google's best are still using Python for this endeavour...but...using Python 2.7.

I have just recently been persuaded by the community that 3.5 is cost free, and here I have this enormous counterexample. For the science guys, clearly, the message is not getting through, and I'm not surprised: 3.x offers them nothing. Hence they're blissfully continuing with 2.7.

So I guess I'll have to run two Pythons on my system. Not the ideal situation.

Now if Python 3.(x > 5) would give us native GPU, which might require putting (a GPU-enabled) Numpy into the standard library...as opposed to, say, spending 2 years re-inventing Tornado....




> Tensors are the future; matrices are going to look so old in a few years.

I am honestly curious about this point of view. Is there any example where actual multidimensional tensor have any relevance? What I mostly see around is just standard linear algebra operations on matrices and vectors lifted to higher-dimensional tensors point-wise (for instance applying a certain operation to all 2-dimensional subtensors of a given tensor).

I never saw the need for modeling matrices in terms of tensors - the latter seem just more complex to use and usually tensor operations (like symmetrization, antisymmetrization, wedge products...) are not even available in libraries

(And by the way, both matrices and tensors are now more than one century old...)


> What I mostly see around is just standard linear algebra operations on matrices and vectors lifted to higher-dimensional tensors point-wise

Equally what is matrix multiplication but a bunch of 1-dimensional dot products applied pointwise? why do we need matrices?

I do get what you're saying, and that part of it is that ML / CS folk just use 'tensor' as a fancy word for a multi-dimensional array, whereas physics folk use the word for a related coordinate-basis-independent geometric concept. But for numerical computing broadcasting simple operations over slices of some big array is really useful thing to be able to do fast and to express concisely.

Numerics libraries which don't bother to generalise arrays beyond rank 1 and 2 always feel rather inelegant and limiting to me. Rank 3+ arrays are often really useful (images, video, sensory data, count data grouped by more than 2 factors, ...), and lots of operations generalise to them in a nice way. Good array programming environments (numpy, torch, APL take advantage of this to provide an elegant general framework for broadcasting of operations without ugly special cases.


The traditional pure mathematical of looking at it is that vectors are members of a vector space. Matrices are linear maps, and matrix multiplication is composition of linear maps.

So what algebraic concept do tensors correspond to?


Multilinear maps. You can view contraction with a vector (a dot product) as mapping a vector into the scalars, contraction with a matrix as mapping two vectors into the scalars, and contraction with a tensor as mapping several vectors into the scalars. You don't always have to contract all the indices at once, so with a rank m tensor, you can map n vectors into a collection of m - n vectors.


In my domain (finance) correlations between vectors are unstable but (maybe) dependent on cross-sectional relationships in the problem space. Some of the mathematics behind elastic body deformation (car tyres in in mechanical engineering, fluid dynamics in weather forecasting) have high applicability. Tensors are required.

It's true that tensors are hard to reason about - they overclock my brain most of the time - but there is no doubt that, just like moving from scalars to vectors massively increases your real-world modelling power, so does the natural extrapolation to matrices, and from there, tensors.


It's not that tensors are particularly hard to reason about. It's just that the way they are represented as multidimensional arrays hides very well the fact that they are element of a tensor product (hence tensors). Natural operations like, well, the tensor product are often not even available.


I agree that tensors are often used simply as a convenient representation for multiple lower-dimensional objects. I suspect that is because there is still plenty of value in many fields, including mine, in exploring scalar, vector, and matrix representations of problems, and tensors are often unexploited. They're used as convenient data structures, not as algorithmic necessities. Still, as more and more people have access to data and explore it, increasing competition, reducing the "alpha" of simpler analyses, moving deeper into dimensionality on algorithms will be the only choice for those who want to innovate, is my view. So it is probably necessary to have the tools at our disposal already.


>Tensors are the future

"Tensor said the tensor. Tension, apprehension, And dissension have begun."

https://en.wikipedia.org/wiki/The_Demolished_Man

Tensors were the future even before they were cool.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: