Hacker News new | past | comments | ask | show | jobs | submit login

I’d love to see this in torch. What are the odds?



Implementing a custom layer isn't hard[0], that said if it were me I'd rather add it in the runtime, e.g. via a TensorRT plugin.

[0] e.g. https://jamesmccaffrey.wordpress.com/2021/09/02/example-of-a...


That tut only shows you how to remix existing operators. That obviously won't work here (not what's required). You need to do this

https://pytorch.org/tutorials/advanced/dispatcher.html

More complex but still not that hard.



But not backed by VkFFT. The implication of the comment is that it would make FFTs on various backends easier if it was implemented on VkFFT in the first place. Not sure that's true though, as I don't know how much code the various backends share.

Edit: as an example that I experienced first-hand, coremltools, which converts PyTorch to CoreML models, only gained FFT support very recently. It's also not really a PyTorch backend but a PyTorch code converter, though, so wouldn't benefit at all from PyTorch's FFT being backed by VkFFT. Still, good example that one shouldn't take FFTs for granted.


Of course, but CuFFT is half the speed of VkFFT.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: