Hacker News new | past | comments | ask | show | jobs | submit login

If you wanted to do compute in a shader before WebGPU with WebGL instead, then I think the answer is kind of yes. It wasn't "for free" without any code but it was required to do. But now WebGPU supports compute shaders properly so you don't have to do compute in a shader that produces textures.



A texture in WebGL is just a memory buffer that can be accessed by shaders. It doesn't end up on screen automatically.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: