Hacker News new | past | comments | ask | show | jobs | submit login

This is usually done with shaders and a circle of buffers which maintain state.



Fragment shaders can only go so far. Can you do in webgpu something like this?

outputTexture2d[inputTexture2d[inputPosition]]++

In other words, if you have a large texture with (x,y) coordinates of points, can you draw another texture that shows a density cloud of these points? In webgl2 it becomes a phd level problem.


One needs to do multiple passes, not sure if that is to be considered a PhD level problem.


Total graphics / shaders / GPU noob here. Does that mean you'll essentially get free visualisations (albeit non-sensical ones) as a byproduct of your computations?


If you wanted to do compute in a shader before WebGPU with WebGL instead, then I think the answer is kind of yes. It wasn't "for free" without any code but it was required to do. But now WebGPU supports compute shaders properly so you don't have to do compute in a shader that produces textures.


A texture in WebGL is just a memory buffer that can be accessed by shaders. It doesn't end up on screen automatically.


No, it's an @compute shader rather than a combination of @vertex and @fragment shaders (which would do graphics) in the case of WebGPU.

Surely you could visualize it but not as a side effect.


I think it depends, but given an arbitrary compute pipeline, you should be able to write the results (or intermediary results) to the screen with minimal effort.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: