Hacker News new | past | comments | ask | show | jobs | submit login

I'm not really experienced when it comes to GPU programming, so forgive me if I'm wrong with this, but some of the things you say don't make a lot of sense to me:

> 2d vector graphics include things like "bones" and "tweening", which are CPU algorithms. (Much like how bone processing in 3d world is also CPU-side processing).

Changing the position of bones does seem like something you would do on a CPU (or at least setting the indices of bone positions in a pre-loaded animation), but as far as I'm aware, 99% of the work for this sort of thing is done in a vertex shader as it's just matrix math to change vertex positions.

> Consider the creation of a Beizer curve, in 2d or 3d. Do you expect this to be a CPU algorithm, or GPU algorithm? Answer: clearly a CPU algorithm.

Why is it clearly a CPU algorithm? If you throw the bezier data into a uniform buffer, you can use a compute shader that writes to an image to just check if each pixel falls into the bounds of the curve. You don't need to use the graphics pipeline at all if you're not using vertices. Or even just throw a quad on the screen and jump straight to the fragment shader like I did with my circle vector.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: