Hacker News new | past | comments | ask | show | jobs | submit login

This requires a depth map in the image.

Would it also work on any image by calculating a depthmap from the blurriness?




No, because “blurriness” (low local contrast) may indicate something besides a particular depth.

Consider, for instance, a head-on photograph of a print of a shallow-focused photo. The region that print embodies will have plenty of variation in contrast, but exist at a single depth. Also, consider that blurring increases in both in front of and behind the center of focus; how could we tell which depth the blurring indicated?

Something similar to what you suggest is, however, done in software autofocus, which can take repeated samples at different focal distances to clear things up. Maybe that’s something to think about, e.g. for a static subject. http://en.wikipedia.org/wiki/Autofocus#Contrast_detection


Yup, there's no simple way to recover depth from blur: there will be featureless regions where you can't tell if there was any blur -- in other words, there is no universal way to tell if a region has gone through a low-pass filter or if it is natively low frequency.

Would heuristics work well? I can think of a handful, but none really good.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: