Hacker News new | past | comments | ask | show | jobs | submit | panrafal's comments login

I'm using Capitally (https://mycapitally.com) which I started building to replace my 3 spreadsheets that needed constant attention and broken down quite often. It now completely covers tracking my investments - spreadsheet nr 1, still two to go - portfolio strategy and budgeting.


I will add <span ng-if="Modernizr.Android">another </span>Android ;)


Fog, refocus, background separation, chroma shifting.

As for the fill, if the depthmap is of good quality and without sharp edges - there is no need for that, unless you go berzerk with the scale of displacement.


There already is a library for extracting the depth map here: https://github.com/spite/android-lens-blur-depth-extractor (internet rocks, isn't it ;) )

As for the Shader - it's just a few lines of actual code.

Depthy is a quick weekend hack. There's a loong way before it.


Has anyone managed to replicate the lens blur effect that's being utilized in the new Android camera app? Or at least know what research paper it's based on?



That seems to just be using the depth map information stored by the app, not replicating the effect from scratch.


They recalculate the blur. Were you asking about calculating the depthmap?


Yes, I was wondering how to fully replicate the effect without the need for the app.


It's exactly that - single image, and a depth map calculated from a series of shots made upwards. Both these are bundled in a fake-bokeh image. There is obviously no more pixels. However... If you choose your subject wisely, the effect can be pretty believeable, using a simple displacement map.

On iOS though it's something different. There's no depth map in there, just a flat image moving counterwise to your hand movements.


It sounds like there are two issues here. One is how the depth map is generated, and the other is how the resulting image file is formatted. For the former, several still images are collected while the camera is moving, which provides parallax which can be used to generate the depth map. For the latter, I don't know, but it would certainly be possible to bundle both the depth map and multiple "layers" of photograph that could be used to actually expose new pixels in the background when the foreground moves.


There is an app for iOS - seene.co. The amount of movements you have to do to capture enough pixels is prohibitive for my taste. I think that google has nailed it - it's super simple.

As for storing the layers - you would only have the "from above" pixels, and only a few. Probably that's why there is only a LensBlur in their app in the first place.

If you just want a small displacement effect like on depthy - then the key is no sharp edges in the depthmap. We will try to tackle this in one of the upcoming updates...


Now that you explained the IOS7 trick, it's obvious. (Given the environment: background filling image on a physical device movement as opposed to a image on a website).


On iOS you have the icons as a first plane - that makes the trick.

If you have an Android device around, try depthy on it - it's way better this way imo.


There is more pixels captured compared to a single shot image as you said: "series of shots made upwards". So it captures some pixels that is hidden when the camera moves upwards, but if the simulated parallax is bigger than the original camera movement then there will be still missing pixels. Probably this could be improved by doing bigger movements with the camera, like with other 3D reconstruction software.


I've created a parallax viewer for lens blur photos. It's an open source web app available at http://depthy.stamina.pl/ . It lets you extract the depthmap, works on chrome with webgl and looks pretty awesome on some photos. There is quite a few things you can do with this kind of images, so feel free to play around with the source code on github https://github.com/panrafal/depthy


Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: