Hacker News new | past | comments | ask | show | jobs | submit login

(Note I am an OpenFlexure Maintainer) Camera sensors are very rarely the limiting factor for a microscope unless you are in pretty exotic modes where speed, timing, or low light conditions are important. The key reason it is better often to use something like a Raspberry Pi camera than a phone is you know exactly what sensor you have and can design for it. Also there are benefits of not having the lens in front of it where you then need extra lenses to act as eyepieces to view a virtual image. But using the picamera and either using a microscope objective and a tube lens (or in the low cost version just the picamera lens and a spacer) we can get diffraction limited performance in a really small, light footprint. (More detail on the optics for nerds: https://build.openflexure.org/openflexure-microscope/v7.0.0-... )

However, the camera/sensor isn't the clever bit. The main benefit of OpenFlexure is the automated stage. The range of motion is small and the motion is slow so it really isn't the right microscope for looking at something like a bug leg. But if you want to take loads of high resolution images with a high powered objective and stitch them into a composite image (or take time-lapses automatically autofocusing regularly) we are considerably smaller, more affordable and more customisable than commercial alternatives. With lots of options for scripting.

As an example of what is possible, check out this multi-gigapixel composite image of a cervical smear, and the resolution when you zoom in: https://images.openflexure.org/cap_demo/viewer.html Note, this is collected with an experimental branch of the software (of course open source). We need to do some tidying and bugfixes before it is ready for release.




I mean... that seems like a very valuable piece of technology but I feel it ideally would be hardware agnostic? If you have a piece of software that takes a video of the camera going over a near-static field and it generates a composite highres image - that'd be very useful.

You could then either have an automated stage, or a hand operated one, or just move your slide by hand under the microscope.

I think the camera can then be a RPi or a phone or anything else




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: