Hacker News new | past | comments | ask | show | jobs | submit login

we're having similar issues in aerospace. time series data out the wazoo.



I used to deal with radar data on roughly this order of magnitude, which had to be synced to the output of other sensors.

I really miss the challenges of real-time signal and data processing. I really want back into it.


Video data from multiple cameras FTW.


So I'm listening to this Radiolab episode [0] about drones with cameras that can solve crime (traffic and other societal ills) and it occurs to me that with everyone soon to be walking around with DSLR-quality smartphones, couldn't we triangulate all this video/ audio data and provide substantially better resolution to daily life? Think of it as continuous Meerkat/ Periscope localized around an event in four dimensions.

[0] http://www.radiolab.org/story/eye-sky/


I was involved in a project conceptualizing real-time video streams from smartphones and synchronizing, adjusting/correcting quality before having it be presentable... in real-time!

Think of a soccer stadium, with fans taking "video" of the game. All the feeds would be gathered, synchronized, quality adjusted and put online for anyone to view, from any angle.


Yeah ... there is a ton of research out there on multi-sensor fusion; super-resolution & synthetic viewpoint reconstruction.

Even the buzzword-du-jour is getting involved[0].

[0] https://www.youtube.com/watch?v=cizgVZ8rjKA&feature=youtu.be




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: