Hacker News new | past | comments | ask | show | jobs | submit login

I think it can be simpler than that. If a piece of code is going to do a velocity calculation from motion estimation between two frames it should expect to get new data every 1 / FPS seconds. A local clock could help validate that. If data comes too fast or too slow within some tolerance then the system could signal there is an issue. If the images have time stamps it could check those too.

Now you’d need to hope the inertial guidance is good enough at those altitudes. The article says it doesn’t use image based calculations for landing due to the possibility of kicking up dust.




> "...This glitch caused a single image to be lost, but more importantly, it resulted in all later navigation images being delivered with inaccurate timestamps..."

It seems that the timestamping is detached from frameshooting. That is, as if there were a burst happening, then the burst's frames get assigned the sequential timestamps. Thus if a burst had a glitched/skipped frame, a part of the series and the subsequent bursts become mis-attributed.

I understand that integrating the timestamping to be atomic to frameshooting is likely to slowdown the burst. So there has to be some expected internal timeline to validate the timeline that results from bursting.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: