Hacker News new | past | comments | ask | show | jobs | submit login

> It gives them time to confirm a new technology and update their maps of areas with information from the new wavelengths they're just now gathering.

Fair enough. I just tried to do a diff between the versions:

Current implementation [0]

* Camera module in the front

* Front-facing Radar

* 12 ultrasonic sensors

New implementation:

* 8 surround cameras

* Front-facing radar

* 12 ultrasonic sensors (updated)

[0]: https://www.quora.com/What-kind-of-sensors-does-the-Tesla-Mo...

So the difference is basically just a few extra cameras and updates to the sensors. It doesn't seem like a huge step or completely new platform - at least when looking at the components.




What about the 40x faster processor with neural nets thrown in there somewhere? There's a bit more than just new cameras/sensors.


>neural nets thrown in there somewhere?

What techniques were they using to process the data before? Surely it was statistical learning. And if they weren't, their competitors certainly were.


For version 8 of their software, the camera is now the primary sensor. And that's where the biggest hardware difference between the models is.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: