Prof. Thrun was part of the team that used laser range-finders to assist the image stitching in StreetView. Since both sensors (rangefinders and cameras) were mounted on a mobile vehicle, wheel odometry and GPS "ground truth" must have been available at each waypoint as well.
The author of the post suggested that Google might have used the GPS data (and possibly rangefinders) to create a simulated world to teach the self-driving car (useful for testing new tweaks?). This is a good idea and probably did not take them a lot of effort given the available log of data from StreetView.