I'm pretty sure cheap and high quality lidar is coming very very quickly - so many companies working on it. There will be no market for an awkward kinect. Kinect is not great compared to modern lidar systems anyhow, it was just more accessible.
Not entirely. LIDAR has other issues important when building self-driving cars. LIDAR can't read signs. LIDAR doesn't work well in the rain, though progress is being made these days.. And don't forget the cameras tesla uses don't rely solely on the visual spectrum we see. They can still see through fog via infrared, for example.
Can't the LIDAR eventually be made to use IR wavelengths and thus synthesize 3-d images through fog?
also, the question of seeing the fog is more nuanced that can see/can't see:
"Just like it is impossible to give a simple answer
to the question “How far can I see with a thermal
imaging camera?”, it is equally impossible to say
how much shorter the range will be in foggy or
rainy conditions. This is not only dependant on the
atmospheric conditions and the type of fog but it is
also dependent on the IR camera used and on the
properties of the target (size, temperature difference
of the target and background, etc)" [http://www.flir.com/uploadedFiles/FOG_techNote_LR.pdf]
LIDAR typically uses lasers in the so-called eye-safe range (around 1.4 micro meters). This range is precisely around the absorption peak of water so that LIDAR cannot damage the eyes of pedestrians and other by-standers. By construction, LIDAR sucks in the fog, rain, snow, etc.