Very cool. Also, love seeing rambling wrecks from Georgia Tech here!
While this is a very cool project, making a very obvious demo that people can use to leverage it would make this stand out in the current ecosystem of tools like this.
Thanks for the suggestion! I just added links to the demo applications earlier in the README. All applications are Jupyter notebooks that you can open in Google Colab.
I personally wouldn’t put the Emotion one first on the GitHub README, that was the only one I opened, before clicking the license plate one and a) see it was a whole other GitHub demo and b) opened two files to see both doing parsing/loading models without any SQL before getting bored and closing the project.
Maybe I’m not the target market but seeing the 2nd and 3rd example in your list here, which actually has SQL query examples, were much more interesting and relevant IMO
Thanks for the helpful suggestion! Just reordered the examples in the README.
Here are the illustrative queries:
-- Object detection in a surveillance video
SELECT id, YoloV5(data)
FROM ObjectDetectionVideos
WHERE id < 20
-- Emotion analysis in movies
SELECT id, bbox, EmotionDetector(Crop(data, bbox))
FROM HAPPY JOIN LATERAL UNNEST(FaceDetector(data)) AS Face(bbox, conf)
WHERE id < 15;
While this is a very cool project, making a very obvious demo that people can use to leverage it would make this stand out in the current ecosystem of tools like this.