William Gibson's novel the Peripheral (2014) and Agency (2020) have characters who control implanted smartphones (like a HUD on your eyes and audio channel built into you ears) that are controlled by using the tip of you tongue on the roof of your mouth. It's possible he was inspired by the video or some other source but yeah it's clear the idea has been around for a while.
IIRC the book(s) mainly referenced "subvocal" controls, based on detecting almost-talking movements within the user's face/throat/voicebox.
I'd argue those are qualitatively different, more like a modern-day system of saying "AssistantBot: Send E-Mail", as opposed to a geometric mapping one could use for, say, painting a picture.