Hacker News new | past | comments | ask | show | jobs | submit login

There are two things to note here:

1. A cross-application connectivity layer that pipes data and actions between apps

2. A natural language interface to control #1

Thinking about them separately is useful, because although chat is the new UI hotness, #1 is valuable on its own and the two can potentially be deployed separately.

As presented here, I suspect the natural language interface will be faster and easier than buttons for operating the cross-app layer for complex queries, but potentially slower than operating buttons for simple things (like "start dark mode").

But personally, I believe #1 combined with some AI context awareness is more powerful of the features.

...

And btw, I left Apple last year to build a local-first and developer-extensible assistants for the Mac that's pretty similar. If this interests you, would love to chat (email in profile, as well as a waitlist).




I was also thinking about how they could've integrated this into every app, but I figured that simple tasks would be more reliable across all apps if they did AI-recognition on screenshots of the desktop + injecting data like "Installed Apps", then by the end of it output virtual keyboard event or a virtual click at an X/Y coordinate.

Just some afternoon postulating haha


#1 has existed in Windows and other OSes for ages: accessibility APIs and UI test tooling (called UI Automation in Windows).


It looks like this is a high level interface more like Intents on Android, e.g. in the video it opens the Logo Templates page of Adobe Express probably without automated UI interaction.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: