This reminds me of Blinkendroid https://code.google.com/p/blinkendroid/ which was the first time I'd seen multiple devices interact with each other like this. Sadly that project never managed to get much traction despite how much potential it had.
Can't actually think of a case where I would want or need this. Even my Chrome tabs followed me around is kind of annoying lately, since I lookup different things different places without much interest in them elsewhere. E.g. I'm not going to read HN on my phone.
Indeed. It's a nice proof of concept and immediately gets me thinking how I'd do it. However the Evernote demo makes it look like a solution waiting for a problem. I imagine there is a compelling use case - will be interested to see it when it appears.
I think this is fantastic, but I couldn't find API docs / tutorial anywhere on your site. Demonstrating use cases is important, but I don't think it's enough to get developers to download / register.
There is a link to the github repos in the 'Get Started' section, where it says "The demo apps are open source and come with the respective development guide." Unfortunately only the Android version is there now. Will try to make it more visible!
I couldn't get this to work between a S5 & Nexus5. What are the requirements? Same wifi? NFC or Bluetooth enabled? How does it know what sides the devices are on?
Hey zillwc, thanks for trying the demo out. The requirements are an internet connection (any should do; WiFi, 3/4G etc) and the availability of Location Services.
I'm not sure about the connection protocols, but it looks like it determines which side(s) each device is on by the direction that you swipe to connect them.