The Wingnut AR demo that Apple showed during WWDC featuring the Unreal Engine was just mindblowing! Thankfully, it's available on Unreal's Youtube channel so you can still watch if you don't have Safari or the app - https://www.youtube.com/watch?v=S14AVwaBF-Y
This is nice but what's unique about this? Is it significantly better or more performant? Is it just nice to have an integrated solution?
I'd love to see a breakdown of ARKit vs something like Vuforia. Here's a video with the equivalent functionality to the WWDC demo -- https://www.youtube.com/watch?v=JvE_7filGsY
>This is nice but what's unique about this? Is it significantly better or more performant? Is it just nice to have an integrated solution?
Ignoring all that other stuff, it's officially sanctioned and supported, part of the platform, requires no external libs, has Cocoa level documentation, and tons of developers will be using it very soon.
It all depends on whether Apple makes ARKit a priority, though. Apple has advertised SpriteKit on various occasions, then mostly broke it in iOS 9.0/9.1 [1], [2]. Game Center went down once a single game (Letterpress) started using it as designed [3]. What will you do if ARKit breaks in iOS 12.0? If you use an third-party lib, at least you can roll back to an earlier version, or even fix the bug yourself if it is open source.
I would stay as far away from Apple as possible when it comes to gamedev tools.
> part of the platform, requires no external libs [...]
Sadly this also means Apple will only update it in subsequent versions of the OS. This is a big deal when choosing a target platform, e.g. ARKit 11 vs ARKit 12.
For example when iOS 12 comes out your target audience will be split, some with ARKit 11 and some with 12. You'll have to decide if you want the new features in 12 or ability to run on devices still using iOS 11.
Compare with other game platforms, which insulate you from underlying tech changes. For example a Unity app can be updated and use the latest Unity features, even on an old iOS.
[FWIW Apple could update ARKit independently of iOS, but that's not been their pattern]
The building destruction physics were pretty impressive. Maybe I'm out of touch with modern games engines but I've not seen anything quite like that before.
This kind of thing is the least interesting application of AR in my opinion. I want high-quality information overlays about the real world. I do understand that some people just want Starcraft on a coffee table, but it seems like a low ambition for Apple to showcase.
>This kind of thing is the least interesting application of AR in my opinion. I want high-quality information overlays about the real world. I do understand that some people just want Starcraft on a coffee table, but it seems like a low ambition for Apple to showcase.
It makes for a compelling proof of concept. Having a lot of individual things interacting with the environment independently is most of what you need, technically, for the high-quality overlays you're talking about. If you can do the former you can do the latter, it's just a sexier way of showing off the capabilities.
I don't get the point of seeing anything in the background of the table, in this case the audience. Why not paint it with sky, then it's less distracting?
Oh, I get it, then it's not AR. Well, maybe AR is not such a good fit in this case to begin with.
The it's just watching a 360 youtube video on your phone.
This is still new and doesn't quite make full use, but there are many reason why you'd want to see the environment. For one, it gives a sense of scale. For example if you put a real life size dinosaur, seeing the surrounding allows you to understand the scale better. Next up, the AR application could very well interact with the world too, which brings new possibilities.
These are pretty new technology and no one knows the best way to use it yet. Maybe in the future we'll start putting a sky in the background. Maybe we'll find better ways to use AR. Both AR and VR are still in their infancy.
Good point, except that in this example, you don't understand the scale better (since all the people and spaceships fit on a table). I believe (for this example) an immersive experience, with VR goggles would be much better.
I definitely agree. The examples Google gave of Tango were much more inline with the points I was speaking of. But again, I'd love to see what new creative things others come up with.
It demonstrates the power they have with AR. If they can show a town appear on a table with air strikes coming in in realtime, it implies that simpler more practical AR use cases will be much easier.
Prior to this, as a developer, you might think AR is limited to what Pokemon GO and the first example (like a cup of coffee on a table) show. After seeing this, if you were thinking of doing anything simpler than what they were showing, you'd likely see it as much more feasible than you previously did and perhaps look into including some of those features.
And this was at WWDC (their developer conference), so it certainly applies to a developer audience.
The content itself wasn't that big of a deal; it's the fact that the AR technology worked so well, and is available to anyone using the SDK is what was impressive and hype-worthy.
Doing AR from scratch is a huge endeavour but this should make it simple enough so that we start seeing more applications using it for more useful stuff.
I thought Samsung S8 should have AR on a premium phones.
Compare to 60-120fps on iOS 11, yup, Apple Metal 2 could boost by reduce latency from 16.6ms to 8.xms and even more with direct display, it matched PlayStation VR refresh rate, real world tracking is what we need.
https://www.youtube.com/watch?v=Yphh1Ue3D6g
Also, Apple has almost never been the first to do anything. Tim Cook even admitted this yesterday. What Apple brings to the table is an actual polished solution. Which having used Hololens extensively it absolutely is not.
Well, Apple has just released the 'smart keyboard' for its ipads, the 'smart' bit being that with a physical connection, "it doesn't need charging and automatically transfers data". Such innovation! Something keyboards have been lacking for decades...
I know I have been using thin and light portable keyboards that magnetically snap to my PC that also aligns the power/data connection since the early 80's
So you're saying the 'smart' bit is the magnets? Because I don't know what keyboards you were using, but the power and data connections were always aligned when I connected up my keyboards.
Unfortunately, the old-fashioned keyboards also worked when they weren't hard up against the screen - thank god we have these new 'smart' keyboards that have to lie flush against the screen to work.
How was it mindblowing? I thought it was a pretty poor example of what AR can do. It was just a 3d game on a tabletop. Pokemon Go is a lot better, and it's pretty basic.
Having designed an AR experience for children and doing quite a bit of research into this problem a few years ago... You incorporate thumb controls on the sides of the device where you would naturally hold it. You use physical movement of the tablet to move things in the environment. Its limiting, but it works and can allow for fun interaction with the environment.
I wasn't aware of that, what format does Google use for its live streams?
I know that YouTube on Chrome uses VP9, which is not a web standard - or at least not more of a web standard than HTTP Live Streaming (which is what Apple uses for its keynote).
You're right, I am legitimately confused. If there are no web standard video encodings, is it therefore wrong to say that "Google uses web standards for their video?" What exactly is Google doing better?
YouTube uses MPEG-DASH as its adaptive streaming protocol, which works over standard web protocols with royalty free codecs and containers, unlike HLS.
As for why there isn't a web standard video encoding and container, all the major browser vendors except one have announced support for VP9 and its successor and Webm.
I get that you don't want to use Safari, but actually "forcing us to use Safari" is exactly what Google did. Google chose not to implement HTTP Live Streaming, and therefore it's not available in Chrome. Microsoft made the other choice, which is why it works in Edge.
The double standard here is really quite breathtaking. Apple doesn't implement a feature: Apple's fault. Google doesn't implement a feature: Apple's fault too.
HLS requires the MPEG2-TS container format, which is not royalty free and therefore has no chance of becoming a web standard or of being implemented by Mozilla on free platforms.
I don't see the double standard. All but one browser on GP's platform support the royalty free option. Only Safari on his platform supports the encumbered option, and he doesn't want to be forced to use it. On my own preferred platform, no browsers support HLS.
Since the votes on this thread won't let me reply:
@pjmlp: I choose to use a platform where developer experience is the primary goal because developing software is both my vocation and avocation and why I am reading "Hacker News." If I wanted a walled garden media consumption toy, I would get a LeapPad. Look, we can snark all day, but that doesn't change the conclusion that tommoor was right about web standards.
@millstone: There are no royalty-required formats that are web standards for a reason: to make the web free and open to all. That is why it is not a red herring. Bringing Google into it, on the other hand, is — whether Google pays the fees has no bearing on whether the format should be a web standard. Apple chose not to implement hardware decoding for the unencumbered format. All other major consumer hardware manufacturers have.
Not that I like HLS even a tiny bit, but even if MPEG2-TS was patent encumbered (and I'm not sure it is) HLS now supports fragmented MP4 just like DASH (this was done so that content providers don't need to store two different muxings of the video for DASH and HLS).
Apple, as usual, wrote their own private format. A pretty sucky one at that. But don't get fooled, DASH is not much better, and for the few things it offers over HLS it comes with a massive implementation difficulty. There is basically no DASH compliant player around, not even the one developed by the DASH Industry Forum implements all the standard. It would be way easier for every browser vendor to support HLS than DASH.
"Royalty free" seems like a red herring. I'm not familiar with MPEG2-TS in particular, but Chrome already happily plays H.264 on macOS, plus there is an explicit exception in H.264 license for non-paid (including ad-supported) content. I don't see how implementing HLS would increase Google's licensing cost in any way.
I appreciate the free software position. It seems MPEG-DASH is indeed better suited for it, though maybe only slightly [1]. (Regardless, turns out Apple did not release anything as copyleft this year, so maybe it's better that free software purists could not have watched the video?)
The case for "encumbered" options is simple: it's what's decoded in hardware, for users who prefer their device to last the entire video.
If Apple used fMP4 for this video, that gives them no excuse not to support other browsers. They can just ship a JavaScript HLS client built on the MSE web standard on the web page.
It is a Apple developer conference for software developers that care about making beautiful applications that take full advantage of native experiences on Apple platforms.
Any developer on this community can watch the video.
I can still be an outsider and be genuinely interested in what their platform has to offer and, maybe, later decide that I want to invest in their platform. I become a better developer, Apple gains a new developer, everyone wins.
Forcing people to use macOS to view recorded sessions or events goes against that.
However, the default experience is a little box that says "Streaming is available in Safari and the WWDC app". Now I have to go out of my way to learn that they stream via HLS, obtain the stream's URL and feed it to VLC. †
Compare this with Microsoft's and Google's videos (available through Channel9 and YouTube, respectively) that are accessible "virtually" from every operating system and device.
If your goal is to attract new developers to the platform, maybe adopting a more widespread industry standard (such as DASH, which all other browsers implement) is probably the way to go, IMO.
--
† By the way, the box also breaks the "Copy Video Location" menu item, so I have to open the inspector or install an extension to find out what the real URL is.
It does seem like a lost opportunity to do some outreach though. It's not hard to provide streams both in HLS and DASH, and streaming the developer conference to everyone could surely attract new developers to their platform (and not only developers, the same thing applies to presentations of new devices and consumer software).
To me it really looks like a statement against people without macs; Apple doesn't care about them, not even in its own interest of making money. They won't speak to you as long as you don't have an iDevice, full stop.
All computer ecosystems up to the late 90's were like that, the only thing special about Apple is that they are the surviving ones from those days.
It were IBM "mistakes" that made the PC different from all other computer eco-systems, however the current trends of commodity hardware and race to bottom prices are making PC OEMs go back to the 80-90's full integration of hardware/software culture.
Most apps these days run on Android, too, which makes using a single-platform framework like this a lot harder to justify. It's the same reason you don't see many games using SpriteKit -- it just doesn't make sense when you know you'll need to support Android in the future.
I wonder if it would make sense for Apple itself to support frameworks like ARKit and SpriteKit on Android. I think it would make people a lot more comfortable relying on them.
It absolutely does not make sense for Apple to support this on Android. AR is something that uniquely leverages assets that Apple has: its various AR related acquisitions (Metaio, FlyBy Media, Faceshift, Emotient, WiFiSLAM, PrimeSense) and most importantly their prowess in chip design.
It is trivial for the Android ecosystem to replicate an Instagram for their platform, it will be much more difficult to do this for AR and thus will be a meaningful source of competitive advantage.
There are real economic barriers to having the necessary hardware for AR. High-end Android phones don't ship anywhere near the numbers of iPhones to give it the scale required to make the necessary investments in hardware and software worth it (both from OEMs and Google). And that will ultimately limit the opportunity and market for developers, disincentivizing them to invest in it.
This is an area where Apple's scale, margins and ability to deploy new technologies and see meaningful adoption from users and developers (think about how fast Touch ID rolled down into the entire iPhone/iPad installed base, setting it up for Apple Pay years later) pulls up the ladder on would-be competitors.
The fact that Tango isn't on the Pixel means Google has no confidence in it and nobody should invest in it as Google will likely abandon it quickly. It's a shame, I think it's great idea, and they'll only end up playing catch up later if they do.
1. The technology is still new and not yet polished
2. Pixel 1 was a rushed job made in 9 months
3. Tango needs extra hardware that increases cost
I have a feeling that Pixel 2 might have it though, as they had much more time working on it and the technology itself has matured a lot more since, as we saw at IO. Not only that, they also showed off VPS which is a visual position system for indoors using Tango. Definitely a killer feature for the Pixel.
You're probably right, but I think it's important to note the difference between Touch ID (a user feature) and ARKit (a developer feature). Users won't know or care about ARKit if developers don't adopt it.
This is the most relevant post in this thread. Holding up an iPad for hours pointing it at a table to play Millenium Falcon animated chess makes very little sense, but it will be incredibly cool when your friends can all sit around the table with their Apple Glasses on, and see the same AR projection :)
Unreal Engine and Unity is going to have ARKit support. If your ambition is to make an AR app for both iOS and Android I suspect that one of those engines are better.
I don't believe giving ARKit to away makes sense for Apple at all. It is, from what I can see, a pretty advanced technology and widespread adoption of it on other platforms will not help them selling more iPhones.
Of course it makes sense and of course it will help sell phones.
This only works on iPhones/iPads with the A9 or A10 chip. Which means that if you want to use AR then you need to upgrade from your iPhone 6. It also will create a suite of AR apps which only run on iOS.
Being iPhone only didn't stop Instagram from launching, getting huge, and branching out to other platforms. If this allows you to make a quicker prototype, it's a big deal.
The only downside to this wild success is that profit is a lot more concentrated than it used to be. It's really hard to bust onto the charts as an outsider when there's so many incumbents.
For example, sucks to be someone who launched a game this week when Monument Valley 2 took over everything.
Mobile apps in general seem to be a race to the bottom from a business standpoint and Android is much more so that than Apple. I strongly dislike both platforms, and generally harbor a very negative opinion of Apple in general, but if I were to venture into mobile development I'd target Apple first and worry about Android (much) later.
From what I've gathered this totally depends on your market. Creating a paid app or an app targeting US University students? iOS first. Creating an ad supported app or targeting an international market? Android first or use a framework that supports compiling to both platforms.
There's also the possibility that this ends up being very useful to the business market; either for in-house apps or for apps sold to businesses. If it solves the problem they have they'd be willing to pay real money as opposed to the problem of getting consumers to spend.
Indeed. I work for an education company in Mexico. We target upper class schools and the distribution between iOS and Android is roughly 50/50 in our market.
Apple keeps doubling down on platform specific services and APIs but as their market share dwindles this becomes increasingly untenable. Porting Apple Music to Android was a step in the right direction but they have a long way to go in the new world where services are more important than devices.
> But the essential point is that nobody can afford to ignore Android and building to iOS-specific APIs doesn't make sense.
For broad market maybe, but for niche markets a good example would be ForeFlight https://www.foreflight.com/products/foreflight-mobile/ It's pretty much the gold standard among pilots, from private to airline transport, and it's IOS only. Makes support and dealing with fragmentation a whole lot easier.
This is an almost silly point. How many less significant than this features hold companies hostage to Microsoft? So it's ok when Microsoft makes Microsoft/Windows only frameworks and software but not when Apple does it? Sheesh. Of course vendors are going to specialize their offerings.
Of course Apple and developers can choose to ignore Android.
The market within the iOS space is huge and if building an iOS only app using ARKit is more profitable then why not ? It's all about ROI. There isn't an equivalent for ARKit on Android and building one would be costly and time consuming.
How large of a device marketshare could a developer count on at this point from Tango to justify investing heavily in that platform? One thing is for sure, developers can count on a significant share of paying customers if they go with ARKit.
Unless you are targeting a specific audience (for example artists that use the iPad Pro) for most developers it really doesn't make sense to confine yourself to iOS.
Mobile platforms have gotten large enough that market share isn't necessarily a useful metric. There are many hundreds of millions of iOS devices out there. That's a pretty decent market to be addressing.
(It also happens to be almost all of the top end of the mobile market, so an iOS only offering has a TAM of hundreds of millions of people who are more likely to spend money. Not bad really :)
I'm willing to bet the only reason they did that is because they already had beats customers on android.
The whole point of these APIs is to let developers make a better apps on iOS so people want an iOS device. If they port any of them to android all they're doing is making it easier for people to leave their platform.
Sure that's the goal but in practice I don't think it's working. People use cross platform tools like Unity or React Native or OpenCV because they know they have to support Android and can't kick that can a year or two down the road like they used to.
So then use Unity with ARKit and wait for an equivalent android AR library. I’ve looked into ARKit myself already - it is dead simple to get started. And so much data is already provided, including scale and lighting estimates.
Yes, you lose out on android for now, but in the mean time you support millions of iOS devices in a category that will be getting a ton of media attention in the next year.
> Porting Apple Music to Android was a step in the right direction but they have a long way to go in the new world where services are more important than devices.
Are they though? There's still a huge amount of people buying phones and computers, which cost a lot of money, periodically. There's paid services, sure, like Apple Music, which might become self-sustaining businesses, but a lot of these other "services" by the big tech companies are operated at a loss, like email, maps, cloud-storage, messaging, vide-calling, AI assistants, etc.
The only reason to make free services like all those at a loss is to bring value to a platform so users will stick to it, and Apple keeping its services platform-specific makes its platform even more sticky. Apple benefits from this directly, because it makes money selling its platform. To use iOS you need a device from which Apple makes loads of cash.
Google on the other hand is still essentially just an ads company with hobbies, and its profit from the platform its building is much less murky, indirect, and IMHO flimsy. The only way their platform brings them substantial profits is by collecting data to better target ads, which is concerning too.
The lifetime of devices is growing, so in the long term we have to see if Apple can keep selling as many devices as it does; their main source of income is not entirely future-proof. If you ask me, however, Google might be in a more delicate position, especially considering the growing issues with internet ads (concerning privacy, tracking, obnoxiousness etc) and that so many players are in the ads business nowadays. Some, like Facebook, are moving in deep-- challenging Google in user data analytics and engagement time.
tl;dr: Only some services (like Music) make money. All the rest are more valuable if they serve your core business, and Apple is very much doing that.
Why would Apple port core technologies to a competing platform ? Did Google port Play Services to Windows ? No. And no one would expect them to since it is a core differentiator for their platform.
ARKit isn't a backend service it's an OS library so doesn't make sense to be cross platform.
Why has this comment thread devolved into comparing the number of Android OS installs versus the number of iPhones sold.
They are apples and oranges and the point is that the iPhone for a few years back is a guaranteed hardware specification that you can guarantee ARKit will work on. If you think about this for Android, you'd need to slice it by the devices that are compatible - it's not going to be a guarantee by OS.
I hope that ARKit can be a lesson in what works well and what doesn't, and spur the open source community to create something like it, but cross platform. As more phones get dual cameras, what we can do with software like this will only get better.
ARKit doesn't seem to require dual cameras because it runs on iPads too. I do wonder if it uses the depth sensing second camera in the iPhone 7 Plus, or if that data isn't fast/precise enough to be useful.
Apple added a new depth api that gives real time information from the dual cameras. I’d be surprised if ARKit doesn’t use it to enhance the experience on the 7 Plus.
That said, it isn’t required. I have run the demo on my own iPad...the tracking is seriously impressive even with just the one camera.
My argument probably isn't as valid but I still think the guarantee of 2 iPhone generations is safer than the fragmentation of Android devices that would be "released in the past 2 years".
Agree that ARKit is likely to be more successful than an Android equivalent - it's not just version fragmentation but also hardware capability fragmentation.
Not only is the install rate for the latest Android dramatically smaller than the latest iOS, but camera capabilities (and GPU horsepower to drive the AR itself) are all over the place, with no guarantee that the camera module in the phone will provide even a satisfactory experience, or that the GPU will be sufficiently capable.
Camera APIs on Android are also a complete jibbering mess, which adds to the difficulty of anything that seeks to work across all (or even most) devices.
One distinct iOS advantage isn't just that adoption of new versions is rapid, but also that hardware capability variance is low, so devs can be confident that their products work well, as opposed to barely working.
I'm currently working on a client project that uses the Vuforia library paired with an array of iPad Pros to pull off this sort of AR-light effect. But Vuforia relies on a tracking marker, and it's a huge pain in the ass for us. The idea of using lower-level camera and inertia sensor data to perform tracking without a marker is something I didn't expect to see without a Tango-like cluster of depth sensors.
AR goes beyond gaming - I thought the idea of using it for things like interior design was great. I remember Apple mentioning that IKEA used ARKit to project models of their furniture into rooms.
Both metrics are equally useful or useless depending on what you want to quantify.
If you compare this to "serious" gaming technology (i.e. what the triple-A creators are making, and what parents are willing to pay $70 dollars a title for to keep their 16 year old from nagging them), the iPad and iPhone both are severely lacking in computational power to produce anything that the heavy consumer will spend "real money" on (rare exceptions like Farmville and Candy Crush not with standing). This is completely useless for serious gaming.
This could be huge for, say, realtors though. Being able to show your prospective purchaser who really likes the home but feels 'iffy' about that wall blocking off light into the dining room can now power up an app and show in real time a rendering of how the house would look if you knocked out that wall and had an open-floor layout with that beautiful Southern light pouring in. The ability to pop in new landscaping to let the buyer visualize what could happen if you replaced the front with hedges + hydrangeas against hyacinths could easily make the difference between that realtor making her 6% on that $850,000 or not.
Never seen anyone using AR in real life apart from some gimmicky stuff. The idea is nice, the execution is lacking, and the applications would be way better if you had it always on (as in, glasses or something similar), not through a phone or tablet surface.
There's something called Free To Play games these days, and I dont see why you would not consider them as gaming, even tough 90% of users don't pay anything (no DLC). They are still gaming.
You don't need the most market share to have the most gaming activity. People that use iPhones are much more active with gaming apps for whatever reason.
I personally disagree. Give me some good AR maps app, or just something that will put wikipedia on top the real world and I'll have to buy battery packs to make for all the extra time I'll be on my phone.
Google Nintendo's AR Games if you haven't already seen them. Mind blowing fun, and years ahead of Apple's efforts. The archery and golf games were incredible.
I was hoping for years that it was a sign that Nintendo was going to release Super MARio...
The guy from "Pair" in the first episode of "Planet of the Apps" [https://www.planetoftheapps.com/en-us] tried to make a similar SDK his main business model but eventually failed to get the fundings from the VC.
I hope in the end he decided to stick his focus on the app (which is quite amazing), otherwise he's in big trouble now!
I really think that a lot of new apps will come out in this space, now that the technology is much more approachable.
Apple does not provide JavaScript access to their APIs, so I doubt it. You'll only be able to use this in native apps. Something like PhoneGap or Cordova could try, but I wouldn't be surprised if they were serious performance issues.