Hacker News new | past | comments | ask | show | jobs | submit login
iOS 7: Catch me if you can (allenpike.com)
133 points by jaysonlane on June 26, 2013 | hide | past | favorite | 127 comments



This needs a dose of cynicism I think. The success of the iPhone in 2007 really wasn't driven by hardware capability (except arguably the capacitative touch screen). Even then, other phones had faster processors, more memory, and equivalent GPUs. Apple won because it invented new usage models, not because it drove its designers to "show off what's possible in" 2007.

The stuff this is talking about is just eye candy. The original iPhone was much more than an eye candy improvement.


It was a bit of both. The original iPhone wasn't a hardware revolution, but it was an OS revolution. It was the first time anyone put an OS on a phone that had a blazing fast graphics layer.

Touchscreen phones existed before then, but because of platform performance, was always of the "poke, wait, stuff magically appears, poke, wait, stuff magically appears..." model.

Apple didn't just invent new usage models, they invented the platform that made it implementable. The deep integration of UIKit with hardware acceleration, as well as animation as a first-class consideration in the API, were (and still are) some of Apple's greatest advantages.


My impression was that the iPhone was fully thought out in advance so that everything would work together. Take away a major piece, and the whole thing stops making sense. It needed a touch screen so that it didn't need a keyboard. There couldn't be a keyboard, because it took space away from the display. Replacing a keyboard with a touch screen was radical at that time, and it had to work really well. People would need visual feedback to be able to interact with it effectively. The visual feedback couldn't lag, so that required fast graphics. And so on...


Sort of. It wasn't quite such a clear causal chain - there were some things that positioned them well in the first place.

Quartz, which is the drawing and compositing engine developed for OSX, is really at the core of it. For one thing, Quartz is really fast, and long before iPhone was even a rumor Apple had already tied it deeply into OpenGL and made it hardware accelerated.

Quartz is really the "secret sauce" to the responsiveness and graphics performance of iOS, and it predates iOS by a pretty wide margin. Core Graphics was built on top of Quartz, as well as Core Animation, which form the primary ways third party devs interact with the graphics layer.

If Apple hadn't done OSX first, or done it differently, I don't think they could have pulled off iPhone.


For sure, it wasn't conceived in a vacuum. Apple's existing IP would have provided essential pieces to make it possible.

I've only owned a single smart phone so far, and it's an Android device. But I've noticed that Apple's devices have smoother interfaces. I've wondered how much of that can be traced to platform factors, and how much is simply execution and polish. And whether the OS design, and the difference between ObjC and Java, play a significant role. My impression was that Java was a dog, but it seems to run pretty well on Android. It's possible that with the meticulous attention to detail of a company like Apple, Android could be made as responsive and smooth as iOS. Or, maybe, platform issues make that virtually impossible. It would be more technically intriguing to think that native code and Quartz are decisively better, but I'm leaning toward the more boring explanation that Apple just worked harder on the UI, and expended the effort to do all the little things right. The basis for my opinion is the observation that true explanations are more often boring.

I'm not very expert on graphics, but I've had a suspicion that dedicated graphics hardware makes iPhone-type devices possible. Highly-integrated and highly power efficient graphics hardware was only just becoming available when the iPhone was in development. And, as you mentioned, the software layers to take full advantage of it in a GUI wasn't something you could just take for granted.


CoreAnimation is another great example: it was first made public in the summer of '06. It's easier to see why they were working on such a thing now.


>>It was a bit of both. The original iPhone wasn't a hardware revolution, but it was an OS revolution.

I disagree. Only a very small minority cares about operating systems and revolutionizing them.

What the iPhone revolutionized was user experience. For the first time, someone invented a device that was easy to understand, fun to use, and seemingly limitless in capability. When Steve went on stage and showed it off, the minds of everyone in the audience went into overdrive to start imagining all the possible things they could do with it. Both the hardware and the OS were designed to maximize those aspects of the device.

This is what sets the iPhone apart from its competitors, who to this day focus on hardware features and software gimmicks. In my opinion neither Android nor Windows Phone have managed to capture and learn to communicate in the higher level thinking that constitutes UX. They are still great operating systems. They just work in a very different context.


I don't think we're actually disagreeing. I'm not saying that the average iPhone user has remarked "Core Graphics is great! It makes my phone so responsive and fast!". I'd be very surprised if any lay iOS user has ever said that ;)

But the idea is the level of interactivity and the UX around iPhone would have been impossible without the platform Apple first built for OSX.

Many other manufacturers at the time had UX ambitions like Apple, but were held back by their own platforms. This infamous video comparing the concept renders of the Nokia N97 vs. what shipped really demonstrates this:

http://www.youtube.com/watch?v=vJpEuMidcSU

It's not that Apple didn't innovate on the UX, they sure as hell did, but rather that they bought themselves a huge lead on the competition by having, for all practical purposes, the only platform around that could even do something like that.


Wow, I never saw that video, but your analysis is spot-on.


When Steve went on stage and showed it off, the minds of everyone in the audience went into overdrive to start imagining all the possible things they could do with it.

Interesting that it wasn't the first time an Apple CEO went on stage and showed off a pocket tablet, and the minds of everyone in the audience went into overdrive to start imagining all the possible things they could do with it.

What the Newton, in 1993, revolutionized was user experience. For the first time, someone invented a device that was easy to understand, fun to use, and seemingly limitless in capability. Except it wasn't limitless. The public mind was thrilled with the concept, embraced the wonderous new platform, and got...egg freckles? http://fortunebrainstormtech.files.wordpress.com/2011/10/db9... The OS, hardware, UI, UX, all were revolutionary - and some revolutions fail. Steve killed it promptly upon his return, and I imagine nonetheless studied it a great deal, taking some 13 years between unveilings to get it right, making sure the engaged public didn't overwhelm the second attempt.


That is a fair assessment. I think the part that can really be attributed to Apple is the hardware accelerated, fluid, instantaneous and very polished interface with touch gestures . It's not any one UI component or gesture in particular but the beautiful integration of all of them together.

The other thing that iPhone had going for it was the iTunes integration that would save you from having to buy an iPod. This brought a lot of monetary value.

For the rest, I remember during the first iPhone reveal thinking it was a more polished, better funded version of the openMoko phone. ( https://www.google.ca/search?q=openmoko&safe=off&source=lnms... )

openMoko was an open source, volunteer made project. One thing openMoko had for it was their plan for promoting third party native apps from the get go, whereas Apple blocked them for almost a year.

In a 2006 article, a few months prior to the iPhone reveal we get about openMoko:

"The Neo1973 is based on a Samsung S3C2410 SoC (system-on-chip) application processor, powered by an ARM9 core. It will have 128MB of RAM, and 64MB of flash, along with an upgradable 64MB MicroSD card.

Typical of Chinese phone designs, the Neo1973 sports a touchscreen, rather than a keypad -- in this case, an ultra-high resolution 2.8-inch VGA (640 x 480) touchscreen. "Maps look stunning on this screen," Moss-Pultz said.

The phone features an A-GPS (assisted GPS) receiver module connected to the application processor via a pair of UARTs. The commercial module has a closed design, but the API is apparently open.

Similarly, the phone's quad-band GSM/GPRS module, built by FIC, runs the proprietary Nucleus OS on a Texas Instruments baseband powered by an ARM7 core. It communicates with Linux over a serial port, using standard "AT" modem commands.

The Neo1973 will charge when connected to a PC via USB. It will also support USB network emulation, and will be capable of routing a connected PC to the Internet, via its GPRS data connection. [...] Moss-Pultz adds, "Applications are the ringtones of the future." [...] As for additional software components, Moss-Pultz admits, "Quite a lot is there, and quite a lot is not there. We're hoping to change this." In addition to a dialer, phonebook, media player, and application manager, the stack will likely include the Minimo browser [...] He adds, "Mobile phones are the PCs of the 21st century, in terms of processing power and broadband network access. "


I think you may be underestimating the impact of the touch screen. It's wasn't just that it was touch, it was the phone. I can't find it, but I remember reading industry reactions that were incredulous that it was even possible to have enough power to drive the screen for significant periods. That large touch screen really set the iPhone apart.


You're probably thinking of a RIM employee who told a story about how the engineers thought Apple was lying about the iPhone because the large screen was too power-hungry:

The iPhone "couldn't do what [Apple was] demonstrating without an insanely power hungry processor, it must have terrible battery life," Shacknews poster Kentor heard from his former colleagues of the time. "Imagine their surprise [at RIM] when they disassembled an iPhone for the first time and found that the phone was battery with a tiny logic board strapped to it."

http://www.electronista.com/articles/10/12/27/rim.thought.ap...


Indeed I was. You found it a minute before me.


Seems doubtful that people would be incredulous over driving the screen; capacitive touchpads of the time didn't take much power. Color iPAQs from 8 years earlier managed a few hours with a CCFL backlight--white LEDs weren't widely available back then, and lithium-ion batteries of 1999 sucked.

If iPhone guts powering a windows phone with a resistive screen came out before iPhone, it wouldn't have sold like crazy. Windows phone software had a start menu... the fastest text input was drawing misunderstood letters with a stylus, resistive screens had to be manually calibrated and were very inaccurate unless using a stylus; iphone-like typing just didn't work.


I think you meant 'performance' instead of 'capability'.

As far as 'performance', Apple has never been one to put performance numbers up as a selling point. Macs have never been marketed as an X (M|G)hz machine with Y (M|G)B RAM. Those were details Steve felt should be abstracted away from the user. So, I'd agree with you that the success of iPhone has never really been driven by 'hardware performance', as that is the popular habit of Apple.

That said, it remains that iPhone is still a fairly powerful smartphone, as they come, and the author seems to be arguing (rightfully so, IMO) that Apple is cashing in on that fact through leveraging that for the aesthetic design of their software.

Arguably, all mobile OSs have to be built with some amount of mindfulness to the performance of the lowest common denominator device. Android could be pretty low, whereas Apple's lowest common denominator device for iOS 7 still has moderately powerful specs.


> Macs have never been marketed as an X (M|G)hz machine with Y (M|G)B RAM.

Not quite true, I don't think. Around the late 90s, early 00s there was a lot of marketing around the high Mhz of the Mac CPU compared with PCs - and arguments about the merits of RISC vs CISC. That was back in the Mhz wars when Mhz was all most people cared about and paid attention to.


> Not quite true, I don't think. Around the late 90s, early 00s there was a lot of marketing around the high Mhz of the Mac CPU compared with PCs

Funnily enough that exactly corresponds to Apple's dark period


> there was a lot of marketing around the high Mhz of the Mac CPU compared with PCs

Em, no. At a time when Apple used PowerPC instead of Intel processors, Apple wanted to de-emphasize Megahertzes because the PowerPC chips had lower clock frequencies than the Intel chips of the day. Instead, Apple would advertise the PowerMac as being twice as fast as PC workstations, and it would speak of megaflops and ‘the first desktop supercomputer’. See also ‘The Megahertz Myth’[1].

So yes, Apple did compare the performance of their pro models (PowerMac, PowerBook) with PC counterparts, but they didn’t use clock frequency or memory speed to make their point.

[1] http://en.wikipedia.org/wiki/Megahertz_myth


>This needs a dose of cynicism I think. The success of the iPhone in 2007 really wasn't driven by hardware capability (except arguably the capacitative touch screen). Even then, other phones had faster processors, more memory, and equivalent GPUs.

No, they really didn't. Any citation?


Nokia N95. First available March 2007, three months before the iPhone was introduced.

  * Memory - 160MB, versus iPhone: 128MB.
  * CPU - Dual CPU, 332 MHz Texas Instruments OMAP 2420 (ARM11-based), versus
 Samsung 32-bit RISC ARM 1176JZ(F)-S v1.0 412 MHz
  * Display - iPhone: 320x480 18-bit 3.5", versus N95: 240x320 24-bit, 2.6"
  * Network ability: HSDPA 3.5G, versus iPhone: GSM/EDGE
  * Camera: 5MP, versus iPhone 2MP
Much as the iPhone was lauded - one of the criticisms leveled at the iPhone was that the hardware was quite underwhelming. My specs were lifted from the respective Wikipedia pages, but this is hardly a subjective or minority opinion.


So, on par or nearly on par specs on most counts (except, of course, 3G) but vastly superior specs on some counts? People criticised the first iPhone for its lack of 3G. That much is true.

But its horse power? That would have been an exoteric opinion at the time. Legend has it that Nokia was so astonished by the iPhone, they got their high speed cameras out to see whether the phone was really doing 60fps scrolling.

The first iPhone was a towering technical achievement (except for the lack of 3G). It was a phone that made all the right compromises at exactly the right time.

I think you would have been hard pressed to find anyone disputing that in 2007. Sure, people doubted whether it would really be useful without the ability to install third party apps, people doubted whether Edge would be enough, people bemoaned missing features – all good reasons to not want the original iPhone – but the technology was without a doubt great.

And all you are engaging in is revisionist history.

(I’m writing this doubting that iOS7 will be any of that. It’s a train-wreck on the iPad currently. Yeah, it’s just a preview but an OS is so complex, I don’t see how it’s possible to turn this mess into a coherent whole in the few months they have left. No matter how many hours they pour into it. I’m pessimistic.)


The Nokia N95 was a top of the line phone, released a couple months earlier. "Other phones", as in 95% of them, didn't have those specs.

And even then, the specs didn't make much difference. It was just another phone, bulky, with a small, dark screen, clunky buttons and a shitty interface: http://www.youtube.com/watch?feature=player_detailpage&v=HwT...


"The Nokia N95 was a top of the line phone"

Whereas, of course, the iPhone was brought out as a budget, bottom line phone, marketed at low end consumers, given away by carriers...

Only in the RDF could a phone that was 15g lighter (120 vs 135), and smaller (99x53x21mm vs 115x61x12) be described as "bulky".

Of course, "bulky" will be redefined shortly to mean "less deep, which as everyone knows, is the only metric that matters" - just like the iPhone's aspect ratio was perfect - until the iPhone 5 came out, of course.


>Whereas, of course, the iPhone was brought out as a budget, bottom line phone, marketed at low end consumers, given away by carriers...

Whereas, of course, the Nokia was 40% more expensive -- at $699.

Not to mention that, hw specs aside, it was so last decade, and the inefficient BS OS it had, made it dog slow, unintuitive and unfit for the intertubes.

>Of course, "bulky" will be redefined shortly to mean "less deep, which as everyone knows, is the only metric that matters" - just like the iPhone's aspect ratio was perfect - until the iPhone 5 came out, of course.

Err, what? Bulk is about volume. Unless you have 2D pockets, the Nokia being smaller in the other 2 dimensions doesn't mean shit. Just look at the thing in the video:

http://www.youtube.com/watch?feature=player_detailpage&v=HwT...

Also look at the UI. Is this supposed to be a joke?

http://www.youtube.com/watch?v=2GmLchZN1Dc

So, yes, it had a better camera (since Nokia mostly slapped on a huge-ass compact camera lens on it), and it had 3G (since Nokia could not care less about battery life with 3G on).


heh. slow down turbo. nobody's arguing that the execution on the iphone wasn't completely awesome.

that being said, i find it funny that you flippantly discard the number of phones with better specs in one breath, and then acknowledge them in the next. (despite them having shitty implementations)

i am totally entertained picturing you as a slavering apple fanboy raging at a hacker news thread. makes me giggle.

m3mnoch.

p.s. btw, using your number and a gsma.com estimated 500 new phone models per year, 5% of them that "have those specs" amounts to 25 phone models that had better specs than the iphone. just sayin'.


My impression is that other 2007 phones were wasting their hardware on 1990s style UIs because their OSes were designed for <100 MHz processors. The iPhone actually used its performance.


    iOS 7 was clearly designed to show off what’s possible in 2013.
I absolutely agree with this point.

And, while I disagree with the post's main thesis — I believe the changes were in no way directed towards the web — it's a pretty entertaining take on the updates, OP.


I'd rephrase that as ios7 was clearly made to catch up with android 4 and holo, but maybe that's just me.

there's nothing revolutionary in ios7 shown so far. that said, I can't wait for Apple to sue Android over the things they just stole from them.


I haven't used Android since it first came out, but does it have the dynamic physics, layering, dynamic image compositing (blur, etc), parallax, etc, that all are in iOS 7? iOS 7 seems to me to be very much about how it feels and behaves in motion. The current beta still isn't performance tuned, so my guess is the final build is going to be a pretty futuristic experience on the iPhone 5S.

I think a lot of people are getting hung up on lack of textures/lighting as "copying Android" but I would love to hear how Metro and Android are comparable on the things that are going to define iOS 7's experience: motion, dynamics, translucency, layering, and depth.


I agree, but it's funny. iOS 7 is all about dynamic, physical interfaces - so much so that they built a physics engine into UIKit. At the same time, they're ripping apart the user experience so that nothing onscreen _looks_ like a physical object.

Before, interfaces looked like a bunch of real-world objects. Now, they look like a bunch of thin-ass lines and boxes, but behave like real-world objects.

I agree that iOS is all about motion, dynamics, translucency, layering and depth. But the verdict is still out on whether those things make for better interfaces.


Motion, dynamics, and depth are a hallmark of the Windows Phone Metro experience.


Windows phone metro doesn't include a physics engine. The motions are all tweened.

Windows phone metro includes no depth at all, it truly is flat and 2D spatial.

Dynamics?


I don't know (or frankly care) how it is implemented. While the UI is flat, the apps use transitions and a parallax effect to build the illusion of the depth (similar to how Apple does it using their gyroscopic parallax effect on the home screen).


There is no depth or layering in metro, not even small drop shadows and definitely not layered application screens. The panaroma metaphor is panning left or right with some disjoint movement of the background (not really parallax, very fake) and the cutting off lots of text...

Lets wait until we can judge them side by side. I have a windows phone device, but no apps (my market doesn't really have anything decent), so I'm not really sure if the metaphors have evolved much from the panaromas.


All of the new features are just 'feature libraries' on top of CoreAnimation, which was there since 1.0. You could of made a demo of all of this that ran (slower) on iOS 2.0 with just CoreAnimation and the accelerometer hardware API as a 3rd party developer. CoreAnimation is the impressive part.


Well I've had a parallax (sic? I don't think I've spelt that right) Live Wallpaper for some time now. Android lends itself to allowing other developers to innovate without waiting for the OS to introduce those "features". There are also other Launchers that can provide the iOS 7 graphical embellishments while other Launchers have provided even more 3D enhancements. I've used those before, but I'm running a fairly stock UI out of personal choice. And that I think is still why Android gives me the best platform -- choice. There are a variety of device manufactures and I can replace whole system components, but at the end of the day, while my particular device is unique to my experience, I can run the same software as another device that looks, and in some cases feels, completely different than mine.


In your estimation, what did iOS "steal" from Android?

For example, a common iOS 7 feature cited as "stolen" is the app switching interface. But that's the same interface Safari has used for switching tabs since the very first iPhone. They simply repurposed their Safari UI for the OS.


I make my living writing iOS apps, but come on - there's plenty here that has been "borrowed". Off the top of my head:

- The insta-airplane-mode-wifi-bluetooth buttons in iOS7. These are a straight lift from Android, and have been hotly demanded by users for a long time.

- Notification Center is an incredibly uncanny look-and-feel-alike of notifications in Android. It's basically an outright clone - though this isn't an iOS7 development.

- The new app switcher is a takeoff from webOS. Swipe to close an app is also implemented almost precisely like it was on webOS. Claims that this came from Android are IMO off the mark. I don't think it's fair to say they repurposed the Safari tab-switcher, considering the gestures bear such an uncanny resemblance to the webOS implementation.


"hotly demanded by users for a long time."

I never need to switch to airplane mode, or bluetooth on/off in a hurry? I wonder who these people are they find setting/slide airplane mode so time consuming.


People who use Bluetooth regularly find themselves having to turn it on and off for the sake of battery life. It's great to be able to do so without diving through multiple menus.

It's also not just airplane mode - do-not-disturb mode is also really useful, and it should be fairly obvious why someone would want to toggle it without menu-diving.

Ditto orientation lock. In iOS6 is tap-tap-swipe-tap-tap, now it's just swipe-tap. This last one is kind of a platform problem though - orientation lock is useful in part because some apps excel at landscape mode, while others suck at it but insist on enabling it.


I'm a bit baffled by this myself. I'm on the iOS 7 beta, and it's far closer to WP7 than Android. (Which is awesome -- I loved WP7, just wished it had apps.)


I think you could argue the control center is borrowed from Android (although the iOS 7 version looks kinda spiffier). The interface for moving between apps is similar. I had to open Safari to see your comparison. The demo I see on the product page for iOS7 reminds me more of Android than Safari. I remember seeing a few other things in the WWDC presentation but I don't remember what they were.

All that said, I hope both sides "steal" good ideas early and often.


Catch up? People like you really have no idea how much behind Android is as an OS. It's until very recently it's got close to 60fps scrolling, and low audio latency and decent power management.

One example on the advancement of iOS 7, how about take their animation framework a step further (which is already ahead of whats available on Android) and introduces a full rigid-body physics engine that's easily at the developers' disposal?

What Apple has always done is good developer support in the form of great APIs


There is some "flat" design that is similar to Android, but the similarities are fairly superficial and not what the article was addressing.


And it is not even flat. Apple talks about three things regarding new design: depth, deference and clarity. Nothing about flatness, on the contrary, it's about establishing visual hierarchy via layers. Also, even bigger part will be not in the static appearance but in subtle motions when interacting, thanks to UIKit Dynamics.


> Nothing about flatness, on the contrary, it's about establishing visual hierarchy via layers.

This is exactly what the Android design is about as well.


>I'd rephrase that as ios7 was clearly made to catch up with android 4 and holo, but maybe that's just me.

Catch up? Actually it was more of a "let's go another 100 miles ahead, Android seems to be catching up finally to were we've been the last 3 years".


To quote you: "Citation"?

What iOS features have been present for three years that Android is just now getting?

Let's be clear too, three years ago, iOS was at version 3.1.3.

I would love to be enlightened, as an iPhone 5/iPad 4/rMBP owner.


60fps scrolling? A semi-decent animation framework for developers? Low audio latency and good power management? Pretty much all the advantages Apple has from their decade of experience of developing OS X frameworks?


> 60fps scrolling?

It's pretty easy to make UITableView's chug if you do something stupid when populating cells as well.

> A semi-decent animation framework for developers?

The property animation system introduced in Honeycomb and backported to what, 2.2? works very nicely.

> Low audio latency

Audio has been an issue on Android. I've not delved into it myself but this is certainly something that only seems recently fixed. Not sure this warrants calling the platform far behind though.

> and good power management?

I'm not sure what you're getting at here.


There is a property animation system on Android, but it's woefully lacking when compared to iOS. IIRC it doesn't even support 2.5D transform with a Z axis and variable camera distance (which is useful for perspective transform). On iOS 7 they now have a full rigid-body physics engine built in and comes with a nice high level API for developers to take advantage of.


"On iOS 7 they now have a full rigid-body physics engine built in and comes with a nice high level API for developers to take advantage of."

That's great, but the original comment I was discussing was the claim that "Android is just now getting what iOS had three years ago". iOS 7 is three to six months away, still.


It's pretty easy to make UITableView's chug if you do something stupid when populating cells as well.

Only the chuginess comes standard with Android.

The property animation system introduced in Honeycomb and backported to what, 2.2? works very nicely.

It's not really close to CA.

>Not sure this warrants calling the platform far behind though.

If you casually dismiss everything else too, of course not.

>I'm not sure what you're getting at here.

Like Android apps allowed to run rampant in the bg, draining the batteries pronto -- a complaint you hear all the time, and of which you can also find some measurements.


> Only the chuginess comes standard with Android.

Sounds to me like you're trolling. This is not the case unless one is using a low end device.

> If you casually dismiss everything else too, of course not.

Pot, meet kettle.

> Like Android apps allowed to run rampant in the bg, draining the batteries pronto -- a complaint you hear all the time, and of which you can also find some measurements.

I honestly think you're just making things up at this point? Have you even used an Android device?

Yes, LTE was a big drain when it first came out. Generally Android devices handle power management very nicely.

At this point you just sound like a combative troll.


This troll word. Do you use it with anything you find too uncomfortable to hear?

It's like discussing with a 10-year old. Either respond to what I write, ask me a question, ask for clarifications, or provide counter-arguments -- or don't and refrain from this subthread. The "you are a troll" accusation gets old quickly, as I approach forty. I could not fucking care less about going to a forum and making "joke comments". What I write is what I believe to be true, based on what I know.

So.

>Sounds to me like you're trolling. This is not the case unless one is using a low end device.

No, it's very much the case, EVEN on high end devices. And there have been posts from Android engineers on the issue, blaming various stuff, from the stop the world GC to the drawing thread scheduling. Here's one:

https://plus.google.com/100838276097451809262/posts/VDkV9XaJ...

It also links to another Google person, saying that "that's not it, it's because of the extra security Android offers, that has an overhead". Which is funny considering:

http://appleinsider.com/articles/13/05/14/mobile-malware-exp...

>>Like Android apps allowed to run rampant in the bg, draining the batteries pronto -- a complaint you hear all the time, and of which you can also find some measurements.

>I honestly think you're just making things up at this point? Have you even used an Android device?

Yes, I have. A Samsung mid-range one. Also borrowed briefly a high end LG one. Not impressed on both counts.

As for the battery thing, not only it's an issue, but a whole cottage industry has sprung around it -- with battery-saving apps being among the most popular (LOL):

http://reviews.cnet.com/8301-19736_7-57581440-251/five-andro...

http://lifehacker.com/5990553/betterbatterystats-tells-you-e...

http://www.topapps.net/android/top-10-battery-saver-apps-for...

http://www.techradar.com/news/phone-and-communications/mobil...

I could similar list stuff forever...


> No, it's very much the case, EVEN on high end devices. And there have been posts from Android engineers on the issue, blaming various stuff, from the stop the world GC to the drawing thread scheduling. Here's one:

You're still just being ignorant. This was a post by a former Software Engineer in Test Android intern who didn't know what he was talking about, hence why he was corrected multiple times all over the internet. In your own link he mentions that he was wrong but leaving the thread up for posterity! It's like you only post things you assume support your pre-determined world view without even reading them. Quote from Andrew on that post:

> BEFORE READING: A LOT OF MY ANALYSIS OF ANDROID PERFORMANCE IS WRONG, HOWEVER I AM LEAVING THIS POST UP BECAUSE OF MY COMMENTARY ON THE ISSUE.

So yes, I think at this point you're just trolling. I have addressed your points - performance is an issue on low end devices, not on high end devices. This is not unexpected. There are lots of battery packs for iPhones, that doesn't mean iPhones have relatively shit battery life, it just means there is a market for extending that battery.


>What iOS features have been present for three years that Android is just now getting?

The basic feature it didn't have --and I'm not sure it yet getting--, is that it's not about features and feature lists at all, but overall polish, coherence of implementation and ecosystem.

That said, just randomly, some things that it didn't have:

- An actual deployment base for the new versions. Even TODAY (much less 3 years ago) half of the people out there use a 2+ year old Android version. (Google's own mid-2013 data).

- A good security story. Latest study: "Android accounts for 92% of mobile malware, malicious apps increase 614%"

- Tons of top notch apps that are iOS only (or appear on iOS years ahead of Android launch) because devs cannot be bothered with a platform that's too fragmented and with more people using it as a feature phone and not caring about apps. From Instagram, to Paper, to Vine, to Letterpress, Flipboard, and tons besides. (Sure, there are also Android apps that are not on iOS. But the highly celebrated apps, including apps that result in billion dollar businesses like Instagram, are always if not exclusive, then first on the iOS side).

- Low latency audio APIs. It's lack killed the musician's app market that thrives on iOS. Android started getting 1-2 apps this very last month, but it's still DOA on the audio apps market.

- A huge ecosystem of add-on peripherals -- from heart monitors to MIDI keyboards to camera tethering, to glucose meters, to external speaker bases.


Thanks Zach. I don't think the changes were directed towards the web either - they were a side effect:

> As a side effect, they’ve embraced conventions that will be hard to emulate with commodity hardware or web tech.


Gauging iOS's progress on aesthetics alone is like a Formula 1 driver on the highway saying "I have no idea where I'm going, but I'm making great time!".

They only recently implemented a sane notifications system and quick-access settings, and they STILL don't have anything comparable to Intents on Android, which I believe is a revolutionary feature that should be core to any mobile OS.


>If we felt really crazy, we’d make simple things like home screens and modal dialogs subtly shift in 3D, real-time, in response to gyroscope input. (To a mobile web developer that sounds like a troll feature request.)

Uh, except that's all completely possible on the modern web, thanks to tools like CSS3 and deviceorientation APIs. In fact, when iOS7 was revealed a few weeks ago, developers raced to duplicate the parallax background feature, and did exactly that in very short order: http://matthewlehner.net/ios-7-style-parallax-background-bas...


If this article is accurate - and that iOS 7 has so much eyecandy that it will only work on the latest high end hardware - does this mean that a low end budget iPhone is off the table? Or is this Apple's way of segmenting a high end version of the OS and then a version with all the eyecandy disabled for the cheap iPhone?

It's interesting that Apple and Google seem to be going in polar opposite directions right now - with Apple going for high end effects, and KLP reportedly focusing on reducing the minimum requirements and letting it run on lower end hardware.


Ios 7 runs fine on most hardware right now even with beta 2, blur effects aren't exactly that big of a power draw.

Don't believe everything you read in articles before its released. Best not to form opinions on things that aren't in general availability. Apple isn't going for "high end effects" at all with ios7, more fluid effects that overall provide a consistent experience. Can't get too into details due to nda but it isn't nearly as crazy as this article makes it out to be.

It is impressive what they've changed since Ive took over though, very impressive how much of a change they've made actually. It definitely feels designed all the way through.


I don't care about 3d or blur or other eye candy. I'd happily trade all that crap off for the battery life of my old Nokia 1100.


The 3D and blur aren't what's draining your battery. The big ass LTE radio sucking down videos, music, and photos is what's doing it. That and the huge, brilliant LCD screen staring at you.

Optimizing battery life by getting rid of animations and graphical effects would be extremely penny-wise, pound-foolish.


In time. I think the battery life vs. computing power tradeoff took a turn in 2007 because batteries were finally robust enough to handle a bright screen for a reasonable amount of time.

Ideally I would like a device which has a battery which, when bought, will last the lifetime of the product.


One that recharges itself continuously via radioactive decay? I wonder if, regulatory challenges aside, a radioactive decay trickle charge could be enough of a trickle charge to meaningfully extend daily battery life...


None of what goes into modern battery is notably radioactive, unless your'e suggesting a battery that's similar to a RTG?


I don't remember what RTG stands for, but I'm pretty sure they mean adding radioactive bits to the battery.


RTG: http://en.wikipedia.org/wiki/Radioisotope_thermoelectric_gen...

Basically instead of having a big complicated nuclear reactor, you just throw some radioactive material in a package, let it decay, and generate electricity off the heat. It's mechanically much more simple and has the bonus of not involving critical masses of radioactive material ;)

Not very realistic though. Besides the obvious issues of carrying around lumps of plutonium, they're rather bulky, and disposal is a pretty major problem.


I have a charged external 12 AmpHr battery in my briefcase. It helps a lot.


> The 3D, the blur, the compositing – all of them are disabled or degraded on the [iPhone 4's] A4.

I hadn't heard that before, and it's quite interesting. Does anyone know if previous iOS versions have similar feature segmentation, other than the obvious (and perhaps artificial) ones like Siri and 3d maps?


Besides what others have already mentioned, third party devs (including myself) find ourselves disabling features on the iPhone 4 also.

The important thing to recall is that the iPhone 4 quadrupled the number of pixels you had to render while doing almost nothing for CPU performance. Graphics performance-wise the iPhone 4 was a huge downgrade from the 3GS.

If you're doing any sort of graphics work that's CPU-bound, your iPhone 4 performance is going to be dismal.


Background apps was a pretty big one when iOS4 came out, and couldn't run on the 3G. (I actually jailbroke to get the feature, and it was painfully slow)


It's not unheard of. The camera's panorama feature, taking photos while recording video, some other stuff:

http://support.apple.com/kb/HT5457

Also, some OS X stuff, like AirPlay mirroring or Power Nap require up-to-date hardware:

http://support.apple.com/kb/ht5444


Navigation isn't available on the 4.


Anything that pushed native over web in my mind is a step backwards. Let's think about the challenges that come along with native ios apps:

- Less accessible. Objective C is a strange, low-level, difficult-to-learn, niche language used only for mac development.

- More work. See above explanation of Objective C. You need an IDE to even be remotely efficient, an IDE that Apple also controls.

- More work to port anywhere else. Popular app on iOS and want it on android? Time to rebuild the entire thing from scratch in another language!

- More expensive. Want to sell your app? Better give Apple their 30% cut, in addition to the yearly fee in order to have an app of any sort.

The common theme of these points being that Apple wants control over everything, and has control over everything. But that's a point for a separate rant. My point here is that the web has none of these drawbacks. The web is a uniform interface across devices, the languages are fairly simple and widely used, and the environment is open.

Apple could still take control and sell web apps, but they seem to be very purposely not doing that, and pushing native as hard as they can while occasionally implementing a buggy web app feature here or there. It just makes me sad. We don't need parallax when we move our phones around. We don't need blur effects everywhere. We need an easy, clear, open, and accessible way to develop apps for mobile devices.


I find the web programming model to be very difficult and limiting relative to desktop or mobile UI development. I don't take on any development work in that area because it is so unpleasant to me (JavaScript, HTML/CSS, DOM manipulation, and making that stuff work on multiple browsers reliably). I suspect I am not the only one hoping that someone comes up with an entirely new web development model that is much more like classic desktop UI development, and portable. JavaServer Faces was one attempt I can think of. GWT is another.

The user experience with web apps isn't there yet either, particularly for mobile where you don't have a mouse, keyboard, and screen real estate to burn.

There are of course many other advantages web-based applications. Good application UX is just not one of those advantages and I think it is the main reason that native continues to rule mobile, not anything Apple or any other mobile OS vendor is doing.


> Less accessible. Objective C is a strange, low-level, difficult-to-learn, niche language used only for mac development.

I don't think you know what a low level language is. Also, Obj-C is quite a breeze to learn.


More expensive. Want to sell your app? Better give Apple their 30% cut, in addition to the yearly fee in order to have an app of any sort.

Can you name an app that charges that isn't in the App Store? There's plenty of apps in the App Store that are "HTML apps" (built using PhoneGap, Titanium, etc)


Ah, you forget the main Advantage (to Apple)

- native means control.


> - Less accessible. Objective C is a strange, low-level, difficult-to-learn, niche language used only for mac development.

It's not hard for someone that already knows C, or really any other language. The learning curve is in platform knowledge, and every platform -- including the web -- is different.

> - More work. See above explanation of Objective C. You need an IDE to even be remotely efficient, an IDE that Apple also controls.

The IDE is a major win, not a downside. It automates the mundane, and applies equally well to web development -- I use IntelliJ for web work.

> - More work to port anywhere else. Popular app on iOS and want it on android? Time to rebuild the entire thing from scratch in another language!

That's unfortunate, but it provides the best user experience.

> - More expensive. Want to sell your app? Better give Apple their 30% cut, in addition to the yearly fee in order to have an app of any sort.

On the other hand, Apple has a mountain of customer's credit cards on file. The 30% buys you a lot.

> My point here is that the web has none of these drawbacks. The web is a uniform interface across devices, the languages are fairly simple and widely used, and the environment is open.

The web has a massive pile of its own drawbacks -- it's a language monoculture, low performance, no platform to speak of, much less platform consistency. Development is a hodgepodge of different interdependent tools, server-side and client-side. None of your users actually own anything, and webapps can just disappear at any time.

Neither proprietary native platforms NOR the web are perfect, and both are imperfect in very different ways.


I have never used objective C, although I use C quite a bit. Javascript is much higher level than C, thus you can code more faster with less bugs in javascript. Of course it all depends on if you need low-level performance.


Objective-C is about on the same level as Java.


Well... No. And I say this as someone who write Obj-C all day and actually likes Obj-C.

Obj-C is a relatively thin layer around C, it's nowhere near Java's level of abstraction.

For one thing, at least on iOS, there is no actual garbage collector. ARC is a static analyzer that injects manual memory management code into your app - it does not collect garbage at runtime.

Even with ARC you are still managing your own memory - it is not possible to write a complex iOS app without understanding manual memory management and what the different reference types do in this regard. This is in contrast with Java where you really don't have to understand memory at all to write non-leaky code.

You're also allowed to (though not encouraged to) do low-level hackery like pointer arithmetic. There is nothing stopping you from reaching down and fiddling with raw memory addresses and values. This is in sharp contrast with Java and other high-level languages where you're very explicitly forbidden from doing anything that might blow your foot off.

The runtime itself is also really thin compared to the JVM. Hell, frequently the first you know something is wrong is when your program segfaults. Try that with Java.

Not Obj-C, but OSX/iOS, is also that as soon as you get into any of the deeper libraries you're in C-land. Core Graphics is exclusively a C API, as are many of the other APIs like calendaring and address book. You really can't go deep into the platform without touching raw C.

Anyways, Obj-C isn't on the same abstraction plane as Java, not even close.


It was a generalized statement for people who want to begin to learn it, I personally find that Obj-C offers very similar level of abstraction for beginners as Java does (especially with ARC, and not everyone has to deal with CoreAudio, AddressBook and other C libraries or integrate C and C++ code into their app).

I'm a professional iOS dev as well and everything you said were correct, so no arguing there.


Good point. Hadn't really though of it this way before. I guess as a mobile web dev I'm going to be always behind the curve here. Something interesting to think about.


Will it be true that Apple still has enough weight to cause ripple effects in design paradigms completely unrelated to a mobile device? I am guessing so, but mostly because a lot of designers are coupled strongly to Apple ideology and it affects their designs directly.

This is of course painful as someone outside of this bubble - watching designs that conform to a device and a way of thinking instead of living and breathing in their own right.


Well, at least this design is a lot closer to current Android and WP8. When developers designed specifically for iOS and then ported to Android as an afterthought, the results often looked completely out of place.

I'm hopeful that it will be easier for clever designers to make apps that look very similar on all three platforms, and also look at home on all three platforms.


Good point. That may not actually be too bad. I would hate to see blur and super-thin lines propagating all designs but those would hopefully be minor details if the apps themselves are more similar.


From my brief, albeit limited time, running iOS 7 beta 2 on my 3rd gen iPad I have to say that I'm thoroughly underwhelmed. Bugginess aside, it just feels like a significant step backwards in terms of intuitiveness and usability.

It's as if Apple (obviously) had to do something and prioritized jumping on the flat UI bandwagon over a well thought-out / polished experience.


This is assuming that everyone is chasing the same level of perfection as apple. If the web can provide a "good enough" interface will the bulk of developers go to the extra effort of native apps?

Currently things like games are (largely) not good enough on mobile web and so the issue is forced, but building a native app for a nicer blur effect? Probably not.


I'd argue that bulk of developers are already doing native. The thing is, extra effort is required to replicate with web technologies even simple things that native SDKs offer. If everything you know is HTML, JS, CSS then yes, learning iOS or Android SDK will require some effort but it is worth it. Even the best webapps still feel subpar on iOS6, on iOS 7 they will feel very clumsy.


Nice to see a non-negative analysis of iOS7. Not having tested it myself, I hadn't realized that iOS7 was actually pushing the boundaries of what's possible with modern hardware. My previous impression was that, if anything, it was a simplification of sorts and that seemed a dubious direction. Thanks for pointing this out!


Apple haven't done anything with iOS 7 that I can't do with MOAI. (http://getmoai.com/) By taking this direction, they've given me even more reason to ignore their Native GUI frameworks and continue to focus on implementing a clean, usable GUI using non-native (read: 100% cross platform) technologies.

I think this is a good thing, personally. I don't see any reason to write code specifically for iOS any more, when I can, just as easily, declare a similar GUI function in a language that will work on all platforms.


iOS 7 implementing "troll feature requests" for web developers had me chuckling. For the most part, I'm finding iOS 7 to be great.


Apple doesn't need to introduce delicious eye-candy to its interfaces that nobody needs to hamper web developers...they can just keep neutering the shit out of WebView by keeping it several generations behind the Safari javascript engine.


Bing! Give the man a prize! I totally agree.


Another cynicism that comes to mind - this won't only help staying ahead of competitors, but also perhaps persuading owners of older iphones to get a newer device, even though they were fully satisfied up to this point.


Why is blur supposed to be computationally intensive? It's very simple mathematically...


It's not the computation itself, it's the GPU memory bandwidth it takes up. (Assuming you're doing it in real-time of course.)


if you blur an MxM area with a NxN kernel you have approximately MxMxNxN multiply adds. That can add up to a lot especially on a retina display. You are right that it is simple mathematically though :)


Shaders.


Most certainly you want to use a shader, but that doesn't fix all your problems. You have to push a (very large) texture across the bus. The GPU will chew on it and spit out the result, which you then have to pull back across the bus into main memory so that it plays nicely with other UI elements that are CPU-bound and not cached in VRAM.

So you've removed the processing bottleneck in exchange for a bus bandwidth bottleneck :(


are you sure it has to be pulled back to main memory?


Yeah, that doesn't make sense to me either. Especially since the entire point of the recent changes to iOS is precisely to get everything on-screen handled by the GPU, in the first place ..


As is factoring large numbers :)


Actually, in a naive implementation it is very computationally intensive, especially with large convolution cores (like on iOS 7 where it appears huge) and high resolutions.

Even optimized implementations on GPU are still computationally intensive compared to alpha blending.


What's funny here is the idea that somehow commodity SoCs will not soon be catching up and providing the same hardware benefit to all the competitors.

Hiding behind a hardware wall is not a safe place to be these days.


"Try copying this assholes"

Seriously? I can't get over there are iOS fanboys out there still with their heads so far in the sand.

Along with more characterizations of web development that are a decade out of date... sigh. (And anyone who doesn't understand what I mean should watch the Shadow DOM and Polymers presentations from this year's Google IO. They really need to put the Sandbox on a public url for a quick impressive demo...)


  > I can't get over there are iOS fanboys out there still
  > with their heads so far in the sand.
Those iOS fanboys are just knowledgeable enough. Why don't you spend some time and educate yourself on a couple of things. First, where the first usage of the cornerstones of modern web tech appeared in the wild: I mean <canvas>, CSS transitions and animations. You may be surprised. Second, learn a bit what native SDKs offer. Once you know the state of "modern web" and the state of modern mobile OSes SDKs you may be less tempted to try to compare them. Trying to portray web stack as superior is laughable and the person making such claim probably has his head in the darker place than sand.


> Trying to portray web stack as superior is laughable and the person making such claim probably has his head in the darker place than sand.

No such claim was made. He simply pointed out (correctly) that the mobile web is in much better shape than it is being given credit for. In that, I agree.


Ah, yes. Assume I'm ignorant about the technology you claim is superior (the exact same thing happened the last time I tried to point out that the web has advanced a bit in the last 10 years). I didn't even try to make any claim about the web being better than native, but for your own knowledge, I have experience with native SDKs from win32, to cocoa, to android and iOS.

>Trying to portray web stack as superior is laughable

In what regard? Surely not in ease of development? Ease of learning? Ease/portability/availability of tools? Portability of the end app itself? Available developer pool? Open source code to look at and use? I can't imagine a way that any native platform beats the web in any of those categories.

Or just in terms of making a fancy animation and having reusable UI elements? You're probably right, but oh wait, if only you'd investigated Shadow DOM and Polymers like I recommended you'd see that web technologies are rapidly advancing on that front as well.


> Surely not in ease of development?

Nope. Building a complex application in JS+HTML+CSS is still an enormously painful undertaking.

> Ease of learning?

Also nope. Complex applications are still complex.

> Ease/portability/availability of tools?

Compared to the development tools for other platforms (IDEs, profilers, etc), the web tools are far, far, far more limited.

> Portability of the end app itself?

Portability is a cost-cutting measure, but never a user-appreciated feature. The web would need to stand alone as a first-tier platform for this to be a net gain; as it is, web-only experience is subpar.

> Available developer pool?

Quality engineers capable of producing first-class applications for your users still cost a lot and are rare on the ground.

> Open source code to look at and use?

This is not even remotely unique to the web.

> You're probably right, but oh wait, if only you'd investigated Shadow DOM and Polymers like I recommended you'd see that web technologies are rapidly advancing on that front as well.

Catching up to 1990s application architecture is not really a cause to celebrate. The technology stack that drives desktop and mobile devices is massive. We don't build custom interfaces directly on top of OpenGL and call it a day.


Hmm, I suspect that the comparison is relative, whereas many of your comments are absolute. E.g. Quality engineers are certainly hard to find, but are good javascript developers _harder_ or _easier_ to find than iOS developers? I don't really know, but I suspect that good javascript developers are easier to find. Also, I think web-development tools are pretty on par to xcode.


Due to the degree that each of the major mobile players "cross-pollinates" or "copies" these days, why are we even still talking about who is copying who? Who cares? Looking at these new iOS demos, I see a lot of things that were first done elsewhere. But to most people (outside of our ivory tower), that doesn't matter, and it shouldn't.

But Apple shouldn't be faulted for some of these "borrowed" elements in iOS 7. Google does the same thing, Microsoft does the same thing, and so does everyone else. In fact, I expect them to copy good things, because I as a consumer want the best for my money. Stay on your own stubborn, outdated path and end up like RIM.


Right? I feel like for the first 4-5 years it was whining that Android stole everything from Apple (arguable) and then after that it was whining about iOS leaching Android features.

But now... who cares? Or has the effort to care? Look at both of the latest platform releases. It's new APIs. It's new services. It's new ways of giving them more data to improve their services. They're working on UI and UX issues and small features.

We've reached such feature parity - I honestly believe a choice between iOS/Android/WP8/RIM is going to give you a good, fairly polished experience. (The biggest differentiator left isn't bullshit gradients and transparencies; if anything it's personal preferences or App availability [I need my Google Voice!])


> We've reached such feature parity - I honestly believe a choice between iOS/Android/WP8/RIM is going to give you a good, fairly polished experience.

This is exactly right. Just like arguing over the pronunciation of "pecan", or which text editor is the "best", you're probably not going to convince anyone with an existing opinion on the "best" mobile OS. Who cares?


"In short, we’d replace 2007′s sliding textures with motion, dimension, and physics."

I'm pretty sure Steve Jobs would be rolling in his grave if the UI was that busy. He was into sleek.


> Steve Jobs would be rolling in his grave

Please, let’s not play that game.


Yeah, I am an iPhone owner and prefer it to Android, but this article is just asinine.

First of all, you can absolutely do anything iOS does on the web; it's just a question of how much effort you're willing to put in. There's not a library already in existence that you can use easily which perfectly mimics iOS 7, but if there were then iOS 7 would need more work. But there will be web libraries that allow you to mimic iOS 7.

The whole concept of "X platform can't do this" is so wrong headed to begin with. When it comes to software, the first thing anyone should learn is that anything can do just about anything. The limitation is hardware.


Explain how you are going to animate blurred glass at 60 FPS if Safari doesn't give you a fast path to the GPU?

I don't understand your comment. Are you just saying that both Javascript and Objective-C are Turing complete? If that is the point your are making, well that isn't really debatable.


http://www.theverge.com/2013/6/11/4418188/apple-ios-7-design...

I'm baffled why all the Apple people are excited about features I disable in Windows because they are obnoxious and ugly.


You really think you need direct GPU access for a simple blur effect?

There's already plenty of ways to accomplish blur effects with CSS and JS. Do a little research.


Yes, you can build Call of Duty with canvas and JavaScript. No, you shouldn't. A claim that faster hardware is a solution is circular: to get native performance today, faster hardware is needed. At that point, native performance on that hardware becomes great and theoretically is doing more amazing things than it did on the old hardware. In other words HTML-based apps always play catch up, unless native can't use the hardware anymore, or the interpreted bottleneck is reduced.


Of course mobile apps are always running slower than native, but that wasn't the issue here. The claim was that web apps CANNOT do what iOS 7 does and that's just horseshit.


You may be surprised, but iPhone 5, or even more, Galaxy S4 are capable of much more than that crap. Like the full 3D with bump mapping, environment mapping and dynamic lighting.

The main problem is iOS 7 does not invoke a desire to copy it.

Whereas various items of the old iPhone's UI can be found all over the web. That UI style is so routinely copied nobody even notices. Some things from the iPhone have even become some sort of a standard.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: