Developers, please take note of the authors statement below....
"Many developers assume that everyone wants their data to be “in the cloud”, but that's actually not true for a lot of my customers. Professional researchers often sign agreements in their children's blood stating that their data will be stored on an encrypted disk, won't leave their laptop, and will be destroyed when the research project is completed. “The cloud” is the last place they want their data to go."
There are so many great note taking and productivity application that I just can't use because the majority of my notes are of a confidential nature. If my company provided macbook where to be compromised I would not be held liable, however if my personal dropbox or evernote account where compromised I would be held accountable.
I'm very much in the camp of not wanting my data in the cloud. I don't autoupload photos, for example, because I want control over them.
What I would like is a home cloud server which would handle all the services I could get from the cloud with explicit sharing with chosen people (e.g. my family).
I think (hope) that there is a distinction that can arise between "cloud" applications and services and "offsite" or "online" applications.
If you plug together a handful of off the shelf Amazon components, slap a label on it, and open the doors, perhaps that is rightly called a "cloud" service ... the end provider has no accountability to you (or your users) and you have no idea what's going on behind the curtain. It's all just magic happening many layers of abstraction away.
But if you build systems, own the platform, write the architecture and provide something that you understand and have accountability for, end to end, I think it can satisfy the skeptics (of which I am one).
So in this case, the researcher that can't store the documents on dropbox ... hopefully he could upload them with duplicity to an online storage platform that was built and run like this[1].
And I hope that this would be possible because such a distinction could be made ...
SpaceMonkey stores your data on your home device, and then replicated across other users' devices for redundancy. If you have liability problems using something like Dropbox, SpaceMonkey may not solve them.
It's not only about protection from hacking, it's often also about protection from access by authorities – in particular American authorities. Non-disclosure agreements, data privacy laws and attorney-client privileges are not compatible are simply not compatible with most hosted services, especially not abroad where your local law cannot protect your local legal obligations.
I think the people here suggesting things like bittorent sync and owncliud missed the commenters request, i. e. a personal cloud that works with all these other services. Owncloud etc are just clunky implementations of drop box that offer less uptime and features. They don't offer slick, out of the box integration with other apps/services
Synology is doing good things in this area. They offer a "private cloud" backup and file-sharing solution, served from a (linux based) NAS, with client programs for Mac, Windows, Android and iOS. I have a home/small office model and I love it.
1. even in Seattle, internet response time is erratic. Delays anywhere from a second to a couple minutes is commonplace. Using an app requiring constant traffic over the internet is quite unpleasant.
2. The backup problem. If my cloud account "goes dark" for whatever reason, I'm dead in the water, and I'm helpless to fix it.
3. I simply won't use a cloud solution that doesn't encrypt the data on my machine before sending it to the cloud server. Encrypting it after it gets to the cloud server is unacceptable. I currently use Jungledisk for backups on Amazon's cloud service because it does encryption locally.
> 2. The backup problem. If my cloud account "goes dark"
> for whatever reason, I'm dead in the water, and
> I'm helpless to fix it.
Isn't the cloud part of your back-up strategy? For example, I use OneNote for all of my notes. It syncs the notes on my PC to the cloud. As long as I'm connected, I have a backup online.
Changes also sync to my other computers when they're online.
Additionally, the notebooks are included in the local data backups we make (in our case off-sited to our ISP).
Sensitive stuff is encrypted in TrueCrypt, and the TrueCrypt volume is stored on DropBox. MI5/CIA could probably break the encryption, but there are easier ways for them to get at it.. http://xkcd.com/538/
I also wouldn't want to use a cloud storage system that didn't at least have the option to keep a local cache of my data so that I could access it offline and with local speed. But that's how most of the "cloud storage" services I'm familiar with work, so it's not really much of a problem.
This is a great point, but also orthogonal to the tech/platform choice. One can still develop a front-end using web tech which connects to a localhost server (see for example Google Refine -- which runs a localhost web server). It may make packaging and distribution it tad more annoying, but there are tools that can do this, and then you get the benefit of cross-platform adoption -- as well as the option to make it a hosted service, if desired.
> a front-end using web tech which connects to a localhost server
But what advantage does that offer over a native application? You’d then need a ‘native’ web server and would still rely on the local browser in a way that opens you up to more incompatibilities than just using the native interface framework?
Assuming you are really a one-man team, you’d still have to set up the web server, database etc., which I imagine to be somewhat cumbersome on an ‘unknown’ platform.
An interesting usecase would be a LAN-local server, which would (with server-side data processing) avoid the slow uploading of data over the internet and could still utilise native computation speeds on the server.
The terms of these agreements seem pretty arbitrary and probably present a false sense of security. Properly encrypted data in the cloud is completely secure. In fact it should be impossible to tell from random bits.
On the other hand data on an encrypted disk is not exactly the same thing. It must be made available to the OS whenever the user is logged in. Any breach in security say from an email attachment or malicious website would expose its unencrypted contents.
I wonder what the required policy is for backups? Can they be stored on servers if encrypted? Remote servers?
What I have never understood is the willingness of people to accept ad ridden web services when some years ago everybody had their panties in a bunch when Opera or Getright had ads in their software.
Back when I was part of a EU research project, we could not use Skype, DimDim or any other video-conference platform because a lot of stuff we worked was protected as the author comments under a lot of privacy contracts.
Probably still not acceptable for many of these situations, especially when you're signing these kinds of data custody agreements. Oftentimes they specify requirements for physical custody of the hardware, and even if they don't, adding third parties into the mix (Amazon and you) may make the auditing requirements more complicated.
A lot of the time, these requirements are more about auditing and liability than about technical security measures.
> So I'm left with <canvas>, and <canvas> is slow.
I consider this a bit of a frustrating pseudo-myth. It's true canvas is slow compared to lots of native drawing, but its usually presented as a false equivalency issue. Canvas didn't set out to replace native drawing of hand-crafted OpenCL. It's an alternative to cross-platform graphics on the web, where you get Canvas or you get Flash (Or SVG or hobbling together colored DOM elements to animate while crying and eating ice-cream out of a gallon tub. We've been there. Don't lie.).
What's worse, looking at the videos demoing his Mac app[1], they don't even look like they'd need canvas levels of performance. They look like they'd work just fine in SVG. What's wrong in this case with the very good performance across many platforms (even tablets) of the SVG-backed RaphaelJS?[2]
Unless his app can do things he'd rather not demo, I'm guessing this post is moreso post-hoc rationalization of picking Mac as his preferred development platform. At the risk of being a bit rude, it's worth noting that he doesn't bother with all Desktops, just Mac, which causes further suspicion that this is really just a rationalization piece and not about performance.
(Unrelated to the post at hand, this canvas pseudo-myth also upsets me because I spend big chunks of free time helping people with their canvas work, and its almost always an issue on the programmer's part. This is fine, I've never fault a programmer for writing less-than-optimal code, but often programmers tend to contribute their voices to the chorus of "Canvas is slow", regardless of looking for fault prior to their declaration.)
> What's wrong in this case with the very good performance across many platforms (even tablets) of the SVG-backed RaphaelJS?
So I attempted to determine the accuracy of this claim. I ran a benchmark from Kevin Roast [1] who seems to author a lot of Canvas demos. For each of the 8 tests in his benchmark, I recorded the FPS reading that I saw that was the lowest (e.g., framerate dropped to X at some point during the 5-second test).
On the Mac side, I can see both sides of the issue. Arguably 26-29fps in a wide variety of situations is good enough for a wide variety of applications. At the same time, I can understand the author really wanting to blow past 29fps.
On the iPad side, the issue is more clear. I think most people would say that 18fps is unacceptable for a drawing application.
(If these numbers are contributing to the "pseudo-myth" of slow Canvas performance, please point me to a reasonable benchmark. This one is just the most comprehensive one that I found.)
> At the risk of being a bit rude, it's worth noting that he doesn't bother with all Desktops, just Mac, which causes further suspicion that this is really just a rationalization piece and not about performance.
He addresses the choice of the Mac platform in some detail in the "Drawbacks" and "Conclusions" section of the post. At the risk of being a bit rude, it's poor form to dismiss someone's reasoning as a "rationalization" without addressing the reasoning on the merits. To the extent that his choice of Mac over Windows et al is specious, it is not a claim that is supported by your comment.
In the sentence you quoted I'm referring to SVG, not canvas, and the "in this case" refers to mapping apps, not games. I never made any claims with regards to games.
In the test you quoted the goal is to stress it until it could only handle 30fps (it says so on the page), so you necessarily must see 30fps for the test to go on to the next one. That is why your median score on the Mac is 29fps, because it degrades smoothly on the desktop compared to a mobile device.
If you want to see many devices peg 60fps on the canvas (the rate imposed by requestAnimationFrame), you can use a demo like MS' Fish one.[1] Your Mac ought to get 60fps for 1000 fish on a 1920 x 1075 canvas with no sweat. This is not a very interesting test, and I don't know what it will look like on an iPad, but it more than enough accounts for any animation you might see in a mapping application.
“Your Mac ought to get 60fps for 1000 fish on a 1920 x 1075 canvas with no sweat.”
I just checked, and indeed, my new MacBook Air with external display ran it just fine at 60fps. My 3 year old Mac mini handled 45fps. My iPad 4 managed to crank out 25fps in a 981x644 window.
This is written by someone who actually writes a popular web framework for Erlang - the exact opposite of throwing a static site on S3. That gives this piece more credibility - he knows how to build active server-side software as well.
Good point. I'm personally of the opinion that one should not publicly denounce a thing unless they've spent time in the trenches trying to make it work.
> Browser vendors with cross-platform drawing kits just don't invest the same kind of effort into drawing lines and circles. As a result, large <canvas> areas seem laggy on most browsers, whereas native apps “just work”. (“Are the graphics pre-computed?”)
This is incorrect. Firefox and Safari use Core Graphics (Quartz) just as Mac native apps do. Things like lines and circles go through the exact same SSE-accelerated routines. I don't know off the top of my head whether Chrome uses Skia or Core Graphics on the Mac, but either way, its blitting routines use SSE optimizations as well.
Technical reasons aside, the canvas tag is still orders of magnitude slower than writing Core Graphics by hand. Personally I'd say almost unacceptably slow, but YMMV.
You succeeded in picking out a minor technical inaccuracy, but not in addressing the point the author was making about rendering performance.
One of the reasons it's slow is because CG bitmap contexts are pre-multiplied, but the Canvas spec requires things to be not-premultiplied, so there's a bit of extra math that needs to happen w/ Canvas vs. straight CG/Quartz.
> the Canvas spec requires things to be not-premultiplied
It does, but unfortunately neither Webkit nor Gecko respect this. Take a PNG with an alpha channel and draw it to a canvas over a white div then read back the pixels; you get the pixels premultiplied by the white behind it. Same with other backing colors. There are cases where it won't premultiply, but unfortunately they're a minority.
(The reason I ran across this particular case is that I use PNGs to compress data, and using the alpha channel to store data is impossible because of the premultiplication.)
Well, of course browsers are going to render ultimately with the platform's native drawing functions -- which have undoubtably been optimized. The issue is the high-level of abstraction between the description of what is to be drawn, and the drawing calls themselves. This is fine for a majority of tasks, but when it becomes a bottleneck your hands are pretty tied. Native apps have much more direct control over threading, vector processing, audio devices, networking, etc.
All good thoughts, and similar to why I develop native apps for iOS vs web apps.
I think John Gruber summarized it nicely when he said: "Facebook, bless them, has it right. What's great about the web is ubiquitous network availability, not running within a browser tab. Websites are just services, and what you see in a browser tab is merely one possible interface to that service. The best possible interface to that service is often, if not usually, going to be a native app, not a web app."[1]
Web does not mean ubiquitous network availability. It's a turn of phrase that only seems to originate from Silicon Valley and Redmond. When I get on the London Underground my network is gone. When I drive from Cape Town to Hout Bay my network disappears. When I go into my favourite coffee shop in Berlin my network disappears.
If an app (Evernote, for example) relies on a non- store-and-forward network connection to hook me up with my data it gets uninstalled. I can't use Trello for the same reason. When I need my data I need my data.
Right, that's a huge stepping stone for web apps to overcome. But what happens when the interface itself requires an update from the cloud? We're doomed.
I think dismissing web tech b/c of what Facebook does or John Gruber says is short-sighted. Case in point: the people over at Sencha rebuilt Facebook in HTML5 to run faster than Facebook's native app [1]. It may take an additional level of skill and careful planning...and it may be easier to shoot yourself in the foot with a WebApp that provides a compelling experience with performance on par with a native app, but that doesn't mean it can't be done.
I've seen the case of Sencha Facebook app brought up many times now, but I've never seen someone point out their huge flaw in the scrolling. At one point in the demo, they trivially dismiss the fact that they made the scrolling inertia smaller to give the data a chance to load; but that's in my opinion THE main flaw of the app. As a matter of fact, that's the first thing I notice on every web app I've seen to date: the quirky scrolling physics. For me, Sencha's implementation falls right in that uncanny valley and it kills the UX of the app each time for me. I don't see why they put so much emphasis on the loading speed or smoothness (not that it's not important), when all this effort to improve the UX is completely offset by the scrolling.
Ask everyone again. This was true a long time ago (long time being, like, 1.5-2 years), where the Facebook app was so slow and so crashy (and the API down so often) that it was mostly useless.
Facebook is, IMO, one of the gold standards now for large-scale iOS app building. I had the pleasure of hearing one of their people speak about this a few weeks ago and they're very much at the forefront of iOS engineering.
I've been using the FB app since it arrived on iOS. At one point it was a decent app, but it is far from being the gold standard. The last few updates have just introduced more bugs and lag to the experience. Lately I don't even know if I clicked the 'comments and likes' text reliably. You can't tell if you missed the ridiculously small hit target or if it is just taking the usual 3-5 seconds to give you any kind of indication it is doing anything. That's just one example of a frustrating experience, but I consistently have others—zooming and panning pictures randomly closes them, likes sometimes never show up, my own posts won't be visible on the phone, only on the web, etc.
Apparently Facebook's iOS engineers weren't that excellent (I mean, really relatively speaking), so comparing their old Facebook effort to a well-made web app's is not telling.
Plus, the Facebook app is way different now. I don't see how it'd be possible for them to implement web-based chat heads without killing the performance (even on my iPad 3 it isn't 100% smooth). Lots of times these clever design concepts cannot be implemented on the web because of its relative performance drawback.
I'm going to take a wild stab at the primary reason he develops on the Mac is his own comfort and expertise with it - which is fine, of course. With the advancements made in web tech today, I don't think all the performance reasons he listed ring entirely true. See for example Google Refine https://code.google.com/p/google-refine/ which is a similar tool to his Wizard tool, with a web front-end.
> his own comfort and expertise with it - which is fine, of
> course. With the advancements made in web tech today
Do you have any expertise developing for desktop/native mobile? I have a lot of expertise developing for the web—I've spent 4x as much being the web developer before switching to iOS programming full-time and every time I see someone presenting web tech as a superior way to build apps I have a hard time taking this seriously.
If there is a lack of expertise it must be the other way: web devs don't know what native frameworks offer.
Sure we can push and twist and stretch a systems with foundations met for the marking-up document structure and styling document presentation to fit the needs for the up, but what's really the point?
Yes there is the promise of develop once run on every platform, which in practice is more like develop core once, endlessly tweak for idiosyncrasies later.
And when the web tech really works for mobile app it usually just means that your app had to be a web page anyway.
You've mis-read my statement as some sort of attack on native development and counter-attacked accordingly - even personally? I never said web tech was superior. I'm simply presenting a counter-argument to those that continue to deride web development and claim it can't hold a candle to native apps. There are endless examples of web applications done right that have compelling experiences and perform on par with native apps. I've provided a couple of examples in this thread, but there are countless others. I'm simply claiming web apps can be done right.
At the risk of sounding cavalier :) see https://news.ycombinator.com/item?id=5659447 -- which is a better summary of the point I was trying to make. It seems he likes the Mac native platform, which is great, but don't try to sell me on the idea that all other platforms and the web, in particular fall short.
No offense, but with that intro to your post and the way you follow it up, you strike me as exactly the person in question who has been out of the web dev game for a bit and doesn't know how much it's progressed in the last 3 years alone.
It's not hard to do responsive design. Yes, in iOS world you can still hard code 5 layouts if you want. Those of us that have also done Android design work understand why that's untenable moving forward. I'll take relative layouts, etc and be very happy with Bootstrap/Foundation.
This made me scratch my head. Desktop apps have been accommodating multiple screen and window sizes since before the Internet existed. Do you think we hard-code a separate layout for every window size?
I don't know if responsive design is "hard" or not, but I observe that on the web, fixed widths are still incredibly common, and even major sites break easily. I visit Google, and if my window size is not at least one thousand pixels, the Sign In button is clipped and not initially visible. This would simply be considered a bug in a desktop app.
Besides, I always thought that layout was one of the weakest parts of HTML and CSS - witness the endless parade of hacks to achieve equal height columns.
As someone who is not a "web designer", but occasionally have to knock up a page or two I agree with you all 100%. To me the whole CSS business is just a big hack, and not at all usable or intuitive. YMMV, but I think there's a reason why people come up with [1] type jokes...
As someone who feels fairly proficient in CSS and also has experience with desktop application layout stuff (Swing, Qt, XAML mostly) I can say that CSS does feel like a hack. It was conceived as a (more or less) simple stylesheet language for (mostly) textual documents. By now it's used not only to style increasingly-complex document layouts but also application UIs and in that respect it definitely falls short of what other (widget-based) things could achieve for years.
It gets better with some of the CSS3 layouts which more closely mirror what's ben available in the desktop world for ages, but using those is just a recipe for »Your site looks like crap on my browser« mails because they're not standardised yet or not yet widely-implemented (or your target audience uses something else than the bleeding-edge version).
> This made me scratch my head. Desktop apps have been accommodating multiple screen and window sizes since before the Internet existed. Do you think we hard-code a separate layout for every window size?
I think the problem is the youth of today doesn't know anything other than web development, so they are full of false assumptions.
Yeah, I personally have a giggle whenever i see a complaint about catering to multiple resolutions. I seem to remember devs having similar issues with desktop games (and apps) in the whole VGA -> SVGA -> XGA migration. (1987 to 1990 ish... My dates may be off). Were none of the lessons learned then applicable to mobile dev today?
So the same coded version of Office works on your tablet and phone and the UI adapts properly? That's pretty impressive.
I wasn't implying desktop designers hard code everything -- but in fact many of them do or make assumptions that the app will NEVER be used on anything smaller than even the absurdly small 800x600. We're talking about mobile apps, responsiveness, the cost/benefit of targetting multiple devices with one app vs six native apps.
Neither Microsoft Office nor Google Docs "just worked" on tablets and phones without major reworking of their UIs. It's not obvious that the web has any advantage here - I just now visited docs.google.com and was encouraged to "download the Google Drive iOS app" so that I can "edit documents." (!)
If you target multiple platforms with one app, you get an app that works from OK to poorly on lots of platforms. Look at Light Table as an example: done entirely via web technologies, easy to port everywhere, but feels incredibly alien and _wrong_ on my Mac. (No offense Chris!)
Compare to Sublime Text 2, which to my understanding has lots of platform-specific code to make it feel native on each platform, but also a shared core (using Cairo, etc). So "six native apps" need not cost anywhere near six times a single native app.
So in the end, the cost of targeting multiple platforms with a single app is surely lower than targeting each platform individually, but that's just a classic cost/quality tradeoff - the web limits your polish. And what good is having your app on multiple platforms if it's inferior to native alternatives on all of them? I know as a Mac user, I'll pick the Mac app that feels like a Mac app every time.
Your comparison of Wizard with Refine is misguided. The two products do completely different things. Refine "refines" data into a usable form. Wizard is a stats and visualization tool for understanding an already-clean dataset. Last I checked, Refine actually runs a local web server, even if its client is in the browser, so it's not that different from a native app. Wizard also does incredible amounts of visualization that Refine doesn't do.
(disclaimer: I purchased Wizard a few months ago and have been exceptionally happy with it -- it's a brilliant product)
So, pair Refine with d3.js and you've got both. There's no reason I see to require native application for data visualization unless you absolutely need to load enormous amounts of data, but the value in the visualization itself tends to diminish the more datasets you shove in it. d3.js has very powerful visualization tools https://github.com/mbostock/d3/wiki/Gallery. He chose to build is tool in the native Mac platform, which is great, and great to hear the product is really useful for you -- I'm just defending against alternative tech choices. If you wanted to build this in web tech, I think you surely could. I'm not saying it should be built in web tech, but I disagree with the notion that it couldn't be built in web tech.
Sure, your web app might feel instantaneous when your server is sitting across the LAN, but many users have crappy Internet connections, or are downloading a Torrent, or are living in New Zealand...
Yup. New Zealand's internet speeds are just abysmal. On behalf of kiwis everywhere, can I ask that you all stop writing your own wrappers for web based videos? Instead, upload your videos to YouTube and embed that on your site. Their buffering is orders of magnitude better than some of the half baked crap I see from other people, probably because streaming video is YouTube's core business so they've focused a lot of time and attention on getting it right. If you live in urban parts of United States or Asia it's not a problem you'd notice, but for the rest of us it's a daily nuisance.
On the other hand I have a great home internet connection here in the USA yet YouTube is terrible for me. I constantly get HD video streaming at 250KB/s which is well below the bitrate meaning I have to wait for it to buffer. And yes, I've tried the hacks (blacklisting YT CDN IPs) to no avail.
Your ISP might also be proxying YouTube IPs to local cache servers, the way Time Warner does in the US (though not necessarily improving performance, judging by the recent excitement over how to disable it).
With each web project I take part on, I feel more and more in sync with what the author states.
The proper way is to have desktop applications that take proper advantage of the hardware and operating system integration and use the network for communication.
Leave the browser for the documents. No need for browser compatibility headaches or JavaScript/CSS/HTML hacks.
Just use your favourite programming language and take advantage of the OS capabilities to full extent.
That's nice sentiment from developer perspective. For a manager or entrepreneur it would be an unacceptable waste of time and money, so I don't think it's going to happen anytime soon.
Debugging CSS is much less fun than using a proven multiplatform C++ toolkit. Only when node.js got popular I could write the ever reliable callbacks in my web code, while I use them all the time in C++.
But of course there are more people who can write web apps than people who can add this symbol at the end of the line -> ;
Only if you are speaking about database frontends that are nothing more than enhanced versions of "form submit", which used to be done in RPG, COBOL, CLIPPER, VB and now Web.
There are many other areas when desktop is still the way to go.
> Try running a profiler on a native Mac application that does a fair amount of drawing. You'll see references to function calls like: sse64CGSFill8by1
Try running a disassembler on any compiled graphic applications of the last 10 years on any platform and you'll see SSE instructions.
You mean developers, right? Compiler auto-vectorization hasn't really advanced that much. You still need to manually use SIMD intrinsics (or inline asm) to get any kind of worthwhile performance increase.
More accurately: Why I develop this specific class of applications for the desktop rather than the web. This is basically what the author is talking about despite the generalization regarding desktop apps; if he's going to stick with that he should have gone with "desktop" instead of "mac."
You seem to have missed the whole "Features" section. Here's a list of what he said he gains "for free" by developing for Mac (of course, other environments might have similar stuff, but not exactly like OS X's): Undo/Redo with Core Data, PDF export (and "import" into current documents), Multi-touch zoom.
He was being very careful to not overstate his claims, or even to appear to be overstating his claims. Considering how people in forums typically respond, I understand why.
I think the keyword is "text-heavy". Rendering 256 characters into a texture to use with WebGL is one thing, but what about unicode, or readability at small sizes? I'm honestly asking, I'd love to hear about ideas or libraries etc.
In OpenGL you can use CoreGraphics to render into a context and convert that data to a texture. This could be done for each string, but doing so you loose the power of the underlying implementation. If there are libraries that make this easier I am yet to come across them.
Maybe he meant WebGL, but even then it could be done very well as with more modern OpenGL variants with bitmapped font tables. OpenGL on Mac is 3.2 IIRC. Nvidia recently tinkered with proprietary version of OpenVG with beautiful text rendering capabilities. There are numerous ways to make great text with OpenGL be performant with native code. Even GUIs with IMGUI approach. Some of those techniques could be done with WebGL, I'm sure. So, I don't really understand his point about text and OpenGL a no go
Sure, WebGL is not all you can get with native and OpenGL, but nowhere near a fiasco in performance.
There are 75 million Mac users, many of whom have iTunes accounts and credit cards on file – they can buy from the Mac App Store with a single click.
20% of 75 million is 15 million. 15 million copies of any paid app is huge.
Also, let’s not forget that Mac sales has been growing for years, while the PC market as a whole has been shrinking. Today, there are 3 times as many Mac users than 5 years a go.
It's great going for low hanging fruit, but you need to be realistic about things as well. Bigger market means more sales, but potentially only at a lower cost. Markets are a funny old beast.
At the same time, every member of that 10% bought a Mac desktop, and therefore bought a product with high margins. If 10% of your potential audience has no problem buying software, especially high-margin software, they'll be easier to sell high-margin software to.
Well, according to this [1] the Mac install base is around 66 million users. So, if 20% of them bought your app for $1 you'd be $13 million dollars richer. That seems worth it to me. It seems that you are underestimating the size of the market and potential the to make money by writing applications for the Mac. You are also ignoring the fact that Macs sell at a premium so those 66 million represent not the bottom end of the market, but those with money to purchase services to make their lives easier.
My estimate of 75 million was conservative, it’s closer to 80 million.
Half of all new Macs are sold to people who are ‘new to Mac’[1]. Even the people who buy a replacement for their old Mac will often find other uses for that old Mac or they will sell it or give it away.
One year a go, Mac install base was 66 million. Since then, 17 million Macs have been sold. 8.5 million of those are sold to people who never owned a Mac before. Let’s imagine that all the remaining 8.5 million units are bought by existing Mac users, and that half of them are bought because the old Macs died. That gives us a new total of 78.75 million.
Until instantaneous data transfer discovered (where the speed of light isn't the limit), the closer one is to the processor the faster the application will be. The point being that with data manipulation unless you have a huge bandwidth, the packet size can match and the ability to assure rare outages (if ever) applications on ones personal computer, with personal data will always be faster. The point being is all the discussion regarding performance on desktops vs web applications is moot when we are at least a decade away from everyone having internet connections capable of instantaneously transferring hundreds of megabytes of data.
Personally, I can barely stand web applications, I want my computer to move as fast or faster than I think.
I was experiencing some native-client nostalgia for a bit until I got to "beg Apple to perform an expedited review of the fixes." Seriously? Fuck that shit. I'll keep my lousy performance and server maintenance woes, thanks.
This is exactly why I stopped developing for native. If you want to fix a bug or tweak/add one little feature its a huge ordeal. Plus the 30% tax on non-recurring revenue. I'm glad he's doing mac apps. Less competition in the web space.
Im just saying your giving up a major marketing outlet if you arnt on the MAS. Of course you can go it alone, but since it is a MAC app, that would be foolish.
I find this article a very good example of how you can try your very best to describe in reasonable terms and tone why a certain decision was the right one for you, but that won't stop a bunch of people who know neither you nor your business to tell you why your opinion and you personally suck.
Well, the raw GPU interface IS not good for 2D text-heavy graphics!! I mean you could render to an offscreen framebuffer but doing it on the CPU is much more convenient.
> Our Terra-based auto- tuner for BLAS routines performs within 20% of ATLAS, and our DSL for stencil computations runs 2.3x faster than hand-written C.
That's also why few people will ever use his applications. Javascript might be more limiting, but it can open your app up to orders of magnitude more users.
I somehow feel that he misses the point, "web app" and
"desktop class app" usually have very different purpose and goal.
It almost sound like to me: fuck these gui app console app are way faster, and see how easy it is to run them on headless server.
I don't think it requires too many explanation to point that out. It suffices to compare Cocoa with the clusterfrack that it is shoehorning UIs in the browser.
That said, he points out performance, which is a pretty poor reason nowadays.
make error cannot copy create file in /path/hft7ahd7wtyc7hoy4awt/blabla/thing
I had already searched and found people with the same problems, but it boils down to hardship between xcode and cross-platform build software like cmake and others.
I don't want to bash apple, but it feels awkward to be a dev and watching apple digging up their NS tech to put in everyone's mouth, even if there are business with big working c++ codebases. Being incompatible with the competition is still a strategy, even if apple use unix-flavored stuff.
If you return control to the user under 150ms your app will fell quite snappy. From 150ms to 200ms is okay too. Anything above 250ms is noticeable and considered by many as sluggish.
Another rule of thumb that's used is that responses within a tenth of a second feel instant, within a second will keep the user focussed, and you can get up to about ten seconds before they'll want to do something else while they wait.
C++ isn't a pure win. It's far more complex, and leaky abstraction wise than Objective C is. MOST people write mac and iOS apps in Objective C, and it works well, is predictable. It's also far easier to find people who do Obj-C well than C++ (in the mac environment, but this holds for all systems I've ever seen as well).
It's difficult to say which is the leakier abstraction, but I feel that in C++ I have the option of writing only high level code.
In Obj-C it's not unusual to have to mix in C. And sometimes the code I write is quite fragile due to the type system and not having some features available.
The worst thing about C++ for UI work is that it offers essentially no introspection and is a nightmare for tools authors. It's good for writing high-performance application kernels but for everyday UI building it's just overly complex and awkward.
Also agree that QML is light years ahead of anything else for UI work. Declarative is the way to go, but most are focusing in the wrong direction (e.g storyboards).
Any programming language will be awful for UIs, be it C++, Obj-C, Python or C#.
Qt for example uses hybrid approach. UI itself is not defined it C++, rather in declarative language (QML). C++ is used for application logic. That wasn't so in Qt from the start, but was brought there because of dynamic UI requirements, especially on mobile.
That's my point. Learning the language just to use some platform specific non portable technology like Cocoa? No thanks. Better to stick with what's reusable, like C++ and Qt.
Iron law of native desktop app development: Code to your platform's native GUI API -- no wrappers. Or your user base will notice and one-star that shit.
You have less of an excuse on the Mac, which has literally the best GUI API in Cocoa that anyone has ever invented.
Iron law of native desktop app development: Code to your platform's native GUI API -- no wrappers.
Not so iron, if you have many platforms to manage. Usually it doesn't worth it to manage tons of separate toolkits for each platform, if there is no common abstraction at all. It's just too costly.
If you don't mind your app looking like shit on all platforms, this is a viable strategy. A lot of specialist applications are like this: for example, I used to develop Qt-based GUI tools for robotics.
If your app looking like shit will cost you significant sales -- as is the case for most productivity and design apps -- then yes, you have to port the GUI bits to each and every toolkit you're using.
Not true. Lot's of applications use generic abstractions and look good at the same time.
Firefox, LibreOffice, VirtualBox and etc. If you don't know how to make good GUI using generic toolkits it doesn't mean it's not possible.
Anyway, if you buy application for its "looks" - there is something seriously wrong already. It should look good, no doubt, but it should be functional first.
>as is the case for most productivity and design apps
Completely the opposite. They tend to provide completely custom, not native looking UIs, and therefore using cross platform toolkits for them only makes more sense.
Oh, yea. And Mac users are oh-so-great majority of desktop users... Please. I have nothing against Macs and when I'm rich enough I might even buy one, but I won't try to force the whole world to support my niche OS then. Or niche aesthetics for that matter.
Linux doesn't dictate a native API for GUI. It de facto dictates the display server protocol though (currently being X11 and Wayland as the next generation). So any API would be built with that in mind. That said, most widely used APIs are Qt and GTK+.
Ever noticed how Linux appspace tends to be a horrible mishmash of different GUI styles? That's why Mac users are nazis when it comes to obeying the Apple Human Interface Guidelines - the alternative is every developer does whatever he thinks looks good, and there's no consistency to your desktop. OS and UI consistency is a problem if not done well.
This is not specific to Mac users. Most prefer applications which integrate with their desktop environment well. It's a tradeoff. Using something like Qt will try to mimic the native UI close enough, even if not 100%. But saves you the time on learning each native toolkit. I'd say it usually pays off, and that small percent doesn't worth the effort (especially if the project is supposed to be cross platform).
Who downvoted this guy and what for, when he's only stating his opinion on what tools to learn if one wants to develop for desktop?
And I personally think he's (shmerl) right. Learning a couple almost completely redundant APIs, but each with it's own problems, gothas and plain stupid decisions is hardly a good thing.
Years ago I would have agreed with you, now I'm not so sure.
Most users know so little about the UI of their computers that they don't know when to click the mouse once or twice.
They spend their day working on Windows, checkout Facebook on their iPhone on the way home, play a game on a Playstation then send some email using Gmail on Safari.
Then you have the major changes within operating systems over recent years, Windows Vista/7 and Windows 8 both introduced major UI changes as has OS/X.
Peoples interaction with computers has become much more diverse. Unless the UI is jarringly different they just aren't going to notice.
Having a hard time phrasing this without being a jerk, but many of these "native is better than web" psuedo-rants contain misinformation or seem to be lacking in an understanding of the current state of web technologies like persistance, app caching, etc, etc.
Sorry, but I guess you lack in the understanding of both.
Yeah, app caching looks nice in theory. Now implement it reliably in practice.
What will you use in the place of Core Data? Local storage? IndexedDB(with no support in default browser both on iOS and Android)?
How will your replacement for UITableView with reusable cells look like? How about the same combined with UIFetchedResultController? I am not even talking about Core Graphics, animation, etc, etc.
Sorry, but anyone really wanting to advance web tech on mobile should first learn what native SDKs really offer. And then think long and think hard about an answer to this question: "Should we?Why?".
How about finding the way finally to serve responsible images on the web instead of fighting wars against native apps?
I'm well aware of what native SDKs offer. I don't know why you assumed I'm unfamiliar with native development.
As for the last statement, and much of the others, I'm not even sure what you're asking. If you want a nice syntax for it, then yeah, wait for the spec to be finished. Otherwise, you're probably implementing that logic yourself. The same as if you want a native app to pull in different assets (unless we're talking about UI assets and a sane OS, but still, it's not like that's a compelling lacking feature).
Look, I'd be happy to be wrong but there are half a dozen inaccuracies in this very article about what is possible with web technology and how browsers themselves work.
Am I suggesting Angry Birds in web tech (even though it's already been done): No. Am I suggesting that shit basic apps like Facebook, Twitter, Reddit, etc make more sense as simple, functional, accessible webapps? Absolutely.
And then Sencha remade it and it managed to be faster than the native app. Also note that the actual mobile Facebook site remains as fast as the native app and was always faster than the "native" app that just used webtech. So, ironically, the webapp still wins when used and developed properly for a Facebook type app.
Native apps should always be faster except in the case where the native app isn't architected properly or web services are poorly implemented. Unfortunately, I've seen that both are often the case.
For CRUD apps like Facebook, it's often easier to do them right as web apps, but a good native app developer with a good backend team should almost always be able to beat them.
In that case it's incredibly disappointing that Twitter or Facebook's mobile apps continue to be every bit as fast as their native apps. I KNOW that you are right, as I've said, I'm well aware and worried of the overhead of doing things like video decoding in the browser... but for literally a few http requests, infinite scrolling and TEXT... (which is Facebook, Twitter, Instagram, Path, etc, etc, etc)... Chrome for Android is simply going to be as fast as the native app.
I agree about Reddit, Twitter, etc, but the fact that these apps even exist and they're so popular tells a lot about the capabilities of web apps. It's so bad that websites create apps to access websites and on HN people applaud the huge progresses in web design...
You're right of course, I didn't mean "all" of it by any means. In fact, last night I lamented about web app overhead and the effect on performance. That having been said, it doesn't make his bits about graphics acceleration, local caching or storage any less inaccurate.
Someone develops for the Mac because thinking about cross platform and making it work in the browser is too hard, so we'll just bet the farm on letting Apple handle the hard stuff, and hope that performance doesn't later drop off, and then clients ask why it is no longer so good, and I say "but, I develop on this closed platform, and it should be good..." etc.
"Many developers assume that everyone wants their data to be “in the cloud”, but that's actually not true for a lot of my customers. Professional researchers often sign agreements in their children's blood stating that their data will be stored on an encrypted disk, won't leave their laptop, and will be destroyed when the research project is completed. “The cloud” is the last place they want their data to go."
There are so many great note taking and productivity application that I just can't use because the majority of my notes are of a confidential nature. If my company provided macbook where to be compromised I would not be held liable, however if my personal dropbox or evernote account where compromised I would be held accountable.