This event was by far the most disappointing Mac event in the history. A lot of the time was wasted in:
- Mildly funny jokes and comparison with 90's technology.
- 90% of the talk was about the touch bar.
- Awful demos of Photoshop & some cringy DJ.
I was hoping we would see:
- A new MacBook with all day battery life and touch bar, even thinner design. Ok, I understand that they are trying to consolidate their product line but the category of a web-browsing machine that is 12", super small design and an adequate processor is left without any update.
- A MacBook pro with some real innovation. They could just copy Microsoft with a detachable screen (oh but they would cannibalize iPad market), pen input, touch screen. But, instead we get this touchbar thing which is great but I am just disappointed that it is the only thing they have innovated here.
- Killed Macbook Air.
- No iMac update (!!!).
- No monitor announcement.
Microsoft really hit it out of the park yesterday. Apple's entire presentation felt like they are trying to fill the 1.5 hours of time with bullshit.
Also, Panos Panay sounds like a genuine, authentic, passionate and knowledgeable whereas Jony Ive sounds like an Evangelical designer who feels "fake". I don't know how to explain it.
> Microsoft really hit it out of the park yesterday.
Did we watch the same event? Microsoft introduced a $3,000 desktop PC in an era when nobody uses desktops anymore. It introduced a minor update to the Surface Book that starts at $2,300 with dual-core CPU, only 8GB of RAM, and last-gen graphics hardware.
For the same price as the new Surface Book i7, I can get an MBP 15" with bigger screen, twice the RAM, and a quad-core CPU, and it's Microsoft that hit it out of the park?!
No. Microsoft introduced an entirely new kind of tool for digital artists. I'm not one myself, but the Surface Studio makes me wish I were. And the people I know who are, are over the moon about it. They can't wait to get their hands on one.
Yes, it's Microsoft that hit it out of the park, and made Apple look amazingly weak by comparison. Today's event wouldn't have been particularly impressive even on its own. By comparison with yesterday's, it's just embarrassing.
I'm a digital artist myself. I'm a more goal based guy, so I use both Windows and Mac hardware and software. I know that I am probably the target market for both the Macbook Pro and the Surface Studio, but I have to say, I'm not impressed. And I'd argue that the only people who are would have to be Microsoft or Apple fans. Certainly not tech based artists.
First, this is a business for me. And I suspect it's a business for anyone that Apple or Microsoft expects to sell this stuff to. And having to pay $3000 for outdated hardware is a tough pill to swallow. The hardware in the Macbook "Pro" is simply inexcusable. (REALLY???? 16GB RAM MAX!?!?!?) But the hardware in the Surface Studio is equally galling. They give you an NVidia what? 965 for $3000??? And the BEST I could even POSSIBLY get would be a 980??? And for that magnanimous gesture on their part I would be obliged to pay a MINIMUM of $4100??? For LAST GENERATION graphics cards???? On a machine purportedly about graphics???? (A 1080 would run both faster AND cooler using LESS power Microsoft. And at this point in time, for a business investment 64GB of RAM should really be offered by anyone NOT trying to screw you. What's the deal Microsoft???)
I think to objective observers who were waiting for these presentations... both proved SEVERELY underwhelming considering the pent up expectations. I mean... OK... if you put a gun to my head, I'll probably buy the Surface Studio. But don't expect me to pretend that I don't know that both Microsoft and Apple are robbing me.
I think you (and a few others here) are making the mistake of equating digital art with a need for high-end graphics cards. 99% of what most digital artists consider to be "digital art" does not require state-of-the-art GPUs. Graphic design work, retouching, painting, etc. barely even scratch the surface of what the last generation of graphics cards could handle.
Let's call high-end graphics cards for what they really are: gaming console graphics hardware stuffed into PCs. Apart from a relatively small number of professionals who work with 3D rendering, high-end graphics cards are an even greater waste of money (unless of course you're buying the machine for gaming, which doesn't really fit the "digital artist" target of these machines).
Microsoft and Apple may still be pricing these products too high ("robbing you" as you say), but that's a separate discussion.
"...I can't help but notice that you (and others here) are making the mistake of equating digital art with a need for high-end graphics cards. 99% of what most digital artists consider to be "digital art" does not require state-of-the-art GPUs. Graphic design work, retouching, painting, etc. barely even scratch the surface of what the last generation of graphics cards could handle..."
I can't help but notice that you are equating that 99% of "digital art" that can be handled by mobile graphics cards, to the 1% of "digital art" that anyone's going to actually PAY someone to produce. As I said, this is a business. If I could make the living I currently make retouching photos, then I'd gladly pay $3000 for an underpowered machine...
but I can't.
because no one is going to pay us that kind of money to retouch photos are they?
Look, these machines, to a business, are INVESTMENTS. You invest for the FUTURE, not to take advantage of the past. A 1070 or 1080 is not too much to ask considering Pascal's MANIFEST superiority in efficiency. Additionally, I was being kind, I think MS should OFFER 64GB in the Surface Studio, but a 128GB option would really be necessary to future proof this thing.
I'll tell you what MS and Apple are going for here... it's a money grab. And they are setting themselves up to come back to guys like me every 18 to 24 months for another mandatory money grab instead of just giving us a 60 month machine from the outset.
Offer me a 60 month Surface Studio at the $3000 price point and I'd update every workstation here. But that's not what they're offering is it?
What kind of art are you talking about? I've done much graphic design (yes, paid) and used Photoshop and Illustrator for nearly a decade (though I now use Affinity, as of last year), and I have never been in a situation where I was running out of resources.
Up until this year, I have never owned a machine with more than 8GB of RAM, and never a video card with more than 1GB of RAM until now either.
Are you perhaps talking about video editing or CGI rendering?
"...Are you perhaps talking about video editing or CGI rendering?.."
Yes. Video editing and CGI. The workflow normally requires the use of MASSIVE memory mapped files. Additionally, a given artist may do several "lo res" renders before shipping off to the farm. ("Lo res" being a relative word. Because in most cases we're talking about 2560x1440 or 1080p minimum.) I really don't see how your tech guys are able to get that to work for you on cards that have 1GB memory. But if they can, then you'd better NEVER let those guys go because they are worth their weight in gold. (Unless you don't do any CGI or video editing???)
Yes, this is the case in most of all VFX and animation studios. Pretty much everything is done on Linux, usually CentOS. That being said 8GB is a joke. 32GB is minimum and 64GB is what new machines would be ordered with for almost all artist workstations.
Source: I was a system engineer for VFX and animation for a number of years.
> Then again, with Chrome taking up >4GB of RAM, they probably could have gotten half that performance
Can confirm. Got up to 15GB RAM usage today, before closing 30+ tabs to get back down to 7GB. Maybe it's ridiculous, but based on the information I'm needing from those tabs--100% text--it's infuriating.
Oh, and also sitting awfully comfortably in that 7GB area is Dropbox, which is doing who knows what with all of its RAM...I do like the Linux command line tool though.
I had same issue but at home where my work flow is different. My solution was to use the extension "The great suspender" which sleeps tabs that hasn't been used in a while. Might help you too.
Firefox is even worse. I caught it using 37GB virt the other day on Linux and a pretty significant chunk of RAM too. I don't even have 37GB of physical memory.
"Then again, with Chrome taking up >4GB of RAM, they probably could have gotten half that performance just by closing their browser or using Safari." This is so true. I've recently stopped using Chrome and switched to Safari. Now my 5-year old MacBook Pro runs like butter.
Browsers and web is frankly getting ridiculous. The modern web has a lot going for it but when moving the web forward people seem to completely forget about the efficiency argument. I wonder how much of the worlds power we waste casually like that.
As a developer and a digital artist I am often running multiple VMs, my various development tools and 2-5 Adobe products on top of my 2 browsers and other misc programs. Agreed that for any single piece of digital design software 16gb should be enough (even to a certain extent After Effects). But if I could have a high quality laptop, that runs Adobe software, is Unix-y AND run the whole Adobe suite plus my development tools... (whilst not having to close all my research notes in Firefox)... gee whizz would I throw my money at that.
I'm probably an outlier though so can understand why it doesnt exist.
I don't think you're an outlier at all. 32GB is the absolute minimum I need to get my work done without disk thrashing. If I spent more time on video than audio I'd want 64GB.
That's completely realistic for pro and semi-pro users.
The cost of making a laptop that could take 32GB or even 64GB of RAM must be tiny. No one would care if Apple only supplied 16GB but left a couple of slots free, or even if the stock overpriced 16GB could be swapped for 32GB.
Why cut corners and make "pro" machines that can't be used for professional work?
It's counterproductive. It may increase margins in the short term, but over time it alienates users and erodes the appeal of the brand.
It seems "Think Different" has become "Don't Expect Much."
To be fair, I think most people's interaction with "graphics arts" is Lightroom, which is a pretty darn slow piece of software. Not the complicated stuff, where they reused code from Photoshop, but the simple stuff like opening the file import dialog.
I think people are imagining that a better computer will improve its performance, but it won't. There's a sleep statement in there or something. (Their blog post says something like "a cloud industry-leading innovation in picking-your-camera-out-of-a-list user experience story flow." Really? Just suck the files off my memory card in the background as soon as you detect one.)
Most people I know working as professional artists/designers are working with 2D assets in software like photoshop. Even the few working in print don't need crazy GPUs. I know some people who deal with video and 3d but most of them farm out to servers to do the heavy lifting. I'm not sure what you do but in my experience this is more than enough for most digital artists and designers.
Considering the extremely long time between hardware refresh on the MacBook Pro I entirely share your sentiment and am agog Apple didn't wait for a suitable Kaby Lake part. People have been waiting years, what's another six months? If you're going to charge me top end prices bloody give me top end hardware. It isn't simply top end by definition with a Microsoft or Apple logo plastered on the side, sorry.
This is 100% where I'm at. I'd buy a Surface Monitor right now, because I could be assured to use it on a...well, I almost said "a real computer", and I feel a little bad for that, but I also sorta don't. I need forward compatibility with my hardware, and my workstation already blows the doors off the little box they've saddled that great monitor with.
The stats for the Cintiq Companion 2 are in the same ballpark and, from personal use, the thing has a hard time providing smooth UI when editing very large photoshop, illustrator, or unity projects.
I think you're right that artists largely don't need what high end gaming PCs might require to play the newest steam titles at full graphics settings, but for what Microsoft is asking here, they do at least need enough GPU/CPU strength to have a very responsive UI when editing very large and complex graphical projects at 2560 x 1600 and beyond.
My worry about the Surface is simply that the minority report-style advertisement isn't really possible with a real-world PSD or 3dsmax scene.
High end graphics cards are usually much better than a gaming console, in that they offer much better frame-rates, and a lot more resolution, and at the same time even a single card can be more expensive than a complete gaming console.
So, yes, they are for gaming 99.9% of the time. But please don't compare them with consoles.
> Let's call high-end graphics cards for what they really are: gaming console graphics hardware stuffed into PCs.
Look, I'm sorry - I'm with you for most of it, but you have this stick held by completely the wrong end. Gaming consoles are closer to APUs than discrete high-end GPUs.
your rant about GPUs on the Studio for 3k is heavily misguided. What you are paying for with that 3k is mostly the new screen and its capability. The closest rival to that screen is a Wacom tablet which is very near the same price point without all the hardware MS has added. Are you aware of the new stuff that is packed into that one screen? That is the question you need to answer for yourself to understand that price point.
As for having "outdated" cards, that is more a function of the production timeline. I am sure the Studio has been in the works for a while and those were probably the best in class cards around the time.
Pro tip: When you have to explain to people why they shouldn't be disappointed by something they've thought about, you've almost always already lost the argument.
And the video is outdated, period, no air-quotes. There is simply no argument there. Making an excuse for it is nice and all, but it simply is a case of charging a premium for old tech.
He is directly rebutting one person's opinion, not coming up with a talking point after consulting with a survey of people's reactions to the new announcements. Who's to say it's not the parent that was mistaken?
I know several artists who bought the Surface for exactly this reason and found it disappointing. The hardware in the Surface Pro is pretty mediocre unless you spend a lot of money on it (at which point a Cintiq Companion becomes an intriguing idea if you don't know what Wacom's customer support is like).
Most of those artists ended up buying iPad Pros to replace them with.
"Making an excuse for it is nice and all, but it simply is a case of charging a premium for old tech"
This is a weird complaint to me. So what if the display hardware is dated if it is perfectly up to the task it's presented with? Maxing out the specs on a machine for no other reason than to max out the specs seems to me to be nothing but a waist.
The GP was the one making the complaint. I have no interest in the device. But if it is important to people, then it is important to people.
And even if it is perfectly suited the the rest of the hardware and any additional RAM/performance would be wasted[1], that's not a good excuse to overcharge for old commodity hardware.
[1] I would be extremely surprised if the rest of the system were designed around the video card. Actually, designed twice, since there are two video card options.
Another pro tip: when marketing to professionals, they will get it over time. If you're a real professional, you will follow the news, see the reviews and tests, and after a while you will understand new technology that was not properly marketed in the first place.
This "digital art graphic card" is starting to sound like "for my Big Data project I need these nonstandard tools and big compute pellets" when all they have is moderate data queries and a batch-processing workflow.
Not even just that, it's the last gen mobile cards, which means you have to spend 4.2k just to have 4GB of VRAM on something that, at the end of the day, is really a desktop. the 980 desktop version is ~$600 with 6GB VRAM.
Except that you are paying for it. If it costs money (which is does) and you pay (which you do) then you are very much paying for it. And that's the problem. Not only are the 9 series GPUs a generation behind, the shift to the Pascal architecture delivers previous gen. performance for literally half the price (see GTX 970 vs. 1060).
Maybe the thing you want to pay for is the screen, but there's no avoiding the outdated GPU tax that you have to pay to get it.
Yeah, but my point was that if they were too far along with old tech, there were still drastically better pre-existing options for a product that isn't really mobile.
I share your surprise that Microsoft didn't use the 1080. I would really love to hear the back story on that, was it getting it to fit? was it power envelope? was it driver support? Etc. I expect that if they pre-sell a million Studios or more we'll see an updated unit ("Studio Pro" anyone?) that is a 64GB/1080 machine but time will tell.
If they started out with the max TDP of 100W for the cooling system - upgrading it to 150W in that small space it sounds like too much of a change to meet their DL?
Clearly this does not excuse the reengineering but it might explain why they went this route.
You're surprised that they didn't use a top of the line card when most of their target audience will never even make the thing sweat? I have a 1080 and use it to play games at 4k while driving two additional 1080p monitors. It kicks ass and I push it much harder than most digital artists ever will.
Well there are two reasons I was surprised they didn't use the 1080, one was that it is surprisingly hard on a GPU to do the sort of lag free drawing/manipulation. When I'm turning a 3D part around in "quality" render mode in TurboCAD and modifying it on the fly it works out the GPU fairly well. Unlike games its pretty much forced to recompute lighting fairly often.
The other was that they have already crossed over the "luxury" threshold from "just another PC" to "special hotness" sort of territory. So unless Nvidia is giving them a smoking deal on "old" GPU's it seems that from a marketing perspective you would want to check off all the boxes. Its reported that it would use less power and generate less heat, and Nvidia says it is 20% faster. That seems like it would be a "no brainer". But I clearly don't know.
That is why it surprised me when I read the high end Studio only had the GTX980 in it.
I think the idea that it's a digital drafting table that can also be used as a desktop computer is the right way to think of it (as many have alluded to in this thread).
I wonder why ms didn't stick with AMD (like the xbox one) if they weren't going to use a 1080 (or 1070, might make more sense).
As someone that used to enjoy drawing/doodling but never made the leap from mouse to a digital pen - sketching and doodling on the surface 4 was eye-opening. Any number of programs paired with the pen/screen feels a lot like ink drawing without the mess and with multi-level undo.
The combination of a decent pen and great screen makes for a very satisfying and immediate drawing experience.
While I really like the look of the Surface Studio I was shocked at the GPU choice and the fact it comes with a hybrid drive. I mean I do think it is still a good machine even for the price but come on!
In some use cases they are, since they contain an SSD as a large cache. They are also a lot cheaper than an SSD, so it's nice to get some of the benefits without all the cost.
They don't need to "max out" a card, because PS is not a game.
What counts is the time it takes to perform an operation like a Camera RAW import or a blur. If that goes down by a significant percentage, productivity goes up by a significant amount.
For drawing, good acceleration is the difference between a distracting experience that's laggy and unusable, and a smooth experience that gets out of the way of the work.
Besides, there are people with an interest in art and machine learning. Good luck with a 980M if that's your area of interest.
I am also puzzled by this... Maybe knowing they have more horsepower than they need empowers them. Or maybe they are doing a lot of high poly 3D work or something.
The architects at Foster + Partners designing the Apple campus use Revit and being architects they want the coolest tech. Surface Studio hits that sweetspot better than anything from Apple at the moment.
I'm struggling to believe you're a digital artist if you're complaining about the Surface Studio. A Cintiq monitor, with no computer, costs nearly what the surface studio does, and gives you an inferior display to boot.
The machine is not "purportedly about graphics", it's about 2D artists, and they made that clear in pretty much every presentation I've seen on the device.
Ok graphics shouldn't really be that big of a problem, if companies were trying to maximize it. Instead, they're trying to minimize the footprint of the actual computer without sacrificing productivity capabilities. Luckily, this provides incentive for the eGPU market, which will hopefully give us compact devices with modular powerful computing specs.
>> And the BEST I could even POSSIBLY get would be a 980???
Not only that- it's a mobile processor, 980M, so it would lag some distance behind a 980. That is cutting serious corners for a desktop at that price point.
Actual graphics and design pros (as opposed to Apple 'Pros') absolutely care about specs. Come hang out in VR developer land for a while, where interest in specs extends to system architecture choices.
Isn't something like the Mac Pro aimed more at you? It's packed with graphics hardware. Although admittedly they haven't updated that for several years.
For me, the most revealing piece of information from that Penny Arcade article was how much the 27" Cintiq costs ($2500). Makes the Surface Studio seem very affordable for what it offers.
In the last two years all of the artists at our studio (which does animation and interactive work) have switched to Cintiq's. We were really apprehensive to make the jump but they all seem extremely happy and some have even reported that pain they experienced after extensive tasks has gone away. I'd be really interested to see how they like the Surface Studio compared to a Cintiq.
Different strokes, I imagine. My fiancee is a graphic artist who had a wacom tablet and a cinema display as her setup for a long time, but she got the 27" Cintiq a few months ago and LOVES it. I had similar skepticism that it'd really be that big of a step up, especially given what she was spending on it, but she says it's noticeable improvement on her productivity.
and are you not a bit disappointed that the products you describe, Apple's desktop products, appear to have been totally ignored by Apple? No new iMac, Mac Pro or Mac Mini hardware announced, it seems like it's not a focus for them at all.
I think their focus has been on mobility for some time now. Very few people actually need the added horsepower a desktop platform can offer due to the relaxed power and space budget. I've been working from laptops exclusively since 2003 or so.
lol why would i be disappointed? Just use a laptop and dock it to a monitor when needed.
Why would anyone need a desktop in this day and age? Moore's law means the things that desktops used to do 5 years ago, laptops do now.
The compute requirements for photo editing, for example, haven't changed in 5 years. But the processors and systems (IO, displays, etc..) keep getting faster. It's only natural that laptops would cannibalize desktops. And phones/tablets will cannibalize laptops in a few years.
Can I suggest you learn something about a domain before commenting on it?
Photo editing can be hugely resource intensive. Camera RAW files have been getting larger over time, and all but the simplest edits demand an SSD, a fast processor, and at least 32GB of memory. 128GB isn't unusual for commercial work.
You can get by with less, but sooner or later - usually sooner - you run out of RAM and your machine starts thrashing.
Many Photoshop features are hardware-accelerated, so having a good graphics card makes a significant difference to editing speed.
3D animation definitely needs a good graphics card. So does video editing.
Audio doesn't, but it needs as much raw processor power as possible. I know DJs/producers who are running dual 12-core Xeon systems for their mixes, and adding PCI DSP cards on top.
Surface Studio looks very nice, but it fails to tick most of these boxes. The reality is there's a significant market for creative power users willing to pay good money for multicore server-grade towers for their work.
Neither Apple nor MS are paying attention to this market. The Mac Pro is an underpowered curiosity now, and Surface Studio has - sadly - been hobbled by greed and penny pinching.
MS definitely is not paying attention to the high-end workstation market. They are paying attention to the trends of interaction and new interfaces that technology is allowing us. The product here is the new styles of content creation and the accelerated pace of current content creation via the form factor and accessory knob thing. If you have needs that demand extreme resources, you can probably afford to have remote rendering or processing using all of the myriad of wonderful networking technologies that have advanced so much.
It can't be an effective interaction device and a server-level resource at the same time. Anyone who is enough of an enthusiast to require a dual Xeon workstation is clearly not who MS is targeting with a single product. Leave the multicore server-grade towers to HP, Dell, Lenovo, etc. because there's nothing to innovate there - you just throw hardware, bought at market price, into a box. What is MS supposed to innovate on there if all you care about is specs/$ and ignore the design and use-case?
This is a very niche market. As laptops get more and more capable, it'll be an even narrower niche.
I love to have just as much power I can. I'd love to have a multi-CPU POWER9 loaded with a couple terabytes of RAM and a couple flash cards attached to a fast bus, but I wouldn't know what to do with it.
> The Mac Pro is overpowered for them.
I'm not sure if you are trolling?
Moore's law has slowed down btw and is predicted to end in 4-6 years (2020 - 2022) as it would be physically impossible to shrink transistors any further /unless/ there is a change from the current silicon CMOS technology.
In fact, manufacturers are no longer even targeting the doubling of transistors anymore! Instead they are focusing speeding them up.
I'm not sure if you're being facetious but for those of us who enjoy playing video games a desktop is the best choice. If you don't care about portability, why not get something that is by far more powerful, cheaper, customizable, and future-proof?
You're in the wrong discussion if you think gaming has any relevance to this line of laptops. People that own Macbook Pros don't play games seriously. Gaming is strictly for Windows devices, as it's largely an afterthought in the Mac world.
Are there many more studios outside of Blizzard that are developing for OS X? I hadn't realized there were more devs working with it. Every other game I know of that has Mac support is usually running in some WINE equivalent which isn't great.
So we have Blizzard and Source engine, who am I missing?
Ahaha ahaha... No, no it's not. I can't imagine you play any 'serious' games. Hell, even WoW, a game which is well supported by the developer on OSX, plays like crap on my $2600 MVP.
Well, I should have worded that better, shouldn't have been so dismissive. By 'serious' I meant 'resource intensive'. I don't know all of those games, bit the games I do know are definitely not very taxing on a GPU (aside from Skyrim, but what settings are yo uplaying it at? Does that seem worth it for the price tag?)
I don't think you'd argue that a mac is a good gaming rig from a performance / $ perspective (or any other perspective really).
Define 'good enough'. These aren't inexpensive machines. In the context of gaming, $2600 is far too much to pay for a laptop that gets ~30fps at medium settings.
I never used the Cintiq, but I've been using Wacom tablets for a good 15 years. I owned & used the Surface Pro 2 & 3, and the iPad Pro with pencil.
Both have their place - pen on screen or pen off. Working with your pen on a screen, especially a smaller screen, means you lose a good chunk of your visible space. Overall opinion -- it is great for artists that two major companies have made a significant effort on getting the pen on display right.
"Overwatch was fast and fun on medium settings. Civ VI is what I’ve played the most on the Studio and it looks and runs fantastic."
"When I first saw the device months ago in that secret room at MS, they asked me what I thought. I said, “Well I have no idea if anyone else will want it, but you have made my dream computer.”
Nothing about that is, I went out and paid my own money to buy this.
So you must've somehow missed the "Just to be clear, I don’t get paid to do any of this stuff but I enjoy doing it and I like to think I’m helping make the Surface better for artists." while reading the article, right?
Advertising is more than just paying actors to pretend to like something. News articles rephrasing press releases are often also adds. Video game reviews are usually not simply handing money to someone. That does not make them unbiased.
People spreading Viral Videos are often unpaid paid, even if they are a critical part of a marketing campaign. Hand someone some free hardware and they don't think is this worth X, just is X cool.
I suggest familiarizing yourself with the history of Gabe/PA and the Surface. Gabe initiated the relationship by writing a post out of the blue about its potential as a drawing tool, as well as complaining. In response, Microsoft invited him to a lab to try prototype Surface hardware/software and made a lot of changes based on his feedback. So, they've basically made a machine designed to be liked by him, and they continue to get his feedback to ensure that doesn't change.
So, it's not surprising or questionable that his reviews are positive.
I actually read his first post about the surface when it came our. The point is MS is continuing the relationship for good press, and by becoming part of the process they have become biased.
If the consideration is that they made the product work for his use case, then that's not corruption in any meaningful sense of the word. That is exactly the behavior you want to reward companies for.
IIRC, he returned his last trial-Surface device and bought one with his own cash.
Based on the article (he's been lent this device for the last week or so), I imagine the same thing may well happen if he's happy with it - which it sounds like he may be.
Pretty sure the guys who've been making a very successful business out of Penny Arcade for the last couple of decades don't need to take backroom deals from Microsoft in order to make ends meet.
Apple delivered meat and potatoes for ordinary users (albeit with a weird garnish), while Microsoft delivered a molecular gastronomy dessert that a tiny niche has any use for. I don't see how that is to Microsoft's credit, much less "embarrassing" to Apple.
They haven't replaced my laptop with anything new. They just released a newer model. Similar to every yearly company announcement ever. Nobody cares about version 7 with nothing new, they care about version 1 of something innovative.
Wacom hasn't released an innovative product in 10 years, and much like Texas Instruments and their $130 monopoly on school calculators, everyone is still paying the same $2500 to Wacom that they did decades ago for Cintiq.
The lack of competition for Wacom is criminal and has allowed them to coast on last-decades product line for a very long time.
Hell, the most innovative thing Wacom has done in 10 years is rebrand Graphire into Bamboo. Ooooooh!
I imagine the day the Apple Pencil was announced was a very grim day in what ever is left of the Wacom R&D department.
Does Wacom have an entire suite of comparable products, or do they cater specifically to a niche market? Was their announcement yesterday? Life has context.
> What does that have to do with new MBPs?
I'll go ahead and quote the original comment you originally responded to: "This event was by far the most disappointing Mac event in the history". Apple had an event where they announced not much more than that we are now in late 2016. Nobody is saying MBP itself is bad, the event is comparatively lacking.
* I think you can reasonably make a case that Windows 95/98/NT were better than MacOS 7/8/9. (You could probably make a reasonable case for the opposite; I certainly wouldn't argue the point).
* The Zune was better than the iPod. Unfortunately for MS not better than the iPhone a few months later. The "Metro" design language it introduced was quite a bit better than Apple's at the time (and started the "flat design" trend that even Apple would end up aping, years later).
* Various areas where MS beat Apple by default, at least circa the 90s-00s, owing to Apple making no or a half-hearted effort. Office suites, gaming consoles, web browsers, the server, 3D graphics, yadda yadda.
* C# is a far better language than Objective-C, and Visual Studio is a far better IDE than Xcode.
* Edge is arguably a better browser than Safari. MS have also made some great tooling for webdev recently, VSCode and Typescript, while we haven't heard much from Apple.
* The Surface tablets have been arguably better than recent iPads, by virtue of running a full OS.
> * C# is a far better language than Objective-C, and Visual Studio is a far better IDE than Xcode.
C# is a nicer language, and VS a better IDE when it comes to language integration but.. the platform APIs are far more productive on OS X than they are on Windows and the GUI toolkit was good from the get go, whereas MS never commits to a toolkit fully (MFC, WinForms, WPF, now UWP which is like a restricted, not fully compatible WPF). Why do you think there's so many great, lightweight alternatives to Photoshop with non destructive image editing on Mac OS X, like Affinity Photo, Acorn, Pixelmator, and none on Windows ? why is it that Apple can dogfood and write everything in their modern platform APIs, even rewrote their file browser Finder, in it, but Windows out of the box comes with exactly zero .net apps? Recently Windows 10 brought some .net stuff out of the box on the desktop, but they're all toy apps no one would want to use, and certainly no equivalent to Apple Mail, Photos, iMovie, Garageband etc. The UWP mail client is pitiful.
C#/.net platform in general, on the desktop, as far as commercial, desktop apps sold to consumers come, is a dead wasteland. Whatever few .net apps I've seen as an enduser that made use of .net, I tend to associate with "slow, heavy, not featureful" particularly WPF apps. .net greatest success is the same as Java: on servers, or on the desktop for in-house business software where the well being of the enduser is not a priority and no one cares if the app feels sluggish or has terrible UI.
The lack of dogfooding has been a common complaint among windows developers for a long time, for example :
> I understand deadlines and priorities, and I know that probably Microsoft just had to ship something at some point, but it really seems that there was a big lack of dogfooding in the WPF case.
> There’s a striking example of this: what was the number one complaint that developers had about WPF since 2006?… Blurry text and images. And when did Microsoft fix it?… Only in 2010, when they started using WPF for Visual Studio.
> Another issue that has been bugging me since I started using WPF was the airspace limitation. It seems that it’s finally going to be fixed in 4.5. Why do I think it’s being solved now? Because they probably needed some native WinRT component to play nice with WPF…
Microsoft still doesn't really use .net outside of dev tools and server apps. UWP apps are just toy apps. UWP OneNote isn't even close to the desktop OneNote. And so on. MS themselves don't really produce high quality desktop apps with C#. If they won't, then who will? Compare to Apple and how everything they make, makes heavy use of their platform APIs such as Cocoa, Core Image, Core Animation etc. How could .net not be a barren wasteland for desktop apps?
Apple always had the better developer platform, dogfooded and thus battletested, and now they're also getting a nicer programming language to work with, with Swift. Their IDE is still no visual studio, but AppCode from IntelliJ fixes that.
Well, you could argue that Apple hasn't done a tablet PC. They've done a tablet-sized phone. Of course, where you draw the line between those is probably different for different people.
They did, just not in terms of the underlying concept. The sheer size and weight of the iPad was innovative. Doing away with a physical keyboard altogether was innovative. Using a capacitive touchscreen for a far more pleasant user experience was innovative. In short, many small innovations rather than one big one.
What's interesting is that it's a dual volte face; Apple produced a gimmicky consumer-oriented iteration whilst Microsoft gave us something genuinely innovative, gorgeous looking, but aimed at a high-end, professional market with the money to spend. It follows on from Apple giving us desktop users a transparency abomination straight from the Windows Vista era. What is happening?
To continue the food analogy, it's as if Apple reheated some leftovers --the kind that taste better the next day, but when it comes down to it, it's the same food.
Some of you may remember when many many years ago Microsoft showed off an interactive coffee table. The Surface. And now finally with Studio they have brought to reality a device that will be nirvana for designers, factory planners, circuit designers and anybody else who can actually use an interactive surface (pun intended). Im a product manager for enterprise software and Im bubbling with ideas that such an eminently usable large screen touch device brings to the market. After decades of heaping scorn on MS for the stuff i had to work with that hasnt improved (talking about you Word, Powerpoint, Excel, Visual Studio) they have post-Balmer become an entirely different company and Im pleased about that. Competition is good for innovation and Studio is truly innovative in engineering and the things it makes possible.
> Microsoft introduced an entirely new kind of tool for digital artists.
It isn't an entirely new tool, it is an HDR, higher res, larger version of a Cintiq, with a built in slow PC with last-generation mobile graphics. The hinge and stuff was definitely an improved industrial design.
Those screen improvements alone are probably enough for lots of professionals to switch from cintiq, even if they lose support for professional-tier desktop hardware, but what about the pen? As far as I can tell the pen doesn't even have tilt detection and from videos you have to use the dial/puck to simulate rotating or tilting the pen.
In spite of that, it was definitely more innovative than anything Apple showed.
SURE! Except, of course, you're not going to be allowed to code it in Python, R, or Octave. Instead, it will be Microsoft(TM) Visual(TM) R#(TM) with the new NetDNA/DCOM/.NET/CloudyMcCloudface Framework (TM)!! (now, with 40% more Cortana!(TM))
> Microsoft introduced an entirely new kind of tool for digital artists
Actually I believe they introduced a competitor to the Wacom Cintiq that comes in a variety of sizes and does not require you to use a desktop or Windows 10.
Calling it 'an entirely new kind of tool' is really over stating it.
Have you used Surface before? Asking because when I picked one up about a month ago, I had an awful time with it. Makes me think their announcement won't really do anything but look interesting. I could be missing something, though.
>> when I picked one up about a month ago, I had an awful time with it
Did you get to play with it at a store or friends house, or actually take it home and use it for some time. Everyone i've met that has a surface (book, 3, 4, etc) loves it.
Microsoft announced a laptop that gets 6 more hours of battery life than the MBP, while Apple used a chipset a full year behind Microsoft's.
For the desktop, if you think people don't use desktops you're out of touch with reality. Microsoft introduced the Surface Studio primarily targeting what Apple has started to abandon, creatives & creative professionals.
A 27" Cintiq tablet costs $2800 on Amazon, and that's without the stand, which is an additional $600. This is what video editors, audio engineers and graphic artists use in their studios. The Surface Studio replaces all of that -and- it replaces the computer itself. What are you missing here?
> Microsoft announced a laptop that gets 6 more hours of battery life than the MBP, while Apple used a chipset a full year behind Microsoft's.
Which means Microsoft has introduced a dual-core low-voltage part significantly less powerful than the one in the MBP. Let's not kid ourselves here, Apple was not going to use 15W LV KL in either MBP, and the 45W parts are will 4~6 months from release.
> They do use the 15W Skylake in the non-touchbar MBP.
True. I wasn't talking about the red-headed step-child but you've got a point here. The hold-up there might have been that there's no Iris KL (incidentally I'm pissed that they've apparently gone with the non-iris pro on the 15", what the fuck)
Well, not only is there no iris KL, there's no quad-core KL yet, either. Dual-core 15W CPU in a pro machine seems... inadequate. Also, all the 15" have dedicated graphics so an Iris Pro would seem like just a waste of TDP to me.
The IGP on the CPU is considerably more power efficient, the dedicated graphics can be pretty much switched of for anything non-Gaming or GPGPU related.
Yup. I'm just saving having (better, more power hungry) integrated graphics seems like a waste when you already have a dedicated GPU for when you want more performance.
Dude, I use a desktop. A Mac mini, in fact, which has been struggling for the latter half of the past 6 years.
I'm seriously considering getting a PC instead, even though I've written http://taoofmac.com for 14 years.
(I do happen to work for Microsoft, by the way, but would very much prefer to keep my home setup on a separate tech stack for the sake of keeping an open view. I'm now ogling the Skullcandy NUC and wondering how well it will run Elementary OS)
On your blog, you wrote:
As far as I’m concerned, Apple is completely out of touch with my segment (call it UNIX-centric pros, if you will), so I’m going to seriously rethink my options over the next couple of weeks.
I would guess that just here on HN there are tens of thousands of us who could describe ourselves as "Unix-centric pros", who want powerful unixy client machines that match our unixy servers and who also want all the client-side stuff everyone needs (excellent video, audio, wifi, battery mgt., etc., drivers perfectly matched to hardware and working out of the box) plus a full range of apps. We can get the unixy client from Linux, BSD, or Mac. We can get the reliable, everything works consumer client stuff from Windows or Mac. The only overlap is Mac.
Unfortunately for us, this power user unix workstation stuff is just a historical accident for Apple, not a corporate objective. They're trying to get out of the computer business so they can focus on fashionable, high-margin, designer "lifestyle" products and services for the hip ones who consume stuff, not the old drudges in the back room who make stuff.
Microsoft holds their Windows legacy so dear that they'll slap more layers on the papier-mache balloon that is Windows but never put unix at its core, and all the nifty client-side Linux systems out there seem to have the motto "before you complain, realize that how much of it ends up working depends on how smart you are".
I would wish for Google to develop a line of powerful Linux-based workstations, but they would just spy on me until Google canceled the project altogether.
I wish Apple would create a wholly-owned subsidiary called "Apple Computer, Inc." that would have access to Apple's manufacturing expertise but would build a line of computers for power users who produce stuff--computers that weren't "thinner".
> Unfortunately for us, this power user unix workstation stuff is just a historical accident for Apple, not a corporate objective.
Many that went Apple when they showed OS X to the world, never really understood Apple, or for that matter, NeXT culture.
The POSIX layer never mattered, what mattered were the Objective-C frameworks on top and a workflow based on Xerox PARC ideas, that Steve Jobs brought to Apple and NeXT.
The UNIX compatibility was there, because NeXT was competing against Sun and Irix workstations, so it needed to be compatible to a certain extent.
Later Apple trying to promote the same compatibility because after the Copland failure, they needed to sell computers or close shop, so the UNIX with nice GUI was a way of doing it.
For those of us that develop native applications, it never really mattered.
> Microsoft holds their Windows legacy so dear that they'll slap more layers on the papier-mache balloon that is Windows but never put unix at its core
Bash on Windows isn't at all the same as Windows on Linux.
I do appreciate that they are trying to do as much as they think they can without throwing away the thing that they believe is the most valuable corporate asset they have: the mountain of stuff built over the years for Windows. They have carefully preserved this asset by making huge efforts to ensure than anything that worked on Windows in the past will keep working forever.
The thing is, what I wish they would do for me (which might not be what they need to do for themselves) is to take the equivalent of my Linux server and wrap a Windows API around it, the way a Mac is like a Linux server inside with a Mac API wrapped around it for client software usefulness. I'd be just as happy with a Windows GUI/API as a Mac GUI/API, but what I want is for it to resemble my servers under the GUI, so I can leverage whatever unixy skills and tools I may have.
Having a simulation of a unix shell on top of an NT core makes Windows more useful than it would otherwise be, but it's not the same as a Windows shell on a Linux kernel.
> the way a Mac is like a Linux server inside with a Mac API wrapped around it for client software usefulness
No, Darwin is Mach with BSD welded to it; it is not a pure Unix design. Likewise, WSL is the NT kernel with the Linux syscall interface on top, via the Pico Process interface. The NT kernel was designed for portability; in fact, POSIX compatibility was a design goal from the early days, in contrast to Mach where the idea of mashing together Mach and BSD was never considered.
In many ways, WSL is a much cleaner design than Darwin is.
I did exactly that. After using the Mac forever (started on 10.1), I got kinda fed up and made the jump to an Intel NUC with Linux. I tried Elementary OS and didn't find it to my liking. But I really really dig Fedora, specifically Gnome 3. The breadth of choice in Linux desktop environments at the moment is neat and not something I expected when I made the leap to PC.
Sure thing. First, some details. I switched to the NUC three or four months ago as my main machine. Mine is the NUC6i5SYH model. I installed 32GB of RAM and filled both drive slots. Windows 10 is installed on one drive, Fedora 24 is installed on the other.
As long as you have no plans to play games, it's a wonderful machine for development. It's really fast for its small size, noticeably faster than the Macs I'd been using before. I spend most of my time with the NUC doing Ruby development in Fedora, with some Clojure and Node.js on the side. I've done some office type stuff with LibreOffice. I've dabbled in OpenSCAD while working on a 3D print. The NUC is more than enough power for all of that. I haven't played with Windows 10 too much, but it's also snappy when I need to boot into it.
It's a very quiet machine. On full blast, you might get a blowing sound similar to a stressed MBP. But usually in normal usage it's no more than a low hum; I can't even hear it unless I listen for it. You can throttle down the processors in the BIOS to make it nearly (totally?) silent if you like.
The most annoying thing about the NUC was installing the BIOS updates before getting started. The NUC6i5SYH had a lot of problems at launch that needed to be patched. Once I got beyond that, everything was smooth. The NUC is basically a laptop board shoved into a small box instead. If the things you want to do are possible on a laptop, they'll be possible on the NUC no problem.
Thanks for the response, it seems that your use case is very similar to mine (Ruby, Node.js, etc.). I actually decided to go with the i3 model with 16GB of RAM, since I don't think that I will do anything extremely demanding or graphically intense and it's super affordable to boot. I'll run elementary OS or Fedora on it, hopefully it will all work out.
I wonder how Visual Studio runs on it - I am a gamer but I'm very tempted to go NUC for development and stay with the Xbox One for gaming until I can afford a proper gaming rebuild.
I still use a 2010 MacBook Pro (i7 8GB RAM 256SSD) to run my IDE and all the "normal" apps (Skype, Slack, Chrome), and a NUC (NUC5i7RYH) on Ubuntu to build and run the stack.
I mostly access the NUC via SSH (and rsync my codebase between the two computers). The NUC is also connected to a monitor and I use the MacBook keyboard and mouse to drive both computers with Symless Synergy.
If you are fine with dual core, get a NUC5i5 with 16GB RAM - Yosemite and higher run absolutely flawless there. Skylake NUCs are still a bit of a mess unfortunately.
> Microsoft introduced a 28" desktop PC in an era when nobody uses desktops anymore.
"a 28" desktop PC" is a bit of an understatement. It's a huge low-latency touch display which works like a drafting table.
> nobody uses desktops anymore
There's a desktop on most desks in our office. A Macbook is certainly an alternative option but not if you're looking for a large touchscreen display.
> For the same price as the new Surface Book i7, I can get an MBP 15" with bigger screen and quad-core CPU
If there was really a performance problem I think a lot more people would be upset with what was announced. Last gen means more stability and likely cooler temperature hardware. Also, does the 15" MBP have a detachable/touch screen? Does it come with Windows? These are real considerations for potential Surface adopters.
The Macbook is simply a laptop. There's no problem with that, but Surface is selling more than that, so a direct comparison on cost and performance isn't really fair. Or are you suggesting Apple is the only company capable of putting newer hardware into a computer?
Slight correction, though your point stands. You would need to buy the $2,800, 27" Cintiq[1] and a new computer to compete against the Surface Studio offering. This makes the Surface Studio a more compelling offering on the surface, at least to me.
Penny Arcade's Mike Krahulik has been using one for a week and compared it to his Cintiq:
> Tycho asked me to compare it to my Cintiq, and I told him that drawing on the Cintiq now felt like drawing on a piece of dirty plexiglass hovering over a CRT monitor from 1997.
It's kind of amusing watching Apple fans replay "spec" and "niche" arguments they used to poo-poo. Kind of like watching them do a smooth about-face on the superiority of PowerPC processors back in the day.
I've been running a custom desktop PC at work for years. I tend to want the best possible hardware and getting a custom workstation is the cheapest way of getting it. Very happy with my machine and I've never had any need for a laptop in the office.
It's just trendy to complain about everything Apple does although most likely 90% of the complainers will be getting these new macbooks anyway.
If they just push the specs up, increase battery life..."Apple isn't innovating anymore", "So what's new?", "It looks just like my old MBP" etc.
Now that they actually bring something innovative you'll get this kind of complains.
Now no matter if people liked this event or not I find it hard to believe anyone seriously though yesterdays MS event was any better. Overpriced useless last gen crap for insane price.
> Overpriced useless last gen crap for insane price.
That perfectly fits my opinion of Apple. ;)
I jumped off the Apple band wagon years ago, but an important point here is: Apple events used to be exciting. They would announce something interesting and worthy at every event, which is why so many even started watching life streams.
The iPhone is still a really great phone, but iOS has been playing catchup with Android for years. And apart from mobile Apple hasn't done much that's innovative or interesting for quite some time.
That's why opinions are so harsh. People have come to expect more.
If that's your opinion of Apple I wonder what you think of others then.
Apple products usually have top notch latest specs and are most powerful products out there. Both in laptops and in smartphones. So definitely not last gen crap.
Overpriced? Expensive maybe, but probably one of the least overpriced actually. Unlike most manufacturers who just throw in Windows or Android on top of their generic hardware, Apple design a lot of HW and SW and their integrations in-house and they actually have costs to cover. (I heard Silicon Valley engineers aren't exactly free)
I don't know what is "much that's innovative". I guess it's subjective. If we look at the last few years: 3D Touch, TouchID. That's not so bad.
If we look at the whole history of Android...eh...well...hmmm...Samsung Edge...maybe?
I agree that Apple isn't innovating as well as they used to back in the days of Jobs, but nobody else really didn't innovate back then and they sure as hell don't innovate now unless you see putting unnecessarily high specs on a smartphone as an innovation.
You mean low-spec RAM? Slow SSD's? Obviously soldered in, lel...
A GPU that is apparently is on par with last, severely slower (compared to 1000 series nVidias), generation of GPU's? 2GB of VRAM, lel again
A CPU that is nothing to gawk at? (2.6~2.8 GHz nominal)
Crappy sound, crappy cooling, zero moisture resistance, all for ~2800 bucks
Meanwhile Lenovo I've got in order to work with Autodesk Inventor has the same CPU, 1TB 7200rpm HDD + 128GB SSD (replaceable), up to 32GB RAM (I've got 16GB in 2 slots out of 4), 960M with 4GB VRAM, SD card reader that is amazing to have, bitching sound that kinda surprised me, full size keyboard, all for a whopping ~1000$. (And I'm fairly certain it will not die to humid air - I'm guilty of sometimes using it with water dripping off my fingers even)
Sure, it only has a 1080 screen and I don't care about battery life (since I use it as a mobile workstation that is plugged 99% of the time) and it obviously comes with Windows 10 (THE HORROR)
The point is: stop spreading bullshit about how amazing Apple hardware is. For the price you are paying it is a joke.
I used to be Thinkpad hardcore too. I switched to Mac when my x200/x201 hybrid that I built died (fell off the top of a server rack) and work bought me an MBPr. I'm pretty firm in the Apple camp now. I used to be a hater and holy shit was I wrong.
Now that my workflow is mostly photography and 4k video editing I couldn't imagine being back on a Thinkpad with Arch/CentOS. Windows is dead to me and has been for years, I'd go back to Linux FAR before Windows, but the workflow on OSX is just so damn sweet that I'm not even thinking about it.
There's really no company that comes even close to Apple for my use case. That being said I wouldn't pay $2800 either. Let the suckers buy it and I buy lightly used for half the price. The new screen is really, really, tempting though.
I think we would be all much more receptive to their innovation if it didn't also include the removal of important features like the Escape key, USB type A, HDMI, and headphone jacks.
If they replaced function keys with the touch bar and left everything else alone (save for faster components), I don't think half as many people would be complaining.
I just don't care about the Touch Bar. It's fine I guess, I'm not religious about the F keys. But only USB-Cs? No USB-A? Not event MagSafe?? That's the last nail in the coffin for me, I'm afraid. It's supposed to be a Pro machine after all.
I sit at an Apple desktop (macbook pro with 2 apple monitors) for most of my day so I can use my giant screen to design interfaces and I'm not cramped into a laptop's tiny 13 inch screen. The last office of about 5,000 devs and designers that I worked at had desktops with monitors. No one worked on laptops except a few people here and there.
I would lose HOURS of productivity per week if I were not on a dual screen set up. It's just better for what I do.
That being said, if I need to go mobile, which I normally don't I use my macbook. I think I've disconnected my macbook from my monitors probably 2 times in the last 2 years.
> I sit at an Apple desktop (macbook pro with 2 apple monitors)
You're kind of proving his point... even the people who doesn't need a portable computer do get a portable computer instead of a desktop one nowadays. (By the way, I don't agree on desktop computers not being used anymore: I spent most of my day using a desktop computer. And a desktop telephone, another supossedly dissapearing technology.)
> I would lose HOURS of productivity per week if I were
> not on a dual screen set up.
I agree that desktops have their place, but it's not because of dual screens... This latest MBP supports 2 at 5K (or, presumably, 4K if not buying Apple screens...)
Almost everyone I know has a desktop. In fact, the only people I know don't are my parents and people who only need a laptop for email.
I happily paid $1,200 for my graphics card, and while that is too much in general there is definitely a demand for high end computing and gaming.
Sure, this doesn't really apply but if I worked in a creative position this machine would be a godsend.
I work in an open-plan office of about 40 people; a mix of business and tech staff. I'm the only one with a desktop. The 'killer feature' of the laptop here is being able to take your computer into meetings for various reasons.
Mind you, if you totalled up all the staff hours wasted this year with people wandering around searching and asking others for various dongles, we probably could have afforded another couple of MBPs.
Scala development, and some devs ops stuff that required spinning up VMs. I mean not a lot; like 6 ~ 8.
At work, I use a 16GB ram laptop, Scala development an no VMs (we do everything in Docker and deploy to Mesos/Marathon).
I do have a desktop at home now with 32GB of ram, but it honestly feels like overkill and I may scale down. There is a lot of dev work that does require a pretty beefy workstation, but I've been doing HD video editing, Photoshop/Lightroom and Scala/Java work on laptops for years.
I run several Docker containers with Java-based microservices on Win10 Dell XPS 13 laptop, while debugging PhoneGap client app at the same time. Don't see any performance problems with that. The moment when laptops became good enough for cloud development happened 2 or 3 years ago.
Hm, I don't know -- have you used more powerful stuff? I have a 2x10 core desktop with 64G RAM and a relatively powerful Dell Latitude workstation laptop -- and the latter is significantly slower. (I'm also running Java microservices inside containers, mostly.)
No, I didn't use such beasts. I have a desktop with 8 cores/32 Gb RAM, but I don't spend much time on it. Top hardware wouldn't add much to my personal performance, so I won't spend money on building such workstation.
>> A lot of people have desktops, myself included. I'm not really sure where you got such an impression.
Not just that... If nobody uses desktops, why would Apple waste so much engineering and design resources on something like the iMac if there's no market?
>> For the same price as the new Surface Book i7, I can get an MBP 15" with bigger screen and quad-core CPU, and it's Microsoft that hit it out of the park?!
While they are alike in many respects, the Surface Book is an orange in comparison the the Macbook Pro in three important ways:
The Surface Studio isn't really just a desktop PC - the whole swivel stand and touch-screen thing makes it a really interesting proposition for people who do a lot of creative and design work. No, it's not mass market, but it is a lot more interesting than anything we saw today.
Overpriced Microsoft devices, although impressive. I am more worried about the silence from MS on MFC given the world's investment in it. We can't all just rewrite our apps in UWP overnight. The foundation classes were just that - a foundation, and we can't move the giant building built on top.
For the Mac, goodbye MagSafe and SD cards and optical audio in (or any audio line in, actually). What's the point of a super thin device if you have to carry it in a thick bag due to the bulkiness of the 24 adapters to make the device useful or connect it to anything? It beggars belief.
"We've made the thinnest car ever! Steering wheel is an optional extra".
I think we are reaching the pinnacle of computer hardware and software development. We may have even passed it.
In short, I'll stick with my current MacBook Pro for as long as I can. That new giant touchpad looks to make typing tricky.
"MFC can't be used for for Windows Store apps because it uses too many APIs that are banned for store usage. In order to allow MFC usage in your scenario we would probably have to break MFC into multiple DLLs, which would be a prohibitive amount of work."
I think the main difference isn't "who announced something that I'd prefer buying tomorrow", but who announced something that shows a ton of potential growth areas? The MBP refresh announced today seems OK. It's where the MBP is right now. The Microsoft announcement yesterday showed where computers should go. It's a different approach, and kind of subtle, but I see a much larger potential in the Surface Studio's future vision for desktop/portable PCs than in Apple's.
I'm pretty sure our office of 300 people only has desktops in, so saying that no one uses desktops is a bit much,no? Also, all of our artists(~80) have the CintiQ 24" Tablets, which can now be replaced by the Surface Desktop. It's a fantastic device, if you have a use case for it.
Most illustrators I know are using a combination of an iMac and Wacom tablet. If it is an agency, they'll possibly be using a Wacom Cintiq, which approaches $2,800. The cost of a Surface Studio is not out of the ordinary.
https://us-store.wacom.com/Catalog/Pen-Displays/Cintiq/Cinti...
For comparison, my 27" Wacom Cintiq cost me $2500, which is just a screen with a stylus and the Studio blows it out of the water in every measurable way.
I'll take my desktops for doing real work. I don't know any laptop that can drive four and five screen setups and an area about 4' x 3' of display area.
The laptop is fine for fooling around, or "working" from home on the couch, but I'd lose my mind trying to do anything that small a screen, that few pixels, and a cramped, irregular keyboard with no travel. Plus trackpads. Fuck trackpads.
We have about 150+ engineers with Macs - and precisely zero of them asked for a desktop versus a MacBook Pro. Consider the possibility that your statement, "The laptop is fine for fooling around" is a fairly personal experience that isn't necessarily representative of the broad population.
>"People dramatically overestimate their ability to manage their environment," Meyer says, adding that in some ways, using multiple monitors to keep all sorts of data visible is analogous to using a cell phone while driving.
Well, sure, but isn't it rather obvious that using resources to engage in distracting non-work activities is detrimental to work efficiency?
I've got 6880x1440 worth of curved display on my desk, and obviously, watching Netflix, browsing Facebook, or otherwise distracting myself with one of them would be a net productivity loss.
Studies show that it's true for the broad population, too. More screen real estate and faster processing directly and significantly impacts focus and working memory.
Perhaps your employees aren't perfectly rational actors whose incentive is to maximize the utility of their tools while at the office?
For example, there seems to be correlation between 1) wanting to keep git work local to a workstation and not push anywhere with public visibility, and 2) wanting to use a laptop so you can bring that local repository state with you.
Neither desires are necessarily aligned with what benefits the company (or the quality of work), but they do drive a laptop preference.
First, they provide a good summary of previous findings:
"Previous research has demonstrated productivity increases for users performing tasks on larger or multiple screens. For instance, a 9% productivity increase was noted while using wider screens (15” flat panel vs. 46.5” curved screen)
(Czerwinski, et al., 2003)."
"Similarly, a 3.1% increase was noted using dual screens over a single screen (Poder, Godbout, & Bellemare, 2007). Task success has been found to increase for tasks performed on 4 – 17” monitors vs. a single 17” monitor (Truemper, et al., 2008). Likewise, tasks were performed faster and with less workload while using 2 – 17” monitors over a single 17” monitor (Kang & Stasko, 2007)."
"Other studies have noted increased window interaction and open windows while using multi-monitor configurations (Hutchings, Smith, Meyers, Czerwinski, & Robertson, 2004)."
"Issues with single monitor use have been well documented. Generally, higher mental workload, more window switching, repositioning, resizing, inadvertent opening, and closing of files have been associated with single small monitor usage (Czerwinski, et al., 2003). Users generally perceive small single monitors as requiring more workload (Grudin, 2001; Hashizume, Kurosu, Kaneko, 2007)."
Second, their own findings:
"Participants felt more rushed, that they worked harder, and were more frustrated [with a single monitor configuration]."
"Participants spent more active time in the PDF reference document while completing tasks on the single monitor configurations"
"Participants were observed to leave the PDF reference document open and viewable (but not in focus) while working with other source documents when in the dual monitor configurations."
"Participants clicked less during tasks on the dual 22” monitors than the dual 17” monitors and single 17” and 22” monitors."
"Participants switched between windows more frequently during tasks on a single 17” and 22” monitors than the dual 22” monitors."
>> Given that they're using glossy-screened laptops — maybe they care more about looks and hipness than about getting the job done?
Why the hate against glossy screens? The colors pop more and unless you're sitting in a spot that's prone to glare, you're just as productive as anyone else. I've never had an issue with a Macbook's glossy screen affecting my ability to work.
If I need to look at it the whole day, I find glossy hurts my eyes more. If it is about having something pretty as a status symbol, then yeah glossy makes sense.
Sure, that's valid. But everyone's eyes are different, which is why I don't get why there are a lot of glossy screen haters out there. I never had that problem sitting in front of my glossy screened MBP for 8+ hours a day for 3+ years (it died but not because it had a glossy screen).
>> If it is about having something pretty as a status symbol, then yeah glossy makes sense.
Aren't glossy haters mocking/judging people basically using matte screens as status symbols? i.e., you must be a hipster or shallow and not a real programmer/{insert other profession here}?
Indeed. For me, it comes down to ergonomics. Monitor at eye height, keyboard comfortably positioned. Laptops force you to hunch in, eyes down, hands squished. It's a recipe for RSI. I'm sure some people can manage it fine, but I can't.
Both Apple and Microsoft presentations were overwhelming for the size of their companies. However, the latter introduced a completely new product which seems innovative. Meanwhile, Apple just gave their MacBook Pros a touch bar, and basically neglected any demanded updates for their other product lines (e.g. MacBook Air).
And honestly, MBP is not the best purchase for specs. You can get a cheaper Dell XPS 15.
This is microsoft making a product that designers love.
The behemoth which everyone considered dead is finding ways to make inroads into apple's old stronghold, with design innovations.
Yeah its going to have to deal with Windows, but this is a huge turn for a creature with that kind of momentum to overcome.
With the internal stack re-orgnization (rationalization), and having to build their own reference PCs, and buying their own pen tech, they are doing what a good firm should do - work hard on improving.
They have far to go but theres lots of signs of them moving in the right direction.
Whereas in sharp contrast you have
the Ipad pro which still runs ios, and the touch bar. And lets not forget their moment of courage, where they decided that they will obsolete the entire world of headphones.
Everyone has already chewed you apart for this but
> nobody uses desktops anymore
That's just not true. Maybe you only use computers for social media and youtube but other people need things that can actually run resource intense programs.
Noone uses desktop PCs? I don't know what type of hipster country you live in but they are certainly still used heavily.
The product that really hit both of them out of the park was the Razer Blade Pro. Legitimately better in every single way than both the Apple and Microsoft offerings.
http://www.techradar.com/reviews/razer-blade-pro-2016
Lots of people use desktops. And this is more than a desktop PC, the "studio" functionality makes that clear.
And last-gen/overpriced current hardware is what Apple has been doing for a long time, yet it's
easy to see by their success that it's not pure specs but rather the actual experience that matters.
There are plenty of existing laptops that can outperform a MacBook Pro ALREADY. Meanwhile, Microsoft is innovating. Apple replaced their laptop lineup; there are no new products, and for my job the lineup is virtually the same. Hell, the lack of an escape key has me looking at thinkpads + an iPad for music.
For general computing (e.g. web browsing), yes. But that doesn't cover all the possible uses of a computing device. Gamers, designers, scientists, video/media professionals are some of the people who use primarily desktops.
Maybe a relative perception. Apple has a historical relationship with content creation personal workstation landmarks, so a mild event is a let down; Microsoft is new in this, so triggering excitement there is a very positive surprise.
I'd argue that this isn't true, especially for high-end artists, developers, gamers, and... well, my anec-data (gathered in only the finest biased areas) shows that this is blatantly false.
i want the microsoft thingie, but know that the app scaling issues and other MSFT funkery will still be there, under the polished sheen. if only that thing ran some linux flavor, and some linux flavor ran adobe creative suite, we'd finally be free from mac/windows forever! it's the new bernie sanders.
new kind of tool? all in one pcs with touch screens have been there for ages.. The microsoft event was just cool branding, taking a page out of apple's manual of selling something that has existed forever and packaging it neatly and saying it's innovating, when it's really really not
Welcome to the world - a place where perception matters more than reality.
This is why every single company has giant marketing departments and this perception informs so many of our decisions both consciously and unconsciously.
This is the real danger that apple is in - that how it's perceived is changing even if its products continue to be superior.
Without taking a side, can I just say that this entire thread is bringing a little tear of nostalgia to my eye?
I love that even in 2016-- with all the new companies and platforms and wars, and desktops arguably being the least important battleground-- we can still have an old-fashioned bitter Mac-vs-PC argument thread once in a while, with all the swearing and name-calling, just like when I was on Slashdot in 2003. I love it.
Not just Slashdot in 2003. Also Slashdot in 1998, Hacker News in 2008, Usenet in 1988, and all throughout the intervening years and back to the very dawn of recorded history. Sumerians were exchanging angry commentary written on their clay tablets about the virtues of Microsoft, IBM, and Apple computers in the Euphrates swamps six thousand years ago.
Meh, I don't think this is the same. Back then, the PC mainstream looked at Apple buyers like a rabble of fanboys who didn't know what they were buying. And indeed a lot of them were exactly that: artists, musicians and schoolteachers overpaying for hardware in order to get the superior MacOS experience.
Today, the mainstream in tech circles is a mob of disgruntled Apple buyers that Cupertino spent decades courting and accumulating with bold moves (like the MBA and the MBPR), who now don't see a reason to stick around. It's a bad trend for Apple.
I'm glad someone made this comment, although I confess I stopped reading the whole thread after the comment count went above 700 or so, so for all I know there is another like it in the dogpile here.
There is something about this topic that really gets the comments going, and I wonder what it is.
It's not just Apple vs. Microsoft. It is also Apple vs. what Apple should have done, or Microsoft vs. what its competitors are doing.
I propose that a simple A vs. B discussion isn't enough to provoke all this. Today's Apple announcement is being taken as a kind of indirect referendum on the future of nerd-friendly high-end computing hardware as a whole. As you point out, the waning significance of this question (with the rise of phones and tablets) has caused a lot of anxiety out there!
I was going to say you were wrong because it was stated "starts at 16GB" near the end of the event. Checked the store and sure enough, I see no way to increase the RAM.
They even left the fact that it is still LPDDR3 out of the keynote. It was the first question I was thinking of. I wonder when Intel will support LPDDR4.
LPDDR4 is supported by some Intel CPU's but overall memory support across intel CPU's is pretty inconsistent for no apparent reason since they are all based on the same core design.
That said you won't see any difference in performance between LPDDR3 @ 2133 mhz and LPDDR4 at 2000-2400mhz.
It's likely not even in full production since the roll out was announced a week ago.
Also no where does it says the speed and bandwidth they'll support, overall there isn't much performance difference between LPDDR3 or 4, or between DDR3 and 4 when the frequencies are off by only a few 100's of mhz.
That said I'm actually disappointed that the memory on the MBP especially the 15" is LPDDR, no reason to degrade memory performance on a machine which already has a pretty big battery, DDR4 is pretty darn power efficient as it is.
This to me looks more like another cost cutting decision set by Apple to simplify board design and the supply chain.
No 7th gen CPU's, low power memory for no reason, dumping of ports, and a pretty steep price hike.
I'm glad I bought my mid 2015 pretty fully upgraded MBP 15 with a dedicated GPU for 1480 GBP on sale last holiday season, this one is not even meh, it's a full on pass, the only feature I like is the touch ID, I still don't understand why Apple doesn't allow you to unlock your Mac with your phone, but now I know why.
Low-power memory is kind of a big deal because unlike the CPU, memory doesn't sleep. Even as it is, I expect big battery life regressions relative to the already inadequate 8-9 hours of the 2015 models. Apple cut the 13" model's battery by 1/3 and the 15" model's by 1/4. No reason losing even more battery life by ditching LPDDR3.
As for the 7th gen CPU--there are no available quad cores. I don't think Apple wanted to release redesigned MBPs with the 15" flagship model having a CPU one-generation behind the 13" model.
While memory doesn't sleep, DDR4 supports Deep Power Down and Low Power Auto Refresh even in the non-LP form factor (this is 50-80% power reduction, there are also variable timing modes which can be used to reduce power consumption during normal machine power states).
I am ok with the LPDDR3 being used for the 13" one, but on the 15" they should've went with DDR4 tbh.
As for the chips yes no quad cores for KB yet, but then again I don't understand why they didn't just staggered the release like usual, MBP13 can come out with Kabby Lake Dual Cores and the 15 can come out later when the QC's are out which is December this year.
FWIW it looks like the only available option for 15" rMBP is 16GB. Personally it's more than I need, but it's still interesting that there's no room to adjust.
No 32gb? Is this true? Someone please tell me that's not true.
I was (against all odds), hoping for 64! If this is true, consider me disgusted. I had 16GB in my (long lost, loved) 17-inch Macbook Pro, in 2009! I really cannot fathom what this company is thinking. I'm bewildered, disappointed, angry, in equal measure. I've been waiting for a killer Macbook Pro for literally 18 months! Sorry. I'm going Lenovo/Dell or why not, Surface Studio. For the first time in 20 years. I'm going Windows. I will not support a company which insults me with this. If this is Jony Ive's work, if they are so self-confident in their "design" that they cannot put a decent hardware spec together, then these people need to be retired.
Or maybe I just haven't cottoned on? They just don't care about high end users? Or are they completely out of touch? I don't understand.
You're god damn right it's insulting. Same boat as you. It's really annoying. I'm trying to work here. It's not like software's getting any less bloated.
Now that ssds in macbooks are not user replaceable -- you have to buy as much disk as you want for 3 years all up front -- it's nearly $3k for the cheapest 15 inch macbook not even including applecare for 3 years. Ouch.
My instinct is that they are going to flood the early adopters market with low RAM laptops. Then silently introduce upgrade options something like a few months down the line. This serves the dual benefit of being able to pump them out faster and giving people a reason to buy more of them.
The first example I came across, they did it after 8 months in 2011. Maybe you're right, maybe the Macbook Pro is just a beefier Macbook now. They didn't even splurge on the most up to date Intel chips, but the price went up?
I don't think there very small bumps to CPU clock speed in your example are in anyway comparable to the suggestion that they will introduce 32GB as an option or the default in a few months. So I don't think this is an example of what is being discussed.
This is another great reason for serious power users to never buy the first generation of Apple equipment. Historically a quick refresh comes out, and it sometimes takes two gens to get it right.
Glad right now I timed my last Macbook pro purchase to have a year of applecare left on it, and it could easily last me another 2 years thanks to being maxed out when I bought it.
Apple is a company looking to maximize it profit as well as build good products. A lot of laptop manufacturers are starting to catch up on the hardware side (the new Zenbook).. it's just the MacOS experience that someone has left to create an equivalent to.
"Starting"? Competitors have caught up. The current Dell XPS 15'' blows the just announced MBP out of the water: same cpu, 32gb ram, equivalent GPU, more ports (including TB3), similar weight and dimensions -- at 50% of the price! The Razer Blade is "only" 20% cheaper, but has a smoking GPU. The HP ZBook Ultrabook is again 20% cheaper, 32gb, etc etc etc.
Apple dropped the ball, big time. I'll ride out my 2012 MBPr (which still works great), but at these conditions, my next laptop will not be from Cupertino.
Have used windows primary for 15 years, and mac for the last 10.. I've seen the new windows laptops walk right past apple past few months, my rMBP is about 2 years old so I'm not feeling the pinch so much in buying outdated equipment...
Even the new Lenovo T460p's are sliding by. Asus Zenbook is slick too.
I am more of a 13" guy, although i've used the 15" MBP and windows laptop at one time.
I'm hoping by the time my applecare is up they will throw a massive parade for doing the obvious hardware upgrades.
One possible reason they haven't is getting performance and heat managed in the new slim form factor.
Had a few macbooks that ran hot in the past, so I hope it's just that and not much else. It sucked having 15" macbook pros that had faulty video cards due to poor heat management and died prematurely.
You'll go Lenovo/Dell until you have to use their trackpads for any amount of time. I know; my wife has one. My 5-year old son can tell there's no comparison between a Lenovo trackpad and the Apple version.
Given that 99% of users don't require more than 8GB in a MacLaptop, it's hard to justify increasing the maximum ram beyond 16 GB for that fraction of 1% of users who do. Not impossible, mind you - because there is always value in chasing after the 1%, but to some degree, Apple's efficiency at running their operating system and applications on small memory footprints makes it less valuable to increase their maximum ram footprint.
My girlfriend is a nursing student, not a programmer and has 8 gigs of ram on her laptop. She's constantly asking me to look at her laptop when it beachballs and slows to a crawl when she's doing homework and her memory is maxed. She only has chrome, Word, Excel and some online courses open.
I'm a power user. I currently have 43 applications open simultaneously on my i5 MBAir with 8 GB Ram. I have VMware fusion running Windows 7 (With Visio running inside that), and virtualBox running 4 OpenBSD images, Google Earth, the entire Office Suite (PPT/Excel/Word), Safari with 12-15 Tabs, etc, etc..
Of the approximately 100 million or so Macintosh Laptop users, it's entirely possible that a million of them are High End video editing or Developers/3D modeling and require more than 8GB, but I"m pretty confident that the vast majority of Macintosh Laptop Users will require less out of their system than I do - and, for better or worse, a lot of the high end scientific workstation/CAD-CAM/Servers/etc... have departed from the Macintosh Laptop Platform, and are now running on the MacPro, or, or, more likely, Linux/Windows.
And yes, I realize I've just argued myself into a corner, that those people have had to leave the Macintosh Laptop world because they need the horsepower/memory that they can only get on other platforms - but that's who is left on the MacLaptop community, and that's who Apple is targeting.
Unfortunately Chrome is absolutely terrible not just for power usage but also for RAM usage. For some reason Chrome loves memory, it gobbles it up. Safari is much better about it.
I switched from Chrome back to Safari recently and while it took some time adjusting, I have been seeing almost 2 - 3 hours longer battery life, and I have had way less issues with "beachballing".
I can't believe I have to repeat this, in 2016: software workload naturally expands to use all available resources. Period. Nobody likes doing "memory optimization" work when developing.
I regularly have 50 tabs open on Chrome. Switched to Safari a month ago hoping it would leave me more memory.
Turns out my machine suddenly started beachballing, swapping like mad (battery usage up) and crashing. Switching back to Chrome completely fixed that problem.
Haha, not anymore. For some time now, the PRO in Apple products is for PROsumers, or its possibly just a meaningless vestigial acronym now. See also: Final Cut X "Pro" which was obviously not targeted at professionals.
Edit: for comparison, Lenovo T460p can be configured with quad-core i7, Nvidia GPU, 32GB RAM, 1 SSD + 1 HD/SSD, replaceable battery, 1.9mm key travel, base starts at $800 with many components user-replaceable, http://shop.lenovo.com/us/en/laptops/thinkpad/t-series/t460p...
Hell, my three and a half year old W530 has a quad-core i7, Nvidia GPU (Optimus), 32 GB RAM, and a pair of Samsung PRO 850 SSDs.
My one year old MBP (which is really a "2013 MBP") has half the memory and storage of the ThinkPad. In fact, pretty much the only noticeable difference between it and the "new" one is the touch strip. :/
I have a rMBP, but these T460p's are excellent machines, even can come with a touchscreen. If I had to have only one laptop, and it had to be windows....
It's not that it's hard, it's just the logistics of running the virtual machines(s) and the applications they contain. You could use a minimal linux distro like Alpine or Tiny Core, but you still need to run applications on top of that if you're testing or developing.
Spinning up a basic devstack instance (for example) take a minimum of 6GB and that's before you even deploy any test vms inside that infrastructure. Another example, if you're doing config management development you may need several VMs running which in turn may have (say) large java apps with heavy memory requirements even when fairly unladen.
So, I guess the answer is, it depends on what you're doing and what the memory requirements of the thing you're running on VMs is.
Hyper-V has dynamic memory, so memory resources can be reallocated as needed, and has driver hooks so that linux vms can be resized too. There's also Intel's clear containers push which virtualizes for linux but shares a lot of kernel structures between the host and the VM.
>It seems like a really big opportunity, even if it's really hard.
It's not because it's a problem that is easily solvable by spending a small amount on better hardware. 16GB RAM costs $80 which is cheap if you're only going to use it for VMs.
For a 512GB SSD, no upgrade option vs 2TB, 14" vs 15" screen, and the crappiest NVidia graphics option even at the high end $1500 model.. What a joke, those are MBP 2013 specs.
It's not hard to bump into the 16 GB limit when doing video editing, photo editing, software development with virtual machines, etc. All of those tasks are commonly done by freelancers on-the-go, which necessitates a pro-level laptop. Instead we got a gimmicky touch bar and lost compatibility with decades of peripherals.
Well, this fact coupled with os x being absolutely SHITTY at managing and task-switching when you're doing this, and you've got a good argument for abandoning the platform.
Used windows for 15 years, osx for the last 10, haven't found a panacea between the OS, applications I need to run, and my essential human right to run 100+ tabs. Maybe 32 GB of RAM will do it.
I don't think I've ever used more than a gigabyte of RAM programming. I could even do it on a Raspberry Pi. What exactly in your workflow uses 64GB of RAM?
C++ games programming. Build process uses several gigabytes, but running the game in debug configuration takes 50-55GB because we store every allocation at the moment. If I need to run my own servers or bake data I go over that easily.
When you say "store every allocation" do you mean you never release anything, or do you just mean you store info about every allocation? If it's the former, that sounds kind of crazy, is it common for game developers to do that? If it's the latter, you could always write it to disk (which is what malloc stack logging does).
We had a random memory corruption problem recently, so we started storing every allocation without releasing, to verify periodically. We do free up old memory, but only every 10 million allocations or so.
Maybe it's not super common for games programming, but it's definitely common to not use ref counted pointers or anything that could help you here.
machine learning stuff - whilst training datasets are usually cloud-deployed, dev data alone can use up a lot of RAM. I've recently started dumping my matrices to disk for dev work now. Or turn off Chrome and Firefox which turns out to be the largest memory sucks in my ubuntu machine
16g ram: perhaps 13g available to the user. If you run chrome/spotify/slack/an editor you're often left with only 8g useable.
ml work commonly uses data that is 8g+ -- and regularly 32g+ -- just for the data itself. Yes you can work on remote servers but it's convenient to be able to work and develop locally.
No that's not why. If you have any application that sucks memory without releasing it, adding more memory just delay the inevitable.
As for the RAM in the rMBP. I'm only half disappointed compared to what I would have been 1 year ago. My workflow is pushing more and more stuff in the cloud, so instead of running a bunch of small VM locally I can have VM sized properly to the task at hand rather than limited to my laptop configuration.
That's the beauty of Capitalism. It doesn't matter if someone, no matter the position in society, doesn't understand why someone would need 32GB. As long as there's sufficient demand, it'll be produced by some business.
you are the entirely predictable, and, sorry, but very irritating, "who needs more RAM than I have ever needed" stereotype of every comment board ever. 16GB is peanuts for anybody doing anything serious in graphics, video, machine learning, statistics, or finance, or ....[put your professional subject here].
Not everybody wants to have the weight on their back of a clock-ticking cost of doing their R&D online in the cloud. Many of us, including me, want a highly capable machine with an upfront, quantifiable cost, but that is professionally credible.
Hear hear. The anecdata being thrown around in these comments is ridiculous. Like the above, and also the people saying "everyone i know has a desktop". So what? I need a powerful laptop, and i don't need to justify it to you. End of story. Apple's insistence on limitations are ridiculous and just as offensive as Bill Gates'.
This is highly subjective and anecdotal, but I find OSX to use a lot of RAM.
The in-OS memory compression helps, but when I still had a 16GB Macbook Pro, the system always found a way to use up all of the RAM to the point that the compression would kick in to handle the overage over my physical memory.
My habits aren't any different in terms of extraneous windows/apps open on Windows, and I rarely hit 100% RAM utilization on my 16GB Windows machine.
The system should use all the RAM. You paid for RAM, why have it sit there unused? As long as the next user process gets the RAM it asks for, I want the system to use all my RAM to cache everything.
Edit: even to the point of compressing pages, since it's faster to uncompress them than fetch from disk.
> Mac Pro: Buy. If you like paying 2013 prices for 2013 hardware.
It's even worse than that, if you're in the UK: the price of the 2013 Mac Pro has gone up by £500. Yes yes, Brexit and currency adjustment, and if it were new hardware I'd clench my teeth and bear it. But a £500 increase for hardware that's almost three years old is highway robbery.
Turned my gaming PC into a Hackintosh because games became boring. I'm amazed at how easy that went down.
Installing OS X took only a little longer than on my native Macs. I just had to tick a few checkmarks in the Hackintosh post-install software to customize for my setup.
OS X 10.11 updates I could install as soon as they were available from Apple. No problems at all. (I of course checked the appropriate forums for compatibility notes but there never were any problems reported). Haven't installed 10.12 yet (as I'm not too thrilled by the OS on my MBP anyways) but I heard it works just fine on Hackintoshes.
Now the caveats: I disabled the onboard audio and attached external USB speakers instead. Because onboard audio seems to be more broken than not and USB audio "just works". I'm also connecting the machine via ethernet and don't have a WiFi adapter attached. Oh, and I got lucky in the first place by choosing OS X compatible hardware (Intel CPU, ASRock mainboard, AMD GPU).
Things that doen't work: Browser DRM with Safari. I can't watch Netflix on the machine. iMessage support is cumbersome. And I can't submit binaries to the app store from the machine.
But otherwise for work it's a beast (compile times are great compared to my MBP) :)
As long as you use Intel CPU/AMD GPU you're golden. Intel's HD GPUs work, too. Nvidia GPUs need 3rd party drivers from Nvidia.
I was looking into it, and the answer seems to be fine if you (1) don't reuse hardware -- just buy exactly the components on tonymacx86; (2) don't install OS updates on release but wait until people have figured out how well they work and tweaked the hackintosh software; (3) are willing to suffer some pain around making imessage work.
edit: the key to a good experience seemed to be finding a working configuration then leaving it alone. If you want to update the OS regularly, you'll probably be unhappy.
I'm getting serious shades of "IBM clone vendor throwing darts at a board to try to appeal to every possible price point" instead of coherent product lines here.
My guess is it's a transitional period. The new low-end 13″ MacBook Pro, the new 12″ MacBook, and the outdated 11″ and 13″ MacBook Air probably ought not to coexist. But the MacBook Air is important to Apple as their entry-level (i.e. cheapest) laptop, and the MacBook is still too expensive and not quite featureful enough to replace it.
I think Apple would like to drop the MacBook Air in favour of a cheaper MacBook at some point, but they can't right now.
I was wondering what would happen if Intel bought Compaq back in 1991 (when Rod Canion and Jim Harris was still at Compaq) for a while now. Intel has a high profit margin too, and Compaq had higher profit margin back in 1991. Even back in the late 1990s laptops still had a higher profit margin, and laptop theft was more common in these days. Anyone remember the Apple price increases in 1988? Particularly for the Mac II it was worse.
I was for a while thinking of the case where Apple bought NeXT, but Gil Amelio and later Ellen Hancock stayed as CEO. While PC margins was declining, I was thinking that Apple could mostly focus on the higher-end PC and workstation markets. I wonder how this would have compared to Tim Cook today.
> This event was by far the most disappointing Mac event in the history.
I very rarely respond with such a glib comment, but: really? Isn't that slightly hyperbolic?
This is very emotive language to describe a completely reasonable and innovative update to a laptop. It was never going to satisfy everyone's needs or wants, especially when some of those wants are not currently physically realistic ("all day battery life" and "even thinner design"?).
For myself, as a 2012 rMBP owner, this is a really solid upgrade that I would be delighted to use.
It's been what felt like forever since the last Mac update, aside from the new Macbook. Which is only a little more appealing to most people than the first-gen Air.
I, personally, have been feeling like it's upgrade time for my 2013 Air for a year or two, but there hasn't been anything beyond tiny little CPU bumps that they didn't bother making any noise about. This was going to be NEW! MACBOOK! PROS! for the first time in ages.
Yeah, it was pretty disappointing. Especially when Microsoft sat down in front of every digital artist on the planet yesterday and looked them straight in the eyes with an expression Mac users haven't seen since like 1994 or something.
These things are always personal... for my use case, I do need a quad core CPU and portability, but these CPUs seem like they're the best compromise for speed and TDP that Intel currently offer, so I don't see what else Apple could do there. I also really value the larger track pad, the better screen, and the reduction in weight. So for me it would be a nice upgrade. I can completely see how it wouldn't be suitable for someone else though.
Ok, now from my point of view: slower CPU (I don't edit video, I don't need 4 cores), new keyboard layout (I don't have any issues with current trackpad), same screen (100% brightness level? never), no USB-A port, no HDMI port (and I hate adapters hell). And 15" version is not even an option because it doesn't have Esc button and F-buttons - I use them every 5 minutes in IDE.
> 15" version is not even an option because it doesn't have Esc button and F-buttons
As for the Esc-button, they've added an option to macOS Sierra where you can map the modifier keys to Escape. Very useful to have Caps Lock mapped to Escape.
I don't know about Microsoft, but Apple now has two product launches in a row that failed to introduce anything really new. Especially when it comes to MacBooks, Apple's chief competition isn't even Microsoft. It's their own 1-3 year old products. Why should I upgrade do this new MacBook? It doesn't levitate or make me coffee.
With phones it's a little less clear cut because phones getting faster is still worth paying for, but that will slow down at some point.
It seems to me that Apple has finally hit their post-Jobs point where the radical innovation is outpaced by the comfort of not changing much and just printing money. Of course it's quite possible that iPhone 7S will in fact hover and make coffee, and they are just having an off year, but somehow I am pessimistic about it.
Having personally met Panos Panay as a fellow groomsman at a friend's wedding, I can tell you that guy is full of energy and passion. He's the type of person that brightens up a room given he's so fun and enthusiastic.
I was hoping to get one that supported multiple 4k displays. With the title of "Mac Event" I was anticipating at least 2 Macs updated, not just the MBP.
They killed the Mac Mini when they stopped making it upgradeable. They can kind of get away with that with the Macbooks because of the form factor, but there's no excuse with a Mini. I'm hanging on to my late 2012 model. Specced out to the max and with a hybrid drive, it still boots faster than my more recent rMBP.
I cringed when Apple's event started with a video claiming how great their products are for disabled people, just the same as Microsoft did the day before.
I don't recall either of them then going on to build on that with any product/feature announcements...
I had a friend in uni that was blind. He used a Mac for all of his work because he said, and I quote "It's the most beautiful OS I have seen".
He wasn't just legally blind, he had absolutely no vision what so ever... but with the voiceover technology and the accessibility built in to OS X 10.4 at the time it was already amazing to him. Apple has worked very hard to make iOS and OS X accessible to not just those with perfect vision or perfect control, but also those with various disabilities.
And now, they make a huge step back by launching an unaccessible touchbar that removes real keys, hence removing options for your friend. Yay, progress.
I wouldn't be surprised if it's more accessible. I would think it has voice over like the iPhone which is supposedly one of the smart phones for visually impaired people.
What exactly are you looking for that they aren't making? I don't know about Windows, but Macintosh and iOS have all sorts of software features for accessibility, and they've had for years.
> Macintosh and iOS have all sorts of software features for accessibility, and they've had for years
Yup, I remember seeing a blind guy on the subway using voiceover on an iPhone 3g (that would be the second iPhone ever) - totally blew my mind at how adeptly he was able to use the thing.
I cringed when Microsoft did it. Apple has been doing stuff like this for years. Accessibility has been at the core of the company since the 90s. Just look at all the accessibility meet ups, workshops, and presentations were given at this year's WWDC alone.
No, only the 11 inch one has been discontinued. The $1000 13 inch Air is still on sale[1], and so popular that I would predict it won't be killed off until the cheapest pro becomes sustantially cheaper than $1500.
- A MacBook pro with some real innovation. They could just copy Microsoft with a detachable screen (oh but they would cannibalize iPad market), pen input, touch screen. But, instead we get this touchbar thing which is great but I am just disappointed that it is the only thing they have innovated here.
You know, laptops with a pen input have been around for a long while. Even the convertible kind. I don't think that form factor is fully baked yet. In fact, I have a couple of HP tc1100s from the year 2000. You can still get good use out of them. But as a mass consumer device, they were not a runaway hit.
I suspect that Apple has a roadmap for conservatively rolling out more digitally configurable/reactive/touch interfaces on their laptop form factor. They will do this while working off the of strengths of each input device. My bet is that they will come out with e-ink keyboards and some kind of display built into the touchpad, perhaps with a stylus.
I would concur with you that this was a disappointing Mac event, however.
It has been written about many times (about Apple's TV business and a rumored Apple TV with an actual screen) that screens, whether monitor or TV are actually very dumb, it makes no sense whatsoever for today's Apple to be in the business of making dumb screens, while their main business is value-added products.
So, they passed on the opportunity to make the 'Apple monitor' to their main supplier of LCD screens, LG. Fair enough.
FWIW, I really, really don't want a detachable screen. Other users might, but if you're a programmer using your MBP as a work machine you just want something that opens and closes with a strong hinge, great battery life, and bulletproof wifi.
I think the touch bar is just horrid, its a step backwards if not simply a declaration of being out of touch. Who looks at their keyboard anymore? For a company with so many touch oriented devices the last thing they needed is an extension of the keyboard
I don't think it's that bad. I can definitely see the benefits for different applications.
However, watching the presentation at times the touch bar felt like an elaborate workaround for a full-blown touchscreen. Especially during the dj demo.
And with the Microsoft Puck for the Surface Studio still in the back of your head, the touch bar for Photoshop or video editing looked a bit outdated.
It is being years since Apple did anything innovative yet they are gaining users. Status conscious Chinese and Indians is their new Market and all they want is Apple in their homes and pockets.
Apple said on Tuesday it saw a nearly 30 percent decline in its China revenue in the quarter ending September, the highest fall among all regions, due to tepid demand for its iconic iPhones.
iMacs were refreshed in late 2015. It's typically ~3 years between new generations, often with a hardware refresh somewhere in between. Nothing new here.
> - No monitor announcement.
They partnered with LG (I think?) on a 5k monitor. Sounds like a reasonable replacement for the Thunderbolt Display.
> iMacs were refreshed in late 2015. It's typically ~3 years between new generations, often with a hardware refresh somewhere in between. Nothing new here.
147 May 2015
215 Oct 2014
387 Sep 2013
298 Nov 2012
577 May 2011
280 Jul 2010
We're currently 380 days since the last release, which is on the long side other than the one 2010-2011 outlier. I don't think anyone expects Apple to ship radically different hardware designs every year but they desperately need a tick-tock strategy which ships identical cases with non-obsolete components if they expect anyone outside of the die-hard Mac fans to keep buying.
This might have made sense a couple of years ago, but I don't think desktop hardware is advancing at a rate where yearly updates are a must. This seems even more true for typical Mac use-cases where users don't quite care as much about gaming performance. Intel deprecated tick-tock for a reason, so I don't think it would make sense for Apple to adopt that strategy now.
I don't think it needs to be a rigid update every year but they should have a policy of not staying more than one generation behind. The 5K iMac jumped to Skylake, so that's fine, but the rest of the iMac line is stuck on Broadwell, the Mac Mini is Haswell, and the Mac Pro is Ivy Bridge.
The solid industrial design, long service life, etc. are worth something but probably not 3-4 hardware generations, especially for the premium market Apple wants to be in. The PC market hasn't been completely stagnant and while nobody is buying a Mac for super hard-core gaming they really don't want to be in a position where someone drops a fair amount of cash and their new computer struggles with a game which came out a year or two back, 4K video playback, etc.
I'm more concerned about the ports. If Apple really is moving to Thunderbolt 3 over USB-C for everything, they need to start shipping iMacs and Mac Minis that support this too.
I think Microsoft has sensed their weaknesses and has made a change in strategy from focussing on what works for "ordinary people" and segmented their market and is now trying to win over people one segment at a time, developers, gamers, artists. Not sure if they will find more good segments to target, but this is certainly a good strategy for winning over those segments. They obviously have a path to travel, but it seems like a good one to be on.
Regarding the 15" version, I have the 15" 256 GB Intel Graphics 2015 Model and I really, really, was expecting the same unit with the Skylake processor version of the quad core with the Intel graphics that has the on-board graphics RAM.
I was hoping it would be lighter weight and longer battery life.
Instead we have the discrete GPUs that undoubtedly will need more battery life than the Intel internal GPU.
You understand that your point is flawed? Compromise isn't always necessary! The OP was looking for innovation in this space, similar to say what Tesla did for autos.
-.- but it is. You can't make thinner devices with more battery. The way battery tech is right now is that to make it thinner you need to sacrifice battery volume and consequently capacity. Nothing short of a major technological breakthrough will change this, and nobody on earth has shown even the slightest signs of having managed to produce a better battery technology, let alone it being ready for consumer tech. So no, my point is not in the least bit flawed.
> Also, Panos Panay sounds like a genuine, authentic, passionate and knowledgeable whereas Jony Ive sounds like an Evangelical designer who feels "fake". I don't know how to explain it.
>Microsoft really hit it out of the park yesterday
Hmmm, you make it sound as if the Microsoft event was a home run - it wasn't. The Surface Studio was interesting, but it's clearly aimed for graphic professionals and its stratospheric pricing put it out of reach for a lot of people. As for the Surface Book, well, considering all of the issues they've had with the previous version I'm not sure people will be so inclined to even upgrade. And let's not even get into the ridiculous pricing for the Surface Book.
>Also, Panos Panay sounds like a genuine, authentic, passionate
You're right about that one. I've never seen anyone so passionate about creating documents and highlighting them.
> stratospheric pricing put it out of reach for a lot of people
Look I'm not exactly a fan of MS but $3000 isn't stratospheric in an era where 'budget' gaming builds are $1000 for just the tower and people are regularly spending $1500 - $2000 for a laptop. The equivalent artist setup with a MacBook Pro, a high end external 4k monitor, and a Wacom drawing tablet will easily set you back $3k.
idk seems very impressive to me. engineering is v hard so much respect to the teams at apple for continually pushing consumer electronics to the state of the art.
i think it's pretty obvious the macbooks aren't the big money makers at apple. that lil apple pay button is there to subsidize and justify continued development of the mac.
apple probably wants to kill the mac. but some guy at apple's like what if we put a lil thumb scanner on it and apple pay then can we keep it pls and then the cfo was like sure okay fine the mac can stay only if you add an apply pay button.
also pretty sure all-day battery life isn't innovation and thus probably not what they're working on. how about eternal battery life wherein a solar panel so efficient is able to pull light even in the darkest of situations. how about remote charging.
i'm sick of charging shit. that's what's next, no charging.
- Mildly funny jokes and comparison with 90's technology.
- 90% of the talk was about the touch bar.
- Awful demos of Photoshop & some cringy DJ.
I was hoping we would see:
- A new MacBook with all day battery life and touch bar, even thinner design. Ok, I understand that they are trying to consolidate their product line but the category of a web-browsing machine that is 12", super small design and an adequate processor is left without any update.
- A MacBook pro with some real innovation. They could just copy Microsoft with a detachable screen (oh but they would cannibalize iPad market), pen input, touch screen. But, instead we get this touchbar thing which is great but I am just disappointed that it is the only thing they have innovated here.
- Killed Macbook Air.
- No iMac update (!!!).
- No monitor announcement.
Microsoft really hit it out of the park yesterday. Apple's entire presentation felt like they are trying to fill the 1.5 hours of time with bullshit.
Also, Panos Panay sounds like a genuine, authentic, passionate and knowledgeable whereas Jony Ive sounds like an Evangelical designer who feels "fake". I don't know how to explain it.