No. Microsoft introduced an entirely new kind of tool for digital artists. I'm not one myself, but the Surface Studio makes me wish I were. And the people I know who are, are over the moon about it. They can't wait to get their hands on one.
Yes, it's Microsoft that hit it out of the park, and made Apple look amazingly weak by comparison. Today's event wouldn't have been particularly impressive even on its own. By comparison with yesterday's, it's just embarrassing.
I'm a digital artist myself. I'm a more goal based guy, so I use both Windows and Mac hardware and software. I know that I am probably the target market for both the Macbook Pro and the Surface Studio, but I have to say, I'm not impressed. And I'd argue that the only people who are would have to be Microsoft or Apple fans. Certainly not tech based artists.
First, this is a business for me. And I suspect it's a business for anyone that Apple or Microsoft expects to sell this stuff to. And having to pay $3000 for outdated hardware is a tough pill to swallow. The hardware in the Macbook "Pro" is simply inexcusable. (REALLY???? 16GB RAM MAX!?!?!?) But the hardware in the Surface Studio is equally galling. They give you an NVidia what? 965 for $3000??? And the BEST I could even POSSIBLY get would be a 980??? And for that magnanimous gesture on their part I would be obliged to pay a MINIMUM of $4100??? For LAST GENERATION graphics cards???? On a machine purportedly about graphics???? (A 1080 would run both faster AND cooler using LESS power Microsoft. And at this point in time, for a business investment 64GB of RAM should really be offered by anyone NOT trying to screw you. What's the deal Microsoft???)
I think to objective observers who were waiting for these presentations... both proved SEVERELY underwhelming considering the pent up expectations. I mean... OK... if you put a gun to my head, I'll probably buy the Surface Studio. But don't expect me to pretend that I don't know that both Microsoft and Apple are robbing me.
I think you (and a few others here) are making the mistake of equating digital art with a need for high-end graphics cards. 99% of what most digital artists consider to be "digital art" does not require state-of-the-art GPUs. Graphic design work, retouching, painting, etc. barely even scratch the surface of what the last generation of graphics cards could handle.
Let's call high-end graphics cards for what they really are: gaming console graphics hardware stuffed into PCs. Apart from a relatively small number of professionals who work with 3D rendering, high-end graphics cards are an even greater waste of money (unless of course you're buying the machine for gaming, which doesn't really fit the "digital artist" target of these machines).
Microsoft and Apple may still be pricing these products too high ("robbing you" as you say), but that's a separate discussion.
"...I can't help but notice that you (and others here) are making the mistake of equating digital art with a need for high-end graphics cards. 99% of what most digital artists consider to be "digital art" does not require state-of-the-art GPUs. Graphic design work, retouching, painting, etc. barely even scratch the surface of what the last generation of graphics cards could handle..."
I can't help but notice that you are equating that 99% of "digital art" that can be handled by mobile graphics cards, to the 1% of "digital art" that anyone's going to actually PAY someone to produce. As I said, this is a business. If I could make the living I currently make retouching photos, then I'd gladly pay $3000 for an underpowered machine...
but I can't.
because no one is going to pay us that kind of money to retouch photos are they?
Look, these machines, to a business, are INVESTMENTS. You invest for the FUTURE, not to take advantage of the past. A 1070 or 1080 is not too much to ask considering Pascal's MANIFEST superiority in efficiency. Additionally, I was being kind, I think MS should OFFER 64GB in the Surface Studio, but a 128GB option would really be necessary to future proof this thing.
I'll tell you what MS and Apple are going for here... it's a money grab. And they are setting themselves up to come back to guys like me every 18 to 24 months for another mandatory money grab instead of just giving us a 60 month machine from the outset.
Offer me a 60 month Surface Studio at the $3000 price point and I'd update every workstation here. But that's not what they're offering is it?
What kind of art are you talking about? I've done much graphic design (yes, paid) and used Photoshop and Illustrator for nearly a decade (though I now use Affinity, as of last year), and I have never been in a situation where I was running out of resources.
Up until this year, I have never owned a machine with more than 8GB of RAM, and never a video card with more than 1GB of RAM until now either.
Are you perhaps talking about video editing or CGI rendering?
"...Are you perhaps talking about video editing or CGI rendering?.."
Yes. Video editing and CGI. The workflow normally requires the use of MASSIVE memory mapped files. Additionally, a given artist may do several "lo res" renders before shipping off to the farm. ("Lo res" being a relative word. Because in most cases we're talking about 2560x1440 or 1080p minimum.) I really don't see how your tech guys are able to get that to work for you on cards that have 1GB memory. But if they can, then you'd better NEVER let those guys go because they are worth their weight in gold. (Unless you don't do any CGI or video editing???)
Yes, this is the case in most of all VFX and animation studios. Pretty much everything is done on Linux, usually CentOS. That being said 8GB is a joke. 32GB is minimum and 64GB is what new machines would be ordered with for almost all artist workstations.
Source: I was a system engineer for VFX and animation for a number of years.
> Then again, with Chrome taking up >4GB of RAM, they probably could have gotten half that performance
Can confirm. Got up to 15GB RAM usage today, before closing 30+ tabs to get back down to 7GB. Maybe it's ridiculous, but based on the information I'm needing from those tabs--100% text--it's infuriating.
Oh, and also sitting awfully comfortably in that 7GB area is Dropbox, which is doing who knows what with all of its RAM...I do like the Linux command line tool though.
I had same issue but at home where my work flow is different. My solution was to use the extension "The great suspender" which sleeps tabs that hasn't been used in a while. Might help you too.
Firefox is even worse. I caught it using 37GB virt the other day on Linux and a pretty significant chunk of RAM too. I don't even have 37GB of physical memory.
"Then again, with Chrome taking up >4GB of RAM, they probably could have gotten half that performance just by closing their browser or using Safari." This is so true. I've recently stopped using Chrome and switched to Safari. Now my 5-year old MacBook Pro runs like butter.
Browsers and web is frankly getting ridiculous. The modern web has a lot going for it but when moving the web forward people seem to completely forget about the efficiency argument. I wonder how much of the worlds power we waste casually like that.
As a developer and a digital artist I am often running multiple VMs, my various development tools and 2-5 Adobe products on top of my 2 browsers and other misc programs. Agreed that for any single piece of digital design software 16gb should be enough (even to a certain extent After Effects). But if I could have a high quality laptop, that runs Adobe software, is Unix-y AND run the whole Adobe suite plus my development tools... (whilst not having to close all my research notes in Firefox)... gee whizz would I throw my money at that.
I'm probably an outlier though so can understand why it doesnt exist.
I don't think you're an outlier at all. 32GB is the absolute minimum I need to get my work done without disk thrashing. If I spent more time on video than audio I'd want 64GB.
That's completely realistic for pro and semi-pro users.
The cost of making a laptop that could take 32GB or even 64GB of RAM must be tiny. No one would care if Apple only supplied 16GB but left a couple of slots free, or even if the stock overpriced 16GB could be swapped for 32GB.
Why cut corners and make "pro" machines that can't be used for professional work?
It's counterproductive. It may increase margins in the short term, but over time it alienates users and erodes the appeal of the brand.
It seems "Think Different" has become "Don't Expect Much."
To be fair, I think most people's interaction with "graphics arts" is Lightroom, which is a pretty darn slow piece of software. Not the complicated stuff, where they reused code from Photoshop, but the simple stuff like opening the file import dialog.
I think people are imagining that a better computer will improve its performance, but it won't. There's a sleep statement in there or something. (Their blog post says something like "a cloud industry-leading innovation in picking-your-camera-out-of-a-list user experience story flow." Really? Just suck the files off my memory card in the background as soon as you detect one.)
Most people I know working as professional artists/designers are working with 2D assets in software like photoshop. Even the few working in print don't need crazy GPUs. I know some people who deal with video and 3d but most of them farm out to servers to do the heavy lifting. I'm not sure what you do but in my experience this is more than enough for most digital artists and designers.
Considering the extremely long time between hardware refresh on the MacBook Pro I entirely share your sentiment and am agog Apple didn't wait for a suitable Kaby Lake part. People have been waiting years, what's another six months? If you're going to charge me top end prices bloody give me top end hardware. It isn't simply top end by definition with a Microsoft or Apple logo plastered on the side, sorry.
This is 100% where I'm at. I'd buy a Surface Monitor right now, because I could be assured to use it on a...well, I almost said "a real computer", and I feel a little bad for that, but I also sorta don't. I need forward compatibility with my hardware, and my workstation already blows the doors off the little box they've saddled that great monitor with.
The stats for the Cintiq Companion 2 are in the same ballpark and, from personal use, the thing has a hard time providing smooth UI when editing very large photoshop, illustrator, or unity projects.
I think you're right that artists largely don't need what high end gaming PCs might require to play the newest steam titles at full graphics settings, but for what Microsoft is asking here, they do at least need enough GPU/CPU strength to have a very responsive UI when editing very large and complex graphical projects at 2560 x 1600 and beyond.
My worry about the Surface is simply that the minority report-style advertisement isn't really possible with a real-world PSD or 3dsmax scene.
High end graphics cards are usually much better than a gaming console, in that they offer much better frame-rates, and a lot more resolution, and at the same time even a single card can be more expensive than a complete gaming console.
So, yes, they are for gaming 99.9% of the time. But please don't compare them with consoles.
> Let's call high-end graphics cards for what they really are: gaming console graphics hardware stuffed into PCs.
Look, I'm sorry - I'm with you for most of it, but you have this stick held by completely the wrong end. Gaming consoles are closer to APUs than discrete high-end GPUs.
your rant about GPUs on the Studio for 3k is heavily misguided. What you are paying for with that 3k is mostly the new screen and its capability. The closest rival to that screen is a Wacom tablet which is very near the same price point without all the hardware MS has added. Are you aware of the new stuff that is packed into that one screen? That is the question you need to answer for yourself to understand that price point.
As for having "outdated" cards, that is more a function of the production timeline. I am sure the Studio has been in the works for a while and those were probably the best in class cards around the time.
Pro tip: When you have to explain to people why they shouldn't be disappointed by something they've thought about, you've almost always already lost the argument.
And the video is outdated, period, no air-quotes. There is simply no argument there. Making an excuse for it is nice and all, but it simply is a case of charging a premium for old tech.
He is directly rebutting one person's opinion, not coming up with a talking point after consulting with a survey of people's reactions to the new announcements. Who's to say it's not the parent that was mistaken?
I know several artists who bought the Surface for exactly this reason and found it disappointing. The hardware in the Surface Pro is pretty mediocre unless you spend a lot of money on it (at which point a Cintiq Companion becomes an intriguing idea if you don't know what Wacom's customer support is like).
Most of those artists ended up buying iPad Pros to replace them with.
"Making an excuse for it is nice and all, but it simply is a case of charging a premium for old tech"
This is a weird complaint to me. So what if the display hardware is dated if it is perfectly up to the task it's presented with? Maxing out the specs on a machine for no other reason than to max out the specs seems to me to be nothing but a waist.
The GP was the one making the complaint. I have no interest in the device. But if it is important to people, then it is important to people.
And even if it is perfectly suited the the rest of the hardware and any additional RAM/performance would be wasted[1], that's not a good excuse to overcharge for old commodity hardware.
[1] I would be extremely surprised if the rest of the system were designed around the video card. Actually, designed twice, since there are two video card options.
Another pro tip: when marketing to professionals, they will get it over time. If you're a real professional, you will follow the news, see the reviews and tests, and after a while you will understand new technology that was not properly marketed in the first place.
This "digital art graphic card" is starting to sound like "for my Big Data project I need these nonstandard tools and big compute pellets" when all they have is moderate data queries and a batch-processing workflow.
Not even just that, it's the last gen mobile cards, which means you have to spend 4.2k just to have 4GB of VRAM on something that, at the end of the day, is really a desktop. the 980 desktop version is ~$600 with 6GB VRAM.
Except that you are paying for it. If it costs money (which is does) and you pay (which you do) then you are very much paying for it. And that's the problem. Not only are the 9 series GPUs a generation behind, the shift to the Pascal architecture delivers previous gen. performance for literally half the price (see GTX 970 vs. 1060).
Maybe the thing you want to pay for is the screen, but there's no avoiding the outdated GPU tax that you have to pay to get it.
Yeah, but my point was that if they were too far along with old tech, there were still drastically better pre-existing options for a product that isn't really mobile.
I share your surprise that Microsoft didn't use the 1080. I would really love to hear the back story on that, was it getting it to fit? was it power envelope? was it driver support? Etc. I expect that if they pre-sell a million Studios or more we'll see an updated unit ("Studio Pro" anyone?) that is a 64GB/1080 machine but time will tell.
If they started out with the max TDP of 100W for the cooling system - upgrading it to 150W in that small space it sounds like too much of a change to meet their DL?
Clearly this does not excuse the reengineering but it might explain why they went this route.
You're surprised that they didn't use a top of the line card when most of their target audience will never even make the thing sweat? I have a 1080 and use it to play games at 4k while driving two additional 1080p monitors. It kicks ass and I push it much harder than most digital artists ever will.
Well there are two reasons I was surprised they didn't use the 1080, one was that it is surprisingly hard on a GPU to do the sort of lag free drawing/manipulation. When I'm turning a 3D part around in "quality" render mode in TurboCAD and modifying it on the fly it works out the GPU fairly well. Unlike games its pretty much forced to recompute lighting fairly often.
The other was that they have already crossed over the "luxury" threshold from "just another PC" to "special hotness" sort of territory. So unless Nvidia is giving them a smoking deal on "old" GPU's it seems that from a marketing perspective you would want to check off all the boxes. Its reported that it would use less power and generate less heat, and Nvidia says it is 20% faster. That seems like it would be a "no brainer". But I clearly don't know.
That is why it surprised me when I read the high end Studio only had the GTX980 in it.
I think the idea that it's a digital drafting table that can also be used as a desktop computer is the right way to think of it (as many have alluded to in this thread).
I wonder why ms didn't stick with AMD (like the xbox one) if they weren't going to use a 1080 (or 1070, might make more sense).
As someone that used to enjoy drawing/doodling but never made the leap from mouse to a digital pen - sketching and doodling on the surface 4 was eye-opening. Any number of programs paired with the pen/screen feels a lot like ink drawing without the mess and with multi-level undo.
The combination of a decent pen and great screen makes for a very satisfying and immediate drawing experience.
While I really like the look of the Surface Studio I was shocked at the GPU choice and the fact it comes with a hybrid drive. I mean I do think it is still a good machine even for the price but come on!
In some use cases they are, since they contain an SSD as a large cache. They are also a lot cheaper than an SSD, so it's nice to get some of the benefits without all the cost.
They don't need to "max out" a card, because PS is not a game.
What counts is the time it takes to perform an operation like a Camera RAW import or a blur. If that goes down by a significant percentage, productivity goes up by a significant amount.
For drawing, good acceleration is the difference between a distracting experience that's laggy and unusable, and a smooth experience that gets out of the way of the work.
Besides, there are people with an interest in art and machine learning. Good luck with a 980M if that's your area of interest.
I am also puzzled by this... Maybe knowing they have more horsepower than they need empowers them. Or maybe they are doing a lot of high poly 3D work or something.
The architects at Foster + Partners designing the Apple campus use Revit and being architects they want the coolest tech. Surface Studio hits that sweetspot better than anything from Apple at the moment.
I'm struggling to believe you're a digital artist if you're complaining about the Surface Studio. A Cintiq monitor, with no computer, costs nearly what the surface studio does, and gives you an inferior display to boot.
The machine is not "purportedly about graphics", it's about 2D artists, and they made that clear in pretty much every presentation I've seen on the device.
Ok graphics shouldn't really be that big of a problem, if companies were trying to maximize it. Instead, they're trying to minimize the footprint of the actual computer without sacrificing productivity capabilities. Luckily, this provides incentive for the eGPU market, which will hopefully give us compact devices with modular powerful computing specs.
>> And the BEST I could even POSSIBLY get would be a 980???
Not only that- it's a mobile processor, 980M, so it would lag some distance behind a 980. That is cutting serious corners for a desktop at that price point.
Actual graphics and design pros (as opposed to Apple 'Pros') absolutely care about specs. Come hang out in VR developer land for a while, where interest in specs extends to system architecture choices.
Isn't something like the Mac Pro aimed more at you? It's packed with graphics hardware. Although admittedly they haven't updated that for several years.
For me, the most revealing piece of information from that Penny Arcade article was how much the 27" Cintiq costs ($2500). Makes the Surface Studio seem very affordable for what it offers.
In the last two years all of the artists at our studio (which does animation and interactive work) have switched to Cintiq's. We were really apprehensive to make the jump but they all seem extremely happy and some have even reported that pain they experienced after extensive tasks has gone away. I'd be really interested to see how they like the Surface Studio compared to a Cintiq.
Different strokes, I imagine. My fiancee is a graphic artist who had a wacom tablet and a cinema display as her setup for a long time, but she got the 27" Cintiq a few months ago and LOVES it. I had similar skepticism that it'd really be that big of a step up, especially given what she was spending on it, but she says it's noticeable improvement on her productivity.
and are you not a bit disappointed that the products you describe, Apple's desktop products, appear to have been totally ignored by Apple? No new iMac, Mac Pro or Mac Mini hardware announced, it seems like it's not a focus for them at all.
I think their focus has been on mobility for some time now. Very few people actually need the added horsepower a desktop platform can offer due to the relaxed power and space budget. I've been working from laptops exclusively since 2003 or so.
lol why would i be disappointed? Just use a laptop and dock it to a monitor when needed.
Why would anyone need a desktop in this day and age? Moore's law means the things that desktops used to do 5 years ago, laptops do now.
The compute requirements for photo editing, for example, haven't changed in 5 years. But the processors and systems (IO, displays, etc..) keep getting faster. It's only natural that laptops would cannibalize desktops. And phones/tablets will cannibalize laptops in a few years.
Can I suggest you learn something about a domain before commenting on it?
Photo editing can be hugely resource intensive. Camera RAW files have been getting larger over time, and all but the simplest edits demand an SSD, a fast processor, and at least 32GB of memory. 128GB isn't unusual for commercial work.
You can get by with less, but sooner or later - usually sooner - you run out of RAM and your machine starts thrashing.
Many Photoshop features are hardware-accelerated, so having a good graphics card makes a significant difference to editing speed.
3D animation definitely needs a good graphics card. So does video editing.
Audio doesn't, but it needs as much raw processor power as possible. I know DJs/producers who are running dual 12-core Xeon systems for their mixes, and adding PCI DSP cards on top.
Surface Studio looks very nice, but it fails to tick most of these boxes. The reality is there's a significant market for creative power users willing to pay good money for multicore server-grade towers for their work.
Neither Apple nor MS are paying attention to this market. The Mac Pro is an underpowered curiosity now, and Surface Studio has - sadly - been hobbled by greed and penny pinching.
MS definitely is not paying attention to the high-end workstation market. They are paying attention to the trends of interaction and new interfaces that technology is allowing us. The product here is the new styles of content creation and the accelerated pace of current content creation via the form factor and accessory knob thing. If you have needs that demand extreme resources, you can probably afford to have remote rendering or processing using all of the myriad of wonderful networking technologies that have advanced so much.
It can't be an effective interaction device and a server-level resource at the same time. Anyone who is enough of an enthusiast to require a dual Xeon workstation is clearly not who MS is targeting with a single product. Leave the multicore server-grade towers to HP, Dell, Lenovo, etc. because there's nothing to innovate there - you just throw hardware, bought at market price, into a box. What is MS supposed to innovate on there if all you care about is specs/$ and ignore the design and use-case?
This is a very niche market. As laptops get more and more capable, it'll be an even narrower niche.
I love to have just as much power I can. I'd love to have a multi-CPU POWER9 loaded with a couple terabytes of RAM and a couple flash cards attached to a fast bus, but I wouldn't know what to do with it.
> The Mac Pro is overpowered for them.
I'm not sure if you are trolling?
Moore's law has slowed down btw and is predicted to end in 4-6 years (2020 - 2022) as it would be physically impossible to shrink transistors any further /unless/ there is a change from the current silicon CMOS technology.
In fact, manufacturers are no longer even targeting the doubling of transistors anymore! Instead they are focusing speeding them up.
I'm not sure if you're being facetious but for those of us who enjoy playing video games a desktop is the best choice. If you don't care about portability, why not get something that is by far more powerful, cheaper, customizable, and future-proof?
You're in the wrong discussion if you think gaming has any relevance to this line of laptops. People that own Macbook Pros don't play games seriously. Gaming is strictly for Windows devices, as it's largely an afterthought in the Mac world.
Are there many more studios outside of Blizzard that are developing for OS X? I hadn't realized there were more devs working with it. Every other game I know of that has Mac support is usually running in some WINE equivalent which isn't great.
So we have Blizzard and Source engine, who am I missing?
Ahaha ahaha... No, no it's not. I can't imagine you play any 'serious' games. Hell, even WoW, a game which is well supported by the developer on OSX, plays like crap on my $2600 MVP.
Well, I should have worded that better, shouldn't have been so dismissive. By 'serious' I meant 'resource intensive'. I don't know all of those games, bit the games I do know are definitely not very taxing on a GPU (aside from Skyrim, but what settings are yo uplaying it at? Does that seem worth it for the price tag?)
I don't think you'd argue that a mac is a good gaming rig from a performance / $ perspective (or any other perspective really).
Define 'good enough'. These aren't inexpensive machines. In the context of gaming, $2600 is far too much to pay for a laptop that gets ~30fps at medium settings.
I never used the Cintiq, but I've been using Wacom tablets for a good 15 years. I owned & used the Surface Pro 2 & 3, and the iPad Pro with pencil.
Both have their place - pen on screen or pen off. Working with your pen on a screen, especially a smaller screen, means you lose a good chunk of your visible space. Overall opinion -- it is great for artists that two major companies have made a significant effort on getting the pen on display right.
"Overwatch was fast and fun on medium settings. Civ VI is what I’ve played the most on the Studio and it looks and runs fantastic."
"When I first saw the device months ago in that secret room at MS, they asked me what I thought. I said, “Well I have no idea if anyone else will want it, but you have made my dream computer.”
Nothing about that is, I went out and paid my own money to buy this.
So you must've somehow missed the "Just to be clear, I don’t get paid to do any of this stuff but I enjoy doing it and I like to think I’m helping make the Surface better for artists." while reading the article, right?
Advertising is more than just paying actors to pretend to like something. News articles rephrasing press releases are often also adds. Video game reviews are usually not simply handing money to someone. That does not make them unbiased.
People spreading Viral Videos are often unpaid paid, even if they are a critical part of a marketing campaign. Hand someone some free hardware and they don't think is this worth X, just is X cool.
I suggest familiarizing yourself with the history of Gabe/PA and the Surface. Gabe initiated the relationship by writing a post out of the blue about its potential as a drawing tool, as well as complaining. In response, Microsoft invited him to a lab to try prototype Surface hardware/software and made a lot of changes based on his feedback. So, they've basically made a machine designed to be liked by him, and they continue to get his feedback to ensure that doesn't change.
So, it's not surprising or questionable that his reviews are positive.
I actually read his first post about the surface when it came our. The point is MS is continuing the relationship for good press, and by becoming part of the process they have become biased.
If the consideration is that they made the product work for his use case, then that's not corruption in any meaningful sense of the word. That is exactly the behavior you want to reward companies for.
IIRC, he returned his last trial-Surface device and bought one with his own cash.
Based on the article (he's been lent this device for the last week or so), I imagine the same thing may well happen if he's happy with it - which it sounds like he may be.
Pretty sure the guys who've been making a very successful business out of Penny Arcade for the last couple of decades don't need to take backroom deals from Microsoft in order to make ends meet.
Apple delivered meat and potatoes for ordinary users (albeit with a weird garnish), while Microsoft delivered a molecular gastronomy dessert that a tiny niche has any use for. I don't see how that is to Microsoft's credit, much less "embarrassing" to Apple.
They haven't replaced my laptop with anything new. They just released a newer model. Similar to every yearly company announcement ever. Nobody cares about version 7 with nothing new, they care about version 1 of something innovative.
Wacom hasn't released an innovative product in 10 years, and much like Texas Instruments and their $130 monopoly on school calculators, everyone is still paying the same $2500 to Wacom that they did decades ago for Cintiq.
The lack of competition for Wacom is criminal and has allowed them to coast on last-decades product line for a very long time.
Hell, the most innovative thing Wacom has done in 10 years is rebrand Graphire into Bamboo. Ooooooh!
I imagine the day the Apple Pencil was announced was a very grim day in what ever is left of the Wacom R&D department.
Does Wacom have an entire suite of comparable products, or do they cater specifically to a niche market? Was their announcement yesterday? Life has context.
> What does that have to do with new MBPs?
I'll go ahead and quote the original comment you originally responded to: "This event was by far the most disappointing Mac event in the history". Apple had an event where they announced not much more than that we are now in late 2016. Nobody is saying MBP itself is bad, the event is comparatively lacking.
* I think you can reasonably make a case that Windows 95/98/NT were better than MacOS 7/8/9. (You could probably make a reasonable case for the opposite; I certainly wouldn't argue the point).
* The Zune was better than the iPod. Unfortunately for MS not better than the iPhone a few months later. The "Metro" design language it introduced was quite a bit better than Apple's at the time (and started the "flat design" trend that even Apple would end up aping, years later).
* Various areas where MS beat Apple by default, at least circa the 90s-00s, owing to Apple making no or a half-hearted effort. Office suites, gaming consoles, web browsers, the server, 3D graphics, yadda yadda.
* C# is a far better language than Objective-C, and Visual Studio is a far better IDE than Xcode.
* Edge is arguably a better browser than Safari. MS have also made some great tooling for webdev recently, VSCode and Typescript, while we haven't heard much from Apple.
* The Surface tablets have been arguably better than recent iPads, by virtue of running a full OS.
> * C# is a far better language than Objective-C, and Visual Studio is a far better IDE than Xcode.
C# is a nicer language, and VS a better IDE when it comes to language integration but.. the platform APIs are far more productive on OS X than they are on Windows and the GUI toolkit was good from the get go, whereas MS never commits to a toolkit fully (MFC, WinForms, WPF, now UWP which is like a restricted, not fully compatible WPF). Why do you think there's so many great, lightweight alternatives to Photoshop with non destructive image editing on Mac OS X, like Affinity Photo, Acorn, Pixelmator, and none on Windows ? why is it that Apple can dogfood and write everything in their modern platform APIs, even rewrote their file browser Finder, in it, but Windows out of the box comes with exactly zero .net apps? Recently Windows 10 brought some .net stuff out of the box on the desktop, but they're all toy apps no one would want to use, and certainly no equivalent to Apple Mail, Photos, iMovie, Garageband etc. The UWP mail client is pitiful.
C#/.net platform in general, on the desktop, as far as commercial, desktop apps sold to consumers come, is a dead wasteland. Whatever few .net apps I've seen as an enduser that made use of .net, I tend to associate with "slow, heavy, not featureful" particularly WPF apps. .net greatest success is the same as Java: on servers, or on the desktop for in-house business software where the well being of the enduser is not a priority and no one cares if the app feels sluggish or has terrible UI.
The lack of dogfooding has been a common complaint among windows developers for a long time, for example :
> I understand deadlines and priorities, and I know that probably Microsoft just had to ship something at some point, but it really seems that there was a big lack of dogfooding in the WPF case.
> There’s a striking example of this: what was the number one complaint that developers had about WPF since 2006?… Blurry text and images. And when did Microsoft fix it?… Only in 2010, when they started using WPF for Visual Studio.
> Another issue that has been bugging me since I started using WPF was the airspace limitation. It seems that it’s finally going to be fixed in 4.5. Why do I think it’s being solved now? Because they probably needed some native WinRT component to play nice with WPF…
Microsoft still doesn't really use .net outside of dev tools and server apps. UWP apps are just toy apps. UWP OneNote isn't even close to the desktop OneNote. And so on. MS themselves don't really produce high quality desktop apps with C#. If they won't, then who will? Compare to Apple and how everything they make, makes heavy use of their platform APIs such as Cocoa, Core Image, Core Animation etc. How could .net not be a barren wasteland for desktop apps?
Apple always had the better developer platform, dogfooded and thus battletested, and now they're also getting a nicer programming language to work with, with Swift. Their IDE is still no visual studio, but AppCode from IntelliJ fixes that.
Well, you could argue that Apple hasn't done a tablet PC. They've done a tablet-sized phone. Of course, where you draw the line between those is probably different for different people.
They did, just not in terms of the underlying concept. The sheer size and weight of the iPad was innovative. Doing away with a physical keyboard altogether was innovative. Using a capacitive touchscreen for a far more pleasant user experience was innovative. In short, many small innovations rather than one big one.
What's interesting is that it's a dual volte face; Apple produced a gimmicky consumer-oriented iteration whilst Microsoft gave us something genuinely innovative, gorgeous looking, but aimed at a high-end, professional market with the money to spend. It follows on from Apple giving us desktop users a transparency abomination straight from the Windows Vista era. What is happening?
To continue the food analogy, it's as if Apple reheated some leftovers --the kind that taste better the next day, but when it comes down to it, it's the same food.
Some of you may remember when many many years ago Microsoft showed off an interactive coffee table. The Surface. And now finally with Studio they have brought to reality a device that will be nirvana for designers, factory planners, circuit designers and anybody else who can actually use an interactive surface (pun intended). Im a product manager for enterprise software and Im bubbling with ideas that such an eminently usable large screen touch device brings to the market. After decades of heaping scorn on MS for the stuff i had to work with that hasnt improved (talking about you Word, Powerpoint, Excel, Visual Studio) they have post-Balmer become an entirely different company and Im pleased about that. Competition is good for innovation and Studio is truly innovative in engineering and the things it makes possible.
> Microsoft introduced an entirely new kind of tool for digital artists.
It isn't an entirely new tool, it is an HDR, higher res, larger version of a Cintiq, with a built in slow PC with last-generation mobile graphics. The hinge and stuff was definitely an improved industrial design.
Those screen improvements alone are probably enough for lots of professionals to switch from cintiq, even if they lose support for professional-tier desktop hardware, but what about the pen? As far as I can tell the pen doesn't even have tilt detection and from videos you have to use the dial/puck to simulate rotating or tilting the pen.
In spite of that, it was definitely more innovative than anything Apple showed.
SURE! Except, of course, you're not going to be allowed to code it in Python, R, or Octave. Instead, it will be Microsoft(TM) Visual(TM) R#(TM) with the new NetDNA/DCOM/.NET/CloudyMcCloudface Framework (TM)!! (now, with 40% more Cortana!(TM))
> Microsoft introduced an entirely new kind of tool for digital artists
Actually I believe they introduced a competitor to the Wacom Cintiq that comes in a variety of sizes and does not require you to use a desktop or Windows 10.
Calling it 'an entirely new kind of tool' is really over stating it.
Have you used Surface before? Asking because when I picked one up about a month ago, I had an awful time with it. Makes me think their announcement won't really do anything but look interesting. I could be missing something, though.
>> when I picked one up about a month ago, I had an awful time with it
Did you get to play with it at a store or friends house, or actually take it home and use it for some time. Everyone i've met that has a surface (book, 3, 4, etc) loves it.
No. Microsoft introduced an entirely new kind of tool for digital artists. I'm not one myself, but the Surface Studio makes me wish I were. And the people I know who are, are over the moon about it. They can't wait to get their hands on one.
Yes, it's Microsoft that hit it out of the park, and made Apple look amazingly weak by comparison. Today's event wouldn't have been particularly impressive even on its own. By comparison with yesterday's, it's just embarrassing.