I interpreted that comment as a jab at the PC industry. The implication being that in the last 5+ years the PC industry has failed to offer 600 million users a compelling reason to upgrade. Isn't that the correct interpretation, considering the whole point of bringing that up is because they're positioning a new device intended to replace the PC? I seriously doubt it's intended to mean (to quote the article) "LOL poor people".
Granted, this interpretation would make for a boring thinkpiece and would not get me to #1 on HN.
>> The implication being that in the last 5+ years the PC industry has failed to offer 600 million users a compelling reason to upgrade
Compelling reason why I had to upgrade my iPhone 3G, iPhone 4 and original iPad after less than 3 years? Painfully slow and unusable.
Well if the compelling reason to upgrade is "painfully slow and unusable", then maybe a >= 5 year old PC that still works isn't a bad thing. If I'm not mistaken, my 2011 MBP's quad core i7 can still outperform the CPUs in today's Macbook Airs (the most popular Mac notebooks) in terms of pure processing power.
Compelling reason why I had to upgrade my iPhone 3G, iPhone 4 and original iPad after less than 3 years? Painfully slow and unusable.
Exactly. Upgrading is a cost; not just money but also time. I upgraded my 1st gen iPad's OS exactly once (5 -> 6 I think) and it was a completely horrendous experience, involving PC iTunes.
The main driver of PC upgrades, apart from improved games, is bloat and the slow deterioration of Windows installations. It's not as bad as it used to be but reinstalling the OS and cleaning the fans can often substantially improve an old PC.
I still have my original iPad, but it's used for a very limited set of things as the browser is crashy and many apps aren't available for its OS.
Users want neither forced upgrades, manufactured obsolescence, nor bogus SaaS rentalware like Adobe Creative Cloud.
I concur with you on not wanting forced upgrades and manufactured obsolescence.
But I have found rentalware (specifically Adobe Creative Cloud as I use it) to be a somewhat positive experience. Adobe pushes out updates frequently enough where you see positive increment in productivity without it being an annoying update. Plus it has bought the price within affordable range for people like me who can afford the software if we save up enough for a few years but not the down payment in one go. Having the payment broken down over months makes it more manageable. I have been a convert from a pirate to a legitimate paying user just because of this. Straight out licenses for adobe's software cost more than some kidney operations in my country before this SaaS model.
Jetbrains has also moved onto a SaaS model. If they continue with legitimate updates every few months, then that subscription is also worth paying for.
The only subscription I do not see any benefit in, as a individual developer, is Microsoft Office 365. That's more aimed at business's whose full life depends on the software.
> The only subscription I do not see any benefit in, as a individual developer, is Microsoft Office 365. That's more aimed at business's whose full life depends on the software.
If the Office 365 subscription (for businesses) only included the Office suite, then I'd agree with you, but it also includes hosted email (for your own domain) and 1TB of OneDrive storage. To put that in perspective, Google Apps is a comparable product that offers hosted email, and it costs around the same but only includes 30GB of storage.
On paper at least, the subscription provides quite a bit of value. If only OneDrive for Business wasn't a piece of crap...
> nor bogus SaaS rentalware like Adobe Creative Cloud
Eh, as SaaS goes, at least Adobe's has some compelling reasons to use it, the biggest one being cost. Sure, you'll pay more in the long run, but try telling someone trying to start out a small business that they should shell out multiple thousands of dollars for Adobe CS, or multiple hundreds of dollars per individual program if buying a la carte. That's the situation my Fiance was in, and I was finally able to move her to Adobe's SaaS offering and off of pirated Adobe CS because $40/mo is much easier to swallow than the prior cost. That they offer tech support, she can easily migrate computers, and there's no virus' bundled with the installer are all good selling points when coming from that side.
I still have all three of those iOS devices (and my wife's now retired 4S) mentioned in my original post.
The saddest thing?
The iPhone 4 has only one easy job to do. Play music in my bedroom. It can't even do that reliably. My iPhone 3G (now a desk clock) is more stable than the 4.
There's a reason I left iOS for Android on the phone side and Windows on the tablet side. I've been burned enough that I don't care to be burned again. Granted, Apple's devices are probably less prone to bloat now that Federighi took over from Forstall, but there's nothing compelling for me on the iOS side right now. A Pencil-enabled Mini would be very interesting, but that's about it.
The hardware of the Iphone is nice. I'd like it but with in an Android OS (and an MicroSD card slot). But then again, I wouldn't be willing for pay for that, so I'll continue using my used LG3 I picked up for like 200$.
I'd like an Android phone with the update availablity of iOS. Nexus is the closest contender, but they still stop supporting devices after about 1,5 - 2 years.
I have a G3 like the GP... Unlocked, with latest nightly (Marshmallow) builds of Cyanogenmod. Pretty painless.
You can argue that Google and the OEMs should be making these upgrades easier and more widely available... And they should... But at least 3rd-parties filling in is an option in many cases. You can shop around to find the combination of hardware and software support that you want.
The problem with getting locked into a walled garden - no matter how nice it is - is that you're still locked in when winter comes.
> It's not as bad as it used to be but reinstalling the OS and cleaning the fans can often substantially improve an old PC.
And if you're reinstalling the OS anyway, add an SSD (at least for the system files). This gives a substantial performance boost on most old PCs for under 100 $/€.
Completely agree. I have a 5 year old Asus K54C (i3-2330m model).
About a year ago I upgraded RAM from 3GB to 6GB, which dealt with most of my multitasking and "large files in memory" issues.
About 2 months ago I bought a Samsung Evo 250GB SSD for under €90. This solved just about every other frustration I had with the machine - slow boot times, slow program load times etc.
The only issue I really have now is maxing out one core too easily and having the fan spin up loudly. Investigations reveal a CPU upgrade is possible.[0]
I have no intention of buying another laptop for a further 12-24 months, even if the plastic case of this one is held together with super glue and tape.
Though, if you buy a cheap SSD, I would be careful not to depend on it for important data. Cheaper SSDs often have very low write endurance, and using them as the OS drive where there will be swap and hibernation files, as well as potentially large install and update activity, can drive them to a quick death.
Is it no longer standard operating procedure to disable hibernation when you put an SSD in?
In any case, with an SSD, my boot times from cold start have been faster than coming back from hibernation were with rust disks. Plus, clean starts are less likely to leave things in a fubar state than the often-buggy hibernation
Surely true, phone hardware was quite 'limited' at first, unlike PC which were mostly capable of swallowing any task thrown at them (unless transcoding 1080p in real time or rebuilding LFS counts). On PC the main issue is: subpar and/or unsupported GPU (to avoid burning the main cpu for entertainment), absence of SSD, bad software stack.
>> Surely true, phone hardware was quite 'limited' at first
Well here's my problem with that argument. The 3G, 4 and iPad 1 were quite fast until the bloated OS upgrades started coming out.
The one thing that would keep me from being bitter is if Apple offered an easy way to downgrade to the original OS version of each one.
I don't run a lot of apps. The most important apps to me are the ones that come with the phone. If that meant living without some apps and new features but being able to keep the devices in use longer, I would have surely made that trade-off.
True, but I can also understand how for some time iPad 1 level hardware was just a bit too limiting. Now the bloated upgrades, as I said, did also help forcing users to "want" the new one more.
The thing that saddens me is that, business will always do that, it's the survival gene by making profit. On the other hand we could have nice open source, legit good software layers but it just doesn't happen the right way on smartphones so we're stuck with android / ios and their tricks.
>> Now the bloated upgrades, as I said, did also help forcing users to "want" the new one more.
My experience after switching to Android is not that though. Most of the major releases, including the upcoming N, offer performance improvements.
Granted, I don't know what that says about the code base, whether it was already so vastly inefficient that they find tons of room for improvements or whatnot, but it's definitely a refreshing change from iOS for me.
That's irony, since Apple has a ~80% adoption rate for iOS 9 and Google has an adoption rate of <40% for Android 5+.
I long ago switched away from iPhones to Android devices, but I'd be lying if I didn't admit that Apple's public commitment to privacy and Android's recent security woes didn't make me think twice about what the next phone in my pocket will be. I look at the system updates page on my Moto X longingly, but it still says February 1st is the most recent update available. Not sure how long I'll have to wait but I hope I don't get exploited in the meantime.
> That's irony, since Apple has a ~80% adoption rate for iOS 9 and Google has an adoption rate of <40% for Android 5+.
But I also wonder what the story is behind that adoption rate discrepancy.
Apple makes their own phones, so they can push out upgrades easily. Most of the Android phones sold were made by Samsung right? And Samsung has a a wide variety of phone models and penchant for making mods that slow down updates and/or make them infeasible for other models.
Aging iPhones also tend to become less enjoyable (carefully selected words) to use on their last viable OS update, so people tend to either upgrade to the latest iPhone or switch platforms.
Speaking of Moto X'es, I just got a notification for an update right now, while my wife who has the same model on the same network got her notification over a week ago. I'm guessing they like to stagger their updates?
I think part of the Android version spread has to do with prepaid phones. At least 2 of my family members regularly lose/destroy their cheap Android phones and then replace them. These cheaper phones ($30 - $60) are usually running an older 4.x version (until recently, often 3.x).
True, after version 7 iOS became too demanding. Ironically I had much love for first OSX releases that got better even on 1st gen iMac until 10.5 I believe. Apple put that gene to rest ..
All valid points, but it hasn't always been this way. There is a model for profitably and sustainably running a business that builds solid products and then services them.
We often act as though the world we live in is the only way things could have possibly proceeded, but after WW2 U.S. manufacturing invented the idea of the consumer to cope with manufacturing capacity built up during war time.
We are still effectively living in a manufacturing boom time created by a war machine.
I've pretty much gotten to the point of upgrading my hardware only when continuing to use my current hardware is more painful.
Is this slowness//clunkiness liveable for what I'm doing? When the answer is "no" more days than "yes", it's time for me to go shopping. Until then, I'm ok with typing on something you might find on the Millennium Falcon.
The irony of those comments is that one of the Mac's selling points is that after 5 years the average person can still expect it to be a decent machine running the latest OS and not corrupted by bloatware.
(My six year old PC laptop is an incredibly heavy piece of junk that crashes all the time due to the combination of corrupted hard disk and Vista, and it actually is sad that I still occasionally find a cause to use it, not least because it's still the least unreliable device for running Skype that I own)
This is why Chromebooks are brilliant devices for a lot of users. You lose out on functionality, but you completely avoid bloat. They reliably boot in 6-8 seconds and stay responsive with a dozen tabs open, even with a puny little Atom or ARM processor.
> I seriously doubt it's intended to mean (to quote the article) "LOL poor people".
He didn't mean to but this is the general reflection of the culture that permeates Apple as a company and the bubble in which they live.
I remember once one of my wealthy friends saying "I don't understand why people live in one bedroom apartments". I stared at her for a solid minute to determine if she was joking but no, she was dead serious. She genuinely did not understand. She was born and raised in a wealthy family and she just had lost track of the rest of the world.
Schiller and the Apple execs have similar blinders on and once in a while, the mask drops in a public speech because the speech writers and their proofreaders have similar blinders on and didn't realize the enormity of the implication.
As a hardware company, if a huge percentage of your user base is still using a product you made five years ago, there are two schools of thought:
1. "Great! Our products are so well-made, stable and backwards-compatible that users do not have to upgrade."
2. "This is terrible. We have failed so thoroughly to innovate that people are content with using five-year-old technology."
It's not like Apple is unaware of virtues of #1, but as a company they are clearly in bucket #2. In five years, if 600MM users are still using the iPhone 6, I'm certain they would also consider that "very sad". I don't think Schiller and the exec team have blinders, they are just holding the rest of the PC industry to the same standard to which they hold themselves.
I feel like you're just exemplifying the exact opposite here. You're making a lot of assumptions about a statement that, in context, is pretty innocuous. For as much as you're saying that Apple as a company has these blinders on, I think you have just as much of an issue with blinders. The statement can be taken at least 2 ways. It's telling that you've turned one short statement into a generalization about an entire company.
It's not as clearcut as you make it out to be though. More often than not those old PCs effectively deny a whole bunch of people access to reliable and fast modern computing. In that sense an iPad Pro would be very well worth the price. Think hidden opportunity costs through crappy, unstable UX.
Unfortunately the real problem with Apple is that they are still somewhat on track of losing the "functional high ground" as "it just works" doesn't apply as much as it used to only some years ago.
I hear that MS Surface is actually quite usable these days, I don't have first-hand experience with it though. I would be interested to know as I'm feeling more and more inclined of not recommending Apple products to that cohort anymore.
But now you've changed the substance of the utterance entirely. He didn't say, "I don't understand why people use 5 year old computers." He said that it's "sad" that they do. And, when you believe that your product -- especially the latest iteration of that product -- genuinely improves people's lives, it follows that anyone who doesn't have it is 'missing out on something great.'
This controversy ranks right up there with the 'let's make her smile' bit from the iPad Pro announcement. Manufactured to create a story, but not impactful in any way.
This story is old as the world. "Qu'ils mangent de la brioche" - "Let them eat a cake", as great princess Marie Antoinette said upon learning that the peasants had no bread.
That was fictional phrase written in a novel by one famous writer, and attributed to Marie Antoinette, in order to discredit her because those who did benefit from that.
I wouldn't say it's a bubble so much as they just aren't motivated to consider those viewpoints. Apple's products are high cost and high status, they have no reason to consider the lower end of the economic spectrum; that's not their market. It would be like Rolls Royce running radio ads in rural Kentucky. Yeah, many people there would probably like one, but they can't get one.
I don't think that it means "LOL poor people" either. But I do think that it's bullshit marketing talk. Apple is perfectly happy to be proud (rightly, I think) of the enduring utility of older pieces of Apple hardware -- but if some other company's hardware continues to be perfectly serviceable for 5 years, that's sad?
Also, if you have a 5 year old PC that's genuinely not serving your needs any more, sure, you could pay $600 for an iPad Pro that also probably won't serve your needs, or you could pay around $400 for a new PC that will serve your needs.
Yep, eBay is booming with second hand workstations like this. Cousin bought a HP Z210 Quad Core E3-1270 3.40GHz with 16GB RAM for 320€. Added a SSD and cheap GPU and is an awesome 500€ machine that can last for years.
Totally there. If your electricity is cheap, it's an awesome deal. Since, oh about, 2009, there are a lot of choices that are akin to buying a '57 Chevy.
Yup. Clock-for-clock, Skylake is about 10-20% faster than Sandy Bridge, IPC improvements have been <5% per year. The apparent improvement since that time has been slowly cranking up the stock clockrates. If you overclock a Sandy Bridge to >4 GHz, which is extremely reasonable, then it keeps up just fine with a Skylake in most tasks.
CPU performance is largely "good enough" for most users. OS bloat has finally stopped: Win8.1 is just as fast as Win7 (and is more stable) and Win10 is faster and skinnier. Most users don't do anything intensive and probably wouldn't even notice if you substituted in a low-end processor. For those that do have big needs, GPU offloading has taken off in a big way.
This is kind of unfortunate in other respects. CPU performance (especially single-threaded) is extremely important for high refresh rates. At 144hz there's no margin for any weak link in the system. But I recognize that I'm kind of a niche user in that regard.
Sometimes - there's multiple types of execution units in a CPU core (even multiples of the same type), and a thread can dispatch to multiple units at once (superscalar execution). It can also reorder the instruction stream to keep all the units occupied (out-of-order execution), preemptively execute along the most likely direction a branch will take (speculative execution), etc.
Basically, it's all a massive game to keep all the units of a core busy to execute the desired instruction stream as fast as possible. Over time, successive CPU architectures have gotten better at playing the game: better occupancy, more execution units, and more powerful units (SSE, AVX, etc), which translates into a greater number of instructions executed per clock cycle (IPC).
That's why a Skylake is much faster than a Pentium 4, even though the P4 might run at a higher clockrate. The Skylake has better IPC.
And as a side note: what Hyperthreading does is duplicate the part of the core that manages registers and instruction dispatch for a thread. So you have a second thread that can utilize any execution units that the first thread left unoccupied.
Bulldozer works somewhat similarly: two threads share a single core, and each core has a pair of integer execution units but they share a floating-point unit. So kinda like a Super-Hyperthreading, where they include a duplicate of (what they hope is) the most needed execution unit. Doesn't always work out in reality though.
Agreed that the "LOL poor people" is probably not the intention however I am sure the internet will have a field day with it. With that in mind it will likely generate some really interesting discussions around the quality and longevity of Apples products.
Imho smart phone tech is plateauing and I get the feeling Apple is going to struggle more and more to justify the prices they charge for iPhones and other products. My iPhone 5 (4/5 years of use) broke 3 weeks ago, and after reviewing prices of phones I just couldn't bring myself to spend 3/4 times the amount on a phone when I can get one "just as good", but Android.
In fact I was chatting to a friend who recently also went from iPhone to Android, something he said that I took to heart was that spending more than R3500 (about $230) on a cellphone is an unnecessary luxury. I am willing to bet that limit could be pushed down too.
Couldn't you make the opposite argument from the same starting point?
That as the technology plateaus, and the upgrade cycle becomes longer, there's much more reason to spend more on a handset because you're not going to feel like replacing it a year later?
Spending a bit more doesn't seem like an issue for something that will last 4/5 years.
Your friend may say spending more than an R3500 is an unnecessary luxury on a cellphone. I say that the cellphone is the single most important piece of technology I own and so isn't the area I want to be stingy on.
Apple isn't necessarily the best choice even with that in mind, but that's a separate argument.
That is a valid point, but if I can get a good Android phone that lasts half as long for a quarter of the price then I am afraid I am going with the Android. But really now I am just throwing around numbers with no real experience of how long this android phone will last (Moto G 3rd Gen, so far so good).
Something I also found was during my transition I had two Android phones in the space of 3 months, (I used a friends spare while I replaced my iPhone), and transitioning from the one to the other was incredibly friction-less. I mean it actually made me pause and "wow" a bit.
I suppose my argument was more that the quality gap of iPhone to many Android phones has closed a hell of a lot over the last few years, and I am just not sure I can justify paying so much more for an iPhone anymore. Especially as the ZAR weakens against the USD.
Your assuming that the item your spending more on is worth more. It's really hard to qualify on iPhone vs Android, but my super-simplified calculation is:
iPhone = 4 stars, Android = 4 stars
If the Android is $200, and the iPhone is $400, I'm gonna go with the Android
I built my desktop from scratch (for the first time) in 2005, added a couple components from other computers as they died, and still use it with no issues. I've thought about upgrading components in it, but since they haven't died yet, and it still works just fine, there's just not much of a point.
Built a monster PC late last year and I see it lasting me a long time as well. I don't want to play the game of always jumping on the latest and greatest. Your pockets will always get emptied out and I've discovered accumulating material possessions and keeping up with the Joneses doesn't give me the endorphin rush like it used to. Thanks for this comment.
I did this back in 2009 and dropped a wad of cash on a Mac Pro tower. Thing is a frickin' champ. Upgraded it to 12GB of RAM and a nice SSD for the OS drive and never looked back. Turns out 4 Nehalem cores at 2.66GHz will last you quite a while if you don't do video editing work or play the latest games at full resolution.
This is the way to go, if you know how. Unfortunately a lot of people's brains just shut off when they have to think about their computer on that level.
Just bought a like-new PC on ebay for $200. It's not about being poor because anyone who says they can't scrape up $200 if they had to, is lying to you.
It's the shocking fact that (especially if cash strapped) you can live your life, without constantly buying more and newer things.
Of course Apple's stance is, you must buy our new products every year. But I can get on Facebook and email and whatever else on my $200 PC just fine - the value isn't there for alot of people.
There are people that extremely poor yes. They need help and society should help them. But they wouldn't be in the market for a new computer. My point is the cost of computing is about as low as it can be expected to get.
As opposed to what? It's still less than half the price of a usable iPad (I know the Mini 2 is $270, but let's not pretend they're comparable), let alone Mac.
That is only about two months of the cable internet bill or cell phone data plan that you would need to really be able to use the computer. To do anything that the vast majority of people want to use a computer for, anyway...
Yes. The irony is that iPhone and iPad are no better as explained in the article(s)[0].
"Based on my experience with Apple hardware newer than my laptop, I’m guessing it’s the former. Since 2011, I’ve bought a couple of iPhones (both dead), numerous connectors and power cords (all dead; I’ve switched to knockoffs), and a replacement laptop battery (currently half-dead and in need of replacement)."
I agree with your interpretation. It would seem to have more to do with the slowing sales numbers for tablets as well as the incorrect assumption we would be in a post-PC era at this point. That day will come eventually and while large tablets may not be the answer, Apple is certainly hoping it will be. There has to be a good bit of concern at Apple that they are still largely dependent on the iPhone.
There is quite a large cohort of PC users with minimal needs beyond a web browser for email, news and some basic entertainment (Youtube and simple gaming). Google has also done an amazing job with the Chrome ecosystem providing web services that meet all of these needs.
As someone sitting at a nearly 5 year old Macbook Pro, I took the comment as an off hand throwaway. I understand not liking the comment, but this isn't news. A company thinks everyone should be using the latest of their products. Oh no.
A company thinks everyone should be using the latest of their products. Oh no.
An alternative way to read this is that Apple themselves don't believe their own products will be worth using after 5 years, and will therefore fail to support them properly in the long term. Anyone in the market for a computer to last longer than that (eg most home users, and many corporate users who don't lease) might have to look elsewhere.
This is not an offhand throwaway comment. This speaks to the absolute heart of Apple's strategy.
>Apple themselves don't believe their own products will be worth using after 5 years
I think you got the gist of this thread! It's not actually about the poor but about competing products lasting longer than their product planned lifespan, and that's a bad thing for them.
Apple makes no money on the OS. That is included with the hardware and upgrades are free. So aside from services such as App Store and iTunes, Apple has to sell hardware to make money. It's not in their interest to encourage people to hang on to older hardware as long as possible.
It's not in their interest to encourage people to hang on to older hardware as long as possible.
There's a difference between encouraging people to hang on to hardware for as long as possible and actively driving away your customers by laughing at them or suggesting that your own products will be obsolete before the customer is ready to update again.
To maximise their profit on hardware Apple should be encouraging users to stay within the Apple ecosystem, even if those customers only upgrade every 10 years. This is especially true in today's climate of very powerful computers that will easily do most things a home user wants to do even after a decade. As someone who writes web software I am acutely aware that there are users who are still using old PCs with old software.
For Apple to push their customers to abandon Apple products in favour of things that last longer is corporate suicide.
I think this may be a consequence of HackerNews not having a downvote option for stories. I like the fact that they don't and I wouldn't suggest making the change but it tends to allow stories that have a higher dislike to like ratio to rise because the number of people who have an opinion are very high. This does tend to prevent some filtering of storied I do think should stay up on the top. It would be an interesting experiment to add a downvote to stories button to see what happened, even if that button didn't actually affect where the stories showed in the list at first.
As someone typing on a 2011 Macbook Air, I really don't like that comment. What's wrong with using a 5 year old computer? Fortunately their hardwares still last for you and me. I had to send in mine for repair twice. Once for the keyboard and the other for the touchpad. If the company is pushing people to dump their 5 year old hardware, I'm going to be a little nervous to buy their new product.
This was my immediate thought while watching the press conference to. It sounded more a simple tone deaf statement. I buy Apple hardware because I think it's really high quality and I can expect it to last. I was troubled that maybe Apple doesn't align with my beliefs.
That's an odd fact about markets. You have the high rate of progress phase where technological advances almost sell themselves. I didn't have to wonder for long to want a Pentium 200 MMX instead of my Pentium 75. Nowadays it's really tough to convince someone naturally and strongly. They have to push for new 'needs' (4K ?) or allow for crapware and subpar software to trigger the religious reflex newer = better = faster.
4K monitors are a little marginal. 27" is really too small, most people need some DPI enlargement to feel comfortable. The bigger panels increase significantly in price, but I think 32-36" would probably be about right in terms of ppi.
27" 1440P monitors are fantastic though, they strike the perfect balance between resolution, size, and price. If you're used to 22-24" monitors I really encourage you to give it a try, it's very nice for productivity. You can get a cheap Korean VA/IPS monitor for $200 on eBay or a very nice one for $300 or so (the nicer ones have DC backlights which help prevent eyestrain from PWM flickering). Definitely worth it.
I originally planned on getting a single 32" 4K IPS monitor, but got tired of waiting for it to come out in Canada.
I test drove a few smaller 28 4K screens in the local computer store and realized that I could easily live with 1:1 on a 28". I ended up getting a Dell 27" 4K monitor because I wanted an IPS panel.
I realize I'm in a very small minority of users who can use 4K with no scaling at 27" -- I use a standing desk and I have single vision computer glasses for the distance my monitors are away from my face.
When I first got the 27" 4K monitor, I was in awe of the glorious real estate it afforded me. That awe didn't last long, however, because I ended up wanting even more real estate. So I bought a second, matching 4K monitor, and it's great. It's like having 8x1080p monitors.
I would love the additional real estate from a third and fourth monitor (i.e., 8K equivalent) but things would get harder at that point because of neck movements, stand availability and the fact that I have a small form factor PC with no room for additional video cards.
I tried a 28" (TN) 4K, an Acer B286HK. Horrible ghosting and capacitor whine - if I had Wikipedia open you could hear the capacitors scream from across the room, and lines would blur into each other while scrolling. I eventually replaced it with a Dell P2715Q (IPS) too, and that's a really great monitor.
All in all though I would have been better off going 1440p at 27". I need some DPI scaling that probably gets me back to 1440p anyway, and it would be easier to drive in video games.
So that's what I recommend now based on personal experience. YMMV, but the Crossover 2795QHD seems ideal for productivity work. The Acer XB270HU or XF270HU are ideal for gaming, but adaptive sync is unnecessary for productivity so you might as well get a Crossover instead.
I do agree that 1440p is probably best for most people. Most of the people who see my setup just ask "How the hell do you do that?"
I'm one of those people who has a bad habit of having 50 windows open at any given time. Sure I could change my habit, as I actually "only" need tiles of about 10 screens at any given time, but so far, having two 27" 4K monitors side by side has improved my productivity significantly.
I originally had them stacked vertically, but I changed the orientation to side by side landscape when I changed desks, and I found the new wide orientation works even better for me. Having needed reading glasses for the past few years also helps, because the glasses I have magnify the screens a little bit too, so it helps.
28" 4k panels are around 150 DPI, which is only 50% more than the "standard" 22" 1080p panel. 32" 8k panels, though, are going to be 275 DPI.
It is all personal experience, but my 21" 1080p monitors have awful pixelage that is very noticeable at standard view distances. Obviously, it has pitifully low DPI. Reasonable desktops (modern Gnome, latest KDE / Qt) now fully support DPI scaling, so there will be no problem going forward for people to use much more reasonable fidelity displays going forward because its not 2005 anymore, and we don't have to be bound by some awful fixed DPI in Windows that is incredibly low.
I bought a 4K monitor last year, and since it was about as expensive as all the good 24"-28" models, I went with a 40" model (the Philips BDM4065UC, to be exact). And it's amazing. 4K is awesome when you have the screen area to actually fit more content on the screen without having to Alt-Tab all the time (e.g. code + API documentation + test results). It's basically four 1080p monitors in one.
I never used something above 21/22 IIRC, but as a secondary monitor. I'll give it a shot, although nowadays I'm having a retro fetish, give me a 10" orange / black CRT and I'll be happy.
I swear, if a modern tech company went up and said 'We think you should buy our product", someone would start yelling about how its insulting how a company is endorising capitalism and materialism.
We've gone from "Won't somebody think of the children" to "Won't somebody think of every single possible group that is possible to somehow offend in some way"
What apple is saying today is something that it has always said - but its only recently that people in america are too poor to buy american products.
I grew up around windows and linux, and whenever I tell that in a job interview - people think I am not being serious since "real developers" use mac.
In my first software job - I was told to abandon my personal cheap linux and use the company's brand new expensive apple - even though I was more productive on my cheap linux.
Apple has always been like that, its just that most people who come from money do not hear it.
Edit:
Also you will notice this in poorer countries - a strong ecosystem for recycling exists everywhere outside of the US/UK.
When I brought my first bike in the UK, I wanted to repair it and was told to just ditch it and get a new one !
It was extremely odd to me since I expected that bike to last at-least 20 years.
When my 3 year old laptop stopped working - I had it repaired rather than buy a newer one - even though repairing was almost as expensive as getting a new one. It just fell really wrong ditching a laptop rather than repairing it.
Maybe you are right - western societies do worship materialism.
> Also you will notice this in poorer countries - a strong ecosystem for recycling exists everywhere outside of the US/UK. When I brought my first bike in the UK, I wanted to repair it and was told to just ditch it and get a new one !
I'd call the US, UK poor for lacking a system of recycling. And indeed in many ways they are poorer than the most well functioning societies.
Maybe OP wanted to say "reuse/refurbish" instead of "recycle". In less rich countries there are thriving second hand markets, online and offline. Many sell the obsolete hardware they no longer need instead of throwing it away.
I still have some 9-year-old "capacitor plague" Dells. I got them gratis--due to the bad capacitors, naturally--and desoldered and re-capped them for about $15 each.
But desoldering and replacing non-SM PCB components is not the sort of repair that most people would feel comfortable doing, even if they can change the oil in their cars.
Based on the cost differential of something like an XBox mod chip versus the same mod chip plus no-hassle installation service, buying an all-new device assembled mostly by robots and low-wage Chinese people is simply more cost-effective in every country where knowing how to use a soldering iron is worth more than a few dollars per hour.
The only way around that is to teach kids how to build and repair custom electronics as part of the regular school curriculum. I don't know about you, but I learned to make sand-mold aluminum castings, silk-screened t-shirts, wooden bookends, and to spot-weld sheet metal in my 8th grade shop class. My classmates and I really would have been better off replacing the t-shirt module with custom-etched circuit boards and simple electronics soldering. That was close to the time when Heathkit stopped making hobbyist electronics kits, too.
> I grew up around windows and linux, and whenever I tell that in a job interview - people think I am not being serious since "real developers" use mac.
How true is this? I have noticed the proliferation of Macs in the developer community and I was wondering how much it would hurt me professionally by not owning one. Do devs really make snap judgements based on your platform of choice and could this impact me professionally in the long term?
Real real developers use a precision flathead screwdriver and a bank of DIP switches.~
One of the great triumphs of computing was abstracting the algorithm away from the specifics of the hardware using compilers. We have the good fortune to be able to configure those compilers to generate several varieties of executable containers, such as ELF, a.out, COFF, EXE (MZ), EXE (PE/COFF), JAR, Mach-O, and others. As long as the OS has an installed virtual machine or ABI that can handle the executable, it will be able to run it.
If anyone got snooty with me about my hardware, I think I'd consider myself lucky to not need to work with them.
Does it have a text editor? Does it have a compiler? Yes to both means that real developers can use it.
From my experience this shift occurred over the past 10 years or so, starting around when the MacBook Pro was released. Gradually Macs gained acceptance as developer machines, not just "creative" workstations.
I think platform preference still varies widely by industry. Macs might be common in app and web development but in engineering and systems development Windows is dominant.
It's not that poor people are offended that apple makes fun of them for not buying their products.
It's that we live in a world that systemically vilifies and denigrates poor people for not lifting themselves up by their bootstraps, learning programming, and starting a company and solving income inequality on their own.
And while our culture is collectively shitting on poor people, we are lauding the great entrepreneurs and business people at apple and the likes for being objectively better.
That is why it's offensive. Not because poor people are offended, but because is SHOULD offend you at how obtuse it is.
It's sad that a full five years later, CPU performance/USD has barely moved at all. RAM is half the cost now, SSDs a quarter the cost. Not sure about how GPUs have developed?
(Edit: The GTX 960 which today also costs 200 USD seems to be about twice as fast as the GTX 460.)
The average person goes to Walmart/Bestbuy/BJ's/Staples and spends $475 on some shitbox laptop. The 25th percentile user (ie. millions of people) probably spent $275-325.
That average person who bought that average PC in 2011 has an aging HP/Dell/Asus laptop with a failing battery, $30 Celeron or AMD processor, 2GB memory, integrated graphics, a 500GB 5400rpm disk and integrated graphics.
I still have my 2008 plastic Macbook refurb that I still use to browse the web.
I had the pre-release Win 10/64 on it for a while, and it actually ran faster than OSX did on that machine (in my experience, Windows runs faster on Mac than OSX itself, but ymmv). And keep in mind that my Macbook pretty much got left behind by OSX after Lion.
Eh, probably more like the top 40-50%. All the people (and yes, there are more than a few of them) who bu anything better than the cheapest POS Walmart laptop. I built a computer for my parents at the beginning of 2012 - entirely lower midrange compnents - AMD CPU with built-in graphics, 4 GB RAM, etc. IIRC the total cost including monitor was a bit over $500.
The only thing wrong with that computer today is that it has a 7200 RPM mechanical hard drive. Even then it's not slow, it's just noticeably slower than a SSD.
Don't forget the 1366x768 screen! They were selling those as standard on 15.6" laptops with the good old "It's a good screen because it has more inches" line.
Only in the last year or two has anything better than that become remotely common.
I really thought this had gotten better with all the high DPI convertibles and Windows + 3rd party software finally being pretty reliable with it. But you're right, Best Buy's most popular 15" laptop today is this:
Specs claim '15.6" display, HD resolution.' with no actual screen resolution listed.
Digging around on Asus's website I see it's available with 1080p, but I'm guessing the 1336x768 base model is the common one. Since 720p is technically "HD", they can claim that in advertising and then dodge around admitting how crappy it is. If BestBuy's product listing were for the 1080p version, they'd be advertising as either 1080p or Full HD.
I have an even shittier PC than you (Celeron 2.6 GHz for 40 USD, no dedicated GPU) and it is still absolutely fine for everything I do (web development, image processing in python, typing papers in TeX, editing images in Gimp, ...).
If I had to upgrade one component it would be my 22" monitor which is 6 or 7 years old but even for this there is no need.
And replacing my PC with an iPad Pro? lol, wtf? No TeX, no Python/matplotlib, no SublimeText/vim/emacs, no real filesystem, ...
btw, I think you can drop a SSD into almost any shitty PC/Laptop and make it usable again. Before SSDs I had to update my PC quite often because it always felt slow, but SSD have been an absolute game changer for me. I even revived my old trusty T61s by adding a SSD and now it is good again.
GPUs have developed a lot. I had a high-end card a couple of years ago, and now I can barely run Elite: Dangerous. For the upcoming VR headsets I would definitely need a new GPU. Apple doesn't even have a computer on offer that would be powerful enough for the VR headsets (Oculus/Vive).
8GB RAM is also the bare minimum.
GPUs are much better now since GTX 460 but that's all you'd need to upgrade in this computer (and some storage) if you wanted to play current AAA games in 1080p or even at 1440p.
The head of marketing thinks that using a competitor's product is "sad".
In response, this article calls Apple:
"Insensitive". Offensive. "Hypocritical". "Insulting". And worst of all, promoting inequality by building high quality, expensive products and forgetting the needs of the poor.
Really this isn't terribly offensive. They're selling a product they think is superior, in their [marketing] minds the world would be a better place if you were pulled from the womb, slapped on the ass, and handed an iPad Pro, to be renewed every generation of gadget.
The Apple presentation stage is not the Basilica of St Peter. These are marketing pronouncements, not moral ones and analyzing them as such is such intense naval gazing that it's actually bad for your neck. If Apple hates the poor it's for no reason other than that they're outside their customer base.
Perhaps I have some wires crossed from too much time as a sysadmin, but I take a 5 year old (+) PC (or any machine, really) as a mark of pride, not any bit of shame whatsoever. It speaks to a high degree of reliability which often speaks well of the operator (even if just "choosing robust hardware" is a component of this)
Some stories to add some color to this:
Ran a very primitive file sharing server for my university on a dual P(2 or 3, don't quite remember) machine that was probably around a decade old by the time they finally ended up retiring it.
My home fileserver is a ~10 TB 4U monster, running on (conveniently) 5 year old hardware and very boring FBSD. Outside of moving apartments, it has not had unplanned downtime once, I will continue using it as long as this is true and would be sad if I didn't get another good few years out of it.
I _WISH_ I could get the same lifetime out of desktop PCs but I tend to find assorted parts failing at an asymptotic rate around 3-5 years. The world in which we all use <5 year old hardware is a sad one, to be avoided, to my eyes. (To clarify, I don't mean this in any luddite sense, I don't believe tech should stop moving forward, but I long for more robust products with longer viable lifespans, such that one can make a choice to upgrade rather than waiting for the inevitable.)
You're right, but I was not trying to make a 1:1 comparison, but saying that properties servers seem to have in spades I'd really appreciate seeing in other markets, but that segment seems rarely catered to. (Let me gesture as well to my 2.5 year old phone whose irreplaceable battery is currently ballooning)
My statement addressed the "whole computer" as well, although the FS example was very much a special niche tool, I'd go so far as to say 90% of novel software features that have necessitated hardware refreshes in the last... decade? I could probably live without. This may be mildly hyperbole, but I hope the spirit of my statement comes across. I want to spend less time replacing/fixing my platforms and more time with them "just working".
I'm not a fan of moralistic handwringing, particularly when it's brought about by uncharitable interpretations of what is clearly just marketing. This and the related articles are such poor quality that it makes me sad to see them get so much traction here. It's the sort of thing I'd expect at dailytech or slashdot.
Ive had that exact same feeling since my iPad 1's life was effectively ended by software updates. It was the quickest device to become obsolete I've ever owned.
I inherited one after never owning a tablet. I was able to turn it into a Plex video-viewing device after jailbreaking and figuring out what the latest supported version of Plex is and it works great for that.
Of course for web browsing in general it's pretty unusable.
As someone who had to develop for that thing - it was underpowered from day 1. iPhone 3GS hardware with 5x the number of pixels to push and higher expectations on software capability? It was never going to work.
The phrase "reduce reuse recycle" is actually a pyramid of effectiveness. The best thing is to reduce your consumption entirely, which in this case translates to not producing or buying new hardware you don't need.
I split "reuse" into "repair" and "repurpose", so that I can put "repair" before "reduce".
I think a product designed to be repairable indefinitely is less wasteful than one that has half as much mass but needs to be completely recycled whenever any critical part fails.
I would dearly love to have a standard laptop form factor, even if means cases would be 5cm thick when closed and 2kg without its innards. If there were a laptop case standard, it would be a whole lot easier to manage repair parts and 3rd party upgrades.
We definitely need a strong reduction in the introduction of new technology that replaces old, still working hardware. This is so wasteful. Along with this, we need centrally planned production of consumer goods to control the pace of change and prevent overproduction, and these goods need to be have government set prices to insure that everyone can afford them.
It is ironic that Apple is mentioning this as I believe this is going to spell the end of their era of massive profits. Phones now are getting to the state where much like PCs, the older phone is good enough, and the new phone is not substantially better. There will always be people buying a phone for .2ghz more CPU or some slightly higher res screen, but the days of rapid evolution of mobile devices are over, and Apple is going to have problems selling someone a $6-800 phone every year or 2.
But with new OS versions, older phones quickly start to show their age. An iphone 4 or iphone 4s on the latest version of iOS it supports is really sluggish and slow (and no easy way to put iOS6/7 back on it)
Methinks Jobs had an overly high opinion of iTunes.
Giving iTunes to a Windows user is more like giving a massive, ugly, wooden horse to a Trojan, and it is so incredibly unwieldy that the lintel has to be removed from the city gate to even get it inside. Then a bunch of bearded maniacs come out of it and kill everybody.
I can't even tell you how many times I have been murdered in my sleep by iTunes. Okay, well, yes I can--zero. But that doesn't stop me from thinking it might happen one day.
If there's something sad about that fact, it's that the industry has delivered so little value in the past 5 years that not many people feel compelled to upgrade!
> If there's something sad about that fact, it's that the industry has delivered so little value in the past 5 years that not many people feel compelled to upgrade!
The glass half-full response?
The industry has delivered so much value in the past 5-8 years that not many people feel compelled to upgrade!
Exactly. I haven't upgraded my washing machine or refrigerator in the last 5 years either. The people Schiller was sneering at view PCs as appliances, and they don't watch Apple press conferences anyway.
Actually, my main development tower is about nine years old.
And in that time, I've upgraded the processor, doubled the memory, upgraded the hard drive to an SSD, switched the video card twice, upped the number of connected monitors from one to three, and upgraded the OS from 32-bit Vista to 64-bit Windows 10.
It was pretty leading edge when I built it, and it's still pretty leading edge today. What's sad is the expectation that I should throw my computer away every 18 months.
I try to overspend on my computers to ensure that I have a computer that can last at least five years.
I loved the Macbook Pro unibodies because they were expandable. My 2011 MBP had its memory upgraded and maxed out as prices came down, and its storage upgraded to larger and faster drives and SSDs as prices came down.
If it weren't for one thing, it would have still been my main computer and delayed my switch to Windows.
The infamous overheating problem with my model of MBP turned it into a brick a few months before Apple acknowledged the defect and announced a repair program. The replacement part was known to have the same defect (I already had my logic board replaced once), and being out of AppleCare, there was no way in hell I was paying >500 out of pocket for yet another defective board. So I just switched to a Windows machine since nothing I did was OS dependent anyways.
Granted I would have switched when the Macbook Pro got unusable (unibodies are gone and don't like the new soldered on approach), but I think that wouldn't have happened until at least 2018.
Unless he's on AMD and the processor upgrade was very minor, he almost certainly did.
It's odd when people say "my computer is a decade old and I've only had to replace every single part multiple times over look at how green I'm being". It's the exact same thing as buying a new computer a couple times, just in smaller increments and with parts that are less likely to be recycled on their own.
Jesus christ, they are in the business of selling computers. This faux outrage over a salesman trying to sell his computers is gross - what's wrong with people?
Now watch as every blog tries to scramble to see who has the most outrage and who is the largest victim.
Some people need to feel offended just as I need my morning coffee.
I'm not even counting how old is my pc/laptop and that's the thing I like about PCs - there's hardly anything new that triggers my Gear Acquisition Syndrome.
I still use my 2009 MBP. I upgraded to an SSD drive and 8 gigs of ram. I don't game on it, but I can do almost all of my development on it. I'll admit it's slower then my 2013 MPB at work, but I see no reason to spend more money on a working computer.
My second computer is a Chromebook. My oldest daughter uses it for school work and we couldn't be happier. It's much better then our iPad (which we don't use anymore).
You know what makes me really happy? Any computer being used for fun/interesting/productive things, not just 'the latest ones'.
As a die-hard retrocomputing enthusiast with far more old computers in my basement than new, I'm biased. But I sure think that the time has come for the compute industry to start highlighting the need for lesser computing power, but yet still more productive computing.
8-bit computers are awesome. 16-bit machines superb! Get yourself set up with these systems and you can entertain yourself for hours and hours. 8-bit is a great way to learn software development - 2 hours of 8-bit coding a week will keep you sharper than sharp when the time comes to go back to the hipstertools-de-jour. (I kid, I kid.)
Point is this, folks: old computers never die - their users do.
Until earlier this year I was still making heavy use of a Powerbook from 2004. I only stopped using it because the graphics hardware started to go funny.
I sold its battery, memory and power supply to someone still using a slightly older Powerbook.
If anything its a fairly strong statement on the ruggedness of the PC platform. I have a 2500k i5 in my old desktop. With a new-ish videocard I play all the newest AAA games at high quality. Its incredible how the x86 world really hasn't had any huge performance bumps and how a Q1 2011 CPU is still competitive.
Also, there's the larger narrative of people buying tablets and putting off PC upgrades, so the PC ages. Don't worry Apple, you're still getting their money. Its just people aren't ready to replace a general purpose computer that they control and can run pretty much everything with a walled garden mobile device designed to get ad impressions and consume media.
If anything, this is Apple's frustration. They have all this success but people and businesses keep buying PCs. They'll never crack this market. They're too invested in the Jobsian "closed" ecosystem philosophy to be as agile as the PC platform. Mocking those who don't drink their kool-aid just makes them look like sore winners.
edit: I'm aware I can buy a newer chip, but from a single core vs single core perspective its not that much faster. Very little consumer software is properly multi-threaded so this is why my expensive work computer with the newest i7 doesnt feel any faster than my 5 year old desktop at home. Most things are pegged to one core and at the end of the day single core performance is what's going to matter.
> Its incredible how the x86 world really hasn't had any huge performance bumps and how a Q1 2011 CPU is still competitive.
Actually you can have twice as fast desktop CPU today than your 2500k, and not for some number crunching you'll maybe never need (even though it's the most important thing for others) but for such common tasks like compiling the code:
> Actually you can have twice as fast desktop CPU today than your 2500k
Alternatively, you can overclock the 2500K to absurd levels (33% or more seems the norm, even on air cooling), which negates most of the advantage of newer models.
I'm a little sour on Apple. They say their products are rugged and "built to last", but an iPad is mostly glass which you need to encase in a giant rubber protective case if you don't want the screen shattered. Even then they are easy to break and very costly to fix, to the point now where it barely makes sense to fix it over replace it. Self-Repair is less expensive but so far I'm 50/50 on successes. I would say they are built to last, until the next one comes out.
The killer part is that it's only expensive to repair because Apple has opted not to streamline a repair pipeline. There is absolutely no way a digitizer glued to a piece of glass costs $500 otherwise they'd never make another iPad.
But if they price the part hideously expensively it makes buying a new one a no-brainer. Working in a school for a few years, I was amazed how easily and inexpensively I could replace screens and memory on 2008-era white macbooks when I had access to school channels for parts priced at cost (or close enough). I think we had access to those channels as a stipulation in a state contract to buy assloads of those machines for every kid in the state.
As someone who just picked up a reconditioned six-year-old, i7-equipped ThinkPad for a hell of a lot less money than what it would have cost brand-new, I have to laugh. It may not be the Latest and Greatest, but it's still speedy enough to handle anything I'm going to throw at it. The CPU performance curve has flattened out in the last few years, to the point that it's not worth spending lots of money for a modest performance gain.
considering cpu speed has been "good enough since around core 2 duos (late 2006?) the only major improvements since then have been battery life and ssd cost.
anyone with a computer from the last 10 years that can support an ssd and 8+gb ram, is probably fine. travel being the exception.
The only thing not lasting more than 5 years in my 20+ years owning computers are Macs, other iDevices and PCs are much, much stabler. Macbook Pro 2008 (died in 3 years), iMac 27 2010 (died in 4.8 years), Macbook Air 2013 (dying) and Mac Mini 2012 (dying) are the reason why I always end up coming back to my 2010 gaming PC and 2010 HTPC, the ones to break the 5+ year mark without a major hardware failure.
I don't understand how the author went from the guy saying it's "really sad" to "Apple is insulting people". I mean even his first argument on why people don't upgrade IS actually sad since they can't afford to do that. I know it is because I'm one of those people who can't afford to upgrade my machine.
An external USB3 enclosure for the Ultrabay DVDRW drive will run around $30 on Amazon.
To install the Intel 7260 card, you will need to install a modified "whitelist removed" BIOS, as Lenovo restricts what MiniPCIe cards can go in their systems by default. The system comes with an 802.11n card (mine was an Intel Centrino 6205).
My only complaint about ArrowDirect is that if they have a system model that could have different screen resolutions, you have no way of specifying / requesting certain specs; you get what they pull and send.
Every system I've ordered from them (close to ten now) has arrived in near-mint condition; this t420s would be amazingly perfect if not for a scuff of the rubberized paint/coating on the top lid (which doesn't matter to me one bit).
Echoing the comments from others, My "main" pc is about 7 years old running the x58 platform (socket 1366) - it easily outperforms my work-supplied development laptop.
For the uninitiated, the ebay workstations mentioned are typically these ancient x58's. Most support hex core xeons, 24gb ram (or 48gb unofficially, more on server boards and some workstations), and a pile of PCI-express lanes. As such, you can easily add in PCI-express m.2 SSDs, USB 3/3.1, and GPU's to your heart's content. The takeaway is that old pc tech can be had at a fraction the cost of new hardware with comparable performance.
I understand the marketing nonsense from Apple, the "PC does what" consortium, and hardware vendors on the whole - but there is nothing sad about owning an old pc. The reality is that the best performance for the price lies in "obsolete" platforms.
Is there anything sad about using a 5 year old mac? I'm hoping that eventually my mac book pro will last 5-10 years. With multi-core systems, SSDs, and 16 gigs of ram in MBP's do we really need to be upgrading so much? Also, clock speed advances have stalled.
I think I have a rusty, almost 5 year old PC lying around. It has an Intel i7-2600K overclocked to 4.6 GHz, 24 GB RAM, 240G SSD and a Radeon 6950 GPU that can run most games well in fHD.
Most of the machines Apple sells are actually slower than this PC.
That's a great exception to the rule. The overwhelming majority of computers Apple is talking about, are probably shitty $400 PCs that were underpowered from day one, which are just barely chugging along.
Thing is, OS bloat has finally ground to a halt. Win8.1 wasn't really any slower than 7, and Win10 is actually faster in my experience. So really, you don't need a stronger machine today than you did five years ago. Any perceived slowdown is just the result of malware, fragmentation, and DLL buildup - refresh the OS and maybe install an SSD and they're fine.
I'm using a laptop from 2010 with an i7-720QM and an SSD, and it's perfectly snappy. I frequently let it do light encoding tasks while I'm using my desktop.
Partly. Yes, the operating systems feel faster, but they are all relying more on GPUs for their rendering. And one way cheap Windows PCs keep their costs down is by using cheap integrated GPUs that are quite underpowered.
I use Windows 10 on in VMWare every day at 4k resolution on my retina macbook pro. It feels a bit sluggish, but I know I'm pushing the limits of VM graphics on a laptop with that resolution. But I can only imagine that a computer that's 6-7 years old would start to struggle graphically in Windows 10.
Apple didn't think it was worth making Mountain Lion running on it.
Meanwhile, I installed the newest 64 bit version of Windows 10 on it as part of the Insider program, and it generally runs faster than my Lion install.
I agree that this was tone-deaf of them, I just find it fascinating that we picked this one instance of tech industry rich guy tone-deafness. Seems to be getting a lot of play for some reason. But it's everywhere if you look.
Thinking of all of the computers I've had since 2007, all of them are usable and still in use at least once a week.
2006 era Mac Mini Core Duo 1.66Ghz. Run Windows 7 gave it to my mom. She still uses it.
2009 era Sony Viao - Core Duo 1.66Ghz, 2Gb RAM. Windows 7. My son uses it for Office and MineCraft
2009 Dell Pentium Dusl Core 2Ghz, 4Gb of RAM. It's still my only laptop. The display is 1600x900 and is still better than many cheap laptops. The battery is crap though.
2011 Core 2 Duo 2.66Ghz laptop. My wife's computer. It's still feels fast.
My workhouse is a 3Ghz I3 with 6Gb of RAM. Bought in 2012.
I agree with this piece. I have long been upset at Apple's forced obsolesce policy, and creating the notion that devices are disposable. I still have a Macbook from 2009, and another from 2012, that I am doing everything in my power to upgrade and avoid the forced slow down. Likewise with my iPad and phone. I resist the persistent upgrade requests to the OS. Not because I am blase about security issues, or not wanting bugs to be fixed, but because those fixes and upgrades come with a cost: premature, forced obsolescence.
I am using a 2009 octo mac pro, which is now 7 years old and since that date, apple has not released a single product that is compelling enough to upgrade that system.
I believe that an iPad (or comparable Android Tablet) is better for most computer users than any low/mid-tier 5-year old PC.
With PC's you can upgrade them, and make tweaks to squeeze out every bit of performance, but by and large for most people, when taking into account that mobile content consumption is on the rise, a tablet is a better upgrade than a new PC. Tasks like email, text, video, music, photography, facebook, are pretty much now done via mobile phone. For these people, PCs are anachronistic.
I have a HP Pavilion Elite m9458fr bought in july 2009 for 416€ (on eBay hp_marketplace_fr). I only need a silent reliable computer with a reasonably fast CPU for web development on Linux. I have no need for beefy GPU, I just need to connect 2 displays (23" and 19"). I would like to replace it in order to have USB3 connectors and SATA III HD. I do not find anything on the market for reasonable price. Should I buy a ultra-HD laptop with a magnifying lens ?
It seems a somewhat strained interpretation to view this as Apple is anti-poor people. But it does point out that Apple is losing touch with what people do with personal computing power.
Phone are much better then five years ago, computers not so much. I'd much rather spend my dollars where the major improvement in computing technology is, then spending to upgrade a desktop to view thing almost the same as before.
The main problem I see is Intel and Microsoft have given up on power users and it is all Apple envy and phone envy, no wonder people don't buy new PCs.
Back in the 0s I had a policy of never rehabilitating an old PC because a new PC was better in every way.
The other day a friend brought a Macbook from 2007 to me with a busted HDD and I put an ssd I had laying around in and we got Win10 running on it with no drama and no Apple malware (iTunes, boot camp, etc.) It feel faster than a skylake machine with one of those useless hybrid hard drives and after puffing some hcfc gas through the fan it is great.
Any and or pre core 2 machine would go to the trash, I would not even donate it to the poor, but frankly broadwell and skylake are just an excuse to reduce the io slots to put manufacturers of gfx cards.
They say customers get better battery life but software screws that up if they really tried it and the most you can get is spend a lot of money on a thin and light machine that the doorman can slide under the hotel or get a 2 in 1 machine just because you need a trackpad on a touchpad machine and have a fight over if and where you stow it with the stewardess just to have another reason to get arrested at your destination.
I mean, even IBM sells 360 chips that clock over 5 that use water cooling. It is not that hard.
Most of the actual machine specs (i.e. processor and max RAM) have barely changed in 5 years. I put 16 Gb in my laptop 5 years ago and it's still the most you can squeeze into a 13" MacBook Pro.
It's sad (and somewhat telling) that Apple has not packed more power into this form factor over the past 5 years.
I'm using a ~5 year old PC I built myself and it still outperforms the hardware in any available mac product (desktop, tablet or laptop) that isn't a Mac Pro (starting price $2999,00).
I don't have a compelling reason to upgrade until the launch of VR headsets.
I actually have 2 USB 3.0 ports. USB 3.0 was first available on ASUS motherboards in late 2009 [1]. Thunderbolt wasn't available at that point but it was unveiled with the Thunderbolt Display, which was July 20th 2011, so coming up fast on 5 years old tech at this point. I personally have no use for 20Gbps transfers so TB2 has zero draw for me to upgrade.
Talking about hyperbole, you're comparing brand new hardware vs something that was released in 1999 (so 16 years of moore's law) against the point that the hardware apple uses in it's cutting edge desktops/laptops is pretty comparable to the stuff commercially available 5 years ago, which is a valid point.
If Apple wants to shame the market for not upgrading in the last 5 years, maybe they should offer a valid reason to upgrade outside of a marketing campaign appealing to 'want' and additional software features which a recent change actually made free anyway.
The idea that I could replace my PC with an iPad pro is just plain wrong.
My hyperbole was making fun of your silly statement that your 5 year old computer out performs all the hardware that Apple makes[1]
My phone is more powerful than any PC[2]
I am the richest man in Seattle[3]
> The idea that I could replace my PC with an iPad pro is just plain wrong.
I'll take your word on that. But that isn't true for all PC users. I have a pretty beefy machine, not the latest and greatest, but a hex-core, 12gb, 960gtx and dual 30" monitors, which I use wholly for gaming. Youtubing, videoing, etc, have all moved to my ipad+appletv, and with my PS4, I finally feel like I have a console that provides a real crisp gaming experience. So I'm ready to no longer maintain my $2k PC, and switch to something like a SteamMachine.
[1] if you exclude the MacPro.
[2] made before the year 2000
[3] if you exclude the people that have more money than I do.
Silly statement that's factually correct and entirely relevant to a discussion about the lack of reason to upgrade hardware to something Apple currently makes?
When your comparison is to bring up a 16 year hardware difference or to incorrectly comment at the lack of transfer standards available to me and also suggesting that I even needed said transfer speed or peripherals, I don't understand what impact you're trying to have on the discussion.
I excluded the mac pro because it costs ~$1k more than I spent on my computer 5 years ago. I could spend the same ~$2k and build a similar or the ~$3k and a far more powerful computer, but I realize that isn't available to everyone and so omitted it as it wasn't a fair comparison and not relevant to the conversation on an Apple exec suggesting that people don't upgrade their computers frequently enough and should instead buy the latest iPad Pro every 2-3 years.
I really don't get what point you're trying to make, other than being overly pedantic and argumentative. There isn't much reason to upgrade hardware, especially not within the Apple ecosystem and these comments from an Apple exec are very far from the mark. If you don't agree with that statement then I'm happy to discuss, but your current line of reasoning is flawed and unproductive.
> I don't understand what impact you're trying to have on the discussion.
> I really don't get what point you're trying to make
The point I'm making with you is the strong confirmation bias (See https://en.wikipedia.org/wiki/Confirmation_bias) to your belief that there is no compelling reason to upgrade from an older PC to a tablet.
You are selectively fitting evidence (ignoring MacPros) to ensure your anecdotal argument is factually correct, then using that statement as the basis for invalidating the assertion that PC users with old hardware would benefit from "upgrading" to an iPad.
There is no reason for you who presumably has a very powerful, albeit "old", PC to upgrade to a tablet. That does not mean every other person with an old computer will also not benefit from replacing their machine with a tablet. The latter part being the basis for the statement from Apple.
My opinion: for folks who don't require KVM, could upgrade to a tablet. Anecdotally, my Mom no longer uses her laptop, and does her emailing, youtubing, and photo stuff wholly on a 1st gen ipad. However, her use case is clearly not yours (or mine).
I ignored mac pros? I specifically called them out as not a valid comparison given the price point.
First it's hyperbole, which you incorrectly applied now it's confirmation bias by 'ignoring' something which was not pertinent but I called out anyway. Make up your mind.
If you want to argue on the internet, there are plenty of other places to do that, I wish you luck in your sport. However, if you're just going to stick around and argue equivocation needlessly, that's not something I want to be a part of.
Reading related news is interesting. Imagine Apple applies its tactics on their cars: You probably need a new apple designed plugin other than the universal one, special tools to change flat tires and/or update the exterior slightly to market therefore driving a 5-year-old car is sad.
600 million PC users discovered that running a lightweight unix distribution/DE and not allowing javascript made the life expectancy of their computers triple.
I may not be a fan of Apple any more, but I don't understand how this is sad.
The innards have been upgraded regularly since 2012, and the CPUs and GPUs in the latest 15" models are pretty competitive to everything else on the market not counting machines for gamers.
Yes there are things that are sad about it. A 5-year-old PC is probably not running a recent version of Windows, and has a higher likelihood of being compromised. And if a person wants a new computer but can't afford it, that is sad too.
That said, it was an obviously stupid stat for Schiller to cite. But now we're going to be subjected to a long series of Apple-bashing articles that overreach in the opposite direction. By the end of today we'll see multiple "actually, I'm proud to be running 5-year-old PC" posts.
Why? Because when mining for pageviews, there are few veins as rich as bashing Apple.
Edit to add: If we want to talk about the tech industry and poor people, let's do so. How many new companies are variations of "let us bring things to your door for you for an extra fee" or "let's give you personalized service so you don't have to go shopping/ride a bus/interact with a human"?
A 5 year old PC is likely running Windows 7, which still receives upgrades, is supported, and is quite secure. Of course, all Windows 7 PCs received a free upgrade path to Windows 10. So there's no reason they couldn't be running the most recent Windows, without spending a dime.
Of course, they could also be running something like Ubuntu (Linux counter puts the number at 73 million, W3Counter at 2.58% of all devices, or just under half OSX's numbers in that counting method), in which case they'd also likely be running a recent OS, one which is arguably more secure than either Windows or OSX.
I don't think it's 'offensive' what Schiller says. But it does speak to Apple's mentality, where planned obsolescence and profit trumps all, and the faithful buy into it...
> A 5-year-old PC is probably not running a recent version of Windows
Windows 7 was released in 2009, and all official owners of Windows 7 were offered the Windows 10 upgrade.
I'd be surprised if many PCs that are 5 years old aren't running one of these 2 OSs
I'm proud to be running my almost-seven-year-old PC, though I think it is about time for an upgrade. I also have a 2013 Macbook Pro, which is more than adequate for work. I'm not exactly poor, and I abhor the way the media runs away with this kind of comment.
But I also get a little peeved at overzealous marketing that takes advantage of people's ignorance. Convincing people that their 5 year old PC no longer meets their needs (when it probably does) is pretty similar to convincing some poor sucker to switch to your super duper maximum strength 4G Long Term Evolution wireless network because you can have 20G of data per month, when you actually only use 0.5G of data per month.
Granted, this interpretation would make for a boring thinkpiece and would not get me to #1 on HN.