Because Engineeering is about trade offs from a business level, not just about technical ones. The support costs with older systems go up because the number of users decreases because you have to keep spare parts around for those old machines. That’s what Apple support is. If it’s not a supported model, they’re not going to do any hardware repairs for you period. Tying software and hardware that way isn’t a terrible decision to make from Apple’s perspective. Customer’s also have to pick - do I want to continue in the ecosystem or so I get out. Apple doesn’t want users getting disgruntled because their QA is no longer testing software running on older hardware.
Also Apple doesn’t deprecate things by dropping support suddenly. That wouldn’t be great UX. They wait for that to happen naturally over time and, when the number of users is sufficiently small enough past some point AND they want to make a technical break, they start stopping to support a model.
I think it’s valuable to compare alternatives. Apple supports their software and hardware much much better than any competitor AFAIK. For example, iPhones have a guaranteed support level of five years iirc and often it can go even older despite engineers on the ground wanting to deprecate things faster because of the maintenance costs (eg dropping 32-bit macOS support went on longer than line engineers were happy with). Competitors? Google does 3 years of upgrades on their flagship phones and laptops and 5 years of security updates if I recall correctly.
That’s why I’d say the framing is wrong. If anything Apple is a leader in figuring out how to provide the best support I’m the industry by vertically integrating everything (other SW vendors often can have one of forks for every product which makes it quicker to launch something but economically limits how long those support operations can last. Google tried several architectural efforts to stabilize this situation for Android but as I said they still haven’t managed this for their own hardware - they maybe solved it at the OS level but firmware still has security flaws so if a vendor (often the chip vendor, not the oem) stops making updates (they do frequently and definitely don’t go out of their way to unify their software stack - they want planned obsolescence more than OEMs), you’re SOL. Apple is a tad bit better insulated from this since they have a stronger negotiating hand with partners / have vertically integrated a lot of components in house which lowers support costs / complexity. Still, they’re not fully immune to it because those pressures are multifaceted and complex.
So to answer your question, sure it’s not strict technical merit write now but that’s too myopic. It’s a signaling mechanism to customers that “we are not doing any work to validate software we write works on old machines”. Lots of things to criticize Apple about, but best in class support in the commercial space doesn’t feel like the best place. I could be wrong though.
Also Apple doesn’t deprecate things by dropping support suddenly. That wouldn’t be great UX. They wait for that to happen naturally over time and, when the number of users is sufficiently small enough past some point AND they want to make a technical break, they start stopping to support a model.
I think it’s valuable to compare alternatives. Apple supports their software and hardware much much better than any competitor AFAIK. For example, iPhones have a guaranteed support level of five years iirc and often it can go even older despite engineers on the ground wanting to deprecate things faster because of the maintenance costs (eg dropping 32-bit macOS support went on longer than line engineers were happy with). Competitors? Google does 3 years of upgrades on their flagship phones and laptops and 5 years of security updates if I recall correctly.
That’s why I’d say the framing is wrong. If anything Apple is a leader in figuring out how to provide the best support I’m the industry by vertically integrating everything (other SW vendors often can have one of forks for every product which makes it quicker to launch something but economically limits how long those support operations can last. Google tried several architectural efforts to stabilize this situation for Android but as I said they still haven’t managed this for their own hardware - they maybe solved it at the OS level but firmware still has security flaws so if a vendor (often the chip vendor, not the oem) stops making updates (they do frequently and definitely don’t go out of their way to unify their software stack - they want planned obsolescence more than OEMs), you’re SOL. Apple is a tad bit better insulated from this since they have a stronger negotiating hand with partners / have vertically integrated a lot of components in house which lowers support costs / complexity. Still, they’re not fully immune to it because those pressures are multifaceted and complex.
So to answer your question, sure it’s not strict technical merit write now but that’s too myopic. It’s a signaling mechanism to customers that “we are not doing any work to validate software we write works on old machines”. Lots of things to criticize Apple about, but best in class support in the commercial space doesn’t feel like the best place. I could be wrong though.