Hacker Newsnew | past | comments | ask | show | jobs | submit | altairprime's commentslogin

You’re describing a presumably-desired outcome here but not why you think this particular limitation would be overturned as a free speech violation. On what basis would you expect that ruling to occur?

#) Young people are very susceptible to outsourcing decisions (and morality) to their leadership.

Competent people lose their innate tolerance for self-serving (or immoral) decision makers over time; once you have the breadth of knowledge and experience “to see through that nonsense”, your leadership will be forced to either improve or get rid of their antagonists. Fortunately for leadership, age-related terms are rarely prosecuted successfully in tech due in large part to the absence of unions, and the penalties if convicted are typically pocket change compared to the alternatives.


Curly quotes are plaintext, same as any other character.

> camera like this is necessarily, at least in part, a closed system that blocks you from controlling the software or hardware on the device you supposedly own

Attestation systems are not inherently in conflict with repurposeability. If they let you install user firmware, then it simply won’t produce attestations linked to their signed builds, assuming you retain any of that functionality at all. If you want attestations to their key instead of yours, you just reinstall their signed OS, the HSM boot attests to whoever’s OS signature it finds using its unique hardware key, and everything works fine (even in a dual boot scenario).

What this does do is prevent you from altering their integrity-attested operating system to misrepresent that photos were taken by their operating system. You can, technically, mod it all you want — you just won’t have their signature on the attestation, because you had to sign it with some sort of key to boot it, and certainly that won’t be theirs.

They could even release their source code under BSD, GPL, or AGPL and it would make no difference to any of this; no open source license compels producing the crypto private keys you signed your build with, and any such argument for that applying to a license would be radioactive for it. Can you imagine trying to explain to your Legal team that you can’t extract a private key from an HSM to comply with the license? So it’s never going to happen: open source is about releasing code, not about letting you pass off your own work as someone else’s.

> must be based on reputation

But it is already. By example:

Is this vendor trusted in a court of law? Probably, I would imagine, it would stand up to the court’s inspection; given their motivations they no doubt have an excellent paper trail.

Are your personal attestations, those generated by your modded camera, trusted by a court of law? Well, that’s an interesting question: Did you create a fully reproducible build pipeline so that the court can inspect your customizations and decide whether to trust them? Did you keep record of your changes and the signatures of your build? Are you willing to provide your source code and build process to the court?

So, your desire for reputation is already satisfied, assuming that they allow OS modding. If they do not, that’s a voluntary-business decision, not a mandatory-technical one! There is nothing justifiable by cryptography or reputation in any theoretical plans that lock users out of repurposing their device.


I know of no such cases, and would love to know if someone finds one.

I worked for a company who had this happen to an internal development domain, not exposed to the public internet. (We were doing security research on our own software, so we had a pentest payload hosted on one of those domains as part of a reproduction case for a vulnerability we were developing a fix for.)

Our lawyers spoke to Google's lawyers privately, and our domains got added to a whitelist at Google.


> LLMs seem like a nuke-it-from-orbit solution to the complexities of software. Rather than addressing the actual problems, we reached for something far more complex and nebulous to cure the symptoms.

The author overlooks a core motivation of AI here: to centralize the high-skill high-cost “creative” workers into just the companies that design AIs, so that every other business in the world can fire their creative workers and go back to having industrial cogs that do what they’re told instead of coming up with ‘improvements’ that impact profits. It’s not that the companies are reaching for something complex and nebulous. It’s that companies are being told “AI lets you eject your complex and nebulous creative workers”, which is a vast reduction in nearly everyone’s business complexity. Put in the terms of a classic story, “The Wizard of Oz”, no one bothers to look behind the curtain because everything is easier for them — and if there’s one constant across both people and corporations, it’s the willingness to disregard long-term concerns for short-term improvements so long as someone else has to pay the tradeoff.


Yes! This happened in so many industries. Banking is my go to example, where we used to have local bankers making decisions based on local knowledge of their community, but then decision making was centralized into remote central HQs and the local bankers moved into living below the API, while the central HQ guys began to make all the bucks. See also "seeing like a state" and the concept of legibility.

In the 90s, people were worrying about the efficiency of education workers and the profitability of the Internet backbone companies; which is when US Networks (iirc?) absorbed every local dialup ISO nationwide in order to monopolize.

What’s going to happen next will probably more closely resemble the early 20th century in economic crash, when the population goes through a subprime debt cutoff for the 25% of U.S. households who can’t afford 1-bedroom rent, starving a lot of corporations of the workers and buyers that are propping up “run it until the well is dry” businesses (after their workers’ cars are repossessed and there’s no unprofitable public transit to compensate).

Much of this era’s political exploitation is under the banner “privatization”, which is simply trying to open up more markets that can be harvested until the field is barren, as otherwise they’d have to invest in the ones that have already been destroyed. So look for government-regulated monopolies that cannot be run at a profit — as an example, the postal mail has been a multi-decade quest for this outcome. Privatized sewers and roadway services (planning, paving, painting, signaling) come to mind; imagine how much profit a company could generate by forcing an entire city to toll roads in order to extract the most profit for the rich owners, etc.


Heroku is pricing for “# of FTE headcount that can be terminated for switching to Heroku”; in that sense, this article’s $3000/mo bill is well below 1.0 FTE/month at U.S. pricing, so it’s not interesting to Heroku to address. I’m not defending this pricing lens, but it’s why their pricing is so high: if you aren’t switching to Heroku to layoff at least 1-2 FTE of salary per billing period, or using Heroku to replace a competitor’s equivalent replacement thereof, Heroku’s value assigned to you as a customer is net negative and they’d rather you went elsewhere. They can’t slam the door shut on the small fry, or else the unicorns would start up elsewhere, but they can set the pricing in FTE-terms and VCs will pay it for their moonshots without breaking a sweat.

To clarify with the final paragraphs of context, “He said, Corp said, 3 of 3 coworkers asked corroborated what He said”.

If you email the mods they’ll merge the duplicate discussions. Footer contact link.

Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: