Hacker News new | past | comments | ask | show | jobs | submit | kokanee's comments login

> What, uh, do they think they're going to do?

They published a charter. They're going to establish guidelines for sustainable web development and tools for measuring your impact. Yes, static architectures will probably be one path for improvement.

> There will always be competing interests, so a body that only exists to checks notes talk about ""sustainability"" on the web feels moot.

I'm not following this point. The existence of entrenched interests means that no opposing interests should be researched? Why is "sustainability" in quotes, is it not a legitimate pursuit, or are you implying that they have ulterior motives?

> They explicitly say hardware is out of scope. Cool.

Hardware is out of scope "unless related to hosting & infrastructure," AKA the cloud. That is an absolutely massive scope within the hardware realm.

> Honestly, reading the manifesto [2] just makes me more angry. It doesn't say anything.

It sounds like you're looking for the guidelines that this group aims to publish. A manifesto in this context is not intended to be a solution or a prescription; it's a framework for alignment towards a goal. The concrete solutions are the goal of the group.


I'm a waffler on this as well, increasingly leaning away from containers lately. I can recall one time in my pre-Docker career when I was affected by a bug related to the fact that software developed on Mac OS ran differently than software running on CentOS in production. But I have spent untold countless hours trying to figure out various Docker-related quirks.

If you legitimately need to run your software on multiple OSes in production, by all means, containerize it. But in 15 years I have never had a need to do that. I have a rock solid bash script that deploys and daemonizes an executable on a linux box, takes like 2 seconds to run, and saves me hours and hours of Dockery.


I don't understand how running a single command to start either a single container or a stack of them with compose, that then gets all the requirements in a tarball similar and just runs is seen as more complicated than running random binaries, setting values on php.ini, setting up mysql or postgres, demonizing said binaries and making sure libraries and the like are in order.


You're going to be setting all that stuff up either way, though. It'll either be in a Dockerfile, or in a Vagrantfile (or an Ansible playbook, or a shell script, ...). But past a certain point you can't really get away from all that.

So I think it comes down to personal preference. This is going to sound a bit silly, but to me, running things in VMs feels like living in an apartment. Containers feel more like living out of a hotel room.

I know how to maintain an apartment, more or less. I've been living in them my whole life. I know what kinds of things I generally should and should not mess with. I'm not averse to hotels by any means, but if I'm going to spend a lot of time in a place, I will pick the apartment, where I can put all of my cumulative apartment-dwelling hours to good use.


Yes, thank you for answering on my behalf. To underscore this, the decision is whether to set up all of your dependencies and configurations with a tool like bash, or to set it all up within Docker, which involves setting up Docker itself, which sometimes involves setting up (and paying for) things like registries and orchestration tools.

I might tweak the apartment metaphor because I think it's generous to imply that, like a hotel, Docker does everything for you. Maybe Dockerless development is like living in an apartment and working on a boat, while using Docker is like living and working on a houseboat.

There is one thing I definitely prefer Docker for, and that's running images that were created by someone else, when little to no configuration is required. For example, running Postgres locally can be nicer with Docker than without, especially if you need multiple Postgres versions. I use this workflow for proofs of concepts, trials, and the like.


I suppose like anything, it's a preference based on where the majority of your experience is, and what you're using it for. If you're running things you've written and it's all done the same way, docker probably is just an extra step.

I personally run a bunch of software I've written, as well as open source things. So for me docker makes everything significantly easier, and saves me installing a lot of rubbish I don't understand well.


After 20 years of various things breaking on my (admittedly franken) debian installs after each dist-upgrade, and spending days troubleshooting each time, I recently took the plunge and switched all services to docker-compose.

I then booted into a new fresh clean debian environment, mounted my disks, and:

  cd /opt/docker/configs; for i in *; do cd $i; docker-compose up -d; cd ..; done
voila, everything was up and working, and no longer tied to my underlying OS. Now at least I can keep my distro and kernel etc all up to date without worrying about anything else breaking.

Sure, I have a new set of problems, but they feel smaller.


Thou hast discovered docker's truest use case.

Like, legit, this is the whole point of docker. Application/service dependencies are no longer tied to the server it is running on, mitigating the worst parts of dependency hell.

Although, in your case, I suppose your tolerance for dependency hell has been quite high ;)


> Application/service dependencies are no longer tied to the server it is running on, mitigating the worst parts of dependency hell.

Until you decide to optimise for resources, and do crazy things like “one postgres instance, one influxdb instance” instead of “one instance per microservice”, and then you get back into hell pretty quick.

Winds me up how massive tiny applications become, and how my choices are to throw money (RAM) at the problem, or money (time) at the problem. I wonder when someone will do the math and prove that developer laziness is having a substantial drag on global efficiency. The aggregate cost bourn by users has to be orders of magnitude larger than the cost savings made by developers at this point.


> Now at least I can keep my distro and kernel etc all up to date without worrying about anything else breaking.

I get what you are saying, but note a word of caution - kernel upgrades can break container runtimes: https://github.com/containers/podman/issues/10623.


I'm doing exactly the same thing. I started to do everything on Synology with Doctor Compose and got rid of most Synology apps: through open source applications.

At some point I moved individual containers to other machines and they work perfectly. VPS, NUC no matter what.


Yea, in same boat and I'm wondering if there is big contingent of devs out there that bristle at Docker. Biggest issue I run into writing my lab software is finding decent enough container registry but now I just endorse free tier of Vultr CR.


I just use the github registry, but I've been paying for their personal pro subscription for years now so it wasn't really an "additional" cost for me.


I did the same, created a thing that is supposed to be a collection of scripts to set up a vps without any container. You can check it out at https://github.com/diogocasado/coderaft


Sure, this a correct mindset, and an obvious one for people who build and maintain information repositories professionally, but at an average company with hundreds or thousands of people, it's a losing battle to enforce. Paying for Slack and having the ability to find information in conversations that should have been documented but were not is valuable in practice.


There are two interesting cases of blue fish in my home state.

1. The Beardslee Trout, which exists only in a single lake in the world: https://en.wikipedia.org/wiki/Beardslee_trout. Most photos online of these fish, including the Wikipedia thumbnail, are not representative of the stunning color of these fish in person, and are most likely not Beardslees at all. The lake itself looks like blue gatorade, and the fish have obviously adapted accordingly.

2. Lingcod, a fraction of which are a crazy alien blue color not only outside but throughout the flesh. https://oregonmarinereserves.com/2021/08/31/lingcod/. This phenomenon applies to a handful of species, but lingcod is the most well-known.


It's true... the quality of content on the Internet has a bunch of problems, and AI is just one of them. The economic incentives to trick people into staying on your page and scrolling as much as possible are a fundamental part of the problem. Politically-motivated ragebait and lies are a separate huge problem. AI-generated slop is also a problem for content quality and UX, but I'm far more concerned about the impact of AI on the value of human labor and intellectual property than I am about the UX of my search result pages.


The idea that they would give ChatGPT away to consumers for free without mining the data in some form or another is naive.


The partnership is structured so that Apple can legally defend including language in their marketing that says things like "users’ IP addresses are obscured." These corporations have proven time and time again that we need to read these statements with the worst possible interpretation.

For example, when they say "requests are not stored by OpenAI," I have to wonder how they define "requests," and whether a request not having been stored by OpenAI means that the request data is not accessible or even outright owned by OpenAI. If Apple writes request data to an S3 bucket owned by OpenAI, it's still defensible to say that OpenAI didn't store the request. I'm not saying that's the case; my point is that I don't trust these parties and I don't see a reason to give them the benefit of the doubt.

The freakiest thing about it is that I probably have no way to prevent this AI integration from being installed on my devices. How could that be the case if there was no profit being extracted from my data? Why would they spend untold amounts on this deal and forcibly install expensive software on my personal devices at no cost to me? The obvious answer is that there is a cost to me, it's just not an immediate debit from my bank account.


> The partnership is structured so that Apple can legally defend including language in their marketing that says things like "users’ IP addresses are obscured." These corporations have proven time and time again that we need to read these statements with the worst possible interpretation.

What's the worst possible interpretation of Apple and CloudFlare's iCloud Private Relay?


Requests are not stored by openai, but stored by Apple and available on request.

Is how I interpret that. It's similar to that OneDrive language which was basically allowing user directed privacy invasion.

Inevitably,openai will consume and regurgitate all data it touches.

It is not clean and anyone thinking openai won't brutalize your data for it's race to general AI is delusional in one of several ways.


I’m not sure I understand the paranoia that Apple is secretly storing your data. Sure they could secretly do so but it doesn’t make any sense. Their whole schtick is privacy. What would Apple benefit from violating what is essentially their core value prop? They’d be one whistleblower away from permanent and irreparable loss of image.


Theyre not secretly. They are. They admit it.

The question is, is it encrypted E2E everywhere, how controlled is it on device, how often is it purged.

The ubiquity of cloud means theres a huge privacy attack.surface and unclear how much ofvthat is auditable.

Lastly, theres no reason to think Apple will avoid enshittification as the value of their ecosystem and users grow.

Just takes one bad quarter and a greedy MBA to tear down the walls.

Past privacy protection is no Guarantee of future protection.


Interesting stuff, particularly the estimation that 140 TW of solar radiation are captured by photosynthesis. In arguments about the efficacy of tree-planting as a climate control effort, I've heard about the effect on albedo radiation and of course on the sinking of CO2, but I've never heard it pointed out that photosynthesis itself captures solar radiation that would otherwise be reflected heat.


The refusal by these companies to accept the realities of the modern printer market reminds me of the tagline that Michael Scott wrote for Dunder Mifflin:

Limitless paper in a paperless world


Wouldn't work for me without the ability to add other exercises. I'm also not familiar with a lot of these exercises, so descriptions and images would be very helpful.


Thank you for your feedback. This is a hobby project for now, so images are a bit hard to do on budget. But I will definitely add descriptions soon.


Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: