To be fair, many (all?) of these ships are burning https://en.wikipedia.org/wiki/Heavy_fuel_oil, which is not at all like petrol in that it requires preheating and atomisation to burn.
It still doesn't burn well unless it's in a fine mist. Even just a puddle of diesel is actually quite difficult to get burning on its own without some other material as a wick of sorts. https://youtu.be/7nL10C7FSbE Great practical demonstration of the different flammability characteristics of some common fuels.
For the tar that a container ship burns, it's not going to be very volatile. Coal also burns quite well, but unless it's powdered, good luck trying to ignite some with just a match.
I said pre-heated. No fuel burns in it's liquid form, it needs to be turned into a gas first. If you heat up diesel it'll light extremely easily as the vapor concentration is high.
So you learn no? I'm sure the way fuel was carried during World War 1 is not the way the US Military carries it now.
Special ships? That probably depends on the economics. If the Electric Car Industry makes it worth a shippers while - Someone will invest in a special purpose vehicle.
So now we are at the first stages of this - in time - if the industry succeeds - the money will come.
I think like most things there is a balance. The sweet spot may be I can go hard on the project but if I feel like I need some time off - there is the ability to step back and have the work continue.
I'm not an expert here but it may be time for white box no name mobile phones that come with nothing that we have to set up ourselves. My second computer was the first time I downloaded from both kernel and kerneli and rolled my first Linux Kernel based gnu system....
If that's the case, my suspicion is this may continue happening until a 'Netflix of scientific papers' appears. Even with Sci-Hub down it's likely someone else will replicate it.
Every piece of value generating work in the chain of paper publication is done by people paid by universities and in turn by taxpayer money.
The publishing companies give 0 pay to the volunteers that write and peer review the papers. At most (and often even not that), they pay the person that does the final formatting (which is often already done by the author). So it often boils down to having a program add the publishers copyright notice.
So you pay 35$ per download of a 2mb file, where all the publisher did is host said 2mb file.
Does that seem like a fair price?
So universities pay twice.
The university library of my alma mater used to pay 15 Million Euros a year for online licenses.
That is three large multi-institutional EU projects worth of money, equivalent to 200 PhD student positions.
1. Academics write and submit papers -> university/tax money pays
2. Academics review papers -> university/tax money pays
3. Academics organize and attend conferences -> university/tax money pays
4. University buys published paper from publishers -> university/tax money pays
So university/tax money pays for writing the paper, quality assurance via reviews, conferences, just to finally buy the paper via some insanely expensive subscription.
For a rational environment like science this model is simply insane.
Somewhere between step 3 and 4, I assume the publisher gets hold of the paper and acquires a license to resell it? How does that part work and why do universities/academics support it if they could just distribute it for free or via an open journal? Is it solely to get the "kudos" of being in particular publications?
It's 100% due to the traditional model of measuring academic performance through the number of published papers weighted (very, very strongly) by the "reputation" of the journal they're published in, justified by the importance of curation and peer review to provide quality control.
Historically, this model grew because the journal publishers provided the necessary infrastructure to print and distribute copies of the articles.
My impression is that it is a offshoot of the metrics mania of the 1990s
I am one person watching the world go by but I think that there was, is, far too much emphasis on things that can be counted.
People became afraid to make subjective judgements of quality. They demanded data. So whatever was the thing they counted, it got maximised. Quality is a slippery concept and it is harder than counting.
Judge scientists by the number of papers they publish, and they will publish a lot. A manifestation of the quantifying fetish.
The authors give the rights over to journal when they submit the paper. The authors depend on publishing in high impact-factor journals for tenure and continued funding.
Not all academics support it and several groups have opted out, publishing in open access only or organizing their own journals and conferences.
I can see a possible future where only a few holdouts rely on 'prestige' as a decision criteria, while a majority of academic fields/groups have switched to open access... and the former get only more and more vocal about the decline of science, or some screeching point.
Not in academia, but have watched the debates over the past decade or so...
It really seems like it's the inertia of existing administrators that haven't shifted away from judging papers based on the prestige of the journals they get published by.
Once the prestige factor goes away, and authors are judged primarily on the quality of their work, the publishers will lose their stranglehold.
Of course, that means a lot of entrenched interests losing revenue streams, so it's going to be a long struggle of grassroots change vs regulatory capture combined with reactionary pushback.
You are not wrong and I agree but eventually I suspect there will be some project management (following up on peer reviewers, winnowing out the low quality papers, etc) that will need to be paid for on top of the server and bandwidth costs. Whatever service comes about it will need to collect some money. My view is it should be small in the single digit dollar space for unlimited monthly access for every paper that was funded by a tax payer in the world.
This sounds like there is currently a mechanism for winnowing the crap, but we currently don't really have this either.
On the contrary, because researchers are driven to publish publish publish, they often reheat the same paper over and over again with minor modifications, or just go conference shopping until they get an acceptance.
With less publishing pressure, qualitu would go up automatically.
Watson and Crick published papers only every couple years. This wouldn't work today at all.
Science needs to go back to publishing when you habe something to say, not just to fullfill your quota.
I think that what some outside of academia might miss is that publishing in a journal is not about distribution and access anymore. The renown of the journal where you publish is often used as a proxy for the quality of your scientific output by the entities that grant you funding and career prospects. Hopefully it will improve soon, but it is this prestige and evaluation problem that has to be tackled, not the distribution problem.
Even if there was a "Netflix" of scientific papers, a few years after its prime all of the publishers would notice that it's far more lucrative to make their own Netflix and take all of their content for themselves again, forcing people to go back to Torre- I mean, Sci-Hub.
It does not condone that it took an HN frontpage to react to a massive issue from a client blocked due to either a badly configured sanctions system, or a badly defined false positive determination workflow, that could not be expedited otherwise by the client, but... it’s something I guess.
Good luck having a 7-day response by your bank, who have the legal obligation to not share with you why did they block you, or having Google’s CEO looking into your issue aired in twitter.
Two things to consider: That guy is the corporate vice president for developer services, so he probably had to run that response by Legal before committing like that. Also unless this is a really exceptional year, there probably wasn't anyone "at work" at Microsoft last week except on-call rotations.
I like your statement 'the mini-mill has already been made'. The thing is though not everyone needs an employee with Harvard degrees. Small companies may be happy with a finance person with a 4 year degree from a smaller university for example.
I sure do. As a small web hosting provider AtlantaWebHost.com it was one of our proudest moments when our FreeBSD servers handled the Slackdot effect without complaint. This was the Pentium 3 Dell PowerEdge server era.
I am no longer affiliated with AtlantaWebHost.com. The company was bought in 2005 and is still running as a boutique provider.
To this day, I'm still pretty handy with crimping ethernet cables :-) I mostly just do it in my own home and homes of friends if asked for help.
Yes, but at that time serving the Slashdot amount of traffic was actually a challenge when using standard setup. (Not impossible, you just needed to put in extra effort) These days unless you're doing something silly you should be able to handle HN front page from a raspberry pi at home.
No, connections better than ADSL can handle the traffic on network side and as long as you're not serving huge content. And as long as you're not doing per-user content generation, your throughout is a question of "how fast can nginx serve memory cache over network". The original RPi B can do 40 rps serving a 180kB file. RPi4 can do 4k rps (https://ibug.io/blog/2019/09/raspberry-pi-4-review-benchmark...).
So surviving front page on HN for a blog post served from home connection should be trivial unless you disabled caching somewhere.
For Daring Fireball, it's called Fireballed. Jon Gruber (used to?) keeps a cached version of linked pages to put up in case linking to them in a new post causes them to go down.