When it comes to tech topics this is an insiders discussion. When it comes to political topics, 99% of people in HN threads have close to zero insights, and circle around publicly known information. Big difference.
It is very dangerous to expect deep insights on every aspect of human life from a HN thread, regardless of how well educated and well meaning average HN commenters are.
> That's a "having different rules for different ways of making money" thing.
That's the thing which is a consequence of the existing complexity, which in turn is a consequence of trying to do brackets by income.
A flat rate tax is you collect VAT on everything no exceptions, send everyone a check in a fixed amount as the credit to make it progressive no exceptions, and you're done.
Different marginal rates is oops, if you use VAT then rich people have poor people go to the store for them so you have to use income tax and track everybody's income. But some people get income from investments and then it's not realized until they cash out, which allows a bunch of fancy tax dodges, but trying to tax unrealized gains has a bunch of other serious problems like liquidity and valuation. Also, you didn't really mean to tax everyone's retirement savings, so now you need a bunch of stuff like 401(k) to undo the thing you didn't really mean to do, and now you have some more complexity. And it continues like this until you turn around and doctors are paying higher taxes than billionaires because billionaires have more resources to navigate all the complexity.
Wow I thought this would take at least another decade, given how difficult driving in London is compared to American cities. I will still be really surprised if they can actually make this work
When I visited San Francisco recently the Waymos were really awesome and worked well, but also there's barely any traffic compared to London. The streets are all really wide and you can pretty much just pull over anywhere. Some even just stopped in the middle of the road and I was amazed to see people waiting patiently behind them! London is entirely different.
Still, props for trying. Will be very interesting to watch what happens!
It’s pretty clear that the commenter you’re responding to is making the point that a treatment effective in mice still has a long way to go before it’s viable for use on humans, assuming it is ever viable to use on humans.
Mice and humans are quite different, and whilst it looks like this treatment actually reverses the effects of dementia in mice, it’s far from clear that it would have the same impact on humans. By the time people start exhibiting Alzheimer’s symptoms, the brain will already have sustained quite a lot of damage - by which I mean death of neurons - so it’s hard to see how this would actually reverse the disease, as opposed to simply slowing or halting its progression, without these neurons being replaced.
95% of HN comments nowadays amount to: "I have no idea how to wield AI, but I'm going to talk authoritatively on the subject and use fallacies to downplay the claims of others."
It's just anecdotal but when I asked why our Xerox workstations at JSC had dancing Snoopy line art, I was told Charles Schulz himself was a big fan of the space program and he'd drawn art for the program and extended its use to them in perpetuity.
I have been on a team that won a silver Snoopy but was a subcontractor and didn't get one myself; just the Boeing employees I worked with did. Every once in a while I Google them on the off chance I could get one as a piece of memoribilia, but they are thousands of dollars.
Eh, that’s companies rather than individuals, and while it’s still objectionable it’s not quite in the same league.
If you’re running a company you probably already have an accountant, and they’re probably already using one of those pieces of software. Or you’re using something like Xero, which is already on the list.
Eh, I have an m1 pro and it's definitely showing its age..
My colleagues on m3/m4 have a night and day difference in programming performance.
CPU, memory bandwidth, latencies, working on javascript projects that involve countless IOs on small files...It really shows. I can't wait for the upgrade.
> there’s no sense in trying to solve for some abstract future set of problems that the vast majority of people are never going to have
> That too requires a substantial investment of time and resources.
The discussion has gotten to be pretty abstract at this point. To get back to concrete examples, the egui RAD builder I've been hacking on worked on day 1, first commit. It's been a joy to put together, and no more difficult than building GUI apps with any other toolkit I've worked with. Which causes me to question your statements about additional complexity. You can dig deep and do dark magic with Rust if you want, but you can also treat it like any high level language and get things done quickly. That's part of what makes it a rare gem to me.
Some folks don't like dealing with strict types, or with the borrow checker, but I find that the errors they illuminate for me would have been similarly serious in other languages which lacked the tooling to highlight them. Which adds to my appreciation of Rust.
Your audience chooses you over the press releases because you sound like a human, trim out the boring items and more obvious propaganda, place things in context, reduce jargon/simplify things, also report on other things the pentagon doesn't have press releases about, and throw in some jokes.
You choose to keep at it because you think military stuff is pretty neat; you get paid by the view; getting briefings from the pentagon makes you seem important to yourself and others; and you like being a celebrity (albeit a very minor one)
As someone with years of experience on serverless stuff on AWS I might be a bit biased BUT I'd argue serverless is the sweet spot for most applications. You need to remember however that most applications aren't your typical startups or other software products but simply some rather boring line of business software nobody outside the company owning it knows of.
Concerning how IT departments in most non-software companies are, the minimal operational burden is a massive advantage and the productivity is great once you have a team with enough cloud expertise. Think bespoke e-commerce backends, product information management systems or data platforms with teams of a handful of developers taking responsibility for the whole application lifecycle.
The cloud expertise part is a hard requirement though but luckily on AWS the curriculum is somewhat standardized through developer and solutions architect certifications. That helps if you need to do handovers to maintenance or similar.
That said, even as a serverless fan, I immediately thought of containers when the performance requirements came up in the article. Same with the earlier trending "serverless sucks" about video processing on AWS. Most of the time serverless is great but it's definitely not a silver bullet.
I’d say these celebrate entrepreneurship more than innovation. Nothing wrong with that, but it does bother me that the true innovators often don’t get credit outside academia and enthusiasts well versed in the history.
Apple II was not the first usable by mere mortals PC. There’s a lot of contenders but one of the earliest came from Georgia:
Cray was not the first multiprocessor wide vector supercomputer, but it did innovate on it. I’d say Cray broke more fundamental innovation ground than Apple.
HR 1923 ("Circulating Collectible Coin Redesign Act of 2020") [0]: "No head and shoulders portrait or bust of any person, living or dead, and no portrait of a living person may be included in the design on the reverse of specified coins"
Well... the reverse certainly isn't a head-and-shoulders portrait or bust...
All these features sound really awesome and would also benefit many non-kernel cases (especially generalized projections). Very happy to see Linux driving the language forward.
It's less about having an effect but all about moral integrity. They want to signal that they still abide to their professional standards in order to keep their reputation among their peers and the public, those who aren't gleichgeschaltet (yet).
Annoyingly, Ableton Push 3 Standalone runs on Linux. This means that Ableton have a working Linux version of, at least the core, of Ableton Live working on Linux. I sincerely hope they release a true Linux version soon. It's the last thing tying me to Windows.
I write macOS software (among other things). I always run the earlier betas on another machine for testing. The primary dev box gets the beta a few weeks before release. It’s never been a problem.
This is 100% on electron, they didn’t do their due diligence that every Mac & iOS dev goes through every summer before the next release. It’s been two decades of the same song and dance every year. There’s just no excuse.
Most machines (except those that literally only take quarters) take dollar coins, as these are designed to be the same as susan b anything dollars, which have been around since 1979.
The real key is they don’t stop making the dollar bill and force the issue.
It is very dangerous to expect deep insights on every aspect of human life from a HN thread, regardless of how well educated and well meaning average HN commenters are.