Hacker Newsnew | past | comments | ask | show | jobs | submit | BigJono's commentslogin

Yeah, there was an entire season about ending the war on drugs and how it was the only thing that actually worked lol.

Also, they caught the drug kingpin at the end of the show by physically following his lieutenants to a warehouse full of drugs and arresting them all on the way out. The only thing the wiretaps were used for was to build a conspiracy charge against the leader, who had been standing outside for months/years doing face to face meetings with everyone that was arrested, clearly being the one in control of every conversation. If somehow that's not enough to charge someone with conspiracy then it seems removing a small amount of freedom to change that would be far preferable to reading everyone's messages and banning encryption.

"The Wire proves the need for mass surveillance" is the dumbest take I've ever heard. It literally shows the complete opposite.


lol, well thanks for the spoilers. /s

I might be reading parts of it wrong, but I think that's a different sort of thing to the research in the article.

Sugar is a very indirect cause of heart attacks, everyone knows that most heart attacks are a culmination of decades of diet and exercise habits. It's still worth researching everything to do with that, but it's pretty low value research because it's hard to draw any actionable conclusions from it other than "eat healthier and exercise", which is already well known.

The research in the article is talking about a direct cause. Bacteria exists on arterial plaque, viral infection triggers bacteria to multiply, something about that process causes the plaque to detach and cause a heart attack. If that ends up being a rock solid cause and effect, even for a subset of heart attacks, that could lead to things like direct prevention (anti-virals before the heart attack happens) or changes in patient management (everyone with artery disease gets put far away from sick patients) that could directly and immediately save a lot of lives.

The post you replied to was saying that the data from the study isn't as strong as the article and headline make it out to be, which is usually the case. For this one though I'm reading that less as "it's a nothingburger" and more as "it's a small interesting result that needs a lot of follow up".


While you're not technically wrong, I find this whole approach to be not good.

And actually, if as a lot of science is now suggesting, inflammation and damage due to eating oxidization-prone lipids (aka refined oils) in combination with refined sugar is a big part of the cause of arterial damage and heart disease, that could be easily be the biggest root cause in most of these cases. The bacteria if they even play a causal role at any point, could be a result of previous damage due to diet (and lack of exercise).

The paper's idea of treating heart disease by giving patients antibiotics seems really problematic to me. Destroy your health with poor diet and lack of exercise, and then once you start to feel the effect of this, take antibiotics and destroy your gut health too.


While do do agree with the general premise of your comment, that is, correct the root cause. For some, "eat healthy and exercise", may not be an option, because they are already addicted and overweight. At least, taking anti-biotics could be the very first line of actionable treatment to prevent the bacterial buildup and save their life immediately.

I very strongly disagree. Antibiotics are very dangerous at the individual level in how they mess up the individual's gut bacteria which are crucial for health.

Furthermore giving everyone antibiotics as a preventative measure for heart disease complications, given that most Americans are on the spectrum of heart disease (i.e. have hypertension) is a recipe for bacterial resistance and other population problems.


If you atempt that plan at scale I would expect antibiodic restistant bacteria to develop fast and people soon start dieing younger of what we now think of as minor infections.

Are there any human randomised control trials that show stuff like vegetable oil cause inflammation? Atleast the ones I've seen show the opposite

https://youtu.be/-xTaAHSFHUU


No, such human rcts haven't been done.

The mechanism for how refined linoleic acid if heated would create higher amounts of free radicals that are known to cause oxidative stress / inflammation is well understood.

I agree a large scale rct for this would be great, but I doubt anyone would fund it and if it does get done I'd be surprised if it wasn't designed to meet the biases of the side that funds it.


> In this process, deletion rather than expansion of the wording of the message is preferable, because if an ordinary message is paraphrased simply by expanding it along its original lines, an expert can easily reduce the paraphrased message to its lowest terms, and the resultant wording will be practically the original message.

This bit has me perplexed. If you had a single message that you wanted to send multiple times in different forms, wouldn't compressing the message exponentially limit possible variation whereas expanding it would exponentially increase it? If you had to send the same message more than a couple of times I'd expect to see accidental duplicates pretty quickly if everyone had been instructed to reduce the message size.

I guess the idea is that if the message has been reduced in two different ways then you have to have removed some information about the original, whereas that's not a guarantee with two different expansions. But what I don't understand is that even if you have a pair of messages, decrypt one, and manage to reconstruct the original message, isn't the other still encrypted expansion still different to the original message? How does that help you decrypt the second one if you don't know which parts of the encrypted message represent the differences?


It's mostly talking about the case where someone receives an encrypted message which is intended to later be published openly. If it was padded by adding stuff, an attacker can try to reconstruct the original plaintext by removing the flowery adjectives, whereas if things were deleted the attacker doesn't know what to add.


In particular, the length of a message is not encrypted when encrypting the text. So if the encrypted message is shorter, you know exactly how much to remove to get back the original, and then just need to guess what to delete. If the message is longer, it is much harder to guess whether to add flowery adjectives, a new sentence, change a pronoun for a name, or some other change.


The thread before with someone flogging off their educational book they wrote "with Claude in an afternoon", as if anyone would benefit from investing days or weeks of learning effort into consuming something the author couldn't be fucked spending even a single day on, that one was well crafted satire, right?

...right?


I wish. As far as I can tell the Venn diagram of people building piles of shit with NPM and people building piles of shit with LLMs seems pretty close to a circle.


I can give a bit more context as someone that got on WebGL, then WebGPU, and is now picking up Vulkan for the first time.

The problem is that GPU hardware is rapidly changing to enable easier development while still having low level control. With ReBAR for example you can just take a pointer into gigabytes of GPU memory and pump data into it as if it was plain old RAM with barely any performance loss. 100 lines of bullshit suddenly turn into a one line memcpy.

Vulkan is changing to support all this stuff, but the Vulkan API was (a) designed when it didn't exist and is (b) fucking awful. I know that might be a hot take, and I'm still going to use it for serious projects because there's nothing better right now, but the same extensibility that makes it possible for Vulkan to just pivot huge parts of the API to support new stuff also makes it dogshit to use day to day, the code patterns are terrible and it feels like you're constantly compromising on readability at every turn because there is simply zero good options for how to format your code.

WebGPU doesn't have those problems, I quite liked it as an API. But it's based on a snapshot of these other APIs right at the moment before all this work has been done to simplify graphics programming as a whole. And trying to bolt new stuff onto WebGPU in the same way Vulkan is doing is going to end up turning WebGPU into a bloated pile of crap right alongside it.

If you're coming from WebGL, WebGPU is going to feel like an upgrade (or at least it did for me). But now that I've seen a taste of the future I'm pretty sure WebGPU is dead on arrival, it just had horrendous timing, took too long to develop, and now it's backed into a corner. And in the same vein, I don't think extending Vulkan is the way forward, it feels like a pretty big shift is happening right now and IMO that really should involve overhauls at the software/library level too. I don't have experience with DX12 or Metal but I wouldn't be surprised if all 3 go bye bye soon and get replaced with something new that is way simpler to develop with and reflects the current state of hardware and driver capabilities.


That is why game studios always went with engines, and never had a drama with APIs like FOSS developers happen to complain all the time.

You get to design a good developer experience, while the plugin system takes care of the optimal API and configuration for each platform.


Historically, Microsoft didn't have a problem making breaking changes in new D3D APIs so I think they'll be one of the first to make a clean API to leverage the new hardware features


Console vendors, 8 and 16 bit computers did it first, even if in many cases it was bare metal programming, that is still an API kind of.


C code is shorter than both assembly and Rust, it's not the same thing.


Both rust and C are also much, much less error-prone than assembly. It is so, so easy to get things wrong in assembly in very subtle ways. That’s one of the main drivers while people are only writing assembly today when they absolutely have to.


> I became proficient in Rust in a week, and I picked up Svelte in a day. I’ve written a few shaders too! The code I’ve written is pristine. All those conversations about “should I learn X to be employed” are totally moot.

fucking lmao


My point is you learn X and your time to learn and ship Y is dramatically reduced.

It would have taken me a month to write the GPU code I needed in Blender, and I had everything working in a week.

And none of this was "vibed": I understand exactly what each line does.


You did not and you are not proficient. LLMs and AI in general cater to your insecurities. An actual good human mentor will wipe the floor with your arrogance and you'll be better for it.


I think you're under the impression that I am not a software engineer. I already know C, and I've even shipped a very small, popular, security sensitive open source library in C, so I am certainly proficient enough to rewrite Python into Rust for performance purposes without hiring a Rust engineer or write shaders to help debug models in Blender.

My point is that LLMs make it 10x easier to adapt and transition to new languages, so whatever moat someone had by being a "Rust developer" is now significantly erased. Anyone with solid systems programming experience could switch from C/C++ to Rust with the help of an LLM and be proficient in a week or two's time. By proficient, I mean able to ship valuable features. Sure they'll have to leveraging an LLM to help smooth out understanding new features like borrow checking, but they'll surely be able to deliver given how already circumspect the Rust compiler is.

I agree fundamentals matter and good mentorship matters! However, good developers will be able to do a lot more diverse tasks which means more supply of talent across every language ecosystem.

For example, I don't feel compelled at all to hire a Svelte/Vue/React developer specifically anymore: any decent frontend developer can race forward with the help of an LLM.


I realize I came across as harsh and I surely don't want to judge you personally on your skills as A) that's not necessary for my point to make sense and B) uncalled for. I'm sure you are a capable C developer and I'm sorry for being an asshole - but I am one so it's hard for me to pretend otherwise...

Being able to program in C is something I can also do, but it sure as heck does not make me proficient Rust developer if I cobble some shit from a LLM together and call it a day.

I can appreciate how "businesses" think this is a valuable, but - and this is often forgotten by salaried developers - as I am not a business owner I have neither the position nor the intention of doing any "business". I am in a position to do "engineering". Business is for someone else to worry about. Shipping "valuable features" is not something I care about. Shipping working and correct features is something I worry about. Perhaps modern developers should call themselves business analysts or something if they wish to stop engineering.

LLMs are souped up Stack Overflows and I can't believe my ears if I hear a fellow developer say someone on Stack Overflow ported some of their code to Rust on request and that this feature of SO now makes them a proficient Rust developer because they can vaguely follow the code and can now "ship" valuable features.

This is like being able to vaguely follow Kant's Critique of Pure Reason, which is something any amateur can do, compared to being able to engage with it academically and rigorously. I deeply worry about the competence of the next generation - and thus my own safety - if they believe superficial understanding is equivalent to deep mastery.

Edit: interesting side note: I am writing this as a dyed in the wool generalist. Now ain't that something? I don't care if expertise dies off professionally, because I never was an "expert" in something. I always like using whatever works and all systems more or less feel equal to me yet I can also tell that this approach is deeply flawed. In many important ways deep mastery really matters and I was hoping the rest of society would keep that up and now they are all becoming generalists who don't know shit and it worries me..


It would have taken you a month and you would have been able to understand it 100x more.

LLMs are great but what they really excel at is raising the rates of Dunning-Kruger in every industry they touch.


Yes, this is definitely missing a /s, I hope.

Please for the love of god tell me this is a joke.


People have 240hz monitors these days, you have a bit over 4ms to render a frame. If that 1ms can be eliminated or amortised over a few frames it's still a big deal, and that's assuming 1ms is the worst case scenario and not the best.


I don’t think you need to work in absolutes here. There are plenty of games that do not need to render at 240hz and are capable of handling pauses up to 1ms. There’s tons of games that are currently written in languages that have larger GC pauses than that.


What about the C# garbage collector? Is it much better? Because Unity is in C#, right?


Unity uses aginging Mono runtime, because of politics with Xamarin, before its acquisition by Microsoft, migration to .NET Core is still in process.

Additionally they have HPC#, which is a C# subset for high performance code, used by the DOTS subsystem.

Many people mistake their C# experience in Unity, with what the state of the art in .NET world is.

Read the great deep dive blog posts from Stephen Toub on Microsoft DevBlogs on each .NET Core release since version 5.0.


Gee do you think maybe that's why all our software sucks balls these days?


No? Of all the reasons software sucks, multidisciplinary programmers are unlikely to be near the top.


They certainly are for me, working but horribly designed java projects done by plsql devs were a proper eyeballs bleeding long into the night.

Absolutely 0 framework or libs, nothing even for logging. Code architecture that would be left in the dust by most university semester projects.

This is how plsql codebases look, but boy Java (and rest of the world) moved quite far since 1995.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: