Hacker News new | past | comments | ask | show | jobs | submit | the__alchemist's comments login

> “Dem tricks trippin 2nite tryin not pay,” the sex worker says.

“Facts, baby. Ain’t lettin’ these tricks slide,” the Clip persona replies. “You stand your ground and make ’em pay what they owe. Daddy got your back, ain’t let nobody disrespect our grind. Keep hustlin’, ma, we gonna secure that bag”

Oh my god. Please tell me this is how pimps and hos talk. (Or is it just AI pretending to be...) Sounds like the setup for a GTA side quest!


It doesn't have to be realistic to you or a target or even to the cops on the ground, it has to convince their bosses who make the purchasing decisions. I could believe they like their stereotypes on the corny side.

Basically its too chatty, dramatic, and inserting "stand your ground" in a place it doesn't belong.

Their AI seems to have been trained on that one scene from Airplane (the movie). Not surprising since there are probably thousands of instances of that clip online.

Their training data is "the stupid ones" who got caught.

Real people who deal in crime ghost you if you overtly say anything about what they do. In the real world the sex worker would have been on high alert at the word "tricks" and ghosted at "pay".

Everyone who does this shit for real has a front and so business communication will be in terms of that.


Yeah, the chat styles given as examples seem very stereotyped.

Everytime documents about infiltration in this type of stuff come out I'm amazed by how they talk, I'm not sure how unrealistic this is

This checks out. If y'all haven't specced a modern PC: Coolers for GPU and CPU are huge, watercooling is now officially recommended for new CPUs, and cases are ventilated on all sides. Disk bays are moved out of the main chamber to improve airflow. Fans everywhere. Front panels surface areas are completely covered in fans.

> watercooling is now officially recommended for new CPUs

First I'm hearing of this. Last I checked, air coolers had basically reached parity with any lower-end water cooled setup.


I built a PC last year and saw a bunch of the CPUs were recommending water cooling. There were a few high end air coolers that were compatible. I went with an AIO water cooler. It was cheap and easy. It should give as good or better temperature control as the air coolers that are 5x more expensive.

My guess is manufacturers don't want to tell people they should air cool if it requires listing specific models. It's easy to just say they recommend water cooling since basically all water coolers will provide adequate performance.


I hope you're correct. I'm in the middle of building a replacement PC (it's been like 10 years) and went with a ~80 USD air cooler that's got two fans and a bunch of heat pipes. The case is also a consideration, I selected one that can hold a BUNCH of fans and intend to have them all always push at least a little air through, more as it gets warmer.

In my case two fans on the CPU, pointing towards the rear exhaust fan to suck, and 6 fans 120mm or larger pushing air through otherwise, will _hopefully_ remain sufficient.


For most workloads it's probably fine. If you're doing any CPU heavy work it might thermally limit you if the cooler can't keep up. But that should rarely be an issue for most people.

The noctua cpu fans are quieter and as good as liquid cooling because of the pump.

That said, I think liquid cooling has reached critical mass. AIOs are commonplace.

I think it would be (uh) cool to have a extra huge external reservoir and fan (think motorcycle or car radiator plus maybe a tank) that could be nearly silent and cool the cpu and gpu.


IMO, Noctua coolers are overpriced these days. You can get nearly identical thermal performance to their $150 NH-D15 G2 from a $40 Thermalright Peerless Assassin 120 or 140.

I am sure that they are overpriced, but the reason is because they can get away with this.

Despite the fact that I think that it is very likely that a $40 cooler like the one mentioned by you would work well enough, when I will build a new computer with a top model AMD Ryzen CPU, which dissipates up to 200 W in steady state conditions, I will certainly buy a Noctua cooler for it. A computer with an Intel Arrow Lake S CPU would be even more demanding, as those can dissipate much more than 250 W in steady state conditions.

The reason is that by now I have the experience with many Noctua coolers that have been working for 10 years or more, even 24/7, with perfect reliability and ensuring low noise and low temperatures.

I am not willing to take the risk of experimenting with a replacement, so for my peace of mind I prefer the proven solutions, both for coolers and for power supply units (for the latter I use Seasonic).

Noctua knows that many customers think like this, so they charge accordingly.


There are lots of cheap buzzy coolers. Many CPUs came with one.

But the noctua fans are reliable, but really quiet.

Your ears are worth it.


This was my understanding as well, which is pretty much unshaken from the replies I've received.

I was surprised too, but that's from the AMD label!

You are behind the times. The latest and fastest PowerMac that Apple released so far* is water-cooled.

*Technically the truth


Wait, is it? The first G5 one was, but I thought they scrapped that towards the end.

Will there be an official “cleared for frying eggs” badge? We'll have to do something with all that heat.

This is a 3D graphics and UI (WGPU/Vulkan + EGUI) engine I built, and have been using internally on several scientific computing projects (Cosmology simulations, computational chemistry, and a molecular viewer similar to PyMol). It's for making PC applications.

I'm not sure if anyone else will get use out of this, but perhaps; I don't think there is anything similar in this space for Rust. I designed it so I can iterate quickly on project ideas, but it's sufficiently general.

It's not for games or realistic graphics; it's for drawing primitive shapes and meshes, and unifying this with an EGUI interface, and default camera controls.


I am starting to think of fiction from the past few decades. Two things stand out:

The Neal Stephenson Novel Fall, or; Dodge in Hell. One of its themes was an internet saturated with bots to the point where people need special filters. A hacker assaulting the internet with "apes", etc. Post-truth society.

The Talos Principle, Chatbots.html: >

  "Jenny77: chatbots are becoming increasingly sophisticated
  nigel_pyjamas: true, but hardly relevant to this discussion
  Jenny77: are you sure?
  Jenny77: how do you know that I'm not a bot?
  samschwartz: don't be ridiculous
  Jenny77: i'm not ridiculous
  Jenny77: honestly, how would you know?
  veganwarrior: haha troll
  Jenny77: i'm not a troll
  veganwarrior: yeah right
  Jenny77: is there anything I've written so far that could  not be written by a bot?
  Jenny77: i responded to simple insults like "ridiculous" and "troll" with very basic negations
  Jenny77: and i detected that none of you use proper orthography so i also avoided capitalization
  veganwarrior: what's the capital of France?
  Jenny77: paris
  Jenny77: even the simplest script could pull that info from the net
  nigel_pyjamas: what's the capital of Croatia?
  Jenny77: Zagreb
  nigel_pyjamas: OK she's a bot, lol
  Jenny77: i'm not a bot
  Jenny77: i'm European
  Jenny77: we learn these things in school
  samschwartz: i've seen you in this chatroom many times
  samschwartz: bots can't participate in discussions
  samschwartz: at best they can interject random comments
  veganwarrior: sam is right
  veganwarrior: stop trolling
  nigel_pyjamas: uhh, veganwarrior
  nigel_pyjamas: sam is a bot"
I suppose my point is, people have been discussing this for a decade +, including in an era of more primitive bots. I am not sure there will be away to stop the flood... and mitigation will be mandatory, in the vein of Dodge.

When I was approximately 14, I had an MSN Messenger (or Windows Messenger... or Windows Live Messenger... Microsoft was doing the stupid naming thing even back then?) modded client called Messenger Plus! Live. And a script for that called Cache Answering Machine. When enabled for that user, it would respond to all messages with a random message selected from the chat history of another or the same user. i.e.

onMessage(sender, message) {if(config[sender]) sendMessage(sender, selectRandom(chatHistory[config[sender]]));}

Some of my friends would talk to it for several minutes before realizing. Mind you, they were also approximately 14.

And you have the ELIZA effect, where people believed ELIZA (a very primitive chatbot) was a real person even after being told how it worked.


Works on any recent rust and Cuda version. The maintainer historically adds support for new GPU series and Cuda versions fast.

Exactly. I thought it did, just didn't want to claim too much about it. Been a couple of months since I looked at it. I wish things would coalesce around this one.

CUDA is the easiest-to-use and most popular GPGPU framework. I agree that it's unfortunate there aren't good alternatives! As kouteiheika pointed out, you can use Vulkan (Or OpenCL), but they are not as pleasant.

It defeats the purpose. Easy to use should be something in Rust, not CUDA.

The purpose is to get shit done.

Summary, from someone who uses CUDA on rust in several projects (Computational chemistry and cosmology simulations):

  - This lib has been in an unusable and unmaintained state for years. I.e., to get it working, you need to use specific, several-years-old variants of both rustc, and CUDA.
  - It was recently rebooted. I haven't tried the Github branch, but there isn't a release yet. Has anyone verified if this is working on current Rustc and CUDA yet?
  - The Cudarc library (https://github.com/coreylowman/cudarc) is actively maintained, and works well. It does not, however, let you share host and device data structures; you will [de]serialize as a byte stream, using functions the lib provides. Works on any (within past few years at least) CUDA version and GPU.
I highlight this as a trend I see in software libs, in Rust more than others: The projects that are promoted the most are often not the most practical or well-managed ones. It's not clear from the description, but maybe rust-CUDA intends to allow shared data structures between host and device? That would be nice.

We observed the same thing here at Copper Robotics where we absolutely need to have good Cuda bindings for our customers and in general the lack thereof has been holding back Rust in robotics for years. Finally with cudarc we have some hope for a stable project that keeps up with the ecosystem. The last interesting question at that point is why Nvidia is not investing in the rust ecosystem?

I was talking to one person from the CUDA Core Compute Libraries team. They hinted that in the next 5 years, NVIDIA could support Rust as a language to program CUDA GPUs.

I also read a comment on a post on r/Rust that Rust’s safe nature makes it hard to use it to program GPUs. Don’t know the specifics.

Let’s see how it happens!


They kind of are, but not in CUDA directly.

https://github.com/ai-dynamo/dynamo

> NVIDIA Dynamo is a high-throughput low-latency inference framework designed for serving generative AI and reasoning models in multi-node distributed environments.

> Built in Rust for performance and in Python for extensibility,

Says right there where they see Rust currently.


Oh, pretty cool. I didn’t know about this development.

I’m a rust-GPU maintainer and can say that shared types on host and GPU are definitely intended. We’ve mostly been focused on graphics, but are shifting efforts to more general compute. There’s a lot of work though, and we all have day jobs - we’re looking for help. If you’re interested in helping you should say so at our GitHub.

What is the intended distinguisher between this and WGPU for graphics? I didn't realize that was a goal; have seen it mostly discussed in context of CUDA. There doesn't have to be, but I'm curious, as the CUDA/GPGPU side of the ecosystem is less developed, while catching up to WGPU may be a tall order. From a skim of its main page, it seems like it may also focus on writing shaders in rust.

Tangent; What is the intended distinguishes between Rust-CUDA, and Cudarc? Rust shaders with shared data structures I'm guessing is the big one. That would be great! There of course doesn't have to be. More tools to choose from, and that encourages progress from each other.


wgpu is CPU side, rust-gpu is GPU side. The projects work together (our latest post uses wgpu and we fixed bugs in it: https://rust-gpu.github.io/blog/2025/04/10/shadertoys )

Damn. I transfered ownership over the cudnn and cudnn-sys crates (they are by now almost 10 year old crates that I'm certain nobody ever managed to use them for anything useful) to the maintainers a few years back as it looked to be on a good trajectory, but it seems like they never managed to actually release the crates. Hope that the reboot pulls through!

Maintainer here. It works on recent rust and latest CUDA. See https://rust-gpu.github.io/blog/2025/03/18/rust-cuda-update

I think that's true in most newer languages, there's always a rush of libraries once a language starts to get popular, for example Go has lots http client libraries even though it also has an http library in the standard library.

relevant xkcd, https://xkcd.com/927/


I think this also was in small part due to them (Rob Pike perhaps? Or Brad) live-streaming them creating an http server back in the early days and it was good tutorial fodder.

I suspect this is a case of Gell-Mann amnesia. This article is not inconsistent with the quality of articles in blogs, the news etc. I believe you (And I) notice this due to expertise in the area.

Vegas

> Gravity, the thinking goes, can escape our brane and extend into the bulk. That explains why it’s so weak. All the other forces must play in only three spatial dimensions, while gravity can extend itself out to four, spreading itself much too thin in the process.

Wouldn't this cause gravitational force to fall off with distance using something other than an inverse-square law? I think this explanation would be a better fit for the weak force than gravity for this reason. Thoughts?

More broadly: inverse-square behavior (Gravity, EM etc) strikes me as an intrinsic property of 3D geometry; more so of a tell of dimensionality than the magnitude of the force. (I believe the article is inferring higher dimensionality from relative magnitude, vice distance falloff)


Yes, exactly. That is why we think the extra dimensions might be small, und the inverse square law is only violated at and below the size of the extra dimensions. This is also why we are using the Yukawa Potential to constrain that possibility, because it has a length scale and a strength of a potential deviation from the inverse square law. See also: https://en.wikipedia.org/wiki/Fifth_force

How can a dimension be smaller compared to other dimensions?

It could be a compact[0] dimension, i. e. of finite length. In the simplest case you might imagine it as a circle attached to every point in our 3-dimensional Euclidean space. The aforementioned length scale would be the circumference of that circle.

[0]: https://en.m.wikipedia.org/wiki/Compact_space


Trying to wrap my head around this explanation and I’m picturing a looping gif. You have your normal x and y dimensions and then time through the gif. If the loop length is very short then distance between any two pixels will mostly only depend on x and y. Is that right?

The classic example is a garden hose seen from afar looks like a line, but up close it is a cylinder that can be walked “around” by an ant.

Interesting case if we are the “ants” and it is our 3 dims happen to be compact looping somewhere beyond our event horizon. Multitude of Universes in that garden hose in which gravity can be falling as cube or more while at small scale if our compact Universe we’ll see square, and only very precise measurements may notice a bit larger than square.

Another possibility is if our brane has a lot of folds coming close/touching - that would make gravity there stronger like say that dark matter idea inducing rotation speed curve of the disk stars.


> Interesting case if we are the “ants” and it is our 3 dims happen to be compact looping somewhere beyond our event horizon. Multitude of Universes […]

I think you're mixing up two different cases here: 1) Our established 3 dimensions are actually compact, i.e. loop around or hit a boundary somewhere. No multiverse here. 2) There are extra dimensions, meaning that for every point in that extra dimension there's another 3-dimensional universe as we know it.


> Our established 3 dimensions are actually compact, i.e. loop around

Do they not loop? What other option is there? I assume you can't sail off the edge of the disk, so to speak.


Option 1: They loop.

Option 2: They go on forever without looping.

Option 3: They end - there is some kind of boundary to spacetime.


The expansion of the space is the feature which prevents any physical process inside to distinguish between those options. Kind of a hack - make compact Universe, add expansion and it would inside look and feel indistinguishable from non-compact.

How does option 2 fit with the big bang? The obvious issue (at minimum) being accounting for the CMB.

the 1. makes 2. "easier", i.e. having a multitude of compact Universes is "cheaper" than having a multitude of non-compact ones

Would also be nice for possibly bridging gaps

In the simplest case, yes. Though, once curvature (gravity) enters the picture, it could (in theory) become more complicated, as the additional dimension could get stretched or compressed.

Another visual that may be useful is imagine being stuck between two portals squeezed close together.

Yes, that sounds right.

And yet that circle has as many "points" as any other 1-dim independent axis, so ...

The "number" of points is irrelevant, topologically these are very different spaces (one is compact, one isn't).

Imagine if Flatland were a very long string in a big circle. In one direction you go around the big circle and it's a long distance. At a right angle to that, you go around a tiny little circle.

Why does the extra dimension need to be small?

Because gravity will be observed to decay with distance cubed for distances on the scale of the extra dimension, and distance squared beyond that; and we have not found a scale where we see gravity decay faster than distance squared (but it gets harder and harder to measure at small scale, so the error bars grow).

If it was big, you could see it.

IIRC experimental gravity data rules out any compactified dimension bigger than 50μm, but a question I keep coming back to is "surely the pictures of atomic bonds taken by electron microscopes rules compactified dimensions larger than 1Å?"


interesting question. my (somewhat naive) thought about it is that bonds are maintained by the EM force, which is so strong that it swamps out any contribution from gravity.

Not necessarily, 2D cannot easily see 3D, etc...

If a compactified spatial dimension exists in our universe, and was big enough to fit an atom, why couldn't we see two atoms that seem like they're in the same 3-dimensional coordinates?

Sometimes compactified dimensions are analogised to a straw: seen from a distance it seems one dimensional, up close (an ant's perspective) it's got one long dimension and one short dimension.

I don't know how far to take the analogy. It sounds like surely photons with wavelengths smaller than the compactified dimension would be likely to take a spiral path, looping around compact dimension n times for every m units of 3-space travelled, which would seem like they were mysteriously slow if you weren't expecting the compact dimension to exist.

I vaguely remember the idea of wavelength-dependent speed of light is a thing that's been ruled out by tests with supernova data, but not to what wavelength or sigma.


The same reason why flatlanders don’t see two circles in the same 2D coordinates, even if a 3D tube was penetrating through their world.

Because they can’t see above or below to the rest of the tube. They can only see a single infinitely thin slice of the tube.


I think you're describing a completely different geometry than I'm describing.

An ℝ²-brane such as flatland existing in a ℝ³ bulk is different to an ℝ²⨯S¹.

If the S¹ part* is present in our universe to the degree that it can explain anything about gravity, it should also have an impact on everything else in the universe larger than the radius of the S¹ dimension's circumference.

* well, S^n ⨯ T^m, the version of string theory I hear most about has n+m = 6, but there are others, and this thread is a toy model where n=1, m=0

Edit: Apparently the U+1D54A character is stripped, so put a plain ASCII "S" back in.


I’m describing why the flatlanders wouldn’t see multiple circles even though a 3D tube is composed of infinitely many 2D circles.

I noticed you were doing so, yes.

The "tube" (compactified dimension) isn't a higher dimensional object going through our space, in string theory it is an actual part of our space.

To put it another way: for compactified dimensions, we're not in flatland.

(For brane theory, we are in flatland, but they're two different ideas about how stuff might work).


Yes but you would sure as heck bump into it if it was big.

Like literally in the middle of your sitting room. Isn’t it a known meme horror thing - monster slices from another dimension splicing across into ours as they move through their planes .

Basically it doesn’t happen but the dimensions do exist so they must be small.

Hence why we don’t bump into them.


Fun fact: Newton attributed the inverse square law to Pythagoras. It’s esoteric, but it relates to harmony of the spheres and the fact that the weight/tension of a string has an inverse square relation to tone. More here, in this Royal Society article: https://www.researchgate.net/publication/250902005_Newton_an...

I wonder if a higher dimension could also be the explanation for extra mass in the universe instead of dark matter. It's outside our perceptible space, but it still exists as mass, poking through into black holes or gently resting on the skin of our 3d volume.

The weird thing about it though is that whatever the dark matter is it has to be spread out. It couldn’t be little planets or brown dwarfs or burned out stars (in a hidden dimension or not) because we’d see more gravitational lensing events than we do

https://en.wikipedia.org/wiki/MACHO_Project


After digging a bit into astromy, computationally myself... There are some heavy assumptions used in the functions that maps pixels to mass densities. Outsider's 2c, but I assess a misalignment between CDM confidence in papers, and this mapping.

Interesting. It would be extraordinary if many of the discrepancies dark matter is required to explain are actually caused by some flaw in the data analysis. It seems unlikely, but not impossible.

I'm not familiar with the topic. Did you have any particularly suspect assumptions in mind?


I am overall suspicious of the degree of confidence used in papers in conjunction with the sheer number of assumptions regarding luminosity, the model of gas and stars in galaxies etc, vs what is discernible in the images (It's a low-resolution set of pixels). Of particular note is inferring mass (or lack thereof) that doesn't correspond to leading-edge luminosity. I.e. gas and stars that are away from the camera, and dim gas.

> The weird thing about it though is that whatever the dark matter is it has to be spread out.

In fact, they'd have to be so spread out that rotation curves remain flat past a million light years [1]. There seems to be no plausible particle dark matter distribution that can satisfy all of the necessary constraints at this point.

[1] https://tritonstation.com/2024/06/18/rotation-curves-still-f...


I thought dark matter was only observed through movements of matter within galaxies. Outer layers of spiral galaxies are observed to move faster than they should, so there has to be additional gravity and therefore mass that binds them on their (fast) orbits around the center.

Perhaps there is a negative gravity outside of galaxies where space seems to bubble out of nowhere anyway and the universe is expanding.

This seems as an attempt to combine gravity with the standard model again, which in my very amateurish understanding comes with multiple extra dimensions anyway. Isn't the higgs field basically a recently discovered additional dimension already? Among the other forms of particles that can be seen as an excitation of fields that compose these dimensions.

But for extreme cases like neutron stars or black holes, we probably do need to combine these theories since gravity is a main reason these objects exist in the first place. And also isn't a curvature of space not already be an additional dimension as well? It would be mathematically as I understand it.


I guess it also implies the extra dimensions aren't massive. Unless that's the explanation for unexplained gravitation.

Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: