Hacker News new | past | comments | ask | show | jobs | submit login

This is one of the many truly weird thing about trying to build software on top of LLM APIs.

I'm not used to programming where one of the possible error states is that the computer just straight up decides it doesn't want to do the thing I asked it to do!




> I'm not used to programming where one of the possible error states is that the computer just straight up decides it doesn't want to do the thing I asked it to do!

Without the anthropomorphism, an unexpected error condition from an external system is not that unusual. That LLMs have both loosely specified and—barring things like the ability to set 0 “temperature”—nondeterministic behavior makes that more common than most systems you’ll interact with, sure.


Exceptions are basically this? Some unknown unknown happened and you can’t do what you wanted to do


The only parallels I can think of is “export grade” cryptography and how you can’t edit a photo of money in Photoshop.

Both cases are a legal compliance matter. The developers had no legal choice.

The LLMs refusing to obey ordinary commands is very jarring to me, especially when a Google search for the same request will generally succeed.

You’ll find instructions on Wikipedia for how to make an atomic bomb, and you could use Mathematica to run the simulation codes required for the design. SolidWorks would let you save the file with the casing model, etc…

Meanwhile LLMs will refuse to write a story with certain elements.

Similarly, many image generators will refuse to make nudes even thought a cursory Google search will yield terabytes of free porn. (Including fakes of celebrities!)

It’s as-if AI is exclusively made by Mormons.


> It’s as-if AI is exclusively made by Mormons.

Nah, it's worse than that. It's made by people worried about finding themselves on the front page of a major newspaper, in an article associating them with something naughty. Porn, building bombs, anything too violent, anything that could be even remotely construed to pattern-match some -ism, are all excellent ways of ending in this situation today.


Porn is a strange one because nobody in tech or in general the coasts seems to really care (or if they dislike it, it's for fundamentally different reasons than are traditionally held) - it's the banks who really hate it, and to be honest I have no idea why.


> to be honest I have no idea why

Generally because the chargeback-rate of typical paid-for porn providers was exceptionally high. When I worked at one we had to use special merchant providers that would charge up to 10% or higher for each transaction because of it.


>It’s as-if AI is exclusively made by Mormons.

A weird mixture of degenerate unconstrained crony capitalism/VCs and purity spiraling techbros. No small irony that this timeline is the one where occupy wallstreet was distracted and destroyed by injecting any possible controversy they could into it.

Don't think about class and money, think about anything else. It's ok to centralize technology, capital and power in the hands of a few firms on the west coast of America, as long as those trolls on the internet are thwarted

I just pray the EU doesn't fall for this garbage.


Unit testers hate this one trick!

But on a serious note, I think it's the ambiguity. What if the model refuses one prompt, but then accepts the other - that is essentially the same, but worded differently.

What if it at one point refuses a prompt, but then on the next run accepts the exact same one, for some weird fuzzy reason that can't be debugged.


We are living inside a science fiction parody.


Welcome to team leadership.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: