> They use an AI tool to comb public information about your company and file hundreds of copyright infringement, IP, and trade secret theft cases. The scale means you can’t just ignore it or settle for a nominal amount.
The "scale" itself is the problem. Because companies are so huge, and their reach is so huge, it invites techniques that increase the efficiency of attacks. Human beings weren't meant to handle things at such scales, and that is part of the reason we have the problem of AI in the first place.
If we lived in smaller, more self-sufficient communities, then we would not have scale and the people in such communities would not have much desire to develop AI either. AI is the natural reaction of a large populace who look for a technological solution to the immense chaos of information.
This is an unnecessarily complicated rationalization.
The only reason large companies attract these attacks is because large companies have large bank accounts.
That’s it. That’s the reason. There’s no need to theorize about small communities or humans developing AI in reaction to something.
Scammers see a tool, an opportunity for exploitation, and a target with something they want (money). They leverage the tool to try to extract what they want from the most likely target.
That’s what’s happening here. There isn’t some deeper philosophical explanation. Scammers just want money and they’ll use any tool they can in order to get it.
> If we lived in smaller, more self-sufficient communities, then we would not have scale and the people in such communities would not have much desire to develop AI either.
Yes, because they’d be too busy worrying about food and disease to bother with much else. Agrarian societies weren’t fun.
I suppose you've lived in one? Or are you just fearmongering? Because agrarian societies certainly weren't always worrying about food and disease. What about the Amish? They certainly manage decently without too much technology.
Humans were also not meant to handle corporations of a scale that could engage in things like IP theft, data theft, wage fixing, and mass psychology, with impunity. It strikes me as a cat and mouse game. Not that I think either side will make things better for the rest of us.
> If we lived in smaller, more self-sufficient communities, then we would not have scale and the people in such communities would not have much desire to develop AI either.
You could say AI is the child of internet, as it was the internet that cooked up the massive trillion token datasets. Without internet, no AI.
But if you look closely, internet resembles AI - you can search for images instead of generating, there are billions you can choose from. You can search for information and you can chat on social networks instead of using a LLM. It's the same with AI but even better, human made. An HGI made of humans and networks.
This attack works on a single person or a small business that produces a lot of public works. Any prolific author or artist is exposed to this. Any marketer, etc.
Your thesis doesn’t make sense about there being no desire for AI in small communities. Small communities absolutely want automation, both in the physical and information world.
I mean, sure, if we still lived as small groups of hunter-gatherers, many of today's problems would not exist. But many of yesterday's problems could never have been solved.
It isn't a dichotomy. But making a dichotomy is thing technophiles do a lot and I'm not sure why. It's clear that we can go back to smaller communities and keep some of the newer ways. I think technophiles irrationally put everything that isn't "advanced tech society" into the box of hunter-gatherer because they can't psychologically cope with the idea that there is an alternative to the unsustainable technological world that they have become emotionally attached to.
The "scale" itself is the problem. Because companies are so huge, and their reach is so huge, it invites techniques that increase the efficiency of attacks. Human beings weren't meant to handle things at such scales, and that is part of the reason we have the problem of AI in the first place.
If we lived in smaller, more self-sufficient communities, then we would not have scale and the people in such communities would not have much desire to develop AI either. AI is the natural reaction of a large populace who look for a technological solution to the immense chaos of information.