This is the part I don't understand. Individually I don't know anyone paying for LLM's, only companies that are immediately pivoting to find reduced costs/internal options to horizon that expense.
Considering the massive cost and shortened lifespan of previous models, current gen models have to not only break even on their own costs but make up for hundreds of millions lost in previous generations.
As soon as they find a way to embed advertising into LLM outputs the small utility LLM's provide now will be gone.
The only hope I guess is that the local LLM stuff that Microsoft is pushing will actually work out. If inference is done on the device… well, at least using it doesn’t need to be profitable. Training it… I guess will have to be justified by the feature being enough of a value-add that it makes the OS more profitable.
Designing operating systems that people would actually pay real money for doesn’t seem to be a focus for anybody except Apple. But I guess there’s hope on some theoretical level at least.
That's the real AI future, not some atomic war with robots. Manipulative AI bowing to the dollars of advertisers, responding to something overheard by a smart speaker 20 minutes ago by packaging an interstitial ad for coca-cola and delivering it stealthily in-experience to your child in their bedroom on a VR headset, straight to their cornea.
The ways for circumventing the influence are currently being dismantled and these AI RTB ad systems are well funded and being built. AI news feed will echo the message and the advertiser will be there again in their AI internet search.
We will cede the agencies of reality to machines responding to those wishing to reshape reality in their own interest as more of the human experience gets commodified, packaged, and traded as a security to investors looking to extract value from being alive.
Weirdly the old Deutsch Mark doesn't seem to have its own code point in the block start U+20A0, whereas the Spanish equivalent (Peseta, ₧, not just Pt) does.
It's not a Unicode issue, there just isn't a dedicated symbol for it, everyone just used the letters DM. Unicode (at least back then) was mostly a superset of existing character sets and then distinct glyphs.
That would be a fine answer, but for the fact that other currencies like the rupee (₨) that are "just letters" do have their own codepoint. Being made up of two symbols doesn't necessarily make something not a symbols, in semiotics or in Unicode.
In fact this is one of the root problems, there are plenty of Unicode symbols you can make out of others, either juxtaposing or overstriking or using a combining character, but this isn't consistently done.
That doesn't mean the new money doesn't have value, just has a small percentage less value. It's a wealth transfer from people who currently have money to the new money. It works out well for people who have a negative net worth, as well!
It is a wealth transfer from people who currently have cash and earn cash (workers, since increases in pay lag the rate at which currency loses purchasing power) to people who have assets and COLA adjusted annuities (wealthy people and old people).
I don't think sarcastic comments really help, people who aren't capable of having real conversations about this already bring enough of that. It is definitely worth talking openly about whether it's worth trading short term inflationary pain for long-term climate pain.
I guess you're right. The parent speaks of inflation as if printing money is the new norm. If you are going to take money from your citizens then be upfront about it!
You are essentially forced borrowing from the cash holders of the printed currency. So yeah it would work. Wouldn’t necessarily be fair or popular; but it would work. Just have to account for the new money also being worth less because of the increase in the denominator of this equation.
It wouldn't work linearly, but it still works. If you print 10% of your GDP in a year, you'll be sitting on only 1/11th of the GDP in cash at the end of the process.
Look at Covid fiscal response causing permanent 30% inflation in the last 4 years. The climate+demographic response money printing operation will be way bigger than that.
Most of the inflation from the last 4 years is attributable to Russia invading Ukraine. You can't have the largest natural gas exporter and second largest oil exporter invade one of the largest grain exporters without causing basically everything in a supermarket or restaurant to be more expensive.
Also shipping interruptions and lockdowns. Giving people money to not work is goin to have a much larger inflationary effect than giving people money to build things we want.
> Most of the inflation from the last 4 years is attributable to Russia invading Ukraine
Source? Other than media articles repeating "due to the war in Ukraine"
Assuming you are talking about the USA, supposedly the USA is a net /exporter/ of grains [0]
[0] Not loading for me but https://www.ers.usda.gov/data-products/ag-and-food-statistic... . Copilot said "The United States is a net grain exporter. According to the USDA Economic Research Service (ERS), the U.S. typically exports more agricultural goods, including grains, than it imports1. In fiscal year 2023, the value of U.S. agricultural exports was $178.7 billion, despite a decline from the previous year. Grains and feeds are among the leading U.S. agricultural exports"
As a reader, people writing to learn about something irritates me when it's not clearly flagged that the writer has almost zero experience using the thing they are writing about.
There's so many articles in tech where the writer probably has less experience with something than literally anyone who will read their post, and it means there's effectively a content farm of what a new software engineer will learn in their first few months (if not years) on the job, written by software engineers in their first few months, with effectively no net information.
I'll offer the opposite perspective. People writing about stuff that they are currently learning is often better, because they have a much clearer model of what's obvious and what isn't.
Someone with 20 years of experience with a technology will usually have a much harder time re-connecting with that beginner's mindset and doing a great job of providing the information that other newcomers most need to understand.
That's not to say that there isn't plenty of junk content out there, but I blame that more on inexperienced writers than on people who are writing about technology that they don't have a great deal of experience with.
A great writer should be able to write about something while they're learning while still producing content that's genuinely useful.
But they should still present themselves accurately, because at that stage they don't know what they don't know and they may be misleading people without realizing it.
This is why I like the TIL format. Saying "Today I Learned" is a great shorthand for "I'm not an expert and I may have missed something but here's what I've figured out so far..."
There are some topics that we need more expert voices on, because the subject matter is genuinely complicated and requires a veteran hand to guide people through. Otherwise we end up with a bunch of "expert beginners" sitting on their local maxima of understanding and thinking they are at the pinnacle of understanding. Some of us really do want to hear how experts think, imperfect as their explanations may be. Dev-fluencers are already taking over the space with their absolute nonsense gish-galloped everywhere for that sweet YouTube $$$
I imagine you are speaking of the trend of medium like articles where someone writes a "guide" on how to use a trendy tool rather than a blog post about something someone did with a tool. It is why I usually ignore anything on a blogging platform.
I LOVE reading dev blogs about the journey of making something. I understand the frustrations when you know they are doing it "wrong". But, more often than not, for me at least, I always learn something new.
reply