Seriously? The only comparison you can make is that both were hyped. Digging even a millimeter under the surface reveals they’re completely different.
Blockchain was and still is rife with scams and “you just don’t understand the technology bro” hype men. Just check out Dirty Bubble Media. Blockchain was rarely if ever a product that solved a problem, the whole point is The Line Goes Up. That’s why no one uses blockchain in industry, and crypto bros were constantly finding themselves proposing silly use cases like ticket sales and property deeds. These are people who have apparently never heard of a relational database.
The hype around AI is due to increased attention on things that have already existed and have already been studied, used, and improved for decades now. There was never much R&D into blockchain tech because the tech isn’t the point. For ML, there are researchers who have worked on these problems for decades. It doesn’t need to justify its own existence, the justification is that it can solve real problems.
Again, I do tend to agree that there is a lot more "there" there with generative AI. But I think it's also true that it's too early to be sure.
You're comparing the two technologies at totally different points in their hype cycles. The comparison point to where AI is right now is to Bitcoin / very early Ethereum in the late 2000s to early 2010s. Nobody knew where it was all going, some people saw endless potential, other people saw nonsense and scams. The explosion of bitcoin into mainstream consciousness in the early 2010s is akin to the explosion of ChatGPT over the past 6 to 9 months.
But what's next? That's what matters. The early 2010s bitcoin boom now pretty clearly looks like a fad, in hindsight. Was ChatGPT also mostly a fad, or is it going to be a lasting fixture of productivity and/or entertainment moving forward? I think it's the latter - it has already changed my habits at work in ways that I think will be permanent - but I just think it's too early to say for sure.
(And to be clear, I'm not talking about machine learning as an academic discipline; I totally agree with you that there is definitely enough evidence to say there is a lot more "there" there than research into chained hashing to solve double-spend-like problems.)
Well if you narrowly define AI to be ChatGPT and other generative LLMs, I think I agree so some extent. Unlike blockchain they do have use cases but it remains to be seen if those use cases can justify the money being thrown at them. How much is code completion really worth?
However, I disagree insofar as the outcome truly depends on an unknown technology. Blockchain was never going to revolutionize finance or any of its other grand claims. At best (and that’s if it worked), it would be a new database type that all of the existing financial systems would plug into. It was a libertarian pipe dream, naive about how the world actually works.
For any AI application, the world is different. If we simply replace AI with “automated system” we can see why. Pretty much every company would like to replace their workers with machines. And maybe machines can do things that humans would never be able to do (for example, search the entire internet for a very specific topic).
Yes that's what I'm talking about because that's what the article is talking about! The article is explicitly not about the ML / AI academic research. I agree that's well established.
What the article is about is the current hype cycle of people trying to take the newest generation of "AI" tools, of which GPT-4 is the leading edge and most widely known, and make useful products with them. And whether that is going to be a big deal or a fad is, as yet, unproven.
It is super easy to say, in 2023, that "blockchain was never going to revolutionize finance". But in 2013, that was an unknown. For what it's worth, you could go back to my commenting history in that period of time to find me saying "bitcoin is never going to revolutionize finance"; I was a skeptic then. But that doesn't mean I was definitely going to be right, I was just educated-guessing, just like the people on the other side of the conversation. That guess looks to have been prescient with the benefit of hindsight, but I've been wrong about lots of stuff too - I thought the iPad was stupid, I hated "Web 2.0", I thought the Facebook IPO was doomed, the list goes on and on.
My best guess is that building products on top of "generative AI" is going to prove to be a big deal, but I don't know that, and it's hard not to be influenced by an ongoing hype cycle, is all I'm saying.
> For any AI application, the world is different. If we simply replace AI with “automated system” we can see why. Pretty much every company would like to replace their workers with machines. And maybe machines can do things that humans would never be able to do (for example, search the entire internet for a very specific topic).
Sure, but again, we just don't know yet if the "AI Engineering" thing this article is talking about is going to, in any way, turn into any of that, or if it's going to be more of a bust.
Blockchain was and still is rife with scams and “you just don’t understand the technology bro” hype men. Just check out Dirty Bubble Media. Blockchain was rarely if ever a product that solved a problem, the whole point is The Line Goes Up. That’s why no one uses blockchain in industry, and crypto bros were constantly finding themselves proposing silly use cases like ticket sales and property deeds. These are people who have apparently never heard of a relational database.
The hype around AI is due to increased attention on things that have already existed and have already been studied, used, and improved for decades now. There was never much R&D into blockchain tech because the tech isn’t the point. For ML, there are researchers who have worked on these problems for decades. It doesn’t need to justify its own existence, the justification is that it can solve real problems.