> The next big platform is multiple cores/multiple CPUs. The next big language is functional and helps deal with consistency across time and across CPUs.
So the popular opinion in the Internet echo chamber keeps telling me, but somehow I don't buy it.
If you do, please try to answer this simple question: what single application of widespread importance benefits on a game-changing scale from running on multiple cores?
It's not office productivity/business automation applications like word processors, spreadsheets, and accounting packages. They could run just fine on a typical desktop PC years ago. Sure, it's useful to run multiple applications simultaneously, but the OS can handle the scaling in that case.
It's not mass information distribution/web applications. The bottlenecks there are typically caused by limited communications bandwidth or database issues. While concurrency is obviously a big factor internally in databases, most of us don't actually write database engines.
It's not games. Most AAA titles today still don't scale up in that way, and one mid-range graphics card with its specialist processor would blow away a top-end quad-Xeon workstation when it comes to real-time rendering. Again, there is some degree of concurrency here, but many intensive graphics rendering problems are embarrassingly parallel in several ways, so again this isn't much of a challenge even for today's mainstream programming languages and design techniques.
I suspect the most likely mainstream beneficiaries of better multi-core/multi-CPU support would be things where there really is heavy calculation going on behind the scenes and it's not always uniform: multimedia processing, CAD, etc.
However, what about the alternative directions the industry might take? The Internet age has emphasized some basic realities of software development that as an industry we weren't good at recognising before.
For one thing, many useful tools are not million-lines-of-code monsters but relatively simple programs with far fewer lines of code. It's knowing what those lines should do that counts. That means rapid development matters, and that in turn requires flexible designs and easy prototyping.
For another thing, data matters far more than any particular piece of software. Protecting that data matters much more in a connected world with fast and widespread communications, so security is more important than ever, and we need software that doesn't crash, suffer data loss bugs, and so on.
So I'm going to go out on a limb here and suggest that multi-core/multi-CPU is not in fact going to be the dominant factor in the success of near-future languages. I think flexibility and robustness are going to be far more important.
It may turn out that the attributes of a more declarative programming style support these other factors as well. It may be that functional programming becomes the default for many projects as a consequence. But I don't think any future rise of functional programming will be driven by a compelling advantage to do with implementing modest concurrency on multi-core systems. That just isn't where the real bottlenecks are (in most cases).
> If you do, please try to answer this simple question: what single application of widespread importance benefits on a game-changing scale from running on multiple cores?
Computer vision and machine learning both benefit a lot from multiple cores. They seem to be really big growth areas at the moment and have the potential to dramatically change the way we interact with computers. It's already happening: recommendation engines on e-commerce sites are a great example of machine learning in practice. I believe we're going to see this sort of thing appearing in more and more places.
Web browsers already take advantage of multiple cores, by the way. The Rust language is being developed by Mozilla because (one of the three reasons from the project FAQ) of dissatisfaction with the concurrency support in existing languages.
I think there's a large opportunity cost to dismissing concurrency & parallelism at this moment.
> I think there's a large opportunity cost to dismissing concurrency & parallelism at this moment.
I'm not dismissing the idea, nor claiming that it is not valuable for any application. Clearly that parallelism would have value to a significant number of projects, which perhaps don't make best use of the host hardware today. I'm just trying to keep the multi-core idea in perspective, relative to other ways our programming languages might improve.
Better multi-core support can get you a constant factor speed-up in computationally expensive work, but Amdahl's Law tends to spoil even that. On the other hand, a language with a type system that allows you to prevent entire classes of programmer error could lead to a step change in security or robustness. A language expressive enough to capture the developer's intent in ways that today's programming models do not could lead to entirely new techniques for keeping designs flexible, supporting new rapid development processes, or it could create opportunities for optimisers that bring the performance of more expressive languages to a level where they compete with lower-level languages used in the same field today for speed reasons. I suspect that across the field of programming as a whole, such improvements would be far more widely applicable.
So the popular opinion in the Internet echo chamber keeps telling me, but somehow I don't buy it.
If you do, please try to answer this simple question: what single application of widespread importance benefits on a game-changing scale from running on multiple cores?
It's not office productivity/business automation applications like word processors, spreadsheets, and accounting packages. They could run just fine on a typical desktop PC years ago. Sure, it's useful to run multiple applications simultaneously, but the OS can handle the scaling in that case.
It's not mass information distribution/web applications. The bottlenecks there are typically caused by limited communications bandwidth or database issues. While concurrency is obviously a big factor internally in databases, most of us don't actually write database engines.
It's not games. Most AAA titles today still don't scale up in that way, and one mid-range graphics card with its specialist processor would blow away a top-end quad-Xeon workstation when it comes to real-time rendering. Again, there is some degree of concurrency here, but many intensive graphics rendering problems are embarrassingly parallel in several ways, so again this isn't much of a challenge even for today's mainstream programming languages and design techniques.
I suspect the most likely mainstream beneficiaries of better multi-core/multi-CPU support would be things where there really is heavy calculation going on behind the scenes and it's not always uniform: multimedia processing, CAD, etc.
However, what about the alternative directions the industry might take? The Internet age has emphasized some basic realities of software development that as an industry we weren't good at recognising before.
For one thing, many useful tools are not million-lines-of-code monsters but relatively simple programs with far fewer lines of code. It's knowing what those lines should do that counts. That means rapid development matters, and that in turn requires flexible designs and easy prototyping.
For another thing, data matters far more than any particular piece of software. Protecting that data matters much more in a connected world with fast and widespread communications, so security is more important than ever, and we need software that doesn't crash, suffer data loss bugs, and so on.
So I'm going to go out on a limb here and suggest that multi-core/multi-CPU is not in fact going to be the dominant factor in the success of near-future languages. I think flexibility and robustness are going to be far more important.
It may turn out that the attributes of a more declarative programming style support these other factors as well. It may be that functional programming becomes the default for many projects as a consequence. But I don't think any future rise of functional programming will be driven by a compelling advantage to do with implementing modest concurrency on multi-core systems. That just isn't where the real bottlenecks are (in most cases).