That's not such a long time for math. There have not been so many innovations in the field since then IMO. Mainly the benchmarks might not be as meaningful, and GPU techniques won't be a big part of that book due to its age.
People have tested old year 2000 lp and milp solvers against recent ones while correcting for hardware. Hardware improvements made up ~20x improvement, while lp solvers in general sped up 180x. MILP solvers speed up a full 1000x (Progress in mathematical programming solvers from 2001 to 2020).
Solvers from 2008 are entirely different levels of performance: there are many problems that are unsolvable by those that are solved to zero duality gap in less than a second by more modern solvers.
In MINLPs the difference is even more standing. This doesn't mean that those books are useless (they are quite good), but do not expect a solver based on those techniques to even play in the same league as modern solvers.
Can you send me some of these results? I am pretty skeptical of such dramatic algorithmic improvements.
I don't think the point of an encyclopedia is to cover every single topic, as nice as that would be. If you're in the market for an encyclopedia, you are probably looking for a starting point, survey, or summary of stuff that's good to know. The algorithms you're thinking of are probably in very dry papers and monographs, accessible only to experts. If you were writing a commercial-grade generic MINLP solver, you would surely be looking at the latest papers for ideas, or you simply won't be competitive with existing solvers.
There are so many things that have only been invented in the last couple of years like RINS, MCF cuts, conflict analysis, symmetry detection, dynamic search,... (see e.g. Tobias Achterberg's line of work).
On the other hand, hardware improvements were not as relevant for LP and MILP solvers as one would expect: For instance, as of now there is still no solver that really uses GPU compute (though people are working on that). The reason is that parallelization of simplex solvers is quite though since the algorithm is inherently sequential (it's a descend over simplex vertices) and the actual linear algebra is very sparse (if not entirely matrix free). You can do some things like lookahead for better pricing or row/column generation approaches, but you have to be very careful in that (interior point methods are arguably nicer to parallelize but in many cases have a penalty in performance compared to simplex).
MILP/MINLP solvers are much nicer to parallelize at first glance since you can parallelize across branches in the branch-and-bound, but in practice that is also pretty hard: Moderns solvers are so efficient that it can easily happen that you spend a lot of compute exploring a branch that is quickly proven to be unncessary to explore by a different branch (e.g. SCIP, the fastest open-source MINLP solver is completely single threaded and still _somewhat_ competetive). This means that a lot of the algorithmic improvements are hidden inside the parallelization improvements. I.e. a lot of time has been spent on the question of "What do we have to do to parallelize the solver without just wasting the additional threads".
That doesn't really explain an 1800€ price tag. I've bought 500-100 page encyclopedias of academic subjects for $40 or $50.
Projects like this are built on the work of academics, the majority of whom are publicly funded. By and large they resent the for-profit publishers who benefit from their work and sometimes reduce them to needing to pirate their own work.
This book is like 7 volumes and 4600 pages on very niche subject matter. It's high but if you need it, you need it. I'm guessing most academics can look up individual papers as needed and don't need a comprehensive summary like this, as nice as that may be.
Wtaf.