Dijkstra advocates creating programs by doing formal transformations of specifications written in mathematical logic, and computers just do boolean algebra on strings of bits, so it's not that Dijkstra advocates formal methods because they are somehow essential to how computation is done in the computer - they are not, especially in the form proposed by Dijkstra.
He is proposing formal methods as a way of thinking about developing computer programs, this is clear from reading "A Discipline of Programming". You can not say "formal methods are a way not to think", because the theoretical existence of a sequence of transformations giving you the final program does not tell you how to find those transformations, so some thinking on the part of programmer is still needed. Of course you can come up with a program by a way of thinking completely uninformed by the formal specification, and then formalize it, but this is not what Dijkstra is speaking of in this article. What I am saying in the comment is that I do not believe many people benefit that much from a formal approach to program development, just like not that many new theorems in mathematics, outside of mathematical logic, were discovered using the tools of mathematical logic. Even the proofs themselves in mathematics aren't done in a completely formal way.
> What I am saying in the comment is that I do not believe many people benefit that much from a formal approach to program development
A lot of systems would benefit from formal (or more formal) methods. I worked in aviation software, formal methods would have saved us so much trouble, especially since the code size is often relatively small (10k-200k LOC) and because we're dealing with other systems the IO spec was reasonably complete. Sure there'd be issues, but feature creep was not one of them (except on one project). Feature creep makes formalisms difficult, but only when you're looking at whole program formalisms. So certainly safety critical systems benefit.
What else might? How about medical systems (see Therac-25 for faulty UI leading to deaths). Network code, especially fundamentals like routers, switches and core services like DNS that so much depends on. Cryptographic and other security systems. Compilers, obviously, similar to network code they're too ubiquitous to be left to error prone ad hoc approaches. Anything handling money. Anything handling privacy (ensuring HIPAA and whatever European privacy rights). Software handling production automation or managing warehouses/inventory, failures there mean lost production time, wasted inventory, lots of money and productivity gains to be had by having software that works.
If you accept formal methods on the small scale (essentially the promise of functional programming a la Haskell and the ML family), you can be confident in composing large systems out of smaller, formally verified systems.
He is proposing formal methods as a way of thinking about developing computer programs, this is clear from reading "A Discipline of Programming". You can not say "formal methods are a way not to think", because the theoretical existence of a sequence of transformations giving you the final program does not tell you how to find those transformations, so some thinking on the part of programmer is still needed. Of course you can come up with a program by a way of thinking completely uninformed by the formal specification, and then formalize it, but this is not what Dijkstra is speaking of in this article. What I am saying in the comment is that I do not believe many people benefit that much from a formal approach to program development, just like not that many new theorems in mathematics, outside of mathematical logic, were discovered using the tools of mathematical logic. Even the proofs themselves in mathematics aren't done in a completely formal way.