Cryptocurrency mining ASICs cannot be used for password cracking because they are not designed to hash arbitrary strings, and do not support arbitrary scrypt parameterization. They are designed to take in a block template of some sort, and then internally handle iterating the nonce to minimize load on the communication bus and the host processor.
An interesting thing about GPU password cracking that many people don't realize is that the GPU is responsible for not only hashing candidate passwords, but also generating those candidate passwords. The bus bandwidth isn't enough and the CPU isn't fast enough to keep it fed with candidate passwords otherwise.
You could, of course, build password cracking ASICs that have internal processors for generating candidate passwords, but I'm not convinced that it doesn't make more sense to stick with GPUs or maybe FPGAs there.
You'd be better off with FPGA which are available in significantly smaller process sizes (and therefor lower running costs) than ASICs you roll yourself. There's also a certain level of reusability that you don't get with SHA256 etched into silicon.
Not true re GPU bandwidth/dictionaries. see cudahashcat - we're using it on a switched PCIe bus feeding dictionaries with plenty of bandwidth. You're not limited to mask attacks, which is what you're describing.
Are you sure about that? I'm more familiar with oclhashcat, and it certainly can't keep the GPU busy with a pure dictionary attack. It'll feed the GPU with a dictionary, then apply rules on GPU to generate variants.
An interesting thing about GPU password cracking that many people don't realize is that the GPU is responsible for not only hashing candidate passwords, but also generating those candidate passwords. The bus bandwidth isn't enough and the CPU isn't fast enough to keep it fed with candidate passwords otherwise.
You could, of course, build password cracking ASICs that have internal processors for generating candidate passwords, but I'm not convinced that it doesn't make more sense to stick with GPUs or maybe FPGAs there.