My understanding is that for at least three years now chip feature size has been on a scale where the design software has to evolve (as in: their algorithm is simulated evolution) physical solutions to the logic-level designs to avoid unintentional self interaction, both quantum tunnelling and classical e.g. capacitance. Humans can’t do that for multi-billion transistor chips.