But my point is by hyper-optimising for individuals it will at some point cross a line where it's not really insurance as you understand it any more. The unlikely event is that the insurance company gets it wrong, for example your house floods in an area that was not predicted to flood for millennia. As the insurance company gets better, the chance of an unlikely event gets smaller. How small can that chance get before you decide to just take the risk yourself?