Most of the use-cases you describe wouldn't be resolved by re-reading the entirety of the topics from the beginning of time.
They'd resume from their last committed offset and continue where they left off.
If you do have to catastrophically recover from the beginning of time, then sure you'd have a rough time. But that's true for any system that would have to do that. It's not Kafka-specific.
Now if your consumer was entirely in-memory and made no use of persistent storage itself and it had to recover from the beginning of time, then I'd say that type of problem is your own architectural failure and I have yet to come across a tool, pattern, framework, or architecture that bullet-proofs your foot.
They'd resume from their last committed offset and continue where they left off.
If you do have to catastrophically recover from the beginning of time, then sure you'd have a rough time. But that's true for any system that would have to do that. It's not Kafka-specific.
Now if your consumer was entirely in-memory and made no use of persistent storage itself and it had to recover from the beginning of time, then I'd say that type of problem is your own architectural failure and I have yet to come across a tool, pattern, framework, or architecture that bullet-proofs your foot.