You know I’d like to see a TV series of a post-pandemic world, where the world is you know…. quite normal, and the series could be about how scientists saved the world instead of zombies, apocalyptic destruction and humans generally going back to middle ages. But that would be so boring, correct?