The Walking Dead. The fact that it is a *post* apocalyptic series just shows how lazy the zombie genre has gotten in general. Whereas the original Dawn of the Dead showed us how the epidemic unfolded at the beginning, the Walking Dead just take the lazy route of showing us a world that has already been overrun by the infected. I don't want to see that. I want to see the interesting stuff at the beginning--how society gradually collapses, how we go from our daily lives to holy heck, this is the end of the world as we know it. So far the original Dawn of the Dead is the only zombie film that has ever done this.
English
-
Watch "Fear The Walking Dead" then, that's actually what it's about