In the first couple of seasons of The Walking Dead, the only villains were the undead roaming the Earth. That seemed to be enough to keep the audience's interest. But when that turned out to be untrue, the show started bringing in human villains. This fit in with the premise of a post-apocalyptic world in which other surviving humans are a bigger threat than zombies.