Dystopia: an imagined place or state in which everything is unpleasant or bad, typically a totalitarian or environmentally degraded one
This is the accepted definition of a Dystopia, one writers and readers alike have embraced. However, since dystopia has become such a popular genre thanks to the YA explosions of the Hunger Games and Divergent, as well as older staples like 1984 and The Handmaiden’s Tale, it’s important to have a discussion on why the accepted definition is wrong, or at least why it should be.
In order for a dystopian society, or more specifically, the characters living within that dystopia, to be believable, there must be upsides. Why is the common citizen, and possibly your protagonist, delighted to live in this society, at least initially? Of course we can blame it on indoctrination. We saw that approach in The Giver. We can blame it on military might as in The Hunger Games or a secretive police faction that makes dissenters disappear, as in 1984 or the film Equilibr…