Over the last century, science fiction has been a valuable genre, often satirizing social and political trends (as in Animal Farm and The Handmaid’s Tale). It could be argued that it is the job of science fiction to shine a light on our reality in the hopes that the reflected glow will lead to needed changes. What are some memorable books, films, shows, and comics that have done this?