The increasing prevalence of misinformation in society may adversely affect democratic decision making, which depends on a well-informed public. False information can originate from a number of sources including rumors, literary fiction, mainstream media, corporate-vested interests, governments, and nongovernmental organizations. The rise of the Internet and user-driven content has provided a venue for quick and broad dissemination of information, not all of which is accurate. Consequently, a large body of research spanning a number of disciplines has sought to understand misinformation and determine which interventions are most effective in reducing its influence. This essay summarizes research into misinformation, bringing together studies from psychology, political science, education, and computer science. Cognitive psychology investigates why individuals struggle with correcting misinformation and inaccurate beliefs, and why myths are so difficult to dislodge. Two important findings involve (i) various "backfire effects," which arise when refutations ironically reinforce misconceptions, and (ii) the role of worldviews in accentuating the persistence of misinformation. Computer scientists simulate the spread of misinformation through social networks and develop algorithms to automatically detect or neutralize myths. We draw together various research threads to provide guidelines on how to effectively refute misconceptions without risking backfire effects.