About this item

Something is wrong with American journalism. Long before "fake news" became the calling card of the Right, Americans had lost faith in their news media. But lately, the feeling that something is off has become impossible to ignore. That's because the majority of our mainstream news is no longer just liberal; it's woke. Today's newsrooms are propagating radical ideas that were fringe as recently as a decade ago, including "antiracism," intersectionality, open borders, and critical race theory. How did this come to be?It all has to do with who our news media is written by -- and who it is written for. In Bad News: How Woke Media Is Undermining Democracy, Batya Ungar-Sargon reveals how American journalism underwent a status revolution over the twentieth century -- from a blue-collar trade to an elite profession.



Read Next Recommendation

Report incorrect product information.