The news media has been under attack from all angles, probably more so than ever over the past few years. Between wholly fake stories posting all over social media and spreading like wildfire, to highly partisan news channels spreading whatever they think their watchers want to hear to pull in big ratings, the legitimacy of the news media is being questioned left and right.
President Trump regularly attacks a number of outlets including CNN, the Washington Post and the New York Times as being fake news. There is no doubt many, honorable and highly ethical journalists out there. However, many question the credibility of even some of the most respected journalists out there, which makes us wonder if many of you do not believe the news is real anymore at this point. Rhetoric aside, do you really believe the news is "fake" or do you think for the most part, at least among the reputable news sources, that it is real?
What do you think? Are the majority of news stories from the mainstream media honest and truthful or are they simply made up to push an agenda to a specific audience? We are not talking about fringe/conspiracy-type blogs which we won't even name (you know who they are) but across mainstream, long-standing media. Tell us what you think about this and then see how others voted on this question.