Just to widen the topic slightly, I have recently been watching " The Vietnam War" series on Netflix. Not only does that documentary follow the events on the ground throughout the conflict, but also the way the war grew divisions within America which brought forward the civil rights and Campus movements of the 1960s-70s.
Americans were lied too by their president's throughout that decade and a half and it ended with the United States as a nation virtually tearing itself apart in riots and confrontation.
Having watched the news over the last few nights it seems as if the same situation is once again developing in America, with a president that is constantly lying to the nation, a poor economic situation and race discrimination and inequality still rife in many states.
I genuinely fear for the future of the United States at present.