October 2018
The Liberal World Order.
Since the end of World War II, something never before seen was happening to the world: a spread of the "liberal world order." Liberal in this case means freedom, not left-wing. Although some might think that it was inevitable, the expected trajectory of the world, because we are older and wiser now, it was not at all inevitable. It would not have happened without the United States not only pushing this, but protecting it with military force and money. Most Americans understood this, and more...