American Empire

Since the beginning, the United States has been an empire. After the founding fathers called for an ‘Empire of Liberty,’ their children did their damndest to bring it to fruition.