Since 1945, and specifically since colonizing Palestine with Israel and taking the baton of Empire from Britain, the US has been waging imperial domination around the globe, with the safety of claiming the distinction of not being an overt colonial force.