Racism in the United States
Thursday, January 24th, 2019The campaign and subsequent election of Barack Obama as President of the United States brought out into the open the ugly underbelly of racism that has existed in the United States since the beginning. It's always been here, it just sort of went underground with the passage of the Civil ...