Not American and not that well versed in American politics....

BUT having visited America pre-Obama and after the commencement of his presidency, I must saw that I felt racial tension much more on visits to America after Obama became president. The oppressed/oppressor tension was much more pronounced. Americans really seem to be battling it out over this thing called "race".

The people on NC are generally quite open but the average White American that I routinely meet outside of America is shockingly racist, particularly the bible thumpers which always leaves me feeling baffled and glad I have no dealings with organized religion, especially Christianity. Recently met a plump White Christian American Woman whose racism, sexism, and homophobia were quite astounding. Have no idea how people reconcile their hatred with God.
Originally Posted by Gabi2009
YES!!!! after a black man became president, its like every white person let their guard down and just stopped "pretending" to be color blind. I have friends of people colors, but i found myself having to pull up some of my white friends and correct them. I love them to death, but i realized that even though i may not be 100% black, i'm their only non-white friend and they were completly naive about race relations. I believe white people have become afraid of black people, black men in particular, and having a black man running the country is freaking them out.