Racism: do you think it's going away in the US or making a comeback?
When I was 6, I never knew what racism. I went to a predominant white elementary school and I never really experienced racism from the kids until 1st Grade. I remember when it was black history month they never spent time to talk about it and the reason why we have it. They only talked about MLK and that was for 5 minutes. Anyways everybody would stare at me and the throughout the whole entire day. The kids kept asking me about my hair and then touching it without permission. I complained to the teachers about it and they just ignored me. There was one time I forgot to put lotion on and one boy asked me if I was trying to be white and that it would be better if I was (around 4th grade). After a while I began to hate myself and wish I was white. I realized what I was thinking was wrong so in 5th grade I told my dad to take me out of the school and I began a self love journey.
Now I go to a diverse middle school which I like.
I went to a laundry mat a few days ago with my family and by accident I took a mans basket. After returning it to him, he came up to my family saying "Do you speak English?" We answered yes and out of no where we hear "You f*****g thieves, why the hell did you touch my basket? We don't need you people in our country, you should've stuck to picking cotton!" The man left and of course we were mad as hell.
After that I was wondering if racism is making a comeback in a America and will it ever go away. I've dealt with racism a lot and its getting to the point where I think the US is putting a blind eye over racism and pretending it doesn't exists and slavery never happened.