In my opinion I do not think racism has changed in the United States. I do however believe that the U.S has lightened up since the abolishment of slavery and the passing of the 14th amendment giving people of African descent equal rights as a citizen of the United States.
Less than fifty years ago, America was a society of segregation and racism.
Racism is defined as the belief that a particular race is superior to another. Although it is clear things have changed, racism is still visible in modern America. Relationships between African Americans and whites are generally better than they were in the forties and fifties. Today, it is not as common to see a black man walk down the street and step off the sidewalk to let a white man walk by, or see a black man sitting on a different section of the bus or train because a white man says he has to.
But superiority of races is still occurring. A lot of this has to do with the ignorance of others.
We belong, as white people, to a global system that favors whiteness over blackness at any given term. It is also a system that bestows privileges that are unearned onto white people simply because the color of our skin. We are born with certain privileges because we are white, therefore no matter how much we don’t want these privileges we are forced to keep them. They cannot be given away.
For example: if I walk into any drug store in the country that carries hair products I can be sure that I will find something that was designed for my hair. Black hair products are much harder to find. Often African Americans have to drive for miles to buy what they need. Further, I know that when band aids box says flesh color it means my skin color, not those of my Asian or Latina friends. If, in an attempt to give back my privileges I said to the drug store clerk, “I don’t want the privilege of always being able to get shampoo for my hair when my black friend can’t”.
The clerk would think I was nuts even if he agreed with me it…