I live in the South-East part of the United States, in Florida, and I've observed that racism isn't too common. Every once in a while you'll hear someone maybe make a controversial joke, but no real hate. It seems that political correctness has driven out the expressions of people on regards to race, and it seems as whites are actually starting to be discriminated against. It seems colored people are starting to look down to whites, and that they're superior.
As far as discrimination based on other features than skin color I have seen nearly none of, but then again I live in a very good part of the city, and I'm somewhat sheltered.
As far as discrimination based on other features than skin color I have seen nearly none of, but then again I live in a very good part of the city, and I'm somewhat sheltered.