Pinkie-Dawn
Vampire Waifu
- 9,525
- Posts
- 12
- Years
- Age 32
- California
- Seen Feb 16, 2021
This topic has be going through my mind when examining how most of the misdeeds done in other countries are based on cultural traditions. For example, some animals are being critically threatened to extinction because one part of their anatomy is being used as medicine and are believed to heal the sick in an instant. This puts animal protection organizations at a tough challenge at trying to convince these people the immorality and dangers of wiping these animals out. Another example regards the age of consent, which varies from country to country. However, it feels like American countries are trying to convert every country to their age of consent due to how disgusted they are of seeing underaged people having sex and the amount of fanservice involving underaged characters in foreign entertainment such as anime. But one question has always struck me: Does one country have a right to change another country's culture? What other countries are doing to their own people and environment is their business, no matter how immoral it is and how bad it'll affect its other neighboring countries, and their culture is what makes them unique, for better or for worse. Your thoughts?