Mina Cash-Valdez
10/04/2018 9:41 PM
I think the meat industry has really become a central part of American "culture". As a Capitalist society we have made meat a very cheap option in grocery stores and in fast food. These companies profit off of those in poverty by making meat products extremely cheap. Why would I order a $7 salad when I'm struggling to pay the bills and can get a 4 for $4 deal and feed my family? Not only that but if you keep people eating meat and having health issues it makes a lot of people a lot of money. Meat is also a huge factor in misogyny. For example, ads for burgers are marketed as manly and strong and meat culture is a huge aspect of the patriarchy. There is also an extreme divide in our societys minds between human and animal. We think we are above and seperate from them. But you can see in indigenious cultures that eat meat, their mindset towards it is completely different. As much as it may seem like a reach to say this, meat in the United States is a symbol of power, of domination, and in order to shift our meat-focused food culture it would require an entire shift in our economic system.