Today, men are changing themselves to look like women, and women are changing themselves to look like men. There are a couple of other strange things going on nowadays too; men are trying to look more like lizards and cats.... Society is pushing us to accept their gender identity, yet this is leading to things like men using the women's restrooms. Is all of this shit really worth debating over? Or is this what the masterminds of this country-wide scheme wants? Maybe you should quit thinking like you are told and question it instead. Do you really think that a civilized society would promote the thought that you should be more like something else and less like yourself? No, a civilized society would teach you the importance of self acceptance. You have no control over what you look like, or what gender you are. One essential key to true happiness is letting go of the things you will never have control over. You need to love yourself for who you are. All this, "I was born a girl in a man's body," is complete bullshit. They decided that they liked the idea of being a woman after they were born. I believe that people can dress up however they want and do whatever they want, but to go and have surgery done to "improve" yourself... That's just hateful. That's hateful for anyone to expect you to look a certain way, and that's hateful to yourself to think so negatively of the way you are. Love yourself, and live your life with nothing to prove to anyone but yourself.
Here's a video about this. This is more about deceit and the bullshit you are fed. Courtesy of HighImpactFlix: