Feminism… Feminist… Female… Girlie girlism… I really don’t get it?! Some people that claim they are feminist aren’t really feminist but rather sexist. All they do is hate on the opposite gender and talk about how being a woman sucks because men use and abuse them, but I’m pretty sure that’s not what it is. I always thought feminism was about how women are confident in being women, fighting for what they believe in and coming together with some sort of common support system, and just overall loving the fact they were put in this world as woman! So I’m not too sure what feminism really is. I also thought feminism was used to help men and women work and live together, happily, equally, and successfully. It’ seamy to bash on the other sex, I mean hey, it’s human, but do we really need to bash on each other to get attention or make a point. Is all the drama and media necessary. Our past seems to have shaped our present day society and our perspectives on both men and women. The women are looked at as the inferior weaker gender and usually put on a pedestal to accent their partner (usually a man). So what gives man?

20140303-101022.jpg

20140303-101030.jpg

20140303-101215.jpg

Advertisements