What is feminism anyway?

Feminism can be defined as the movements that aim at getting equal rights and we all know women had to suffer for that. In this day and age, women can vote, have equal pay to men and all these wonderful things. But does that really mean both genders are equal?

It seems to me that there is a double standard in society, men are expected to be sexually active with as many partners as possible and its over looked, but if a woman does that then she gets a reputation. Are we really stuck in the ages where woman should be seen and not heard? Where woman have to save themselves for marriage? It’s no wonder so many women have self-esteem issues!

Society should empower women to be happy with who they are no matter how sexually active they are or not.  Your opinions?…