I have this genuine question in my mind. Many claim that men and women are equals. That the men have no right to dominate women. I agree to that in principle but if the nature had made them equal how come throughout the history, in almost all societies women always have been weaker and dominated by men ? Evolution seem to have settled this question to indicate that women are weaker than men and by nature are supposed to be dominated by men. Then isn’t it again the nature to claim that women are equal to men?
Before the feminist haul me over hot red coal, let me argue their case.
There is a significant invention that has changed the life of women forever and that invention alone has empowered women in no other way. If the modern day women can claim equality with men in almost all fields they probably owe it to this invention. This invention is nothing but contraceptives and partly to increased cost of living.
Throughout history, women have spent most of their life either in pregnancy or in rearing little kids. This meant they were always vulnerable to various threats and had to depend on men for their safety and well-being. That explains why throughout evolution women were subjugated by men and did not get equal status with men.
With the rise of contraceptives, increased marriage age women are now able to spend their time pursuing their interests. They can get college degrees, get job, join army, enter boxing rings, dance and so on. So they can claim to be equal of men.
It is silly that many religions so strongly advocate against contraceptives. This has nothing to do with morality but just a ploy to deny women an equal status with men.