Is it the religion that undermines women? Religion is just words on paper. Without misogynistic assholes, the religion has no power. And if you take the religion away, I doubt the misogynists are just going to start respecting women because they are no longer under the influence of religion.

The typical Arabic attitudes towards women were in existence long before Islam existed as far as we can tell from the social anthropology. And it is dishonest if we say that the state of women in those eras did not improve after the arrival of Islam.

Paul was a misogynist like many of the men of his era, but the early Christian church had many female leaders prior to the Holy Roman empire. Priesthood was restricted to men, maybe. That is debatable since it is difficult to say exactly when Christianity developed a formalized priesthood prior to the Holy Roman Church. However, the vast majority of early church leaders, male or female, were not officially "priests."

The way I see it: Nature, defines the gender during the conception, there is no physical nor metaphysical law, logic or reasoning to claim that there are differences between men and women, beside the physical. People who still think there is, still struggle with their ego, their control in power, or self esteem. But all claims that assert there are differences in the field of rights, intelligence, or what-so-ever can all be refuted by logic and ethic reasoning.

People use all kinds of things to justify evil. If you read enough, know enough, and bright enough, you will realize that religion doesn't even make the top of the list. Politics, money, and non-religious ideologies have contributed to more evil being done in the world in just the last hundred years than you can realistically attribute to any religion in the last 2500 years.

It is a lot easier to frame evil around someone else's religion than to admit that humans are just fearful, ignorant animals with a penchant for violence and hatred of all forms.