Some of the largest female-oppressive forces in our culture are religious in nature. We have the big three...christianity, judaism, and islam...telling women that they are less than men, that they cannot ever aspire to hold positions of power within the religion hierarchy, that they must be subservient and submissive to men, etc. And a LARGE portion of Americans claim to belong to these religions. I can't see women ever really getting further than they are right now unless/until we do something about religion.