I find this very odd. How could a major cultural lineage be wrong about something so much a part of ordinary experience?
Perhaps because women weren’t allowed much say in the writing of medical books at that time? I may be wrong, but there weren’t many women doctors around to begin with before the 20th century.
Perhaps because women weren’t allowed much say in the writing of medical books at that time? I may be wrong, but there weren’t many women doctors around to begin with before the 20th century.
I didn’t just mean the Victorians—either the ancients were wrong about women being the hornier sex, or we’re wrong that men are.
Or both.