If women become the dominant sex, will it be permanent?
Question: Many experts believe that, if current trends continue, women will become increasingly dominant in government, business and the professions. If women do become the dominant sex, do you think they will consolidate their hold on power and retain it indefinitely or will men gradually claw it back?
Created by: infohound at
07:04:38 AM, Friday, July 15, 2011 PDT
Comments (344) |