I think "equality" in the US has been in slow-motion, and male-dominated far too long. While it's refreshing to see homosexuals come out of the closet, and demand their civil rights....it's somewhat frustrating that women were fighting this battle long before any man did.
Hey, women still don't have pay equity, or equal rights, regardless of their sexual orientation.
Does this really mean that only men can advance women's causes, because men own the political processes, and most of the money?