Does our modern society unjustly give good men a bad rep sometimes? Do all men, good or bad, get painted with the same cynical brush stroke? Are women just better than men? Or better off without them? I’ll give you my unfiltered thoughts, some info from a study I read on men and women working together, and my view on how true love conquers all.