How nice: Americans no longer disapprove of working women. I know that might not seem like much to you ladies who’ve been working your behind off your entire adult life, but, believe me, this is a huge deal.

According to research by Kathleen Gerson of New York University and Jerry Jacobs of the University of Pennsylvania, traditional gender roles are breaking down, and Americans are finally accepting the reality that the vast majority of women (70 percent) work outside the home.