Is it important to be working woman to earn respect in society or in family? I work day in and day out but at times feel that had I been working outside and earning, my family would have more respect for me. Does working full time affect the way people look at you?