While I'm not going to say that all romance novels are great paragons of literature or don't have problematic (oh SO problematic) gender relationships, the majority have moved past rape=love, and I think most of the current stigma is the fact that, as meganbmoore said, it's it's published fiction by and for women, focusing on emotions which (as a female gendered trait) is less valued by our culture than books on how much pain a middle class white male suffers in this country.
no subject