Home >  Term: women's fiction
women's fiction

Women's fiction is an umbrella term for women centered books that focus on women's life experience that are marketed to female readers, and includes many mainstream novels. It is distinct from Women's writing, which refers to literature written by (rather than promoted to) women. There exists no comparable label in English for works of fiction that are marketed to males.

The Romance Writers of America organization defines women's fiction as, "a commercial novel about a woman on the brink of life change and personal growth. Her journey details emotional reflection and action that transforms her and her relationships with others, and includes a hopeful/upbeat ending with regard to her romantic relationship."

0 0

Kūrėjas

  • deb.egan
  • (Aberdeen - MD, United States)

  •  (Bronze) 376 points
  • 0% positive feedback
© 2024 CSOFT International, Ltd.