|
Dear
Maria,
Historically
women have been left out of--or
at least marginalized from within--the
traditional medical field. For
instance, women weren't even
research subjects for medical
research until the 1970s and
today, it's still far too common
for women to be treated with
medicine and medical practices
that were tested on male subjects.
Knowing this, it unfortunately,
became women's responsibility
to take care of themselves,
and to ensure that their own
health was being taken seriously.
However, this doesn't mean that
we shouldn't also lobby the
traditional medical fields to
include women--it just means
that we have to do both. Also,
women's health also implies
children's health since women
remain the primary caregivers.
So women taking control of their
own health often means taking
care of children's health, too.
Good
luck with your essay.
--Amy
|