Women have always been deeply involved in health care, but until the second half of the nineteenth century, they were not allowed to become fully qualified doctors. Before then, especially in rural areas, women learnt from each other and from books how to diagnose illnesses, prepare medicines and practice as midwives. During the First World War, many medical schools opened their doors to women for the first time, although until recently they were effectively excluded from prestigious specialities such as surgery.