It's almost like the fashion is to remove references to the biological role of women, and language specific to women, from common use! As a feminist, I think it's scary and may be to do with trying to either placate or include men who feel excluded from what is, essentially, a process which only affects biological women. I'm sure someone has much more historical knowledge than me but if I recall from what I've read, women managed the process of giving birth with the aid of other women until male doctors came along and decided to medicalized the whole thing.
Good Morning Sunday 19th April 2026
Terrible relationship with DIL - am I the problem?
Gammon joint finshed in an air fryer?



