What actually is feminism and why it's still important?
Throughout the last couple of months, the F word... Feminism has popped up in numerous discussions in my house with my male housemates adamantly claiming that they won't call themselves feminists because "it's a bunch of aggressive women' or 'it's just about women's problems'.
A proud feminist, I took it upon myself to shed some light for them on the misunderstood topic. Stereotyping feminists is a big issue and what I found is that when asked, not one of my housemates could actually define what feminism was.
Why is that a problem? Because how can you automatically assume something is bad or 'man hating' if you don't actually know what it is?
So let's just start with basic definition. According to IWDA "Quite simply, feminism is about all genders having equal rights and opportunities". They highlight it as respecting diversity of experiences, identities, knowledge and more, championing a level playing felid for all. This includes 'intersectional feminism' which acknowledges the interconnectedness between other forms of discrimination and gender, for example faith, sexual identity or race.
I am by no means forcing anyone to call themselves a feminist if they don't want to as ultimately, everyone has their own choices to make however, we all have a responsibility to learn about wider opinions and issues before judging. Feminism is so often viewed as a dirty word, but what it truly means is working towards equality for all...
For all those fellow or future feminists out there, I think the key to breaking the stigma is to educate others on what it truly means. Should you want to have a read of some very interesting feminist literature, I have a few suggestions that I found really eye opening from when I started delving into the subject.