All women should be feminists. According to the Merriam-Webster Dictionary, feminism is defined as “the theory of the