so maybe i'll be starting a ranting series on here, because honestly, all of us need to do it. lately, i've noticed some opinions regarding this topic, so here are mine:
I for one, strongly believe in feminism. people give it a bad reputation and make all feminists look like they hate men, but that's not true! i don't understand why men are STILL seen as the boss of everything. women are here! women should be heard! "doing it like a girl" shouldn't be an insult! the wage gap is incredibly stupid too. if you are doing the same job as a white male, then whoever it is should be paid the same amount as a white male.
Also, black. lives. matter! sure all lives matter, but isn't it CLEAR that black lives are the ones that are being targeted? why is it so bad to give them recognition? so many have been killed with no explanation. they shouldn't have been. they deserve to live just as much as anyone else.

Not all terrorists are Muslims, not all Muslims are terrorists! this one hits home because i am Muslim. you wouldn't believe how many times i've been called a terrorist. people have this perception that they are all bad people, but nobody ever pays attention to the good things they do. Real Muslims are such peaceful people,i promise you! if you are like ISIS, they aren't true Muslims. they kill people for their "religion" which is so wrong and shouldn't happen. But, if you say "all Muslims are terrorists" and you happen to be a christian, and as you remember, the KKK was a christian terrorist group, but does that make you a terrorist? of course not! so saying "all Muslims are terrorists" is incredibly false, and don't listen to news sources such as fox- which are known to be biased. do't fall for it!
So those were a few of my opinions. i just needed to rant about this topic :)