I love bears. I really do. I love their faces and noses and ears and paws and stubby tails. I love how they make me feel cozy and safe when I go to bed, or when I watch TV, or when I’m just being alone in my room. I still love shows and things from my childhood like The Care Bears or the Gummi Bears. They take me back to simpler times when this adult life gets to be too much to handle. I find nothing brings me peace quite the same as my bears do.
But I’ve noticed a trend in recent years, and saying that “it offends me” is putting it very lightly. I’ve been witness to countless people in the media, entertainment, and even in my personal life (of all places!) that assert that bears are dangerous or deadly. I’ve heard news reports talk about chasing “bears” out of urban areas. There was a movie called Grizzly Man that is one of the most hateful pieces of propaganda I’ve ever seen in my life. I’ve seen nature documentaries where they try to propagate these ideas by hiding and avoiding “bears” in the wild.
I put “bears” in quotes there because what they’re showing isn’t really bears at all. They’re trying to put the idea in our minds that this animal is a bear:
That’s so wrong it makes me red with anger. THIS is a bear, and if you think otherwise you need to really do some self-searching as to why you desire such savagery and viciousness:
Why would anyone push ideas about a bear that was anything else? Why would you WANT bears to be anything else?
I ask you: who are they to try to tell the rest of us what bears are, what bears do, or how “dangerous” they can be?! How dare they! No bear that *I* know would ever attack or eat a human.
Bears are not scary.
Bears do not eat people.
Bears do not harm anyone or anything. They love them and comfort them and bring joy.
If you want to keep pushing hateful ideas like some have about bears, I guess I can’t stop you, but I want you to know that those are only opinions and MY bears would never do any of those things.