|
Post by Lovelace on Apr 27, 2016 2:15:41 GMT
Growing up, I was very religious, mainly because my mother was, and I always wanted to appease her. We would go to CCD and church every Sunday and it would make her happy, so I wanted to be happy in the process.
Yet, as I grew older and older, and learned more and more, I started to not relate to the church as much. It was getting tedious, boring, and I started to become more weary that all the leaders were old white men.
I remember that one Sunday before Lent where Father would give out a list of sins and start reading them out. I was cringing at some of them:
-Abortion -Advising to have Abortion -ANY Sex before Marriage -Laziness Etc. Etc. Etc.
When did the religion that was supposed to be unbiased and let Jesus and God love of us all of sudden get so political? Why were they dictating what I should believe and what I should see is moral and right?
Since coming to college, I have lost more and more interest in organized religion, as I feel like I can decide what I believe without old white men telling me for me. I still feel guilty when my mom wants us to go to church, as my siblings and I have all lost interest. I'm still a spiritual guy, and I believe there is something in existence, but I don't need church to control my religion for me.
|
|