Did Christian Nursing Schools teach you anything special?

Nurses Spirituality

Published

I am a nursing student and a Christian. I go to a state school in a very Christian area, so many of the teachers and students in my program also come from a Christian background. Of course, though, we do not about Christian aspects of nursing because it is a state school.

I am wondering, for those of you who did go to Christian nursing schools, did they teach you anything special? Like, about how Christians view nursing, about how to deal with patient death, or a way to look at suffering in the world.

A lot of people used to tell me that if you had an illness, you either didn't have enough faith, or you had somehow sinned. I really don't agree with this, I don't think it is what Jesus taught, and I really want to stop having nursing school remind me of how people used to blame me for my dad dying of cancer when I was a child. (I'm not even sure what they thought I did wrong! But it still hurts.)

I think the best way to do this is to get another perspective.

There are some Christian schools that incorporate Christian values into nursing, although I personally don't attend one of these schools.

I agree that illness is not always because of one person's sin or lack of faith, although sometimes it can be. Biblically, the world was broken and filled with suffering when Adam and Eve disobeyed God. Although this wasn't God's original plan, He uses suffering for His glory, often in ways we cannot perceive and understand.

I hope to take some classes in the future about Christianity and how it influences patient care as nurses. I still have a lot to learn.

I attended a Christian nursing school. Nobody pushed any strange beliefs on us. Definitely no blaming people. In fact, I remember my psych teacher adamantly saying it is always the rapist's fault, not the victim's.

I think the main difference in a Christian education is that they assume you are a believer and that you are in nursing school because you feel it is God's plan for you. Therefore, they see nursing as a calling and an opportunity to share the gospel & minister to others. In fact, they would often share old-school tidbits in case we ended up serving in a third-world country.

Specializes in ICU, Postpartum, Onc, PACU.
I am a nursing student and a Christian. I go to a state school in a very Christian area, so many of the teachers and students in my program also come from a Christian background. Of course, though, we do not about Christian aspects of nursing because it is a state school.

I am wondering, for those of you who did go to Christian nursing schools, did they teach you anything special? Like, about how Christians view nursing, about how to deal with patient death, or a way to look at suffering in the world.

A lot of people used to tell me that if you had an illness, you either didn't have enough faith, or you had somehow sinned. I really don't agree with this, I don't think it is what Jesus taught, and I really want to stop having nursing school remind me of how people used to blame me for my dad dying of cancer when I was a child. (I'm not even sure what they thought I did wrong! But it still hurts.)

I think the best way to do this is to get another perspective.

I'm two classes away from a religion minor because we were required to take so many religion classes while at my private christian school. However, in the nursing program, I don't remember ever having specific things taught that would incorporate religion into our lives as nurses. I grew up in that particular religion and am pretty much a low scale christian (if not borderline agnostic) and a lot of the people there were from the community. Some of them weren't even christian, let alone the specific religion of the school.

They might have put a Bible verse at the end or beginning of a syllabus and maybe pray in class once in a while, but that was about it. They knew we were getting our religion fill from the other numerous religion classes that were mandatory, I guess:D

I don't believe (and neither does that particular religion I grew up with) that people are sick or ill because they or someone close to them has sinned. If you're gonna go there, we all have sinned and therefore we should all be stricken down with something awful and be miserable.

Some religious people say "no medical care" for their child, while they're wearing prescription glasses so there's a lot of hypocritical thinking going on in those situations and it turns people off to religion in general (from what I've seen).

I did always felt like I could ask a nursing teacher about an issue I might have within nursing, but what did I know back then? I had no situations to ponder since I hadn't encountered anything personally, but that was the freedom that the school gave us, in retrospect.

I always think of a friend of mine (the daughter of one of the teachers at that school) who said, "It would be fun if, when Jesus comes back, he says that all this time wasted with holy wars, arguing about which religion is right, and belittling people for not believing as we do were all for nothing and that all he wanted us to do was be good people and kind to each other". :saint:

xo

+ Add a Comment