Published
Any advice would be wonderful!
I've recently excepted the fact that I feel nursing might be for me. I love the thought of working to nurse patients to health, helping families, working hard and being busy and not sitting all day. I love how I felt by nurses that have helped me in my hospital stays. I am just afraid because I have met people bitter about nursing... saying it's all politics and you're treated with zero respect and basically are a glamorized bed pan changer. Is nursing only cleaning up patients and getting abused? From what I knew I thought there was more "action".
I don't want to go through this huge deal of going to school and working my tail off to achieve the degree when I have small children if the career isn't what it's cracked up to be. I suppose I am just nervous. Can anyone give me advice?
Is nursing more than bed pans and being demeaned?
Is it worth it?
I can handle the dirty work, but I guess I just want to be reassured that as A nurse I will be more than a servant and someone who makes educated decisions and makes a difference.
also, do you believe this is a good career for a family woman? I'm hoping to still be able to properly mother my babies and be an intimate wife to my husband. I don't want to never see my family.
Please don't think I am being negative I am just trying to receive reassurance from the negative advice and commentary I have received.
Kaitybar7
22 Posts
Also. Don't you think me taking to initiative to ask actual nurses is doing proper research?