I worked in a previous state and we were never allowed to get informed consent. that was the dr or surgeons job. Now at the hospital i'm working at in Texas it is normal routine for the nurse to have the patient fill out the consent. I was told that in Texas the dr can write an order for the nurse to obtain consent. Does anyone know if this is true? I'm not really sure where to look to find the answer on this. It makes me uncomfortable.