I keep reading here that nurses are constantly walked on and berated by doctors. That doctors seem to have a God complex and see themselves as better and all omniscient as opposed to the nurses who they seem to percieve as lowly and worthless. Well, has anyone here ever stood up to a doctor thats been treating them like dirt? If so, what took place and were the consequences good or bad?