So I'm currently in a Ph.D. Nursing Education program, and working on a course in advanced nursing theory. I was reading an article entitled "Theoretical development in the context of nursing-The hidden epistemology of nursing theory" and they discuss the deprofessionalization of nurses and other allied health professions. This is the first time hearing about this.
The homework is about concepts and nursing theory, not about assigning professionalization to healthcare workers, so this is not a homework question. I'll probably try and work the angle into the discussion board that nurses no longer need theory due to deprofessionalization, just to see what others are thinking about it, but I thought I would see what others here thought.