Hi everyone! I have noticed that different people in society all have very different views of nursing. Some think that nursing is a "dirty job", meaning that it involves many unpleasant tasks. Other think nurses are "overworked and underpaid." What can we do as members of the nursing profession to change these views?