Ok. SO I just gave my first med-surg test online and the majority of the class scored between 70-75%. I thought I had prepped them well--I chose questions that are most practical to actual clinical practice. I made a study guide specifically targeted to the questions on the test, and in a review game I even had some of the actual test questions and answers.
They never really taught us in my MSN class how to decide what questions to throw out based on how many people got it wrong. I will look the questions and responses to see if there were traps that students fell into when answering or if they just did not understand the question.
Any guidelines on when to throw out a question--like if 75% or 50% got something wrong?
Mar 7, '13
One of the biggest things that I remember from my MSN program was to never teach to a test. When I give a test, I will do an item analysis to see how many students miss each question and if 80% miss the same question I will take another look at this question and decide from there if I will throw it out. I just gave an exam in pediatrics on Monday and one question that asked where you would see substernal retractions with a picture. Most (90%) picked the wrong answer. This is very important to know and it was very obvious as to where the substernal retractions should be seen so I did not throw that one out.
It really is up to your judgement.
Another tip that has worked great and I am seeing a better response to test grades, is that I allow the students to use handwritten notes for their exams. These have to be written in their own handwriting. By doing this they are retaining more information than they do by just reading. It has been a great study tool, and they really do not even go back to look at their notes when they take the tests.
Hope this helps.
Mar 8, '13
If you have the software to do it, you can run the statistics to see if there is a good correlation between the answer given on that question and the overall score for the test. If the people who did better on the test in general got this question wrong, that question is probably not a good indicator as to which students know their stuff and which don't. If the people who scored well on the test as a whole did pretty well on that particular question, then that question is doing a good job of distinguishing between those who know the material and those who don't. (It might just be more difficult than the average question -- which might be something you want to keep. You want to have a few difficult -- but good -- questions on tests so that you can identify the A students from the C ones.)
If you don't have the software to run those item analysis statistics, you can do a little of that manually. Identify the top 5 or 10 student scores on the test ... and then see how they answered that particular question. That will give you a clue as to how the better students in the class perceived it. Sometimes, after considering all the possibilities ... I conclude that the question was good and legitimate, but just difficult. I try to have a least a couple of those on each test so that the best students will get better grades than those who didn't do the work.
I also look at whether or not the students chose the same "wrong" answer. Is there any legitimate reason as to why they chose it. Sometimes, they simply read/interpret a question differently from what I intended. If that's the case, then I didn't write it as well as I thought and I need to take some responsiblity for that.
Sometimes, I conclude that I didn't teach that content very well -- and I alter my teaching methods.
Other times, I conclude that lots of the students didn't study very much and didn't do "their part" in the learning process. In those cases, I am not afraid to let the bad grades stand so that students can learn a valuable lesson about the consequenses of not putting forth an effort to learn the material.
You might want to get a book (or find some good articles) on test construction and analysis. There is an art and science to constructing a good test and writing good test questions. It may be worth investing in a little continuing education for yourself in this area.
Mar 23, '13
We stopped this in our school because we felt giving them a "cheat sheet" (even if they wrote it) gave them too much of an advantage that they would not have taking the NCLEX or help them to critically think through clinical situations.
Must Read Topics