giving you credit on difficult exam questions??

Published

Hi!

I am just wondering if your nursing instructors review the exams after they are taken? Like do they give you credit on particular questions where a lot of students didn’t do well?! or do you just submit the exam and that’s it, no review??!!

The nursing instructors review the exams in the program I’m in now... But when I was in this same program a few years ago, they did not do that.. is this a new thing?! or a not so common thing? I’m just super curious 

Specializes in Physiology, CM, consulting, nsg edu, LNC, COB.

Answering a former faculty here. Some faculty offer pre- and post-exam reviews. Smart students attend both. Interestingly, I found that the better students were the ones who attending my reviews. Chicken or egg? 
You can always do an exam review with your faculty member on an individual basis, either before the exam to clear up questions or misunderstandings you have, and afterwards, to go over the things you weren’t clear on and get suggestions for further study. 
As to your question about “giving credit on particular questions where a lot of students don’t do well,” some programs may not allow that. Seen from another perspective, if 75% of the class gets one item wrong, and you give them credit for it anyway, what message does that send to the 25% who learned the material? 
We were required to use a massive item bank for our exams, all items that had been tested for accuracy and rated hard, medium, or easy based on testing results by the psychometric test development company who developed them. For my first exam I selected 10% hard, 65% medium, and 25% easy, and made sure my classes and reading assignments covered everything carefully. Passing was 75, and 1/3 of the class failed. Second exam, 10% hard, 50% medium, 40% easy, and still had 1/4 of the class failing. None of the failing students had attended any review sessions or come to office hours for extra help. 


But I had them saying that if X numbers of questions were “thrown out” they would have passed. I heard them out on this, let them explain how they had figured this out to the fourth decimal place, and how unfair it was because they had “paid a lot of money to come to this school.” (This was at a community college program subsidized by the hospital and their tuition was half what anybody else was paying.) Then there was silence.


“OK, then, what you’re saying is that I should let you pass and take care of sick and injured people or my mom or kid when you can’t get a minimum of three out of four questions right, and most of them not even hard? And because you paid tuition you’re automatically entitled to sit for licensure? If you spent half the time studying or coming to review sessions that you spent on these detailed calculations, you’d probably pass on the merits.”

I know this isn’t exactly what you’re asking for, but here’s some more advice. Nursing faculty WANT you to learn and succeed at competence. If your faculty doesn’t offer review sessions, ask them to. Start up a study group and divide up the material, and share so you all get the benefits— nursing is a collaborative profession, so this is good practice too. If somebody tells you they don’t have time for this but they still b**** about how hard it is, ask yourself how hard they are committed to learning what they have to know to BE A NURSE or whether they should reconsider, do something else, or come back when life allows. 

Off soapbox, LOL. 

Specializes in oncology.
18 hours ago, ktweis said:

The nursing instructors review the exams in the program I’m in now... 

I established testing review sessions on the printed and distributed course calendar before the course even started. I did this because some students could get childcare paid 'if it was something required for class'. Yes I think test review is required. Our tests always had the rationale established before the test was sent to be copied. I believe test review is vital. I did credit some questions if the class score was low. Sometimes because the questions were incorrectly written, some because it was not covered well. Many students would complain they got that 'credited'question right. I did a tally of the outcome and found it really did balance out. Everyone achieved the credit in equal amounts. 

At one time I instituted a 'student success plan' for reviewing tests. We had established test review session and could also request the review the test in our office for 2 weeks after the test. The students who completed EVERY test review received 5 points (5 tests). If you only attended 4, 3, 2, 1 you do not receive bonuse points. I also developed a tool for students to review why they may have gotten their answers wrong-- erased, vocabulary issue, etc. This tool was developed to give them insight. I had many faculty who taught other  nursing courses complain about those measly 5 points (5 bonus out of 300) . I did the best thing I could do, I retired.

1 hour ago, Hannahbanana said:

Nursing faculty WANT you to learn and succeed at competence

Some faculty do not share this view I can say in 40 years of nursing education. Faculty are so worried they might have to fail a student for clinical performance, that they will do anything to fail them on classroom scores.

Specializes in CNA, Nursing Student.

They will only drop questions if they review and almost everyone missed it because the question was unclear or had multiple answers that were arguably the best choice. 

Specializes in Physiology, CM, consulting, nsg edu, LNC, COB.
On 2/27/2021 at 4:28 PM, peach2218 said:

They will only drop questions if they review and almost everyone missed it because the question was unclear or had multiple answers that were arguably the best choice.

This is unlikely if the faculty uses the kind of test bank I described. These questions have already been tested for confusion or bad writing, and found to be valid measures of learning. So don't hang your hat on this idea.

However, this sort of experience is good feedback for the faculty if there are multiple people WRITING exam items. It's harder than you think, and they might learn to improve.

Our instructors do a statistical analysis on our tests after everyone has taken them, and then throw out a question if more than 50% of the class missed it. This generally results in about one question getting thrown out per 50 question test. 

Specializes in Physiology, CM, consulting, nsg edu, LNC, COB.

This is appropriate if they are creating items without psychometric backup. Many standardized tests (like SATs, GREs, certification exams, and NCLEX) include some items that are being tested for inclusion in a future version of the test for exactly that purpose. This is why you always answer every question as best you can even if it sounds like a '"bad question," without letting it make you go crazy, because it might be one of those.

Specializes in oncology.
On 2/26/2021 at 9:04 AM, Hannahbanana said:

We were required to use a massive item bank for our exams, all items that had been tested for accuracy and rated hard, medium, or easy based on testing results by the psychometric test development company who developed them.

Quote

This is unlikely if the faculty uses the kind of test bank I described. These questions have already been tested for confusion or bad writing, and found to be valid measures of learning. So don't hang your hat on this idea

Are you saying your test questions came from an outside source? 

Quote

by the psychometric test development company who developed them.

 I gotta say the test development company hoodwinked you and probably got a lot of $$$ in the process. And I don't know any school that has a massive test bank unless they are acquiring it outside the faculty then test security goes out the window.... Our questions were developed by faculty who were teaching the material,  and we had atleast a decade or more of data to back up the stats.We worked with a psychometric testing expert and software program but the neither she nor company  write our tests.  She taught us how to improve our tests. We also had running stats for how students had done the last semester and ALL semesters.  This kind of data does not come from an outside source and help you with the evaluation of your curriculum by  what is taught and the outcome of testing.  Unfortunately, faculty change, current practice,  lecture hours and textbooks change and I do not believe any test bank is fail safe. In addition, with the trends and changes in nursing practice additional questions need to be added.

I just gotta add in 40 years of writing nursing tests I have attended dozens of test writing workshops that all "had a new way" to test. From adding case studies in a series of questions that may or may not depend on the last answer, verbiage such as  "all of the following", or writing "which of these" versus "which",  underlining, no underlining, adding names for the patient "Mrs. Smith has etc", add charts for students to get information from  and those who said essays were the way to go, add true false, etc.  I just get a dazed look over my eyes at the newest expert.

 

Specializes in Physiology, CM, consulting, nsg edu, LNC, COB.
12 hours ago, londonflo said:
On 2/26/2021 at 10:04 AM, Hannahbanana said:

testing results by the psychometric test development company who developed them.

Quote

 This is unlikely if the faculty uses the kind of test bank I described. These questions have already been tested for confusion or bad writing, and found to be valid measures of learning. So don't hang your hat on this idea

 Are you saying your test questions came from an outside source? 

Quote

by the psychometric test development company who developed them.

  I gotta say the test development company hoodwinked you and probably got a lot of $$$ in the process. And I don't know any school that has a massive test bank unless they are acquiring it outside the faculty then test security goes out the window.... Our questions were developed by faculty who were teaching the material,  and we had atleast a decade or more of data to back up the stats.We worked with a psychometric testing expert and software program but the neither she nor company  write our tests.  She taught us how to improve our tests. We also had running stats for how students had done the last semester and ALL semesters.  This kind of data does not come from an outside source and help you with the evaluation of your curriculum by  what is taught and the outcome of testing.  Unfortunately, faculty change, current practice,  lecture hours and textbooks change and I do not believe any test bank is fail safe. In addition, with the trends and changes in nursing practice additional questions need to be added.

Precisely. Let me help you understand how we (and many other testing entities like schools and certifying bodies) do this. It may not change your belief but at least you’ll have some facts to consider.

Exam security: the test item bank was on a computer that was offline and isolated from any outside content. Also see below. 
I disagree with your unstated assumption that faculty excel at test-item writing skills. I have had extensive education and experience in this at the instruction level and for professional certification examinations, including coaching others, and I can tell you that it’s not as easy as you think. Subject experts are often lousy item writers because they are famously unable to put themselves in the mindset of the naive learner. They often write confusing distractors because of the way they set up the stem, for example. This is why a psychometrically validated item set is so valuable, with the added bonus that faculty didn’t have to spend time creating test items of uneven quality.

While you were obviously dismayed at the ED consultants telling you how to improve your test construction, removing sources of unconscious bias (gender, age, marital status, ethnicity,  or other “color” to enliven a setup) makes fir a better test item. True-false sections are good at seeing how learners discern assumptions or “always/never” decision points. Each aspect if test item type looks at different cognitive processes, and all are important. 
We constructed our exams before we gave our didactic instruction, thus assuring that our exams measured performance on our planned objectives for learning- we made sure that we covered the material we planned to test.
Our exams were not online, but printed out on paper the day of the exam with scannable mark-sense answer sheets. These lent themselves admirably to exactly the kind of ongoing analyses you mention, which we did do. This included stats on item distractor placement (a, b, c, or d) and correct answer placement; it’s surprising how often patterns emerge as most often unconscious bias on the item writers’ part influences these. The test bank also included annual updates to account for changes in practice or new knowledge PRN. 
I hope this clarifies my earlier posts for you. I can assure you that we took exam content, security, instruction quality, and evaluation very seriously and were quite satisfied with this method of facilitating all of them. 

Specializes in oncology.
4 hours ago, Hannahbanana said:

Exam security: the test item bank was on a computer that was offline and isolated from any outside content.

Ours also.

4 hours ago, Hannahbanana said:

I disagree with your unstated assumption that faculty excel at test-item writing skills.

I did not say this. I said that faculty wrote their own questions, were able to receive critiques from a nationally known test expert (who has developed the standardized entrance exam that your college probably used)  and EACH question had to be approved by 3 other faculty members ( not in the course) for clarity, item bias ect. 

4 hours ago, Hannahbanana said:

I have had extensive education and experience in this at the instruction level and for professional certification examinations, including coaching others, and I can tell you that it’s not as easy as you think.

Yes,I have submited test questions to NCSBN and Excelsior and have had them accepted. NO, it is not easy. I have only written for test items that would be used for prelicensure. I would think writing for professional certification exams would muddy ones thinking about what a student/new nurse should know. 

4 hours ago, Hannahbanana said:

Subject experts are often lousy item writers because they are famously unable to put themselves in the mindset of the naive learner.

SO Non-subject experts write better questions??????

4 hours ago, Hannahbanana said:

While you were obviously dismayed at the ED consultants telling you how to improve your test construction, removing sources of unconscious bias (gender, age, marital status, ethnicity,  or other “color” to enliven a setup) makes fir a better test item. True-false sections are good at seeing how learners discern assumptions or “always/never” decision points. Each aspect if test item type looks at different cognitive processes, and all are important. 

 

 

No I was not dismayed. Infact I learned something in every expert-provided session. I also have graduate credits for test writing and test administration from a top 10 university. I just added the comments about the evolution of nursing testing because everyone is so sure they know the construction of item construction but it has varied over my time in nursing education.Where were you in the last 40 years,  we youlearning the nuances and trends of testing and paying a consultant?

 Also, We also were active in commiting written testing content and the establishment of the paper test that was to be administered at a future date prior to the lecture delivery.

4 hours ago, Hannahbanana said:

Our exams were not online, but printed out on paper the day of the exam with scannable mark-sense answer sheets

Yes, we have never given an online test but we do use scantron.

4 hours ago, Hannahbanana said:

This included stats on item distractor placement (a, b, c, or d) and correct answer placement; it’s surprising how often patterns emerge as most often unconscious bias on the item writers’ part influences these.

Yes. I found this important too, If a distractor did not receive a student choice, it was discussed with possible revision...and you? 

4 hours ago, Hannahbanana said:

The test bank also included annual updates to account for changes in practice or new knowledge PRN. 

We admitted classes twice a year and were required to to item analysis with each test and update the test bank every semester/

Frankly, I find your pompous attitude insulting and the fact that you are not actively teaching at the prelicensure level indicates a lot.

Best wishes in your current and future endeavors. Education may have dodge a bullitt.

Specializes in oncology.
3 hours ago, Hannahbanana said:

While you were obviously dismayed at the ED consultants telling you how to improve your test construction, removing sources of unconscious bias (gender, age, marital status, ethnicity,  or other “color” to enliven a setup) makes fir a better test item. True-false sections are good at seeing how learners discern assumptions or “always/never” decision points. Each aspect if test item type looks at different cognitive processes, and all are important. 

Of course I know this, being in nursing education for 40 years. You are not telling  me anything new. I have taught in diploma, ADN and BSN. I was involved with developing a 2 + 2  BSN program (second in the US)  in the early 80s and instrumental in establishing a 2+2 program in the mid-80s.

Specializes in Physiology, CM, consulting, nsg edu, LNC, COB.

As ever, when we post here we aren't always posting to one person, but providing info and/or background to people coming from many different perspectives. Thank you for taking the time to engage.

+ Join the Discussion