Test Item Analysis
Often gaining statistical information on test effectiveness allows faculty members to pinpoint any student performance issues and help improve course assessments. Blackboard offers an item analysis on tests that provides statistics on test questions and overall test performance to help you create useful assessments, adjust credit on current attempts, and realize overall course objectives.
Item analysis can be run on:
- Tests deployed in a content area
- Grade Center test column information
Run Test Item Analysis
1. Select “item analysis” from the drop-down selection next to the test title
2. Click “Run” on the page that follows
3. Then, click the report title link to display the report
The Test Summary provides test statistics that include the following:
- Possible Points – total number of points for the test.
- Possible Questions – the total number of questions in the test.
- In Progress Attempts – the number of students currently taking the test that have not yet submitted it.
- Completed Attempts – the number of submitted tests.
- Average Score – an * next to scores indicates that some attempts are not graded. The average score might change after all attempts are graded. This score is the average score reported for the test in the Grade Center.
- Average Time – the average completion time for all submitted attempts.
- Discrimination - indicates the number of questions that fall into the Good (greater than 0.3), Fair (between 0.1 and 0.3) and Poor (less than 0.1) categories. A discrimination value is listed as Cannot Calculate when the question’s difficulty is 100% or when all students receive the same score on a question.
- Difficulty - indicates the number of questions that fall into the Easy (greater than 80%), Medium (between 30% and 80%) and Hard (less than 30%) categories. Difficulty is the percentage of students who answered the question correctly.
NOTE: Item analysis uses only graded attempts in its calculations.
The Item Analysis Report page allows you to access the Test Canvas to edit the test if necessary. Click the link in the upper right hand corner of the report screen.
For a more detailed item analysis report, use the filter choices shown at the top of the report box to drill down to specific items. Questions that contain metadata (categories, difficulty designation, etc.) can be chosen for view.
For more information on adding metadata to test questions, see related assessment tutorials on this blog.
Shortlink for this post: http://ids.commons.udmercy.edu/?p=560