UITS will no longer provide printed copies. You will receive a file containing your item analysis along with your exam results.
Every exam must include a bubble sheet with the instructor’s netid. Without this sheet you will not receive any results.
Students must enter their Peoplesoft number beginning in column A of the Identification Number on their exam sheet.
You will continue to receive an email from SCHDSCN@uconn.edu with a subject of Secure Mail. Within that email there will be a link to your exam files which will be in the form of a zip file and will contain four options.
examreport.txt This file includes grades for each student, a summary of class performance and “Item Analysis”.
studentanswers.csv A spreadsheet of all student scores and their answers to each question. This file can be easily imported into any spreadsheet program.
huskyct.csv A slimmeddown version of the above spreadsheet. It simplifies importing exam grades into HuskyCT.
rawdata .txt A raw data file used for archival purposes.
To upload your exam results to HuskyCT go to http://irc.uconn.edu/wpcontent/uploads/sites/77/2014/06/UploadBubbleSheetGrades.pdf for detailed instructions.
Procedure For Having Exams Scanned
This service is available to all University Departments. The system uses standard 8.5 x 11 Exam Scanning sheets (see Appendix 1).
USE ONLY BLUE BUBBLE SHEETS.
*Note: The UITS Scheduling department DOES NOT stock blank forms in bulk for security reasons. They also DO NOT have the staff to provide extensive consulting services. Please have all forms prepared before submitting them for scanning.
Blank exam forms are available through Central Stores. Departments should contact Central Stores (see “Obtaining Blank Scanner Forms” with the necessary form number and order enough forms for an entire semester.
Bring the completed exam and its key sheets to UITS, Room M035 in an envelope marked with Professor’s name and unit number, for processing. (Open Hours)
Hours Exams Can Be Received For Scanning – Drop box always available
Regular – Room M035
 Monday through Friday – 08:00 A.M. to 04:00 P.M.
 Saturday and Sunday – Closed
FINAL EXAMS
 Monday through Friday – 08:00 A.M. to 04:00 P.M.
 Saturday – Varies. See note below.**
 Sunday – Closed
** Saturday exam hours are determined by Saturday’s exam schedule.
IntroductionThe purposes of this manual are to acquaint faculty with the Professor Test Scoring Service and to provide the procedures for scoring an exam. This manual
also provides advice about how to interpret the various computer printouts that result from test scoring.
Professors are commonly concerned about the accuracy and fairness of their tests. Establishing the validity of a test is a fairly complicated process, but a simple
and practical criterion comes from the test scores themselves. If a test’s total score is used as an anchor, then each item may be judged against this anchor. For
example, did students who scored well on the whole test tend to get item 14 right? And did students who fared poorly on the whole exam tend to get item
14 wrong? What is the correlation between responses on item 14 and total test scores? Of course, if the whole test is flawed, these questions can not give
sensible answers. Statistical answers produced by the test scoring process can not substitute for careful tes preparation. But if the test has a good coverage
of content, and if items are carefully written, then the total test score serves as a sturdy anchor.
Receiving Exam Results Through EMail
UITS has setup a secure mail mechanism to allow for emailing of exam results.
YOU WILL NEED TO KNOW YOUR NETID AND PASSWORD TO USE THIS OPTION. If you do not know your netid or password you can locate your information
by visiting web page http://netid.uconn.edu
In order to receive exam results you will be required to complete a ‘2’ keysheet . The first three positions must be NID followed by one space and your netid (example: NID PYP02099).
NOTE: To bubble in the numbers in your netid use the numbers in the ‘Identification Number’ grid. You would begin using the numbers in column B of the id number grid.
If you have any questions about this procedure, please call UITS Production Control at 860.486.3732.
Description of the Test Scoring System
Test and Item Description
The test must have a multiplechoice, matching, or truefalse format. Students have up to five options on the same answer sheet:
A,B,C,D,E or 1,2,3,4,5
Matching items may be no longer than five items. Students taking truefalse items are instructed to fill in A (or 1) for TRUE, and B (or 2) for FALSE.
The test scoring sheets have room for 200 responses. In reporting scores, the 200 responses are split into four subparts of 50 items each. That’s handy
if you have two, three, or four separate parts of no more than 50 items per part. But if you intend to use only a single, total score, then you will ignore the
part scores and use the total scores. On the first scanning, the machine will print the fourpart scores in the front left position of the document. This will be
preceded by a serial number. Therefore, please indicate if the exam is being rescored, as the new scores will be printed further to the right of original scores.
Preparing Answer Sheets
There are three types of key sheets to be used when preparing an exam.
 The first shows the department name, course title, date of exam, course number and section, and number of versions.
 The second is required and defines your NET ID. The exam results will be sent through a secure email mechanism.
 The third shows the instructor’s name, test version, and answers. There must be a corresponding key sheet for every version the students have indicated.
The following instructions must be followed explicitly.
NCS Exam Scanning Key Sheet Instructions Click here for a printable (PDF document) version of these instructions.  
KEY SHEET NUMBER 1 – REQUIRED SHEET – DEPARTMENT SHEET (See Appendix 1) 

SHEET LOCATION  FIELD INFORMATION 
In the NAME AREA (required)  DEPARTMENT NAME – 4 character Abbreviated Name 
In the IDENTIFICATION NUMBER AREA (required)  COURSE NUMBER (columns AC) 3 digit numeric field Our current program only allows for 3 digits, any three digits of your course number can be used. SECTION NUMBER(columns DE) 2 digit numeric fieldEnter “00” If you want your exam sorted by section, or enter two digits of your section number. KEY SHEET NUMBER (column J) 1 digit numeric field Always enter a ‘1’ in this column. 
In the SPECIAL CODES AREA (required)  VERSION NUMBER (column K) 1 digit numeric field Enter a (1) for one version or ( 25) for multiple versions 
KEY SHEET NUMBER 2 – REQUIRED SHEET – NETID SHEET (See Appendix 2)NO RESULTS WILL BE RECEIVED WITHOUT THIS SHEET! 

SHEET LOCATION  FIELD INFORMATION 
In the NAME AREA (required)  Starting in the first column the characters” NID” followed by a space, then the first three characters of the Net Id. The numeric part of the Net Id is entered in the bottom portion of the sheet in the identification number area starting in column ‘B’. (note: be sure to start in column ‘B’) 
In the IDENTIFICATION NUMBER AREA (required)  KEY SHEET NUMBER (column J) always enter a 2 in this column. 
KEY SHEET NUMBER 3 – REQUIRED SHEET – INSTRUCTOR’S SHEET (See Appendix 3)***A separate sheet must be filled out for every version of the exam*** 

SHEET LOCATION  FIELD INFORMATION 
In the NAME AREA (required)  INSTRUCTOR’S NAME 
In the IDENTIFICATION NUMBER AREA (required)  KEY SHEET NUMBER (column J) always enter a 3 in this column. 
In the SPECIAL CODES AREA (required)  VERSION NUMBER (col K) – 1 digit numeric fieldEnter 1 when only one version is given. When multiple versions are used, instructor sheets MUST be in numerical order (15). 
ANSWER SECTION (required)  Fill in as many answers as desired. Answers left blank will not be graded against. Multiple answers are not allowed. 
STUDENT ANSWER SHEET (See Appendix 4) 

SHEET LOCATION  FIELD INFORMATION 
In the NAME AREA  STUDENT’S NAME 
In the IDENTIFICATION NUMBER AREA  9 digit numeric field (Peoplesoft number) (columns A – I) Column ‘ J’ MUST NOT BE FILLED IN OR HAVE ANY STRAY MARKS IN IT. 
In the SPECIAL CODES AREA (required)  EXAM VERSION (column K) If multiple versions, exam version must be filled in. If only one version of the exam (column K) need not be filled in.STUDENT SECTION NUMBER (columns L – M)If multiple sections, section number must be coded in orscanner program will default to 00.If one section, students need not fill in. 
Appropriate use of the “Identification Number”
In accordance with the Policy on the Use of the Social Security Number at the University of Connecticut, issued 9/1/2005, Social Security numbers will not be used in this field. Professors must have students bubble in their Peoplesoft number in the “Identification Number” field on the Student Answer Sheet to upload results to HuskyCT. (See Appendix 4).
Computer Printouts will no longer be produced as of 1/1/2015
The Item Analysis — contains technical information about the test items and is extremely useful in revising a test. This shows how many students achieved any particular score, and what percentile rank is associated with any score. At the bottom of the frequency distribution are several summary statistics, including a coefficient termed INTERNAL CONSISTENCY. Internal consistency is an estimate of test homogeneity, and answers the question: “How well do the test items represent a single domain?” If the estimate is high (say, .75 or better), the items seem to be measuring the same property. If the estimate is low, then the test lacks homogeneity, and you should wonder whether it makes sense to total the items into a single score. This statistic may not be appropriate for criterionreferenced test.
The core of the item analysis is presented next. Appendix 5 gives an example. All scores in the class are ranked from highest to lowest. The upper 27% and the lower 27% of the scores are set aside; the middle 46% is ignored for the time being.*** These upper and lower groups of students represent those who have performed well on the test and those who haven’t fared well. One indication of an item’s worth is whether it can distinguish these two groups. Under the columns headed A,B,C,D,E (or 1,2,3,4,5 if numeric responses were used) the behavior of the upper and lower groups is compared for each portion of an item. The correct answer among A,B,C,D,E is noted with a # sign.
Let’s look at item 1 on this sample test (Appendix 5). Option A is the correct answer and was chosen by 32 of 43 students. Options B, C & D are called distractors. Option B appears to be a good distractor because a total of 10 students selected it and 6 of these students were in the low group. Option C is not a good distractor because no student selected it; we note that Option D was selected by only 1 student. The last line under Item 1 labeled “Mean” gives the average score on the exam for those students selecting each option. Under the column headers WRONG and RIGHT, the total class is considered. This is merely information about how many got the item wrong or right, and the test means for these two groups.
All of these little comparisons of upper and lower groups are combined into a single index, in the far right column of the page. The discrimination index,*** (titled DISCRIM), tells what proportion of the upper group got the item right, minus the proportion of the bottom group. The index ranges from +1.00 (the whole top group got the item right; the whole bottom group got it wrong) to 1.00 (the entire upper group missed the item; the bottom group got it right). Obviously, a high discrimination index is a good sign. A negative index usually indicates a poorly constructed or ambiguous item. Or it could be a sign that you have marked the wrong answer on your answer keys. In general, coefficients of .20 and higher are suitable for achievement tests. Notice that item 1 has only a fair discrimination because several people in the low group also selected Option A.
On the right of the page, two other pieces of information surround the discrimination index. The EASINESS figure represents what percent of the whole class got the item right.**** Easiness figures are challenging to interpret. For example, an item may show that 97% of a class got it right. Do you ascribe that to your
stellar teaching, or have you written an exceedingly easy item?
Finally, the R=PBISER stands for the PointBISERial correlation. This is a special case of the correlation coefficient most of us are familiar with. It shows the relationship between getting an item right and the total test score. So, item l’s pointbiserial coefficient of .42 shows a modest connection between performance on this item and performance on the test as a whole. If ever an item gave a pointbiserial correlation of 1.00, that item is equivalent to the entire test!
*** When two or more scores fall at the 27th or the 73rd percentile, all of the tied scores are dumped into the upper or lower category. Thus, the claim of exactly 27% of the scores is accurate only when there is a single score at those percentile ranks.
**** Please note that the EASINESS index was formerly called the “difficulty” item, and was the inverse of the number presently used.
APPENDIX 1
APPENDIX 2
APPENDIX 3
APPENDIX 4