IRMS Lecture w3 BB PDF
IRMS Lecture w3 BB PDF
IRMS Lecture w3 BB PDF
Quality of Measurement
Instruments; Introduction SPSS
Introduction to Research Methods & Statistics
2013 – 2014
Hemmo Smit
Overview
Read:
- Leary: Chapters 3 (pp. 53-70)
Two aspects of the quality of a measure
Reliability coefficient
NOTE! For
diagnostic
Lies between 0 and 1. purposes higher
Rule of thumb: .70 or higher is sufficient reliability required!
Determine with repeated measurement
1) Test-retest reliability
1) Test-Retest Reliability
- One measurement or whole instrument
- Measure twice and compare outcomes
- Consistency of a measurement over time
4) Replication
- For whole study
- Repeat the whole study and compare the outcomes
Internal Consistency
Repeated measurement:
- Each item is a small measurement instrument
- All items are parallel test forms of each other
1) Item-total correlation
2
1
2) Split-half reliability
?
3) Cronbach’s Alpha
( y y)
Assessing Cronbach’s α
α Assessment
<.60 Insufficient
.60-.80 Reasonable
>.80 Good
1) transient states
2) stable attributes
3) situational factors
4) charachteristics of the measure
5) mistakes
( y y)
Validity
Systematic
score X Bias
Note: Validity requries Reliability, but not the other way around
Validity Measurement Instruments (1)
1) Face validity
Does it appear to measure what it’s supposed to measure?
2) Content validity
- does the measure cover all aspects of a construct?
- requires independent observers
Note: Not in Leary
Validity Measurement Instruments (2)
3) Construct Validity
Does a measure relate to other measures as it should?
a) Convergent validity: Strong correlations with instruments
that measure comparable or opposing constructs
b) Discriminant validity: weak / no correlation with
instruments that measure different constructs
4) Criterion-Related Validity
Does a measure relate to a particular behavioral criterion?
a) Concurrent Validity: present behavior
b) Predictive Validity: future behavior
Validity of a Study
Read:
Leary: Chapter 6
Howell: Chapter 2