Learn about the flu shot, COVID-19 vaccine, and our masking policy »
New to MyHealth?
Manage Your Care From Anywhere.
Access your health information from any device with MyHealth. You can message your clinic, view lab results, schedule an appointment, and pay your bill.
ALREADY HAVE AN ACCESS CODE?
DON'T HAVE AN ACCESS CODE?
NEED MORE DETAILS?
MyHealth for Mobile
Get the iPhone MyHealth app »
Get the Android MyHealth app »
Abstract
To measure the inter-expert and intra-expert agreement in sleep spindle scoring, and to quantify how many experts are needed to build a reliable dataset of sleep spindle scorings.The EEG dataset was comprised of 400 randomly selected 115s segments of stage 2 sleep from 110 sleeping subjects in the general population (57±8, range: 42-72years). To assess expert agreement, a total of 24 Registered Polysomnographic Technologists (RPSGTs) scored spindles in a subset of the EEG dataset at a single electrode location (C3-M2). Intra-expert and inter-expert agreements were calculated as F1-scores, Cohen's kappa (?), and intra-class correlation coefficient (ICC).We found an average intra-expert F1-score agreement of 72±7% (?: 0.66±0.07). The average inter-expert agreement was 61±6% (?: 0.52±0.07). Amplitude and frequency of discrete spindles were calculated with higher reliability than the estimation of spindle duration. Reliability of sleep spindle scoring can be improved by using qualitative confidence scores, rather than a dichotomous yes/no scoring system.We estimate that 2-3 experts are needed to build a spindle scoring dataset with 'substantial' reliability (?: 0.61-0.8), and 4 or more experts are needed to build a dataset with 'almost perfect' reliability (?: 0.81-1).Spindle scoring is a critical part of sleep staging, and spindles are believed to play an important role in development, aging, and diseases of the nervous system.
View details for DOI 10.1016/j.clinph.2014.10.158
View details for PubMedID 25434753
View details for PubMedCentralID PMC4426257