The Centers for Disease Control and Prevention (CDC) has identified 9 attributes of surveillance systems that can be assessed to gauge overall performance.
Simplicity |
Definition |
Ease of operation of surveillance system (ease of data collection/reporting for example). Refers to both structure and ease of operation. |
Type of attribute |
Qualitative |
Indicators & measures |
Not standardized |
Other considerations |
Can be assessed by mapping the information flow in the surveillance system (useful tools BPMN1) |
Examples |
- Number and type of data sources reporting to system
- Number of organizations involved in reporting
|
Questions to ask |
- Is the surveillance system design, elements, and levels difficult to understand and describe from an external perspective?
- Are users of the system able to use it easily? At all levels?’
|
Flexibility |
Definition |
Ability of a surveillance system to adapt to changing information needs or operating conditions with little additional time, personnel, or allocated funds. |
Type of attribute |
Qualitative |
Indicators & measures |
Observational |
Other considerations |
- Easier way to assess flexibility is to review how the surveillance system responded to new diseases, integration, or disruptions.
- Simpler (simplicity) systems tend to be more flexible.
|
Examples |
- Number of components that need modification when adding a new disease
- Time required to change reporting source
|
Questions to ask |
- Does the system use data standards and formats that allow integration with other systems (interoperability)?
- How much effort (time, staff, funds) is required to modify the system to changing needs?”
|
Quote from paper2 |
“In four weeks (from week 5 to week 9 of 2018), the system was able to adapt to the epidemiological situation with minimal additional resources and personnel. Indeed, updates were not made in the IT systems of each Eds’ but at the level of the national ANSP server (by one person). This surveillance system was also flexible thank to the reactivity of ED physicians who timely implemented coding of visits related to dengue fever.” |
Data Quality |
Definition |
Completeness and validity of the data recorded in the surveillance system. |
Type of attribute |
Quantitative |
Indicators & measures |
- Completeness: % of unknown/blank responses
- Validity: % of correct values (e.g. impossible dates, sex or age)
|
Other considerations |
Influenced by case definitions, clarity of forms, training of reporters, and data management. Also influenced by lack of logic for data validation mechanisms in electronic forms. |
Examples |
- Percentage of ICD-10 codes missing in routine health information system-based surveillance system
- Percentage of impossible age and date values (i.e. out of range) in patient visit information data.
|
Questions to ask |
- What percentage of missing or invalid data exists in the system?
- Are case definitions, forms, training, and data management optimized for quality?
|
Quote from paper3 |
“Over 99% of cases had complete fields for treatment start date, age, sex, type of TB patient, site of disease, and initial sputum smear result. However, a high percentage of cases had blank values for HIV test date (47%), treatment regimen (12%), and treatment outcome date (30%).” |
Acceptability |
Definition |
Willingness of persons and organizations to participate in the surveillance system. |
Type of attribute |
Mixed |
Indicators & measures |
- Participation rates
- Completeness of forms
- Timeliness of reporting
|
Other considerations |
- Assesses the perceptions of the interaction between the user and the surveillance system.
- Affected by importance of health event, burden on reporters, confidentiality protections etc. Statutory reporting requirements can aid acceptability.
|
Examples |
- Number or % of individuals reporting timely information into the system.
- In depth interview of users about their motivation or lack thereof for using the surveillance system
|
Questions to ask |
- What percentage of reporting organizations participate? What are participation rates in surveys?
- How complete and timely is reporting by organizations/individuals?
- Are participants satisfied with the system and willing to participate?
|
Quote from paper4 |
“Regarding the acceptability domain of the CMSS, the results showed that 44% of experts believed that the surveillance system questionnaire was filled by patients’ experts and 44.5% of the experts believed that the deceased child’s families did not have the necessary cooperation to complete the questionnaire information” |
Sensitivity |
Definition |
- Proportion of disease cases detected by the system.
- Ability to detect outbreaks and monitor case changes.
|
Type of attribute |
Quantitative |
Indicators & measures |
Proportion of total cases detected |
Other considerations |
- The emphasis is on estimating the proportion of all individuals with the condition that the surveillance system detects.
- Influenced by health behavior, diagnostics, and reporting.
- Sensitivity can be assessed using capture-recapture methods and other data sources.
|
Examples |
Proportion of cases detected by the system out of total cases detected by external verified source (see secondary data analysis) |
Quote from paper5 |
“The proportion of states with HPAI detected by the surveillance system compared to the number of states with actual outbreaks of HPAI, which is the sensitivity of the system was 15.4%…” |
Predictive Value Positive |
Definition |
Proportion of reported cases that have the health-related event under surveillance. |
Type of attribute |
Quantitative |
Indicators & measures |
Same as definition |
Other considerations |
- PVP impacts resource allocation; low PVP can lead to misdirected resources.
- Affects the initiation of potentially unnecessary outbreak investigations.
- PVP requires confirmation of cases reported, sometimes using external data sources like medical records, registries, and death certificates.
|
Example |
Proportion of confirmed cases out of all cases reported to the system |
Questions to ask |
- How often are reported cases confirmed as true cases?
- What proportion of identified outbreaks are genuine?
- How is PVP affecting resource allocation?
|
Representativeness |
Definition |
Accurately describes the occurrence of a health-related event over time and its distribution in the population by place and person. |
Type of attribute |
Quantitative |
Indicators & measures |
Number or rates of disease by age, sex, residence, geography, etc. |
Other considerations |
- Surveillance data should reflect the health-related event’s characteristics related to time, place, and person.
- Evaluating representativeness can identify excluded subgroups, leading to data collection improvements.
- Awareness of the strengths and limitations of the system’s data is crucial.
|
Example |
Proportion of geographic areas reporting data to surveillance system |
Questions to ask |
- How well do the data reflect the actual distribution of the health-related event?
- Are certain population subgroups systematically excluded?
- How consistent are the data sources over time?
|
Quote from paper6 |
“The results of the present study also suggested that the contributions of the private and public sectors to the surveillance of TB are significantly different, and therefore, the representativeness of the reported cases of TB to the CDSS is seriously under question.” |
Timeliness |
Definition |
Reflects the speed between steps in a public health surveillance system. |
Type of attribute |
Quantitative |
Indicators & measures |
- Time interval between onset of a health-related event and its reporting to the relevant public health agency.
- Time required for identification of trends, outbreaks, or the effect of control and prevention measures.
|
Other considerations |
- Relevant time interval can vary with the type of health-related event.
- Factors affecting timeliness can include patient’s recognition of symptoms, acquisition of medical care, attending physician’s diagnosis, and lab and reporting processes.
- Electronic data collection and interoperability can enhance timeliness.
|
Examples |
- Time between positive lab test and case reported to surveillance system
- Time between patient onset of symptoms and lab result
|
Questions to ask |
- How quickly are health-related events reported after their onset?
- How long does it take to identify trends or outbreaks?
|
Quote from paper7 |
“Analyzing the three key date intervals in the system reporting process shows that most timeliness variance between the three periods occurs in the first 3 days from when the specimen is collected. The interval between the specimen date and laboratory report dated … was the longest interval in each period” |
Stability |
Definition |
Refers to the reliability (ability to collect, manage, and provide data properly without failure) and availability (operational when needed) of the public health surveillance system. |
Type of attribute |
Mixed |
Indicators & measures |
- Number of unscheduled outages and system downtime.
- Costs associated with system repairs.
- Percentage of time the system is fully operational.
- Desired and actual time required for data collection, management, and release.
|
Other considerations |
Stability can be impacted by a lack of dedicated resources, such as workforce shortages. |
Examples |
- Average number of hours of system downtime per month
- Percent of personnel that have completed all required trainings
|
Questions to ask |
- How often does the system experience unscheduled outages?
- What are the costs associated with system repairs?
- Is the system fully operational most of the time?
|
Quote from paper8 |
“All hospitals have a sufficient number of computers, and any one of these computers can be used to install JSANDS. The 12 computers that were used for running JSANDS in the pilot hospitals were functional and did not need repair during the period of implementation.” |
Reference List:
1 Dente MG, Riccardo F, Bolici F, Colella NA, Jovanovic V, Drakulovic M, Vasic M, Mamlouk H, Maazaoui L, Bejaoui M, Zakhashvili K, Kalandadze I, Imnadze P, Declich S; MeSA Working Group. Implementation of the One Health approach to fight arbovirus infections in the Mediterranean and Black Sea Region: Assessing integrated surveillance in Serbia, Tunisia and Georgia. Zoonoses Public Health. 2019 May;66(3):276-287. doi: 10.1111/zph.12562. Epub 2019 Feb 5.
2 Vilain P, Vincent M, Fouillet A, Mougin-Damour K, Combes X, Vague A, Vaniet F, Filleul L, Menudier L. Flexibility of ED surveillance system to monitor dengue outbreak in Reunion Island. Online J Public Health Inform. 2019 May 30;11(1):e389. doi: 10.5210/ojphi.v11i1.9872.
3 Sharma A, Ndisha M, Ngari F, Kipruto H, Cain KP, Sitienei J, Bloss E. A review of data quality of an electronic tuberculosis surveillance system for case-based reporting in Kenya. Eur J Public Health. 2015 Dec;25(6):1095-7. doi: 10.1093/eurpub/ckv092. Epub 2015 May 25.
4 Bahardoust M, Rajabi A, Barakati SH, Naserbakht M, Ghadami S, Talachian E, Motevalian SA. Evaluation of Timeliness, Simplicity, Acceptability, and Flexibility in Child Mortality Surveillance System for Children Aged 1-59 Months in Iran. Int J Prev Med. 2019 Nov 28;10:205. doi: 10.4103/ijpvm.IJPVM_452_18.
5 Waziri NE, Nguku P, Olayinka A, Ajayi I, Kabir J, Okolocha E, Tseggai T, Joannis T, Okewole P, Kumbish P, Ahmed M, Lombin L, Nsubuga P. Evaluating a surveillance system: live-bird market surveillance for highly pathogenic avian influenza, a case study. Pan Afr Med J. 2014 Jul 21;18 Suppl 1(Suppl 1):11. doi: 10.11694/pamj.supp.2014.18.1.4188.
6 Kazerooni PA, Nejat M, Akbarpoor M, Sedaghat Z, Fararouei M. Underascertainment, underreporting, representativeness and timeliness of the Iranian communicable disease surveillance system for tuberculosis. Public Health. 2019 Jun;171:50-56. doi: 10.1016/j.puhe.2019.03.008. Epub 2019 May 14.
7 Clare T, Twohig KA, O’Connell AM, Dabrera G. Timeliness and completeness of laboratory-based surveillance of COVID-19 cases in England. Public Health. 2021 May;194:163-166. doi: 10.1016/j.puhe.2021.03.012. Epub 2021 Apr 1.
8 Khader Y, Alyahya M, El-Khatib Z, Batieha A, Al-Sheyab N, Shattnawi K. The Jordan Stillbirth and Neonatal Mortality Surveillance (JSANDS) System: Evaluation Study. J Med Internet Res. 2021 Jul 21;23(7):e29143. doi: 10.2196/29143.