id
stringlengths
16
27
title
stringlengths
18
339
abstract
stringlengths
95
38.7k
category
stringlengths
7
44
10.1101/2020.11.03.20225144
Characteristics of those most vulnerable to employment changes during the COVID-19 pandemic: a nationally representative cross-sectional study in Wales
BackgroundThe public health response to the SARS-CoV-2 (COVID-19) pandemic has had a detrimental impact on employment and there are concerns the impact may be greatest amongst the most vulnerable. We examined the characteristics of those who experienced changes in employment status during the early months of the pandemic. MethodsData was collected from a cross-sectional, nationally representative household survey of the working age population (18-64 years) in Wales in May/June 2020 (N=1,379). We looked at changes in employment and being placed on furlough since February 2020 across demographics, contract type, job skill level, health status and household factors. Chi-squared or Fishers tests and multinomial logistic regression models examined associations between demographics, subgroups and employment outcomes. ResultsOf our respondents 91.0% remained in the same job in May/June 2020 as they were in February 2020, 5.7% were now in a new job, and 3.3% experienced unemployment. In addition, 24% of our respondents reported being placed on furlough. Non-permanent contract types, individuals who reported low mental wellbeing and household financial difficulties were all significant factors in experiencing unemployment. Being placed on furlough was more likely in younger (18-29 years) and older (60-64 years) workers, those in lower skilled jobs and from households with less financial security. ConclusionA number of vulnerable population groups were observed to experience detrimental employment outcomes during the initial stage of the COVID-19 pandemic. Targeted support is needed to mitigate against both the direct impacts on employment, and indirect impacts on financial insecurity and health. What is already known on this subject?O_LIThe response to the current global pandemic caused by SARS-CoV-2 (COVID-19) is already having a significant impact on peoples ability to work and employment status. C_LIO_LIEmerging UK employment data has raised concerns about the disproportionate impact on specific demographic groups. C_LI What this study adds?O_LIGroups that reported higher proportions of being placed on furlough included younger (18-29 years) and older (50-64 years) workers, people from more deprived areas, in lower skilled jobs, and those from households with less financial security. C_LIO_LIJob insecurity in the early months of the COVID-19 pandemic was experienced more by those self-employed or employed on atypical or fixed term contract arrangements compared to those holding permanent contracts. C_LIO_LITo ensure that health and wealth inequalities are not exacerbated by COVID-19 or the economic response to the pandemic, interventions should include the promotion of secure employment and target the groups identified as most susceptible to the emerging harms of the pandemic. C_LI
public and global health
10.1101/2020.11.02.20224782
Prevalence Of COVID-19 In Rural Versus Urban Areas in a Low-Income Country: Findings from a State-Wide Study in Karnataka, India
Although the vast majority of confirmed cases of COVID-19 are in low- and middle-income countries, there are relatively few published studies on the epidemiology of SARS-CoV-2 in these countries. The few there are focus on disease prevalence in urban areas. We conducted state-wide surveillance for COVID-19, in both rural and urban areas of Karnataka between June 15-August 29, 2020. We tested for both viral RNA and antibodies targeting the receptor binding domain (RBD). Adjusted seroprevalence across Karnataka was 46.7% (95% CI: 43.3-50.0), including 44.1% (95% CI: 40.0-48.2) in rural and 53.8% (95% CI: 48.4-59.2) in urban areas. The proportion of those testing positive on RT-PCR, ranged from 1.5 to 7.7% in rural areas and 4.0 to 10.5% in urban areas, suggesting a rapidly growing epidemic. The relatively high prevalence in rural areas is consistent with the higher level of mobility measured in rural areas, perhaps because of agricultural activity. Overall seroprevalence in the state implies that by August at least 31.5 million residents had been infected by August, nearly an order of magnitude larger than confirmed cases.
infectious diseases
10.1101/2020.11.02.20223941
Predicting the Clinical Management of Skin Lesions using Deep Learning
Automated machine learning approaches to skin lesion diagnosis from images are approaching dermatologist-level performance. However, current machine learning approaches that suggest management decisions rely on predicting the underlying skin condition to infer a management decision without considering the variability of management decisions that may exist within a single condition. We present the first work to explore image-based prediction of clinical management decisions directly without explicitly predicting the diagnosis. In particular, we use clinical and dermoscopic images of skin lesions along with patient metadata from the Interactive Atlas of Dermoscopy dataset (1,011 cases; 20 disease labels; 3 management decisions) and demonstrate that predicting management labels directly is more accurate than predicting the diagnosis and then inferring the management decision (13.73 {+/-} 3.93% and 6.59 {+/-} 2.86% improvement in overall accuracy and AUROC respectively), statistically significant at p < 0.001. Directly predicting management decisions also considerably reduces the over-excision rate as compared to management decisions inferred from diagnosis predictions (24.56% fewer cases wrongly predicted to be excised). Furthermore, we show that training a model to also simultaneously predict the seven-point criteria and the diagnosis of skin lesions yields an even higher accuracy (improvements of 4.68 {+/-} 1.89% and 2.24 {+/-} 2.04% in overall accuracy and AUROC respectively) of management predictions. Finally, we demonstrate our models generalizability by evaluating on the publicly available MClass-D dataset and show that our model agrees with the clinical management recommendations of 157 dermatologists as much as they agree amongst each other.
dermatology
10.1101/2020.11.03.20221978
Altered Cortical Thickness Development in 22q11.2 Deletion Syndrome and Association with Psychotic Symptoms
Schizophrenia has been extensively associated with reduced cortical thickness (CT), and its neurodevelopmental origin is increasingly acknowledged. However, the exact timing and extent of alterations occurring in preclinical phases remain unclear. With a high prevalence of psychosis, 22q11.2 deletion syndrome (22q11DS) is a neurogenetic disorder that represents a unique opportunity to examine brain maturation in high-risk individuals. In this study, we quantified trajectories of CT maturation in 22q11DS and examined the association of CT development with the emergence of psychotic symptoms. Longitudinal structural MRI data with 1-6 time points were collected from 324 participants aged 5-35 years (N=148 22q11DS, N=176 controls), resulting in a total of 636 scans (N=334 22q11DS, N=302 controls). Mixed model regression analyses were used to compare CT trajectories between participants with 22q11DS and controls. Further, CT trajectories were compared between participants with 22q11DS who developed (N=61, 146 scans), or remained exempt of (N=47; 98 scans) positive psychotic symptoms during development. Compared to controls, participants with 22q11DS showed widespread increased CT, focal reductions in the posterior cingulate gyrus and superior temporal gyrus (STG), and accelerated cortical thinning during adolescence, mainly in fronto-temporal regions. Within 22q11DS, individuals who developed psychotic symptoms showed exacerbated cortical thinning in the right STG. Together, these findings suggest that genetic predisposition for psychosis is associated with increased CT starting from childhood and altered maturational trajectories of CT during adolescence, affecting predominantly fronto-temporal regions. In addition, accelerated thinning in the STG may represent an early biomarker associated with the emergence of psychotic symptoms.
psychiatry and clinical psychology
10.1101/2020.11.03.20225466
Delirium and Neuropsychological Outcomes in Critically Ill Patients with COVID-19: an Institutional Case Series
ObjectiveTo characterize the clinical course of delirium for COVID-19 patients in the intensive care unit, including post-discharge cognitive outcomes. Patients and MethodsA retrospective chart review was conducted for patients diagnosed with COVID-19 (n=148) admitted to an intensive care unit at Michigan Medicine between March 1, 2020 and May 31, 2020. A validated chart review method was used to identify presence of delirium, and various measures (e.g., Family Confusion Assessment Method, Short Blessed Test, Patient-Health Questionnaire-9) were used to determine neuropsychological outcomes between 1-2 months after hospital discharge. ResultsDelirium was identified in 108/148 (73%) patients in the study cohort, with median (interquartile range) duration lasting 10 (4 - 17) days. In the delirium cohort, 50% (54/108) of patients were African American, and delirious patients were more likely to be female (76/108, 70%) (absolute standardized differences >.30). Sedation regimens, inflammation, deviation from delirium prevention protocols, and hypoxic-ischemic injury were likely contributing factors, and the most common disposition for delirious patients was a skilled care facility (41/108, 38%). Among patients who were delirious during hospitalization, 4/17 (24%) later screened positive for delirium at home based on caretaker assessment, 5/22 (23%) demonstrated signs of questionable cognitive impairment or cognitive impairment consistent with dementia, and 3/25 (12%) screened positive for depression within two months after discharge. ConclusionPatients with COVID-19 commonly experience a prolonged course of delirium in the intensive care unit, likely with multiple contributing factors. Furthermore, neuropsychological impairment may persist after discharge.
anesthesia
10.1101/2020.11.03.20225441
Infection patterns of endemic human coronaviruses in rural households in coastal Kenya.
IntroductionThe natural history and transmission patterns of endemic human coronaviruses are of increased interest following the emergence of severe acute respiratory syndrome coronavirus-2 (SARS-CoV-2). MethodsIn rural Kenya 483 individuals from 47 households were followed for six months (2009-10) with nasopharyngeal swabs collected twice weekly regardless of symptoms. A total of 16,918 swabs were tested for human coronavirus (hCoV) OC43, NL63 and 229E and other respiratory viruses using polymerase chain reaction. ResultsFrom 346 (71.6%) household members 629 hCoV infection episodes were defined with 36.3% being symptomatic: varying by hCoV type and decreasing with age. Symptomatic episodes (aHR=0.6 (95% CI:0.5-0.8) or those with elevated peak viral load (medium aHR=0.4 (0.3-0.6); high aHR=0.31 (0.2-0.4)) had longer viral shedding compared to their respective counterparts. Homologous reinfections were observed in 99 (19.9%) of 497 first infections. School-age children (55%) were the most common index cases with those having medium (aOR=5.3 (2.3 - 12.0)) or high (8.1 (2.9 - 22.5)) peak viral load most often generating secondary cases. ConclusionHousehold coronavirus infection was common, frequently asymptomatic and mostly introduced by school-age children. Secondary transmission was influenced by viral load of index cases. Homologous-type reinfection was common. These data may be insightful for SARS-CoV-2.
epidemiology
10.1101/2020.11.04.20225573
Household Transmission of SARS-COV-2: Insights from a Population-based Serological Survey
BackgroundKnowing the transmissibility of asymptomatic infections and risk of infection from household- and community-exposures is critical to SARS-CoV-2 control. Limited previous evidence is based primarily on virologic testing, which disproportionately misses mild and asymptomatic infections. Serologic measures are more likely to capture all previously infected individuals. ObjectiveEstimate the risk of SARS-CoV-2 infection from household and community exposures, and identify key risk factors for transmission and infection. DesignCross-sectional household serosurvey and transmission model. SettingGeneva, Switzerland Participants4,524 household members [&ge;]5 years from 2,267 households enrolled April-June 2020. MeasurementsPast SARS-CoV-2 infection confirmed through IgG ELISA. Chain-binomial models based on the number of infections within households used to estimate the cumulative extra-household infection risk and infection risk from exposure to an infected household member by demographics and infectors symptoms. ResultsThe chance of being infected by a SARS-CoV-2 infected household member was 17.3% (95%CrI,13.7-21.7%) compared to a cumulative extra-household infection risk of 5.1% (95%CrI,4.5-5.8%). Infection risk from an infected household member increased with age, with 5-9 year olds having 0.4 times (95%CrI, 0.07-1.4) the odds of infection, and [&ge;]65 years olds having 2.7 (95%CrI,0.88-7.4) times the odds of infection of 20-49 year olds. Working-age adults had the highest extra-household infection risk. Seropositive asymptomatic household members had 69.6% lower odds (95%CrI,33.7-88.1%) of infecting another household member compared to those reporting symptoms, accounting for 14.7% (95%CrI,6.3-23.2%) of all household infections. LimitationsSelf-reported symptoms, small number of seropositive kids and imperfect serologic tests. ConclusionThe risk of infection from exposure to a single infected household member was more than three-times that of extra-household exposures over the first pandemic wave. Young children had a lower risk of infection from household members. Asymptomatic infections are far less likely to transmit than symptomatic ones but do cause infections. Funding SourceSwiss Federal Office of Public Health, Swiss School of Public Health (Corona Immunitas research program), Fondation de Bienfaisance du Groupe Pictet, Fondation Ancrage, Fondation Privee des Hopitaux Universitaires de Geneve, and Center for Emerging Viral Diseases.
epidemiology
10.1101/2020.10.30.20223115
COVID-19 in Hospitalized Ethiopian Children: Characteristics and Outcome Profile
BackgroundConsidering the number of people affected and the burden to the health care system due to the Coronavirus pandemic, there is still a gap in understanding the disease better leaving a space for new evidence to be filled by researchers. This scarcity of evidence is observed especially among children with the virus. Therefore, this study aimed to assess the characteristics and outcome profile of children with COVID-19 admitted to Millennium COVID-19 Care Center in Ethiopia. MethodsA prospective cohort study was conducted among 90 children with COVID-19 who were admitted from June 23 to September 17, 2020. Data was summarized using frequency tables, mean {+/-} standard deviation or median with Inter Quartile range values. A chi-square test/ Fischers exact test was used to compare disease severity between groups. ResultsThe median age of the participants was 15 years and 57 were females. The most common reported route of disease transmission was through close contact with a diagnosed person (41/90). Only three had a history of pre-existing comorbid illness. One-third (31/90) had one or more symptoms at diagnosis, the most common being cough (20/90). Among the 90 patients, 59 were asymptomatic, 14 had mild disease and the rest 17 had moderate disease. Based on the chi-square/ Fischers exact test result, no statistically significant difference was observed between the age groups and sex. ConclusionsPediatric patients seemed to have a milder disease presentation and a favorable outcome compared to other countries report and also the adult pattern observed in our country.
infectious diseases
10.1101/2020.11.03.20225482
Using Capture-Recapture Methods to Estimate Local Influenza Hospitalization Incidence Rates
BackgroundAccurate population estimates of disease incidence and burden are needed to set appropriate public health policy. The capture-recapture (C-R) method combines data from multiple sources to provide better estimates than is possible using single sources. MethodsData were derived from clinical virology test results and from an influenza vaccine effectiveness study from seasons 2016-2017 to 2018-2019. The Petersen C-R method was used to estimate the population size of influenza cases; these estimates were then used to calculate adult influenza hospitalization burden using a Centers for Disease Control and Prevention (CDC) multiplier method. ResultsOver all seasons, 343 influenza cases were reported in the clinical database and 313 in the research database. Fifty-nine cases (17%) reported in the clinical database were not captured in the research database, and 29 (9%) cases in the research database were not captured in the clinical database. Influenza hospitalizations were higher among vaccinated (58%) than the unvaccinated (35%) in the current season and were similar among unvaccinated (51%) and vaccinated (49%) in the previous year. Completeness of the influenza hospitalization capture was estimated to be 76%. The incidence rates for influenza hospitalizations varied by age and season and averaged 307-309 cases/100,000 adult population annually. ConclusionUsing Capture-Recapture methods with more than one database, along with a multiplier method with adjustments improves the population estimates of influenza disease burden compared with relying on a single data source.
infectious diseases
10.1101/2020.11.04.20225656
Factors influencing the COVID-19 daily deaths peak across European countries
OBJECTIVESThe purpose of this study was to determine predictors of the height of COVID-19 daily deaths peak and time to the peak, in order to explain their variability across European countries. STUDY DESIGNFor 34 European countries, publicly available data were collected on daily numbers of COVID-19 deaths, population size, healthcare capacity, government restrictions and their timing, tourism and change in mobility during the pandemic. METHODSUnivariate and multivariate generalised linear models using different selection algorithms (forward, backward, stepwise and genetic algorithm) were analysed with height of COVID-19 daily deaths peak and time to the peak as dependent variables. RESULTSThe proportion of the population living in urban areas, mobility at the day of first reported death and number of infections when borders were closed were assessed as significant predictors of the height of COVID-19 daily deaths peak. Testing the model with variety of selection algorithms provided consistent results. Total hospital bed capacity, population size, number of foreign travellers and day of border closure, were found as significant predictors of time to COVID-19 daily deaths peak. CONCLUSIONSOur analysis demonstrated that countries with higher proportions of the population living in urban areas, with lower reduction in mobility at the beginning of the pandemic, and countries which closed borders having more infected people experienced higher peak of COVID-19 deaths. Greater bed capacity, bigger population size and later border closure could result in delaying time to reach the deaths peak, whereas a high number of foreign travellers could accelerate it.
epidemiology
10.1101/2020.11.04.20151290
ZINC SUFFICIENCY AND COVID-19 MORTALITY IN SOCIALLY SIMILAR EUROPEAN POPULATIONS
The impact of Zinc (Zn) sufficiency/supplementation on COVID-19 associated mortality and incidence (SARS-CoV-2 infections) remains unknown. During an infection, the levels of free Zn are reduced as part of nutritional immunity to limit the growth and replication of pathogen and the ensuing inflammatory damage. Considering its key role in immune competency and frequently recorded deficiency in large sections of different populations, Zn has been prescribed for both prophylactic and therapeutic purposes in COVID-19 without any corroborating evidence for its protective role. Multiple trials are underway evaluating the effect of Zn supplementation on COVID-19 outcome in patients getting standard of care treatment. However, the trial designs presumably lack the power to identify negative effects of Zn supplementation, especially in the vulnerable groups of elderly and patients with comorbidities (contributing 9 out of 10 deaths; up to >8000-fold higher mortality). In this study, we have analyzed COVID-19 mortality and incidence (case) data from 23 socially similar European populations with comparable confounders (population: 522.47 million; experiencing up to >150 fold difference in death rates) and at the matching stage of the pandemic (12 March - 26 June 2020; 1st wave of COVID-19 incidence and mortality). Our results suggest a positive correlation between populations Zn-sufficiency status and COVID-19 mortality (r(23): 0.7893-0.6849, p-value<0.0003) as well as incidence [r(23):0.8084 to 0.5658; p-value<0.005]. The observed association is contrary to what would be expected if Zn sufficiency was protective in COVID-19. Thus, controlled trials or retrospective analyses of the adverse event patients data should be undertaken to correctly guide the practice of Zn supplementation in COVID-19.
epidemiology
10.1101/2020.11.04.20151290
ZINC SUFFICIENCY AND COVID-19 MORTALITY IN SOCIALLY SIMILAR EUROPEAN POPULATIONS
The impact of Zinc (Zn) sufficiency/supplementation on COVID-19 associated mortality and incidence (SARS-CoV-2 infections) remains unknown. During an infection, the levels of free Zn are reduced as part of nutritional immunity to limit the growth and replication of pathogen and the ensuing inflammatory damage. Considering its key role in immune competency and frequently recorded deficiency in large sections of different populations, Zn has been prescribed for both prophylactic and therapeutic purposes in COVID-19 without any corroborating evidence for its protective role. Multiple trials are underway evaluating the effect of Zn supplementation on COVID-19 outcome in patients getting standard of care treatment. However, the trial designs presumably lack the power to identify negative effects of Zn supplementation, especially in the vulnerable groups of elderly and patients with comorbidities (contributing 9 out of 10 deaths; up to >8000-fold higher mortality). In this study, we have analyzed COVID-19 mortality and incidence (case) data from 23 socially similar European populations with comparable confounders (population: 522.47 million; experiencing up to >150 fold difference in death rates) and at the matching stage of the pandemic (12 March - 26 June 2020; 1st wave of COVID-19 incidence and mortality). Our results suggest a positive correlation between populations Zn-sufficiency status and COVID-19 mortality (r(23): 0.7893-0.6849, p-value<0.0003) as well as incidence [r(23):0.8084 to 0.5658; p-value<0.005]. The observed association is contrary to what would be expected if Zn sufficiency was protective in COVID-19. Thus, controlled trials or retrospective analyses of the adverse event patients data should be undertaken to correctly guide the practice of Zn supplementation in COVID-19.
epidemiology
10.1101/2020.11.04.20151290
NUTRITIONAL IMMUNITY, ZINC SUFFICIENCY AND COVID-19 MORTALITY IN SOCIALLY SIMILAR EUROPEAN POPULATIONS
The impact of Zinc (Zn) sufficiency/supplementation on COVID-19 associated mortality and incidence (SARS-CoV-2 infections) remains unknown. During an infection, the levels of free Zn are reduced as part of nutritional immunity to limit the growth and replication of pathogen and the ensuing inflammatory damage. Considering its key role in immune competency and frequently recorded deficiency in large sections of different populations, Zn has been prescribed for both prophylactic and therapeutic purposes in COVID-19 without any corroborating evidence for its protective role. Multiple trials are underway evaluating the effect of Zn supplementation on COVID-19 outcome in patients getting standard of care treatment. However, the trial designs presumably lack the power to identify negative effects of Zn supplementation, especially in the vulnerable groups of elderly and patients with comorbidities (contributing 9 out of 10 deaths; up to >8000-fold higher mortality). In this study, we have analyzed COVID-19 mortality and incidence (case) data from 23 socially similar European populations with comparable confounders (population: 522.47 million; experiencing up to >150 fold difference in death rates) and at the matching stage of the pandemic (12 March - 26 June 2020; 1st wave of COVID-19 incidence and mortality). Our results suggest a positive correlation between populations Zn-sufficiency status and COVID-19 mortality (r(23): 0.7893-0.6849, p-value<0.0003) as well as incidence [r(23):0.8084 to 0.5658; p-value<0.005]. The observed association is contrary to what would be expected if Zn sufficiency was protective in COVID-19. Thus, controlled trials or retrospective analyses of the adverse event patients data should be undertaken to correctly guide the practice of Zn supplementation in COVID-19.
epidemiology
10.1101/2020.11.04.20151290
NUTRITIONAL IMMUNITY, ZINC SUFFICIENCY AND COVID-19 MORTALITY IN SOCIALLY SIMILAR EUROPEAN POPULATIONS
The impact of Zinc (Zn) sufficiency/supplementation on COVID-19 associated mortality and incidence (SARS-CoV-2 infections) remains unknown. During an infection, the levels of free Zn are reduced as part of nutritional immunity to limit the growth and replication of pathogen and the ensuing inflammatory damage. Considering its key role in immune competency and frequently recorded deficiency in large sections of different populations, Zn has been prescribed for both prophylactic and therapeutic purposes in COVID-19 without any corroborating evidence for its protective role. Multiple trials are underway evaluating the effect of Zn supplementation on COVID-19 outcome in patients getting standard of care treatment. However, the trial designs presumably lack the power to identify negative effects of Zn supplementation, especially in the vulnerable groups of elderly and patients with comorbidities (contributing 9 out of 10 deaths; up to >8000-fold higher mortality). In this study, we have analyzed COVID-19 mortality and incidence (case) data from 23 socially similar European populations with comparable confounders (population: 522.47 million; experiencing up to >150 fold difference in death rates) and at the matching stage of the pandemic (12 March - 26 June 2020; 1st wave of COVID-19 incidence and mortality). Our results suggest a positive correlation between populations Zn-sufficiency status and COVID-19 mortality (r(23): 0.7893-0.6849, p-value<0.0003) as well as incidence [r(23):0.8084 to 0.5658; p-value<0.005]. The observed association is contrary to what would be expected if Zn sufficiency was protective in COVID-19. Thus, controlled trials or retrospective analyses of the adverse event patients data should be undertaken to correctly guide the practice of Zn supplementation in COVID-19.
epidemiology
10.1101/2020.11.03.20189472
Demarcation line determination for diagnosis of gastric cancer disease range using unsupervised machine learning in magnifying narrow-band imaging
ObjectivesIt is important to determine an accurate demarcation line (DL) between the cancerous lesions and background mucosa in magnifying narrow-band imaging (M-NBI)-based diagnosis. However, it is difficult for novice endoscopists. Our aim was to automatically determine the accurate DL using a machine learning method. MethodsWe used an unsupervised machine learning approach to determine the DLs because it can reduce the burden of training machine learning models and labeling large datasets. Our method consists of the following four steps: 1) An M-NBI image is segmented into superpixels (a group of neighboring pixels) using simple linear iterative clustering. 2) The image features are extracted for each superpixel. 3) The superpixels are grouped into several clusters using the k-means method. 4) The boundaries of the clusters are extracted as DL candidates. To validate the proposed method, 23 M-NBI images of 11 cases were used for performance evaluation. The evaluation investigated the similarity of the DLs identified by endoscopists and our method, and the Euclidean distance between the two DLs was calculated. For the single case of 11 cases, the histopathological examination was also conducted and was used to evaluate the proposed system. ResultsThe average Euclidean distances for the 11 cases were10.65, 11.97, 7.82, 8.46, 8.59, 9.72, 12.20, 9.06, 22.86, 8.45, and 25.36. The results indicated that the specific selection of the number of clusters enabled the proposed method to detect DLs that were similar to those of the endoscopists. The DLs identified by our method represented the complex shapes of the DLs, similarly to those identified by experienced doctors. Also, it was confirmed that the proposed system could generate the pathologically valid DLs by increasing the number of clusters. ConclusionsOur proposed system can support the training of inexperienced doctors, as well as enrich the knowledge of experienced doctors in endoscopy.
gastroenterology
10.1101/2020.11.05.20223289
Longitudinal proteomic profiling of dialysis patients with COVID-19 reveals markers of severity and predictors of death
End-stage kidney disease (ESKD) patients are at high risk of severe COVID-19. We measured 436 circulating proteins in serial blood samples from hospitalised and non-hospitalised ESKD patients with COVID-19 (n=256 samples from 55 patients). Comparison to 51 non-infected patients revealed 221 differentially expressed proteins, with consistent results in a separate subcohort of 46 COVID-19 patients. 203 proteins were associated with clinical severity, including IL6, markers of monocyte recruitment (e.g. CCL2, CCL7), neutrophil activation (e.g. proteinase-3) and epithelial injury (e.g. KRT19). Machine learning identified predictors of severity including IL18BP, CTSD, GDF15, and KRT19. Survival analysis with joint models revealed 69 predictors of death. Longitudinal modelling with linear mixed models uncovered 32 proteins displaying different temporal profiles in severe versus non-severe disease, including integrins and adhesion molecules. These data implicate epithelial damage, innate immune activation, and leucocyte-endothelial interactions in the pathology of severe COVID-19 and provide a resource for identifying drug targets.
infectious diseases
10.1101/2020.11.04.20226308
A random-walk-based epidemiological model
Random walkers on a two-dimensional square lattice are used to explore the spatio-temporal growth of an epidemic. We have found that a simple random-walk system generates non-trivial dynamics compared with traditional well-mixed models. Phase diagrams characterizing the long-term behaviors of the epidemics are calculated numerically. The functional dependence of the basic reproductive number R0 on the models defining parameters reveals the role of spatial fluctuations and leads to a novel expression for R0. Special attention is given to simulations of inter-regional transmission of the contagion. The scaling of the epidemic with respect to space and time scales is studied in detail in the critical region, which is shown to be compatible with the directed-percolation universality class.
epidemiology
10.1101/2020.11.05.20225300
Children hospitalized for COVID-19 during the first winter of the pandemic in Buenos Aires, Argentina
BackgroundAlthough there are reports on COVID-19 in pediatrics, it is possible that the characteristics of each population, their health systems and how they faced the pandemic made the disease show distinctive features in different countries. ObjectiveWe aimed to describe the characteristics of patients hospitalized for COVID-19 in a tertiary pediatric hospital in the City of Buenos Aires, Argentina. MethodsDescriptive study, including all patients hospitalized for COVID-19 in a tertiary pediatric hospital, from 04/26/2020 to 10/31/2020. Demographic, clinical and epidemiological characteristics of the patients are described. ResultsIn the studied period 578 patients were hospitalized for COVID-19. The median age was 4.2 years and 83% had a history of close contact with a confirmed COVID-19 case. Regarding severity, 30.8% were asymptomatic, 60.4% mild, 7.4% moderate, and 1.4% severe. Among those with symptoms, the most frequent was fever, followed by sore throat and cough. ConclusionWe reported 578 cases of children and adolescents hospitalized for COVID-19, most of them showed a mild or asymptomatic condition.
pediatrics
10.1101/2020.11.05.20226415
Psychological resilience, coping behaviours, and social support among healthcare workers during the COVID-19 pandemic: A systematic review of quantitative studies
AimTo appraise and synthesize studies examining resilience, coping behaviours, and social support among healthcare workers during the coronavirus pandemic. BackgroundA wide range of evidence has shown that healthcare workers, currently on the frontlines in the fight against COVID-19, are not spared from the psychological and mental health-related consequences of the pandemic. Studies synthesizing the role of coping behaviours, resilience, and social support in safeguarding the mental health of healthcare workers during the pandemic is largely unknown. EvaluationThis is a systematic review with a narrative synthesis. A total of 31 articles were included in the review. Key IssuesHealthcare workers utilized both problem-centred and emotion-centred coping to manage the stress-associated with the coronavirus pandemic. Coping behaviours, resilience, and social support were associated with positive mental and psychological health outcomes. ConclusionSubstantial evidence supports the effectiveness of coping behaviours, resilience, and social support to preserve psychological and mental health among healthcare workers during the COVID-19 pandemic. Implications for Nursing ManagementIn order to safeguard the mental health of healthcare workers during the pandemic, hospital and nursing administrators should implement proactive measures to sustain resilience in HCWs, build coping skills, and implement creative ways to foster social support in healthcare workers through theory-based interventions, supportive leadership, and fostering a resilient work environment.
nursing
10.1101/2020.11.04.20226316
Don't wait, re-escalate: delayed action results in longer duration of COVID-19 restrictions
Non-pharmaceutical public health interventions have significant economic and social costs, and minimizing their duration is paramont. Assuming that interventions are sufficient to reduce infection prevalence, we use a simple linear SIR model with case importation to determine the relationship between the timing of restrictions, duration of measures necessary to return the incidence to a set point, and the final size of the outbreak. The predictions of our linear SIR model agree well with COVID-19 data from Atlantic Canada, and are consistent with the predictions of more complex deterministic COVID-19 models. We conclude that earlier re-escalation of restrictions results in shorter disruptions, smaller outbreaks, and consequently, lower economic and social costs. Our key message is succinctly summarized as dont wait, re-escalate since delaying re-escalation of restrictions results in not only more infections, but also longer periods of restrictions.
infectious diseases
10.1101/2020.11.05.20226100
Severity of Respiratory Infections due to SARS-CoV-2 in Working Population: Age and Body Mass Index Outweigh ABO Blood Group
BackgroundWith increasing rates of SARS-CoV-2 infections and the intention to avoid a lock-down, the risks for the working population are of great interest. No large studies have been conducted which allow risk assessment for this population. MethodsDKMS is a non-profit donor center for stem cell donation and reaches out to registered volunteers between 18 and 61 years of age. To identify risk factors for severe COVID-19 courses in this population we performed a cross-sectional study. Self-reported data on oro- or nasopharyngeal swabs, risk factors, symptoms and treatment were collected with a health questionnaire and linked to existing genetic data. We fitted multivariable logistic regression models for the risk of contracting SARS-CoV-2, risk of severe respiratory infection and risk of hospitalization. FindingsOf 4,440,895 contacted volunteers 924,660 (20.8%) participated in the study. Among 157,544 participants tested, 7,948 reported SARS-CoV-2 detection. Of those, 947 participants (11.9%) reported an asymptomatic course, 5,014 (63.1%) mild/moderate respiratory infections, and 1,987 (25%) severe respiratory tract infections. In total, 286 participants (3.6%) were hospitalized for respiratory tract infections. The risk of hospitalization in comparison to a 20-year old person of normal weight was 2.1-fold higher (95%-CI, 1.2-3.69, p=0.01) for a person of same age with a BMI between 35-40 kg/m2, it was 5.33-fold higher (95%-CI, 2.92-9.70, p<0.001) for a 55-year old person with normal weight and 11.2-fold higher (95%-CI, 10.1-14.6, p<0.001) for a 55-year old person with a BMI between 35-40 kg/m2. Blood group A was associated with a 1.15-fold higher risk for contracting SARS-CoV-2 (95%-CI 1.08-1.22, p<0.001) than blood group O but did not impact COVID-19 severity. InterpretationIn this relatively healthy population, the risk for hospitalizations due to SARS-CoV-2 infections was moderate. Age and BMI were major risk factors. These data may help to tailor risk-stratified preventive measures. FundingDKMS initiated and conducted this study. The Federal Ministry of Education and Research (BMBF) supported the study by a research grant (COVID-19 call (202), reference number 01KI20177).
infectious diseases
10.1101/2020.11.05.20226449
The sensitivity improved two-test algorithm "SIT2": a universal optimization strategy for SARSAQ-CoV-2 serology
BackgroundSerological tests are widely used in various medical disciplines for diagnostic and monitoring purposes. Unfortunately, the sensitivity and specificity of test systems is often poor, leaving room for false positive and false negative results. However, conventional methods used to increase specificity decrease sensitivity and vice versa. Using SARS-CoV-2 serology as an example, we propose here a novel testing strategy: the "Sensitivity Improved Two-Test" or " SIT2" algorithm. MethodsSIT2 involves confirmatory re-testing of samples with results falling in a predefined retesting-zone of an initial screening test, with adjusted cut-offs to increase sensitivity. We verified and compared the performance of SIT2 to single tests and orthogonal testing (OTA) in an Austrian cohort (1,117 negative, 64 post-COVID positive samples) and validated the algorithm in an independent British cohort (976 negatives, 536 positives). ResultsThe specificity of SIT2 was superior to single tests and non-inferior to OTA. The sensitivity was maintained or even improved using SIT2 when compared to single tests or OTA. SIT2 allowed correct identification of infected individuals even when a live virus neutralization assay could not detect antibodies. Compared to single testing or OTA, SIT2 significantly reduced total test errors to 0.46% (0.24-0.65) or 1.60% (0.94-2.38) at both 5% or 20% seroprevalence. ConclusionFor SARS-CoV-2 serology, SIT2 proved to be the best diagnostic choice at both 5% and 20% seroprevalence in all tested scenarios. It is an easy to apply algorithm and can potentially be helpful for the serology of other infectious diseases.
infectious diseases
10.1101/2020.11.05.20226449
The sensitivity improved two-test algorithm "SIT2": a universal optimization strategy for SARS-CoV-2 serology
BackgroundSerological tests are widely used in various medical disciplines for diagnostic and monitoring purposes. Unfortunately, the sensitivity and specificity of test systems is often poor, leaving room for false positive and false negative results. However, conventional methods used to increase specificity decrease sensitivity and vice versa. Using SARS-CoV-2 serology as an example, we propose here a novel testing strategy: the "Sensitivity Improved Two-Test" or " SIT2" algorithm. MethodsSIT2 involves confirmatory re-testing of samples with results falling in a predefined retesting-zone of an initial screening test, with adjusted cut-offs to increase sensitivity. We verified and compared the performance of SIT2 to single tests and orthogonal testing (OTA) in an Austrian cohort (1,117 negative, 64 post-COVID positive samples) and validated the algorithm in an independent British cohort (976 negatives, 536 positives). ResultsThe specificity of SIT2 was superior to single tests and non-inferior to OTA. The sensitivity was maintained or even improved using SIT2 when compared to single tests or OTA. SIT2 allowed correct identification of infected individuals even when a live virus neutralization assay could not detect antibodies. Compared to single testing or OTA, SIT2 significantly reduced total test errors to 0.46% (0.24-0.65) or 1.60% (0.94-2.38) at both 5% or 20% seroprevalence. ConclusionFor SARS-CoV-2 serology, SIT2 proved to be the best diagnostic choice at both 5% and 20% seroprevalence in all tested scenarios. It is an easy to apply algorithm and can potentially be helpful for the serology of other infectious diseases.
infectious diseases
10.1101/2020.11.05.20226449
Increasing test specificity without impairing sensitivity - lessons learned from SARS-CoV-2 serology
BackgroundSerological tests are widely used in various medical disciplines for diagnostic and monitoring purposes. Unfortunately, the sensitivity and specificity of test systems is often poor, leaving room for false positive and false negative results. However, conventional methods used to increase specificity decrease sensitivity and vice versa. Using SARS-CoV-2 serology as an example, we propose here a novel testing strategy: the "Sensitivity Improved Two-Test" or " SIT2" algorithm. MethodsSIT2 involves confirmatory re-testing of samples with results falling in a predefined retesting-zone of an initial screening test, with adjusted cut-offs to increase sensitivity. We verified and compared the performance of SIT2 to single tests and orthogonal testing (OTA) in an Austrian cohort (1,117 negative, 64 post-COVID positive samples) and validated the algorithm in an independent British cohort (976 negatives, 536 positives). ResultsThe specificity of SIT2 was superior to single tests and non-inferior to OTA. The sensitivity was maintained or even improved using SIT2 when compared to single tests or OTA. SIT2 allowed correct identification of infected individuals even when a live virus neutralization assay could not detect antibodies. Compared to single testing or OTA, SIT2 significantly reduced total test errors to 0.46% (0.24-0.65) or 1.60% (0.94-2.38) at both 5% or 20% seroprevalence. ConclusionFor SARS-CoV-2 serology, SIT2 proved to be the best diagnostic choice at both 5% and 20% seroprevalence in all tested scenarios. It is an easy to apply algorithm and can potentially be helpful for the serology of other infectious diseases.
infectious diseases
10.1101/2020.11.06.20226969
Interventions targeting nonsymptomatic cases can be important to prevent local outbreaks: SARS-CoV-2 as a case-study
During infectious disease epidemics, an important question is whether cases travelling to new locations will trigger local outbreaks. The risk of this occurring depends on the transmissibility of the pathogen, the susceptibility of the host population and, crucially, the effectiveness of surveillance in detecting cases and preventing onward spread. For many pathogens, transmission from presymptomatic and/or asymptomatic (together referred to as nonsymptomatic) infectious hosts can occur, making effective surveillance challenging. Here, using SARS-CoV-2 as a case-study, we show how the risk of local outbreaks can be assessed when nonsymptomatic transmission can occur. We construct a branching process model that includes nonsymptomatic transmission, and explore the effects of interventions targeting nonsymptomatic or symptomatic hosts when surveillance resources are limited. We consider whether the greatest reductions in local outbreak risks are achieved by increasing surveillance and control targeting nonsymptomatic or symptomatic cases, or a combination of both. We find that seeking to increase surveillance of symptomatic hosts alone is typically not the optimal strategy for reducing outbreak risks. Adopting a strategy that combines an enhancement of surveillance of symptomatic cases with efforts to find and isolate nonsymptomatic infected hosts leads to the largest reduction in the probability that imported cases will initiate a local outbreak.
epidemiology
10.1101/2020.11.06.20227108
Primary school staff perspectives of school closures due to COVID-19, experiences of schools reopening and recommendations for the future: a qualitative survey in Wales
School closures due to the COVID-19 global pandemic are likely to have a range of negative consequences spanning the domains of child development, education and health, in addition to the widening of inequalities and inequities. Research is required to improve understanding of the impact of school closures on the education, health and wellbeing of pupils and school staff, the challenges posed during reopening and importantly to identify how countries can return to in-school education and to inform policy. This qualitative study aimed to reflect on the perspectives and experiences of primary school staff (pupils aged 3-11) in Wales regarding school closures and the initial reopening of schools and to identify recommendations for the future. A total of 208 school staff completed a national online survey through the HAPPEN primary school network, consisting of questions about school closures (March to June 2020), the phased reopening of schools (June to July 2020) and a return to full-time education. Thematic analysis of survey responses highlighted that primary school staff perceive that gaps in learning, health and wellbeing have increased and inequalities have widened during school closures. Findings from this study identified five recommendations; (i) prioritise the health and wellbeing of pupils and staff; (ii) focus on enabling parental engagement and support; (iii) improve digital competence amongst pupils, teachers and parents; (iv) consider opportunities for smaller class sizes and additional staffing; and (v) improve the mechanism of communication between schools and families, and between government and schools.
public and global health
10.1101/2020.11.06.20222398
Optimal test-assisted quarantine strategies for COVID-19
ObjectiveTo evaluate the effectiveness of SARS-CoV-2 testing on shortening the duration of quarantines for COVID-19 and to identify the most effective choices of testing schedules. DesignWe performed extensive simulations to evaluate the performance of quarantine strategies when one or more SARS-CoV-2 tests were administered during the quarantine. Simulations were based on statistical models for the transmissibility and viral loads of SARS-CoV-2 infections and the sensitivities of available testing methods. Sensitivity analyses were performed to evaluate the impact of perturbations in model assumptions on the outcomes of optimal strategies. ResultsWe found that SARS-CoV-2 testing can effectively reduce the length of a quarantine without compromising safety. A single RT-PCR test performed before the end of quarantine can reduce quarantine duration to 10 days. Two tests can reduce the duration to 8 days, and three highly sensitive RT-PCR tests can justify a 6-day quarantine. More strategic testing schedules and longer quarantines are needed if tests are administered with less sensitive RT-PCR tests or antigen tests. Shorter quarantines can be utilized for applications that tolerate a residual post-quarantine transmission risk comparable to a 10-day quarantine. ConclusionsTesting could substantially reduce the length of isolation, reducing the physical and mental stress caused by lengthy quarantines. With increasing capacity and lowered costs of SARS-CoV-2 tests, test-assisted quarantines could be safer and more cost-effective than 14-day quarantines and warrant more widespread use. RESEARCH IN CONTEXTO_ST_ABSWhat is already known on this topic?C_ST_ABSO_LIRecommendations for quarantining individuals who could have been infected with COVID-19 are based on limited evidence. C_LIO_LIDespite recent theoretical and case studies of test-assisted quarantines, there has been no substantive investigation to quantify the safety and efficacy of, nor an exhaustive search for, optimal test-assisted quarantine strategies. C_LI What this study addsO_LIOur simulations indicate that the 14-day quarantine approach is overly conservative and can be safely shortened if testing is performed. C_LIO_LIOur recommendations include testing schedules that could be immediately adopted and implemented as government and industry policies. C_LI Role of the Funding SourceA major technology company asked that we perform simulations to understand the optimal strategy for managing personnel quarantining before forming cohorts of individuals who would work closely together. The funding entity did not influence the scope or output of the study but requested that we include antigen testing as a component of the quarantining process. Patrick Yu and Peter Matos are employees of Corporate Medical Advisors, and International S.O.S employs Julie McCashin. Other funding sources are research grants and did not influence the investigation.
health systems and quality improvement
10.1101/2020.11.06.20227405
Gout, rheumatoid arthritis and the risk of death from COVID-19: an analysis of the UK Biobank
ObjectivesTo assess whether gout and / or rheumatoid arthritis (RA) are risk factors for coronavirus disease 19 (COVID-19) diagnosis. To assess whether gout and / or RA are risk factors for death with COVID-19. MethodsWe used data from the UK Biobank. Multivariable-adjusted logistic regression was employed in the following analyses: Analysis A, to test for association between gout or RA and COVID-19 diagnosis (n=473,139); Analysis B, to test for association between gout or RA and death with COVID-19 in a case-control cohort of people who died or survived with COVID-19 (n=2,059); Analysis C, to test for association with gout or RA and death with COVID-19 in the entire UK Biobank cohort (n=473,139) ResultsRA, but not gout, associated with COVID-19 diagnosis in analysis A. Neither RA nor gout associated with risk of death in the COVID-19-diagnosed group in analysis B. However RA associated with risk of death related to COVID-19 using the UK Biobank cohort in analysis C independent of comorbidities and other measured risk factors (OR=1.9 [95% CI 1.2 ; 3.0]). Gout was not associated with death related to COVID-19 in the same UK Biobank analysis (OR=1.2 [95% CI 0.8 ; 1.7]). ConclusionRheumatoid arthritis is a risk factor for death with COVID-19 using the UK Biobank cohort. These findings require replication in larger data sets that also allow inclusion of a wider range of factors. Key messagesInformation on the risk of death from COVID-19 for people with gout and rheumatoid arthritis is scarce. In an analysis of the UK Biobank there is an increased risk of death related to COVID-19 for people with rheumatoid arthritis independent of included co-morbidities, but not gout. The findings need to be replicated in other datasets where the influence of therapies for rheumatoid arthritis can be tested.
rheumatology
10.1101/2020.11.06.20227330
Delays, masks, the elderly, and schools: first COVID-19 wave in the Czech Republic
Running across the globe for more than a year, the COVID-19 pandemic keeps demonstrating its strength. Despite a lot of understanding, uncertainty regarding the efficiency of interventions still persists. We developed an age-structured epidemic model parameterized with sociological data for the Czech Republic and found that (1) delaying the spring 2020 lockdown by four days produced twice as many confirmed cases by the end of the lockdown period, (2) personal protective measures such as face masks appear more effective than just a reduction of social contacts, (3) only sheltering the elderly is by no means effective, and (4) leaving schools open is a risky strategy. Despite the onset of vaccination, an evidence-based choice and timing of non-pharmaceutical interventions still remains the most important weapon against the COVID-19 pandemic. One sentence summaryWe address several issues regarding COVID-19 interventions that still elicit controversy and pursue ignorance
epidemiology
10.1101/2020.11.07.20227082
A saliva-based RNA extraction-free workflow integrated with Cas13a for SARS-CoV-2 detection
A major bottleneck in scaling-up COVID-19 testing is the need for sophisticated instruments and well-trained healthcare professionals, which are already overwhelmed due to the pandemic. Moreover, the high-sensitive SARS-CoV-2 diagnostics are contingent on an RNA extraction step, which, in turn, is restricted by constraints in the supply chain. Here, we present CASSPIT (Cas13 Assisted Saliva-based & Smartphone Integrated Testing), which will allow direct use of saliva samples without the need for an extra RNA extraction step for SARS-CoV-2 detection. CASSPIT utilizes CRISPR-Cas13a based SARS-CoV-2 RNA detection, and lateral-flow assay (LFA) readout of the test results. The sample preparation workflow includes an optimized chemical treatment and heat inactivation method, which, when applied to COVID-19 clinical samples, showed a 97% positive agreement with the RNA extraction method. With CASSPIT, LFA based visual limit of detection (LoD) for a given SARS-CoV-2 RNA spiked into the saliva samples was [~]200 copies; image analysis-based quantification further improved the analytical sensitivity to [~]100 copies. Upon validation of clinical sensitivity on RNA extraction-free saliva samples (n=76), a 98% agreement between the lateral-flow readout and RT-qPCR data was found (Ct<35). To enable user-friendly test results with provision for data storage and online consultation, we subsequently integrated lateral-flow strips with a smartphone application. We believe CASSPIT will eliminate our reliance on RT-qPCR by providing comparable sensitivity and will be a step toward establishing nucleic acid-based point-of-care (POC) testing for COVID-19.
infectious diseases
10.1101/2020.11.08.20224915
State- and County-Level COVID-19 Public Health Orders in California: Constructing a Dataset and Describing Their Timing, Content, and Stricture
Without vaccines, non-pharmaceutical interventions have been the most widely used approach to controlling the spread of COVID-19 epidemics. Various jurisdictions have implemented public health orders as a means of reducing effective contacts and controlling their local epidemics. Multiple studies have examined the effectiveness of various orders (e.g. use of face masks) for epidemic control. However, orders occur at different timings across jurisdictions and some orders on the same topic are stricter than others. We constructed a county-level longitudinal data set of more than 2,400 public health orders issues by California and its 58 counties pertaining to its 40 million residents. First, we describe methods used to construct the dataset that enables the characterization of the evolution over time of California state- and county-level public health orders dealing with COVID-19 from January 1, 2020 through June 30, 2021. Public health orders are both an interesting and important outcome in their own right and also a key input into analyses looking at how such orders may impact COVID-19 epidemics. To construct the dataset, we developed and executed a search strategy to identify COVID-19 public health orders over this time period for all relevant jurisdictions. We characterized each identified public health order in terms of the timing of when it was announced, went into effect and (potentially) expired. We also adapted an existing schema to describe the topic(s) each public health order dealt with and the level of stricture each imposed, applying it to all identified orders. Finally, as an initial assessment, we examined the patterns of public health orders within and across counties, focusing on the timing of orders, the rate of increase and decrease in stricture, and on variation and convergence of orders within regions.
health policy
10.1101/2020.11.08.20224915
State- and County-Level COVID-19 Public Health Orders in California: Constructing a Dataset and Describing Their Timing, Content, and Stricture
Without vaccines, non-pharmaceutical interventions have been the most widely used approach to controlling the spread of COVID-19 epidemics. Various jurisdictions have implemented public health orders as a means of reducing effective contacts and controlling their local epidemics. Multiple studies have examined the effectiveness of various orders (e.g. use of face masks) for epidemic control. However, orders occur at different timings across jurisdictions and some orders on the same topic are stricter than others. We constructed a county-level longitudinal data set of more than 2,400 public health orders issues by California and its 58 counties pertaining to its 40 million residents. First, we describe methods used to construct the dataset that enables the characterization of the evolution over time of California state- and county-level public health orders dealing with COVID-19 from January 1, 2020 through June 30, 2021. Public health orders are both an interesting and important outcome in their own right and also a key input into analyses looking at how such orders may impact COVID-19 epidemics. To construct the dataset, we developed and executed a search strategy to identify COVID-19 public health orders over this time period for all relevant jurisdictions. We characterized each identified public health order in terms of the timing of when it was announced, went into effect and (potentially) expired. We also adapted an existing schema to describe the topic(s) each public health order dealt with and the level of stricture each imposed, applying it to all identified orders. Finally, as an initial assessment, we examined the patterns of public health orders within and across counties, focusing on the timing of orders, the rate of increase and decrease in stricture, and on variation and convergence of orders within regions.
health policy
10.1101/2020.11.08.20224915
State- and County-Level COVID-19 Public Health Orders in California: Constructing a Dataset and Describing Their Timing, Content, and Stricture
Without vaccines, non-pharmaceutical interventions have been the most widely used approach to controlling the spread of COVID-19 epidemics. Various jurisdictions have implemented public health orders as a means of reducing effective contacts and controlling their local epidemics. Multiple studies have examined the effectiveness of various orders (e.g. use of face masks) for epidemic control. However, orders occur at different timings across jurisdictions and some orders on the same topic are stricter than others. We constructed a county-level longitudinal data set of more than 2,400 public health orders issues by California and its 58 counties pertaining to its 40 million residents. First, we describe methods used to construct the dataset that enables the characterization of the evolution over time of California state- and county-level public health orders dealing with COVID-19 from January 1, 2020 through June 30, 2021. Public health orders are both an interesting and important outcome in their own right and also a key input into analyses looking at how such orders may impact COVID-19 epidemics. To construct the dataset, we developed and executed a search strategy to identify COVID-19 public health orders over this time period for all relevant jurisdictions. We characterized each identified public health order in terms of the timing of when it was announced, went into effect and (potentially) expired. We also adapted an existing schema to describe the topic(s) each public health order dealt with and the level of stricture each imposed, applying it to all identified orders. Finally, as an initial assessment, we examined the patterns of public health orders within and across counties, focusing on the timing of orders, the rate of increase and decrease in stricture, and on variation and convergence of orders within regions.
health policy
10.1101/2020.11.08.20224915
State- and County-Level COVID-19 Public Health Orders in California: Constructing a Dataset and Describing Their Timing, Content, and Stricture
Without vaccines, non-pharmaceutical interventions have been the most widely used approach to controlling the spread of COVID-19 epidemics. Various jurisdictions have implemented public health orders as a means of reducing effective contacts and controlling their local epidemics. Multiple studies have examined the effectiveness of various orders (e.g. use of face masks) for epidemic control. However, orders occur at different timings across jurisdictions and some orders on the same topic are stricter than others. We constructed a county-level longitudinal data set of more than 2,400 public health orders issues by California and its 58 counties pertaining to its 40 million residents. First, we describe methods used to construct the dataset that enables the characterization of the evolution over time of California state- and county-level public health orders dealing with COVID-19 from January 1, 2020 through June 30, 2021. Public health orders are both an interesting and important outcome in their own right and also a key input into analyses looking at how such orders may impact COVID-19 epidemics. To construct the dataset, we developed and executed a search strategy to identify COVID-19 public health orders over this time period for all relevant jurisdictions. We characterized each identified public health order in terms of the timing of when it was announced, went into effect and (potentially) expired. We also adapted an existing schema to describe the topic(s) each public health order dealt with and the level of stricture each imposed, applying it to all identified orders. Finally, as an initial assessment, we examined the patterns of public health orders within and across counties, focusing on the timing of orders, the rate of increase and decrease in stricture, and on variation and convergence of orders within regions.
health policy
10.1101/2020.11.08.20224915
State- and County-Level COVID-19 Public Health Orders in California: Constructing a Dataset and Describing Their Timing, Content, and Stricture
Without vaccines, non-pharmaceutical interventions have been the most widely used approach to controlling the spread of COVID-19 epidemics. Various jurisdictions have implemented public health orders as a means of reducing effective contacts and controlling their local epidemics. Multiple studies have examined the effectiveness of various orders (e.g. use of face masks) for epidemic control. However, orders occur at different timings across jurisdictions and some orders on the same topic are stricter than others. We constructed a county-level longitudinal data set of more than 2,400 public health orders issues by California and its 58 counties pertaining to its 40 million residents. First, we describe methods used to construct the dataset that enables the characterization of the evolution over time of California state- and county-level public health orders dealing with COVID-19 from January 1, 2020 through June 30, 2021. Public health orders are both an interesting and important outcome in their own right and also a key input into analyses looking at how such orders may impact COVID-19 epidemics. To construct the dataset, we developed and executed a search strategy to identify COVID-19 public health orders over this time period for all relevant jurisdictions. We characterized each identified public health order in terms of the timing of when it was announced, went into effect and (potentially) expired. We also adapted an existing schema to describe the topic(s) each public health order dealt with and the level of stricture each imposed, applying it to all identified orders. Finally, as an initial assessment, we examined the patterns of public health orders within and across counties, focusing on the timing of orders, the rate of increase and decrease in stricture, and on variation and convergence of orders within regions.
health policy
10.1101/2020.11.08.20227876
Design and implementation of an international, multi-arm, multi-stage platform master protocol for trials of novel SARS-CoV-2 antiviral agents: Therapeutics for Inpatients with COVID-19 (TICO/ACTIV-3)
BackgroundSafe and effective therapies for COVID-19 are urgently needed. In order to meet this need, the Accelerating COVID-19 Therapeutic Interventions and Vaccines (ACTIV) public-private partnership initiated the Therapeutics for Inpatients with COVID-19 (TICO). TICO is a multi-arm, multi-stage (MAMS) platform master protocol, which facilitates the rapid evaluation of the safety and efficacy of novel candidate anti-viral therapeutic agents for adults hospitalized with COVID-19. Four agents have so far entered the protocol, with rapid answers already provided for three of these. Other agents are expected to enter the protocol throughout 2021. This protocol contains a number of key design and implementation features that, along with challenges faced by the protocol team, are presented and discussed. Protocol Design and ImplementationThree clinical trial networks, encompassing a global network of clinical sites, participated in the protocol development and implementation. TICO utilizes a MAMS design with an agile and robust approach to futility and safety evaluation at 300 patients enrolled, with subsequent expansion to full sample size and an expanded target population if the agent shows an acceptable safety profile and evidence of efficacy. Rapid recruitment to multiple agents is enabled through the sharing of placebo as well as the confining of agent-specific information to protocol appendices, and modular consent forms. In collaboration with the Food and Drug Administration, a thorough safety data collection and DSMB schedule was developed for the study of agents with limited in-human data. ChallengesChallenges included ensuring drug supply and reliable recruitment allowing for changing infection rates across the global network of sites, the need to balance the collection of data and samples without overburdening clinical staff, and obtaining regulatory approvals across a global network of sites. ConclusionThrough a robust multi-network partnership, the TICO protocol has been successfully used across a global network of sites for rapid generation of efficacy data on multiple novel antiviral agents. The protocol design and implementation features used in this protocol, and the approaches to address challenges, will have broader applicability. Mechanisms to facilitate improved communication and harmonization among country-specific regulatory bodies are required.
infectious diseases
10.1101/2020.11.09.20227280
One size does not fit all: Single-subject analyses reveal substantial individual variation in electroencephalography (EEG) characteristics of antidepressant treatment response
Electroencephalography (EEG) characteristics associated with treatment response show potential for informing treatment choices for major depressive disorder, but to date, no robust markers have been identified. Variable findings might be due to the use of group analyses on a relatively heterogeneous population, which neglect individual variation. However, the correspondence between group level findings and individual brain characteristics has not been extensively investigated. Using single-subject analyses, we explored the extent to which group-based EEG connectivity and complexity characteristics associated with treatment response could be identified in individual patients. Resting-state EEG data and Montgomery-[A]sberg Depression Rating Scale symptom scores were collected from 43 patients with depression (23 females) before, at 1 and 12 weeks of treatment with escitalopram, bupropion or both. The multivariate statistical technique partial least squares was used to: 1) identify differences in EEG connectivity (weighted phase lag index) and complexity (multiscale entropy) between responders and non-responders to treatment ([&ge;]50% and <50% reduction in symptoms, respectively, by week 12), and 2) determine whether group patterns could be identified in individual patients. The group analyses distinguished groups. Responders showed decreased alpha and increased beta connectivity and early, widespread decreases in coarse scale entropy over treatment. Non-responders showed an opposite connectivity pattern, and later, spatially confined decreases in coarse scale entropy. These EEG characteristics were identified in [~]40-60% of individual patients. Substantial individual variation highlighted by the single-subject analyses might explain why robust EEG markers of antidepressant treatment response have not been identified. As up to 60% of patients in our sample was not well represented by the group results, individual variation needs to be considered when investigating clinically useful characteristics of antidepressant treatment response. Author summaryMajor depression affects over 300 million people worldwide, placing great personal and financial burden on individuals and society. Although multiple forms of treatment exist, we are not able to predict which treatment will work for which patients, so finding the right treatment can take months to years. Neuroimaging biomarker research aims to find characteristics of brain function that can predict treatment outcomes, allowing us to identify the most effective treatment for each patient faster. While promising findings have been reported, most studies look at group-average differences at intake between patients who do and do not recover with treatment. We do not yet know if such group-level characteristics can be identified in individual patients, however, and therefore if they can indeed be used to personalize treatment. In our study, we conducted individual patient analyses, and compared the individual patterns identified to group-average brain characteristics. We found that only [~]40-60% of individual patients showed the same brain characteristics as their group-average. These results indicate that commonly conducted group-average studies miss potentially important individual variation in the brain characteristics associated with antidepressant treatment outcome. This variation should be considered in future research so that individualized prediction of treatment outcomes can become a reality. Trial registrationclinicaltrials.gov; https://clinicaltrials.gov; NCT00519428
psychiatry and clinical psychology
10.1101/2020.11.08.20228056
Protocol for a Sequential, Prospective Meta-Analysis to Describe COVID-19 in Pregnancy and Newborn Periods
We urgently need answers to basic epidemiological questions regarding SARS-CoV-2 infection in pregnant and postpartum women and its effect on their newborns. While many national registries, health facilities, and research groups are collecting relevant data, we need a collaborative and methodologically rigorous approach to better combine these data and address knowledge gaps, especially those related to rare outcomes. We propose that using a sequential, prospective meta-analysis (PMA) is the best approach to generate data for policy- and practice-oriented guidelines. As the pandemic evolves, additional studies identified retrospectively by the steering committee or through living systematic reviews will be invited to participate in this PMA. Investigators can contribute to the PMA by either submitting individual patient data or running standardized code to generate aggregate data estimates. For the primary analysis, we will pool data using two-stage meta-analysis methods. The meta-analyses will be updated as additional data accrue in each contributing study and as additional studies meet study-specific time or data accrual thresholds for sharing. At the time of publication, investigators of 25 studies, including more than 76,000 pregnancies, in 41 countries had agreed to share data for this analysis. Among the included studies, 12 have a contemporaneous comparison group of pregnancies without COVID-19, and four studies include a comparison group of non-pregnant women of reproductive age with COVID-19. Protocols and updates will be maintained publicly. Results will be shared with key stakeholders, including the World Health Organization (WHO) Maternal, Newborn, Child, and Adolescent Health (MNCAH) Research Working Group. Data contributors will share results with local stakeholders. Scientific publications will be published in open-access journals on an ongoing basis.
public and global health
10.1101/2020.11.11.20229914
Comparative analysis of three point-of-care lateral flow immunoassays for detection of anti-SARS-CoV-2 antibodies: data from 100 healthcare workers in Brazil
Since the Coronavirus Disease 2019 (COVID-19) pandemic, Brazil has the third-highest number of confirmed cases and the second-highest number of recovered patients. SARS-CoV-2 detection by real-time RT-PCR is the gold standard but requires a certified laboratory infrastructure with high-cost equipment and trained personnel. However, for large-scale testing, diagnostics should be fast, cost-effective, widely available, and deployed for the community, such as serological tests based on lateral flow immunoassay (LFIA) for IgM/IgG detection. We evaluated three different commercial point-of-care (POC) LFIAs for anti-SARS-CoV-2 IgM and IgG detection in capillary whole blood of 100 healthcare workers (HCW) from Sao Paulo university hospital previously tested by RT-PCR: 1) COVID-19 IgG/IgM BIO (Bioclin, Brazil), 2) Diagnostic kit for IgM/IgG Antibody to Coronavirus (SARS-CoV-2) (Livzon, China); and 3) SARS-CoV-2 Antibody Test (Wondfo, China). A total of 84 positives and 16 negatives HCW were tested. The data was also analyzed by the number of days post symptoms (DPS) in three groups: <30 (n=26), 30-59 (n=42), and >59 (n=16). The observed sensibility was 85.71%, 47.62%, and 44.05% for Bioclin, Wondfo, and Livzon, respectively, with a specificity of 100% for all LFIA. Bioclin was more sensitive (p<0.01), regardless of the DPS. Thus, the Bioclin may be used as a POC test to monitor SARS-CoV-2 seroconversion in HCW.
infectious diseases
10.1101/2020.11.09.20228023
Diagnosis and Tracking of SARS-CoV-2 Infection By T-Cell Receptor Sequencing
In viral diseases T cells exert a prominent role in orchestrating the adaptive immune response and yet a comprehensive assessment of the T-cell repertoire, compared and contrasted with antibody response, after severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) infection is currently lacking. A prior population-scale study of the municipality of Vo', Italy, conducted after the initial SARS-CoV-2 outbreak uncovered a high frequency of asymptomatic infected individuals and their role in transmission in this town. Two months later, we sampled the same population's T-cell receptor repertoire structure in terms of both diversity (breadth) and frequency (depth) to SARS-CoV-2 antigens to identify associations with both humoral response and protection. For this purpose, we analyzed T-cell receptor and antibody signatures from over 2,200 individuals, including 76 PCR-confirmed SARS-CoV-2 cases (25 asymptomatic, 42 symptomatic, 9 hospitalized). We found that 97.4% (74/76) of PCR confirmed cases had elevated levels of T-cell receptors specific for SARS-CoV-2 antigens. The depth and breadth of the T-cell receptor repertoire were both positively associated with neutralizing antibody titers; helper CD4+ T cells directed towards viral antigens from spike protein were a primary factor in this correlation. Higher clonal depth of the T-cell response to the virus was also significantly associated with more severe disease course. A total of 40 additional suspected infections were identified based on T-cell response from the subjects without confirmatory PCR tests, mostly among those reporting symptoms or having household exposure to a PCR-confirmed infection. Taken together, these results establish that T cells are a sensitive, reliable and persistent measure of past SARS-CoV-2 infection that are differentially activated depending on disease morbidity.
infectious diseases
10.1101/2020.11.09.20228197
Risk of acute arterial and venous thromboembolic events in Eosinophilic Granulomatosis with Polyangiitis (Churg-Strauss syndrome)
Background and objectiveSystemic small vessel vasculitides carry an increased risk of acute arterial and venous thromboembolic events (AVTE); however, this risk has not been systematically explored in Eosinophilic Granulomatosis with Polyangiitis (EGPA). This study assessed the occurrence and main risk factors of AVTE among EGPA patients as compared to the general community from the population-based Bruneck cohort. MethodsWe conducted a retrospective multicenter cohort study on 573 EGPA patients. Clinical and serological data were collected at diagnosis. Occurrence of AVTE and time to the first AVTE after EGPA diagnosis were recorded. Age-standardized event rate (SER) of AVTE as compared to the reference cohort was assessed. Cox regression was applied to identify AVTE predictors. Results129 EGPA patients (22.5%) had AVTE, considered as potentially life-threatening in 55.8%. Seventy patients experienced an AVTE prior to diagnosis (of whom 58.6% in the two years before diagnosis) and 75 following EGPA diagnosis, of whom 56% in the two subsequent years. The SER of AVTE as compared to the reference cohort was 2.10 (95% CI 1.67-2.63). This risk was particularly increased in patients with history of AVTE and with a Birmingham Vasculitis Activity Score [&ge;]20 at diagnosis. Patients receiving immunosuppression within 2 months of diagnosis were at lower risk, while antiplatelet or anticoagulant treatment did not confer measurable benefit. ConclusionEGPA is associated with AVTE in approximately one quarter of patients, particularly around diagnosis. Immunosuppressants seemed to exert a protective effect, while anticoagulant and antiplatelet agents did not.
allergy and immunology
10.1101/2020.11.09.20228114
Genetic analyses of gynecological disease identify genetic relationships between uterine fibroids and endometrial cancer, and a novel endometrial cancer genetic risk region at the WNT4 1p36.12 locus
Endometriosis, polycystic ovary syndrome (PCOS) and uterine fibroids have been proposed as endometrial cancer risk factors; however, disentangling their relationships with endometrial cancer is complicated due to shared risk factors and comorbidities. Using genome-wide association study (GWAS) data, we explored the relationships between these non-cancerous gynecological diseases and endometrial cancer risk by assessing genetic correlation, causal relationships and shared risk loci. We found significant genetic correlation between endometrial cancer and PCOS, and uterine fibroids. Adjustment for genetically predicted body mass index (a risk factor for PCOS, uterine fibroids and endometrial cancer) substantially attenuated the genetic correlation between endometrial cancer and PCOS but did not affect the correlation with uterine fibroids. Mendelian randomization analyses provided evidence of a causal relationship between only uterine fibroids and endometrial cancer. Gene-based analyses revealed risk regions shared between endometrial cancer and endometriosis, and uterine fibroids. Multi-trait GWAS analysis of endometrial cancer and the genetically correlated gynecological diseases identified a novel genome-wide significant endometrial cancer risk locus at 1p36.12, which replicated in an independent endometrial cancer dataset. Interrogation of functional genomic data at 1p36.12 revealed biologically relevant genes, including WNT4 which is necessary for the development of the female reproductive system. In summary, our study provides genetic evidence for a causal relationship between uterine fibroids and endometrial cancer. It further provides evidence that the comorbidity of endometrial cancer, PCOS and uterine fibroids may partly be due to shared genetic architecture. Notably, this shared architecture has revealed a novel genome-wide risk locus for endometrial cancer.
oncology
10.1101/2020.11.09.20228684
Retrospective Analysis of Interventions to Epidemics using Dynamic Simulation of Population Behavior
Retrospective analyses of interventions to epidemics, in which the effectiveness of strategies implemented are compared to hypothetical alternatives, are valuable for performing the cost-benefit calculations necessary to optimize infection countermeasures. SIR (susceptible-infected-removed) models are useful in this regard but are limited by the challenge of deciding how and when to update the numerous parameters as the epidemic changes in response to population behaviors. Behaviors of particular interest include facemasks adoption (at various levels) and social distancing. We present a method that uses a "dynamic spread function" to systematically capture the continuous variation in the population behavior, and the gradual change in infection evolution, resulting from interventions. No parameter updates are made by the user. We use the tool to quantify the reduction in infection rate realizable from the population of New York City adopting different facemask strategies during COVID-19. Assuming a baseline facemask of 67% filtration efficiency, calculations show that increasing the efficiency to 80% could have reduced the roughly 5000 new infections per day occurring at the peak of the epidemic to around 4000. Population behavior that may not be varied as part of the retrospective analysis, such as social distancing in a facemask analysis, are automatically captured as part of the calibration of the dynamic spread function.
infectious diseases
10.1101/2020.11.09.20228684
Retrospective Analysis of Interventions to Epidemics using Dynamic Simulation of Population Behavior
Retrospective analyses of interventions to epidemics, in which the effectiveness of strategies implemented are compared to hypothetical alternatives, are valuable for performing the cost-benefit calculations necessary to optimize infection countermeasures. SIR (susceptible-infected-removed) models are useful in this regard but are limited by the challenge of deciding how and when to update the numerous parameters as the epidemic changes in response to population behaviors. Behaviors of particular interest include facemasks adoption (at various levels) and social distancing. We present a method that uses a "dynamic spread function" to systematically capture the continuous variation in the population behavior, and the gradual change in infection evolution, resulting from interventions. No parameter updates are made by the user. We use the tool to quantify the reduction in infection rate realizable from the population of New York City adopting different facemask strategies during COVID-19. Assuming a baseline facemask of 67% filtration efficiency, calculations show that increasing the efficiency to 80% could have reduced the roughly 5000 new infections per day occurring at the peak of the epidemic to around 4000. Population behavior that may not be varied as part of the retrospective analysis, such as social distancing in a facemask analysis, are automatically captured as part of the calibration of the dynamic spread function.
infectious diseases
10.1101/2020.11.09.20223792
The Individual and Social Determinants of COVID-19 Diagnosis in Ontario, Canada: A Population-Wide Study
BackgroundOptimizing the public health response to reduce coronavirus disease 2019 (COVID-19) burden necessitates characterizing population-level heterogeneity of COVID-19 risks. However, heterogeneity in severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) testing may introduce biased estimates depending on analytic design. MethodsWe explored the potential for collider bias and characterized individual, environmental, and social determinants of testing and diagnosis using cross-sectional analyses among 14.7 million community-dwelling people in Ontario, Canada. Among those diagnosed, we used separate analytic designs to compare predictors of: 1) individuals testing positive versus negative; 2) symptomatic individuals only testing positive versus testing negative; and 3) individuals testing positive versus individuals not testing positive (i.e., testing negative or not being tested). Analyses included tests conducted between March 1 and June 20, 2020. ResultsOf a total of 14,695,579 individuals, 758,691 were tested for SARS-CoV-2, of whom 25,030 (3.3%) tested positive. The further the odds of testing from the null, the more variability observed in the odds of diagnosis across analytic design, particularly among individual factors. There was less variability in testing by social determinants across analytic designs. Residing in areas with highest household density (adjusted odds ratio [aOR]: 1.86; 95%CI: 1.75-1.98), highest proportion of essential workers (aOR: 1.58; 95%CI: 1.48-1.69), lowest educational attainment (aOR: 1.33; 95%CI: 1.26-1.41), and highest proportion of recent immigrants (aOR: 1.10; 95%CI: 1.05-1.15) were consistently related to increased odds of SARS-CoV-2 diagnosis regardless of analytic design. InterpretationWhere testing is limited, risk factors may be better estimated using population comparators rather than test-negative comparators. Optimizing COVID-19 responses necessitates investment and sufficient coverage of structural interventions tailored to heterogeneity in social determinants of risk, including household crowding, occupation, and structural racism.
epidemiology
10.1101/2020.11.09.20227843
Identifying Neuroanatomical and Behavioral Features for Autism Spectrum Disorder Diagnosis in Children using Machine Learning
Autism spectrum disorder (ASD) is a neurodevelopmental disorder that can cause significant social, communication, and behavioral challenges. Diagnosis of ASD is complicated and there is an urgent need to identify ASD-associated biomarkers and features to help automate diagnostics and develop predictive ASD models. The present study adopts a novel evolutionary algorithm, the conjunctive clause evolutionary algorithm (CCEA), to select features most significant for distinguishing individuals with and without ASD, and is able to accommodate datasets having a small number of samples with a large number of feature measurements. The dataset is unique and comprises both behavioral and neuroimaging measurements from a total of 28 children from 7 to 14 years old. Potential biomarker candidates identified include brain volume, area, cortical thickness, and mean curvature in specific regions around the cingulate cortex, frontal cortex, and temporal-parietal junction, as well as behavioral features associated with theory of mind. A separate machine learning classifier (i.e., k-nearest neighbors algorithm) was used to validate the CCEA feature selection and for ASD prediction. Study findings demonstrate how machine learning tools might help move the needle on improving diagnostic and predictive models of ASD.
neurology
10.1101/2020.11.09.20226746
Risk Score Stratification of Alzheimer's Disease and Mild Cognitive Impairment using Deep Learning
Alzheimer Disease (AD) is a progressive neurodegenerative disease that can significantly impair cognition and memory. AD is the leading cause of dementia and affects one in ten people age 65 and older. Current diagnoses methods of AD heavily rely on the use of Magnetic Resonance Imaging (MRI) since non-imaging methods can vary widely leading to inaccurate diagnoses. Furthermore, recent research has revealed a substage of AD, Mild Cognitive Impairment (MCI), that is characterized by symptoms between normal cognition and dementia which makes it more prone to misdiagnosis. A large battery of clinical variables are currently used to detect cognitive impairment and classify early mild cognitive impairment (EMCI), late mild cognitive impairment (LMCI), and AD from cognitive normal (CN) patients. The goal of this study was to derive a simplified risk-stratification algorithm for diagnosis and identify a few significant clinical variables that can accurately classify these four groups using an empirical deep learning approach. Over 100 variables that included neuropsychological/neurocognitive tests, demographics, genetic factors, and blood biomarkers were collected from EMCI, LMCI, AD, and CN patients from the Alzheimers Disease Neuroimaging Initiative (ADNI) database. Feature engineering was performed with 5 different methods and a neural network was trained on 90% of the data and tested on 10% using 10-fold cross validation. Prediction performance used area under the curve (AUC) of the receiver operating characteristic analysis. The five different feature selection methods consistently yielded the top classifiers to be the Clinical Dementia Rating Scale - Sum of Boxes (CDRSB), Delayed total recall (LDELTOTAL), Modified Preclinical Alzheimer Cognitive Composite with Trails test (mPACCtrailsB), the Modified Preclinical Alzheimer Cognitive Composite with Digit test (mPACCdigit), and Mini-Mental State Examination (MMSE). The best classification model yielded an AUC of 0.984, and the simplified risk-stratification score yielded an AUC of 0.963 on the test dataset. Our results show that this deep-learning algorithm and simplified risk score derived from our deep-learning algorithm accurately diagnose EMCI, LMCI, AD and CN patients using a few commonly available neurocognitive tests. The project was successful in creating an accurate, clinically translatable risk-stratified scoring aid for primary care providers to diagnose AD in a fast and inexpensive manner.
primary care research
10.1101/2020.11.10.20229120
PERCEIVED STIGMATIZATION, SOCIODEMOGRAPHIC FACTORS, AND MENTAL HEALTH HELP-SEEKING BEHAVIORS AMONG PSYCHIATRIC OUTPATIENTS ATTENDING A PSYCHIATRIC HOSPITAL IN LAGOS, NIGERIA.
Among the general population of patients with mental illness is a sub-population (psychiatric outpatients) who often encounter limited mental health help-seeking behaviors due to many unknown factors. Therefore, this study aimed to explore some predictors of mental health help-seeking behaviors among psychiatric outpatients. This cross-sectional study accidentally recruited 42 psychiatric outpatients receiving treatment at the Federal Neuropsychiatric Hospital, Yaba, Lagos State, Nigeria. Their mean age was 27.03{+/-}7.05 years (age range = 18-48 years). Data were collected using standardized questionnaires, and analyzed using SPSS (v. 22). Statistical significance set at p<.05. The first finding showed a positive but not significant relationship between perceived stigmatization and mental health help-seeking behavior. Second showed that gender had no significant influence on mental health help-seeking behavior. Third showed that age had a positive but not significant relationship with mental health help-seeking behavior. Last finding submitted that clinical diagnosis, religious affiliation, marital status, and educational qualification had a significant joint prediction on mental health help-seeking behaviour, with 28% variance explained. Only religious affiliation had a significant independent prediction. Our findings have practical implications for enhancing mental health help-seeking behavior and strengthening an interdisciplinary approach to mental health care.
psychiatry and clinical psychology
10.1101/2020.11.10.20229203
The genetic risk for COVID-19 severity is associated with defective innate immune responses
Recent genome-wide association studies (GWASs) of COVID-19 patients of European ancestry have identified genetic loci significantly associated with disease severity (1). Here, we employed the detailed clinical, immunological and multi-omics dataset of the Human Functional Genomics Projects (HFGP) to explore the physiological significance of the host genetic variants that influence susceptibility to severe COVID-19. A genomics investigation intersected with functional characterization of individuals with high genetic risk for severe COVID-19 susceptibility identified several major patterns: i. a large impact of genetically determined innate immune responses in COVID-19, with increased susceptibility for severe disease in individuals with defective monocyte-derived cytokine production; ii. genetic susceptibility related to ABO blood groups is probably mediated through the von Willebrand factor (VWF) and endothelial dysfunction. We further validated these identified associations at transcript and protein levels by using independent disease cohorts. These insights allow a physiological understanding of genetic susceptibility to severe COVID-19, and indicate pathways that could be targeted for prevention and therapy. One Sentence summaryIn this study, we explore the physiological significance of the genetic variants associated with COVID-19 severity using detailed clinical, immunological and multi-omics data from large cohorts. Our findings allow a physiological understanding of genetic susceptibility to severe COVID-19, and indicate pathways that could be targeted for prevention and therapy.
genetic and genomic medicine
10.1101/2020.11.10.20228890
Anti-SARS-CoV-2 IgG responses are powerful predicting signatures for the outcome of COVID-19 patients
The COVID-19 global pandemic is far from ending. There is an urgent need to identify applicable biomarkers for early predicting the outcome of COVID-19. Growing evidences have revealed that SARS-CoV-2 specific antibodies evolved with disease progression and severity in COIVD-19 patients. We assumed that antibodies may serve as biomarkers for predicting disease outcome. By taking advantage of a newly developed SARS-CoV-2 proteome microarray, we surveyed IgG responses against 20 proteins of SARS-CoV-2 in 1,034 hospitalized COVID-19 patients on admission and followed till 66 days. The microarray results were further correlated with clinical information, laboratory test results and patient outcomes. Cox proportional hazards model was used to explore the association between SARS-CoV-2 specific antibodies and COVID-19 mortality. We found that nonsurvivors induced higher levels of IgG responses against most of non-structural proteins than survivors on admission. In particular, the magnitude of IgG antibodies against 8 non-structural proteins (NSP1, NSP4, NSP7, NSP8, NSP9, NSP10, RdRp, and NSP14) and 2 accessory proteins (ORF3b and ORF9b) possessed significant predictive power for patient death, even after further adjustments for demographics, comorbidities, and common laboratory biomarkers for disease severity (all with p trend < 0.05). Additionally, IgG responses to all of these 10 non-structural/accessory proteins were also associated with the severity of disease, and differential kinetics and serum positive rate of these IgG responses were confirmed in COVID-19 patients of varying severities within 20 days after symptoms onset. The AUCs for these IgG responses, determined by computational cross-validations, were between 0.62 and 0.71. Our findings have important implications for improving clinical management, and especially for developing medical interventions and vaccines.
infectious diseases
10.1101/2020.11.10.20228528
Two-step strategy for the identification of SARS-CoV-2 variant of concern 202012/01 and other variants with spike deletion H69-V70, France, August to December 2020
We report the implementation of a two-step strategy for the identification of SARS-CoV-2 variants carrying the spike deletion H69-V70 ({Delta}H69/{Delta}V70). This spike deletion resulted in a S-gene target failure (SGTF) of a three-target RT-PCR assay (TaqPath kit). Whole genome sequencing performed on 37 samples with SGTF revealed several receptor-binding domain mutations co-occurring with {Delta}H69/{Delta}V70. More importantly, this strategy enabled the first detection of the variant of concern 202012/01 in France on December 21th 2020. Since September a SARS-CoV-2 spike (S) deletion H69-V70 ({Delta}H69/{Delta}V70) has attracted increasing attention. This deletion was detected in the cluster-5 variant identified both in minks and humans in Denmark. This cluster-5 variant carries a receptor binding domain (RBD) mutation Y453F and was associated with reduced susceptibility to neutralizing antibodies to sera from recovered COVID-19 patients [1-3]. The {Delta}H69/{Delta}V70 has also co-occurred with two other RBD mutations of increasing interest [4]: N439K that is currently spreading in Europe and might also have reduced susceptibility to SARS-CoV-2 antibodies [5]; and N501Y that is part of the SARS-CoV-2 variant of concern (VOC) 202012/01 recently detected in England [6]. Although the impact of {Delta}H69/{Delta}V70 on SARS-CoV-2 pathogenesis is not clear, enhanced surveillance is urgently needed. Herein we report the implementation of a two-step strategy enabling a rapid detection of VOC 202012/01 or other variants carrying {Delta}H69/{Delta}V70.
infectious diseases
10.1101/2020.11.10.20215145
The National Seroprevalence of SARS-CoV-2 Antibody in the Asymptomatic Population
As the COVID-19 pandemic continues to ravage the world there is a great need to understand the dynamics of spread. Currently the seroprevalence of asymptomatic COVID-19 doubles every 3 months, this silent epidemic of new infections may be the main driving force behind the rapid increase in SARS-CoV-2 cases. Public health official quickly recognized that clinical cases were just the tip of the iceberg. In fact a great deal of the spread was being driven by the asymptomatically infected who continued to go out, socialize and go to work. While seropositivity is an insensitive marker for acute infection it does tell us about the prevalence COVID-19 in the population. ObjectiveDescribe the seroprevalence of SARS-CoV-2 infection in the United States over time. MethodologyRepeated convenience samples from a commercial laboratory dedicated to the assessment of life insurance applicants were tested for the presence of antibodies to SARS-CoV-2, in several time periods between May and December of 2020. US census data were used to estimate the population prevalence of seropositivity. ResultsThe raw seroprevalence in the May-June, September, and December timeframes were 3.0%, 6.6% and 10.4%, respectively. Higher rates were noted in younger vs. older age groups. Total estimated seroprevalence in the US is estimated at 25.7 million cases. ConclusionsThe seroprevalence of SARS-CoV-2 demonstrates a significantly larger pool of individuals who have contract COVID-19 and recovered, implying a lower case rate of hospitalizations and deaths than have been reported so far.
epidemiology
10.1101/2020.11.10.20211995
CoSIR: Managing an Epidemic via Optimal Adaptive Control of Transmission Policy
Multiple macro-phenomena such as disease epidemics, online information propagation, and economic activity can be well-approximated using simple dynamical systems. Shaping these phenomena with adaptive control of key levers has long been the holy grail of policymakers. In this paper, we focus on optimal control of transmission rate in epidemic systems following the widely applicable SIR dynamics. We first demonstrate that the SIR model with infectious patients and susceptible contacts (i.e., product of transmission rate and susceptible population) interpreted as predators and prey respectively reduces to a Lotka-Volterra (LV) predator-prey model. The modified SIR system (LVSIR) has a stable equilibrium point, an "energy" conservation property, and exhibits bounded cyclic behavior. We exploit this mapping using a control-Lyapunov approach to design a novel adaptive control policy (CoSIR) that nudges the SIR model to the desired equilibrium. Combining CoSIR policy with data-driven estimation of parameters and adjustments for discrete transmission levels yields a control strategy with practical utility. Empirical comparison with periodic lockdowns on simulated and real COVID-19 data demonstrates the efficacy and adaptability of our approach.
epidemiology
10.1101/2020.11.10.20211995
CoSIR: Optimal control of SIR epidemic dynamics by mapping to Lotka-Volterra System
Multiple macro-phenomena such as disease epidemics, online information propagation, and economic activity can be well-approximated using simple dynamical systems. Shaping these phenomena with adaptive control of key levers has long been the holy grail of policymakers. In this paper, we focus on optimal control of transmission rate in epidemic systems following the widely applicable SIR dynamics. We first demonstrate that the SIR model with infectious patients and susceptible contacts (i.e., product of transmission rate and susceptible population) interpreted as predators and prey respectively reduces to a Lotka-Volterra (LV) predator-prey model. The modified SIR system (LVSIR) has a stable equilibrium point, an "energy" conservation property, and exhibits bounded cyclic behavior. We exploit this mapping using a control-Lyapunov approach to design a novel adaptive control policy (CoSIR) that nudges the SIR model to the desired equilibrium. Combining CoSIR policy with data-driven estimation of parameters and adjustments for discrete transmission levels yields a control strategy with practical utility. Empirical comparison with periodic lockdowns on simulated and real COVID-19 data demonstrates the efficacy and adaptability of our approach.
epidemiology
10.1101/2020.11.10.20229369
BrainCheck: Validation of a Computerized Cognitive Test Battery for Detection of Mild Cognitive Impairment and Dementia
BackgroundEarly detection of dementia is critical for intervention and care planning but remains difficult. Computerized cognitive testing provides an accessible and promising solution to address these current challenges. This study evaluated a computerized cognitive testing battery (BrainCheck) for its diagnostic accuracy and ability to distinguish the severity of cognitive impairment. Methods99 participants diagnosed with Dementia, Mild Cognitive Impairment (MCI), or Normal Cognition (NC) completed the BrainCheck battery. Statistical analyses compared participant performances on BrainCheck based on their diagnostic group. ResultsBrainCheck battery performance showed significant differences between the NC, MCI, and Dementia groups, achieving [&ge;]88% sensitivity/specificity for separating NC from Dementia, and [&ge;]77% sensitivity/specificity in separating the MCI group from NC/Dementia groups. Three-group classification found true positive rates [&ge;]80% for the NC and Dementia groups and [&ge;]64% for the MCI group. ConclusionsBrainCheck was able to distinguish between diagnoses of Dementia, MCI, and NC, providing a potentially reliable tool for early detection of cognitive impairment.
neurology
10.1101/2020.11.10.20229369
BrainCheck: Validation of a Computerized Cognitive Test Battery for Detection of Mild Cognitive Impairment and Dementia
BackgroundEarly detection of dementia is critical for intervention and care planning but remains difficult. Computerized cognitive testing provides an accessible and promising solution to address these current challenges. This study evaluated a computerized cognitive testing battery (BrainCheck) for its diagnostic accuracy and ability to distinguish the severity of cognitive impairment. Methods99 participants diagnosed with Dementia, Mild Cognitive Impairment (MCI), or Normal Cognition (NC) completed the BrainCheck battery. Statistical analyses compared participant performances on BrainCheck based on their diagnostic group. ResultsBrainCheck battery performance showed significant differences between the NC, MCI, and Dementia groups, achieving [&ge;]88% sensitivity/specificity for separating NC from Dementia, and [&ge;]77% sensitivity/specificity in separating the MCI group from NC/Dementia groups. Three-group classification found true positive rates [&ge;]80% for the NC and Dementia groups and [&ge;]64% for the MCI group. ConclusionsBrainCheck was able to distinguish between diagnoses of Dementia, MCI, and NC, providing a potentially reliable tool for early detection of cognitive impairment.
neurology
10.1101/2020.11.11.20229567
No evidence for intervention-associated DNA methylation changes in monocytes of patients with posttraumatic stress disorder or anorexia nervosa
DNA methylation patterns can be responsive to environmental influences. This observation has sparked interest in the potential for psychological interventions to influence epigenetic processes. Recent studies have observed correlations between DNA methylation changes and therapy out-come. However, most did not control for changes in cell composition from pre- to post-therapy. This study had two aims: first, we sought to replicate therapy-associated changes in DNA methylation of commonly assessed candidate genes in isolated monocytes from 60 female patients with post-traumatic stress disorder (PTSD) using targeted deep bisulfite sequencing (DBS). Our second, exploratory goal was to identify novel genomic regions with substantial pre-to-post intervention DNA methylation changes by performing whole-genome bisulfite sequencing (WGBS) in two patients with PTSD and three patients with anorexia nervosa (AN) before and after intervention. Equivalence testing and Bayesian analyses provided evidence against physiologically meaningful intervention associated DNA methylation changes in monocytes of PTSD patients in commonly investigated target genes (NR3C1, FKBP5, SLC6A4, OXTR). Furthermore, WGBS yielded only a limited set of candidate regions with suggestive evidence of differential methylation pre- to post-therapy. These differential methylation patterns did not prove replicable when investigated in the entire cohort. We conclude that there is no evidence for major, recurrent intervention-associated DNA methylation changes in the investigated genes in monocytes of patients with either PTSD or AN. Author SummaryMany mental health problems have developmental origin, and epigenetic mechanisms have been proposed to explain the link between stressful or adverse experiences and subsequent health outcomes. More recently, studies have begun to examine whether psychological therapies might influence or even reverse supposedly acquired DNA methylation marks. Correlations between response to therapy and DNA methylation changes in peripheral tissue have been reported; however, these results might be confounded by differences in cell composition between time points and not reflect true DNA methylation changes. Here, we attempted to replicate previous reported results in a homogenous cell population (monocytes) and further to identify novel intervention-responsive regions in the whole genome in patients with post-traumatic stress disorder (PTSD) and anorexia nervosa (AN). Our results showed that the improvement in symptomatology in PTSD and AN patients was not reflected in changes in DNA methylation in monocytes, neither in the previously studied candidate genes nor in the regions identified by whole-genome bisulfite sequencing. This study provides evidence against DNA methylation changes in peripheral tissue following therapy, and we suggest that previous findings are most likely explained by differences in cell composition.
psychiatry and clinical psychology
10.1101/2020.11.10.20227397
For a structured response to the psychosocial consequences of the restrictive measures imposed by the global COVID-19 health pandemic: The MAVIPAN longitudinal prospective cohort study protocol.
BackgroundThe COVID-19 pandemic and the isolation measures taken to control it has caused important disruptions in economies and labour markets, changed the way we work and socialize, forced schools to close and healthcare and social services to reorganize in order to redirect resources on the pandemic response. This unprecedented crisis forces individuals to make considerable efforts to adapt and can have serious psychological and social consequences that are likely to persist once the pandemic has been contained and restrictive measures lifted. These impacts will be significant for vulnerable individuals and will most likely exacerbate existing social and gender health and social inequalities. This crisis also puts a toll on the capacity of our healthcare and social services structures to provide timely and adequate care. In order to minimize these consequences, there is an urgent need for high-quality, real-time information on the psychosocial impacts of the pandemic. The MAVIPAN (Ma vie et la pandemie/My life with the pandemic) study aims to document how individuals, families, healthcare workers, and health organisations that provide services are affected by the pandemic and how they adapt. MethodsThe MAVIPAN study is a 5-year longitudinal prospective cohort study that was launched on April 29th, 2020 in the province of Quebec which, at that time, was the epicenter of the pandemic in Canada. Quantitative data is collected through online questionnaires approximately 5 times a year depending on the pandemic evolution. Questionnaires include measures of health, social, behavioral and individual determinants as well as psychosocial impacts. Qualitative data will be collected with individual and group interviews that seek to deepen our understanding of coping strategies. DiscussionThe MAVIPAN study will support the healthcare and social services system response by providing the evidence base needed to identify those who are most affected by the pandemic and by guiding public health authorities decision making regarding intervention and resource allocation to mitigate these impacts. It is also a unique opportunity to advance our knowledge on coping mechanisms and adjustment strategies. Trial registrationNCT04575571 (retrospectively registered)
public and global health
10.1101/2020.11.12.20214478
Upper Limb Motor Improvement after TBI: Systematic Review of Interventions
BackgroundTraumatic Brain Injury (TBI) is a leading cause of adult morbidity and mortality. Individuals with TBI have impairments in both cognitive and motor domains. Motor improvements post-TBI are attributable to adaptive neuroplasticity and motor learning. Majority of the studies focus on remediation of balance and mobility issues. There is limited understanding on the use of interventions for upper limb (UL) motor improvements in this population. ObjectiveWe examined the evidence regarding the effectiveness of different interventions to augment UL motor improvement after a TBI. MethodsWe systematically examined the evidence published in English from 1990-2020. The modified Downs and Black checklist helped assess study quality (total score:28). Studies were classified as excellent:24-28, good:19-23, fair:14-18 and poor:[&le;]13 in quality. Effect sizes helped quantify intervention effectiveness. ResultsTwenty-three studies were retrieved. Study quality was excellent(n=1), good(n=5) or fair(n=17). Interventions used included strategies to decrease muscle tone (n=6), constraint induced movement therapy (n=4), virtual reality gaming (n=5), noninvasive stimulation (n=3), arm motor ability training (n=1), stem-cell transplant (n=1); task-oriented training (n=2) and feedback provision (n=1). Motor impairment outcomes included Fugl-Meyer Assessment, Modified Ashworth Scale, and kinematic outcomes (error and movement straightness). Activity limitation outcomes included Wolf Motor Function Test and Motor Activity Log. Effect sizes for majority of the interventions ranged from medium(0.5-0.79) to large([&ge;]0.8). Only ten studies included retention testing. ConclusionThere is preliminary evidence that using some interventions may enhance UL motor improvement after a TBI. Answers to emergent questions can help select the most appropriate interventions in this population.
rehabilitation medicine and physical therapy
10.1101/2020.11.10.20219675
Association of the Transthyretin Variant V122I With Polyneuropathy Among Individuals of African Descent
IntroductionHereditary transthyretin-mediated (hATTR) amyloidosis is an underdiagnosed, progressively debilitating disease caused by mutations in the transthyretin (TTR) gene. The V122I variant, one of the most common pathogenic TTR mutations, is found in 3-4% of Black individuals, and has been associated with cardiomyopathy. MethodsTo better understand the phenotypic consequences of carrying V122I, we conducted a phenome-wide association study scanning 427 ICD diagnosis codes for association with this variant in Black participants of the UK Biobank (n= 6,062). Significant associations were tested for replication in the Penn Medicine Biobank (n= 5,737) and the Million Veteran Program (n= 82,382). ResultsOur analyses discovered a significant association between V122I and polyneuropathy diagnosis (odds ratio = 6.4, 95% confidence interval [CI] = 2.6 to 15.6, P = 4.2 x 10-5) in the UK Biobank,which was replicated in the Penn Medicine Biobank (p=6.0x10-3)) and Million Veteran Program (P= 1.8x10-4)). Polyneuropathy prevalence among V122I carriers was 2.1-9.0% across biobanks. The cumulative incidence of common hATTR amyloidosis manifestations (carpal tunnel syndrome, polyneuropathy, cardiomyopathy, heart failure) was significantly enriched in V122I carriers versus non-carriers (hazard ratio = 2.8, 95% CI = 1.7-4.5, P = 2.6 x 10-5) in the UK Biobank;37.4% of V122I carriers having a diagnosis of any one of these manifestations by age 75. ConclusionsOur findings show that, although the V122I variant is known to be associated with cardiomyopathy, carriers are also at significantly increased risk of developing polyneuropathy. These results also emphasize the underdiagnosis of disease in V122I carriers with a significant proportion of subjects showing phenotypic changes consistent with hATTR. Greater understanding of the manifestations associated with V122I is critical for earlier diagnosis and treatment.
genetic and genomic medicine
10.1101/2020.11.12.20230318
The implementation of remote home monitoring models during the COVID-19 pandemic in England
BackgroundThere is a paucity of evidence for the implementation of remote home monitoring for COVID-19 infection. The aims of this study were to identify the key characteristics of remote home monitoring models for COVID-19 infection, explore the experiences of staff implementing these models, understand the use of data for monitoring progress against outcomes, and document variability in staffing and resource allocation. MethodsThis was a multi-site mixed methods study that combined qualitative and quantitative approaches to analyse the implementation and impact of remote home monitoring models during the first wave of the COVID-19 pandemic (July to September 2020) in England. The study combined interviews (n=22) with staff delivering these models across eight sites in England with the collection and analysis of data on staffing models and resource allocation. FindingsThe models varied in relation to the healthcare settings and mechanisms used for patient triage, monitoring and escalation. Implementation was embedded in existing staff workloads and budgets. Good communication within clinical teams, culturally-appropriate information for patients/carers and the combination of multiple approaches for patient monitoring (app and paper-based) were considered facilitators in implementation. The mean cost per monitored patient varied from {pound}400 to {pound}553, depending on the model. InterpretationIt is necessary to provide the means for evaluating the effectiveness of these models, for example, by establishing comparator data. Future research should also focus on the sustainability of the models and patient experience (considering the extent to which some of the models exacerbate existing inequalities in access to care). FundingThe study was funded by the National Institute for Health Research-NIHR (Health Services and Delivery Research, 16/138/17 - Rapid Service Evaluation Research Team; or The Birmingham, RAND and Cambridge Evaluation (BRACE) Centre Team (HSDR16/138/31).
health systems and quality improvement
10.1101/2020.11.12.20230441
Early Prediction of Hemodynamic Shock in the Intensive Care Units with Deep Learning on Thermal Videos
Shock is one of the major killers in Intensive Care Units and early interventions can potentially reverse it. In this study, we advance a non-contact thermal imaging modality to continuous monitoring of hemodynamic shock working on 103,936 frames from 406 videos recorded longitudinally upon 22 patients. Deep learning was used to preprocess and extract the Center-to-Peripheral Difference (CPD) in temperature values from the videos. This time-series data along with heart rate was finally analyzed using Long-Short Term Memory models to predict the shock status up to the next 6 hours. Our models achieved the best area under the receiver operating characteristics curve of 0.81 {+/-} 0.06 and area under the precision-recall curve of 0.78 {+/-} 0.05 at 5 hours, providing sufficient time to stabilize the patient. Our approach, thus, provides a reliable shock prediction using an automated decision pipeline, that can provide better care and save lives.
intensive care and critical care medicine
10.1101/2020.11.13.20231431
Racial and Ethnic Differentials in COVID-19-Related Job Exposures by Occupational Status in the US
Researchers and journalists have argued that work-related factors may be partly responsible for disproportionate COVID-19 infection and death rates among vulnerable groups. We evaluate these claims by examining racial and ethnic differences in the likelihood of work-related exposure to COVID-19. We extend previous studies by considering 12 racial and ethnic groups and five types of potential occupational exposure to the virus: exposure to infection, physical proximity to others, face-to-face discussions, interactions with external customers and the public, and working indoors. Most importantly, we stratify our results by occupational standing, defined as the proportion of workers within each occupation with at least some college education. This measure serves as a proxy for whether workplaces and workers employ significant COVID-19-related risk reduction strategies. We use the 2018 American Community Survey to identify recent workers by occupation, and link 409 occupations to information on work context from the Occupational Information Network to identify potential COVID-related risk factors. We then examine the racial/ethnic distribution of all frontline workers and frontline workers at highest potential risk of COVID-19, by occupational standing and by sex. The results indicate that, contrary to expectation, White frontline workers are often overrepresented in high-risk jobs while Black and Latino frontline workers are generally underrepresented in these jobs. However, disaggregation of the results by occupational standing shows that, in contrast to Whites and several Asian groups, Latino and Black frontline workers are overrepresented in lower status occupations overall and in lower status occupations associated with high risk, and are thus less likely to have adequate COVID-19 protections. Our findings suggest that greater work exposures likely contribute to a higher prevalence of COVID-19 among Latino and Black adults and underscore the need for measures to reduce potential exposure for workers in low status occupations and for the development of programs outside the workplace.
infectious diseases
10.1101/2020.11.12.20230870
Optimizing COVID-19 control with asymptomatic surveillance testing in a university environment
The high proportion of transmission events derived from asymptomatic or presymptomatic infections make SARS-CoV-2, the causative agent in COVID-19, difficult to control through the traditional non-pharmaceutical interventions (NPIs) of symptom-based isolation and contact tracing. As a consequence, many US universities developed asymptomatic surveillance testing labs, to augment NPIs and control outbreaks on campus throughout the 2020-2021 academic year (AY); several of those labs continue to support asymptomatic surveillance efforts on campus in AY2021-2022. At the height of the pandemic, we built a stochastic branching process model of COVID-19 dynamics at UC Berkeley to advise optimal control strategies in a university environment. Our model combines behavioral interventions in the form of group size limits to deter superspreading, symptom-based isolation, and contact tracing, with asymptomatic surveillance testing. We found that behavioral interventions offer a cost-effective means of epidemic control: group size limits of six or fewer greatly reduce superspreading, and rapid isolation of symptomatic infections can halt rising epidemics, depending on the frequency of asymptomatic transmission in the population. Surveillance testing can overcome uncertainty surrounding asymptomatic infections, with the most effective approaches prioritizing frequent testing with rapid turnaround time to isolation over test sensitivity. Importantly, contact tracing amplifies population-level impacts of all infection isolations, making even delayed interventions effective. Combination of behavior-based NPIs and asymptomatic surveillance also reduces variation in daily case counts to produce more predictable epidemics. Furthermore, targeted, intensive testing of a minority of high transmission risk individuals can effectively control the COVID-19 epidemic for the surrounding population. Even in some highly vaccinated university settings in AY2021-2022, asymptomatic surveillance testing offers an effective means of identifying breakthrough infections, halting onward transmission, and reducing total caseload. We offer this blueprint and easy-to-implement modeling tool to other academic or professional communities navigating optimal return-to-work strategies.
epidemiology
10.1101/2020.11.12.20230870
Optimizing COVID-19 control with asymptomatic surveillance testing in a university environment
The high proportion of transmission events derived from asymptomatic or presymptomatic infections make SARS-CoV-2, the causative agent in COVID-19, difficult to control through the traditional non-pharmaceutical interventions (NPIs) of symptom-based isolation and contact tracing. As a consequence, many US universities developed asymptomatic surveillance testing labs, to augment NPIs and control outbreaks on campus throughout the 2020-2021 academic year (AY); several of those labs continue to support asymptomatic surveillance efforts on campus in AY2021-2022. At the height of the pandemic, we built a stochastic branching process model of COVID-19 dynamics at UC Berkeley to advise optimal control strategies in a university environment. Our model combines behavioral interventions in the form of group size limits to deter superspreading, symptom-based isolation, and contact tracing, with asymptomatic surveillance testing. We found that behavioral interventions offer a cost-effective means of epidemic control: group size limits of six or fewer greatly reduce superspreading, and rapid isolation of symptomatic infections can halt rising epidemics, depending on the frequency of asymptomatic transmission in the population. Surveillance testing can overcome uncertainty surrounding asymptomatic infections, with the most effective approaches prioritizing frequent testing with rapid turnaround time to isolation over test sensitivity. Importantly, contact tracing amplifies population-level impacts of all infection isolations, making even delayed interventions effective. Combination of behavior-based NPIs and asymptomatic surveillance also reduces variation in daily case counts to produce more predictable epidemics. Furthermore, targeted, intensive testing of a minority of high transmission risk individuals can effectively control the COVID-19 epidemic for the surrounding population. Even in some highly vaccinated university settings in AY2021-2022, asymptomatic surveillance testing offers an effective means of identifying breakthrough infections, halting onward transmission, and reducing total caseload. We offer this blueprint and easy-to-implement modeling tool to other academic or professional communities navigating optimal return-to-work strategies.
epidemiology
10.1101/2020.11.13.20231332
Multi-state network meta-analysis of cause-specific survival data
Multiple randomized controlled trials, each comparing a subset of competing interventions, can be synthesized by means of a network meta-analysis to estimate relative treatment effects between all interventions in the evidence base. Often there is an interest in estimating the relative treatment effects regarding time-to-event outcomes. Cancer treatment effectiveness is frequently quantified by analyzing overall survival (OS) and progression-free survival (PFS). In this paper we introduce a method for the joint network meta-analysis of PFS and OS that is based on a time-inhomogeneous tri-state (stable, progression, and death) Markov model where time-varying transition rates and relative treatment effects are modeled with known parametric survival functions or fractional polynomials. The data needed to run these analyses can be extracted directly from published survival curves. We demonstrate use by applying the methodology to a network of trials for the treatment of non-small-cell lung cancer. The proposed approach allows the joint synthesis of OS and PFS, relaxes the proportional hazards assumption, extends to a network of more than two treatments, and simplifies the parameterization of decision and cost-effectiveness analyses.
health economics
10.1101/2020.11.14.20231894
Phe2vec: Automated Disease Phenotyping based on Unsupervised Embeddings from Electronic Health Records
ObjectiveRobust phenotyping of patient data from electronic health records (EHRs) at scale is a current challenge in the field of clinical informatics. We introduce Phe2vec, an automated framework for disease phenotyping from EHRs based on unsupervised learning, and we assess its effectiveness against standard rule-based algorithms from the Phenotype KnowledgeBase (PheKB). Materials and MethodsPhe2vec is based on pre-computing embeddings of medical concepts and patients longitudinal clinical history. Disease phenotypes are then derived from a seed concept and its neighbors in the embedding space. Patients are similarly linked to a disease if their embedded representation is close to the phenotype. We implemented Phe2vec using 49,234 medical concepts from structured EHRs and clinical notes from 1,908,741 patients in the Mount Sinai Health System. We assessed performance on ten diverse diseases that have a PheKB algorithm. ResultsPhe2vec phenotypes derived using Word2vec, GloVe, and Fasttext embeddings led to promising performance in disease definition and patient cohort identification with respect to phenotypes and cohorts obtained by PheKB. When comparing Phe2vec and PheKB disease patient cohorts head-to-head using chart review, Phe2vec performed on par or better in nine out of ten diseases in terms of positive predictive values. DiscussionPhe2vec offers a solution to improve time-consuming phenotyping pipelines. Differently from other approaches in the literature, it is data-driven and unsupervised, can easily scale to any disease and was validated against widely adopted expert-based standards. ConclusionPhe2vec aims to optimize clinical informatics research by augmenting current frameworks to characterize patients by condition and derive reliable disease cohorts.
health informatics
10.1101/2020.11.14.20231894
Phe2vec: Automated Disease Phenotyping based on Unsupervised Embeddings from Electronic Health Records
ObjectiveRobust phenotyping of patient data from electronic health records (EHRs) at scale is a current challenge in the field of clinical informatics. We introduce Phe2vec, an automated framework for disease phenotyping from EHRs based on unsupervised learning, and we assess its effectiveness against standard rule-based algorithms from the Phenotype KnowledgeBase (PheKB). Materials and MethodsPhe2vec is based on pre-computing embeddings of medical concepts and patients longitudinal clinical history. Disease phenotypes are then derived from a seed concept and its neighbors in the embedding space. Patients are similarly linked to a disease if their embedded representation is close to the phenotype. We implemented Phe2vec using 49,234 medical concepts from structured EHRs and clinical notes from 1,908,741 patients in the Mount Sinai Health System. We assessed performance on ten diverse diseases that have a PheKB algorithm. ResultsPhe2vec phenotypes derived using Word2vec, GloVe, and Fasttext embeddings led to promising performance in disease definition and patient cohort identification with respect to phenotypes and cohorts obtained by PheKB. When comparing Phe2vec and PheKB disease patient cohorts head-to-head using chart review, Phe2vec performed on par or better in nine out of ten diseases in terms of positive predictive values. DiscussionPhe2vec offers a solution to improve time-consuming phenotyping pipelines. Differently from other approaches in the literature, it is data-driven and unsupervised, can easily scale to any disease and was validated against widely adopted expert-based standards. ConclusionPhe2vec aims to optimize clinical informatics research by augmenting current frameworks to characterize patients by condition and derive reliable disease cohorts.
health informatics
10.1101/2020.11.14.20231894
Phe2vec: Automated Disease Phenotyping based on Unsupervised Embeddings from Electronic Health Records
ObjectiveRobust phenotyping of patient data from electronic health records (EHRs) at scale is a current challenge in the field of clinical informatics. We introduce Phe2vec, an automated framework for disease phenotyping from EHRs based on unsupervised learning, and we assess its effectiveness against standard rule-based algorithms from the Phenotype KnowledgeBase (PheKB). Materials and MethodsPhe2vec is based on pre-computing embeddings of medical concepts and patients longitudinal clinical history. Disease phenotypes are then derived from a seed concept and its neighbors in the embedding space. Patients are similarly linked to a disease if their embedded representation is close to the phenotype. We implemented Phe2vec using 49,234 medical concepts from structured EHRs and clinical notes from 1,908,741 patients in the Mount Sinai Health System. We assessed performance on ten diverse diseases that have a PheKB algorithm. ResultsPhe2vec phenotypes derived using Word2vec, GloVe, and Fasttext embeddings led to promising performance in disease definition and patient cohort identification with respect to phenotypes and cohorts obtained by PheKB. When comparing Phe2vec and PheKB disease patient cohorts head-to-head using chart review, Phe2vec performed on par or better in nine out of ten diseases in terms of positive predictive values. DiscussionPhe2vec offers a solution to improve time-consuming phenotyping pipelines. Differently from other approaches in the literature, it is data-driven and unsupervised, can easily scale to any disease and was validated against widely adopted expert-based standards. ConclusionPhe2vec aims to optimize clinical informatics research by augmenting current frameworks to characterize patients by condition and derive reliable disease cohorts.
health informatics
10.1101/2020.11.13.20230060
ELIPSE-COL: A novel ELISA test based on rational envisioned synthetic peptides for detection of SARS-CoV-2 infection in Colombia.
BackgroundCOVID-19 pandemic caused by infection with the betacoronavirus SARS-CoV-2 is the greatest public health defiant on a global scale in the last 100 years. Governments and health Institutes face challenges during the pandemic, related to the diagnosis, mitigation, treatment, and timely detection after the epidemic peak for the prevention of new infections and the evaluation of the real impact of the COVID-19 disease in different geographic areas. To develop a valuable tool to study the seroprevalence of SARS-CoV-2 infection in Colombia, an "in-house" ELISA was achieved for the detection of IgG anti-SARS-CoV-2 antibodies in serum. MethodsThe test was standardized using an antigenic epitope "Pool" of the synthetic peptide as antigen derived from antigenic regions of the spike, nucleocapsid, envelope, and membrane structural proteins, which were designed, based on the genomic information of SARS-CoV-2 circulating in Colombia. In the ELISA standardization process, 94 positive sera were used, including sera from asymptomatic and symptomatic patients (mild and severe) and 123 negative sera, including pre-pandemic historical negatives originating from patients living in arbovirus endemic areas or patients with a history of respiratory diseases and sera from patients with a negative rRT-PCR test for SARS-CoV-2. ResultsThe in-house peptide ELIPSE-COL test showed promising performance, being able to detect reactivity in sera from asymptomatic and symptomatic patients. The sensitivity and specificity of the assay were 91.4% and 83.7% respectively. ConclusionELIPSE-COL assay was developed as an ELISA test using synthetic peptides for the study of the seroprevalence of SARS-CoV-2 infection in Colombia. SUMMARY BOXO_LIDetection of IgG anti-SARS-CoV-2 antibodies is required for the evaluation of the pandemic impact and vaccination strategies. C_LIO_LIELIPSE-COL is an in-house test based on synthetic peptides as antigen derived from antigenic regions of the spike, nucleocapsid, envelope, and membrane structural proteins. C_LIO_LIThe sensitivity and specificity of the assay were 91.4% and 83.7% respectively suggesting a promising performance. C_LIO_LIELIPSE-COL test is a valuable tool for the study of seroprevalence in Colombia. C_LI
infectious diseases
10.1101/2020.11.13.20231209
Plasma Markers of Disrupted Gut Permeability in Severe COVID-19 Patients
A disruption of the crosstalk between the gut and the lung has been implicated as a driver of severity during respiratory-related diseases. Lung injury causes systemic inflammation, which disrupts gut barrier integrity, increasing the permeability to gut microbes and their products. This exacerbates inflammation, resulting in positive feedback. We aimed to test whether severe Coronavirus disease 2019 (COVID-19) is associated with markers of disrupted gut permeability. We applied a multi-omic systems biology approach to analyze plasma samples from COVID-19 patients with varying disease severity and SARS-CoV-2 negative controls. We investigated the potential links between plasma markers of gut barrier integrity, microbial translocation, systemic inflammation, metabolome, lipidome, and glycome, and COVID-19 severity. We found that severe COVID-19 is associated with high levels of markers of tight junction permeability and translocation of bacterial and fungal products into the blood. These markers of disrupted intestinal barrier integrity and microbial translocation correlate strongly with higher levels of markers of systemic inflammation and immune activation, lower levels of markers of intestinal function, disrupted plasma metabolome and glycome, and higher mortality rate. Our study highlights an underappreciated factor with significant clinical implications, disruption in gut functions, as a potential force that may contribute to COVID-19 severity.
infectious diseases
10.1101/2020.11.16.20232363
Studying trajectories of multimorbidity: a systematic scoping review of longitudinal approaches and evidence.
ObjectivesMultimorbidity - the co-occurrence of at least two chronic diseases in an individual-is an important public health challenge in ageing societies. The vast majority of multimorbidity research takes a cross-sectional approach, but longitudinal approaches to understanding multimorbidity are an emerging research area, being encouraged by multiple funders. To support development in this research area, the aim of this study is to scope the methodological approaches and substantive findings of studies which have investigated longitudinal multimorbidity trajectories. DesignWe conducted a systematic search for relevant studies in four online databases (Medline, Scopus, Web of Science, and Embase) using pre-defined search terms and inclusion and exclusion criteria. The search was complemented by searching reference lists of relevant papers. From the selected studies we systematically extracted data on study methodology and findings, and summarised them in a narrative synthesis. ResultsWe identified 34 studies investigating multimorbidity longitudinally, all published in the last decade, and predominantly in high-income countries from the Global North. Longitudinal approaches employed included constructing change variables, multilevel regression analysis (e.g. growth curve modelling), longitudinal group-based methodologies (e.g. latent class modelling), analysing disease transitions, and visualisation techniques. Commonly identified risk factors for multimorbidity onset and progression were older age, higher socio-economic and area-level deprivation, overweight, and poorer health behaviours. ConclusionThe nascent research area employs a diverse range of longitudinal approaches that characterize accumulation and disease combinations, and to a lesser extent disease sequencing and progression. Gaps include understanding the long-term, life course determinants of different multimorbidity trajectories, and doing so in across diverse populations, including those from low and middle-income countries. This can provide a detailed picture of morbidity development, with important implications from a clinical and intervention perspective. STRENGTHS AND LIMITATIONS OF THE STUDYO_LIThis is the first systematic review to focus on studies that take a longitudinal, rather than cross-sectional, approach to multimorbidity. C_LIO_LISystematic searches of online academic databases were performed using pre-defined search terms, as well as searching of reference lists, and this is reported using PRISMA guidelines. C_LIO_LIFor selected papers, data was double extracted using standardised proformas to aid narrative synthesis. C_LIO_LIDue to the heterogeneity of the studies included, their weaknesses were described in the narrative synthesis, but we did not perform quality assessment using standardised tools. C_LI
epidemiology
10.1101/2020.11.16.20232009
The total number and mass of SARS-CoV-2 virions
Quantitatively describing the time course of the SARS-CoV-2 infection within an infected individual is important for understanding the current global pandemic and possible ways to combat it. Here we integrate the best current knowledge about the typical viral load of SARS-CoV-2 in bodily fluids and host tissues to estimate the total number and mass of SARS-CoV-2 virions in an infected person. We estimate that each infected person carries 109-1011 virions during peak infection, with a total mass in the range of 1-100 g, which curiously implies that all SARS-CoV-2 virions currently circulating within human hosts have a collective mass of only 0.1-10 kg. We combine our estimates with the available literature on host immune response and viral mutation rates to demonstrate how antibodies markedly outnumber the spike proteins and the genetic diversity of virions in an infected host covers all possible single nucleotide substitutions. SignificanceKnowing the absolute numbers of virions in an infection promotes better understanding of the disease dynamics and the response of the immune system. Here we use the best current knowledge on the concentrations of virions in infected individuals to estimate the total number and mass of SARS-CoV-2 virions in an infected person. Although each infected person carries an estimated 1-100 billion virions during peak infection, their total mass is no more than 0.1 mg. This curiously implies that all SARS-CoV-2 virions currently in all human hosts have a mass of between 100 gram and 10 kilogram. Combining the known mutation rate and our estimate of the number of infectious virions we quantify the formation rate of genetic variants.
infectious diseases
10.1101/2020.11.16.20231639
Potential of machine learning to predict early ischemic events after carotid endarterectomy or stenting: A comparison with surgeon predictions
BackgroundCarotid endarterectomy (CEA) and carotid artery stenting (CAS) are recommended for high stroke-risk patients with carotid artery stenosis to reduce ischemic events. However, we often face difficulty in determining the best treatment strategy. ObjectiveWe aimed to develop an accurate post-CEA/CAS outcome prediction model using machine learning that will serve as a basis for a new decision support tool for patient-specific treatment planning. MethodsRetrospectively collected data from 165 consecutive patients with carotid stenosis underwent CEA or CAS were divided into training and test samples. The following five machine learning algorithms were tuned, and their predictive performance evaluated by comparison with surgeon predictions: an artificial neural network, logistic regression, support vector machine, random forest, and extreme gradient boosting (XGBoost). Seventeen clinical factors were introduced into the models. Outcome was defined as any ischemic stroke within 30 days after treatment including asymptomatic diffusion-weighted imaging abnormalities. ResultsThe XGBoost model performed the best in the evaluation; its sensitivity, specificity, positive predictive value, and accuracy were 31.9%, 94.6%, 47.2%, and 86.2%, respectively. These statistical measures were comparable to those of surgeons. Internal carotid artery peak systolic velocity, low density lipoprotein cholesterol, and procedure (CEA or CAS) were the most contributing factors according to the XGBoost algorithm. ConclusionWe were able to develop a post-procedural outcome prediction model comparable to surgeons in performance. The accurate outcome prediction model will make it possible to make a more appropriate patient-specific selection of CEA or CAS for the treatment of carotid artery stenosis.
neurology
10.1101/2020.11.16.20231639
Potential of machine learning to predict early ischemic events after carotid endarterectomy or stenting: A comparison with surgeon predictions
BackgroundCarotid endarterectomy (CEA) and carotid artery stenting (CAS) are recommended for high stroke-risk patients with carotid artery stenosis to reduce ischemic events. However, we often face difficulty in determining the best treatment strategy. ObjectiveWe aimed to develop an accurate post-CEA/CAS outcome prediction model using machine learning that will serve as a basis for a new decision support tool for patient-specific treatment planning. MethodsRetrospectively collected data from 165 consecutive patients with carotid stenosis underwent CEA or CAS were divided into training and test samples. The following five machine learning algorithms were tuned, and their predictive performance evaluated by comparison with surgeon predictions: an artificial neural network, logistic regression, support vector machine, random forest, and extreme gradient boosting (XGBoost). Seventeen clinical factors were introduced into the models. Outcome was defined as any ischemic stroke within 30 days after treatment including asymptomatic diffusion-weighted imaging abnormalities. ResultsThe XGBoost model performed the best in the evaluation; its sensitivity, specificity, positive predictive value, and accuracy were 31.9%, 94.6%, 47.2%, and 86.2%, respectively. These statistical measures were comparable to those of surgeons. Internal carotid artery peak systolic velocity, low density lipoprotein cholesterol, and procedure (CEA or CAS) were the most contributing factors according to the XGBoost algorithm. ConclusionWe were able to develop a post-procedural outcome prediction model comparable to surgeons in performance. The accurate outcome prediction model will make it possible to make a more appropriate patient-specific selection of CEA or CAS for the treatment of carotid artery stenosis.
neurology
10.1101/2020.11.16.20232561
Locus coeruleus integrity is related to tau burden and memory loss in autosomal-dominant Alzheimer's disease
Abnormally phosphorylated tau, an indicator of Alzheimers disease, accumulates in the first decades of life in the locus coeruleus (LC), the brains main noradrenaline supply. However, technical challenges in reliable in-vivo assessments have impeded research into the role of the LC in Alzheimers disease. We studied participants with or known to be at-risk for mutations in genes causing autosomal-dominant Alzheimers disease (ADAD) of early onset, providing a unique window into the pathogenesis of Alzheimers largely disentangled from age-related factors. Using high- resolution MRI and tau PET, we revealed lower rostral LC integrity in symptomatic participants. LC integrity was associated with individual differences in tau burden and memory decline. Post- mortem analyses in a separate set of carriers of the same mutation confirmed substantial neuronal loss in the LC. Our findings link LC degeneration to tau burden and memory in Alzheimers and highlight a role of the noradrenergic system in this neurodegenerative disease.
neurology
10.1101/2020.11.16.20232561
Locus coeruleus integrity is related to tau burden and memory loss in autosomal-dominant Alzheimer's disease
Abnormally phosphorylated tau, an indicator of Alzheimers disease, accumulates in the first decades of life in the locus coeruleus (LC), the brains main noradrenaline supply. However, technical challenges in reliable in-vivo assessments have impeded research into the role of the LC in Alzheimers disease. We studied participants with or known to be at-risk for mutations in genes causing autosomal-dominant Alzheimers disease (ADAD) of early onset, providing a unique window into the pathogenesis of Alzheimers largely disentangled from age-related factors. Using high- resolution MRI and tau PET, we revealed lower rostral LC integrity in symptomatic participants. LC integrity was associated with individual differences in tau burden and memory decline. Post- mortem analyses in a separate set of carriers of the same mutation confirmed substantial neuronal loss in the LC. Our findings link LC degeneration to tau burden and memory in Alzheimers and highlight a role of the noradrenergic system in this neurodegenerative disease.
neurology
10.1101/2020.11.16.20232850
The impact of COVID-19 employment shocks on suicide and safety net use: An early-stage investigation
This paper examines whether the COVID-19-induced employment shocks are associated with increases in suicides and safety net use in the second and third quarters of 2020. We exploit plausibly exogenous regional variation in the magnitude of the employment shocks in Japan and adopt a difference-in-differences research design to examine and control for possible confounders. Our preferred point estimates suggest that a one-percentage-point increase in the unemployment rate in the second quarter of 2020 is associated with, approximately, an additional 0.52 suicides, 28 unemployment benefit recipients, 88 recipients of a temporary loan program, and 10 recipients of public assistance per 100,000 population per month. A simple calculation based on these estimates suggests that if a region experienced a one-percentage-point increase in the unemployment rate caused by the COVID-19 crisis in the second quarter of 2020, which is roughly equivalent to the third-highest regional employment shock, this would be associated with 37.4%, 60.5%, and 26.5% increases in the total, female, and male suicide rates respectively in July 2020 compared with July 2019. Our baseline findings are robust to several different model specifications, although we do not assert that our research design perfectly solves the problem of estimation bias.
health economics
10.1101/2020.11.16.20232850
The impact of COVID-19 employment shocks on suicide and safety net use: An early-stage investigation
This paper examines whether the COVID-19-induced employment shocks are associated with increases in suicides and safety net use in the second and third quarters of 2020. We exploit plausibly exogenous regional variation in the magnitude of the employment shocks in Japan and adopt a difference-in-differences research design to examine and control for possible confounders. Our preferred point estimates suggest that a one-percentage-point increase in the unemployment rate in the second quarter of 2020 is associated with, approximately, an additional 0.52 suicides, 28 unemployment benefit recipients, 88 recipients of a temporary loan program, and 10 recipients of public assistance per 100,000 population per month. A simple calculation based on these estimates suggests that if a region experienced a one-percentage-point increase in the unemployment rate caused by the COVID-19 crisis in the second quarter of 2020, which is roughly equivalent to the third-highest regional employment shock, this would be associated with 37.4%, 60.5%, and 26.5% increases in the total, female, and male suicide rates respectively in July 2020 compared with July 2019. Our baseline findings are robust to several different model specifications, although we do not assert that our research design perfectly solves the problem of estimation bias.
health economics
10.1101/2020.11.16.20232850
The impact of COVID-19 employment shocks on suicide and safety net use: An early-stage investigation
This paper examines whether the COVID-19-induced employment shocks are associated with increases in suicides and safety net use in the second and third quarters of 2020. We exploit plausibly exogenous regional variation in the magnitude of the employment shocks in Japan and adopt a difference-in-differences research design to examine and control for possible confounders. Our preferred point estimates suggest that a one-percentage-point increase in the unemployment rate in the second quarter of 2020 is associated with, approximately, an additional 0.52 suicides, 28 unemployment benefit recipients, 88 recipients of a temporary loan program, and 10 recipients of public assistance per 100,000 population per month. A simple calculation based on these estimates suggests that if a region experienced a one-percentage-point increase in the unemployment rate caused by the COVID-19 crisis in the second quarter of 2020, which is roughly equivalent to the third-highest regional employment shock, this would be associated with 37.4%, 60.5%, and 26.5% increases in the total, female, and male suicide rates respectively in July 2020 compared with July 2019. Our baseline findings are robust to several different model specifications, although we do not assert that our research design perfectly solves the problem of estimation bias.
health economics
10.1101/2020.11.16.20232967
Immune memory in mild COVID-19 patients and unexposed donors from India reveals persistent T cell responses after SARS-CoV-2 infection
Understanding the causes of the diverse outcome of COVID-19 pandemic in different geographical locations is important for the worldwide vaccine implementation and pandemic control responses. We analyzed 42 unexposed healthy donors and 28 mild COVID-19 subjects up to 5 months from the recovery for SARS-CoV-2 specific immunological memory. Using HLA class II predicted peptide megapools, we identified SARS-CoV-2 cross-reactive CD4+ T cells in around 66% of the unexposed individuals. Moreover, we found detectable immune memory in mild COVID-19 patients several months after recovery in the crucial arms of protective adaptive immunity; CD4+ T cells and B cells, with a minimal contribution from CD8+ T cells. Interestingly, the persistent immune memory in COVID-19 patients is predominantly targeted towards the Spike glycoprotein of the SARS-CoV-2. This study provides the evidence of both high magnitude pre-existing and persistent immune memory in Indian population. By providing the knowledge on cellular immune responses to SARS-CoV-2, our work has implication for the development and implementation of vaccines against COVID-19.
infectious diseases
10.1101/2020.11.17.20232918
SARS-CoV-2 epidemic after social and economic reopening in three US states reveals shifts in age structure and clinical characteristics
In the United States, state-level re-openings in spring 2020 presented an opportunity for the resurgence of SARS-CoV-2 transmission. One important question during this time was whether human contact and mixing patterns could increase gradually without increasing viral transmission, the rationale being that new mixing patterns would likely be associated with improved distancing, masking, and hygiene practices. A second key question to follow during this time was whether clinical characteristics of the epidemic would improve after the initial surge of cases. Here, we analyze age-structured case, hospitalization, and death time series from three states - Rhode Island, Massachusetts, and Pennsylvania - that had successful re-openings in May 2020 without summer waves of infection. Using a Bayesian inference framework on eleven daily data streams and flexible daily population contact parameters, we show that population-average mixing rates dropped by >50% during the lockdown period in March/April, and that the correlation between overall population mobility and transmission-capable mobility was broken in May as these states partially re-opened. We estimate the reporting rates (fraction of symptomatic cases reporting to health system) at 96.0% (RI), 72.1% (MA), and 75.5% (PA); in Rhode Island, when accounting for cases caught through general-population screening programs, the reporting rate estimate is 94.5%. We show that elderly individuals were less able to reduce contacts during the lockdown period when compared to younger individuals. Attack rate estimates through August 31 2020 are 6.4% (95% CI: 5.8% - 7.3%) of the total population infected for Rhode Island, 5.7% (95% CI: 5.0% - 6.8%) in Massachusetts, and 3.7% (95% CI: 3.1% - 4.5%) in Pennsylvania, with some validation available through published seroprevalence studies. Infection fatality rates (IFR) estimates for the spring epidemic are higher in our analysis (>2%) than previously reported values, likely resulting from the epidemics in these three states affecting the most vulnerable sub-populations, especially the most vulnerable of the [&ge;]80 age group.
epidemiology
10.1101/2020.11.17.20232918
SARS-CoV-2 epidemic after social and economic reopening in three US states reveals shifts in age structure and clinical characteristics
In the United States, state-level re-openings in spring 2020 presented an opportunity for the resurgence of SARS-CoV-2 transmission. One important question during this time was whether human contact and mixing patterns could increase gradually without increasing viral transmission, the rationale being that new mixing patterns would likely be associated with improved distancing, masking, and hygiene practices. A second key question to follow during this time was whether clinical characteristics of the epidemic would improve after the initial surge of cases. Here, we analyze age-structured case, hospitalization, and death time series from three states - Rhode Island, Massachusetts, and Pennsylvania - that had successful re-openings in May 2020 without summer waves of infection. Using a Bayesian inference framework on eleven daily data streams and flexible daily population contact parameters, we show that population-average mixing rates dropped by >50% during the lockdown period in March/April, and that the correlation between overall population mobility and transmission-capable mobility was broken in May as these states partially re-opened. We estimate the reporting rates (fraction of symptomatic cases reporting to health system) at 96.0% (RI), 72.1% (MA), and 75.5% (PA); in Rhode Island, when accounting for cases caught through general-population screening programs, the reporting rate estimate is 94.5%. We show that elderly individuals were less able to reduce contacts during the lockdown period when compared to younger individuals. Attack rate estimates through August 31 2020 are 6.4% (95% CI: 5.8% - 7.3%) of the total population infected for Rhode Island, 5.7% (95% CI: 5.0% - 6.8%) in Massachusetts, and 3.7% (95% CI: 3.1% - 4.5%) in Pennsylvania, with some validation available through published seroprevalence studies. Infection fatality rates (IFR) estimates for the spring epidemic are higher in our analysis (>2%) than previously reported values, likely resulting from the epidemics in these three states affecting the most vulnerable sub-populations, especially the most vulnerable of the [&ge;]80 age group.
epidemiology
10.1101/2020.11.17.20233106
Combined polygenic risk scores of different psychiatric traits predict general and specific psychopathology in childhood
BackgroundPolygenic risk scores (PRSs) operationalize genetic propensity towards a particular mental disorder and hold promise as early predictors of psychopathology, but before a PRS can be used clinically, explanatory power must be increased and the specificity for a psychiatric domain established. To enable early detection it is crucial to study these psychometric properties in childhood. We examined whether PRSs associate more with general or with specific psychopathology in school-aged children. Additionally, we tested whether psychiatric PRSs can be combined into a multi-PRS score for improved performance. MethodsWe computed 16 PRSs based on GWASs of psychiatric phenotypes, but also neuroticism and cognitive ability, in mostly adult populations. Study participants were 9247 school-aged children from three population-based cohorts of the DREAM-BIG consortium: ALSPAC (UK), The Generation R Study (Netherlands) and MAVAN (Canada). We associated each PRS with general and specific psychopathology factors, derived from a bifactor model based on self-, parental-, teacher-, and observer reports. After fitting each PRS in separate models, we also tested a multi-PRS model, in which all PRSs are entered simultaneously as predictors of the general psychopathology factor. ResultsSeven PRSs were associated with the general psychopathology factor after multiple testing adjustment, two with specific externalizing and five with specific internalizing psychopathology. PRSs predicted general psychopathology independently of each other, with the exception of depression and depressive symptom PRSs. Most PRSs associated with a specific psychopathology domain, were also associated with general child psychopathology. ConclusionsThe results suggest that PRSs based on current GWASs of psychiatric phenotypes tend to be associated with general psychopathology, or both general and specific psychiatric domains, but not with one specific psychopathology domain only. Furthermore, PRSs can be combined to improve predictive ability. PRS users should therefore be conscious of non-specificity and consider using multiple PRSs simultaneously, when predicting psychiatric disorders.
psychiatry and clinical psychology
10.1101/2020.11.18.20233833
Cumulative COVID-19 incidence, mortality, and prognosis in cancer survivors: a population-based study in Reggio Emilia, Northern Italy.
The aim of this population-based study was to evaluate the impact of having had cancer on COVID-19 risk and prognosis during the first wave of the pandemic (27 February - 13 May 2020) in Reggio Emilia Province. Prevalent cancer cases diagnosed between 1996 and December 2019 were linked with the provincial COVID-19 surveillance system. We compared cancer survivors (CS) cumulative incidence of being tested, testing positive for SARS-CoV-2, being hospitalized, and dying of COVID-19 with that of the general population; we compared COVID-19 prognosis in CS and in patients without cancer. 15,391 people (1527 CS) underwent RT-PCR for SARS-CoV-2, of whom 4541 (447 CS) tested positive; 541 (113 CS) died of COVID-19. The cumulative incidences of being tested, testing positive, COVID-19 hospitalization, and death were lower in CS: age- and sex-adjusted incidence rate ratios were 1.28 [95%CI = 1.21, 1.35], 1.06 [95%CI = 0.96, 1.18], 1.27 [95%CI = 1.09, 1.48], and 1.39 [95%CI = 1.12, 1.71], respectively. CS had worse prognosis when diagnosed with COVID-19, particularly those below the age of 70 (age- and sex-adjusted odds ratio (OR) of death 5.03; [95%CI = 2.59, 9.75]), while the OR decreased after age 70. The OR of death was higher for patients with a recent diagnosis, i.e. <2 years (OR=2.92; 95%CI = 1.64, 5.21), or metastases (OR=2.09; 95%CI = 0. 88, 4.93). Cancer patients showed the same probability of being infected, despite a slightly higher probability of being tested, than the general population, nevertheless they were at higher risk of death once infected. Novelty and impactCancer survivors during the first wave of the pandemic showed higher COVID-19 cumulative incidence and mortality. When infected, they had worse prognosis, particularly in people younger than age 70 or those with a recent diagnosis.
public and global health
10.1101/2020.11.17.20233577
Fighting COVID-19 with Flexible Testing: Models and Insights
Increasing testing capacities plays a substantial role in safely reopening the economy and avoiding a new wave of COVID-19. Pooled testing can expand testing capabilities by pooling multiple individual samples, but it also raises accuracy concerns. In this study, we propose a flexible testing strategy that adopts pooled testing or individual testing according to epidemic dynamics. We identify the prevalence threshold between individual and pooled testing by modeling the expected number of tests per confirmed case. Incorporating an epidemic model, we show pooled testing is more effective in containing epidemic outbreaks and can generate more reliable test results than individual testing because the reliability of test results is relevant to both testing methods and prevalence. Our study is the first to evaluate the interplay between pooled testing and a rapidly evolving outbreak to the best of our knowledge. Our results allay accuracy concerns about pooled testing and provide theoretical supports to empirical studies.
public and global health
10.1101/2020.11.17.20229344
Improved Detection of Air Trapping on Expiratory Computed Tomography Using Deep Learning
BackgroundRadiologic evidence of air trapping (AT) on expiratory computed tomography (CT) scans is associated with early pulmonary dysfunction in patients with cystic fibrosis (CF). However, standard techniques for quantitative assessment of AT are highly variable, resulting in limited efficacy for monitoring disease progression. ObjectiveTo investigate the effectiveness of a convolutional neural network (CNN) model for quantifying and monitoring AT, and to compare it with other quantitative AT measures obtained from threshold-based techniques. Materials and MethodsPaired volumetric whole lung inspiratory and expiratory CT scans were obtained at four time points (0, 3, 12 and 24 months) on 36 subjects with mild CF lung disease. A densely connected CNN (DN) was trained using AT segmentation maps generated from a personalized threshold-based method (PTM). Quantitative AT (QAT) values, presented as the relative volume of AT over the lungs, from the DN approach were compared to QAT values from the PTM method. Radiographic assessment, spirometric measures, and clinical scores were correlated to the DN QAT values using a linear mixed effects model. ResultsQAT values from the DN were found to increase from 8.65% {+/-} 1.38% to 21.38% {+/-} 1.82%, respectively, over a two-year period. Comparison of CNN model results to intensity-based measures demonstrated a systematic drop in the Dice coefficient over time (decreased from 0.86 {+/-} 0.03 to 0.45 {+/-} 0.04). The trends observed in DN QAT values were consistent with clinical scores for AT, bronchiectasis, and mucus plugging. In addition, the DN approach was found to be less susceptible to variations in expiratory deflation levels than the threshold-based approach. ConclusionThe CNN model effectively delineated AT on expiratory CT scans, which provides an automated and objective approach for assessing and monitoring AT in CF patients.
radiology and imaging
10.1101/2020.11.18.20233700
The impact of the coronavirus disease 2019 (COVID-19) outbreak on cancer practice in Japan: using an administrative database
BackgroundRecent researches reported the impact of the coronavirus disease 2019 (COVID - 19) pandemic on the clinical practice of specific type cancers. The aim of this study was to reveal the impact of the COVID-19 outbreak on the clinical practice of various cancers. MethodsWe included hospitalized patients aged 18 years or older diagnosed between July 2018 and June 2020 with one of the top 12 most common cancers in Japan (colon/rectum, lung, gastric, breast, bladder & urinary tract, pancreas, non-Hodgkin lymphoma, liver, prostate, esophagus, uterus, and gallbladder & biliary tract) using Diagnostic Procedure Combination data, an administrative database in Japan. The intervention was defined April 2020 based on a declaration of emergency from Japanese government. The change volume of number of monthly admissions with each cancer was tested by interrupted time series (ITS) analysis, and monthly cases with radical surgery or chemotherapy for each cancer were descripted. Results403,344 cases were included during the study period. The most common cancer was colon/rectum (20.5%), followed by lung (17.5%). In almost cancer cases, the number of admissions decreased in May 2020. In particular, colorectal, lung, gastric, breast, uterine, or esophageal cancer cases decreased by over 10%. The number of admissions with surgery or chemotherapy decreased in colorectal, lung, gastric, breast, uterine, or esophageal cancer. ITS analysis indicated that cases with gastric or esophageal cancer were affected more than other type of cancer. ConclusionsThe COVID-19 outbreak has a negative impact on the number of admission cases with cancer; the magnitude of impact varied by cancer diagnosis.
oncology
10.1101/2020.11.18.20233825
COVIDStrategyCalculator: A software to assess testing- and quarantine strategies for incoming travelers, contact person management and de-isolation
While large-scale vaccination campaigns against SARS-CoV-2 are rolled out at the time of writing, non-pharmaceutical interventions (NPIs), including the isolation of infected individuals, and quarantine of exposed individuals remain central measures to contain the spread of SARS-CoV-2. Strategies that combine NPIs with innovative SARS-CoV-2 testing strategies may increase containment efficacy and help to shorten quarantine durations. We developed a user-friendly software-tool that implements a recently published stochastic within-host viral dynamics model that captures temporal attributes of the viral infection, such as test sensitivity, infectiousness and the occurrence of symptoms. Based on this model, the software allows to evaluate the efficacy of user-defined, arbitrary NPI and testing strategies in reducing the transmission potential in different contexts. The software thus enables decision makers to explore NPI strategies and perform hypothesis testing, e.g. with regards to the utilization of novel diagnostics or with regards to containing novel virus variants.
health informatics
10.1101/2020.11.17.20233601
Valuation system connectivity is correlated with poly-drug use in young adults
Poly-drug consumption contributes to fatal overdose in more than half of all poly-drug users. Analyzing decision-making networks may give insight into the motivations behind poly-drug use. We correlated average functional connectivity of the valuation system (VS), executive control system (ECS) and valuation-control complex (VCC) in a large population sample (n=992) with drug use behaviour. VS connectivity is correlated with sedative use, ECS connectivity is separately correlated with hallucinogens and opiates. Network connectivity is also correlated with drug use via two-way interactions with other substances including alcohol and tobacco. These preliminary findings can contribute to our understanding of the common combinations of substance co-use and associated neural patterns.
addiction medicine
10.1101/2020.11.18.20233767
The impact of early public health interventions on SARS-CoV-2 transmission and evolution
BackgroundMany countries have attempted to mitigate and control COVID-19 through the implementation of non-pharmaceutical interventions, particularly with the aim of reducing population movement and contact. However, it remains unclear how the different control strategies impacted the local phylodynamics of the causative SARS-CoV-2 virus. AimTo assess the duration of chains of virus transmission within individual countries and the extent to which countries export viruses to their geographic neighbours. MethodsTo address core questions in genomic epidemiology and public health we analysed complete SARS-CoV-2 genomes to infer the relative frequencies of virus importation and exportation, as well as virus transmission dynamics, within countries of northern Europe. To this end, we examined virus evolution and phylodynamics in Denmark, Finland, Iceland, Norway and Sweden during the first year of the pandemic. ResultsThe Nordic countries differed markedly in the invasiveness of control strategies implemented. In particular, Sweden did not initially employ any strict population movement limitations and experienced markedly different transmission chain dynamics, which were more numerous and tended to have more cases, a set of features that increased with time during the first eight months of 2020. ConclusionTogether with Denmark, Sweden was also characterised as a net exporter of SARS-CoV-2. Hence, Sweden effectively constituted an epidemiological and evolutionary refugia that enabled the virus to maintain active transmission and spread to other geographic localities. In sum, our analysis reveals the utility of genomic surveillance where active transmission chain monitoring is a key metric.
epidemiology
10.1101/2020.11.16.20232900
High throughput wastewater SARS-CoV-2 detection enables forecasting of community infection dynamics in San Diego county
Large-scale wastewater surveillance has the ability to greatly augment the tracking of infection dynamics especially in communities where the prevalence rates far exceed the testing capacity. However, current methods for viral detection in wastewater are severely lacking in terms of scaling up for high throughput. In the present study, we employed an automated magnetic-bead based concentration approach for viral detection in sewage that can effectively be scaled up for processing 24 samples in a single 40-minute run. The method compared favorably to conventionally used methods for viral wastewater concentrations with higher recovery efficiencies from input sample volumes as low as 10ml and can enable the processing of over 100 wastewater samples in a day. The sensitivity of the high-throughput protocol was shown to detect cases as low as 2 in a hospital building with a known COVID-19 caseload. Using the high throughput pipeline, samples from the influent stream of the primary wastewater treatment plant of San Diego county (serving 2.3 million residents) were processed for a period of 13 weeks. Wastewater estimates of SARS-CoV-2 viral genome copies in raw untreated wastewater correlated strongly with clinically reported cases by the county, and when used alongside past reported case numbers and temporal information in an autoregressive integrated moving average (ARIMA) model enabled prediction of new reported cases up to 3 weeks in advance. Taken together, the results show that the high-throughput surveillance could greatly ameliorate comprehensive community prevalence assessments by providing robust, rapid estimates. ImportanceWastewater monitoring has a lot of potential for revealing COVID-19 outbreaks before they happen because the virus is found in the wastewater before people have clinical symptoms. However, application of wastewater-based surveillance has been limited by long processing times specifically at the concentration step. Here we introduce a much faster method of processing the samples, and show that its robustness by demonstrating direct comparisons with existing methods and showing that we can predict cases in San Diego by a week with excellent accuracy, and three weeks with fair accuracy, using city sewage. The automated viral concentration method will greatly alleviate the major bottleneck in wastewater processing by reducing the turnaround time during epidemics.
epidemiology
10.1101/2020.11.17.20233403
The stochastic dynamics of early epidemics: probability of establishment, initial growth rate, and infection cluster size at first detection
Emerging epidemics and local infection clusters are initially prone to stochastic effects that can substantially impact the epidemic trajectory. While numerous studies are devoted to the deterministic regime of an established epidemic, mathematical descriptions of the initial phase of epidemic growth are comparatively rarer. Here, we review existing mathematical results on the epidemic size over time, and derive new results to elucidate the early dynamics of an infection cluster started by a single infected individual. We show that the initial growth of epidemics that eventually take off is accelerated by stochasticity. These results are critical to improve early cluster detection and control. As an application, we compute the distribution of the first detection time of an infected individual in an infection cluster depending on the testing effort, and estimate that the SARS-CoV-2 variant of concern Alpha detected in September 2020 first appeared in the United Kingdom early August 2020. We also compute a minimal testing frequency to detect clusters before they exceed a given threshold size. These results improve our theoretical understanding of early epidemics and will be useful for the study and control of local infectious disease clusters.
epidemiology
10.1101/2020.11.17.20233403
The stochastic dynamics of early epidemics: probability of establishment, initial growth rate, and infection cluster size at first detection
Emerging epidemics and local infection clusters are initially prone to stochastic effects that can substantially impact the epidemic trajectory. While numerous studies are devoted to the deterministic regime of an established epidemic, mathematical descriptions of the initial phase of epidemic growth are comparatively rarer. Here, we review existing mathematical results on the epidemic size over time, and derive new results to elucidate the early dynamics of an infection cluster started by a single infected individual. We show that the initial growth of epidemics that eventually take off is accelerated by stochasticity. These results are critical to improve early cluster detection and control. As an application, we compute the distribution of the first detection time of an infected individual in an infection cluster depending on the testing effort, and estimate that the SARS-CoV-2 variant of concern Alpha detected in September 2020 first appeared in the United Kingdom early August 2020. We also compute a minimal testing frequency to detect clusters before they exceed a given threshold size. These results improve our theoretical understanding of early epidemics and will be useful for the study and control of local infectious disease clusters.
epidemiology
10.1101/2020.11.17.20233460
Global seroprevalence of SARS-CoV-2 antibodies: a systematic review and meta-analysis
BackgroundMany studies report the seroprevalence of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) antibodies. We aimed to synthesize seroprevalence data to better estimate the level and distribution of SARS-CoV-2 infection, identify high-risk groups, and inform public health decision making. MethodsIn this systematic review and meta-analysis, we searched publication databases, preprint servers, and grey literature sources for seroepidemiological study reports, from January 1, 2020 to December 31, 2020. We included studies that reported a sample size, study date, location, and seroprevalence estimate. We corrected estimates for imperfect test accuracy with Bayesian measurement error models, conducted meta-analysis to identify demographic differences in the prevalence of SARS-CoV-2 antibodies, and meta-regression to identify study-level factors associated with seroprevalence. We compared region-specific seroprevalence data to confirmed cumulative incidence. PROSPERO: CRD42020183634. ResultsWe identified 968 seroprevalence studies including 9.3 million participants in 74 countries. There were 472 studies (49%) at low or moderate risk of bias. Seroprevalence was low in the general population (median 4.5%, IQR 2.4-8.4%); however, it varied widely in specific populations from low (0.6% perinatal) to high (59% persons in assisted living and long-term care facilities). Median seroprevalence also varied by Global Burden of Disease region, from 0.6 % in Southeast Asia, East Asia and Oceania to 19.5% in Sub-Saharan Africa (p<0.001). National studies had lower seroprevalence estimates than regional and local studies (p<0.001). Compared to Caucasian persons, Black persons (prevalence ratio [RR] 3.37, 95% CI 2.64-4.29), Asian persons (RR 2.47, 95% CI 1.96-3.11), Indigenous persons (RR 5.47, 95% CI 1.01-32.6), and multi-racial persons (RR 1.89, 95% CI 1.60-2.24) were more likely to be seropositive. Seroprevalence was higher among people ages 18-64 compared to 65 and over (RR 1.27, 95% CI 1.11-1.45). Health care workers in contact with infected persons had a 2.10 times (95% CI 1.28-3.44) higher risk compared to health care workers without known contact. There was no difference in seroprevalence between sex groups. Seroprevalence estimates from national studies were a median 18.1 times (IQR 5.9-38.7) higher than the corresponding SARS-CoV-2 cumulative incidence, but there was large variation between Global Burden of Disease regions from 6.7 in South Asia to 602.5 in Sub-Saharan Africa. Notable methodological limitations of serosurveys included absent reporting of test information, no statistical correction for demographics or test sensitivity and specificity, use of non-probability sampling and use of non-representative sample frames. DiscussionMost of the population remains susceptible to SARS-CoV-2 infection. Public health measures must be improved to protect disproportionately affected groups, including racial and ethnic minorities, until vaccine-derived herd immunity is achieved. Improvements in serosurvey design and reporting are needed for ongoing monitoring of infection prevalence and the pandemic response. FundingPublic Health Agency of Canada through the COVID-19 Immunity Task Force.
public and global health
10.1101/2020.11.17.20233460
Global seroprevalence of SARS-CoV-2 antibodies: a systematic review and meta-analysis
BackgroundMany studies report the seroprevalence of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) antibodies. We aimed to synthesize seroprevalence data to better estimate the level and distribution of SARS-CoV-2 infection, identify high-risk groups, and inform public health decision making. MethodsIn this systematic review and meta-analysis, we searched publication databases, preprint servers, and grey literature sources for seroepidemiological study reports, from January 1, 2020 to December 31, 2020. We included studies that reported a sample size, study date, location, and seroprevalence estimate. We corrected estimates for imperfect test accuracy with Bayesian measurement error models, conducted meta-analysis to identify demographic differences in the prevalence of SARS-CoV-2 antibodies, and meta-regression to identify study-level factors associated with seroprevalence. We compared region-specific seroprevalence data to confirmed cumulative incidence. PROSPERO: CRD42020183634. ResultsWe identified 968 seroprevalence studies including 9.3 million participants in 74 countries. There were 472 studies (49%) at low or moderate risk of bias. Seroprevalence was low in the general population (median 4.5%, IQR 2.4-8.4%); however, it varied widely in specific populations from low (0.6% perinatal) to high (59% persons in assisted living and long-term care facilities). Median seroprevalence also varied by Global Burden of Disease region, from 0.6 % in Southeast Asia, East Asia and Oceania to 19.5% in Sub-Saharan Africa (p<0.001). National studies had lower seroprevalence estimates than regional and local studies (p<0.001). Compared to Caucasian persons, Black persons (prevalence ratio [RR] 3.37, 95% CI 2.64-4.29), Asian persons (RR 2.47, 95% CI 1.96-3.11), Indigenous persons (RR 5.47, 95% CI 1.01-32.6), and multi-racial persons (RR 1.89, 95% CI 1.60-2.24) were more likely to be seropositive. Seroprevalence was higher among people ages 18-64 compared to 65 and over (RR 1.27, 95% CI 1.11-1.45). Health care workers in contact with infected persons had a 2.10 times (95% CI 1.28-3.44) higher risk compared to health care workers without known contact. There was no difference in seroprevalence between sex groups. Seroprevalence estimates from national studies were a median 18.1 times (IQR 5.9-38.7) higher than the corresponding SARS-CoV-2 cumulative incidence, but there was large variation between Global Burden of Disease regions from 6.7 in South Asia to 602.5 in Sub-Saharan Africa. Notable methodological limitations of serosurveys included absent reporting of test information, no statistical correction for demographics or test sensitivity and specificity, use of non-probability sampling and use of non-representative sample frames. DiscussionMost of the population remains susceptible to SARS-CoV-2 infection. Public health measures must be improved to protect disproportionately affected groups, including racial and ethnic minorities, until vaccine-derived herd immunity is achieved. Improvements in serosurvey design and reporting are needed for ongoing monitoring of infection prevalence and the pandemic response. FundingPublic Health Agency of Canada through the COVID-19 Immunity Task Force.
public and global health
10.1101/2020.11.17.20233684
Linking vestibular function and sub-cortical grey matter volume changes in a longitudinal study of aging adults
Emerging evidence suggests a relationship between impairments of the vestibular (inner ear balance) system and alterations in the function and the structure of the central nervous system in older adults. However, it is unclear whether age-related vestibular loss is associated with volume loss in brain regions known to receive vestibular input. To address this gap, we investigated the association between vestibular function and the volumes of four structures that process vestibular information (the hippocampus, entorhinal cortex, thalamus, and basal ganglia) in a longitudinal study of 97 healthy, older participants from the Baltimore Longitudinal Study of Aging. Vestibular testing included cervical vestibular-evoked myogenic potentials (cVEMP) to measure saccular function, ocular VEMP (oVEMP) to measure utricular function, and video head-impulse tests to measure the horizontal semi-circular canal vestibulo-ocular reflex (VOR). Participants in the sample had vestibular and brain MRI data for a total of 1 (18.6%), 2 (49.5%) and 3 (32.0%) visits. Linear mixed-effects regression was used to model regional volume over time as a function of vestibular physiological function, correcting for age, sex, intracranial volume, and inter-subject random variation in the baseline levels of and rates of change of volume over time. We found that poorer saccular function, characterized by lower cVEMP amplitude, is associated with reduced bilateral volumes of the basal ganglia and thalamus at each time point, demonstrated by a 0.0714 cm3 {+/-} 0.0344 (unadjusted p=0.038; 95% CI: 0.00397-0.139) lower bilateral-mean volume of the basal ganglia and a 0.0440 cm3 {+/-} 0.0221 (unadjusted p=0.046; 95% CI: 0.000727-0.0873) lower bilateral-mean volume of the thalamus for each 1-unit lower cVEMP amplitude. We also found a relationship between a lower mean VOR gain and lower left hippocampal volume ({beta}=0.121, unadjusted p=0.018, 95% CI: 0.0212-0.222). There were no significant associations between volume and oVEMP. These findings provide insight into the specific brain structures that undergo atrophy in the context of age-related loss of peripheral vestibular function. Comprehensive SummaryHumans rely on their vestibular, or inner ear balance, system to manage everyday life. In addition to sensing head motion and head position with respect to gravity, the vestibular system helps to maintain balance and gaze stability. Furthermore, evidence is mounting that vestibular function is linked to structural changes in the central nervous system (CNS). Yet, the exact processes by which vestibular function alters brain structural integrity is unclear. One possible mechanism is that progressive vestibular deafferentation results in neurodegeneration of structures that receive vestibular input. In support of this putative mechanism, recent studies report the association of vestibular impairment with volume loss of brain areas that receive vestibular information, specifically the hippocampus and entorhinal cortex, in older adults. This present work investigates the extent over time to which age-related vestibular loss contributes to volume reduction of four brain regions that receive vestibular input: the hippocampus, entorhinal cortex, thalamus, and basal ganglia. Using data from a cohort of healthy, older adults between 2013 and 2017 from the Baltimore Longitudinal Study of Aging, we assessed regional brain volume as a function of vestibular function, while accounting for common confounds of brain volume change (e.g., age, sex, head size). We found that poor vestibular function is associated with significantly reduced volumes of the thalamus, basal ganglia, and left hippocampus. Notably, this study is one of the first to demonstrate relationships between age-related vestibular loss and gray matter loss in brain regions that receive vestibular input. Further research is needed to understand in greater detail the observed link between vestibular function and CNS structure. Which brain areas are impacted by age-related vestibular loss? How and in what sequence are they impacted? As the worlds aging population--and the prevalence of age-related vestibular impairment--increases, answering questions like these becomes increasingly important. One day, these answers will provide targets for preemptive interventions, like physical pre-habilitation, to stave off adverse changes in brain structure before they occur and progress towards clinical significance. O_FIG O_LINKSMALLFIG WIDTH=200 HEIGHT=140 SRC="FIGDIR/small/20233684v3_ufig1.gif" ALT="Figure 1"> View larger version (68K): org.highwire.dtl.DTLVardef@1fe4527org.highwire.dtl.DTLVardef@6e2443org.highwire.dtl.DTLVardef@10ed2eforg.highwire.dtl.DTLVardef@3a59dd_HPS_FORMAT_FIGEXP M_FIG C_FIG
otolaryngology
10.1101/2020.11.17.20233684
Linking vestibular function and sub-cortical grey matter volume changes in a longitudinal study of aging adults
Emerging evidence suggests a relationship between impairments of the vestibular (inner ear balance) system and alterations in the function and the structure of the central nervous system in older adults. However, it is unclear whether age-related vestibular loss is associated with volume loss in brain regions known to receive vestibular input. To address this gap, we investigated the association between vestibular function and the volumes of four structures that process vestibular information (the hippocampus, entorhinal cortex, thalamus, and basal ganglia) in a longitudinal study of 97 healthy, older participants from the Baltimore Longitudinal Study of Aging. Vestibular testing included cervical vestibular-evoked myogenic potentials (cVEMP) to measure saccular function, ocular VEMP (oVEMP) to measure utricular function, and video head-impulse tests to measure the horizontal semi-circular canal vestibulo-ocular reflex (VOR). Participants in the sample had vestibular and brain MRI data for a total of 1 (18.6%), 2 (49.5%) and 3 (32.0%) visits. Linear mixed-effects regression was used to model regional volume over time as a function of vestibular physiological function, correcting for age, sex, intracranial volume, and inter-subject random variation in the baseline levels of and rates of change of volume over time. We found that poorer saccular function, characterized by lower cVEMP amplitude, is associated with reduced bilateral volumes of the basal ganglia and thalamus at each time point, demonstrated by a 0.0714 cm3 {+/-} 0.0344 (unadjusted p=0.038; 95% CI: 0.00397-0.139) lower bilateral-mean volume of the basal ganglia and a 0.0440 cm3 {+/-} 0.0221 (unadjusted p=0.046; 95% CI: 0.000727-0.0873) lower bilateral-mean volume of the thalamus for each 1-unit lower cVEMP amplitude. We also found a relationship between a lower mean VOR gain and lower left hippocampal volume ({beta}=0.121, unadjusted p=0.018, 95% CI: 0.0212-0.222). There were no significant associations between volume and oVEMP. These findings provide insight into the specific brain structures that undergo atrophy in the context of age-related loss of peripheral vestibular function. Comprehensive SummaryHumans rely on their vestibular, or inner ear balance, system to manage everyday life. In addition to sensing head motion and head position with respect to gravity, the vestibular system helps to maintain balance and gaze stability. Furthermore, evidence is mounting that vestibular function is linked to structural changes in the central nervous system (CNS). Yet, the exact processes by which vestibular function alters brain structural integrity is unclear. One possible mechanism is that progressive vestibular deafferentation results in neurodegeneration of structures that receive vestibular input. In support of this putative mechanism, recent studies report the association of vestibular impairment with volume loss of brain areas that receive vestibular information, specifically the hippocampus and entorhinal cortex, in older adults. This present work investigates the extent over time to which age-related vestibular loss contributes to volume reduction of four brain regions that receive vestibular input: the hippocampus, entorhinal cortex, thalamus, and basal ganglia. Using data from a cohort of healthy, older adults between 2013 and 2017 from the Baltimore Longitudinal Study of Aging, we assessed regional brain volume as a function of vestibular function, while accounting for common confounds of brain volume change (e.g., age, sex, head size). We found that poor vestibular function is associated with significantly reduced volumes of the thalamus, basal ganglia, and left hippocampus. Notably, this study is one of the first to demonstrate relationships between age-related vestibular loss and gray matter loss in brain regions that receive vestibular input. Further research is needed to understand in greater detail the observed link between vestibular function and CNS structure. Which brain areas are impacted by age-related vestibular loss? How and in what sequence are they impacted? As the worlds aging population--and the prevalence of age-related vestibular impairment--increases, answering questions like these becomes increasingly important. One day, these answers will provide targets for preemptive interventions, like physical pre-habilitation, to stave off adverse changes in brain structure before they occur and progress towards clinical significance. O_FIG O_LINKSMALLFIG WIDTH=200 HEIGHT=140 SRC="FIGDIR/small/20233684v3_ufig1.gif" ALT="Figure 1"> View larger version (68K): org.highwire.dtl.DTLVardef@1fe4527org.highwire.dtl.DTLVardef@6e2443org.highwire.dtl.DTLVardef@10ed2eforg.highwire.dtl.DTLVardef@3a59dd_HPS_FORMAT_FIGEXP M_FIG C_FIG
otolaryngology
10.1101/2020.11.17.20233452
Patterns and persistence of SARS-CoV-2 IgG antibodies in a US metropolitan site
BackgroundEstimates of seroprevalence to SARS-CoV-2 vary widely and may influence vaccination response. We ascertained IgG levels across a single US metropolitan site, Chicago, from June 2020 through December 2020. MethodsParticipants (n=7935) were recruited through electronic advertising and received materials for a self-sampled dried blood spot assay through the mail or a minimal contact in person method. IgG to the receptor binding domain of SARS-CoV-2 was measured using an established highly sensitive and highly specific assay. ResultsOverall seroprevalence was 17.9%, with no significant difference between method of contact. Only 2.5% of participants reported having had a diagnosis of COVID-19 based on virus detection, consistent with a 7-fold greater exposure to SARS-CoV-2 measured by serology than detected by viral testing. The range of IgG level observed in seropositive participants from this community survey overlapped with the range of IgG levels associated with COVID-19 cases having a documented positive PCR positive test. From a subset of those who participated in repeat testing, half of seropositive individuals retained detectable antibodies for 3-4 months. ConclusionsQuantitative IgG measurements with a highly specific and sensitive assay indicate more widespread exposure to SARS-CoV-2 than observed by viral testing. The range of IgG concentration produced from these asymptomatic exposures is similar to IgG levels occurring after documented non-hospitalized COVID-19, which is considerably lower than that produced from hospitalized COVID-19 cases. The differing ranges of IgG response, coupled with the rate of decay of antibodies, may influence response to subsequent viral exposure and vaccine. Graphical Abstract O_FIG O_LINKSMALLFIG WIDTH=200 HEIGHT=180 SRC="FIGDIR/small/20233452v3_ufig1.gif" ALT="Figure 1"> View larger version (33K): org.highwire.dtl.DTLVardef@f5986dorg.highwire.dtl.DTLVardef@1ea428dorg.highwire.dtl.DTLVardef@b758c9org.highwire.dtl.DTLVardef@1263310_HPS_FORMAT_FIGEXP M_FIG C_FIG
infectious diseases
10.1101/2020.11.17.20233452
Patterns and persistence of SARS-CoV-2 IgG antibodies in Chicago tomonitor COVID-19 exposure
BackgroundEstimates of seroprevalence to SARS-CoV-2 vary widely and may influence vaccination response. We ascertained IgG levels across a single US metropolitan site, Chicago, from June 2020 through December 2020. MethodsParticipants (n=7935) were recruited through electronic advertising and received materials for a self-sampled dried blood spot assay through the mail or a minimal contact in person method. IgG to the receptor binding domain of SARS-CoV-2 was measured using an established highly sensitive and highly specific assay. ResultsOverall seroprevalence was 17.9%, with no significant difference between method of contact. Only 2.5% of participants reported having had a diagnosis of COVID-19 based on virus detection, consistent with a 7-fold greater exposure to SARS-CoV-2 measured by serology than detected by viral testing. The range of IgG level observed in seropositive participants from this community survey overlapped with the range of IgG levels associated with COVID-19 cases having a documented positive PCR positive test. From a subset of those who participated in repeat testing, half of seropositive individuals retained detectable antibodies for 3-4 months. ConclusionsQuantitative IgG measurements with a highly specific and sensitive assay indicate more widespread exposure to SARS-CoV-2 than observed by viral testing. The range of IgG concentration produced from these asymptomatic exposures is similar to IgG levels occurring after documented non-hospitalized COVID-19, which is considerably lower than that produced from hospitalized COVID-19 cases. The differing ranges of IgG response, coupled with the rate of decay of antibodies, may influence response to subsequent viral exposure and vaccine. Graphical Abstract O_FIG O_LINKSMALLFIG WIDTH=200 HEIGHT=180 SRC="FIGDIR/small/20233452v3_ufig1.gif" ALT="Figure 1"> View larger version (33K): org.highwire.dtl.DTLVardef@f5986dorg.highwire.dtl.DTLVardef@1ea428dorg.highwire.dtl.DTLVardef@b758c9org.highwire.dtl.DTLVardef@1263310_HPS_FORMAT_FIGEXP M_FIG C_FIG
infectious diseases