id
stringlengths
16
27
title
stringlengths
18
339
abstract
stringlengths
95
38.7k
category
stringlengths
7
44
10.1101/2020.09.20.20198499
Elevated serum uric acid is a facilitating mechanism for insulin resistance mediated accumulation of visceral adipose tissue
OBJECTIVESerum uric acid (SUA) has been associated to cardiometabolic conditions such as insulin resistance (IR) and visceral adipose tissue (VAT) accumulation. Here, we aimed to clarify a unifying mechanism linking elevated SUA to IR and VAT. METHODSWe conducted analyses in 226 subjects from the UIEM cohort with both euglycemic hyperinsulinemic clamp (EHC) and dual X-ray absorptiometry (DXA) measurements for IR and VAT accumulation, and explored the role of SUA and adiponectin by developing a network of causal mediation analyses to assess their impact on IR and VAT. These models were then translated to two population-based cohorts comprising 6,337 subjects from NHANES 2003-2004 and 2011-2012 cycles in the US and ENSANUT Medio Camino 2016 in Mexico, using HOMA2IR and adipoIR as indicators of peripheral and adipose tissue IR, and METS-VF as a surrogate for VAT accumulation. ResultsSUA has a mediating role inside a bidirectional relationship between IR and visceral obesity, which was similar using either gold standard measurements or surrogate measures for IR and VAT. Furthermore, adiponectin acts as a linking mediator between elevated SUA and both peripheral IR and VAT accumulation. The proportion of the mechanism for IR-mediated (in either peripheral or adipose tissue) VAT accumulation was greater, compared to VAT-mediated IR accumulation (10.53%[9.23%-12.00%] to 5.44%[3.78%-7.00%]). Normal-range SUA levels can be used to rule-out underlying cardio-metabolic abnormalities in both men and women. CONCLUSIONSElevated SUA acts as mediator inside the bidirectional relationship between IR and VAT accumulation and these observations could be applicable at a phenotype scale.
endocrinology
10.1101/2020.09.20.20198226
Large-Scale Hypothesis Testing for Causal Mediation Effects with Applications in Genome-wide Epigenetic Studies
In genome-wide epigenetic studies, it is of great scientific interest to assess whether the effect of an exposure on a clinical outcome is mediated through DNA methylations. However, statistical inference for causal mediation effects is challenged by the fact that one needs to test a large number of composite null hypotheses across the whole epigenome. Two popular tests, the Wald-type Sobels test and the joint significant test using the traditional null distribution are underpowered and thus can miss important scientific discoveries. In this paper, we show that the null distribution of Sobels test is not the standard normal distribution and the null distribution of the joint significant test is not uniform under the composite null of no mediation effect, especially in finite samples and under the singular point null case that the exposure has no effect on the mediator and the mediator has no effect on the outcome. Our results explain why these two tests are underpowered, and more importantly motivate us to develop a more powerful Divide-Aggregate Composite-null Test (DACT) for the composite null hypothesis of no mediation effect by leveraging epigenome-wide data. We adopted Efrons empirical null framework for assessing statistical significance of the DACT test. We showed analytically that the proposed DACT method had improved power, and could well control type I error rate. Our extensive simulation studies showed that, in finite samples, the DACT method properly controlled the type I error rate and outperformed Sobels test and the joint significance test for detecting mediation effects. We applied the DACT method to the US Department of Veterans Affairs Normative Aging Study, an ongoing prospective cohort study which included men who were aged 21 to 80 years at entry. We identified multiple DNA methylation CpG sites that might mediate the effect of smoking on lung function with effect sizes ranging from -0.18 to - 0.79 and false discovery rate controlled at level 0.05, including the CpG sites in the genes AHRR and F2RL3. Our sensitivity analysis found small residual correlations (less than 0.01) of the error terms between the outcome and mediator regressions, suggesting that our results are robust to unmeasured confounding factors.
genetic and genomic medicine
10.1101/2020.09.22.20199869
Exploratory and confirmatory factor analysis of the Cognitive Social Capital Scale in a Colombian sample
During the last two decades, the concept of social capital has been used with increasing frequency in health sciences due to the direct and indirect relationships between social capital and populations physical and mental health. Therefore, it is necessary to build an instrument to quantify this concept confidently and reliably. The study aimed to perform exploratory and confirmatory factor analyses on a seven-item scale to measure social capital in adults of Colombias general population. An online validation study was done, including a sample of 700 adults aged between 18 and 76 years; 68% were females. Participants completed a seven-item scale called the Cognitive Social Capital Scale (CSCS). Cronbachs alpha and McDonalds omega were computed to test internal consistency. Exploratory and confirmatory factor analyses were conducted to explore the dimensionality of the CSCS. The CSCS presented a low internal consistency (Cronbachs alpha of 0.56 and McDonalds omega of 0.59) and poor dimensionality. Then, a five-item version (CSCS-5) was tested. The CSCS-5 showed high internal consistency (Cronbachs alpha of 0.79 and McDonalds omega of 0.80) and a one-dimension structure with acceptable goodness-of-fit indicators. In conclusion, the CSCS-5 presents high internal consistency and a one-dimension structure to measure cognitive capital social in Colombian sample. It can be recommended for the measuring of social capital in the general Colombian population. Further research should corroborate these findings on pencil and paper applications and explore other reliability and validity indicators.
public and global health
10.1101/2020.09.22.20199844
Quantification of abdominal fat from computed tomography using deep learning and its association with electronic health records in an academic biobank
ObjectiveThe objective was to develop a fully automated algorithm for abdominal fat segmentation and deploy this method at scale and associated with diagnoses in an academic biobank. Materials and MethodsWe built a fully automated image curation and labeling technique using deep learning and distributive computing to identify subcutaneous and visceral abdominal fat compartments from 47,587 CT scans in 13,422 patients in the Penn Medicine Biobank (PMBB). A classification network identified the inferior and superior borders of the abdomen, and a segmentation network differentiated visceral and subcutaneous fat. Following technical evaluation of our method, we conducted studies to validate known relationships with adiposity. ResultsWhen compared with 100 manually annotated cases, the classification network was on average within one 5 mm slice for both the superior (0.3{+/-}0.6 slices) and inferior (0.7{+/-}0.6 slices) borders. The segmentation network also demonstrated excellent performance with interclass correlation coefficients of 0.99 (p<2e-16) for subcutaneous and 0.99 (p<2e-16) for visceral fat on 100 testing cases. We performed integrative analyses of abdominal fat with the phenome extracted from the electronic health record and found highly significant associations with diabetes mellitus, hypertension, renal failure, among other phenotypes. ConclusionThis work presents a fully automated and highly accurate method for the quantification of abdominal fat that can be applied to routine clinical imaging studies to fuel translation scientific discovery.
radiology and imaging
10.1101/2020.09.20.20196907
Microfluidic Affinity Profiling reveals a Broad Range of Target Affinities for Anti-SARS-CoV-2 Antibodies in Plasma of Covid Survivors
The clinical outcome of SARS-CoV-2 infections, which can range from asymptomatic to lethal, is crucially shaped by the concentration of antiviral antibodies and by their affinity to their targets. However, the affinity of polyclonal antibody responses in plasma is difficult to measure. Here we used Microfluidic Antibody Affinity Profiling (MAAP) to determine the aggregate affinities and concentrations of anti-SARS-CoV-2 antibodies in plasma samples of 42 seropositive individuals, 19 of which were healthy donors, 20 displayed mild symptoms, and 3 were critically ill. We found that dissociation constants, Kd, of anti-receptor binding domain antibodies spanned 2.5 orders of magnitude from sub-nanomolar to 43 nM. Using MAAP we found that antibodies of seropositive individuals induced the dissociation of pre-formed spike-ACE2 receptor complexes, which indicates that MAAP can be adapted as a complementary receptor competition assay. By comparison with cytopathic-effect based neutralisation assays, we show that MAAP can reliably predict the cellular neutralisation ability of sera, which may be an important consideration when selecting the most effective samples for therapeutic plasmapheresis and tracking the success of vaccinations.
allergy and immunology
10.1101/2020.09.19.20198077
Testing the Ability of Convolutional Neural Networks to Learn Radiomic Features
Background and ObjectiveRadiomics and deep learning have emerged as two distinct approaches to medical image analysis. However, their relative expressive power remains largely unknown. Theoretically, hand-crafted radiomic features represent a mere subset of features that neural networks can approximate, thus making deep learning a more powerful approach. On the other hand, automated learning of hand-crafted features may require a prohibitively large number of training samples. Here we directly test the ability of convolutional neural networks (CNNs) to learn and predict the intensity, shape, and texture properties of tumors as defined by standardized radiomic features. MethodsConventional 2D and 3D CNN architectures with an increasing number of convolutional layers were trained to predict the values of 16 standardized radiomic features from real and synthetic PET images of tumors, and tested. In addition, several ImageNet-pretrained advanced networks were tested. A total of 4000 images were used for training, 500 for validation, and 500 for testing. ResultsFeatures quantifying size and intensity were predicted with high accuracy, while shape irregularity and heterogeneity features had very high prediction errors and generalized poorly. For example, mean normalized prediction error of tumor diameter with a 5-layer CNN was 4.23 {+/-} 0.25, while the error for tumor sphericity was 15.64 {+/-} 0.93. We additionally found that learning shape features required an order of magnitude more samples compared to intensity and size features. ConclusionsOur findings imply that CNNs trained to perform various image-based clinical tasks may generally under-utilize the shape and texture information that is more easily captured by radiomics. We speculate that to improve the CNN performance, shape and texture features can be computed explicitly and added as auxiliary variables to the networks, or supplied as synthetic inputs.
radiology and imaging
10.1101/2020.09.22.20196048
College Openings, Mobility, and the Incidence of COVID-19 Cases
School and college reopening-closure policies are considered one of the most promising non-pharmaceutical interventions for mitigating infectious diseases. Nonetheless, the effectiveness of these policies is still debated, largely due to the lack of empirical evidence on behavior during implementation. We examined U.S. college reopenings association with changes in human mobility within campuses and in COVID-19 incidence in the counties of the campuses over a twenty-week period around college reopenings in the Fall of 2020. We used an integrative framework, with a difference-in-differences design comparing areas with a college campus, before and after reopening, to areas without a campus and a Bayesian approach to estimate the daily reproductive number (Rt). We found that college reopenings were associated with increased campus mobility, and increased COVID-19 incidence by 3.4 cases per 100,000 (95% confidence interval [CI]: 1.5 - 5.4), or a 25% increase relative to the pre-period mean. This reflected our estimate of increased transmission locally after reopening. A greater increase in county COVID-19 incidence resulted from campuses that drew students from counties with high COVID-19 incidence in the weeks before reopening ({chi}2(2) = 10.19, p = 0.006). Even by Fall of 2021, large shares of populations remained unvaccinated, increasing the relevance of understanding non-pharmaceutical decisions over an extended period of a pandemic. Our study sheds light on movement and social mixing patterns during the closure-reopening of colleges during a public health threat, and offers strategic instruments for benefit-cost analyses of school reopening/closure policies.
public and global health
10.1101/2020.09.22.20196048
College Openings in the United States Increased Mobility and COVID-19 Incidence
School and college reopening-closure policies are considered one of the most promising non-pharmaceutical interventions for mitigating infectious diseases. Nonetheless, the effectiveness of these policies is still debated, largely due to the lack of empirical evidence on behavior during implementation. We examined U.S. college reopenings association with changes in human mobility within campuses and in COVID-19 incidence in the counties of the campuses over a twenty-week period around college reopenings in the Fall of 2020. We used an integrative framework, with a difference-in-differences design comparing areas with a college campus, before and after reopening, to areas without a campus and a Bayesian approach to estimate the daily reproductive number (Rt). We found that college reopenings were associated with increased campus mobility, and increased COVID-19 incidence by 3.4 cases per 100,000 (95% confidence interval [CI]: 1.5 - 5.4), or a 25% increase relative to the pre-period mean. This reflected our estimate of increased transmission locally after reopening. A greater increase in county COVID-19 incidence resulted from campuses that drew students from counties with high COVID-19 incidence in the weeks before reopening ({chi}2(2) = 10.19, p = 0.006). Even by Fall of 2021, large shares of populations remained unvaccinated, increasing the relevance of understanding non-pharmaceutical decisions over an extended period of a pandemic. Our study sheds light on movement and social mixing patterns during the closure-reopening of colleges during a public health threat, and offers strategic instruments for benefit-cost analyses of school reopening/closure policies.
public and global health
10.1101/2020.09.22.20196048
College Openings in the United States Increased Mobility and COVID-19 Incidence
School and college reopening-closure policies are considered one of the most promising non-pharmaceutical interventions for mitigating infectious diseases. Nonetheless, the effectiveness of these policies is still debated, largely due to the lack of empirical evidence on behavior during implementation. We examined U.S. college reopenings association with changes in human mobility within campuses and in COVID-19 incidence in the counties of the campuses over a twenty-week period around college reopenings in the Fall of 2020. We used an integrative framework, with a difference-in-differences design comparing areas with a college campus, before and after reopening, to areas without a campus and a Bayesian approach to estimate the daily reproductive number (Rt). We found that college reopenings were associated with increased campus mobility, and increased COVID-19 incidence by 3.4 cases per 100,000 (95% confidence interval [CI]: 1.5 - 5.4), or a 25% increase relative to the pre-period mean. This reflected our estimate of increased transmission locally after reopening. A greater increase in county COVID-19 incidence resulted from campuses that drew students from counties with high COVID-19 incidence in the weeks before reopening ({chi}2(2) = 10.19, p = 0.006). Even by Fall of 2021, large shares of populations remained unvaccinated, increasing the relevance of understanding non-pharmaceutical decisions over an extended period of a pandemic. Our study sheds light on movement and social mixing patterns during the closure-reopening of colleges during a public health threat, and offers strategic instruments for benefit-cost analyses of school reopening/closure policies.
public and global health
10.1101/2020.09.23.20199091
Spatial proteomic characterization of HER2-positive breast tumors through neoadjuvant therapy predicts response
Addition of HER2-targeted agents to neoadjuvant chemotherapy has dramatically improved pathological complete response (pCR) rates in early-stage HER2-positive breast cancer. Still, up to 50% of patients have residual disease following treatment, while others are likely overtreated. Here, we performed multiplex spatial proteomic characterization of 122 samples from 57 HER2-positive breast tumors from the neoadjuvant TRIO-US B07 clinical trial sampled pre-treatment, after 14-21 days of HER2-targeted therapy, and at surgery. We demonstrate that proteomic changes following a single cycle of HER2-targeted therapy aids the identification of tumors that ultimately undergo pCR, outperforming pre-treatment measures or transcriptomic changes. We further developed and validated a classifier that robustly predicts pCR using a single marker, CD45, measured on-treatment, and show that CD45-positive cell counts measured via conventional immunohistochemistry perform comparably. These results demonstrate novel biomarkers to enable the stratification of sensitive tumors early during neoadjuvant HER2-targeted therapy with implications for tailoring subsequent therapy.
oncology
10.1101/2020.09.24.20201053
Psychopathology in mothers of children with pathogenic Copy Number Variants
Caring for children with pathogenic neurodevelopmental Copy Number Variants (CNVs) (i.e., deletions and duplications of genetic material) can place a considerable burden on parents, and their quality of life. Our study is the first to examine the frequency of psychiatric diagnoses in mothers of children with CNVs compared to the frequency of psychiatric problems in age-matched mothers from a large community study. 268 mothers of children with CNVs had higher frequency of depression compared to the 2,680 age-matched mothers (p<0.001). Mothers of children with CNVs reported higher frequency of anorexia, bulimia, alcohol abuse and drug addiction compared to the age-matched mothers from the community sample. The frequency of depression arising after the birth of the index child was similar between the two groups (48% in mothers of children with CNVs vs. 44% in mothers of the community sample, p=0.43), but mothers of children with CNVs had higher frequency of anxiety (55%) compared to mothers from the community sample (30%, p=0.03). Our study highlights the need for health-care providers to devise treatment plans that not only focus on meeting the childs needs, but also assessing and if needed, addressing the mental health needs of the parent.
psychiatry and clinical psychology
10.1101/2020.09.24.20197293
Associations between inhibitory control, stress, and alcohol (mis)use during the first wave of the COVID-19 pandemic in the UK: a national cross-sectional study utilising data from four birth cohorts
BackgroundThe impact COVID-19 on the UK populations alcohol intake is unknown. We assessed change in alcohol-use and hazardous drinking during the first lockdown, and tested the hypothesis that variation would be predicted by stress and inhibitory-control. MethodsWe interrogated cross-sectional data from the first sweep of the COVID-19 longitudinal survey, comprising 4 national cohorts (13 453 respondents, 19-62 years). Respondents self-reported their alcohol use, stress, and inhibitory control. We regressed change in drinking and alcohol misuse on stress and inhibitory control, adjusting for covariates to account for demographics. Findings29{middle dot}08% 30-year-olds increased alcohol use post-COVID-19. Stress was a major contributing factor to increased alcohol use in 30-year olds (adjusted OR 3{middle dot}92, 95% CI 1{middle dot}17 - 13{middle dot}15), as was inhibitory control in 19-year-olds (adjusted OR 1{middle dot}14, 95% CI 1{middle dot}05 - 1{middle dot}23), 30-year-olds (adjusted OR 1{middle dot}18, 95% CI 1.05 - 1.33) and 50-year-olds (adjusted OR 1{middle dot}06, 95% CI 1{middle dot}01 - 1{middle dot}12). We identified several interactions between stress and inhibitory control in all age groups, suggesting a complex age-specific relationship between the risk factors and alcohol use and misuse during the pandemic. InterpretationIn the UK, alcohol use increased in up to 30% of the population during COVID-19, resulting from a combination of factors including poor inhibitory control and stress. It is critical in future lockdowns that clinicians and public health officials are aware of the challenges faced by different age groups, and prioritise and personalise interventions and prevention measures appropriately. FundingESRC, Foundation for Liver Research. Putting research into contextO_ST_ABSEvidence before the studyC_ST_ABSWe searched PubMed, Web of Science, EBSCO Discovery, bioRxiv, medRxiv, and PsyArXiv for articles published between Jan 1, 2020 and Sep 1, 2020, with the following keywords: "covid-19", "coronavirus", and "alcohol". We prioritised the selection of references based on relevance, importance, opportunity for further reading, and whether the work had been peer-reviewed. There have been several published articles that address the issue of alcohol use and misuse during COVID- 19, including a number of editorials and some limited empirical work. There were no nationally representative studies about alcohol use in the UK. In addition, all of the studies identified simply reported figures of those using alcohol during the pandemic, and to the best of our knowledge, none covered risk-factors for alcohol misuse. Added value of this studyUsing data from the COVID-19 national longitudinal survey (first sweep), comprising data from 18 000 people across five national cohorts (aged 19-74), we tested the hypothesis that people who reported higher levels of stress, and who self-reported low impulse-control, would show higher rates of alcohol use/misuse during the pandemic lockdown. First, we show the proportion of adults across the UK that are drinking more during the pandemic, and how this differs by age and gender. Second, we show that while higher levels of stress were associated with higher levels of alcohol intake in some (e.g., 30-year-olds), we found that the relationship was complex and multifaceted. Stress-induced alcohol use and misuse was dependent on age and personality characteristics, with low impulse-control predictive of higher levels of alcohol consumption in 19-, 30- and 50-year-olds, and several stress x personality interactions. Implications of all the available evidenceStress, as well as poor inhibitory control, were risk factors for the susceptibility to increased alcohol intake and hazardous drinking during the early stages of the COVID-19 pandemic and lockdown. The government, healthcare professionals, and the global media should consider the impact of change of lifestyle and stress that might impact on alcohol consumption among at-risk individuals during any future lockdowns. Similarly, additional support for those that may go on to develop an alcohol use disorder or relapse needs to be put in place.
addiction medicine
10.1101/2020.09.24.20196097
Genotype-phenotype correlations and novel molecular insights into the DHX30-associated neurodevelopmental disorders
BackgroundWe aimed to define the clinical and mutational spectrum, and to provide novel molecular insights into DHX30-associated neurodevelopmental disorder. MethodsClinical and genetic data from affected individuals were collected through family support group, GeneMatcher and our network of collaborators. We investigated the impact of novel missense variants with respect to ATPase and helicase activity, stress granule (SG) formation, global translation, and their effect on embryonic development in zebrafish. SG formation was additionally analyzed in CRISPR/Cas9-mediated DHX30-deficient HEK293T and zebrafish models, along with in vivo behavioral assays. ResultsWe identified 25 previously unreported individuals, ten of whom carry novel variants, two of which are recurrent, and provide evidence of gonadal mosaicism in one family. All 19 individuals harboring heterozygous missense variants within helicase core motifs (HCMs) have global developmental delay, intellectual disability, severe speech impairment and gait abnormalities. These variants impair the ATPase and helicase activity of DHX30, trigger SG formation, interfere with global translation, and cause developmental defects in a zebrafish model. Notably, 4 individuals harboring heterozygous variants resulting either in haploinsufficiency or truncated proteins presented with a milder clinical course, similar to an individual bearing a de novo mosaic HCM missense variant. Functionally, we established DHX30 as an ATP-dependent RNA helicase and as an evolutionary conserved factor in SG assembly. Based on the clinical course, the variant location and type we establish two distinct clinical subtypes. DHX30 loss-of-function mutations cause a milder phenotype whereas a severe phenotype is caused by HCM missense mutations that, in addition to the loss of ATPase and helicase activity, lead to a detrimental gain-of function with respect to SG formation. Behavioral characterization of dhx30 deficient zebrafish revealed altered sleep-wake activity and social interaction, partially resembling the human phenotype. ConclusionsOur study highlights the usefulness of social media in order to define novel Mendelian disorders, and exemplifies how functional analyses accompanied by clinical and genetic findings can define clinically distinct subtypes for ultra-rare disorders. Such approaches require close interdisciplinary collaboration between families/legal representatives of the affected individuals, clinicians, molecular genetics diagnostic laboratories and research laboratories.
genetic and genomic medicine
10.1101/2020.09.24.20201178
Face Masks, Public Policies and Slowing the Spread of COVID-19: Evidence from Canada
We estimate the impact of indoor face mask mandates and other non-pharmaceutical interventions (NPI) on COVID-19 case growth in Canada. Mask mandate introduction was staggered from mid-June to mid-August 2020 in the 34 public health regions in Ontario, Canadas largest province by population. Using this variation, we find that mask mandates are associated with a 22 percent weekly reduction in new COVID-19 cases, relative to the trend in absence of mandate. Province-level data provide corroborating evidence. We control for mobility behaviour using Google geo-location data and for lagged case totals and case growth as information variables. Our analysis of additional survey data shows that mask mandates led to an increase of about 27 percentage points in self-reported mask wearing in public. Counterfactual policy simulations suggest that adopting a nationwide mask mandate in June could have reduced the total number of diagnosed COVID-19 cases in Canada by over 50,000 over the period July-November 2020. Jointly, our results indicate that mandating mask wearing in indoor public places can be a powerful policy tool to slow the spread of COVID-19. JEL codesI18, I12, C23
health economics
10.1101/2020.09.25.20201327
40 Hz Auditory Steady-State Responses Predict Clinical Outcomes in Clinical-High-Risk Participants: A MEG Study
ObjectiveTo examine whether 40-Hz Auditory Steady-State Responses (ASSR) in participants at clinical high-risk for psychosis predict clinical outcomes. MethodIn this study, magnetoencephalography (MEG) data were collected during a 40-Hz ASSR paradigm in 116 participants meeting clinical high-risk (CHR-P) for psychosis criteria, a clinical control group characterized by affective disorders and/or substance abuse (CHR-N: n=38), 32 first-episode psychosis patients (FEP, 14 antipsychotic-naive), and 49 healthy controls. We examined 40-Hz-ASSR-source-activity in bilateral Heschls gyrus, superior temporal gyrus, Rolandic operculum, and the thalamus. Group differences in ASSR amplitudes were tested and correlated with neuropsychological scores, psychosocial functioning, and clinical symptoms. Linear discriminant analyses was used to assess whether 40-Hz-ASSR predicts transition to psychosis and persistence of APS. ResultsCompared to controls, 40-Hz-ASSR responses in CHR-Ps were impaired in right Rolandic operculum (d=0.41) and right thalamus (d=0.43), particularly in those with combined UHR/BS symptoms and CHR-Ps who transitioned to psychosis (n=11). FEP-patients showed significant impairments in the right thalamus (d=0.58), while the CHR-N group was unaffected. Importantly, right thalamus 40-Hz-ASSRs predicted transition to psychosis (transitioned [n=11] vs non-transitioned [n=105]); classification accuracy 73.3%, AUC=0.827), whereas this was not the case for persistent APS (Persistent [n=41] vs non-Persistent [n=37]; classification accuracy 56.4%). ConclusionsThe current study indicates that MEG-recorded 40-Hz-ASSRs constitute a potential biomarker for predicting transition to psychosis in CHR-P participants.
psychiatry and clinical psychology
10.1101/2020.09.25.20201848
Effect of cannabis liberalization on suicide and mental illness following recreational access: a state-level longitudinal analysis in the USA
ObjectiveTo standardize the implementation dates of various cannabis liberalization policies and determine whether previous research by Anderson et al. [D.M. Anderson, D.I. Rees, J.J. Sabia, American Journal of Public Health 104, 2369-2376] on medical marijuana access and population-level suicidality is robust to additional years of data and further cannabis liberalization in the form of recreational marijuana access. DesignA state-level longitudinal (panel) analysis. Suicide mortality rates from the National Center for Health Statistics and mental health morbidity rates from the National Survey on Drug Use and Health were employed with the procedures outlined by Anderson et al., using weighted ordinary least squares for three different specifications with various combinations of control variables as a sensitivity analysis to test for robustness. SettingAll 50 states and Washington, DC for the period 1999-2019. ParticipantsUSA population. InterventionsCannabis liberalization policies in the form of recreational or medical access. Primary and Secondary Outcome MeasuresState-level population mental health outcomes in the form of suicide mortality among various age groups for males and females defined by the International Classification of Diseases, Tenth Revision and rates of mental illness, serious mental illness, major depression, and suicidal ideation defined by the Substance Abuse and Mental Health Services Administration. ResultsRecreational marijuana access was associated with a 6.29% reduction (95% CI -11.82% to -0.42%) in suicide rates for males in the 40 to 49 age group. No other mental health outcomes were consistently affected by cannabis liberalization. ConclusionsAdverse mental health outcomes do not follow cannabis liberalization at the state level, confirming the findings of Anderson et al. In addition, there is evidence that recreational marijuana access reduces suicide rates for middle-aged males. Strengths and limitations of this studyO_LICannabis liberalization policies, which vary greatly throughout the literature, are explicitly defined and corrected from previous studies. C_LIO_LISAMHSA suppresses state-level geographical information for individual-level responses in the NSDUH and the analysis relied on population averages for a small number of age groups published in the NSDUH State Prevalence Estimates, which did not allow us to evaluate gender differences for mental health outcomes. C_LIO_LIThe reliability of NSDUH and suicide data to estimate true population rates is highly debated. C_LIO_LIPopulation-level analyses of longitudinal data can be evaluated with multiple accepted methods from the medical literature and it is not clear whether weighted ordinary least squares is the most appropriate approach for this type of analysis. C_LI
health policy
10.1101/2020.09.26.20200097
Threshold-free genomic cluster detection to track transmission pathways in healthcare settings
BackgroundOver the past decade, whole-genome sequencing (WGS) has become the gold standard for tracking the spread of infections in healthcare settings. However, a critical barrier to the routine application of WGS for infection prevention is the lack of reliable criteria for determining if a genomic linkage is consistent with transmission. MethodsHere, we sought to understand the genomic landscape in a high-transmission healthcare setting by performing WGS on 435 carbapenem-resistant Enterobacterales (CRE) isolates collected from 256 patients through admission and biweekly surveillance culturing of virtually every hospitalized patient over a 1-year period. FindingsOur analysis revealed that the standard approach of employing a single-nucleotide variant (SNV) threshold to define transmission would lead to both false-positive and false-negative inferences. False positive inferences were driven by the frequent importation of closely related strains, which were presumably linked via transmission at a connected healthcare facility. False negative inferences stemmed from the diversity of colonizing populations being spread among patients, with multiple examples of hypermutator strains emerging within patients and leading to putative transmission links separated by large genetic distances. Motivated by limitations of an SNV threshold, we implemented a novel threshold-free transmission cluster inference approach whereby each of the 234 acquired CRE isolates were linked back to the imported CRE isolate with which it shared the most variants. This approach yielded clusters that varied in levels of genetic diversity but were highly enriched in patients sharing epidemiologic links. Holistic examination of clusters highlighted extensive variation in the magnitude of onward transmission stemming from the more than 100 importation events and revealed patterns in cluster propagation that could inform improvements to infection prevention strategies. InterpretationOverall, our results show how the integration of culture surveillance data into genomic analyses can overcome limitations of cluster detection based on SNV-thresholds and improve our ability to track pathways of pathogen transmission in healthcare settings. FundingCDC U54 CK000481, CDC U54 CK00016 04S2. S.E.H was supported by the University of Michigan NIH Training Program in Translational Research T32-GM113900 and the University of Michigan Rackham pre-doctoral fellowship. Research in contextO_ST_ABSEvidence before this studyC_ST_ABSWe searched PubMed for studies published before May 1, 2021, with no start date restriction, with the search "transmission AND whole-genome AND (snp OR snv) AND (cut-off OR threshold) NOT (SARS-CoV-2 OR virus or HIV)". We identified 18 reports that used whole genome sequencing to study transmission, primarily in healthcare settings. Several of these studies attempted to identify optimal single nucleotide variant (SNV) cutoffs for delineating transmission. These studies were all single-site and had only partial sampling of healthcare facilities. Moreover, even when the same species was considered, different optimal SNV thresholds were reported. Added value of this studyTo understand the limitations of an SNV threshold approach for tracking transmission we leveraged a data set that comprised admission and every-other-week CRE surveillance culturing for every patient entering a hospital over the course of one year. By performing genomic analysis of 435 isolates from the 256 CRE colonized patients we systematically demonstrated pitfalls with the use of SNV thresholds for transmission inference that stem from the importation of closely related strains from connected healthcare facilities, variation in genetic heterogeneity of colonizing populations and uneven evolutionary rates of CRE strains colonizing patients. We went on to implement an alternative approach for tracking transmission in healthcare facilities that relies on genetic context, instead of genetic distance to group patients into intra-facility transmission clusters. We applied this approach to our CRE genomes and demonstrated that the resultant transmission clusters are strongly enriched in patients with spatiotemporal overlap, and that clusters can be interrogated to identify putative targets to interrupt transmission. Implications of all the available evidenceAdvances in the speed and economy of genome sequencing are making it increasingly feasible to perform routine sequencing to track transmission in healthcare settings. However, a critical barrier to these efforts is the lack of clear criteria for inferring transmission that generalizes to diverse strains of healthcare pathogens and that are robust to variation in organism prevalence and differences in connectivity of local healthcare networks. Here, we show that by combining genome sequencing with surveillance data that healthcare transmission can be inferred in a threshold-free manner. The success of this approach in a setting with high importation and transmission rates bodes well for its generalizability to less challenging healthcare settings.
epidemiology
10.1101/2020.09.27.20202481
The Characteristic of Auditory Function and Cochlear Pathophysiology in a Noise-exposed Cohort: A Cross-sectional Study
BackgroundTo determine the characteristics and sex differences of auditory perception and cochlear function in individuals with long-term occupational noise exposure. MethodsYoung workers with long-term occupational noise exposure from a shipyard were recruited in the current study as the hidden hearing loss (HHL) risk group. Age-matched office workers in the same shipyard who had no occupational noise-exposure history were enrolled in the control group. The auditory processing ability of speech-in-noise (SIN) score and gap detection threshold (GDT) were further examined by sex. The cochlear function of action potential (AP) and summating potential (SP)/AP values were tested and compared by sex and side. The correlation between the SIN score and cochlear function was studied by sex. The correlation between either auditory processing ability or cochlear function and occupational-noise working length (OWL) was also analysed in the HHL risk group. ResultsSignificantly decreased SIN scores and a higher GDT of the 4 kHz gap marker were only found in men in the HHL risk group. Although the hearing thresholds of the women in the HHL risk group were slightly but significantly worse than those of the women in the control group, no significant defects in auditory processing or temporal resolution were found between the two groups. Significantly decreased cochlear function and increased SP/AP values in the left ear were only found in men in the HHL risk group. Neither the AP amplitude nor the AP latency differed significantly between the two groups by sex. A correlation study indicated that only the correlation between the SIN score and the AP amplitude of the right ear in men was significant. No significant difference was found between the SIN score and cochlear function in women. The AP latency of the right ear was only significantly correlated with OWL in men. ConclusionIn long-term occupational working exposure individuals with normal hearing, defects in auditory processing, temporal resolution and cochlear function showed sex differences, none of which were significant in women. In men, a weak correlation between the SIN score and the AP amplitude of the right ear was found. There was only a weak correlation between OWL and the AP latency of the right ear in men. Our findings indicate men are more vulnerable to occupational noise than women. Considering the noise-exposure dose differences between the control and HHL risk groups, our measures are insensitive to cochlear synaptopathy in humans.
otolaryngology
10.1101/2020.09.27.20202481
Characteristic and Sex Differences in Auditory Function and Cochlear Pathophysiology in a Noise-exposed Cohort: A Cross-sectional Study
BackgroundTo determine the characteristics and sex differences of auditory perception and cochlear function in individuals with long-term occupational noise exposure. MethodsYoung workers with long-term occupational noise exposure from a shipyard were recruited in the current study as the hidden hearing loss (HHL) risk group. Age-matched office workers in the same shipyard who had no occupational noise-exposure history were enrolled in the control group. The auditory processing ability of speech-in-noise (SIN) score and gap detection threshold (GDT) were further examined by sex. The cochlear function of action potential (AP) and summating potential (SP)/AP values were tested and compared by sex and side. The correlation between the SIN score and cochlear function was studied by sex. The correlation between either auditory processing ability or cochlear function and occupational-noise working length (OWL) was also analysed in the HHL risk group. ResultsSignificantly decreased SIN scores and a higher GDT of the 4 kHz gap marker were only found in men in the HHL risk group. Although the hearing thresholds of the women in the HHL risk group were slightly but significantly worse than those of the women in the control group, no significant defects in auditory processing or temporal resolution were found between the two groups. Significantly decreased cochlear function and increased SP/AP values in the left ear were only found in men in the HHL risk group. Neither the AP amplitude nor the AP latency differed significantly between the two groups by sex. A correlation study indicated that only the correlation between the SIN score and the AP amplitude of the right ear in men was significant. No significant difference was found between the SIN score and cochlear function in women. The AP latency of the right ear was only significantly correlated with OWL in men. ConclusionIn long-term occupational working exposure individuals with normal hearing, defects in auditory processing, temporal resolution and cochlear function showed sex differences, none of which were significant in women. In men, a weak correlation between the SIN score and the AP amplitude of the right ear was found. There was only a weak correlation between OWL and the AP latency of the right ear in men. Our findings indicate men are more vulnerable to occupational noise than women. Considering the noise-exposure dose differences between the control and HHL risk groups, our measures are insensitive to cochlear synaptopathy in humans.
otolaryngology
10.1101/2020.09.27.20202762
Cytokines (IL-17, IL-23 and IL-33) in Systemic lupus erythematosus in Trinidad and Tobago.
Systemic lupus erythematosus (SLE) is the most common autoimmune disease. It is characterized by the presence of hundreds of autoantibodies against many organs and tissues, including the presence of a large number of autoantibodies, which are specific to self-antigens mainly of nuclear origin such as Smith antigen, double-stranded DNA (dsDNA), anti-Sjogrens syndrome-related antigen A and B (SSA/Ro and SSB/La, respectively) and ribonucleoproteins, which are the hallmarks of the disease. Type I and II interferons, interleukin-6 (IL-6), IL-1, tumor necrosis factor-alpha (TNF-), and immunomodulatory cytokines such as IL-10 and TGF-{beta} are essential players in SLE. Additionally, T-cell-derived cytokines such as IL-17, IL-21, and IL-2 are dysregulated in SLE. In this study among cohorts of 60 individuals attending the hospital clinics in Trinidad and Tobago, blood samples were analyzed and the levels of the essential cytokines were measured using SLE Disease Activity Index (SLEDAI) 2000 score. The results confirmed that serum IL-17 and IL-23 levels were positively correlated with the SLE Disease Activity Index (SLEDAI) 2000 score in these patients. These findings have diagnostic and therapeutic implications. However, more work must be done targeting other cytokines relevant to autoimmunity and SLE in particular. Interleulin-33 is not an SLE marker, as has been noted in other populations.
rheumatology
10.1101/2020.09.27.20199737
High-Quality Masks Reduce Infections and Deaths in the US
ObjectivesTo evaluate the effectiveness of widespread adoption of masks or face coverings to reduce community transmission of the SARS-CoV-2 virus that causes Covid-19. MethodsWe employed an agent-based stochastic network simulation model, where Covid-19 progresses across census tracts according to a variant of SEIR. We considered a mask order that was initiated 3.5 months after the first confirmed Covid-19 case. We evaluated scenarios where wearing a mask reduces transmission and susceptibility by 50% or 80%; an individual wears a mask with a probability of 0%, 20%, 40%, 60%, 80%, or 100%. ResultsIf 60% of the population wears masks that are 50% effective, this decreases the cumulative infection attack rate (CAR) by 25%, the peak prevalence by 51%, and the population mortality by 25%. If 100% of people wear masks (or 60% wear masks that are 80% effective), this decreases the CAR by 38%, the peak prevalence by 67%, and the population mortality by 40%. ConclusionsAfter community transmission is present, masks can significantly reduce infections.
infectious diseases
10.1101/2020.09.28.20203109
DeepCOVID: An Operational Deep Learning-driven Framework for Explainable Real-time COVID-19 Forecasting
How do we forecast an emerging pandemic in real time in a purely data-driven manner? How to leverage rich heterogeneous data based on various signals such as mobility, testing, and/or disease exposure for forecasting? How to handle noisy data and generate uncertainties in the forecast? In this paper, we present DO_SCPLOWEEPC_SCPLOWCO_SCPLOWOVIDC_SCPLOW, an operational deep learning frame-work designed for real-time COVID-19 forecasting. DO_SCPLOWEEPC_SCPLOW-CO_SCPLOWOVIDC_SCPLOW works well with sparse data and can handle noisy heterogeneous data signals by propagating the uncertainty from the data in a principled manner resulting in meaningful uncertainties in the forecast. The deployed framework also consists of modules for both real-time and retrospective exploratory analysis to enable interpretation of the forecasts. Results from real-time predictions (featured on the CDC website and FiveThirtyEight.com) since April 2020 indicates that our approach is competitive among the methods in the COVID-19 Forecast Hub, especially for short-term predictions.
epidemiology
10.1101/2020.09.28.20202978
A steady trickle-down from metro districts and improving epidemic-parameters characterized the increasing COVID-19 cases in India
BackgroundBy mid-September of 2020, the number of daily new infections in India crossed 95, 000. We aimed to characterize the spatio-temporal shifts in the disease burden as the infections rose during the first wave of COVID-19. MethodsWe gathered the publicly available district-level (equivalent of counties) granular data for the 15 April to 31 August 2020 period. We used the epidemiological data from 186 districts with the highest case burden as of August 31, 559, 566 active cases and 2, 715, 656 cumulative infections, and the governing epidemic parameters were estimated by fitting it to a susceptible-asymptomatic-infected-recovered-dead (SAIRD) model. The space-time trends in the case burden and epidemic parameters were analyzed. When the physical proximity of the districts did not explain the spreading patterns, we developed a metric for accessibility of the districts via air and train travel. The districts were categorized as large metro, metro, urban and sub-urban and the spatial shifts in case burden were analyzed. ResultsThe center of the burden of the current-active infections which on May 15 was in the large metro districts with easy international access shifted continuously and smoothly towards districts which could be accessed by domestic airports and by trains. A linear trend-analysis showed a continuous improvement in the governing epidemic parameters consistently across the four categories of districts. The reproduction numbers improved from 1.77 {+/-} 0.58 on May 15 to 1.07 {+/-} 0.13 on August 31 in large metro districts (p-Value of trend 0.0001053); and from 1.58 {+/-} 0.39 on May 15 to 0.94 {+/-} 0.11 on August 31 in sub-urban districts (p-Value of trend 0.0067). The recovery rate per infected person per day improved from 0.0581 {+/-} 0.009 on May 15 to 0.091 {+/-} 0.010 on August 31 in large metro districts (p-Value of trend 0.26 x 10-12); and from 0.059 {+/-} 0.011 on May 15 to 0.100 {+/-} 0.010 on August 31 in sub-urban districts (p-Value of trend 0.12 x 10-16). The death rate of symptomatic individuals which includes the case-fatality-rate as well as the time from symptoms to death, consistently decreased from 0.0025 {+/-} 0.0014 on May 15 to 0.0013 {+/-} 0.0003 on August 31 in large metro districts (p-Value of trend 0.0010); and from 0.0018 {+/-} 0.0008 on May 15 to 0.0014 {+/-} 0.0003 on August 31 in sub-urban districts (p-Value of trend 0.2789). ConclusionsAs the daily infections continued to rise at a national level, the "center" of the pandemic-burden shifted smoothly and predictably towards smaller sized districts in a clear hierarchical fashion of accessibility from an international travel perspective. This observed trend was meant to serve as an alert to re-organize healthcare resources towards remote districts. The geographical spreading patterns continue to be relevant as the second wave of infections began in March 2021 with a center in the mid-range districts. FundingNone
epidemiology
10.1101/2020.09.29.20203646
Tracking WhatsApp behaviors during a crisis: A longitudinal observation of messaging activities during the COVID-19 pandemic
BackgroundWorldwide, social media traffic increased following the onset of the coronavirus disease (COVID-19) pandemic. Although the spread of COVID-19 content has been described for several social media platforms (e.g., Twitter, Facebook), little is known about how content is spread via private messaging platforms such as WhatsApp. ObjectiveIn this study, we documented: (i) how WhatsApp is used to transmit COVID-19 content; (ii) the characteristics of WhatsApp users based on their usage patterns; and (iii) how usage patterns link to well-being. MethodsWe used the experience sampling method to track day-to-day WhatsApp usage during the COVID-19 pandemic. For one week, participants reported each day the extent to which they had received, forwarded, or discussed COVID-19 content. The final dataset comprised 924 data points collected from 151 participants. ResultsDuring the week-long monitoring, most participants (143/151, 95%) reported at least one COVID-19-related use of WhatsApp. When a taxonomy was generated based on usage patterns, 1 in 10 participants (21/151, 14%) were found to have received and shared a high volume of forwarded COVID-19 content - akin to super spreaders identified on other social media platforms. Finally, those who engaged with more COVID-19 content in their personal chats were more likely to report having COVID-19 thoughts throughout the day. ConclusionsThese findings provide a rare window into discourse on private messenger platforms. In turn, this can inform risk communication strategies during the pandemic.
public and global health
10.1101/2020.09.29.20203653
Identification of atypical circulating tumor cells with prognostic value in metastatic breast cancer patients
BackgroundCirculating tumor cells (CTCs) have a strong potential as a quasi-non-invasive tool to set up precision medicine strategy for cancer patients. Tremendous efforts have been made to develop the second-generation of "filtration-based" technologies to detect CTCs, revealing a surprising heterogeneity among those cells. Here, we performed the largest and simultaneous analysis of all atypical circulating tumor cells (aCTCs) detected with a filtration-based technology, in a cohort of metastatic breast cancer (mBC) patients, and correlated their presence with clinicopathological and survival data. MethodsThe PERMED-01 study enrolled patients with mBC refractory to systemic therapy. We prospectively analyzed aCTCs present at the time of inclusion in the study, using the Screencell(R)Cyto device (n=91). Subsets cut-offs were established and evaluated for correlation with clinicopathological data, including progression-free survival (PFS) and overall survival (OS). ResultsThe median number of aCTCs found in mBC was 8.3 per mL of blood. Three subsets of aCTCs, absent from controls, were observed in mBC patients: single (s-aCTCs), circulating tumor micro-emboli (CTM), and giant-aCTCs (g-aCTCs). The presence of g-aCTCs was associated with shorter PFS and OS in multivariate analyses. For 23 cases, the analysis was completed with advanced immunofluorescence staining and showed that CTM and g-aCTCs displayed a hybrid phenotype for epithelial and mesenchymal markers. ConclusionsThis study highlights the heterogeneity of aCTCs in mBC patients both at the cytomorphological and molecular levels when using a Screencell(R)Cyto device. It reveals the g-aCTC subset as a prognostic factor and a potential stratification tool that might help to orientate late-stage mBC patients therapeutic care.
oncology
10.1101/2020.09.28.20203166
Comparison of infection control strategies to reduce COVID-19 outbreaks in homeless shelters in the United States: a simulation study
BackgroundCOVID-19 outbreaks have occurred in homeless shelters across the US, highlighting an urgent need to identify the most effective infection control strategy to prevent future outbreaks. MethodsWe developed a microsimulation model of SARS-CoV-2 transmission in a homeless shelter and calibrated it to data from cross-sectional polymerase-chain-reaction (PCR) surveys conducted during COVID-19 outbreaks in five shelters in three US cities from March 28 to April 10, 2020. We estimated the probability of averting a COVID-19 outbreak when an exposed individual is introduced into a representative homeless shelter of 250 residents and 50 staff over 30 days under different infection control strategies, including daily symptom-based screening, twice-weekly PCR testing and universal mask wearing. ResultsThe proportion of PCR-positive residents and staff at the shelters with observed outbreaks ranged from 2.6% to 51.6%, which translated to basic reproduction number (R0) estimates of 2.9-6.2. The probability of averting an outbreak diminished with higher transmissibility (R0) within the simulated shelter and increasing incidence in the local community. With moderate community incidence (~30 confirmed cases/1,000,000 people/day), the estimated probabilities of averting an outbreak in a low-risk (R0=1.5), moderate-risk (R0=2.9), and high-risk (R0=6.2) shelter were, respectively: 0.35, 0.13 and 0.04 for daily symptom-based screening; 0.53, 0.20, and 0.09 for twice-weekly PCR testing; 0.62, 0.27 and 0.08 for universal masking; and 0.74, 0.42 and 0.19 for these strategies combined. ConclusionsIn high-risk homeless shelter environments and locations with high community incidence of COVID-19, even intensive infection control strategies (incorporating daily symptom-screening, frequent PCR testing and universal mask wearing) are unlikely to prevent outbreaks, suggesting a need for non-congregate housing arrangements for people experiencing homelessness. In lower-risk environments, combined interventions should be employed to reduce outbreak risk.
infectious diseases
10.1101/2020.09.29.20204107
Prospective assessment of catheter-associated bacteriuria in nursing home residents: clinical presentation, epidemiology, and colonization dynamics
BackgroundCatheterization facilitates continuous bacteriuria, for which the clinical significance remains unclear. This study aimed to determine the clinical presentation, epidemiology, and dynamics of bacteriuria in a cohort of long-term catheterized nursing home residents. MethodsProspective urine culture, urinalysis, chart review, and assessment of signs and symptoms of infection were performed weekly for 19 study participants over 7 months. All bacteria [&ge;]103 cfu/ml were cultured, isolated, identified, and tested for susceptibility to select antimicrobials. Results226 of the 234 urines were polymicrobial (97%), with an average of 4.7 isolates per weekly specimen. 228 urines (97%) exhibited [&ge;]106 CFU/ml, 220 (94%) exhibited abnormal urinalysis, 126 (54%) were associated with at least one possible sign or symptom of infection, 82 (35%) would potentially meet a standardized definition of CAUTI, but only 3 had a caregiver diagnosis of CAUTI. 286 (30%) of bacterial isolates were resistant to a tested antimicrobial agent, and bacteriuria composition was remarkably stable despite a combined total of 54 catheter changes and 23 weeks of antimicrobial use. ConclusionsBacteriuria composition was largely polymicrobial, including persistent colonization by organisms previously considered to be urine culture contaminants. Neither antimicrobial use nor catheter changes sterilized the urine, at most resulting in transient reductions in bacterial burden followed by new acquisition of resistant isolates. Thus, this patient population exhibits a high prevalence of bacteriuria coupled with potential indicators of infection, necessitating further exploration to identify sensitive markers of true infection. FundingThis work was supported by the NIH (R00 DK105205, R01 DK123158, UL1 TR001412)
infectious diseases
10.1101/2020.09.29.20203869
Systematic review and meta-analysis of randomized trials of hydroxychloroquine for the prevention of COVID-19
BackgroundRecruitment into randomized trials of hydroxychloroquine (HCQ) for prevention of COVID-19 has been adversely affected by a widespread conviction that HCQ is not effective for prevention. In the absence of an updated systematic review, we conducted a meta-analysis of randomized trials that study the effectiveness of HCQ to prevent COVID-19. MethodsA search of PubMed and medRxiv with expert consultation found ten completed randomized trials: seven pre-exposure prophylaxis trials and three post-exposure prophylaxis trials. We obtained or calculated the risk ratio of COVID-19 diagnosis for assignment to HCQ versus no HCQ (either placebo or usual care) for each trial, and then pooled the risk ratio estimates. ResultsThe pooled risk ratio estimate of the pre-exposure prophylaxis trials was 0.72 (95% CI: 0.58-0.91) when using either a fixed effect or a standard random effects approach, and 0.72 (95% CI: 0.52-1.00) when using a conservative modification of the Hartung-Knapp random effects approach. The corresponding estimates for the post-exposure prophylaxis trials were 0.91 (95% CI: 0.71-1.16) and 0.91 (95% CI: 0.54-1.55). All trials found a similar rate of serious adverse effects in the HCQ and no HCQ groups. DiscussionA benefit of HCQ as prophylaxis for COVID-19 cannot be ruled out based on the available evidence from randomized trials. However, the "not statistically significant" findings from early prophylaxis trials were widely interpreted as definite evidence of lack of effectiveness of HCQ. This interpretation disrupted the timely completion of the remaining trials and thus the generation of precise estimates for pandemic management before the development of vaccines.
epidemiology
10.1101/2020.09.29.20203869
Systematic review and meta-analysis of randomized trials of hydroxychloroquine for the prevention of COVID-19
BackgroundRecruitment into randomized trials of hydroxychloroquine (HCQ) for prevention of COVID-19 has been adversely affected by a widespread conviction that HCQ is not effective for prevention. In the absence of an updated systematic review, we conducted a meta-analysis of randomized trials that study the effectiveness of HCQ to prevent COVID-19. MethodsA search of PubMed and medRxiv with expert consultation found ten completed randomized trials: seven pre-exposure prophylaxis trials and three post-exposure prophylaxis trials. We obtained or calculated the risk ratio of COVID-19 diagnosis for assignment to HCQ versus no HCQ (either placebo or usual care) for each trial, and then pooled the risk ratio estimates. ResultsThe pooled risk ratio estimate of the pre-exposure prophylaxis trials was 0.72 (95% CI: 0.58-0.91) when using either a fixed effect or a standard random effects approach, and 0.72 (95% CI: 0.52-1.00) when using a conservative modification of the Hartung-Knapp random effects approach. The corresponding estimates for the post-exposure prophylaxis trials were 0.91 (95% CI: 0.71-1.16) and 0.91 (95% CI: 0.54-1.55). All trials found a similar rate of serious adverse effects in the HCQ and no HCQ groups. DiscussionA benefit of HCQ as prophylaxis for COVID-19 cannot be ruled out based on the available evidence from randomized trials. However, the "not statistically significant" findings from early prophylaxis trials were widely interpreted as definite evidence of lack of effectiveness of HCQ. This interpretation disrupted the timely completion of the remaining trials and thus the generation of precise estimates for pandemic management before the development of vaccines.
epidemiology
10.1101/2020.09.29.20203877
Analysis of COVID-19 case numbers: adjustment for diagnostic misclassification on the example of German case reporting data
BackgroundReported COVID-19 case numbers are key to monitoring pandemic spread and decision-making on policy measures but require careful interpretation as they depend substantially on testing strategy. A high and targeted testing activity is essential for a successful Test-Trace-Isolate strategy. However, it also leads to increased numbers of false-positives and can foster a debate on the actual pandemic state, which can slow down action and acceptance of containment measures. AimWe evaluate the impact of misclassification in COVID-19 diagnostics on reported case numbers and estimated numbers of disease onsets (epidemic curve). MethodsWe developed a statistical adjustment of reported case numbers for erroneous diagnostic results that facilitates a misclassification-adjusted real-time estimation of the epidemic curve based on nowcasting. Under realistic misclassification scenarios, we provide adjusted case numbers for Germany and illustrate misclassification-adjusted nowcasting for Bavarian data. ResultsWe quantify the impact of diagnostic misclassification on time-series of reported case numbers, highlighting the relevance of a specificity smaller than one when test activity changes over time. Adjusting for misclassification, we find that the increase of cases starting in July might have been smaller than indicated by raw case counts, but cannot be fully explained by increasing numbers of false-positives due to increased testing. The effect of misclassification becomes negligible when true incidence is high. ConclusionsAdjusting case numbers for misclassification can improve this important measure on short-term dynamics of the pandemic and should be considered in data-based surveillance. Further limitations of case reporting data exist and have to be considered.
epidemiology
10.1101/2020.09.28.20203265
Altered EEG markers of synaptic plasticity in a human model of NMDA receptor deficiency: anti-NMDA receptor encephalitis
Plasticity of synaptic strength and density is a vital mechanism enabling memory consolidation, learning, and neurodevelopment. It is strongly dependent on the intact function of N-methyl-D-aspartate receptors (NMDAR). The importance of NMDAR is further evident as their dysfunction is involved in many diseases such as schizophrenia, Alzheimers disease, neurodevelopmental disorders, and epilepsies. Synaptic plasticity is thought to be reflected by changes of sleep slow wave slopes across the night, namely higher slopes after wakefulness at the beginning of sleep than after a night of sleep. Hence, a functional NMDAR deficiency should theoretically lead to altered overnight changes of slow wave slopes. Here we investigated whether pediatric patients with anti-NMDAR encephalitis, being a very rare but unique human model of NMDAR deficiency due to autoantibodies against receptor subunits, indeed show alterations in this sleep EEG marker for synaptic plasticity. We retrospectively analyzed 12 whole-night EEGs of 9 patients (age 4.3-20.8 years, 7 females) and compared them to a control group of 45 healthy individuals with the same age distribution. Slow wave slopes were calculated for the first and last hour of non-rapid eye movement (NREM) sleep (factor hour) for patients and controls (factor group). There was a significant interaction between hour and group (p = 0.013), with patients showing a smaller overnight decrease of slow wave slopes than controls. Moreover, we found smaller slopes during the first hour in patients (p = 0.022), whereas there was no group difference during the last hour of NREM sleep (p = 0.980). Importantly, the distribution of sleep stages was not different between the groups, and in our main analyses of patients without severe disturbance of sleep architecture, neither was the incidence of slow waves. These possible confounders could therefore not account for the differences in the slow wave slope values, which we also saw in the analysis of the whole sample of EEGs.These results suggest that quantitative EEG analysis of slow wave characteristics may reveal impaired synaptic plasticity in patients with anti-NMDAR encephalitis, a human model of functional NMDAR deficiency. Thus, in the future, the changes of sleep slow wave slopes may contribute to the development of electrophysiological biomarkers of functional NMDAR deficiency and synaptic plasticity in general. HighlightsO_LIChanges of slow waves in overnight EEGs are thought to reflect synaptic plasticity. C_LIO_LISynaptic plasticity is strongly dependent on intact NMDAR function. C_LIO_LIAntibody-mediated NMDAR deficiency occurs in patients with anti-NMDAR encephalitis. C_LIO_LIIn this human model of NMDAR deficiency, we found altered slow wave changes. C_LIO_LISleep EEG measures may mark NMDAR-related impairments of synaptic plasticity. C_LI
neurology
10.1101/2020.09.28.20202879
Impact of visual impairment on balance and visual processing functions in students with special educational needs
IntroductionVision is a critical factor for childrens development. However, prevalence of visual impairment (VI) is high in students with special educational needs (SEN). Other than vision disability, this group of students is prone to having functional deficits. It is unclear whether visual problems relate to these compromised functional deficits. This study aimed to assess the impact of vision on visual processing functions and balance performance in SEN students through a community service in special schools. MethodsA total of 104 (chronological age 14.3 {+/-} 4.3 years, 43 females) SEN students in Taiwan were assessed and classified as having normal vision (NV) or vision impairment (VI). Visual acuity (distance and near) and contrast sensitivity (CS) were measured as the visual outcomes. Visual processing function assessment included facial expression recognition by Heidi expression test, in terms of card matching (FEC), and examiners facial expression matching (FEE), and visual orientation recognition (by mailbox game, VO). Dynamic balance was assessed with Timed Up and Go (TUG) test, while static standing balance was assessed using a force plate to measure the postural sway in double-legged feet-together and tandem stance with eyes open and closed conditions. Static balance was presented in terms of the change in the centre of pressure in maximal medial-lateral (ML) and antero-posterior (AP) sways, sway variability (V), and sway path length (L). ResultsAlthough visual acuity was significantly worse in VI than NV (p < 0.001), CS was similar in the two groups (p = 0.08). VO, FEC, and FEE also did not differ significantly between groups (p > 0.05). NV performed better in the TUG than VI (p = 0.03). There was a significant interaction between eye condition and the vision group (p < 0.05) for static balance. Pairwise comparisons showed that NV swayed significantly less in ML than VI under tandem stance-open eye condition (p = 0.04), but significantly more in closed eye condition (p = 0.03). Conversely, VI had less V and shorter L than NV under tandem stance-closed eye condition (p = 0.03). ConclusionThis study is the first to our knowledge to examine the effect of vision on visual processing functions and balance performance in SEN students. Vision did not appear to be the major reason for impairment in visual processing. However, vision plays an important role in maintaining dynamic and static balance in SEN students.
occupational and environmental health
10.1101/2020.09.30.20204636
Impact of reproduction number on multiwave spreading dynamics of COVID-19 with temporary immunity : a mathematical model
COVID-19 is caused by a hitherto nonexistent pathogen, hence the immune response to the disease is currently unknown. Studies conducted over the past few weeks have found that the antibody titre levels in the blood plasma of infected patients decrease over time, as is common for acute viral infections. Fully documented reinfection cases from Hong Kong, India, Belgium and USA, as well as credible to anecdotal evidence of second-time cases from other countries, bring into sharp focus the question of what profile the epidemic trajectories may take if immunity were really to be temporary in a significant fraction of the population. Here we use mathematical modeling to answer this question, constructing a novel delay differential equation model which is tailored to accommodate different kinds of immune response. We consider two immune responses here : (a) where a recovered case becomes completely susceptible after a given time interval following infection and (b) where a first-time recovered case becomes susceptible to a lower virulence infection after a given time interval following recovery, and becomes permanently immunized by a second infection. We find possible solutions exhibiting large number of waves of disease in the first situation and two to three waves in the second situation. Interestingly however, these multiple wave solutions are manifest only for some intermediate values of the reproduction number R, which is governed by public health intervention measures. For sufficiently low as well as sufficiently high R, we find conventional single-wave solutions despite the short-lived immunity. Our results cast insight into the potential spreading dynamics of the disease and might also be useful for analysing the spread after a vaccine is invented, and mass vaccination programs initiated.
infectious diseases
10.1101/2020.09.29.20204420
On Mendelian Randomization Mixed-Scale Treatment Effect Robust Identification (MR MiSTERI) and Estimation for Causal Inference
Standard Mendelian randomization analysis can produce biased results if the genetic variant defining instrumental variable (IV) is confounded and/or has a horizontal pleiotropic effect on the outcome of interest not mediated by the treatment. We provide novel identification conditions for the causal effect of a treatment in the presence of unmeasured confounding by leveraging an invalid IV for which both the IV independence and exclusion restriction assumptions may be violated. The proposed Mendelian Randomization Mixed-Scale Treatment Effect Robust Identification (MR MiSTERI) approach relies on (i) an assumption that the treatment effect does not vary with the invalid IV on the additive scale; and (ii) that the selection bias due to confounding does not vary with the invalid IV on the odds ratio scale; and (iii) that the residual variance for the outcome is heteroscedastic and thus varies with the invalid IV. Although assumptions (i) and (ii) have, respectively appeared in the IV literature, assumption (iii) has not; we formally establish that their conjunction can identify a causal effect even with an invalid IV subject to pleiotropy. MR MiSTERI is shown to be particularly advantageous in the presence of pervasive heterogeneity of pleiotropic effects on the additive scale. For estimation, we propose a simple and consistent three-stage estimator that can be used as preliminary estimator to a carefully constructed one-step-update estimator, which is guaranteed to be more efficient under the assumed model. In order to incorporate multiple, possibly correlated and weak IVs, a common challenge in MR studies, we develop a MAny Weak Invalid Instruments (MR MaWII MiSTERI) approach for strengthened identification and improved estimation accuracy. Both simulation studies and UK Biobank data analysis results demonstrate the robustness of the proposed MR MiSTERI method.
epidemiology
10.1101/2020.09.29.20204388
Aggregative trans-eQTL analysis detects trait-specific target gene sets in whole blood
Large scale genetic association studies have identified many trait-associated variants and understanding the role of these variants in downstream regulation of gene-expressions can uncover important mediating biological mechanisms. In this study, we propose Aggregative tRans assoCiation to detect pHenotype specIfic gEne-sets (ARCHIE), as a method to establish links between sets of known genetic variants associated with a trait and sets of co-regulated gene-expressions through trans associations. ARCHIE employs sparse canonical correlation analysis based on summary statistics from trans-eQTL mapping and genotype and expression correlation matrices constructed from external data sources. A resampling based procedure is then used to test for significant trait-specific trans-association patterns in the background of highly polygenic regulation of gene-expression. Simulation studies show that compared to standard trans-eQTL analysis, ARCHIE is better suited to identify "core"-like genes through which effects of many other genes may be mediated and which can explain disease specific patterns of genetic associations. By applying ARCHIE to available trans-eQTL summary statistics reported by the eQTLGen consortium, we identify 71 gene networks which have significant evidence of trans-association with groups of known genetic variants across 29 complex traits. Around half (50.7%) of the selected genes do not have any strong trans-associations and could not have been detected by standard trans-eQTL mapping. We provide further evidence for causal basis of the target genes through a series of follow-up analyses. These results show ARCHIE is a powerful tool for identifying sets of genes whose trans regulation may be related to specific complex traits. The method has potential for broader applications for identification of networks of various types of molecular traits which mediates complex traits genetic associations.
genetic and genomic medicine
10.1101/2020.09.29.20199893
Multi-trait genome-wide association study identifies new loci associated with sleep apnoea risk
BackgroundSleep apnoea is a disorder characterised by periods of halted breathing during sleep. Despite its association with severe health conditions such as cardiovascular disease, liver problems and metabolic syndrome, the aetiology of sleep apnoea remain understudied, and previous genetic analyses have not identified many robustly associated genetic risk variants. MethodsWe performed a genome-wide association study (GWAS) meta-analysis of sleep apnoea across five cohorts (NTotal=523,366) and multi-trait analysis of GWAS (MTAG) to boost statistical power, leveraging the high genetic correlation between sleep apnoea and snoring. GWAS and MTAG results were adjusted to control for the effects of body mass index (BMI) using the multi-trait-based conditional & joint analysis (mtCOJO) method. Lead hit replication was conducted in an independent sample GWAS from 23andMe, Inc. that included BMI as a covariate (NTotal=1,477,352; Ncases=175,522). Lastly, we explored genetic correlations with other complex traits and performed a phenome-wide screen for causally associated phenotypes using the latent causal variable method. ResultsOur MTAG with snoring uncovered a total of 49 independent significant sleep apnoea loci; adjusting for BMI showed that approximately half of those loci act via their effects on BMI/obesity. Twenty-nine variants replicated in the 23andMe cohort. Several complex traits, including multisite chronic pain, diabetes, eye disorders, high blood pressure, osteoarthritis, chronic obstructive pulmonary disease, and BMI-related traits were genetically correlated with sleep apnoea. ConclusionsOur study identifies multiple robust genetic markers for sleep apnoea, increasing our understanding of the aetiology of this condition and its relationship with other complex traits.
genetic and genomic medicine
10.1101/2020.09.29.20203950
Alzheimer's disease variant portal (ADVP): a catalog of genetic findings for Alzheimer's disease
Alzheimers Disease (AD) genetics has made substantial progress through genome-wide association studies (GWASs). An up-to-date resource providing harmonized, searchable information on AD genetic variants with linking to genes and supporting functional evidence is needed. We developed the Alzheimers Disease Variant Portal (ADVP), an extensive collection of associations curated from >200 GWAS publications from Alzheimers Disease Genetics Consortium (ADGC) and other researchers. Publications are reviewed systematically to extract top associations for harmonization and genomic annotation. ADVP V1.0 catalogs 6,990 associations with disease-risk, expression quantitative traits, endophenotypes and neuropathology across >900 loci, >1,800 variants, >80 cohorts, and 8 populations. ADVP integrates with NIAGADS Alzheimers GenomicsDB where investigators can cross-reference other functional evidence. ADVP is a valuable resource for investigators to quickly and systematically explore high-confidence AD genetic findings and provides insights into population- and tissue-specific AD genetic architecture. ADVP is continually maintained and enhanced by NIAGADS and is freely accessible (https://advp.niagads.org).
genetic and genomic medicine
10.1101/2020.09.30.20204909
Lineage-specific protection and immune imprinting shape the age distributions of influenza B cases
How a history of influenza virus infections contributes to protection is not fully understood, but such protection might explain the contrasting age distributions of cases of the two lineages of influenza B, B/Victoria and B/Yamagata. Fitting a statistical model to those distributions using surveillance data from New Zealand, we found they could be explained by historical changes in lineage frequencies combined with cross-protection between strains of the same lineage. We found additional protection against B/Yamagata in people for whom it was their first influenza B infection, similar to the immune imprinting observed in influenza A. While the data were not informative about B/Victoria imprinting, B/Yamagata imprinting could explain the fewer B/Yamagata than B/Victoria cases in cohorts born in the 1990s and the bimodal age distribution of B/Yamagata cases. Longitudinal studies can test if these forms of protection inferred from historical data extend to more recent strains and other populations.
epidemiology
10.1101/2020.09.30.20204909
Lineage-specific protection and immune imprinting shape the age distributions of influenza B cases
How a history of influenza virus infections contributes to protection is not fully understood, but such protection might explain the contrasting age distributions of cases of the two lineages of influenza B, B/Victoria and B/Yamagata. Fitting a statistical model to those distributions using surveillance data from New Zealand, we found they could be explained by historical changes in lineage frequencies combined with cross-protection between strains of the same lineage. We found additional protection against B/Yamagata in people for whom it was their first influenza B infection, similar to the immune imprinting observed in influenza A. While the data were not informative about B/Victoria imprinting, B/Yamagata imprinting could explain the fewer B/Yamagata than B/Victoria cases in cohorts born in the 1990s and the bimodal age distribution of B/Yamagata cases. Longitudinal studies can test if these forms of protection inferred from historical data extend to more recent strains and other populations.
epidemiology
10.1101/2020.10.02.20194241
Detecting cryptic clinically-relevant structural variation in exome sequencing data increases diagnostic yield for developmental disorders
Structural Variation (SV) describes a broad class of genetic variation greater than 50bps in size. SVs can cause a wide range of genetic diseases and are prevalent in rare developmental disorders (DD). Patients presenting with DD are often referred for diagnostic testing with chromosomal microarrays (CMA) to identify large copy-number variants (CNVs) and/or with single gene, gene-panel, or exome sequencing (ES) to identify single nucleotide variants, small insertions/deletions, and CNVs. However, patients with pathogenic SVs undetectable by conventional analysis often remain undiagnosed. Consequently, we have developed the novel tool InDelible, which interrogates short-read sequencing data for split-read clusters characteristic of SV breakpoints. We applied InDelible to 13,438 probands with severe DD recruited as part of the Deciphering Developmental Disorders (DDD) study and discovered 64 rare, damaging variants in genes previously associated with DD missed by standard SNV, InDel or CNV discovery approaches. Clinical review of these 64 variants determined that about half (30/64) were plausibly pathogenic. InDelible was particularly effective at ascertaining variants between 21-500 bps in size, and increased the total number of potentially pathogenic variants identified by DDD in this size range by 42.3%. Of particular interest were seven confirmed de novo variants in MECP2 which represent 35.0% of all de novo protein truncating variants in MECP2 among DDD patients. InDelible provides a framework for the discovery of pathogenic SVs that are likely missed by standard analytical workflows and has the potential to improve the diagnostic yield of ES across a broad range of genetic diseases.
genetic and genomic medicine
10.1101/2020.10.01.20205096
Interactions between seasonal human coronaviruses and implications for the SARS-CoV-2 pandemic: A retrospective study in Stockholm, Sweden, 2009-2020
ObjectivesThe four seasonal coronaviruses 229E, NL63, OC43, and HKU1 are frequent causes of respiratory infections and show annual and seasonal variation. Increased understanding about these patterns could be informative about the epidemiology of SARS-CoV-2. MethodsResults from PCR diagnostics for the seasonal coronaviruses, and other respiratory viruses, were obtained for 55,190 clinical samples analysed at the Karolinska University Hospital, Stockholm, Sweden, between 14 September 2009 and 2 April 2020. ResultsSeasonal coronaviruses were detected in 2,130 samples (3.9%) and constituted 8.1% of all virus detections. OC43 was most commonly detected (28.4% of detections), followed by NL63 (24.0%), HKU1 (17.6%), and 229E (15.3%). The overall fraction of positive samples was similar between seasons, but at species level there were distinct biennial alternating peak seasons for the Alphacoronaviruses, 229E and NL63, and the Betacoronaviruses, OC43 and HKU1, respectively. The Betacoronaviruses peaked earlier in the winter season (Dec-Jan) than the Alphacoronaviruses (Feb-Mar). Coronaviruses were detected across all ages, but diagnostics were more frequently requested for paediatric patients than adults and the elderly. OC43 and 229E incidence was relatively constant across age strata, while that of NL63 and HKU1 decreased with age. ConclusionsBoth the Alphacoronaviruses and Betacoronaviruses showed alternating biennial winter incidence peaks, which suggests some type of immune mediated interaction. Symptomatic reinfections in adults and the elderly appear relatively common. Both findings may be of relevance for the epidemiology of SARS-CoV-2.
infectious diseases
10.1101/2020.10.01.20205021
Stochastic forecasting of COVID-19 daily new cases across countries with a novel hybrid time series model
An unprecedented outbreak of the novel coronavirus (COVID-19) in the form of peculiar pneumonia has spread globally since its first case in Wuhan province, China, in December 2019. Soon after, the infected cases and mortality increased rapidly. The future of the pandemics progress was uncertain, and thus, predicting it became crucial for public health researchers. These future predictions help the effective allocation of health care resources, stockpiling, and help in strategic planning for clinicians, government authorities, and public health policymakers after understanding the extent of the effect. The main objective of this paper is to develop a hybrid forecasting model that can generate real-time out-of-sample forecasts of COVID-19 outbreaks for five profoundly affected countries, namely the USA, Brazil, India, UK, and Canada. A novel hybrid approach based on the Theta method and Autoregressive neural network (ARNN) model, named Theta-ARNN (TARNN) model, is developed. Daily new cases of COVID-19 are nonlinear, non-stationary, and volatile; thus a single specific model cannot be ideal for future prediction of the pandemic. However, the newly introduced hybrid forecasting model with an acceptable prediction error rate can help healthcare and government for effective planning and resource allocation. The proposed method outperforms traditional univariate and hybrid forecasting models for the test data sets on an average.
epidemiology
10.1101/2020.10.02.20198663
Quantifying the dynamics of COVID-19 burden and impact of interventions in Java, Indonesia
BackgroundAs in many countries, quantifying COVID-19 spread in Indonesia remains challenging due to testing limitations. In Java, non-pharmaceutical interventions (NPIs) were implemented throughout 2020. However, as a vaccination campaign launches, cases and deaths are rising across the island. MethodsWe used modelling to explore the extent to which data on burials in Jakarta using strict COVID-19 protocols (C19P) provide additional insight into the transmissibility of the disease, epidemic trajectory, and the impact of NPIs. We assess how implementation of NPIs in early 2021 will shape the epidemic during the period of likely vaccine roll-out. ResultsC19P burial data in Jakarta suggest a death toll approximately 3.3 times higher than reported. Transmission estimates using these data suggest earlier, larger, and more sustained impact of NPIs. Measures to reduce sub-national spread, particularly during Ramadan, substantially mitigated spread to more vulnerable rural areas. Given current trajectory, daily cases and deaths are likely to increase in most regions as the vaccine is rolled-out. Transmission may peak in early 2021 in Jakarta if current levels of control are maintained. However, relaxation of control measures is likely to lead to a subsequent resurgence in the absence of an effective vaccination campaign. ConclusionSyndromic measures of mortality provide a more complete picture of COVID-19 severity upon which to base decision-making. The high potential impact of the vaccine in Java is attributable to reductions in transmission to date and dependent on these being maintained. Increases in control in the relatively short-term will likely yield large, synergistic increases in vaccine impact. Key questionsO_ST_ABSWhat is already known?C_ST_ABSO_LIIn many settings, limited SARS-CoV-2 testing makes it difficult to estimate the true trajectory and associated burden of the virus. C_LIO_LINon-pharmaceutical interventions (NPIs) are key tools to mitigate SARS-CoV-2 transmission. C_LIO_LIVaccines show promise but effectiveness depends upon prioritization strategies, roll-out and uptake. C_LI What are the new findings?O_LIThis study gives evidence of the value of syndrome-based mortality as a metric, which is less dependent upon testing capacity with which to estimate transmission trends and evaluate intervention impact. C_LIO_LINPIs implemented in Java earlier in the pandemic have substantially slowed the course of the epidemic with movement restrictions during Ramadan preventing spread to more vulnerable rural populations. C_LIO_LIPopulation-level immunity remains below proposed herd-immunity thresholds for the virus, though it is likely substantially higher in Jakarta. C_LI What do the new findings imply?O_LIGiven current levels of control, upwards trends in deaths are likely to continue in many provinces while the vaccine is scheduled to be rolled out. A key exception is Jakarta where population-level immunity may increase to a level where the epidemic begins to decline before the vaccine campaign has reached high coverage. C_LIO_LIFurther relaxation of measures would lead to more rapidly progressing epidemics, depleting the eventual incremental effectiveness of the vaccine. Maintaining adherence to control measures in Jakarta may be particularly challenging if the epidemic enters a decline phase but will remain necessary to prevent a subsequent large wave. Elsewhere, higher levels of control with NPIs are likely to yield high synergistic vaccine impact. C_LI
epidemiology
10.1101/2020.10.01.20204073
COVID-19 Classification of X-ray Images Using Deep Neural Networks
ObjectivesIn the midst of the coronavirus disease 2019 (COVID-19) outbreak, chest X-ray (CXR) imaging is playing an important role in diagnosis and monitoring of patients with COVID-19. Machine learning solutions have been shown to be useful for X-ray analysis and classification in a range of medical contexts. In this study, we propose a machine learning model for detection of patients tested positive for COVID-19 from CXRs that were collected from inpatients hospitalized in four different hospitals. We additionally present a tool for retrieving similar patients according to the models results on their CXRs. MethodsIn this retrospective study, 1384 frontal CXRs, of COVID-19 confirmed patients imaged between March-August 2020, and 1024 matching CXRs of non-COVID patients imaged before the pandemic, were collected and used to build a deep learning classifier for detecting patients positive for COVID-19. The classifier consists of an ensemble of pre-trained deep neural networks (DNNS), specifically, ReNet34, ReNet50, ReNet152, vgg16, and is enhanced by data augmentation and lung segmentation. We further implemented a nearest-neighbors algorithm that uses DNN-based image embeddings to retrieve the images most similar to a given image. ResultsOur model achieved accuracy of 90.3%, (95%CI: 86.3%-93.7%) specificity of 90% (95%CI: 84.3%-94%), and sensitivity of 90.5% (95%CI: 85%-94%) on a test dataset comprising 15% (350/2326) of the original images. The AUC of the ROC curve is 0.96 (95%CI: 0.93-0.97). ConclusionWe provide deep learning models, trained and evaluated on CXRs that can assist medical efforts and reduce medical staff workload in handling COVID-19. Key PointsO_LIA machine learning model was able to detect chest X-ray (CXR) images of patients tested positive for COVID-19 with accuracy and detection rate above 90%. C_LIO_LIA tool was created for finding existing CXR images with imaging characteristics most similar to a given CXR, according to the models image embeddings. C_LI
radiology and imaging
10.1101/2020.09.30.20204990
Epidemiological Risk Factors of SARS-Cov-2 Infections
Since the first recognitions by governments of the pandemic characteristic of the SARS-CoV-2 infections, public health agencies have warned the public about the dangers of the virus to persons with a variety of underlying physical conditions, many of which are more commonly found in persons over 50 years old or in certain ethnic groups. To investigate the statistical, rather than physiological basis of such warnings, this study examines correlations globally on a nation-by-nation basis between the statistical data concerning COVID-19 fatalities among the populations of the ninety-nine countries with the greatest number of SARS-CoV-2 infections plus the statistics of potential co-morbidities that may influence the severity of the infections. It examines reasons that may underlie of the degree to which advanced age increases the risk of mortality of an infection and contrasts the risk factors of SARS-Cov-2 infections with those of influenzas and their associated pneumonias.
epidemiology
10.1101/2020.10.02.20205559
A machine learning-based holistic approach for diagnoses within the Alzheimer's disease spectrum
Alzheimers disease (AD) is a neurodegenerative condition driven by a multifactorial etiology. We employed a machine learning (ML) based algorithm and the wealth of information offered by the Alzheimers Disease Neuroimaging Initiative (ADNI) database to investigate the relative contribution of clinically relevant factors for identifying subjects affected by Mild Cognitive Impairment (MCI), a transitional status between healthy aging and dementia. Our ML-based Random Forest (RF) algorithm did not help predict clinical outcomes and the AD conversion of MCI subjects. On the other hand, non-converting (ncMCI) subjects were correctly classified and predicted. Two neuropsychological tests, the FAQ and ADAS13, were the most relevant features used for the classification and prediction of younger, under 70, ncMCI subjects. Structural MRI data combined with systemic parameters and the cardiovascular status were instead the most critical factors for the classification of over 70 ncMCI subjects. Our results support the notion that AD is not an organ-specific condition and results from pathological processes inside and outside the Central Nervous System.
neurology
10.1101/2020.10.02.20205716
Poor metabolic health increases COVID-19-related mortality in the UK Biobank sample
Previous studies link obesity, components of metabolic health, such as hypertension or inflammation, to increased hospitalisations and death rates of patients with COVID-19. Here, in two overlapping samples of over 1,000 individuals from the UK Biobank we investigate whether metabolic health as measured by waist circumference, dyslipidaemia, hypertension, diabetes, and systemic inflammation is related to increased COVID-19 infection and mortality rates. Using logistic regression and controlling for confounding variables such as socioeconomic status, age, sex or ethnicity, we find that individuals with worse metabolic health (measured on average eleven years prior to 2020) have an increased risk for COVID-19-related death (adjusted odds ratio: 1.67). We also find that specific factors contributing to increased mortality are increased serum glucose levels, systolic blood pressure and waist circumference.
epidemiology
10.1101/2020.10.02.20205948
How geographic access to care shapes disease burden: the current impact of post-exposure prophylaxis and potential for expanded access to prevent human rabies deaths in Madagascar
BackgroundPost-exposure prophylaxis (PEP) is highly effective at preventing human rabies deaths, however access to PEP is limited in many rabies endemic countries. The 2018 decision by Gavi to add human rabies vaccine to its investment portfolio should expand PEP availability and reduce rabies deaths. We explore how geographic access to PEP impacts the rabies burden in Madagascar and the potential benefits of improved provisioning. Methodology & Principal FindingsWe use spatially resolved data on numbers of bite patients seeking PEP across Madagascar and estimates of travel times to the closest clinic providing PEP (N = 31) in a Bayesian regression framework to estimate how geographic access predicts reported bite incidence. We find that travel times strongly predict reported bite incidence across the country. Using resulting estimates in an adapted decision tree, we extrapolate rabies deaths and reporting and find that geographic access to PEP shapes burden sub-nationally. We estimate 960 human rabies deaths annually (95% Prediction Intervals (PI):790 - 1120), with PEP averting an additional 800 deaths (95% PI: 800 (95% PI: 640 - 970) each year. Under these assumptions, we find that expanding PEP to one clinic per district (83 additional clinics) could reduce deaths by 19%, but even with all major primary clinics provisioning PEP (1733 additional clinics), we still expect substantial rabies mortality. Our quantitative estimates are most sensitive to assumptions of underlying rabies exposure incidence, but qualitative patterns of the impacts of travel times and expanded PEP access are robust. Conclusions & SignificancePEP is effective at preventing rabies deaths, and in the absence of strong surveillance, targeting underserved populations may be the most equitable way to provision PEP. Given the potential for countries to use Gavi funding to expand access to PEP in the coming years, this framework could be used as a first step to guide expansion and improve targeting of interventions in similar endemic settings where PEP access is geographically restricted and baseline data on rabies risk is lacking. While better PEP access should save many lives, improved outreach, surveillance, and dog vaccination will be necessary, and if rolled out with Gavi investment, could catalyze progress towards achieving zero rabies deaths. Author SummaryCanine rabies causes an estimated 60,000 deaths each year across the world, primarily in low- and middle-income countries where people have limited access to both human vaccines (post-exposure prophylaxis or PEP) and dog rabies vaccines. Given that we have the tools to prevent rabies deaths, a global target has been set to eliminate deaths due to canine rabies by 2030, and recently, Gavi, a multilateral organization that aims to improve access to vaccines in the poorest countries, added human rabies vaccine to its portfolio. In this study, we estimated reported incidence of patients seeking PEP in relation to travel times to clinics provisioning PEP and extrapolate human rabies deaths in Madagascar. We find that PEP currently averts around 800 deaths each year, but that the burden remains high (1000 deaths/ year), particularly in remote, hard-to-reach areas. We show that expanding PEP availability to more clinics could significantly reduce rabies deaths in Madagascar, but our results reaffirm that expansion alone is will not achieve the global goal of zero human deaths from dog-mediated rabies by 2030. Combining PEP expansion with outreach, surveillance, and mass dog vaccination programs will be necessary to move Madagascar, and other Low- and Middle-Income countries, forward on the path to rabies elimination.
public and global health
10.1101/2020.10.02.20202614
Combined Metabolic Activators accelerates recovery in mild-to-moderate COVID-19
There is a need to treat COVID-19 patients suffering from respiratory problems, resulting in decreased oxygen levels and thus leading to mitochondrial dysfunction and metabolic abnormalities. Here, we investigated if a high oral dose of a mixture of Combined Metabolic Activators (CMA) can restore metabolic function and thus aid the recovery of COVID-19 patients. We conducted a placebo-controlled, open-label phase 2 study and a double-blinded phase 3 clinical trials to investigate the time of symptom-free recovery on ambulatory patients using a mixture of CMA consisting of NAD+ and glutathione precursors. The results of both studies showed that the time to complete recovery was significantly shorter in the CMA group (6.6 vs 9.3 days) in phase 2 and (5.7 vs 9.2 days) in phase 3 trials. A comprehensive analysis of the blood metabolome and proteome showed that the plasma levels of proteins and metabolites associated with inflammation and antioxidant metabolism are significantly improved in patients treated with the metabolic activators as compared to placebo. The results show that treating patients infected with COVID-19 with a high dose of CMAs leads to a more rapid symptom-free recovery, suggesting a role for such a therapeutic regime in the treatment of infections leading to respiratory problems.
infectious diseases
10.1101/2020.10.02.20205971
The revolution will be hard to evaluate: How co-occurring policy changes affect research on the health effects of social policies
Extensive empirical health research leverages variation in the timing and location of policy changes as quasi-experiments. Multiple social policies may be adopted simultaneously in the same locations, creating co-occurrence which must be addressed analytically for valid inferences. The pervasiveness and consequences of co-occurring policies have received limited attention. We analyzed a systematic sample of 13 social policy databases covering diverse domains including poverty, paid family leave, and tobacco. We quantified policy co-occurrence in each database as the fraction of variation in each policy measure across different jurisdictions and times that could be explained by co-variation with other policies (R2). We used simulations to estimate the ratio of the variance of effect estimates under the observed policy co-occurrence to variance if policies were independent. Policy co-occurrence ranged from very high for state-level cannabis policies to low for country-level sexual minority rights policies. For 65% of policies, greater than 90% of the place-time variation was explained by other policies. Policy co-occurrence increased the variance of effect estimates by a median of 57-fold. Co-occurring policies are common and pose a major methodological challenge to rigorously evaluating health effects of individual social policies. When uncontrolled, co-occurring policies confound one another, and when controlled, resulting positivity violations may substantially inflate the variance of estimated effects. Tools to enhance validity and precision for evaluating co-occurring policies are needed.
epidemiology
10.1101/2020.10.02.20204735
Decoupling sleep and brain size in childhood: An investigation of genetic covariation in the ABCD study
BackgroundChildhood sleep problems are common and among the most frequent and impairing comorbidities of childhood psychiatric disorders. However, little is known about the genetic architecture of childhood sleep and potential etiological links between sleep, brain morphology, and pediatric-onset psychiatric symptoms. MethodsUsing data from the Adolescent Brain and Cognitive Development Study (NPhenotype=4,428 for discovery/replication, NGenetics=4,728, age: 9-10), we assessed phenotypic relationships, heritability, and genetic correlation between childhood sleep disturbances (SDs: insomnia, arousal, breathing, somnolence, hyperhidrosis, sleep-wake transitions), brain size (surface area [SA], cortical thickness, volume), and dimensional psychopathology. ResultsSDs showed widespread positive associations with multiple domains of childhood psychopathology; however, only insomnia showed replicable associations with smaller brain SA. Among the SDs assessed, only insomnia showed significant SNP-based heritability (h2SNP=0.15, p<0.05), and showed substantial genetic correlations with externalizing symptoms and attention-deficit hyperactivity disorder (ADHD; rGs>0.80, ps<0.05), suggesting significant pleiotropy across these complex childhood traits. We find no evidence of genetic correlation between childhood insomnia and brain size. Polygenic risk scores (PRS) calculated from genome-wide association studies (GWAS) of adult insomnia and adult brain size did not predict childhood insomnia; instead, PRS trained using ADHD GWAS predicted decreased SA at baseline, as well as insomnia and externalizing symptoms longitudinally. ConclusionsThese findings demonstrate a distinct genetic architecture underlying childhood insomnia and brain size and indicate that childhood insomnia should be considered along the dimensional axis of externalizing traits. Uncovering shared and unique genetic risk across childhood traits may inform our understanding of the developmental origins of comorbid psychiatric disorders.
psychiatry and clinical psychology
10.1101/2020.10.05.20206961
Communicating personalised risks from COVID-19: guidelines from an empirical study
As increasing amounts of data accumulate on the effects of the novel coronavirus Sars-CoV-2 and the risk factors that lead to poor outcomes, it is possible to produce personalised estimates of the risks faced by groups of people with different characteristics. The challenge of how to communicate these then becomes apparent. Based on empirical work (total n=5,520, UK) supported by in-person interviews with the public and physicians, we make recommendations on the presentation of such information. These include: using predominantly percentages when communicating the absolute risk, but also providing, for balance, a format which conveys a contrasting (higher) perception of risk (expected frequency out of 10,000); using a visual linear scale cut at an appropriate point to illustrate the maximum risk, explained through an illustrative persona who might face that highest level of risk; and providing context to the absolute risk through presenting a range of other personas illustrating people who would face risks of a wide range of different levels. These personas should have their major risk factors (age, existing health conditions) described. By contrast, giving people absolute likelihoods of other risks they face in an attempt to add context was considered less helpful.
infectious diseases
10.1101/2020.10.05.20207142
Evaluating the Effect of Prebiotics on the Gut Microbiome Profile and Beta-cell Function in Youth with Newly-Diagnosed Type 1 Diabetes: Protocol of a Randomized Controlled Trial
IntroductionData show that disturbances in the gut microbiota play a role in glucose homeostasis, type 1 diabetes (T1D) risk and progression. The prebiotic high amylose maize starch (HAMS) alters the gut microbiome profile and metabolites favorably with an increase in bacteria producing short chain fatty acids (SCFAs) that have significant anti-inflammatory effects. HAMS also improves glycemia, insulin sensitivity and secretion in healthy non-diabetic adults. Additionally, a recent study testing an acetylated and butyrylated form of HAMS (HAMS-AB) that further increases SCFA production prevented T1D in a rodent model without adverse safety effects. The overall objective of this human study will be to assess how daily HAMS-AB consumption impacts the gut microbiome profile, SCFA production, {beta}-cell heath, function and glycemia as well as immune responses in newly-diagnosed T1D youth. Methods and AnalysisWe hypothesize that HAMS-AB intake will improve the gut microbiome profile, increase SCFA production, improve {beta}-cell health, function and glycemia as well as modulate the immune system. We describe here a pilot, randomized crossover trial of HAMS-AB in 12 newly-diagnosed T1D youth with residual {beta}-cell function. In Aim 1, we will determine the effect of HAMS-AB on the gut microbiome profile and SCFA production; in Aim 2, we will determine the effect of HAMS-AB on {beta}-cell health, function and glycemia; and in Aim 3, we will determine the peripheral blood effect of HAMS-AB on frequency, phenotype and function of specific T cell markers. We anticipate beneficial effects from a simple, inexpensive, and safe dietary approach. Ethics and DisseminationThe Institutional Review Board at Indiana University approved the study protocol. The findings of this trial will be submitted to a peer-reviewed pediatric journal. Abstracts will be submitted to relevant national and international conferences. Trial registration numberNCT04114357; Pre-results. Article SummaryO_ST_ABSStrengths and limitations of this studyC_ST_ABSO_LIThe study design (randomized controlled trial, RCT) is the most robust methodology to assess the effectiveness of therapeutic interventions. C_LIO_LIThe findings of this RCT, whether positive or negative, will contribute to the formulation of further recommendations on the use of high amylose maize starch that has been acetylated and butyrylated for improving beta-cell function in children with newly diagnosed type 1 diabetes (T1D). C_LI
endocrinology
10.1101/2020.10.05.20206938
A Bayesian approach for estimating typhoid fever incidence from large-scale facility-based passive surveillance data
Decisions about typhoid fever prevention and control are based on estimates of typhoid incidence and their uncertainty. Lack of specific clinical diagnostic criteria, poorly sensitive diagnostic tests, and scarcity of accurate and complete datasets contribute to difficulties in calculating age-specific population-level typhoid incidence. Using data from the Strategic Alliance across Africa & Asia (STRATAA) programme, we integrated demographic censuses, healthcare utilization surveys, facility-based surveillance, and serological surveillance from Malawi, Nepal, and Bangladesh to account for under-detection of cases. We developed a Bayesian approach that adjusts the count of reported blood-culture-positive cases for blood culture detection, blood culture collection, and healthcare seeking--and how these factors vary by age--while combining information from prior published studies. We validated the model using simulated data. The ratio of observed to adjusted incidence rates was 7.7 (95% credible interval (CrI): 6.0-12.4) in Malawi, 14.4 (95% CrI: 9.3-24.9) in Nepal, and 7.0 (95% CrI: 5.6-9.2) in Bangladesh. The probability of blood culture collection led to the largest adjustment in Malawi, while the probability of seeking healthcare contributed the most in Nepal and Bangladesh; adjustment factors varied by age. Adjusted incidence rates were within the seroincidence rate limits of typhoid infection. Estimates of blood-culture-confirmed typhoid fever without these adjustments results in considerable underestimation of the true incidence of typhoid fever. Our approach allows each phase of the reporting process to be synthesized to estimate the adjusted incidence of typhoid fever while correctly characterizing uncertainty, which can inform decision-making for typhoid prevention and control.
epidemiology
10.1101/2020.10.06.20207621
Methylome-wide association study of antidepressant use in Generation Scotland and the Netherlands Twin Register implicates the innate immune system
Antidepressants are an effective treatment for major depressive disorder (MDD), although individual response is unpredictable and highly variable. Whilst the mode of action of antidepressants is incompletely understood, many medications are associated with changes in DNA methylation in genes that are plausibly linked to their mechanisms. Studies of DNA methylation may therefore reveal the biological processes underpinning the efficacy and side effects of antidepressants. We performed a methylome-wide association study (MWAS) of self-reported antidepressant use accounting for lifestyle factors and MDD in Generation Scotland (GS:SFHS, N=6,428, EPIC array) and the Netherlands Twin Register (NTR, N=2,449, 450K array) and ran a meta-analysis of antidepressant use across these two cohorts. We found 10 CpG sites significantly associated with self-reported antidepressant use in GS:SFHS, with the top CpG located within a gene previously associated with mental health disorders, ATP6V1B2 ({beta}=-0.055, pcorrected=0.005). Other top loci were annotated to genes including CASP10, TMBIM1, MAPKAPK3, and HEBP2, which have previously been implicated in the innate immune response. Next, using penalised regression, we trained a methylation-based score of self-reported antidepressant use in a subset of 3,799 GS:SFHS individuals that predicted antidepressant use in a second subset of GS:SFHS (N=3,360, {beta}=0.377, p=3.12x10-11, R2=2.12%). In an MWAS analysis of prescribed selective serotonin reuptake inhibitors, we showed convergent findings with those based on self-report. In NTR, we did not find any CpGs significantly associated with antidepressant use. The meta-analysis identified the two CpGs of the ten above that were common to the two arrays used as being significantly associated with antidepressant use, although the effect was in the opposite direction for one of them. Antidepressants were associated with epigenetic alterations in loci previously associated with mental health disorders and the innate immune system. These changes predicted self-reported antidepressant use in a subset of GS:SFHS and identified processes that may be relevant to our mechanistic understanding of clinically relevant antidepressant drug actions and side effects.
psychiatry and clinical psychology
10.1101/2020.10.05.20200949
Whole Genome Sequencing identifies novel structural variant in a large Indian family affected with X - linked agammaglobulinemia
BackgroundX - linked agammaglobulinemia (XLA, OMIM #300755) is a primary immunodeficiency disorder caused by pathogenic variations in the BTK gene, characterized by failure of development and maturation of B lymphocytes. The estimated prevalence worldwide is 1 in 190,000 male births. Recently, genome sequencing has been widely used in difficult to diagnose and familial cases. We report a large Indian family suffering from XLA with five affected individuals. MethodsWe performed complete blood count, immunoglobulin assay, and lymphocyte subset analysis for all patients and analyzed Btk expression for one patient and his mother. Whole exome sequencing (WES) for four patients, and whole genome sequencing (WGS) for two patients have been performed. Carrier screening was done for 17 family members using Multiplex Ligation-dependent Probe Amplification (MLPA) and haplotype ancestry mapping using fineSTRUCTURE was performed. ResultsAll patients had hypogammaglobulinemia and low CD19+ B cells. One patient who underwent Btk estimation had low expression and his mother showed a mosaic pattern. On structural variant analysis of WGS data, we found a novel large deletion of 5,296 bp at loci chrX:100,624,323-100,629,619 encompassing exons 3-5 of the BTK gene. Family screening revealed seven carriers for the deletion. Two patients had a successful HSCT. Haplotype mapping revealed mainly South Asian ancestry. ConclusionWhole genome sequencing led to identification of the accurate genetic mutation which could help in early diagnosis leading to improved outcomes, prevention of permanent organ damage and improved quality of life, as well as enabling prenatal diagnosis.
genetic and genomic medicine
10.1101/2020.10.05.20205468
Listening Difficulties in Children with Normal Audiograms: Relation to Hearing and Cognition
ObjectivesChildren presenting at audiology services with caregiver-reported listening difficulties often have normal audiograms. The appropriate approach for the further assessment and clinical management of these children is currently unclear. In this Sensitive Indicators of Childhood Listening Difficulties (SICLiD) study we assessed listening ability using a reliable and validated caregiver questionnaire (the ECLiPS) in a large (n = 146) and heterogeneous sample of 6-13 year-old children with normal audiograms. Scores on the ECLiPS were related to a multifaceted laboratory assessment of the childrens audiological, psycho- and physiological-acoustic and cognitive abilities. This report is an overview of the SICLiD study and focuses on the childrens behavioral performance. The overall goals of SICLiD were to understand the auditory and other neural mechanisms underlying childhood listening difficulties and to translate that understanding into clinical assessment and, ultimately, intervention. DesignCross-sectional behavioral assessment of children with listening difficulties and an age-matched typically developing control group. Caregivers completed the ECLiPS and the resulting Total standardized composite score formed the basis of further descriptive statistics, univariate and multivariate modeling of experimental data. ResultsAll scores of the ECLiPS, the SCAN-3:C, a standardized clinical test suite for auditory processing, and the NIH Cognition Toolbox were significantly lower for children with listening difficulties than for their typically developing peers, using group comparisons via t-tests and Wilcoxon Rank Sum tests. A similar effect was observed on the LiSN-S test for speech sentence-in-noise intelligibility, but only reached significance for the Low Cue and High Cue conditions, and the Talker Advantage derived score. Stepwise regression to examine the factors contributing to the ECLiPS Total scaled score (pooled across groups) yielded a model that explained 42% of its variance based on the SCAN-3:C composite, LiSN-S Talker Advantage, and the NIH Toolbox Picture Vocabulary and Dimensional Change Card Sorting scores (F4,95 = 17.35, p < 0.001). High correlations were observed between many test scores including the ECLiPS, SCAN-3:C and NIH Toolbox composite measures. LiSN-S Advantage measures generally correlated weakly and non-significantly with non-LiSN-S measures. However, a significant interaction was found between extended high frequency threshold and LiSN-S Talker Advantage. ConclusionsChildren with listening difficulties but normal audiograms have problems with the cognitive processing of auditory and non-auditory stimuli that include both fluid and crystallized reasoning. Analysis of poor performance on the LiSN-S Talker Advantage measure identified subclinical hearing loss as a minor contributing factor to talker segregation. Beyond auditory tests, evaluations of children with complaints of listening difficulties should include standardized caregiver observations and consideration of broad cognitive abilities.
otolaryngology
10.1101/2020.10.06.20208132
BayesSMILES: Bayesian Segmentation Modeling for Longitudinal Epidemiological Studies
The coronavirus disease of 2019 (COVID-19) is a pandemic. To characterize its disease transmissibility, we propose a Bayesian change point detection model using daily actively infectious cases. Our model builds on a Bayesian Poisson segmented regression model that can 1) capture the epidemiological dynamics under the changing conditions caused by external or internal factors; 2) provide uncertainty estimates of both the number and locations of change points; and 3) adjust any explanatory time-varying covariates. Our model can be used to evaluate public health interventions, identify latent events associated with spreading rates, and yield better short-term forecasts.
epidemiology
10.1101/2020.10.06.20206714
Deep transcriptome profiling of multiple myeloma with quantitative measures using the SPECTRA approach
Complex diseases, including cancer, are highly heterogeneous, and large molecular datasets are increasingly part of describing an individuals unique experience. Gene expression is particularly attractive because it captures genetic, epigenetic and environmental consequences. SPECTRA is an approach to describe variation in a transcriptome as a set of unsupervised quantitative variables. Spectra variables provide a deep dive into the transcriptome, representing both large (prominent, high-level) and small (deeper, more subtle) sources of variance. Spectra variables are ideal for modeling alongside other variables for any outcome of interest. Each spectrum can also be considered a phenotypic trait, providing new avenues for disease characterization or to explore disease risk. We applied the SPECTRA approach to multiple myeloma (MM), the second most common blood cancer. Using RNA sequencing from malignant CD138+ cells, we derived 39 spectra in 767 patients from the MMRF CoMMpass study. We included spectra in prediction models for several clinical endpoints, compared to established expression-based risk scores, and used descriptive modeling to identify associations with patient characteristics. Spectra-based risk scores added predictive value beyond established clinical risk factors and other expression-based risk scores for overall survival, progression-free survival, and time to first-line treatment failure. We identified significant associations between CD138+ spectra and tumor cytogenetics, race, gender, and age at diagnosis. The SPECTRA approach provides quantitative measures of transcriptome variation to deeply profile tumors. This framework more comprehensively represents signals in the transcriptome and offers greater flexibility to model clinical outcomes and characteristics.
epidemiology
10.1101/2020.10.05.20205963
What to do when everything happens at once: Analytic approaches to estimate the health effects of co-occurring social policies
Social policies have great potential to improve population health and reduce health disparities. Thus, increasing empirical research seeks to quantify the health effects of social policies by exploiting variation in the timing of policy changes across places. Multiple social policies are often adopted simultaneously or in close succession in the same locations, creating co-occurrence which must be handled analytically for valid inferences. Although this is a substantial methodological challenge for studies aiming to isolate social policy effects, limited prior work has systematically considered analytic solutions within a causal framework or assessed whether these solutions are being adopted. We designated seven analytic solutions to policy co-occurrence, including efforts to disentangle individual policy effects and efforts to estimate the combined effects of co-occurring policies. We leveraged an existing systematic review of social policies and health to evaluate how often policy co-occurrence is identified as a threat to validity and how often each analytic solution is applied in practice. Of the 55 studies, only 17 (31%) reported checking for any co-occurring policies, although 36 (67%) used at least one approach that helps address policy co-occurrence. The most common approaches were: adjusting for measures of co-occurring policies; defining the outcome on subpopulations likely to be affected by the policy of interest (but not other co-occurring policies); and selecting a less-correlated measure of policy exposure. As health research increasingly focuses on policy changes, we must systematically assess policy co-occurrence and apply analytic solutions to strengthen future studies on the health effects of social policies.
epidemiology
10.1101/2020.10.07.20208587
Remote home monitoring (virtual wards) during the COVID-19 pandemic: a systematic review
ObjectivesThe aim of this review was to analyse the implementation and impact of remote home monitoring models (virtual wards) during COVID-19, identifying their main components, processes of implementation, target patient populations, impact on outcomes, costs and lessons learnt. DesignA rapid systematic review to capture an evolving evidence base. We used the Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA) statement. SettingThe review included models led by primary and secondary care across seven countries. Participants27 articles were included in the review. Main outcome measuresImpact of remote home monitoring on virtual length of stay, escalation, emergency department attendance/reattendance, admission/readmission and mortality. ResultsThe aim of the models was to maintain patients safe in the right setting. Most models were led by secondary care and confirmation of COVID-19 was not required (in most cases). Monitoring was carried via online platforms, paper-based systems with telephone calls or (less frequently) through wearable sensors. Models based on phone calls were considered more inclusive. Patient/carer training was identified as a determining factor of success. We could not reach substantive conclusions regarding patient safety and the identification of early deterioration due to lack of standardised reporting and missing data. Economic analysis was not reported for most of the models and did not go beyond reporting resources used and the amount spent per patient monitored. ConclusionsFuture research should focus on staff and patient experiences of care and inequalities in patients access to care. Attention needs to be paid to the cost-effectiveness of the models and their sustainability, evaluation of their impact on patient outcomes by using comparators, and the use of risk-stratification tools. Protocol registrationThe review protocol was published on PROSPERO (CRD: 42020202888). RESEARCH IN CONTEXTO_ST_ABSEvidence before this studyC_ST_ABSRemote home monitoring models for other conditions have been studied, but their adaptation to monitor COVID-19 patients and the analysis of their implementation constitute gaps in research. Added value of this studyThe review covers a wide range of remote home monitoring models (pre-hospital as well as step-down wards) implemented in primary and secondary care sectors in eight countries and focuses on their implementation and impact on outcomes (including costs). Implications of all the available evidenceThe review provides a rapid overview of an emerging evidence base that can be used to inform changes in policy and practice regarding the home monitoring of patients during COVID-19. Attention needs to be paid to the cost-effectiveness of the models and their sustainability, evaluation of their impact on patient outcomes by using comparators, and the use of risk-stratification tools.
health systems and quality improvement
10.1101/2020.10.07.20208264
Detecting and isolating false negatives of SARS-CoV-2 primers and probe sets among the Japanese Population: A laboratory testing methodology and study
ObjectivesIn this study, a comparative study between primers from Japans and USs disease control centers was conducted. As further investigation, virus sequence alignment with primers oligonucleotide was analyzed. Design or methods11,652 samples from Japanese population were tested for SARS-CoV-2 positive using recommended RT-PCR primer-probe sets from Japan National Institute of Infectious Disease (NIID) and US Centers for Disease Control and Prevention (CDC). ResultsOf the 102 positive samples, 17 samples (16.7% of total positives) showed inconsistent results when tested simultaneously for the following primers: JPN-N2, JPN-N1, CDC-N1, and CDC-N2. As a result, CDC recommended primer-probe sets showed relatively higher sensitivity and accuracy. Further virus sequence alignment analysis showed evidences for virus mutation happening at primers binding sites. ConclusionsThe inconsistency in the RT-PCR results for JPN-N1, JPN-N2, CDC-N1, and CDC-N2 primer-probe sets could be attributed to differences in virus mutation at primers binding site as observed in sequence analysis. The use of JPN-N2 combined with CDC-N2 primer produces the most effective result to reduce false negatives in Japan region. In addition, adding CDC-N1 will also help to detect false negatives.
infectious diseases
10.1101/2020.10.07.20166769
Hepatic resection versus transarterial chemoembolisation for intermediate-stage hepatocellular carcinoma;a predicted mortality risk-based decision analysis
BackgroundThe selection criterion for liver resection (LR) in intermediate-stage (IM) hepatocellular carcinoma (HCC) is still controversial. This study aims to compare LR and transarterial chemoembolization in the range of predicted death risk MethodsThe multivariable Cox regression model (MVR) was estimated to predict mortality at 5yr. The cut-off values were determined by a two-piece-wise linear regression model, decision curve analysis with MVR model, and hazard ratio curve for treatment plotted against the predicted mortality. Results825 IM-HCC with hepatitis B cirrhosis were included for analysis (TACE, n=622; LR, n=203). The 5-yr overall survival rate of LR patients was higher than the TACE group (52.8% vs. 20.8%; P<0.0001). The line of LR and TACE were crossing with predicted death risk at 100% (P for interaction =0.008). The benefit of LR versus TACE decreased progressively as predicted death risk>0.55 (95%CI: 0.45,0.62). When predicted death risk over 0.7, decision curve analysis suggested that LR and TACE did not increase net benefit. Patients were then divided into four subgroups by the cut-off values (<0.45, 0.45[&ge;] /<0.62, 0.62[&ge;] /<0.7, [&ge;] 0.7). The stratified analysis of treatment in different subgroups, hazard ratios were 0.39 (95%CI: 0.27, 0.56), 0.36 (95%CI: 0.23, 0.56), 0.51 (95%CI: 0.27, 0.98), and 0.46 (95%CI: 0.27, 0.80), respectively. ConclusionsLR reached the maximal relative utility in the interval of 0.45 to 0.62, and both LR and TACE did not increase net benefit at the 5-yr death risk over 0.7.
gastroenterology
10.1101/2020.10.07.20166769
Decision curve analysis to identify optimal candidates of liver resection for intermediate-stage hepatocellular carcinoma with hepatitis B cirrhosis:A cohort study
BackgroundThe selection criterion for liver resection (LR) in intermediate-stage (IM) hepatocellular carcinoma (HCC) is still controversial. This study aims to compare LR and transarterial chemoembolization in the range of predicted death risk MethodsThe multivariable Cox regression model (MVR) was estimated to predict mortality at 5yr. The cut-off values were determined by a two-piece-wise linear regression model, decision curve analysis with MVR model, and hazard ratio curve for treatment plotted against the predicted mortality. Results825 IM-HCC with hepatitis B cirrhosis were included for analysis (TACE, n=622; LR, n=203). The 5-yr overall survival rate of LR patients was higher than the TACE group (52.8% vs. 20.8%; P<0.0001). The line of LR and TACE were crossing with predicted death risk at 100% (P for interaction =0.008). The benefit of LR versus TACE decreased progressively as predicted death risk>0.55 (95%CI: 0.45,0.62). When predicted death risk over 0.7, decision curve analysis suggested that LR and TACE did not increase net benefit. Patients were then divided into four subgroups by the cut-off values (<0.45, 0.45[&ge;] /<0.62, 0.62[&ge;] /<0.7, [&ge;] 0.7). The stratified analysis of treatment in different subgroups, hazard ratios were 0.39 (95%CI: 0.27, 0.56), 0.36 (95%CI: 0.23, 0.56), 0.51 (95%CI: 0.27, 0.98), and 0.46 (95%CI: 0.27, 0.80), respectively. ConclusionsLR reached the maximal relative utility in the interval of 0.45 to 0.62, and both LR and TACE did not increase net benefit at the 5-yr death risk over 0.7.
gastroenterology
10.1101/2020.10.07.20208017
Hearing loss is associated with gray matter differences in older adults at risk for and with Alzheimer's disease
Using data from the COMPASS-ND study we investigated associations between hearing loss and hippocampal volume as well as cortical thickness in older adults with subjective cognitive decline (SCD), mild cognitive impairment (MCI), and Alzheimers dementia (AD). SCD participants with greater pure-tone HL exhibited lower hippocampal volume, but more cortical thickness in the left superior temporal gyrus and right pars opercularis. Greater speech-in-noise reception thresholds were associated with lower cortical thickness bilaterally across much of the cortex in AD. The AD group also showed a trend towards worse speech-in-noise thresholds compared to the SCD group. HighlightsO_LIIn SCD, greater pure-tone hearing loss was associated with lower right hippocampal volume. C_LIO_LIPure-tone hearing loss was not associated with brain atrophy in MCI or AD. C_LIO_LIIndividuals with AD exhibited a trend towards poorer speech-in-noise (SiN) thresholds than SCD. C_LIO_LIIn AD, greater atrophy across large portions of the cortex was associated with greater SiN thresholds. C_LI
neurology
10.1101/2020.10.07.20208280
A high-throughput microfluidic nano-immunoassay for detecting anti-SARS-CoV-2 antibodies in serum or ultra-low volume dried blood samples
Novel technologies are needed to facilitate large-scale detection and quantification of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) specific antibodies in human blood samples. Such technologies are essential to support seroprevalence studies, vaccine clinical trials, and to monitor quality and duration of immunity. We developed a microfluidic nano-immunnoassay for the detection of anti-SARS-CoV-2 IgG antibodies in 1024 samples per device. The method achieved a specificity of 100% and a sensitivity of 98% based on the analysis of 289 human serum samples. To eliminate the need for venipuncture, we developed low-cost, ultra-low volume whole blood sampling methods based on two commercial devices and repurposed a blood glucose test strip. The glucose test strip permits the collection, shipment, and analysis of 0.6 {micro}L whole blood easily obtainable from a simple fingerprick. The nano-immunoassay platform achieves high-throughput, high sensitivity and specificity, negligible reagent consumption, and a decentralized and simple approach to blood sample collection. We expect this technology to be immediately applicable to current and future SARS-CoV-2 related serological studies and to protein biomarker diagnostics in general.
infectious diseases
10.1101/2020.10.08.20209593
COVID-19 Susceptibility and Severity Risks in a Survey of Over 500,000 Individuals
BackgroundThe enormous toll of the COVID-19 pandemic has heightened the urgency of collecting and analyzing population-scale datasets in real time to monitor and better understand the evolving pandemic. MethodsThe AncestryDNA COVID-19 Study collected self-reported survey data on symptoms, outcomes, risk factors, and exposures for over 563,000 adult individuals in the U.S. in just under four months, including over 4,700 COVID-19 cases as measured by a self-reported positive test. ResultsWe replicated previously reported associations between several risk factors and COVID-19 susceptibility and severity outcomes, and additionally found that differences in known exposures accounted for many of the susceptibility associations. A notable exception was elevated susceptibility for males even after adjusting for known exposures and age (adjusted odds ratio [aOR]=1.36, 95% confidence interval [CI] = (1.19, 1.55)). We also demonstrated that self-reported data can be used to build accurate risk models to predict individualized COVID-19 susceptibility (area under the curve [AUC]=0.84) and severity outcomes including hospitalization and critical illness (AUC=0.87 and 0.90, respectively). The risk models achieved robust discriminative performance across different age, sex, and genetic ancestry groups within the study. ConclusionThe results highlight the value of self-reported epidemiological data to rapidly provide public health insights into the evolving COVID-19 pandemic. THUMBNAILO_ST_ABSWhat is already known on this subjectC_ST_ABSO_LIThe COVID-19 pandemic has exacted a historic toll on human lives, healthcare systems and global economies, with over 83 million cases and over 1.8 million deaths worldwide as of January 2021. C_LIO_LICOVID-19 risk factors for susceptibility and severity have been extensively investigated by clinical and public health researchers. C_LIO_LISeveral groups have developed risk models to predict COVID-19 illness outcomes based on known risk factors. C_LI What this study addsO_LIWe performed association analyses for COVID-19 susceptibility and severity in a large, at-home survey and replicated much of the previous clinical literature. C_LIO_LIAssociations were further adjusted for known COVID-19 exposures, and we observed elevated positive test odds for males even after adjustment for these known exposures. C_LIO_LIWe developed risk models and evaluated them across different age, sex, and genetic ancestry cohorts, and showed robust performance across all cohorts in a holdout dataset. C_LIO_LIOur results establish large-scale, self-reported surveys as a potential framework for investigating and monitoring rapidly evolving pandemics. C_LI
health informatics
10.1101/2020.10.08.20209288
The Statistical Monitoring by Adaptive RMSTD Tests: an efficient, informative, and customizable method for the complete internal quality control intended for low-frequent sampling of control measures
Two control mechanisms are relevant to perform an internal quality assurance: a permissible limit LSMC applied to single measures of control samples and a retrospective statistical analysis to detect increased imprecision and baseline drifts. A common statistical metric is the root mean square (total) deviation (RMSD/RMSTD). To focus on recent changes under low-frequent sampling conditions, the monitored amount of retrospective data is usually very small. Unfortunately, the calculated RMSTD of a small data set with n<50 samples has a significant statistical uncertainty that needs to be considered in adequate limit definitions. In particular, the minimum reasonable limit LRMSTD(n), applied to the RMSTD of a series of n samples, decreases from LSMC (e.g., 2.33*standard_deviation+bias) for n=1 towards Ltrue_RMSTD for n[-&gt;]{infty} (long-term statistics). Two mathematical approaches were derived to reliably estimate an optimal function to adjust LRMSTD(n) to small sample sizes. This knowledge led to the development of a new quality-control method: the Statistical Monitoring by Adaptive RMSTD Tests (SMART). SMART requires just one mandatory limit (either LSMC or Ltrue_RMSTD) per analyte. By definition of up to 7 possible alert levels, SMART can early recognize and evaluate both the significance of a single outlier and establishing critical trends or shifts in recent SMC data. SMART is intended to efficiently monitor and evaluate small amounts of control data.
health systems and quality improvement
10.1101/2020.10.09.20209858
COVID-19 serological survey using micro blood sampling
During August 2020, we carried out a serological survey among students and employees at the Okinawa Institute of Science and Technology Graduate University (OIST), Japan, testing for the presence of antibodies against SARS-CoV-2, the causative agent of COVID-19. We used a FDA-authorized 2-step ELISA protocol (1, 2) in combination with at-home self-collection of blood samples using a custom low-cost finger prick-based capillary blood collection kit. Although our survey did not find any COVID-19 seropositive individuals among the OIST cohort, it reliably detected all positive control samples obtained from a local hospital and excluded all negatives controls. We found that high serum antibody titers can persist for at least up to 6.5 months post infection. Among our controls, we found strong cross-reactivity of antibodies in samples from a serum pool from two MERS patients in the anti-SARS-CoV-2-S ELISA. Here we show that a centralized ELISA in combination with patient-based capillary blood collection using as little as one drop of blood can reliably assess the seroprevalence among communities. Anonymous sample tracking and an integrated website created a stream-lined procedure. Major parts of the workflow were automated on a liquid handler, demonstrating scalability. We anticipate this concept to serve as a prototype for reliable serological testing among larger populations.
public and global health
10.1101/2020.10.09.20209999
COVID-19 Disease Severity and Determinants among Ethiopian Patients: A study of the Millennium COVID-19 Care Center
BackgroundThe COVID-19 pandemic started a little later in Ethiopia than Europe and most of the initial cases were reported to have a milder disease course and a favorable outcome. This changed as the disease spread into the population and the more vulnerable began to develop severe disease. Understanding the risk factors for severe disease in Ethiopia was needed to provide optimal health care services in a resource limited setting. ObjectiveThe study assessed COVID-19 patients admitted to Millennium COVID-19 Care Center in Ethiopia for characteristics associated with COVID-19 disease severity. MethodsA cross-sectional study was conducted from June to August 2020 among 686 randomly selected 686 patients. Chi-square test was used to detect the presence of a statistically significant difference in the characteristics of the patients based on disease severity (Mild vs Moderate vs Severe). A multinomial logistic regression model was used to identify risk factors of COVID-19 disease severity where Adjusted Odds ratio (AOR), 95% CIs for AOR and P-values were used for significance testing. ResultsHaving moderate as compared with mild disease was significantly associated with having hypertension (AOR=2.30, 95%CI=1.27,4.18, p-value=0.006), diabetes mellitus (AOR=2.61, 95%CI=1.31,5.19, p-value=0.007 for diabetes mellitus), fever (AOR=6.12, 95%CI=2.94,12.72, p-value=0.0001) and headache (AOR=2.69, 95%CI=1.39,5.22, p-value=0.003). Similarly, having severe disease as compared with mild disease was associated with age group (AOR=4.43, 95%CI=2.49,7.85, p-value=0.0001 for 40-59 years and AOR=18.07, 95%CI=9.29,35.14, p-value=0.0001 for [&ge;] 60 years), sex (AOR=1.84, 95%CI=1.12,3.03, p-value=0.016), hypertension (AOR=1.97, 95%CI=1.08,3.59, p-value=0.028), diabetes mellitus (AOR=3.93, 95%CI=1.96,7.85, p-value=0.0001), fever (AOR=13.22, 95%CI=6.11, 28.60, p-value=0.0001) and headache (AOR=4.82, 95%CI=2.32, 9.98, p-value=0.0001). In addition, risk factors of severe disease as compared with moderate disease were found to be significantly associated with age group (AOR=4.87, 95%CI=2.85, 8.32, p-value=0.0001 for 40-59 years and AOR=18.91, 95%CI=9.84,36.33, p-value=0.0001 for [&ge;] 60 years), fever (AOR=2.16, 95%CI=1.29,3.63, p-value=0.004) and headache (AOR=1.79, 95%CI=1.03, 3.11, p-value=0.039). ConclusionsRisk factors associated with severe COVID-19 in Ethiopia are being greater than 60 years old, male, a diagnosis of hypertension, and diabetes mellitus, and the presence of fever and headache. This is consistent with severity indicators identified by WHO and suggests the initial finding of milder disease in Ethiopia may have been because the first people to get COVID-19 in Ethiopia were less than 60 years of age with fewer health problems.
infectious diseases
10.1101/2020.10.08.20209619
Associations between governor political affiliation and COVID-19 cases, deaths, and testing in the United States
IntroductionThe response to the COVID-19 pandemic became increasingly politicized in the United States (US) and political affiliation of state leaders may contribute to policies affecting the spread of the disease. This study examined differences in COVID-19 infection, death, and testing by governor party affiliation across 50 US states and the District of Columbia. MethodsA longitudinal analysis was conducted in December 2020 examining COVID-19 incidence, death, testing, and test positivity rates from March 15 through December 15, 2020. A Bayesian negative binomial model was fit to estimate daily risk ratios (RRs) and posterior intervals (PIs) comparing rates by gubernatorial party affiliation. The analyses adjusted for state population density, rurality, census region, age, race, ethnicity, poverty, number of physicians, obesity, cardiovascular disease, asthma, smoking, and presidential voting in 2020. ResultsFrom March to early June, Republican-led states had lower COVID-19 incidence rates compared to Democratic-led states. On June 3, the association reversed, and Republican-led states had higher incidence (RR=1.10, 95% PI=1.01, 1.18). This trend persisted through early December. For death rates, Republican-led states had lower rates early in the pandemic, but higher rates from July 4 (RR=1.18, 95% PI=1.02, 1.31) through mid-December. Republican-led states had higher test positivity rates starting on May 30 (RR=1.70, 95% PI=1.66, 1.73) and lower testing rates by September 30 (RR=0.95, 95% PI=0.90, 0.98). ConclusionGubernatorial party affiliation may drive policy decisions that impact COVID-19 infections and deaths across the US. Future policy decisions should be guided by public health considerations rather than political ideology.
epidemiology
10.1101/2020.10.09.20209833
Investigating the relationship between IGF-I, -II and IGFBP-3 concentrations and later-life cognition and brain volume
BackgroundThe insulin/insulin-like signalling (IIS) pathways, including Insulin-like Growth Factors (IGFs), varies with age. However, their association with late-life cognition and neuroimaging parameters is not well characterised. MethodsUsing data from the British 1946 birth cohort we investigated associations of IGF-I, -II and IGFBP-3 (measured at 53 and 60-64 years) with cognitive performance (word learning test (WLT) and visual letter search (VLS) - at 60-64y and 69y) and cognitive state (Addenbrookes Cognitive Exam-III (ACE-III) - at 69-71y), and in a proportion, quantified neuroimaging measures (whole brain volume (WBV); white matter hyperintensity volume (WMHV); hippocampal volume (HV)). Regression models included adjustments for demographic, lifestyle and health factors. ResultsHigher IGF-I and IGF-II at 53y was associated with higher ACE-III scores ({beta} 0.07 95%CI [0.02,0.12]; scoreACE-III 89.48 [88.86,90.1], respectively). IGF-II at age 53y was additionally associated with higher WLT scores (scoreWLT 20 [19.35,20.65]). IGFBP-3 at 60-64y was associated with favourable VLS score at 60-64y and 69y ({beta} 0.07 [0.01,0.12]; {beta} 0.07 [0.02,0.12], respectively), higher memory and cognitive state at 69y ({beta} 0.07 [0.01,0.12]; {beta} 0.07 [0.01,0.13], respectively) and reduced WMHV ({beta} -0.1, [-0.21,-0.00]). IGF-I/IGFBP-3 at 60-64y was associated with lower VLS scores at 69y ({beta} -0.08, [-0.15,-0.02]). ConclusionsIncreased measure in IIS parameters (IGF-I, -II and IGFBP-3) relate to better cognitive state in later life. There were apparent associations with specific cognitive domains (IGF-II relating to memory; IGFBP-3 to memory, processing speed and WMHV; and IGF-I/IGFBP-3 molar ratio with slower processing speed). IGFs and IGFBP-3 are associated with favourable cognitive function outcomes.
epidemiology
10.1101/2020.10.09.20207464
SARS-CoV-2 infects brain astrocytes of COVID-19 patients and impairs neuronal viability
Although increasing evidence confirms neuropsychiatric manifestations associated mainly with severe COVID-19 infection, the long-term neuropsychiatric dysfunction has been frequently observed after mild infection. Here we show the spectrum of the cerebral impact of SARS-CoV-2 infection ranging from long-term alterations in mildly infected individuals (orbitofrontal cortical atrophy, neurocognitive impairment, excessive fatigue and anxiety symptoms) to severe acute damage confirmed in brain tissue samples extracted from the orbitofrontal region (via endonasal trans-ethmoidal approach) from individuals who died of COVID-19. We used surface-based analyses of 3T MRI and identified orbitofrontal cortical atrophy in a group of 81 mildly infected patients (77% referred anosmia or dysgeusia during acute stage) compared to 145 healthy volunteers; this atrophy correlated with symptoms of anxiety and cognitive dysfunction. In an independent cohort of 26 individuals who died of COVID-19, we used histopathological signs of brain damage as a guide for possible SARS-CoV-2 brain infection, and found that among the 5 individuals who exhibited those signs, all of them had genetic material of the virus in the brain. Brain tissue samples from these 5 patients also exhibited foci of SARS-CoV-2 infection and replication, particularly in astrocytes. Supporting the hypothesis of astrocyte infection, neural stem cell-derived human astrocytes in vitro are susceptible to SARS-CoV-2 infection through a non-canonical mechanism that involves spike-NRP1 interaction. SARS-CoV-2-infected astrocytes manifested changes in energy metabolism and in key proteins and metabolites used to fuel neurons, as well as in the biogenesis of neurotransmitters. Moreover, human astrocyte infection elicits a secretory phenotype that reduces neuronal viability. Our data support the model in which SARS-CoV-2 reaches the brain, infects astrocytes and consequently leads to neuronal death or dysfunction. These deregulated processes are also likely to contribute to the structural and functional alterations seen in the brains of COVID-19 patients.
neurology
10.1101/2020.10.09.20207464
SARS-CoV-2 infects brain astrocytes of COVID-19 patients and impairs neuronal viability
Although increasing evidence confirms neuropsychiatric manifestations associated mainly with severe COVID-19 infection, the long-term neuropsychiatric dysfunction has been frequently observed after mild infection. Here we show the spectrum of the cerebral impact of SARS-CoV-2 infection ranging from long-term alterations in mildly infected individuals (orbitofrontal cortical atrophy, neurocognitive impairment, excessive fatigue and anxiety symptoms) to severe acute damage confirmed in brain tissue samples extracted from the orbitofrontal region (via endonasal trans-ethmoidal approach) from individuals who died of COVID-19. We used surface-based analyses of 3T MRI and identified orbitofrontal cortical atrophy in a group of 81 mildly infected patients (77% referred anosmia or dysgeusia during acute stage) compared to 145 healthy volunteers; this atrophy correlated with symptoms of anxiety and cognitive dysfunction. In an independent cohort of 26 individuals who died of COVID-19, we used histopathological signs of brain damage as a guide for possible SARS-CoV-2 brain infection, and found that among the 5 individuals who exhibited those signs, all of them had genetic material of the virus in the brain. Brain tissue samples from these 5 patients also exhibited foci of SARS-CoV-2 infection and replication, particularly in astrocytes. Supporting the hypothesis of astrocyte infection, neural stem cell-derived human astrocytes in vitro are susceptible to SARS-CoV-2 infection through a non-canonical mechanism that involves spike-NRP1 interaction. SARS-CoV-2-infected astrocytes manifested changes in energy metabolism and in key proteins and metabolites used to fuel neurons, as well as in the biogenesis of neurotransmitters. Moreover, human astrocyte infection elicits a secretory phenotype that reduces neuronal viability. Our data support the model in which SARS-CoV-2 reaches the brain, infects astrocytes and consequently leads to neuronal death or dysfunction. These deregulated processes are also likely to contribute to the structural and functional alterations seen in the brains of COVID-19 patients.
neurology
10.1101/2020.10.09.20207464
Morphological, cellular and molecular basis of brain infection in COVID-19 patients
Although increasing evidence confirms neuropsychiatric manifestations associated mainly with severe COVID-19 infection, the long-term neuropsychiatric dysfunction has been frequently observed after mild infection. Here we show the spectrum of the cerebral impact of SARS-CoV-2 infection ranging from long-term alterations in mildly infected individuals (orbitofrontal cortical atrophy, neurocognitive impairment, excessive fatigue and anxiety symptoms) to severe acute damage confirmed in brain tissue samples extracted from the orbitofrontal region (via endonasal trans-ethmoidal approach) from individuals who died of COVID-19. We used surface-based analyses of 3T MRI and identified orbitofrontal cortical atrophy in a group of 81 mildly infected patients (77% referred anosmia or dysgeusia during acute stage) compared to 145 healthy volunteers; this atrophy correlated with symptoms of anxiety and cognitive dysfunction. In an independent cohort of 26 individuals who died of COVID-19, we used histopathological signs of brain damage as a guide for possible SARS-CoV-2 brain infection, and found that among the 5 individuals who exhibited those signs, all of them had genetic material of the virus in the brain. Brain tissue samples from these 5 patients also exhibited foci of SARS-CoV-2 infection and replication, particularly in astrocytes. Supporting the hypothesis of astrocyte infection, neural stem cell-derived human astrocytes in vitro are susceptible to SARS-CoV-2 infection through a non-canonical mechanism that involves spike-NRP1 interaction. SARS-CoV-2-infected astrocytes manifested changes in energy metabolism and in key proteins and metabolites used to fuel neurons, as well as in the biogenesis of neurotransmitters. Moreover, human astrocyte infection elicits a secretory phenotype that reduces neuronal viability. Our data support the model in which SARS-CoV-2 reaches the brain, infects astrocytes and consequently leads to neuronal death or dysfunction. These deregulated processes are also likely to contribute to the structural and functional alterations seen in the brains of COVID-19 patients.
neurology
10.1101/2020.10.09.20210252
Multi-type branching and graph product theory of infectious disease outbreaks
The heterogeneity of human populations is a challenge to mathematical descriptions of epidemic outbreaks. Numerical simulations are deployed to account for the many factors influencing the spreading dynamics. Yet, the results from numerical simulations are often as complicated as the reality, leaving us with a sense of confusion about how the different factors account for the simulation results. Here, using a multi-type branching together with a graph tensor product approach, I derive a single equation for the effective reproductive number of an infectious disease outbreak. Using this equation I deconvolute the impact of crowd management, targeted testing, contact heterogeneity, stratified vaccination, mask use and smartphone tracing app use. This equation can be used to gain a basic understanding of infectious disease outbreaks and their simulations.
infectious diseases
10.1101/2020.10.08.20209122
Duration of Oxygen Requirement and Predictors in Severe COVID-19 Patients in Ethiopia: A Survival Analysis
BackgroundWith the rising number of new cases of COVID-19, understanding the oxygen requirement of severe patients assists in identifying at risk groups and in making an informed decision on building hospitals capacity in terms of oxygen facility arrangement. Therefore, the study aimed to estimate time to getting off supplemental oxygen therapy and identify predictors among COVID-19 patients admitted to Millennium COVID-19 Care Center in Ethiopia. MethodsA prospective observational study was conducted among 244 consecutively admitted COVID-19 patients from July to September, 2020. Kaplan Meier plots, median survival times and Log-rank test were used to describe the data and compare survival distribution between groups. Cox proportional hazard survival model was used to identify determinants of time to getting off supplemental oxygen therapy, where hazard ratio (HR), P-value and 95%CI for HR were used for testing significance and interpretation of results. ResultsMedian time to getting off supplemental oxygen therapy among the studied population was 6 days (IQR, 4.3-20.0). Factors that affect time to getting off supplemental oxygen therapy were age group (AHR=0.52,95%CI=0.32,0.84, p-value=0.008 for [&ge;]70 years) and shortness of breath (AHR=0.71,95%CI=0.52,0.96, p-value=0.026). ConclusionsAverage duration of supplemental oxygen therapy requirement among COVID-19 patients was 6 days and being 70 years and older and having shortness of breath were found to be associated with prolonged duration of supplemental oxygen therapy requirement. This result can be used as a guide in planning institutional resource allocation and patient management to provide a well-equipped care to prevent complications and death from the disease.
infectious diseases
10.1101/2020.10.09.20209429
The role of masks, testing and contact tracing in preventing COVID-19 resurgences: a case study from New South Wales, Australia
ObjectivesThe early stages of the COVID-19 pandemic illustrated that SARS-CoV-2, the virus that causes the disease, has the potential to spread exponentially. Therefore, as long as a substantial proportion of the population remains susceptible to infection, the potential for new epidemic waves persists even in settings with low numbers of active COVID-19 infections, unless sufficient countermeasures are in place. We aim to quantify vulnerability to resurgences in COVID-19 transmission under variations in the levels of testing, tracing, and mask usage. SettingThe Australian state of New South Wales, a setting with prolonged low transmission, high mobility, non-universal mask usage, and a well-functioning test-and-trace system. ParticipantsNone (simulation study) ResultsWe find that the relative impact of masks is greatest when testing and tracing rates are lower (and vice versa). Scenarios with very high testing rates (90% of people with symptoms, plus 90% of people with a known history of contact with a confirmed case) were estimated to lead to a robustly controlled epidemic, with a median of [~]180 infections in total over October 1 - December 31 under high mask uptake scenarios, or 260-1,200 without masks, depending on the efficacy of community contact tracing. However, across comparable levels of mask uptake and contact tracing, the number of infections over this period were projected to be 2-3 times higher if the testing rate was 80% instead of 90%, 8-12 times higher if the testing rate was 65%, or 30-50 times higher with a 50% testing rate. In reality, NSW diagnosed 254 locally-acquired cases over this period, an outcome that had a low probability in the model (4-7%) under the best-case scenarios of extremely high testing (90%), near-perfect community contact tracing (75-100%), and high mask usage (50-75%), but a far higher probability if any of these were at lower levels. ConclusionsOur work suggests that testing, tracing and masks can all be effective means of controlling transmission. A multifaceted strategy that combines all three, alongside continued hygiene and distancing protocols, is likely to be the most robust means of controlling transmission of SARS-CoV-2. Strengths and limitations of this studyO_LIA key methodological strength of this study is the level of detail in the model that we use, which allows us to capture many of the finer details of the extent to which controlling COVID-19 transmission relies on the balance between testing, contact tracing, and mask usage. C_LIO_LIAnother key strength is that our model is stochastic, so we are able to quantify the probability of different epidemiological outcomes under different policy settings. C_LIO_LIA key limitation is the shortage of publicly-available data on the efficacy of contact tracing programs, including data on how many people were contacted for each confirmed index case of COVID-19. C_LI
epidemiology
10.1101/2020.10.09.20207936
Functional connectivity directionality between large-scale resting-state networks in children and adolescence from the Healthy Brain Network sample
Mental disorders often emerge during adolescence and have been associated with age-related differences in connection strengths of brain networks (static functional connectivity), manifesting in non-typical trajectories of brain development. However, little is known about the direction of information flow (directed functional connectivity) in this period of functional brain progression. We employed dynamic graphical models (DGM) to estimate directed functional connectivity from resting state functional magnetic resonance imaging data on 1143 participants, aged 6 to 17 years from the healthy brain network (HBN) sample. We tested for effects of age, sex, cognitive abilities and psychopathology on estimates of direction flow. Across participants, we show a pattern of reciprocal information flow between visual-medial and visual-lateral connections, in line with findings in adults. Investigating directed connectivity patterns between networks, we observed a positive association for age and direction flow from the cerebellar to the auditory network, and for the auditory to the sensorimotor network. Further, higher cognitive abilities were linked to lower information flow from the visual occipital to the default mode network. Additionally, examining the degree networks overall send and receive information to each other, we identified age-related effects implicating the right frontoparietal and sensorimotor network. However, we did not find any associations with psychopathology. Our results revealed that the directed functional connectivity of large-scale brain networks is sensitive to age and cognition during adolescence, warranting further studies that may explore trajectories of development in more fine-grained network parcellations and in different populations.
radiology and imaging
10.1101/2020.10.09.20207936
Functional connectivity directionality between large-scale resting-state networks in children and adolescence from the Healthy Brain Network sample
Mental disorders often emerge during adolescence and have been associated with age-related differences in connection strengths of brain networks (static functional connectivity), manifesting in non-typical trajectories of brain development. However, little is known about the direction of information flow (directed functional connectivity) in this period of functional brain progression. We employed dynamic graphical models (DGM) to estimate directed functional connectivity from resting state functional magnetic resonance imaging data on 1143 participants, aged 6 to 17 years from the healthy brain network (HBN) sample. We tested for effects of age, sex, cognitive abilities and psychopathology on estimates of direction flow. Across participants, we show a pattern of reciprocal information flow between visual-medial and visual-lateral connections, in line with findings in adults. Investigating directed connectivity patterns between networks, we observed a positive association for age and direction flow from the cerebellar to the auditory network, and for the auditory to the sensorimotor network. Further, higher cognitive abilities were linked to lower information flow from the visual occipital to the default mode network. Additionally, examining the degree networks overall send and receive information to each other, we identified age-related effects implicating the right frontoparietal and sensorimotor network. However, we did not find any associations with psychopathology. Our results revealed that the directed functional connectivity of large-scale brain networks is sensitive to age and cognition during adolescence, warranting further studies that may explore trajectories of development in more fine-grained network parcellations and in different populations.
radiology and imaging
10.1101/2020.10.11.20208926
Novel Mutations in Human Luteinizing Hormone Beta Subunit Related to Polycystic Ovary Syndrome among Sudanese Women
IntroductionPolycystic ovary syndrome (PCOS) is a common disorder that is not fully understood. Multiple hormonal and metabolic factors impact on disease pathophysiology resulting in various phenotypic characteristics among the PCOS population. Luteinizing hormone beta subunit (LHB) (protein ID P01229) is mapped on (chr19p13.3) and consists of three exons. Luteinizing hormone (LH) has a central role in stimulation ovarian steroidogenesis, in particular androgen production, and the promotion of ovulation. ObjectivesTo determine if genetic variations of LHB are associated with PCOS among Sudanese families. MethodsA prospective laboratory based cross-sectional study to examine genetic mutations in LHB that associate with PCOS in families (cases; n=35 families, 90 females and controls; n=11 families, 30 females) in Khartoum State, Sudan. Quantitative enzyme linked immuno-sorbent assay (ELISA) and polymerase chain reaction (PCR) with Sanger sequencing were used to analyze biochemical parameters and detect polymorphisms. Protein structure and function bioinformatics analysis was conducted using standard software. ResultsPCOS cases had significantly different biochemical parameters from the controls (LH: p<0.001; testosterone: p<0.001; fasting glucose: p=0.02; insulin: p=0.01; triglycerides: p=0.03; total cholesterol: p<0.001; high density lipoprotein (HDL): p=0.012;low density lipoprotein (LDL): p<0.001). There were no differences in follicle stimulating hormone (FSH) (p=0.984) or prolactin (p=0.068). Sanger sequencing revealed 5 single nucleotide polymorphisms (rs5030775, A18T; rs746167425, R22K; rs1800447, W28R; rs35270001, H30R; and rs34349826, I35T) located on (exon 2) of LHB gene that were statistically correlated with serum LH, Testosterone and insulin levels among PCOS families. ConclusionThis is the first molecular family-based study in Sudan exploring the genetics of the LHB gene in women manifesting PCOS. These novel mutations give further information about the role of genetic inheritance and may explain some of the altered ovarian function and responses in women with PCOS.
endocrinology
10.1101/2020.10.11.20210914
α4β2* Nicotinic Cholinergic Receptor Target Engagement in Parkinson Disease Gait-Balance Disorders
ObjectiveAttentional function deficits secondary to degeneration of brain cholinergic systems are significant contributors to gait-balance deficits in Parkinson disease (PD). As an initial step towards assessing if 4{beta}2* nicotinic acetylcholine receptor (nAChR) stimulation improves attention and gait-balance function, we assessed target engagement of the 4{beta}2* nAChR partial agonist varenicline. MethodsNon-demented PD participants with cholinergic deficits were identified with [18F]fluoroethoxybenzamicol positron emission tomography (PET). 4{beta}2* nAChR occupancy after subacute oral varenicline treatment was measured with [18F]flubatine PET. With a dose selected from the receptor occupancy experiment, varenicline effects on gait, balance, and cognition were assessed in a double-masked placebo-controlled crossover study. Primary endpoints were normal pace gait speed and a measure of postural stability. ResultsAll varenicline doses (0.25 mg per day, 0.25 mg b.i.d., 0.5 mg b.i.d., and 1.0 mg b.i.d.) produced 60% - 70% receptor occupancy. We selected 0.5 mg po b.i.d for the crossover study. Thirty-three (of thirty-four) participants, completed the crossover study with excellent tolerability. Varenicline had no significant impact on the postural stability measure and caused slower normal pace gait speed. Varenicline narrowed the difference in normal pace gait speed between dual task and no dual task gait conditions, reduced dual task cost, and improved performance on a sustained attention test. We obtained identical conclusions in 28 participants in whom treatment compliance was confirmed by plasma varenicline measurements. InterpretationVarenicline occupied a significant fraction of 4{beta}2* nicotinic acetylcholine receptors, was tolerated well, enhanced attentional function, and altered gait performance. These results are consistent with relevant target engagement. Varenicline or similar agents may be worth further evaluation for mitigation of gait and balance disorders in PD.
neurology
10.1101/2020.10.12.20211292
The ADEPT Study, A Comparative Study of Dentists Ability to Detect Enamel-only Proximal Caries in Bitewing Radiographs With and Without the use of AssistDent(R) Artificial Intelligence Software
IntroductionReversal of enamel-only proximal caries by non-invasive treatments is important in preventive dentistry. However, detecting such caries using bitewing radiography is difficult, and the subtle patterns are often missed by dental practitioners. AimsTo investigate whether the ability of dentists to detect enamel-only proximal caries is enhanced by the use of AssistDent(R) Artificial Intelligence (AI) software. Materials and MethodsIn the ADEPT (AssistDent Enamel-only Proximal caries assessmenT) study, twenty-three dentists were randomly divided into a control arm, without AI assistance, and an experimental arm in which AI assistance provided on-screen prompts indicating potential enamel-only proximal caries. All participants analysed a set of 24 bitewings in which an expert panel had previously identified 65 enamel-only carious lesions and 241 healthy proximal surfaces. ResultsThe control group found 44.3% of the caries, whereas the experimental group found 75.8%. The experimental group incorrectly identified caries in 14.6% of the healthy surfaces compared to 3.7% in the control group. The increase in sensitivity of 71% and decrease in specificity of 11% are statistically significant (p<0.01). ConclusionsAssistDent(R) Artificial Intelligence software significantly improves dentists ability to detect enamel-only proximal caries and could be considered as a tool to support preventive dentistry in general practice. Key PointsEnamel-only proximal caries are often missed by dentists when examining bitewing radiographs. The use of AssistDent(R) Artificial Intelligence software results in a 71% increase in ability to detect enamel-only proximal caries accompanied by a 11% decrease in specificity. Artificial Intelligence software could be considered as a tool to support preventive dentistry in general practice.
dentistry and oral medicine
10.1101/2020.10.12.20211094
COVID-19 vaccination rate and protection attitudes can determine the best prioritisation strategy to reduce fatalities
Background: The unprecedented rapid development of vaccines against the SARS-CoV-2 virus creates in itself a new challenge for governments and health authorities: the effective vaccination of large numbers of people in a short time and, possibly, with shortage of vaccine doses. To whom vaccinate first and in what sequence, if any at all, to avoid the most fatalities remains an open question. Methods: A compartmental model considering age-related groups was developed to evaluate and compare vaccine distribution strategies in terms of the total avoidable fatalities. Population groups are established based on relevant differences in mortality (due to e.g. their age) and risk-related traits (such as their behaviour and number of daily person-to-person interactions). Vaccination distribution strategies were evaluated for different vaccine effectiveness levels, population coverage and vaccination rate using data mainly from Spain. Findings: Our results show that, if children could also be included in the vaccination, a rollout by priority to groups with the highest number of daily person-to-person interactions can achieve large reductions in total fatalities. This is due to the importance of the avoided subsequent infections inflicted on the rest of the population by highly interactive individuals. If children are excluded from the vaccination, the differences between priority strategies become smaller and appear highly depending on rollout rate, coverage and the levels of self-protection and awareness exercised by the population. Interpretation: These results are in possible contradiction with several published plans for COVID-19 vaccination and highlight the importance of conducting an open comprehensive and thorough analysis of this problem leaving behind possible preconceptions.
epidemiology
10.1101/2020.10.12.20211169
Testing and isolation to prevent overloaded health care facilities and to reduce death rates in the SARS-CoV-2 pandemic in Italy
BackgroundDuring the first wave of COVID-19, hospital and intensive care unit beds got overwhelmed in Italy leading to an increased death burden. Based on data from Italian regions, we disentangled the impact of various factors contributing to the bottleneck situation of health care facilities, not well addressed in classical SEIR-like models. A particular emphasis was set on the dark figure, on the dynamically changing hospital capacity, and on different testing, contact tracing, quarantine strategies. MethodsWe first estimated the dark figure for different Italian regions. Using parameter estimates from literature and, alternatively, with parameters derived from a fit to the initial phase of COVID-19 spread, the model was optimized to fit data (infected, hospitalized, ICU, dead) published by the Italian Civil Protection. ResultsWe showed that testing influenced the infection dynamics by isolation of newly detected cases and subsequent interruption of infection chains. The time-varying reproduction number (Rt) in high testing regions decreased to < 1 earlier compared to the low testing regions. While an early test and isolate (TI) scenario resulted in up to [~] 32% peak reduction of hospital occupancy, the late TI scenario resulted in an overwhelmed health care system. ConclusionsAn early TI strategy would have decreased the overall hospital accessibility drastically and, hence, death toll ([~] 45% reduction in Lombardia) and could have mitigated the lack of health care facilities in the course of the pandemic, but it would not have kept the hospitalization amount within the pre-pandemic hospital limit. We showed that contact tracing and quarantine without testing would have a similar effect and might be an efficient strategy when sufficient test capacities are not available.
epidemiology
10.1101/2020.10.13.20211797
All-cause mortality among patients treated with repurposed antivirals and antibiotics for COVID-19 in Mexico City: A Real-World Observational Study
AimTo evaluate all-cause mortality risk in patients with laboratory-confirmed COVID-19 in Mexico City treated with repurposed antivirals and antibiotics. MethodsThis real-world retrospective cohort study contemplated 395,343 patients evaluated for suspected COVID-19 between February 24 and September 14, 2020 in 688 primary-to-tertiary medical units in Mexico City. Patients were included with a positive RT-PCR for SARS-CoV-2; those receiving unspecified antivirals, excluded; and antivirals prescribed in <30 patients, eliminated. Survival and mortality risks were determined for patients receiving antivirals, antibiotics, both, or none. Results136,855 patients were analyzed; mean age 44.2 (SD:16.8) years; 51.3% were men. 16.6% received antivirals (3%), antibiotics (10%), or both (3.6%). Antivirals studied were Oseltamivir (n=8414), Amantadine (n=319), Lopinavir-Ritonavir (n=100), Rimantadine (n=61), Zanamivir (n=39), and Acyclovir (n=36). Survival with antivirals (73.7%, p<0.0001) and antibiotics (85.8%, p<0.0001) was lower than no antiviral/antibiotic (93.6%). After multivariable adjustment, increased risk of death occurred with antivirals (HR=1.72, 95%CI:1.61-1.84) in ambulatory (HR=4.7, 95%CI:3.94-5.62) and non-critical (HR=2.03, 95%CI:1.86-2.21) patients. Oseltamivir increased mortality risk in the general population (HR=1.72, 95%CI:1.61-1.84), ambulatory (HR=4.79, 95%CI:4.01-5.75), non-critical (HR=2.05, 95%CI:1.88-2.23), and pregnancy (HR=8.35, 95%CI:1.77-39.30); as well as hospitalized (HR=1.13, 95%CI:1.01-1.26) and critical patients (HR:1.22, 95%CI:1.05-1.43) after propensity score-matching. Antibiotics were a risk factor in general population (HR=1.13, 95%CI:1.08-1.19) and pediatrics (HR=4.22, 95%CI:2.01-8.86), but a protective factor in hospitalized (HR=0.81, 95%CI:0.77-0.86) and critical patients (HR=0.67, 95%CI:0.63-0.72). ConclusionsNo significant benefit for repurposed antivirals was observed; oseltamivir was associated with increased mortality. Antibiotics increased mortality risk in the general population but may increase survival in hospitalized and critical patients. WHAT IS ALREADY KNOWNO_LICurrent recommendations for using repurposed antivirals and antibiotics for COVID-19 are conflicting. C_LIO_LIFew antivirals (i.e. lopinavir-ritonavir) have been shown to provide no additional benefit for COVID-19 in clinical trials; other antivirals may be having widespread use in real-world settings without formal assessment in clinical trials. C_LIO_LIReal-world use of repurposed antivirals and antibiotics for COVID-19 in population-based studies have not been performed; important populations have been left largely understudied (ambulatory patients, pregnant women, and pediatrics). C_LI WHAT THIS STUDY ADDSO_LIThis is the first real-world observational study evaluating amantadine, rimantadine, zanamivir, and acyclovir for COVID-19; no registered studies to evaluate these drugs exist. Only one study has evaluated risk of death for oseltamivir. Lopinavir-ritonavir have been previously evaluated in clinical trials. C_LIO_LIRepurposed antivirals and antibiotics were commonly prescribed in 688 ambulatory units and hospitals of Mexico City despite unclear recommendations for their use out of clinical trials. C_LIO_LIOseltamivir was associated with increased mortality risk; other repurposed antivirals (zanamivir, amantadine, rimantadine, and acyclovir) had no significant and consistent impact on mortality. Antibiotics were associated with increased mortality risk in the general population but may increase survival in hospitalized and critical patients. C_LI
pharmacology and therapeutics
10.1101/2020.10.12.20211755
Using Satellite Images and Deep Learning to Identify Associations Between County-Level Mortality and Neighborhood Features: A Cross-Sectional Study
What is the relationship between mortality and satellite images as elucidated through the use of Convolutional Neural Networks? BackgroundFollowing a century of increase, life expectancy in the United States has stagnated and begun to decline in recent decades. Using satellite images and street view images, prior work has demonstrated associations of the built environment with income, education, access to care and health factors such as obesity. However, assessment of learned image feature relationships with variation in crude mortality rate across the United States has been lacking. ObjectiveWe sought to investigate if county-level mortality rates in the U.S. could be predicted from satellite images. MethodsSatellite images of neighborhoods surrounding schools were extracted with the Google Static Maps application programming interface for 430 counties representing approximately 68.9% of the US population. A convolutional neural network was trained using crude mortality rates for each county in 2015 to predict mortality. Learned image features were interpreted using Shapley Additive Feature Explanations, clustered, and compared to mortality and its associated covariate predictors. ResultsPredicted mortality from satellite images in a held-out test set of counties was strongly correlated to the true crude mortality rate (Pearson r=0.72). Direct prediction of mortality using a deep learning model across a cross-section of 430 U.S. counties identified key features in the environment (e.g. sidewalks, driveways and hiking trails) associated with lower mortality. Learned image features were clustered, and we identified 10 clusters that were associated with education, income, geographical region, race and age. ConclusionsThe application of deep learning techniques to remotely-sensed features of the built environment can serve as a useful predictor of mortality in the United States. Although we identified features that were largely associated with demographic information, future modeling approaches that directly identify image features associated with health-related outcomes have the potential to inform targeted public health interventions.
public and global health
10.1101/2020.10.12.20211755
Using Satellite Images and Deep Learning to Identify Associations Between County-Level Mortality and Residential Neighborhood Features Proximal to Schools: A Cross-Sectional Study
What is the relationship between mortality and satellite images as elucidated through the use of Convolutional Neural Networks? BackgroundFollowing a century of increase, life expectancy in the United States has stagnated and begun to decline in recent decades. Using satellite images and street view images, prior work has demonstrated associations of the built environment with income, education, access to care and health factors such as obesity. However, assessment of learned image feature relationships with variation in crude mortality rate across the United States has been lacking. ObjectiveWe sought to investigate if county-level mortality rates in the U.S. could be predicted from satellite images. MethodsSatellite images of neighborhoods surrounding schools were extracted with the Google Static Maps application programming interface for 430 counties representing approximately 68.9% of the US population. A convolutional neural network was trained using crude mortality rates for each county in 2015 to predict mortality. Learned image features were interpreted using Shapley Additive Feature Explanations, clustered, and compared to mortality and its associated covariate predictors. ResultsPredicted mortality from satellite images in a held-out test set of counties was strongly correlated to the true crude mortality rate (Pearson r=0.72). Direct prediction of mortality using a deep learning model across a cross-section of 430 U.S. counties identified key features in the environment (e.g. sidewalks, driveways and hiking trails) associated with lower mortality. Learned image features were clustered, and we identified 10 clusters that were associated with education, income, geographical region, race and age. ConclusionsThe application of deep learning techniques to remotely-sensed features of the built environment can serve as a useful predictor of mortality in the United States. Although we identified features that were largely associated with demographic information, future modeling approaches that directly identify image features associated with health-related outcomes have the potential to inform targeted public health interventions.
public and global health
10.1101/2020.10.11.20210781
Increased brain volume from higher cereal and lower coffee intake: Shared genetic determinants and impacts on cognition and metabolism
It is unclear how different diets may affect human brain development and if genetic and environmental factors play a part. We investigated diet effects in the UK Biobank data from 18,879 healthy adults and discovered anti-correlated brain-wide grey matter volume (GMV)-association patterns between coffee and cereal intake, coincidence with their anti-correlated genetic constructs. The Mendelian randomisation approach further indicated a causal effect of higher coffee intake on reduced total GMV, which is likely through regulating the expression of genes responsible for synaptic development in the brain. The identified genetic factors may further affect peoples lifestyle habits and body/blood fat levels through the mediation of cereal/coffee intake, and the brain-wide expression pattern of gene CPLX3, a dedicated marker of subplate neurons that regulate cortical development and plasticity, may underlie the shared GMV-association patterns among the coffee/cereal intake and cognitive functions. All the main findings were successfully replicated in the newly-released independent UK Biobank data from 16,412 healthy adults. Our findings thus revealed that high-cereal and low-coffee diets shared similar brain and genetic constructs, leading to long-term beneficial associations regarding cognitive, BMI and other metabolic measures. This study has important implications for public health, especially during the pandemic, given the poorer outcomes of COVID-19 patients with greater BMIs. Significance statementWe investigated diet effects on the brain structure and its genetic constructs using the UK Biobank data and discovered a causal effect of higher coffee intake on reduced total grey matter volume (GMV) and replicable anti-correlated brain-wide association GMV patterns between cereal and coffee intake. Further, the high-cereal and low-coffee diets shared similar brain and genetic constructs, leading to long-term beneficial associations regarding cognitive, BMI, and other metabolic indicators. Our study has important implications for public health, especially during the pandemic, given the poorer outcomes of COVID-19 patients with greater BMIs.
public and global health
10.1101/2020.10.12.20211607
Tracking the Progression & Influence of Beta-Amyloid Plaques Using Percolation Centrality and Collective Influence Algorithm: A Study using PET images
(1) Background: Network analysis allows investigators to explore the many facets of brain networks, particularly the proliferation of disease. One of the hypotheses behind the disruption in brain networks in Alzheimers disease is the abnormal accumulation of beta-amyloid plaques and tau protein tangles. In this study, the potential use of percolation centrality to study beta-amyloid movement was studied as a feature of given PET image-based networks; (2) Methods: The PET image-based network construction is possible using a public access database - Alzheimers Disease Neuroimaging Initiative, which provided 551 scans. For each image, the Julich atlas provides 121 regions of interest, which are the network nodes. Besides, using the collective influence algorithm, the influential nodes for each scan are calculated; (3) Analysis of variance (p<0.05) yields the region of interest Gray Matter Brocas Area for PiB tracer type for five nodal metrics. In comparison, AV45: the Gray Matter Hippocampus region is significant for three of the nodal metrics. Pairwise variance analysis between the clinical groups yields five and twelve statistically significant ROIs for AV45 and PiB, capable of distinguishing between pairs of clinical conditions. Multivariate linear regression between the percolation centrality values for nodes and psychometric assessment scores reveals Mini-Mental State Examination is reliable(4) Conclusion: percolation centrality effectively (41% of ROIs) indicates that the regions of interest that are part of the memory, visual-spatial skills, and language are crucial to the percolation of beta-amyloids within the brain network to the other widely used nodal metrics. Ranking the regions of interest based on the collective influence algorithm indicates the anatomical areas strongly influencing the beta-amyloid network.
neurology
10.1101/2020.10.12.20211607
Tracking the Progression & Influence of Beta-Amyloid Plaques Using Percolation Centrality and Collective Influence Algorithm: A Study using PET images
(1) Background: Network analysis allows investigators to explore the many facets of brain networks, particularly the proliferation of disease. One of the hypotheses behind the disruption in brain networks in Alzheimers disease is the abnormal accumulation of beta-amyloid plaques and tau protein tangles. In this study, the potential use of percolation centrality to study beta-amyloid movement was studied as a feature of given PET image-based networks; (2) Methods: The PET image-based network construction is possible using a public access database - Alzheimers Disease Neuroimaging Initiative, which provided 551 scans. For each image, the Julich atlas provides 121 regions of interest, which are the network nodes. Besides, using the collective influence algorithm, the influential nodes for each scan are calculated; (3) Analysis of variance (p<0.05) yields the region of interest Gray Matter Brocas Area for PiB tracer type for five nodal metrics. In comparison, AV45: the Gray Matter Hippocampus region is significant for three of the nodal metrics. Pairwise variance analysis between the clinical groups yields five and twelve statistically significant ROIs for AV45 and PiB, capable of distinguishing between pairs of clinical conditions. Multivariate linear regression between the percolation centrality values for nodes and psychometric assessment scores reveals Mini-Mental State Examination is reliable(4) Conclusion: percolation centrality effectively (41% of ROIs) indicates that the regions of interest that are part of the memory, visual-spatial skills, and language are crucial to the percolation of beta-amyloids within the brain network to the other widely used nodal metrics. Ranking the regions of interest based on the collective influence algorithm indicates the anatomical areas strongly influencing the beta-amyloid network.
neurology
10.1101/2020.10.12.20211607
Tracking the Progression & Influence of Beta-Amyloid Plaques Using Percolation Centrality and Collective Influence Algorithm: A Study using PET images
(1) Background: Network analysis allows investigators to explore the many facets of brain networks, particularly the proliferation of disease. One of the hypotheses behind the disruption in brain networks in Alzheimers disease is the abnormal accumulation of beta-amyloid plaques and tau protein tangles. In this study, the potential use of percolation centrality to study beta-amyloid movement was studied as a feature of given PET image-based networks; (2) Methods: The PET image-based network construction is possible using a public access database - Alzheimers Disease Neuroimaging Initiative, which provided 551 scans. For each image, the Julich atlas provides 121 regions of interest, which are the network nodes. Besides, using the collective influence algorithm, the influential nodes for each scan are calculated; (3) Analysis of variance (p<0.05) yields the region of interest Gray Matter Brocas Area for PiB tracer type for five nodal metrics. In comparison, AV45: the Gray Matter Hippocampus region is significant for three of the nodal metrics. Pairwise variance analysis between the clinical groups yields five and twelve statistically significant ROIs for AV45 and PiB, capable of distinguishing between pairs of clinical conditions. Multivariate linear regression between the percolation centrality values for nodes and psychometric assessment scores reveals Mini-Mental State Examination is reliable(4) Conclusion: percolation centrality effectively (41% of ROIs) indicates that the regions of interest that are part of the memory, visual-spatial skills, and language are crucial to the percolation of beta-amyloids within the brain network to the other widely used nodal metrics. Ranking the regions of interest based on the collective influence algorithm indicates the anatomical areas strongly influencing the beta-amyloid network.
neurology
10.1101/2020.10.12.20211607
Tracking the Progression & Influence of Beta-Amyloid Plaques Using Percolation Centrality and Collective Influence Algorithm: A Study using PET images
(1) Background: Network analysis allows investigators to explore the many facets of brain networks, particularly the proliferation of disease. One of the hypotheses behind the disruption in brain networks in Alzheimers disease is the abnormal accumulation of beta-amyloid plaques and tau protein tangles. In this study, the potential use of percolation centrality to study beta-amyloid movement was studied as a feature of given PET image-based networks; (2) Methods: The PET image-based network construction is possible using a public access database - Alzheimers Disease Neuroimaging Initiative, which provided 551 scans. For each image, the Julich atlas provides 121 regions of interest, which are the network nodes. Besides, using the collective influence algorithm, the influential nodes for each scan are calculated; (3) Analysis of variance (p<0.05) yields the region of interest Gray Matter Brocas Area for PiB tracer type for five nodal metrics. In comparison, AV45: the Gray Matter Hippocampus region is significant for three of the nodal metrics. Pairwise variance analysis between the clinical groups yields five and twelve statistically significant ROIs for AV45 and PiB, capable of distinguishing between pairs of clinical conditions. Multivariate linear regression between the percolation centrality values for nodes and psychometric assessment scores reveals Mini-Mental State Examination is reliable(4) Conclusion: percolation centrality effectively (41% of ROIs) indicates that the regions of interest that are part of the memory, visual-spatial skills, and language are crucial to the percolation of beta-amyloids within the brain network to the other widely used nodal metrics. Ranking the regions of interest based on the collective influence algorithm indicates the anatomical areas strongly influencing the beta-amyloid network.
neurology
10.1101/2020.10.12.20210500
Predicting Olfactory Loss In Chronic Rhinosinusitis Using Machine Learning
ObjectiveCompare machine learning (ML) based predictive analytics methods to traditional logistic regression in classification of olfactory dysfunction in chronic rhinosinusitis (CRS-OD), and identify predictors within a large multi-institutional cohort of refractory CRS patients. MethodsAdult CRS patients enrolled in a prospective, multi-institutional, observational cohort study were assessed for baseline CRS-OD using a smell identification test (SIT) or brief SIT (bSIT). Four different ML methods were compared to traditional logistic regression for classification of CRS normosmics versus CRS-OD. ResultsData were collected for 611 study participants who met inclusion criteria between April 2011 and July 2015. 34% of enrolled patients demonstrated olfactory loss on psychophysical testing. Differences between CRS normosmics and those with smell loss included objective disease measures (CT and endoscopy scores), age, sex, prior surgeries, socioeconomic status, steroid use, polyp presence, asthma, and aspirin sensitivity. Most ML methods performed favorably in terms of predictive ability. Top predictors include factors previously reported in the literature, as well as several socioeconomic factors. ConclusionOlfactory dysfunction is a variable phenomenon in CRS patients. ML methods perform well compared to traditional logistic regression in classification of normosmia versus smell loss in CRS, and are able to include numerous risk factors into prediction models. Several actionable features were identified as risk factors for CRS-OD. These results suggest that ML methods may be useful for current understanding and future study of hyposmia secondary to sinonasal disease, the most common cause of persistent olfactory loss in the general population.
otolaryngology
10.1101/2020.10.12.20211573
Preventing COVID-19 spread in closed facilities by regular testing of employees - an efficient intervention in long-term care facilities and prisons?
BackgroundDifferent levels of control measures were introduced to contain the global COVID-19 pandemic, many of which have been controversial, particularly the comprehensive use of diagnostic tests. Regular testing of high-risk individuals (pre-existing conditions, older than 60 years of age) has been suggested by public health authorities. The WHO suggested the use of routine screening of residents, employees, and visitors of long-term care facilities (LTCF) to protect the resident risk group. Similar suggestions have been made by the WHO for other closed facilities including incarceration facilities (e.g., prisons or jails), wherein parts of the U.S., accelerated release of approved inmates is taken as a measure to mitigate COVID-19. Methods and findingsHere, the simulation model underlying the pandemic preparedness tool CovidSim 1.1 (http://covidsim.eu/) is extended to investigate the effect of regularly testing of employees to protect immobile resident risk groups in closed facilities. The reduction in the number of infections and deaths within the risk group is investigated. Our simulations are adjusted to reflect the situation of LTCFs in Germany, and incarceration facilities in the U.S. COVID-19 spreads in closed facilities due to contact with infected employees even under strict confinement of visitors in a pandemic scenario without targeted protective measures. Testing is only effective in conjunction with targeted contact reduction between the closed facility and the outside world - and will be most inefficient under strategies aiming for herd immunity. The frequency of testing, the quality of tests, and the waiting time for obtaining test results have noticeable effects. The exact reduction in the number of cases depends on disease prevalence in the population and the levels of contact reductions. Testing every 5 days with a good quality test and a processing time of 24 hours can lead up to a 40% reduction in the number of infections. However, the effects of testing vary substantially among types of closed facilities and can even be counterproductive in U.S. IFs. ConclusionsThe introduction of COVID-19 in closed facilities is unavoidable without a thorough screening of persons that can introduce the disease into the facility. Regular testing of employees in closed facilities can contribute to reducing the number of infections there, but is only meaningful as an accompanying measure, whose economic benefit needs to be assessed carefully.
health policy
10.1101/2020.10.13.20211953
Identification of drugs associated with reduced severity of COVID-19: A case-control study in a large population
BackgroundUntil COVID-19 drugs specifically developed to treat COVID-19 become more widely accessible, it is crucial to identify whether existing medications have a protective effect against severe disease. Towards this objective, we conducted a large population study in Clalit Health Services (CHS), the largest healthcare provider in Israel, insuring over 4.7 million members. MethodsTwo case-control matched cohorts were assembled to assess which medications, acquired in the last month, decreased the risk of COVID-19 hospitalization. Case patients were adults aged 18-95 hospitalized for COVID-19. In the first cohort, five control patients, from the general population, were matched to each case (n=6202); in the second cohort, two non-hospitalized SARS-CoV-2 positive control patients were matched to each case (n=6919). The outcome measures for a medication were: odds ratio (OR) for hospitalization, 95% confidence interval (CI), and the p-value, using Fishers exact test. False discovery rate was used to adjust for multiple testing. ResultsMedications associated with most significantly reduced odds for COVID-19 hospitalization include: ubiquinone (OR=0.185, 95% CI (0.058 to 0.458), p<0.001), ezetimibe (OR=0.488, 95% CI ((0.377 to 0.622)), p<0.001), rosuvastatin (OR=0.673, 95% CI (0.596 to 0.758), p<0.001), flecainide (OR=0.301, 95% CI (0.118 to 0.641), p<0.001), and vitamin D (OR=0.869, 95% CI (0.792 to 0.954), p<0.003). Remarkably, acquisition of artificial tears, eye care wipes, and several ophthalmological products were also associated with decreased risk for hospitalization. ConclusionsUbiquinone, ezetimibe and rosuvastatin, all related to the cholesterol synthesis pathway were associated with reduced hospitalization risk. These findings point to a promising protective effect which should be further investigated in controlled, prospective studies. FundingThis research was supported in part by the Intramural Research Program of the National Institutes of Health, NCI.
infectious diseases
10.1101/2020.10.13.20183111
A delayed modulation of solar radiation on the COVID-19 transmission reflects an incubation period
Laboratory experiments have revealed the meteorological sensitivity of the coronavirus disease 2019 (COVID-19) virus. However, no consensus has been reached about how outdoor meteorological conditions modulate the virus transmission as it is also constrained by non-meteorological conditions. Here, we find that statistically, non-meteorological factors constrain the growth rate of cumulative confirmed cases least when the cases in a country arrive around 1300-3200. The least-constrained growth rate correlates with the ultraviolet flux and temperature significantly (correlation coefficients r=-0.55{+/-}0.09 and -0.40{+/-}0.10 at p < 0.01, respectively), but not with precipitation, humidity, and wind. The ultraviolet correlation exhibits a delay of about seven days, providing a meteorological measure of the incubation period. Our work reveals a seasonality of COVID-19 and a high risk of a pandemic resurgence in winter, implying a need for seasonal adaption in public policies. One-sentence summaryA delayed modulation of ultraviolet radiation on the COVID-19 transmission provides independent evidence for a 7-day incubation period and implies a strong seasonality
epidemiology
10.1101/2020.10.13.20183111
A delayed modulation of solar radiation on the COVID-19 transmission reflects an incubation period
Laboratory experiments have revealed the meteorological sensitivity of the coronavirus disease 2019 (COVID-19) virus. However, no consensus has been reached about how outdoor meteorological conditions modulate the virus transmission as it is also constrained by non-meteorological conditions. Here, we find that statistically, non-meteorological factors constrain the growth rate of cumulative confirmed cases least when the cases in a country arrive around 1300-3200. The least-constrained growth rate correlates with the ultraviolet flux and temperature significantly (correlation coefficients r=-0.55{+/-}0.09 and -0.40{+/-}0.10 at p < 0.01, respectively), but not with precipitation, humidity, and wind. The ultraviolet correlation exhibits a delay of about seven days, providing a meteorological measure of the incubation period. Our work reveals a seasonality of COVID-19 and a high risk of a pandemic resurgence in winter, implying a need for seasonal adaption in public policies. One-sentence summaryA delayed modulation of ultraviolet radiation on the COVID-19 transmission provides independent evidence for a 7-day incubation period and implies a strong seasonality
epidemiology
10.1101/2020.10.12.20211557
Nonspecific blood tests as proxies for COVID-19 hospitalization: are there plausible associations after excluding noisy predictors?
This study applied causal criteria in directed acyclic graphs for handling covariates in associations for prognosis of severe COVID-19 (Corona virus disease 19) cases. To identify nonspecific blood tests and risk factors as predictors of hospitalization due to COVID-19, one has to exclude noisy predictors by comparing the concordance statistics (AUC) for positive and negative cases of SARS-CoV-2 (acute respiratory syndrome coronavirus 2). Predictors with significant AUC at negative stratum should be either controlled for their confounders or eliminated (when confounders are unavailable). Models were classified according to the difference of AUC between strata. The framework was applied to an open database with 5644 patients from Hospital Israelita Albert Einstein in Brazil with SARS-CoV-2 RT-PCR (Reverse Transcription - Polymerase Chain Reaction) exam. C-reactive Protein (CRP) was a noisy predictor: hospitalization could have happen due to causes other than COVID-19 even when SARS-CoV-2 RT-PCR is positive and CRP is reactive, as most cases are asymptomatic to mild. Candidates of characteristic response from moderate to severe inflammation of COVID-19 were: combinations of eosinophils, monocytes and neutrophils, with age as risk factor; and creatinine, as risk factor, sharpens the odds ratio of the model with monocytes, neutrophils, and age.
epidemiology
10.1101/2020.10.14.20212803
Characteristics and Factors Associated with COVID-19 Infection, Hospitalization, and Mortality Across Race and Ethnicity
BackgroundData on the characteristics of COVID-19 patients disaggregated by race/ethnicity remain limited. We evaluated the sociodemographic and clinical characteristics of patients across racial/ethnic groups and assessed their associations with COVID-19 outcomes. MethodsThis retrospective cohort study examined 629,953 patients tested for SARS-CoV-2 in a large health system spanning California, Oregon, and Washington between March 1 and December 31, 2020. Sociodemographic and clinical characteristics were obtained from electronic health records. Odds of SARS-CoV-2 infection, COVID-19 hospitalization, and in-hospital death were assessed with multivariate logistic regression. Results570,298 patients with known race/ethnicity were tested for SARS-CoV-2, of whom 27.8% were non-White minorities. 54,645 individuals tested positive, with minorities representing 50.1%. Hispanics represented 34.3% of infections but only 13.4% of tests. While generally younger than White patients, Hispanics had higher rates of diabetes but fewer other comorbidities. 8,536 patients were hospitalized and 1,246 died, of whom 56.1% and 54.4% were non-White, respectively. Racial/ethnic distributions of outcomes across the health system tracked with state-level statistics. Increased odds of testing positive and hospitalization were associated with all minority races/ethnicities. Hispanic patients also exhibited increased morbidity, and Hispanic race/ethnicity was associated with in-hospital mortality (OR: 1.39 [95% CI: 1.14-1.70]). ConclusionMajor healthcare disparities were evident, especially among Hispanics who tested positive at a higher rate, required excess hospitalization and mechanical ventilation, and had higher odds of in-hospital mortality despite younger age. Targeted, culturally-responsive interventions and equitable vaccine development and distribution are needed to address the increased risk of poorer COVID-19 outcomes among minority populations. Key pointsRacial/ethnic disparities are evident in the disaggregated characteristics of COVID-19 patients. Minority patients experience increased odds of SARS-CoV-2 infection and COVID-19 hospitalization. Hospitalized Hispanic patients presented with more severe illness, experienced increased morbidity, and faced increased mortality.
public and global health