doi
stringlengths 16
27
| title
stringlengths 18
435
| authors
stringlengths 6
600
| author_corresponding
stringlengths 5
52
| author_corresponding_institution
stringlengths 1
160
⌀ | date
stringlengths 10
10
| version
int64 0
26
| type
stringclasses 3
values | license
stringclasses 7
values | category
stringclasses 51
values | jatsxml
stringlengths 68
79
| abstract
stringlengths 4
38.7k
| published
stringlengths 13
46
⌀ | server
stringclasses 1
value |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
10.1101/19006189 | Psychiatric comorbid disorders of cognition: A machine learning approach using 1,159 UK Biobank participants | Li, C.; Gheorghe, D. A.; Gallacher, J. E.; Bauermeister, S. | Sarah Bauermeister | University of Oxford | 2020-03-02 | 2 | PUBLISHAHEADOFPRINT | cc_by | psychiatry and clinical psychology | https://www.medrxiv.org/content/early/2020/03/02/19006189.source.xml | BackgroundConceptualising comorbidity is complex and the term is used variously. Here, it is the coexistence of two or more diagnoses which might be defined as chronic and, although they may be pathologically related, they may also act independently1. Of interest here is the comorbidity of common psychiatric disorders and impaired cognition.
ObjectivesTo examine whether anxiety and/or depression are important longitudinal predictors of cognitive change.
MethodsUK Biobank participants used at three time points (n= 502,664): baseline, 1st follow-up (n= 20,257) and 1st imaging study (n=40,199). Participants with no missing data were 1,175 participants aged 40 to 70 years, 41% female. Machine learning (ML) was applied and the main outcome measure of reaction time intraindividual variability (cognition) was used.
FindingsUsing the area under the Receiver Operating Characteristic (ROC) curve, the anxiety model achieves the best performance with an Area Under the Curve (AUC) of 0.68, followed by the depression model with an AUC of 0.63. The cardiovascular and diabetes model, and the covariates model have weaker performance in predicting cognition, with an AUC of 0.60 and 0.56, respectively.
ConclusionsOutcomes suggest psychiatric disorders are more important comorbidities of long-term cognitive change than diabetes and cardiovascular disease, and demographic factors. Findings suggest that psychiatric disorders (anxiety and depression) may have a deleterious effect on long-term cognition and should be considered as an important comorbid disorder of cognitive decline.
Clinical implicationsImportant predictive effects of poor mental health on longitudinal cognitive decline should be considered in secondary and also primary care.
Summary BoxO_ST_ABSWhat is already known about this subject? 3-4 bullet pointsC_ST_ABSO_LIPoor mental health is associated with cognitive deficits.
C_LIO_LIOne in four older adults experience a decline in affective state with increasing age.
C_LIO_LIML approaches have certain advantages in identifying patterns of information useful for the prediction of an outcome.
C_LI
What are the new findings? 3-4 bullet pointsO_LIPsychiatric disorders are important comorbid disorders of long-term cognitive change.
C_LIO_LIMachine-learning methods such as sequence learning based methods are able to offer non-parametric joint modelling, allow for multiplicity of factors and provide prediction models that are more robust and accurate for longitudinal data
C_LIO_LIThe outcome of the RNN analysis found that anxiety and depression were stronger predictors of change IIV over time than either cardiovascular disease and diabetes or the covariate variables.
C_LI
How might it impact on clinical practice in the foreseeable future?The important predictive effect of mental health on longitudinal cognition should be noted and, its comorbidity relationship with other conditions such as cardiovascular disease likewise to be considered in primary care and other clinical settings | 10.1136/ebmental-2020-300147 | medrxiv |
10.1101/19006189 | Psychiatric comorbid disorders of cognition: A machine learning approach using 1,175 UK Biobank participants | Li, C.; Gheorghe, D. A.; Gallacher, J. E.; Bauermeister, S. | Sarah Bauermeister | University of Oxford | 2020-06-28 | 3 | PUBLISHAHEADOFPRINT | cc_by | psychiatry and clinical psychology | https://www.medrxiv.org/content/early/2020/06/28/19006189.source.xml | BackgroundConceptualising comorbidity is complex and the term is used variously. Here, it is the coexistence of two or more diagnoses which might be defined as chronic and, although they may be pathologically related, they may also act independently1. Of interest here is the comorbidity of common psychiatric disorders and impaired cognition.
ObjectivesTo examine whether anxiety and/or depression are important longitudinal predictors of cognitive change.
MethodsUK Biobank participants used at three time points (n= 502,664): baseline, 1st follow-up (n= 20,257) and 1st imaging study (n=40,199). Participants with no missing data were 1,175 participants aged 40 to 70 years, 41% female. Machine learning (ML) was applied and the main outcome measure of reaction time intraindividual variability (cognition) was used.
FindingsUsing the area under the Receiver Operating Characteristic (ROC) curve, the anxiety model achieves the best performance with an Area Under the Curve (AUC) of 0.68, followed by the depression model with an AUC of 0.63. The cardiovascular and diabetes model, and the covariates model have weaker performance in predicting cognition, with an AUC of 0.60 and 0.56, respectively.
ConclusionsOutcomes suggest psychiatric disorders are more important comorbidities of long-term cognitive change than diabetes and cardiovascular disease, and demographic factors. Findings suggest that psychiatric disorders (anxiety and depression) may have a deleterious effect on long-term cognition and should be considered as an important comorbid disorder of cognitive decline.
Clinical implicationsImportant predictive effects of poor mental health on longitudinal cognitive decline should be considered in secondary and also primary care.
Summary BoxO_ST_ABSWhat is already known about this subject? 3-4 bullet pointsC_ST_ABSO_LIPoor mental health is associated with cognitive deficits.
C_LIO_LIOne in four older adults experience a decline in affective state with increasing age.
C_LIO_LIML approaches have certain advantages in identifying patterns of information useful for the prediction of an outcome.
C_LI
What are the new findings? 3-4 bullet pointsO_LIPsychiatric disorders are important comorbid disorders of long-term cognitive change.
C_LIO_LIMachine-learning methods such as sequence learning based methods are able to offer non-parametric joint modelling, allow for multiplicity of factors and provide prediction models that are more robust and accurate for longitudinal data
C_LIO_LIThe outcome of the RNN analysis found that anxiety and depression were stronger predictors of change IIV over time than either cardiovascular disease and diabetes or the covariate variables.
C_LI
How might it impact on clinical practice in the foreseeable future?The important predictive effect of mental health on longitudinal cognition should be noted and, its comorbidity relationship with other conditions such as cardiovascular disease likewise to be considered in primary care and other clinical settings | 10.1136/ebmental-2020-300147 | medrxiv |
10.1101/19007682 | Interleukin-6 signaling effects on ischemic stroke and other cardiovascular outcomes: a Mendelian Randomization study | Georgakis, M. K.; Malik, R.; Gill, D. K.; Franceschini, N.; Sudlow, C. L.; INVENT Consortium, ; CHARGE Inflammation Working Group, ; Dichgans, M. | Martin Dichgans | Ludwig-Maximilians-University (LMU) Munich | 2019-09-27 | 1 | PUBLISHAHEADOFPRINT | cc_by | epidemiology | https://www.medrxiv.org/content/early/2019/09/27/19007682.source.xml | BackgroundStudies in humans and experimental models highlight a role of interleukin-6 (IL-6) in cardiovascular disease. Indirect evidence suggests that inhibition of IL-6 signaling could lower risk of coronary artery disease. However, whether such an approach would be effective for ischemic stroke and other cardiovascular outcomes remains unknown.
MethodsIn a genome-wide association study (GWAS) of 204,402 European individuals, we identified genetic proxies for downregulated IL-6 signaling as genetic variants in the IL-6 receptor (IL6R) locus that were associated with lower C-reactive protein (CRP) levels, a downstream effector of IL-6 signaling. We then applied two-sample Mendelian randomization (MR) to explore associations with ischemic stroke and its major subtypes (large artery stroke, cardioembolic stroke, small vessel stroke) in the MEGASTROKE dataset (34,217 cases and 404,630 controls), with coronary artery disease in the CARDIoGRAMplusC4D dataset (60,801 cases and 123,504 control), and with other cardiovascular outcomes in the UK Biobank (up to 321,406 individuals) and in phenotype-specific GWAS datasets. All effect estimates were scaled to the CRP-decreasing effects of tocilizumab, a monoclonal antibody targeting IL-6R.
ResultsWe identified 7 genetic variants as proxies for downregulated IL-6 signaling, which showed effects on upstream regulators (IL-6 and soluble IL-6R levels) and downstream effectors (CRP and fibrinogen levels) of the pathway that were consistent with pharmacological blockade of IL-6R. In MR, proxies for downregulated IL-6 signaling were associated with lower risk of ischemic stroke (Odds Ratio [OR]: 0.89, 95%CI: 0.82-0.97) and coronary artery disease (OR: 0.84, 95%CI: 0.77-0.90). Focusing on ischemic stroke subtypes, we found significant associations with risk of large artery (OR: 0.76, 95%CI: 0.62-0.93) and small vessel stroke (OR: 0.71, 95%CI: 0.59-0.86), but not cardioembolic stroke (OR: 0.95, 95%CI: 0.74-1.22). Proxies for IL-6 signaling inhibition were further associated with a lower risk of myocardial infarction, aortic aneurysm, atrial fibrillation and carotid plaque.
ConclusionsWe provide evidence for a causal effect of IL-6 signaling on ischemic stroke, particularly large artery and small vessel stroke, and a range of other cardiovascular outcomes. IL-6R blockade might represent a valid therapeutic target for lowering cardiovascular risk and should thus be investigated in clinical trials.
CLINICAL PERSPECTIVEO_ST_ABSWhat is newC_ST_ABSO_LIWe identified genetic proxies for downregulated IL-6 signaling that had effects on upstream and downstream regulators of the IL-6 signaling pathway consistent with those of pharmacological IL-6R blockade
C_LIO_LIGenetically downregulated IL-6 signaling was associated with a lower risk of ischemic stroke, and in particular large artery and small vessel stroke
C_LIO_LISimilar associations were obtained for a broad range of other cardiovascular outcomes
C_LI
What are the clinical implicationsO_LIInhibition of IL-6 signaling is a promising therapeutic target for lowering risk of stroke and other cardiovascular outcomes and should be further investigated in clinical trials
C_LI | 10.1161/CIRCGEN.119.002872 | medrxiv |
10.1101/19007435 | Genetic Diversity of the cagA gene of Helicobacter pylori strains from Sudanese Patients with Different Gastroduodenal Diseases | Hassan, H. G.; Idris, A. B.; Hassan, M. A.; Altayb, H. N.; Yasin, K.; Beirage, N.; Abdel Hamid, M. M. | Hadeel Gassim Hassan | Institute of Endemic Diseases, Medical Sciences Campus, University of Khartoum, Khartoum,Sudan | 2019-09-27 | 1 | PUBLISHAHEADOFPRINT | cc_by_nd | gastroenterology | https://www.medrxiv.org/content/early/2019/09/27/19007435.source.xml | BackgroundThere is an increase in the prevalence of Helicobacter pylori infection in Sudan, accompanied by a high incidence of upper gastrointestinal malignancy. The cytotoxin-associated gene cagA gene is a marker of a pathogenicity island (PAI) in H. pylori and plays a crucial role in determining the clinical outcome of Helicobacter infections.
ObjectiveThis study aimed to determine the frequency and heterogeneity of the cagA gene of H. pylori and correlate the presence of cagA gene with clinical outcomes.
Materials and methodsFifty endoscopy biopsies were collected from Fedail and Soba hospitals in Khartoum state. DNA was extracted using the Guanidine chloride method followed by PCR to amplify 16S rRNA and cagA gene of H. pylori using specific primers. DNA amplicons of cagA gene were purified and sequenced. Bioinformatics and statistical analysis were done to characterize and to test the association between cagA gene and gastric complications.
ResultsCagA gene was detected in 20/37(54%) of the samples that were found positive for H. pylori. There was no association between endoscopy finding and the presence of the cagA gene (p = 0.225). Specific amino acid variations were found at seven loci related to strains from a patient with duodenitis, gastric ulcer, and gastric atrophy (R448H, T457K, S460L, IT463-464VA, D470E, A482Q, KNV490-491-492TKT) while mutations in cancerous strain were A439P, T457P, and H500Y.
ConclusionDisease-specific variations of cagA of H. pylori strains, in the region of amino acid residues 428-510, were evident among Sudanese patients with different gastroduodenal diseases. A novel mutation (K458N) was detected in a patient with duodenitis, which affects the positive electrostatic surface of cagA. Phylogenetic analysis showed a high level of diversity of cagA from Sudanese H. pylori strains. | null | medrxiv |
10.1101/19007419 | Patterns of Autism Symptoms: Hidden Structure in the ADOS and ADI-R instruments | Lefort-besnard, J.; Vogeley, K.; Schilbach, L.; Varoquaux, G.; Thirion, B.; Dumas, G.; Bzdok, D. | Jeremy Lefort-besnard | RWTH Aachen | 2019-09-27 | 1 | PUBLISHAHEADOFPRINT | cc_by_nd | psychiatry and clinical psychology | https://www.medrxiv.org/content/early/2019/09/27/19007419.source.xml | We simultaneously revisited the ADI-R and ADOS with a comprehensive data-analytics strategy. Here, the combination of pattern analysis algorithms and an extensive data resources (n=266 patients aged 7 to 49 years) allowed identifying coherent clinical constellations in and across ADI-R and ADOS assessments widespread in clinical practice. The collective results of the clustering and sparse regression approaches suggest that identifying autism subtypes and severity for a given individual may be most manifested in the social and communication domains of the ADI-R. Additionally, our quantitative investigation revealed that disease-specific patterns of ADI-R and ADOS scores can be uncovered when studying sex, age or level of FIQ in patients. | 10.1038/s41398-020-00946-8 | medrxiv |
10.1101/19007666 | Tumor Necrosis Factor (TNF) Blocking Agents Reduce Risk for Alzheimer's Disease in Patients with Rheumatoid Arthritis and Psoriasis | Zhou, M.; Kaelber, D.; Xu, R.; Gurney, M. | Mengshi Zhou | Case Western Reserve University | 2019-09-27 | 1 | PUBLISHAHEADOFPRINT | cc_no | neurology | https://www.medrxiv.org/content/early/2019/09/27/19007666.source.xml | This large, retrospective case-control study of electronic health records from 56 million unique adult patients examined whether or not treatment with a Tumor Necrosis Factor (TNF) blocking agent is associated with lower risk for Alzheimers disease (AD) in patients with rheumatoid arthritis (RA), psoriasis, and other inflammatory diseases which are mediated in part by TNF and for which a TNF blocker is an approved treatment. The analysis compared the diagnosis of AD as an outcome measure in patients receiving at least one prescription for a TNF blocking agent (etanercept, adalimumab, and infliximab) or for methotrexate. Adjusted odds ratios (AORs) were estimated using the Cochran-Mantel-Haenszel (CMH) method and presented with 95% confidence intervals (CIs) and p-values. RA was associated with a higher risk for AD (Adjusted Odds Ratio (AOR) = 2.06, 95% Confidence Interval: (2.02-2.10), P-value <0.0001) as did psoriasis (AOR = 1.37 (1.31-1.42), P <0.0001), ankylosing spondylitis (AOR = 1.57 (1.39-1.77), P <0.0001), inflammatory bowel disease (AOR = 2.46 (2.33-2.59), P < 0.0001), ulcerative colitis (AOR = 1.82 (1.74-1.91), P <0.0001), and Crohns disease (AOR = 2.33 (2.22-2.43), P <0.0001). The risk for AD in patients with RA was lower among patients treated with etanercept (AOR = 0.34 (0.25-0.47), P <0.0001), adalimumab (AOR = 0.28 (0.19-0.39), P < 0.0001), or infliximab (AOR = 0.52 (0.39-0.69), P <0.0001). Methotrexate was also associated with a lower risk for AD (AOR = 0.64 (0.61-0.68), P <0.0001), while lower risk was found in patients with a prescription history for both a TNF blocker and methotrexate. Etanercept and adalimumab also were associated with lower risk for AD in patients with psoriasis: AOR = 0.47 (0.30-0.73 and 0.41 (0.20-0.76), respectively. There was no effect of gender or race, while younger patients showed greater benefit from a TNF blocker than did older patients. This study identifies a subset of patients in whom systemic inflammation contributes to risk for AD through a pathological mechanism involving TNF and who therefore may benefit from treatment with a TNF blocking agent. | 10.1371/journal.pone.0229819 | medrxiv |
10.1101/19007666 | Tumor Necrosis Factor (TNF) Blocking Agents are Associated with Lower Risk for Alzheimer's Disease in Patients with Rheumatoid Arthritis and Psoriasis | Zhou, M.; Kaelber, D.; Xu, R.; Gurney, M. | Mengshi Zhou | Case Western Reserve University | 2020-03-11 | 2 | PUBLISHAHEADOFPRINT | cc_no | neurology | https://www.medrxiv.org/content/early/2020/03/11/19007666.source.xml | This large, retrospective case-control study of electronic health records from 56 million unique adult patients examined whether or not treatment with a Tumor Necrosis Factor (TNF) blocking agent is associated with lower risk for Alzheimers disease (AD) in patients with rheumatoid arthritis (RA), psoriasis, and other inflammatory diseases which are mediated in part by TNF and for which a TNF blocker is an approved treatment. The analysis compared the diagnosis of AD as an outcome measure in patients receiving at least one prescription for a TNF blocking agent (etanercept, adalimumab, and infliximab) or for methotrexate. Adjusted odds ratios (AORs) were estimated using the Cochran-Mantel-Haenszel (CMH) method and presented with 95% confidence intervals (CIs) and p-values. RA was associated with a higher risk for AD (Adjusted Odds Ratio (AOR) = 2.06, 95% Confidence Interval: (2.02-2.10), P-value <0.0001) as did psoriasis (AOR = 1.37 (1.31-1.42), P <0.0001), ankylosing spondylitis (AOR = 1.57 (1.39-1.77), P <0.0001), inflammatory bowel disease (AOR = 2.46 (2.33-2.59), P < 0.0001), ulcerative colitis (AOR = 1.82 (1.74-1.91), P <0.0001), and Crohns disease (AOR = 2.33 (2.22-2.43), P <0.0001). The risk for AD in patients with RA was lower among patients treated with etanercept (AOR = 0.34 (0.25-0.47), P <0.0001), adalimumab (AOR = 0.28 (0.19-0.39), P < 0.0001), or infliximab (AOR = 0.52 (0.39-0.69), P <0.0001). Methotrexate was also associated with a lower risk for AD (AOR = 0.64 (0.61-0.68), P <0.0001), while lower risk was found in patients with a prescription history for both a TNF blocker and methotrexate. Etanercept and adalimumab also were associated with lower risk for AD in patients with psoriasis: AOR = 0.47 (0.30-0.73 and 0.41 (0.20-0.76), respectively. There was no effect of gender or race, while younger patients showed greater benefit from a TNF blocker than did older patients. This study identifies a subset of patients in whom systemic inflammation contributes to risk for AD through a pathological mechanism involving TNF and who therefore may benefit from treatment with a TNF blocking agent. | 10.1371/journal.pone.0229819 | medrxiv |
10.1101/19007757 | Machine learning-based prediction of response to PARP inhibition across cancer types. | Hill, K. E.; Rattani, A.; Lietz, C. E.; Garbutt, C.; Choy, E.; Cote, G. M.; Culhane, A.; Kelly, A. D.; Haibe-Kains, B.; Spentzos, D. | Dimitrios Spentzos | Department of Orthopedic Surgery, Center for Sarcoma and Connective Tissue Oncology, Massachusetts General Hospital Cancer Center | 2019-09-27 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | genetic and genomic medicine | https://www.medrxiv.org/content/early/2019/09/27/19007757.source.xml | PARP inhibitors (PARPi) are FDA approved for the treatment of BRCA1/2 deficient breast and ovarian cancer, but a growing body of pre-clinical evidence suggests the drug class holds therapeutic potential in other cancer types, independent of BRCA1/2 status. Large-scale pharmacogenomic datasets offer the opportunity to develop predictors of response to PARPis in many cancer types, expanding their potential clinical applicability. Response to the PARPi olaparib was used to identify a multi-gene PARPi response signature in a large in vitro dataset including multiple cancer types, such as breast, ovarian, pancreatic, lung cancer, osteosarcoma and Ewing sarcoma, using machine learning approaches. The signature was validated on multiple independent in vitro datasets, also testing for response to another PARPi, rucaparib, as well as two clinical datasets using the cisplatin response as a surrogate for PARPi response. Finally, integrative pharmacogenomic analysis was performed to identify drugs which may be effective in PARPi resistant tumors. A PARPi response signature was defined as the 50 most differentially transcribed genes between PARPi resistant and sensitive cell lines from several different cancer types. Cross validated predictors generated with LASSO logistic regression using the PARPi signature genes accurately predicted PARPi response in a training set of olaparib treated cell lines (80-89%), an independent olaparib treated in vitro dataset (66-77%), and an independent rucaparib treated in vitro dataset (80-87%). The PARPi signature also significantly predicted in vitro breast cancer response to olaparib in another separate experimental dataset. The signature also predicted clinical response to cisplatin and survival in human ovarian cancer and osteosarcoma datasets. Robust transcriptional differences between PARPi sensitive and resistant tumors accurately predict PARPi response in vitro and cisplatin response in vivo for multiple tumor types with or without known BRCA1/2 deficiency. These signatures may prove useful for predicting response in patients treated with PARP inhibitors. | null | medrxiv |
10.1101/19007674 | Whole genome sequencing identifies putative associations between genomic polymorphisms and clinical response to the antiepileptic drug levetiracetam | Vavoulis, D. V.; Pagnamenta, A. T.; Knight, S. J.; Pentony, M. M.; Armstrong, M.; Galizia, E. C.; Balestrini, S.; Sisodiya, S.; Taylor, J. C. | Dimitrios V Vavoulis | University of Oxford | 2019-09-27 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | genetic and genomic medicine | https://www.medrxiv.org/content/early/2019/09/27/19007674.source.xml | In the context of pharmacogenomics, whole genome sequencing provides a powerful approach for identifying correlations between response variability to specific drugs and genomic polymorphisms in a population, in an unbiased manner. In this study, we employed whole genome sequencing of DNA samples from patients showing extreme response (n=72) and non-response (n=27) to the antiepileptic drug levetiracetam, in order to identify genomic variants that underlie response to the drug. Although no common SNP (MAF>5%) crossed the conventional genome-wide significance threshold of 5x10-8, we found common polymorphisms in genes SPNS3, HDC, MDGA2, NSG1 and RASGEF1C, which collectively predict clinical response to levetiracetam in our cohort with [~]91% predictive accuracy ([~]94% positive predictive value, [~]85% negative predictive value). Among these genes, HDC, NSG1, MDGA2 and RASGEF1C are potentially implicated in synaptic neurotransmission, while SPNS3 is an atypical solute carrier transporter homologous to SV2A, the known molecular target of levetiracetam. Furthermore, we performed gene- and pathway-based statistical analysis on sets of rare and low-frequency variants (MAF<5%) and we identified associations between genes or pathways and response to levetiracetam. Our findings include a) the genes PRKCB and DLG2, which are involved in glutamatergic neurotransmission, a known target of anticonvulsants, including levetiracetam; b) the genes FILIP1 and SEMA6D, which are involved in axon guidance and modelling of neural connections; and c) pathways with a role in synaptic neurotransmission, such as WNT5A-dependent internalization of FZD4 and disinhibition of SNARE formation. Targeted analysis of genes involved in neurotransmitter release and transport further supports the possibility of association between drug response and genes NSG1 and DLG2. In summary, our approach to utilise whole genome sequencing on subjects with extreme response phenotypes is a feasible route to generate plausible hypotheses for investigating the genetic factors underlying drug response variability in cases of pharmaco-resistant epilepsy.
AUTHOR SUMMARYLevetiracetam (LEV) is a prominent antiepileptic drug prescribed for the treatment of both focal and generalised epilepsy. The molecular mechanism mediating its action is not well understood, but it involves the modulation of synaptic neurotransmition through binding to the synaptic vesicle glycoprotein SV2A. Identifying genomic polymorphisms that predict response to the drug is important, because it can help clinicians prescribe the most appropriate treatment in a patient-specific manner. In this study, we employed whole genome sequencing (WGS) of DNA samples from extreme responders or non-responders to LEV and we identified a small group of common variants, which successfully predict response to the drug in our cohort. These variants are mostly located in genes implicated in synaptic function. Furthermore, we identified significant associations between clinical response to LEV and low-frequency variants in genes and pathways involved in excitatory neurotransmission or in the moulding of neural networks in the brain. Our approach to utilise WGS on subjects with extreme response phenotypes is a feasible route to generate plausible hypotheses on the genomic basis of pharmaco-resistant epilepsy. We expect that the rapidly decreasing cost of WGS will allow conducting similar studies on a larger scale in the near future. | null | medrxiv |
10.1101/19007781 | Acquisition of extended-spectrum beta-lactamase-producing Enterobacteriaceae (ESBL-PE) carriage after exposure to systemic antimicrobials during travel: systematic review and meta-analysis. | Wuerz, T.; Kassim, S.; Atkins, K. | Terence Wuerz | University of Manitoba | 2019-09-27 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | infectious diseases | https://www.medrxiv.org/content/early/2019/09/27/19007781.source.xml | BackgroundInternational travel is an important risk factor for colonization with extended-spectrum beta-lactamase-producing Enterobacteriaceae (ESBL-PE). Antimicrobial use during travel likely amplifies this risk, yet to what extent, and whether it varies by antimicrobial class, has not been established.
MethodsWe conducted a systematic review that included prospective cohorts reporting both receipt of systemic antimicrobials and acquired ESBL-PE isolated from stool or rectum during international travel. We performed a random effects meta-analysis to estimate odds of acquiring ESBL-PE due to antimicrobials during travel, overall and by antimicrobial class.
ResultsFifteen studies were included. The study population was mainly female travellers from high income countries recruited primarily from travel clinics. Participants travelled most frequently to Asia and Africa with 10% reporting antimicrobial use during travel. The combined odds ratio (OR) for ESBL-PE acquisition during travel was 2.37 for antimicrobial use overall (95% confidence interval [CI], 1.69 to 3.33), but there was substantial heterogeneity between studies. Fluoroquinolones were the antibiotic class associated with the highest combined OR of ESBL-PE acquisition, compared to no antimicrobial use (OR 4.68, 95% CI, 2.34 to 9.37).
ConclusionsThe risk of ESBL-PE colonization during travel is increased substantially with exposure to antimicrobials, especially fluoroquinolones. While a small proportion of colonized individuals will develop a resistant infection, there remains the potential for onward spread among returning travellers. Public health efforts to decrease inappropriate antimicrobial usage during travel are warranted.
Research in contextO_ST_ABSEvidence before this studyC_ST_ABSAntimicrobial resistance (AMR) among bacteria that commonly cause human infection is of increasing public health concern. International travel has recently been associated with colonization with Extended-Spectrum Beta-Lactamase Producing-Enterobacteriaceae (ESBL-PE), increasing the spread of drug resistance among these important pathogens. We searched Pubmed, Embase, MEDLINE, Web of Science, SCOPUS, and the Cochrane Library for prospective cohort studies published between January 2000 and June 2018, reporting on acquisition of ESBL-PE among travellers, which reported on antimicrobial use during travel. 15 studies were included, which were at moderate risk of bias. The pooled odds ratio for acquisition of ESBL-PE during travel was 2.37 among antimicrobial users, compared to non-users (95% CI, 1.69 to 3.33). The magnitude of this association was stronger among travellers reporting fluoroquinolone use (OR 4.68, 95% CI 2.34 to 9.37).
Added value of this studyThis is the first study to quantify the association between antimicrobial use during travel, overall and by specific antimicrobial class, with ESBL-PE acquisition across broad populations of travellers and destination countries.
Implications of all the available evidenceFurther study into the mechanisms by which antimicrobials, such as fluoroquinolones, contribute to AMR may identify protective measures. Meanwhile, antimicrobial use during travel for prevention or treatment of mild-to-moderate travellers diarrhea should not be recommended routinely. Where indicated, alternatives to fluoroquinolone antimicrobials should be considered. | 10.1016/j.tmaid.2020.101823 | medrxiv |
10.1101/19007393 | Contribution of Marijuana Legalization to the U.S. Opioid Mortality Epidemic: Individual and Combined Experience of 27 States and District of Columbia | Bleyer, A.; Barnes, B. | Archie Bleyer | Oregon Health and Science University | 2019-09-27 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc | public and global health | https://www.medrxiv.org/content/early/2019/09/27/19007393.source.xml | BackgroundPrior studies of U.S. states as of 2013 and one state as of 2015 suggested that marijuana availability reduces opioid mortality (marijuana protection hypothesis). This investigation tested the hypothesis with opioid mortality trends updated to 2017 and by evaluating all states and the District of Columbia (D.C.).
MethodsOpioid mortality data obtained from the U.S. Centers for Disease Control and Prevention were used to compare opioid death rate trends in each marijuana-legalizing state and D.C. before and after medicinal and recreational legalization implementation and their individual and cumulative aggregate trends with concomitant trends in non-legalizing states. The Joinpoint Regression Program identified statistically-significant mortality trends and when they occurred.
ResultsOf 23 individually evaluable legalizing jurisdictions, 78% had evidence for a statistically-significant acceleration of opioid death rates after medicinal or recreational legalization implementation at greater rates than their pre-legalization rate or the concurrent composite rate in non-legalizing states. All four jurisdictions evaluable for recreational legalization had evidence (p <0.05) for subsequent opioid death rate increases, one had a distinct acceleration, and one a reversal of prior decline. Since 2009-2012, when the cumulative-aggregate opioid death rate in the legalizing jurisdictions was the same as in the non-legalizing group, the legalizing groups rate accelerated increasingly faster (p=0.009). By 2017 it was 67% greater than in the non-legalizing group (p <<0.05).
ConclusionsThe marijuana protection hypothesis is not supported by recent U.S. data on opioid mortality trends. Instead, legalizing marijuana appears to have contributed to the nations opioid mortality epidemic. | null | medrxiv |
10.1101/19007393 | Contribution of Marijuana Legalization to the U.S. Opioid Mortality Epidemic: Individual and Combined Experience of 27 States and District of Columbia | Bleyer, A.; Barnes, B. | Archie Bleyer | Oregon Health and Science University | 2019-10-10 | 2 | PUBLISHAHEADOFPRINT | cc_by_nc | public and global health | https://www.medrxiv.org/content/early/2019/10/10/19007393.source.xml | BackgroundPrior studies of U.S. states as of 2013 and one state as of 2015 suggested that marijuana availability reduces opioid mortality (marijuana protection hypothesis). This investigation tested the hypothesis with opioid mortality trends updated to 2017 and by evaluating all states and the District of Columbia (D.C.).
MethodsOpioid mortality data obtained from the U.S. Centers for Disease Control and Prevention were used to compare opioid death rate trends in each marijuana-legalizing state and D.C. before and after medicinal and recreational legalization implementation and their individual and cumulative aggregate trends with concomitant trends in non-legalizing states. The Joinpoint Regression Program identified statistically-significant mortality trends and when they occurred.
ResultsOf 23 individually evaluable legalizing jurisdictions, 78% had evidence for a statistically-significant acceleration of opioid death rates after medicinal or recreational legalization implementation at greater rates than their pre-legalization rate or the concurrent composite rate in non-legalizing states. All four jurisdictions evaluable for recreational legalization had evidence (p <0.05) for subsequent opioid death rate increases, one had a distinct acceleration, and one a reversal of prior decline. Since 2009-2012, when the cumulative-aggregate opioid death rate in the legalizing jurisdictions was the same as in the non-legalizing group, the legalizing groups rate accelerated increasingly faster (p=0.009). By 2017 it was 67% greater than in the non-legalizing group (p <<0.05).
ConclusionsThe marijuana protection hypothesis is not supported by recent U.S. data on opioid mortality trends. Instead, legalizing marijuana appears to have contributed to the nations opioid mortality epidemic. | null | medrxiv |
10.1101/19007542 | Subacute effects of the psychedelic ayahuasca on the salience and default mode networks | Pasquini, L.; Palhano-Fontes, F.; de Araujo, D. B. | Draulio B de Araujo | Brain Institute, Federal University Rio Grande do Norte | 2019-09-29 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | psychiatry and clinical psychology | https://www.medrxiv.org/content/early/2019/09/29/19007542.source.xml | BackgroundNeuroimaging studies have just begun to explore the acute effects of psychedelics on large-scale brain networks functional organization. Even less is known on the neural correlates of subacute effects taking place days after the psychedelic experience. This study explores the subacute changes of primary sensory brain networks and networks supporting higher-order affective and self-referential functions 24h after a single session with the psychedelic ayahuasca.
MethodsWe leveraged task-free functional magnetic resonance imaging data one day before and one day after a randomized placebo-controlled trial exploring the effects of ayahuasca in naive healthy participants (21 placebo/22 ayahuasca). We derived intra- and inter-network functional connectivity of the salience, default mode, visual, and sensorimotor networks, and assessed post-session connectivity changes between the ayahuasca and placebo groups. Connectivity changes were associated with Hallucinogen Rating Scale scores assessed during the acute effects.
ResultsOur findings revealed increased anterior cingulate cortex connectivity within the salience network, decreased posterior cingulate cortex connectivity within the default mode network, and increased connectivity between the salience and default mode networks one day after the session in the ayahuasca group compared to placebo. Connectivity of primary sensory networks did not differ between-groups. Salience network connectivity increases correlated with altered somesthesia scores, decreased default mode network connectivity correlated with altered volition scores, and increased salience-default mode network connectivity correlated with altered affect scores.
ConclusionThese findings provide preliminary evidence for subacute functional changes induced by the psychedelic ayahuasca on higher-order cognitive brain networks that support interoceptive, affective, and self-referential functions. | 10.1177/0269881120909409 | medrxiv |
10.1101/19007542 | Subacute effects of the psychedelic ayahuasca on the salience and default mode networks | Pasquini, L.; Palhano-Fontes, F.; de Araujo, D. B. | Draulio B de Araujo | Brain Institute, Federal University Rio Grande do Norte | 2020-02-21 | 2 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | psychiatry and clinical psychology | https://www.medrxiv.org/content/early/2020/02/21/19007542.source.xml | BackgroundNeuroimaging studies have just begun to explore the acute effects of psychedelics on large-scale brain networks functional organization. Even less is known on the neural correlates of subacute effects taking place days after the psychedelic experience. This study explores the subacute changes of primary sensory brain networks and networks supporting higher-order affective and self-referential functions 24h after a single session with the psychedelic ayahuasca.
MethodsWe leveraged task-free functional magnetic resonance imaging data one day before and one day after a randomized placebo-controlled trial exploring the effects of ayahuasca in naive healthy participants (21 placebo/22 ayahuasca). We derived intra- and inter-network functional connectivity of the salience, default mode, visual, and sensorimotor networks, and assessed post-session connectivity changes between the ayahuasca and placebo groups. Connectivity changes were associated with Hallucinogen Rating Scale scores assessed during the acute effects.
ResultsOur findings revealed increased anterior cingulate cortex connectivity within the salience network, decreased posterior cingulate cortex connectivity within the default mode network, and increased connectivity between the salience and default mode networks one day after the session in the ayahuasca group compared to placebo. Connectivity of primary sensory networks did not differ between-groups. Salience network connectivity increases correlated with altered somesthesia scores, decreased default mode network connectivity correlated with altered volition scores, and increased salience-default mode network connectivity correlated with altered affect scores.
ConclusionThese findings provide preliminary evidence for subacute functional changes induced by the psychedelic ayahuasca on higher-order cognitive brain networks that support interoceptive, affective, and self-referential functions. | 10.1177/0269881120909409 | medrxiv |
10.1101/19007856 | Travel time to health facilities as a marker of geographical accessibility across heterogeneous land coverage in Peru | Carrasco-Escobar, G.; Manrique, E.; Tello-Lizarraga, K.; Miranda, J. J. | Gabriel Carrasco-Escobar | University of California San Diego | 2019-09-29 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | public and global health | https://www.medrxiv.org/content/early/2019/09/29/19007856.source.xml | The geographical accessibility to health facilities is conditioned by the topography and environmental conditions overlapped with different transport facilities between rural and urban areas. To better estimate the travel time to the most proximate health facility infrastructure and determine the differences across heterogeneous land coverage types, this study explored the use of a novel cloud-based geospatial modeling approach and use as a case study the unique geographical and ecological diversity in the Peruvian territory. Geospatial data of 145,134 cities and villages and 8,067 health facilities in Peru were gathered with land coverage types, roads infrastructure, navigable river networks, and digital elevation data to produce high-resolution (30 m) estimates of travel time to the most proximate health facility across the country. This study estimated important variations in travel time between urban and rural settings across the 16 major land coverage types in Peru, that in turn, overlaps with socio-economic profiles of the villages. The median travel time to primary, secondary, and tertiary healthcare facilities was 1.9, 2.3, and 2.2 folds higher in rural than urban settings, respectively. Also, higher travel time values were observed in areas with a high proportion of the population with unsatisfied basic needs. In so doing, this study provides a new methodology to estimate travel time to health facilities as a tool to enhance the understanding and characterization of the profiles of accessibility to health facilities in low- and middle-income countries (LMIC), calling for a service delivery redesign to maximize high quality of care. | 10.3389/fpubh.2020.00498 | medrxiv |
10.1101/19007856 | Travel time to health facilities as a marker of geographical accessibility across heterogeneous land coverage in Peru | Carrasco-Escobar, G.; Manrique, E.; Tello-Lizarraga, K.; Miranda, J. J. | Gabriel Carrasco-Escobar | University of California San Diego | 2019-12-26 | 2 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | public and global health | https://www.medrxiv.org/content/early/2019/12/26/19007856.source.xml | The geographical accessibility to health facilities is conditioned by the topography and environmental conditions overlapped with different transport facilities between rural and urban areas. To better estimate the travel time to the most proximate health facility infrastructure and determine the differences across heterogeneous land coverage types, this study explored the use of a novel cloud-based geospatial modeling approach and use as a case study the unique geographical and ecological diversity in the Peruvian territory. Geospatial data of 145,134 cities and villages and 8,067 health facilities in Peru were gathered with land coverage types, roads infrastructure, navigable river networks, and digital elevation data to produce high-resolution (30 m) estimates of travel time to the most proximate health facility across the country. This study estimated important variations in travel time between urban and rural settings across the 16 major land coverage types in Peru, that in turn, overlaps with socio-economic profiles of the villages. The median travel time to primary, secondary, and tertiary healthcare facilities was 1.9, 2.3, and 2.2 folds higher in rural than urban settings, respectively. Also, higher travel time values were observed in areas with a high proportion of the population with unsatisfied basic needs. In so doing, this study provides a new methodology to estimate travel time to health facilities as a tool to enhance the understanding and characterization of the profiles of accessibility to health facilities in low- and middle-income countries (LMIC), calling for a service delivery redesign to maximize high quality of care. | 10.3389/fpubh.2020.00498 | medrxiv |
10.1101/19007385 | A Quantitative and Narrative Evaluation of Goodman and Gilmans Pharmacological Basis of Therapeutics | Piper, B. J.; Alinea, A. A.; Wroblewski, J. R.; Graham, S. M.; Chung, D. Y.; McCutcheon, L. R.; Birkett, M. A.; Kheloussi, S. S.; Shah, V. M.; Zalim, Q. K.; Arnott, J. A.; McLaughlin, W. A.; Lucchessi, P. A.; Miller, K. A.; Waite, G. N.; Bordonaro, M. | Brian J Piper | Geisinger Commonwealth School of Medicine | 2019-09-29 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc | medical education | https://www.medrxiv.org/content/early/2019/09/29/19007385.source.xml | ObjectiveGoodman and Gilmans The Pharmacological Basis of Therapeutics (GGPBT) has been a cornerstone in the education of pharmacists, physicians, and pharmacologists for decades. The objectives of this report were to describe and evaluate the 13th edition of GGPBT including: 1) author characteristics; 2) recency of citations; 3) conflict of interest (CoI) disclosure, and 4) expert evaluation of chapters.
MethodsContributors (N = 115) sex, professional degrees, and presence of undisclosed potential CoI as reported by the Center for Medicare and Medicaids Open Payments (2013 to 2017) were examined. Year of publication of citations were extracted relative to comparison textbooks (Katzungs Basic and Clinical Pharmacology (KatBCP), and DiPiros Pharmacotherapy: A Pathophysiologic Approach (DiPPAPA). Content experts in pharmacy and pharmacology education provided chapter reviews.
ResultsThe percent of GGPBT contributors that were female (20.9%) was equivalent to those in KatBCP (17.0%). Citations in GGPBT (11.5 {+/-} 0.2 years) were significantly older than those in KatBCP (10.4 {+/-} 0.2) and DiPPAPA (9.1 {+/-} 0.1, p < .0001). Contributors to GGPBT received three million in undisclosed remuneration from pharmaceutical companies (Maximum author = $743,718). In contrast, DiPPAPA made CoI information available. However, self-reported disclosures were not uniformly congruent with Open Payments reported data. Reviewers noted several strengths but also some areas for improvement.
ConclusionGGPBT will continue to be an important component of the biomedical curriculum. Areas of improvement include more diverse authorship, improved conflict of interest transparency, and greater inclusion of more recent citations. | 10.3390/pharmacy8010001 | medrxiv |
10.1101/19007096 | Havana Syndrome Among Canadian Diplomats: Brain Imaging Reveals Acquired Neurotoxicity | Friedman, A.; Calkin, C.; Adams, A.; Suarez, G. A.; Bardouille, T.; Hacohen, N.; Green, A. L.; Gupta, R. R.; Hashmi, J.; Kamintsky, L.; Kim, J. S.; Laroche, R.; MacKenzie, D.; Milikovsky, D.; Oystreck, D.; Newton, J.; Noel, G.; Ofer, J.; Quraan, M.; Reardon, C.; Ross, M.; Rutherford, D.; Schmidt, M.; Serlin, Y.; Sweeney, C.; Verge, J.; Walsh, L.; Bowen, C. | Alon Friedman | Dalhousie University | 2019-09-29 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | neurology | https://www.medrxiv.org/content/early/2019/09/29/19007096.source.xml | BACKGROUNDIn late 2016, US diplomats stationed in Havana began presenting with a variety of neurological manifestations that proved difficult to diagnose. Though previous studies suggested a likely association with brain injury, the mechanism of injury, brain regions involved, and etiology remained unknown.
METHODSWe conducted a multimodal study examining 26 Canadian diplomats and their family members, the majority of whom presented with symptoms similar to their American counterparts while residing in Havana. Assessments included a medical history, self-reported symptom questionnaires, cognitive assessments, blood tests, and brain imaging assessments (magnetic resonance imaging (MRI) and magnetoencephalography (MEG)). Individuals showing signs of brain injury underwent further neurological, visual, and audio-vestibular assessments. Eight participants were tested both before and after living in Havana.
RESULTSOur assessment documents multiple functional and structural impairments, including significant spatial memory impairment, abnormal brain-stem evoked potentials, degradation of fibre tracts in the fornix and posterior corpus callosum, blood-brain barrier injury to the right basal forebrain and anterior insula, and abnormal paroxysmal slowing events of cortical activity. Subsequent mass-spectrometry and blood analyses documented reduced serum cholinesterase activity and the presence of organophosphates (Temephos) and pyrethroid metabolites (3-phenoxybenzoic acid or 3-BPA).
CONCLUSIONSOur findings confirm brain injury, specify the regions involved, and raise the hypothesis of overexposure to cholinesterase inhibitors as a plausible etiology. If correct, our hypothesis bears public health ramifications (see Discussion) and suggests a course of action for reducing exposure in the future.
FUNDINGGlobal Affairs Canada. | null | medrxiv |
10.1101/19007237 | The global burden of pressure ulcers among patients with spinal cord injury: a systematic review and meta-analysis. | shiferaw, w. s.; Yirga, T.; Mulugeta, H.; Aynalem, Y. A. | wondimeneh shibabaw shiferaw | debre berhan university | 2019-09-29 | 1 | PUBLISHAHEADOFPRINT | cc_no | nursing | https://www.medrxiv.org/content/early/2019/09/29/19007237.source.xml | BackgroundPressure ulcer, one of the common challenging public health problems affecting patient with spinal cord injury, is the formation of lesion and ulceration on the skin specially in the bony prominence areas. It has a significant impact to the patient and health care system. Moreover, it has psychological, physical, social burden and decrease the quality of life (QoL) of patients. Despite its serious complications, limited evidence is available on the global magnitude of pressure ulcers among patient with spinal cord injury. Hence, the objective of this systematic review and meta-analysis was to estimate the global magnitude of pressure ulcers among patient with spinal cord injury.
MethodsPubMed, Scopus, Google Scholar, Africa journal online, PsycINFO and web-science were systematically searched online to retrieve related articles. The Preferred Reporting Items for Systematic Review and Meta-Analysis (PRISMA) guideline was followed. The random-effects model was fitted to estimate the summary effect. To investigate heterogeneity across the included studies, I2 test was employed. Publication bias was examined using funnel plot and Eggers regression test statistic. All statistical analysis was done using STATA version 14 software for windows.
ResultsTwenty-four studies which comprises of 600,078 participants were included in this meta-analysis. The global pooled magnitude of pressure ulcer among patients with spinal cord injury was 32.36% (95% CI (28.21, 36.51%)). Based on the subgroup analysis, the highest magnitude of pressure ulcer was observed in Africa 41.19% (95% CI: 31.70, 52.18).
ConclusionThis systematic review and meta-analysis revealed that about one in three patients with spinal cord injury had pressure ulcers. This implies that the overall global magnitude of pressure ulcer is relatively high. Therefore, policymakers (FMoH) and other concerned bodies need give special attention to reduce the magnitude of pressure ulcers in patient with spinal cord injury. | 10.1186/s12891-020-03369-0 | medrxiv |
10.1101/19007815 | Clinical and laboratory characteristics of clozapine treated schizophrenia patients referred to a national immunodeficiency clinic reveals a B-cell signature resembling CVID. | Ponsford, M. J.; Steven, R.; Bramhall, K.; Burgess, M.; Wijetilleka, S.; Carne, E.; McGuire, F.; Price, C. R.; Moody, M.; Zouwail, S.; Tahir, T.; Farewell, D.; El-Shanawany, T.; Jolles, S. | Mark J Ponsford | Cardiff University | 2019-10-02 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | allergy and immunology | https://www.medrxiv.org/content/early/2019/10/02/19007815.source.xml | PurposeAn association between antibody deficiency and clozapine use in individuals with Schizophrenia has recently been reported. We hypothesized that if clozapine-associated hypogammaglobulinaemia was clinically relevant this would manifest in referral patterns.
MethodsRetrospective case note review of patients referred and assessed by Immunology Centre for Wales (ICW) between January 2005 and July 2018 with extraction of clinical and immunologic features for individuals with diagnosis of schizophrenia-like illness.
Results1791 adult patients were assessed at ICW during this period; 23 patients had a psychiatric diagnosis of schizophrenia or schizo-affective disorder. Principal indications for referral were findings of low calculated globulin and immunoglobulins. Clozapine was the single most commonly prescribed antipsychotic (17/23), disproportionately increased relative to reported use in the general schizophrenia population (OR 6.48, 95% CI: 1.79 to 23.5). Clozapine therapy was noted in 6/7 (86%) of patients subsequently requiring immunoglobulin replacement therapy (IgRT). Marked reduction of class-switched memory B-cells (CSMB) and plasmablasts were observed in clozapine-treated individuals relative to healthy age-matched controls. Clozapine duration is associated with CSMB decline. One patient discontinued clozapine, with gradual recovery of IgG levels without use of IgRT.
ConclusionOur findings are consistent with enrichment of clozapine-treatment within schizophrenic individuals referred for ICW assessment over the last 13 years. These individuals displayed clinical patterns closely resembling the primary immunodeficiency CVID, however appears reversible upon drug cessation. This has diagnostic, monitoring and treatment implications for psychiatry and immunology teams and directs prospective studies to address causality and the wider implications for this patient group. | 10.1136/jclinpath-2019-206235 | medrxiv |
10.1101/19007195 | A Comparison of the Randomized Clinical Trial Efficacy and Real-World Effectiveness of Tofacitinib for the Treatment of Inflammatory Bowel Disease: A Cohort Study | Rudrapatna, V. A.; Glicksberg, B. S.; Butte, A. J. | Atul J Butte | University of California, San Francisco | 2019-10-02 | 1 | PUBLISHAHEADOFPRINT | cc_no | gastroenterology | https://www.medrxiv.org/content/early/2019/10/02/19007195.source.xml | BackgroundReal-world data are receiving attention from regulators, biopharmaceuticals and payors as a potential source of clinical evidence. However, the suitability of these data to produce evidence commensurate with randomized controlled trials (RCTs) and the best practices in their use remain unclear. We sought to compare the real-world effectiveness of Tofacitinib in the treatment of IBD against efficacy rates published by corresponding RCTs.
MethodsElectronic health records at the University of California, San Francisco (UCSF) were queried and reviewed to identify 86 Tofacitinib-treated IBD patients through 4/2019. The primary endpoint was treatment effectiveness. This was measured by time-to-treatment-discontinuation and by the primary endpoints of RCTs in Ulcerative Colitis (UC) and Crohns Disease (CD). Endpoints were measured and analyzed following a previously published protocol and analysis plan.
Findings86 patients (68 with UC, 18 with CD) initiated Tofacitinib for IBD treatment. Most of the data needed to calculate baseline and follow-up disease activity indices were documented within the EHR(77% for UC, 91% for CD). Baseline characteristics of the UCSF and RCT cohorts were similar, except for a longer disease duration and 100% treatment failure of Tumor Necrosis Factor inhibitors in the former. None of the UCSF cohort would have met the RCT eligibility criteria due to multiple reasons.
The rate of achieving the RCT primary endpoints were highly similar to the published rates for both UC(16%, P=0{middle dot}5) and CD (38%, P=0{middle dot}8). However, treatment persistence was substantially higher: 69% for UC (week 52) and 75% for CD (week 26).
InterpretationAn analysis of routinely collected clinical data can reproduce published Tofacitinib efficacy rates, but also indicates far greater treatment durability than suggested by RCTs including possible benefit in CD. These results underscore the value of real-world studies to complement RCTs.
FundingThe National Institutes of Health and UCSF Bakar Institute
Research in ContextO_ST_ABSEvidence before this studyC_ST_ABSTofacitinib is the most recently approved treatment for Ulcerative Colitis. Data related to treatment efficacy for either IBD subtype is generally limited, whether from controlled trials or real-world studies. A search of clinicaltrials.gov was performed in January 2019 for completed phase 2 or 3, interventional, placebo-controlled clinical trials matching the terms "Crohns Disease" OR "Ulcerative Colitis" in the conditions field, and matching "Placebo" AND "Tofacitinib" OR "CP-690,550") in the Interventions field. We identified three Phase 3 trials for UC (OCTAVE trials, all initially reported in a single article in 2016) and three Phase 2 trials of CD (two published in the same article in 2017, one reported in 2014). The Phase 3 UC trials reported 57{middle dot}6% pooled clinical response rate in the Tofacitinib-assigned groups after 8 weeks (induction), and a 37{middle dot}5% pooled remission rate among eligible induction trial responders in the Tofacitinib-assigned groups at 52 weeks. The 2017 CD trial reported a 70{middle dot}8% pooled rate of response or remission in the Tofacitinib-assigned groups after 8 weeks, and a 47{middle dot}6% pooled rate of response or remission among enrolled induction-trial responders at 26 weeks. A bias assessment of both UC and CD trials indicated a high risk of attrition bias and unclear risk of bias related to conflicts of interest. We also performed a search of pubmed.gov in January 2019 using search terms ("Colitis" OR "Crohns") AND ("Tofacitinib" OR "CP-690,550") OR "real-world" to identify cohort studies of Tofacitinib efficacy in routine clinical practice. No studies meeting these criteria were identified.
Added value of this studyThis is one of the early studies to closely compare the results of clinical trials with the continuously-updated data captured in the electronic health records, and the very the first to assess the efficacy-effectiveness gap for Tofacitinib. We found that none of the patients treated at our center thus far would have qualified for the clinical trial based on published eligibility criteria. We found that the drug appeared to perform similarly to its efficacy when using the endpoints reported in clinical trials, but treatment persistence was significantly greater than would have been expected from the reported trial outcomes: 69% for UC at week 52 and 75% for CD at week 26.
Implications of all the available evidenceTofacitinib is an effective treatment for the Ulcerative colitis and may be efficacious for Crohns disease. Controlled trials may not be representative of real-world cohorts, may not be optimally designed to identify efficacious drugs, and may not accurately predict patterns of use in clinical practice. Further studies using real-world data as well as methods to enable their proper use are needed to confirm and continuously monitor the efficacy and safety of drugs, both for on- and off-label use. | null | medrxiv |
10.1101/19007989 | A Personalized Multi-Component Lifestyle Intervention Program Leads to Improved Quality of Life in Persons with Chronic Kidney Disease | Headley, S. A.; Hutchinson, J. C.; Thompson, B. A.; Ostroff, M. L.; Courtney J. Doyle-Campbell, C. J.; Cornelius, A. E.; Dempsey, K.; Siddall, J.; Miele, E. M.; Evans, E. E.; Wood, B.; Sirois, C. M.; Winston, B. A.; Whalen, S. K.; Germain, M. J. | Samuel A Headley | Springfield College | 2019-10-02 | 1 | PUBLISHAHEADOFPRINT | cc_no | nephrology | https://www.medrxiv.org/content/early/2019/10/02/19007989.source.xml | IntroductionLifestyle interventions have been shown to produce favorable changes in some health outcomes in patients with chronic kidney disease (CKD). However, few such studies, employing "real world" methods have been completed in patients with CKD.
ObjectiveThis study tested the effectiveness of a comprehensive, multicomponent, lifestyle intervention, delivered through individualized counseling on a variety of health outcomes in pre-dialysis CKD patients.
MethodsEligible patients were assigned randomly to the intervention (TR) or usual care group (UC). A six-month home-based program involving personalized counseling to increase physical activity to recommended levels among stage G3a to G4 CKD patients while exchanging plant proteins for animal proteins was implemented. Physical function, cardiovascular function, dietary intake, medication use, and health-related quality of life (HRQOL) were assessed at baseline and after 1-month, 3-months (M3) and 6-months (M6).
ResultsForty-two, patients (age 60.2 {+/-} 9.2, BMI 34.5 {+/-} 7.8) participated in this study (TR=27 UC=15). The intervention reduced (p<0.05) brachial (bSBP) and central systolic blood pressures (cSBP) at month 3 (M3) but both were attenuated at month 6 (M6). Scores on the effect of kidney disease subscale of the HRQOL measure improved in the intervention group at M3 and M6. There was no change in the other measures of HRQOL or in any physical function scores.
ConclusionsThis personalized multi-component lifestyle intervention enabled CKD patients to self-report fewer concerns with how CKD affected their daily lives independent of changes in physical function. | null | medrxiv |
10.1101/19008128 | Primary motor cortex has independent representations for ipsilateral and contralateral arm movements but correlated representations for grasping | Downey, J. E.; Quick, K. M.; Schwed, N.; Weiss, J. M.; Wittenberg, G. F.; Boninger, M. L.; Collinger, J. L. | Jennifer L Collinger | University of Pittsburgh | 2019-10-02 | 1 | PUBLISHAHEADOFPRINT | cc_no | neurology | https://www.medrxiv.org/content/early/2019/10/02/19008128.source.xml | Motor commands for the arms and hands generally originate in contralateral motor cortex anatomically. However, ipsilateral primary motor cortex shows activity related to arm movement despite the lack of direct connections. The extent to which the activity related to ipsilateral movement is independent from that related to contralateral movement is unclear based on conflicting conclusions in prior work. Here we present the results of bilateral arm and hand movement tasks completed by two human subjects with intracortical microelectrode arrays implanted in left primary motor cortex for a clinical brain-computer interface study. Neural activity was recorded while they attempted to perform arm and hand movements in a virtual environment. This enabled us to quantify the strength and independence of motor cortical activity related to continuous movements of each arm. We also investigated the subjects ability to control both arms through a brain-computer interface system. Through a number of experiments, we found that ipsilateral arm movement was represented independently of, but more weakly than, contralateral arm movement. However, the representation of grasping was correlated between the two hands. This difference between hand and arm representation was unexpected, and poses new questions about the different ways primary motor cortex controls hands and arms. | 10.1093/cercor/bhaa120 | medrxiv |
10.1101/19007898 | Machine learning assisted DSC-MRI radiomics as a tool for glioma classification by grade and mutation status | Sudre, C.; Panovska-Griffiths, J.; Sanverdi, E.; Brandner, S.; Katsaros, V. K.; Stanjalis, G.; Pizzini, F. B.; Ghimenton, C.; Surlan-Popovic, K.; Avsenik, J.; Spampinato, M. V.; Nigro, M.; Chatterjee, A. R.; Attye, A.; Grand, S.; Krainik, A.; Anzalone, N.; Conte, G. M.; Romeo, V.; Ugga, L.; Elefante, A.; Ciceri, E. F.; Guadagno, E.; Kapsalaki, E.; Roettger, D.; Gonzalez, J.; Boutelier, T.; Cardoso, J. M.; Bisdas, S. | Sotirios Bisdas | UCL | 2019-10-02 | 1 | PUBLISHAHEADOFPRINT | cc_no | neurology | https://www.medrxiv.org/content/early/2019/10/02/19007898.source.xml | BackgroundMachine learning assisted MRI radiomics, which combines MRI techniques with machine learning methodology, is rapidly gaining attention as a promising method for staging of brain gliomas. This study assesses the diagnostic value of such framework applied to dynamic susceptibility contrast (DSC)-MRI in classifying treatment-naive gliomas from a multi-center patient pool into WHO grades II-IV and across their isocitrate dehydrogenase (IDH) mutation status.
Methods333 patients from 6 tertiary centres, diagnosed histologically and molecularly with primary gliomas (IDH-mutant=151 or IDH-wildtype=182) were retrospectively identified. Raw DSC-MRI data was post-processed for normalised leakage-corrected relative cerebral blood volume (rCBV) maps. Shape, intensity distribution (histogram) and rotational invariant Haralick texture features over the tumour mask were extracted. Differences in extracted features between IDH-wildtype and IDH-mutant gliomas and across three glioma grades were tested using the Wilcoxon two-sample test. A random forest algorithm was employed (2-fold cross-validation, 250 repeats) to predict grades or mutation status using the extracted features.
ResultsFeatures from all types (shape, distribution, texture) showed significant differences across mutation status. WHO grade II-III differentiation was mostly driven by shape features while texture and intensity feature were more relevant for the III-IV separation. Increased number of features became significant when differentiating grades further apart from one another. Gliomas were correctly stratified by IDH mutation status in 71% of the cases and by grade in 53% of the cases. In addition, 87% of the gliomas grades predicted with an error distance up to 1.
ConclusionDespite large heterogeneity in the multi-center dataset, machine learning assisted DSC-MRI radiomics hold potential to address the inherent variability and presents a promising approach for non-invasive glioma molecular subtyping and grading.
Key points- On highly heterogenous, multi-centre data, machine learning on DSC-MRI features can correctly predict glioma IDH subtyping in 71% of cases and glioma grade II-IV in 53% of the cases (87% <1 grade difference)
- Shape features distinguish best grade II from grade III gliomas.
- Texture and distribution features distinguish best grade III from grade IV tumours.
Importance of studyThis work illustrates the diagnostic value of combining machine learning and dynamic susceptibility contrast-enhanced MRI (DSC-MRI) radiomics in classifying gliomas into WHO grades II-IV as well as across their isocitrate dehydrogenase (IDH) mutation status. Despite the data heterogeneity inherent to the multi-centre design of the studied cohort (333 subjects, 6 centres) that greatly increases the theoretical challenges of machine learning frameworks, good classification performance (accuracy of 53% across grades (87% <1 grade difference) and 71% across mutation status) was obtained. Therefore, our results provide a proof-of-concept for this emerging precision medicine field that has good generalisability and scalability properties. Introspection on the classification errors highlighted mostly borderline cases and helped underline the challenges of a categorical classification in a pathological continuum.
With its strong generalisability property, its ability to further incorporate participating centres and its possible use to identify borderline cases, the proposed machine learning framework has the potential to contribute to the clinical translation of machine-learning assisted diagnostic tools in neuro-oncology. | 10.1186/s12911-020-01163-5 | medrxiv |
10.1101/19006635 | Spinal muscular atrophy diagnosis and carrier screening from whole-genome sequencing data | Chen, X.; Sanchis-Juan, A.; French, C. E.; Connell, A. J.; Chawla, A.; Halpern, A. L.; Taft, R. J.; NIHR BioResource, ; Bentley, D. R.; Butchbach, M. E.; Raymond, F. L.; Eberle, M. A. | Michael A Eberle | Illumina, Inc | 2019-10-02 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | genetic and genomic medicine | https://www.medrxiv.org/content/early/2019/10/02/19006635.source.xml | PurposeSpinal muscular atrophy (SMA), caused by loss of the SMN1 gene, is a leading cause of early childhood death. Due to the near identical sequences of SMN1 and SMN2, analysis of this region is challenging. Population-wide SMA screening to quantify the SMN1 copy number (CN) is recommended by the American College of Medical Genetics.
MethodsWe developed a method that accurately identifies the CN of SMN1 and SMN2 using genome sequencing (GS) data by analyzing read depth and eight informative reference genome differences between SMN1/2.
ResultsWe characterized SMN1/2 in 12,747 genomes, identified 1568 samples with SMN1 gains or losses and 6615 samples with SMN2 gains or losses and calculated a pan-ethnic carrier frequency of 2%, consistent with previous studies. Additionally, 99.8% of our SMN1 and 99.7% of SMN2 CN calls agreed with orthogonal methods, with a recall of 100% for SMA and 97.8% for carriers, and a precision of 100% for both SMA and carriers.
ConclusionThis SMN copy number caller can be used to identify both carrier and affected status of SMA, enabling SMA testing to be offered as a comprehensive test in neonatal care and an accurate carrier screening tool in GS sequencing projects. | 10.1038/s41436-020-0754-0 | medrxiv |
10.1101/19006635 | Spinal muscular atrophy diagnosis and carrier screening from whole-genome sequencing data | Chen, X.; Sanchis-Juan, A.; French, C. E.; Connell, A. J.; Delon, I.; Kingsbury, Z.; Chawla, A.; Halpern, A. L.; Taft, R. J.; NIHR BioResource, ; Bentley, D. R.; Butchbach, M. E.; Raymond, F. L.; Eberle, M. A. | Michael A Eberle | Illumina, Inc | 2019-12-18 | 2 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | genetic and genomic medicine | https://www.medrxiv.org/content/early/2019/12/18/19006635.source.xml | PurposeSpinal muscular atrophy (SMA), caused by loss of the SMN1 gene, is a leading cause of early childhood death. Due to the near identical sequences of SMN1 and SMN2, analysis of this region is challenging. Population-wide SMA screening to quantify the SMN1 copy number (CN) is recommended by the American College of Medical Genetics.
MethodsWe developed a method that accurately identifies the CN of SMN1 and SMN2 using genome sequencing (GS) data by analyzing read depth and eight informative reference genome differences between SMN1/2.
ResultsWe characterized SMN1/2 in 12,747 genomes, identified 1568 samples with SMN1 gains or losses and 6615 samples with SMN2 gains or losses and calculated a pan-ethnic carrier frequency of 2%, consistent with previous studies. Additionally, 99.8% of our SMN1 and 99.7% of SMN2 CN calls agreed with orthogonal methods, with a recall of 100% for SMA and 97.8% for carriers, and a precision of 100% for both SMA and carriers.
ConclusionThis SMN copy number caller can be used to identify both carrier and affected status of SMA, enabling SMA testing to be offered as a comprehensive test in neonatal care and an accurate carrier screening tool in GS sequencing projects. | 10.1038/s41436-020-0754-0 | medrxiv |
10.1101/19007963 | Treatment advantage in HBV/HIV coinfection compared to HBV monoinfection in a South African cohort | Maponga, T. G.; McNaughton, A. L.; Van Schalkwyk, M.; Hugo, S.; Nwankwo, C.; Taljaard, J.; Mokaya, J.; Smith, D. A.; van Vuuren, C.; Goedhals, D.; Gabriel, S.; Andersson, M. I.; Preiser, W.; van Rensburg, C.; Matthews, P. C. | Philippa C Matthews | University of Oxford | 2019-10-02 | 1 | PUBLISHAHEADOFPRINT | cc_by | infectious diseases | https://www.medrxiv.org/content/early/2019/10/02/19007963.source.xml | ObjectivePrompted by international targets for elimination of hepatitis B virus (HBV) infection, we performed a cross-sectional observational study of adults with chronic HBV (CHB) infection in South Africa, characterising individuals with HBV monoinfection vs. those coinfected with HBV/HIV, to evaluate the impact of therapy and to guide improvements in clinical care as guidelines for antiviral therapy change over time.
DesignWe prospectively recruited 115 adults with CHB, over a period of one year at a university hospital in Cape Town, South Africa. HIV coinfection was present in 39 (34%) subjects. We recorded cross-sectional demographic, clinical and laboratory data.
ResultsAdults with HBV monoinfection were comparable to those with HBV/HIV coinfection in terms of age, sex and body mass. HBeAg-positive status was more common among those with HIV coinfection (p=0.01). However, compared to HBV/HIV coinfection, HBV monoinfected patients were less likely to have had assessment with elastography (p<0.0001) and less likely to be on antiviral treatment (p<0.0001). The HBV monoinfected group was more likely to have detectable HBV viraemia (p=0.04), and features suggesting underlying liver disease including moderate/severe thrombocytopaenia (p=0.007), elevated bilirubin (p=0.004), and APRI score >2 (p=0.02). Three cases of hepatocellular carcinoma were documented, all in patients with HBV monoinfection.
ConclusionIn this setting, individuals with HBV monoinfection are disadvantaged in terms of clinical assessment and appropriate antiviral therapy compared to those with HIV coinfection, associated with relatively worse liver health. Enhanced advocacy, education, resources and infrastructure are required to optimise interventions for CHB. | 10.1016/j.jinf.2020.04.037 | medrxiv |
10.1101/19007963 | Treatment advantage in HBV/HIV coinfection compared to HBV monoinfection in a South African cohort | Maponga, T. G.; McNaughton, A. L.; Van Schalkwyk, M.; Hugo, S.; Nwankwo, C.; Taljaard, J.; Mokaya, J.; Smith, D. A.; van Vuuren, C.; Goedhals, D.; Gabriel, S.; Andersson, M. I.; Preiser, W.; van Rensburg, C.; Matthews, P. C. | Philippa C Matthews | University of Oxford | 2019-12-19 | 2 | PUBLISHAHEADOFPRINT | cc_by | infectious diseases | https://www.medrxiv.org/content/early/2019/12/19/19007963.source.xml | ObjectivePrompted by international targets for elimination of hepatitis B virus (HBV) infection, we performed a cross-sectional observational study of adults with chronic HBV (CHB) infection in South Africa, characterising individuals with HBV monoinfection vs. those coinfected with HBV/HIV, to evaluate the impact of therapy and to guide improvements in clinical care as guidelines for antiviral therapy change over time.
DesignWe prospectively recruited 115 adults with CHB, over a period of one year at a university hospital in Cape Town, South Africa. HIV coinfection was present in 39 (34%) subjects. We recorded cross-sectional demographic, clinical and laboratory data.
ResultsAdults with HBV monoinfection were comparable to those with HBV/HIV coinfection in terms of age, sex and body mass. HBeAg-positive status was more common among those with HIV coinfection (p=0.01). However, compared to HBV/HIV coinfection, HBV monoinfected patients were less likely to have had assessment with elastography (p<0.0001) and less likely to be on antiviral treatment (p<0.0001). The HBV monoinfected group was more likely to have detectable HBV viraemia (p=0.04), and features suggesting underlying liver disease including moderate/severe thrombocytopaenia (p=0.007), elevated bilirubin (p=0.004), and APRI score >2 (p=0.02). Three cases of hepatocellular carcinoma were documented, all in patients with HBV monoinfection.
ConclusionIn this setting, individuals with HBV monoinfection are disadvantaged in terms of clinical assessment and appropriate antiviral therapy compared to those with HIV coinfection, associated with relatively worse liver health. Enhanced advocacy, education, resources and infrastructure are required to optimise interventions for CHB. | 10.1016/j.jinf.2020.04.037 | medrxiv |
10.1101/19007971 | Optimizing the deployment of ultra-low volume and indoor residual spraying for dengue outbreak response | Cavany, S. M.; Espana, G.; Lloyd, A. L.; Waller, L. A.; Kitron, U.; Astete, H.; Elson, W. H.; Vazquez-Prokopec, G. M.; Scott, T. W.; Morrison, A. C.; Reiner, R. C.; Perkins, T. A. | T. Alex Perkins | University of Notre Dame | 2019-10-02 | 1 | PUBLISHAHEADOFPRINT | cc_by | epidemiology | https://www.medrxiv.org/content/early/2019/10/02/19007971.source.xml | Recent years have seen rising incidence of dengue and large outbreaks of Zika and chikungunya, which are all caused by viruses transmitted by Aedes aegypti mosquitoes. In most settings, the primary intervention against Aedes-transmitted viruses is vector control, such as indoor, ultra-low volume (ULV) spraying. Targeted indoor residual spraying (TIRS) has the potential to more effectively impact Aedes-borne diseases, but its implementation requires careful planning and evaluation. The optimal time to deploy these interventions and their relative epidemiological effects are not well understood, however. We used an agent-based model of dengue virus transmission calibrated to data from Iquitos, Peru to assess the epidemiological effects of these interventions under differing strategies for deploying them. Specifically, we compared strategies where spray application was initiated when incidence rose above a threshold based on incidence in recent years to strategies where spraying occurred at the same time(s) each year. In the absence of spraying, the model predicted 361,000 infections [inter-quartile range (IQR): 347,000 - 383,000] in the period 2000-2010. The ULV strategy with the fewest median infections was spraying twice yearly, in March and October, which led to a median of 172,000 infections [IQR: 158,000 - 183,000] over the 11-year study period, a 52% reduction from baseline. Compared to spraying once yearly in September, the best threshold-based strategy utilizing ULV had fewer median infections (254,000 vs. 261,000), but required more spraying (351 vs. 274 days). For TIRS, the best strategy was threshold-based, which led to the fewest infections of all strategies tested (9,900; [IQR: 8,720 - 11,400], a 94% reduction), and required fewer days spraying than the equivalent ULV strategy (280). Although spraying twice each year is likely to avert the most infections, our results indicate that a threshold-based strategy can become an alternative to better balance the translation of spraying effort into impact, particularly if used with a residual insecticide.
Author SummaryOver half of the worlds population is at risk of infection by dengue virus (DENV) from Aedes aegypti mosquitoes. While most infected people experience mild or asymptomatic infections, dengue can cause severe symptoms, such as hemorrhage, shock, and death. A vaccine against dengue exists, but it can increase the risk of severe disease in people who have not been previously infected by one of the four DENV serotypes. In many places, therefore, the best currently available way to prevent outbreaks is by controlling the mosquito population. Our study used a simulation model to explore alternative strategies for deploying insecticide in the city of Iquitos in the Peruvian Amazon. Our simulations closely matched empirical patterns from studies of dengues ecology and epidemiology in Iquitos, such as mosquito population dynamics, human household structure, demography, human and mosquito movement, and virus transmission. Our results indicate that an insecticide that has a long-lasting, residual effect will have the biggest impact on reducing DENV transmission. For non-residual insecticides, we find that it is best to begin spraying close to the start of the dengue transmission season, as mosquito populations can rebound quickly and resume previous levels of transmission. | 10.1371/journal.pcbi.1007743 | medrxiv |
10.1101/19007740 | MicroRNA-mRNA networks define translatable molecular outcome phenotypes in osteosarcoma. | Lietz, C. E.; Garbutt, C.; Barry, W. T.; Deshpande, V.; Chen, Y.-L.; Lozano-Calderon, S. A.; Wang, Y.; Lawney, B.; Ebb, D.; Cote, G. M.; Duan, Z.; Hornicek, F. J.; Choy, E.; Nielsen, G. P.; Haibe-Kains, B.; Quackenbush, J.; Spentzos, D. | Dimitrios Spentzos | Department of Orthopaedic Surgery, Massachusetts General Hospital, Harvard Medical School, Boston, MA. | 2019-10-03 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | genetic and genomic medicine | https://www.medrxiv.org/content/early/2019/10/03/19007740.source.xml | BackgroundThere is a lack of well validated biomarkers in osteosarcoma, a rare, recalcitrant disease with variable outcome and poorly understood biologic behavior, for which treatment standards have stalled for decades. The only standard prognostic factor in osteosarcoma remains the amount of pathologic necrosis following pre-operative chemotherapy, which does not adequately capture the biologic complexity of the tumor and has not resulted in optimized patient therapeutic stratification. New, robust biomarkers are needed to understand prognosis and better reflect the underlying biologic and molecular complexity of this disease.
MethodsWe performed microRNA sequencing in 74 frozen osteosarcoma biopsy samples, the largest single center translationally analyzed cohort to date, and separately analyzed a multi-omic dataset from a large (n = 95) NCI supported national cooperative group cohort. Molecular patterns were tested for association with outcome and used to identify novel therapeutics for further study by integrative pharmacogenomic analysis.
ResultsMicroRNA profiles were found predict Recurrence Free Survival (5-microRNA profile, Median RFS 59 vs 202 months, log rank p=0.06, HR 1.87, 95% CI 0.96-3.66). The profiles were independently prognostic of RFS when controlled for metastatic disease at diagnosis and pathologic necrosis following chemotherapy in multivariate Cox proportional hazards regression (5-microRNA profile, HR 3.31, 95% CI 1.31-8.36, p=0.01). Strong trends for survival discrimination were observed in the independent NCI dataset, and transcriptomic analysis revealed the downstream microRNA regulatory targets are also predictive of survival (median RFS 17 vs 105 months, log rank p=0.007). Additionally, DNA methylation patterns held prognostic significance. Through machine learning based integrative pharmacogenomic analysis, the microRNA biomarkers identify novel therapeutics for further study and stratified application in osteosarcoma.
ConclusionsOur results support the existence of molecularly defined phenotypes in osteosarcoma associated with distinct outcome independent of clinicopathologic features. We validated candidate microRNA profiles and their associated molecular networks for prognostic value in multiple independent datasets. These networks may define previously unrecognized osteosarcoma subtypes with distinct molecular context and clinical course potentially appropriate for future application of tailored treatment strategies in different patient subgroups. | 10.1038/s41598-020-61236-3 | medrxiv |
10.1101/19006817 | Science map of Cochrane systematic reviews receiving the most altmetric attention: network visualization and machine learning perspective | Kolahi, J.; Khazaei, S.; Bidram, E.; Kelishadi, R. | Jafar Kolahi | Dental Hypotheses | 2019-10-03 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | health informatics | https://www.medrxiv.org/content/early/2019/10/03/19006817.source.xml | We aimed to analyze and visualize the science map of Cochrane systematic reviews (CSR) with high Altmetric attention scores (AAS). On 10 May 2019, the Altmetric data of the CSR Database were obtained from the Altmetric database (Altmetric LLP, London, UK). Bibliometric data of the top 5% of CSR were extracted from the Web of Science. Keyword co-occurrence, co-authorship, and co-citation network analysis were then employed using VOSviewer software. A Random forest model was used to analyze the citation patterns. A total of 12016 CSR with AAS were found (Total mentions: 259968) with Twitter being the most popular Altmetric resource. Consequently, the top 5% (607 articles, mean AAS: 171.2, 95% confidence level (CL): 14.4, mean citations: 42.1, 95%CL: 1.3) with the highest AAS were included in the study. Keyword co-occurrence network analysis revealed female, adult, and child as the most popular keywords. Helen V. Worthington (University of Manchester, Manchester, UK), and the University of Oxford and UK had the greatest impact on the network at the author, organization and country levels respectively. The co-citation network analysis revealed that The Lancet and CSR database had the most influence on the network. However, AAS were not correlated with citations (r=0.15) although they were correlated with policy document mentions (r=0.61). The results of random forest model confirmed the importance of policy document mentions. Despite the popularity of CSR in the Twittersphere, disappointingly, they were rarely shared and discussed within the new academic tools that are emerging, such as F1000 prime, Publons, and PubPeer.
Article HighlightsO_LIThe CSR database was most mentioned in Twitter.
C_LIO_LITwitter and News act as the greatest prominent issues regarding altmetric scores.
C_LI | 10.5530/jscires.9.3.36 | medrxiv |
10.1101/19006817 | Science map of Cochrane systematic reviews receiving the most altmetric attention: network visualization and machine learning perspective | Kolahi, J.; Khazaei, S.; Bidram, E.; Kelishadi, R.; Iranmanesh, P.; Nekoofar, M. H.; Dummer, P. M. H. | Jafar Kolahi | Dental Hypotheses | 2020-04-07 | 2 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | health informatics | https://www.medrxiv.org/content/early/2020/04/07/19006817.source.xml | We aimed to analyze and visualize the science map of Cochrane systematic reviews (CSR) with high Altmetric attention scores (AAS). On 10 May 2019, the Altmetric data of the CSR Database were obtained from the Altmetric database (Altmetric LLP, London, UK). Bibliometric data of the top 5% of CSR were extracted from the Web of Science. Keyword co-occurrence, co-authorship, and co-citation network analysis were then employed using VOSviewer software. A Random forest model was used to analyze the citation patterns. A total of 12016 CSR with AAS were found (Total mentions: 259968) with Twitter being the most popular Altmetric resource. Consequently, the top 5% (607 articles, mean AAS: 171.2, 95% confidence level (CL): 14.4, mean citations: 42.1, 95%CL: 1.3) with the highest AAS were included in the study. Keyword co-occurrence network analysis revealed female, adult, and child as the most popular keywords. Helen V. Worthington (University of Manchester, Manchester, UK), and the University of Oxford and UK had the greatest impact on the network at the author, organization and country levels respectively. The co-citation network analysis revealed that The Lancet and CSR database had the most influence on the network. However, AAS were not correlated with citations (r=0.15) although they were correlated with policy document mentions (r=0.61). The results of random forest model confirmed the importance of policy document mentions. Despite the popularity of CSR in the Twittersphere, disappointingly, they were rarely shared and discussed within the new academic tools that are emerging, such as F1000 prime, Publons, and PubPeer.
Article HighlightsO_LIThe CSR database was most mentioned in Twitter.
C_LIO_LITwitter and News act as the greatest prominent issues regarding altmetric scores.
C_LI | 10.5530/jscires.9.3.36 | medrxiv |
10.1101/19006619 | Association between CCR5-Δ32 homozygosity and mortality in 37,650 participants from three U.S.-based cohorts | Jiang, X.; Huang, H.; Grodstein, F.; Kraft, P. | Peter Kraft | Harvard T.H. Chan School of Public Health | 2019-10-03 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | epidemiology | https://www.medrxiv.org/content/early/2019/10/03/19006619.source.xml | An analysis of 409,693 UK Biobank participants recently published in Nature Medicine identified a relative 21% increase in all-cause mortality among participants who were homozygous for the {Delta}32 deletion in the C-C motif chemokine receptor 5 gene (CCR5).1 This is a timely and potentially cautionary result in light of He Jiankuis controversial germline editing of CCR5 to induce mutations that putatively mimic the effects of {Delta}32, which is known to reduce the risk of HIV infection. To provide additional evidence on the association between the {Delta}32 deletion and mortality and assess its generalizability, we present results from three large-scale population-based US cohorts: the Nurses Health Study (NHS),2 the NHSII and the Health Professional Follow-Up Study (HPFS).3 | null | medrxiv |
10.1101/19006288 | Methodology for tDCS integration with fMRI | Esmaeilpour, Z.; Shereen, A. D.; Ghobadi-Azari, P.; Datta, A.; Woods, A. J.; Ironside, M.; O'Shea, J.; Kirk, U.; Bikson, M.; Ekhtiari, H. | Zeinab Esmaeilpour | City University of New York | 2019-10-03 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | psychiatry and clinical psychology | https://www.medrxiv.org/content/early/2019/10/03/19006288.source.xml | Integration of tDCS with fMRI holds promise for investigation the underlying mechanism of stimulation effect. There are 118 published tDCS studies (up to 1st Oct 2018) that used fMRI as a proxy measure of neural activation to answer mechanistic, predictive, and localization questions about how brain activity is modulated by tDCS. FMRI can potentially contribute as: a measure of cognitive state-level variance in baseline brain activation before tDCS; inform the design of stimulation montages that aim to target functional networks during specific tasks; and act as an outcome measure of functional response to tDCS. In this systematic review we explore methodological parameter space of tDCS integration with fMRI. Existing tDCS-fMRI literature shows little replication across these permutations; few studies used comparable study designs. Here, we use a case study with both task and resting state fMRI before and after tDCS in a cross-over design to discuss methodological confounds. We further outline how computational models of current flow should be combined with imaging data to understand sources of variability in responsiveness. Through the case study, we demonstrate how modeling and imaging methodology can be integrated for individualized analysis. Finally, we discuss the importance of conducting tDCS-fMRI with stimulation equipment certified as safe to use inside the MR scanner, and of correcting for image artifacts caused by tDCS. tDCS-fMRI can address important questions on the functional mechanisms of tDCS action (e.g. target engagement) and has the potential to support enhancement of behavioral interventions, provided studies are designed rationally. | 10.1002/hbm.24908 | medrxiv |
10.1101/19006569 | BAck Complaints in the Elders - Chiropractic (BACE-C): Design of a cohort study in chiropractic care | Jenks, A.; Rubinstein, S. M.; Hoekstra, T.; van Tulder, M.; de Luca, K.; French, S.; Newell, D.; Field, J.; Axen, I.; Koes, B.; Hartvigsen, J. | Alan Jenks | Vrije Universiteit | 2019-10-04 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | epidemiology | https://www.medrxiv.org/content/early/2019/10/04/19006569.source.xml | BackgroundLow back pain is a common condition among older adults that significantly influences physical function and participation. Compared to their younger counterparts, there is limited information available about the clinical course of low back pain in older people, in particularly those presenting for chiropractic care. Improving our understanding of this patient population and the course of their low back pain may provide input for studies researching safer and more effective care than is currently provided.
ObjectivesThe primary objectives are to examine the clinical course over one year of the intensity, healthcare costs and improvement rates of low back pain in people 55 years and older who visit a chiropractor for a new episode of low back pain.
MethodsAn international prospective, multi-center cohort study with one-year follow-up. Chiropractic practices are to be recruited in the Netherlands, Sweden, United Kingdom and Australia. Treatment will be left to the discretion of the chiropractor. Inclusion/Exclusion criteria: Patients 55 years and older who are accepted for care having presented to a chiropractor with a new episode of low back pain and who have not been to a chiropractor in the previous six months for an episode of low back pain are to be included, independent of whether or not they have seen another type of health care provider. Patients who are unable to complete the web-based questionnaires because of language restrictions or those with computer literacy restrictions will be excluded as well as those with cognitive disorders. In addition, those with a suspected tumor, fracture, infection or any other potential red flag or condition considered to be a contraindication for chiropractic care will be excluded. Data will be collected using online questionnaires at baseline, and at 2 and 6 weeks and at 3, 6, 9 and 12 months.
Trial RegistrationNederlandse Trial Registrar NL7507 | 10.1186/s12998-020-00302-z | medrxiv |
10.1101/19005744 | Postgraduate education among family and community physicians in Brazil: the Trajetorias MFC project | Fontenelle, L. F.; Rossi, S. V.; Oliveira, M. H. M. d.; Brandao, D. J.; Sarti, T. D. | Leonardo Ferreira Fontenelle | Universidade Vila Velha | 2019-10-05 | 1 | PUBLISHAHEADOFPRINT | cc_by | medical education | https://www.medrxiv.org/content/early/2019/10/05/19005744.source.xml | Neither primary health care or family and community medicine are recognized as knowledge areas in Brazil, for the purpose of postgraduate education (masters, Ph.D.) or research. Our objective was to describe the postgraduate education trajectories of family and community physicians in Brazil. In this observational, exploratory study, we used data from SBMFC and SisCNRM to compile the list of physicians and community physicians, and then downloaded their curricula vitae from the Lattes Platform, verifying all data for consistency. A masters degree was held by one in eight, and a Ph.D., by one in forty; most degrees were in collective health. Women (versus men) were less likely to hold masters degrees, and even less likely to hold Ph.D. degrees. Professional (versus academic) masters degrees and those in other areas (versus in medicine or collective health) were also associated with lower probability of obtaining a Ph.D. degree. Certified specialists (versus those with a medical residency) with a postgraduate degree were more likely to have earned it before becoming family and community physicians. We suggest that researchers in public health critically examine the relative benefits of different postgraduate trajectories for the professional performance of family and community physicians. | 10.1136/fmch-2020-000321 | medrxiv |
10.1101/19005744 | Postgraduate education among family and community physicians in Brazil: the Trajetorias MFC project | Fontenelle, L. F.; Rossi, S. V.; Oliveira, M. H. M. d.; Brandao, D. J.; Sarti, T. D. | Leonardo Ferreira Fontenelle | Universidade Vila Velha | 2019-12-11 | 2 | PUBLISHAHEADOFPRINT | cc_by | medical education | https://www.medrxiv.org/content/early/2019/12/11/19005744.source.xml | Neither primary health care or family and community medicine are recognized as knowledge areas in Brazil, for the purpose of postgraduate education (masters, Ph.D.) or research. Our objective was to describe the postgraduate education trajectories of family and community physicians in Brazil. In this observational, exploratory study, we used data from SBMFC and SisCNRM to compile the list of physicians and community physicians, and then downloaded their curricula vitae from the Lattes Platform, verifying all data for consistency. A masters degree was held by one in eight, and a Ph.D., by one in forty; most degrees were in collective health. Women (versus men) were less likely to hold masters degrees, and even less likely to hold Ph.D. degrees. Professional (versus academic) masters degrees and those in other areas (versus in medicine or collective health) were also associated with lower probability of obtaining a Ph.D. degree. Certified specialists (versus those with a medical residency) with a postgraduate degree were more likely to have earned it before becoming family and community physicians. We suggest that researchers in public health critically examine the relative benefits of different postgraduate trajectories for the professional performance of family and community physicians. | 10.1136/fmch-2020-000321 | medrxiv |
10.1101/19006379 | A method for complete characterization of complex germline rearrangements from long DNA reads | Mitsuhashi, S.; Ohori, S.; Katoh, K.; Frith, M. C.; Matsumoto, N. | Naomichi Matsumoto | Yokohama City University Graduate School of Medicine | 2019-10-05 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | genetic and genomic medicine | https://www.medrxiv.org/content/early/2019/10/05/19006379.source.xml | Many genetic/genomic disorders are caused by genomic rearrangements. Standard methods can often characterize these variations only partly, e.g. copy number changes. We describe full characterization of complex chromosomal rearrangements, based on whole-genome-coverage sequencing of long DNA reads from four patients with chromosomal translocations. We developed a new analysis pipeline, which filters out rearrangements seen in humans without the same disease, reducing the number of loci per patient from a few thousand to a few dozen. For one patient with two reciprocal chromosomal translocations, we find that the translocation points have complex rearrangements of multiple DNA fragments involving 5 chromosomes, which we could order and orient by an automatic algorithm, thereby fully reconstructing the rearrangement. Some important properties of these rearrangements, such as sequence loss, are holistic: they cannot be inferred from any part of the rearrangement, but only from the fully-reconstructed rearrangement. In this patient, the rearrangements were evidently caused by shattering of the chromosomes into multiple fragments, which rejoined in a different order and orientation with loss of some fragments. Our approach promises to fully characterize many congenital germline rearrangements, provided they do not involve poorly-understood loci such as centromeric repeats. | 10.1186/s13073-020-00762-1 | medrxiv |
10.1101/19007013 | Examining the effect of smoking on suicidal ideation and attempts: A Mendelian randomisation study | Harrison, R.; Munafo, M. R.; Davey Smith, G.; Wootton, R. E. | Robyn E Wootton | University of Bristol | 2019-10-05 | 1 | PUBLISHAHEADOFPRINT | cc_by | epidemiology | https://www.medrxiv.org/content/early/2019/10/05/19007013.source.xml | BackgroundPrevious literature has demonstrated a strong association between cigarette smoking and suicide-related behaviours, characterised as ideation, plans, attempts and suicide related death. This association has not previously been examined in a causal inference framework and has important implications for suicide prevention strategies.
AimsWe aimed to examine the evidence for an association between smoking behaviours (initiation, smoking status, heaviness, lifetime smoking) and suicidal thoughts or attempts by triangulating across observational and Mendelian randomisation (MR) analyses.
MethodsFirst, in the UK Biobank, we calculate observed associations between smoking behaviours and suicidal thoughts or attempts. Second, we used Mendelian randomisation (MR) to explore the relationship between smoking and suicide using genetic variants as instruments to reduce bias from residual confounding and reverse causation.
ResultsOur observational analysis showed a relationship between smoking behaviour and suicidal behaviour, particularly between smoking initiation and suicidal attempts (OR = 2.07, 95% CI = 1.91 to 2.26, p<0.001). The MR analysis and single SNP analysis, however, did not support this. Despite past literature showing a positive dose-response relationship our results showed no clear evidence for a causal effect of smoking on suicidal behaviours.
ConclusionThis was the first MR study to explore the effect of smoking on suicidal behaviours. Our results suggest that, despite observed associations, there is no strong evidence for a causal effect of smoking behaviour on suicidal behaviour. Our evidence suggests that further research is needed into alternative risk factors for suicide which might make better intervention targets. | 10.1192/bjp.2020.68 | medrxiv |
10.1101/19007013 | Examining the effect of smoking on suicidal ideation and attempts: A triangulation of epidemiological approaches | Harrison, R.; Munafo, M. R.; Davey Smith, G.; Wootton, R. E. | Robyn E Wootton | University of Bristol | 2019-10-25 | 2 | PUBLISHAHEADOFPRINT | cc_by | epidemiology | https://www.medrxiv.org/content/early/2019/10/25/19007013.source.xml | BackgroundPrevious literature has demonstrated a strong association between cigarette smoking and suicide-related behaviours, characterised as ideation, plans, attempts and suicide related death. This association has not previously been examined in a causal inference framework and has important implications for suicide prevention strategies.
AimsWe aimed to examine the evidence for an association between smoking behaviours (initiation, smoking status, heaviness, lifetime smoking) and suicidal thoughts or attempts by triangulating across observational and Mendelian randomisation (MR) analyses.
MethodsFirst, in the UK Biobank, we calculate observed associations between smoking behaviours and suicidal thoughts or attempts. Second, we used Mendelian randomisation (MR) to explore the relationship between smoking and suicide using genetic variants as instruments to reduce bias from residual confounding and reverse causation.
ResultsOur observational analysis showed a relationship between smoking behaviour and suicidal behaviour, particularly between smoking initiation and suicidal attempts (OR = 2.07, 95% CI = 1.91 to 2.26, p<0.001). The MR analysis and single SNP analysis, however, did not support this. Despite past literature showing a positive dose-response relationship our results showed no clear evidence for a causal effect of smoking on suicidal behaviours.
ConclusionThis was the first MR study to explore the effect of smoking on suicidal behaviours. Our results suggest that, despite observed associations, there is no strong evidence for a causal effect of smoking behaviour on suicidal behaviour. Our evidence suggests that further research is needed into alternative risk factors for suicide which might make better intervention targets. | 10.1192/bjp.2020.68 | medrxiv |
10.1101/19007336 | Implementation of a specialist pneumonia intervention nurse service significantly lowers mortality for community acquired pneumonia | Free, R. C.; Richardson, M.; Pillay, C.; Skeemer, J.; Hawkes, K.; Broughton, R.; Woltmann, G. | Gerrit Woltmann | University Hospitals of Leicester NHS Trust | 2019-10-05 | 1 | PUBLISHAHEADOFPRINT | cc_by | respiratory medicine | https://www.medrxiv.org/content/early/2019/10/05/19007336.source.xml | ObjectivesEvaluate clinical outcomes associated with implementing a specialist pneumonia intervention nursing (SPIN) service, to improve adherence with BTS guidelines for hospitalised community acquired pneumonia (CAP).
DesignRetrospective cohort study, comparing periods before (2011-13) and after (2014-16) SPIN service implementation.
SettingSingle NHS trust across two hospital sites in Leicester City, England
Participants13,496 adult (aged [≥]16) admissions to hospital with a primary diagnosis of CAP
InterventionsThe SPIN service was set up in 2013/2014 to provide clinical review of new CAP admissions; assurance of guidelines adherence; delivery of CAP clinical education and clinical follow up after discharge.
Main outcome measuresThe primary outcomes were proportions of CAP cases receiving antibiotic treatment within 4 hours of admission and change in crude in-hospital mortality rate. Secondary outcomes were adjusted mortality rate and length of stay (LOS).
ResultsThe SPIN service reviewed 38% of CAP admissions in 2014-16. 82% of these admissions received antibiotic treatment in <4 hours (68.5% in the national audit). Compared with the pre-SPIN period, there was a significant reduction in both 30-day (OR=0.77 [0.70-0.85], p<0.0001) and in-hospital mortality (OR=0.66 [0.60-0.73], p<0.0001) after service implementation, with a review by the service having the largest independent 30-day mortality benefit (HR=0.60 [0.53-0.67], p<0.0001). There was no change in LOS (median 6 days).
ConclusionsImplementation of a SPIN service improves adherence with BTS guidelines and achieves significant reductions in CAP associated mortality. This enhanced model of care is low cost, highly effective and readily adoptable in secondary care.
Key MessagesO_ST_ABSWhat is the key question?C_ST_ABSDoes a specialist nurse-led intervention affect BTS guideline adherence and mortality for patients admitted to hospital with community acquired pneumonia (CAP)?
What is the bottom line?Implementing specialist nurse teams for CAP delivers improved guideline adherence and survival for patients admitted with the condition.
Why read on?This study shows a low-cost specialist nursing service focussed on CAP is associated with a significant improvement in BTS guidelines adherence and patient survival. | 10.1136/bmjresp-2020-000863 | medrxiv |
10.1101/19007336 | Implementation of a specialist pneumonia intervention nurse service significantly lowers mortality for community acquired pneumonia | Free, R. C.; Richardson, M.; Pillay, C.; Skeemer, J.; Hawkes, K.; Broughton, R.; Haldar, P.; Woltmann, G. | Gerrit Woltmann | University Hospitals of Leicester NHS Trust | 2019-10-10 | 2 | PUBLISHAHEADOFPRINT | cc_by | respiratory medicine | https://www.medrxiv.org/content/early/2019/10/10/19007336.source.xml | ObjectivesEvaluate clinical outcomes associated with implementing a specialist pneumonia intervention nursing (SPIN) service, to improve adherence with BTS guidelines for hospitalised community acquired pneumonia (CAP).
DesignRetrospective cohort study, comparing periods before (2011-13) and after (2014-16) SPIN service implementation.
SettingSingle NHS trust across two hospital sites in Leicester City, England
Participants13,496 adult (aged [≥]16) admissions to hospital with a primary diagnosis of CAP
InterventionsThe SPIN service was set up in 2013/2014 to provide clinical review of new CAP admissions; assurance of guidelines adherence; delivery of CAP clinical education and clinical follow up after discharge.
Main outcome measuresThe primary outcomes were proportions of CAP cases receiving antibiotic treatment within 4 hours of admission and change in crude in-hospital mortality rate. Secondary outcomes were adjusted mortality rate and length of stay (LOS).
ResultsThe SPIN service reviewed 38% of CAP admissions in 2014-16. 82% of these admissions received antibiotic treatment in <4 hours (68.5% in the national audit). Compared with the pre-SPIN period, there was a significant reduction in both 30-day (OR=0.77 [0.70-0.85], p<0.0001) and in-hospital mortality (OR=0.66 [0.60-0.73], p<0.0001) after service implementation, with a review by the service having the largest independent 30-day mortality benefit (HR=0.60 [0.53-0.67], p<0.0001). There was no change in LOS (median 6 days).
ConclusionsImplementation of a SPIN service improves adherence with BTS guidelines and achieves significant reductions in CAP associated mortality. This enhanced model of care is low cost, highly effective and readily adoptable in secondary care.
Key MessagesO_ST_ABSWhat is the key question?C_ST_ABSDoes a specialist nurse-led intervention affect BTS guideline adherence and mortality for patients admitted to hospital with community acquired pneumonia (CAP)?
What is the bottom line?Implementing specialist nurse teams for CAP delivers improved guideline adherence and survival for patients admitted with the condition.
Why read on?This study shows a low-cost specialist nursing service focussed on CAP is associated with a significant improvement in BTS guidelines adherence and patient survival. | 10.1136/bmjresp-2020-000863 | medrxiv |
10.1101/19007187 | The day-to-day experiences of caring for children with Osteogenesis Imperfecta: A qualitative descriptive study | Castro, A. R.; Marinello, J.; Chougui, K.; Morand, M.; Bilodeau, C.; Rauch, F.; Tsimicalis, A. | Aimee R Castro | McGill University and Shriners Hospitals for Children-Canada | 2019-10-05 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | nursing | https://www.medrxiv.org/content/early/2019/10/05/19007187.source.xml | Aims and objectivesThis study aimed to explore the day-to-day experiences of caregivers who are caring for children with Osteogenesis Imperfecta (OI).
BackgroundOI is a rare genetic condition known to cause bone fragility. Family caregivers, such as parents, of children with OI play an important role in helping these children live well at home.
DesignThe design was qualitative description.
MethodsA qualitative descriptive study was conducted which adheres to the COREQ guidelines. Adult caregivers (n=18) of children with OI were recruited at a childrens hospital in Montreal, Canada to participate in individual interviews. The interviews were transcribed verbatim and inductively thematically analysed.
ResultsThe following caregiving themes were identified in these interviews: regular day-to-day caregiving activities, including morning routines, evening routines, and the facilitation of their childs mobilization; periods that made the caregiving routine more challenging, such as fractures, surgeries, and pain; and the long-term strategies caregivers developed to support day-to-day care, such as managing the environment, accessing medical and school resources, and coordinating care and respite.
ConclusionsThe results showcase what being a caregiver for a child with OI involves on a day-to-day basis.
Relevance to clinical practiceThe recommendations include suggestions for future clinical, policy, and research endeavours to develop better policies and interventions to support the unique needs of family caregivers of children with OI. These recommendations may be relevant to other clinicians and policymakers working with families living with rare and chronic physical conditions. | 10.1111/jocn.15310 | medrxiv |
10.1101/19007187 | The day-to-day experiences of caring for children with Osteogenesis Imperfecta: A qualitative descriptive study | Castro, A. R.; Marinello, J.; Chougui, K.; Morand, M.; Bilodeau, C.; Tsimicalis, A. | Aimee R Castro | McGill University and Shriners Hospitals for Children-Canada | 2019-11-05 | 2 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | nursing | https://www.medrxiv.org/content/early/2019/11/05/19007187.source.xml | Aims and objectivesThis study aimed to explore the day-to-day experiences of caregivers who are caring for children with Osteogenesis Imperfecta (OI).
BackgroundOI is a rare genetic condition known to cause bone fragility. Family caregivers, such as parents, of children with OI play an important role in helping these children live well at home.
DesignThe design was qualitative description.
MethodsA qualitative descriptive study was conducted which adheres to the COREQ guidelines. Adult caregivers (n=18) of children with OI were recruited at a childrens hospital in Montreal, Canada to participate in individual interviews. The interviews were transcribed verbatim and inductively thematically analysed.
ResultsThe following caregiving themes were identified in these interviews: regular day-to-day caregiving activities, including morning routines, evening routines, and the facilitation of their childs mobilization; periods that made the caregiving routine more challenging, such as fractures, surgeries, and pain; and the long-term strategies caregivers developed to support day-to-day care, such as managing the environment, accessing medical and school resources, and coordinating care and respite.
ConclusionsThe results showcase what being a caregiver for a child with OI involves on a day-to-day basis.
Relevance to clinical practiceThe recommendations include suggestions for future clinical, policy, and research endeavours to develop better policies and interventions to support the unique needs of family caregivers of children with OI. These recommendations may be relevant to other clinicians and policymakers working with families living with rare and chronic physical conditions. | 10.1111/jocn.15310 | medrxiv |
10.1101/19007708 | Systematic review and network meta-analysis with individual participant data on Cord Management at Preterm Birth (iCOMP): study protocol | Seidler, A. L. L.; Duley, L.; Katheria, A.; De Paco Matallana, C.; Dempsey, E.; Rabe, H.; Kattwinkel, J.; Mercer, J.; Josephsen, J.; Fairchild, K.; Andersson, O.; Hosono, S.; Sundaram, V.; Datta, V.; El-Naggar, W.; Tarnow-Mordi, W.; Debray, T. P. A.; Hooper, S.; Kluckow, M.; Polglase, G.; Davis, P.; Montgomery, A.; Hunter, K. E.; Barba, A.; Simes, J.; Askie, L. | Anna Lene L Seidler | University of Sydney, NHMRC Clinical Trials Centre | 2019-10-05 | 1 | PUBLISHAHEADOFPRINT | cc_by_nd | obstetrics and gynecology | https://www.medrxiv.org/content/early/2019/10/05/19007708.source.xml | IntroductionTiming of cord clamping and other cord management strategies may improve outcomes at preterm birth. However, it is unclear whether benefits apply to all preterm subgroups such as those who usually receive immediate neonatal care. Previous and current trials compare various policies, including immediate cord clamping, time- or physiology-based deferred cord clamping, and cord milking. Individual participant data (IPD) enables exploration of different strategies within subgroups. Network meta-analysis (NMA) enables comparison and ranking of all available interventions using a combination of direct and indirect comparisons.
Objectives1) To evaluate the effectiveness of cord management strategies for preterm infants on neonatal mortality and morbidity overall and for different participant characteristics using IPD meta-analysis; and 2) to evaluate and rank the effect of different cord management strategies for preterm births on mortality and other key outcomes using NMA.
Methods and analysisWe will conduct a systematic search of Medline, Embase, clinical trial registries, and other sources for all planned, ongoing and completed randomised controlled trials comparing alternative cord management strategies at preterm birth (before 37 weeks gestation). IPD will be sought for all trials. First, deferred clamping and cord milking will be compared with immediate clamping in pairwise IPD meta-analyses. The primary outcome will be death prior to hospital discharge. Effect differences will be explored for pre-specified subgroups of participants. Second, all identified cord management strategies will be compared and ranked in an IPD NMA for the primary outcome and the key secondary outcomes intraventricular haemorrhage (any grade) and infant blood transfusions (any). Treatment effect differences by participant characteristics will be identified. Inconsistency and heterogeneity will be explored.
Ethics and disseminationApproved by University of Sydney Human Research Ethics Committee (2018/886). Results will be relevant to clinicians, guideline-developers and policy-makers, and will be disseminated via publications, presentations, and media releases.
RegistrationAustralian New Zealand Clinical Trials Registry: ACTRN12619001305112.
STRENGTH AND LIMITATIONS OF THIS STUDYO_LIThis will be the most comprehensive review to date of interventions for umbilical cord management in preterm infants and the findings will be highly relevant to clinicians and guideline developers
C_LIO_LIThe use of individual participant data will allow assessment of the best treatment option for key subgroups of participants
C_LIO_LINetwork meta-analysis will enable the comparison and ranking of all available treatment options using direct and indirect evidence
C_LIO_LIFor some of the trials it will not be possible to obtain individual participant data, so published aggregate results will be used instead
C_LIO_LIRisk of bias in the primary trials will be assessed using Cochrane criteria, and certainty of evidence for the meta-analyses will be appraised using the GRADE approach for the pairwise comparisons, and the CINeMA approach for the network meta-analysis
C_LI | 10.1136/bmjopen-2019-034595 | medrxiv |
10.1101/19008029 | Potential impact of outpatient stewardship interventions on antibiotic exposures of bacterial pathogens | Tedijanto, C.; Grad, Y. H.; Lipsitch, M. | Christine Tedijanto | Center for Communicable Disease Dynamics, Department of Epidemiology, Harvard T.H. Chan School of Public Health | 2019-10-05 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | infectious diseases | https://www.medrxiv.org/content/early/2019/10/05/19008029.source.xml | The relationship between antibiotic stewardship and population levels of antibiotic resistance remains unclear. In order to better understand shifts in selective pressure due to stewardship, we use publicly available data to estimate the effect of changes in prescribing on exposures to frequently used antibiotics experienced by potentially pathogenic bacteria that are asymptomatically colonizing the microbiome. We quantify this impact under four hypothetical stewardship strategies. In one scenario, we estimate that elimination of all unnecessary outpatient antibiotic use could avert 6 to 48% (IQR: 17 to 31%) of exposures across pairwise combinations of sixteen common antibiotics and nine bacterial pathogens. All scenarios demonstrate that stewardship interventions, facilitated by changes in clinician behavior and improved diagnostics, have the opportunity to broadly reduce antibiotic exposures across a range of potential pathogens. Concurrent approaches, such as vaccines aiming to reduce infection incidence, are needed to further decrease exposures occurring in "necessary" contexts. | 10.7554/eLife.52307 | medrxiv |
10.1101/19008029 | Potential impact of outpatient stewardship interventions on antibiotic exposures of bacterial pathogens | Tedijanto, C.; Grad, Y. H.; Lipsitch, M. | Christine Tedijanto | Center for Communicable Disease Dynamics, Department of Epidemiology, Harvard T.H. Chan School of Public Health | 2020-01-17 | 2 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | infectious diseases | https://www.medrxiv.org/content/early/2020/01/17/19008029.source.xml | The relationship between antibiotic stewardship and population levels of antibiotic resistance remains unclear. In order to better understand shifts in selective pressure due to stewardship, we use publicly available data to estimate the effect of changes in prescribing on exposures to frequently used antibiotics experienced by potentially pathogenic bacteria that are asymptomatically colonizing the microbiome. We quantify this impact under four hypothetical stewardship strategies. In one scenario, we estimate that elimination of all unnecessary outpatient antibiotic use could avert 6 to 48% (IQR: 17 to 31%) of exposures across pairwise combinations of sixteen common antibiotics and nine bacterial pathogens. All scenarios demonstrate that stewardship interventions, facilitated by changes in clinician behavior and improved diagnostics, have the opportunity to broadly reduce antibiotic exposures across a range of potential pathogens. Concurrent approaches, such as vaccines aiming to reduce infection incidence, are needed to further decrease exposures occurring in "necessary" contexts. | 10.7554/eLife.52307 | medrxiv |
10.1101/19007021 | A validation of machine learning-based risk scores in the prehospital setting | Spangler, D. N.; Hermansson, T.; Smekal, D.; Blomberg, H. | Douglas Nils Spangler | Uppsala University | 2019-10-05 | 1 | PUBLISHAHEADOFPRINT | cc_by_nd | emergency medicine | https://www.medrxiv.org/content/early/2019/10/05/19007021.source.xml | BackgroundThe triage of patients in pre-hospital care is a difficult task, and improved risk assessment tools are needed both at the dispatch center and on the ambulance to differentiate between low- and high-risk patients. This study develops and validates a machine learning-based approach to predicting hospital outcomes based on routinely collected prehospital data.
MethodsDispatch, ambulance, and hospital data were collected in one Swedish region from 2016 - 2017. Dispatch center and ambulance records were used to develop gradient boosting models predicting hospital admission, critical care (defined as admission to an intensive care unit or in-hospital mortality), and two-day mortality. Model predictions were used to generate composite risk scores which were compared to National Early Warning System (NEWS) scores and actual dispatched priorities in a similar but prospectively gathered dataset from 2018.
ResultsA total of 38203 patients were included from 2016-2018. Concordance indexes (or area under the receiver operating characteristics curve) for dispatched priorities ranged from 0.51 - 0.66, while those for NEWS scores ranged from 0.66 - 0.85. Concordance ranged from 0.71 - 0.80 for risk scores based only on dispatch data, and 0.79 - 0.89 for risk scores including ambulance data. Dispatch data-based risk scores consistently outperformed dispatched priorities in predicting hospital outcomes, while models including ambulance data also consistently outperformed NEWS scores. Model performance in the prospective test dataset was similar to that found using cross-validation, and calibration was comparable to that of NEWS scores.
ConclusionsMachine learning-based risk scores outperformed a widely-used rule-based triage algorithm and human prioritization decisions in predicting hospital outcomes. Performance was robust in a prospectively gathered dataset, and scores demonstrated adequate calibration. Future research should investigate the generality of these results to prehospital triage in other settings, and establish the impact of triage tools based on these methods by means of randomized trial. | 10.1371/journal.pone.0226518 | medrxiv |
10.1101/19008417 | Quantifying the success of measles vaccination campaigns in the Rohingya refugee camps | Chin, T.; Buckee, C. O.; Mahmud, A. S. | Ayesha S. Mahmud | University of California, Berkeley | 2019-10-05 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | epidemiology | https://www.medrxiv.org/content/early/2019/10/05/19008417.source.xml | In the wake of the Rohingya populations mass migration from Myanmar, one of the worlds largest refugee settlements was constructed in Coxs Bazar, Bangladesh to accommodate nearly 900,000 new refugees. Refugee populations are particularly vulnerable to infectious disease outbreaks due to many population and environmental factors. A large measles outbreak, with over 2,500 cases, occurred among the Rohingya population between September and December 2017. Here, we estimate key epidemiological parameters and use a dynamic mathematical model of measles transmission to evaluate the effectiveness of the reactive vaccination campaigns in the refugee camps. We also estimate the potential for subsequent outbreaks under different vaccination coverage scenarios. Our modeling results highlight the success of the vaccination campaigns in rapidly curbing transmission and emphasize the public health importance of maintaining high levels of vaccination in this population, where high birth rates and historically low vaccination coverage rates create suitable conditions for future measles outbreaks. | 10.1016/j.epidem.2020.100385 | medrxiv |
10.1101/19008011 | Can the impact of childhood adiposity on disease risk be reversed? A Mendelian randomization study | Richardson, T. G.; Sanderson, E.; Elsworth, B.; Tilling, K.; Davey Smith, G. | Tom G Richardson | MRC Integrative Epidemiology Unit (IEU) | 2019-10-05 | 1 | PUBLISHAHEADOFPRINT | cc_by | epidemiology | https://www.medrxiv.org/content/early/2019/10/05/19008011.source.xml | ObjectiveTo evaluate whether early life adiposity has an independent effect on later life disease risk or whether its influence is mediated by adulthood body mass index (BMI).
DesignTwo-sample univariable and multivariable Mendelian randomization.
SettingThe UK Biobank (UKB) prospective cohort study and four large-scale genome-wide association study (GWAS) consortia.
Participants453,169 participants enrolled in the UKB and a combined total of over 700,000 individuals from different GWAS consortia.
ExposuresMeasured BMI during adulthood (mean age: 56.5) and self-reported adiposity at age 10.
Main outcome measuresCoronary artery disease (CAD), type 2 diabetes (T2D), breast cancer and prostate cancer.
ResultsIndividuals with genetically predicted higher BMI in early life had increased odds of CAD (OR:1.49, 95% CI:1.33-1.68) and T2D (OR:2.32, 95% CI:1.76-3.05) based on univariable MR (UVMR) analyses. However, there was little evidence of a direct effect (i.e. not via adult BMI) based on multivariable MR (MVMR) estimates (CAD OR:1.02, 95% CI:0.86-1.22, T2D OR:1.16, 95% CI:0.74-1.82). In the MVMR analysis of breast cancer risk, there was strong evidence of a protective direct effect for early BMI (OR:0.59, 95% CI:0.50-0.71), although adult BMI did not appear to have a direct effect on this outcome (OR:1.08, 95% CI:0.93-1.27). Adding age of menarche as an additional exposure provided weak evidence of a total causal effect (UVMR OR:0.98, 95% CI:0.91-1.06) but strong evidence of a direct causal effect, independent of early and adult BMI (MVMR OR:0.90, 95% CI:0.85-0.95). Weak evidence of a causal effect was observed in the MVMR analysis of prostate cancer (early life BMI OR:1.06, 95% CI:0.81-1.40, adult BMI OR:0.87, 95% CI:0.70-1.08).
ConclusionsOur findings suggest that increased CAD and T2D risk attributed to early life adiposity can be mitigated if individuals reduce their weight in later life. However, having a low BMI during childhood may increase risk of breast cancer regardless of changes to weight in later life, with timing of puberty also putatively playing an important role. | 10.1136/bmj.m1203 | medrxiv |
10.1101/19007922 | Mathematical modeling of directed acyclic graphs to explore competing causal mechanisms underlying epidemiological study data | Havumaki, J.; Eisenberg, M. | Marisa Eisenberg | University of Michigan | 2019-10-05 | 1 | PUBLISHAHEADOFPRINT | cc_by_nd | epidemiology | https://www.medrxiv.org/content/early/2019/10/05/19007922.source.xml | 1Accurately estimating the effect of an exposure on an outcome requires understanding how variables relevant to a study question are causally related to each other. Directed acyclic graphs (DAGs) are used in epidemiology to understand causal processes and determine appropriate statistical approaches to obtain unbiased measures of effect. Compartmental models (CMs) are also used to represent different causal mechanisms, by depicting flows between disease states on the population level. In this paper, we extend a mapping between DAGs and CMs to show how DAG-derived CMs can be used to compare competing causal mechanisms by simulating epidemiological studies and conducting statistical analyses on the simulated data. Through this framework, we can evaluate how robust simulated epidemiological study results are to different biases in study design and underlying causal mechanisms. As a case study, we simulated a longitudinal cohort study to examine the obesity paradox: the apparent protective effect of obesity on mortality among diabetic ever-smokers, but not among diabetic never-smokers. Our simulations illustrate how study design bias (e.g., reverse causation), can lead to the obesity paradox. Ultimately, we show the utility of transforming DAGs into in silico laboratories within which researchers can systematically evaluate bias, and inform analyses and study design. | 10.1098/rsif.2019.0675 | medrxiv |
10.1101/19007922 | Using compartmental models to simulate directed acyclic graphs to explore competing causal mechanisms underlying epidemiological study data | Havumaki, J.; Eisenberg, M. | Marisa Eisenberg | University of Michigan | 2020-05-28 | 2 | PUBLISHAHEADOFPRINT | cc_by_nd | epidemiology | https://www.medrxiv.org/content/early/2020/05/28/19007922.source.xml | 1Accurately estimating the effect of an exposure on an outcome requires understanding how variables relevant to a study question are causally related to each other. Directed acyclic graphs (DAGs) are used in epidemiology to understand causal processes and determine appropriate statistical approaches to obtain unbiased measures of effect. Compartmental models (CMs) are also used to represent different causal mechanisms, by depicting flows between disease states on the population level. In this paper, we extend a mapping between DAGs and CMs to show how DAG-derived CMs can be used to compare competing causal mechanisms by simulating epidemiological studies and conducting statistical analyses on the simulated data. Through this framework, we can evaluate how robust simulated epidemiological study results are to different biases in study design and underlying causal mechanisms. As a case study, we simulated a longitudinal cohort study to examine the obesity paradox: the apparent protective effect of obesity on mortality among diabetic ever-smokers, but not among diabetic never-smokers. Our simulations illustrate how study design bias (e.g., reverse causation), can lead to the obesity paradox. Ultimately, we show the utility of transforming DAGs into in silico laboratories within which researchers can systematically evaluate bias, and inform analyses and study design. | 10.1098/rsif.2019.0675 | medrxiv |
10.1101/19007997 | Safety of Flexible Sigmoidoscopy in Pregnant Patients with known or suspected Inflammatory Bowel Disease | Ko, M. S.; Rudrapatna, V. A.; Avila, P.; Mahadevan, U. | Uma Mahadevan | University of California, San Francisco | 2019-10-05 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | gastroenterology | https://www.medrxiv.org/content/early/2019/10/05/19007997.source.xml | I. Background and AimsLower gastrointestinal endoscopy is the gold standard for the diagnosis and staging of Inflammatory Bowel Disease (IBD). However, there is limited safety data in pregnant populations, resulting in conservative society guidelines and practice patterns favoring diagnostic delay. The aim of this study is to investigate if the performance of flexible sigmoidoscopy is associated with adverse events in pregnant patients with known or suspected IBD.
II. MethodsA retrospective cohort study was conducted at the University of California San Francisco (UCSF) between April 2008 and April 2019. Female patients aged between 18 and 48 years who were pregnant at the time of endoscopy were identified. All patient records were reviewed to determine disease, pregnancy course, and lifestyle factors. Two independent reviewers performed the data abstraction. Adverse events were assessed for temporal relation (defined as within 4 weeks) with endoscopy. Any discrepancies in the two reviewers data were reviewed by a third independent investigator. Descriptive statistics of data were calculated, and comparison of continuous and categorical data were made using a one-sided Wilcoxon rank-sum test and Fishers exact test, respectively.
III. ResultsWe report the outcomes of 48 pregnant patients across all trimesters who underwent lower endoscopy for suspected or established IBD. There were no hospitalizations or adverse obstetric events temporally associated with sigmoidoscopy. 78% of patients experienced a change in treatment following sigmoidoscopy. 12% of the patients with known IBD were found to have no endoscopic evidence of disease activity despite symptoms.
IV. ConclusionsLower endoscopy in the pregnant patient with known or suspected IBD is low risk and affects therapeutic decision making. It should not be delayed in patients with appropriate indications. | 10.1007/s10620-020-06122-8 | medrxiv |
10.1101/19007872 | Predicting youth diabetes risk using NHANES data and machine learning | Vangeepuram, N.; Liu, B.; Chiu, P.-h.; Wang, L.; Pandey, G. | Nita Vangeepuram | Icahn School of Medicine at Mount Sinai | 2019-10-05 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | pediatrics | https://www.medrxiv.org/content/early/2019/10/05/19007872.source.xml | BackgroundPrediabetes and diabetes mellitus (preDM/DM) have become alarmingly prevalent among youth in recent years. However, simple questionnaire-based screening tools to reliably assess diabetes risk are only available for adults, not youth.
MethodsAs a first step in developing such a tool, we used a large-scale dataset from the National Health and Nutritional Examination Survey (NHANES) to examine the performance of a published pediatric clinical screening guideline in identifying youth with preDM/DM based on American Diabetes Association diagnostic biomarkers. We assessed the agreement between the clinical guideline and biomarker criteria using established evaluation measures (sensitivity, specificity, positive/negative predictive value, F-measure for the positive/negative preDM/DM classes, and Kappa). We also compared the performance of the guideline to those of machine learning (ML) based preDM/DM classifiers derived from the NHANES dataset.
ResultsApproximately 29% of the 2858 youth in our study population had preDM/DM based on biomarker criteria. The clinical guideline had a sensitivity of 43.1% and specificity of 67.6%, positive/negative predictive values of 35.2%/74.5%, positive/negative F-measures of 38.8%/70.9%, and Kappa of 0.1 (95%CI: 0.06-0.14). The performance of the guideline varied across demographic subgroups. Some ML-based classifiers performed comparably to or better than the screening guideline, especially in identifying preDM/DM youth (p=5.23x10-5).
ConclusionsWe demonstrated that a recommended pediatric clinical screening guideline did not perform well in identifying preDM/DM status among youth. Additional work is needed to develop a simple yet accurate screener for youth diabetes risk, potentially by using advanced ML methods and a wider range of clinical and behavioral health data.
Key MessagesO_LIAs a first step in developing a youth diabetes risk screening tool, we used a large-scale dataset from the National Health and Nutritional Examination Survey (NHANES) to examine the performance of a published pediatric clinical screening guideline in identifying youth with prediabetes/diabetes based on American Diabetes Association diagnostic biomarkers.
C_LIO_LIIn this cross-sectional study of youth, we found that the screening guideline correctly identified 43.1% of youth with prediabetes/diabetes, the performance of the guideline varied across demographic subgroups, and machine learning based classifiers performed comparably to or better than the screening guideline in identifying youth with prediabetes/diabetes.
C_LIO_LIAdditional work is needed to develop a simple yet accurate screener for youth diabetes risk, potentially by using advanced ML methods and a wider range of clinical and behavioral health data.
C_LI | 10.1038/s41598-021-90406-0 | medrxiv |
10.1101/19007872 | Estimating youth diabetes risk using NHANES data and machine learning | Vangeepuram, N.; Liu, B.; Chiu, P.-h.; Wang, L.; Pandey, G. | Nita Vangeepuram | Icahn School of Medicine at Mount Sinai | 2020-08-12 | 2 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | pediatrics | https://www.medrxiv.org/content/early/2020/08/12/19007872.source.xml | BackgroundPrediabetes and diabetes mellitus (preDM/DM) have become alarmingly prevalent among youth in recent years. However, simple questionnaire-based screening tools to reliably assess diabetes risk are only available for adults, not youth.
MethodsAs a first step in developing such a tool, we used a large-scale dataset from the National Health and Nutritional Examination Survey (NHANES) to examine the performance of a published pediatric clinical screening guideline in identifying youth with preDM/DM based on American Diabetes Association diagnostic biomarkers. We assessed the agreement between the clinical guideline and biomarker criteria using established evaluation measures (sensitivity, specificity, positive/negative predictive value, F-measure for the positive/negative preDM/DM classes, and Kappa). We also compared the performance of the guideline to those of machine learning (ML) based preDM/DM classifiers derived from the NHANES dataset.
ResultsApproximately 29% of the 2858 youth in our study population had preDM/DM based on biomarker criteria. The clinical guideline had a sensitivity of 43.1% and specificity of 67.6%, positive/negative predictive values of 35.2%/74.5%, positive/negative F-measures of 38.8%/70.9%, and Kappa of 0.1 (95%CI: 0.06-0.14). The performance of the guideline varied across demographic subgroups. Some ML-based classifiers performed comparably to or better than the screening guideline, especially in identifying preDM/DM youth (p=5.23x10-5).
ConclusionsWe demonstrated that a recommended pediatric clinical screening guideline did not perform well in identifying preDM/DM status among youth. Additional work is needed to develop a simple yet accurate screener for youth diabetes risk, potentially by using advanced ML methods and a wider range of clinical and behavioral health data.
Key MessagesO_LIAs a first step in developing a youth diabetes risk screening tool, we used a large-scale dataset from the National Health and Nutritional Examination Survey (NHANES) to examine the performance of a published pediatric clinical screening guideline in identifying youth with prediabetes/diabetes based on American Diabetes Association diagnostic biomarkers.
C_LIO_LIIn this cross-sectional study of youth, we found that the screening guideline correctly identified 43.1% of youth with prediabetes/diabetes, the performance of the guideline varied across demographic subgroups, and machine learning based classifiers performed comparably to or better than the screening guideline in identifying youth with prediabetes/diabetes.
C_LIO_LIAdditional work is needed to develop a simple yet accurate screener for youth diabetes risk, potentially by using advanced ML methods and a wider range of clinical and behavioral health data.
C_LI | 10.1038/s41598-021-90406-0 | medrxiv |
10.1101/19007377 | Measuring Total Healthcare Utilization Among Intimate Partner Violence Survivors In Primary Care | Logeais, M. E.; Wang, Q.; Renner, L. M.; Clark, C. J. | Mary E Logeais | University of Minnesota | 2019-10-05 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | primary care research | https://www.medrxiv.org/content/early/2019/10/05/19007377.source.xml | Rising health care costs are influenced by health care utilization, which encompasses hospital, ambulatory and non-face-to-face episodes of care. In this study, we created a novel a health care utilization-scoring tool that was used to examine whether one psychosocial factor, intimate partner violence (IPV), leads to higher utilization of health care services when controlling for relevant confounders. We sought to fill gaps about how social and behavioral issues impact utilization--particularly non-face-to-face episodes of care.
We conducted a retrospective cross-sectional study in 2017 examining patients seen at 11 University-affiliated primary care clinics from January 2015 to December 2016 who were screened for IPV. A total of 31,305 patients were screened, of which 280 screened positive. We controlled for medical complexity by deriving the revised Charlson Comorbidity Index for each patient. We calculated a novel utilization score, which was a weighted sum of hospital, ambulatory and non-face-to-face encounters. Missed appointments were also measured.
IPV-positive and IPV-negative patients were similar with respect medical complexity. IPV-positive patients had significantly higher mean utilization scores (54 vs. 40, p<0.001) and more missed appointments (3 vs. 1.3, p<0.001). IPV was associated with increased total utilization (p=0.015), as well as non-face-to-face and ambulatory visits (p=0.025 and p=0.015, respectively) for female patients and was associated with more missed appointments for both males and females (p< .001).
These data support more inclusive population-specific interventions focusing on social determinants of health to reduce both face-to-face and non-face-to-face utilization, which may improve health care expenditures, outcomes and provider satisfaction. | null | medrxiv |
10.1101/19007633 | PERIPHERAL BLOOD AS TOOL TO DETERMINE GENE EXPRESSION PATTERNS IN PATIENTS WITH PSYCHIATRIC, NEUROLOGICAL AND OTHER COMMON DISORDERS: A SYSTEMATIC REVIEW AND META-ANALYSIS PROTOCOL | Panzenhagen, A. C.; Alves-Teixeira, A.; Wissmann, M. S.; Girardi, C. S.; Santos, L.; Silveira, A. K.; Gelain, D. P.; Moreira, J. C. F. | Alana Castro Panzenhagen | Universidade Federal do Rio Grande do Sul | 2019-10-05 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | psychiatry and clinical psychology | https://www.medrxiv.org/content/early/2019/10/05/19007633.source.xml | IntroductionCommon diseases are influenced by a variety of factors that can enhance one persons susceptibility to developing a specific condition. Complex traits have been investigated in several biological levels. One that reflects the high interconnectivity and interaction of genes, proteins and transcription factors is the transcriptome. In this study, we disclose the protocol for a systematic review and meta-analysis aiming at summarizing the available evidence regarding transcriptomic gene expression levels of peripheral blood samples comparing subjects with psychiatric, neurological and other common disorders to healthy controls.
Methods and analysisThe investigation of the transcriptomic levels in the peripheral blood enables the unique opportunity to unravel the etiology of common diseases in patients ex-vivo. However, the experimental results should be minimally consistent across studies for them to be considered as the best approximation of the true effect. In order to test this, we will systematically identify all transcriptome studies that compared subjects with common disorders to their respective control samples. We will apply meta-analyses to assess the overall differentially expressed genes throughout the studies of each condition.
Ethics and disseminationThe data that will be used to conduct this study are available online and have already been published following their own ethical laws. Therefore this study requires no further ethical approval. The results of this study will be published in leading peer-reviewed journals of the area and also presented at relevant national and international conferences.
Strengths and limitations of this study We present a new and systematically centered method to assess the overall effect of transcriptomic levels in the blood of subjects with common conditions.
Meta-analyses are a robust statistical method to assess effect sizes across studies.
The analysis is limited by the availability of studies, as well as their quality and comprehensiveness.
Subgroup and meta-regression analyses will be also limited by the amount and quality of sample characterization variables made available by original studies. | null | medrxiv |
10.1101/19008508 | A conceptual model for pluralistic healthcare behavior: results from a qualitative study in southwestern Uganda | Sundararajan, R.; Mwanga-Amumpaire, J.; King, R.; Ware, N. C. | Radhika Sundararajan | Weill Cornell Medicine | 2019-10-05 | 1 | PUBLISHAHEADOFPRINT | cc_no | public and global health | https://www.medrxiv.org/content/early/2019/10/05/19008508.source.xml | IntroductionMedical pluralism, or concurrent utilization of multiple therapeutic modalities, is common in various international contexts, and has been characterized as a factor contributing to poor health outcomes in low-resource settings. Traditional healers are ubiquitous providers in most regions, including the study site of southwestern Uganda. It is not well understood why patients in pluralistic settings continue to engage with both therapeutic healthcare modalities, rather than simply selecting one or the other. The goal of this study was to identify factors that motivate pluralistic healthcare utilization, and create a general, conceptual framework of pluralistic health behavior.
MethodsIn-depth interviews were conducted between September 2017 and February 2018 with patients seeking care at traditional healers (N=30) and at an outpatient medicine clinic (N=30) in Mbarara, Uganda; the study is nested within a longitudinal project examining HIV testing engagement among traditional healer-utilizing communities. Inclusion criteria included age [≥]18 years, and ability to provide informed consent. Participants were recruited from healer practices representing the range of healer specialties. Following an inductive approach, interview transcripts were reviewed and coded to identify conceptual categories explaining healthcare utilization.
ResultsWe identified three broad categories relevant to healthcare utilization among study participants: 1) traditional healers treat patients with "care"; 2) biomedicine uses "modern" technologies; and 3) peer "testimony" influences healthcare engagement. These categories describe variables at the healthcare provider, healthcare system, and peer levels that interrelate to motivate individual engagement in pluralistic health resources.
ConclusionsPatients perceive clear advantages and disadvantages to biomedical and traditional care in medically pluralistic settings. We identified factors at the healthcare provider, healthcare system, and peer levels which influence patients therapeutic itineraries. Our findings provide a basis to improve health outcomes in medically pluralistic settings, and underscore the importance of recognizing traditional healers as important stakeholders in community health.
O_TEXTBOXSTRENGTHS AND LIMITATIONS OF THIS STUDY
O_LIMedical pluralism is common in both high- and low-resource settings, and has been characterized as a factor leading to poor health outcomes for both infectious and non-communicable diseases
C_LIO_LIThis study identifies factors that motivate utilization of healthcare in a medically pluralistic community
C_LIO_LIPatients in pluralistic settings perceive clear advantages and disadvantages of both traditional care and biomedicine; characteristics of healthcare providers, the healthcare system, and peer influences motivate patients to engage with particular healthcare modalities
C_LIO_LIPatients often prefer traditional healing instead of biomedicine; this utilization is not simply a function of limited access to biomedical resources
C_LIO_LITraditional healers should be considered important stakeholders in community health
C_LI
C_TEXTBOX | 10.1136/bmjopen-2019-033410 | medrxiv |
10.1101/19007765 | A machine learning approach for precision diagnosis of juvenile-onset SLE | Robinson, G. A.; Peng, J.; Dönnes, P.; Coelewij, L.; Radziszewska, A.; Wincup, C.; Peckham, H.; Isenberg, D. A.; Ioannou, Y.; Ciurtin, C.; Pineda-Torra, I.; Jury, E. C. | Elizabeth C Jury | University College London | 2019-10-05 | 1 | PUBLISHAHEADOFPRINT | cc_no | rheumatology | https://www.medrxiv.org/content/early/2019/10/05/19007765.source.xml | Juvenile-Onset systemic lupus erythematosus (JSLE) is an autoimmune rheumatic disease characterised by systemic inflammation and organ damage, with disease onset often coinciding with puberty. JSLE is associated with more severe disease manifestations and a higher motility rate compared to adult SLE. Due to the heterogeneous clinical and immunological manifestations of JSLE, delayed diagnosis and poor treatment efficacy are major barriers for improving patient outcome. In order to define a unique immunophenotyping profile distinguishing JSLE patients from age matched healthy controls, immune-based machine learning (ML) approaches were applied. Balanced random forest analysis discriminated JSLE patients from healthy controls with an overall 91% prediction accuracy. The top-ranked immunological features were selected from the optimal ML model and were validated by partial least squares discriminant analysis and logistic regression analysis. Patients could be clustered into four distinct groups based on the top hits from the ML model, providing an opportunity for tailored therapy. Moreover, complex correlations between the JSLE immune profile and clinical features of disease were identified. Further immunological association studies are essential for developing data-driven personalised medicine approaches to aid diagnosis of JSLE for targeted therapy and improved patient outcomes. | null | medrxiv |
10.1101/19007955 | PROBLEM BASED LEARNING APPLIED TO PRACTICE IN BRAZILIANS MEDICAL SCHOOL: A MINI SYSTEMATIC REVIEW | Bussolaro, F. A.; Thereza Bussolaro, C. L. | Claudine L Thereza Bussolaro | University of Alberta | 2019-10-05 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | medical education | https://www.medrxiv.org/content/early/2019/10/05/19007955.source.xml | BackgroundActive learning is a well-established educational methodology in medical schools worldwide, although its implementation in Brazilian clinical settings is quite challenging. The objective of this study is to review the literature in a systematic manner to find and conduct a reflective analysis of how problem-based learning (PBL) has been applied to clinical teaching in medical schools in Brazil.
Material & methodsA systematic literature search was conducted in three databases. A total of 250 papers related to PBL in Brazilian medical schools were identified through the database searches. Four studies were finally selected for the review.
ResultsFour fields of medicine were explored on the four selected papers: gynecology/family medicine, medical semiology, psychiatry, and pediatrics. Overall, all the papers reported some level of strategic adaptability of the original PBL methodology to be applied in the Brazilian medical schools curricula and to the peculiar characteristics specific to Brazil.
ConclusionPBL application in Brazilian medical schools require some level of alteration from the original format, to better adapt to the characteristics of Brazilian students maturity, health system priorities and the medical labor market. | null | medrxiv |
10.1101/19007955 | PROBLEM BASED LEARNING APPLIED TO PRACTICE IN BRAZILIANS MEDICAL SCHOOL: A MINI SYSTEMATIC REVIEW | Bussolaro, F. A.; Thereza Bussolaro, C. L. | Claudine L Thereza Bussolaro | University of Alberta | 2020-01-20 | 2 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | medical education | https://www.medrxiv.org/content/early/2020/01/20/19007955.source.xml | BackgroundActive learning is a well-established educational methodology in medical schools worldwide, although its implementation in Brazilian clinical settings is quite challenging. The objective of this study is to review the literature in a systematic manner to find and conduct a reflective analysis of how problem-based learning (PBL) has been applied to clinical teaching in medical schools in Brazil.
Material & methodsA systematic literature search was conducted in three databases. A total of 250 papers related to PBL in Brazilian medical schools were identified through the database searches. Four studies were finally selected for the review.
ResultsFour fields of medicine were explored on the four selected papers: gynecology/family medicine, medical semiology, psychiatry, and pediatrics. Overall, all the papers reported some level of strategic adaptability of the original PBL methodology to be applied in the Brazilian medical schools curricula and to the peculiar characteristics specific to Brazil.
ConclusionPBL application in Brazilian medical schools require some level of alteration from the original format, to better adapt to the characteristics of Brazilian students maturity, health system priorities and the medical labor market. | null | medrxiv |
10.1101/19008086 | Forecasting Seizure Risk over Days | Proix, T.; Truccolo, W.; Leguia, M. G.; King-Stephens, D.; Rao, V. R.; Baud, M. O. | Maxime O Baud | University of Bern | 2019-10-05 | 1 | PUBLISHAHEADOFPRINT | cc_no | neurology | https://www.medrxiv.org/content/early/2019/10/05/19008086.source.xml | For persons with epilepsy, much suffering stems from the apparent unpredictability of seizures. Historically, efforts to predict seizures have sought to detect changes in brain activity in the seconds to minutes preceding seizures (pre-ictal period), a timeframe that limits preventative interventions. Recently, converging evidence from studies using chronic intracranial electroencephalography revealed that brain activity in epilepsy has a robust cyclical structure over hours (circadian) and days (multidien). These cycles organize pro-ictal states, hours-to days-long periods of heightened seizure risk, raising the possibility of forecasting seizures over horizons longer than the pre-ictal period. Here, using cEEG from 18 subjects, we developed point-process generalized linear models incorporating cyclical variables at multiple time-scales to show that seizure risk can be forecasted accurately over days in most subjects. Personalized risk-stratification days in advance of seizures is unprecedented and may enable novel preventative strategies. | 10.1016/s1474-4422(20)30396-3 | medrxiv |
10.1101/19008490 | Traditional Healers as Client Advocates in the HIV-endemic Region of Maputo, Mozambique: Results from a qualitative study | Sundararajan, R.; Langa, P. V.; Morshed, T.; Manuel, S. | Radhika Sundararajan | Weill Cornell Medicine | 2019-10-05 | 1 | PUBLISHAHEADOFPRINT | cc_no | hiv aids | https://www.medrxiv.org/content/early/2019/10/05/19008490.source.xml | Traditional healers are commonly utilized throughout sub-Saharan Africa instead of - and in concert with - biomedical facilities. Traditional healers are trusted providers and prominent community members, and could be important partners in improving engagement with HIV services in endemic contexts. Our study sought to understand the roles of healers in the urban setting of Maputo, Mozambique, where HIV prevalence is high and testing rates are low. Qualitative data were gathered through minimally-structured interviews with 36 healers. Analysis followed an inductive, grounded theory approach. Data reveal three themes relevant to improving engagement with HIV services in this endemic region: 1) healers have positive attitudes towards biomedicine; 2) healers advocate for their sick clients; and 3) clients are reticent to present to biomedical facilities. Healers describe their roles as cooperative with biomedical providers to provide healthcare for their clients. Results suggest that healers could be considered critical enablers to effective HIV programs in communities. They have social and symbolic capital that positions them to beneficially influence clients, and are natural partners for interventions to improve uptake of HIV services. | 10.1080/17290376.2021.1909492 | medrxiv |
10.1101/19006841 | Machine learning prediction of motor response after deep brain stimulation in Parkinson's disease | Habets, J. G.; Duits, A. A.; Sijben, L. C.; De Greef, B.; Mulders, A.; Temel, Y.; Kuijf, M. L.; Kubben, P. L.; Herff, C.; Janssen, M. L. | Jeroen GV Habets | Department of Neurosurgery, School for Mental Health and Neuroscience, Maastricht University, The Netherlands | 2019-10-05 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | neurology | https://www.medrxiv.org/content/early/2019/10/05/19006841.source.xml | IntroductionDespite careful patient selection for subthalamic nucleus deep brain stimulation (STN DBS), some Parkinsons disease patients show limited improvement of motor disability. Non-conclusive results from previous prediction studies maintain the need for a simple tool for neurologists that reliably predicts postoperative motor response for individual patients. Establishing such a prediction tool facilitates the clinician to improve patient counselling, expectation management, and postoperative patient satisfaction. Predictive machine learning models can be used to generate individual outcome predictions instead of correlating pre- and postoperative variables on a group level.
MethodsWe developed a machine learning logistic regression prediction model which generates probabilities for experiencing weak motor response one year after surgery. The model analyses preoperative variables and is trained on 90 patients using a ten-fold cross-validation. We intentionally chose to leave out pre-, intra- and postoperative imaging and neurophysiology data, to ensure the usability in clinical practice. Weak responders (n = 27) were defined as patients who fail to show clinically relevant improvement on Unified Parkinson Disease Rating Scale (UPDRS) II, III or IV.
ResultsThe model predicts weak responders with an average area under the curve of the receiver operating characteristic of 0.88 (standard deviation: 0.14), a true positive rate of 0.85 and a false positive rate of 0.25, and a diagnostic accuracy of 78%. The reported influences of the individual preoperative variables are useful for clinical interpretation of the model, but cannot been interpreted separately regardless of the other variables in the model.
ConclusionThe very good diagnostic accuracy of the presented prediction model confirms the utility of machine-learning based motor response prediction one year after STN DBS implantation, based on clinical preoperative variables.
After reproduction and validation in a prospective cohort, this prediction model holds a tremendous potential to be a supportive tool for clinicians during the preoperative counseling. | 10.7717/peerj.10317 | medrxiv |
10.1101/19006841 | Machine learning prediction of motor response after deep brain stimulation in Parkinson's disease | Habets, J. G.; Duits, A. A.; Sijben, L. C.; De Greef, B.; Mulders, A.; Temel, Y.; Kuijf, M. L.; Kubben, P. L.; Herff, C.; Janssen, M. L. | Jeroen GV Habets | Department of Neurosurgery, School for Mental Health and Neuroscience, Maastricht University, The Netherlands | 2019-10-10 | 2 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | neurology | https://www.medrxiv.org/content/early/2019/10/10/19006841.source.xml | IntroductionDespite careful patient selection for subthalamic nucleus deep brain stimulation (STN DBS), some Parkinsons disease patients show limited improvement of motor disability. Non-conclusive results from previous prediction studies maintain the need for a simple tool for neurologists that reliably predicts postoperative motor response for individual patients. Establishing such a prediction tool facilitates the clinician to improve patient counselling, expectation management, and postoperative patient satisfaction. Predictive machine learning models can be used to generate individual outcome predictions instead of correlating pre- and postoperative variables on a group level.
MethodsWe developed a machine learning logistic regression prediction model which generates probabilities for experiencing weak motor response one year after surgery. The model analyses preoperative variables and is trained on 90 patients using a ten-fold cross-validation. We intentionally chose to leave out pre-, intra- and postoperative imaging and neurophysiology data, to ensure the usability in clinical practice. Weak responders (n = 27) were defined as patients who fail to show clinically relevant improvement on Unified Parkinson Disease Rating Scale (UPDRS) II, III or IV.
ResultsThe model predicts weak responders with an average area under the curve of the receiver operating characteristic of 0.88 (standard deviation: 0.14), a true positive rate of 0.85 and a false positive rate of 0.25, and a diagnostic accuracy of 78%. The reported influences of the individual preoperative variables are useful for clinical interpretation of the model, but cannot been interpreted separately regardless of the other variables in the model.
ConclusionThe very good diagnostic accuracy of the presented prediction model confirms the utility of machine-learning based motor response prediction one year after STN DBS implantation, based on clinical preoperative variables.
After reproduction and validation in a prospective cohort, this prediction model holds a tremendous potential to be a supportive tool for clinicians during the preoperative counseling. | 10.7717/peerj.10317 | medrxiv |
10.1101/19007054 | Computer-assisted craniometric evaluation for diagnosis and follow-up of craniofacial asymmetries: SymMetric v. 1.0 | Alho, E. J. L.; Rondinoni, C.; Furokawa, F. A. O.; Monaco, B. A. | Eduardo Joaquim Lopes Alho | University of Sao Paulo | 2019-10-06 | 1 | PUBLISHAHEADOFPRINT | cc_no | pediatrics | https://www.medrxiv.org/content/early/2019/10/06/19007054.source.xml | PurposeThe current assessment of patients with craniofacial asymmetries is accomplished by physical examination, anamnesis and radiological imaging.
We propose a semi-automated, computer-assisted craniofacial evaluation (SymMetric v 1.0) based on orthogonal photography of the patients head in 3 positions. The system is simple, low-cost, no-radiation or special resources needed. Although it does not substitute CT in cases of doubt between craniosynostosis and positional plagiocephaly, multiple numeric evaluations indicate regional deformities and severity of the asymmetry, which can help in the clinical decision of indicating or not the orthosis in positional deformities, determining treatment duration or evaluating surgical outcomes after correction.
MethodsA Matlab-based tool was developed for digital processing of photographs taken in 3 positions (anterior, superior and lateral). The software guides the user to select visible and reproducible landmarks in each photograph acquisition and calculates multiple indexes and metrics, generating a set of comprehensive plots to offer the user an overview of head and facial symmetry across the orthogonal views. For purposes of demonstration, we evaluated 2 patients (one control and one with non-sinostotic deformity).
ResultsThe results show a clear differentiation of the control and plagiocephalic patient metrics mainly in the superior view, showing potential for diagnosis of the condition, and also detected the clinical improvement during helmet treatment in the follow-up, 3 and 5 months after orthosis use.
ConclusionWe presented a proof-of-concept for a low cost, no radiation evaluation system for craniofacial asymmetries, that can be useful in a clinical context for diagnosis and follow-up of patients. | 10.1007/s00381-019-04451-2 | medrxiv |
10.1101/19007823 | Incidence of Respiratory distress and its predictors among neonates admitted at neonatal intensive care unit, Black Lion Specialized Hospital, Addis Ababa, Ethiopia, 2019. | Aynalem, Y. A.; Asefaw, H. M.; Yirga, T.; Habtewold, T. D.; Sinshaw, A. E.; shiferaw, w. s. | Yared Asmare Aynalem Sr. | Debre Berhan Universty | 2019-10-08 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc | pediatrics | https://www.medrxiv.org/content/early/2019/10/08/19007823.source.xml | BackgroundAlthough Respiratory distress is one of the major causes of neonatal morbidity and mortality throughout the globe, it is a serious concern more of in resource limited nations, like Ethiopia. Besides, few studies are available in developing countries. Data from different settings is needed to tackle it. Therefore, we intended to assess the incidence and predictors of respiratory distress among neonates who were admitted in neonatal Intensive care unit (NICU) at Black Lion Specialized Hospital, Ethiopia.
MethodsInstitution-based retrospective follow-up study was conducted among 571 neonates from January 2013 to March 2018. Data were collected by reviewing patients chart using systematic sampling technique with a pretested checklist; entered using Epi-data 4.2 and analyzed with STATA 14. Median time, Kaplan-Meier survival estimation curve and Log-rank test were computed. Bivariable and multivariable Gompertz parametric hazards models were fitted to detect the determinant of respiratory distress. Hazard ratio with a 95% confidence interval was calculated. Variables with reported p-values < 0.05 were considered statistically significant.
ResultsThe proportion of respiratory distress among of neonates admitted in Black Lion specialized hospital neonatal intensive care unit was 42.9 % (95%CI: 39.3-46.1%) with incidence of 8.1/100(95%CI: 7.3, 8.9)).Being male [AHR=2.4 (95%CI:1.1,3.1)], neonates born via caesarean section [AHR:1.9((95%CI:1.6,2.3)], home delivery [AHR :2.9 (95%CI:1.5, 5,2)], maternal diabetes mellitus [AHR 2.3(95%CI: 1.4, 3.6)], preterm birth [AHR:2.9(95%CI:1.6, 5.1)] and APGAR score less than 7 [AHR: 3.1 (95%CI:1.8,5.0)] were found to be significant predictors of respiratory distress.
ConclusionsThe incidence of respiratory distress among neonates was found to be high. Those neonates delivered at home, delivered through caesarean section, preterm neonates, whose APGAR score<7, and born from diabetic mothers were more likely to develop respiratory distress. All concerned bodies should work on preventing RD and give special attention for multifactorial cause of it. Thus; it is indicated to promote health institutional delivery more. Besides, a need to establish and/or strengthen strategies to prevent the occurrence of respiratory distress among babies with low APGAR score, preterm babies, born from diabetes mellitus mothers, and delivered through caesarean section. | 10.1371/journal.pone.0235544 | medrxiv |
10.1101/19005967 | Problematic Internet Use in Children and Adolescents: Associations with psychiatric disorders and impairment | Restrepo, A.; Scheininger, T.; Clucas, J.; Alexander, L.; Salum, G.; Georgiades, K.; Paksarian, D.; Merikangas, K.; Milham, M. | Anita Restrepo | Child Mind Institute | 2019-10-08 | 1 | PUBLISHAHEADOFPRINT | cc_by_nd | psychiatry and clinical psychology | https://www.medrxiv.org/content/early/2019/10/08/19005967.source.xml | ObjectiveHere, we leveraged the ongoing, large-scale Child Mind Institute Healthy Brain Network, a transdiagnostic self-referred, community sample of children and adolescents (ages 5-21), to examine the associations between Problematic Internet Use (PIU) and psychopathology, general impairment, physical health and sleep disturbances.
MethodsA total sample of 564 (190 female) participants between the ages of 7-15 (mean = 10.80, SD = 2.16), along with their parents/guardians, completed diagnostic interviews with clinicians, answered a myriad of self-report questionnaires, and underwent physical testing as part of the Healthy Brain Network protocol.
ResultsPIU was positively associated with depressive disorders (aOR = 2.34; CI: 1.18-4.56; p = .01), the combined subtype of ADHD (aOR = 1.79; CI: 1.08-2.98; p = .02), greater levels of impairment (Standardized Beta = 4.79; CI: 3.21-6.37; p < .01) and increased sleep disturbances (Standardized Beta = 3.01; CI: 0.58-5.45; p = .02), even when accounting for demographic covariates and psychiatric comorbidity.
ConclusionThe association between PIU and psychopathology, as well as its impact on impairment and sleep disturbances, highlight the urgent need to gain an understanding of mechanisms in order to inform public health recommendations on internet use in U.S. youth. | 10.1186/s12888-020-02640-x | medrxiv |
10.1101/19007831 | Frontal and cerebellar atrophy supports FTLD-ALS clinical continuum and neuropsychology | Pizzarotti, B.; Palesi, F.; Vitali, P.; Castellazzi, G.; Anzalone, N.; Alvisi, E.; Martinelli, D.; Bernini, S.; Cotta Ramusino, M.; Ceroni, M.; Micieli, G.; Sinforiani, E.; D'Angelo, E.; Costa, A.; Gandini Wheeler-Kingshott, C. A. M. | Beatrice Pizzarotti | University of Pavia, Pavia, Italy | 2019-10-08 | 1 | PUBLISHAHEADOFPRINT | cc_no | neurology | https://www.medrxiv.org/content/early/2019/10/08/19007831.source.xml | BackgroundFrontotemporal Spectrum Disorder (FTSD) and Amyotrophic Lateral Sclerosis (ALS) are neurodegenerative diseases often considered as a continuum from clinical, epidemiologic and genetic perspectives. We used localized brain volume alterations to evaluate common and specific features of FTSD, FTSD-ALS and ALS patients to further understand this clinical continuum.
MethodsWe used voxel-based morphometry on structural MRI images to localize volume alterations in group comparisons: patients (20 FTSD, seven FTSD-ALS, 18 ALS) versus healthy controls (39 CTR), and patient groups between themselves. We used mean whole-brain cortical thickness [Formula] to assess whether its correlations with local brain volume could propose mechanistic explanations of the heterogeneous clinical presentations. We also assessed whether volume reduction can explain cognitive impairment, measured with frontal assessment battery, verbal fluency and semantic fluency.
ResultsCommon (mainly frontal) and specific areas with reduced volume were detected between FTSD, FTSD-ALS and ALS patients, confirming suggestions of a clinical continuum, while at the same time defining morphological specificities for each clinical group (e.g. a difference of cerebral and cerebellar involvement between FTSD and ALS). [Formula]values suggested extensive network disruption in the pathological process, with indications of a correlation between cerebral and cerebellar volumes and [Formula] in ALS. The analysis of the neuropsychological scores indeed pointed towards an important role for the cerebellum, along with fronto-temporal areas, in explaining impairment of executive and linguistic functions.
ConclusionsWe identified common elements that explain the FTSD-ALS clinical continuum, while also identifying specificities of each group, partially explained by different cerebral and cerebellar involvement. | 10.3389/fnagi.2020.593526 | medrxiv |
10.1101/19007831 | Frontal and cerebellar atrophy supports FTSD-ALS clinical continuum | Pizzarotti, B.; Palesi, F.; Vitali, P.; Castellazzi, G.; Anzalone, N.; Alvisi, E.; Martinelli, D.; Bernini, S.; Cotta Ramusino, M.; Ceroni, M.; Micieli, G.; Sinforiani, E.; D'Angelo, E.; Costa, A.; Gandini Wheeler-Kingshott, C. A. M. | Beatrice Pizzarotti | University of Pavia, Pavia, Italy | 2020-07-13 | 2 | PUBLISHAHEADOFPRINT | cc_no | neurology | https://www.medrxiv.org/content/early/2020/07/13/19007831.source.xml | BackgroundFrontotemporal Spectrum Disorder (FTSD) and Amyotrophic Lateral Sclerosis (ALS) are neurodegenerative diseases often considered as a continuum from clinical, epidemiologic and genetic perspectives. We used localized brain volume alterations to evaluate common and specific features of FTSD, FTSD-ALS and ALS patients to further understand this clinical continuum.
MethodsWe used voxel-based morphometry on structural MRI images to localize volume alterations in group comparisons: patients (20 FTSD, seven FTSD-ALS, 18 ALS) versus healthy controls (39 CTR), and patient groups between themselves. We used mean whole-brain cortical thickness [Formula] to assess whether its correlations with local brain volume could propose mechanistic explanations of the heterogeneous clinical presentations. We also assessed whether volume reduction can explain cognitive impairment, measured with frontal assessment battery, verbal fluency and semantic fluency.
ResultsCommon (mainly frontal) and specific areas with reduced volume were detected between FTSD, FTSD-ALS and ALS patients, confirming suggestions of a clinical continuum, while at the same time defining morphological specificities for each clinical group (e.g. a difference of cerebral and cerebellar involvement between FTSD and ALS). [Formula]values suggested extensive network disruption in the pathological process, with indications of a correlation between cerebral and cerebellar volumes and [Formula] in ALS. The analysis of the neuropsychological scores indeed pointed towards an important role for the cerebellum, along with fronto-temporal areas, in explaining impairment of executive and linguistic functions.
ConclusionsWe identified common elements that explain the FTSD-ALS clinical continuum, while also identifying specificities of each group, partially explained by different cerebral and cerebellar involvement. | 10.3389/fnagi.2020.593526 | medrxiv |
10.1101/19007773 | Autoantibodies against the prion protein in individuals with PRNP mutations | Frontzek, K.; Carta, M. C.; Losa, M.; Epskamp, M.; Meisl, G.; Anane, A.; Brandel, J.-P.; Camenisch, U.; Castilla, J.; Haik, S.; Knowles, T.; Lindner, E.; Lutterotti, A.; Minikel, E. V.; Roiter, I.; Safar, J. G.; Sanchez-Valle, R.; Zakova, D.; Hornemann, S.; Aguzzi, A. | Karl Frontzek | University of Zurich | 2019-10-08 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | neurology | https://www.medrxiv.org/content/early/2019/10/08/19007773.source.xml | Structured abstractO_ST_ABSObjectiveC_ST_ABSTo determine whether naturally occurring autoantibodies against the prion protein are present in individuals with genetic prion disease mutations and controls, and if so, whether they are protective against prion disease.
MethodsIn this case-control study, we collected 124 blood samples from individuals with a variety of pathogenic PRNP mutations and 78 control individuals with a positive family history of genetic prion disease but lacking disease-associated PRNP mutations. Antibody reactivity was measured using an indirect ELISA for the detection of human IgG1-4 antibodies against wild-type human prion protein. Multivariate linear regression models were constructed to analyze differences in autoantibody reactivity between a) PRNP mutation carriers versus controls and b) asymptomatic versus symptomatic PRNP mutation carriers. Robustness of results was examined in matched cohorts.
ResultsWe found that antibody reactivity was present in a subset of both PRNP mutation carriers and controls. Autoantibody levels were not influenced by PRNP mutation status nor clinical manifestation of prion disease. Post hoc analyses showed anti-PrPC autoantibody titers to be independent of personal history of autoimmune disease and other immunological disorders, as well as PRNP codon 129 polymorphism.
ConclusionsPathogenic PRNP variants do not notably stimulate antibody-mediated anti-PrPC immunity. Anti-PrPC IgG autoantibodies are not associated with the onset of prion disease. The presence of anti-PrPC autoantibodies in the general population without any disease-specific association suggests that relatively high titers of naturally occurring antibodies are well tolerated. Clinicaltrials.gov identifier NCT02837705. | 10.1212/WNL.0000000000009183 | medrxiv |
10.1101/19008144 | Implementation of nursing process in Ethiopia and its association with working environment and knowledge: a systematic review and meta-analysis | shiferaw, w. s.; Aynalem, Y. A.; yirga, T.; Dargie, A. | wondimeneh shibabaw shiferaw | debre berhan university | 2019-10-08 | 1 | PUBLISHAHEADOFPRINT | cc_no | nursing | https://www.medrxiv.org/content/early/2019/10/08/19008144.source.xml | BackgroundNursing Process is a scientific problem solving approach to direct nurses in caring for clients effectively and to improve quality of health care service. In Ethiopia, the national pooled prevalence of implementation of nursing process remains unknown. Hence, the objective of this systematic review and meta-analysis was to estimate the level of implementation of nursing process and it association with knowledge and working environment.
MethodsPubMed, Scopus, Cochrane Library, Google Scholar, PsycINFO and CINAHL were systematically searched online to retrieve related articles. The Preferred Reporting Items for Systematic Review and Meta-Analysis (PRISMA) guideline was followed. To investigate heterogeneity across the included studies, I2 test was employed. Publication bias was examined using funnel plot and Eggers regression test statistic. The random-effect model was fitted to estimate the summary effects, and odds ratios (ORs). All statistical analysis was done using STATA version 14 software for windows.
ResultsSeven studies which comprises of 1,268 participants were included in this meta-analysis. The estimated pooled prevalence of implementation of nursing process in Ethiopia was 42.44% (95% CI (36.91, 47.97%)). Based on the subgroup analysis, the highest implementation of nursing process was observed sample size greater than or equals to two hundred, 44.69% (95% CI: 35.34, 54.04). Nurses who have been work in the stressful environment [(OR 0.41, 95%CI (0.08, 2.12)] and having good knowledge about nursing process [(OR 2.44, 95%CI (0.34,17.34)] was not significant associated with the implementation of nursing process.
ConclusionThe overall implementation of nursing process in Ethiopia is relatively low. Nurses who have been work in the stressful environment have less likely implement nursing process. On the other hand, Nurses who had good knowledge on nursing process were more likely to implement nursing process. Therefore, policymakers (FMOH) and other concerned body need give special attention to improve the implementation of nursing process. | 10.1155/2020/6504893 | medrxiv |
10.1101/19008078 | Setting Standards to Promote Artificial Intelligence in Colon Mass Endoscopic Sampling | Zheng, Y.; Su, R.; Wang, W.; Meng, S.; Xiao, H.; Zhang, W.; Xu, H.; Bu, Y.; Zhong, Y.; Zhang, Y.; Qiu, H.; Qin, W.; Zhang, Y.; Xu, W.; Chen, H.; Zhang, C.; Wu, S.; Han, Z.; Zheng, X.; Zhu, H.; Wu, S.; Pan, W.; He, Y.; Hu, Y. | Yiqun Hu | Zhongshan Hospital Xiamen University | 2019-10-08 | 1 | PUBLISHAHEADOFPRINT | cc_no | oncology | https://www.medrxiv.org/content/early/2019/10/08/19008078.source.xml | ObjectiveArtificial intelligence (AI) has undeniable values in detection, characterization, and monitoring of tumors during cancer imaging. However, major AI explorations in digestive endoscopy have not been systematically planned, and more important, most AI productions are based on Single-center Studies (ScSs). ScSs result in data scarcity, redundancy as well as island effects, which leads to some limitations in applying it on endoscopy. We investigate the disadvantages of picture processing which may effect the AI detection, and make improvements in AI detection and image recognition accuracy.
DesignCurrent investigation aggregates a total of 2,500 gastroenteroscopy samples from various hospitals in multiple regions and carries out deep learning.
ResultsIt is found that factors inconducive to AI recognition are common such as: (a) the gastrointestinal tract is not cleaned up completely; (b) shooting angle (from left to right and the top of polyp are unexposed clearly), shooting distance (too close or too far to shoot causes the lump to be unclear), shooting light (insufficient light source or overexposed light source in mass) and unstable shooting lead to poor quality of pictures.
ConclusionWe set standards for a multicenter cooperation involving three-level medical institutions from the provincial, municipal and county to improve the recognition accuracy as well as the diagnosis and treatment efficiency meanwhile. | null | medrxiv |
10.1101/19008052 | Molecular Epidemiology, Diagnostics and Mechanisms of Antibiotic Resistance in Mycobacterium tuberculosis complex in Africa: A Systematic Review of Current Reports | Osei Sekyere, J.; Reta, M. A.; Maningi, N. E.; Fourie, P. B. | John Osei Sekyere | University of Pretoria | 2019-10-08 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | infectious diseases | https://www.medrxiv.org/content/early/2019/10/08/19008052.source.xml | BackgroundTuberculosis (TB) remains a main global public health problem. However, a systematic review of TB resistance epidemiology in Africa is wanting.
MethodsA comprehensive systematic search of PubMed, Web of Science and ScienceDirect for English research articles reporting on the molecular epidemiology of Mycobacterium tuberculosis complex resistance in Africa from January 2007 to December 2018 was undertaken.
Results and conclusionQualitative and quantitative synthesis were respectively undertaken with 232 and 186 included articles, representing 32 countries. TB monoresistance rate was highest for isoniazid (59%) and rifampicin (27%), particularly in Zimbabwe (100%), Swaziland (100%), and Sudan (67.9%) whilst multidrug resistance (MDR) rate was substantial in Zimbabwe (100%), Sudan (34.6%), Ivory Coast (24.5%) and Ethiopia (23.9%). Resistance-conferring mutations were commonly found in katG (n=3694), rpoB (n=3591), rrs (n=1272), inhA (n=1065), pncA (n=1063) and embB (n=705) in almost all included countries: S315G/I/N/R/T, V473D/F/G/I, Q471H/Q/R/Y, S303C/L etc. in katG; S531A/F/S/G, H526A/C/D/G, D516A/E/G etc. in rpoB; A1401G, A513C etc. in rrs; C15T, G17A/T, -A16G etc. in inhA; Ins456C, Ins 172G, L172P, C14R, Ins515G etc in pncA. Commonest lineages and families such as T (n=8139), LAM (n=5243), Beijing (n=5471), Cameroon (n=3315), CAS (n=2021), H (n=1773) etc., with the exception of T, were not fairly distributed; Beijing, Cameroon and CAS were prevalent in South Africa (n=4964), Ghana (n=2306), and Ethiopia/Tanzania (n=799/635) respectively. Resistance mutations were not lineage-specific and sputum (96.2%) were mainly used for diagnosing TB resistance using the LPA (38.5%), GeneXpert (17.2%), whole-genome sequencing (12.3%) and PCR/amplicon sequencing (9%/23%). Intercountry spread of strains were limited while intra-country dissemination was common. TB resistance and its diagnosis remain a major threat in Africa, necessitating urgent action to contain this global menace. | 10.1016/j.jinf.2019.10.006 | medrxiv |
10.1101/19007278 | Epidemiology and predictors of repeat positive chlamydia tests: the Brant County cohort, Ontario, Canada | Santos, J.; Babayan, A.; Huang, M.; Jolly, A. | Ann Jolly | University of Ottawa | 2019-10-08 | 1 | PUBLISHAHEADOFPRINT | cc_by_nd | infectious diseases | https://www.medrxiv.org/content/early/2019/10/08/19007278.source.xml | ObjectivesRepeat positive tests for chlamydia (CR) may help explain current high rates of chlamydia despite years of screening, partner notification and treatment to reduce sequelae. We wanted to determine the numbers of CRs over time as a proportion of all chlamydia cases, and define the differences in demographic, clinical, behavioural, and public health management indicators, between individuals who have experienced a CR and individuals who experienced a single infection in Brant County, Ontario.
MethodsA retrospective cohort was developed using notifiable disease data extracted from the integrated public health system. Cases were laboratory confirmed chlamydia and gonorrhea infections in Brant County between January 1st, 2006 and December 31st, 2015. During the study period, 3,499 chlamydia cases and 475 gonorrhea cases were diagnosed. The total number of individuals with chlamydia in that period was 3,060, including 157 coinfections with gonorrhea. Differences between those with reinfection and those with single infection were evaluated using univariate and multivariate (Cox proportional hazards model) methods.
ResultsFour hundred and ninety-nine (16.30%) individuals experienced CR 28 days from initial infection; of which 328 (65.73%) occurred within 2 years and 211 (42.28%) within 1 year. The median time to CR was 276 days, consistent with existing Canadian literature. Independent risk factors for CR included being male, 25 years old or younger, and not receiving recommended treatment for initial and/or subsequent infection.
ConclusionsThese findings suggest that inadequate treatment play a significant role in CR, while accounting for young age and male gender, likely due to untreated sex partners.
Key MessagesO_LISixteen percent of people experienced a second positive chlamydia test more than 28 days after their initial positive test in a cohort of 3,499 patients
C_LIO_LIThose who had a second positive test were more likely to be male, younger than 25 and had not received recommended antimicrobials
C_LIO_LIConfirmation of any kind of partner notification was missing in 88% of records
C_LI | null | medrxiv |
10.1101/19007864 | Not all animals are equal - farm living and allergy in Upper Bavaria | Wjst, M. | Matthias Wjst | Helmholtz Zentrum Muenchen | 2019-10-08 | 1 | PUBLISHAHEADOFPRINT | cc_no | allergy and immunology | https://www.medrxiv.org/content/early/2019/10/08/19007864.source.xml | BackgroundA lower allergy and asthma prevalence in farm children has been described three decades ago in Switzerland.
ObjectiveAfter years of research into bacterial exposure at farms, the origin of the farm effect is still unknown. We now hypothesize, that there is no such an effect in large industrial cattle farms with slatted floors indoors but in small farms only where animals are grazing outdoors and are having a higher endoparasite load.
MethodsWe re-analyze an earlier epidemiological study by record-linkage to later agricultural surveys. The Asthma and Allergy Study in 1989/90 was a cross-sectional study of 1714 ten year old children in 63 villages covering ten different districts of Upper Bavaria. The farm effect is defined here as the association of number of cows per villager on lifetime prevalence of allergic rhinitis prevalence in the children of this village.
ResultsThe farm effect is restricted to small villages only. Furthermore, districts with higher Fasciola infection rates of cows, show a significant stronger farm effect than districts with lower infection rates.
ConclusionsThe results warrant further research into human immune response to endoparasites in livestock. | null | medrxiv |
10.1101/19007005 | Does e-cigarette use in non-smoking young adults act as a gateway to smoking? A systematic review and meta-analysis | Khouja, J. N.; Suddell, S. F.; Peters, S. E.; Taylor, A. E.; Munafo, M. R. | Jasmine N Khouja | University of Bristol | 2019-10-08 | 1 | PUBLISHAHEADOFPRINT | cc_by | epidemiology | https://www.medrxiv.org/content/early/2019/10/08/19007005.source.xml | ObjectiveThe aim of this review was to investigate whether e-cigarette use compared to non-use in young non-smokers is associated with subsequent cigarette smoking.
Data sourcesPubMed, Embase, Web of Science, Wiley Cochrane Library databases, and the 2018 Society for Research on Nicotine and Tobacco and Society for Behavioural Medicine conference abstracts.
Study selectionAll studies of young people (up to age 30 years) with a measure of e-cigarette use prior to smoking and an outcome measure of smoking where an odds ratio could be calculated were included (excluding reviews and animal studies).
Data ExtractionIndependent extraction was completed by multiple authors using a pre-prepared extraction form.
Data synthesisOf 9,199 results, 17 studies were included in the meta-analysis. There was strong evidence for an association between e-cigarette use among non-smokers and later smoking (OR 4.59, 95% CI 3.60 to 5.85) when the results were meta-analysed in a random effects model. However, there was high heterogeneity (I2 = 88%).
ConclusionsWhilst the association between e-cigarette use among non-smokers and subsequent smoking appears strong, the available evidence is limited by the reliance on self-report measures of smoking history without biochemical verification. None of the studies included negative controls which would provide stronger evidence for whether the association may be causal. Much of the evidence also failed to consider the nicotine content of e-liquids used by non-smokers meaning it is difficult to make conclusions about whether nicotine is the mechanism driving this association. | 10.1136/tobaccocontrol-2019-055433 | medrxiv |
10.1101/19007005 | Is e-cigarette use in non-smoking young adults associated with later smoking? A systematic review and meta-analysis | Khouja, J. N.; Suddell, S. F.; Peters, S. E.; Taylor, A. E.; Munafo, M. R. | Jasmine N Khouja | University of Bristol | 2020-01-02 | 2 | PUBLISHAHEADOFPRINT | cc_by | epidemiology | https://www.medrxiv.org/content/early/2020/01/02/19007005.source.xml | ObjectiveThe aim of this review was to investigate whether e-cigarette use compared to non-use in young non-smokers is associated with subsequent cigarette smoking.
Data sourcesPubMed, Embase, Web of Science, Wiley Cochrane Library databases, and the 2018 Society for Research on Nicotine and Tobacco and Society for Behavioural Medicine conference abstracts.
Study selectionAll studies of young people (up to age 30 years) with a measure of e-cigarette use prior to smoking and an outcome measure of smoking where an odds ratio could be calculated were included (excluding reviews and animal studies).
Data ExtractionIndependent extraction was completed by multiple authors using a pre-prepared extraction form.
Data synthesisOf 9,199 results, 17 studies were included in the meta-analysis. There was strong evidence for an association between e-cigarette use among non-smokers and later smoking (OR 4.59, 95% CI 3.60 to 5.85) when the results were meta-analysed in a random effects model. However, there was high heterogeneity (I2 = 88%).
ConclusionsWhilst the association between e-cigarette use among non-smokers and subsequent smoking appears strong, the available evidence is limited by the reliance on self-report measures of smoking history without biochemical verification. None of the studies included negative controls which would provide stronger evidence for whether the association may be causal. Much of the evidence also failed to consider the nicotine content of e-liquids used by non-smokers meaning it is difficult to make conclusions about whether nicotine is the mechanism driving this association. | 10.1136/tobaccocontrol-2019-055433 | medrxiv |
10.1101/19008755 | Clinical Impact of Metagenomic Next-Generation Sequencing of Plasma Cell-Free DNA for the Diagnosis of Infectious Diseases: A Multicenter Retrospective Cohort Study | Hogan, C. A.; Yang, S.; Garner, O. B.; Green, D. A.; Gomez, C. A.; Dien Bard, J.; Pinsky, B. A.; Banaei, N. | Niaz Banaei | Stanford University | 2019-10-08 | 1 | PUBLISHAHEADOFPRINT | cc_no | pathology | https://www.medrxiv.org/content/early/2019/10/08/19008755.source.xml | BackgroundMetagenomic next-generation sequencing (mNGS) of plasma cell-free DNA has emerged as an attractive diagnostic modality allowing broad-range pathogen detection, noninvasive sampling, and earlier diagnosis. However, little is known about its real-world clinical impact as used in routine practice.
MethodsWe performed a retrospective cohort study of all patients for whom plasma mNGS (Karius test) was performed for all indications at 5 U.S. institutions over 1.5 years. Comprehensive chart review was performed, and standardized assessment of clinical impact of the mNGS based on the treating teams interpretation of Karius results and patient management was established.
ResultsA total of 82 Karius tests were evaluated, from 39 (47.6%) adults and 43 (52.4%) children and a total of 53 (64.6%) immunocompromised patients. Karius positivity rate was 50/82 (61.0%), with 24 (48.0%) showing two or more organisms (range, 2-8). The Karius test results led to positive impact in 6 (7.3%), negative impact in 3 (3.7%), no impact in 70 (85.4%), and was indeterminate in 3 (3.7%). Cases with positive Karius result and clinical impact involved bacteria and/or fungi but not DNA viruses or parasites. In 10 patients who underwent 16 additional repeated tests, only one was associated with clinical impact.
ConclusionsThe real-world impact of the Karius test as currently used in routine clinical practice is limited. Further studies are needed to identify high-yield patient populations, define the complementary role of mNGS to conventional microbiological methods, and how best to integrate mNGS into current testing algorithms.
SummaryIn a multicenter retrospective cohort study, we show that the real-world clinical impact of plasma metagenomic next-generation sequencing (mNGS) for the noninvasive diagnosis of infections is limited (positive impact 7.3%). Further studies are needed to optimize the impact of mNGS. | 10.1093/cid/ciaa035 | medrxiv |
10.1101/19008300 | Diagnosing Cornelia de Lange syndrome and related neurodevelopmental disorders using RNA-sequencing | Rentas, S.; Rathi, K.; Kaur, M.; Raman, P.; Krantz, I.; Sarmady, M.; Abou Tayoun, A. | Ahmad Abou Tayoun | Al Jalila Children\'s Specialty Hospital | 2019-10-08 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | genetic and genomic medicine | https://www.medrxiv.org/content/early/2019/10/08/19008300.source.xml | PurposeNeurodevelopmental phenotypes represent major indications for children undergoing clinical exome sequencing. However, 50% of cases remain undiagnosed even upon exome reanalysis. Here we show RNA sequencing (RNA-seq) on human B lymphoblastoid cell lines (LCL) is highly suitable for neurodevelopmental Mendelian gene testing and demonstrate the utility of this approach in suspected cases of Cornelia de Lange syndrome (CdLS).
MethodsGenotype-Tissue Expression project transcriptome data for LCL, blood, and brain was assessed for neurodevelopmental Mendelian gene expression. Detection of abnormal splicing and pathogenic variants in these genes was performed with a novel RNA-seq diagnostic pipeline and using a validation CdLS-LCL cohort (n=10) and test cohort of patients who carry a clinical diagnosis of CdLS but negative genetic testing (n=5).
ResultsLCLs share isoform diversity of brain tissue for a large subset of neurodevelopmental genes and express 1.8-fold more of these genes compared to blood (LCL, n=1706; whole blood, n=917). This enables testing of over 1000 genetic syndromes. The RNA-seq pipeline had 90% sensitivity for detecting pathogenic events and revealed novel diagnoses such as abnormal splice products in NIPBL and pathogenic coding variants in BRD4 and ANKRD11.
ConclusionThe LCL transcriptome enables robust frontline and/or reflexive diagnostic testing for neurodevelopmental disorders. | 10.1038/s41436-019-0741-5 | medrxiv |
10.1101/19008151 | Effect of a simple exercise programme on hospitalisation-associated disability in older patients: a randomised controlled trial. | Ortiz-Alonso, J.; Bustamante-Ara, N.; Valenzuela, P. L.; Vidan, M. T.; Rodriguez-Romo, G.; Mayordomo-Cava, J.; Javier-Gonzalez, M.; Hidalgo-Gamarra, M.; Lopez-Tatis, M.; Valades Malagon, M. I.; Santos-Lozano, A.; Lucia, A.; Serra-Rexach, J. A. | Pedro L. Valenzuela Sr. | University of Alcala | 2019-10-08 | 1 | PUBLISHAHEADOFPRINT | cc_no | geriatric medicine | https://www.medrxiv.org/content/early/2019/10/08/19008151.source.xml | ObjectiveHospitalisation-associated disability (HAD, defined as the loss of ability to perform one or more basic activities of daily living [ADL] independently at discharge) is a frequent condition among older patients. The present study aimed to assess whether a simple inpatient exercise programme decreases the incidence of HAD in acutely hospitalised very old patients.
DesignIn this randomized controlled trial (Activity in GEriatric acute CARe, AGECAR) participants were assigned to a control or intervention (exercise) group, and were assessed at baseline, admission, discharge, and 3 months thereafter.
Setting and participants268 patients (mean age 88 years, range 75-102) admitted to an acute care for elders (ACE) unit of a Public Hospital were randomized to a control (n=125) or intervention (exercise) group (n=143).
MethodsBoth groups received usual care, and patients in the intervention group also performed simple supervised exercises (walking and rising from a chair, for a total daily duration of [~]20 min). We measured incident HAD at discharge and after 3 months (primary outcome); and Short Physical Performance Battery (SPPB), ambulatory capacity, number of falls, re-hospitalisation and death during a 3-month follow-up (secondary outcomes).
ResultsMedian duration of hospitalisation was 7 days (interquartile range 4 days). Compared with admission, the intervention group had a lower risk of HAD at discharge (odds ratio [OR]: 0.32; 95% confidence interval [CI]: 0.11-0.92) and at 3-months follow-up (OR 0.24; 95% CI: 0.08-0.74) than controls during follow-up. No intervention effect was noted for the other secondary endpoints (all p>0.05), although a trend towards a lower mortality risk was observed in the intervention group (p=0.078).
Conclusion and implicationsThese findings demonstrate that a simple inpatient exercise programme significantly decreases the risk of HAD in acutely hospitalised, very old patients.
Trial registrationNCT0137489 (https://clinicaltrials.gov/ct2/show/NCT01374893).
Brief summaryA simple inpatient intervention consisting of walking and rising from a chair ([~]20 minutes/day) considerably decreases the risk of hospitalisation-associated disability in acutely hospitalised older patients. | 10.1016/j.jamda.2019.11.027 | medrxiv |
10.1101/19008185 | DyeVer PLUS EZ system for Preventing Contrast-Induced Acute Kidney Injury in Patients Undergoing Diagnostic Coronary Angiography and/or Percutaneous Coronary Intervention: A UK-Based Cost-Utility Analysis | javanbakht, M.; Rezaie Hemami, M.; Mashayekhi, A.; Branagan-Harris, M. | Mehdi javanbakht | Optimax Access | 2019-10-08 | 1 | PUBLISHAHEADOFPRINT | cc_no | health economics | https://www.medrxiv.org/content/early/2019/10/08/19008185.source.xml | BackgroundContrast-induced acute kidney injury (CI-AKI) is a complication commonly associated with invasive angiographic procedures and is considered the leading cause of hospital-acquired acute kidney injury. CI-AKI can lead to a prolonged hospital stay, with a substantial economic impact, and increased mortality. The DyeVert PLUS EZ system (FDA approved and CE marked) is a device that has been developed to divert a portion of the theoretical injected contrast media volume (CMV), reducing the overall injected contrast media and aortic reflux and potentially improving long-term health outcomes.
ObjectivesTo assess the long-term costs and health outcomes associated with the introduction of the DyeVert PLUS EZ system into the health care service for the prevention of CI-AKI in a cohort of patients with chronic kidney disease (CKD) stage 3-4 undergoing Diagnostic Coronary Angiography (DAG) and/or Percutaneous Coronary intervention (PCI), compared with current practice.
MethodsA de novo economic model was developed based on the current pathway of managing patients undergoing DAG and/or PCI and on evidence related to the clinical effectiveness of DyeVert, in terms of its impact on relevant clinical outcomes and health service resource use. Clinical data used to populate the model were derived from the literature or were based on assumptions informed by expert clinical input. Costs included in the model were obtained from the literature and UK-based routine sources. Probabilistic distributions were assigned to the majority of model parameters so that a probabilistic analysis could be undertaken, while deterministic sensitivity analyses were also carried out to explore the impact of key parameter variation on the model results.
ResultsBase-case results indicate that the intervention leads to cost savings (- {pound}3,878) and improved effectiveness (+ 0.02 QALYs) over the patients lifetime, compared with current practice. Output from the probabilistic analysis supports the high likelihood of the intervention being cost-effective across presented willingness-to-pay (WTP) thresholds. The overall long-term cost saving for the NHS associated with introduction of the intervention for each cohort of patients is over {pound}175 million. The cost savings are mainly driven by lower risk of subsequent diseases and associated costs
ConclusionsIntroduction of the DyeVert PLUS EZ system has the potential to reduce costs for the health care service and lead to improved clinical outcomes for patients with CKD stage 3-4 undergoing angiographic procedures.
Key Points for Decision MakersO_LIAn economic model has been developed to consider the cost-effectiveness of the DyeVert PLUS EZ system for use amongst patients undergoing angiographic procedures.
C_LIO_LIResults of the economic analysis indicate that the DyeVert PLUS EZ system is highly likely to be cost saving and result in improved patient outcomes.
C_LI | 10.1007/s41669-020-00195-x | medrxiv |
10.1101/19008664 | Surgical Intervention for Paediatric Infusion-Related Extravasation Injury: A Systematic Review | Little, M.; Dupre, S.; Wormald, J.; Gale, C.; Gardiner, M.; Jain, A. | Chris Gale | Neonatal Medicine, Department of Medicine, Imperial College London, Chelsea and Westminster Campus, London, UK, SW10 9NH | 2019-10-08 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | pediatrics | https://www.medrxiv.org/content/early/2019/10/08/19008664.source.xml | ObjectivesThis systematic review aims to assess the quality of literature supporting surgical interventions for paediatric extravasation injury and to determine and summarize their outcomes.
MethodsWe performed a systematic review by searching Ovid MEDLINE and EMBASE as well as AMED, CINAHL, Cochrane Central Register of Controlled Trials (CENTRAL), Cochrane Database of Systematic Reviews and clinicaltrials.gov from inception to February 2019. All studies other than case reports were eligible for inclusion if the population was younger than 18 years old, there was a surgical intervention aimed at treating extravasation injury and they reported on outcomes. Risk of bias was graded according to the National Institutes of Health (NIH) study quality assessment tools.
Results26 studies involving 728 children were included - one before-and-after study and 25 case series. Extravasation injuries were mainly confined to skin and subcutaneous tissues but severe complications were also encountered, including amputation (one toe and one below elbow). Of the surgical treatments described, the technique of multiple puncture wounds and instillation of saline and/or hyaluronidase was the most commonly used. However, there were no studies in which its effectiveness was tested against another treatment or a control and details of functional and aesthetic outcomes were generally lacking.
ConclusionThere is a lack of high quality evidence to support treatment of extravasation injury in children. A definitive trial of extravasation injuries, or a centralized extravasation register using a universal grading scheme and core outcome set with adequate follow-up, are required to provide evidence to guide clinician decision-making.
Strengths and LimitationsO_LIA systematic review was performed according to PRISMA guidelines and registered on PROSPERO
C_LIO_LITwo authors used a bespoke inclusion/exclusion form to independently assess study eligibility
C_LIO_LIStudies were eligible for inclusion if the population was younger than 18 years old, if there was a surgical intervention aimed at treating extravasation injury in any setting and if they reported on short- or long-term outcomes
C_LIO_LITwo researchers also independently assessed the included studies risk of methodological bias using the National Institutes of Health (NIH) study quality assessment tools
C_LIO_LI18 years old may represent a relatively arbitrary cut-off age to differentiate between paediatric and adult in terms of extravasation injury
C_LI | 10.1136/bmjopen-2019-034950 | medrxiv |
10.1101/19008664 | Surgical Intervention for Paediatric Infusion-Related Extravasation Injury: A Systematic Review | Little, M.; Dupre, S.; Wormald, J.; Gardiner, M.; Gale, C.; Jain, A. | Chris Gale | Neonatal Medicine, Department of Medicine, Imperial College London, Chelsea and Westminster Campus, London, UK, SW10 9NH | 2019-10-10 | 2 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | pediatrics | https://www.medrxiv.org/content/early/2019/10/10/19008664.source.xml | ObjectivesThis systematic review aims to assess the quality of literature supporting surgical interventions for paediatric extravasation injury and to determine and summarize their outcomes.
MethodsWe performed a systematic review by searching Ovid MEDLINE and EMBASE as well as AMED, CINAHL, Cochrane Central Register of Controlled Trials (CENTRAL), Cochrane Database of Systematic Reviews and clinicaltrials.gov from inception to February 2019. All studies other than case reports were eligible for inclusion if the population was younger than 18 years old, there was a surgical intervention aimed at treating extravasation injury and they reported on outcomes. Risk of bias was graded according to the National Institutes of Health (NIH) study quality assessment tools.
Results26 studies involving 728 children were included - one before-and-after study and 25 case series. Extravasation injuries were mainly confined to skin and subcutaneous tissues but severe complications were also encountered, including amputation (one toe and one below elbow). Of the surgical treatments described, the technique of multiple puncture wounds and instillation of saline and/or hyaluronidase was the most commonly used. However, there were no studies in which its effectiveness was tested against another treatment or a control and details of functional and aesthetic outcomes were generally lacking.
ConclusionThere is a lack of high quality evidence to support treatment of extravasation injury in children. A definitive trial of extravasation injuries, or a centralized extravasation register using a universal grading scheme and core outcome set with adequate follow-up, are required to provide evidence to guide clinician decision-making.
Strengths and LimitationsO_LIA systematic review was performed according to PRISMA guidelines and registered on PROSPERO
C_LIO_LITwo authors used a bespoke inclusion/exclusion form to independently assess study eligibility
C_LIO_LIStudies were eligible for inclusion if the population was younger than 18 years old, if there was a surgical intervention aimed at treating extravasation injury in any setting and if they reported on short- or long-term outcomes
C_LIO_LITwo researchers also independently assessed the included studies risk of methodological bias using the National Institutes of Health (NIH) study quality assessment tools
C_LIO_LI18 years old may represent a relatively arbitrary cut-off age to differentiate between paediatric and adult in terms of extravasation injury
C_LI | 10.1136/bmjopen-2019-034950 | medrxiv |
10.1101/19008268 | Distance to white matter trajectories is associated with treatment response to internal capsule deep brain stimulation in treatment-resistant depression | Liebrand, L. C.; Natarajan, S. J.; Caan, M. W. A.; Schuurman, P. R.; van den Munckhof, P.; de Kwaasteniet, B.; Luigjes, J.; Bergfeld, I. O.; Denys, D.; van Wingen, G. A. | Luka C. Liebrand | Dept. of Psychiatry, Amsterdam UMC | 2019-10-08 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | psychiatry and clinical psychology | https://www.medrxiv.org/content/early/2019/10/08/19008268.source.xml | ObjectiveDeep brain stimulation (DBS) is an innovative treatment for treatment-resistant depression. DBS is usually targeted at specific anatomical landmarks, with patients responding to DBS in approximately 50% of cases. Attention has recently shifted to white matter tracts to explain DBS response, with initial open-label trials targeting white matter tracts yielding much higher response rates (>70%).
MethodsWe associated distance to individual white matter tracts around the stimulation target in the ventral anterior limb of the internal capsule to treatment response. We performed diffusion magnetic resonance tractography of the superolateral branch of the medial forebrain bundle and the anterior thalamic radiation in fourteen patients that participated in our randomized clinical trial. We combined the tract reconstructions with the postoperative images to identify the DBS leads and estimated the distance between tracts and leads, which we subsequently associated with treatment response.
ResultsStimulation closer to both tracts was significantly correlated to a larger symptom decrease (r=0.61, p=0.02), suggesting that stimulation more proximal to the tracts was beneficial. There was no difference in lead placement with respect to anatomical landmarks, which could mean that differences in treatment response were driven by individual differences in white matter anatomy.
ConclusionsOur results suggest that deep brain stimulation of the ventral anterior limb of the internal capsule could benefit from targeting white matter bundles. We recommend acquiring diffusion magnetic resonance data for each individual patient. | 10.1016/j.nicl.2020.102363 | medrxiv |
10.1101/19008367 | Identification of newborns at risk for autism using electronic medical records and machine learning | Rahman, R.; Kodesh, A.; Levine, S. Z.; Sandin, S.; Reichenberg, A.; Schlessinger, A. | Avner Schlessinger | Icahn School of Medicine at Mount Sinai | 2019-10-08 | 1 | PUBLISHAHEADOFPRINT | cc_no | psychiatry and clinical psychology | https://www.medrxiv.org/content/early/2019/10/08/19008367.source.xml | ImportanceCurrent approaches for early identification of individuals at high risk for autism spectrum disorder (ASD) in the general population are limited, where most ASD patients are not identified until after the age of 4. This is despite substantial evidence suggesting that early diagnosis and intervention improves developmental course and outcome.
ObjectiveDevelop a machine learning (ML) method predicting the diagnosis of ASD in offspring in a general population sample, using parental electronic medical records (EMR) available before childbirth
DesignPrognostic study of EMR data within a single Israeli health maintenance organization, for the parents of 1,397 ASD children (ICD-9/10), and 94,741 non-ASD children born between January 1st, 1997 through December 31st, 2008. The complete EMR record of the parents was used to develop various ML models to predict the risk of having a child with ASD.
Main outcomes and measuresRoutinely available parental sociodemographic information, medical histories and prescribed medications data until offsprings birth were used to generate features to train various machine learning algorithms, including multivariate logistic regression, artificial neural networks, and random forest. Prediction performance was evaluated with 10-fold cross validation, by computing C statistics, sensitivity, specificity, accuracy, false positive rate, and precision (positive predictive value, PPV).
ResultsAll ML models tested had similar performance, achieving an average C statistics of 0.70, sensitivity of 28.63%, specificity of 98.62%, accuracy of 96.05%, false positive rate of 1.37%, and positive predictive value of 45.85% for predicting ASD in this dataset.
Conclusion and relevanceML algorithms combined with EMR capture early life ASD risk. Such approaches may be able to enhance the ability for accurate and efficient early detection of ASD in large populations of children.
Key pointsO_ST_ABSQuestionC_ST_ABSCan autism risk in children be predicted using the pre-birth electronic medical record (EMR) of the parents?
FindingsIn this population-based study that included 1,397 children with autism spectrum disorder (ASD) and 94,741 non-ASD children, we developed a machine learning classifier for predicting the likelihood of childhood diagnosis of ASD with an average C statistic of 0.70, sensitivity of 28.63%, specificity of 98.62%, accuracy of 96.05%, false positive rate of 1.37%, and positive predictive value of 45.85%.
MeaningThe results presented serve as a proof-of-principle of the potential utility of EMR for the identification of a large proportion of future children at a high-risk of ASD. | 10.1192/j.eurpsy.2020.17 | medrxiv |
10.1101/19008250 | The Causal Effects of Health Conditions and Risk Factors on Social and Socioeconomic Outcomes: Mendelian Randomization in UK Biobank | Harrison, S.; Davies, A. R.; Dickson, M.; Tyrrell, J.; Green, M. J.; Katikireddi, S. V.; Campbell, D.; Munafo, M.; Dixon, P.; Jones, H. E.; Rice, F.; Davies, N. M.; Howe, L. D. | Sean Harrison | MRC Integrative Epidemiology Unit (IEU), Population Health Sciences, Bristol Medical School, University of Bristol | 2019-10-08 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | public and global health | https://www.medrxiv.org/content/early/2019/10/08/19008250.source.xml | ObjectivesTo estimate the causal effect of health conditions and risk factors on social and socioeconomic outcomes in UK Biobank. Evidence on socioeconomic impacts is important to understand because it can help governments, policy-makers and decision-makers allocate resources efficiently and effectively.
DesignWe used Mendelian randomization to estimate the causal effects of eight health conditions (asthma, breast cancer, coronary heart disease, depression, eczema, migraine, osteoarthritis, type 2 diabetes) and five health risk factors (alcohol intake, body mass index [BMI], cholesterol, systolic blood pressure, smoking) on 19 social and socioeconomic outcomes.
SettingUK Biobank.
Participants337,009 men and women of white British ancestry, aged between 39 and 72 years.
Main outcome measuresAnnual household income, employment, deprivation (measured by the Townsend deprivation index [TDI]), degree level education, happiness, loneliness, and 13 other social and socioeconomic outcomes.
ResultsResults suggested that BMI, smoking and alcohol intake affect many socioeconomic outcomes. For example, smoking was estimated to reduce household income (mean difference = -{pound}24,394, 95% confidence interval (CI): -{pound}33,403 to -{pound}15,384), the chance of owning accommodation (absolute percentage change [APC] = -21.5%, 95% CI: -29.3% to -13.6%), being satisfied with health (APC = -32.4%, 95% CI: -48.9% to -15.8%), and of obtaining a university degree (APC = -73.8%, 95% CI: -90.7% to -56.9%), while also increasing deprivation (mean difference in TDI = 1.89, 95% CI: 1.13 to 2.64, approximately 236% of a decile of TDI). There was evidence that asthma increased deprivation and decreased both household income and the chance of obtaining a university degree, and migraine reduced the chance of having a weekly leisure or social activity, especially in men. For other associations, estimates were null.
ConclusionsHigher BMI, alcohol intake and smoking were all estimated to adversely affect multiple social and socioeconomic outcomes. Effects were not detected between health conditions and socioeconomic outcomes using Mendelian randomization, with the exceptions of depression, asthma and migraines. This may reflect true null associations, selection bias given the relative health and age of participants in UK Biobank, and/or lack of power to detect effects.
What is known?O_LIStudies have shown associations between poor health and adverse social (e.g. wellbeing, social contact) and socioeconomic (e.g. educational attainment, income, employment) outcomes, but there is also strong evidence that social and socioeconomic factors influence health.
C_LIO_LIThese bidirectional relationships make it difficult to establish whether health conditions and health risk factors have causal effects on social and socioeconomic outcomes.
C_LIO_LIMendelian randomization is a technique that uses genetic variants robustly related to an exposure of interest (here, health conditions and risk factors for poor health) as a proxy for the exposure.
C_LIO_LISince genetic variants are randomly allocated at conception, they tend to be unrelated to the factors that typically confound observational studies, and are less likely to suffer from reverse causality, making causal inference from Mendelian randomization analyses more plausible.
C_LI
What this study addsO_LIThis study suggests causal effects of higher BMI, smoking and alcohol use on a range of social and socioeconomic outcomes, implying that population-level improvements in these risk factors may, in addition to the well-known health benefits, have social and socioeconomic benefits for individuals and society.
C_LIO_LIThere was evidence that asthma increased deprivation, decreased household income and the chance of having a university degree, migraine reduced the chance of having a weekly leisure or social activity, especially in men, and depression increased loneliness and decreased happiness.
C_LIO_LIThere was little evidence for causal effects of cholesterol, systolic blood pressure or breast cancer on social and socioeconomic outcomes.
C_LI | 10.1093/ije/dyaa114 | medrxiv |
10.1101/19008581 | Brain network disruption predicts memory and attention deficits after surgical resection of glioma | Romero-Garcia, R.; Hart, M. G.; Owen, M.; Assem, M.; Coelho, P.; McDonald, A.; Woodberry, E.; Price, S. J.; Burke, A. G.; Santarius, T.; Erez, Y.; Suckling, J. | Rafael Romero-Garcia | Department of Psychiatry, University of Cambridge | 2019-10-08 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | radiology and imaging | https://www.medrxiv.org/content/early/2019/10/08/19008581.source.xml | ObjectiveThe aim of this study is to test brain tumour interactions with brain networks thereby identifying protective features and risk factors for memory recovery after surgical resection.
MethodsSeventeen patients with diffuse non-enhancing glioma (aged 22-56 years) were longitudinally MRI-scanned before and after surgery, and during a 12-months recovery period (47 MRI in total after exclusion). After each scanning session, a battery of memory tests was performed using a tablet-based screening tool, including free verbal memory, overall verbal memory, episodic memory, orientation, forward digit span and backwards digit span. Using structural MRI and Neurite Orientation Dispersion and Density Imaging (NODDI) derived from diffusion-weighted images, we respectively estimated lesion overlap and Neurite Density with brain networks derived from normative data in healthy participants (somato-motor, dorsal attention, ventral attention, fronto-parietal and Default Mode Network -DMN-). Linear Mixed Models (LMMs) that regressed out the effect of age, gender, tumour grade, type of treatment, total lesion volume and total neurite density were used to test the potential longitudinal associations between imaging markers and memory recovery.
ResultsMemory recovery was not significantly associated with tumour location based on traditional lobe classification nor with the type of treatment received by patients (i.e. surgery alone or surgery with adjuvant chemoradiotherapy). Non-local effects of tumours were evident on Neurite Density, which was reduced not only within the tumour, but also beyond the tumour boundary. In contrast, high preoperative Neurite Density outside the tumour, but within the DMN, was associated with better memory recovery (LMM, Pfdr<10-3). Furthermore, postoperative and follow-up Neurite Density within the DMN and fronto-parietal network were also associated with memory recovery (LMM, Pfdr=0.014 and Pfdr=0.001, respectively). Preoperative tumour, and post-operative lesion, overlap with the DMN showed a significant negative association with memory recovery (LMM, Pfdr=0.002 and Pfdr<10-4, respectively).
ConclusionImaging biomarkers of cognitive recovery and decline can be identified using NODDI and resting-state networks. Brain tumours and their corresponding treatment affecting brain networks that are fundamental for memory functioning such as the DMN can have a major impact on patients memory recovery. | 10.3171/2021.1.jns203959 | medrxiv |
10.1101/19008581 | Disruptive and protective outcomes to memory and attention when treating diffuse glioma | Romero-Garcia, R.; Suckling, J.; Owen, M.; Assem, M.; Sinha, R.; Coelho, P.; Woodberry, E.; Price, S. J.; Burke, A. G.; Santarius, T.; Erez, Y.; Hart, M. G. | Rafael Romero-Garcia | Department of Psychiatry, University of Cambridge | 2019-11-05 | 2 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | radiology and imaging | https://www.medrxiv.org/content/early/2019/11/05/19008581.source.xml | ObjectiveThe aim of this study is to test brain tumour interactions with brain networks thereby identifying protective features and risk factors for memory recovery after surgical resection.
MethodsSeventeen patients with diffuse non-enhancing glioma (aged 22-56 years) were longitudinally MRI-scanned before and after surgery, and during a 12-months recovery period (47 MRI in total after exclusion). After each scanning session, a battery of memory tests was performed using a tablet-based screening tool, including free verbal memory, overall verbal memory, episodic memory, orientation, forward digit span and backwards digit span. Using structural MRI and Neurite Orientation Dispersion and Density Imaging (NODDI) derived from diffusion-weighted images, we respectively estimated lesion overlap and Neurite Density with brain networks derived from normative data in healthy participants (somato-motor, dorsal attention, ventral attention, fronto-parietal and Default Mode Network -DMN-). Linear Mixed Models (LMMs) that regressed out the effect of age, gender, tumour grade, type of treatment, total lesion volume and total neurite density were used to test the potential longitudinal associations between imaging markers and memory recovery.
ResultsMemory recovery was not significantly associated with tumour location based on traditional lobe classification nor with the type of treatment received by patients (i.e. surgery alone or surgery with adjuvant chemoradiotherapy). Non-local effects of tumours were evident on Neurite Density, which was reduced not only within the tumour, but also beyond the tumour boundary. In contrast, high preoperative Neurite Density outside the tumour, but within the DMN, was associated with better memory recovery (LMM, Pfdr<10-3). Furthermore, postoperative and follow-up Neurite Density within the DMN and fronto-parietal network were also associated with memory recovery (LMM, Pfdr=0.014 and Pfdr=0.001, respectively). Preoperative tumour, and post-operative lesion, overlap with the DMN showed a significant negative association with memory recovery (LMM, Pfdr=0.002 and Pfdr<10-4, respectively).
ConclusionImaging biomarkers of cognitive recovery and decline can be identified using NODDI and resting-state networks. Brain tumours and their corresponding treatment affecting brain networks that are fundamental for memory functioning such as the DMN can have a major impact on patients memory recovery. | 10.3171/2021.1.jns203959 | medrxiv |
10.1101/19008581 | Memory recovery is related to default mode network impairment and neurite density during brain tumours treatment | Romero-Garcia, R.; Suckling, J.; Owen, M.; Assem, M.; Sinha, R.; Coelho, P.; Woodberry, E.; Price, S. J.; Burke, A. G.; Santarius, T.; Erez, Y.; Hart, M. G. | Rafael Romero-Garcia | Department of Psychiatry, University of Cambridge | 2020-10-22 | 3 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | radiology and imaging | https://www.medrxiv.org/content/early/2020/10/22/19008581.source.xml | ObjectiveThe aim of this study is to test brain tumour interactions with brain networks thereby identifying protective features and risk factors for memory recovery after surgical resection.
MethodsSeventeen patients with diffuse non-enhancing glioma (aged 22-56 years) were longitudinally MRI-scanned before and after surgery, and during a 12-months recovery period (47 MRI in total after exclusion). After each scanning session, a battery of memory tests was performed using a tablet-based screening tool, including free verbal memory, overall verbal memory, episodic memory, orientation, forward digit span and backwards digit span. Using structural MRI and Neurite Orientation Dispersion and Density Imaging (NODDI) derived from diffusion-weighted images, we respectively estimated lesion overlap and Neurite Density with brain networks derived from normative data in healthy participants (somato-motor, dorsal attention, ventral attention, fronto-parietal and Default Mode Network -DMN-). Linear Mixed Models (LMMs) that regressed out the effect of age, gender, tumour grade, type of treatment, total lesion volume and total neurite density were used to test the potential longitudinal associations between imaging markers and memory recovery.
ResultsMemory recovery was not significantly associated with tumour location based on traditional lobe classification nor with the type of treatment received by patients (i.e. surgery alone or surgery with adjuvant chemoradiotherapy). Non-local effects of tumours were evident on Neurite Density, which was reduced not only within the tumour, but also beyond the tumour boundary. In contrast, high preoperative Neurite Density outside the tumour, but within the DMN, was associated with better memory recovery (LMM, Pfdr<10-3). Furthermore, postoperative and follow-up Neurite Density within the DMN and fronto-parietal network were also associated with memory recovery (LMM, Pfdr=0.014 and Pfdr=0.001, respectively). Preoperative tumour, and post-operative lesion, overlap with the DMN showed a significant negative association with memory recovery (LMM, Pfdr=0.002 and Pfdr<10-4, respectively).
ConclusionImaging biomarkers of cognitive recovery and decline can be identified using NODDI and resting-state networks. Brain tumours and their corresponding treatment affecting brain networks that are fundamental for memory functioning such as the DMN can have a major impact on patients memory recovery. | 10.3171/2021.1.jns203959 | medrxiv |
10.1101/19008581 | Memory recovery is related to default mode network impairment and neurite density during brain tumours treatment | Romero-Garcia, R.; Suckling, J.; Owen, M.; Assem, M.; Sinha, R.; Coelho, P.; Woodberry, E.; Price, S. J.; Burke, A. G.; Santarius, T.; Erez, Y.; Hart, M. G. | Rafael Romero-Garcia | Department of Psychiatry, University of Cambridge | 2021-02-01 | 4 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | radiology and imaging | https://www.medrxiv.org/content/early/2021/02/01/19008581.source.xml | ObjectiveThe aim of this study is to test brain tumour interactions with brain networks thereby identifying protective features and risk factors for memory recovery after surgical resection.
MethodsSeventeen patients with diffuse non-enhancing glioma (aged 22-56 years) were longitudinally MRI-scanned before and after surgery, and during a 12-months recovery period (47 MRI in total after exclusion). After each scanning session, a battery of memory tests was performed using a tablet-based screening tool, including free verbal memory, overall verbal memory, episodic memory, orientation, forward digit span and backwards digit span. Using structural MRI and Neurite Orientation Dispersion and Density Imaging (NODDI) derived from diffusion-weighted images, we respectively estimated lesion overlap and Neurite Density with brain networks derived from normative data in healthy participants (somato-motor, dorsal attention, ventral attention, fronto-parietal and Default Mode Network -DMN-). Linear Mixed Models (LMMs) that regressed out the effect of age, gender, tumour grade, type of treatment, total lesion volume and total neurite density were used to test the potential longitudinal associations between imaging markers and memory recovery.
ResultsMemory recovery was not significantly associated with tumour location based on traditional lobe classification nor with the type of treatment received by patients (i.e. surgery alone or surgery with adjuvant chemoradiotherapy). Non-local effects of tumours were evident on Neurite Density, which was reduced not only within the tumour, but also beyond the tumour boundary. In contrast, high preoperative Neurite Density outside the tumour, but within the DMN, was associated with better memory recovery (LMM, Pfdr<10-3). Furthermore, postoperative and follow-up Neurite Density within the DMN and fronto-parietal network were also associated with memory recovery (LMM, Pfdr=0.014 and Pfdr=0.001, respectively). Preoperative tumour, and post-operative lesion, overlap with the DMN showed a significant negative association with memory recovery (LMM, Pfdr=0.002 and Pfdr<10-4, respectively).
ConclusionImaging biomarkers of cognitive recovery and decline can be identified using NODDI and resting-state networks. Brain tumours and their corresponding treatment affecting brain networks that are fundamental for memory functioning such as the DMN can have a major impact on patients memory recovery. | 10.3171/2021.1.jns203959 | medrxiv |
10.1101/19008326 | Towards a brain signature of chronic pain using cerebral blood flow spatial covariance analysis in people with chronic knee pain | Iwabuchi, S. J.; Xing, Y.; Cottam, W. J.; Drabek, M. M.; Tadjibaev, A.; Fernandes, G. S.; Petersen, K. K.; Arendt-Nielsen, L.; Graven-Nielsen, T.; Valdes, A. M.; Zhang, W.; Doherty, M.; Walsh, D.; Auer, D. P. | Dorothee P Auer | University of Nottingham | 2019-10-08 | 1 | PUBLISHAHEADOFPRINT | cc_no | radiology and imaging | https://www.medrxiv.org/content/early/2019/10/08/19008326.source.xml | Chronic musculoskeletal pain is a common problem globally. Current evidence suggests that maladaptive modulation of central pain pathways is associated with pain chronicity following e.g. chronic post-operative pain after knee replacement. Other factors such as low mood, anxiety and tendency to catastrophize seem to also be important contributors. We aimed to identify a chronic pain brain signature that discriminates chronic pain from pain-free conditions using cerebral blood flow (CBF) measures, and explore how this signature relates to the chronic pain experience. In 44 chronic knee pain patients and 29 pain-free controls, we acquired CBF data (using arterial spin labelling) and T1-weighted images. Participants completed a series of questionnaires related to affective processes, and pressure and cuff algometry to assess pain sensitization. Two factor scores were extracted from these scores representing negative affect and pain sensitization, respectively. A spatial covariance principal components analysis of CBF identified five components that significantly discriminated chronic pain patients from controls, with the unified network achieving 0.83 discriminatory accuracy (area under the curve). In chronic knee pain, significant patterns of relative hypo-perfusion were evident in anterior regions of the default mode and salience network hubs, while hyperperfusion was seen in posterior default mode regions, the thalamus, and sensory regions. One component was positively correlated to the pain sensitization score (r=.43, p=.006), suggesting that this CBF pattern reflects the neural activity changes encoding pain sensitization. Here, we report the first chronic knee pain-related brain signature, pointing to a brain signature underpinning the central aspects of pain sensitisation. | 10.1097/j.pain.0000000000001829 | medrxiv |
10.1101/19008318 | APOE interacts with tau PET to influence memory independently of amyloid PET | Weigand, A. J.; Thomas, K. R.; Bangen, K. J.; Eglit, G. M.; Delano-Wood, L.; Gilbert, P. E.; Brickman, A. M.; Bondi, M. W. | Mark W Bondi | VA San Diego Healthcare System/University of California, San Diego | 2019-10-08 | 1 | PUBLISHAHEADOFPRINT | cc_no | neurology | https://www.medrxiv.org/content/early/2019/10/08/19008318.source.xml | ObjectiveApolipoprotein E (APOE) interacts with AD pathology to promote disease progression. Studies of APOE risk primarily focus on amyloid, however, and little research has assessed its interaction with tau pathology independent of amyloid. The current study investigated the moderating effect of APOE genotype on independent associations of amyloid and tau PET with cognition.
MethodsParticipants included 297 older adults without dementia from the Alzheimers Disease Neuroimaging Initiative. Regression equations modeled associations between cognitive domains and (1) cortical A{beta} PET levels adjusting for tau PET and (2) medial temporal lobe (MTL) tau PET levels adjusting for A{beta} PET, including interactions with APOE {varepsilon}4 carrier status.
ResultsAdjusting for tau PET, A{beta} was not associated with cognition and did not interact with {varepsilon}4 status. In contrast, adjusting for A{beta} PET, MTL tau PET was significantly associated with all cognitive domains. Further, there was a moderating effect of {varepsilon}4 status on MTL tau and memory with the strongest negative associations in {varepsilon}4 carriers and at high levels of tau. This interaction persisted even among only A{beta} negative individuals.
InterpretationAPOE {varepsilon}4 genotype strengthens the negative association between MTL tau and memory independently of A{beta}, although the converse is not observed, and this association may be particularly strong at high levels of tau. These findings suggest that APOE may interact with tau independently of A{beta} and that elevated MTL tau confers negative cognitive consequences in A{beta} negative {varepsilon}4 carriers. | 10.1002/alz.12173 | medrxiv |
10.1101/19008383 | Real-world Drug Regimens for Multiple Myeloma in a Swiss Population (2012 to 2017): cost-outcome description | Eichler, K.; Rapold, R.; Wieser, S.; Reich, O.; Blozik, E. | Klaus Eichler | Winterthur Institute of Health Economics, Zurich University of Applied Sciences | 2019-10-08 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | oncology | https://www.medrxiv.org/content/early/2019/10/08/19008383.source.xml | BackgroundNovel drugs are dynamically changing current treatment regimens for multiple myeloma (MM). Novel drugs have improved prognosis of MM patients in clinical studies but are expensive. Little is known about up-to-date real-world application and costs.
MethodsWe performed a retrospective observational cohort analysis (cost-outcome description; 2012-2017) in a claims database of a major Swiss health insurance company which covers 14% of the Swiss population (Helsana Versicherungen AG).We used primary (MM diagnoses via ICD-10) and secondary features (prescribed MM-specific drugs) as inclusion criteria and defined a hierarchy of drug regimens to classify treatments as: 1) proteasome inhibitor (PI)-based regimen (e.g. bortezomib); 2) IMID-based regimen (e.g. lenalidomide); 3) chemotherapy (CHEMO)-based regimen (e.g. bendamustin); 4) monoclonal antibody (MAB)-based regimen (e.g. daratunumab). Direct medical costs of mandatory health insurance were analysed in 2017 Swiss Francs (CHF; third party payer perspective).
ResultsOverall, we identified n=1054 prevalent MM patients (2012-2017) and n=378 incident MM patients (2015-2017; men: 47.1%; age group <=75 years: 48.7%). The number of prevalent patients per year increased over time (from n=314 in 2012 to n=645 in 2017).
PI-based regimens were the most frequent first line approach for incident patients (76.0%), followed by IMID-based (21.9%) and CHEMO-based regimens (2.1%). Only four patients were treated with MAB-drugs. For later lines, IMID-based regimens were most often used (2nd line: 56.4%; 3rd line: 2 of 3 patients), followed by PI-based regimens (43.6% and 1 of 3 patients, respectively). 161 of 1054 prevalent MM patients (15.3%) were treated with autologous hematopoietic stem cell transplantation (HSCT), 4 patients with allogeneic HSCT.
Average costs per patient per treatment line varied considerably (reliable data available from 2012 to 2014; mean duration of lines between 112 and 388 days): PI-based regimens: CHF 81352; IMID-based: CHF 73495; CHEMO-based: CHF 683. Mean daily costs under MM treatment stepwise increased from CHF 209 in 2012 to CHF 254 in 2017 (relative increase: 21.5%). Annual direct medical costs in Switzerland for seven novel MM drugs were extrapolated to be 60.1 Mio CHF in 2012 and 118.6 Mio CHF in 2017 (relative increase: 97.3%), corresponding to mean annual outpatient MM drug costs per patient of CHF 28000 in 2017.
Annual death rates decreased systematically from 18.6% in 2012 to 15.5% in 2017 (p for trend: 0.03). No statistically significant difference in death rates emerged for 2017 compared with 2012 (risk ratio: 0.83; 95%-CI: 0.63 to 1.10; absolute risk reduction: 3.1%).
ConclusionsCurrent treatment patterns for MM patients in Switzerland show variation concerning applied drug regimens as well as costs. An increasing prevalent population of MM patients in combination with increasing costs per day under treatment lead to a substantial and growing budget impact for the Swiss social insurance system. | null | medrxiv |
10.1101/19008482 | Inconsistent Addiction Treatment for Patients Undergoing Cardiac Surgery for Injection Drug Use-Associated Infective Endocarditis | Nguemeni Tiako, M. J.; Hong, S.; Bin Mahmood, S. U.; Mori, M.; Mangi, A.; Yun, J.; Juthani-Mehta, M.; Geirsson, A. | Arnar Geirsson | Yale School of Medicine | 2019-10-10 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | addiction medicine | https://www.medrxiv.org/content/early/2019/10/10/19008482.source.xml | IntroductionCases of injection drug use-related infective endocarditis (IDU-IE) requiring surgery are rising in the setting of the current U.S. opioid epidemic. We thus aimed to determine the nature of addiction interventions in the perioperative period.
MethodsThis is a retrospective review of surgical IDU-IE from 2011 to 2016 at a tertiary care center in New Haven, Connecticut. The data collected included substances consumed recreationally, consultations by social work (SW), psychiatry, pharmacotherapy for addiction, and evidence of enrollment in a drug rehabilitation program upon discharge.
Among patients with active drug use (ADU), we compared the 24-month survival of those who received at least one form of addiction intervention to that of those who did not.
ResultsForty-two patients (75%) had active drug use. Among them, 22 used heroin. Forty-one patients (73.2%) saw SW, 17 (30.4%) saw psychiatry; 14 (25%) saw neither SW nor psychiatry.
Twenty-one patients (37.5%) received methadone, 6 (10.7%) received buprenorphine, 1 (0.02%) received naltrexone; 26 (46.4%) did not receive any pharmacotherapy. Fifteen patients (26.8%) attended a drug rehabilitation program, 13 (86.7%) of whom had seen SW and 8 (53%) psychiatry. Among patients with ADU, there was no statistically significant difference in survival between those who received at least one intervention and those who did not (p=0.1 by log rank).
ConclusionAddiction interventions are deployed inconsistently for patients with surgical IDU-IE. Untreated substance use disorder and recurrent endocarditis are the leading cause of death in this population. Studying best-practices for perioperative interventions in IDU-IE and establishing protocols are of the upmost importance. | 10.1097/adm.0000000000000710 | medrxiv |
10.1101/19008474 | Joint modelling of individual trajectories, within-individual variability and a later outcome: systolic blood pressure through childhood and left ventricular mass in early adulthood | Parker, R. M. A.; Leckie, G.; Goldstein, H.; Howe, L. D.; Heron, J.; Hughes, A. D.; Phillippo, D. M.; Tilling, K. | Richard M.A. Parker | University of Bristol | 2019-10-10 | 1 | PUBLISHAHEADOFPRINT | cc_by | epidemiology | https://www.medrxiv.org/content/early/2019/10/10/19008474.source.xml | Within-individual variability of repeatedly-measured exposures may predict later outcomes: e.g. blood pressure (BP) variability (BPV) is an independent cardiovascular risk factor above and beyond mean BP. Since two-stage methods, known to introduce bias, are typically used to investigate such associations, we introduce a joint modelling approach, examining associations of both mean BP and BPV across childhood to left ventricular mass (indexed to height; LVMI) in early adulthood with data from the UKs Avon Longitudinal Study of Parents and Children (ALSPAC) cohort. Using multilevel models, we allow BPV to vary between individuals (a "random effect") as well as to depend on covariates (allowing for heteroscedasticity). We further distinguish within-clinic variability ("measurement error") from visit-to-visit BPV. BPV was predicted to be greater at older ages, at higher bodyweights, and in females, and was positively correlated with mean BP. BPV had a positive association with LVMI (10% increase in SD(BP) was predicted to increase LVMI by mean = 0.42% (95% credible interval: -0.47%, 1.38%)), but this association became negative (mean = -1.56%, 95% credible interval: -5.01%, 0.44%)) once the effect of mean BP on LVMI was adjusted for. This joint modelling approach offers a flexible method of relating repeatedly-measured exposures to later outcomes. | 10.1093/aje/kwaa224 | medrxiv |
10.1101/19008458 | Living alone, loneliness and lack of emotional support as predictors of suicide and self-harm: seven-year follow up of the UK Biobank cohort | Shaw, R. J.; Cullen, B.; Graham, N.; Lyall, D. M.; Mackay, D.; Okolie, C.; Pearsall, R.; Ward, J.; John, A.; Smith, D. J. | Richard John Shaw | University of Glasgow | 2019-10-10 | 1 | PUBLISHAHEADOFPRINT | cc_no | epidemiology | https://www.medrxiv.org/content/early/2019/10/10/19008458.source.xml | BackgroundThe association between loneliness and suicide is poorly understood. We investigated how living alone, loneliness and emotional support were related to suicide and self-harm in a longitudinal design.
MethodsBetween 2006 and 2010 UK Biobank recruited and assessed in detail over 0.5 million people in middle age. Data were linked to prospective hospital admission and mortality records. Adjusted Cox regression models were used to investigate relationships between living arrangements, loneliness and emotional support, and both suicide and self-harm as outcomes.
ResultsFor men, both living alone (Hazard Ratio (HR) 2.16, 95%CI 1.51-3.09) and living with non-partners (HR 1.80, 95%CI 1.08-3.00) were associated with death by suicide, independently of loneliness, which had a modest relationship with suicide (HR 1.43, 95%CI 0.1.01-2.03). For women, there was no evidence that living arrangements, loneliness or emotional support were associated with death by suicide. Associations between living alone and self-harm were explained by health for women, and by health, loneliness and emotional support for men. In fully adjusted models, loneliness was associated with hospital admissions for self-harm in both women (HR 1.89, 95%CI 1.57-2.28) and men (HR 1.74, 95%CI 1.40-2.16).
LimitationsLoneliness and emotional support were operationalized using single item measures.
ConclusionsFor men - but not for women - living alone or living with a non-partner increased the risk of suicide, a finding not explained by subjective loneliness. Overall, loneliness may be more important as a risk factor for self-harm than for suicide. Loneliness also appears to lessen the protective effects of cohabitation.
HighlightsO_LIFirst cohort study to investigate lonelinesss relationship with deaths by suicide
C_LIO_LILoneliness is associated with a modest increased risk of death by suicide for men
C_LIO_LIFor men, living with a partner reduces the risk of death by suicide
C_LIO_LILoneliness increases the risk of hospitalization for self-harm for men and women
C_LI | 10.1016/j.jad.2020.10.026 | medrxiv |
10.1101/19008458 | Living alone, loneliness and lack of emotional support as predictors of suicide and self-harm: a nine-year follow up of the UK Biobank cohort | Shaw, R. J.; Cullen, B.; Graham, N.; Lyall, D. M.; Mackay, D.; Okolie, C.; Pearsall, R.; Ward, J.; John, A.; Smith, D. J. | Richard John Shaw | University of Glasgow | 2020-05-12 | 2 | PUBLISHAHEADOFPRINT | cc_no | epidemiology | https://www.medrxiv.org/content/early/2020/05/12/19008458.source.xml | BackgroundThe association between loneliness and suicide is poorly understood. We investigated how living alone, loneliness and emotional support were related to suicide and self-harm in a longitudinal design.
MethodsBetween 2006 and 2010 UK Biobank recruited and assessed in detail over 0.5 million people in middle age. Data were linked to prospective hospital admission and mortality records. Adjusted Cox regression models were used to investigate relationships between living arrangements, loneliness and emotional support, and both suicide and self-harm as outcomes.
ResultsFor men, both living alone (Hazard Ratio (HR) 2.16, 95%CI 1.51-3.09) and living with non-partners (HR 1.80, 95%CI 1.08-3.00) were associated with death by suicide, independently of loneliness, which had a modest relationship with suicide (HR 1.43, 95%CI 0.1.01-2.03). For women, there was no evidence that living arrangements, loneliness or emotional support were associated with death by suicide. Associations between living alone and self-harm were explained by health for women, and by health, loneliness and emotional support for men. In fully adjusted models, loneliness was associated with hospital admissions for self-harm in both women (HR 1.89, 95%CI 1.57-2.28) and men (HR 1.74, 95%CI 1.40-2.16).
LimitationsLoneliness and emotional support were operationalized using single item measures.
ConclusionsFor men - but not for women - living alone or living with a non-partner increased the risk of suicide, a finding not explained by subjective loneliness. Overall, loneliness may be more important as a risk factor for self-harm than for suicide. Loneliness also appears to lessen the protective effects of cohabitation.
HighlightsO_LIFirst cohort study to investigate lonelinesss relationship with deaths by suicide
C_LIO_LILoneliness is associated with a modest increased risk of death by suicide for men
C_LIO_LIFor men, living with a partner reduces the risk of death by suicide
C_LIO_LILoneliness increases the risk of hospitalization for self-harm for men and women
C_LI | 10.1016/j.jad.2020.10.026 | medrxiv |
10.1101/19008458 | Living alone, loneliness and lack of emotional support as predictors of suicide and self-harm: a nine-year follow up of the UK Biobank cohort | Shaw, R. J.; Cullen, B.; Graham, N.; Lyall, D. M.; Mackay, D.; Okolie, C.; Pearsall, R.; Ward, J.; John, A.; Smith, D. J. | Richard John Shaw | University of Glasgow | 2020-10-13 | 3 | PUBLISHAHEADOFPRINT | cc_by | epidemiology | https://www.medrxiv.org/content/early/2020/10/13/19008458.source.xml | BackgroundThe association between loneliness and suicide is poorly understood. We investigated how living alone, loneliness and emotional support were related to suicide and self-harm in a longitudinal design.
MethodsBetween 2006 and 2010 UK Biobank recruited and assessed in detail over 0.5 million people in middle age. Data were linked to prospective hospital admission and mortality records. Adjusted Cox regression models were used to investigate relationships between living arrangements, loneliness and emotional support, and both suicide and self-harm as outcomes.
ResultsFor men, both living alone (Hazard Ratio (HR) 2.16, 95%CI 1.51-3.09) and living with non-partners (HR 1.80, 95%CI 1.08-3.00) were associated with death by suicide, independently of loneliness, which had a modest relationship with suicide (HR 1.43, 95%CI 0.1.01-2.03). For women, there was no evidence that living arrangements, loneliness or emotional support were associated with death by suicide. Associations between living alone and self-harm were explained by health for women, and by health, loneliness and emotional support for men. In fully adjusted models, loneliness was associated with hospital admissions for self-harm in both women (HR 1.89, 95%CI 1.57-2.28) and men (HR 1.74, 95%CI 1.40-2.16).
LimitationsLoneliness and emotional support were operationalized using single item measures.
ConclusionsFor men - but not for women - living alone or living with a non-partner increased the risk of suicide, a finding not explained by subjective loneliness. Overall, loneliness may be more important as a risk factor for self-harm than for suicide. Loneliness also appears to lessen the protective effects of cohabitation.
HighlightsO_LIFirst cohort study to investigate lonelinesss relationship with deaths by suicide
C_LIO_LILoneliness is associated with a modest increased risk of death by suicide for men
C_LIO_LIFor men, living with a partner reduces the risk of death by suicide
C_LIO_LILoneliness increases the risk of hospitalization for self-harm for men and women
C_LI | 10.1016/j.jad.2020.10.026 | medrxiv |
10.1101/19007526 | Quaternary Prevention: Is this Concept Relevant to Public Health? A Bibliometric and Descriptive Content Analysis. | Depallens, M. A.; Guimaraes, J. M. d. M.; Almeida Filho, N. | Miguel Andino Depallens | Federal University of the South of Bahia | 2019-10-10 | 1 | PUBLISHAHEADOFPRINT | cc_no | public and global health | https://www.medrxiv.org/content/early/2019/10/10/19007526.source.xml | ObjectiveTo measure and map research output on Quaternary Prevention (P4) and outline research trends; to assess the papers content, mainly regarding methods and subjects approached in order to contribute to the improvement of global knowledge about P4 and to evaluate its relevance for public health.
DesignBibliometric and descriptive content analysis.
Articles reviewedScientific articles about P4 recorded in Pubmed, LILACS, Scielo or CINAHL published until August 2018, with correspondent full articles available in Portuguese, English, Spanish, German or French.
Main outcome measuresYear of publication, first authors name and nationality, journals name, country and ranking, publication language, used methods and main reported subjects.
Results65 articles were included, published in 33 journals of 16 countries between 2003 and 2018 with a peak of publications in 2015. The first authors came from 17 different countries, 23% of them were Brazilian and Uruguay was the leading nation according to the scientific production per capita. 40% of all the selected articles were in English, 32% in Portuguese, 26% in Spanish. 28% of the papers were published in Q1 or Q2 journals. The research outputs on P4 begun first in the South of Europe, went to South America and then expanded worldwide. 88% of the articles were bibliographic research and 38% of all focused on specific examples of medical overuse (including several screening tests).
ConclusionsQuaternary prevention represents an ethical and valid approach to prevent occurence of iatrogenic events and to achieve equal and fair access to health services. Conceptual, geographical and linguistic elements, as well as WONCA conferences and type of healthcare systems in the authors country were fundamental factors that affected research output. The quality and quantity of available studies is still limited, therefore further investigations are recommended to assess the effective impact of P4 on public health. | 10.1590/0102-311X00231819 | medrxiv |