doi
stringlengths 16
27
| title
stringlengths 18
435
| authors
stringlengths 6
600
| author_corresponding
stringlengths 5
52
| author_corresponding_institution
stringlengths 1
160
⌀ | date
stringlengths 10
10
| version
int64 0
26
| type
stringclasses 3
values | license
stringclasses 7
values | category
stringclasses 51
values | jatsxml
stringlengths 68
79
| abstract
stringlengths 4
38.7k
| published
stringlengths 13
46
⌀ | server
stringclasses 1
value |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
10.1101/19004440 | Medal: a patient similarity metric using medication prescribing patterns | Lopez Pineda, A.; Pourshafeie, A.; Ioannidis, A.; McCloskey Leibold, C.; Chan, A.; Frankovich, J.; Bustamante, C. D.; Wojcik, G. L. | Arturo Lopez Pineda | Stanford University | 2019-08-26 | 2 | PUBLISHAHEADOFPRINT | cc_by_nd | health informatics | https://www.medrxiv.org/content/early/2019/08/26/19004440.source.xml | ObjectivePediatric acute-onset neuropsychiatric syndrome (PANS) is a complex neuropsychiatric syndrome characterized by an abrupt onset of obsessive-compulsive symptoms and/or severe eating restrictions, along with at least two concomitant debilitating cognitive, behavioral, or neurological symptoms. A wide range of pharmacological interventions along with behavioral and environmental modifications, and psychotherapies have been adopted to treat symptoms and underlying etiologies. Our goal was to develop a data-driven approach to identify treatment patterns in this cohort.
Materials and MethodsIn this cohort study, we extracted medical prescription histories from electronic health records. We developed a modified dynamic programming approach to perform global alignment of those medication histories. Our approach is unique since it considers time gaps in prescription patterns as part of the similarity strategy.
ResultsThis study included 43 consecutive new-onset pre-pubertal patients who had at least 3 clinic visits. Our algorithm identified six clusters with distinct medication usage history which may represent clinicians practice of treating PANS of different severities and etiologies i.e., two most severe groups requiring high dose intravenous steroids; two arthritic or inflammatory groups requiring prolonged nonsteroidal anti-inflammatory drug (NSAID); and two mild relapsing/remitting group treated with a short course of NSAID. The psychometric scores as outcomes in each cluster generally improved within the first two years.
Discussion and conclusionOur algorithm shows potential to improve our knowledge of treatment patterns in the PANS cohort, while helping clinicians understand how patients respond to a combination of drugs. | null | medrxiv |
10.1101/19004440 | Discovering prescription patterns in pediatric acute-onset neuropsychiatric syndrome patients | Lopez Pineda, A.; Pourshafeie, A.; Ioannidis, A.; McCloskey Leibold, C.; Chan, A. L.; Bustamante, C. D.; Frankovich, J.; Wojcik, G. L. | Arturo Lopez Pineda | Stanford University | 2020-05-28 | 3 | PUBLISHAHEADOFPRINT | cc_by_nd | health informatics | https://www.medrxiv.org/content/early/2020/05/28/19004440.source.xml | ObjectivePediatric acute-onset neuropsychiatric syndrome (PANS) is a complex neuropsychiatric syndrome characterized by an abrupt onset of obsessive-compulsive symptoms and/or severe eating restrictions, along with at least two concomitant debilitating cognitive, behavioral, or neurological symptoms. A wide range of pharmacological interventions along with behavioral and environmental modifications, and psychotherapies have been adopted to treat symptoms and underlying etiologies. Our goal was to develop a data-driven approach to identify treatment patterns in this cohort.
Materials and MethodsIn this cohort study, we extracted medical prescription histories from electronic health records. We developed a modified dynamic programming approach to perform global alignment of those medication histories. Our approach is unique since it considers time gaps in prescription patterns as part of the similarity strategy.
ResultsThis study included 43 consecutive new-onset pre-pubertal patients who had at least 3 clinic visits. Our algorithm identified six clusters with distinct medication usage history which may represent clinicians practice of treating PANS of different severities and etiologies i.e., two most severe groups requiring high dose intravenous steroids; two arthritic or inflammatory groups requiring prolonged nonsteroidal anti-inflammatory drug (NSAID); and two mild relapsing/remitting group treated with a short course of NSAID. The psychometric scores as outcomes in each cluster generally improved within the first two years.
Discussion and conclusionOur algorithm shows potential to improve our knowledge of treatment patterns in the PANS cohort, while helping clinicians understand how patients respond to a combination of drugs. | null | medrxiv |
10.1101/19003434 | The impact of reactive mass vaccination campaigns on measles outbreaks in the Katanga region, Democratic Republic of Congo | Funk, S.; Takahashi, S.; Hellewell, J.; Gadroen, K.; Carrion-Martin, I.; van Lenthe, M.; Rivette, K.; Dietrich, S.; Edmunds, W. J.; Siddiqui, R.; Rao, V. B. | Sebastian Funk | London School of Hygiene & Tropical Medicine | 2019-08-17 | 1 | PUBLISHAHEADOFPRINT | cc_by | epidemiology | https://www.medrxiv.org/content/early/2019/08/17/19003434.source.xml | The Katanga region in the Democratic Republic of Congo (DRC) has been struck by repeated epidemics of measles, with large outbreaks occurring in 2010-13 and 2015. In many of the affected health zones, reactive mass vaccination campaigns were conducted in response to the outbreaks. Here, we attempted to determine how effective the vaccination campaigns in 2015 were in curtailing the ongoing outbreak. We further sought to establish whether the risk of large measles outbreaks in different health zones could have been determined in advance to help prioritise areas for vaccination campaign and speed up the response. In doing so, we first attempted to identify factors that could have been used in 2015 to predict in which health zones the greatest outbreaks would occur. Administrative vaccination coverage was not a good predictor of the size of outbreaks in different health zones. Vaccination coverage derived from surveys, on the other hand, appeared to give more reliable estimates of health zones of low vaccination coverage and, consequently, large outbreaks. On a coarser geographical scale, the provinces most affected in 2015 could be predicted from the outbreak sizes in 2010-13. This, combined with the fact that the vast majority of reported cases were in under-5 year olds, would suggest that there are systematic issues of undervaccination. If this was to continue, outbreaks would be expected to continue to occur in the affected health zones at regular intervals, mostly concentrated in under-5 year olds. We further used a model of measles transmission to estimate the impact of the vaccination campaigns, by first fitting a model to the data including the campaigns and then re-running this without vaccination. We estimated the reactive campaigns to have reduced the size of the overall outbreak by approximately 21,000 (IQR: 16,000-27,000; 95% CI: 8300-38,000) cases. There was considerable heterogeneity in the impact of campaigns, with campaigns started earlier after the start of an outbreak being more impactful. Taken together, these findings suggest that while a strong routine vaccination regime remains the most effective means of measles control, it might be possible to improve the effectiveness of reactive campaigns by considering predictive factors to trigger a more targeted vaccination response. | null | medrxiv |
10.1101/19003434 | The impact of reactive mass vaccination campaigns on measles outbreaks in the Katanga region, Democratic Republic of Congo | Funk, S.; Takahashi, S.; Hellewell, J.; Gadroen, K.; Carrion-Martin, I.; van Lenthe, M.; Rivette, K.; Dietrich, S.; Edmunds, W. J.; Siddiqui, M. R.; Rao, V. B. | Sebastian Funk | London School of Hygiene & Tropical Medicine | 2019-08-26 | 2 | PUBLISHAHEADOFPRINT | cc_by | epidemiology | https://www.medrxiv.org/content/early/2019/08/26/19003434.source.xml | The Katanga region in the Democratic Republic of Congo (DRC) has been struck by repeated epidemics of measles, with large outbreaks occurring in 2010-13 and 2015. In many of the affected health zones, reactive mass vaccination campaigns were conducted in response to the outbreaks. Here, we attempted to determine how effective the vaccination campaigns in 2015 were in curtailing the ongoing outbreak. We further sought to establish whether the risk of large measles outbreaks in different health zones could have been determined in advance to help prioritise areas for vaccination campaign and speed up the response. In doing so, we first attempted to identify factors that could have been used in 2015 to predict in which health zones the greatest outbreaks would occur. Administrative vaccination coverage was not a good predictor of the size of outbreaks in different health zones. Vaccination coverage derived from surveys, on the other hand, appeared to give more reliable estimates of health zones of low vaccination coverage and, consequently, large outbreaks. On a coarser geographical scale, the provinces most affected in 2015 could be predicted from the outbreak sizes in 2010-13. This, combined with the fact that the vast majority of reported cases were in under-5 year olds, would suggest that there are systematic issues of undervaccination. If this was to continue, outbreaks would be expected to continue to occur in the affected health zones at regular intervals, mostly concentrated in under-5 year olds. We further used a model of measles transmission to estimate the impact of the vaccination campaigns, by first fitting a model to the data including the campaigns and then re-running this without vaccination. We estimated the reactive campaigns to have reduced the size of the overall outbreak by approximately 21,000 (IQR: 16,000-27,000; 95% CI: 8300-38,000) cases. There was considerable heterogeneity in the impact of campaigns, with campaigns started earlier after the start of an outbreak being more impactful. Taken together, these findings suggest that while a strong routine vaccination regime remains the most effective means of measles control, it might be possible to improve the effectiveness of reactive campaigns by considering predictive factors to trigger a more targeted vaccination response. | null | medrxiv |
10.1101/19003434 | The impact of reactive mass vaccination campaigns on measles outbreaks in the Katanga region, Democratic Republic of Congo | Funk, S.; Takahashi, S.; Hellewell, J.; Gadroen, K.; Carrion-Martin, I.; van Lenthe, M.; Rivette, K.; Dietrich, S.; Edmunds, W. J.; Siddiqui, M. R.; Rao, V. B. | Sebastian Funk | London School of Hygiene & Tropical Medicine | 2019-10-10 | 3 | PUBLISHAHEADOFPRINT | cc_by | epidemiology | https://www.medrxiv.org/content/early/2019/10/10/19003434.source.xml | The Katanga region in the Democratic Republic of Congo (DRC) has been struck by repeated epidemics of measles, with large outbreaks occurring in 2010-13 and 2015. In many of the affected health zones, reactive mass vaccination campaigns were conducted in response to the outbreaks. Here, we attempted to determine how effective the vaccination campaigns in 2015 were in curtailing the ongoing outbreak. We further sought to establish whether the risk of large measles outbreaks in different health zones could have been determined in advance to help prioritise areas for vaccination campaign and speed up the response. In doing so, we first attempted to identify factors that could have been used in 2015 to predict in which health zones the greatest outbreaks would occur. Administrative vaccination coverage was not a good predictor of the size of outbreaks in different health zones. Vaccination coverage derived from surveys, on the other hand, appeared to give more reliable estimates of health zones of low vaccination coverage and, consequently, large outbreaks. On a coarser geographical scale, the provinces most affected in 2015 could be predicted from the outbreak sizes in 2010-13. This, combined with the fact that the vast majority of reported cases were in under-5 year olds, would suggest that there are systematic issues of undervaccination. If this was to continue, outbreaks would be expected to continue to occur in the affected health zones at regular intervals, mostly concentrated in under-5 year olds. We further used a model of measles transmission to estimate the impact of the vaccination campaigns, by first fitting a model to the data including the campaigns and then re-running this without vaccination. We estimated the reactive campaigns to have reduced the size of the overall outbreak by approximately 21,000 (IQR: 16,000-27,000; 95% CI: 8300-38,000) cases. There was considerable heterogeneity in the impact of campaigns, with campaigns started earlier after the start of an outbreak being more impactful. Taken together, these findings suggest that while a strong routine vaccination regime remains the most effective means of measles control, it might be possible to improve the effectiveness of reactive campaigns by considering predictive factors to trigger a more targeted vaccination response. | null | medrxiv |
10.1101/19004994 | Comparison of risk factors for coronary heart disease morbidity versus mortality | Batty, G. D.; Kivimaki, M.; Bell, S. | George David Batty | University College London | 2019-08-17 | 1 | PUBLISHAHEADOFPRINT | cc_no | epidemiology | https://www.medrxiv.org/content/early/2019/08/17/19004994.source.xml | Owing to the often prohibitively high costs of medical examinations, or an absence of infrastructure for linkage of study members to morbidity registries, much aetiological research in the field of cardiovascular research relies on death records. Because they are regarded as being more distal to risk factor assessment than morbidity endpoints, mortality data are generally maligned in this context for seemingly providing less clear insights into aetiology. The relative utility of mortality versus morbidity registries is, however, untested. In a pooling of data from three large cohort studies whose participants had been linked to both death and morbidity registries for coronary heart disease, we related a range of established and emerging risk factors to these two methods of ascertainment. A mean duration of study member surveillance of 10.1 years (mortality) and 9.9 years (morbidity) for a maximum of 20,956 study members (11,868 women) in the analytical sample yielded 289 deaths from coronary heart disease and 770 hospitalisations for this condition. The direction of the age- and sex-adjusted association was the same for 21 of the 24 risk factor- morbidity/mortality combinations. The only marked discordance in effect estimates, such that different conclusions about the association could be drawn, was for social support, total cholesterol, and fruit/vegetable consumption whereby null effects were evident for selected outcomes. In conclusion, variation in disease definition typically did not have an impact on the direction of the association of an array of risk factors for coronary heart disease. | 10.1177/2047487319882512 | medrxiv |
10.1101/19002923 | An emergent, high-fatality lung disease in systemic juvenile arthritis | Saper, V.; Chen, G.; Deutsch, G.; Guillerman, R. P.; Birgmeier, J.; Jagadeesh, K.; Canna, S.; Schulert, G.; Deterding, R.; Xu, J.; Leung, A.; Qin, X.; Bouzoubaa, L.; Abulaban, K.; Baszis, K.; Behrens, E.; Birmingham, J.; Casey, A.; Cidon, M.; Cron, R.; De, A.; De Benedetti, F.; Ferguson, I.; Fishman, M.; Goodman, S.; Graham, B.; Grom, A.; Haines, K.; Hazen, M.; Henderson, L.; Ho, A.; Ibarra, M.; Inman, C.; Jerath, R.; Khawaja, K.; Kingsbury, D.; Klein-Gitelman, M.; Lai, K.; Lapidus, S.; Lin, C.; Lin, J.; Liptzin, D.; Milojevic, D.; Mombourquette, J.; Onel, K.; Ozen, S.; Perez, M.; Phillippi, K | Elizabeth Mellins | Stanford University | 2019-08-20 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | rheumatology | https://www.medrxiv.org/content/early/2019/08/20/19002923.source.xml | ObjectiveTo investigate characteristics and risk factors of a novel parenchymal lung disease, increasingly detected in systemic juvenile idiopathic arthritis (sJIA).
MethodsIn a multi-center retrospective study, 61 cases were investigated, using physician-reported clinical information and centralized analyses of radiologic, pathologic and genetic data.
ResultsLung disease (LD) was associated with distinctive features, including acute erythematous clubbing and a high frequency of anaphylactic reactions to the IL-6 inhibitor, tocilizumab. Serum ferritin elevation and/or significant lymphopenia preceded LD detection. The most prevalent chest CT pattern was septal thickening, involving the periphery of multiple lobes +/- ground glass opacities. Predominant pathology (23/36) was pulmonary alveolar proteinosis and/or endogenous lipoid pneumonia (PAP/ELP), with atypical features, including regional involvement and concomitant vascular changes. Apparent severe delayed drug hypersensitivity occurred in some cases. 5-year survival was 42%. Whole-exome sequencing (20/61) did not identify a novel monogenic defect PAP-related or macrophage activation syndrome (MAS)-related mutations as likely primary cause. Trisomy 21 (T21) increased LD risk, as did young sJIA onset. Refractory sJIA was not required for LD development. Exposure to interleukin (IL)-1 and IL-6 inhibitors (46/61) was associated with multiple LD features. By several indicators, severity of sJIA was comparable in drug-exposed subjects and published sJIA cohorts. MAS at sJIA onset was increased in the drug-exposed, but it was not associated with LD features.
ConclusionsA rare, life-threatening LD in sJIA is defined by a constellation of unusual clinical characteristics. The pathology, a PAP/ELP variant, suggests macrophage dysfunction. Inhibitor exposure may promote LD, independent of sJIA severity, in a small subset of treated patients. Treatment/prevention strategies are needed. | 10.1136/annrheumdis-2019-216040 | medrxiv |
10.1101/19005082 | Measuring stroke outcomes using linked administrative data: Population-based estimates and validation of home-time as a surrogate measure of functional status. | Gattellari, M.; Goumas, C.; Jalaludin, B.; Worthington, J. | Melina Gattellari | Department of Neurology, Royal Prince Alfred Hospital, Camperdown, New South Wales, Australia | 2019-08-20 | 1 | PUBLISHAHEADOFPRINT | cc_no | health informatics | https://www.medrxiv.org/content/early/2019/08/20/19005082.source.xml | BackgroundAdministrative data offer cost-effective, whole-of-population stroke surveillance yet the lack of validated outcomes is a short-coming. The number of days spent living at home after stroke ("home-time") is a patient-centred outcome that can be objectively ascertained from administrative data. Population-based validation against both severity and outcome measures and for all subtypes is lacking.
MethodsStroke hospitalisations from a state-wide census in New South Wales, Australia, from July 1, 2005 to March 31, 2014 were linked to pre-hospital data, post-stroke admissions and deaths. We calculated correlations between 90-day home-time and Glasgow Coma Scale (GCS) scores, measured upon a patients initial contact with paramedics, and Functional Independence Measure (FIM) scores, measured upon entry to rehabilitation after the acute hospital stroke admission. Negative binomial regression models were used to identify predictors of home-time.
ResultsPatients with stroke (N=74,501) spent a median of 53 days living at home after the event. Median home-time was 60 days after ischaemic stroke, 49 days after subarachnoid haemorrhage and 0 days after intracerebral haemorrhage. GCS and FIM scores significantly correlated with home-time (p-values<0.001). Female sex predicted less home-time in ischaemic stroke, while being married increased home time after ischaemic stroke and subarachnoid haemorrhage.
ConclusionsHome-time measured using administrative data is a robust, replicable and valid patient-centred outcome enabling inexpensive population-based surveillance. | 10.1111/ijcp.13484 | medrxiv |
10.1101/19004671 | Genetic risk of obesity as a modifier of associations between neighbourhood environment and body mass index: an observational study of 335,046 UK Biobank participants | Mason, K. E.; Palla, L.; Pearce, N.; Phelan, J.; Cummins, S. | Kate E Mason | University of Liverpool; London School of Hygiene & Tropical Medicine | 2019-08-20 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | epidemiology | https://www.medrxiv.org/content/early/2019/08/20/19004671.source.xml | BackgroundThere is growing recognition that recent global increases in obesity are the product of a complex interplay between genetic and environmental factors. However, in gene-environment studies of obesity, environment usually refers to individual behavioural factors that influence energy balance, while more upstream environmental factors are overlooked. We examined gene-environment interactions between genetic risk of obesity and two neighbourhood characteristics likely to be associated with obesity (proximity to takeaway/fast-food outlets and availability of physical activity facilities).
MethodsWe used data from 335,046 adults aged 40-70 in the UK Biobank cohort to conduct a population-based cross-sectional study of interactions between neighbourhood characteristics and genetic risk of obesity, in relation to BMI. Proximity to a fast-food outlet was defined as distance from home address to nearest takeaway/fast-food outlet, and availability of physical activity facilities as the number of formal physical activity facilities within one kilometre of home address. Genetic risk of obesity was operationalised by 91-SNP and 69-SNP weighted genetic risk scores, and by six individual SNPs considered separately. Multivariable, mixed effects models with product terms for the gene-environment interactions were estimated.
ResultsAfter accounting for likely confounding, the association between proximity to takeaway/fast-food outlets and BMI was stronger among those at increased genetic risk of obesity, with evidence of an interaction with polygenic risk scores (P=0.018 and P=0.028 for 69-SNP and 91-SNP scores, respectively) and in particular with a SNP linked to MC4R (P=0.009), a gene known to regulate food intake. We found very little evidence of a gene-environment interaction for availability of physical activity facilities.
ConclusionsIndividuals at an increased genetic risk of obesity may be more sensitive to exposure to the local fast-food environment. Ensuring that neighbourhood residential environments are designed to promote a healthy weight may be particularly important for those with greater genetic susceptibility to obesity. | 10.1136/bmjnph-2020-000107 | medrxiv |
10.1101/19004671 | Genetic risk of obesity as a modifier of associations between neighbourhood environment and body mass index: an observational study of 335,046 UK Biobank participants | Mason, K. E.; Palla, L.; Pearce, N.; Phelan, J.; Cummins, S. | Kate E Mason | University of Liverpool; London School of Hygiene & Tropical Medicine | 2019-08-26 | 2 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | epidemiology | https://www.medrxiv.org/content/early/2019/08/26/19004671.source.xml | BackgroundThere is growing recognition that recent global increases in obesity are the product of a complex interplay between genetic and environmental factors. However, in gene-environment studies of obesity, environment usually refers to individual behavioural factors that influence energy balance, while more upstream environmental factors are overlooked. We examined gene-environment interactions between genetic risk of obesity and two neighbourhood characteristics likely to be associated with obesity (proximity to takeaway/fast-food outlets and availability of physical activity facilities).
MethodsWe used data from 335,046 adults aged 40-70 in the UK Biobank cohort to conduct a population-based cross-sectional study of interactions between neighbourhood characteristics and genetic risk of obesity, in relation to BMI. Proximity to a fast-food outlet was defined as distance from home address to nearest takeaway/fast-food outlet, and availability of physical activity facilities as the number of formal physical activity facilities within one kilometre of home address. Genetic risk of obesity was operationalised by 91-SNP and 69-SNP weighted genetic risk scores, and by six individual SNPs considered separately. Multivariable, mixed effects models with product terms for the gene-environment interactions were estimated.
ResultsAfter accounting for likely confounding, the association between proximity to takeaway/fast-food outlets and BMI was stronger among those at increased genetic risk of obesity, with evidence of an interaction with polygenic risk scores (P=0.018 and P=0.028 for 69-SNP and 91-SNP scores, respectively) and in particular with a SNP linked to MC4R (P=0.009), a gene known to regulate food intake. We found very little evidence of a gene-environment interaction for availability of physical activity facilities.
ConclusionsIndividuals at an increased genetic risk of obesity may be more sensitive to exposure to the local fast-food environment. Ensuring that neighbourhood residential environments are designed to promote a healthy weight may be particularly important for those with greater genetic susceptibility to obesity. | 10.1136/bmjnph-2020-000107 | medrxiv |
10.1101/19005041 | Triage-based care of people with Back Pain: STarT Back or Start diagnosing? An observational study. | Germon, T.; Jack, A.; Hobart, J. | Tim Germon | University Hospitals Plymouth | 2019-08-20 | 1 | PUBLISHAHEADOFPRINT | cc_no | health policy | https://www.medrxiv.org/content/early/2019/08/20/19005041.source.xml | ObjectivesBack pain is a massive public health problem. The STarT Back Screening Tool (SBST) was developed for use in primary care to triage people with lumbar pain, classifying them as low, medium or high "risk" of prolonged symptoms. This classification guides non-surgical interventions including manual treatments, exercise and cognitive behavioural therapy. Claims suggest SBST brings generic health and cost benefits. National guidance recommends STarT Back is used at the first primary care consultation but can be used at any stage. For SBST to be an effective triage tool it should distinguish structural from non-structural pain. We tested this requirement in consecutive people referred to a single triage practitioner, hypothesising it was not possible conceptually.
DesignAn observational study of the relationship between routine, prospectively collected triage data and diagnosis.
SettingA secondary care spinal triage service based in a teaching hospital.
ParticipantsWe studied consecutive referrals with lumbar pain triaged by a single extended scope practitioner (ESP) over 22 months (Nov 2015-Sept 2017).
Main Outcome MeasuresSBST and pain visual analogue scores (VAS: 0-10) were collected at the initial consultation. We compared data for people with and without surgically remedial lesions.
Results1041 people were seen (61% female, mean age 53), n=234 (28%) had surgically amenable explanations for pain. People with surgical lesions were older (58 v 51yrs), more likely male (48 v 35%) and had higher VAS scores (6.8 v 6.1). Surgery and non-surgery subgroups had similar SBST total and domain score distribution profiles. The surgery subgroup had less low risk (9%v21%) and more high risk (37% v 30%) classified people.
ConclusionSBST scores did not differentiate surgical from non-surgical pathologies. It seems unlikely that symptom questionnaires can estimate prognosis accurately unless everyone has the same diagnosis, not just the same symptom. Diagnosis, rather than questionnaire scores, should guide treatment and inform prognosis.
O_TEXTBOXSummary Box
What is already known on this topic?The symptom of low back pain is a common cause of disability worldwide.
The majority if people with low back pain probably do not have a structural problem in their lumbar spine to explain the extent of their disability.
National guidelines throughout the world attempt to facilitate the identification and treatment of people with, "non-specific low back pain", and prescribe treatment for people given this label.
What this study adds?It seems unlikely that symptom questionnaires can estimate prognosis accurately unless everyone has the same diagnosis, not just the same symptom.
The practice of recommending treatment and prognosticating in the absence of a diagnosis needs further scrutiny.
C_TEXTBOX | null | medrxiv |
10.1101/19003590 | The HIV-Associated Neurocognitive Disorders in Zambia (HANDZ) Study: Protocol of a research program in pediatric HIV in sub-Saharan Africa | Adams, H. R.; Mwanza-Kabaghe, S.; Mbewe, E. G.; Kabundula, P. P.; Potchen, M. J.; Maggirwar, S.; Johnson, B. A.; Schifitto, G.; Gelbard, H. A.; Birbeck, G. L.; Bearden, D. R. | Heather R Adams | University of Rochester School of Medicine and Dentistry | 2019-08-20 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | hiv aids | https://www.medrxiv.org/content/early/2019/08/20/19003590.source.xml | Approximately 10% of youth in sub-Saharan Africa are infected with the Human Immunodeficiency Virus. In Zambia, it is estimated that over 72,000 children have HIV infection, and despite access to combination antiretroviral therapy, many will experience HIV-associated neurocognitive deficits (HAND) encompassing cognitive and psychiatric sequelae such as global intellectual delay, executive dysfunction, and depressed mood. However, little is known about the neurocognitive profile of such children, the long-term outcomes and impacts of HAND, or the predictors and risk factors for HAND-related impairment. We have initiated the first-ever prospective, longitudinal study of neurocognition in children with HIV-infection in Zambia. Our overarching study goals are to validate cognitive and psychiatric testing tools in children with HIV infection in Zambia, and to determine if inflammatory biomarkers and brain imaging can prospectively identify children at high risk of developing HAND. This article outlines the study methods, highlights several challenges encountered in the initiation of the study, and offers solutions to these challenges. | null | medrxiv |
10.1101/19004416 | Cytological Abnormalities and its relation to CD4 count among HIV seropositive women living in Ahvaz, southwest of Iran | darvishi, a.; alavi, s. m.; khafaie, m. a.; sokooti, A.; Molavi, S.; salmanzadeh, s. | shokrollah salmanzadeh | Ahvaz Jundishapur University of Medical Sciences | 2019-08-20 | 1 | PUBLISHAHEADOFPRINT | cc_by | hiv aids | https://www.medrxiv.org/content/early/2019/08/20/19004416.source.xml | IntroductionHuman immunodeficiency virus (HIV) infection is a known risk factor for abnormal cervical cytology and cervical cancer. The aim of this study was to investigate cervical cytological abnormalities and its relation with CD4 (T4 Lymphocyte) count among HIV seropositive women.
MethodsWe conducted a study on 58 HIV positive women referred to Ahvaz Counseling Center for Behavioral Disease, southwest of Iran between 2016 and 2017. Pap smear was performed for all participants from the cervix and endocervix. Patients characteristics including age, duration of disease, treatment with anti-retroviral treatment (ART), marital status, number of children, and contraception method were also recorded. Cervical cytological abnormalities reported as Bethesda system (TBS). A regular blood sample was taken from all the patients to evaluate the CD4 cells counts. Logistic regression models were used to obtain OR of presences of cytological abnormalities related to CD4 counts, controlling for important factors.
ResultsOut of 58 patients only 5 were not under ART. We demonstrated that 29.3 % of patients had squamous cell abnormalities and these abnormalities, was more prevalent among 30-40 years old patients (70.6%). The prevalence of ASC-US (Atypical Squamous Cells of Undetermined Significance), LSIL (Low-Grade Squamous Intraepithelial Lesions) and HSIL (High-Grade Squamous Intraepithelial Lesions) were 19.0%, 3.4%, and 6.9% respectively. Overall 9 patients need to repeat Pap smear test. Presence of cervical cytological abnormalities was not associated with the CD4 count, even after adjusting for the variable such age, duration of disease and ART.
ConclusionWe found a high prevalence of ASC-US in HIV-infected women which was independent of age, duration of diseases and history of ART. Though cervical cancer screening in this population might have a substantial public health benefit.
Summary box- More than 70% of cervical cancers incidences associated with Genital HPV infections
- Prevalent of Squamous cell abnormalities among HIV-infected women was about sex time more than general population
- We demonstrated that squamous cell abnormalities are more prevalent in middle age women (30 to 40 years)
- The high prevalence of Squamous cell abnormalities in HIV-infected women warrants the need for regular Pap smear screening | null | medrxiv |
10.1101/19005058 | Prevalence and clinical importance of titin truncating variants in adults without known congestive heart failure | Pirruccello, J. P.; Bick, A.; Friedman, S.; Chaffin, M.; Aragam, K. G.; Choi, S. H.; Lubitz, S. A.; Ho, C.; Ng, K.; Philippakis, A.; Ellinor, P. T.; Kathiresan, S.; Khera, A. V. | Amit V Khera | Massachusetts General Hospital | 2019-08-20 | 1 | PUBLISHAHEADOFPRINT | cc_no | cardiovascular medicine | https://www.medrxiv.org/content/early/2019/08/20/19005058.source.xml | BackgroundCross-sectional studies of various forms of dilated cardiomyopathy have noted a truncating mutation in the gene encoding titin ( TTNtv) in 7-30% of patients, but the clinical importance of identifying a TTNtv in an asymptomatic adult is largely unknown. In contrast to cross-sectional studies, prospective cohort studies allow for unbiased estimates of the disease risks associated with a genotype exposure.
ObjectivesTo determine the prevalence of cardiac imaging abnormalities and risk of incident disease among middle-aged TTNtv carriers without known congestive heart failure.
MethodsWe analyze exome sequencing data of 45,747 participants of the UK Biobank without known congestive heart failure to identify TTNtv carriers. Among 10,552 with cardiac magnetic resonance imaging (MRI), we determine the relationship between TTNtv carrier status and left ventricular ejection fraction. In this prospective cohort, we quantify the absolute and relative risks of incident disease in TTNtv carriers versus noncarriers.
ResultsAmong 45,747 middle-aged participants without known congestive heart failure, 196 (0.43%) harbored a TTNtv. The average ejection fraction was 61% in TTNtv carriers versus 65% in noncarriers (P = 1.8 x 10-8), with a 9.3-fold increase (95% CI 3.9 - 22.2) in odds of subnormal ejection fraction (P = 5.7 x 10-5). Over a median follow-up of 6.9 years, a composite endpoint of incident dilated cardiomyopathy, congestive heart failure, or all-cause mortality was observed in 6.6% of TTNtv carriers versus 2.9% of non-carriers (adjusted hazard ratio 2.5; 95% CI 1.4 - 4.3; p = 1.1 x 10-3).
ConclusionsApproximately 1 in 230 middle-aged adults without known congestive heart failure harbored a TTNtv. These carriers had a substantially increased relative risk--but modest absolute risk--of having a subnormal ejection fraction or manifesting clinical disease during prospective follow-up.
Condensed AbstractCross-sectional studies of dilated cardiomyopathy have noted a truncating mutation in the gene encoding titin ( TTNtv) in up to 30% of patients--but the clinical importance of TTNtv in asymptomatic adults is largely unknown. Here, we observe a TTNtv in 0.43% of 45,747 middle-aged adults. Average ejection fraction was 61% in TTNtv carriers versus 65% in non-carriers (p<0.001). Over a median follow-up of 7 years, incident congestive heart failure or mortality was observed in 6.6% of TTTtv carriers versus 2.9% of non-carriers (hazard ratio 2.5; p = 0.001). | 10.1016/j.jacc.2020.01.013 | medrxiv |
10.1101/19005074 | An Analysis for Key Indicators of Reproducibility in Radiology | Wright, B.; Vo, N.; Nolan, J.; Johnson, A. L.; Braaten, T.; Tritz, D.; Vassar, M. | Bryan Wright | Oklahoma State University Center for Health Sciences | 2019-08-20 | 1 | PUBLISHAHEADOFPRINT | cc_no | radiology and imaging | https://www.medrxiv.org/content/early/2019/08/20/19005074.source.xml | BackgroundGiven the central role of radiology in patient care, it is important that radiological research is grounded in reproducible science. It remains unexamined whether there is a lack of reproducibility or transparency in radiologic research.
PurposeThe purpose of this study was to analyze published radiology literature for the presence or absence of key indicators of reproducibility.
MethodsThis cross-sectional, retrospective study was performed by conducting a search of the National Library of Medicine to identify publications contained within journals in the field of Radiology. Journals that were not written in English or MEDLINE indexed were excluded from the analysis. Studies published from January 1, 2014 to December 31, 2018 were used to generate a random list of 300 publications for this meta-analysis. A pilot-tested, Google form was used to evaluate key indicators of reproducibility in the queried publications.
ResultsOur initial search returned 295,543 records, from which 300 were randomly selected for analysis. Of these 300 records, 294 met the inclusion criteria. Among the empirical publications, 5.6% contained a data availability statement (11/195, 95% CI: 3.0-8.3), 0.51% provided clearly documented raw data (1/195), 12.0% provided a materials availability statement (23/191, 8.4-15.7), none provided analysis scripts, 4.1% provided a preregistration statement (8/195, 1.9-6.3), 2.1% provided a protocol statement (4/195, 0.4-3.7), and 3.6% were preregistered (7/195, 1.5-5.7).
ConclusionOur findings demonstrate that key indicators of reproducibility are missing in the field of radiology. Thus, the ability to reproduce radiological studies may be problematic and may have potential clinical implications. | 10.1186/s13244-020-00870-x | medrxiv |
10.1101/19003939 | Harmonizing Palliative Care: National Survey to Evaluate the Knowledge and Attitude of Emergency Physicians towards Palliative Care | AlAnsari, A. M.; Suroor, S. N.; AboSerea, S. M.; Abd-El-Gawad, W. | Wafaa Abd-El-Gawad | AinShams university | 2019-08-20 | 1 | PUBLISHAHEADOFPRINT | cc_no | palliative medicine | https://www.medrxiv.org/content/early/2019/08/20/19003939.source.xml | Background and AimAlthough the challenges of integrating palliative care practices across care settings are real and well recognized until now little is known about palliative care practice of emergency physicians (EPs) and their accessibility to palliative care services in Kuwait. So the aim of this study was to explore the attitude, and knowledge encountered by EPs in providing palliative care in all general hospitals in Kuwait.
MethodA cross-sectional survey was performed in the emergency rooms of all general hospitals in Kuwait using Palliative Care Attitude and Knowledge (PCAK) questionnaire.
ResultsOf the total number of physicians working in emergency rooms (n=156), 104 (66.67%) had completed the survey. 76.9% (n=80) of the EPs had either uncertain attitude toward palliative care. Most of the EPs (n=73, 70.28%) didnt discuss the need of the patients to palliative care either with the patients or their families. Only 16 (15.4%) of the EPs responded correctly to the most of the questions while nearly half of the EPs (n=51, 49%) had poor knowledge especially in the most effective management of refractory dysnea (n=18, 17.3%). Experience [≥] 11yrs and better knowledge scores were independent predictors of positive attitude after adjustment of age, sex, qualifications, specialty, position, and nationality [OR: 5.747 (CI: 1.031-25.00), 1.458(CI: 1.148-1.851); p-value: 0.021, 0.002 respectively]..
ConclusionsDespite recognizing palliative care as an important competence, the majority of the emergency physicians in Kuwait had uncertain attitude and poor knowledge towards palliative care. Lack of knowledge, direct accessibility to palliative care services and lack of support from palliative medicine specialists were the main reasons for uncertain and negative attitude. Efforts should be done to enhance physician training and provide palliative care resources in order to improve the quality of care given to patients visiting emergency departments.
What this paper addsO_LIStudies proved that the emergency room may be a suitable place for early referral of patients who may benefit from palliative care especially old age to prevent upcoming undesired admissions and hospital deaths.
C_LIO_LIThe integration of palliative care concepts and consultation teams into emergency medicine may help to avoid unnecessary and burdensome treatments, tests, and procedures that are not aligned with patients goals of care.
C_LIO_LIAlthough the challenges of integrating palliative care practices across care settings are real and well recognized until now little is known about palliative care practice of emergency physicians and their accessibility to palliative care services in Kuwait.
C_LIO_LIRecently, a newly developed tool called Palliative Care Attitude and Knowledge (PCAK) questionnaire was created to assess the attitude and knowledge of non-palliative physicians toward palliative care. So the aim of this study was to explore the attitude, and knowledge encountered by emergency physicians in providing palliative care using PCAK 8 in emergency departments in all general
C_LIO_LIStudies showed that early palliative care consultation was shown to improve quality of life for cancer patients and may even lengthen their survival.
C_LI
What this study addsO_LIDespite recognizing palliative care as an important competence, the majority of the emergency physicians in Kuwait had uncertain attitude and poor knowledge towards palliative care. Lack of knowledge, direct accessibility to palliative care services and lack of support from palliative medicine specialists were the main reasons for uncertain and negative attitude.
C_LIO_LIEfforts should be done to enhance physician training and provide palliative care resources in order to improve the quality of care given to patients visiting emergency departments.
C_LI | 10.1136/bmjspcare-2019-002141 | medrxiv |
10.1101/19003913 | Research on Artificial Intelligence and Primary Care: A Scoping Review | Kueper, J. K.; Terry, A. L.; Zwarenstein, M.; Lizotte, D. J. | Jacqueline K Kueper | Western University | 2019-08-21 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | primary care research | https://www.medrxiv.org/content/early/2019/08/21/19003913.source.xml | ObjectiveThe purpose of this study was to assess the nature and extent of the body of research on artificial intelligence (AI) and primary care.
MethodsWe performed a scoping review, searching 11 published and grey literature databases with subject headings and key words pertaining to the concepts of 1) AI and 2) primary care: MEDLINE, EMBASE, Cinahl, Cochrane Library, Web of Science, Scopus, IEEE Xplore, ACM Digital Library, MathSciNet, AAAI, arXiv. Screening included title and abstract and then full text stages. Final inclusion criteria: 1) research study of any design, 2) developed or used AI, 3) used primary care data and/or study conducted in a primary care setting and/or explicit mention of study applicability to primary care; exclusion criteria: 1) narrative, editorial, or textbook chapter, 2) not applicable to primary care population or settings, 3) full text inaccessible in the English Language. We extracted and summarized seven key characteristics of included studies: overall study purpose(s), author appointments, primary care functions, author intended target end user(s), target health condition(s), location of data source(s) (if any), subfield(s) of AI.
ResultsOf 5,515 non-duplicate documents, 405 met our eligibility criteria. The body of literature is primarily focused on creating novel AI methods or modifying existing AI methods to support physician diagnostic or treatment recommendations, for chronic conditions, using data from higher income countries. Meaningfully more studies had at least one author with a technology, engineering, or math appointment than with a primary care appointment (57 (14%) compared to 217 (54%)). Predominant AI subfields were supervised machine learning and expert systems.
DiscussionOverall, AI research associated with primary care is at an early stage of maturity with respect to widespread implementation in practice settings. For the field to progress, more interdisciplinary research teams with end-user engagement and evaluation studies are needed.
SUMMARY BOXESO_ST_ABSSection 1: What is already known on this topicC_ST_ABSO_LIAdvancements in technology and the availability of health data have increased opportunities for artificial intelligence to be used for primary care purposes.
C_LIO_LINo comprehensive review of research on artificial intelligence associated with primary care has been performed.
C_LI
Section 2: What this study addsO_LIThe body of research on artificial intelligence and primary care is driven by authors without appointments in primary care departments and is focused on developing artificial intelligence methods to support diagnostic and treatment decisions.
C_LIO_LIThere is a need for more interdisciplinary research teams and evaluation of artificial intelligence projects in real world practice settings.
C_LI | 10.1370/afm.2518 | medrxiv |
10.1101/19004796 | Photo-Integrated Conversation Moderated by Robots for Cognitive Health in Older Adults: A Randomized Controlled Trial | Otake-Matsuura, M.; Tokunaga, S.; Watanabe, K.; Abe, M. S.; Sekiguchi, T.; Sugimoto, H.; Kishimoto, T.; Kudo, T. | Mihoko Otake-Matsuura | RIKEN | 2019-08-21 | 1 | PUBLISHAHEADOFPRINT | cc_by_nd | geriatric medicine | https://www.medrxiv.org/content/early/2019/08/21/19004796.source.xml | Background and ObjectivesSocial interaction might prevent or delay dementia, but little is known about the specific effects of various social activity interventions on cognition. This study conducted a single-site randomized controlled trial (RCT) of Photo-Integrated Conversation moderated by a Robot (PICMOR), a group conversation intervention program for resilience against cognitive decline and dementia.
Research Design and MethodsIn the RCT, PICMOR was compared to an unstructured group conversation condition. Sixty-five community-living older adults participated in this study. The intervention was provided once a week for 12 weeks. Primary outcome measures were the cognitive functions; process outcome measures included the linguistic characteristics of speech to estimate interaction quality. Baseline and post-intervention data were collected. PICMOR contains two key features: (i) photos taken by the participants are displayed and discussed sequentially; and (ii) a robotic moderator manages turn-taking to make sure that participants are allocated the same amount of time.
ResultsAmong the primary outcome measures (i.e., cognitive functions), verbal fluency significantly improved in the intervention group. Among the process outcome measures (i.e., linguistic characteristics of speech), the amount of speech and richness of words were larger for the intervention group.
Discussion and ImplicationsThis study demonstrated for the first time the positive effects of a robotic social activity intervention on cognitive function in healthy older adults via RCT. The group conversation generated by PICMOR may improve participants cognitive function controlling the amount of speech produced to make it equal. PICMOR is available and accessible to community-living older adults. | 10.3389/frobt.2021.633076 | medrxiv |
10.1101/19003756 | Health literacy, cognitive ability and self-reported diabetes in the English Longitudinal Study of Ageing | Fawns-Ritchie, C.; Price, J.; Deary, I. J. | Chloe Fawns-Ritchie | University of Edinburgh | 2019-08-22 | 1 | PUBLISHAHEADOFPRINT | cc_no | epidemiology | https://www.medrxiv.org/content/early/2019/08/22/19003756.source.xml | ObjectiveTo examine the association of health literacy and cognitive ability with risk of diabetes.
Research Design and Methods: Participants were 8,669 English Longitudinal Study of Ageing participants (mean age 66.7 years, SD 9.7) who completed health literacy and cognitive ability tests at wave 2 (2004-2005), and who answered a self-reported question on whether a doctor had ever diagnosed them with diabetes. Logistic regression was used to examine the cross-sectional associations of health literacy and cognitive ability with diabetes status. In those without diabetes at wave 2, Cox regression was used to test the associations of health literacy and cognitive ability with risk of diabetes over a median of 9.5 years follow-up (n=6,961).
ResultsAdequate (compared to limited) health literacy (OR 0.72, 95% CI 0.61-0.84) and higher cognitive ability (OR per 1 SD 0.73, CI 0.67-0.80) were both associated with lower odds of self-reported diabetes. Adequate health literacy (HR 0.64; CI 0.53-0.77) and higher cognitive ability (HR 0.77, CI 0.69-0.85) were also associated with lower risk of self-reporting diabetes during follow-up. When both health literacy and cognitive ability were added to the same model, these associations were slightly attenuated. Additional adjustment for health behaviours, education and social class attenuated associations further, and neither health literacy nor cognitive ability were significantly associated with diabetes.
ConclusionsAdequate health literacy and better cognitive ability were associated with reduced risk of diabetes. These associations were independent of each other, but not of other health- and socioeconomic-related variables. | 10.1136/bmjopen-2021-058496 | medrxiv |
10.1101/19004879 | Dynamic Changes in Prescription Opioids from 2006 to 2017 in Texas | Ighodaro, E. T.; McCall, K. L.; Chung, D. Y.; Nichols, S. D.; Piper, B. J. | Brian James Piper | Geisinger Commonwealth School of Medicine | 2019-08-22 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | pharmacology and therapeutics | https://www.medrxiv.org/content/early/2019/08/22/19004879.source.xml | Study ObjectiveThe US is experiencing an epidemic of opioid overdoses which may be at least partially due to an over-reliance on opioid analgesics in the treatment of chronic non-cancer pain and subsequent escalation to heroin or illicit fentanyl. As Texas was reported to be among the lowest in the US for opioid use and misuse, further examination of this state is warranted.
Study DesignThis study was conducted to quantify prescription opioid use in Texas.
Data SourceData was obtained from the publically available US Drug Enforcement Administrations Automation of Reports and Consolidated Orders System (ARCOS) which monitors controlled substances transactions from manufacture to commercial distribution.
Measurement and Main ResultsData for 2006-2017 from Texas for ten prescription opioids including eight primarily used to relieve pain (codeine, fentanyl, hydrocodone, hydromorphone, meperidine, morphine, oxycodone, oxymorphone) and two (buprenorphine and methadone) for the treatment of an Opioid Use Disorder (OUD) were examined. The change in Morphine Mg Equivalent (MME) of all opioids (+23.3%) was only slightly greater than the states population gains (21.1%). Opioids used to treat an OUD showed pronounced gains (+90.8%) which were four-fold faster than population growth. Analysis of individual agents revealed pronounced elevations in codeine (+387.5%), hydromorphone (+106.7%), and oxycodone (+43.6%) and a reduction in meperidine (-80.3%) in 2017 relative to 2006. Methadone in 2017 accounted for a greater portion (39.5%) of the total MME than hydrocodone, oxycodone, morphine, hydromorphone, oxymorphone, and meperidine, combined. There were differences between urban and rural areas in the changes in hydrocodone and buprenorphine.
ConclusionsCollectively, these findings indicate that continued vigilance is needed in Texas to appropriately treat pain and an OUD while minimizing the potential for prescription opioid diversion and misuse. Texas may lead the US in a return to pre opioid crisis prescription levels. | 10.7717/peerj.8108 | medrxiv |
10.1101/19004853 | Candidate molecular predictors of outcome after aneurysmal subarachnoid haemorrhage: a systematic review of haemoglobin metabolism, inflammation and oxidative injury pathways. | Gaastra, B.; Galea, I. | Ben Gaastra | Wessex Neurological Centre, University Hospital Southampton, Southampton, UK | 2019-08-22 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | neurology | https://www.medrxiv.org/content/early/2019/08/22/19004853.source.xml | Aneurysmal subarachnoid haemorrhage (aSAH) is a devastating form of stroke associated with significant morbidity and mortality. Very little is known about the predictors of poor outcome and the pathophysiological mechanisms underlying neurological injury following aSAH. Three molecular pathways have been shown to be important: haemoglobin metabolism, inflammation and oxidative injury. The aim of this review is to use a systematic approach to identify a panel of key genes within these three pathways in order to focus future studies investigating predictors of poor outcome and the mechanisms of neurological injury following aSAH. Manual searching and bioinformatic mining tools were used. Studies of experimental or human SAH were included, and outcome was broadly defined to include all encountered readouts such as mortality, neurological scores, and neuropathological markers of tissue damage. If two or more molecules belonged to the same biochemical pathway, this pathway was examined in detail to identify all its components, which were then searched individually for any evidence of association with outcome using the same broad definition as before. This resulted in the identification of 58 candidate genes within the three pathways of interest (haemoglobin metabolism, inflammation and oxidative injury) potentially linked to outcome after aSAH. | null | medrxiv |
10.1101/19004929 | Social Media Surveillance for Perceived Therapeutic Effects of Cannabidiol (CBD) Products | Tran, T.; Kavuluru, R. | Ramakanth Kavuluru | University of Kentucky | 2019-08-22 | 1 | PUBLISHAHEADOFPRINT | cc_no | health informatics | https://www.medrxiv.org/content/early/2019/08/22/19004929.source.xml | BackgroundCBD products have risen in popularity given CBDs therapeutic potential and lack of legal oversight, despite lacking conclusive scientific evidence for widespread over-the-counter usage for many of its perceived benefits. While medical evidence is being generated, social media surveillance offers a fast and inexpensive alternative to traditional surveys in ascertaining perceived therapeutic purposes and modes of consumption for CBD products.
MethodsWe collected all comments from the CBD subreddit posted between January 1 and April 30, 2019 as well as comments submitted to the FDA regarding regulation of cannabis-derived products and analyzed them using a rule-based language processing method. A relative ranking of popular therapeutic uses and product groups for CBD is obtained based on frequency of pattern matches including precise queries that entail identifying mentions of the condition, a CBD product, and some "trigger" phrase indicating therapeutic use. We validated the social media-based findings using a similar analysis on comments to the U.S. Food and Drug Administrations (FDA) 2019 request-for-comments on cannabis-derived products.
ResultsCBD is mostly discussed as a remedy for anxiety disorders and pain and this is consistent across both comment sources. Of comments posted to the CBD subreddit during the monitored time span, 6.19% mentioned anxiety at least once with at least 6.02% of these comments specifically mentioning CBD as a treatment for anxiety (i.e., 0.37% of total comments). The most popular CBD product group is oil and tinctures.
ConclusionSocial media surveillance of CBD usage has the potential to surface new therapeutic use-cases as they are posted. Contemporary social media data indicate, for example, that stress and nausea are frequently mentioned as therapeutic use cases for CBD without corresponding evidence, that affirms or denies, in the research literature. However, the abundance of anecdotal claims warrants serious scientific exploration moving forward. Meanwhile, as FDA ponders regulation, our effort demonstrates that social data offers a convenient affordance to surveil for CBD usage patterns in a way that is fast and inexpensive and can inform conventional electronic surveys. | 10.1016/j.drugpo.2020.102688 | medrxiv |
10.1101/19004929 | Social Media Surveillance for Perceived Therapeutic Effects of Cannabidiol (CBD) Products | Tran, T.; Kavuluru, R. | Ramakanth Kavuluru | University of Kentucky | 2020-02-06 | 2 | PUBLISHAHEADOFPRINT | cc_no | health informatics | https://www.medrxiv.org/content/early/2020/02/06/19004929.source.xml | BackgroundCBD products have risen in popularity given CBDs therapeutic potential and lack of legal oversight, despite lacking conclusive scientific evidence for widespread over-the-counter usage for many of its perceived benefits. While medical evidence is being generated, social media surveillance offers a fast and inexpensive alternative to traditional surveys in ascertaining perceived therapeutic purposes and modes of consumption for CBD products.
MethodsWe collected all comments from the CBD subreddit posted between January 1 and April 30, 2019 as well as comments submitted to the FDA regarding regulation of cannabis-derived products and analyzed them using a rule-based language processing method. A relative ranking of popular therapeutic uses and product groups for CBD is obtained based on frequency of pattern matches including precise queries that entail identifying mentions of the condition, a CBD product, and some "trigger" phrase indicating therapeutic use. We validated the social media-based findings using a similar analysis on comments to the U.S. Food and Drug Administrations (FDA) 2019 request-for-comments on cannabis-derived products.
ResultsCBD is mostly discussed as a remedy for anxiety disorders and pain and this is consistent across both comment sources. Of comments posted to the CBD subreddit during the monitored time span, 6.19% mentioned anxiety at least once with at least 6.02% of these comments specifically mentioning CBD as a treatment for anxiety (i.e., 0.37% of total comments). The most popular CBD product group is oil and tinctures.
ConclusionSocial media surveillance of CBD usage has the potential to surface new therapeutic use-cases as they are posted. Contemporary social media data indicate, for example, that stress and nausea are frequently mentioned as therapeutic use cases for CBD without corresponding evidence, that affirms or denies, in the research literature. However, the abundance of anecdotal claims warrants serious scientific exploration moving forward. Meanwhile, as FDA ponders regulation, our effort demonstrates that social data offers a convenient affordance to surveil for CBD usage patterns in a way that is fast and inexpensive and can inform conventional electronic surveys. | 10.1016/j.drugpo.2020.102688 | medrxiv |
10.1101/19005025 | Multidrug therapy with terbinafine plus daily fluconazole is more effective than terbinafine alone or terbinafine plus weekly fluconazole in current epidemic of altered dermatophytosis in India: Results of a randomized pragmatic trial | Singh, S.; Jha, B.; Shukla, P.; Anchan, V. N. | Sanjay Singh | Institute of Medical Sciences, Banaras Hindu University, Varanasi, India | 2019-08-22 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | dermatology | https://www.medrxiv.org/content/early/2019/08/22/19005025.source.xml | BackgroundTreatment responsiveness of tinea has decreased considerably in recent past in India. We tested effectiveness of oral terbinafine daily plus fluconazole weekly (TFw) and terbinafine daily plus fluconazole daily (TFd) versus oral terbinafine daily (T) in tinea corporis, tinea cruris and tinea faciei in a pragmatic randomized open trial.
MethodsOne hundred and seventeen microscopy confirmed patients were allocated to T (6 mg/kg/day), TFw (terbinafine 6 mg/kg/day+fluconazole 12 mg/kg once weekly), or TFd (terbinafine 6 mg/kg/day+fluconazole 6 mg/kg/day) groups by concealed randomization and treated for 8 weeks or cure. Each group included 39 patients.
ResultsAt 4 weeks, 9 (23.1%), 8 (20.5%) and 14 (35.9%) patients were cured in T, TFw and TFd groups, respectively (P=0.279). At 8 weeks, number of patients cured was as follows: T 13 (33.3%), TFw 18 (46.2%) and TFd 25 (64.1%). TFd was more effective than T (P=0.012), other comparisons were not significantly different. However, effect size as calculated by number needed to treat (NNT) (versus terbinafine) was 8 for TFw and 4 for TFd. Relapse rates one month after cure were similar in all groups (P=0.664).
ConclusionsIn view of cure rates and NNT, terbinafine plus daily fluconazole is more effective than terbinafine alone or terbinafine plus weekly fluconazole in current epidemic of altered dermatophytosis in India.
One Sentence SummaryTerbinafine plus daily fluconazole is more effective than terbinafine alone or terbinafine plus weekly fluconazole in current epidemic of altered dermatophytosis in India. | null | medrxiv |
10.1101/19004986 | Poor glycemic control and associated factors among diabetic patients in Ethiopia; A Systemic review and meta-analysis | teklehaimanot, b. f.; berhe, a. k.; welearegawi, g. g. | berhane fseha teklehaimanot | adigrat university | 2019-08-22 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | endocrinology | https://www.medrxiv.org/content/early/2019/08/22/19004986.source.xml | IntroductionThe major global public health problems now days are diabetes especially the burden is high in low income countries including Ethiopia due to the limited resource for screening and early diagnosis of the diabetes. To prevent diabetic complications including organ damage and micro vascular complications blood glucose level should be maintained at an optimum level. However there was no pooled national picture on poor glycemic control and its associated factors.
MethodsDifferent data base searching engine including PubMed, Google scholar, the Cochrane library, MEDLINE,, HINARY and African journal online (AJOL) were used. The Joanna Briggs Critical Appraisal Tools and Newcastle Ottawa scale for assessing the quality of cross sectional studies were used for quality assessment. The meta-analysis was conducted using STATA 14 software. I2 statistic and egger weighted regression were used to assess heterogeneity and publication bias.
ResultsA total of 134 studies were identified from different database searching engines and other sources. After removing for duplication, absence of abstract and review of the full text 12 studies were including in the meta-analysis. The pooled prevalence of poor glycemic control among diabetic patients in Ethiopia is 64.72% with 95% confidence interval 63.16-66.28%. The sub group analysis of poor glycemic control among diabetic patients in different region of the country shows consistent and high prevalence of poor glycemic control ranging from 62.5% in Tigray region to 65.6% in Oromia region of the country. Residence, dyslipidemia and diet adherence were significantly association with poor glycemic control among diabetic patients in Ethiopia.
ConclusionThe prevalence of poor glycemic control among diabetic patients was high in Ethiopia and consistent across different regions of the country. The most important factors associated with poor glycemic factor among diabetic patients were being rural residence, having dyslipidemia and not adhering to dietary plan. | null | medrxiv |
10.1101/19004606 | Characterizing traumatic brain injury and its association with homelessness in a community-based sample of precariously housed adults and youth | Stubbs, J. L.; Thornton, A. E.; Gicas, K. M.; O'Connor, T. A.; Livingston, E. M.; Lu, H. Y.; Mehta, A. K.; Lang, D. J.; Vertinsky, A. T.; Field, T. S.; Heran, M. K.; Leonova, O.; Buchanan, T.; Barr, A. M.; MacEwan, W.; Honer, W. G.; Panenka, W. J. | William J. Panenka | University of British Columbia | 2019-08-23 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | public and global health | https://www.medrxiv.org/content/early/2019/08/23/19004606.source.xml | We characterized the prevalence, mechanisms, and sex difference of lifetime traumatic brain injury (TBI) in a precariously housed sample. We also examined the impact of TBI severity and timing on becoming and staying homeless. 285 precariously housed participants (adults n = 226, youths n = 59) completed the Brain Injury Screening Questionnaire (BISQ) in addition to other health assessments. A history of TBI was reported in 82.1% of the sample, with 64.6% reporting > 1 TBI, and 21.4% reporting a moderate or severe TBI (msTBI). 10.1% of adults had traumatically-induced lesions on MRI scans. Assault was the most common mechanism of injury overall, and females reported significantly more TBIs due to physical abuse than males (adjusted OR = 1.26, 95% CI = 1.14 - 1.39, p = 9.18e-6). The first msTBI was significantly closer to the first experience of homelessness (b = 2.79, p = 0.003) and precarious housing (b = 2.69, p = 7.47e-4) than was the first mild TBI. Traumatic brain injuries more proximal to the initial loss of stable housing were associated with a longer lifetime duration of homelessness (RR = 1.04, 95% CI = 1.02 - 1.06, p = 6.8e-6) and precarious housing (RR = 1.03, 95% CI = 1.01 - 1.04, p = 5.5e-10). These findings demonstrate the high prevalence of TBI in vulnerable persons and the severity- and timing-related risk that TBI may confer for the onset and prolongation of homelessness. | 10.1177/07067437211000665 | medrxiv |
10.1101/19005413 | Relationship between early childhood non-parental childcare and diet, physical activity, sedentary behaviour, and sleep: A systematic review of longitudinal studies | Costa, S.; Benjamin Neelon, S.; Winpenny, E.; Phillips, V.; Adams, J. | Silvia Costa | Loughborough University | 2019-08-23 | 1 | PUBLISHAHEADOFPRINT | cc_no | public and global health | https://www.medrxiv.org/content/early/2019/08/23/19005413.source.xml | BackgroundThe rising prevalence of childhood obesity is a global public health concern. Evidence suggests that exposure to non-parental childcare before age six years is associated with increased risk of obesity, diet, and activity behaviours (physical activity, sedentary behaviour, and sleep). However, findings are inconsistent and mostly from cross-sectional studies, making it difficult to identify the direction of causation in associations. This review identified and synthesised the published research on longitudinal associations between non-parental childcare during early childhood, diet, and activity behaviours.
MethodsSeven databases were searched using a predefined search strategy. Results were independently double-screened through title/abstract and full-text stages according to predefined criteria. Included studies were tabulated, and evaluated for risk of bias using the Nutrition Evidence Library Bias Assessment Tool.
ResultsOf 18793 references screened, 13 studies met eligibility criteria and were included in the review. Eight studies reported on diet and seven studies reported on activity behaviour outcomes (three on physical activity, three on sedentary behaviour, and one on sleep). These studies included results on 89 tested childcare:outcome associations. Of 63 associations testing diet outcomes, 37 (59%) were null, and the remainder showed inconsistent patterns. There was an indication of a potential benefit of Head Start providers (vs other care, including parental) on dietary behaviours. Of 26 associations testing activity behaviour outcomes, 22 (85%) were null, and the remainder were inconsistent. Most studies (92%) did not use (or did not report using) valid and reliable outcome measures, and outcome assessors were not blinded (or it was unclear if they were blinded) to childrens exposure status (77%).
ConclusionsThe scarce available literature indicates little and mixed evidence of a longitudinal association between exposure to non-parental childcare before age six years and diet or activity behaviours. This reflects a paucity of research, rather than clear evidence of no effect. There is an urgent need for studies investigating the longitudinal associations of non-parental childcare on diet and activity behaviours to assess potential lasting effects and mechanisms. Studies should assess whether and how associations vary by provider and child sub-groups, as well as differences by intensity and duration of care. | 10.3390/ijerph16234652 | medrxiv |
10.1101/19005249 | Impairment of dual-task gait dynamics in older adults with mild cognitive impairment: Relationships to neuropsychological status, fitness and brain morphology | Hawkins, T. C.; Samuel, R.; Fiatarone Singh, M. A.; Gates, N.; Wilson, G. C.; Jain, N.; Meiklejohn, J.; Brodaty, H.; Wen, W.; Singh, N.; Baune, B. T.; Suo, C.; Baker, M. K.; Foroughi, N.; Wang, Y.; Sachdev, P. S.; Valenzuela, M. J.; Hausdorff, J. M.; Mavros, Y. | Yorgi Mavros | The University of Sydney | 2019-08-23 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | geriatric medicine | https://www.medrxiv.org/content/early/2019/08/23/19005249.source.xml | BackgroundIndividuals with Mild Cognitive Impairment (MCI) have more gait variability under dual-task conditions than cognitively healthy adults. However, characteristics associated with this susceptibility of gait to dual-task stress are unknown.
MethodsTesting was performed at baseline in the Study of Mental And Resistance Training (SMART). Ninety-three adults with MCI (age 70{+/-}6.8 years; 66.6% female) performed a single- and dual-task walk (cognitive distractor=letter fluency), in random order. Linear and non-linear gait variability were measured using force-sensitive insoles. Cognitive performance during dual-tasking was assessed by the number of correct words vocalized. Cognitive function, brain Magnetic Resonance Imaging (MRI), muscle strength, aerobic capacity, body composition, physical and psychosocial function were also assessed as potential correlates of gait dynamics.
ResultsGait dynamics worsened during dual-tasking, with decrements in both stride time variability (p<0.001) and detrended fluctuation analysis (DFA) (p=0.001). Lower aerobic capacity and thinner posterior cingulate cortex were associated with greater decrements in DFA (p<0.05). Smaller hippocampal volume, worse psychological well-being and poorer static balance were associated with greater decrements in stride time variability (p<0.05). By contrast, cognitive performance did not change under dual-task conditions compared to seated testing (p=0.13).
ConclusionsUnder dual-task conditions, participants with MCI preserved their cognitive performance at the expense of gait stability. Decrements in dual-tasking gait were associated with lower aerobic fitness, balance, psychological well-being, and brain volume in cognitively-relevant areas of the posterior cingulate and hippocampus, all potentially modifiable characteristics. Trials of targeted interventions are needed to determine the potential plasticity of gait variability in high-risk cohorts. | null | medrxiv |
10.1101/19004812 | Association of Single Nucleotide Polymorphisms with Dyslipidemia in Antiretroviral Exposed HIV Patients: A Case-Control Study in a Ghanaian population | Obirikorang, C.; ACHEAMPONG, E.; Quaye, L.; Yorke, J.; Amos-Abanyie, E. K.; Abena Akyaw, P.; Odame Anto, E.; Bannison Bani, S.; Adu Asamoah, E.; Nsenbah Batu, E. | Emmanuel ACHEAMPONG | Department of Molecular Medicine, School of Medical Science, Kwame Nkrumah University of Science and Technology (KNUST), Kumasi, Ghana | 2019-08-23 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | hiv aids | https://www.medrxiv.org/content/early/2019/08/23/19004812.source.xml | Dyslipidemia is a potential complication of long-term usage of antiretroviral therapy (ART) and also known to be associated with genetic factors. The host genetic variants associated with dyslipidemia in HIV patients on ART in Ghana have not been fully explored. The study constituted a total of 289 HIV-infected patients on stable ART for at least a year and 85 aged matched apparently healthy control subjects with no history of HIV and dyslipidemia. Fasting blood was collected into EDTA tube for lipids measurements. Lipid profiles were determined as a measure of dyslipidemia. HIV-infected patients were categorized into two groups; those with dyslipidemia(HIV-Dys+) (n=90; 31.1%) and without dyslipidemia (n=199; 68.9%) based on the NCEP-ATP III criteria. Four candidate single nucleotide polymorphisms (SNPs) genes (ABCA1-rs2066714, LDLR-rs6511720, APOA5-rs662799 and DSCAML1-rs10892151) were determined. Genotyping was performed on isolated genomic DNA of study participants using PCR followed by a multiplex Ligation Detection Reaction (LDR). The percentage of the population who had the rare homozygote alleles for rs6511720 (T/T), rs2066714 (G/G), and rs10892151 (T/T) and rs662799 (G/G) among HIV+Dys+ subjects were 5.5%, 14.4%, 6.6% and 10.0%; 2.0% 9.1%, 6.5% and 4.0% among HIV+Dys- subjects while 3.5%, 4.7%, 4.7% and 2.4% were observed in HIV-Dys- subjects. Statistically significant difference in genotypic prevalence of APOA5 polymorphisms was observed among different groups (p=0.0196). Compared to the AA genotype of the APOA5 polymorphisms, individuals with the rare homozygote genotype [aOR =4.01, 95%CI(1.57-22.39), p=0.004] were significantly more likely to develop dyslipidemia after controlling for age, gender, treatment duration and CD4 counts among the HIV+Dys+ subjects. There was also a significant associated between GG genotype of ABCA1 and dyslipidemia [aOR =3.29, 95% (1.08 -12.43); p=0.042]. Individuals with the rare homozygote variant (GG) of APOA5 (rs662799) were significantly associated with increased likelihood of developing dyslipidemia [OR =2.24, 95% CI (1.20 -6.83); p=0.0370] holding other variables constant in the HIV+Dys- subjects. Our data accentuate the presence of SNPs in four candidate genes and its association with dyslipidemia among HIV patients exposed to ART in the Ghanaian population especially variants in APOA5-rs662799 ABCA1-rs2066714 respectively. These findings provide baseline information that necessitates a pre-symptomatic strategy for monitoring dyslipidemia in ART-treated HIV patients. There is a need for longitudinal studies to validate a comprehensive number of SNPs and its association with dyslipidemia. | 10.1371/journal.pone.0227779 | medrxiv |
10.1101/19004812 | Association of Single Nucleotide Polymorphisms with Dyslipidemia in Antiretroviral Exposed HIV Patients in a Ghanaian population | Obirikorang, C.; ACHEAMPONG, E.; Quaye, L.; Yorke, J.; Amos-Abanyie, E. K.; Abena Akyaw, P.; Odame Anto, E.; Bannison Bani, S.; Adu Asamoah, E.; Nsenbah Batu, E. | Emmanuel ACHEAMPONG | Department of Molecular Medicine, School of Medical Science, Kwame Nkrumah University of Science and Technology (KNUST), Kumasi, Ghana | 2019-11-05 | 2 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | hiv aids | https://www.medrxiv.org/content/early/2019/11/05/19004812.source.xml | Dyslipidemia is a potential complication of long-term usage of antiretroviral therapy (ART) and also known to be associated with genetic factors. The host genetic variants associated with dyslipidemia in HIV patients on ART in Ghana have not been fully explored. The study constituted a total of 289 HIV-infected patients on stable ART for at least a year and 85 aged matched apparently healthy control subjects with no history of HIV and dyslipidemia. Fasting blood was collected into EDTA tube for lipids measurements. Lipid profiles were determined as a measure of dyslipidemia. HIV-infected patients were categorized into two groups; those with dyslipidemia(HIV-Dys+) (n=90; 31.1%) and without dyslipidemia (n=199; 68.9%) based on the NCEP-ATP III criteria. Four candidate single nucleotide polymorphisms (SNPs) genes (ABCA1-rs2066714, LDLR-rs6511720, APOA5-rs662799 and DSCAML1-rs10892151) were determined. Genotyping was performed on isolated genomic DNA of study participants using PCR followed by a multiplex Ligation Detection Reaction (LDR). The percentage of the population who had the rare homozygote alleles for rs6511720 (T/T), rs2066714 (G/G), and rs10892151 (T/T) and rs662799 (G/G) among HIV+Dys+ subjects were 5.5%, 14.4%, 6.6% and 10.0%; 2.0% 9.1%, 6.5% and 4.0% among HIV+Dys- subjects while 3.5%, 4.7%, 4.7% and 2.4% were observed in HIV-Dys- subjects. Statistically significant difference in genotypic prevalence of APOA5 polymorphisms was observed among different groups (p=0.0196). Compared to the AA genotype of the APOA5 polymorphisms, individuals with the rare homozygote genotype [aOR =4.01, 95%CI(1.57-22.39), p=0.004] were significantly more likely to develop dyslipidemia after controlling for age, gender, treatment duration and CD4 counts among the HIV+Dys+ subjects. There was also a significant associated between GG genotype of ABCA1 and dyslipidemia [aOR =3.29, 95% (1.08 -12.43); p=0.042]. Individuals with the rare homozygote variant (GG) of APOA5 (rs662799) were significantly associated with increased likelihood of developing dyslipidemia [OR =2.24, 95% CI (1.20 -6.83); p=0.0370] holding other variables constant in the HIV+Dys- subjects. Our data accentuate the presence of SNPs in four candidate genes and its association with dyslipidemia among HIV patients exposed to ART in the Ghanaian population especially variants in APOA5-rs662799 ABCA1-rs2066714 respectively. These findings provide baseline information that necessitates a pre-symptomatic strategy for monitoring dyslipidemia in ART-treated HIV patients. There is a need for longitudinal studies to validate a comprehensive number of SNPs and its association with dyslipidemia. | 10.1371/journal.pone.0227779 | medrxiv |
10.1101/19005280 | Evaluation of Patient-Level Retrieval from Electronic Health Record Data for a Cohort Discovery Task | Chamberlin, S. R.; Bedrick, S. D.; Cohen, A. M.; Wang, Y.; Wen, A.; Liu, S.; Liu, H.; Hersh, W. | Steve R Chamberlin | Oregon Health & Science University | 2019-08-25 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc | health informatics | https://www.medrxiv.org/content/early/2019/08/25/19005280.source.xml | ObjectiveGrowing numbers of academic medical centers offer patient cohort discovery tools to their researchers, yet the performance of systems for this use case is not well-understood. The objective of this research was to assess patient-level information retrieval (IR) methods using electronic health records (EHR) for different types of cohort definition retrieval.
Materials and MethodsWe developed a test collection consisting of about 100,000 patient records and 56 test topics that characterized patient cohort requests for various clinical studies. Automated IR tasks using word-based approaches were performed, varying four different parameters for a total of 48 permutations, with performance measured using B-Pref. We subsequently created structured Boolean queries for the 56 topics for performance comparisons. In addition, we performed a more detailed analysis of 10 topics.
ResultsThe best-performing word-based automated query parameter settings achieved a mean B-Pref of 0.167 across all 56 topics. The way a topic was structured (topic representation) had the largest impact on performance. Performance not only varied widely across topics, but there was also a large variance in sensitivity to parameter settings across the topics. Structured queries generally performed better than automated queries on measures of recall and precision, but were still not able to recall all relevant patients found by the automated queries.
ConclusionWhile word-based automated methods of cohort retrieval offer an attractive solution to the labor-intensive nature of this task currently used at many medical centers, we generally found suboptimal performance in those approaches, with better performance obtained from structured Boolean queries. Insights gained in this preliminary analysis will help guide future work to develop new methods for patient-level cohort discovery with EHR data. | 10.1093/jamiaopen/ooaa026 | medrxiv |
10.1101/19005280 | Evaluation of Patient-Level Retrieval from Electronic Health Record Data for a Cohort Discovery Task | Chamberlin, S. R.; Bedrick, S. D.; Cohen, A. M.; Wang, Y.; Wen, A.; Liu, S.; Liu, H.; Hersh, W. | Steve R Chamberlin | Oregon Health & Science University | 2019-11-12 | 2 | PUBLISHAHEADOFPRINT | cc_by_nc | health informatics | https://www.medrxiv.org/content/early/2019/11/12/19005280.source.xml | ObjectiveGrowing numbers of academic medical centers offer patient cohort discovery tools to their researchers, yet the performance of systems for this use case is not well-understood. The objective of this research was to assess patient-level information retrieval (IR) methods using electronic health records (EHR) for different types of cohort definition retrieval.
Materials and MethodsWe developed a test collection consisting of about 100,000 patient records and 56 test topics that characterized patient cohort requests for various clinical studies. Automated IR tasks using word-based approaches were performed, varying four different parameters for a total of 48 permutations, with performance measured using B-Pref. We subsequently created structured Boolean queries for the 56 topics for performance comparisons. In addition, we performed a more detailed analysis of 10 topics.
ResultsThe best-performing word-based automated query parameter settings achieved a mean B-Pref of 0.167 across all 56 topics. The way a topic was structured (topic representation) had the largest impact on performance. Performance not only varied widely across topics, but there was also a large variance in sensitivity to parameter settings across the topics. Structured queries generally performed better than automated queries on measures of recall and precision, but were still not able to recall all relevant patients found by the automated queries.
ConclusionWhile word-based automated methods of cohort retrieval offer an attractive solution to the labor-intensive nature of this task currently used at many medical centers, we generally found suboptimal performance in those approaches, with better performance obtained from structured Boolean queries. Insights gained in this preliminary analysis will help guide future work to develop new methods for patient-level cohort discovery with EHR data. | 10.1093/jamiaopen/ooaa026 | medrxiv |
10.1101/19004051 | Enhancing multi-center generalization of machine learning-based depression diagnosis from resting-state fMRI | Nakano, T.; Takamura, M.; Ichikawa, N.; Okada, G.; Okamoto, Y.; Yamada, M.; Suhara, T.; Yamawaki, S.; Yoshimoto, J. | Junichiro Yoshimoto | Nara Institute of Science and Technology | 2019-08-25 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | psychiatry and clinical psychology | https://www.medrxiv.org/content/early/2019/08/25/19004051.source.xml | Resting-state fMRI has the potential to find abnormal behavior in brain activity and to diagnose patients with depression. However, resting-state fMRI has a bias depending on the scanner site, which makes it difficult to diagnose depression at a new site. In this paper, we propose methods to improve the performance of the diagnosis of major depressive disorder (MDD) at an independent site by reducing the site bias effects using regression. For this, we used a subgroup of healthy subjects of the independent site to regress out site bias. We further improved the classification performance of patients with depression by focusing on melancholic depressive disorder. Our proposed methods would be useful to apply depression classifiers to subjects at completely new sites. | 10.3389/fpsyt.2020.00400 | medrxiv |
10.1101/19005116 | Epigenome-wide association study of seizures in childhood and adolescence | Caramaschi, D.; Hatcher, C.; Mulder, R.; Felix, J.; Cecil, C.; Relton, C.; Walton, E. | Doretta Caramaschi | University of Bristol | 2019-08-26 | 1 | PUBLISHAHEADOFPRINT | cc_by | epidemiology | https://www.medrxiv.org/content/early/2019/08/26/19005116.source.xml | The occurrence of seizures in childhood is often associated with neurodevelopmental impairments and school underachievement. Common genetic variants associated with epilepsy have been identified and epigenetic mechanisms have also been suggested to play a role. In this study we analysed the association of genome-wide blood DNA methylation with the occurrence of seizures in [~]800 children from the Avon Longitudinal Study of Parents and Children, UK, at birth (cord blood), during childhood and adolescence (peripheral blood). We also analysed the association between the lifetime occurrence of any seizures before age 13 with blood DNA methylation levels. We sought replication of the findings in the Generation R Study and explored causality using Mendelian randomization, i.e. using genetic variants as proxies. The results showed five CpG sites which were associated cross-sectionally with seizures either in childhood or adolescence (1-5% absolute methylation difference at pFDR<0.05), although the evidence of replication in an independent study was weak. One of these sites was located in the BDNF gene, which is highly expressed in the brain, and showed high correspondence with brain methylation levels. The Mendelian randomization analyses suggested that seizures might be causal for changes in methylation rather than vice-versa. In addition, seizure-associated methylation changes could affect other outcomes such as growth, cognitive skills and educational attainment. In conclusion, we present a link between seizures and DNA methylation which suggests that DNA methylation changes might mediate some of the effects of seizures on growth and neurodevelopment. | 10.1186/s13148-019-0793-z | medrxiv |
10.1101/19004358 | Estimating the health impact of vaccination against 10 pathogens in 98 low and middle income countries from 2000 to 2030 | Li, X.; Mukandavire, C.; Cucunuba, Z. M.; Abbas, K.; Clapham, H. E.; Jit, M.; Johnson, H. L.; Papadopoulos, T.; Vynnycky, E.; Brisson, M.; Carter, E. D.; Clark, A.; de Villiers, M. J.; Eilertson, K.; Ferrari, M. J.; Gamkrelidze, I.; Gaythorpe, K.; Grassly, N. C.; Hallett, T. B.; Jackson, M. L.; Jean, K.; Karachaliou, A.; Klepac, P.; Lessler, J.; Li, X.; Moore, S. M.; Nayagam, S.; Nguyen, D. M.; Razavi, H.; Razavi-Shearer, D.; Resch, S.; Sanderson, C.; Sweet, S.; Sy, S.; Tam, Y.; Tanvir, H.; Tran, Q. M.; Trotter, C. L.; Truelove, S.; van Zandvoort, K.; Verguet, S.; Walker, N.; Winter, A.; Fergu | Neil M Ferguson | MRC Centre for Global Infectious Disease Analysis, Department of Infectious Disease Epidemiology, Imperial College London | 2019-08-27 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | public and global health | https://www.medrxiv.org/content/early/2019/08/27/19004358.source.xml | BackgroundThe last two decades have seen substantial expansion of childhood vaccination programmes in low and middle income countries (LMICs). Here we quantify the health impact of these programmes by estimating the deaths and disability-adjusted life years (DALYs) averted by vaccination with ten antigens in 98 LMICs between 2000 and 2030.
MethodsIndependent research groups provided model-based disease burden estimates under a range of vaccination coverage scenarios for ten pathogens: hepatitis B (HepB), Haemophilus influenzae type b (Hib), human papillomavirus (HPV), Japanese encephalitis (JE), measles, Neisseria meningitidis serogroup A (MenA), Streptococcus pneumoniae, rotavirus, rubella, yellow fever. Using standardized demographic data and vaccine coverage estimates for routine and supplementary immunization activities, the impact of vaccination programmes on deaths and DALYs was determined by comparing model estimates from the no vaccination counterfactual scenario with those from a default coverage scenario. We present results in two forms: deaths/DALYs averted in a particular calendar year, and in a particular annual birth cohort.
FindingsWe estimate that vaccination will have averted 69 (2.5-97.5% quantile range 52-88) million deaths between 2000 and 2030 across the 98 countries and ten pathogens considered, 35 (29-45) million of these between 2000-2018. From 2000-2018, this represents a 44% (36-57%) reduction in deaths due to the ten pathogens relative to the no vaccination counterfactual. Most (96% (93-97%)) of this impact is in under-five age mortality, notably from measles. Over the lifetime of birth cohorts born between 2000 and 2030, we predict that 122 (96-147) million deaths will be averted by vaccination, of which 58 (39-75) and 38 (26-52) million are due to measles and Hepatitis B vaccination, respectively. We estimate that recent increases in vaccine coverage and introductions of additional vaccines will result in a 72% (61-79%) reduction in lifetime mortality caused by these 10 pathogens in the 2018 birth cohort.
InterpretationIncreases in vaccine coverage and the introduction of new vaccines into LMICs over the last two decades have had a major impact in reducing mortality. These public health gains are predicted to increase in coming decades if progress in increasing coverage is sustained. | 10.1016/s0140-6736(20)32657-x | medrxiv |
10.1101/19005199 | Pre-diagnostic loss to follow-up in an active case-finding TB program: a mixed-methods study from rural Bihar, India | Garg, T.; Gupta, V.; Sen, D.; Verma, M.; Brouwer, M.; Mishra, R.; Bhardwaj, M. | Tushar Garg | Innovators In Health | 2019-08-27 | 1 | PUBLISHAHEADOFPRINT | cc_by | public and global health | https://www.medrxiv.org/content/early/2019/08/27/19005199.source.xml | backgroundDespite active case-finding (ACF) identifying more presumptive and confirmed TB cases, high pre-diagnostic loss to follow-up (PDLFU) among presumptive TB cases referred for diagnostic test remains a concern. We aimed to quantify the PDLFU, and identify the barriers and enablers in undergoing a diagnostic evaluation in an ACF program implemented in 1.02 million rural population in the Samastipur district of Bihar, India.
methodsDuring their routine work, Accredited Social Health Activists (ASHA, a community health worker or CHW), informal providers, and community laypersons identified people at risk of TB, and referred them to the program. A field coordinator (FC) screened them for TB symptoms at the patients home. The identified presumptive TB cases were accompanied by the CHW to a designated government facility for diagnostics. Those with a confirmed TB diagnosis were put on treatment by the CHW and followed-up till treatment completion. All services were provided free of cost and patients were supported throughout the care pathway, including a transport allowance. We analyzed programmatically collected data, conducted in-depth interviews with patients, and focus group discussions with the CHWs and FCs in an explanatory mixed-methods design.
resultsA total of 11146 presumptive TB cases were identified from January 2018 to December 2018, out of which 4912 (44.1%) underwent a diagnostic evaluation. The key enablers were CHW accompaniment and support in addition to the free TB services in the public sector. The major barriers identified were transport challenges, deficient family and health provider support, and poor services in the public system.
conclusionIf we are to find missing cases, the health system needs urgent reform, and diagnostic services need to be patient-centric. A strong patient support system engaging all stakeholders and involvement of CHWs in routine TB care is an effective solution.
STRENGTHS AND LIMITATIONS OF THIS STUDYO_LIFirst such study to explore the reasons for pre-diagnostic loss to follow-up
C_LIO_LIA mixed-method design including the views of both patients and community health workers
C_LIO_LIUses operational data from a routine programmatic setting at an NGO site
C_LIO_LINo record of the actual number of people screened intuitively before being referred to the program.
C_LIO_LINo record of patients accessing diagnostics in private sector and those completing the diagnostic process.
C_LI | 10.1136/bmjopen-2019-033706 | medrxiv |
10.1101/19003442 | Descriptive epidemiology of physical activity energy expenditure in UK adults. The Fenland Study. | Lindsay, T.; Westgate, K.; Wijndaele, K.; Hollidge, S.; Kerrison, N.; Forouhi, N.; Griffin, S.; Wareham, N.; Brage, S. | Soren Brage | MRC Epidemiology Unit, University of Cambridge | 2019-08-27 | 1 | PUBLISHAHEADOFPRINT | cc_no | epidemiology | https://www.medrxiv.org/content/early/2019/08/27/19003442.source.xml | BackgroundPhysical activity (PA) plays a role in the prevention of a range of diseases including obesity and cardiometabolic disorders. Large population-based descriptive studies of PA, incorporating precise measurement, are needed to understand the relative burden of insufficient PA levels and to inform the tailoring of interventions. Combined heart and movement sensing enables the study of physical activity energy expenditure (PAEE) and intensity distribution. We aimed to describe the sociodemographic correlates of PAEE and moderate-to-vigorous physical activity (MVPA) in UK adults.
MethodsThe Fenland study is a population-based cohort study of 12,435 adults aged 29-64 years-old in Cambridgeshire, UK. Following individual calibration (treadmill), participants wore a combined heart rate and movement sensor continuously for 6 days in free-living, from which we derived PAEE (kJ*day-1*kg-1) and time in MVPA (>3 & >4 METs) in bouts greater than 1 minute and 10 minutes. Socio-demographic information was self-reported. Stratum-specific summary statistics and multivariable analyses were performed.
ResultsWomen accumulated a mean(sd) 50(20) kJ*day-1*kg-1 of PAEE, and 83(67) and 33(39) minutes*day-1 of 1-min bouted and 10-min bouted MVPA respectively. By contrast, men recorded 59(23) kJ*day-1*kg-1, 124(84) and 60(58) minutes*day-1. Age and BMI were also important correlates of PA. Association with age was inverse in both sexes, more strongly so for PAEE than MVPA. Obese individuals accumulated less PA than their normal-weight counterparts whether considering PAEE or allometrically-scaled PAEE (-10 kJ*day-1*kg-1 vs -15 kJ*day-1*kg-2/3 in men). Higher income and manual work were associated with higher PA; manual workers recorded 13-16 kJ*kg-1*day-1 more PAEE than sedentary counterparts. Overall, 86% of women and 96% of men accumulated 21.4 min*day-1 of MVPA (>3 METs) on average (150 minutes per week). These values were 49% and 74% if only considering bouts >10 min (15% and 31% for >4 METs).
ConclusionsPA varied by age, sex and BMI, and was higher in manual workers and those with higher incomes. Light physical activity was the main driver of PAEE; a component of PA that is currently not quantified as a target in UK guidelines. | 10.1186/s12966-019-0882-6 | medrxiv |
10.1101/19004465 | Physician Suicide: A Scoping Review to Highlight Opportunities for Prevention | Leung, T. I.; Pendharkar, S. S.; Chen, C.-Y. A.; Snyder, R. | Tiffany I. Leung | Faculty of Health, Medicine & Life Sciences, Maastricht University | 2019-08-29 | 1 | PUBLISHAHEADOFPRINT | cc_no | medical education | https://www.medrxiv.org/content/early/2019/08/29/19004465.source.xml | ObjectiveThe aim of this scoping review is to map the current landscape of published research and perspectives on physician suicide. Findings could serve as a roadmap for further investigations and potentially inform efforts to prevent physician suicide.
MethodsOvid MEDLINE, PsycInfo, and Scopus were searched for English-language publications from August 21, 2017 through April 28, 2018. Inclusion criteria were a primary outcome or thesis focused on suicide (including suicide completion, attempts, and thoughts or ideation) among medical students, postgraduate trainees, or attending physicians. Opinion articles were included. Studies that were non-English, or those that only mentioned physician burnout, mental health or substance use disorders were excluded. Data extraction was performed by two authors.
ResultsThe search yielded 1,596 articles, of which 347 articles passed to the full-text review round. The oldest article was an editorial from 1903; 210 (60.3%) articles were published from 2000 to present. Authors originated from 37 countries and 143 (41.2%) were opinion articles. Most discussed were suicide risk factors and culture of practice issues, while least discussed themes included public health and postvention.
ConclusionsConsistency and reliability of data and information about physician suicides could be improved. Data limitations partly contribute to these issues. Also, various suicide risk factors for physicians have been explored, and several remain poorly understood. Based on this scoping review, a public health approach, including surveillance and early warning systems, investigations of sentinel cases, and postvention may be impactful next steps in preventing physician deaths by suicide. | 10.2478/gp-2020-0014 | medrxiv |
10.1101/19004820 | Classifying dementia progression using microbial profiling of saliva | Bathini, P.; Foucras, S.; Perna, A.; Doucey, M.-A.; Berreux, J.-L.; Annoni, J.-M.; Alberi, L. | Lavinia Alberi | Swiss Integrative Center for Human Health, Fribourg, Switzerland | 2019-08-29 | 1 | PUBLISHAHEADOFPRINT | cc_no | neurology | https://www.medrxiv.org/content/early/2019/08/29/19004820.source.xml | IntroductionThere is increasing evidence linking periodontal infections to Alzheimers disease. Saliva sampling can reveal information about the host and pathogen interactions that can inform about physiological and pathological brain states.
MethodsA cross-sectional cohort of age-matched subjects (78) was segmented according to their chemosensory (University of Pennsylvania smell identification test; UPSIT) and cognitive scores (mini-mental score evaluation; MMSE and clinical dementia rating; CDR). Mid-morning saliva was sampled from each subject and processed for microbiome composition and cytokine analysis. Linear discriminant analysis (LDA) was used to unravel specific changes in microbial and immunological signatures and logistic regression analysis (LRA) was employed to identify taxa that varied in abundance among patients groups.
ResultsUsing olfaction we distinguish in the cognitively normal population a segment with high chemosensory scores (CNh, 27) and another segment with chemosensory scores (CNr, 16) as low as MCI (21) but higher than the AD group (17). We could identify stage-specific microbial signatures changes but no clear distinct cytokine profiles. Periodontal pathogen species as F. villosus decline with increasing severity of AD while opportunistic oral bacteria as L. Wadei shows a significant enrichment MCI. Conclusions: The salivary microbiome indicates stage-dependent changes in oral bacteria favoring opportunistic bacteria at the expenses of periodontal bacteria, while the inflammatory profiles remain mainly unchanged in the sampled population. | 10.1002/dad2.12000 | medrxiv |
10.1101/19004895 | Apolipoprotein B underlies the causal relationship of circulating blood lipids with coronary heart disease | Richardson, T. G.; Sanderson, E.; Palmer, T. M.; Ala-Korpela, M.; Ference, B. A.; Davey Smith, G.; Holmes, M. V. | Tom G Richardson | University of Bristol | 2019-08-29 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | cardiovascular medicine | https://www.medrxiv.org/content/early/2019/08/29/19004895.source.xml | BackgroundCirculating blood lipids cause coronary heart disease (CHD). However, the precise way in which one or more lipoprotein lipid-related entities account for this relationship remains unclear. We sought to explore the causal relationships of blood lipid traits with risk of CHD using multivariable Mendelian randomization.
MethodsWe conducted GWAS of circulating blood lipid traits in UK Biobank (up to n=440,546) for LDL cholesterol, triglycerides and apolipoprotein B to identify lipid-associated SNPs. Using data from CARDIoGRAMplusC4D for CHD (consisting of 60,801 cases and 123,504 controls), we performed univariable and multivariable Mendelian randomization (MR) analyses. Similar analyses were conducted for HDL cholesterol and apolipoprotein A-I.
FindingsGWAS identified multiple independent SNPs associated at P<5x10-8 for LDL cholesterol (220), apolipoprotein B (n=255), triglycerides (440), HDL cholesterol (534) and apolipoprotein AI (440). Between 56-93% of SNPs identified for each lipid trait had not been previously reported in large-scale GWAS. Almost half (46%) of these SNPs were associated at P<5x10-8 with more than one lipid related trait. Assessed individually using MR, each of LDL cholesterol (OR 1.66 per 1 standard deviation higher trait; 95%CI: 1.49; 1.86; P=2.4x10-19), triglycerides (OR 1.34; 95%CI: 1.25, 1.44; P=9.1x10-16) and apolipoprotein B (OR 1.73; 95%CI: 1.56, 1.91; P=1.5x10-25) had effect estimates consistent with a higher risk of CHD. In multivariable MR, only apolipoprotein B (OR 1.92; 95%CI: 1.31, 2.81; P=7.5x10-4) retained a robust effect with the estimate for LDL cholesterol (OR 0.85; 95%CI: 0.57; 1.27; P=0.44) reversing and that of triglycerides (OR 1.12; 95%CI: 1.02, 1.23; P=0.01) becoming markedly weaker.
Individual MR analyses showed a 1-SD higher HDL-C (OR 0.80; 95%CI: 0.75, 0.86; P=1.7x10-10) and apolipoprotein A-I (OR 0.83; 95%CI: 0.77, 0.89; P=1.0x10-6) to lower the risk of CHD but these effect estimates weakened to include the null on accounting for apolipoprotein B.
ConclusionsApolipoprotein B is of fundamental causal relevance in the aetiology of CHD, and underlies the relationship of LDL cholesterol and triglycerides with CHD. | 10.1371/journal.pmed.1003062 | medrxiv |
10.1101/19005298 | Transmission Dynamics of and Insights from the 2018-2019 Measles Outbreak in New York City: A Modeling Study | Yang, W. | Wan Yang | Columbia University | 2019-08-29 | 1 | PUBLISHAHEADOFPRINT | cc_no | epidemiology | https://www.medrxiv.org/content/early/2019/08/29/19005298.source.xml | In 2018-2019, New York City experienced the largest measles outbreak in the US in nearly three decades. To identify key factors contributing to this outbreak to aid future public health interventions, here we developed a model-inference system to infer the transmission dynamics of measles in the affected community, based on incidence data. Our results indicate that delayed vaccination of young children aged 1-4 years enabled the initial spread of measles and that increased infectious contact among this age group, likely via gatherings intended to expose unvaccinated children (i.e. "measles parties"), further aggravated the outbreak and led to widespread of measles beyond this age group. We found that around half of infants were susceptible to measles by age 1 (the age-limit to receive the first vaccine dose in the US); as such, infants experienced a large number of infections during the outbreak. We showed that without the implemented vaccination campaigns, the outbreak severity including numbers of infections and hospitalizations would be 10 times higher and predominantly affect infants and children under 4. These results suggest that recommending the first vaccine dose before age 1 and the second dose before age 4 could allow pro-vaccine parents to vaccinate and protect infants and young children more effectively, should high level of vaccine hesitancy persist. In addition, enhanced public health education is needed to reduce activities that unnecessarily expose children to measles and other infections. | 10.1126/sciadv.aaz4037 | medrxiv |
10.1101/19005298 | Transmission Dynamics of and Insights from the 2018-2019 Measles Outbreak in New York City: A Modeling Study | Yang, W. | Wan Yang | Columbia University | 2019-09-14 | 2 | PUBLISHAHEADOFPRINT | cc_no | epidemiology | https://www.medrxiv.org/content/early/2019/09/14/19005298.source.xml | In 2018-2019, New York City experienced the largest measles outbreak in the US in nearly three decades. To identify key factors contributing to this outbreak to aid future public health interventions, here we developed a model-inference system to infer the transmission dynamics of measles in the affected community, based on incidence data. Our results indicate that delayed vaccination of young children aged 1-4 years enabled the initial spread of measles and that increased infectious contact among this age group, likely via gatherings intended to expose unvaccinated children (i.e. "measles parties"), further aggravated the outbreak and led to widespread of measles beyond this age group. We found that around half of infants were susceptible to measles by age 1 (the age-limit to receive the first vaccine dose in the US); as such, infants experienced a large number of infections during the outbreak. We showed that without the implemented vaccination campaigns, the outbreak severity including numbers of infections and hospitalizations would be 10 times higher and predominantly affect infants and children under 4. These results suggest that recommending the first vaccine dose before age 1 and the second dose before age 4 could allow pro-vaccine parents to vaccinate and protect infants and young children more effectively, should high level of vaccine hesitancy persist. In addition, enhanced public health education is needed to reduce activities that unnecessarily expose children to measles and other infections. | 10.1126/sciadv.aaz4037 | medrxiv |
10.1101/19004689 | Institutional behaviour in the success of the Clean India programme | Curtis, V. | Val Curtis | London School of Hygiene and Tropical Medicine | 2019-08-29 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | public and global health | https://www.medrxiv.org/content/early/2019/08/29/19004689.source.xml | IntroductionWhilst many less developed countries are struggling to provide universal access to safe sanitation, in the last five years India has almost reached its target of eliminating open defaecation. The object of this study was to understand how the Indian Government effected this sanitation transformation.
MethodsThe study employed interviews with 17 actors in the Governments Clean India programme across the national capital and four states which were analysed using a theory of change grounded in Behaviour Centred Design.
ResultsThe Swachh Bharat Mission (Gramin) claims to have improved the coverage of toilets in rural India from 39% to over 95% of households between 2014 and mid 2019. From interviews with relevant actors we constructed a theory of change for the programme in which high-level political support and disruptive leadership changed environments in districts, which led to psychological changes in district officials, which, in turn, led to changed behaviour concerning sanitation programming. The Prime Ministers setting of the ambitious goal to eliminate open defecation by the 150th birthday of Mahatma Gandhi (October 2019) galvanised government bureaucracy, while early success in 100 flagship districts reduced the scepticism of government employees, a cadre of 500 young professionals placed in districts imparted new ideas and energy, social and mass media was used to engage and motivate the public and key players, and new norms of ethical behaviour were demonstrated by leaders. As a result, district officials engaged emotionally with the programme and felt pride at their achievements in ridding villages of open defecation.
ConclusionsThough many challenges remain, Governments seeking to achieve the Sustainable Development Goal of universal access to safe sanitation can emulate the success of Indias Swachh Bharat Mission.
SUMMARY BOXESO_ST_ABSWhat is already known?C_ST_ABSO_LIAt least 47 countries are not on track to reach the Sustainable Development Goal of universal access to safe sanitation by 2030 and some 0.6 billion people are still defecating in the open.
C_LIO_LIIt is not clear how governments in low income countries can be galvanised to act to resolve this pressing public health problem.
C_LI
What are the new findings?O_LIThe experience of the Clean India programme suggests that countries can almost eliminate open defecation.
C_LIO_LIThe success of the programme was due to factors including: the setting of ambitious targets; the use of modern communications strategies and monitoring technology; and the provision of visible reward and recognition for employees.
C_LI
What do the new findings imply?O_LIDisruptive leadership is needed to create working environments where sometimes jaded civil servants are given an opportunity to make a difference.
C_LIO_LIPoliticians who embrace the cause of sanitation may find that there are votes in toilets.
C_LI | 10.1136/bmjgh-2019-001892 | medrxiv |
10.1101/19004689 | Explaining the outcomes of the 'Clean India' campaign: institutional behaviour and sanitation transformation in India | Curtis, V. | Val Curtis | London School of Hygiene and Tropical Medicine | 2019-09-14 | 2 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | public and global health | https://www.medrxiv.org/content/early/2019/09/14/19004689.source.xml | IntroductionWhilst many less developed countries are struggling to provide universal access to safe sanitation, in the last five years India has almost reached its target of eliminating open defaecation. The object of this study was to understand how the Indian Government effected this sanitation transformation.
MethodsThe study employed interviews with 17 actors in the Governments Clean India programme across the national capital and four states which were analysed using a theory of change grounded in Behaviour Centred Design.
ResultsThe Swachh Bharat Mission (Gramin) claims to have improved the coverage of toilets in rural India from 39% to over 95% of households between 2014 and mid 2019. From interviews with relevant actors we constructed a theory of change for the programme in which high-level political support and disruptive leadership changed environments in districts, which led to psychological changes in district officials, which, in turn, led to changed behaviour concerning sanitation programming. The Prime Ministers setting of the ambitious goal to eliminate open defecation by the 150th birthday of Mahatma Gandhi (October 2019) galvanised government bureaucracy, while early success in 100 flagship districts reduced the scepticism of government employees, a cadre of 500 young professionals placed in districts imparted new ideas and energy, social and mass media was used to engage and motivate the public and key players, and new norms of ethical behaviour were demonstrated by leaders. As a result, district officials engaged emotionally with the programme and felt pride at their achievements in ridding villages of open defecation.
ConclusionsThough many challenges remain, Governments seeking to achieve the Sustainable Development Goal of universal access to safe sanitation can emulate the success of Indias Swachh Bharat Mission.
SUMMARY BOXESO_ST_ABSWhat is already known?C_ST_ABSO_LIAt least 47 countries are not on track to reach the Sustainable Development Goal of universal access to safe sanitation by 2030 and some 0.6 billion people are still defecating in the open.
C_LIO_LIIt is not clear how governments in low income countries can be galvanised to act to resolve this pressing public health problem.
C_LI
What are the new findings?O_LIThe experience of the Clean India programme suggests that countries can almost eliminate open defecation.
C_LIO_LIThe success of the programme was due to factors including: the setting of ambitious targets; the use of modern communications strategies and monitoring technology; and the provision of visible reward and recognition for employees.
C_LI
What do the new findings imply?O_LIDisruptive leadership is needed to create working environments where sometimes jaded civil servants are given an opportunity to make a difference.
C_LIO_LIPoliticians who embrace the cause of sanitation may find that there are votes in toilets.
C_LI | 10.1136/bmjgh-2019-001892 | medrxiv |
10.1101/19003707 | Systematic analysis of tumor-infiltrating immune cells in human endometrial cancer: a retrospective study | Zhou, X.; Ling, Z.; yang, b. | bing yang | The First Affiliated Hospital of Zunyi Medical University | 2019-08-29 | 1 | PUBLISHAHEADOFPRINT | cc0_ng | obstetrics and gynecology | https://www.medrxiv.org/content/early/2019/08/29/19003707.source.xml | ObjectiveThe prognostic effect of tumor-infiltrating immune cells (TIICs) on endometrial cancer (EMC) has not been extensively investigated. In the present study, we systematically analyzed the role of TIICs in EMC development.
MethodsPatient data were downloaded from The Cancer Genome Atlas (TCGA). We comprehensively analyzed TIIC population in EMC tissue and their role in EMC progression and prognosis by using a deconvolution algorithm (CIBERSORT) and clinically annotated expression profiles.
ResultsThe proportions of gamma delta T cells, resting NK cells, M1 macrophages, and resting mast cells were significantly different in normal endometrium and EMC tissue. The proportion of CD8+ T cells, resting memory CD4 T cells, and M0 macrophages was reversed middle correlated. The proportion of resting dendritic cells, resting memory CD4 T cells, and T regulatory cells (Tregs) decreased in accordance with the cancer cell differentiation grade (G); the lower proportion of activated dendritic cells and gamma delta T cells and higher proportion of Tregs predicted longer EMC survival time and vice versa. The low proportion of gamma delta T cells indicated better response to therapy.
ConclusionCollectively, our data suggested subtle differences in the cellular composition of TIICs in EMC, and these differences were likely to be important determinants of both prognosis and therapy of EMC. | null | medrxiv |
10.1101/19005165 | The consistent burden in published estimates of delirium occurrence in medical inpatients over four decades: a systematic review and meta-analysis study | Gibb, K.; Seeley, A.; Quinn, T.; Siddiqi, N.; Shenkin, S.; Rockwood, K.; Davis, D. | Daniel Davis | UCL | 2019-08-29 | 1 | PUBLISHAHEADOFPRINT | cc_by | geriatric medicine | https://www.medrxiv.org/content/early/2019/08/29/19005165.source.xml | IntroductionDelirium is associated with a wide range of adverse patient safety outcomes. We sought to identify if trends in healthcare complexity were associated with changes in reported delirium in adult medical patients in the general hospital over the last four decades.
MethodsWe used identical criteria to a previous systematic review, including studies using DSM and ICD-10 criteria for delirium diagnosis. Random effects meta-analysis pooled estimates across studies, meta-regression estimated temporal changes, funnel plots assessed publication bias.
ResultsOverall delirium occurrence was 23% (95% CI 19%-26%) (33 studies). There was no change between 1980-2019, nor was case-mix (average age of sample, proportion with dementia) different. There was evidence of increasing publication bias over time.
DiscussionThe incidence and prevalence of delirium in hospitals appears to be stable, though publication bias may mask true changes. Nonetheless, delirium remains a challenging and urgent priority for clinical diagnosis and care pathways. | 10.1093/ageing/afaa040 | medrxiv |
10.1101/19005165 | The consistent burden in published estimates of delirium occurrence in medical inpatients over four decades: a systematic review and meta-analysis study | Gibb, K.; Seeley, A.; Quinn, T.; Siddiqi, N.; Shenkin, S.; Rockwood, K.; Davis, D. | Daniel Davis | UCL | 2019-09-26 | 2 | PUBLISHAHEADOFPRINT | cc_by | geriatric medicine | https://www.medrxiv.org/content/early/2019/09/26/19005165.source.xml | IntroductionDelirium is associated with a wide range of adverse patient safety outcomes. We sought to identify if trends in healthcare complexity were associated with changes in reported delirium in adult medical patients in the general hospital over the last four decades.
MethodsWe used identical criteria to a previous systematic review, including studies using DSM and ICD-10 criteria for delirium diagnosis. Random effects meta-analysis pooled estimates across studies, meta-regression estimated temporal changes, funnel plots assessed publication bias.
ResultsOverall delirium occurrence was 23% (95% CI 19%-26%) (33 studies). There was no change between 1980-2019, nor was case-mix (average age of sample, proportion with dementia) different. There was evidence of increasing publication bias over time.
DiscussionThe incidence and prevalence of delirium in hospitals appears to be stable, though publication bias may mask true changes. Nonetheless, delirium remains a challenging and urgent priority for clinical diagnosis and care pathways. | 10.1093/ageing/afaa040 | medrxiv |
10.1101/19005207 | The Impact of Pre-exposure Prophylaxis for Human Immunodeficiency Virus on Gonorrhea Prevalence | Pharaon, J.; Bauch, C. T. | Chris T Bauch | University of Waterloo | 2019-08-29 | 1 | PUBLISHAHEADOFPRINT | cc_no | hiv aids | https://www.medrxiv.org/content/early/2019/08/29/19005207.source.xml | Pre-exposure prophylaxis (PrEP) has been shown to be highly effective in reducing the risk of HIV infection in gay and bisexual men who have sex with men (GbMSM). However, PrEP does not protect against other sexually transmitted infections (STIs). In some populations, PrEP has also led to riskier behaviour such as reduced condom usage, with the result that the prevalence of bacterial STIs like gonorrhea has increased. Here we develop a compartmental model of the transmission of HIV and gonorrhea, and the impacts of PrEP, condom usage, STI testing frequency and potential changes in sexual risk behaviour stemming from the introduction of PrEP in a population of GbMSM. We find that introducing PrEP causes an increase in gonorrhea prevalence for a wide range of parameter values, including at the current recommended frequency of STI testing once every 3 months for individuals on PrEP. Moreover, the model predicts that a higher STI testing frequency alone is not enough to prevent a rise in gonorrhea prevalence, unless the testing frequency is increased to impractical levels. However, testing every 2 months in combination with sufficiently high condom usage by individuals on PrEP would be successful in maintaining gonorrhea prevalence at pre-PrEP levels. The results emphasize that programs making PrEP more available should be accompanied by efforts to support condom usage and frequent STI testing, in order to avoid an increase in the prevalence of gonorrhea and other bacterial STIs. | 10.1007/s11538-020-00762-7 | medrxiv |
10.1101/19004960 | Identification of host-pathogen-disease relationships using a scalable Multiplex Serology platform in UK Biobank | Mentzer, A. J.; Brenner, N.; Allen, N.; Littlejohns, T. J.; Chong, A. Y.; Cortes, A.; Almond, R.; Hill, M.; Sheard, S.; McVean, G.; UK Biobank Infection Advisory Board, ; Collins, R.; Hill, A. V.; Waterboer, T. | Alexander J Mentzer | University of Oxford | 2019-08-29 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | infectious diseases | https://www.medrxiv.org/content/early/2019/08/29/19004960.source.xml | BackgroundCertain infectious agents are recognised causes of cancer and potentially other chronic diseases. Identifying associations and understanding pathological mechanisms involving infectious agents and subsequent chronic disease risk will be possible through measuring exposure to multiple infectious agents in large-scale prospective cohorts such as UK Biobank.
MethodsFollowing expert consensus we designed a Multiplex Serology platform capable of simultaneously measuring quantitative antibody responses against 45 antigens from 20 infectious agents implicated in non-communicable diseases, including human herpes, hepatitis, polyoma, papilloma, and retroviruses, as well as Chlamydia trachomatis, Helicobacter pylori and Toxoplasma gondii. This panel was assayed in a random subset of UK Biobank participants (n=9,695) to test associations between infectious agents and recognised demographic and genetic risk factors and disease outcomes.
FindingsSeroprevalence estimates for each infectious agent were consistent with those expected from the literature. The data confirmed epidemiological associations of infectious agent antibody responses with sociodemographic characteristics (e.g. lifetime sexual partners with C, trachomatis; P=1{middle dot}8x10-149), genetic variants (e.g. rs6927022 with Epstein-Barr virus (EBV) EBNA1 antibodies, P=9{middle dot}5x10-91) and disease outcomes including human papillomavirus-16 seropositivity and cervical intraepithelial neoplasia (odds ratio 2{middle dot}28, 95% confidence interval 1{middle dot}38-3{middle dot}63), and quantitative EBV viral capsid antigen responses and multiple sclerosis through genetic correlation (MHC rG=0{middle dot}30, P=0{middle dot}01).
InterpretationThis dataset, intended as a pilot study to demonstrate applicability of Multiplex Serology in epidemiological studies, is itself one of the largest studies to date covering diverse infectious agents in a prospective UK cohort including those traditionally under-represented in population cohorts such as human immunodeficiency virus-1 and C. trachomatis. Our results emphasise the validity of our Multiplex Serology approach in large-scale epidemiological studies opening up opportunities for improving our understanding of host-pathogen-disease relationships. These data are available to researchers interested in examining the relationship between infectious agents and human health. | 10.1038/s41467-022-29307-3 | medrxiv |
10.1101/19004044 | A systematic review and meta-analysis on the effectiveness of an early invasive strategy compared to a conservative approach in elderly patients with non-ST elevation acute coronary syndrome | Reano, J. D. P.; Shiu, L. A.; Miralles, K.; Dimalala, M. G.; Reyes, M. J.; Pestano, N.; Tumanan-Mendoza, B.; Punzalan, F. E.; Castillo, R. | Joan Dymphna Palma Reano | Manila Doctors Hospital | 2019-08-29 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | cardiovascular medicine | https://www.medrxiv.org/content/early/2019/08/29/19004044.source.xml | BackgroundElderly patients, 65 years old and older, largely represent (>50 %) of hospital- admitted patients with acute coronary syndrome (ACS). Data are conflicting comparing efficacy of early routine invasive (within 48-72 hours of initial evaluation) versus conservative management of ACS in this population.
ObjectiveWe aimed to determine the effectiveness of routine early invasive strategy compared to conservative treatment in reducing major adverse cardiovascular events in elderly patients with non-ST elevation (NSTE) ACS.
Data SourcesWe conducted a systematic review of randomized controlled trials through PubMed, Cochrane, and Google Scholar database.
Study SelectionThe studies included were randomized controlled trials that evaluated the effectiveness of invasive strategy compared to conservative treatment among elderly patients [≥] 65 years old diagnosed with NSTEACS. Studies were included if they assessed any of the following outcomes of death, cardiovascular mortality, myocardial infarction (MI), stroke, recurrent angina, and need for revascularization. Five articles were subsequently included in the meta-analysis.
Data ExtractionThree independent reviewers extracted the data of interest from the articles using a standardized data collection form that included study quality indicators. Disparity in assessment was settled by an independent adjudicator.
Data SynthesisAll pooled analyses were based on fixed effects model. A total of 2,495 patients were included, 1337 in the invasive strategy group, and 1158 in the conservative treatment group.
ResultsMeta-analysis showed less incidence of revascularization in the invasive (2%) over conservative treatment groups (8%), with overall risk ratio of 0.31 (95% CI 0.16-0.61, I2 =0%). There was also less incidence of stroke in the invasive (2%) versus conservative group (3%) but this was not statistically significant. A significant benefit was noted in the reduction of all-cause mortality (RR 0.63, 95% CI 0.55-0.72, I2=84%) and myocardial infarction (RR 0.62, 95% CI 0.49- 0.79, I2=63%) but with significant heterogeneity.
ConclusionThere was a significantly lower rate of revascularization in the invasive strategy group compared to the conservative treatment group. In the reduction of all-cause mortality and MI, there was benefit favoring invasive strategy but with significant heterogeneity. These findings do not support the bias against early routine invasive intervention in the elderly group with NSTEACS. However, further studies focusing on the elderly with larger population sizes are still needed. | 10.1371/journal.pone.0229491 | medrxiv |
10.1101/19004143 | Where should new parkrun events be located? Modelling the potential impact of 200 new events on socio-economic inequalities in access and participation. | Schneider, P. P.; Smith, R. A.; Bullas, A. M.; Bayley, T.; Haake, S. S.; Brennan, A.; Goyder, E. | Paul P Schneider | University of Sheffield | 2019-08-29 | 1 | PUBLISHAHEADOFPRINT | cc_by | public and global health | https://www.medrxiv.org/content/early/2019/08/29/19004143.source.xml | Backgroundparkrun, an international movement which organises free weekly 5km running events, has been widely praised for encouraging inactive individuals to participate in physical activity. Recently, parkrun received funding to establish 200 new events across England, specifically targeted at deprived communities. This study aims to investigate the relationships between geographic access, deprivation, and participation in parkrun, and to inform the planned expansion by proposing future event locations.
MethodsWe conducted an ecological spatial analysis, using data on 455 parkrun events, 2,842 public green spaces, and 32,844 English census areas. Poisson regression was applied to investigate the relationships between the distances to events, deprivation, and parkrun participation rates. Model estimates were incorporated into a location-allocation analysis, to identify locations for future events that maximise deprivation-weighted parkrun participation.
ResultsThe distance to the nearest event (in km) and the Index of Multiple Deprivation (score) were both independently negatively associated with local parkrun participation rates. Rate ratios were 0.921 (95%CI = 0.921-0.922) and 0.959 (0.959-0.959), respectively. The recommended 200 new event locations were estimated to increase weekly runs by 6.9% (from 82,824 to 88,506). Of the additional runs, 4.1% (n=231) were expected to come from the 10% most deprived communities.
ConclusionParticipation in parkrun is wide spread across England. We provide recommendations for new parkrun event location, in order to increase participation from deprived communities. However, the creation of new events alone is unlikely to be an effective strategy. Further research is needed to study how barriers to participation can be reduced.
Online Map, data, and source codeAn interactive online map is available here, and the annotated R source code and all data that were used to generate the results of this study are provided on a repository. | null | medrxiv |
10.1101/19005132 | Clinical benefits and adverse effects of genetically-elevated free testosterone levels: a Mendelian randomization analysis | Mohammadi-Shemirani, P.; Chong, M.; Pigeyre, M.; Morton, R. W.; Gerstein, H. C.; Pare, G. | Guillaume Pare | McMaster University | 2019-08-29 | 1 | PUBLISHAHEADOFPRINT | cc_by | genetic and genomic medicine | https://www.medrxiv.org/content/early/2019/08/29/19005132.source.xml | BACKGROUNDTestosterone products are increasingly being prescribed to males for a variety of possible health benefits but the causal relationship between testosterone and health-related outcomes is unclear. Evidence from well-powered randomized controlled trials are difficult to obtain, particularly regarding effects on long-term or adverse outcomes. We sought to determine the effects of genetically-predicted calculated free testosterone (CFT) on 23 health outcomes.
METHODSGenetic variants associated with CFT were determined from 136,531 white British males in the UK Biobank. One-sample and two-sample Mendelian randomization (MR) analyses were performed to infer the effects of genetically-predicted CFT on 23 health outcomes selected based on relevance with known or suspected effects of testosterone therapy.
FINDINGSIn males from the UK Biobank, 81 independent genetic variants were associated with CFT levels at genome-wide significance (p<5x10-8). Each 0.1 nmol/L increase in genetically-predicted CFT was associated with clinical benefits on increased heel bone mineral density (0.053 SD; 95% CI = 0.038 to 0.068; p=8.77x10-12) and decreased body fat percentage (-1.86%; 95% CI = -2.35 to -1.37; p=1.56x10-13), and adverse effects on increased risk of prostate cancer (OR=1.28; 95% CI=1.11 to 1.49; p=1.0x10-3), risk of androgenic alopecia (OR=1.82; 95% CI = 1.55 to 2.14; p=3.52x10-13), risk of benign prostate hyperplasia (BPH) (OR=1.81; 95% CI = 1.34 to 2.44; p=1.05x10-4) and hematocrit percentage (1.49%; 95% CI = 1.24 to 1.74; p=3.49x10-32).
CONCLUSIONSLong-term elevated free testosterone levels cause prostate cancer, BPH, and hair loss while reducing body fat percentage and increasing bone density. It also has a neutral effect on type 2 diabetes, cardiovascular and cognitive outcomes. Well powered randomized trials are needed to address the effects of shorter term use of exogenous testosterone on these outcomes. | 10.7554/elife.58914 | medrxiv |
10.1101/19005223 | Identifying subtypes of a stigmatized medical condition | Gabashvili, I. | Irene Gabashvili | Aurametrix, MEBO Research | 2019-08-29 | 1 | PUBLISHAHEADOFPRINT | cc_by_nd | endocrinology | https://www.medrxiv.org/content/early/2019/08/29/19005223.source.xml | BackgroundSome conditions - such as obesity, depression and functional odor disorders - come with a social stigma. Understanding the etiology of these conditions helps to avoid stereotypes and find remedies. One of the major obstacles facing researchers, especially for those studying socially distressing metabolic malodor, is the difficulty in assembling biologically homogenous study cohorts.
ObjectiveThe aim of this study was to examine phenotypic variance, self-reported data and laboratory tests for the purpose of identifying clinically relevant and etiologically meaningful subtypes of idiopathic body odor and the "People are Allergic To Me" (PATM) syndrome.
MethodsParticipants with undiagnosed body odor conditions enrolled to participate in this research study initiated by a healthcare charity MEBO Research and sponsored by Wishart Research group at the Metabolomics Innovation Centre, University of Alberta, Canada. Primary outcomes were differences in metabolite concentrations measured in urine, blood and breath of test and control groups. Principal component analyses and other statistical tests were carried out for these measurements.
ResultsWhile neither of existing laboratory tests could reliably predict chronic malodor symptoms, several measurements distinguished phenotypes at a significance level less than 5%. Types of malodor can be differentiated by self-reported consumption of (or sensitivity to) added sugars (p<0.01), blood alcohols after glucose challenge (especially ethanol: p<0.0005), urinary excretion of phenylalanine, putrescine, and combinations of blood or urine metabolites.
ConclusionsOur preliminary results suggest that malodor heterogeneity can be addressed by analyses of phenotypes based on patients dietary and olfactory observations. Our studies highlight the need for more trials. Future research focused on comprehensive metabolomics and microbiome sequencing will play an important role in the diagnosis and treatment of malodor.
Trial RegistrationThe study discussed in the manuscript was registered as NCT02692495 at clinicaltrials.gov. The results were compared with our earlier study registered as NCT02683876. | null | medrxiv |
10.1101/19005181 | Characteristics of Cardiac Memory in Patients with Implanted Cardioverter Defibrillator: the CAMI study | Haq, K.; Cao, J.; Tereshchenko, L. G. | Larisa G Tereshchenko | Oregon Health & Science University | 2019-08-30 | 1 | PUBLISHAHEADOFPRINT | cc_no | cardiovascular medicine | https://www.medrxiv.org/content/early/2019/08/30/19005181.source.xml | ObjectiveThe goal of this study was to determine factors associated with cardiac memory (CM) in patients with implantable cardioverter-defibrillators (ICD).
MethodsPatients with structural heart disease (n=20; mean age 72.6{+/-}11.6 y; 80% male; mean left ventricular ejection fraction (LVEF) 31.7{+/-}7.6%; history of myocardial infarction (MI) in 75%, ventricular tachycardia (VT) in 85%) and preserved atrioventricular (AV) conduction received primary (80%) or secondary (20%) prevention dual-chamber ICD. Standard 12-lead ECG was recorded in AAI and DDD mode, before and after 7 days of right ventricular (RV) pacing in DDD mode with short AV delay. Direction (azimuth and elevation) and magnitude of spatial QRS, T, and ventricular gradient (SVG) vectors were measured before and after 7 days of RV pacing. CM was quantified as the degree of alignment between QRSDDD-7 and TAAI-7 vectors (QRSDDD-7-TAAI-7 angle). Circular statistics and mixed models with a random slope and intercept were adjusted for days 1-7 change in cardiac activation, LVEF, known risk factors, and use of medications known to affect CM.
ResultsQRSDDD-7-TAAI-7 angle strongly correlated (circular r = - 0.972; P<0.0001) with TAAI-7-TDDD-7 angle. In the mixed models, history of MI (-180{degrees}(95%CI -320{degrees} to -40{degrees}); P=0.011) and female sex (-162{degrees}(95%CI -268{degrees} to -55{degrees}); P=0.003) counteracted CM-T azimuth changes (+132{degrees}(95%CI 80{degrees}-184{degrees}); P<0.0001). History of VT (+27(95%CI 4-46) mV*ms; P=0.007) amplified CM-T area increase (+15(95%CI 6-24) mV*ms; P<0.0001).
ConclusionsExisting cardiac remodeling affects CM in response to RV pacing. Women develop less CM than men. Activation memory is another manifestation of CM. | 10.19102/icrm.2021.120204 | medrxiv |
10.1101/19005181 | Characteristics of Cardiac Memory in Patients with Implanted Cardioverter Defibrillator: the CAMI study | Haq, K.; Cao, J.; Tereshchenko, L. G. | Larisa G Tereshchenko | Oregon Health & Science University | 2019-11-26 | 2 | PUBLISHAHEADOFPRINT | cc_no | cardiovascular medicine | https://www.medrxiv.org/content/early/2019/11/26/19005181.source.xml | ObjectiveThe goal of this study was to determine factors associated with cardiac memory (CM) in patients with implantable cardioverter-defibrillators (ICD).
MethodsPatients with structural heart disease (n=20; mean age 72.6{+/-}11.6 y; 80% male; mean left ventricular ejection fraction (LVEF) 31.7{+/-}7.6%; history of myocardial infarction (MI) in 75%, ventricular tachycardia (VT) in 85%) and preserved atrioventricular (AV) conduction received primary (80%) or secondary (20%) prevention dual-chamber ICD. Standard 12-lead ECG was recorded in AAI and DDD mode, before and after 7 days of right ventricular (RV) pacing in DDD mode with short AV delay. Direction (azimuth and elevation) and magnitude of spatial QRS, T, and ventricular gradient (SVG) vectors were measured before and after 7 days of RV pacing. CM was quantified as the degree of alignment between QRSDDD-7 and TAAI-7 vectors (QRSDDD-7-TAAI-7 angle). Circular statistics and mixed models with a random slope and intercept were adjusted for days 1-7 change in cardiac activation, LVEF, known risk factors, and use of medications known to affect CM.
ResultsQRSDDD-7-TAAI-7 angle strongly correlated (circular r = - 0.972; P<0.0001) with TAAI-7-TDDD-7 angle. In the mixed models, history of MI (-180{degrees}(95%CI -320{degrees} to -40{degrees}); P=0.011) and female sex (-162{degrees}(95%CI -268{degrees} to -55{degrees}); P=0.003) counteracted CM-T azimuth changes (+132{degrees}(95%CI 80{degrees}-184{degrees}); P<0.0001). History of VT (+27(95%CI 4-46) mV*ms; P=0.007) amplified CM-T area increase (+15(95%CI 6-24) mV*ms; P<0.0001).
ConclusionsExisting cardiac remodeling affects CM in response to RV pacing. Women develop less CM than men. Activation memory is another manifestation of CM. | 10.19102/icrm.2021.120204 | medrxiv |
10.1101/19005504 | Impact of spectrograms on the classification of wheezes and crackles in an educational setting. An interrater study. | Aviles Solis, J. C.; Storvoll, I.; Vanbelle, S.; Melbye, H. | Juan Carlos Aviles Solis | UiT, The Arctic University of Norway | 2019-08-31 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | medical education | https://www.medrxiv.org/content/early/2019/08/31/19005504.source.xml | BackgroundChest auscultation is a widely used method in the diagnosis of lung diseases. However, the interpretation of lung sounds is a subjective task and disagreements arise. New technological developments like the use of visual representation of sounds through spectrograms could improve the agreement when classifying lung sounds, but this is not yet known.
AimsTo test if the use of spectrograms improves the agreement when classifying wheezes and crackles.
MethodsWe used 30 lung sounds recordings. The sample contained 15 normal recordings and 15 with wheezes or crackles. We produced spectrograms of the recordings. Twenty-three third to fifth-year medical students at UiT the Arctic University of Norway classified the recordings using an online questionnaire. We first showed the students examples of how wheezes and crackles looked in the spectrogram. Then, we played the recordings in a random order two times, first without the spectrogram, then with live spectrograms displayed. We asked them to classify the sounds for the presence of wheezes and crackles. We calculated kappa values for the agreement between each student and the expert classification with and without display of spectrograms and tested for significant improvement. We also calculated Fleiss kappa for the 23 observers with and without the spectrogram.
ResultsWhen classifying wheezes 13/23 (1 with p<.05) students had a positive change in k, and 16/23 (2 with p<.05). All the statistically significant changes were in the direction of improved kappa values (.52 - .75). Fleiss kappa values were k=.51 and k=.56 (p=.63) for wheezes without and with spectrograms. For crackles, these values were k=.22 and k=.40 (p=<0.01) in the same order.
ConclusionsThe use of spectrograms had a positive impact on the inter-rater agreement and the agreement with experts. We observed a higher improvement in the classification of crackles compared to wheezes. | 10.1038/s41598-020-65354-w | medrxiv |
10.1101/19005512 | High occurrence of transportation and logistics occupations among vascular dementia patients: an observational study | van Loenhoud, A. C.; de Boer, C.; Wols, K.; Pijnenburg, Y. A. L.; Lemstra, A. W.; Bouwman, F. H.; Prins, N. D.; Scheltens, P.; Ossenkoppele, R.; van der Flier, W. M. | Anna C. van Loenhoud | Alzheimer Center Amsterdam, Department of Neurology, Vrije Universiteit Amsterdam, Amsterdam UMC | 2019-08-31 | 1 | PUBLISHAHEADOFPRINT | cc_no | occupational and environmental health | https://www.medrxiv.org/content/early/2019/08/31/19005512.source.xml | BackgroundGrowing evidence suggests a role of occupation in the emergence and manifestation of dementia. Occupations are often defined by complexity level, although working environments and activities differ in several other important ways. We aimed to capture the multi-faceted nature of occupation through its measurement as a qualitative (instead of a quantitative) variable and explored its relationship with different types of dementia.
MethodsWe collected occupational information of 2,121 dementia patients with various suspected etiologies from the Amsterdam Dementia Cohort (age: 67{+/-}8, 57% male, MMSE: 21{+/-}5). Our final sample included individuals with Alzheimers disease (AD) dementia (n=1,467), frontotemporal dementia (n=281), vascular dementia (n=98), Lewy Body disease (n=174) and progressive supranuclear palsy/corticobasal degeneration (n=101). Within the AD group, we used neuropsychological data to further characterize patients by clinical phenotypes. All participants were categorized into one of 11 occupational classes, across which we evaluated the distribution of dementia (sub)types with Chi2 analyses. We gained further insight into occupation-dementia relationships through post-hoc logistic regressions that included various demographic and health characteristics as explanatory variables.
ResultsThere were significant differences in the distribution of dementia types across occupation groups (Chi2=85.87, p<.001). Vascular dementia was relatively common in the Transportation/Logistics sector, and higher vascular risk factors partly explained this relationship. Alzheimers disease occurred less in Transportation/Logistics and more in Health Care/Welfare occupations, which related to a higher/lower percentage of males. We found no relationships between occupational classes and clinical phenotypes of AD (Chi2=53.65, n.s.).
ConclusionsRelationships between occupation and dementia seem to exist beyond complexity level, which offers new opportunities for disease prevention and improvement of occupational health policy. | 10.1186/s13195-019-0570-4 | medrxiv |
10.1101/19005512 | High occurrence of transportation and logistics occupations among vascular dementia patients: an observational study | van Loenhoud, A. C.; de Boer, C.; Wols, K.; Pijnenburg, Y. A. L.; Lemstra, A. W.; Bouwman, F. H.; Prins, N. D.; Scheltens, P.; Ossenkoppele, R.; van der Flier, W. M. | Anna C. van Loenhoud | Alzheimer Center Amsterdam, Department of Neurology, Vrije Universiteit Amsterdam, Amsterdam UMC | 2020-01-08 | 2 | PUBLISHAHEADOFPRINT | cc_no | occupational and environmental health | https://www.medrxiv.org/content/early/2020/01/08/19005512.source.xml | BackgroundGrowing evidence suggests a role of occupation in the emergence and manifestation of dementia. Occupations are often defined by complexity level, although working environments and activities differ in several other important ways. We aimed to capture the multi-faceted nature of occupation through its measurement as a qualitative (instead of a quantitative) variable and explored its relationship with different types of dementia.
MethodsWe collected occupational information of 2,121 dementia patients with various suspected etiologies from the Amsterdam Dementia Cohort (age: 67{+/-}8, 57% male, MMSE: 21{+/-}5). Our final sample included individuals with Alzheimers disease (AD) dementia (n=1,467), frontotemporal dementia (n=281), vascular dementia (n=98), Lewy Body disease (n=174) and progressive supranuclear palsy/corticobasal degeneration (n=101). Within the AD group, we used neuropsychological data to further characterize patients by clinical phenotypes. All participants were categorized into one of 11 occupational classes, across which we evaluated the distribution of dementia (sub)types with Chi2 analyses. We gained further insight into occupation-dementia relationships through post-hoc logistic regressions that included various demographic and health characteristics as explanatory variables.
ResultsThere were significant differences in the distribution of dementia types across occupation groups (Chi2=85.87, p<.001). Vascular dementia was relatively common in the Transportation/Logistics sector, and higher vascular risk factors partly explained this relationship. Alzheimers disease occurred less in Transportation/Logistics and more in Health Care/Welfare occupations, which related to a higher/lower percentage of males. We found no relationships between occupational classes and clinical phenotypes of AD (Chi2=53.65, n.s.).
ConclusionsRelationships between occupation and dementia seem to exist beyond complexity level, which offers new opportunities for disease prevention and improvement of occupational health policy. | 10.1186/s13195-019-0570-4 | medrxiv |
10.1101/19005124 | The Mezurio smartphone application: Evaluating the feasibility of frequent digital cognitive assessment in the PREVENT dementia study | Lancaster, C.; Koychev, I.; Blane, J.; Chinner, A.; Wolter, L.; Hinds, C. | Claire Lancaster | University of Oxford | 2019-09-04 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | psychiatry and clinical psychology | https://www.medrxiv.org/content/early/2019/09/04/19005124.source.xml | BackgroundSmartphones may significantly contribute to the detection of early cognitive decline at scale by enabling remote, frequent, sensitive, economic assessment. Several prior studies have sustained engagement with participants remotely over a period of a week; extending this to a period of a month would clearly give greater opportunity for measurement. However, as such study durations are increased, so too is the need to understand how participant burden and scientific value might be optimally balanced.
ObjectivesWe explore the little but often approach to assessment employed by the Mezurio app, interacting with participants every day for over a month. We aim to understand whether this extended remote study duration is feasible, and which factors might promote sustained participant engagement over such study durations.
MethodsThirty-five adults (aged 40-59 years) with no diagnosis of cognitive impairment were prompted to interact with the Mezurio smartphone app platform for up to 36 days, completing short, daily episodic memory tasks in addition to optional executive function and language tests. A subset (n=20) completed semi-structured interviews focused on their experience using the app.
ResultsAverage compliance with the schedule of learning for subsequent memory test was 80%, with 88% of participants still actively engaged by the final task. Thematic analysis of participants experiences highlighted schedule flexibility, a clear user-interface, and performance feedback as important considerations for engagement with remote digital assessment.
ConclusionsDespite the extended study duration, participants demonstrated high compliance with the tasks scheduled and were extremely positive about their experiences. Long durations of remote digital interaction are therefore definitely feasible, but only when careful attention is paid to the design of the users experience. | 10.2196/16142 | medrxiv |
10.1101/19005124 | The Mezurio smartphone application: Evaluating the feasibility of frequent digital cognitive assessment in the PREVENT dementia study | Lancaster, C.; Koychev, I.; Blane, J.; Chinner, A.; Wolters, L.; Hinds, C. | Claire Lancaster | University of Oxford | 2019-09-14 | 2 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | psychiatry and clinical psychology | https://www.medrxiv.org/content/early/2019/09/14/19005124.source.xml | BackgroundSmartphones may significantly contribute to the detection of early cognitive decline at scale by enabling remote, frequent, sensitive, economic assessment. Several prior studies have sustained engagement with participants remotely over a period of a week; extending this to a period of a month would clearly give greater opportunity for measurement. However, as such study durations are increased, so too is the need to understand how participant burden and scientific value might be optimally balanced.
ObjectivesWe explore the little but often approach to assessment employed by the Mezurio app, interacting with participants every day for over a month. We aim to understand whether this extended remote study duration is feasible, and which factors might promote sustained participant engagement over such study durations.
MethodsThirty-five adults (aged 40-59 years) with no diagnosis of cognitive impairment were prompted to interact with the Mezurio smartphone app platform for up to 36 days, completing short, daily episodic memory tasks in addition to optional executive function and language tests. A subset (n=20) completed semi-structured interviews focused on their experience using the app.
ResultsAverage compliance with the schedule of learning for subsequent memory test was 80%, with 88% of participants still actively engaged by the final task. Thematic analysis of participants experiences highlighted schedule flexibility, a clear user-interface, and performance feedback as important considerations for engagement with remote digital assessment.
ConclusionsDespite the extended study duration, participants demonstrated high compliance with the tasks scheduled and were extremely positive about their experiences. Long durations of remote digital interaction are therefore definitely feasible, but only when careful attention is paid to the design of the users experience. | 10.2196/16142 | medrxiv |
10.1101/19005496 | Modified Long-Axis In-Plane Ultrasound Versus Short-Axis Out-of-Plane Ultrasound For Radial Arterial Cannulation:A Prospective Randomized Controlled Trial | Wang, J.; Zhang, L.; Lai, Z.; Huang, Q.; Wu, G.; Lin, L.; Liu, J.; Weng, X. | Liangcheng Zhang | Department of Anesthesiology, Fujian Medical University Union Hospital, No.29 Xin-Quan Road, Fuzhou, 350001, China. | 2019-09-04 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | anesthesia | https://www.medrxiv.org/content/early/2019/09/04/19005496.source.xml | BACKGROUNDGiven a low first-pass success rate of the conventional SAX (short-axis) or LAX (long-axis) approach, ultrasound-guided radial artery cannulation in adults with radial artery diameter less than 2.2 mm may be still challenging.
OBJECTIVETo assess the efficacy of modified long-axis in-plane(M-LAIP) versus short-axis out-of-plane (SAOP) or conventional palpation(C-P) approaches for ultrasound-guided radial artery cannulation.
DESIGNA prospective, randomized and controlled trial.
SETTINGOperating room in a tertiary university hospital, from 1 July 2018 to 24 November 2018.
PATIENTSA total of 201 patients (age 18 to 85 years, the diameter of the radial artery less 2.2 mm) were included. Patients with history of forearm surgery, ulnar artery occlusion, abnormal Allen test, etc, were excluded from this study.
INTERVENTIONSAll patients were randomized 1:1:1 to M-LAIP, SAOP or C-P.
MAIN OUTCOME MEASURESThe primary outcome was the cannulation success rate. Secondary outcomes included first location time and cannulation time, number of attempts.
RESULTSThe cannulation success rate was significantly higher in the M-LAIP group than in the SAOP group or C-P group (first success rate: 80.3% vs 53.8% or 33.8%; p =0.000; total success rate: 93.9% vs 78.5% or 50.8%; p =0.000). First location time (s) was significantly longer in the M-LAIP group compared with the SAOP group (31(28-35[12-44]) vs 15(14-17[10-21]); p =0.000) and the C-P group (31(28-35[12-44]) vs 12(8-13.5 [6-37]); p =0.000). However, the time of cannulation in the M-LAIP group (29(24-45[16-313])) was significantly shorter than that in the SAOP group (45(28.5-135.5[14-346]), p =0.002) and in the C-P group(138(27-308[12-363]), p =0.000). The number of attempts decreased in the M-LAIP group compared with SAOP or C-P group (1.29{+/-}0.63 vs 1.8{+/-}0.89 or 2.22{+/-}0.93, p =0.000).
CONCLUSIONThe M-LAIP procedure for ultrasound-guided radial artery cannulation can offer a higher success rate of the first-attempt and total cannulation, fewer attempts and less time of cannulation.
TRIAL REGISTRATIONThe study was registered at ClinicalTrials.gov (http://www.chictr.org.cn/index.aspx, number: ChiCTR-IOR-17011474). | null | medrxiv |
10.1101/19005454 | Comprehensive Spatiotemporal Analysis of Opioid Poisoning Mortality in Ohio from 2010 to 2016 | Park, C.; Clemenceau, J. R.; Seballos, A.; Crawford, S.; Lopez, R.; Coy, T.; Atluri, G.; Hwang, T. H. | Tae Hyun Hwang | Cleveland Clinic | 2019-09-04 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | epidemiology | https://www.medrxiv.org/content/early/2019/09/04/19005454.source.xml | ObjectiveWe aimed to identify (1) differences in opioid poisoning mortality among population groups, (2) geographic clusters of opioid-related deaths over time, and (3) health conditions co-occurring with opioid-related death in Ohio by computational analysis.
Materials and MethodsWe used a large-scale Ohio vital statistic dataset from the Ohio Department of Health (ODH) and U.S. Census data from 2010-2016. We surveyed population differences with demographic profiling and use of relative proportions, conducted spatiotemporal pattern analysis with spatial autocorrelation via Moran statistics at the census tract level, and performed comorbidity analysis using frequent itemset mining and association rule mining.
ResultsOur analyses found higher rates of opioid-related death in people aged 25-54, whites, and males. We also found that opioid-related deaths in Ohio became more spatially concentrated during 2010-2016, and tended to be most clustered around Cleveland, Columbus and Cincinnati. Drug abuse, anxiety and cardiovascular disease were found to predict opioid-related death.
DiscussionComprehensive data-driven spatiotemporal analysis of opioid-related deaths provides essential identification of demographic, geographic and health factors related to opioid abuse. Future research should access personal health information for more detailed comorbidity analysis, as well as expand spatiotemporal models for real-time use.
ConclusionComputational analyses revealed demographic differences in opioid poisoning, changing regional patterns of opioid-related deaths, and health conditions co-occurring with opioid overdose for Ohio from 2010-2016, providing essential knowledge for both government officials and caregivers to establish policies and strategies to best combat the opioid epidemic. | 10.1038/s41598-021-83544-y | medrxiv |
10.1101/19003855 | Identifying 31 novel breast cancer susceptibility loci using data from genome-wide association studies conducted in Asian and European women | Shu, X.; Long, J.; Cai, Q.; Kweon, S.-S.; Choi, J.-Y.; Kubo, M.; Park, S. K.; Bolla, M. K.; Dennis, J.; Wang, Q.; Yang, Y.; Shi, J.; Guo, X.; Li, B.; Tao, R.; Aronson, K. J.; Chan, K. Y. K.; Chan, T. L.; Gao, Y.-T.; Hartman, M.; Ho, W.-K.; Ito, H.; Iwasaki, M.; Iwata, H.; John, E. M.; Kasuga, Y.; Khoo, U. S.; Kim, M.-K.; Kurian, A. W.; Kwong, A.; Li, J.; Lophatananon, A.; Low, S.-K.; Mariapun, S.; Matsuda, K.; Matsuo, K.; Muir, K.; Noh, D.-Y.; Park, B.; Park, M.-H.; Shen, C.-Y.; Shin, M.-H.; Spinelli, J. J.; Takahashi, A.; Tseng, C.; Tsugane, S.; Wu, A. H.; Xiang, Y.-B.; Yamaji, T.; Zheng, Y.; | Wei Zheng | Division of Epidemiology, Department of Medicine, Vanderbilt Epidemiology Center, Vanderbilt-Ingram Cancer Center, Vanderbilt University Medical Center, Nashvil | 2019-09-04 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | epidemiology | https://www.medrxiv.org/content/early/2019/09/04/19003855.source.xml | Common genetic variants in 183 loci have been identified in relation to breast cancer risk in genome-wide association studies (GWAS). These risk variants combined explain only a relatively small proportion of breast cancer heritability, particularly in Asian populations. To search for additional genetic susceptibility loci for breast cancer, we performed a meta-analysis of data from GWAS conducted in Asians (24,206 cases and 24,775 controls). Variants showing an association with breast cancer risk at P < 0.01 were evaluated in GWAS conducted in European women including 122,977 cases and 105,974 controls. In the combined analysis of data from both Asian and European women, the lead variant in 28 loci not previously reported showed an association with breast cancer risk at P < 5 x10-8. In the meta-analysis of all GWAS data from Asian and European descendants, we identified SNPs in three additional loci in association with breast cancer risk at P < 5 x10-8. The associations for 10 of these loci were replicated in an independent sample of 16,787 cases and 16,680 controls of Asian women (P < 0.05). Expression quantitative trait locus (eQTL) and gene-based analyses provided evidence for the possible involvement of the YBEY, MAN2C1, SNUPN, TBX1, SEMA4A, STC1, MUTYH, LOXL2, and LINC00886 genes underlying the associations observed in eight of these 28 newly identified risk loci. In addition, we replicated the association for 78 of the 166 previously reported risk variants at P < 0.05 in women of Asian descent using GWAS data. These findings improve our understanding of breast cancer genetics and etiology and extend to Asian populations previous findings from studies of European women. | 10.1038/s41467-020-15046-w | medrxiv |
10.1101/19005439 | Determinants of first aid knowledge and basic practice among elementary school teachers in Debre Tabor City, Northwest Ethiopia | Taklual, W.; Mekie, M.; Yenew, C. | Wubet Taklual | Debre Tabor University | 2019-09-05 | 1 | PUBLISHAHEADOFPRINT | cc_no | pediatrics | https://www.medrxiv.org/content/early/2019/09/05/19005439.source.xml | BackgroundUnpremeditated injuries are the leading causes of morbidity and mortality in pediatrics population, especially in low and middle-income countries. Giving immediate help for an injured child is a crucial step for saving the child from further disability and/or death. This study aimed to assess the determinant factors of first aid knowledge and basic practice among elementary school teachers in Debre Tabor, Ethiopia.
MethodInstitution based cross-sectional study was employed in Debre Tabor City. Single population proportion formula was used for sample size calculation and a total of 216 elementary school teachers were included in study. Simple random sampling technique with proportional allocation was applied for selections of the study participant. Data entry was done by Epi data version 3.1 and the data was exported to SPSS version 21 for analysis. Binary and multivariable logistic regression analysis were performed to identify determinants of knowledge on first aid. Crude and adjusted odds ratios were used to determine the significance and strength of association at 95% confidence interval.
ResultOur study revealed that 45.8% of the subjects were knowledgeable on first aid. More than 75% of study participant reported that they have encountered a child who need first aid. Among this 64% of them provide first aid. The multivariable analysis revealed that service year (AOR=3.51, 95%CI: (1.06, 11.59)), educational status (AOR=12.15, 95%CI: (3.17, 46.67)), previous first aid training (AOR=0.43, 95%CI: (0.21, 0.87)) and information about first aid (AOR=0.12, 95%CI ;(0.03, 0.48)) were found to be significantly associated with having knowledge on first aid.
ConclusionSchool teachers have low knowledge on first aid. Educational status, service year, previous first aid training and information on first aid were the predictor of first aid knowledge. Introducing essential first aid training in the curriculum during teachers training shall be considered. | 10.2174/1874944502013010380 | medrxiv |
10.1101/19004804 | AETIONOMY, a Cross-Sectional Study Aimed at validating a new taxonomy of Neurodegenerative Diseases: Study design and subject characteristics | Corvol, J.-C.; Bujac, S.; Carvalho, S.; Clarke, B.; Marovac, J.; Mangone, G.; Rascol, O.; Meissner, W. G.; Magnin, E.; Foubert-Samier, A.; Catala, H.; Markaki, I.; Tsitsi, P.; Sanchez-Valle, R.; Heneka, M. T.; Molinuevo, J.-L.; Wuellner, U.; Svenningsson, P.; Scordis, P.; Hofmann-Apitius, M. | Jean-Christophe Corvol | Sorbonne Universite, APHP, Inserm, ICM | 2019-09-05 | 1 | PUBLISHAHEADOFPRINT | cc_no | neurology | https://www.medrxiv.org/content/early/2019/09/05/19004804.source.xml | BackgroundAlthough advances in the understanding of neurodegenerative diseases (NDDs) have led to improvements in classification and diagnosis and most importantly to new therapies, the unmet medical needs remain significant due to high treatment failure rates. The AETIONOMY project funded by the Innovative Medicine Initiative (IMI) aims at using multi-OMICs and bioinformatics to identify new classifications for NDDs based on common molecular pathophysiological mechanisms in view of improving the availability of personalised treatments.
ObjectivesThe purpose of the AETIONOMY cross-sectional study is to validate novel patient classification criteria provided by these tools.
MethodsThis was a European multi centre, cross-sectional, clinical study conducted at 6 sites in 3 countries. Standardised clinical data, biosamples from peripheral blood, cerebrospinal fluid, skin biopsies, and data from a multi-OMICs approach were collected in patients suffering from Alzheimers and Parkinsons disease, as well as healthy controls.
ResultsFrom September 2015 to December 2017 a total of 421 participants were recruited including 95 Healthy Controls. Nearly 1,500 biological samples were collected. The study achieved its objective with respect to Parkinsons disease (PD) recruitment, however it was unable to recruit many new Alzheimer Disease (AD) patients. Overall, data from 413 evaluable subjects (405 PD and 8 AD) are available for analysis. PD patients and controls were well matched with respect to age (mean 63.4 years), however, close gender matching was not achieved. Approximately half of all PD patients and one At-Risk subject were taking dopamine agonists; rates of Levodopa usage were slightly higher ([~]60%). Median MDS-UPDRS Part III Scores (OFF state) ranged from 45 (SD 18) in those with Genetic PD to 2 (SD 3) in Healthy Controls. The standardised methodologies applied resulted in a high-quality database with very few missing data.
ConclusionThis is one of the collaborative multi-OMICs studies in individuals suffering from PD and AD involving a control group. It is expected that the integration of data will provide new biomarker-led descriptions of clusters of patient subgroups. | null | medrxiv |
10.1101/19004804 | AETIONOMY, a Cross-Sectional Study Aimed at validating a new taxonomy of Neurodegenerative Diseases: Study design and subject characteristics | Corvol, J.-C.; Bujac, S.; Carvalho, S.; Clarke, B.; Marovac, J.; Mangone, G.; Rascol, O.; Meissner, W. G.; Magnin, E.; Foubert-Samier, A.; Catala, H.; Markaki, I.; Tsitsi, P.; Sanchez-Valle, R.; Heneka, M. T.; Molinuevo, J.-L.; Wuellner, U.; Svenningsson, P.; Scordis, P.; Hofmann-Apitius, M. | Jean-Christophe Corvol | Sorbonne Universite, APHP, Inserm, ICM | 2019-09-26 | 2 | PUBLISHAHEADOFPRINT | cc_no | neurology | https://www.medrxiv.org/content/early/2019/09/26/19004804.source.xml | BackgroundAlthough advances in the understanding of neurodegenerative diseases (NDDs) have led to improvements in classification and diagnosis and most importantly to new therapies, the unmet medical needs remain significant due to high treatment failure rates. The AETIONOMY project funded by the Innovative Medicine Initiative (IMI) aims at using multi-OMICs and bioinformatics to identify new classifications for NDDs based on common molecular pathophysiological mechanisms in view of improving the availability of personalised treatments.
ObjectivesThe purpose of the AETIONOMY cross-sectional study is to validate novel patient classification criteria provided by these tools.
MethodsThis was a European multi centre, cross-sectional, clinical study conducted at 6 sites in 3 countries. Standardised clinical data, biosamples from peripheral blood, cerebrospinal fluid, skin biopsies, and data from a multi-OMICs approach were collected in patients suffering from Alzheimers and Parkinsons disease, as well as healthy controls.
ResultsFrom September 2015 to December 2017 a total of 421 participants were recruited including 95 Healthy Controls. Nearly 1,500 biological samples were collected. The study achieved its objective with respect to Parkinsons disease (PD) recruitment, however it was unable to recruit many new Alzheimer Disease (AD) patients. Overall, data from 413 evaluable subjects (405 PD and 8 AD) are available for analysis. PD patients and controls were well matched with respect to age (mean 63.4 years), however, close gender matching was not achieved. Approximately half of all PD patients and one At-Risk subject were taking dopamine agonists; rates of Levodopa usage were slightly higher ([~]60%). Median MDS-UPDRS Part III Scores (OFF state) ranged from 45 (SD 18) in those with Genetic PD to 2 (SD 3) in Healthy Controls. The standardised methodologies applied resulted in a high-quality database with very few missing data.
ConclusionThis is one of the collaborative multi-OMICs studies in individuals suffering from PD and AD involving a control group. It is expected that the integration of data will provide new biomarker-led descriptions of clusters of patient subgroups. | null | medrxiv |
10.1101/19005462 | Increased Frequency of Acute Illness and Hospitalizations in Infants and Toddlers with Congenital Adrenal Hyperplasia | Tseng, T.; Seagroves, A.; Koppin, C. M.; Keenan, M. F.; Putterman, E.; Nguyen, E.; Chand, S.; Geffner, M. E.; Chang, T.; Kim, M. S. | Mimi S Kim | Children\'s Hospital Los Angeles - University of Southern California | 2019-09-05 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | endocrinology | https://www.medrxiv.org/content/early/2019/09/05/19005462.source.xml | PurposeInfants and toddlers with classical congenital adrenal hyperplasia (CAH) are at high risk for adrenal crisis and associated sequelae. To better understand acute illness at this early age, we determined the frequency and severity of acute illness and hospitalizations between 0-4 years of age, both within CAH and compared to controls. We also evaluated the impact of pre-hospital stress-dose hydrocortisone on Emergency Department (ED) visits and hospitalizations.
MethodsWe performed a retrospective study of 40 CAH youth and 27 age-matched controls at a tertiary center. Characteristics of acute illnesses during the first 4 years of life were recorded, including fever, vomiting, diarrhea, ED visits, hospitalizations, abnormal electrolytes, and stress-dose hydrocortisone usage.
ResultsCAH youth had more frequent illnesses requiring stress-dosing when they were younger than 2 years old [4.0 (1.0-6.0)] compared to when they were 2-4 years old [3.0 (1.0-4.0), P < 0.05], with the most illnesses during their first year of life. As well, CAH infants and toddlers had more hospitalizations younger than 2 years old compared to 2-4 years old (36 vs 2). 25% (3/12) of CAH youth with abnormal electrolytes in the ED did not receive any stress-dosing (oral/IM) prior to the ED, and only 25% (3/12) had received intramuscular hydrocortisone at home. CAH youth had more frequent ED visits (7.4 times as many) and hospitalizations (38 to 0) compared to controls.
ConclusionsVery young children with classical CAH are at high risk for acute illness and hospitalizations during their first 2 years of life, and do not receive adequate stress-dosing prior to the ED despite appropriate education. Our findings underscore the need for earlier recognition of acute illness in this vulnerable population and improved education regarding administration of stress-dose hydrocortisone to prevent morbidity. | null | medrxiv |
10.1101/19005140 | Opioid Overdose in Ohio: Comprehensive Analysis of Associated Socioeconomic Factors | Park, C.; Crawford, S.; Lopez, R.; Seballos, A.; Clemenceau, J.; Coy, T.; Atluri, G.; Hwang, T. H. | Tae Hyun Hwang | Cleveland Clinic | 2019-09-05 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | epidemiology | https://www.medrxiv.org/content/early/2019/09/05/19005140.source.xml | ObjectiveOur study focused on identifying socioeconomic factors associated with death by opioid overdose in Ohio communities at the census tract level.
Materials and MethodsA large-scale vital statistic dataset from Ohio Department of Health (ODH) and U.S. Census datasets were used to obtain opioid-related death rate and socioeconomic characteristics for all census tracts in Ohio. Regression analysis was performed to identify the relationships between socioeconomic factors of census tracts and the opioid-related death rate for both urban and rural tracts.
ResultsIn Ohio from 2010-2016, whites, males, and people aged 25-44 had the highest opioid-related death rates. At the census tract level, higher death rates were associated with certain socioeconomic characteristics (e.g. percentage of the census tract population living in urban areas, percentage divorced/separated, percentage of vacant housing units). Predominately rural areas had a different population composition than urban areas, and death rates in rural areas exhibited fewer associations with socioeconomic characteristics.
DiscussionPredictive models of opioid-related death rates based on census tract-level characteristics held for urban areas more than rural ones, reflecting the recently observed rural-to-urban geographic shift in opioid-related deaths. Future research is needed to examine the geographic distribution of opioid abuse throughout Ohio and in other states.
ConclusionRegression analysis identified associations between population characteristics and opioid-related death rates of Ohio census tracts. These analyses can help government officials and law official workers prevent, predict and combat opioid abuse at the community level. | null | medrxiv |
10.1101/19005546 | Novel delivery of cellular therapy to reduce ischaemia reperfusion injury in kidney transplantation | Thompson, E. R.; Bates, L.; Ibrahim, I. K.; Sewpaul, A.; Stenberg, B.; McNeill, A.; Figueiredo, R.; Girdlestone, T.; Wilkins, G. C.; Irwin, E. A.; Tingle, S. J.; Scott, W. E.; Lemos, H.; Mellor, A. L.; Roobrouck, V. D.; Ting, A.; Hosgood, S. A.; Nicholson, M. L.; Fisher, A. J.; Ali, S.; Sheerin, N. S.; Wilson, C. H. | Emily R Thompson | NIHR Blood and Transplant Research Unit in Organ Donation and Transplantation, Institute of Transplantation, Freeman Hospital, Newcastle upon Tyne, UK | 2019-09-08 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | transplantation | https://www.medrxiv.org/content/early/2019/09/08/19005546.source.xml | Ex-vivo normothermic machine perfusion (NMP) of donor kidneys prior to transplantation provides a platform for direct delivery of cellular therapeutics to optimise organ quality prior to transplantation. Multipotent Adult Progenitor Cells (MAPC(R)) possess potent immunomodulatory properties which could prove beneficial in minimising subsequent ischaemia reperfusion injury. We investigated the potential reconditioning capability of MAPC cells in kidney NMP.
MethodsPairs (5) of human kidneys from the same donor were simultaneously perfused for 7 hours. The right or left kidney was randomly allocated to receive MAPC treatment. Serial samples of perfusate, urine and tissue biopsies were taken for comparison with the control paired kidney.
ResultsMAPC-treated kidneys demonstrated improved urine output (p<0.01), decreased expression of the kidney injury biomarker NGAL (p<0.01), improved microvascular perfusion on contrast enhanced ultrasound (cortex p<0.05, medulla p<0.01), downregulation of IL-1{beta} (p<0.05) and upregulation of IL-10 (p<0.05) and Indolamine-2, 3-dioxygenase (p<0.05). A mouse model of intraperitoneal chemotaxis demonstrated decreased neutrophil recruitment when stimulated with perfusate from MAPC-treated kidneys (p<0.01). Immunofluorescence revealed pre-labelled MAPC cells home to the perivascular space in the kidneys during NMP. MAPC therapy was not associated with detrimental physiological or embolic events.
ConclusionWe report the first successful delivery of cellular therapy to a kidney during NMP. Kidneys treated with MAPC cells demonstrate improvement in clinically relevant functional parameters and injury biomarkers. This novel method of cell therapy delivery provides an exciting opportunity to recondition organs prior to clinical transplantation.
One Sentence SummaryEx-vivo reconditioning of human kidneys using Multipotent Adult Progenitor Cell therapy delivered during normothermic machine perfusion. | 10.1111/ajt.16100 | medrxiv |
10.1101/19005579 | Changes in association between school meals and children's dietary quality during implementation of the Healthy, Hunger-Free Kids Act of 2010 | Berger, A. T.; Widome, R.; Erickson, D. J.; Laska, M. N.; Harnack, L. J. | Aaron T. Berger | Division of Epidemiology and Community Health, School of Public Health, University of Minnesota, Twin Cities, Minneapolis, Minnesota, USA | 2019-09-08 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | epidemiology | https://www.medrxiv.org/content/early/2019/09/08/19005579.source.xml | PurposeTo estimate the effect of Healthy, Hunger-Free Kids Act of 2010 (HHFKA) implementation on dietary quality of all US school-aged children and adolescents, and examine whether those effects differed by demographic group.
MethodsWe used survey regression on 2007-2016 National Health and Nutrition Examination Survey (NHANES) data to estimate the proportion of energy intake from school foods and the association between school food intake and dietary quality, before and after HHFKA passage/implementation. To account for demographic changes in the US population over time, inverse probability weighting was employed. The product of the proportion of energy from school foods and the association between school food intake and dietary quality estimated the effect of HHFKA implementation on dietary quality.
ResultsSchool food intake quantity remained stable during the study period. HHFKA implementation improved students dietary quality by 4.3 Healthy Eating Index-2010 (HEI) points (95% CI: 2.5, 6.1) on days when school foods were eaten, and by 1.3 HEI points (95% CI: 0.73, 1.8) averaged over all days annually.
ConclusionsHHFKA implementation improved the total dietary quality of US school students. US students would benefit from eating school meals in the post-HHFKA era, and HHFKA regulations should not be relaxed. | 10.1016/j.annepidem.2020.05.013 | medrxiv |
10.1101/19005579 | Changes in association between school meals and children's dietary quality during implementation of the Healthy, Hunger-Free Kids Act of 2010 | Berger, A. T.; Widome, R.; Erickson, D. J.; Laska, M. N.; Harnack, L. J. | Aaron T. Berger | Division of Epidemiology and Community Health, School of Public Health, University of Minnesota, Twin Cities, Minneapolis, Minnesota, USA | 2020-05-13 | 2 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | epidemiology | https://www.medrxiv.org/content/early/2020/05/13/19005579.source.xml | PurposeTo estimate the effect of Healthy, Hunger-Free Kids Act of 2010 (HHFKA) implementation on dietary quality of all US school-aged children and adolescents, and examine whether those effects differed by demographic group.
MethodsWe used survey regression on 2007-2016 National Health and Nutrition Examination Survey (NHANES) data to estimate the proportion of energy intake from school foods and the association between school food intake and dietary quality, before and after HHFKA passage/implementation. To account for demographic changes in the US population over time, inverse probability weighting was employed. The product of the proportion of energy from school foods and the association between school food intake and dietary quality estimated the effect of HHFKA implementation on dietary quality.
ResultsSchool food intake quantity remained stable during the study period. HHFKA implementation improved students dietary quality by 4.3 Healthy Eating Index-2010 (HEI) points (95% CI: 2.5, 6.1) on days when school foods were eaten, and by 1.3 HEI points (95% CI: 0.73, 1.8) averaged over all days annually.
ConclusionsHHFKA implementation improved the total dietary quality of US school students. US students would benefit from eating school meals in the post-HHFKA era, and HHFKA regulations should not be relaxed. | 10.1016/j.annepidem.2020.05.013 | medrxiv |
10.1101/19005421 | Systematic Review and Meta-Analysis of the Associations Between Body Mass Index, Prostate Cancer, Advanced Prostate Cancer and Prostate Specific Antigen | Harrison, S.; Tilling, K.; Turner, E. L.; Martin, R. M.; Lennon, R. J.; Lane, J. A.; Donovan, J.; Hamdy, F. C.; Neal, D. E.; Bosch, J. L. R.; Jones, H. E. | Sean Harrison | University of Bristol | 2019-09-09 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | oncology | https://www.medrxiv.org/content/early/2019/09/09/19005421.source.xml | PurposeThe relationship between body-mass index (BMI) and prostate cancer remains unclear. However, there is an inverse association between BMI and prostate-specific antigen (PSA), used for prostate cancer screening. We conducted this review to estimate the associations between BMI and (1) prostate cancer, (2) advanced prostate cancer, and (3) PSA.
MethodsWe searched PubMed and Embase for studies until 02 October 2017 and obtained individual participant data from four studies. In total, 78 studies were identified for the association between BMI and prostate cancer, 21 for BMI and advanced prostate cancer, and 35 for BMI and PSA. We performed random-effects meta-analysis of linear associations of log PSA and prostate cancer with BMI and, to examine potential non-linearity, of associations between categories of BMI and each outcome.
ResultsIn the meta-analyses with continuous BMI, a 5 kg/m2 increase in BMI was associated with a percentage change in PSA of -5.88% (95% CI -6.87% to -4.87%). Using BMI categories, compared to normal weight men the PSA levels of overweight men were 3.43% lower (95% CI -5.57% to -1.23%), and obese men were 12.9% lower (95% CI -15.2% to -10.7%). Prostate cancer and advanced prostate cancer analyses showed little or no evidence associations.
ConclusionThere is little or no evidence of an association between BMI and risk of prostate cancer or advanced prostate cancer, and strong evidence of an inverse and non-linear association between BMI and PSA. The association between BMI and prostate cancer is likely biased if missed diagnoses are not considered. | 10.1007/s10552-020-01291-3 | medrxiv |
10.1101/19005603 | Natural Language Processing for Mimicking Clinical Trial Recruitment in Critical Care: A Semi-automated Simulation Based on the LeoPARDS Trial | Tissot, H.; Shah, A.; Agbakoba, R.; Folarin, A.; Brealey, D.; Harris, S.; Dobson, R.; Asselbergs, F. | Hegler Tissot | University College London | 2019-09-09 | 1 | PUBLISHAHEADOFPRINT | cc_no | health informatics | https://www.medrxiv.org/content/early/2019/09/09/19005603.source.xml | Clinical trials often fail on recruiting an adequate number of appropriate patients. Identifying eligible trial participants is a resource-intensive task when relying on manual review of clinical notes, particularly in critical care settings where the time window is short. Automated review of electronic health records has been explored as a way of identifying trial participants, but much of the information is in unstructured free text rather than a computable form. We developed an electronic health record pipeline that combines structured electronic health record data with free text in order to simulate recruitment into the LeoPARDS trial. We applied an algorithm to identify eligible patients using a moving 1-hour time window, and compared the set of patients identified by our approach with those actually screened and recruited for the trial. We manually reviewed clinical records for a random sample of additional patients identified by the algorithm but not identified for screening in the original trial. Our approach identified 308 patients, of whom 208 were screened in the actual trial. We identified all 40 patients with CCHIC data available who were actually recruited to LeoPARDS in our centre. The algorithm identified 96 patients on the same day as manual screening and 62 patients one or two days earlier. Analysis of electronic health records incorporating natural language processing tools could effectively replicate recruitment in a critical care trial, and identify some eligible patients at an earlier stage. If implemented in real-time this could improve the efficiency of clinical trial recruitment. | 10.1109/JBHI.2020.2977925 | medrxiv |
10.1101/19005603 | Natural Language Processing for Mimicking Clinical Trial Recruitment in Critical Care: A Semi-automated Simulation Based on the LeoPARDS Trial | Tissot, H.; Shah, A.; Agbakoba, R.; Folarin, A.; Romao, L.; Brealey, D.; Harris, S.; Roguski, L.; Dobson, R.; Asselbergs, F. | Hegler Tissot | University College London | 2019-09-26 | 2 | PUBLISHAHEADOFPRINT | cc_no | health informatics | https://www.medrxiv.org/content/early/2019/09/26/19005603.source.xml | Clinical trials often fail on recruiting an adequate number of appropriate patients. Identifying eligible trial participants is a resource-intensive task when relying on manual review of clinical notes, particularly in critical care settings where the time window is short. Automated review of electronic health records has been explored as a way of identifying trial participants, but much of the information is in unstructured free text rather than a computable form. We developed an electronic health record pipeline that combines structured electronic health record data with free text in order to simulate recruitment into the LeoPARDS trial. We applied an algorithm to identify eligible patients using a moving 1-hour time window, and compared the set of patients identified by our approach with those actually screened and recruited for the trial. We manually reviewed clinical records for a random sample of additional patients identified by the algorithm but not identified for screening in the original trial. Our approach identified 308 patients, of whom 208 were screened in the actual trial. We identified all 40 patients with CCHIC data available who were actually recruited to LeoPARDS in our centre. The algorithm identified 96 patients on the same day as manual screening and 62 patients one or two days earlier. Analysis of electronic health records incorporating natural language processing tools could effectively replicate recruitment in a critical care trial, and identify some eligible patients at an earlier stage. If implemented in real-time this could improve the efficiency of clinical trial recruitment. | 10.1109/JBHI.2020.2977925 | medrxiv |
10.1101/19005934 | White adipose tissue inflammation is not attenuated by short-term calorie restriction in obese humans | Sbierski-Kind, J.; Mai, K.; Kath, J.; Jurisch, A.; Streitz, M.; Kuchenbecker, L.; Juerchott, K.; Spranger, L.; Jumpertz von Schwartzenberg, R.; Decker, A.-M.; Krueger, U.; Volk, H.-D.; Spranger, J. | Julia Sbierski-Kind | Charite - Universitaetsmedizin Berlin, Corporate Member of Freie Universitaet Berlin, Humboldt-Universitaet zu Berlin | 2019-09-09 | 1 | PUBLISHAHEADOFPRINT | cc_no | endocrinology | https://www.medrxiv.org/content/early/2019/09/09/19005934.source.xml | Obesity is a growing global health problem due to its association with chronic low-grade inflammation contributing to metabolic complications. Multiple studies indicate that white adipose tissue (WAT) inflammation can promote type 2 diabetes. However, the function and regulation of both innate and adaptive immune cells in human WAT under conditions of obesity and calorie restriction (CR) is not fully understood yet. Using a randomized interventional design, we investigated postmenopausal obese women who either underwent CR for three months followed by a 4 weeks phase of weight maintenance or had to maintain a stable weight over the whole study period. A comprehensive immune phenotyping protocol was conducted using validated multiparameter flow cytometry analysis in blood and subcutaneous WAT (SAT) (n=21). The T cell receptor repertoire was analyzed by next generation sequencing (n=20) and cytokine levels were determined in SAT (n=22). Metabolic parameters were determined by hyperinsulinemic-euglycemic clamp and then correlated to immune cell subsets. We found that insulin resistance (IR) correlates significantly with a shift towards the memory T cell compartment in SAT. Among various T cell subsets, predominantly CD8+ effector memory T cells were associated with obesity-related IR. Interestingly, T cell receptor analysis revealed a diverse repertoire in SAT arguing against an antigen-driven intra-SAT expansion of effector memory T cells. Surprisingly, neither inflammatory cytokine levels nor leucocyte subpopulations were significantly altered upon CR. Our findings demonstrate the accumulation of effector memory T cells in obese SAT contributing to chronic inflammation. The long-standing effect of obesity-induced changes in SAT was demonstrated by preserved immune cell composition after short-term CR induced weight loss. | 10.4049/jimmunol.2000108 | medrxiv |
10.1101/19005470 | Influenza-associated mortality for different causes of death during the 2010-2011 through the 2014-2015 influenza seasons in Russia | Goldstein, E. | Edward Goldstein | Harvard TH Chan School of Public Health | 2019-09-09 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | epidemiology | https://www.medrxiv.org/content/early/2019/09/09/19005470.source.xml | BackgroundThere is limited information on the volume of influenza-associated mortality in Russia.
MethodsUsing previously developed methodology (Goldstein et al., Epidemiology 2012), we regressed the monthly rates of mortality for respiratory causes, circulatory cause, and for certain infectious and parasitic diseases (available from the Russian Federal State Statistics Service (Rosstat)) linearly against the monthly proxies for the incidence of influenza A/H3N2, A/H1N1 and B (obtained using data from the Smorodintsev Research Institute of Influenza (RII) on levels of ILI/ARI consultations and the percent of respiratory specimens testing positive for influenza A/H3N2, A/H1N1 and B), adjusting for the baseline rates of mortality not associated with influenza circulation and temporal trends.
ResultsFor the 2010/11 through the 2014/15 seasons, influenza circulation was associated with an average annual 3662 (95% CI (2487,4836)) deaths for respiratory causes, 9558 (2280,16836) deaths for circulatory causes, and 343 (63,624) deaths for certain infectious and parasitic diseases, with influenza B making a substantial contribution to the last two categories of deaths. The largest numbers of both respiratory and circulatory deaths (5220 (3407,7033) and 16380 (3907,28853) correspondingly) were estimated during the 2014/15 season.
ConclusionsInfluenza circulation is associated with a substantial mortality burden in Russia, particularly for circulatory deaths. Those results support the potential utility of influenza vaccination (with the role played by influenza B pointing to the benefit of quadrivalent influenza vaccines), as well as of administration of antiviral drugs for older individuals and individuals with certain underlying health conditions during periods of active influenza circulation. | null | medrxiv |
10.1101/19005470 | Influenza-associated mortality for different causes of death during the 2010-2011 through the 2014-2015 influenza seasons in Russia | Goldstein, E. | Edward Goldstein | Harvard TH Chan School of Public Health | 2019-09-14 | 2 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | epidemiology | https://www.medrxiv.org/content/early/2019/09/14/19005470.source.xml | BackgroundThere is limited information on the volume of influenza-associated mortality in Russia.
MethodsUsing previously developed methodology (Goldstein et al., Epidemiology 2012), we regressed the monthly rates of mortality for respiratory causes, circulatory cause, and for certain infectious and parasitic diseases (available from the Russian Federal State Statistics Service (Rosstat)) linearly against the monthly proxies for the incidence of influenza A/H3N2, A/H1N1 and B (obtained using data from the Smorodintsev Research Institute of Influenza (RII) on levels of ILI/ARI consultations and the percent of respiratory specimens testing positive for influenza A/H3N2, A/H1N1 and B), adjusting for the baseline rates of mortality not associated with influenza circulation and temporal trends.
ResultsFor the 2010/11 through the 2014/15 seasons, influenza circulation was associated with an average annual 3662 (95% CI (2487,4836)) deaths for respiratory causes, 9558 (2280,16836) deaths for circulatory causes, and 343 (63,624) deaths for certain infectious and parasitic diseases, with influenza B making a substantial contribution to the last two categories of deaths. The largest numbers of both respiratory and circulatory deaths (5220 (3407,7033) and 16380 (3907,28853) correspondingly) were estimated during the 2014/15 season.
ConclusionsInfluenza circulation is associated with a substantial mortality burden in Russia, particularly for circulatory deaths. Those results support the potential utility of influenza vaccination (with the role played by influenza B pointing to the benefit of quadrivalent influenza vaccines), as well as of administration of antiviral drugs for older individuals and individuals with certain underlying health conditions during periods of active influenza circulation. | null | medrxiv |
10.1101/19006031 | Cognitive functioning and lifetime Major Depressive Disorder in UK Biobank | de Nooij, L.; Harris, M. A.; Adams, M. J.; Clarke, T.-K.; Shen, X.; Cox, S. R.; McIntosh, A. M.; Whalley, H. C. | Laura de Nooij | University of Edinburgh | 2019-09-09 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | psychiatry and clinical psychology | https://www.medrxiv.org/content/early/2019/09/09/19006031.source.xml | BackgroundCognitive impairment associated with lifetime Major Depressive Disorder (MDD) is well-supported by meta-analytic studies, but population-based estimates remain scarce. Previous UK Biobank studies have only shown limited evidence of cognitive differences related to probable MDD. Using updated cognitive and clinical assessments in UK Biobank, this study investigated population-level differences in cognitive functioning associated with lifetime MDD.
MethodsAssociations between lifetime MDD and cognition (performance on six tasks and general cognitive functioning (g-factor)) were investigated in UK Biobank (N-range 7,457-14,836, age 45-81 years, 52% female), adjusting for demographics, education and lifestyle. Lifetime MDD classifications were based on the Composite International Diagnostic Interview. Within the lifetime MDD group, we additionally investigated relationships between cognition and (i) recurrence, (ii) current symptoms, (iii) severity of psychosocial impairment (while symptomatic), and (iv) concurrent psychotropic medication use.
ResultsLifetime MDD was robustly associated with a lower g-factor ({beta} = -0.10, PFDR = 4.7x10-5), with impairments in attention, processing speed and executive functioning ({beta} [≥] 0.06). Clinical characteristics revealed differential profiles of cognitive impairment among case individuals; those who reported severe psychosocial impairment and use of psychotropic medication performed worse on cognitive tests. Severe psychosocial impairment and reasoning showed the strongest association ({beta} = -0.18, PFDR = 7.5x10-5).
ConclusionsFindings describe small but robust associations between lifetime MDD and lower cognitive performance within a population based sample. Overall effects were of modest effect size, suggesting limited clinical relevance. However, deficits within specific cognitive domains were more pronounced in relation to clinical characteristics, particularly severe psychosocial impairment. | 10.1192/j.eurpsy.2020.24 | medrxiv |
10.1101/19006031 | Cognitive functioning and lifetime Major Depressive Disorder in UK Biobank | de Nooij, L.; Harris, M. A.; Adams, M. J.; Clarke, T.-K.; Shen, X.; Cox, S. R.; McIntosh, A. M.; Whalley, H. C. | Laura de Nooij | University of Edinburgh | 2019-11-18 | 2 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | psychiatry and clinical psychology | https://www.medrxiv.org/content/early/2019/11/18/19006031.source.xml | BackgroundCognitive impairment associated with lifetime Major Depressive Disorder (MDD) is well-supported by meta-analytic studies, but population-based estimates remain scarce. Previous UK Biobank studies have only shown limited evidence of cognitive differences related to probable MDD. Using updated cognitive and clinical assessments in UK Biobank, this study investigated population-level differences in cognitive functioning associated with lifetime MDD.
MethodsAssociations between lifetime MDD and cognition (performance on six tasks and general cognitive functioning (g-factor)) were investigated in UK Biobank (N-range 7,457-14,836, age 45-81 years, 52% female), adjusting for demographics, education and lifestyle. Lifetime MDD classifications were based on the Composite International Diagnostic Interview. Within the lifetime MDD group, we additionally investigated relationships between cognition and (i) recurrence, (ii) current symptoms, (iii) severity of psychosocial impairment (while symptomatic), and (iv) concurrent psychotropic medication use.
ResultsLifetime MDD was robustly associated with a lower g-factor ({beta} = -0.10, PFDR = 4.7x10-5), with impairments in attention, processing speed and executive functioning ({beta} [≥] 0.06). Clinical characteristics revealed differential profiles of cognitive impairment among case individuals; those who reported severe psychosocial impairment and use of psychotropic medication performed worse on cognitive tests. Severe psychosocial impairment and reasoning showed the strongest association ({beta} = -0.18, PFDR = 7.5x10-5).
ConclusionsFindings describe small but robust associations between lifetime MDD and lower cognitive performance within a population based sample. Overall effects were of modest effect size, suggesting limited clinical relevance. However, deficits within specific cognitive domains were more pronounced in relation to clinical characteristics, particularly severe psychosocial impairment. | 10.1192/j.eurpsy.2020.24 | medrxiv |
10.1101/19005231 | Against Empathy: distinct correlates of empathy and compassion with burnout and affective symptoms in health professionals and students | Romani-Sponchiado, A.; Jordan, M.; Stringaris, A.; Salum, G. A. | Aline Romani-Sponchiado | Universidade Federal do Rio Grande do Sul | 2019-09-09 | 1 | PUBLISHAHEADOFPRINT | cc_no | psychiatry and clinical psychology | https://www.medrxiv.org/content/early/2019/09/09/19005231.source.xml | Concern for the well-being of medical professionals has increased considering the high rates of depression and suicidal ideation observed among medical students and residents. However, the causes of such psychological distress among health professionals are still unknown. One possibility is that such negative outcomes arise from individual differences in how clinicians respond to the emotional states of their patients: while some tend respond with empathy (feeling what others feel), others tend respond with compassion (caring about what others feel). The aim of this study is to investigate the hypothesis that empathy is related to higher levels of burnout and affective symptoms, while compassion is related to lower levels of these outcomes. We surveyed 464 undergraduate students and professionals in medicine (34.3%), psychology (47%) and nursing (18.8%), 79.7% female, with a median age of 23.3. The survey included the concern and perspective taking subscales from the Interpersonal Reactivity Index (IRI); empathy and behavioral contagion from the Empathy Index (EI); the depression, anxiety, and anger subscales from PROMIS; and the Medical Student Well-Being Index (MSWBI). Empathy was associated with higher symptoms of burnout, depression, anxiety and anger; while higher levels of compassion were associated with lower levels of these outcomes. Our findings provide new evidence that the well-being of medical professionals might be affected differently depending on socio-emotional traits relevant to emotional connection. | 10.1590/1516-4446-2020-0941 | medrxiv |
10.1101/19005629 | The purine pathway in liver tissue biopsies from donors for transplantation is associated to immediate graft function and survival | Xu, J.; Hassan-Ally, M.; Casas-Ferreira, A. M.; Suvitaival, T.; Ma, Y.; Vilca-Melendez, H.; Rela, M.; Heaton, N.; Jassem, W.; Legido-Quigley, C. | Cristina Legido-Quigley | King\'s College London | 2019-09-09 | 1 | PUBLISHAHEADOFPRINT | cc_no | transplantation | https://www.medrxiv.org/content/early/2019/09/09/19005629.source.xml | Background & AimsThe current shortage of livers for transplantation has increased the use of organs sourced from donation after circulatory death (DCD). These organs are prone to higher incidence of graft failure, but the underlying mechanisms are largely unknown. Here we aimed to find biomarkers of liver function before transplantation to better inform clinical evaluation.
MethodsMatched pre- and post-transplant liver biopsies from DCD (n=24) and donation after brain death (DBD, n=70) were collected. Liver biopsies were analysed using mass spectroscopy molecular phenotyping. First, a discrimination analysis DCD vs DBD was used to parse metabolites associated to DCD. Then a data-driven approach was used to predict Immediate Graft Function (IGF). The metabolites were tested in models to predict survival.
ResultsFive metabolites in the purine pathway were selected and investigated. The ratios of: adenine monophosphate (AMP), adenine, adenosine and hypoxanthine to urate, differed between DBD and DCD biopsies at pre-transplantation stage (q<0.05). The ratios of AMP and adenine to urate also differed in biopsies from recipients undergoing IGF (q<0.05). Using random forest a panel composed by alanine aminotransferase (ALT) and AMP, adenine, hypoxanthine ratio to urate predicted IGF with AUC 0.84 (95% CI [0.71, 0.97]). In comparison AUC 0.71 (95%CI [0.52, 0.90]) was achieved by clinical measures. Survival analysis revealed that the metabolite classifier could stratify 6-year survival outcomes (p = 0.0073) while clinical data and donor class could not.
ConclusionsAt liver pre-transplantation stage, a panel composed of purine metabolites and ALT in tissue could improve prediction of IGF and survival.
Lay summaryNew liver function biomarkers could help clinicians assess livers before transplantation. Purines are small molecules that are found in healthy livers, and in this work we found that their levels changed critically in livers from cardiac death donors. Measuring them before transplantation improved the prediction of the livers immediate graft function.
Graphic abstract
O_FIG O_LINKSMALLFIG WIDTH=200 HEIGHT=65 SRC="FIGDIR/small/19005629v1_ufig1.gif" ALT="Figure 1">
View larger version (16K):
org.highwire.dtl.DTLVardef@1e3c207org.highwire.dtl.DTLVardef@1d760b0org.highwire.dtl.DTLVardef@10cf605org.highwire.dtl.DTLVardef@1ebebd5_HPS_FORMAT_FIGEXP M_FIG C_FIG HighlightsO_LIThe ratios of purine metabolites to urate differ between DCD and DBD in liver tissue at pre-transplantation.
C_LIO_LIThe ratios of purine metabolites to urate and ALT pre-transplantation can improve prediction of IGF after transplantation.
C_LIO_LIPurine metabolites ratios to urate stratified 6-year survival outcome better than clinical data and donor class.
C_LI | 10.3390/jcm9030711 | medrxiv |
10.1101/19006080 | Analysis of corneal real astigmatism changes and high order aberration after lower eyelid epiblepharon repair surgery | LEE, D. | Dongcheol LEE | Department of ophthalmology, Keimyung University School of Medicine | 2019-09-09 | 1 | PUBLISHAHEADOFPRINT | cc_by | ophthalmology | https://www.medrxiv.org/content/early/2019/09/09/19006080.source.xml | ObjectiveTo investigate changes in corneal low and high order aberrations (LOA and HOA), which cause visual disturbances after lower eyelid epiblepharon repair surgery.
Methods and analysisThis was a retrospective, cross-sectional study, which included 108 eyes from 54 patients. Wavefront analyses for calibrated LOAs and HOAs (root mean square, coma, three-piece aberrations [Trefoil], secondary astigmatisms, and spherical aberrations [SA]) were performed via a Galilei G4 Dual Scheimpflug Analyzer preoperatively, at the first and second follow-ups (f/u), and at G1, G2, and G3 (<45 days, 45-75 days, and >75 days post-surgery). Several risk factors (age, sex, body mass index, and corneal keratitis presence) were assessed.
ResultsIn LOAs, flat keratometer (K) and axis values decreased significantly from baseline at the first f/u. At the second f/u, mean K and axis decreased. In HOAs, coma and trefoil increased from baseline at the first f/u and normalized by the second f/u. SA decreased at the second f/u and in G3. The various risk factors did not affect postoperative outcomes, axis, and secondary astigmatisms. After correcting for risk factors, at the first f/u, cylinder, coma, trefoil, and SA increased significantly from the baseline, while axis and flat K decreased. At the second f/u, cylinder increased, while axis and mean K decreased significantly from the baseline.
ConclusionEpiblepharon repair surgery may impact axis changes. Flat K, coma, and trefoil may be affected by mechanical force changes immediately following surgery. Mean K and SA may change with cornea state changes during healing. | 10.1038/s41598-020-64386-6 | medrxiv |
10.1101/19006197 | RISK6, a universal 6-gene transcriptomic signature of TB disease risk, diagnosis and treatment response | Penn-Nicholson, A.; Mbandi, S. K.; Thompson, E.; Mendelsohn, S.; Suliman, S.; Chegou, N. N.; Malherbe, S. T.; Darboe, F.; Erasmus, M.; Hanekom, W. A.; Bilek, N.; Fisher, M.; Kaufmann, S. H.; Winter, J.; Murphy, M.; Wood, R.; Morrow, C.; Van Rhijn, I.; Moody, D. B.; Murray, M.; Andrade, B.; Sterling, T.; Sutherland, J.; Naidoo, K.; Padayatchi, N.; Walzl, G.; Hatherill, M.; Zak, D.; Scriba, T. | Thomas Scriba | University of Cape Town | 2019-09-09 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | infectious diseases | https://www.medrxiv.org/content/early/2019/09/09/19006197.source.xml | Improved tuberculosis diagnostics and tools for monitoring treatment response are urgently needed. We developed a robust and simple, PCR-based host-blood transcriptomic signature, RISK6, for multiple applications: identifying individuals at risk of incident disease, as a screening test for subclinical or clinical tuberculosis, and for monitoring tuberculosis treatment. RISK6 utility was validated by blind prediction using quantitative real-time (qRT) PCR in seven independent cohorts.
Prognostic performance significantly exceeded that of previous signatures discovered in the same cohort. Performance for diagnosing subclinical and clinical disease in HIV-uninfected and HIV-infected persons, assessed by area under the receiver-operating characteristic curve, exceeded 85%. As a screening test for tuberculosis, the sensitivity at 90% specificity met or approached the benchmarks set out in World Health Organization target product profiles for non-sputum-based tests. RISK6 scores correlated with lung immunopathology activity, measured by positron emission tomography, and tracked treatment response, demonstrating utility as treatment response biomarker, while predicting treatment failure prior to treatment initiation. Performance of the test in capillary blood samples collected by finger-prick was noninferior to venous blood collected in PAXgene tubes. These results support incorporation of RISK6 into rapid, capillary blood-based point-of-care PCR devices for prospective assessment in field studies. | 10.1038/s41598-020-65043-8 | medrxiv |
10.1101/19006163 | Rise and Regional Disparities in Buprenorphine Utilization in the United States | Pashmineh, A. A. R.; Cruz-Mullane, A.; Podd, J. C.; Lam, W. S.; Kaleem, S. S.; Lockard, L. B.; Mandel, M. R.; Chung, D. Y.; Davis, C. S.; Nichols, S. D.; McCall, K. L.; Piper, B. J. | Brian James Piper | Geisinger Commonwealth School of Medicine | 2019-09-09 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | addiction medicine | https://www.medrxiv.org/content/early/2019/09/09/19006163.source.xml | AimsBuprenorphine is an opioid partial-agonist used to treat Opioid Use Disorders (OUD). While several state and federal policy changes have attempted to increase buprenorphine availability, access remains well below optimal levels. This study characterized how buprenorphine utilization in the United States has changed over time and whether there are regional disparities in distribution.
MeasurementsBuprenorphine weights distributed from 2007 to 2017 were obtained from the Drug Enforcement Administration. Data was expressed as the percent change and as the mg per person in each state. Separately, the formulations for prescriptions covered by Medicaid (2008 to 2018) were examined.
FindingsBuprenorphine distributed to pharmacies increased about seven-fold (476.8 to 3,179.9 kg) while the quantities distributed to hospitals grew five-fold (18.6 to 97.6 kg) nationally from 2007 to 2017. Buprenorphine distribution per person was almost 20-fold higher in Vermont (40.4 mg/person) relative to South Dakota (2.1 mg/person). There was a strong association between the number of waivered physicians per 100K population and distribution per state (r(49) = +0.76, p < .0005). The buprenorphine/naloxone sublingual film (Suboxone) was the predominant formulation (92.6% of 0.31 million Medicaid prescriptions) in 2008 but this accounted for less than three-fifths (57.3% of 6.56 million prescriptions) in 2018.
ConclusionsAlthough buprenorphine availability has substantially increased over the last decade, distribution was very non-homogenous across the US. | 10.1002/pds.4984 | medrxiv |
10.1101/19006007 | Associations between reasons for vaping and current vaping and smoking status: Evidence from a UK based cohort | Khouja, J. N.; Taylor, A. E.; Munafo, M. R. | Jasmine N Khouja | University of Bristol | 2019-09-09 | 1 | PUBLISHAHEADOFPRINT | cc_by | epidemiology | https://www.medrxiv.org/content/early/2019/09/09/19006007.source.xml | BackgroundThis study aimed to discover which young adults vape, the reasons given for vaping, and which reasons for vaping are associated with continued vaping/smoking.
MethodsIn a UK cohort of 3,994 young adults, we explored the association of retrospectively-recalled reasons for vaping by 23 years with vaping/smoking status at 24 years. Using logistic regression, we assessed the association with vaping behaviour among ever vapers who had ever smoked (n=668), and with smoking behaviour among individuals who regularly smoked prior to vaping (n=412).
ResultsVaping to quit smoking was associated with higher likelihood of vaping (odds ratio [OR] = 3.51, 95% confidence interval [95%CI] = 2.29 to 5.38), but lower likelihood of smoking at 24 years (OR = 0.50, 95%CI = 0.32 to 0.78). Vaping to cut down smoking was associated with higher likelihood of vaping (OR = 2.90, 95%CI = 1.87 to 4.50) and smoking at 24 years (OR = 1.62, 95%CI = 1.02 to 2.58). Vaping out of curiosity was associated with lower likelihood of vaping at 24 years (OR = 0.41, 95%CI = 0.26 to 0.63) but higher likelihood of smoking at 24 years (OR = 1.66, 95%CI = 1.04 to 2.65).
ConclusionsIntention to quit smoking appears important for young adults to stop smoking using e-cigarettes; vaping to cut down is associated with continued smoking, but smoking to quit is associated with discontinued smoking. Vaping out of curiosity is less likely to lead to a change in smoking/vaping behaviour (i.e., current smokers continue to smoke). | 10.1016/j.drugalcdep.2020.108362 | medrxiv |
10.1101/19006007 | Associations between reasons for vaping and current vaping and smoking status: Evidence from a UK based cohort | Khouja, J. N.; Taylor, A. E.; Munafo, M. R. | Jasmine N Khouja | University of Bristol | 2020-05-20 | 2 | PUBLISHAHEADOFPRINT | cc_by | epidemiology | https://www.medrxiv.org/content/early/2020/05/20/19006007.source.xml | BackgroundThis study aimed to discover which young adults vape, the reasons given for vaping, and which reasons for vaping are associated with continued vaping/smoking.
MethodsIn a UK cohort of 3,994 young adults, we explored the association of retrospectively-recalled reasons for vaping by 23 years with vaping/smoking status at 24 years. Using logistic regression, we assessed the association with vaping behaviour among ever vapers who had ever smoked (n=668), and with smoking behaviour among individuals who regularly smoked prior to vaping (n=412).
ResultsVaping to quit smoking was associated with higher likelihood of vaping (odds ratio [OR] = 3.51, 95% confidence interval [95%CI] = 2.29 to 5.38), but lower likelihood of smoking at 24 years (OR = 0.50, 95%CI = 0.32 to 0.78). Vaping to cut down smoking was associated with higher likelihood of vaping (OR = 2.90, 95%CI = 1.87 to 4.50) and smoking at 24 years (OR = 1.62, 95%CI = 1.02 to 2.58). Vaping out of curiosity was associated with lower likelihood of vaping at 24 years (OR = 0.41, 95%CI = 0.26 to 0.63) but higher likelihood of smoking at 24 years (OR = 1.66, 95%CI = 1.04 to 2.65).
ConclusionsIntention to quit smoking appears important for young adults to stop smoking using e-cigarettes; vaping to cut down is associated with continued smoking, but smoking to quit is associated with discontinued smoking. Vaping out of curiosity is less likely to lead to a change in smoking/vaping behaviour (i.e., current smokers continue to smoke). | 10.1016/j.drugalcdep.2020.108362 | medrxiv |
10.1101/19005611 | A deep learning approach with event-related spectral EEG data in attentional deficit hyperactivity disorder | Dubreuil-Vall, L.; Ruffini, G.; Camprodon, J. | Laura Dubreuil-Vall | Massachusetts General Hospital, Harvard Medical School | 2019-09-11 | 1 | PUBLISHAHEADOFPRINT | cc_no | psychiatry and clinical psychology | https://www.medrxiv.org/content/early/2019/09/11/19005611.source.xml | Attention deficit hyperactivity disorder (ADHD) is a heterogeneous neurodevelopmental disorder that affects 5% of the pediatric and adult population worldwide. The diagnosis remains essentially clinical, based on history and exam, with no available biomarkers. In this paper, we describe a deep convolutional neural network (DCNN) for ADHD classification derived from the time-frequency decomposition of electroencephalography data (EEG), particularly of event-related potentials (ERP) during the Flanker Task collected from 20 ADHD adult patients and 20 healthy controls (HC). The model reaches a classification accuracy of 88%, superior to resting state EEG spectrograms and with the key advantage, compared with other machine learning approaches, of avoiding the need for manual selection of EEG spectral or channel features. Finally, through the use of feature visualization techniques, we show that the main features exciting the DCNN nodes are a decreased power in the alpha band and an increased power in the delta-theta band around 100ms for ADHD patients compared to HC, suggestive of attentional and inhibition deficits, which have been previously suggested as pathophyisiological signatures of ADHD. While confirmation with larger clinical samples is necessary, these results highlight the potential of this methodology to develop CNS biomarkers of practical clinical utility. | 10.3389/fnins.2020.00251 | medrxiv |
10.1101/19005884 | Geographic, environmental, and demographic correlates of central nervous system infections in Lao PDR (2003 through 2011): a retrospective secondary spatial analysis | Rattanavong, S.; Dubot-Peres, A.; Mayxay, M.; Vongsouvath, M.; Lee, S. J.; Cappelle, J.; Newton, P. N.; Parker, D. M. | Daniel M. Parker | University of California, Irvine | 2019-09-11 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | public and global health | https://www.medrxiv.org/content/early/2019/09/11/19005884.source.xml | BackgroundCentral nervous system (CNS) infections are important contributors to morbidity and mortality and the causative agents for [~]50% patients are never identified. The causative agents of some CNS infections have distinct spatial and temporal patterns.
Methodology/Principal FindingsHere we present the results of a spatial epidemiological and ecological analysis of CNS infections in Lao PDR (2003 - 2011). The data came from hospitalizations for suspected CNS infection at Mahosot Hospital in Vientiane. Out of 1,065 patients, 450 were assigned a confirmed diagnosis. While many communities in Lao PDR are in rural and remote locations, most patients in these data came from villages along major roads. Japanese encephalitis virus ((JEV); n=94) and Cryptococcus spp. (n=70) were the most common infections. JEV infections peaked in the rainy season and JEV patients came from villages with higher surface flooding during the same month as admission. JEV infections were spatially dispersed throughout rural areas and were most common in children. Cryptococcus spp. infections clustered near Vientiane (an urban area) and among adults.
Conclusions/SignificanceThe spatial and temporal patterns identified in this analysis are related to complex environmental, social, and geographic factors. For example, JEV infected patients came from locations with environmental conditions (surface water) that are suitable to support larger mosquito vector populations. Most patients in these data came from villages that are near major roads; likely the result of geographic and financial access to healthcare and also indicating that CNS diseases are underestimated in the region (especially from more remote areas). As Lao PDR is undergoing major developmental and environmental changes, the space-time distributions of the causative agents of CNS infection will also likely change. There is a major need for increased diagnostic abilities; increased access to healthcare, especially for rural populations; and for increased surveillance throughout the nation.
AUTHOR SUMMARYInfections of the central nervous system (CNS) are important with regard to public health. However many CNS infections are never diagnosed. In this analysis we investigated spatial and temporal patterns in hospitalized patients with suspected CNS infections in Lao PDR. We found that patients were most likely to come from villages located along major roads and highways. Patients from remote areas may have more difficulty reaching healthcare facilities. The most commonly diagnosed infection in these patients was Japanese encephalitis virus (JEV). Patients with this disease came from locations that were optimal for the mosquito vectors that spread JEV, rural areas with surface water and during the rainy season. Our analyses suggest that CNS infections should be a priority for public health workers in the region. Diagnostic capabilities should be increased throughout the nation; surveillance efforts should be broadened; and efforts should be increased toward providing easy access to healthcare for rural and remote populations. | 10.1371/journal.pntd.0008333 | medrxiv |
10.1101/19005884 | Spatial epidemiology of Japanese encephalitis virus and other infections of the central nervous system infections in Lao PDR (2003 - 2011): a retrospective analysis | Rattanavong, S.; Dubot-Peres, A.; Mayxay, M.; Vongsouvath, M.; Lee, S. J.; Cappelle, J.; Newton, P. N.; Parker, D. M. | Daniel M. Parker | University of California, Irvine | 2019-09-26 | 2 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | public and global health | https://www.medrxiv.org/content/early/2019/09/26/19005884.source.xml | BackgroundCentral nervous system (CNS) infections are important contributors to morbidity and mortality and the causative agents for [~]50% patients are never identified. The causative agents of some CNS infections have distinct spatial and temporal patterns.
Methodology/Principal FindingsHere we present the results of a spatial epidemiological and ecological analysis of CNS infections in Lao PDR (2003 - 2011). The data came from hospitalizations for suspected CNS infection at Mahosot Hospital in Vientiane. Out of 1,065 patients, 450 were assigned a confirmed diagnosis. While many communities in Lao PDR are in rural and remote locations, most patients in these data came from villages along major roads. Japanese encephalitis virus ((JEV); n=94) and Cryptococcus spp. (n=70) were the most common infections. JEV infections peaked in the rainy season and JEV patients came from villages with higher surface flooding during the same month as admission. JEV infections were spatially dispersed throughout rural areas and were most common in children. Cryptococcus spp. infections clustered near Vientiane (an urban area) and among adults.
Conclusions/SignificanceThe spatial and temporal patterns identified in this analysis are related to complex environmental, social, and geographic factors. For example, JEV infected patients came from locations with environmental conditions (surface water) that are suitable to support larger mosquito vector populations. Most patients in these data came from villages that are near major roads; likely the result of geographic and financial access to healthcare and also indicating that CNS diseases are underestimated in the region (especially from more remote areas). As Lao PDR is undergoing major developmental and environmental changes, the space-time distributions of the causative agents of CNS infection will also likely change. There is a major need for increased diagnostic abilities; increased access to healthcare, especially for rural populations; and for increased surveillance throughout the nation.
AUTHOR SUMMARYInfections of the central nervous system (CNS) are important with regard to public health. However many CNS infections are never diagnosed. In this analysis we investigated spatial and temporal patterns in hospitalized patients with suspected CNS infections in Lao PDR. We found that patients were most likely to come from villages located along major roads and highways. Patients from remote areas may have more difficulty reaching healthcare facilities. The most commonly diagnosed infection in these patients was Japanese encephalitis virus (JEV). Patients with this disease came from locations that were optimal for the mosquito vectors that spread JEV, rural areas with surface water and during the rainy season. Our analyses suggest that CNS infections should be a priority for public health workers in the region. Diagnostic capabilities should be increased throughout the nation; surveillance efforts should be broadened; and efforts should be increased toward providing easy access to healthcare for rural and remote populations. | 10.1371/journal.pntd.0008333 | medrxiv |
10.1101/19005884 | Spatial epidemiology of Japanese encephalitis virus and other infections of the central nervous system in Lao PDR (2003 - 2011): a retrospective analysis | Rattanavong, S.; Dubot-Peres, A.; Mayxay, M.; Vongsouvath, M.; Lee, S. J.; Cappelle, J.; Newton, P. N.; Parker, D. M. | Daniel M. Parker | University of California, Irvine | 2019-10-25 | 3 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | public and global health | https://www.medrxiv.org/content/early/2019/10/25/19005884.source.xml | BackgroundCentral nervous system (CNS) infections are important contributors to morbidity and mortality and the causative agents for [~]50% patients are never identified. The causative agents of some CNS infections have distinct spatial and temporal patterns.
Methodology/Principal FindingsHere we present the results of a spatial epidemiological and ecological analysis of CNS infections in Lao PDR (2003 - 2011). The data came from hospitalizations for suspected CNS infection at Mahosot Hospital in Vientiane. Out of 1,065 patients, 450 were assigned a confirmed diagnosis. While many communities in Lao PDR are in rural and remote locations, most patients in these data came from villages along major roads. Japanese encephalitis virus ((JEV); n=94) and Cryptococcus spp. (n=70) were the most common infections. JEV infections peaked in the rainy season and JEV patients came from villages with higher surface flooding during the same month as admission. JEV infections were spatially dispersed throughout rural areas and were most common in children. Cryptococcus spp. infections clustered near Vientiane (an urban area) and among adults.
Conclusions/SignificanceThe spatial and temporal patterns identified in this analysis are related to complex environmental, social, and geographic factors. For example, JEV infected patients came from locations with environmental conditions (surface water) that are suitable to support larger mosquito vector populations. Most patients in these data came from villages that are near major roads; likely the result of geographic and financial access to healthcare and also indicating that CNS diseases are underestimated in the region (especially from more remote areas). As Lao PDR is undergoing major developmental and environmental changes, the space-time distributions of the causative agents of CNS infection will also likely change. There is a major need for increased diagnostic abilities; increased access to healthcare, especially for rural populations; and for increased surveillance throughout the nation.
AUTHOR SUMMARYInfections of the central nervous system (CNS) are important with regard to public health. However many CNS infections are never diagnosed. In this analysis we investigated spatial and temporal patterns in hospitalized patients with suspected CNS infections in Lao PDR. We found that patients were most likely to come from villages located along major roads and highways. Patients from remote areas may have more difficulty reaching healthcare facilities. The most commonly diagnosed infection in these patients was Japanese encephalitis virus (JEV). Patients with this disease came from locations that were optimal for the mosquito vectors that spread JEV, rural areas with surface water and during the rainy season. Our analyses suggest that CNS infections should be a priority for public health workers in the region. Diagnostic capabilities should be increased throughout the nation; surveillance efforts should be broadened; and efforts should be increased toward providing easy access to healthcare for rural and remote populations. | 10.1371/journal.pntd.0008333 | medrxiv |
10.1101/19006320 | Dissociative experiences in fibromyalgia are mediated by symptoms of autonomic dysfunction | Aslanyan, D.; Iodice, V.; Davies, K. A.; Critchley, H. D.; Eccles, J. A. | Jessica A Eccles | Brighton and Sussex Medical School | 2019-09-11 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | neurology | https://www.medrxiv.org/content/early/2019/09/11/19006320.source.xml | BackgroundFibromyalgia is characterised by chronic widespread pain. Quality of life is further reduced by autonomic and cognitive symptoms, including subjective brain-fog and dissociative experiences. Although an association with joint hypermobility suggests variant connective tissue is a factor in both fibromyalgia and dysautonomia, the mechanisms underlying the neuropsychiatric symptoms are poorly understood.
Methods21 fibromyalgia patients and 21 healthy controls were assessed for joint hypermobility dissociative experiences, autonomic symptoms and interoceptive sensibility. Mediation analyses were conducted according to the method of Baron and Kenny.
ResultsPatients with fibromyalgia reported greater dissociative experiences and autonomic symptoms. The relationship between fibromyalgia and dissociative experiences was fully mediated by symptoms of orthostatic intolerance. Fibromyalgia, dissociative experiences and orthostatic intolerance all were associated with joint hypermobility and interoceptive sensibility.
ConclusionsThis exploratory investigation highlights the relationship between dissociative experiences in the context of fibromyalgia and subjective experience of aberrant physiological responses. These findings can enhance the recognition and management of neuropsychiatric symptoms in patients with fibromyalgia, wherein dissociative experiences reflect disturbance of self-representation that can arise through abnormalities in internal agency, autonomic (dys)control and interoceptive prediction errors. | null | medrxiv |
10.1101/19005819 | Cohort Profile: the Oxford Parkinson's Disease Centre Discovery Cohort Magnetic Resonance Imaging sub-study (OPDC-MRI) | Griffanti, L.; Klein, J. C.; Szewczyk-Krolikowski, K.; Menke, R. A. L.; Rolinski, M.; Barber, T. R.; Lawton, M.; Evetts, S. G.; Begeti, F.; Crabbe, M.; Rumbold, J.; Wade-Martins, R.; Hu, M. T.; Mackay, C. E. | Clare E Mackay | University of Oxford | 2019-09-11 | 1 | PUBLISHAHEADOFPRINT | cc_by | neurology | https://www.medrxiv.org/content/early/2019/09/11/19005819.source.xml | PurposeThe Oxford Parkinsons Disease Centre (OPDC) Discovery Cohort magnetic resonance imaging (MRI) sub-study (OPDC-MRI) collects high quality multimodal brain MRI together with deep longitudinal clinical phenotyping in patients with Parkinsons, at-risk individuals and healthy elderly participants. The primary aim is to detect pathological changes in brain structure and function, and develop, together with the clinical data, biomarkers to stratify, predict and chart progression in early-stage Parkinsons and at-risk individuals.
ParticipantsParticipants are recruited from the OPDC Discovery Cohort, a prospective, longitudinal study. Baseline MRI data is currently available for 290 participants: 119 patients with early idiopathic Parkinsons, 15 Parkinsons patients with pathogenic mutations of the LRRK2 or GBA genes, 68 healthy controls and 87 individuals at risk of Parkinsons (asymptomatic carriers of GBA mutation and patients with idiopathic rapid eye movement sleep behaviour disorder - RBD).
Findings to dateDifferences in brain structure in early Parkinsons were found to be subtle, with small changes in the shape of the globus pallidus and evidence of alterations in microstructural integrity in the prefrontal cortex that correlated with performance on executive function tests. Brain function, as assayed with resting fMRI yielded more substantial differences, with basal ganglia connectivity reduced in early Parkinsons, and RBD, but not Alzheimers, suggesting that the effect is pathology specific. Imaging of the substantia nigra with the more recent adoption of sequences sensitive to iron and neuromelanin content shows promising results in identifying early signs of Parkinsonian disease.
Future plansOngoing studies include the integration of multimodal MRI measures to improve discrimination power. Follow-up clinical data are now accumulating and will allow us to correlate baseline imaging measures to clinical disease progression. Follow-up MRI scanning started in 2015 and is currently ongoing, providing the opportunity for future longitudinal imaging analyses with parallel clinical phenotyping.
Article SummaryO_ST_ABSStrengths and limitations of this studyC_ST_ABSO_LIHigh quality 3T MRI data in a very well phenotyped and longitudinally followed cohort of Parkinsons and RBD.
C_LIO_LIAll imaging data were acquired on the same MRI scanner, quite unique for a study of this duration. The protocol includes both standard sequences, comparable across other studies, and sequences acquired to investigate study-specific research questions.
C_LIO_LIClinical longitudinal data are acquired every 18 months and will be used to relate baseline imaging with clinical progression. Information about conversion to Parkinsons of the at-risk individuals will also be available, providing the ultimate validation of potential biomarkers. MRI follow-up is also ongoing, which will allow longitudinal imaging analyses.
C_LIO_LIStatistical maps of published results and support data relative to the analyses are available to share.
C_LIO_LIOPDC-MRI phenotyping is deep and relatively frequent, however the size of the cohort is not at the level of population-level cohort studies. MRI sequences are high quality, but could not exploit the latest advances in the field in order to maintain continuity.
C_LI | 10.1136/bmjopen-2019-034110 | medrxiv |
10.1101/19005702 | Sex-related differences in the clinical diagnosis of frontotemporal dementia | Llorca-Bofi, V.; Illan-Gala, I.; Blesa, R. | Vicent Llorca-Bofí | Autonomous University of Barcelona, Spain | 2019-09-11 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc | neurology | https://www.medrxiv.org/content/early/2019/09/11/19005702.source.xml | Disclaimer text: The authors have withdrawn this manuscript because some authors did not consent with its publication and the data are too preliminary to be disseminated. Therefore, the authors do not wish this work to be cited as reference for the project. If you have any questions, please contact the corresponding author. | null | medrxiv |
10.1101/19005702 | Sex-related differences in the clinical diagnosis of frontotemporal dementia | Llorca-Bofi, V.; Illan-Gala, I.; Blesa, R. | Vicent Llorca-Bofí | Autonomous University of Barcelona, Spain | 2020-02-16 | 2 | WITHDRAWN | cc_by_nc | neurology | https://www.medrxiv.org/content/early/2020/02/16/19005702.source.xml | Disclaimer text: The authors have withdrawn this manuscript because some authors did not consent with its publication and the data are too preliminary to be disseminated. Therefore, the authors do not wish this work to be cited as reference for the project. If you have any questions, please contact the corresponding author. | null | medrxiv |
10.1101/19005652 | Exposure to perfluoroalkyl substances in a cohort of women firefighters and office workers in San Francisco | Trowbridge, J. A.; Gerona, R. R.; Lin, T.; Rudel, R. A.; Bessonneau, V.; Buren, H.; Morello-Frosch, R. | Rachel Morello-Frosch | University of California, Berkeley | 2019-09-11 | 1 | PUBLISHAHEADOFPRINT | cc_no | occupational and environmental health | https://www.medrxiv.org/content/early/2019/09/11/19005652.source.xml | BackgroundStudies in male firefighters have demonstrated increased exposures to carcinogenic compounds and increased rates of certain cancers compared to the general population. Many chemicals related to these occupational exposures have been associated with breast tumor development in animal and human studies, yet, there have been no studies on women firefighters due to their low numbers in most fire departments. To address this data gap, the Women Firefighters Biomonitoring Collaborative (WFBC) created a biological sample archive and analyzed levels of perfluoroalkyl substances (PFAS) among women firefighters and office workers in San Francisco.
MethodsActive duty women firefighters (n=86) and office workers (n=84) were recruited from the San Francisco Fire Department and the City and County of San Francisco, respectively. Serum samples were collected and analyzed using liquid chromatography tandem mass spectrometry (LC MS/MS) to measure and compare PFAS levels between firefighters and office workers. For PFAS congeners detected in at least 70% of our study population, we examined differences in serum PFAS levels controlling for dietary, demographic and other confounders. Among firefighters, we assessed associations between occupational activities and PFAS levels.
ResultsEight of 12 PFAS congeners were detected at levels above the limit of detection and seven were detected in at least 70% of the study population. Four PFAS were detected in all study participants (PFNA, PFOA, PFOS, PFHxS). In regression models comparing PFAS levels by occupation and adjusting for potential confounders, firefighters had higher geometric mean (GM) concentrations of PFAS compared to office workers: 2.39 (95%CI = 1.64,3.48), 2.32 (95% CI = 1.17,4.62) and 1.26 (95% CI = 0.99, 1.59) times higher for PFHxS, PFUnDA and PFNA, respectively. In analyses limited to firefighters, PFAS levels varied by assigned position in the fire department--firefighters and officers had higher PFNA, PFOA, PFDA, and PFUnDA compared to drivers. Additionally, firefighters who reported having used firefighting foam had higher concentrations of PFOA compared firefighters who reported never having used foam.
ConclusionOur study found ubiquitous exposures to PFAS among WFBC participants, with women firefighters exposed to higher levels of some PFAS compared to office workers, suggesting that some of these exposures may be occupationally related. | 10.1021/acs.est.9b05490 | medrxiv |
10.1101/19005975 | Absence of Testicular Adrenal Rest Tumors in Newborns, Infants, and Toddlers with Classical Congenital Adrenal Hyperplasia | Kim, M. S.; Koppin, C. M.; Mohan, P.; Goodarzian, F.; Ross, H. M.; Geffner, M. E.; De Filippo, R.; Kokorowski, P. | Mimi S Kim | Children\'s Hospital Los Angeles | 2019-09-11 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | endocrinology | https://www.medrxiv.org/content/early/2019/09/11/19005975.source.xml | INTRODUCTIONTesticular adrenal rest tumors (TART) are a known consequence for males with classical congenital adrenal hyperplasia (CAH) due to 21-hydroxylase deficiency. TART are associated with potential infertility in adults. However, little is known about TART in very young males with CAH.
OBJECTIVEWe assessed the prevalence of TART in newborn, infant, and toddler males with classical CAH via scrotal ultrasound.
METHODSMales with CAH had scrotal ultrasounds during the first 4 years of life, evaluating testes for morphology, blood flow, and presence of TART. Newborn screen 17-hydroxyprogesterone (17-OHP) and serum 17-OHP at the time of ultrasound were recorded. Bone ages were considered very advanced if [≥] 2SD above chronological age.
RESULTSThirty-one ultrasounds in 16 males were performed. An initial ultrasound was obtained in four newborns at diagnosis (6.8 {+/-}2.1 days), six infants (2.2 {+/-}0.9 months), and six toddlers (2.4 {+/-}0.9 years). Eleven males had at least one repeat ultrasound. A large proportion (11/16) were in poor hormonal control with an elevated 17-OHP (325 {+/-}298 nmol/L). One infant was in very poor hormonal control (17-OHP 447 nmol/L) at initial ultrasound, and two toddlers had advanced bone ages (+3.2 and +4.5 SD) representing exposure to postnatal androgens. However, no TART were detected in any subjects.
CONCLUSIONSTART were not found in males up to 4 years of age with classical CAH despite settings with expected high ACTH drive. Further research into the occurrence of TART in CAH may elucidate factors which contribute to the detection and individual predisposition to TART. | 10.1159/000504135 | medrxiv |
10.1101/19005793 | Dyspareunia in their own words: A comprehensive qualitative description of endometriosis-associated sexual pain | Wahl, K.; Imtiaz, S.; Smith, K. B.; Joseph, K. S.; Yong, P. J.; Cox, S. M. | Paul J Yong | University of British Columbia | 2019-09-12 | 1 | PUBLISHAHEADOFPRINT | cc_no | sexual and reproductive health | https://www.medrxiv.org/content/early/2019/09/12/19005793.source.xml | BackgroundDyspareunia is a classic symptom of endometriosis but is neglected in research and clinical contexts. This study explored the experience of this endometriosis-associated sexual pain.
MethodsThis was a qualitative descriptive study that included people who had experienced endometriosis-associated dyspareunia alone or with a partner. Data collection involved semi- structured interviews with a female researcher that began with an open-ended question about dyspareunia and included interview prompts related to the nature of sexual pain. Interviews were recorded, transcribed verbatim, and analysed for themes.
Results17 participants completed interviews. The mean participant age was 33.3 (SD=7.2) and most participants identified as white (82%), were college-educated (71%), identified as heterosexual (65%), and were partnered (59%). Location, onset, and character emerged as important, interrelated features of endometriosis-associated dyspareunia, as did severity and impact. Dyspareunia occurred at the vaginal opening (n=7) and in the abdomen/pelvis (n=13). Pain at the vaginal opening began with initial penetration and had pulling, burning and stinging qualities. Pain in the pelvis was typically experienced with deep penetration or in certain position and was described as sharp, stabbing and/or cramping. Dyspareunia ranged from mild to severe, and for some participants had a marked psychosocial impact.
ConclusionsDyspareunia is a heterogeneous symptom of endometriosis that ranges in severity and impact. Disaggregating dyspareunia into superficial and deep types may better reflect the etiologies of this pain, thereby improving outcome measurement in intervention studies and clinical care. | 10.1016/j.esxm.2020.10.002 | medrxiv |
10.1101/19006478 | Multifocal breast cancers are more prevalent in BRCA2 versus BRCA1 mutation carriers | McCrorie, A. D.; Ashfield, S.; McIlmunn, C.; Morrison, P. J.; Boyd, C.; Savage, K. I.; McIntosh, S. A. | Stuart A McIntosh | Queen\'s University Belfast | 2019-09-13 | 1 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | surgery | https://www.medrxiv.org/content/early/2019/09/13/19006478.source.xml | Multifocal/multicentric breast cancer is generally considered to be where two or more breast tumours are present within the same breast, and is seen in [~]10% of breast cancer cases. This study investigates the prevalence of multifocality/multicentricity in a cohort of BRCA1/2 mutation carriers with breast cancer from Northern Ireland via cross-sectional analysis. Data from 211 women with BRCA1/2 mutations (BRCA1 - 91), (BRCA2 - 120), with breast cancer were collected including age, tumour focality, size, type, grade, and receptor profile. The prevalence of multifocality/multicentricity within this group was 25%, but within subgroups, prevalence amongst BRCA2 carriers was more than double that of BRCA1 carriers (p=0.001). Women affected by multifocal/multicentric tumours had proportionately higher oestrogen receptor positivity (p=0.001) and lower triple negativity (p=0.004). These observations are likely to be driven by the higher BRCA2 mutation prevalence observed within this cohort. Odds of a BRCA2 carrier developing multifocal/multicentric cancer were almost four-fold higher than a BRCA1 carrier (OR: 3.71, CI: 1.77-7.78, p=0.001). These findings were subsequently validated in a second, large independent cohort of patients with BRCA-associated breast cancers from a UK-wide multicentre study. This confirmed a significantly higher prevalence of multifocal/multicentric tumours amongst BRCA2 mutation carriers compared with BRCA1 mutation carriers. This has important implications for clinicians involved in the treatment of BRCA2-associated breast cancer, both in the diagnostic process, in ensuring that tumour focality is adequately assessed to facilitate treatment decision-making, and for breast surgeons, particularly if breast conserving surgery is being considered as a treatment option for these patients. | 10.1002/cjp2.155 | medrxiv |
10.1101/19006478 | Multifocal breast cancers are more prevalent in BRCA2 versus BRCA1 mutation carriers | McCrorie, A. D.; Ashfield, S.; Begley, A.; McIlmunn, C.; Morrison, P. J.; Boyd, C.; Eccles, B.; Breville-Heygate, S.; Copson, E. R.; Cutress, R. I.; Eccles, D. M.; Savage, K. I.; McIntosh, S. A. | Stuart A McIntosh | Queen\'s University Belfast | 2019-12-30 | 2 | PUBLISHAHEADOFPRINT | cc_by_nc_nd | surgery | https://www.medrxiv.org/content/early/2019/12/30/19006478.source.xml | Multifocal/multicentric breast cancer is generally considered to be where two or more breast tumours are present within the same breast, and is seen in [~]10% of breast cancer cases. This study investigates the prevalence of multifocality/multicentricity in a cohort of BRCA1/2 mutation carriers with breast cancer from Northern Ireland via cross-sectional analysis. Data from 211 women with BRCA1/2 mutations (BRCA1 - 91), (BRCA2 - 120), with breast cancer were collected including age, tumour focality, size, type, grade, and receptor profile. The prevalence of multifocality/multicentricity within this group was 25%, but within subgroups, prevalence amongst BRCA2 carriers was more than double that of BRCA1 carriers (p=0.001). Women affected by multifocal/multicentric tumours had proportionately higher oestrogen receptor positivity (p=0.001) and lower triple negativity (p=0.004). These observations are likely to be driven by the higher BRCA2 mutation prevalence observed within this cohort. Odds of a BRCA2 carrier developing multifocal/multicentric cancer were almost four-fold higher than a BRCA1 carrier (OR: 3.71, CI: 1.77-7.78, p=0.001). These findings were subsequently validated in a second, large independent cohort of patients with BRCA-associated breast cancers from a UK-wide multicentre study. This confirmed a significantly higher prevalence of multifocal/multicentric tumours amongst BRCA2 mutation carriers compared with BRCA1 mutation carriers. This has important implications for clinicians involved in the treatment of BRCA2-associated breast cancer, both in the diagnostic process, in ensuring that tumour focality is adequately assessed to facilitate treatment decision-making, and for breast surgeons, particularly if breast conserving surgery is being considered as a treatment option for these patients. | 10.1002/cjp2.155 | medrxiv |
10.1101/19006429 | Inferring the role of the microbiome on survival in patients treated with immune checkpoint inhibitors: causal modeling, timing, and classes of concomitant medications | Spakowicz, D.; Hoyd, R.; Husain, M.; Bassett, J. S.; Wang, L.; Tinoco, G.; Patel, S.; Burkart, J.; Miah, A.; Li, M.; Johns, A.; Grogan, M.; Carbone, D. P.; Verschraegen, C. F.; Kendra, K.; Otterson, G. A.; Li, L.; Presley, C.; Owen, D. H. | Daniel Spakowicz | The Ohio State University College of Medicine | 2019-09-13 | 1 | PUBLISHAHEADOFPRINT | cc_by | oncology | https://www.medrxiv.org/content/early/2019/09/13/19006429.source.xml | The microbiome has been shown to affect the response to Immune Checkpoint Inhibitors (ICIs) in a small number of cancers. Here, we sought to more broadly survey cancers to identify those in which the microbiome will play a role using retrospective analyes. We created a causal model for the relationship between medications, the microbiome and ICI response and used it to guide the abstraction of electronic health records of 690 patients who received ICI therapy for advanced cancer. Medications associated with changes to the microbiome including antibiotics, corticosteroids, proton pump inhibitors, histamine receptor blockers, non-steroid anti-inflammatories and statins were abstracted. We tested the effect of medication timing on overall survival (OS) and evaluated the robustness of medication effects in each cancer. Finally, we compared the size of the effect observed for antibiotics classes to taxa correlated with ICI response and a literature review of culture-based antibiotic susceptibilities. Of the medications assessed, only antibiotics and corticosteroids significantly associated with lower OS. The hazard ratios (HRs) for antibiotics and corticosteroids were highest near the start of ICI treatment but remained significant when given prior to ICI. Antibiotics and corticosteroids remained significantly associated with OS even when controlling for multiple factors such as Eastern Cooperative Oncology Group performance status and stage. When grouping antibiotics by class, {beta}-lactams showed the strongest association with OS across all tested cancers. The timing and strength of these effects after controlling for confounding factors are consistent with role for the microbiome in response to ICIs. | 10.1186/s12885-020-06882-6 | medrxiv |