ive

An Extensive Meta-Metagenomic Search Identifies SARS-CoV-2-Homologous Sequences in Pangolin Lung Viromes

ABSTRACT

In numerous instances, tracking the biological significance of a nucleic acid sequence can be augmented through the identification of environmental niches in which the sequence of interest is present. Many metagenomic data sets are now available, with deep sequencing of samples from diverse biological niches. While any individual metagenomic data set can be readily queried using web-based tools, meta-searches through all such data sets are less accessible. In this brief communication, we demonstrate such a meta-metagenomic approach, examining close matches to the severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) in all high-throughput sequencing data sets in the NCBI Sequence Read Archive accessible with the "virome" keyword. In addition to the homology to bat coronaviruses observed in descriptions of the SARS-CoV-2 sequence (F. Wu, S. Zhao, B. Yu, Y. M. Chen, et al., Nature 579:265–269, 2020, https://doi.org/10.1038/s41586-020-2008-3; P. Zhou, X. L. Yang, X. G. Wang, B. Hu, et al., Nature 579:270–273, 2020, https://doi.org/10.1038/s41586-020-2012-7), we note a strong homology to numerous sequence reads in metavirome data sets generated from the lungs of deceased pangolins reported by Liu et al. (P. Liu, W. Chen, and J. P. Chen, Viruses 11:979, 2019, https://doi.org/10.3390/v11110979). While analysis of these reads indicates the presence of a similar viral sequence in pangolin lung, the similarity is not sufficient to either confirm or rule out a role for pangolins as an intermediate host in the recent emergence of SARS-CoV-2. In addition to the implications for SARS-CoV-2 emergence, this study illustrates the utility and limitations of meta-metagenomic search tools in effective and rapid characterization of potentially significant nucleic acid sequences.

IMPORTANCE Meta-metagenomic searches allow for high-speed, low-cost identification of potentially significant biological niches for sequences of interest.




ive

Case 1: Progressive Dysphagia in a Teenager with Down Syndrome




ive

Disinfectant Efficacy: Understanding the Expectations and How to Design Effective Studies That Include Leveraging Multi-Site Data to Drive an Efficient Program

For manufacturers of both sterile and nonsterile pharmaceuticals, there is an expectation that the manufacturing process is performed in a manner that prevents extraneous contamination so that the products are provided in a safe, integral, pure, and unadulterated form. As part of that process, cleaning and disinfection are an absolute necessity. Although cleaning and disinfection support control of microbial contamination through preventive and corrective action, specific compendia methods do not currently exist. The intent of this paper is to provide a general guidance on how to perform disinfectant efficacy validation and implementation. This includes how to make sure the concepts are understood, how to interpret facility data and utilize it to demonstrate control awareness for your facilities, and how to leverage the data to reduce redundancies in validation or verification. This paper represents the thoughts and best practices of the authoring team and their respective companies and provides an efficient way to qualify disinfectants without impacting the quality of the study. If you choose to follow the recommendations in this paper, you must ensure that the appropriate rationale is sound and the scientific data is documented. It is the belief of the authoring team that only then will this approach meet regulatory requirements.




ive

Touching the Surface: Diverse Roles for the Flagellar Membrane in Kinetoplastid Parasites [Review]

While flagella have been studied extensively as motility organelles, with a focus on internal structures such as the axoneme, more recent research has illuminated the roles of the flagellar surface in a variety of biological processes. Parasitic protists of the order Kinetoplastida, which include trypanosomes and Leishmania species, provide a paradigm for probing the role of flagella in host-microbe interactions and illustrate that this interface between the flagellar surface and the host is of paramount importance. An increasing body of knowledge indicates that the flagellar membrane serves a multitude of functions at this interface: attachment of parasites to tissues within insect vectors, close interactions with intracellular organelles of vertebrate cells, transactions between flagella from different parasites, junctions between the flagella and the parasite cell body, emergence of nanotubes and exosomes from the parasite directed to either host or microbial targets, immune evasion, and sensing of the extracellular milieu. Recent whole-organelle or genome-wide studies have begun to identify protein components of the flagellar surface that must mediate these diverse host-parasite interactions. The increasing corpus of knowledge on kinetoplastid flagella will likely prove illuminating for other flagellated or ciliated pathogens as well.




ive

Bioavailability Based on the Gut Microbiota: a New Perspective [Review]

The substantial discrepancy between the strong effects of functional foods and various drugs, especially traditional Chinese medicines (TCMs), and the poor bioavailability of these substances remains a perplexing problem. Understanding the gut microbiota, which acts as an effective bioreactor in the human intestinal tract, provides an opportunity for the redefinition of bioavailability. Here, we discuss four different pathways associated with the role of the gut microbiota in the transformation of parent compounds to beneficial or detrimental small molecules, which can enter the body’s circulatory system and be available to target cells, tissues, and organs. We further describe and propose effective strategies for improving bioavailability and alleviating side effects with the help of the gut microbiota. This review also broadens our perspectives for the discovery of new medicinal components.




ive

Fur-Dam Regulatory Interplay at an Internal Promoter of the Enteroaggregative Escherichia coli Type VI Secretion sci1 Gene Cluster [Article]

The type VI secretion system (T6SS) is a weapon for delivering effectors into target cells that is widespread in Gram-negative bacteria. The T6SS is a highly versatile machine, as it can target both eukaryotic and prokaryotic cells, and it has been proposed that T6SSs are adapted to the specific needs of each bacterium. The expression of T6SS gene clusters and the activation of the secretion apparatus are therefore tightly controlled. In enteroaggregative Escherichia coli (EAEC), the sci1 T6SS gene cluster is subject to a complex regulation involving both the ferric uptake regulator (Fur) and DNA adenine methylase (Dam)-dependent DNA methylation. In this study, an additional, internal, promoter was identified within the sci1 gene cluster using +1 transcriptional mapping. Further analyses demonstrated that this internal promoter is controlled by a mechanism strictly identical to that of the main promoter. The Fur binding box overlaps the –10 transcriptional element and a Dam methylation site, GATC-32. Hence, the expression of the distal sci1 genes is repressed and the GATC-32 site is protected from methylation in iron-rich conditions. The Fur-dependent protection of GATC-32 was confirmed by an in vitro methylation assay. In addition, the methylation of GATC-32 negatively impacted Fur binding. The expression of the sci1 internal promoter is therefore controlled by iron availability through Fur regulation, whereas Dam-dependent methylation maintains a stable ON expression in iron-limited conditions.

IMPORTANCE Bacteria use weapons to deliver effectors into target cells. One of these weapons, the type VI secretion system (T6SS), assembles a contractile tail acting as a spring to propel a toxin-loaded needle. Its expression and activation therefore need to be tightly regulated. Here, we identified an internal promoter within the sci1 T6SS gene cluster in enteroaggregative E. coli. We show that this internal promoter is controlled by Fur and Dam-dependent methylation. We further demonstrate that Fur and Dam compete at the –10 transcriptional element to finely tune the expression of T6SS genes. We propose that this elegant regulatory mechanism allows the optimum production of the T6SS in conditions where enteroaggregative E. coli encounters competing species.




ive

The supportive care needs of people living with pulmonary fibrosis and their caregivers: a systematic review

Background

People with pulmonary fibrosis often experience a protracted time to diagnosis, high symptom burden and limited disease information. This review aimed to identify the supportive care needs reported by people with pulmonary fibrosis and their caregivers.

Methods

A systematic review was conducted according to PRISMA guidelines. Studies that investigated the supportive care needs of people with pulmonary fibrosis or their caregivers were included. Supportive care needs were extracted and mapped to eight pre-specified domains using a framework synthesis method.

Results

A total of 35 studies were included. The most frequently reported needs were in the domain of information/education, including information on supplemental oxygen, disease progression and prognosis, pharmacological treatments and end-of-life planning. Psychosocial/emotional needs were also frequently reported, including management of anxiety, anger, sadness and fear. An additional domain of "access to care" was identified that had not been specified a priori; this included access to peer support, psychological support, specialist centres and support for families of people with pulmonary fibrosis.

Conclusion

People with pulmonary fibrosis report many unmet needs for supportive care, particularly related to insufficient information and lack of psychosocial support. These data can inform the development of comprehensive care models for people with pulmonary fibrosis and their loved ones.




ive

Diagnostic Utility and Impact on Clinical Decision Making of Focused Assessment With Sonography for HIV-Associated Tuberculosis in Malawi: A Prospective Cohort Study

ABSTRACTBackground:The focused assessment with sonography for HIV-associated tuberculosis (TB) (FASH) ultrasound protocol has been increasingly used to help clinicians diagnose TB. We sought to quantify the diagnostic utility of FASH for TB among individuals with HIV in Malawi.Methods:Between March 2016 and August 2017, 210 adults with HIV who had 2 or more signs and symptoms that were concerning for TB (fever, cough, night sweats, weight loss) were enrolled from a public HIV clinic in Lilongwe, Malawi. The treating clinicians conducted a history, physical exam, FASH protocol, and additional TB evaluation (laboratory diagnostics and chest radiography) on all participants. The clinician made a final treatment decision based on all available information. At the 6-month follow-up visit, we categorized participants based on clinical outcomes and diagnostic tests as having probable/confirmed TB or unlikely TB; association of FASH with probable/confirmed TB was calculated using Fisher's exact tests. The impact of FASH on empiric TB treatment was determined by asking the clinicians prospectively about whether they would start treatment at 2 time points in the baseline visit: (1) after the initial history and physical exam; and (2) after history, physical exam, and FASH protocol.Results:A total of 181 participants underwent final analysis, of whom 56 were categorized as probable/confirmed TB and 125 were categorized as unlikely TB. The FASH protocol was positive in 71% (40/56) of participants with probable/confirmed TB compared to 24% (30/125) of participants with unlikely TB (odds ratio=7.9, 95% confidence interval=3.9,16.1; P<.001). Among those classified as confirmed/probable TB, FASH increased the likelihood of empiric TB treatment before obtaining any other diagnostic studies from 9% (5/56) to 46% (26/56) at the point-of-care. For those classified as unlikely TB, FASH increased the likelihood of empiric treatment from 2% to 4%.Conclusion:In the setting of HIV coinfection in Malawi, FASH can be a helpful tool that augments the clinician's ability to make a timely diagnosis of TB.




ive

A Qualitative Assessment of Provider and Client Experiences With 3- and 6-Month Dispensing Intervals of Antiretroviral Therapy in Malawi

ABSTRACTIntroduction:Multimonth dispensing (MMD) of antiretroviral therapy (ART) is a differentiated model of care that can help overcome health system challenges and reduce the burden of HIV care on clients. Although 3-month dispensing has been the standard of care, interest has increased in extending refill intervals to 6 months. We explored client and provider experiences with MMD in Malawi as part of a cluster randomized trial evaluating 3- versus 6-month ART dispensing.Methods:Semi-structured in-depth interviews were conducted with 17 ART providers and 62 stable, adult clients with HIV on ART. Clients and providers were evenly divided by arm and were eligible for an interview if they had been participating in the study for 1 year (clients) or 6 months (providers). Questions focused on perceived challenges and benefits of the 3- or 6-month amount of ART dispensing. Interviews were transcribed, and data were coded and analyzed using constant comparison.Results:Both clients and providers reported that the larger medication supply had benefits. Clients reported decreased costs due to less frequent travel to the clinic and increased time for income-generating activities. Clients in the 6-month dispensing arm reported a greater sense of personal freedom and normalcy. Providers felt that the 6-month dispensing interval reduced their workload. They also expressed concerned about clients' challenges with ART storage at home, but clients reported no storage problems. Although providers mentioned the potential risk of clients sharing the larger medication supply with family or friends, clients emphasized the value of ART and reported only rare, short-term sharing, mostly with their spouses. Providers mentioned clients' lack of motivation to seek care for illnesses that might occur between refill appointments.Conclusions:The 6-month ART dispensing arm was particularly beneficial to clients for decreased costs, increased time for income generation, and a greater sense of normalcy. Providers' concerns about storage, sharing, and return visits to the facility did not emerge in client interviews. Further data are needed on the feasibility of implementing a large-scale program with 6-month dispensing.




ive

Insights Into Provider Bias in Family Planning from a Novel Shared Decision Making Based Counseling Initiative in Rural, Indigenous Guatemala




ive

National Surgical, Obstetric, and Anesthesia Plans Supporting the Vision of Universal Health Coverage




ive

Erratum. Ten-Year Outcome of Islet Alone or Islet After Kidney Transplantation in Type 1 Diabetes: A Prospective Parallel-Arm Cohort Study. Diabetes Care 2019;42:2042-2049




ive

Diabetes, Cognitive Decline, and Mild Cognitive Impairment Among Diverse Hispanics/Latinos: Study of Latinos-Investigation of Neurocognitive Aging Results (HCHS/SOL)

OBJECTIVE

Hispanics/Latinos are the largest ethnic/racial group in the U.S., have the highest prevalence of diabetes, and are at increased risk for neurodegenerative disorders. Currently, little is known about the relationship between diabetes and cognitive decline and disorders among diverse Hispanics/Latinos. The purpose of this study is to clarify these relationships in diverse middle-aged and older Hispanics/Latinos.

RESEARCH DESIGN AND METHODS

The Study of Latinos–Investigation of Neurocognitive Aging (SOL-INCA) is an ancillary study of the Hispanic Community Health Study/Study of Latinos (HCHS/SOL). HCHS/SOL is a multisite (Bronx, NY; Chicago, IL; Miami, FL; and San Diego, CA), probability-sampled (i.e., representative of targeted populations), and prospective cohort study. Between 2016 and 2018, SOL-INCA enrolled diverse Hispanics/Latinos aged ≥50 years (n = 6,377). Global cognitive decline and mild cognitive impairment (MCI) were the primary outcomes.

RESULTS

Prevalent diabetes at visit 1, but not incident diabetes at visit 2, was associated with significantly steeper global cognitive decline (βGC = –0.16 [95% CI –0.25; –0.07]; P < 0.001), domain-specific cognitive decline, and higher odds of MCI (odds ratio 1.74 [95% CI 1.34; 2.26]; P < 0.001) compared with no diabetes in age- and sex-adjusted models.

CONCLUSIONS

Diabetes was associated with cognitive decline and increased MCI prevalence among diverse Hispanics/Latinos, primarily among those with prevalent diabetes at visit 1. Our findings suggest that significant cognitive decline and MCI may be considered additional disease complications of diabetes among diverse middle-aged and older Hispanics/Latinos.




ive

Differential Health Care Use, Diabetes-Related Complications, and Mortality Among Five Unique Classes of Patients With Type 2 Diabetes in Singapore: A Latent Class Analysis of 71,125 Patients

OBJECTIVE

With rising health care costs and finite health care resources, understanding the population needs of different type 2 diabetes mellitus (T2DM) patient subgroups is important. Sparse data exist for the application of population segmentation on health care needs among Asian T2DM patients. We aimed to segment T2DM patients into distinct classes and evaluate their differential health care use, diabetes-related complications, and mortality patterns.

RESEARCH DESIGN AND METHODS

Latent class analysis was conducted on a retrospective cohort of 71,125 T2DM patients. Latent class indicators included patient’s age, ethnicity, comorbidities, and duration of T2DM. Outcomes evaluated included health care use, diabetes-related complications, and 4-year all-cause mortality. The relationship between class membership and outcomes was evaluated with the appropriate regression models.

RESULTS

Five classes of T2DM patients were identified. The prevalence of depression was high among patients in class 3 (younger females with short-to-moderate T2DM duration and high psychiatric and neurological disease burden) and class 5 (older patients with moderate-to-long T2DM duration and high disease burden with end-organ complications). They were the highest tertiary health care users. Class 5 patients had the highest risk of myocardial infarction (hazard ratio [HR] 12.05, 95% CI 10.82–13.42]), end-stage renal disease requiring dialysis initiation (HR 25.81, 95% CI 21.75–30.63), stroke (HR 19.37, 95% CI 16.92–22.17), lower-extremity amputation (HR 12.94, 95% CI 10.90–15.36), and mortality (HR 3.47, 95% CI 3.17–3.80).

CONCLUSIONS

T2DM patients can be segmented into classes with differential health care use and outcomes. Depression screening should be considered for the two identified classes of patients.




ive

Incidence and Associations of Chronic Kidney Disease in Community Participants With Diabetes: A 5-Year Prospective Analysis of the EXTEND45 Study

OBJECTIVE

To determine the incidence of and factors associated with an estimated glomerular filtration rate (eGFR) <60 mL/min/1.73 m2 in people with diabetes.

RESEARCH DESIGN AND METHODS

We identified people with diabetes in the EXamining ouTcomEs in chroNic Disease in the 45 and Up Study (EXTEND45), a population-based cohort study (2006–2014) that linked the Sax Institute’s 45 and Up Study cohort to community laboratory and administrative data in New South Wales, Australia. The study outcome was the first eGFR measurement <60 mL/min/1.73 m2 recorded during the follow-up period. Participants with eGFR < 60 mL/min/1.73 m2 at baseline were excluded. We used Poisson regression to estimate the incidence of eGFR <60 mL/min/1.73 m2 and multivariable Cox regression to examine factors associated with the study outcome.

RESULTS

Of 9,313 participants with diabetes, 2,106 (22.6%) developed incident eGFR <60 mL/min/1.73 m2 over a median follow-up time of 5.7 years (interquartile range, 3.0–5.9 years). The eGFR <60 mL/min/1.73 m2 incidence rate per 100 person-years was 6.0 (95% CI 5.7–6.3) overall, 1.5 (1.3–1.9) in participants aged 45–54 years, 3.7 (3.4–4.0) for 55–64 year olds, 7.6 (7.1–8.1) for 65–74 year olds, 15.0 (13.0–16.0) for 75–84 year olds, and 26.0 (22.0–32.0) for those aged 85 years and over. In a fully adjusted multivariable model incidence was independently associated with age (hazard ratio 1.23 per 5-year increase; 95% CI 1.19–1.26), geography (outer regional and remote versus major city: 1.36; 1.17–1.58), obesity (obese class III versus normal: 1.44; 1.16–1.80), and the presence of hypertension (1.52; 1.33–1.73), coronary heart disease (1.13; 1.02–1.24), cancer (1.30; 1.14–1.50), and depression/anxiety (1.14; 1.01–1.27).

CONCLUSIONS

In participants with diabetes, the incidence of an eGFR <60 mL/min/1.73 m2 was high. Older age, remoteness of residence, and the presence of various comorbid conditions were associated with higher incidence.




ive

Markers of Early Life Infection in Relation to Adult Diabetes: Prospective Evidence From a National Birth Cohort Study Over Four Decades




ive

An increase in MYC copy number has a progressive negative prognostic impact in patients with diffuse large B-cell and high-grade lymphoma, who may benefit from intensified treatment regimens

MYC translocations, a hallmark of Burkitt lymphoma, occur in 5-15% of diffuse large B-cell lymphoma, and have a negative prognostic impact. Numerical aberrations of MYC have also been detected in these patients, but their incidence and prognostic role are still controversial. We analyzed the clinical impact of MYC increased copy number on 385 patients with diffuse large B-cell lymphoma screened at diagnosis for MYC, BCL2, and BCL6 rearrangements. We enumerated the number of MYC copies, defining as amplified those cases with an uncountable number of extra-copies. The prevalence of MYC translocation, increased copy number and amplification was 8.8%, 15%, and 1%, respectively. Patients with 3 or 4 gene copies, accounting for more than 60% of patients with MYC copy number changes, had a more favorable outcome compared to patients with >4 copies or translocation of MYC, and were not influenced by the type of treatment received as first-line. Stratification according to the number of MYC extra-copies showed a negative correlation between an increasing number of copies and survival. Patients with >7 copies or the amplification of MYC had the poorest prognosis. Patients with >4 copies of MYC showed a similar, trending towards worse prognosis compared to patients with MYC translocation. The survival of patients with >4 copies, translocation or amplification of MYC seemed to be superior if intensive treatments were used. Our study underlines the importance of fluorescence in situ hybridization testing at diagnosis of diffuse large B-cell lymphoma to detect the rather frequent and clinically significant numerical aberrations of MYC.




ive

Impact of cytogenetic abnormalities on outcomes of adult Philadelphia-negative acute lymphoblastic leukemia after allogeneic hematopoietic stem cell transplantation: a study by the Acute Leukemia Working Committee of the Center for International Blood and

Cytogenetic risk stratification at diagnosis has long been one of the most useful tools to assess prognosis in acute lymphoblastic leukemia (ALL). To examine the prognostic impact of cytogenetic abnormalities on outcomes after allogeneic hematopoietic cell transplantation, we studied 1731 adults with Philadelphia-negative ALL in complete remission who underwent myeloablative or reduced intensity/non-myeloablative conditioning transplant from unrelated or matched sibling donors reported to the Center for International Blood and Marrow Transplant Research. A total of 632 patients had abnormal conventional metaphase cytogenetics. The leukemia-free survival and overall survival rates at 5 years after transplantation in patients with abnormal cytogenetics were 40% and 42%, respectively, which were similar to those in patients with a normal karyotype. Of the previously established cytogenetic risk classifications, modified Medical Research Council-Eastern Cooperative Oncology Group score was the only independent prognosticator of leukemia-free survival (P=0.03). In the multivariable analysis, monosomy 7 predicted post-transplant relapse [hazard ratio (HR)=2.11; 95% confidence interval (95% CI): 1.04-4.27] and treatment failure (HR=1.97; 95% CI: 1.20-3.24). Complex karyotype was prognostic for relapse (HR=1.69; 95% CI: 1.06-2.69), whereas t(8;14) predicted treatment failure (HR=2.85; 95% CI: 1.35-6.02) and overall mortality (HR=3.03; 95% CI: 1.44-6.41). This large study suggested a novel transplant-specific cytogenetic scheme with adverse [monosomy 7, complex karyotype, del(7q), t(8;14), t(11;19), del(11q), tetraploidy/near triploidy], intermediate (normal karyotype and all other abnormalities), and favorable (high hyperdiploidy) risks to prognosticate leukemia-free survival (P=0.02). Although some previously established high-risk Philadelphia-negative cytogenetic abnormalities in ALL can be overcome by transplantation, monosomy 7, complex karyotype, and t(8;14) continue to pose significant risks and yield inferior outcomes.




ive

Appropriation of GPIb{alpha} from platelet-derived extracellular vesicles supports monocyte recruitment in systemic inflammation

Interactions between platelets, leukocytes and the vessel wall provide alternative pathological routes of thrombo-inflammatory leukocyte recruitment. We found that when platelets were activated by a range of agonists in whole blood, they shed platelet-derived extracellular vesicles which rapidly and preferentially bound to blood monocytes compared to other leukocytes. Platelet-derived extracellular vesicle binding to monocytes was initiated by P-selectin-dependent adhesion and was stabilised by binding of phosphatidylserine. These interactions resulted in the progressive transfer of the platelet adhesion receptor GPIbα to monocytes. GPIbα+-monocytes tethered and rolled on immobilised von Willebrand Factor or were recruited and activated on endothelial cells treated with TGF-β1 to induce the expression of von Willebrand Factor. In both models monocyte adhesion was ablated by a function-blocking antibody against GPIbα. Monocytes could also bind platelet-derived extracellular vesicle in mouse blood in vitro and in vivo. Intratracheal instillations of diesel nanoparticles, to model chronic pulmonary inflammation, induced accumulation of GPIbα on circulating monocytes. In intravital experiments, GPIbα+-monocytes adhered to the microcirculation of the TGF-β1-stimulated cremaster muscle, while in the ApoE–/– model of atherosclerosis, GPIbα+-monocytes adhered to the carotid arteries. In trauma patients, monocytes bore platelet markers within 1 hour of injury, the levels of which correlated with severity of trauma and resulted in monocyte clearance from the circulation. Thus, we have defined a novel thrombo-inflammatory pathway in which platelet-derived extracellular vesicles transfer a platelet adhesion receptor to monocytes, allowing their recruitment in large and small blood vessels, and which is likely to be pathogenic.




ive

Extensive multilineage analysis in patients with mixed chimerism after allogeneic transplantation for sickle cell disease: insight into hematopoiesis and engraftment thresholds for gene therapy

Although studies of mixed chimerism following hematopoietic stem cell transplantation in patients with sickle cell disease (SCD) may provide insights into the engraftment needed to correct the disease and into immunological reconstitution, an extensive multilineage analysis is lacking. We analyzed chimerism simultaneously in peripheral erythroid and granulomonocytic precursors/progenitors, highly purified B and T lymphocytes, monocytes, granulocytes and red blood cells (RBC). Thirty-four patients with mixed chimerism and ≥12 months of follow-up were included. A selective advantage of donor RBC and their progenitors/precursors led to full chimerism in mature RBC (despite partial engraftment of other lineages), and resulted in the clinical control of the disease. Six patients with donor chimerism <50% had hemolysis (reticulocytosis) and higher HbS than their donor. Four of them had donor chimerism <30%, including a patient with AA donor (hemoglobin >10 g/dL) and three with AS donors (hemoglobin <10 g/dL). However, only one vaso-occlusive crisis occurred with 68.7% HbS. Except in the patients with the lowest chimerism, the donor engraftment was lower for T cells than for the other lineages. In a context of mixed chimerism after hematopoietic stem cell transplantation for SCD, myeloid (rather than T cell) engraftment was the key efficacy criterion. Results show that myeloid chimerism as low as 30% was sufficient to prevent a vaso-occlusive crisis in transplants from an AA donor but not constantly from an AS donor. However, the correction of hemolysis requires higher donor chimerism levels (i.e. ≥50%) in both AA and AS recipients. In the future, this group of patients may need a different therapeutic approach.




ive

Iron absorption from supplements is greater with alternate day than with consecutive day dosing in iron-deficient anemic women

In iron-depleted women without anemia, oral iron supplements induce an increase in serum hepcidin (SHep) that persists for 24 hours, decreasing iron absorption from supplements given later on the same or next day. Consequently, iron absorption from supplements is highest if iron is given on alternate days. Whether this dosing schedule is also beneficial in women with iron-deficiency anemia (IDA) given high-dose iron supplements is uncertain. The primary objective of this study was to assess whether, in women with IDA, alternate-day administration of 100 and 200 mg iron increases iron absorption compared to consecutive-day iron administration. Secondary objectives were to correlate iron absorption with SHep and iron status parameters. We performed a cross-over iron absorption study in women with IDA (n=19; median hemoglobin 11.5 mg/dL; mean serum ferritin 10 mg/L) who received either 100 or 200 mg iron as ferrous sulfate given at 8 AM on days 2, 3 and 5 labeled with stable iron isotopes 57Fe, 58Fe and 54Fe; after a 16-day incorporation period, the other labeled dose was given at 8 AM on days 23, 24 and 26 (days 2, 3 and 5 of the second period). Iron absorption on days 2 and 3 (consecutive) and day 5 (alternate) was assessed by measuring erythrocyte isotope incorporation. For both doses, SHep was higher on day 3 than on day 2 (P<0.001) or day 5 (P<0.01) with no significant difference between days 2 and 5. Similarly, for both doses, fractional iron absorption (FIA) on days 2 and 5 was 40-50% higher than on day 3 (P<0.001), while absorption on day 2 did not differ significantly from day 5. There was no significant difference in the incidence of gastrointestinal side effects comparing the two iron doses (P=0.105). Alternate day dosing of oral iron supplements in anemic women may be preferable because it sharply increases FIA. If needed, to provide the same total amount of iron with alternate day dosing, twice the daily target dose should be given on alternate days, as total iron absorption from a single dose of 200 mg given on alternate days was approximately twice that from 100 mg given on consecutive days (P<0.001). In IDA, even if hepatic hepcidin expression is strongly suppressed by iron deficiency and erythropoietic drive, the intake of oral iron supplements leads to an acute hepcidin increase for 24 hours. The study was funded by ETH Zürich, Switzerland. This study has been registered at www.clinicaltrials.gov as #NCT03623997.




ive

Bone marrow niche dysregulation in myeloproliferative neoplasms

The bone marrow niche is a complex and dynamic structure composed of a multitude of cell types which functionally create an interactive network facilitating hematopoietic stem cell development and maintenance. Its specific role in the pathogenesis, response to therapy, and transformation of myeloproliferative neoplasms has only recently been explored. Niche functionality is likely affected not only by the genomic background of the myeloproliferative neoplasm-associated mutated hematopoietic stem cells, but also by disease-associated ‘chronic inflammation’, and subsequent adaptive and innate immune responses. ‘Cross-talk’ between mutated hematopoietic stem cells and multiple niche components may contribute to propagating disease progression and mediating drug resistance. In this timely article, we will review current knowledge surrounding the deregulated bone marrow niche in myeloproliferative neoplasms and suggest how this may be targeted, either directly or indirectly, potentially influencing therapeutic choices both now and in the future.




ive

A post-stem cell transplant risk score for Philadelphia-negative acute lymphoblastic leukemia




ive

CRISPR/Cas9-mediated gene deletion efficiently retards the progression of Philadelphia-positive acute lymphoblastic leukemia in a p210 BCR-ABL1T315I mutation mouse model




ive

Disease progression in myeloproliferative neoplasms: comparing patients in accelerated phase with those in chronic phase with increased blasts (<10%) or with other types of disease progression




ive

Suppressive effects of anagrelide on cell cycle progression and the maturation of megakaryocyte progenitor cell lines in human induced pluripotent stem cells




ive

5-formylcytosine and 5-hydroxymethyluracil as surrogate markers of TET2 and SF3B1 mutations in myelodysplastic syndrome, respectively




ive

Functional assessment of glucocerebrosidase modulator efficacy in primary patient-derived macrophages is essential for drug development and patient stratification




ive

Inorganic Nitrate Promotes Glucose Uptake and Oxidative Catabolism in White Adipose Tissue Through the XOR-Catalyzed Nitric Oxide Pathway

An aging global population combined with sedentary lifestyles and unhealthy diets has contributed to an increasing incidence of obesity and type 2 diabetes. These metabolic disorders are associated with perturbations to nitric oxide (NO) signaling and impaired glucose metabolism. Dietary inorganic nitrate, found in high concentration in green leafy vegetables, can be converted to NO in vivo and demonstrates antidiabetic and antiobesity properties in rodents. Alongside tissues including skeletal muscle and liver, white adipose tissue is also an important physiological site of glucose disposal. However, the distinct molecular mechanisms governing the effect of nitrate on adipose tissue glucose metabolism and the contribution of this tissue to the glucose-tolerant phenotype remain to be determined. Using a metabolomic and stable-isotope labeling approach, combined with transcriptional analysis, we found that nitrate increases glucose uptake and oxidative catabolism in primary adipocytes and white adipose tissue of nitrate-treated rats. Mechanistically, we determined that nitrate induces these phenotypic changes in primary adipocytes through the xanthine oxidoreductase–catalyzed reduction of nitrate to NO and independently of peroxisome proliferator–activated receptor-α. The nitrate-mediated enhancement of glucose uptake and catabolism in white adipose tissue may be a key contributor to the antidiabetic effects of this anion.




ive

Pervasive Small RNAs in Cardiometabolic Research: Great Potential Accompanied by Biological and Technical Barriers

Advances in small RNA sequencing have revealed the enormous diversity of small noncoding RNA (sRNA) classes in mammalian cells. At this point, most investigators in diabetes are aware of the success of microRNA (miRNA) research and appreciate the importance of posttranscriptional gene regulation in glycemic control. Nevertheless, miRNAs are just one of multiple classes of sRNAs and likely represent only a minor fraction of sRNA sequences in a given cell. Despite the widespread appreciation of sRNAs, very little research into non-miRNA sRNA function has been completed, likely due to some major barriers that present unique challenges for study. To emphasize the importance of sRNA research in cardiometabolic diseases, we highlight the success of miRNAs and competitive endogenous RNAs in cholesterol and glucose metabolism. Moreover, we argue that sequencing studies have demonstrated that miRNAs are just the tip of the iceberg for sRNAs. We are likely standing at the precipice of immense discovery for novel sRNA-mediated gene regulation in cardiometabolic diseases. To realize this potential, we must first address critical barriers with an open mind and refrain from viewing non-miRNA sRNA function through the lens of miRNAs, as they likely have their own set of distinct regulatory factors and functional mechanisms.




ive

Loss of cerebellar function selectively affects intrinsic rhythmicity of eupneic breathing [RESEARCH ARTICLE]

Yu Liu, Shuhua Qi, Fridtjof Thomas, Brittany L. Correia, Angela P. Taylor, Roy V. Sillitoe, and Detlef H. Heck

Respiration is controlled by central pattern generating circuits in the brain stem, whose activity can be modulated by inputs from other brain areas to adapt respiration to autonomic and behavioral demands. The cerebellum is known to be part of the neuronal circuitry activated during respiratory challenges, such as hunger for air, but has not been found to be involved in the control of spontaneous, unobstructed breathing (eupnea). Here we applied a measure of intrinsic rhythmicity, the CV2, which evaluates the similarity of subsequent intervals and is thus sensitive to changes in rhythmicity at the temporal resolution of individual respiratory intervals. The variability of intrinsic respiratory rhythmicity was reduced in a mouse model of cerebellar ataxia compared to their healthy littermates. Irrespective of that difference, the average respiratory rate and the average coefficient of variation (CV) were comparable between healthy and ataxic mice. We argue that these findings are consistent with a proposed role of the cerebellum in modulating the duration of individual respiratory intervals, which could serve the purpose of coordinating respiration with other rhythmic orofacial movements, such as fluid licking and swallowing.




ive

Improving mental health in autistic young adults: a qualitative study exploring help-seeking barriers in UK primary care

BackgroundAutistic people are at increased risk of developing mental health problems. To reduce the negative impact of living with autism in a non-autistic world, efforts to improve take-up and access to care, and support in early years, which will typically start with a GP appointment, must be grounded in the accounts of autistic young adults.AimTo explore how autistic young adults understand and manage mental health problems; and to consider help seeking as a focus.Design and settingA cross-sectional, qualitative study. Autistic participants were purposively selected to represent a range of mental health conditions including anxiety and depression. A subsample were recruited from a population cohort screened for autism in childhood. The study concerns access to primary care.MethodNineteen autistic young adults without learning disabilities, aged 23 or 24 years, were recruited. In-depth, semi-structured interviews explored how they understood and managed mental health problems. Data were analysed thematically.ResultsYoung adults preferred self-management strategies. Multiple factors contributed to a focus on self-management, including: beliefs about the aetiology of mental health difficulties and increased vulnerability with the context of a diagnosis of autism, knowledge of self-management, and a view that formal support was unavailable or inadequate. Families had limited awareness of professional support.ConclusionYoung autistic adults without learning disabilities, and their families, may hold erroneous beliefs about autism and mental health. This may affect help seeking and contribute to an exacerbation of symptoms. GPs need to be alert to the fact that autistic young adults in their care may be experiencing mental health difficulties but may not recognise them as such.




ive

Optimising management of UTIs in primary care: a qualitative study of patient and GP perspectives to inform the development of an evidence-based, shared decision-making resource

BackgroundUrinary tract infections (UTIs) are one of the most common bacterial infections managed in general practice. Many women with symptoms of uncomplicated UTI may not benefit meaningfully from antibiotic treatment, but the evidence base is complex and there is no suitable shared decision-making resource to guide antibiotic treatment and symptomatic care for use in general practice consultations.AimTo develop an evidence-based, shared decision-making intervention leaflet to optimise management of uncomplicated UTI for women aged <65 years in the primary care setting.Design and settingQualitative telephone interviews with GPs and patient focus group interviews.MethodIn-depth interviews were conducted to explore how consultation discussions around diagnosis, antibiotic use, self-care, safety netting, and prevention of UTI could be improved. Interview schedules were based on the Theoretical Domains Framework.ResultsBarriers to an effective joint consultation and appropriate prescribing included: lack of GP time, misunderstanding of depth of knowledge and miscommunication between the patient and the GP, nature of the consults (such as telephone consultations), and a history of previous antibiotic therapy.ConclusionConsultation time pressures combined with late symptom presentation are a challenge for even the most experienced of GPs: however, it is clear that enhanced patient–clinician shared decision making is urgently required when it comes to UTIs. This communication should incorporate the provision of self-care, safety netting, and preventive advice to help guide patients when to consult. A shared decision-making information leaflet was iteratively co-produced with patients, clinicians, and researchers at Public Health England using study data.




ive

Infection in older adults: a qualitative study of patient experience

BackgroundInfection is common in older adults. Serious infection has a high mortality rate and is associated with unplanned hospital admissions. Little is known about the factors that prompt older patients to seek medical advice when they may have an infection.AimTo explore the symptoms of infection from the perspective of older adults, and when and why older patients seek healthcare advice for a possible infection.Design and settingA qualitative interview study among adults aged ≥70 years with a clinical diagnosis of infection recruited from ambulatory care units in Oxford, UK.MethodInterviews were semi-structured and based on a flexible topic guide. Participants were given the option to be interviewed with their carer. Thematic analysis was facilitated using NVivo (version 11).ResultsA total of 28 participants (22 patients and six carers) took part. Patients (aged 70–92 years) had experienced a range of different infections. Several early non-specific symptoms were described (fever, feeling unwell, lethargy, vomiting, pain, and confusion/delirium). Internally minimising symptoms was common and participants with historical experience of infection tended to be better able to interpret their symptoms. Factors influencing seeking healthcare advice included prompts from family, specific or intolerable symptoms, symptom duration, and being unable to manage with self-care. For some, not wanting to be a burden affected their desire to seek help.ConclusionTailored advice to older adults highlighting early symptoms of infection may be beneficial. Knowing whether patients have had previous experience of infection may help healthcare professionals in assessing older patients with possible infection.




ive

Understanding how patients establish strategies for living with asthma: a qualitative study in UK primary care as part of IMP2ART

BackgroundIn the context of a variable condition such as asthma, patient recognition of deteriorating control and knowing what prompt action to take is crucial. Yet, implementation of recommended self-management strategies remains poor.AimTo explore how patients with asthma and parents/carers of children with asthma develop and establish recommended self-management strategies for living with asthma, and how clinicians can best support the process.Design and settingA qualitative study in UK primary care.MethodPatients with asthma and parents/carers of children with asthma from 10 general practices were purposively sampled (using age, sex, and duration of asthma) to participate in focus groups or interviews between May 2016 and August 2016. Participants’ experiences of health care, management of asthma, and views on supported self-management were explored. Interviews and focus group sessions were audio-recorded and transcribed verbatim. Iterative thematic analysis was conducted, guided by the research questions and drawing on habit theory in discussion with a multidisciplinary research team.ResultsA total of 49 participants (45 patients; 4 parents/carers) took part in 32 interviews and five focus groups. Of these, 11 reported using an action plan. Patients learnt how to self-manage over time, building knowledge from personal experience and other sources, such as the internet. Some regular actions, for example, taking medication, became habitual. Dealing with new or unexpected scenarios required reflective abilities, which may be supported by a tailored action plan.ConclusionPatients reported learning intuitively how to self-manage. Some regular actions became habitual; dealing with the unexpected required more reflective cognitive skills. In order to support implementation of optimal asthma self- management, clinicians should consider both these aspects of self-management and support, and educate patients proactively.




ive

An alternative COVID-19 checklist




ive

Delivering long-term cancer care in primary care




ive

Impacts of Operational Failures on Primary Care Physicians Work: A Critical Interpretive Synthesis of the Literature [Systematic Review]

PURPOSE

Operational failures are system-level errors in the supply of information, equipment, and materials to health care personnel. We aimed to review and synthesize the research literature to determine how operational failures in primary care affect the work of primary care physicians.

METHODS

We conducted a critical interpretive synthesis. We searched 7 databases for papers published in English from database inception until October 2017 for primary research of any design that addressed problems interfering with primary care physicians’ work. All potentially eligible titles/abstracts were screened by 1 reviewer; 30% were subject to second screening. We conducted an iterative critique, analysis, and synthesis of included studies.

RESULTS

Our search retrieved 8,544 unique citations. Though no paper explicitly referred to "operational failures," we identified 95 papers that conformed to our general definition. The included studies show a gap between what physicians perceived they should be doing and what they were doing, which was strongly linked to operational failures—including those relating to technology, information, and coordination—over which physicians often had limited control. Operational failures actively configured physicians’ work by requiring significant compensatory labor to deliver the goals of care. This labor was typically unaccounted for in scheduling or reward systems and had adverse consequences for physician and patient experience.

CONCLUSIONS

Primary care physicians’ efforts to compensate for suboptimal work systems are often concealed, risking an incomplete picture of the work they do and problems they routinely face. Future research must identify which operational failures are highest impact and tractable to improvement.




ive

Efficacy and Safety of Use of the Fasting Algorithm for Singaporeans With Type 2 Diabetes (FAST) During Ramadan: A Prospective, Multicenter, Randomized Controlled Trial [Original Research]

PURPOSE

We aimed to evaluate the efficacy and safety of use of the Fasting Algorithm for Singaporeans with Type 2 Diabetes (FAST) during Ramadan.

METHODS

We performed a prospective, multicenter, randomized controlled trial. The inclusion criteria were age ≥21 years, baseline glycated hemoglobin (HbA1c) level ≤9.5%, and intention to fast for ≥10 days during Ramadan. Exclusion criteria included baseline estimated glomerular filtration rate <30 mL/min, diabetes-related hospitalization, and short-term corticosteroid therapy. Participants were randomized to intervention (use of FAST) or control (usual care without FAST) groups. Efficacy outcomes were HbA1c level and fasting blood glucose and postprandial glucose changes, and the safety outcome was incidence of major or minor hypoglycemia during the Ramadan period. Glycemic variability and diabetes distress were also investigated. Linear mixed models were constructed to assess changes.

RESULTS

A total of 97 participants were randomized (intervention: n = 46, control: n = 51). The HbA1c improvement during Ramadan was 4 times greater in the intervention group (–0.4%) than in the control group (–0.1%) (P = .049). The mean fasting blood glucose level decreased in the intervention group (–3.6 mg/dL) and increased in the control group (+20.9 mg/dL) (P = .034). The mean postprandial glucose level showed greater improvement in the intervention group (–16.4 mg/dL) compared to the control group (–2.3 mg/dL). There were more minor hypoglycemic events based on self-monitered blood glucose readings in the control group (intervention: 4, control: 6; P = .744). Glycemic variability was not significantly different between the 2 groups (P = .284). No between-group differences in diabetes distress were observed (P = .479).

CONCLUSIONS

Our findings emphasize the importance of efficacious, safe, and culturally tailored epistemic tools for diabetes management.




ive

Anticoagulants Safety and Effectiveness in General Practice: A Nationwide Prospective Cohort Study [Original Research]

PURPOSE

Most real-world studies on anticoagulants have been based on health insurance databases or performed in secondary care. The aim of this study was to compare safety and effectiveness between patients treated with vitamin K antagonists (VKAs) and patients treated with direct oral anticoagulants (DOACs) in a general practice setting.

METHODS

The CACAO study (Comparison of Accidents and their Circumstances with Oral Anticoagulants) is a multicenter prospective cohort study conducted among ambulatory patients taking an oral anticoagulant. Participants were patients from the study’s cross-sectional phase receiving oral anticoagulants because of nonvalvular atrial fibrillation, for secondary prevention of venous thromboembolism, or both. They were followed as usual for 1 year by their general practitioners, who collected data on changes in therapy, thromboembolic events, bleeding, and deaths. All events were adjudicated by an independent committee. We used a propensity score and a Cox regression model to derive hazard ratios.

RESULTS

Between April and December 2014, a total of 3,082 patients were included. At 1 year, 42 patients (1.7%) had experienced an arterial or venous event; 151 (6.1%) had experienced bleeding, including 47 (1.9%) who experienced major bleeding; and 105 (4.1%) had died. There was no significant difference between the VKA and DOAC groups regarding arterial or venous events, or major bleeding. The VKA group had a lower risk of overall bleeding (hazard ratio = 0.65; 95% CI, 0.43-0.98) but twice the risk of death (hazard ratio = 1.98; 95% CI, 1.15-3.42).

CONCLUSIONS

VKAs and DOACs had fairly similar safety and effectiveness in general practice. The substantially higher incidence of deaths with VKAs is consistent with known data from health insurance databases and calls for further research to understand its cause.




ive

Predicting Opioid Use Following Discharge After Cesarean Delivery [Original Research]

PURPOSE

Although cesarean delivery is the most common surgical procedure in the United States, postoperative opioid prescribing varies greatly. We hypothesized that patient characteristics, procedural characteristics, or both would be associated with high vs low opioid use after discharge. This information could help individualize prescriptions.

METHODS

In this prospective cohort study, we quantified opioid use for 4 weeks following hospital discharge after cesarean delivery. Predischarge characteristics were obtained from health records, and patients self-reported total opioid use postdischarge on weekly questionnaires. Opioid use was quantified in milligram morphine equivalents (MMEs). Binomial and Poisson regression analyses were performed to assess predictors of opioid use after discharge.

RESULTS

Of the 233 patients starting the study, 203 (87.1%) completed at least 1 questionnaire and were included in analyses (86.3% completed all 4 questionnaires). A total of 113 patients were high users (>75 MMEs) and 90 patients were low users (≤75 MMEs) of opioids postdischarge. The group reporting low opioid use received on average 44% fewer opioids in the 24 hours before discharge compared with the group reporting high opioid use (mean = 33.0 vs 59.3 MMEs, P <.001). Only a minority of patients (11.4% to 15.8%) stored leftover opioids in a locked location, and just 31 patients disposed of leftover opioids.

CONCLUSIONS

Knowledge of predischarge opioid use can be useful as a tool to inform individualized opioid prescriptions, help optimize nonopioid analgesia, and reduce opioid use. Additional studies are needed to evaluate the impact of implementing such measures on prescribing practices, pain, and functional outcomes.




ive

Effect of an Interactive Website to Engage Patients in Advance Care Planning in Outpatient Settings [Original Research]

PURPOSE

Online programs may help to engage patients in advance care planning in outpatient settings. We sought to implement an online advance care planning program, PREPARE (Prepare for Your Care; http://www.prepareforyourcare.org), at home and evaluate the changes in advance care planning engagement among patients attending outpatient clinics.

METHODS

We undertook a prospective before-and-after study in 15 primary care clinics and 2 outpatient cancer centers in Canada. Patients were aged 50 years or older (primary care) or 18 years or older (cancer care) and free of cognitive impairment. They used the PREPARE website over 6 weeks, with reminders sent at 2 or 4 weeks. We used the 55-item Advance Care Planning Engagement Survey, which measures behavior change processes (knowledge, contemplation, self-efficacy, readiness) on 5-point scales and actions relating to substitute decision makers, quality of life, flexibility for the decision maker, and asking doctors questions on an overall scale from 0 to 21; higher scores indicate greater engagement.

RESULTS

In total, 315 patients were screened and 172 enrolled, of whom 75% completed the study (mean age = 65.6 years, 51% female, 35% had cancer). The mean behavior change process score was 2.9 (SD 0.8) at baseline and 3.5 (SD 0.8) at follow-up (mean change = 0.6; 95% CI, 0.49-0.73); the mean action measure score was 4.0 (SD 4.9) at baseline and 5.2 (SD 5.4) at follow-up (mean change = 1.2; 95% CI, 0.54-1.77). The effect size was moderate (0.75) for the former and small (0.23) for the latter. Findings were similar in both primary care and cancer care populations.

CONCLUSIONS

Implementation of the online PREPARE program in primary care and cancer care clinics increased advance care planning engagement among patients.




ive

Prognosis and Survival of Older Patients With Dizziness in Primary Care: A 10-Year Prospective Cohort Study [Original Research]

PURPOSE

The prognosis of older patients with dizziness in primary care is unknown. Our objective was to determine the prognosis and survival of patients with different subtypes and causes of dizziness.

METHODS

In a primary care prospective cohort study, 417 older adults with dizziness (mean age 79 years) received a full diagnostic workup in 2006-2008. A panel of physicians classified the subtype and primary cause of dizziness. Main outcome measures were mortality and dizziness-related impairment assessed at 10-year follow-up.

RESULTS

At 10-year follow-up 169 patients (40.5%) had died. Presyncope was the most common dizziness subtype (69.1%), followed by vertigo (41.0%), disequilibrium (39.8%), and other dizziness (1.7%). The most common primary causes of dizziness were cardiovascular disease (56.8%) and peripheral vestibular disease (14.4%). Multivariable adjusted Cox models showed a lower mortality rate for patients with the subtype vertigo compared with other subtypes (hazard ratio [HR] = 0.62; 95% CI, 0.40-0.96), and for peripheral vestibular disease vs cardiovascular disease as primary cause of dizziness (HR = 0.46; 95% CI, 0.25-0.84). After 10 years, 47.7% of patients who filled out the follow-up measurement experienced substantial dizziness-related impairment. No significant difference in substantial impairment was seen between different subtypes and primary causes of dizziness.

CONCLUSIONS

The 10-year mortality rate was lower for the dizziness subtype vertigo compared with other subtypes. Patients with dizziness primarily caused by peripheral vestibular disease had a lower mortality rate than patients with cardiovascular disease. Substantial dizziness-related impairment in older patients with dizziness 10 years later is high, and indicates that current treatment strategies by family physicians may be suboptimal.




ive

Impacts of Operational Failures on Primary Care Physicians Work: A Critical Interpretive Synthesis of the Literature [Departments]




ive

Identifying Outcomes Important to Patients with Glomerular Disease and Their Caregivers

Background and objectives

Shared decision making in patients with glomerular disease remains challenging because outcomes important to patients remain largely unknown. We aimed to identify and prioritize outcomes important to patients and caregivers and to describe reasons for their choices.

Design, setting, participants, & measurements

We purposively sampled adult patients with glomerular disease and their caregivers from Australia, Hong Kong, the United Kingdom, and the United States. Participants identified, discussed, and ranked outcomes in focus groups using the nominal group technique; a relative importance score (between zero and one) was calculated. Qualitative data were analyzed thematically.

Results

Across 16 focus groups, 134 participants (range, 19–85 years old; 51% women), including 101 patients and 33 caregivers, identified 58 outcomes. The ten highest-ranked outcomes were kidney function (importance score of 0.42), mortality (0.29), need for dialysis or transplant (0.22), life participation (0.18), fatigue (0.17), anxiety (0.13), family impact (0.12), infection and immunity (0.12), ability to work (0.11), and BP (0.11). Three themes explained the reasons for these rankings: constraining day-to-day experience, impaired agency and control over health, and threats to future health and family.

Conclusions

Patients with glomerular disease and their caregivers highly prioritize kidney health and survival, but they also prioritize life participation, fatigue, anxiety, and family impact.




ive

The Elusive Promise of Bioimpedance in Fluid Management of Patients Undergoing Dialysis




ive

Ask and It Shall Be Given: Patient-Centered Outcomes in Glomerular Diseases




ive

What It Means to Live with Focal Segmental Glomerulosclerosis




ive

Kidney Health Initiative Roadmap for Kidney Replacement Therapy: A Patients Perspective




ive

IL1{alpha} Antagonizes IL1{beta} and Promotes Adaptive Immune Rejection of Malignant Tumors

We assessed the contribution of IL1 signaling molecules to malignant tumor growth using IL1β–/–, IL1α–/–, and IL1R1–/– mice. Tumors grew progressively in IL1R–/– and IL1α–/– mice but were often absent in IL1β–/– mice. This was observed whether tumors were implanted intradermally or injected intravenously and was true across multiple distinct tumor lineages. Antibodies to IL1β prevented tumor growth in wild-type (WT) mice but not in IL1R1–/– or IL1α–/– mice. Antibodies to IL1α promoted tumor growth in IL1β–/– mice and reversed the tumor-suppressive effect of anti-IL1β in WT mice. Depletion of CD8+ T cells and blockade of lymphocyte mobilization abrogated the IL1β–/– tumor suppressive effect, as did crossing IL1β–/– mice to SCID or Rag1–/– mice. Finally, blockade of IL1β synergized with blockade of PD-1 to inhibit tumor growth in WT mice. These results suggest that IL1β promotes tumor growth, whereas IL1α inhibits tumor growth by enhancing T-cell–mediated antitumor immunity.