data Five Challenges Companies Must Overcome to Make Use of All Their Data By feedproxy.google.com Published On :: Tue, 29 Aug 2017 22:48:11 +0000 Full Article Industry Perspectives
data How to Design a Data Center in a Norwegian Fjord By feedproxy.google.com Published On :: Wed, 30 Aug 2017 12:00:00 +0000 Full Article Design Europe Featured
data The unreasonable importance of data preparation By feedproxy.google.com Published On :: Tue, 24 Mar 2020 10:00:00 +0000 In a world focused on buzzword-driven models and algorithms, you’d be forgiven for forgetting about the unreasonable importance of data preparation and quality: your models are only as good as the data you feed them. This is the garbage in, garbage out principle: flawed data going in leads to flawed results, algorithms, and business decisions. […] Full Article AI & ML Deep Dive
data How data privacy leader Apple found itself in a data ethics catastrophe By feedproxy.google.com Published On :: Wed, 22 Apr 2020 12:17:23 +0000 Three months ago, Apple released a new credit card in partnership with Goldman Sachs that aimed to disrupt the highly regulated world of consumer finance. However, a well-known software developer tweeted that he was given 20x the credit line offered to his wife, despite the fact that they have been filing joint tax returns and […] Full Article AI & ML Deep Dive
data Driving 21st Century Growth: The Looming Transatlantic Battle Over Data By feedproxy.google.com Published On :: Tue, 14 Feb 2017 13:23:00 +0000 Corporate Members Event 29 March 2017 - 12:15pm to 1:30pm Chatham House, London Event participants Dr Christopher Smart, Whitehead Senior Fellow, Chatham House; Senior Fellow, Mossavar-Rahmani Center for Business and Government, Harvard Kennedy School; Special Assistant to President Obama, International Economics, Trade and Investment (2013-15)Chair: Kenneth Cukier, Senior Editor of Digital Products, The Economist As US and European governments grapple with the challenges of reinforcing their economic relationships, traditional negotiations over tax and trade policy may soon be overwhelmed by a far thornier issue: the regulation of data storage, protection and analysis. As traditional global trade in goods and services has levelled off, cross-border data flows continue to expand rapidly.Christopher Smart will outline the economic promise of data analytics to drive dramatic productivity gains, particularly for industry and financial services. He will explore contrasting political debates in the United States and Europe over personal privacy and national security and analyse how these have influenced many of the assumptions that drive the regulation of data flows. This event is open to coporate members only.This event will be preceded by an informal, welcome reception from 12:15.To enable as open a debate as possible, this event will be held under the Chatham House Rule. Members Events Team Email Full Article
data Regulating the Data that Drive 21st-Century Economic Growth - The Looming Transatlantic Battle By feedproxy.google.com Published On :: Fri, 23 Jun 2017 09:14:01 +0000 28 June 2017 This paper examines how governments on both sides of the Atlantic are establishing frameworks that attempt to govern the commercial uses of data. It covers areas such as data analytics driving productivity and growth, the 'industrial internet of things', and the policy context and political forces shaping data rules in the US and Europe. Read online Download PDF Dr Christopher Smart Former Associate Fellow, US and the Americas Programme @csmart 2017-06-23-TsystemsData.jpg Data centre for T-Systems, a subsidiary of Deutsche Telekom. Photo by: Thomas Trutschel/Photothek/Getty Images SummaryAs the US government and European governments once again grapple with the challenges of reinforcing and expanding the transatlantic economic relationship, traditional negotiations over trade or tax policy may soon be upstaged by a far thornier and more important issue: how to regulate the storage, protection and analysis of data.Growth in the traditional global trade in goods and services has levelled off, but cross-border data flows continue to expand rapidly and the challenges of developing policies that protect privacy, security and innovation are already tremendous. For example, data analytics are driving dramatic productivity gains in industry, particularly for large and complex installations whose safety and efficiency will increasingly depend on flows of data across jurisdictions. Meanwhile, ‘fintech’ (financial technology) start-ups and large banks alike are testing new modes of accumulating, analysing and deploying customer data to provide less expensive services and manage the risk profile of their businesses.While the US debate on the use of data has often been framed around the trade-off between national security and personal privacy, Europeans often face an even more complex set of concerns that include worries that their digital and technology firms lag behind dominant US competitors. The political and regulatory uncertainty helps neither side, and leaves transatlantic companies struggling to comply with uncertain and conflicting rules in different jurisdictions.A global consensus on data regulation is currently well out of reach, but given the expanding importance of data in so many areas, basic agreement on regulatory principles is crucial between the US and the EU. This paper proposes a ‘Transatlantic Charter for Data Security and Mobility’, which could help shape a common understanding. While it would hardly resolve all concerns – or indeed contradictions – around the prevailing traditions on both sides of the Atlantic, it could provide the basis for better cooperation and establish a framework to protect the promise of the digital age amid an unpredictable and emotional debate. Department/project US and the Americas Programme, US Geoeconomic Trends and Challenges Full Article
data Why We Need a Transatlantic Charter for Data Security and Mobility By feedproxy.google.com Published On :: Tue, 27 Jun 2017 09:59:17 +0000 28 June 2017 Dr Christopher Smart @csmart Former Associate Fellow, US and the Americas Programme Setting common guidelines for data flows is crucial both to protect the goods and services that already depend on big data and to support the next generation of productivity gains and business opportunities. 2017-06-23-TsystemsData.jpg Data centre for T-Systems, a subsidiary of Deutsche Telekom. Photo by: Thomas Trutschel/Photothek/Getty Images While trade and tax remain at the heart of the difficult economic conversations between Europe and the US, a new issue has emerged as a potential source of even greater friction: data.Growth in the traditional global trade in goods and services has levelled off, but cross-border data flows continue to expand rapidly and the challenges of developing policies that protect privacy, security and innovation are already tremendous. For example, data analytics are driving dramatic productivity gains in industry, particularly for large and complex installations whose safety and efficiency will increasingly depend on flows of those data across jurisdictions. Meanwhile, ‘fintech’ (financial technology) start-ups and large banks alike are testing new modes of accumulating, analysing and deploying customer data to provide less expensive services and manage the risk profile of their businesses.The rules that govern the collection, transmission and storage of data are perhaps one of the more surprising controversies in the transatlantic relationship. Similar liberal democracies with similar geostrategic interests might be expected to approach the handling of personal, corporate and government data in more or less the same way. And yet the US and its key European partners have struck different balances in the trade-offs between national security and citizens’ rights, between freedom of expression and personal privacy, and between free enterprise and market regulation.While the US debate on the use of data has often been framed around the trade-off between national security and personal privacy, Europeans often face an even more complex set of concerns that include worries that their digital and technology firms lag behind dominant US competitors. The political and regulatory uncertainty helps neither side, and leaves transatlantic companies struggling to comply with uncertain and conflicting rules in different jurisdictions.This makes more determined efforts by US and European policymakers to agree basic principles that will guide the usage and protection of personal and commercial data all the more important. While common regulations or even greater alignment among regulators seem out of reach, a ‘Transatlantic Charter for Data Security and Mobility’ would provide a set of principles for more specific rules amid political landscapes and technological developments that are evolving rapidly. It could also provide the basis for firms, whether in manufacturing or financial services or health care, to draft their own voluntary standards on how they protect data even as they develop new algorithms that improve productivity, safety and customer satisfaction.Embarrassing leaks, careful denials and endless lawsuits will continue to shape the awkward efforts of policymakers to find common ground around issues like cyberespionage, defence of common networks and the sharing of personal data with law enforcement. Cyberattacks with the aim of disrupting government operations or influencing election campaigns will add still further pressures. These will all serve as a noisy backdrop to a related but separate debate over how commercial firms should exploit the opportunities of global networks and ‘big data’ analytics while protecting national interests and privacy.Yet, setting common guidelines for commercial data transmission and storage remains crucial both to protect the goods and services that already depend on sophisticated data-gathering and analysis, and to support the next generation of productivity gains and business opportunities.Global firms yearn for clarity and predictability as they organize themselves to make the most of the data revolution. Neither is likely to become a reality soon. The EU’s new General Data Protection Regulation will take effect in 2018, but its implementation will inevitably be coloured by the fact that American firms currently dominate the information technology business. Last year’s ‘Privacy Shield’ agreement between the US and the EU renews the permission for firms with transatlantic business interests to transfer data, subject to compliance with basic standards of protection, but the agreement remains vulnerable to European court challenges. Britain’s decision to leave the EU adds a further complication, as it establishes its own set of data protection rules that may not easily align with either European or US requirements. Meanwhile, the World Trade Organization continues to debate new rules for digital trade, even as markets like China, Russia and Brazil make up their own.If this ‘Transatlantic Charter for Data Security and Mobility’ were adopted bilaterally, say as part of the annual reviews of the US–EU Privacy Shield agreement, it could form the basis for broader cooperation on these issues, helping to drive progress in the G7 and G20 and ultimately perhaps in trade agreements under the WTO. It would hardly secure complete alignment on these questions, but it could help establish the framework for a debate that all too often lurches to extremes and risks damaging a fundamental alliance for global stability – along with a fundamental driver of 21st-century economic progress.To comment on this article, please contact Chatham House Feedback Full Article
data Webinar: Big Data, AI and Pandemics By feedproxy.google.com Published On :: Tue, 07 Apr 2020 22:30:01 +0000 Members Event Webinar 28 April 2020 - 5:00pm to 6:00pm Online Event participants David Aanensen, Director, The Centre for Genomic Pathogen SurveillanceMarietje Schaake, International Policy Director, Stanford University Cyber Policy CenterStefaan Verhulst, Co-Founder and Chief of Research and Development, NYU GovlabChair: Marjorie Buchser, Executive Director, Digital Society Initiative, Chatham House Artificial Intelligence (AI) has the potential to benefit healthcare through a variety of applications including predictive care, treatment recommendations, identification of pathogens and disease patterns as well as the identification of vulnerable groups.With access to increasingly complex data sets and the rise of sophisticated pattern detection, AI could offer new means to anticipate and mitigate pandemics. However, the risks associated with AI such as bias, infringement on privacy and limited accountability become amplified under the pressurized lens of a global health crisis. Emergency measures often neglect standard checks and balances due to time-constraints. Whether temporary or permanent, AI applications during the epidemic have the potential to mark a watershed moment in human history and normalize the deployment of those tools with little public debate.This webinar discusses the nature of beneficial tech while also identifying issues that arise out of fast-tracking AI solutions during emergencies and pandemics. Can emerging tech help detect and fight viruses? Should surveillance tech be widely accepted and rolled out during times of a global health emergency? And how can policymakers act to ensure the responsible use of data without hindering AI’s full potential?This webinar is being run in collaboration with Chatham House’s Digital Society Initiative (DSI) and Centre for Universal Health. Our DSI brings together policy and technology communities to help forge a common understanding and jointly address the challenges that rapid advances in technology are causing domestic and international politics. The Centre for Universal Health is a multi-disciplinary centre established to help accelerate progress towards the health-related Sustainable Development Goals (SDGs) by 2030 in particular SDG 3: ‘To ensure healthy lives and promote well-being for all at all ages’. Full Article
data Extending the Limits of Quantitative Proteome Profiling with Data-Independent Acquisition and Application to Acetaminophen-Treated Three-Dimensional Liver Microtissues By feedproxy.google.com Published On :: 2015-05-01 Roland BrudererMay 1, 2015; 14:1400-1410Research Full Article
data PaxDb, a Database of Protein Abundance Averages Across All Three Domains of Life By feedproxy.google.com Published On :: 2012-08-01 M. WangAug 1, 2012; 11:492-500Technological Innovation and Resources Full Article
data Interpretation of Shotgun Proteomic Data: The Protein Inference Problem By feedproxy.google.com Published On :: 2005-10-01 Alexey I. NesvizhskiiOct 1, 2005; 4:1419-1440Tutorial Full Article
data Targeted Data Extraction of the MS/MS Spectra Generated by Data-independent Acquisition: A New Concept for Consistent and Accurate Proteome Analysis By feedproxy.google.com Published On :: 2012-06-01 Ludovic C. GilletJun 1, 2012; 11:O111.016717-O111.016717Research Full Article
data What is synthetic data and how can it help protect privacy? By www.techworld.com Published On :: Tue, 01 Oct 2019 07:00:00 GMT Hazy CEO Harry Keen explains why artificially produced data is gaining favour over information generated by real-world events Full Article
data Housing minister announces plans to boost UK proptech sector with data By www.techworld.com Published On :: Tue, 22 Oct 2019 07:12:00 GMT Esther McVey said the government will release local data on properties and land to help the proptech sector thrive Full Article
data Visa's acquisition of Plaid throws up data reuse concerns By www.techworld.com Published On :: Tue, 21 Jan 2020 14:00:00 GMT What happens when a service you shared your personal data with is acquired by a giant corporation? Full Article
data CBD Press Release: Managing biodiversity data from local government: Guidance for local authorities on publishing through the GBIF network, helping preserve knowledge about biodiversity By www.cbd.int Published On :: Fri, 25 May 2012 00:00:00 GMT Full Article
data CBD News: The GBIF Secretariat has launched the inaugural GBIF Ebbe Nielsen Challenge, hoping to inspire innovative applications of open-access biodiversity data by scientists, informaticians, data modelers, cartographers and other experts competing for a By gbif.challengepost.com Published On :: Thu, 04 Dec 2014 00:00:00 GMT Full Article
data CBD News: The ecologically or biologically significant marine areas (EBSA) booklet series provide snapshot summaries of the pages upon pages of data compiled by participating experts, to provide an inspiring overview of some of the most ecologically or bi By www.cbd.int Published On :: Fri, 22 Jun 2018 00:00:00 GMT Full Article
data CBD News: Online platform allows policymakers and other partners to access global data layers, upload and manipulate their own datasets, and query multiple datasets to provide key information on the Aichi Biodiversity Targets and nature-based Sustainable By www.undp.org Published On :: Fri, 06 Jul 2018 00:00:00 GMT Full Article
data CBD Notification SCBD/SSSF/AS/ML/GD/88414 (2019-114): Tracking Economic Instruments and Finance for Biodiversity: Invitation to contribute data on positive incentives relevant to Aichi Biodiversity Target 3 to the OECD PINE database By www.cbd.int Published On :: Thu, 12 Dec 2019 00:00:00 GMT Full Article
data CBD Notification SCBD/SSSF/AS/SBG/ESE/88552 (2019-116): Request to Identify and Submit Data on Other Effective Area-Based Conservation Measures By www.cbd.int Published On :: Wed, 18 Dec 2019 00:00:00 GMT Full Article
data A finite element data assimilation method for the wave equation By www.ams.org Published On :: Tue, 07 Apr 2020 14:09 EDT Erik Burman, Ali Feizmohammadi and Lauri Oksanen Math. Comp. 89 (2020), 1681-1709. Abstract, references and article information Full Article
data Innovative UK companies using and sharing open data By www.techworld.com Published On :: Wed, 04 Dec 2019 16:23:00 GMT Full Article
data SAS Notes for SAS®9 - 65885: The ability to connect to a Google BigQuery database via OAuth Authentication has been added to SAS/ACCESS Interface to Google BigQuery By feedproxy.google.com Published On :: Fri, 1 May 2020 09:59:50 EST The ability to connect to a Google BigQuery database via OAuth is now available with this hot fix. Three new options have been added, REFRESH_TOKEN=, CLIENT_ID=, and CLIENT_SECRET=. You can use these options with  Full Article BIGQUERY+SAS/ACCESS+Interface+to+Google+
data SAS Notes for SAS®9 - 65884: The ability to connect to a Google BigQuery database via proxy has been added to the SAS/ACCESS Interface to Google BigQuery By feedproxy.google.com Published On :: Thu, 30 Apr 2020 12:50:08 EST The ability to connect to a Google BigQuery database via a proxy is available with this hot fix. You can use the newly added option, PROXY=, with the following methods of connection to the Google BigQuery database: Full Article BIGQUERY+SAS/ACCESS+Interface+to+Google+
data How to Make Sound Decisions with Limited Data During the Coronavirus Pandemic By www8.gsb.columbia.edu Published On :: Thu, 02 Apr 2020 17:56:26 +0000 Leadership Operations Risk Management Strategy Thursday, April 2, 2020 - 13:00 Coronavirus presents an unprecedented predicament: Everyday, leaders must make momentous decisions with life or death consequences for many—but there is a dearth of data. Oded Netzer is a Columbia Business School professor and Data Science Institute affiliate who builds statistical and econometric models to measure consumer behavior that help business leaders make data-driven decisions. Here, he discusses how leaders from all fields can make sound decisions with scarce data to guide them. Full Article
data CT scan database of 1000 sets was created for teaching AI to diagnose COVID-19 By www.eurekalert.org Published On :: Fri, 08 May 2020 00:00:00 EDT (Moscow Research and Practical Clinical Center for Diagnostics and Telemedicine Technologies) Researchers of the Moscow Diagnostics and Telemedicine Center collected a dataset that includes more than a thousand sets of chest CT scans of patients with imaging finding of COVID-19. As of today, it is the largest completely anonymized database of CT studies, which has no analogues in Russia or in the world. It is available for download and can be used for developing services based on artificial intelligence technologies. Full Article
data Study: could dark matter be hiding in existing data? By www.eurekalert.org Published On :: Mon, 04 May 2020 00:00:00 EDT (DOE/Lawrence Berkeley National Laboratory) A new study, led by researchers at Berkeley Lab and UC Berkeley, suggests new paths for catching the signals of dark matter particles that have their energy absorbed by atomic nuclei. Full Article
data NASA CubeSat mission to gather vital space weather data By www.eurekalert.org Published On :: Thu, 07 May 2020 00:00:00 EDT (NASA/Goddard Space Flight Center) NASA has selected a new pathfinding CubeSat mission to gather data not collected since the agency flew the Dynamics Explorer in the early 1980s. Full Article
data Data from 2 space lasers comprehensively estimate polar ice loss and sea level rise By www.eurekalert.org Published On :: Thu, 30 Apr 2020 00:00:00 EDT (American Association for the Advancement of Science) Ice sheet losses from Greenland and Antarctica have outpaced snow accumulation and contributed approximately 14 millimeters to sea level rise over 16 years (2003 to 2019), a new analysis of data from NASA's laser-shooting satellites has revealed. Full Article
data The Data Must Be Accessible to All By feedproxy.google.com Published On :: 2020-04-01 Lila M. GieraschApr 1, 2020; 19:569-570Editorial Full Article
data Acquiring and Analyzing Data Independent Acquisition Proteomics Experiments without Spectrum Libraries By feedproxy.google.com Published On :: 2020-04-20 Lindsay K PinoApr 20, 2020; 0:P119.001913v1-mcp.P119.001913Perspective Full Article
data Data-driven motion detection and event-by-event correction for brain PET: Comparison with Vicra By jnm.snmjournals.org Published On :: 2020-02-14T14:01:21-08:00 Head motion degrades image quality and causes erroneous parameter estimates in tracer kinetic modeling in brain PET studies. Existing motion correction methods include frame-based image-registration (FIR) and correction using real-time hardware-based motion tracking (HMT) information. However, FIR cannot correct for motion within one predefined scan period while HMT is not readily available in the clinic since it typically requires attaching a tracking device to the patient. In this study, we propose a motion correction framework with a data-driven algorithm, i.e., using the PET raw data itself, to address these limitations. Methods: We propose a data-driven algorithm, Centroid of Distribution (COD), to detect head motion. In COD, the central coordinates of the line of response (LOR) of all events are averaged over 1-sec intervals to generate a COD trace. A point-to-point change in the COD trace in one direction that exceeded a user-defined threshold was defined as a time point of head motion, which was followed by manually adding additional motion time points. All the frames defined by such time points were reconstructed without attenuation correction and rigidly registered to a reference frame. The resulting transformation matrices were then used to perform the final motion compensated reconstruction. We applied the new COD framework to 23 human dynamic datasets, all containing large head motions, with 18F-FDG (N = 13) and 11C-UCB-J (N = 10), and compared its performance with FIR and with HMT using the Vicra, which can be considered as the "gold standard". Results: The COD method yielded 1.0±3.2% (mean ± standard deviation across all subjects and 12 grey matter regions) SUV difference for 18F-FDG (3.7±5.4% for 11C-UCB-J) compared to HMT while no motion correction (NMC) and FIR yielded -15.7±12.2% (-20.5±15.8%) and -4.7±6.9% (-6.2±11.0%), respectively. For 18F-FDG dynamic studies, COD yielded differences of 3.6±10.9% in Ki value as compared to HMT, while NMC and FIR yielded -18.0±39.2% and -2.6±19.8%, respectively. For 11C-UCB-J, COD yielded 3.7±5.2% differences in VT compared to HMT, while NMC and FIR yielded -20.0±12.5% and -5.3±9.4%, respectively. Conclusion: The proposed COD-based data-driven motion correction method outperformed FIR and achieved comparable or even better performance as compared to the Vicra HMT method in both static and dynamic studies. Full Article
data Clinical evaluation of a data-driven respiratory gating algorithm for whole-body positron emission tomography with continuous bed motion By jnm.snmjournals.org Published On :: 2020-02-14T14:01:21-08:00 Respiratory gating is the standard to overcome respiration effects degrading image quality in positron emission tomography (PET). Data-driven gating (DDG) using signals derived from PET raw data are promising alternatives to gating approaches requiring additional hardware. However, continuous bed motion (CBM) scans require dedicated DDG approaches for axially-extended PET, compared to DDG for conventional step-and-shoot scans. In this study, a CBM-capable DDG algorithm was investigated in a clinical cohort, comparing it to hardware-based gating using gated and fully motion-corrected reconstructions. Methods: 56 patients with suspected malignancies in thorax or abdomen underwent whole-body 18F-FDG CBM-PET/CT imaging using DDG and hardware-based respiratory gating (pressure-sensitive belt gating, BG). Correlation analyses were performed on both gating signals. Besides static reconstructions, BG and DDG were used for optimally-gated PET (BG-OG, DDG-OG) and fully motion-corrected PET (elastic motion correction; BG-EMOCO, DDG-EMOCO). Metabolic volumes, SUVmax and SUVmean of lesions were compared amongst the reconstructions. Additionally, the quality of lesion delineation in different PET reconstructions was independently evaluated by three experts. Results: Global correlation coefficients between BG and DDG signals amounted to 0.48±0.11, peaking at 0.89±0.07 when scanning the kidney and liver region. In total, 196 lesions were analyzed. SUV measurements were significantly higher in BG-OG, DDG-OG, BG-EMOCO and DDG-EMOCO compared to static images (P<0.001; median SUVmax: static, 14.3±13.4; BG-EMOCO, 19.8±15.7; DDG-EMOCO, 20.5±15.6; BG-OG, 19.6±17.1; DDG-OG, 18.9±16.6). No significant differences between BG-OG and DDG-OG, and BG-EMOCO and DDG-EMOCO, respectively, were found. Visual lesion delineation was significantly better in BG-EMOCO and DDG-EMOCO than in static reconstructions (P<0.001); no significant difference was found comparing BG and DDG (EMOCO, OG, respectively). Conclusion: DDG-based motion-compensation of CBM-PET acquisitions outperforms static reconstructions, delivering qualities comparable to hardware-based approaches. The new algorithm may be a valuable alternative for CBM-PET systems. Full Article
data Moving towards multicenter therapeutic trials in ALS: feasibility of data pooling using different TSPO positron emission tomography (PET) radioligands. By jnm.snmjournals.org Published On :: 2020-04-03T15:14:37-07:00 Rationale: Neuroinflammation has been implicated in Amyotrophic Lateral Sclerosis (ALS) and can be visualized using translocator protein (TSPO) radioligands. To become a reliable pharmacodynamic biomarker for ALS multicenter trials, some challenges have to be overcome. We aimed to investigate whether multicenter data pooling of different TSPO tracers (11C-PBR28 and 18F-DPA714) is feasible, after validation of an established 11C-PBR28 PET pseudoreference analysis technique for 18F-DPA714. Methods: 7 ALS-Belgium (58.9±6.7 years,5M) and 8 HV-Belgium (52.1±15.2 years,3M); and 7 ALS-US (53.4±9.8 years,5M) and 7 HV-US (54.6±9.6 years,4M) from a previously published study (1) underwent dynamic 18F-DPA714 (Leuven, Belgium) or 11C-PBR28 (Boston, US) PET-MR scans. For 18F-DPA714, volume of distribution (VT) maps were compared to standardized uptake value ratios (SUVR)40-60 calculated using the pseudoreference regions (1)cerebellum, (2)occipital cortex, and (3)whole brain without ventricles (WB-ventricles). Also for 11C-PBR28, SUVR60-90 using WB-ventricles were calculated. Results: In line with previous studies, increased 18F-DPA714 uptake (17.0±5.6%) in primary motor cortices was observed in ALS, as measured by both VT and SUVR40-60 approaches. Highest sensitivity was found for SUVRWB-ventricles (average cluster 21.6±0.1%). 18F-DPA714 VT ratio and SUVR40-60 results were highly correlated (r>0.8, p<0.001). A similar pattern of increased uptake (average cluster 20.5±0.5%) in primary motor cortices was observed in ALS with 11C-PBR28 using the SUVRWB-ventricles. Analysis of the 18F-DPA714 and 11C-PBR28 data together, resulted in a more extensive pattern of significant increased glial activation in the bilateral primary motor cortices. Conclusion: The same pseudoreference region analysis technique for 11C-PBR28 PET imaging can be extended towards 18F-DPA714 PET. Therefore, in ALS, standardized analysis across these two tracers enables pooling of TSPO PET data across multiple centers and increase power of TSPO as biomarker for future therapeutic trials. Full Article
data Data Driven Respiratory Gating Outperforms Device-Based Gating for Clinical FDG PET/CT By jnm.snmjournals.org Published On :: 2020-04-03T15:14:37-07:00 A data-driven method for respiratory gating in PET has recently been commercially developed. We sought to compare the performance of the algorithm to an external, device-based system for oncological [18F]-FDG PET/CT imaging. Methods: 144 whole-body [18F]-FDG PET/CT examinations were acquired using a Discovery D690 or D710 PET/CT scanner (GE Healthcare), with a respiratory gating waveform recorded by an external, device based respiratory gating system. In each examination, two of the bed positions covering the liver and lung bases were acquired with duration of 6 minutes. Quiescent period gating retaining ~50% of coincidences was then able to produce images with an effective duration of 3 minutes for these two bed positions, matching the other bed positions. For each exam, 4 reconstructions were performed and compared: data driven gating (DDG-retro), external device-based gating (RPM Gated), no gating but using only the first 3 minutes of data (Ungated Matched), and no gating retaining all coincidences (Ungated Full). Lesions in the images were quantified and image quality was scored by a radiologist, blinded to the method of data processing. Results: The use of DDG-retro was found to increase SUVmax and to decrease the threshold-defined lesion volume in comparison to each of the other reconstruction options. Compared to RPM-gated, DDG-retro gave an average increase in SUVmax of 0.66 ± 0.1 g/mL (n=87, p<0.0005). Although results from the blinded image evaluation were most commonly equivalent, DDG-retro was preferred over RPM gated in 13% of exams while the opposite occurred in just 2% of exams. This was a significant preference for DDG-retro (p=0.008, n=121). Liver lesions were identified in 23 exams. Considering this subset of data, DDG-retro was ranked superior to Ungated Full in 6/23 (26%) of cases. Gated reconstruction using the external device failed in 16% of exams, while DDG-retro always provided a clinically acceptable image. Conclusion: In this clinical evaluation, the data driven respiratory gating technique provided superior performance as compared to the external device-based system. For the majority of exams the performance was equivalent, but data driven respiratory gating had superior performance in 13% of exams, leading to a significant preference overall. Full Article
data Reshaping the amyloid buildup curve in Alzheimer's disease? - Partial volume effect correction of longitudinal amyloid PET data By jnm.snmjournals.org Published On :: 2020-05-01T11:16:57-07:00 It was hypothesized that the brain β-amyloid buildup curve plateaus at an early symptomatic Alzheimer's disease (AD) stage. Atrophy-related partial volume effects (PVEs) degrade signal in hot-spot imaging techniques, such as amyloid positron emission tomography (PET). This longitudinal analysis of amyloid-sensitive PET data investigated the shape of the β-amyloid curve in AD applying PVE correction (PVEC). We analyzed baseline and 2-year follow-up data of 216 symptomatic individuals on the AD continuum (positive amyloid status) enrolled in Alzheimer's Disease Neuroimaging Initiative (17 AD dementia, 199 mild cognitive impairment), including 18F-florbetapir PET, magnetic resonance imaging and mini mental state examination (MMSE) scores. For PVEC, the modified Müller-Gärtner method was performed. Compared to non-PVE-corrected data, PVE-corrected data yielded significantly higher regional and composite standardized uptake value ratio (SUVR) changes over time (P=0.0002 for composite SUVRs). Longitudinal SUVR changes in relation to MMSE decreases showed a significantly higher slope of the regression line in the PVE-corrected as compared to the non-PVE-corrected PET data (F=7.1, P=0.008). These PVEC results indicate that the β-amyloid buildup curve does not plateau at an early symptomatic disease stage. A further evaluation of the impact of PVEC on the in-vivo characterization of time-dependent AD pathology, including the reliable assessment and comparison of other amyloid tracers, is warranted. Full Article
data Combining Precursor and Fragment Information for Improved Detection of Differential Abundance in Data Independent Acquisition [Technological Innovation and Resources] By feedproxy.google.com Published On :: 2020-02-01T00:05:30-08:00 In bottom-up, label-free discovery proteomics, biological samples are acquired in a data-dependent (DDA) or data-independent (DIA) manner, with peptide signals recorded in an intact (MS1) and fragmented (MS2) form. While DDA has only the MS1 space for quantification, DIA contains both MS1 and MS2 at high quantitative quality. DIA profiles of complex biological matrices such as tissues or cells can contain quantitative interferences, and the interferences at the MS1 and the MS2 signals are often independent. When comparing biological conditions, the interferences can compromise the detection of differential peptide or protein abundance and lead to false positive or false negative conclusions. We hypothesized that the combined use of MS1 and MS2 quantitative signals could improve our ability to detect differentially abundant proteins. Therefore, we developed a statistical procedure incorporating both MS1 and MS2 quantitative information of DIA. We benchmarked the performance of the MS1-MS2-combined method to the individual use of MS1 or MS2 in DIA using four previously published controlled mixtures, as well as in two previously unpublished controlled mixtures. In the majority of the comparisons, the combined method outperformed the individual use of MS1 or MS2. This was particularly true for comparisons with low fold changes, few replicates, and situations where MS1 and MS2 were of similar quality. When applied to a previously unpublished investigation of lung cancer, the MS1-MS2-combined method increased the coverage of known activated pathways. Since recent technological developments continue to increase the quality of MS1 signals (e.g. using the BoxCar scan mode for Orbitrap instruments), the combination of the MS1 and MS2 information has a high potential for future statistical analysis of DIA data. Full Article
data The Data Must Be Accessible to All [Editorials] By feedproxy.google.com Published On :: 2020-04-01T00:05:32-07:00 Full Article
data Open government data to public use, and Australia may start to catch up with the world By www.smh.com.au Published On :: Mon, 02 Nov 2015 13:15:00 GMT Public servants need to ditch the control and encourage entrepreneurship. Full Article
data Delayed Australian data breach notification bill lands By www.smh.com.au Published On :: Fri, 04 Dec 2015 04:27:12 GMT Australians will be informed of certain breaches of their personal information under new laws being proposed by the Turnbull government, but only if the company or organisation breached turns over $3 million in revenue a year. Full Article
data Australian public service failing to share information: Public Sector Data Management report By www.smh.com.au Published On :: Tue, 08 Dec 2015 02:59:08 GMT A report has revealed stunning examples of public service inefficiency when it comes to releasing and managing data. Full Article
data Pro sport and big data: coaches may be more in favour than athletes By www.smh.com.au Published On :: Mon, 14 Dec 2015 13:00:00 GMT Professional sport is still working out how to tackle big data and understand how technology can assist elite athletes, according to top-level sports sports officials in the United States. Full Article
data ACT government defends seeking access to Canberrans' metadata By www.smh.com.au Published On :: Sun, 31 Jan 2016 13:00:00 GMT The ACT government has defended its right to seek access to Canberrans' private phone and internet records without a warrant. Full Article
data Ricochet uses power of the dark web to help journalists, sources dodge metadata laws By www.smh.com.au Published On :: Fri, 19 Feb 2016 04:19:51 GMT A new internet messaging tool that sidesteps the federal government's metadata collection regime to help journalists protect whistle blowers and assists human rights activists has received a tick of approval from security experts. Full Article
data Can the government really protect your privacy when it 'de-identifies' public data? By www.smh.com.au Published On :: Mon, 05 Dec 2016 12:45:00 GMT We don't really know to how to use big data and protect personal information at the same time. Full Article
data How federal government departments are protecting Australians' data against cyber hack By www.smh.com.au Published On :: Mon, 15 May 2017 10:09:02 GMT Cyber Security Minister Dan Tehan says the government can't rule out vulnerabilities to cyber threats. Full Article
data Medicare details available on dark web is just tip of data breach iceberg By www.smh.com.au Published On :: Mon, 17 Jul 2017 14:00:00 GMT The next wave of government reform will have to focus on data management. Full Article
data Construction of mega new IT data storage centre under way in Fyshwick By www.smh.com.au Published On :: Sun, 27 Aug 2017 14:00:00 GMT Fyshwick is set to get another massive IT data storage facility from 2018. Full Article
data Privacy Commissioner’s small budget to make policing new data breach laws difficult, experts say By www.smh.com.au Published On :: Fri, 23 Feb 2018 01:13:02 GMT New laws that mandate companies notify individuals about data breaches add to Privacy Commissioner's already-stacked caseload, but do not come with new funding. Full Article