disinformation Lichtman Blames Bad Election Prediction on Conservative Media 'Disinformation' and Elon Musk By www.breitbart.com Published On :: Thu, 14 Nov 2024 01:59:28 +0000 Historian and political scientist Allan Lichtman said Tuesday on NewsNation's "Cuomo" that conservative media disinformation and billionaire Elon Musk were why his election prediction that Vice President Harris would win the presidency was incorrect. The post Lichtman Blames Bad Election Prediction on Conservative Media ‘Disinformation’ and Elon Musk appeared first on Breitbart. Full Article Clips Politics 2024 Presidential Election Allan Lichtman Donald Trump Kamala Harris
disinformation 'Nostradamus' Of US Polls Blames Incorrect Prediction On Elon Musk And "Explosion" Of Disinformation By www.ndtv.com Published On :: Thu, 14 Nov 2024 07:32:09 +0530 Allan Lichtman, often dubbed the "Nostradamus" of US Presidential Polls, is blaming the "explosion" of disinformation and billionaire Elon Musk for his incorrect prediction that Kamala Harris would win the 2024 election. Full Article
disinformation Government Condemns Genocide Disinformation Efforts By allafrica.com Published On :: Wed, 13 Nov 2024 10:05:13 GMT [SAnews.gov.za] The Government of South Africa has condemned the spreading of disinformation concerning its case before the International Court of Justice (ICJ). Full Article External Relations Human Rights Middle East and Africa South Africa Southern Africa
disinformation Disinformation enabled Donald Trump’s second term and is a crisis for democracies everywhere By www.bmj.com Published On :: 2024-11-12T04:06:24-08:00 Donald Trump did not win the 2020 election, but asserting that he did became a prerequisite for Republicans standing for nomination to Congress or the Senate to win their primaries. An entire party became a vehicle for disinformation.1 Trump did win the 2024 presidential election, and key to that victory was building on the success of that lie. If you control enough of the information ecosystem, truth no longer matters.Another telling example: Haitian migrants in Springfield, Ohio, are not eating cats and dogs. US vice president elect, JD Vance, the source of that claim, admitted as much even as he justified it. “If I have to create stories so that the American media actually pays attention to the suffering of the American people, then that’s what I'm going to do,” he said.2Disinformation in politics is nothing new. History is replete with claims that were fabricated to advance political aims. Although... Full Article
disinformation Navigating the world of disinformation, deepfakes and AI-generated deception [Book Review] By betanews.com Published On :: Mon, 11 Nov 2024 11:48:17 +0000 Online scams aren't anything new, but thanks to artificial intelligence they're becoming more sophisticated and harder to detect. We've also seen a rise in disinformation and deepfakes many of them made possible, or at least more plausible, by AI. This means that venturing onto the internet is increasingly like negotiating a digital minefield. With FAIK, risk management specialist at KnowBe4 Perry Carpenter sets out to dissect what makes these threats work and the motivations behind them as well as offering some strategies to protect yourself. This is no dry technical guide though, it's all presented in a very readable style,… [Continue Reading] Full Article Article Artificial Intelligence (AI) Book cybersecurity Deepfake
disinformation Quality of Informing: Bias and Disinformation Philosophical Background and Roots By Published On :: Full Article
disinformation On the Difference or Equality of Information, Misinformation, and Disinformation: A Critical Research Perspective By Published On :: Full Article
disinformation Stress disorder in an age of media disinformation By tribune.com.pk Published On :: Fri, 08 Sep 23 20:55:39 +0500 Constant bombardment of ‘bad news’ can easily make individuals desensitised and disillusioned Full Article Opinion
disinformation China, Russia target Pentagon agency with disinformation campaign By www.washingtontimes.com Published On :: Wed, 23 Oct 2024 17:07:18 -0400 China and Russia have joined forces in an aggressive disinformation campaign involving both covert and overt operations designed to discredit a key Pentagon agency. Full Article
disinformation Australia plans huge fines if big tech fails to tackle disinformation By cio.economictimes.indiatimes.com Published On :: Mon, 26 Jun 2023 12:48:51 +0530 Under the proposed legislation, the owners of platforms like Facebook, Google, Twitter, TikTok and podcasting services would face penalties worth up to five percent of annual global turnover -- some of the highest proposed anywhere in the world. Full Article
disinformation Donate and Subscribe to your Local News Outlet Fight Disinformation By www.24-7pressrelease.com Published On :: Wed, 28 Aug 2024 08:00:00 GMT The United States 2024 elections is heating up but, please also pay attention to your local news outlets if you live in parts of the country where mayor, county executive and Governors elections are taking place. Full Article
disinformation Accenture invests in Reality Defender to fight deepfake extortion, fraud, and disinformation By www.kmworld.com Published On :: Wed, 23 Oct 2024 12:00:46 EST The investment in Reality Defender demonstrates Accenture's strong commitment to helping clients confidently navigate the genAI driven threat landscape, mitigate financial fraud, and maintain the integrity of their digital communications Full Article
disinformation DHS disinformation board’s work, plans remain a mystery By federalnewsnetwork.com Published On :: Thu, 05 May 2022 11:36:23 +0000 A newly formed Disinformation Governance Board remains shrouded in secrecy a week after the Biden administration’s announcement of the new effort was met with widespread criticism The post DHS disinformation board’s work, plans remain a mystery first appeared on Federal News Network. Full Article Congress Defense Government News Management Media News Workforce Alejandro Mayorkas Department of Homeland Security disinformation Russia social media
disinformation MeFi: Elon Musk's Perfect Disinformation Machine By www.metafilter.com Published On :: Tue, 12 Nov 2024 15:02:42 GMT Hidden changes are warping the already broken ecosystem that determines how many citizens—particularly in America—construct their sense of reality. (slSubstack)Y'all, I know there are folks who don't like Substack/NYT links but here we are and I thought this was worth posting in the aftermath of a disastrous election. Full Article
disinformation Danielle Smith and disinformation; scented candle reviews as COVID indicator; a surgeon in Tigray and more By www.cbc.ca Published On :: Fri, 21 Oct 2022 18:44:37 EDT What Danielle Smith posted on her subscribers-only social media; how litter boxes in schools became a Republican talking point; Yankee scented candle reviews as COVID indicator; a surgeon struggles to care for patients through Ethiopia's civil war; Brent Bambury returns and more. Full Article Radio/Day 6
disinformation Online Disinformation and Political Discourse: Applying a Human Rights Framework By www.chathamhouse.org Published On :: Tue, 05 Nov 2019 11:03:02 +0000 Online Disinformation and Political Discourse: Applying a Human Rights Framework Research paper sysadmin 5 November 2019 Although some digital platforms now have an impact on more people’s lives than does any one state authority, the international community has been slow to hold to account these platforms’ activities by reference to human rights law. — A man votes in Manhattan, New York City, during the US elections on 8 November 2016. Photo: Getty Images. This paper examines how human rights frameworks should guide digital technology. Summary Online political campaigning techniques are distorting our democratic political processes. These techniques include the creation of disinformation and divisive content; exploiting digital platforms’ algorithms, and using bots, cyborgs and fake accounts to distribute this content; maximizing influence through harnessing emotional responses such as anger and disgust; and micro-targeting on the basis of collated personal data and sophisticated psychological profiling techniques. Some state authorities distort political debate by restricting, filtering, shutting down or censoring online networks. Such techniques have outpaced regulatory initiatives and, save in egregious cases such as shutdown of networks, there is no international consensus on how they should be tackled. Digital platforms, driven by their commercial impetus to encourage users to spend as long as possible on them and to attract advertisers, may provide an environment conducive to manipulative techniques. International human rights law, with its careful calibrations designed to protect individuals from abuse of power by authority, provides a normative framework that should underpin responses to online disinformation and distortion of political debate. Contrary to popular view, it does not entail that there should be no control of the online environment; rather, controls should balance the interests at stake appropriately. The rights to freedom of thought and opinion are critical to delimiting the appropriate boundary between legitimate influence and illegitimate manipulation. When digital platforms exploit decision-making biases in prioritizing bad news and divisive, emotion-arousing information, they may be breaching these rights. States and digital platforms should consider structural changes to digital platforms to ensure that methods of online political discourse respect personal agency and prevent the use of sophisticated manipulative techniques. The right to privacy includes a right to choose not to divulge your personal information, and a right to opt out of trading in and profiling on the basis of your personal data. Current practices in collecting, trading and using extensive personal data to ‘micro-target’ voters without their knowledge are not consistent with this right. Significant changes are needed. Data protection laws should be implemented robustly, and should not legitimate extensive harvesting of personal data on the basis of either notional ‘consent’ or the data handler’s commercial interests. The right to privacy should be embedded in technological design (such as by allowing the user to access all information held on them at the click of a button); and political parties should be transparent in their collection and use of personal data, and in their targeting of messages. Arguably, the value of personal data should be shared with the individuals from whom it derives. The rules on the boundaries of permissible content online should be set by states, and should be consistent with the right to freedom of expression. Digital platforms have had to rapidly develop policies on retention or removal of content, but those policies do not necessarily reflect the right to freedom of expression, and platforms are currently not well placed to take account of the public interest. Platforms should be far more transparent in their content regulation policies and decision-making, and should develop frameworks enabling efficient, fair, consistent internal complaints and content monitoring processes. Expertise on international human rights law should be integral to their systems. The right to participate in public affairs and to vote includes the right to engage in public debate. States and digital platforms should ensure an environment in which all can participate in debate online and are not discouraged from standing for election, from participating or from voting by online threats or abuse. 2019-11-05-Online-Disinformation-Human-Rights (PDF) Full Article
disinformation Thinking out loud: Is disinformation here to stay? By www.chathamhouse.org Published On :: Tue, 25 Oct 2022 12:52:14 +0000 Thinking out loud: Is disinformation here to stay? 10 November 2022 — 6:00PM TO 6:45PM Anonymous (not verified) 25 October 2022 Chatham House This event is postponed. Have you ever wondered how Chatham House researchers approach the big deals that become research? Do you enjoy meeting other Chatham House members and engaging with questions that open your mind? ‘Thinking Out Loud’ invites a small group of members to a live, unscripted discussion with a Chatham House researcher. This in-person event is a way for researchers and members to think out loud to help shape ideas for future research. Kate Jones, Associate Fellow, International Law Programme at Chatham House will pose some key questions facing how speech is governed in an online world: How has big tech influenced the way we think about speech and its limitations? Can disinformation be eliminated or even greatly reduced? Where should the responsibilities fall between government and business when it comes to speech regulation? What might the information landscape look like in 10 years’ time? Should that affect how we tackle disinformation today? As with all members events, questions from the audience drive the conversation. Full Article
disinformation What can governments do about online disinformation from abroad? By www.newscientist.com Published On :: Tue, 27 Aug 2024 15:25:19 +0100 A cyberterrorism charge in Pakistan connected to riots in the UK illustrates how authorities are reaching across borders to tackle disinformation, but bringing overseas suspects to justice won't always be possible Full Article
disinformation Which AI chatbot is best at avoiding disinformation? By www.newscientist.com Published On :: Wed, 02 Oct 2024 23:00:35 +0100 AI chatbots from Google and Microsoft sometimes parrot disinformation when answering questions about Russia’s invasion of Ukraine – but their performance depends on language and changes over time Full Article
disinformation Politics of disinformation : the influence of fake news on the public sphere [Electronic book] / edited by Guillermo López-García, Dolors Palau-Sampio, Bella Palomo, Eva Campos-Domínguez, Pere Masip, Ramon Llull. By encore.st-andrews.ac.uk Published On :: Hoboken, NJ : John Wiley & Sons, Inc., 2021. Full Article
disinformation The malfunction of US education policy : elite misinformation, disinformation, and selfishness / Richard P. Phelps. By darius.uleth.ca Published On :: Lanham : Rowman & Littlefield, [2023] Full Article
disinformation Social Media in 2020: A Year of Misinformation and Disinformation By www.rss-specifications.com Published On :: Mon, 22 Feb 2021 17:30:47 -0500 One of the most important things to learn is that misinformation and especially disinformation do not just affect us cognitively, they affect us emotionally. They often trigger an emotional response. They can make us outraged or angry, and self-righteous. And this can activate us to do something, like press the share button. Tuning in to our emotions, to how information makes us feel, can help us recognize when someone might be trying to manipulate us into becoming part of their disinformation campaign. Full Article
disinformation Disinformation, AI and ‘cyber chakravyuh’ By www.thehindu.com Published On :: Tue, 13 Aug 2024 00:59:00 +0530 This year may well be the one when the world confronts a cornucopia of security threats Full Article Lead
disinformation 'Chaotic disaster': Obama hits Trump's coronavirus response, warns of disinformation ahead of election By www.nbcnews.com Published On :: Sat, 09 May 2020 19:33:47 GMT The former president was also critical of the Justice Department directing prosecutors to drop their case against Michael Flynn, warning that the “rule of law is at risk.” Full Article
disinformation Why Fake Video, Audio May Not Be As Powerful In Spreading Disinformation As Feared By feeds.scpr.org Published On :: Thu, 07 May 2020 04:00:18 -0700 "Deepfakes" are digitally altered images that make incidents appear real when they are not. Such altered files could have broad implications for politics.; Credit: /Marcus Marritt for NPR Philip Ewing | NPRSophisticated fake media hasn't emerged as a factor in the disinformation wars in the ways once feared — and two specialists say it may have missed its moment. Deceptive video and audio recordings, often nicknamed "deepfakes," have been the subject of sustained attention by legislators and technologists, but so far have not been employed to decisive effect, said two panelists at a video conference convened on Wednesday by NATO. One speaker borrowed Sherlock Holmes' reasoning about the significance of something that didn't happen. "We've already passed the stage at which they would have been most effective," said Keir Giles, a Russia specialist with the Conflict Studies Research Centre in the United Kingdom. "They're the dog that never barked." The perils of deepfakes in political interference have been discussed too often and many people have become too familiar with them, Giles said during the online discussion, hosted by NATO's Strategic Communications Centre of Excellence. Following all the reports and revelations about election interference in the West since 2016, citizens know too much to be hoodwinked in the way a fake video might once have fooled large numbers of people, he argued: "They no longer have the power to shock." Tim Hwang, director of the Harvard-MIT Ethics and Governance of AI Initiative, agreed that deepfakes haven't proven as dangerous as once feared, although for different reasons. Hwang argued that users of "active measures" (efforts to sow misinformation and influence public opinion) can be much more effective with cheaper, simpler and just as devious types of fakes — mis-captioning a photo or turning it into a meme, for example. Influence specialists working for Russia and other governments also imitate Americans on Facebook, for another example, worming their way into real Americans' political activities to amplify disagreements or, in some cases, try to persuade people not to vote. Other researchers have suggested this work continues on social networks and has become more difficult to detect. Defense is stronger than attack Hwang also observed that the more deepfakes are made, the better machine learning becomes at detecting them. A very sophisticated, real-looking fake video might still be effective in a political context, he acknowledged — and at a cost to create of around $10,000, it would be easily within the means of a government's active measures specialists. But the risks of attempting a major disruption with such a video may outweigh an adversary's desire to use one. People may be too media literate, as Giles argued, and the technology to detect a fake may mean it can be deflated too swiftly to have an effect, as Hwang said. "I tend to be skeptical these will have a large-scale impact over time," he said. One technology boss told NPR in an interview last year that years' worth of work on corporate fraud protection systems has given an edge to detecting fake media. "This is not a static field. Obviously, on our end we've performed all sorts of great advances over this year in advancing our technology, but these synthetic voices are advancing at a rapid pace," said Brett Beranek, head of security business for the technology firm Nuance. "So we need to keep up." Beranek described how systems developed to detect telephone fraudsters could be applied to verify the speech in a fake clip of video or audio. Corporate clients that rely on telephone voice systems must be wary about people attempting to pose as others with artificial or disguised voices. Beranek's company sells a product that helps to detect them and that countermeasure also works well in detecting fake audio or video. Machines using neural networks can detect known types of synthetic voices. Nuance also says it can analyze a recording of a real known voice — say, that of a politician — and then contrast its characteristics against a suspicious recording. Although the world of cybersecurity is often described as one in which attackers generally have an edge over defenders, Beranek said he thought the inverse was true in terms of this kind of fraud detection. "For the technology today, the defense side is significantly ahead of the attack side," he said. Shaping the battlefield Hwang and Giles acknowledged in the NATO video conference that deepfakes likely will proliferate and become lower in cost to create, perhaps becoming simple enough to make with a smartphone app. One prospective response is the creation of more of what Hwang called "radioactive data" — material earmarked in advance so that it might make a fake easier to detect. If images of a political figure were so tagged beforehand, they could be spotted quickly if they were incorporated by computers into a deceptive video. Also, the sheer popularity of new fakes, if that is what happens, might make them less valuable as a disinformation weapon. More people could become more familiar with them, as well as being detectable by automated systems — plus they may also have no popular medium on which to spread. Big social media platforms already have declared affirmatively that they'll take down deceptive fakes, Hwang observed. That might make it more difficult for a scenario in which a politically charged fake video went viral just before Election Day. "Although it might get easier and easier to create deepfakes, a lot of the places where they might spread most effectively, your Facebooks and Twitters of the world, are getting a lot more aggressive about taking them down," Hwang said. That won't stop them, but it might mean they'll be relegated to sites with too few users to have a major effect, he said. "They'll percolate in these more shady areas." Copyright 2020 NPR. To see more, visit https://www.npr.org. This content is from Southern California Public Radio. View the original story at SCPR.org. Full Article
disinformation Vote-by-mail debate raises fears of election disinformation By www.seattletimes.com Published On :: Tue, 05 May 2020 08:34:47 -0700 WASHINGTON (AP) — A bitterly partisan debate unfolding on whether more Americans should cast their votes through the mail during a pandemic is provoking online disinformation and conspiracy theories that could undermine trust in the results, even if there are no major problems. With social distancing guidelines possibly curtailing in-person voting at the polls in […] Full Article Local Politics Nation Nation & World Politics
disinformation Fake news isn't new: Modern disinformation uses centuries-old techniques, author says By www.cbc.ca Published On :: Fri, 4 Oct 2019 14:51:46 EDT Author Heidi Tworek says we can learn from media manipulation's long history to understand how disinformation functions now. Full Article Radio/Spark
disinformation ‘We Roar’: Graduate alum Ali Nouri fights COVID-19 disinformation as Federation of American Scientists' president By www.princeton.edu Published On :: Tue, 28 Apr 2020 11:06:00 -0400 Ali Nouri, a 2006 Princeton graduate alumnus and president of the Federation of American Scientists, is the latest guest on the "We Roar" podcast. Full Article
disinformation Op-Ed: As coronavirus cases multiply, so does government disinformation By www.latimes.com Published On :: Sat, 11 Apr 2020 15:00:33 -0400 A graph of the spread of fake news -- conspiracy theories, propaganda and disinformation -- would likely run parallel to that of the coronavirus itself. Full Article
disinformation Webinar: Russian Disinformation's Golden Moment: Challenges and Responses in the COVID-19 Era By feedproxy.google.com Published On :: Tue, 21 Apr 2020 23:55:01 +0000 Invitation Only Research Event 7 May 2020 - 3:00pm to 4:30pm Event participants Anneli Ahonen, Head, StratCom East Task Force, European External Action ServiceKeir Giles, Senior Consulting Fellow, Russia and Eurasia Programme, Chatham HouseThomas Kent, Adjunct Associate Professor, Harriman Institute, Columbia University; Senior Fellow, the Jamestown FoundationChairs:James Nixey, Programme Director, Russia and Eurasia, Chatham HouseGlen Howard, President, The Jamestown Foundation The COVID-19 pandemic provides the ideal environment for malign influence to thrive as it feeds on fear and a vacuum of authoritative information. What are the current challenges posed by Russian disinformation, and how should Western nations be responding? In this discussion, jointly hosted by the Jamestown Foundation and the Chatham House Russia and Eurasia Programme, the speakers will consider what best practice looks like in safeguarding Western societies against the pernicious effects of disinformation. This event will be held on the record. Department/project Russia and Eurasia Programme, Russia's Domestic Politics Anna Morgan Administrator, Ukraine Forum +44 (0)20 7389 3274 Email Full Article
disinformation Tackling Cyber Disinformation in Elections: Applying International Human Rights Law By feedproxy.google.com Published On :: Wed, 18 Sep 2019 10:30:02 +0000 Research Event Tackling Cyber Disinformation in Elections: Applying International Human Rights Law 6 November 2019 - 5:30pm to 7:00pm Chatham House | 10 St James's Square | London | SW1Y 4LE Susie Alegre, Barrister and Associate Tenant, Doughty Street ChambersEvelyn Aswad, Professor of Law and the Herman G. Kaiser Chair in International Law, University of OklahomaBarbora Bukovská, Senior Director for Law and Policy, Article 19Kate Jones, Director, Diplomatic Studies Programme, University of OxfordChair: Harriet Moynihan, Associate Fellow, International Law Programme, Chatham House Register Interest Cyber operations are increasingly used by political parties, their supporters and foreign states to influence electorates – from algorithms promoting specific messages to micro-targeting based on personal data and the creation of filter bubbles. The risks of digital tools spreading disinformation and polarizing debate, as opposed to deepening democratic engagement, have been highlighted by concerns over cyber interference in the UK’s Brexit referendum, the 2016 US presidential elections and in Ukraine. While some governments are adopting legislation in an attempt to address some of these issues, for example Germany’s ‘NetzDG’ law and France’s ‘Law against the manipulation of information’, other countries have proposed an independent regulator as in the case of the UK’s Online Harms white paper. Meanwhile, the digital platforms, as the curators of content, are under increasing pressure to take their own measures to address data mining and manipulation in the context of elections. How do international human rights standards, for example on freedom of thought, expression and privacy, guide the use of digital technology in the electoral context? What practical steps can governments and technology actors take to ensure policies, laws and practices are in line with these fundamental standards? And with a general election looming in the UK, will these steps come soon enough? This event brings together a wide range of stakeholders including civil society, the tech sector, legal experts and government, coincides with the publication of a Chatham House research paper on disinformation, elections and the human rights framework. Jacqueline Rowe Programme Assistant, International Law Programme 020 7389 3287 Email Department/project International Law Programme, Cyber, Sovereignty and Human Rights, Rights, Accountability and Justice Full Article
disinformation Online Disinformation and Political Discourse: Applying a Human Rights Framework By feedproxy.google.com Published On :: Tue, 05 Nov 2019 11:03:02 +0000 6 November 2019 Although some digital platforms now have an impact on more people’s lives than does any one state authority, the international community has been slow to hold to account these platforms’ activities by reference to human rights law. This paper examines how human rights frameworks should guide digital technology. Download PDF Kate Jones Associate Fellow, International Law Programme @katejones77 LinkedIn 2019-11-05-Disinformation.jpg A man votes in Manhattan, New York City, during the US elections on 8 November 2016. Photo: Getty Images. SummaryOnline political campaigning techniques are distorting our democratic political processes. These techniques include the creation of disinformation and divisive content; exploiting digital platforms’ algorithms, and using bots, cyborgs and fake accounts to distribute this content; maximizing influence through harnessing emotional responses such as anger and disgust; and micro-targeting on the basis of collated personal data and sophisticated psychological profiling techniques. Some state authorities distort political debate by restricting, filtering, shutting down or censoring online networks.Such techniques have outpaced regulatory initiatives and, save in egregious cases such as shutdown of networks, there is no international consensus on how they should be tackled. Digital platforms, driven by their commercial impetus to encourage users to spend as long as possible on them and to attract advertisers, may provide an environment conducive to manipulative techniques.International human rights law, with its careful calibrations designed to protect individuals from abuse of power by authority, provides a normative framework that should underpin responses to online disinformation and distortion of political debate. Contrary to popular view, it does not entail that there should be no control of the online environment; rather, controls should balance the interests at stake appropriately.The rights to freedom of thought and opinion are critical to delimiting the appropriate boundary between legitimate influence and illegitimate manipulation. When digital platforms exploit decision-making biases in prioritizing bad news and divisive, emotion-arousing information, they may be breaching these rights. States and digital platforms should consider structural changes to digital platforms to ensure that methods of online political discourse respect personal agency and prevent the use of sophisticated manipulative techniques.The right to privacy includes a right to choose not to divulge your personal information, and a right to opt out of trading in and profiling on the basis of your personal data. Current practices in collecting, trading and using extensive personal data to ‘micro-target’ voters without their knowledge are not consistent with this right. Significant changes are needed.Data protection laws should be implemented robustly, and should not legitimate extensive harvesting of personal data on the basis of either notional ‘consent’ or the data handler’s commercial interests. The right to privacy should be embedded in technological design (such as by allowing the user to access all information held on them at the click of a button); and political parties should be transparent in their collection and use of personal data, and in their targeting of messages. Arguably, the value of personal data should be shared with the individuals from whom it derives.The rules on the boundaries of permissible content online should be set by states, and should be consistent with the right to freedom of expression. Digital platforms have had to rapidly develop policies on retention or removal of content, but those policies do not necessarily reflect the right to freedom of expression, and platforms are currently not well placed to take account of the public interest. Platforms should be far more transparent in their content regulation policies and decision-making, and should develop frameworks enabling efficient, fair, consistent internal complaints and content monitoring processes. Expertise on international human rights law should be integral to their systems.The right to participate in public affairs and to vote includes the right to engage in public debate. States and digital platforms should ensure an environment in which all can participate in debate online and are not discouraged from standing for election, from participating or from voting by online threats or abuse. Department/project International Law Programme, Cyber, Sovereignty and Human Rights, Rights, Accountability and Justice Full Article
disinformation EU–US Cooperation on Tackling Disinformation By feedproxy.google.com Published On :: Tue, 17 Sep 2019 15:23:58 +0000 3 October 2019 Disinformation, as the latest iteration of propaganda suitable for a digitally interconnected world, shows no signs of abating. This paper provides a holistic overview of the current state of play and outlines how EU and US cooperation can mitigate disinformation in the future. Read online Download PDF Sophia Ignatidou Academy Associate, International Security Programme @SophiaIgnatidou LinkedIn 2019-09-19-FakeNews.jpg A congressional staff member displays print outs of social media posts during a hearing before the House Select Intelligence Committee 1 November 2017 in Washington, DC. Photo: Getty Images. EU and US cooperation on tackling disinformation needs to be grounded in an international human rights framework in order to bridge the differences of both parties and include other countries facing this challenge.The disinformation debate needs to be reformulated to cover systemic issues rather than merely technical or security concerns. A lag in regulatory development has led to systemic vulnerabilities. In this context, policymakers need to push for more evidence-based analysis, which is only attainable if technology companies engage in honest debate and allow meaningful access to data – as determined by government appointed researchers rather than the companies themselves – taking into account and respecting users’ privacy.Data governance needs to be the focus of attempts to tackle disinformation. Data’s implications for information, market and power asymmetries, feed into and exacerbate the problem.Policymakers should focus on regulating the distribution of online content rather than the subject matter itself, which may have implications for freedom of speech.Disinformation is mainly the result of inefficient gatekeeping of highly extractive digital companies. The old gatekeepers, journalists and their respective regulators, need to be actively engaged in devising the new regulatory framework.Legacy media need to urgently consider the issue of ‘strategic silence’ and avoid being co-opted by political actors aiming to manipulate the accelerated, reactive news cycle by engaging in divisive ‘clickbait’ rhetoric verging on disinformation and propaganda. When strategic silence is not an option, contextual analysis is fundamental. The EU delegation should assist the coordination of EU–US efforts to tackle disinformation by drawing on the work and expertise at the G7 Rapid Response Mechanism (RRM), the Transatlantic Commission on Election Integrity (TCEI), the European Centre of Excellence for Countering Hybrid Threats (Hybrid CoE), the High-level Panel on Digital Cooperation, and work with the International Telecommunication Union (ITU) to foster a long-term interdisciplinary forum to harness technological innovation to protect and support democracy from threats such as disinformation.The EU and US must avoid rushed regulation that may condone enhanced surveillance or vilify journalism that scrutinizes those in power in the name of security. Department/project International Security Programme, Internet Governance Full Article
disinformation Webinar: Russian Disinformation's Golden Moment: Challenges and Responses in the COVID-19 Era By feedproxy.google.com Published On :: Tue, 21 Apr 2020 23:55:01 +0000 Invitation Only Research Event 7 May 2020 - 3:00pm to 4:30pm Event participants Anneli Ahonen, Head, StratCom East Task Force, European External Action ServiceKeir Giles, Senior Consulting Fellow, Russia and Eurasia Programme, Chatham HouseThomas Kent, Adjunct Associate Professor, Harriman Institute, Columbia University; Senior Fellow, the Jamestown FoundationChairs:James Nixey, Programme Director, Russia and Eurasia, Chatham HouseGlen Howard, President, The Jamestown Foundation The COVID-19 pandemic provides the ideal environment for malign influence to thrive as it feeds on fear and a vacuum of authoritative information. What are the current challenges posed by Russian disinformation, and how should Western nations be responding? In this discussion, jointly hosted by the Jamestown Foundation and the Chatham House Russia and Eurasia Programme, the speakers will consider what best practice looks like in safeguarding Western societies against the pernicious effects of disinformation. This event will be held on the record. Department/project Russia and Eurasia Programme, Russia's Domestic Politics Anna Morgan Administrator, Ukraine Forum +44 (0)20 7389 3274 Email Full Article
disinformation Tackling Cyber Disinformation in Elections: Applying International Human Rights Law By feedproxy.google.com Published On :: Wed, 18 Sep 2019 10:30:02 +0000 Research Event 6 November 2019 - 5:30pm to 7:00pm Chatham House | 10 St James's Square | London | SW1Y 4LE Event participants Susie Alegre, Barrister and Associate Tenant, Doughty Street ChambersEvelyn Aswad, Professor of Law and the Herman G. Kaiser Chair in International Law, University of OklahomaBarbora Bukovská, Senior Director for Law and Policy, Article 19Kate Jones, Director, Diplomatic Studies Programme, University of OxfordChair: Harriet Moynihan, Associate Fellow, International Law Programme, Chatham House Cyber operations are increasingly used by political parties, their supporters and foreign states to influence electorates – from algorithms promoting specific messages to micro-targeting based on personal data and the creation of filter bubbles. The risks of digital tools spreading disinformation and polarizing debate, as opposed to deepening democratic engagement, have been highlighted by concerns over cyber interference in the UK’s Brexit referendum, the 2016 US presidential elections and in Ukraine. While some governments are adopting legislation in an attempt to address some of these issues, for example Germany’s ‘NetzDG’ law and France’s ‘Law against the manipulation of information’, other countries have proposed an independent regulator as in the case of the UK’s Online Harms white paper. Meanwhile, the digital platforms, as the curators of content, are under increasing pressure to take their own measures to address data mining and manipulation in the context of elections. How do international human rights standards, for example on freedom of thought, expression and privacy, guide the use of digital technology in the electoral context? What practical steps can governments and technology actors take to ensure policies, laws and practices are in line with these fundamental standards? And with a general election looming in the UK, will these steps come soon enough? This event brings together a wide range of stakeholders including civil society, the tech sector, legal experts and government, coincides with the publication of a Chatham House research paper on disinformation, elections and the human rights framework. Department/project International Law Programme, Cyber, Sovereignty and Human Rights, Rights, Accountability and Justice Jacqueline Rowe Programme Assistant, International Law Programme 020 7389 3287 Email Full Article
disinformation Online Disinformation and Political Discourse: Applying a Human Rights Framework By feedproxy.google.com Published On :: Tue, 05 Nov 2019 11:03:02 +0000 6 November 2019 Although some digital platforms now have an impact on more people’s lives than does any one state authority, the international community has been slow to hold to account these platforms’ activities by reference to human rights law. This paper examines how human rights frameworks should guide digital technology. Download PDF Kate Jones Associate Fellow, International Law Programme @katejones77 LinkedIn 2019-11-05-Disinformation.jpg A man votes in Manhattan, New York City, during the US elections on 8 November 2016. Photo: Getty Images. SummaryOnline political campaigning techniques are distorting our democratic political processes. These techniques include the creation of disinformation and divisive content; exploiting digital platforms’ algorithms, and using bots, cyborgs and fake accounts to distribute this content; maximizing influence through harnessing emotional responses such as anger and disgust; and micro-targeting on the basis of collated personal data and sophisticated psychological profiling techniques. Some state authorities distort political debate by restricting, filtering, shutting down or censoring online networks.Such techniques have outpaced regulatory initiatives and, save in egregious cases such as shutdown of networks, there is no international consensus on how they should be tackled. Digital platforms, driven by their commercial impetus to encourage users to spend as long as possible on them and to attract advertisers, may provide an environment conducive to manipulative techniques.International human rights law, with its careful calibrations designed to protect individuals from abuse of power by authority, provides a normative framework that should underpin responses to online disinformation and distortion of political debate. Contrary to popular view, it does not entail that there should be no control of the online environment; rather, controls should balance the interests at stake appropriately.The rights to freedom of thought and opinion are critical to delimiting the appropriate boundary between legitimate influence and illegitimate manipulation. When digital platforms exploit decision-making biases in prioritizing bad news and divisive, emotion-arousing information, they may be breaching these rights. States and digital platforms should consider structural changes to digital platforms to ensure that methods of online political discourse respect personal agency and prevent the use of sophisticated manipulative techniques.The right to privacy includes a right to choose not to divulge your personal information, and a right to opt out of trading in and profiling on the basis of your personal data. Current practices in collecting, trading and using extensive personal data to ‘micro-target’ voters without their knowledge are not consistent with this right. Significant changes are needed.Data protection laws should be implemented robustly, and should not legitimate extensive harvesting of personal data on the basis of either notional ‘consent’ or the data handler’s commercial interests. The right to privacy should be embedded in technological design (such as by allowing the user to access all information held on them at the click of a button); and political parties should be transparent in their collection and use of personal data, and in their targeting of messages. Arguably, the value of personal data should be shared with the individuals from whom it derives.The rules on the boundaries of permissible content online should be set by states, and should be consistent with the right to freedom of expression. Digital platforms have had to rapidly develop policies on retention or removal of content, but those policies do not necessarily reflect the right to freedom of expression, and platforms are currently not well placed to take account of the public interest. Platforms should be far more transparent in their content regulation policies and decision-making, and should develop frameworks enabling efficient, fair, consistent internal complaints and content monitoring processes. Expertise on international human rights law should be integral to their systems.The right to participate in public affairs and to vote includes the right to engage in public debate. States and digital platforms should ensure an environment in which all can participate in debate online and are not discouraged from standing for election, from participating or from voting by online threats or abuse. Department/project International Law Programme, Cyber, Sovereignty and Human Rights, Rights, Accountability and Justice Full Article
disinformation Webinar: Russian Disinformation's Golden Moment: Challenges and Responses in the COVID-19 Era By feedproxy.google.com Published On :: Tue, 21 Apr 2020 23:55:01 +0000 Invitation Only Research Event 7 May 2020 - 3:00pm to 4:30pm Event participants Anneli Ahonen, Head, StratCom East Task Force, European External Action ServiceKeir Giles, Senior Consulting Fellow, Russia and Eurasia Programme, Chatham HouseThomas Kent, Adjunct Associate Professor, Harriman Institute, Columbia University; Senior Fellow, the Jamestown FoundationChairs:James Nixey, Programme Director, Russia and Eurasia, Chatham HouseGlen Howard, President, The Jamestown Foundation The COVID-19 pandemic provides the ideal environment for malign influence to thrive as it feeds on fear and a vacuum of authoritative information. What are the current challenges posed by Russian disinformation, and how should Western nations be responding? In this discussion, jointly hosted by the Jamestown Foundation and the Chatham House Russia and Eurasia Programme, the speakers will consider what best practice looks like in safeguarding Western societies against the pernicious effects of disinformation. This event will be held on the record. Department/project Russia and Eurasia Programme, Russia's Domestic Politics Anna Morgan Administrator, Ukraine Forum +44 (0)20 7389 3274 Email Full Article
disinformation Far-Right Spreads COVID-19 Disinformation Epidemic Online By www.technewsworld.com Published On :: 2020-05-05T10:14:48-07:00 Far-right groups and individuals in the United States are exploiting the COVID-19 pandemic to promote disinformation, hate, extremism and authoritarianism. "COVID-19 has been seized by far-right groups as an opportunity to call for extreme violence," states a report from ISD, based on a combination of natural language processing, network analysis and ethnographic online research. Full Article
disinformation Russian Disinformation Ongoing Problem, Says FBI Chief By packetstormsecurity.com Published On :: Thu, 06 Feb 2020 17:27:36 GMT Full Article headline government usa russia fraud cyberwar facebook social fbi
disinformation [Ticker] 'Significant weaknesses' on EU disinformation approach By euobserver.com Published On :: Tue, 05 May 2020 10:03:20 +0200 A new report from the European Regulators Group for Audiovisual Media Services (ERGA) on the implementation of the EU Commission's 2018 code of practice on disinformation reveals "significant weaknesses" linked to the lack of transparency and voluntary approach. ERGA proposes shifting from the current flexible self-regulatory approach to co-regulatory. The code targeted companies such as Google, Facebook and Twitter. Full Article
disinformation Disinformation and the digital wild west By www.eversheds.com Published On :: 2019-12-02 The digital modern age offers huge potential – the sharing of new technologies, the promotion of economic growth and the ability for instant communication. But whilst products of this age, such as social media, may help to progress today&rsqu... Full Article
disinformation Press Freedom and Tackling Disinformation in the COVID-19 context By www.ipsnews.net Published On :: Mon, 04 May 2020 06:29:44 +0000 ONLINE HIGH-LEVEL DIALOGUE Moderated by Jorge Ramos (journalist & author, Univision) 4 May 2020, 17:00-18:15 CET (GMT+2) / 11:00am-12:15pm EDT JOIN LIVE: http://en.unesco.org/commemorations/worldpressfreedomday The post Press Freedom and Tackling Disinformation in the COVID-19 context appeared first on Inter Press Service. Full Article Press Freedom
disinformation Microsoft's AI Research Draws Controversy Over Possible Disinformation Use By feedproxy.google.com Published On :: Mon, 04 Nov 2019 21:00:00 GMT Microsoft's AI could enable its popular chatbot to comment on news, but critics see a tool for spreading disinformation Full Article robotics robotics/artificial-intelligence
disinformation Why Fake Video, Audio May Not Be As Powerful In Spreading Disinformation As Feared By www.npr.org Published On :: Thu, 07 May 2020 05:00:25 -0400 "Deepfakes" have received a lot of attention as a way to potentially spread misleading or false information and influence public opinion. But two specialists say that might not be a huge concern. Full Article
disinformation Podcast: Camille François on COVID-19 and the ABCs of disinformation By webfeeds.brookings.edu Published On :: Tue, 28 Apr 2020 23:42:33 +0000 Camille François is a leading investigator of disinformation campaigns and author of the well-known "ABC" or "Actor-Behavior-Content" disinformation framework, which has informed how many of the biggest tech companies tackle disinformation on their platforms. Here, she speaks with Lawfare's Quinta Jurecic and Evelyn Douek for that site's series on disinformation, "Arbiters of Truth." Earlier this… Full Article
disinformation Trends in online disinformation campaigns By webfeeds.brookings.edu Published On :: Fri, 08 May 2020 22:23:23 +0000 Ben Nimmo, director of investigations at Graphika, discusses two main trends in online disinformation campaigns: the decline of large scale, state-sponsored operations and the rise of small scale, homegrown copycats. Full Article
disinformation Trends in online disinformation campaigns By webfeeds.brookings.edu Published On :: Fri, 08 May 2020 22:23:23 +0000 Ben Nimmo, director of investigations at Graphika, discusses two main trends in online disinformation campaigns: the decline of large scale, state-sponsored operations and the rise of small scale, homegrown copycats. Full Article
disinformation Trends in online disinformation campaigns By webfeeds.brookings.edu Published On :: Fri, 08 May 2020 22:23:23 +0000 Ben Nimmo, director of investigations at Graphika, discusses two main trends in online disinformation campaigns: the decline of large scale, state-sponsored operations and the rise of small scale, homegrown copycats. Full Article
disinformation Trends in online disinformation campaigns By webfeeds.brookings.edu Published On :: Fri, 08 May 2020 22:23:23 +0000 Ben Nimmo, director of investigations at Graphika, discusses two main trends in online disinformation campaigns: the decline of large scale, state-sponsored operations and the rise of small scale, homegrown copycats. Full Article
disinformation Podcast: Camille François on COVID-19 and the ABCs of disinformation By webfeeds.brookings.edu Published On :: Tue, 28 Apr 2020 23:42:33 +0000 Camille François is a leading investigator of disinformation campaigns and author of the well-known "ABC" or "Actor-Behavior-Content" disinformation framework, which has informed how many of the biggest tech companies tackle disinformation on their platforms. Here, she speaks with Lawfare's Quinta Jurecic and Evelyn Douek for that site's series on disinformation, "Arbiters of Truth." Earlier this… Full Article