Kiwi News
Menu
🥝

Quite a lot about tech in Amnesty’s annual report, some quotes: “Big Tech’s surveillance business model is pouring fuel on this fire of hate, enabling those with malintent to hound, dehumanize and amplify dangerous narratives to consolidate power or polling.” “Over the past year the rapid trajectory of generative AI, has transformed the scale of the threat posed by the gamut of technologies already in existence – from spyware to state automation and social media’s run-away algorithms.” “There is a vast chasm between the risks posed by the unchecked advancement of technologies, and where we need to be in terms of regulation and protection. It’s our future foretold and will only worsen unless the rampant proliferation of unregulated technology is curtailed,” “ Amnesty International exposed how Facebook’s algorithms contributed to ethnic violence in Ethiopia in the context of armed conflict. This is a prime example of how technology is weaponized to pit communities against each other, particularly in times of instability. The human rights organization forecasts that these problems will escalate in a landmark election year, with the surveillance-based business model underpinning major social media platforms such as Facebook, Instagram, TikTok and YouTube acting as a catalyst for human rights violations in the context of elections. “We’ve seen how hate, discrimination and disinformation are amplified and spread by social media algorithms optimized to maximize ‘engagement’ above all else. They create an endless and dangerous feedback loop, particularly at times of heightened political sensitivity. Tools can generate synthetic images, audio and video in seconds, as well as target specific audience groups at scale, but electoral regulation has yet to catch up with this threat. To date we’ve seen too much talk with too little action,” said Agnès Callamard.” “Politicians have long used manipulation of ‘us vs. them’ narratives to win votes and outmanoeuvre legitimate questions about economic and security fears. We’ve seen how unregulated technologies, such as facial recognition, have been used to entrench discrimination. Coupled with this, Big Tech’s surveillance business model is pouring fuel on this fire of hate, enabling those with malintent to hound, dehumanize and amplify dangerous narratives to consolidate power or polling. It’s a chilling spectre of what’s to come as technological advances rapaciously outpace accountability,” said Agnès Callamard.“



Access NFT | Privacy Policy | API | Guidelines | Onboarding | Docs | iOS Shortcut | Kiwi Pass | Source code | Dune Dashboard | Brand Assets