Open Weaver and Citizen Digital Foundation come together for a bootcamp on Responsible AI

Open Weaver in partnership with Citizen Digital Foundation is organizing a Bootcamp on ‘Responsible AI’ and ‘Building an AI powered Fake News Detection Engine’. The Bootcamp scheduled for March 3, 2023, will cover Tech Governance and its significance in responsible application development.

The keynote by Techtonic Shifts & AI Governance will address the growing concerns around the potential for AI systems to perpetuate and amplify biases in our society. It will highlight the importance of governance in ensuring that AI systems are developed, deployed, and used in a responsible manner, taking into account the social, cultural, and historical contexts in which they are being used. The talk will explore the complex and multifaceted issues that get fed into AI, often exacerbating existing gender, racial, caste, religious and ableist discrimination, and the need for new age developers and businesses to consider preventive and reactive methods to mitigate potential harms.

Following the keynote and an interactive Q&A session with students, we will proceed to a bootcamp on ‘Building a Fake News Detection Engine’.

Fake news has existed since the advent of the printing press but in today’s age of internet and social media, it has become a very dangerous threat to the peaceful existence of us as a society. All of us collectively endured the ill effects of fake news during the COVID pandemic – from magical cures for COVID to misinformation about the availability of medicines and hospital beds – we were flooded with fake news in the form of WhatsApp forwards, video clips, news articles and more.

Such fake news can have disastrous effects on everything ranging from our local elections and national politics to climate change and the global economy.

Nidhi Sudan, Co-Founder, Citizen Digital Foundation said, “We choose doctors based on their qualifications, their expertise and experience. Further trust comes from word of mouth, accuracy of their diagnosis, the quality and efficacy of the treatment, and how well we feel at the end of it. When it comes to technologies, we implicitly trust the processes, because the average user has no means to see and understand how personalised, intuitive solutions work. That alone places immense fiduciary responsibility on creators of new technologies to factor in accountable, transparent, trustworthy processes that proactively prevent harm. Learnings from fallouts of extractive technologies in the last decade should drive embedded responsible governance in technological development.

For more details and to watch recording, visit Responsible AI | Build a Fake News Detection Engine Bootcamp - Live Sessions & Events - Open Weaver