Hi Ziski, 

Done voting. Goodluck to the panel. 

Vanj D. Padilla, CDPO, JD
Founding Chairperson
Shared Knowledge Asia Pacific ( SKAP )

"if your dreams do not scare you, it means they are not big enough."


On Thu, Aug 8, 2024 at 1:17 PM Franziska Putz <fputz@wikimedia.org> wrote:

Hi friends,

I hope you're all enjoying Wikimania, online or onsite! If you're in Katowice, please come say hi to me. 

I'm sharing this message because we need to crowdsource some WikiLove for an important policy event. As many of you know, explaining the Wikimedia model to policy stakeholders is a key priority for our movement. We've identified a BIG stage where we'd like to do this: the South by Southwest 2025 [1] conference in Austin, Texas. They have a unique process, and we need your help - to win a popularity contest.

How you can help: We submitted a panel proposal (details below the line) titled "How can fictional futures help us strengthen truth and facts online?". Panels that are selected are those that get the most votes. Voting is open between now and August 18th. If there's one thing we're really good at, it's crowdsourcing. Please vote for our panel here [2], and help us explain the model to this audience.

Grateful if you could also share this with your friends in the movement and beyond!

Best,

Ziski

[1] https://www.sxsw.com/

[2] https://panelpicker.sxsw.com/vote/151603

_________

The panel,  "How can fictional futures help us strengthen truth and facts online?", will bring together speakers from the worlds of tech, fiction and journalism, to talk about how we can create a future where reliable, well-sourced, and human-verified information is not only flourishing, but public trust in it is growing. The speakers include:

  • Sewell Chan (Moderator) – Editor-in-Chief, Texas Tribune 

  • Karen Lord – Award-winning science fiction author 

  • Stephen Harrison – Contributing Editor, Slate

  • Rebecca MacKinnon, Vice President Global Advocacy, WMF

As part of these discussions, we will raise awareness about how Wikipedia’s human-led content moderation model is an antidote to mis- and disinformation. As we enter into a time where the use of artificial intelligence is becoming increasingly prevalent, Wikipedia’s human approach to verifying content becomes more important than ever; this is something that our teams have collectively been emphasizing in our external messaging to help educate the media, lawmakers and the wider public about how this model is important in maintaining information integrity, and why it should be protected.


Franziska Putz (she/her)

Senior Movement Advocacy Manager

Global Advocacy, Wikimedia Foundation

Fputz@wikimedia.org

UTC Timezone

_______________________________________________
Publicpolicy mailing list -- publicpolicy@lists.wikimedia.org
To unsubscribe send an email to publicpolicy-leave@lists.wikimedia.org