Here is a tweet describing a problem with social media recommendation systems:
"The algorithm I worked on at Google recommended Alex Jones' videos more than 15,000,000,000 times, to some of the most vulnerable people in the nation." - @gchaslot
What should the penalty for that be? A fine? Enough for the Foundation to hire all my Google Summer of Code students to add pronunciation remediation to Wiktionary?
If you think that's bad, most of the recommendation system damage is from the vanity of fame instead of political schemers. Almost all of the post-Myspace social media had a bias towards usually undeserved fame. Luckily, the damage is merely memetic and can be repaired with literature. But the schemers turn into fraud cases, so they get more attention than they should relative to the larger, general problem.
Best regards, Jim
On Wed, Feb 21, 2018 at 11:27 PM, James Salsman jsalsman@gmail.com wrote:
Here is a good example of instructional software to solve a systemic communication issue:
Ref.: https://www.sciencedaily.com/releases/2018/02/180220093555.htm
How can we sustain progress towards resolution of the issues?
Also, does the date by which Titan is likely to be colonized correlate with the extent to which progress has been achieved? This is not the first time I have asked this question here, and I hope the answer is as clear to everyone else as it is to me: it correlates inversely.
I wonder if the Foundation could afford to have David Attenborough narrate the interaction between Cambridge Analytica and Cambridge University. They would if they'd start investing in unskimmable endowment funds. Make donors' money work hard, with a screening for sustainability.
Bring back the regular email to donors suggesting other organizations worthy of their money, and tell them how to avoid being skimmed by high frequency traders, too, please.
Best regards, Jim