https://bugzilla.wikimedia.org/show_bug.cgi?id=55036
--- Comment #9 from Kunal Mehta (Legoktm) legoktm.wikipedia@gmail.com --- When you are plowing through 700 wikis even simple tasks become difficult. A compiled report would let me know which wikis to notify that human intervention is necessarily on which pages which can denote the type of intervention necessary. The prepared report could be language specific so es.Wikipedia would get a report in Spanish, de.Wikipedia would get a report in German, etc.
Protected redirects are a problem particularly as they appear like stuff bots can fix but they can't because bots are unable to edit protected pages. This is not one of your examples. Special:Doubleredirects makes no distinction for this type of problem.
Redirect loops may be more than 2 pages. Among 200 entries such a thing could be difficult to spot.
Also while how to deal with redirect loops is obvious to you and me, admins in local communities are often more than uneasy in dealing with this issue they are not familiar with.