When will the database downloads be refreshed at download.wikimedia.org?
Currently, the last refresh was 16 days ago (January 7, 2005). The related wiki
page says refreshes will occur approximately twice a week. Has this schedule now
changed?
Hi,
I propose to set up a basic antispam / antivirus check on the mailing list
server. The administration of mailing lists became a burden due to spams and
virus.
I propose several steps:
* first blocking mails from invalid domains and virus. This take very little
server resource, but should remove a big part of unwanted trafic.
* if this is not enough, then setting a specific antispam with Spamassassin
for blocking most unwanted mails. This will also require more server
resources.
Here below are the rules I propose to implement for the first step. These are
the rules I use myself for the last 2 years.
========================
/etc/postfix/main.cf:
smtpd_helo_required = yes
disable_vrfy_command = yes
unknown_address_reject_code = 554
unknown_client_reject_code = 554
unknown_hostname_reject_code = 554
# Default: not needed
smtpd_recipient_restrictions =
reject_invalid_hostname,
reject_non_fqdn_hostname,
reject_non_fqdn_sender,
reject_non_fqdn_recipient,
reject_unknown_sender_domain,
reject_unknown_recipient_domain,
reject_unauth_pipelining,
permit_mynetworks,
reject_unauth_destination,
reject_rbl_client relays.ordb.org,
reject_rbl_client opm.blitzed.org,
reject_rbl_client list.dsbl.org,
reject_rbl_client sbl.spamhaus.org,
reject_rbl_client cbl.abuseat.org,
reject_rbl_client dul.dnsbl.sorbs.net,
reject_rbl_client blackholes.easynet.nl,
reject_rbl_client proxies.blackholes.wirehub.net,
reject_rbl_client bl.spamcop.net,
reject_rbl_client dnsbl.njabl.org,
permit
smtpd_client_restrictions =
permit_mynetworks,
reject_unknown_client,
reject_unknown_sender_domain,
reject_non_fqdn_sender,
reject_invalid_hostname,
reject_non_fqdn_hostname,
reject_non_fqdn_recipient,
reject_unknown_recipient_domain,
reject_unauth_pipelining,
reject_unauth_destination,
reject_rbl_client relays.ordb.org,
reject_rbl_client opm.blitzed.org,
reject_rbl_client list.dsbl.org,
reject_rbl_client sbl.spamhaus.org,
reject_rbl_client cbl.abuseat.org,
reject_rbl_client dul.dnsbl.sorbs.net,
reject_rbl_client blackholes.easynet.nl,
reject_rbl_client proxies.blackholes.wirehub.net,
reject_rbl_client bl.spamcop.net,
reject_rbl_client dnsbl.njabl.org,
permit
smtpd_helo_restrictions =
permit_mynetworks,
reject_invalid_hostname,
reject_unknown_hostname,
reject_non_fqdn_hostname
smtpd_sender_restrictions =
reject_unknown_sender_domain,
reject_non_fqdn_sender,
reject_invalid_hostname,
reject_non_fqdn_hostname,
reject_non_fqdn_recipient,
reject_unknown_recipient_domain,
reject_unauth_pipelining,
permit_mynetworks,
reject_unauth_destination,
reject_rbl_client relays.ordb.org,
reject_rbl_client opm.blitzed.org,
reject_rbl_client list.dsbl.org,
reject_rbl_client sbl.spamhaus.org,
reject_rbl_client cbl.abuseat.org,
reject_rbl_client dul.dnsbl.sorbs.net,
reject_rbl_client blackholes.easynet.nl,
reject_rbl_client proxies.blackholes.wirehub.net,
reject_rbl_client bl.spamcop.net,
reject_rbl_client dnsbl.njabl.org,
permit
header_checks = regexp:/etc/postfix/header_checks
mime_header_checks = regexp:/etc/postfix/mime_header_checks
==========
/etc/postfix/mime_header_checks:
/.*name=".*\.(exe|pif|zip|scr|com|dat|vbs)"/ REJECT
==========
/etc/postfix/header_checks: (new rules could be added as needed)
/Subject:.*Hydrocodone.*/ REJECT
/Subject:.*Valium.*/ REJECT
/Subject:.*Vicodin.*/ REJECT
/Subject:.*Pharmacy.*/ REJECT
/Subject:.*Xanax.*/ REJECT
/Subject:.*Rolex.*/ REJECT
/Subject:.*VIAGRA.*/ REJECT
/Subject:.*Network Critical Update.*/ REJECT
Regards,
Yann
--
http://www.non-violence.org/ | Site collaboratif sur la non-violence
http://www.forget-me.net/ | Alternatives sur le Net
http://fr.wikipedia.org/ | Encyclopédie libre
http://www.forget-me.net/pro/ | Formations et services Linux
Hi ...
The Graphviz plugin nows support links inside the graphs.
You can view more info at
http://www.wickle.com/wiki/index.php/Graphviz_extension
regards
Thanks to Tels for suggestion ;))
--
Problems with Windows - Reboot.
Problems with Linux - Be root.
Vic (aKa CoffMan)
vic(a)wickle.com
wickle(a)gmail.com
I'd like the possibility of showing the intro paragraph of a linked
article when the mouse hovers over it.
How about something along the lines of:
http://quatramaran.ens.fr/~monniaux/wikilinks/pagea.html
Done exclusively with Javascript *without duplication of content* (read
the HTML sources).
The Commons' idea of the Picture of the day seems great to me, so that
an idea occurred to me (yes, it happens sometimes, really!). It would
be perfect if I would be able to include the picture on the title page
of the Czech Wikipedia. It would also attract people to the Commons
project (which is a little bit problem on cs:, users seem to be scared
a little bit by the english-speaking look of Commons etc.).
But there is an obvious problem: I don't think it is possible to do
that. The only syntax I came up with was something like
{{commons:Template:Potd|width=300|float=right|lang=cs}}, but I didn't
even expect _that_ to work (and it did not). I don't believe there
would be some other, working way, but for sure: don't you know about a
method of using Commons' POTD on another project?
Sure, we could copy all required pages from Commons and every day add
a copy of a new one manually, but it would be useless duplication of
work, I think.
As a programmer, I am able to imagine two approaches: either allow
cross-project template inclusion (which seems to be a terrible idea),
or allow some "image redirects" (i.e. on Commons, there would be a
page called e.g. Image:POTD.redirect that would contain something like
"#REDIRECT [[Image:{{Template:Potd/{{CURRENTYEAR}}-{{CURRENTMONTH}}-{{CURRENTDAY}}}}]]"),
which seems a little bit more reasonable, but it is only a vague idea.
Any comments?
Best regards,
[[ :cs:User:Mormegil | Petr Kadlec ]]
We should implement CVS-style tagging, where specific article revisions
are 'tagged' so that the group of revisions with that tag can be called
up instantly. This could be used for the push to 1.0 by having a tag for
the 1.0 revisions.
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
Ashar Voultoiz wrote:
> Update of /cvsroot/wikipedia/phase3/includes In directory
> sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv13580/includes
>
> Modified Files: SpecialPage.php SkinTemplate.php Added Files:
> SpecialMypage.php SpecialMytalk.php SpecialMycontributions.php Log
> Message: Get ride of the username in links when user is logged in.
> Instead use special pages just like the already existent
> [[Special:Watchlist]]. That's one more step to get pages rendered for
> logged users cached.
Hello,
I forgot to add that the original idea for this patch comes from
innocence and posted by de:Duesentrieb on livejournal wikitech
http://www.livejournal.com/community/wikitech/2943.html
cheers,
- --
Ashar Voultoiz - WP++++
http://en.wikipedia.org/wiki/User:Hasharhttp://www.livejournal.com/community/wikitech/
Servers in trouble ? noc (at) wikimedia (dot) org
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.2.4 (GNU/Linux)
iD8DBQFB92YxpmyHQ2O4INERAoMsAJ9oQw7KEKyaiBDa5/wnaKhLmicBJACfb3qA
zD5Ppycic5Q9zMPRqGLtklA=
=CVZT
-----END PGP SIGNATURE-----
*David Gerard* (fun at thingy.apana.org.au) said:
<mailto:wikitech-l%40wikimedia.org?Subject=%5BWikitech-l%5D%20deluxe%20wikilinks&In-Reply-To=41F6B935.5020302%40ens.fr>/
/> >/ How about something along the lines of:
/> >/ http://quatramaran.ens.fr/~monniaux/wikilinks/pagea.html <http://quatramaran.ens.fr/%7Emonniaux/wikilinks/pagea.html>
/> >/ Done exclusively with Javascript *without duplication of content* (read
/> >/ the HTML sources)./
> That's *nice*. How well does it work when you simulate 10-15 second
> response times, though?
Remark that the loaded sub-pages would be of the "anonymous" kind.
These, contrary to those generated for logged-in users, can be easily cached. We could
even specify that they expire only after a certain time (I suppose the leading paragraph
of an article does not change that often, in general, and that when it changes, people
can still live with seeing the old version in a pop-up). As far as I know, the main
problem, in the long run, is Apache/database load, not Squid load.