While looking for ways to package HipHop for Debian, I came across Mike
DuPont's git repository (https://github.com/h4ck3rm1k3/hiphop-php/) that
he had started in January '10, but abandoned because of an apparent lack
of interest from the HipHop developers.
I picked up where he left off but hit a wall when I tried to get the
Debian or upstream developers for curl and libevent to accept the HipHop
packages (see http://bugs.debian.org/638359 and
http://bugs.debian.org/638360).
At that point, I contacted Mike and asked for his help. He suggested we
attempt to get HipHop compiled *without* the patches since the upstream
developers weren't too enthusiastic about them.
He managed to get a working compiler and I'd like to get additional
testing done on it before I submit the work to Debian. Following is a
copy of the email he tried to send to wikitech-l earlier today.
---------- Forwarded message ----------
From: Mike Dupont <jamesmikedupont(a)googlemail.com>
Date: Mon, Sep 19, 2011 at 9:16 PM
Subject: Progress made on packaging hiphop for debian
To: Wikimedia developers <wikitech-l(a)lists.wikimedia.org>
I have been working on packaging hiphop php for debian without any
patched libevent and curl, now have a basic package that can install the
compiler. working on getting the runtime sorted out, any help would be
appreciated.
note, this is working with the branch "official" not "master" I will
have to clean that up.
git checkout official
dpkg-buildpackage should produce a deb that builds and a compiler that
produces code, the runtime as I said needs work.
mike
--
James Michael DuPont
Member of Free Libre Open Source Software Kosova http://flossk.org
> > I ran some benchmarks on one of the WMF machines. The input I used is
> > a 137.5 MB (144,220,582 bytes) OGV file that someone asked me to
> > upload to Commons recently. For each benchmark, I hashed the file 25
> > times and computed the average running time.
> >
> > MD5: 393 ms
> > SHA-1: 404 ms
> > SHA-256: 1281 ms
Can we keep some perspective please? MD5 is plenty good enough for the
purposes discussed here. It's fast, and almost as important, is easily
supported by many OSs, libraries, etc. As far as collisions, there are
plenty of easy solutions, such as:
* Check for a collision before allowing a new revision, and do something
if so (to handle the pre-image attack)
* When reverting, do a select count(*) where md5=? and then do something
more advanced when more than one match is found
* Use the checksum to find the revision fast, but still do a full byte
comparison.
I've only seen one real attack scenario mentioned in this thread -
that of someone creating a new page with the same checksum as an existing
one, for purposes of messing up the reversion system. Are there other
attacks we should worry about?
I'm also of the opinion that we should just store things as CHAR(32),
unless someone thinks space is really at that much of a premium. The big
advantage of 32 chars (i.e. 0-9a-f aka hexadecimal ) is that it's a
standard way to represent things, making use of common tools (e.g. md5sum)
much easier.
--
Greg Sabino Mullane greg(a)endpoint.com
End Point Corporation
PGP Key: 0x14964AC8
Incidental: a book about developing for Wikitude was recently
published: "Professional Augmented Reality Browsers for Smartphones:
Programming for junaio, Layar and Wikitude"
http://www.amazon.com/gp/product/1119992818/
I know nothing about it other than the title :)
phoebe
--
* I use this address for lists; send personal messages to phoebe.ayers
<at> gmail.com *
Do we have anyone on this list who develops for Opera Mini and iOS Webkit?
If so come help us with Operas client side folding architecture
https://bugzilla.wikimedia.org/show_bug.cgi?id=29517.
Android and others work great but just not iOS.
--tomasz
Hi all;
Just like the scripts to preserve wikis[1], I'm working in a new script to
download all Wikimedia Commons images packed by day. But I have limited
spare time. Sad that volunteers have to do this without any help from
Wikimedia Foundation.
I started too an effort in meta: (with low activity) to mirror XML dumps.[2]
If you know about universities or research groups which works with
Wiki[pm]edia XML dumps, they would be a possible successful target to mirror
them.
If you want to download the texts into your PC, you only need 100GB free and
to run this Python script.[3]
I heard that Internet Archive saves XML dumps quarterly or so, but no
official announcement. Also, I heard about Library of Congress wanting to
mirror the dumps, but not news since a long time.
L'Encyclopédie has an "uptime"[4] of 260 years[5] and growing. Will
Wiki[pm]edia projects reach that?
Regards,
emijrp
[1] http://code.google.com/p/wikiteam/
[2] http://meta.wikimedia.org/wiki/Mirroring_Wikimedia_project_XML_dumps
[3]
http://code.google.com/p/wikiteam/source/browse/trunk/wikipediadownloader.py
[4] http://en.wikipedia.org/wiki/Uptime
[5] http://en.wikipedia.org/wiki/Encyclop%C3%A9die
2011/6/2 Fae <faenwp(a)gmail.com>
> Hi,
>
> I'm taking part in an images discussion workshop with a number of
> academics tomorrow and could do with a statement about the WMF's long
> term commitment to supporting Wikimedia Commons (and other projects)
> in terms of the public availability of media. Is there an official
> published policy I can point to that includes, say, a 10 year or 100
> commitment?
>
> If it exists, this would be a key factor for researchers choosing
> where to share their images with the public.
>
> Thanks,
> Fae
> --
> http://enwp.org/user_talk:fae
> Guide to email tags: http://j.mp/faetags
>
> _______________________________________________
> foundation-l mailing list
> foundation-l(a)lists.wikimedia.org
> Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
>
One of the long-awaited minor features that has finally come in during the
1.18 development cycle is resolving old bug 6672, natively supporting photos
where EXIF metadata specifies a non-default orientation.
This is very common in photos taken with digital cameras; a portrait-mode
image may be saved at the low-level as a 4x3 framebuffer with an orientation
tag stating it must be rotated 90 or 270 degrees (or occasionally an
upside-down photo needing to be rotated 180 :) While most photo-editing
applications understand this metadata natively and will simply show the
image at its natural size, web browsers don't -- and neither did the
server-side processing that MediaWiki was doing.
Bryan Tong-Minh did most of the earlier work on this a few months ago, which
is much to be commended!
In response to a couple issues noticed on test2.wikipedia.org (bug 31024,
bug 31048) I've made some tweaks to make it work more consistently. From now
on the image's width & height properties will be set to the logical size,
taking the orientation into account. This makes thumbnail resizing more
consistent (sometimes we got wrong sizes because the requested bounding box
would end up applied to the original physical size) and also makes the
orientation change transparent to API clients, such as other wikis using
ForeignAPIRepo (aka 'InstantCommons'). Clients won't need to know or care if
an image uses EXIF rotation; it'll simply always be presented and reported
as at its natural logical size.
I've applied these fixes to trunk, REL1_18, and 1.18wmf1 -- phpunit test
cases have been updated, but it's still conceivable something could be wrong
so please do test. :)
Note that any exif-rotated photos you've uploaded under the old code will
need to be purged to update their file metadata & clear out any old thumbs,
or they'll continue to possibly render wrong.
https://bugzilla.wikimedia.org/show_bug.cgi?id=6672https://bugzilla.wikimedia.org/show_bug.cgi?id=31024https://bugzilla.wikimedia.org/show_bug.cgi?id=31048
-- brion
Hi all,
If you are someone who currently merges code into the 1.17wmf1 branch,
this mail is for you:
We need you to make sure that you merge your code into the 1.18wmf1
branch as well. That also means if you're deploying code and using
syncfile to push individual files, you need to push from both the
1.17wmf1 branch and the 1.18wmf1 branch. If you don't, your code may
not make it into the 1.18wmf1 branch.
Thanks
Rob