[Israel.pm] Using Perl to analyze Wikipedia dumps - my talk yesterday

Amir E. Aharoni amir.aharoni at mail.huji.ac.il
Wed Aug 18 01:56:30 PDT 2010


Hi,

Thanks to everyone who came to Rehovot.pm yesterday.
Thanks to Gabor and Jaime for the hosting.
Thanks to Shlomi for the cake.
Thanks to everybody for the interesting questions.

Here's a collection of links related to my talk:

* https://lists.wikimedia.org/mailman/listinfo/wikitext-l - a special
mailing list dedicated to problems with the current MediaWiki "parser"

* http://download.wikimedia.org/ - download dumps of Wikimedia projects

* http://search.cpan.org/~triddle/MediaWiki-DumpFile-0.1.8/lib/MediaWiki/DumpFile.pm
- a CPAN module to read MediaWiki dump files

* http://tinyurl.com/wpiw-he - project in the Hebrew Wikipedia to find
and sort pages without interlanguage (a.k.a. interwiki) links

* http://en.wikipedia.org/wiki/Kur%C3%B3w - a town in Poland about
which there are Wikipedia articles in 220 languages

* http://www.apertium.org/ - Apertium, a Free machine translation engine

* http://culmus.sourceforge.net/dictionary/index.html - a project to
create a Hebrew-English dictionary from Wiktionary and other sources

* http://www.omegawiki.org - OmegaWiki, a project to create the next
generation multilingual Wiki dictionary, by combining MediaWiki with a
properly structured database

* http://wikimania2011.wikimedia.org/wiki/Main_Page - Wikimania 2011
in Haifa - everybody is welcome!

-- 
אָמִיר אֱלִישָׁע אַהֲרוֹנִי
Amir Elisha Aharoni

http://aharoni.wordpress.com

"We're living in pieces,
 I want to live in peace." - T. Moore


More information about the Perl mailing list