A little AppConfig tip

June 20, 2014

I have a little tip for AppConfig, one apparently not everyone knows, as I discovered in a household discussion.

The wonderfully useful module AppConfig reads ini files

bar = 1
baz = 2

derp = 4

and makes them accessible in a configuration object like so:

$cfg->foo_bar; # return 1

Now, if you want parameters not in any section, put those at the top of your config file. For $cfg->hork:

hork = 1
bar = 1

If you put it lower down you’ll end up with $cfg->foo_hork or $cfg->herp_hork.

That is all. Carry on.


June 18, 2014

My first CPAN module, Acme::Buffalo::Buffalo. There’s always room on CPAN for one more silly module.

Years ago I wanted to write Acme::Log4perl::Terror which would peg your application’s log level to the color-coded terror alert issued by the US Department of Homeland Security. Then they stopped issuing them. Oh well. Life is short, write those Acme modules now.

Pale Moon rising

June 15, 2014

At zork*, I am still automating Firefox to perform front-end performance analyses, though barely thanks to instability in the latest versions of Firefox.  With 28 and 29 I experience a lot of frustrating Firefox crashes.  Our front-end developer also reports crashes with Firefox 28 and its recommended version of  Firebug, 1.12.  He reverted to 24 in order to get some work done.

On my Facebook feed, I read that a friend frustrated to madness by 29 had switched to a browser called Pale Moon.  It bills itself as “an Open Source, Firefox-based web browser” and is currently based on 24, the same version that relieved my colleague’s frustration. So I gave it a try.

Pale Moon supports all my extensions: MozRepl, Firebug, PageSpeed, DOMInspector, Yahoo and NetExport. I can generate profiles programmatically with a few small modifications to the script (the profiles.ini is located in a different directory, for example, and palemoon -v doesn’t print ‘Mozilla Firefox’ obviously). MozRepl and its Perl interfaces work like a charm.

I haven’t measured it, but startup time is noticeably faster, and it’s much more stable.  Since it’s being used in an automated fashion I am less concerned about UI and more about stability and speed, so these observations make me happy.

If you work in a shop that frowns on alternative browsers,  or you wish to abstain from Firefox’s rapid upgrades, you may also consider Firefox ESR.


*work typed on an azerty keyboard. I switch keyboards a lot and the ensuing confusion erodes my grip on sanity.

Newest French Perl Mongers board member introduces herself

June 15, 2014

At the General Assemby of the French Perl Mongers (les Mongueurs de Perl) the board put out a call for new candidates. Sébastien Apherghis-Tramoni urged me to throw my hat in the ring. I asked the board if I would be doing something stupid if I did so. No, answered Laurent Boivin, but you can perhaps prevent someone else from doing something stupid. Oh what the hell, I figured, and put my  hand up. I was duly voted in before I could reconsider. 

As it is unseemly (peu convenable) for a newly-minted board member to have a dead blog, and because Wendy G.A. van Dijk urged us all heartily to Publish! Publish! Publish!,  I am reviving PerlGerl.  I’ve been doing a few things lately too, and should really post about them.



Controlling Firefox from Perl with MozRepl

June 24, 2012

On Friday I wrote my first program using AnyEvent and Coro, and it was so nifty that I decided to revive my moribund blog to describe it. This little program solved a whole raft of problems.

The problem involved determining certain properties of each of a list of URLs and updating a database. The rub is that some properties, such as redirects, are most reliably observed in the browser, and some, such whether the URL is hosted at $WORK, best determined using $WORK’s Perl modules. Now how can we put these two together to get all the info we need?

The approach I hit on involves using Perl to drive an instance of Firefox configured with some helpful extensions. MozRepl is a cool extension that lets you telnet into Firefox and program it from the inside, with access to the entire browser and the Mozilla API. NetExport is an extension to the extension Firebug which generates HAR (HTTP archive) files capturing all the data necessary for the analysis of front-end performance. HAR files serve as input to the command-line versions of PageSpeed and YSlow.

NetExport exports its results either to file or via HTTP POST. So in my program I start an embedded HTTP server, fire up Firefox, telnet into it, load a URL, and let the httpd capture and process the JSON posted by NetExport. Think of the setup as making Firefox puke into a bucket set out for that purpose. A bit imagé but you get the idea.

How do AnyEvent and Coro enter the picture? I use AnyEvent condition variables to coordinate the work between page loading with MozRepl and output handing with the web server, and a Coro thread to tell the web server to kindly stand over there while I continue with my main line of work. The hardest part is reorienting one’s brain towards the asynchronous way of thinking, not so easy after a lifetime of vanilla scripting.

So let’s see some code. The main script is delightfully short and sweet; the annotated version follows. The name of my employer has been changed to protect the innocent. And of course I use strict and warnings.

use AnyEvent;
use Coro;
use WORK::Config;
use WORK::AnyEvent::HTTPD; 
use WORK::AnyEvent::HTTPD::Handler::NetExport;  
use WORK::AnyEvent::MozRepl;
use Log::Log4perl qw(:easy);

my $cfg = WORK::Config::get_config();

# AnyEvent condition variable 

my $cv  = \$WORK::AnyEvent::HTTPD::Handler::NetExport::cv;

my @urls = qw(http://www.google.com http://www.yahoo.com);

# Tell NetExport where to post its results.
my $beacon = "http://localhost:9090/netexport";

# Handlers to process POSTed results.

my $h = WORK::AnyEvent::HTTPD::Handler::NetExport->new;

# Fire up a server and ask it to step out of the way.
async { 

# Fire up FF with good ol' system().


# Connect to MozRepl with AnyEvent::Socket


# Set the POST URL and turn on auto export.


while (@urls) {

    $$cv = AnyEvent->condvar; 

    my $url = shift @urls;

    load_page( $url ); #  Using MozRepl

    $$cv->recv;   # The POST handler will send().

kill_firefox();  # From within MozRepl.

Neat huh? There’s lots of blanks to fill in, but as this post is already getting a bit long, I will do that in posts to follow.

Getting Moose and Rose::DB::Object to play together

April 9, 2011

I’m fairly new to both Moose and Rose::DB::Object. and have been poking around trying to find a simple way to marry the two. Delegation and roles do the trick here.

I created a parameterized role, so that I could tell the role what Rose::DB::Object-derived class to delegate to.

package My::DB::Role;
use MooseX::Role::Parameterized;

parameter 'table' => (
is => 'ro',
isa => 'Str',

role {
my $p = shift;

my $class =
$p->table =~ /^My::DB/
? $p->table
: join '::', 'My::DB', $p->table

eval "require $class";

has 'db_obj' => (
is => 'rw',
isa => $class,
handles => [ $class->meta->column_names, qw(save load delete) ],
default => sub {

no Moose;

In my consuming class, I specify the role and the shortened name of the Rose::DB::Object-derived class:

package My::Product;
use Moose;

with 'Prixing::DB::Role' => { table => 'Product'};

With this little setup, I have working code:

use Test::More;


my $product = My::Product->new;


note $product->created_at;


This outputs:

ok 1 – use My::Product;
# 2011-04-09T09:03:20

This is not thoroughly tested; I just did this tonight. But this looks like the way to go.

Further reading:

  • Kate Yoak posts a comment about her solution at Rohan Almeida’s blog (now listed as an attack site by Google, which I find odd). Link to very fierce attack site; you have been warned.
  • Inheritance from Rose::DB::Object has been possible since Moose 1.15, which introduced the -meta_name to use Moose. Renaming Moose’s meta neatly sidesteps the Rose::DB::Object method of the same name.

UTF8 follies

April 8, 2011

Character encoding issues are shortening my life expectancy. It’s been a few years since I dealt with them regularly, so they bite me from time to time.

Recently I puzzled over a set of strings that were showing up double-encoded in my Postgres database. These strings were encoded from incoming Latin 1 to UTF8, travelled around the program with their utf8 flags set, went through JSON->decode and encode multiple times without apparent harm, but once in the database, they had the telltale double-encoding gibberish characters. E.g.:

use Encode;
my $utf8_string = "Télévision extrême à domicile";
print encode('utf8', $utf8_string);
# becomes Télévision extrême à domicile

I spent an absurdly long time capturing strings and dumping during execution:

say qx{echo $str | od -c }

od being a handy Unix utility I became very familiar with in my days of converting bibliographic data.

Turns out I was focusing on the wrong leads. These strings became values in a hash which was run through JSON->encode before saving to db. What prompted the re-encoding was not the values of the hash, but the keys. These were string literals in my program which were not marked as UTF8. Perl looked at my keys, looked at my values, saw a mixed bag, and decided to run the whole thing through the shredder again, just to be on the safe side.

The solution was simple:
use utf8

Because my keys were source-code string literals, I want my source code to be UTF8. use utf8 assures that.

Started reading Perl Moderne

April 5, 2011

I picked up a copy of Perl Moderne the new book by France’s Perl Gang of Four, Damien Krotkine, Sébastien Aperghis-Tramoni, Jérôme Quelin, and Philippe ‘BooK’ Bruhat, and have just started reading it.

Perl doesn’t seem to have the presence in France that it does in the English-speaking world. I hope this book helps change that, because it’s clearly-written, commute-friendly book which covers a lot of ground, from installation to Moose. Non-French Perl hackers can pick up their professional vocabulary here too. (Make your admirers swoon with desire by murmuring table de hachage like Morticia Addams did Gomez: “Tish! You speak French!”)

I am puzzled about a seeming omission in their first-chapter discussion of editors and ready-to-install binaries: there is no mention of Active State’s Komodo IDE or ActivePerl. Perhaps because Komodo is payware? I’ll ask next time I see one of them.

Figuring out map coordinates

April 5, 2011

Many sites use vendors other than Google for their mapping needs. These may use projections that dish up non-GPS coordinates. Here’s a rough way to figure out what these coordinates are and how to convert them, without returning to school for a graduate degree in geoinformatics.

  • Make a note of the unfamiliar coordinates and the street address.
  • Search by street address at Google Maps. Get the coordinates by right-clicking at the very tip of the red indicator and choosing “What’s here?”
  • Hop over to the World Coordinate Converter web site. Choose ‘*GPS (WGS84) (deg)’ from the drop-down in the upper box (if it’s not already selected) followed by the Google Maps coordinates.
  • Fool around with target systems in the lower box. Choose some likely candidates and click Convert. Keep this up until you get results close to the mystery coordinates (don’t count on identical).
  • Click on the ? next to the coordinate system name you chose. Put on your reading glasses and find the link in the popup to spatialreference.org. Click on it. This will give you the WKID (Well Known ID) of the projection, in the form of EPSG:\d+.
  • The WKID for the Google Maps projection is EPSG:4326 (http://spatialreference.org/ref/epsg/4326/).
  • Obtain the program gdaltransform by installing the GDAL utilities on your friendly *nix box.
    On Fedora, it’s apt-get install gdal-bin

  • Now the magic incantation:

    echo $x $y | /usr/bin/gdaltransform -s_srs [/source] -t_srs [target WKID]

    You will get your desired coordinates. Gingerly carry these over to Google Maps and verify that you’re in desired vicinity. The coordinates may be longitude-first, so you may need to reverse their order.

Further reading:

Happiness is a test suite.

March 9, 2011

Today I am grateful for my test suite. I had to install my handiwork on a production box for the first time today, and the test suite made short work of identifying and fixing problems. It’s astonishing what testing for the seemingly obvious will catch.

A test suite is a courtesy to the next sap charged with the task, as well. Why make them hate you and your mass of bugs any more than strictly necessary?