Blog Archive

Sunday, March 08, 2009

imperative make

The unix utility make wants you to declaratively build a dependency tree, so that subsequently each node of this tree can be created using known rules.

This view always went against my intuition. I don't view my program as a file that depends on its output directory and on a bunch of .o files, which depend on .cpp files, which depend on headers. Instead, I view it as a bunch of source files which have to be compiled and linked into the the program, and the fact that sometimes some of them don't have to be recompiled is an optimization the compiler should be concerned about. In other words, I view making a program as an imperative task:

compile all sources into .obj files
link .obj files into the output file

So now that I'm porting my stuff to FreeBSD, I decided to re-create the imperative make I've been using under Windows. Instead of porting my selectivebuild.exe, I decided to use perl.

Final makefile:



.ifndef DEBUG
OUTDIR = Release
GCC_OPTS += -O3
.else
OUTDIR = Debug
GCC_OPTS += -g
.endif

GCC_OPTS += -Wno-multichar
INCL = -I/usr/include/c++/4.2 -I.. -I.

SOURCES = byte_io.cpp mt_alloc.cpp cstring.cpp ... lots more cpp files

default:
@../compile_sources.pl $(OUTDIR) $(SOURCES) $(INCL) -O"$(GCC_OPTS)"
@../link_sources.pl $(OUTDIR) core.lib $(SOURCES)

clean:
rm -r Debug Release

This can be invoked with "make" or "make DEBUG=". The obj files and the resulting .lib file go into the output directory, which is either Debug or Release.

So... how do we implement compile_sources.pl and link_sources.pl? Well, the concept is simple. For compile_sources, I run makedepend with option -f- (output to stdout), and parse the resulting dependencies. I filter down the original array of sources to a ilst of out-of-date sources. I then iterate over that array and call gcc with options passed via -O.

For link_sources, I compare the modification dates of .obj files to the output file. If one of the objects is newer than the output file, I link. Based on the extension of the output file, I either call "ar" (for .lib) or "gcc" (otherwise).

A function objfile_of($outdir,$source) is used to map source files to obj files.

First, the utility file, filtercpp.pl:


#!/usr/bin/perl
use File::Basename;

# extract @includes
use Getopt::Long;
Getopt::Long::Configure ('bundling');
Getopt::Long::Configure ('no_ignore_case');
@includes;
@compile_opts;
GetOptions("I=s" => \@includes,
"O=s" => \$compile_opts);
@includes = map("-I$_", @includes);

sub objfile_of {
my ($out_dir,$a) = @_;
$a || return "";
my ($base, $dir, $ext) = fileparse($a,'\..*');
return "${dir}$out_dir/${base}.o";
}
sub do_exec {
print "@_\n";
`@_`;
}

1;

Now compile_sources.pl:


#!/usr/bin/perl

## sets variables @includes, $compile_flags

require "../filtercpp.pl";

$out_dir = shift(@ARGV);
$out_dir || die;

@sources = @ARGV;

sub get_ood {
my ($out_dir, $sources) = @_;
my %ood = ();
open DEPENDS, "makedepend @{$sources} @includes -f- |" || return [];
while () {
chomp;
if ($_ !~ m/DO NOT DELETE/) {
my @vars = split(/:/,$_);
my $source = $vars[0];
$source =~ s/\.o$/.cpp/;
$source ne "" || next;
next if $ood{$source};
my $obj = &objfile_of($out_dir,$source);
my $obj_date = ((stat $obj)[9] || 0);
#print STDERR "examining $obj, modified on $obj_date, rest is $vars[1]\n";
for my $header (split('\s+',$vars[1])) {
my $header_date = (stat $header)[9] || 0;
if ($header_date > $obj_date) {
print STDERR "$header ($header_date) is newer than $obj ($obj_date)\n";
$ood{$source} = true;
last;
}
}
if (((stat $source)[9] || 0) > $obj_date) {
$ood{$source} = true;
}
}
}
close DEPENDS;
return keys(%ood);
}

sub compile_sources {
my ($out_dir,$sources) = @_;
my @objs = map(&objfile_of($out_dir,$_), @{$sources});
#print("outdir $out_dir, includes @includes, sources @{$sources}\n");
@ood_sources = get_ood($out_dir, $sources);
mkdir $out_dir;
if (@ood_sources) {
print "ood sources: @ood_sources\n";
for $source (@ood_sources) {
#print("compiling $source\n");
my $obj = &objfile_of($out_dir,$source);
#print("ood obj is $obj for source $source\n");
&do_exec("gcc $source @includes $compile_flags -c -o$obj");
}
}
}

&compile_sources($out_dir,\@sources);


Now link_sources.pl:


#!/usr/bin/perl

## sets global variables @includes, $compile_opts
require "../filtercpp.pl";

$out_dir = shift(@ARGV);
$out_dir || die;

$out = shift(@ARGV);
$out || die;

@sources = @ARGV;

sub link_sources {
my ($out_dir,$sources,$out) = @_;
my $out_date = (stat "$out_dir/$out")[9] || 0;
my @objs = map (&objfile_of($out_dir,$_), @{$sources});
for my $obj (@objs) {
my $obj_date = (stat $obj)[9] || 0;
if ($obj_date > $out_date) {
print("$obj is newer than $out\n");
if ($out =~ /.lib$/) {
&do_exec("ar cr $out_dir/core.lib @objs $compile_opts");
last;
} else {
&do_exec("gcc @objs -o $out_dir/$out $compile_opts");
last;
}
}
}
}

&link_sources($out_dir,\@ARGV,$out);



Monday, March 02, 2009

SSD -- the future is here

There was a time, long long ago, when simple-minded apes didn't know much about storing information. The best they could do was take magnetized platters, stack them up, spin them around an axis and pick up or deposit charges by moving an actuator arm precariously close to the delicate spinning surface of these platters. There is no good reason for storing information like that. But you see, the people who invented the spinning platter lived on a relatively recently colonized continent. The continent was originally inhabited by primates who were, it is agreed, every bit as intelligent as the newcomers who succeeded them, with one major difference: they had not invented the wheel. Because they had not discovered the Great Benefits Of Rotation, their technological development was stunted, and as a result they simply could not compete. It was thus obvious to everyone, given this historical precedent, that Wheel Is Good.

But all technological cycles come to an end. And last weekend, finally, I took off the feathers, laid down the quiver, and picked up night goggles and a bazooka. I mean, I upgraded my home computer to an SSD drive. It was just a little bit painful, not too much. First, I tried all the cloning utilities I had used previously, including the ultimate boot CD and seagate's Disk Wizard, but they all failed to recognize the SSD as a valid disk. I don't know why, but I didn't care to figure out, because the closer you are to hardware, the more dumb things become. Thus, when dealing with hardware directly, you are dealing with the dumbest software, and the dumbest code. So it wasn't surprising to me at all, that even though the new disk (OCZ 120GB, $284) was SATA-compliant, it was perceived differently from other disks. Perhaps the cloning software was stunned to discover that the new disk had no tracks or cylinders. It's just a guess. I ended up using XXClone, which has a convenient 30-day trial mode. Good thing cloning a hard drive takes less than 30 days. XXclone is fundamentally different from other disk cloners in that it copies one file at a time, using Windows' Volume Snapshot Service. After this is done, you press a button in XXClone that makes the new disk bootable, and you're done. There was only one problem during this process. It had to do with junction points. You see, I use two terabyte-sized RAID disks for storing data. Since I hate drive letters, I access these disks using NTFS hard links (see a tool called junction by Mark Russinovich). c:\work leads to one of these 1TB disks, and c:\photo to another. XXClone, after checking that the amount of space taken up by C: was less than the capacity of the new drive, proceeded to naively recurse into these monstrously large directories and attempt to stuff them into the new disk. Fail. Using "junction *" I found all the reparse points on c: (all my reparse points are top-level), and removed them with junction -d. XXClone was even kind enough to swap disk letters, so that when I booted from the new drive, it was C:, and not G: or something. For this, I am grateful. Windows stores a mapping of volume serial numbers to drive letters somewhere in registry; this prevents drive letters from randomly reassigning themselves at boot time. But it also prevents disc cloning operations from working smoothly.

So, just like the wheel was a killer app five hundred years ago, solid state is the killer app today.