debmirror-2.16ubuntu1.1/0000775000000000000000000000000012700374210011776 5ustar debmirror-2.16ubuntu1.1/TODO0000664000000000000000000000020312243012102012450 0ustar It would probably be cleaner and easier to learn if it took apt-style lines to tell where to mirror from and what portions to use. debmirror-2.16ubuntu1.1/debmirror0000775000000000000000000026613412700374210013725 0ustar #!/usr/bin/perl -w =head1 NAME debmirror - Debian partial mirror script, with ftp, http or rsync and package pool support =head1 SYNOPSIS B [I] I =head1 DESCRIPTION This program downloads and maintains a partial local Debian mirror. It can mirror any combination of architectures, distributions, and sections. Files are transferred by ftp, and package pools are fully supported. It also does locking and updates trace files. The partial mirror created by this program is not suitable to be used as a public Debian mirror. If that is your aim, you should instead follow the instructions at L. This program mirrors in three steps. =over 4 =item 1. download Packages and Sources files First it downloads all Packages and Sources files for the subset of Debian it was instructed to get. =item 2. download everything else The Packages and Sources files are scanned, to build up a list of all the files they refer to. A few other miscellaneous files are added to the list. Then the program makes sure that each file in the list is present on the local mirror and is up-to-date, using file size (and optionally checksum) checks. Any necessary files are downloaded. =item 3. clean up unknown files Any files and directories on the local mirror that are not in the list are removed. =back =cut sub usage { warn join(" ", @_)."\n" if @_; warn < For details, see man page. EOF exit(1); } =head1 OPTIONS =over 4 =item I This required (unless defined in a configuration file) parameter specifies where the local mirror directory is. If the directory does not exist, it will be created. Be careful; telling this program that your home directory is the mirrordir is guaranteed to replace your home directory with a Debian mirror! =item B<-p>, B<--progress> Displays progress bars as files are downloaded. =item B<-v>, B<--verbose> Displays progress between file downloads. =item B<--debug> Enables verbose debug output, including ftp protocol dump. =item B<--dry-run> Simulate a mirror run. This will still download the meta files to the F<./.temp> working directory, but won't replace the old meta files, won't download debs and source files and only simulates cleanup. =item B<--help> Display a usage summary. =item B<-h>, B<--host>=I Specify the remote host to mirror from. Defaults to I, you are strongly encouraged to find a closer mirror. =item B<-r>, B<--root>=I Specifies the directory on the remote host that is the root of the Debian archive. Defaults to F, which will work for most mirrors. The root directory has a F subdirectory. =item B<--method>=I Specify the method to download files. Currently, supported methods are B, B, B, and B. =item B<--passive> Download in passive mode when using ftp. =item B<-u>, B<--user>=I Specify the remote user name to use to log into the remote host. Defaults to C. =item B<--passwd>=I Specify the remote user password to use to log into the remote ftp host. It is used with B<--user> and defaults to C. =item B<--proxy>=I Specifies the http proxy (like Squid) to use for http or ftp methods. =item B<-d>, B<--dist>=I Specify the distribution (etch, lenny, squeeze, sid) of Debian to mirror. This switch may be used multiple times, and multiple distributions may be specified at once, separated by commas. You may also use the stable, testing, unstable, names. =item B<--omit-suite-symlinks> With this option set, B will not create the symlink from I to I. This is needed for example when mirroring archived Debian releases as they will all have either C or C as suite in their F files. =item B<-s>, B<--section>=I Specify the section of Debian to mirror. Defaults to C. =item B<-a>, B<--arch>=I Specify the architectures to mirror. The default is B<--arch=i386>. Specifying B<--arch=none> will mirror no archs. =item B<--rsync-extra>=I Allows to also mirror files from a number of directories that are not part of the package archive itself. B will B use rsync for the transfer of these files, irrespective of what transfer method is specified in the B<--method> option. This will therefore not work if your remote mirror does not support rsync, or if the mirror needs a different B<--root> option for rsync than for the main transfer method specified with B<--method>. Note that excluding individual files in the directories is not supported. The following values are supported. =over 2 =item B Download all files and subdirectories in F directory, and all README files in the root directory of the archive. =item B Download all files and subdirectories in F directory. Note that this directory can contain some rather large files; don't include this type unless you know you need these files. =item B Download all files and subdirectories in F directory. =item B Download the remote mirror's trace files for the archive (F). This is enabled by default. =item B This can be used to disable getting extra files with rsync. =back If specified, the update of trace files will be done at the beginning of the mirror run; the other types are done near the end. This switch may be used multiple times, and multiple values may be specified at once, separated by commas; unknown values are ignored. =item B<--di-dist>=I Mirror current Debian Installer images for the specified dists. See further the section L below. =item B<--di-arch>=I Mirror current Debian Installer images for the specified architectures. See further the section L below. =item B<--source> Include source in the mirror (default). =item B<--nosource> Do not include source. =item B<--i18n> Additionally download FlangE.bz2> files, which contain translations of package descriptions. Selection of specific translations is possible using the B<--include> and B<--exclude> options. The default is to download only the English file. =item B<--getcontents> Additionally download FarchE.gz> files. Note that these files can be relatively big and can change frequently, especially for the testing and unstable suites. Use of the available diff files is strongly recommended (see the B<--diff> option). =item B<--checksums> Use checksums to determine if files on the local mirror that are the correct size actually have the correct content. Not enabled by default, because it is too paranoid, and too slow. When the state cache is used, B will only check checksums during runs where the cache has expired or been invalidated, so it is worth considering to use these two options together. =item B<--ignore-missing-release> Don't fail if the F file is missing. =item B<--check-gpg>, B<--no-check-gpg> Controls whether gpg signatures from the F file should be checked. The default is to check signatures. =item B<--keyring>=I Use I as an additional gpg-format keyring. May be given multiple times. Note that these will be used in addition to $GNUPGHOME/trustedkeys.gpg. The latter can be removed from the set of keyrings by setting $GNUPGHOME to something non-existent when using this option. On a typical Debian system, the Debian archive keyring can be used directly with this option: debmirror --keyring /usr/share/keyrings/debian-archive-keyring.gpg ... =item B<--ignore-release-gpg> Don't fail if the F file is missing. If the file does exist, it is mirrored and verified, but any errors are ignored. =item B<--ignore>=I Never delete any files whose filenames match the regex. May be used multiple times. =item B<--exclude>=B Never download any files whose filenames match the regex. May be used multiple times. =item B<--include>=I Don't exclude any files whose filenames match the regex. May be used multiple times. =item B<--exclude-deb-section>=I Never download any files whose Debian Section (games, doc, oldlibs, science, ...) match the regex. May be used multiple times. =item B<--limit-priority>=I Limit download to files whose Debian Priority (required, extra, optional, ...) match the regex. May be used multiple times. =item B<--exclude-field>=I=I Never download any binary packages where the contents of I match the regex. May be used multiple times. If this option is used and the mirror includes source packages, only those source packages corresponding to included binary packages will be downloaded. =item B<--include-field>=I=I Don't exclude any binary packages where the contents of I match the regex. May be used multiple times. If this option is used and the mirror includes source packages, only those source packages corresponding to included binary packages will be downloaded. =item B<-t>, B<--timeout>=I Specifies the timeout to use for network operations (either FTP or rsync). Set this to a higher value if you experience failed downloads. Defaults to 300 seconds. =item B<--max-batch>=I Download at most max-batch number of files (and ignore rest). =item B<--rsync-batch>=I Download at most number of files with each rsync call and then loop. =item B<--rsync-options>=I Specify alternative rsync options to be used. Default options are "-aIL --partial". Care must be taken when specifying alternative options not to disrupt operations, it's best to only add to those options. The most likely option to add is "--bwlimit=x" to avoid saturating the bandwidth of your link. =item B<--postcleanup> Clean up the local mirror but only after mirroring is complete and only if there was no error. This is the default, because it ensures that the mirror is consistent at all times. =item B<--precleanup> Clean up the local mirror before starting mirroring. This option may be useful if you have limited disk space, but it will result in an inconsistent mirror when debmirror is running. The deprecated B<--cleanup> option also enables this mode. =item B<--nocleanup> Do not clean up the local mirror. =item B<--skippackages> Don't re-download F and F files. Useful if you know they are up-to-date. =item B<--diff>=I If B<--diff=use> is specified and the F file contains entries for diff files, then debmirror will attempt to use them to update F, F, and F files (which can significantly reduce the download size for meta files), but will not include them in the mirror. This is the default behavior and avoids having time consuming diff files for a fast local mirror. Specifying B<--diff=mirror> does the same as B, but will also include the downloaded diff files in the local mirror. Specify B<--diff=none> to completely ignore diff files. Note that if rsync is used as method to download files and the archive being mirrored has "rsyncable" gzipped meta files, then using B<--diff=none> may be the most efficient way to download them. See the B(1) man page for information about its rsyncable option. =item B<--gzip-options>=I Specify alternative options to be used when calling B(1) to compress meta files after applying diffs. The default options are C<-9 -n --rsyncable> which corresponds with the options used to gzip meta files for the main Debian archive. These options may need to be modified if the checksum of the file as gzipped by debmirror does not match the checksum listed in the F file (which will result in the gzipped file being downloaded unnecessarily after diffs were successfully applied). =item B<--slow-cpu> By default debmirror saves some bandwidth by performing cpu-intensive tasks, such as compressing files to generate .gz and .bz2 files. Use this mode if the computer's CPU is slow, and it makes more sense to use more bandwidth and less CPU. This option implies B<--diff=none>. =item B<--state-cache-days>=I Save the state of the mirror in a cache file between runs. The cache will expire after the specified number of days, at which time a full check and cleanup of the mirror will be done. While the cache is valid, B will trust that the mirror is consistent with this cache. The cache is only used for files that have a unique name, i.e. binary packages and source files. If a mirror update fails for any reason, the cache will be invalidated and the next run will include a full check. Main advantage of using the state cache is that it avoids a large amount of disk access while checking which files need to be fetched. It may also reduce the time required for mirror updates. =item B<--ignore-small-errors> Normally B will report an error if any deb files or sources fail to download and refuse to update the meta data to an inconsistent mirror. Normally this is a good things as it indicates something went wrong during download and should be retried. But sometimes the upstream mirror actually is broken. Specifying B<--ignore-small-errors> causes B to ignore missing or broken deb and source files but still be pedantic about checking meta files. =item B<--allow-dist-rename> The directory name for a dist should be equal to its Codename and not to a Suite. If the local mirror currently has directories named after Suites, B can rename them automatically. An existing symlink from I to I will be removed, but B will automatically create a new symlink S codename> (immediately after moving meta files in place). This conversion should only be needed once. =item B<--disable-ssl-verification> When https is used, debmirror checks that the SSL certificate is value. If the server has a self-signed certificate, the check can be disabled with this option. =item B<--debmarshal> On each pull, keep the repository meta data from dists/* in a numbered subdirectory, and maintain a symlink latest to the most recent pull. This is similar to Debmarshal in tracking mode, see debmarshal.debian.net for examples and use. debmirror cleanup is disabled when this flag is specified. Separate pool and snapshot cleanup utilities are available at http://code.google.com/p/debmarshal/source/browse/#svn/trunk/repository2 =item B<--config-file>=I Specify a configuration file. This option may be repeated to read multiple configuration files. By default debmirror reads /etc/debmirror.conf and ~/.debmirror.conf (see section FILES). =back =head1 USING DEBMIRROR =head2 Using regular expressions in options Various options accept regular expressions that can be used to tune what is included in the mirror. They can be any regular expression valid in I, which also means that extended syntax is standard. Make sure to anchor regular expressions appropriately: this is not done by debmirror. The --include and --exclude options can be combined. This combination for example will, if the --i18n option is used, exclude all F files, except for the ones for Portuguese (pt) and Brazillian (pt_BR): --exclude='/Translation-.*\.bz2$' --include='/Translation-pt.*\.bz2$' =head2 Mirroring Debian Installer images Debmirror will only mirror the "current" images that are on the remote mirror. At least one of the options --di-dist or --di-arch must be passed to enable mirroring of the images. The special values "dists" and "arches" can be used to tell debmirror to use the same dists and architectures for D-I images as for the archive, but it is also possible to specify different values. If either option is not set, it will default to the same values as for the archive. If you wish to create custom CD images using for example I, you will probably also want add the option "--rsync-extra=doc,tools". B There are no progress updates displayed for D-I images. =head2 Archive size The tables in the file F give an indication of the space needed to mirror the Debian archive. They are particularly useful if you wish to set up a partial mirror. Only the size of source and binary packages is included. You should allow for around 1-4 GB of meta data (in F<./dists/EdistE>) per suite (depending in your settings). Plus whatever space is needed for extra directories (e.g. F, F) you wish to mirror. The tables also show how much additional space is required if you add a release on top of its predecessor. Note that the additional space needed for testing and (to a lesser extend) unstable varies during the development cycle of a release. The additional space needed for testing is zero immediately after a stable release and grows from that time onwards. B Debmirror keeps an extra copy of all meta data. This is necessary to guarantee that the local mirror stays consistent while debmirror is running. =head1 EXAMPLES Simply make a mirror in F, using all defaults (or the settings defined in F): debmirror /srv/mirror/debian Make a mirror of i386 and sparc binaries, main only, and include both unstable and testing versions of Debian; download from 'ftp.kernel.org': debmirror -a i386,sparc -d sid -d etch -s main --nosource \ -h ftp.nl.debian.org --progress $HOME/mirror/debian Make a mirror using rsync (rsync server is 'ftp.debian.org::debian'), excluding the section 'debug' and the package 'foo-doc': debmirror -e rsync $HOME/mirror/debian --exclude='/foo-doc_' \ --exclude-deb-section='^debug$' =head1 FILES /etc/debmirror.conf ~/.debmirror.conf Debmirror will look for the presence of these files and load them in the indicated order if they exist. See the example in /usr/share/doc/debmirror/examples for syntax. ~/.gnupg/trustedkeys.gpg When gpg checking is enabled, debmirror uses gpgv to verify Release and Release.gpg using the default keying ~/.gnupg/trustedkeys.gpg. This can be changed by exporting GNUPGHOME resulting in $GNUPGHOME/trustedkeys.gpg being used. (Note that keyring files can also be specified directly with debmirror's --keyring option -- see above). To add the right key to this keyring you can import it from the debian keyring (in case of the debian archive) using: gpg --keyring /usr/share/keyrings/debian-archive-keyring.gpg --export \ | gpg --no-default-keyring --keyring trustedkeys.gpg --import or download the key from a keyserver: gpg --no-default-keyring --keyring trustedkeys.gpg \ --keyserver keyring.debian.org --recv-keys The can be found in the gpgv error message in debmirror: gpgv: Signature made Tue Jan 23 09:07:53 2007 CET using DSA key ID 2D230C5F =cut use strict; use Cwd; use Storable qw(nstore retrieve); use Getopt::Long; use File::Temp qw/ tempfile /; use File::Path qw(make_path); use IO::Pipe; use IO::Select; use LockFile::Simple; use Compress::Zlib; use Digest::MD5; use Digest::SHA; use Net::INET6Glue; use Net::FTP; use LWP::UserAgent; # Yeah, I use too many global variables in this program. our $mirrordir; our @config_files; our ($debug, $progress, $verbose, $passive, $skippackages, $getcontents, $i18n); our ($ua, $proxy, $ftp); our (@dists, @sections, @arches, @ignores, @excludes, @includes, @keyrings); our (@excludes_deb_section, @limit_priority); our (%excludes_field, %includes_field); our (@di_dists, @di_arches, @rsync_extra); our $state_cache_days = 0; our $verify_checksums = 0; our $pre_cleanup=0; our $post_cleanup=1; our $no_cleanup=0; our $do_source=1; our $host="ftp.debian.org"; our $user="anonymous"; our $passwd="anonymous@"; our $remoteroot="debian"; our $download_method="ftp"; our $timeout=300; our $max_batch=0; our $rsync_batch=200; our $num_errors=0; our $bytes_to_get=0; our $bytes_gotten=0; our $bytes_meta=0; our $doing_meta=1; our $ignore_missing_release=0; our $ignore_release_gpg=0; our $start_time = time; our $dry_run=0; our $do_dry_run=0; our $rsync_options="-aIL --partial"; our $ignore_small_errors=0; our $diff_mode="use"; our $gzip_options="-9 -n --rsyncable"; our $omit_suite_symlinks=0; our $allow_dist_rename=0; our $debmarshal=0; our $disable_ssl_verification; our $slow_cpu=0; our $check_gpg=1; our $new_mirror=0; my @errlog; my $HOME; ($HOME = $ENV{'HOME'}) or die "HOME not defined in environment!\n"; # Load in config files first so options can override them. Getopt::Long::Configure qw(pass_through); GetOptions('config-file=s' => \@config_files); if (@config_files) { foreach my $config_file (@config_files) { die "Can't open config file $config_file!\n" if ! -r $config_file; require $config_file; } } else { require "/etc/debmirror.conf" if -r "/etc/debmirror.conf"; require "$HOME/.debmirror.conf" if -r "$HOME/.debmirror.conf"; } # This hash contains the releases to mirror. If both codename and suite can be # determined from the Release file, the codename is used in the key. If not, # it can also be a suite (or whatever was requested by the user). # The hash has tree subtypes: # - suite: if both codename and suite could be determined from the Release file, # the codename is the key and the value is the name of the suitei - used to # update the suite -> codename symlinks; # - mirror: set to 1 if the package archive should be mirrored for the dist; # - d-i: set to 1 if D-I images should be mirrored for the dist. # For the last two subtypes the key can also include a subdir. my %distset=(); # This hash holds all the files we know about. Values are: # - -1: file was not on mirror and download attempt failed # - 0: file was not on mirror and either needs downloading or was # downloaded this run # - 1: file is on mirror and wanted according to meta data # - 2: file is on mirror and listed in state cache, but not (yet) # verified as wanted according to meta data # Values -1 and 2 can occur in the state cache; see $files_cache_version # below! Filenames should be relative to $mirrordir. my %files; # Hash to record size and checksums of meta files and package files (from the # Release file and Source/Packages files). my %file_lists; # Hash to record which Translation files needs download. Contains size and sha1 # info. Files also get registered in %files. my %i18n_get; # Separate hash for files belonging to Debian Installer images. # This data is not cached. my %di_files; ## State cache meta-data my $use_cache = 0; my $state_cache_exptime; # Next variable *must* be changed if the structure of the %files hash is # changed in a way that makes old state-cache files incompatible. my $files_cache_version = "1.0"; my $help; Getopt::Long::Configure qw(no_pass_through); GetOptions('debug' => \$debug, 'progress|p' => \$progress, 'verbose|v' => \$verbose, 'source!' => \$do_source, 'checksums!' => \$verify_checksums, 'md5sums|m' => \$verify_checksums, # back compat 'passive!' => \$passive, 'host|h=s' => \$host, 'user|u=s' => \$user, 'passwd=s' => \$passwd, 'root|r=s' => \$remoteroot, 'dist|d=s' => \@dists, 'section|s=s' => \@sections, 'arch|a=s' => \@arches, 'di-dist=s' => \@di_dists, 'di-arch=s' => \@di_arches, 'rsync-extra=s' => \@rsync_extra, 'precleanup' => \$pre_cleanup, 'cleanup' => \$pre_cleanup, 'postcleanup' => \$post_cleanup, 'nocleanup' => \$no_cleanup, 'ignore=s' => \@ignores, 'exclude=s' => \@excludes, 'exclude-deb-section=s' => \@excludes_deb_section, 'limit-priority=s' => \@limit_priority, 'include=s' => \@includes, 'exclude-field=s' => \%excludes_field, 'include-field=s' => \%includes_field, 'skippackages' => \$skippackages, 'i18n' => \$i18n, 'getcontents' => \$getcontents, 'method|e=s' => \$download_method, 'timeout|t=s' => \$timeout, 'max-batch=s' => \$max_batch, 'rsync-batch=s' => \$rsync_batch, 'state-cache-days=s' => \$state_cache_days, 'ignore-missing-release' => \$ignore_missing_release, 'ignore-release-gpg' => \$ignore_release_gpg, 'check-gpg!' => \$check_gpg, 'dry-run' => \$dry_run, 'proxy=s' => \$proxy, 'rsync-options=s' => \$rsync_options, 'gzip-options=s' => \$gzip_options, 'ignore-small-errors' => \$ignore_small_errors, 'diff=s' => \$diff_mode, 'omit-suite-symlinks' => \$omit_suite_symlinks, 'allow-dist-rename' => \$allow_dist_rename, 'debmarshal' => \$debmarshal, 'slow-cpu' => \$slow_cpu, 'disable-ssl-verification' => \$disable_ssl_verification, 'keyring=s' => \@keyrings, 'help' => \$help, ) or usage; usage if $help; usage("invalid number of arguments") if $ARGV[1]; # This parameter is so important that it is the only required parameter, # unless specified in a configuration file. $mirrordir = shift if $ARGV[0]; usage("mirrordir not specified") unless defined $mirrordir; $diff_mode="none" if $slow_cpu; if ($download_method eq 'hftp') { # deprecated $download_method='ftp'; } # Check for patch binary if needed if (!($diff_mode eq "none")) { if (system("patch --version 2>/dev/null >/dev/null")) { say("Patch binary missing, falling back to --diff=none"); push (@errlog,"Patch binary missing, falling back to --diff=none\n"); $diff_mode = "none"; } if (system("ed --version 2>/dev/null >/dev/null")) { say("Ed binary missing, falling back to --diff=none"); push (@errlog,"Ed binary missing, falling back to --diff=none\n"); $diff_mode = "none"; } } # Backwards compatibility: remote root dir no longer needs prefix $remoteroot =~ s%^[:/]%%; # Post-process arrays. Allow commas to separate values the user entered. # If the user entered nothing, provide defaults. @dists=split(/,/,join(',',@dists)); @dists=qw(sid) unless @dists; @sections=split(/,/,join(',',@sections)); @sections=qw(main contrib non-free main/debian-installer) unless @sections; @arches=split(/,/,join(',',@arches)); @arches=qw(i386) unless @arches; @arches=() if (join(',',@arches) eq "none"); @di_dists=split(/,/,join(',',@di_dists)); @di_arches=split(/,/,join(',',@di_arches)); if (@di_dists) { @di_dists = @dists if ($di_dists[0] eq "dists"); @di_arches = @arches if (!@di_arches || $di_arches[0] eq "arches"); } elsif (@di_arches) { @di_dists = @dists if (!@di_dists); @di_arches = @arches if ($di_arches[0] eq "arches"); } @rsync_extra=split(/,/,join(',',@rsync_extra)); @rsync_extra="trace" unless @rsync_extra; if (! grep { $_ eq 'trace' } @rsync_extra) { print STDERR "Warning: --rsync-extra is not configured to mirror the trace files.\n"; print STDERR " This configuration is not recommended.\n"; } @rsync_extra=() if grep { $_ eq "none" } @rsync_extra; $pre_cleanup=0 if ($no_cleanup); $pre_cleanup=0 if ($debmarshal); $post_cleanup=0 if ($no_cleanup); $post_cleanup=0 if ($pre_cleanup); $post_cleanup=0 if ($debmarshal); # Display configuration. $|=1 if $debug; if ($passwd eq "anonymous@") { if ($download_method eq "http") { say("Mirroring to $mirrordir from $download_method://$host/$remoteroot/"); } else { say("Mirroring to $mirrordir from $download_method://$user\@$host/$remoteroot/"); } } else { say("Mirroring to $mirrordir from $download_method://$user:XXX\@$host/$remoteroot/"); } say("Arches: ".join(",", @arches)); say("Dists: ".join(",", @dists)); say("Sections: ".join(",", @sections)); say("Including source.") if $do_source; say("D-I arches: ".join(",", @di_arches)) if @di_arches; say("D-I dists: ".join(",", @di_dists)) if @di_dists; say("Pdiff mode: $diff_mode"); say("Slow CPU mode.") if $slow_cpu; say("Veriftying checksums.") if $verify_checksums; say("Not checking Release gpg signatures.") if ! $check_gpg; say("Passive mode on.") if $passive; say("Proxy: $proxy") if $proxy; say("Download at most $max_batch files.") if ($max_batch > 0); say("Download at most $rsync_batch files per rsync call.") if ($download_method eq "rsync"); if ($pre_cleanup) { say("Will clean up before mirroring."); } elsif ($post_cleanup) { say("Will clean up after mirroring."); } else { say("Will NOT clean up."); } say("Dry run.") if $dry_run; say("Debmarshal snapshots kept.") if $debmarshal; say("Disable SSL verification.") if $disable_ssl_verification; # Set up mirror directory and resolve $mirrordir to a full path for # locking and rsync if (! -d $mirrordir) { make_dir($mirrordir); $new_mirror = 1; } die "You need write permissions on $mirrordir" if (! -w $mirrordir); chdir($mirrordir) or die "chdir $mirrordir: $!"; $mirrordir = cwd(); # Handle the lock file. This is the same method used by official # Debian push mirrors. my $hostname=`hostname -f 2>/dev/null || hostname`; chomp $hostname; my $lockfile="Archive-Update-in-Progress-$hostname"; say("Attempting to get lock ..."); my $lockmgr = LockFile::Simple->make(-format => "%f/$lockfile", -max => 12, -delay => 10, -nfs => 1, -autoclean => 1, -warn => 1, -stale => 1, -hold => 0); my $lock = $lockmgr->lock("$mirrordir") or die "$lockfile exists or you lack proper permissions; aborting"; $SIG{INT}=sub { $lock->release; exit 1 }; $SIG{TERM}=sub { $lock->release; exit 1 }; # Create tempdir if missing my $tempdir=".temp"; make_dir($tempdir) if (! -d $tempdir); die "You need write permissions on $tempdir" if (! -w $tempdir); # Load the state cache. load_state_cache() if $state_cache_days; # Register the trace and lock files. my $tracefile="project/trace/$hostname"; $files{$tracefile}=1; $files{$lockfile}=1; my $rsynctempfile; END { unlink $rsynctempfile if $rsynctempfile } sub init_connection { $_ = $download_method; /^http$/ && do { $ua = LWP::UserAgent->new(keep_alive => 1); $ua->timeout($timeout); $ua->proxy('http', $ENV{http_proxy}) if $ENV{http_proxy}; $ua->proxy('http', $proxy) if $proxy; $ua->show_progress($progress); return; }; /^https$/ && do { $ua = LWP::UserAgent->new(keep_alive => 1, ssl_opts => { verify_hostname => ! $disable_ssl_verification }); $ua->timeout($timeout); $ua->proxy('https', $ENV{https_proxy}) if $ENV{https_proxy}; $ua->proxy('https', $proxy) if $proxy; $ua->show_progress($progress); return; }; /^ftp$/ && do { if ($proxy || $ENV{ftp_proxy}) { $ua = LWP::UserAgent->new; $ua->timeout($timeout); $ua->proxy('ftp', $proxy ? $proxy : $ENV{ftp_proxy}); } else { my %opts = (Debug => $debug, Passive => $passive, Timeout => $timeout); $ftp=Net::FTP->new($host, %opts) or die "$@\n"; $ftp->login($user, $passwd) or die "login failed"; # anonymous $ftp->binary or die "could not set binary mode"; $ftp->cwd("/$remoteroot") or die "cwd to /$remoteroot failed"; $ftp->hash(\*STDOUT,102400) if $progress; } return; }; /^rsync$/ && do { return; }; usage("unknown download method: $_"); } init_connection(); # determine remote root for rsync transfers my $rsyncremote; if (length $remoteroot) { $rsyncremote = "$host\:\:$remoteroot/"; if ($user ne 'anonymous') { $rsyncremote = "$user\@$rsyncremote"; } } else { if ($download_method eq 'rsync') { die "rsync cannot be used with a root of $remoteroot/\n"; } } # Update the remote trace files; also update ignores for @rsync_extra. rsync_extra(1, @rsync_extra); # Get Release files without caching for http say("Getting meta files ..."); $ua->default_header( "Cache-Control" => "max-age=0" ) if ($ua); foreach my $dist (@dists) { my $tdir="$tempdir/.tmp/dists/$dist"; my $have_release = get_release($tdir, $dist); next unless ($have_release || $ignore_missing_release); my ($codename, $suite, $dist_sdir) = name_release("mirror", $tdir, $dist); if ($have_release) { my $next; make_dir ("dists/$codename$dist_sdir"); make_dir ("$tempdir/dists/$codename$dist_sdir"); rename("$tdir/Release", "$tempdir/dists/$codename$dist_sdir/Release") or die "Error while moving $tdir/Release: $!\n"; $files{"dists/$codename$dist_sdir/Release"}=1; $files{$tempdir."/"."dists/$codename$dist_sdir/Release"}=1; if ($debmarshal) { $next = make_next_snapshot($mirrordir,$dist,$codename, $dist_sdir,$tempdir); } if (-f "$tdir/Release.gpg") { rename("$tdir/Release.gpg", "$tempdir/dists/$codename$dist_sdir/Release.gpg") or die "Error while moving $tdir/Release.gpg: $!\n"; $files{"dists/$codename$dist_sdir/Release.gpg"}=1; $files{$tempdir."/"."dists/$codename$dist_sdir/Release.gpg"}=1; if ($debmarshal) { link_release_into_snapshot($mirrordir,$dist,$next,$tempdir, $codename,$dist_sdir); } } } } # Check that @di_dists contains valid codenames di_check_dists() if @di_dists; foreach my $dist (keys %distset) { next unless exists $distset{$dist}{mirror}; # Parse the Release and extract the files listed for all checksum types. if (open RELEASE, "<$tempdir/dists/$dist/Release") { my $checksum_type; while () { if (/^(MD5Sum|SHA\d+):/) { $checksum_type=$1; } elsif (/^ / && defined $checksum_type) { my ($checksum, $size, $filename) = /^ +([a-z0-9]+) +(\d+) +(.*)$/; $file_lists{"$tempdir/dists/$dist/$filename"}{$checksum_type} = $checksum; $file_lists{"$tempdir/dists/$dist/$filename"}{size} = $size; } } close RELEASE; } } if ($num_errors != 0 && $ignore_missing_release) { say("Ignoring failed Release files."); push (@errlog,"Ignoring failed Release files\n"); $num_errors = 0; } if ($num_errors != 0) { print "Errors:\n ".join(" ",@errlog) if (@errlog); die "Failed to download some Release or Release.gpg files!\n"; } # Enable caching again for http init_connection if ($ua); # Calculate expected downloads for meta files # As we don't actually download most of the meta files (due to getting # only one compression variant or using diffs), we keep a separate count # of the actual downloaded amount of data in $bytes_meta. # The root Release files have already been downloaded $bytes_to_get = $bytes_meta; $bytes_gotten = $bytes_meta; sub add_bytes { my $name=shift; $bytes_to_get += $file_lists{"$tempdir/$name"}{size} if exists $file_lists{"$tempdir/$name"}; } foreach my $dist (keys %distset) { next unless exists $distset{$dist}{mirror}; foreach my $section (@sections) { foreach my $arch (@arches) { add_bytes("dists/$dist/$section/binary-$arch/Packages"); add_bytes("dists/$dist/$section/binary-$arch/Packages.gz"); add_bytes("dists/$dist/$section/binary-$arch/Packages.bz2"); add_bytes("dists/$dist/$section/binary-$arch/Release"); add_bytes("dists/$dist/$section/binary-$arch/Packages.diff/Index") unless ($diff_mode eq "none"); } # d-i does not have separate source sections if ($do_source && $section !~ /debian-installer/) { add_bytes("dists/$dist/$section/source/Sources"); add_bytes("dists/$dist/$section/source/Sources.gz"); add_bytes("dists/$dist/$section/source/Sources.bz2"); add_bytes("dists/$dist/$section/source/Release"); add_bytes("dists/$dist/$section/source/Sources.diff/Index") unless ($diff_mode eq "none"); } add_bytes("dists/$dist/$section/i18n/Index"); } } # Get and parse MD5SUMS files for D-I images. # (There are not currently other checksums for these.) di_add_files() if @di_dists; # Get Packages and Sources files and other miscellany. my (@package_files, @source_files); foreach my $dist (keys %distset) { next unless exists $distset{$dist}{mirror}; foreach my $section (@sections) { # some suites don't have d-i next if ($section =~ /debian-installer/ && di_skip_dist($dist) ); foreach my $arch (@arches) { get_index("dists/$dist/$section/binary-$arch", "Packages"); link_index($dist,$section,$arch) if $debmarshal; } # d-i does not have separate source sections if ($do_source && $section !~ /debian-installer/) { get_index("dists/$dist/$section/source", "Sources"); link_index($dist,$section,"source") if $debmarshal; } } } # Set download size for meta files to actual values $doing_meta=0; $bytes_to_get=$bytes_meta; $bytes_gotten=$bytes_meta; # Sanity check. I once nuked a mirror because of this.. if (@arches && ! @package_files) { print "Errors:\n ".join(" ",@errlog) if (@errlog); die "Failed to download any Packages files!\n"; } if ($do_source && ! @source_files) { print "Errors:\n ".join(" ",@errlog) if (@errlog); die "Failed to download any Sources files!\n"; } if ($num_errors != 0) { print "Errors:\n ".join(" ",@errlog) if (@errlog); die "Failed to download some Package, Sources or Release files!\n"; } # Activate dry-run option now if it was given. This delay is needed # for the ftp method. $do_dry_run = $dry_run; # Determine size of Contents and Translation files to get. if ($getcontents) { # Updates of Contents files using diffs are done here; only full downloads # are delayed. say("Update Contents files.") if ($diff_mode ne "none"); foreach my $dist (keys %distset) { next unless exists $distset{$dist}{mirror}; foreach my $arch (@arches) { next if $dist=~/experimental/; next if $dist=~/.*-proposed-updates/; next if $arch=~/source/; # In Debian Wheezy, the Contents-*.gz moved to '/dists/$dist/$sect/'. # This handles the new location, but also checks the old location # for backwards compatibility. push my @sects, @sections, ""; foreach my $sect (@sects) { if ($sect ne "") {$sect = "/$sect";} if (exists $file_lists{"$tempdir/dists/$dist${sect}Contents-$arch.gz"}) { if ($diff_mode ne "none") { if (!update_contents("dists/$dist$sect", "Contents-$arch")) { add_bytes("dists/$dist$sect/Contents-$arch.gz"); } } elsif (!check_lists("$tempdir/dists/$dist$sect/Contents-$arch.gz")) { add_bytes("dists/$dist$sect/Contents-$arch.gz"); } } } } } } foreach my $dist (keys %distset) { next unless exists $distset{$dist}{mirror}; foreach my $section (@sections) { i18n_from_release($dist,"$section/i18n"); } } # close ftp connection to avoid timeouts, will reopen later if ($ftp) { $ftp->quit; } say("Parsing Packages and Sources files ..."); { local $/="\n\n"; # Set input separator to read entire package my $empty_mirror = 1; my %arches = map { $_ => 1 } (@arches, "all"); my $include = "(".join("|", @includes).")" if @includes; my $exclude = "(".join("|", @excludes).")" if @excludes; my $exclude_deb_section = "(".join("|", @excludes_deb_section).")" if @excludes_deb_section; my $limit_priority = "(".join("|", @limit_priority).")" if @limit_priority; my $field_filters = scalar(keys %includes_field) || scalar(keys %excludes_field); my %binaries; foreach my $file (@package_files) { next if (!-f $file); open(FILE, "<", $file) or die "$file: $!"; for (;;) { unless (defined( $_ = )) { last if eof; die "$file: $!" if $!; } my ($filename)=m/^Filename:\s+(.*)/im; $filename=~s:/+:/:; # remove redundant slashes in paths my ($deb_section)=m/^Section:\s+(.*)/im; my ($deb_priority)=m/^Priority:\s+(.*)/im; my ($architecture)=m/^Architecture:\s+(.*)/im; next if (!$arches{$architecture}); if(!(defined($include) && ($filename=~/$include/o))) { next if (defined($exclude) && $filename=~/$exclude/o); next if (defined($exclude_deb_section) && defined($deb_section) && $deb_section=~/$exclude_deb_section/o); next if (defined($limit_priority) && defined($deb_priority) && ! ($deb_priority=~/$limit_priority/o)); } next if $field_filters && !check_field_filters($_); my ($package)=m/^Package:\s+(.*)/im; $binaries{$package} = 1; # File was listed in state cache, or file occurs multiple times if (exists $files{$filename}) { if ($files{$filename} >= 0) { $files{$filename} = 1 if $files{$filename} == 2; $empty_mirror = 0; next; } else { # download failed previous run, retry $files{$filename} = 0; } } my ($size)=m/^Size:\s+(\d+)/im; my %checksums; while (m/^(MD5sum|SHA\d+):\s+([A-Za-z0-9]+)/img) { $checksums{$1}=$2; } if (check_file(filename => $filename, size => $size, %checksums)) { $files{$filename} = 1; } else { $files{$filename} = 0; $file_lists{$filename} = \%checksums; $file_lists{$filename}{size} = $size; $bytes_to_get += $size; } $empty_mirror = 0; } close(FILE); } foreach my $file (@source_files) { next if (!-f $file); open(FILE, "<", $file) or die "$file: $!"; SOURCE: for (;;) { my $stanza; unless (defined( $stanza = )) { last if eof; die "$file: $!" if $!; } my @lines=split(/\n/, $stanza); my $directory; my %source_files; my $parse_source_files=sub { my $checksum_type=shift; while (@lines && $lines[0] =~ m/^ ([A-Za-z0-9]+ .*)/) { my ($checksum, $size, $filename)=split(' ', $1, 3); $source_files{$filename}{size}=$size; $source_files{$filename}{$checksum_type}=$checksum; shift @lines; } }; while (@lines) { my $line=shift @lines; if ($line=~/^Directory:\s+(.*)/i) { $directory=$1; } elsif ($line=~/^Section:\s+(.*)/i) { my $deb_section=$1; next SOURCE if (defined($exclude_deb_section) && defined($deb_section) && $deb_section=~/$exclude_deb_section/o); } elsif ($line=~/^Priority:\s+(.*)/i) { my $deb_priority=$1; next SOURCE if (defined($limit_priority) && defined($deb_priority) && ! ($deb_priority=~/$limit_priority/o)); } elsif ($line=~/^Binary:\s+(.*)/i) { if ($field_filters) { my @binary_names=split(/\s*,\s*/,$1); my $fetching_binary=0; for my $binary_name (@binary_names) { if (exists $binaries{$binary_name}) { $fetching_binary=1; last; } } next SOURCE unless $fetching_binary; } } elsif ($line=~/^Files:/i) { $parse_source_files->("MD5Sum"); } elsif ($line=~/^Checksums-(\w+):/i) { $parse_source_files->($1); } } foreach my $filename (keys %source_files) { my %file_data=%{$source_files{$filename}}; $filename="$directory/$filename"; $filename=~s:/+:/:; # remove redundant slashes in paths if(!(defined($include) && $filename=~/$include/o)) { next if (defined($exclude) && $filename=~/$exclude/o); } # File was listed in state cache, or file occurs multiple times if (exists $files{$filename}) { if ($files{$filename} >= 0) { $files{$filename} = 1 if $files{$filename} == 2; $empty_mirror = 0; next; } else { # download failed previous run, retry $files{$filename} = 0; } } if (check_file(filename => $filename, %file_data)) { $files{$filename} = 1; } else { $files{$filename} = 0; $file_lists{$filename} = \%file_data; $bytes_to_get += $file_data{size}; } } $empty_mirror = 0; } close(FILE); } # Sanity check to avoid completely nuking a mirror. if ($empty_mirror && ! $new_mirror) { print "Errors:\n ".join(" ",@errlog) if (@errlog); die "No packages after parsing Packages and Sources files!\n"; } } # With pre-mirror cleanup Contents and Translation files need to be # downloaded before the cleanup as otherwise they would be deleted # because they haven't been registered yet. # With post-mirror cleanup it's more neat to do all downloads together. # This could be simplified if we could register the files earlier. # Download Contents and Translation files. init_connection(); get_contents_files() if ($getcontents); get_i18n_files(); # Pre-mirror cleanup if ($pre_cleanup) { # close ftp connection during cleanup to avoid timeouts if ($ftp) { $ftp->quit; } cleanup_unknown_files(); init_connection(); } say("Files to download: ".print_dl_size($bytes_to_get - $bytes_gotten)); # Download all package files that we need to get. batch_get(); sub batch_get { if ($download_method eq 'ftp' || $download_method eq 'http' || $download_method eq 'https') { my $dirname; my $i=0; foreach my $file (sort keys %files) { if (!$files{$file}) { if (($dirname) = $file =~ m:(.*)/:) { make_dir($dirname); } if ($ftp) { ftp_get($file); } else { http_get($file); } if ($max_batch > 0 && ++$i >= $max_batch) { push (@errlog,"Batch limit exceeded, mirror run was partial\n"); $num_errors++; last; } } } return; } else { my $opt=$rsync_options; my $fh; my @result; my $i=0; my $j=0; $opt .= " --progress" if $progress; $opt .= " -v" if $verbose or $debug; $opt .= " -n" if $do_dry_run; $opt .= " --no-motd" unless $verbose; foreach my $file (sort keys %files) { if (!$files{$file}) { my $dirname; my @dir; ($dirname) = $file =~ m:(.*/):; @dir= split(/\//, $dirname); for (0..$#dir) { push (@result, "" . join('/', @dir[0..$_]) . "/"); } push (@result, "$file"); if (++$j >= $rsync_batch) { $j = 0; ($fh, $rsynctempfile) = tempfile(); if (@result) { @result = sort(@result); my $prev = "not equal to $result[0]"; @result = grep($_ ne $prev && ($prev = $_, 1), @result); for (@result) { print $fh "$_\n"; } } system ("rsync --timeout=$timeout $opt $rsyncremote --include-from=$rsynctempfile --exclude='*' $mirrordir"); die "rsync failed!" if ($? != 0); close $fh; unlink $rsynctempfile; foreach my $dest (@result) { if (-f $dest) { if (!check_lists($dest)) { say("$dest failed checksum verification"); $num_errors++; } } elsif (!-d $dest) { say("$dest missing"); $num_errors++; } } @result = (); } if ($max_batch > 0 && ++$i >= $max_batch) { print "Batch limit exceeded, mirror run will be partial\n"; push (@errlog,"Batch limit exceeded, mirror run was partial\n"); $num_errors++; last; } } } ($fh, $rsynctempfile) = tempfile(); if (@result) { @result = sort(@result); my $prev = "not equal to $result[0]"; @result = grep($_ ne $prev && ($prev = $_, 1), @result); for (@result) { print $fh "$_\n"; } system ("rsync --timeout=$timeout $opt $rsyncremote --include-from=$rsynctempfile --exclude='*' $mirrordir"); close $fh; foreach my $dest (@result) { if (-f $dest) { if (!check_lists($dest)) { say("$dest failed checksum verification"); $num_errors++; } } elsif (!-d $dest) { say("$dest missing"); $num_errors++; } } } return; } } if (! @di_dists) { download_finished(); } say("Everything OK. Moving meta files ..."); if ($debmarshal) { update_latest_links($mirrordir, $tempdir, @dists); } chdir($tempdir) or die "unable to chdir($tempdir): $!\n"; my $res=0; foreach my $file (`find . -type f`) { chomp $file; $file=~s:^\./::; # this skips diff files if unwanted next if (!exists $files{$file}); print("Moving $file\n") if ($debug); if (! $do_dry_run) { $res &= unlink($mirrordir."/".$file) if ($mirrordir."/".$file); "$file" =~ m,(^.*)/,; make_dir("$mirrordir/$1"); if (!link($file, $mirrordir."/".$file)) { $res &= system("cp $file $mirrordir/$file"); } } } chdir($mirrordir) or die "chdir $mirrordir: $!"; # Get optional directories using rsync. rsync_extra(0, @rsync_extra); # Download D-I images. if (@di_dists) { di_get_files(); download_finished(); } # Update suite->codename symlinks. if (! $omit_suite_symlinks && ! $do_dry_run) { my %suites; opendir (DIR, 'dists') or die "Can't open dists/: $!\n"; foreach my $file (grep (!/^\.\.?$/, readdir (DIR))) { if (-l "dists/$file") { my $cur = readlink("dists/$file") or die "Error reading symlink dists/$file: $!"; if (exists $distset{$cur}{suite} && ($file eq $distset{$cur}{suite} || $file eq "stable-$distset{$cur}{suite}")) { $suites{$file} = "ok"; } else { unlink("dists/$file") or die "Failed to remove symlink dists/$file: $!"; } } } closedir (DIR); foreach my $dist (keys %distset) { next if (! exists $distset{$dist}{suite}); next if (!-d "dists/$dist"); my $suite = $distset{$dist}{suite}; if (! exists $suites{$suite}) { symlink("$dist", "dists/$suite") or die "Failed to create symlink dists/$suite: $!"; } if ($suite eq "proposed-updates"&& !exists $suites{"stable-$suite"}) { symlink("$dist", "dists/stable-$suite") or die "Failed to create symlink dists/stable-$suite: $!"; } } } # Write out trace file. if (! $do_dry_run) { make_dir("project/trace"); open OUT, ">$tracefile" or die "$tracefile: $!"; print OUT `date -u`; close OUT; } # Post mirror cleanup. cleanup_unknown_files() if ($post_cleanup && ! $debmarshal); # Mirror cleanup for directories. if (! $use_cache && ($pre_cleanup || $post_cleanup)) { # Remove all empty directories. Not done as part of main cleanup # to prevent race problems with pool download code, which # makes directories.. Sort so they are removable in bottom-up # order. chdir($mirrordir) or die "chdir $mirrordir: $!"; system("find . -depth -type d ! -name . ! -name .. -print0 | xargs -0 rmdir 2>/dev/null") if (! $do_dry_run); } if ($res != 0) { die("Failed to move some meta files."); } # Save the state cache. save_state_cache() if $state_cache_days && !$do_dry_run; say("All done."); $lock->release; print "Errors:\n ".join(" ",@errlog) if (@errlog); if ($num_errors != 0) { print "Failed to download files ($num_errors errors)!\n"; exit 1 if (!$ignore_small_errors); } exit; sub print_dl_size { my $size=shift; my $unit; if ($size >= 10*1000*1024) { $size=int($size/1024/1024); $unit="MiB"; } elsif ($size >= 10*1000) { $size=int($size/1024); $unit="kiB"; } else { $unit="B"; } return "$size $unit"; } sub add_bytes_gotten { my $size=shift; $bytes_gotten += $size; if ($doing_meta) { $bytes_meta += $size; } } # Return true if a package stanza is permitted by # --include-field/--exclude-field. sub check_field_filters { my $stanza = shift; for my $name (keys %includes_field) { if ($stanza=~/^\Q$name\E:\s+(.*)/im) { my $value=$1; return 1 if $value=~/$includes_field{$name}/; } } return 0 if keys %includes_field; for my $name (keys %excludes_field) { if ($stanza=~/^\Q$name\E:\s+(.*)/im) { my $value=$1; return 0 if $value=~/$excludes_field{$name}/; } } return 1; } # Takes named parameters: filename, size. # # Optionally can also be passed parameters specifying expected checksums # for the file, using checksum names as in the Release/Packages/Sources files # ("SHA1", "MD5Sum", etc). # # Size is always checked; verifying the checksum is optional. However, if # a value of -1 is passed for size, a check of the checksum is forced. # # It will return true if the tests show the file matches. sub check_file { my %params=@_; my ($filename, $size)=delete @params{qw{filename size}}; if (! -f $filename) { return 0; } my $disksize = -s _; if ($size == $disksize || $size == -1) { if ($verify_checksums || $size == -1) { # Prefer checking stronger checksums, and failing that, fall back # to whatever checksums are present and supported, trying to prefer # FOObignum over FOOsmallnum. my ($summer, $checksum); foreach my $checksum_type ("SHA512", "SHA256", "SHA1", reverse sort keys %params) { next unless defined $params{$checksum_type}; if (lc $checksum_type eq 'md5sum') { $summer=Digest::MD5->new; } elsif ($checksum_type=~/^sha(\d+)$/i) { # returns undef on unknown/too large SHA type $summer=Digest::SHA->new($1); } if (defined $summer) { $checksum=$params{$checksum_type}; last; } } if (! defined $summer) { die "unsupported checksum type(s): ".(join(" ", keys %params))."\n"; } open HANDLE, $filename or die "$filename: $!"; $summer->addfile(*HANDLE); return 1 if $checksum eq $summer->hexdigest; } else { return 1; } } return 0; } # Always checks both file size and sha1 as the files get updated (this is # similar to what is done in check_lists, which forces verify_checksums). sub check_i18n { my ($filename, $size, $sha1)=@_; my $digest = Digest::SHA->new(1); my $ret = 0; if (-f "$filename" and ($size == -s _)) { open HANDLE, $filename or die "$filename: $!"; $digest->addfile(*HANDLE); $ret = ($sha1 eq $digest->hexdigest); } return $ret; } # Check uncompressed diff content against sha1sum from Index file. sub check_diff { my ($filename, $size, $sha1) = @_; my $digest = Digest::SHA->new(1); my $ret = 0; if (-f "$filename.gz") { system_redirect_io("gzip -d", "$filename.gz", "$filename"); if ($size == -s $filename) { open HANDLE, $filename or die "$filename: $!"; $digest->addfile(*HANDLE); $ret = ($sha1 eq $digest->hexdigest); } unlink ($filename); } return $ret; } # Check file against checksum and size from the Release file. # It will return true if the checksum matches. sub check_lists { my $file = shift; my $t = $verify_checksums; my $ret = 1; $verify_checksums = 1; if (exists $file_lists{$file}) { $ret = check_file(filename => $file, %{$file_lists{$file}}); } $verify_checksums = $t; return $ret; } sub remote_get { my $file=shift; my $tdir=shift; my $res; return 1 if ($skippackages); $tdir=$tempdir unless $tdir; chdir($tdir) or die "unable to chdir($tdir): $!\n"; if ($download_method eq 'ftp' || $download_method eq 'http' || $download_method eq 'https') { $res=$ftp ? ftp_get($file) : http_get($file); $res=$res && check_lists($file); if (-f $file && !$res) { say("$file failed checksum verification, removing"); unlink($file) if (-f $file); } } else { $res=rsync_get($file); $res=$res && check_lists($file); if (-f $file && !$res) { say("$file failed checksum verification"); # FIXME: make sure the size doesn't match so it gets retried } } chdir($mirrordir) or die "unable to chdir($mirrordir): $!\n"; return $res; } sub print_percent { my $message=shift; my $percent = $bytes_to_get ? (($bytes_gotten / $bytes_to_get)*100) : 0; printf "[%3.0f%%] %s", $percent, $message; } # Get a file via http, or possibly ftp if a proxy is being used with that # method. First displaying its filename if progress is on. sub http_get { my $oldautoflush = $|; $| = 1; my $file=shift; my $url; if ($user eq 'anonymous'){ $url="$download_method://${host}/${remoteroot}/${file}"; } else { $url="$download_method://${user}:${passwd}\@${host}/${remoteroot}/${file}"; } my $ret=1; print "$url => " if ($debug); print_percent "Getting: $file... " if $progress or $verbose; print "\t #" if $progress; if (! $do_dry_run) { unlink($file) if (-f $file); $ret = $ua->mirror($url, $file); print $ret->status_line . "\n" if ($debug); if ($ret->is_error) { $files{$file} = -1; warn "failed " . $ret->status_line . "\n" if ($progress or $verbose); push (@errlog,"Download of $file failed: ".$ret->status_line."\n"); $num_errors++; } elsif ($progress || $verbose) { print "ok\n"; } $ret = not ( $ret->is_error ); } elsif ($progress || $verbose) { print "ok\n"; } # Account for actual bytes gotten my @stat = stat $file; add_bytes_gotten($stat[7]) if (@stat); $| = $oldautoflush; return $ret; } # Get a file via ftp, first displaying its filename if progress is on. sub ftp_get { my $oldautoflush = $|; $| = 1; my $file=shift; my $mtime; my @stat = stat $file; if (@stat) { # already have the file? my $size = $ftp->size($file); my $mtime = $ftp->mdtm($file); if ($mtime && $size && $size == $stat[7] && $mtime == $stat[9]) { # size and time match print_percent "Keeping: $file\n" if $progress or $verbose; add_bytes_gotten($size); return 1; } } print_percent "Getting: $file" if $progress or $verbose; print "\t #" if $progress; my $ret=1; if (! $do_dry_run) { unlink($file) if (-f $file); $ret = $ftp->get($file, $file); if ($ret) { my $mtime=$ftp->mdtm($file); utime($mtime, $mtime, $file) if defined $mtime; } else { $files{$file} = -1; warn " failed:".$ftp->message if ($progress or $verbose); push (@errlog,"Download of $file failed: ".$ftp->message."\n"); $num_errors++; } } my $size=$ftp->size($file); add_bytes_gotten($size) if $size; $| = $oldautoflush; print "\n" if (($verbose and not $progress) or ($do_dry_run and $progress)); return $ret; } sub rsync_get { my $file=shift; my $opt=$rsync_options; (my $dirname) = $file =~ m:(.*/):; my @dir= split(/\//, $dirname); for (0..$#dir) { $opt = "$opt --include=" . join('/', @dir[0..$_]) . "/"; } $opt .= " --progress" if $progress; $opt .= " -v" if $debug; $opt .= " --no-motd" unless $verbose; system ("rsync --timeout=$timeout $opt $rsyncremote --include=$file --exclude='*' ."); if ($? == 0 && -f $file) { return 1; } else { $files{$file} = -1; push (@errlog,"Download of $file failed\n"); $num_errors++; return 0; } } sub rsync_extra { my ($early, @extras) = @_; my @includes; if (! defined $rsyncremote) { say("Not able to use rsync to update remote trace files ..."); return; } # @ignores is updated during $early to prevent removal of files # if cleanup is done early. for my $type (@extras) { if ($early) { if ($type eq "trace") { push(@includes, "- /project/trace/$hostname"); push(@includes, "/project/trace/*"); push(@ignores, "^project/trace/"); say("Updating remote trace files (using rsync) ..."); } elsif ($type eq "doc") { push(@ignores, "^doc/"); push(@ignores, "^README*"); } elsif ($type eq "tools") { push(@ignores, "^tools/"); } elsif ($type eq "indices") { push(@ignores, "^indices/"); } } else { if ($type eq "doc") { push(@includes, "/doc/***"); push(@includes, "/README*"); } elsif ($type eq "tools") { push(@includes, "/tools/***"); } elsif ($type eq "indices") { push(@includes, "/indices/***"); } } } return if (! @includes); if (! $early) { @extras = grep(!/^trace$/, @extras); # drop 'trace' from list say("Updating extra files (using rsync): @extras."); } rsync_extra_get(@includes); } sub rsync_extra_get { my @includes = @_; my $fh; my @result; my $opt=$rsync_options; $opt .= " --progress" if $progress; $opt .= " -v" if $verbose or $debug; $opt .= " -n" if $do_dry_run; $opt .= " --no-motd" unless $verbose; ($fh, $rsynctempfile) = tempfile(); foreach my $line (@includes) { if ($line !~ /^- /) { my $dirname; my @dir; ($dirname) = ($line =~ m:(.*/):); @dir= split(/\//, $dirname); for (1..$#dir) { push (@result, "" . join('/', @dir[0..$_]) . "/"); } } push (@result, "$line"); } for (@result) { print $fh "$_\n"; } my $ret=system("rsync --timeout=$timeout $opt $rsyncremote --delete --include-from=$rsynctempfile --exclude='*' $mirrordir"); if ($ret != 0) { print STDERR "Warning: failed to use rsync to download extra files.\n"; } close $fh; unlink $rsynctempfile; } # run system() with stdin and stdout redirected to files # unlinks stdout target file first to break hard links sub system_redirect_io { my ($command, $fromfile, $tofile) = @_; if (-f $tofile) { unlink($tofile) or die "unlink($tofile) failed: $!"; } my $cmd="$command <$fromfile >$tofile"; system("$cmd"); die "Failed: $cmd\n" if ($? != 0); } sub split_dist { my $dist = shift; my ($dist_raw) = ($dist =~ m:^([^/]+)/?:); $dist =~ m:^[^/]+(/.*)?$:; my $dist_sdir = $1 // ""; return ($dist_raw, $dist_sdir); } sub get_next_snapshot { my ($dist) = @_; my $latest = readlink("$mirrordir/dists/$dist/latest"); if (defined $latest) { $latest++; } else { $latest = 0; } return $latest; } sub make_next_snapshot { my ($mirrordir, $dist, $codename, $dist_sdir, $tempdir) = @_; my $next = get_next_snapshot($dist); make_dir("$mirrordir/dists/$dist/$next"); unlink("$mirrordir/dists/$dist/$next/Release"); link("$tempdir/dists/$codename$dist_sdir/Release", "$mirrordir/dists/$dist/$next/Release") or die "Error while linking $tempdir/dists/$codename$dist_sdir/Release: $!\n"; return $next; } sub update_latest_links { my ($mirrordir, $tempdir, @dists) = @_; foreach my $dist (@dists) { system("diff","-q","$mirrordir/dists/$dist/latest/Release", "$tempdir/dists/$dist/Release"); if ($?) { my $next = get_next_snapshot($dist); say("Updating $mirrordir/dists/$dist/latest to $next"); unlink("$mirrordir/dists/$dist/latest"); symlink($next,"$mirrordir/dists/$dist/latest") or die "Error while symlinking $mirrordir/dists/$dist/latest to $next: $\n"; } else { say("Not updating $mirrordir/dists/$dist/latest"); } } } sub link_release_into_snapshot { my ($mirrordir,$dist,$next,$tempdir,$codename,$dist_sdir) = @_; unlink("$mirrordir/dists/$dist/$next/Release.gpg"); link("$tempdir/dists/$codename$dist_sdir/Release.gpg", "$mirrordir/dists/$dist/$next/Release.gpg") or die "Error while linking $tempdir/dists/$codename$dist_sdir/Release.gpg: $!\n"; } sub link_contents_into_snapshot { my ($dist,$mirrordir,$arch,$tempdir) = @_; my $next = get_next_snapshot($dist); push my @sects, @sections, ""; foreach my $sect (@sects) { if ($sect ne "") {$sect = "/$sect";} if (exists $file_lists{"$tempdir/dists/$dist$sect/Contents-$arch.gz"}) { unlink("$mirrordir/dists/$dist/$next$sect/Contents-$arch.gz"); link("$tempdir/dists/$dist$sect/Contents-$arch.gz", "$mirrordir/dists/$dist/$next$sect/Contents-$arch.gz") or die "Error while linking $tempdir/dists/$dist$sect/Contents-$arch.gz: $!\n"; } } } sub link_translation_into_snapshot { my ($file,$dist,$distpath,$filename,$mirrordir,$tempdir) = @_; my $next = get_next_snapshot($dist); my $target_path = "$mirrordir/dists/$dist/$next/$distpath"; say("linking $file"); unlink("$target_path/$filename"); make_path($target_path); link("$tempdir/$file", "$target_path/$filename") or die "Error while linking $tempdir/$file: $!"; } sub get_release { my ($tdir, $dist) = @_; make_dir ("$tdir"); return 0 unless remote_get("dists/$dist/Release", "$tempdir/.tmp"); # Save current error state so we can roll back if $ignore_release_gpg # is set; needed because remote_get() can register errors my @t_errlog = @errlog; my $t_errors = $num_errors; remote_get("dists/$dist/Release.gpg", "$tempdir/.tmp"); if (! $check_gpg) { # Nothing to do. } elsif (-f "$tdir/Release" && -f "$tdir/Release.gpg") { # Check for gpg if (system("gpgv --version >/dev/null 2>/dev/null")) { say("gpgv failed: gpgv binary missing?"); push (@errlog,"gpgv failed: gpgv binary missing?\n"); $num_errors++; } else { # Verify Release signature my $gpgv_res = 0; my $outp = IO::Pipe->new; my $errp = IO::Pipe->new; my $gpgvout = ""; my $gpgverr = ""; if (my $child = fork) { $outp->reader; $errp->reader; my $sel = IO::Select->new; $sel->add($outp, $errp); while (my @ready = $sel->can_read) { for (@ready) { my $buf = ""; my $bytesread = $_->read($buf, 1024); if (!defined($bytesread)) { die "read error: $!\n"; } elsif ($bytesread == 0) { $sel->remove($_); $_->close; } else { if ($_ == $outp) { $gpgvout .= $buf; } if ($_ == $errp) { $gpgverr .= $buf; } } } } waitpid($child, 0) == -1 and die "was pid $child automatically reaped?\n"; $gpgv_res = not $?; } else { $outp->writer; $errp->writer; STDOUT->fdopen(fileno($outp), "w") or die; STDERR->fdopen(fileno($errp), "w") or die; my @gpgv = qw(gpgv --status-fd 1); push @gpgv, (map { ('--keyring' => $_) } @keyrings); push @gpgv, "$tdir/Release.gpg", "$tdir/Release"; exec(@gpgv) or die "exec: $gpgv[0]: $!\n"; } # In debug or verbose mode, display the gpg error message on stdout. if (! $gpgv_res || $debug) { print $gpgvout; print $gpgverr; } if ($verbose && ! $debug) { print $gpgverr; } if (! $gpgv_res) { say("Release gpg signature does not verify."); push (@errlog,"Release gpg signature does not verify\n"); $num_errors++; } } } else { say("Release gpg signature does not verify, file missing."); push (@errlog,"Release gpg signature does not verify\n"); $num_errors++; } if ($ignore_release_gpg) { @errlog = @t_errlog; $num_errors = $t_errors; } return 1 } sub name_release { my ($type, $tdir, $dist) = @_; my ($buf, $codename, $suite); my $origin = "unknown"; if (-f "$tdir/Release") { if (open RELEASE, "<$tdir/Release") { while () { last if /^MD5Sum:/; $buf = $buf . $_; } close RELEASE; } $_ = $buf; ($origin) = m/^Origin:\s+(.*)/im if (/^Origin:/im); ($codename) = m/^Codename:\s+(.*)/im; ($suite) = m/^Suite:\s+(.*)/im; } elsif ($ignore_missing_release) { $origin = "none"; } # Allow for for example "/updates"; split into the # raw dist (codename or suite) and the subdirectory. my ($dist_raw, $dist_sdir) = split_dist($dist); if ($origin eq "none") { $codename = $dist_raw; } elsif ($origin eq "Ubuntu" or $origin eq "Canonical") { if ($suite) { say("Ubuntu Release file: using Suite ($suite)."); $codename = $suite; } else { say("Invalid Ubuntu Release file."); push (@errlog,"Invalid Ubuntu Release file.\n"); $num_errors++; next; } } elsif ($codename) { if ($dist_raw ne $codename && $dist_raw ne $suite) { say("Broken Release file: neither Codename nor Suite matches $dist."); push (@errlog,"Broken Release file: neither Codename nor Suite matches $dist\n"); $num_errors++; next; } } elsif ($suite) { say("Release file does not contain Codename; using Suite ($suite)."); $codename = $suite; } else { say("Release file contains neither Codename nor Suite; using $dist."); $codename = $dist_raw; } # For experimental the suite is the same as the codename $suite = "" if (! $suite || $suite eq $codename); die("Duplicate dist $codename$dist_sdir.\n") if exists $distset{"$codename$dist_sdir"}{$type}; $distset{"$codename$dist_sdir"}{$type} = 1; die("Conflicting suites '$suite' and '$distset{$codename}{suite}' for $codename.\n") if (exists $distset{"$codename"}{suite} && ($suite ne $distset{$codename}{suite})); $distset{$codename}{suite} = "$suite" if ($suite); # This should be a one-time conversion only if ($suite) { if (-d "$tempdir/dists/$suite" && !-l "$tempdir/dists/$suite") { rename_distdir("$tempdir/dists", $codename, $suite); } if (-d "dists/$suite" && !-l "dists/$suite") { rename_distdir("dists", $codename, $suite); } } return ($codename, $suite, $dist_sdir); } # Get Index file in the passed subdirectory. sub get_index { my $subdir=shift; my $file=shift; make_dir($subdir); make_dir("$tempdir/$subdir"); if ($diff_mode ne "none" && exists $file_lists{"$tempdir/$subdir/$file.diff/Index"}) { if (!check_lists("$tempdir/$subdir/$file.diff/Index")) { make_dir("$tempdir/$subdir/$file.diff"); if (!remote_get("$subdir/$file.diff/Index")) { push (@errlog,"$subdir/$file.diff/Index failed checksum verification, removing\n"); } else { fetch_and_apply_diffs(0, $subdir, $file); if (check_lists("$tempdir/$subdir/$file")) { if (! $slow_cpu) { system_redirect_io("gzip $gzip_options", "$tempdir/$subdir/$file", "$tempdir/$subdir/$file.gz"); system_redirect_io("bzip2", "$tempdir/$subdir/$file", "$tempdir/$subdir/$file.bz2"); } } } } else { $bytes_gotten += $file_lists{"$tempdir/$subdir/$file.diff/Index"}{size}; fetch_and_apply_diffs(0, $subdir, "$file"); if (check_lists("$tempdir/$subdir/$file")) { if (! $slow_cpu) { system_redirect_io("gzip $gzip_options", "$tempdir/$subdir/$file", "$tempdir/$subdir/$file.gz"); system_redirect_io("bzip2", "$tempdir/$subdir/$file", "$tempdir/$subdir/$file.bz2"); } } } $files{"$subdir/$file.diff/Index"}=1 if ($diff_mode eq "mirror"); $files{"$tempdir/$subdir/$file.diff/Index"}=1; } if (exists $file_lists{"$tempdir/$subdir/$file.gz"}{size}) { if (!check_lists("$tempdir/$subdir/$file.gz")) { if (remote_get("$subdir/$file.gz")) { system_redirect_io("gzip -d", "$tempdir/$subdir/$file.gz", "$tempdir/$subdir/$file"); if (! $slow_cpu) { system_redirect_io("bzip2", "$tempdir/$subdir/$file", "$tempdir/$subdir/$file.bz2"); } } else { push (@errlog,"$subdir/$file.gz failed checksum verification\n"); $num_errors++; } } else { $bytes_gotten += $file_lists{"$tempdir/$subdir/$file.gz"}{size}; } } elsif ($ignore_missing_release) { say("Ignoring missing Release file for $subdir/$file.gz"); push (@errlog,"Ignoring missing Release file for $subdir/$file.gz\n"); if (remote_get("$subdir/$file.gz")) { system_redirect_io("gzip -d", "$tempdir/$subdir/$file.gz", "$tempdir/$subdir/$file"); } } else { if (-f "$subdir/$file.gz") { say("$subdir/$file.gz exists locally but not in Release"); die "Won't mirror without $subdir/$file.gz signature in Release"; } else { say("$subdir/$file.gz does not exist locally or in Release, skipping.") if ($debug); } } if (exists $file_lists{"$tempdir/$subdir/$file"}) { if (!check_lists("$tempdir/$subdir/$file")) { if (remote_get("$subdir/$file")) { if (! $slow_cpu) { system_redirect_io("bzip2", "$tempdir/$subdir/$file", "$tempdir/$subdir/$file.bz2"); } } else { push (@errlog,"$subdir/$file failed checksum verification\n"); $num_errors++; } } else { $bytes_gotten += $file_lists{"$tempdir/$subdir/$file"}{size}; } } if (exists $file_lists{"$tempdir/$subdir/$file.bz2"}) { if (!check_lists("$tempdir/$subdir/$file.bz2")) { if (!remote_get("$subdir/$file.bz2")) { push (@errlog,"$subdir/$file.bz2 failed checksum verification, removing\n"); } } else { $bytes_gotten += $file_lists{"$tempdir/$subdir/$file.bz2"}{size}; } } if (exists $file_lists{"$tempdir/$subdir/Release"}) { if (!check_lists("$tempdir/$subdir/Release")) { if (!remote_get("$subdir/Release")) { push (@errlog,"$subdir/Release failed checksum verification, removing\n"); } } else { $bytes_gotten += $file_lists{"$tempdir/$subdir/Release"}{size}; } } if ($file eq "Packages") { push @package_files, "$tempdir/$subdir/$file"; } elsif ($file eq "Sources") { push @source_files, "$tempdir/$subdir/$file"; } else { die "get_index called with unknown type $file\n"; } $files{"$subdir/$file.gz"}=1; $files{"$subdir/$file.bz2"}=1; # Uncompressed files are no longer kept on the mirrors $files{"$subdir/$file"}=1 unless exists $file_lists{"$tempdir/$subdir/$file.gz"}; $files{"$subdir/Release"}=1; $files{"$tempdir/$subdir/$file.gz"}=1; $files{"$tempdir/$subdir/$file.bz2"}=1; $files{"$tempdir/$subdir/$file"}=1; $files{"$tempdir/$subdir/Release"}=1; } sub update_contents { my ($subdir, $file) = @_; my $file_ok = check_lists("$tempdir/$subdir/$file.gz"); # Get the Index file for the diffs if (exists $file_lists{"$tempdir/$subdir/$file.diff/Index"}) { if (!check_lists("$tempdir/$subdir/$file.diff/Index")) { make_dir("$tempdir/$subdir/$file.diff"); if (!remote_get("$subdir/$file.diff/Index")) { push (@errlog,"$subdir/$file.diff/Index failed checksum verification, removing\n"); return $file_ok; } #FIXME: before download if (-f "$tempdir/$subdir/$file.diff/Index") { $bytes_to_get += -s "$tempdir/$subdir/$file.diff/Index"; } } $files{"$subdir/$file.diff/Index"}=1 if ($diff_mode eq "mirror"); $files{"$tempdir/$subdir/$file.diff/Index"}=1; } else { return $file_ok; } if (! -f "$tempdir/$subdir/$file.gz" || $file_ok) { # fetch diffs only fetch_and_apply_diffs(1, $subdir, $file); return $file_ok; } # Uncompress the Contents file system_redirect_io("gzip -d", "$tempdir/$subdir/$file.gz", "$tempdir/$subdir/$file"); # Update it fetch_and_apply_diffs(0, $subdir, $file); # And compress it again if (-f "$tempdir/$subdir/$file") { system_redirect_io("gzip $gzip_options", "$tempdir/$subdir/$file", "$tempdir/$subdir/$file.gz"); unlink "$tempdir/$subdir/$file"; } return check_lists("$tempdir/$subdir/$file.gz"); } sub get_contents_files { my $first = 1; foreach my $dist (keys %distset) { next unless exists $distset{$dist}{mirror}; foreach my $arch (@arches) { next if $dist=~/experimental/; next if $dist=~/.*-proposed-updates/; next if $arch=~/source/; push my @sects, @sections, ""; foreach my $sect (@sects) { if ($sect ne "") {$sect = "/$sect";} if (exists $file_lists{"$tempdir/dists/$dist$sect/Contents-$arch.gz"}) { if (!check_lists("$tempdir/dists/$dist$sect/Contents-$arch.gz")) { if ($first) { say("Get Contents files."); $first = 0; } remote_get("dists/$dist$sect/Contents-$arch.gz"); } $files{"dists/$dist$sect/Contents-$arch.gz"}=1; $files{$tempdir."/"."dists/$dist$sect/Contents-$arch.gz"}=1; if ($debmarshal) { link_contents_into_snapshot($dist,$mirrordir,$arch,$tempdir); } } } } } } # hardlink index files from tempdir to next debmarshal snapshot location sub link_index { my ($dist,$section,$arch) = @_; my ($file,$archdir); if ($arch eq "source") { $file = "Sources"; $archdir = "source"; } else { $file = "Packages"; $archdir = "binary-$arch"; } my $next = get_next_snapshot($dist); make_dir("$mirrordir/dists/$dist/$next/$section/$archdir"); unlink("$mirrordir/dists/$dist/$next/$section/$archdir/$file"); link("$tempdir/dists/$dist/$section/$archdir/$file", "$mirrordir/dists/$dist/$next/$section/$archdir/$file") or warn "Error while linking $tempdir/dists/$dist/$section/$archdir/$file: $!\n"; unlink("$mirrordir/dists/$dist/$next/$section/$archdir/$file.gz"); link("$tempdir/dists/$dist/$section/$archdir/$file.gz", "$mirrordir/dists/$dist/$next/$section/$archdir/$file.gz") or die "Error while linking $tempdir/dists/$dist/$section/$archdir/$file.gz: $!\n"; unlink("$mirrordir/dists/$dist/$next/$section/$archdir/$file.bz2"); link("$tempdir/dists/$dist/$section/$archdir/$file.bz2", "$mirrordir/dists/$dist/$next/$section/$archdir/$file.bz2") or die "Error while linking $tempdir/dists/$dist/$section/$archdir/$file.bz2: $!\n"; } sub i18n_from_release { my ($dist,$distpath) = @_; my $subdir = "dists/$dist/$distpath"; my $compdir = $tempdir."/".$subdir; my ($sha1, $size, $filename); my $exclude = "(".join("|", @excludes).")" if @excludes; my $include = "(".join("|", @includes).")" if @includes; # Create i18n directories make_dir($subdir); make_dir($compdir); # Search for translation files in file_lists foreach my $path (keys %file_lists) { next if length($compdir)+1>length($path); # the +1 stands for the slash after $compdir next if substr($path, 0, length($compdir)) ne $compdir; my $filename = substr($path, length($compdir)+1, length($path)-length($compdir)-1); next if $filename !~ /bz2$/; my ($sha1, $size) = ($file_lists{$path}{SHA1}, $file_lists{$path}{size}); if(!(defined($include) && ($subdir."/".$filename)=~/$include/o)) { next if (defined($exclude) && ($subdir."/".$filename)=~/$exclude/o); } next if ! $i18n && $filename !~ /-en/; $files{"$subdir/$filename"}=1; $files{$tempdir."/"."$subdir/$filename"}=1; if (! check_i18n("$tempdir/$subdir/$filename", $size, $sha1)) { $bytes_to_get += $size; $i18n_get{"$subdir/$filename"}{sha1} = $sha1; $i18n_get{"$subdir/$filename"}{size} = $size; $i18n_get{"$subdir/$filename"}{dist} = $dist; $i18n_get{"$subdir/$filename"}{distpath} = $distpath; $i18n_get{"$subdir/$filename"}{filename} = $filename; } } } sub get_i18n_files { say("Get Translation files ..."); foreach my $file (sort keys %i18n_get) { if (! check_i18n("$tempdir/$file", $i18n_get{$file}{size}, $i18n_get{$file}{sha1})) { remote_get("$file"); if ($debmarshal) { link_translation_into_snapshot($file, $i18n_get{$file}{dist}, $i18n_get{$file}{distpath}, $i18n_get{$file}{filename}, $mirrordir, $tempdir); } } } } sub fetch_and_apply_diffs { my ($fetch_only, $subdir, $type) = @_; local (*INDEX, *FILE); my (%history_sha1, %history_size, %diff_sha1, %diff_size); my ($current_sha1, $current_size, $sha1, $size, $file, $digest, $ret); my $t = $num_errors; # Parse DiffIndex file open(INDEX, "$tempdir/$subdir/$type.diff/Index") or die "$tempdir/$subdir/$type.diff/Index: $!"; $_ = ; while (defined($_)) { if (m/^SHA1-Current:/m) { ($current_sha1, $current_size) = m/^SHA1-Current:\s+([A-Za-z0-9]+)\s+(\d+)/m; $_ = ; } elsif (m/^SHA1-History:/m) { while (defined($_ = )) { last if (!m/^\s/m); ($sha1, $size, $file) = m/^\s+([A-Za-z0-9]+)\s+(\d+)\s+(.*)/m; $history_sha1{$file} = $sha1; $history_size{$file} = $size; } } elsif (m/^SHA1-Patches:/m) { while (defined($_ = )) { last if (!m/^\s/m); ($sha1, $size, $file) = m/^\s+([A-Za-z0-9]+)\s+(\d+)\s+(.*)/m; $diff_sha1{$file} = $sha1; $diff_size{$file} = $size; } } } close(INDEX); # Download diff files as necessary $ret = 1; foreach $file (sort keys %diff_sha1) { if (!check_diff("$tempdir/$subdir/$type.diff/$file", $diff_size{$file}, $diff_sha1{$file})) { remote_get("$subdir/$type.diff/$file.gz"); #FIXME: before download if (-f "$tempdir/$subdir/$type.diff/$file.gz") { $bytes_to_get += -s "$tempdir/$subdir/$type.diff/$file.gz"; } if (!check_diff("$tempdir/$subdir/$type.diff/$file", $diff_size{$file}, $diff_sha1{$file})) { say("$subdir/$type.diff/$file.gz failed sha1sum check, removing"); push (@errlog,"$subdir/$type.diff/$file.gz failed sha1sum check, removing\n"); unlink "$tempdir/$subdir/$type.diff/$file.gz"; $ret = 0; } } $files{"$subdir/$type.diff/$file.gz"}=1 if ($diff_mode eq "mirror"); $files{"$tempdir/$subdir/$type.diff/$file.gz"}=1; } $num_errors = $t if ($ignore_small_errors); return if ($fetch_only || ! $ret); # Apply diff files open(FILE, "$tempdir/$subdir/$type") or return; $digest = Digest::SHA->new(1); $digest->addfile(*FILE); $sha1 = $digest->hexdigest; $size = -s "$tempdir/$subdir/$type"; foreach $file (sort keys %history_sha1) { next unless ($sha1 eq $history_sha1{$file} && $size eq $history_size{$file}); if (system("gzip -d < \"$tempdir/$subdir/$type.diff/$file.gz\" | patch --ed \"$tempdir/$subdir/$type\"")) { say("Patch $file failed, will fetch $subdir/$type file"); unlink "$tempdir/$subdir/$type"; return; } open(FILE, "$tempdir/$subdir/$type") or return; $digest = Digest::SHA->new(1); $digest->addfile(*FILE); $sha1 = $digest->hexdigest; $size = -s "$tempdir/$subdir/$type"; say("$subdir/$type patched with $subdir/$type.diff/$file.gz"); } if (!($sha1 eq $current_sha1 && $size eq $current_size)) { say("$subdir/$type failed sha1sum check, removing"); push (@errlog,"$subdir/$type failed sha1sum check, removing\n"); unlink "$tempdir/$subdir/$type"; } } # Make a directory including all needed parents. { my %seen; sub make_dir { my $dir=shift; my @parts=split('/', $dir); my $current=''; foreach my $part (@parts) { $current.="$part/"; if (! $seen{$current}) { if (! -d $current) { mkdir($current) or die "mkdir failed: $!"; debug("Created directory: $current"); } $seen{$current}=1; } } } } # Mirror cleanup for unknown files that cannot be found in Packages files. # This subroutine is called on pre- and post-cleanup and takes no arguments. # It uses some global variables like $files, $mirrordir, @ignores. sub cleanup_unknown_files { print("Cleanup mirror") if ($verbose or $progress); if ($use_cache) { say(": using cache."); foreach my $file (sort keys %files) { next if (@di_dists && $file =~ m:installer-\w(-|\w)*/current/images/:); if ($files{$file} == 2 && -f $file) { say("deleting $file") if ($verbose); if (! $do_dry_run) { unlink $file or die "unlink $file: $!"; } } } } else { say($state_cache_days ? ": full." : "."); chdir($mirrordir) or die "chdir $mirrordir: $!"; my $ignore; $ignore = "(".join("|", @ignores).")" if @ignores; # Remove all files in the mirror that we don't know about foreach my $file (`find . -type f`) { chomp $file; $file=~s:^\./::; next if (@di_dists && $file =~ m:installer-\w(-|\w)*/current/images/:); unless ((exists $files{$file} && $files{$file} != 2) or (defined($ignore) && $file=~/$ignore/o)) { say("deleting $file") if ($verbose); if (! $do_dry_run) { unlink $file or die "unlink $file: $!"; } } } } # Clean up obsolete files of D-I images di_cleanup() if @di_dists; } sub di_skip_dist { my $dist=shift; if ( $dist eq "woody" || $dist eq "experimental" || $dist =~ /.*-updates/ ) { return 1; } return 0; } sub di_check_dists { DI_DIST: for my $di_dist (@di_dists) { next if di_skip_dist($di_dist); if (exists $distset{$di_dist}) { # Valid dist and also mirroring the archive itself $distset{$di_dist}{"d-i"} = 1; } else { foreach my $dist (keys %distset) { my ($dist_raw, $dist_sdir) = split_dist($dist); if ($di_dist eq $distset{$dist_raw}{suite}) { # Suite specified, use codename instead $distset{"$dist_raw$dist_sdir"}{"d-i"} = 1; next DI_DIST; } } # Only mirroring D-I images, not the archive itself my $tdir="$tempdir/.tmp/dists/$di_dist"; next unless (get_release($tdir, $di_dist) || $ignore_missing_release); name_release("d-i", $tdir, $di_dist); unlink "$tdir/Release"; unlink "$tdir/Release.gpg"; } } } sub di_add_files { my $tdir = "$tempdir/d-i"; my $exclude = "(".join("|", @excludes).")" if @excludes; my $include = "(".join("|", @includes).")" if @includes; foreach my $dist (keys %distset) { next unless exists $distset{$dist}{"d-i"}; foreach my $arch (@di_arches) { next if $arch eq "all"; my $image_dir = "dists/$dist/main/installer-$arch/current/images"; make_dir ("$tdir/$image_dir"); if (!remote_get("$image_dir/MD5SUMS", $tdir)) { say("Failed to download $image_dir/MD5SUMS; skipping."); return; } if (-f "$tdir/$image_dir/MD5SUMS") { $bytes_to_get += -s _; # As we did not have the size earlier } local $/; undef $/; # Read whole file open(FILE, "<", "$tdir/$image_dir/MD5SUMS") or die "$tdir/$image_dir/MD5SUMS: $!"; $_ = ; while (m/^([A-Za-z0-9]{32} .*)/mg) { my ($md5sum, $filename) = split(' ', $1, 3); $filename =~ s:^\./::; if(!(defined($include) && ($image_dir."/".$filename)=~/$include/o)) { next if (defined($exclude) && ($image_dir."/".$filename)=~/$exclude/o); } $di_files{$image_dir}{$filename}{md5sum} = $md5sum; # Check against the version currently on the mirror if (check_file(filename => "$image_dir/$filename", size => -1, MD5Sum => $md5sum)) { $di_files{$image_dir}{$filename}{status} = 1; } else { $di_files{$image_dir}{$filename}{status} = 0; } } close(FILE); } } } # ToDo: for rsync maybe it would make sense to sync the images directly # into place, the whole $image_dir at a time. sub di_get_files { say("Getting Debian Installer images."); my $tdir = "$tempdir/d-i"; foreach my $image_dir (sort keys %di_files) { my $lres = 1; foreach my $file (sort keys %{ $di_files{$image_dir} }) { next unless $di_files{$image_dir}{$file}{status} == 0; # Fetch images into a temporary location $file =~ m:(^.*)/:; make_dir ("$tdir/$image_dir/$1") if $1; if (!remote_get("$image_dir/$file", $tdir) || !check_file(filename => "$tdir/$image_dir/$file", size => -1, MD5Sum => $di_files{$image_dir}{$file}{md5sum})) { $lres = 0; last if (! $do_dry_run); } if (-f "$tdir/$image_dir/$file") { $bytes_to_get += -s _; # As we did not have the size in add_di_files() } } # Move images in place on mirror if ($lres && ! $do_dry_run) { foreach my $file (sort keys %{ $di_files{$image_dir} }) { next unless $di_files{$image_dir}{$file}{status} == 0; $file =~ m:(^.*)/:; make_dir ("$image_dir/$1") if $1; unlink "$image_dir/$file" if (-f "$image_dir/$file"); link("$tdir/$image_dir/$file", "$image_dir/$file"); } # Move the MD5SUMS file in place on mirror unlink "$image_dir/MD5SUMS" if (-f "$image_dir/MD5SUMS"); link("$tdir/$image_dir/MD5SUMS", "$image_dir/MD5SUMS"); } elsif (! $do_dry_run) { say("Failed to download some files in $image_dir; not updating images."); } } } sub di_cleanup { # Clean up obsolete files foreach my $image_dir (`find dists/ -type d -name images`) { next unless $image_dir =~ m:/installer-\w(-|\w)*/current/images$:; chomp $image_dir; chdir("$image_dir") or die "unable to chdir($image_dir): $!\n"; foreach my $file (`find . -type f`) { chomp $file; $file=~s:^\./::; if (! exists $di_files{$image_dir} || ! exists $di_files{$image_dir}{$file}) { next if (exists $di_files{$image_dir} && $file eq "MD5SUMS"); say("deleting $image_dir/$file") if ($verbose); if (! $do_dry_run) { unlink "$file" or die "unlink $image_dir/$file: $!\n"; } } } chdir("$mirrordir") or die "unable to chdir($tempdir): $!\n"; } # Clean up temporary D-I files (silently) if (-d "$tempdir/d-i") { chdir("$tempdir/d-i") or die "unable to chdir($tempdir/d-i): $!\n"; foreach my $file (`find . -type f`) { chomp $file; $file=~s:^\./::; unlink "$file" or die "unlink $tempdir/d-i/$file: $!\n"; } chdir("$mirrordir") or die "unable to chdir($mirrordir): $!\n"; } } sub download_finished { if ($ftp) { $ftp->quit; } my $total_time = time - $start_time; if ($download_method eq 'rsync' || $bytes_gotten == 0) { say("Download completed in ".$total_time."s."); } else { my $avg_speed = 0; $avg_speed = sprintf("%3.0f",($bytes_gotten / $total_time)) unless ($total_time == 0); say("Downloaded ".print_dl_size($bytes_gotten)." in ".$total_time."s at ".(int($avg_speed/1024*100)/100)." kiB/s."); } } sub rename_distdir { my ($dir, $codename, $suite) = @_; say("The directory for a dist should be its codename, not a suite."); if (!$allow_dist_rename) { die("Use --allow-dist-rename to have debmirror do the conversion automatically.\n"); } say("Starting conversion - renaming '$dir/$suite' to '$dir/$codename':"); if (-l "$dir/$codename") { say(" removing symlink '$dir/$codename'; a new symlink for the suite will be created later"); unlink "$dir/$codename"; } if (-d "$dir/$codename") { die("Directory '$dir/$codename' already exists; aborting conversion.\n"); } rename("$dir/$suite", "$dir/$codename"); say(" conversion completed successfully"); } sub save_state_cache { my $cache_file = "$tempdir/debmirror_state.cache"; say("Saving debmirror state cache."); foreach my $file (keys %files) { if ($files{$file} == 2) { delete $files{$file}; } elsif ($files{$file} >= 0){ $files{$file} = 2; } } # Add state cache meta data my $now = time(); $files{cache_version} = $files_cache_version; if (! $state_cache_exptime) { $state_cache_exptime = $now + $state_cache_days * 24 * 60 * 60; } $files{cache_expiration_time} = $state_cache_exptime; if (! nstore(\%files, $cache_file)) { say("Failed to save state cache."); unlink $cache_file if -f $cache_file; } else { my $expires = int(($state_cache_exptime - $now) / (60 * 60)); # hours if ($expires > 0) { my $days = int($expires / 24); my $hours = $expires % 24; say("State cache will expire in " . ($days ? "$days day(s)" : ($hours ? "" : "the next hour")) . ($hours ? ($days ? " and " : "") . "$hours hour(s)" : "") . "."); } else { say("State cache expired during this run; next run will not use cache."); } } } sub load_state_cache { my $cache_file = "$tempdir/debmirror_state.cache"; if (! -f $cache_file) { say("State cache file does not exist; doing full mirroring."); return; } my $rfiles; say("Loading debmirror state cache."); $rfiles = retrieve($cache_file); if (! defined $rfiles) { say("Failed to load state cache; doing full mirror check."); return } if (! exists $$rfiles{cache_version}) { say("Cache version missing in state cache; doing full mirroring."); return } elsif ($$rfiles{cache_version} ne $files_cache_version) { say("State cache is incompatible with this version of debmirror; doing full mirror check."); return } else { delete $$rfiles{cache_version}; } if (! exists $$rfiles{cache_expiration_time}) { say("Expiration time missing in state cache; doing full mirror check."); return } elsif ($$rfiles{cache_expiration_time} < time()) { say("State cache has expired; doing full mirror check."); return } else { $state_cache_exptime = $$rfiles{cache_expiration_time}; delete $$rfiles{cache_expiration_time}; } say("State cache loaded successfully; will use cache."); %files = %$rfiles; $use_cache = 1; # Preserve state cache during dry runs if ($dry_run) { $files{$cache_file} = 1; } else { unlink $cache_file if -f $cache_file; } } sub say { print join(' ', @_)."\n" if ($verbose or $progress); } sub debug { print $0.': '.join(' ', @_)."\n" if $debug; } =head1 COPYRIGHT This program is copyright 2000-2001, 2010 by Joey Hess , under the terms of the GNU GPL (either version 2 of the licence or, at your option, any later version), copyright 2001-2002 by Joerg Wendland , copyright 2003-2007 by Goswin von Brederlow and copyright 2009-2010 by Frans Pop . The author disclaims any responsibility for any mangling of your system, unexpected bandwidth usage bills, meltdown of the Debian mirror network, etc, that this script may cause. See NO WARRANTY section of GPL. =head1 AUTHOR Author and current maintainer: Joey Hess Previous maintainers: Joerg Wendland Goswin von Brederlow Frans Pop =head1 MOTTO Waste bandwith -- put a partial mirror on your laptop today! =cut debmirror-2.16ubuntu1.1/GPL0000664000000000000000000004325412243012102012342 0ustar GNU GENERAL PUBLIC LICENSE Version 2, June 1991 Copyright (C) 1989, 1991 Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA Everyone is permitted to copy and distribute verbatim copies of this license document, but changing it is not allowed. Preamble The licenses for most software are designed to take away your freedom to share and change it. By contrast, the GNU General Public License is intended to guarantee your freedom to share and change free software--to make sure the software is free for all its users. This General Public License applies to most of the Free Software Foundation's software and to any other program whose authors commit to using it. (Some other Free Software Foundation software is covered by the GNU Lesser General Public License instead.) You can apply it to your programs, too. When we speak of free software, we are referring to freedom, not price. Our General Public Licenses are designed to make sure that you have the freedom to distribute copies of free software (and charge for this service if you wish), that you receive source code or can get it if you want it, that you can change the software or use pieces of it in new free programs; and that you know you can do these things. To protect your rights, we need to make restrictions that forbid anyone to deny you these rights or to ask you to surrender the rights. These restrictions translate to certain responsibilities for you if you distribute copies of the software, or if you modify it. For example, if you distribute copies of such a program, whether gratis or for a fee, you must give the recipients all the rights that you have. You must make sure that they, too, receive or can get the source code. And you must show them these terms so they know their rights. We protect your rights with two steps: (1) copyright the software, and (2) offer you this license which gives you legal permission to copy, distribute and/or modify the software. Also, for each author's protection and ours, we want to make certain that everyone understands that there is no warranty for this free software. If the software is modified by someone else and passed on, we want its recipients to know that what they have is not the original, so that any problems introduced by others will not reflect on the original authors' reputations. Finally, any free program is threatened constantly by software patents. We wish to avoid the danger that redistributors of a free program will individually obtain patent licenses, in effect making the program proprietary. To prevent this, we have made it clear that any patent must be licensed for everyone's free use or not licensed at all. The precise terms and conditions for copying, distribution and modification follow. GNU GENERAL PUBLIC LICENSE TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION 0. This License applies to any program or other work which contains a notice placed by the copyright holder saying it may be distributed under the terms of this General Public License. The "Program", below, refers to any such program or work, and a "work based on the Program" means either the Program or any derivative work under copyright law: that is to say, a work containing the Program or a portion of it, either verbatim or with modifications and/or translated into another language. (Hereinafter, translation is included without limitation in the term "modification".) Each licensee is addressed as "you". Activities other than copying, distribution and modification are not covered by this License; they are outside its scope. The act of running the Program is not restricted, and the output from the Program is covered only if its contents constitute a work based on the Program (independent of having been made by running the Program). Whether that is true depends on what the Program does. 1. You may copy and distribute verbatim copies of the Program's source code as you receive it, in any medium, provided that you conspicuously and appropriately publish on each copy an appropriate copyright notice and disclaimer of warranty; keep intact all the notices that refer to this License and to the absence of any warranty; and give any other recipients of the Program a copy of this License along with the Program. You may charge a fee for the physical act of transferring a copy, and you may at your option offer warranty protection in exchange for a fee. 2. You may modify your copy or copies of the Program or any portion of it, thus forming a work based on the Program, and copy and distribute such modifications or work under the terms of Section 1 above, provided that you also meet all of these conditions: a) You must cause the modified files to carry prominent notices stating that you changed the files and the date of any change. b) You must cause any work that you distribute or publish, that in whole or in part contains or is derived from the Program or any part thereof, to be licensed as a whole at no charge to all third parties under the terms of this License. c) If the modified program normally reads commands interactively when run, you must cause it, when started running for such interactive use in the most ordinary way, to print or display an announcement including an appropriate copyright notice and a notice that there is no warranty (or else, saying that you provide a warranty) and that users may redistribute the program under these conditions, and telling the user how to view a copy of this License. (Exception: if the Program itself is interactive but does not normally print such an announcement, your work based on the Program is not required to print an announcement.) These requirements apply to the modified work as a whole. If identifiable sections of that work are not derived from the Program, and can be reasonably considered independent and separate works in themselves, then this License, and its terms, do not apply to those sections when you distribute them as separate works. But when you distribute the same sections as part of a whole which is a work based on the Program, the distribution of the whole must be on the terms of this License, whose permissions for other licensees extend to the entire whole, and thus to each and every part regardless of who wrote it. Thus, it is not the intent of this section to claim rights or contest your rights to work written entirely by you; rather, the intent is to exercise the right to control the distribution of derivative or collective works based on the Program. In addition, mere aggregation of another work not based on the Program with the Program (or with a work based on the Program) on a volume of a storage or distribution medium does not bring the other work under the scope of this License. 3. You may copy and distribute the Program (or a work based on it, under Section 2) in object code or executable form under the terms of Sections 1 and 2 above provided that you also do one of the following: a) Accompany it with the complete corresponding machine-readable source code, which must be distributed under the terms of Sections 1 and 2 above on a medium customarily used for software interchange; or, b) Accompany it with a written offer, valid for at least three years, to give any third party, for a charge no more than your cost of physically performing source distribution, a complete machine-readable copy of the corresponding source code, to be distributed under the terms of Sections 1 and 2 above on a medium customarily used for software interchange; or, c) Accompany it with the information you received as to the offer to distribute corresponding source code. (This alternative is allowed only for noncommercial distribution and only if you received the program in object code or executable form with such an offer, in accord with Subsection b above.) The source code for a work means the preferred form of the work for making modifications to it. For an executable work, complete source code means all the source code for all modules it contains, plus any associated interface definition files, plus the scripts used to control compilation and installation of the executable. However, as a special exception, the source code distributed need not include anything that is normally distributed (in either source or binary form) with the major components (compiler, kernel, and so on) of the operating system on which the executable runs, unless that component itself accompanies the executable. If distribution of executable or object code is made by offering access to copy from a designated place, then offering equivalent access to copy the source code from the same place counts as distribution of the source code, even though third parties are not compelled to copy the source along with the object code. 4. You may not copy, modify, sublicense, or distribute the Program except as expressly provided under this License. Any attempt otherwise to copy, modify, sublicense or distribute the Program is void, and will automatically terminate your rights under this License. However, parties who have received copies, or rights, from you under this License will not have their licenses terminated so long as such parties remain in full compliance. 5. You are not required to accept this License, since you have not signed it. However, nothing else grants you permission to modify or distribute the Program or its derivative works. These actions are prohibited by law if you do not accept this License. Therefore, by modifying or distributing the Program (or any work based on the Program), you indicate your acceptance of this License to do so, and all its terms and conditions for copying, distributing or modifying the Program or works based on it. 6. Each time you redistribute the Program (or any work based on the Program), the recipient automatically receives a license from the original licensor to copy, distribute or modify the Program subject to these terms and conditions. You may not impose any further restrictions on the recipients' exercise of the rights granted herein. You are not responsible for enforcing compliance by third parties to this License. 7. If, as a consequence of a court judgment or allegation of patent infringement or for any other reason (not limited to patent issues), conditions are imposed on you (whether by court order, agreement or otherwise) that contradict the conditions of this License, they do not excuse you from the conditions of this License. If you cannot distribute so as to satisfy simultaneously your obligations under this License and any other pertinent obligations, then as a consequence you may not distribute the Program at all. For example, if a patent license would not permit royalty-free redistribution of the Program by all those who receive copies directly or indirectly through you, then the only way you could satisfy both it and this License would be to refrain entirely from distribution of the Program. If any portion of this section is held invalid or unenforceable under any particular circumstance, the balance of the section is intended to apply and the section as a whole is intended to apply in other circumstances. It is not the purpose of this section to induce you to infringe any patents or other property right claims or to contest validity of any such claims; this section has the sole purpose of protecting the integrity of the free software distribution system, which is implemented by public license practices. Many people have made generous contributions to the wide range of software distributed through that system in reliance on consistent application of that system; it is up to the author/donor to decide if he or she is willing to distribute software through any other system and a licensee cannot impose that choice. This section is intended to make thoroughly clear what is believed to be a consequence of the rest of this License. 8. If the distribution and/or use of the Program is restricted in certain countries either by patents or by copyrighted interfaces, the original copyright holder who places the Program under this License may add an explicit geographical distribution limitation excluding those countries, so that distribution is permitted only in or among countries not thus excluded. In such case, this License incorporates the limitation as if written in the body of this License. 9. The Free Software Foundation may publish revised and/or new versions of the General Public License from time to time. Such new versions will be similar in spirit to the present version, but may differ in detail to address new problems or concerns. Each version is given a distinguishing version number. If the Program specifies a version number of this License which applies to it and "any later version", you have the option of following the terms and conditions either of that version or of any later version published by the Free Software Foundation. If the Program does not specify a version number of this License, you may choose any version ever published by the Free Software Foundation. 10. If you wish to incorporate parts of the Program into other free programs whose distribution conditions are different, write to the author to ask for permission. For software which is copyrighted by the Free Software Foundation, write to the Free Software Foundation; we sometimes make exceptions for this. Our decision will be guided by the two goals of preserving the free status of all derivatives of our free software and of promoting the sharing and reuse of software generally. NO WARRANTY 11. BECAUSE THE PROGRAM IS LICENSED FREE OF CHARGE, THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING, REPAIR OR CORRECTION. 12. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY AND/OR REDISTRIBUTE THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS), EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES. END OF TERMS AND CONDITIONS How to Apply These Terms to Your New Programs If you develop a new program, and you want it to be of the greatest possible use to the public, the best way to achieve this is to make it free software which everyone can redistribute and change under these terms. To do so, attach the following notices to the program. It is safest to attach them to the start of each source file to most effectively convey the exclusion of warranty; and each file should have at least the "copyright" line and a pointer to where the full notice is found. Copyright (C) This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program; if not, write to the Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. Also add information on how to contact you by electronic and paper mail. If the program is interactive, make it output a short notice like this when it starts in an interactive mode: Gnomovision version 69, Copyright (C) year name of author Gnomovision comes with ABSOLUTELY NO WARRANTY; for details type `show w'. This is free software, and you are welcome to redistribute it under certain conditions; type `show c' for details. The hypothetical commands `show w' and `show c' should show the appropriate parts of the General Public License. Of course, the commands you use may be called something other than `show w' and `show c'; they could even be mouse-clicks or menu items--whatever suits your program. You should also get your employer (if you work as a programmer) or your school, if any, to sign a "copyright disclaimer" for the program, if necessary. Here is a sample; alter the names: Yoyodyne, Inc., hereby disclaims all copyright interest in the program `Gnomovision' (which makes passes at compilers) written by James Hacker. , 1 April 1989 Ty Coon, President of Vice This General Public License does not permit incorporating your program into proprietary programs. If your program is a subroutine library, you may consider it more useful to permit linking proprietary applications with the library. If this is what you want to do, use the GNU Lesser General Public License instead of this License. debmirror-2.16ubuntu1.1/doc/0000775000000000000000000000000012243022416012543 5ustar debmirror-2.16ubuntu1.1/doc/design.txt0000664000000000000000000000434012243012102014545 0ustar Here are some design Ideas for debmirror V2.0 ============================================= Files to Mirror: - debs: /dists/sid/main/binary-alpha/Packages - source: /dists/sid/main/binary-alpha/Packages - D-I: /dists/sid/main/source/Sources - disks: /dists/woody/main/disks-alpha/* (Any Release files there?) - experimental: /project/experimental/main/binary-alpha/Packages /project/experimental/main/sources/Sources - extras: doc, indices, project, tools, ls-lR - trace: /project/trace Source: 1) Central management core - parse config (apt sources.list format?) - read options - start modules (as needed or let them autostart when queues fill up?) - fetch Release.gpg files - check Release files and reget as needed - check Packages/Sources files and reget as needed - check other files - finalize cleanup module - get other files as needed - collect summaries - use trace files + checking a file registers it with the cleanup module too + files leaving a download object are fed into a Signature object and retried a few times + when waiting periodically probe modules for stats and display 2) Modules - Modules run as threads - One queue for files to be processed (in) - One queue for files finished processing (out) - Some status vars [files/bytes queued, files/byte getting, speed(1,5,15m ?)...] (out) - args function to get additional args (run() argument?) - help function to display help - finalize function (close in queue) - pause/continue to suspend the thread - wait_finalize (wait for finalize to finish and return summary, die) 3) Clean Module - gather a list of files present - gather a list of all files that should be there (in queue, from checking file) - cleanup files on finalize + limit IO/s + limit number of files / bytes to be deleted (prevent a mirror wipe) 4) Signature Modules - MD5sum - SHA1 - GPG + limit IO/s + limit throughput 5) Download Modules - ftp - hftp (ftp via http://user:pass@proxy:port/) - http - https - rsync - wget (wget-ftp) - print (output what should be done) + limit IO/s + limit bandwith + limit traffic debmirror-2.16ubuntu1.1/debian/0000775000000000000000000000000012700374210013220 5ustar debmirror-2.16ubuntu1.1/debian/copyright0000664000000000000000000000061012243012102015137 0ustar Format: http://www.debian.org/doc/packaging-manuals/copyright-format/1.0/ Files: * Copyright: 2000-2001 Joey Hess 2001-2002 Joerg Wendland 2003-2007 Goswin von Brederlow 2009-2010 Frans Pop License: GPL-2+ The full text of the GPL is distributed as in /usr/share/common-licenses/GPL-2 on Debian systems. debmirror-2.16ubuntu1.1/debian/manpages0000664000000000000000000000001412243012102014720 0ustar debmirror.1 debmirror-2.16ubuntu1.1/debian/docs0000664000000000000000000000003312243012102014056 0ustar examples/ mirror_size TODO debmirror-2.16ubuntu1.1/debian/install0000664000000000000000000000002212243012102014572 0ustar debmirror usr/bin debmirror-2.16ubuntu1.1/debian/patches/0000775000000000000000000000000012700372655014662 5ustar debmirror-2.16ubuntu1.1/debian/patches/skip-installer.patch0000664000000000000000000000447412243012102020631 0ustar Description: allow releases to be skipped when fetching installer files. Author: Jamie Strandboge , Kees Cook Bug: http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=576576 Index: debmirror-2.15ubuntu1/debmirror =================================================================== --- debmirror-2.15ubuntu1.orig/debmirror 2013-05-07 16:26:55.364305760 -0600 +++ debmirror-2.15ubuntu1/debmirror 2013-05-07 16:26:55.356305761 -0600 @@ -85,6 +85,10 @@ F<./.temp> working directory, but won't replace the old meta files, won't download debs and source files and only simulates cleanup. +=item B<--skip-installer>=I + +Don't download debian-installer files for the specified distribution. + =item B<--help> Display a usage summary. @@ -583,7 +587,7 @@ our @config_files; our ($debug, $progress, $verbose, $passive, $skippackages, $getcontents, $i18n); our ($ua, $proxy, $ftp); -our (@dists, @sections, @arches, @ignores, @excludes, @includes, @keyrings); +our (@dists, @sections, @arches, @ignores, @excludes, @includes, @keyrings, @skip_installer); our (@excludes_deb_section, @limit_priority); our (%excludes_field, %includes_field); our (@di_dists, @di_arches, @rsync_extra); @@ -709,6 +713,7 @@ 'postcleanup' => \$post_cleanup, 'nocleanup' => \$no_cleanup, 'ignore=s' => \@ignores, + 'skip-installer=s' => \@skip_installer, 'exclude=s' => \@excludes, 'exclude-deb-section=s' => \@excludes_deb_section, 'limit-priority=s' => \@limit_priority, @@ -801,6 +806,8 @@ $post_cleanup=0 if ($no_cleanup); $post_cleanup=0 if ($pre_cleanup); $post_cleanup=0 if ($debmarshal); +@skip_installer=split(/,/,join(',',@skip_installer)); +@skip_installer=() unless @skip_installer; # Display configuration. $|=1 if $debug; @@ -2549,11 +2556,15 @@ di_cleanup() if @di_dists; } +# Figure out whether debian-installer should be skipped for a given dist. +my %skip_installer=("woody" => 1, "experimental" => 1); +foreach my $skipped_dist (@skip_installer) { + $skip_installer{$skipped_dist} = 1; +} + sub di_skip_dist { my $dist=shift; - if ( $dist eq "woody" || - $dist eq "experimental" || - $dist =~ /.*-updates/ ) { + if ( defined($skip_installer{$dist}) ) { return 1; } return 0; debmirror-2.16ubuntu1.1/debian/patches/default-settings.patch0000664000000000000000000000302212243012102021136 0ustar Description: increases rsync batching size, leaves off troublesome -I rsync flag, auto-flushes. Author: Kees Cook Bug: http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=455082 Index: debmirror-2.15ubuntu1/debmirror =================================================================== --- debmirror-2.15ubuntu1.orig/debmirror 2013-05-07 16:26:27.856306029 -0600 +++ debmirror-2.15ubuntu1/debmirror 2013-05-07 16:26:27.848306029 -0600 @@ -322,7 +322,7 @@ =item B<--rsync-options>=I Specify alternative rsync options to be used. Default options are -"-aIL --partial". Care must be taken when specifying alternative +"-aL --partial". Care must be taken when specifying alternative options not to disrupt operations, it's best to only add to those options. @@ -600,7 +600,7 @@ our $download_method="ftp"; our $timeout=300; our $max_batch=0; -our $rsync_batch=200; +our $rsync_batch=300; our $num_errors=0; our $bytes_to_get=0; our $bytes_gotten=0; @@ -611,7 +611,7 @@ our $start_time = time; our $dry_run=0; our $do_dry_run=0; -our $rsync_options="-aIL --partial"; +our $rsync_options="-aL --partial"; our $ignore_small_errors=0; our $diff_mode="use"; our $gzip_options="-9 -n --rsyncable"; @@ -626,6 +626,9 @@ my $HOME; ($HOME = $ENV{'HOME'}) or die "HOME not defined in environment!\n"; +# Switch to auto-flushing mode for stdout. +select STDOUT; $|=1; + # Load in config files first so options can override them. Getopt::Long::Configure qw(pass_through); GetOptions('config-file=s' => \@config_files); debmirror-2.16ubuntu1.1/debian/patches/series0000664000000000000000000000025412700372266016076 0ustar default-settings.patch silence-errors.patch drop-redundant-rsync.patch check_file-return.patch skip-installer.patch rsync-retries.patch ubuntu-settings.patch i18n-gz.patch debmirror-2.16ubuntu1.1/debian/patches/ubuntu-settings.patch0000664000000000000000000001426112243012102021043 0ustar Index: debmirror-2.15ubuntu1/debmirror =================================================================== --- debmirror-2.15ubuntu1.orig/debmirror 2013-05-07 16:26:59.908305716 -0600 +++ debmirror-2.15ubuntu1/debmirror 2013-05-07 16:26:59.904305716 -0600 @@ -11,7 +11,7 @@ =head1 DESCRIPTION -This program downloads and maintains a partial local Debian mirror. It can +This program downloads and maintains a partial local Ubuntu mirror. It can mirror any combination of architectures, distributions, and sections. Files are transferred by ftp, and package pools are fully supported. It also does locking and updates trace files. @@ -26,7 +26,7 @@ =item 1. download Packages and Sources files -First it downloads all Packages and Sources files for the subset of Debian it +First it downloads all Packages and Sources files for the subset of Ubuntu it was instructed to get. =item 2. download everything else @@ -65,7 +65,7 @@ This required (unless defined in a configuration file) parameter specifies where the local mirror directory is. If the directory does not exist, it will be created. Be careful; telling this program that your home directory is the -mirrordir is guaranteed to replace your home directory with a Debian mirror! +mirrordir is guaranteed to replace your home directory with an Ubuntu mirror! =item B<-p>, B<--progress> @@ -95,13 +95,13 @@ =item B<-h>, B<--host>=I -Specify the remote host to mirror from. Defaults to I, +Specify the remote host to mirror from. Defaults to I, you are strongly encouraged to find a closer mirror. =item B<-r>, B<--root>=I -Specifies the directory on the remote host that is the root of the Debian -archive. Defaults to F, which will work for most mirrors. The root +Specifies the directory on the remote host that is the root of the Ubuntu +archive. Defaults to F, which will work for most mirrors. The root directory has a F subdirectory. =item B<--method>=I @@ -129,7 +129,7 @@ =item B<-d>, B<--dist>=I -Specify the distribution (etch, lenny, squeeze, sid) of Debian to +Specify the distribution (lucid, oneiric, precise) of Ubuntu to mirror. This switch may be used multiple times, and multiple distributions may be specified at once, separated by commas. @@ -145,7 +145,7 @@ =item B<-s>, B<--section>=I -Specify the section of Debian to mirror. Defaults to +Specify the section of Ubuntu to mirror. Defaults to C. =item B<-a>, B<--arch>=I @@ -520,11 +520,11 @@ debmirror /srv/mirror/debian -Make a mirror of i386 and sparc binaries, main only, and include both unstable -and testing versions of Debian; download from 'ftp.kernel.org': +Make a mirror of i386 and amd64 binaries, main and universe only, and include +both LTS and latest versions of Ubuntu; download from 'archive.ubuntu.com': - debmirror -a i386,sparc -d sid -d etch -s main --nosource \ - -h ftp.nl.debian.org --progress $HOME/mirror/debian + debmirror -a i386,amd64 -d lucid -d precise -s main,universe --nosource \ + -h archive.ubuntu.com --progress $HOME/mirror/debian Make a mirror using rsync (rsync server is 'ftp.debian.org::debian'), excluding the section 'debug' and the package 'foo-doc': @@ -551,15 +551,15 @@ with debmirror's --keyring option -- see above). To add the right key to this keyring you can import it from the - debian keyring (in case of the debian archive) using: + ubuntu keyring (in case of the Ubuntu archive) using: - gpg --keyring /usr/share/keyrings/debian-archive-keyring.gpg --export \ + gpg --keyring /usr/share/keyrings/ubuntu-archive-keyring.gpg --export \ | gpg --no-default-keyring --keyring trustedkeys.gpg --import or download the key from a keyserver: gpg --no-default-keyring --keyring trustedkeys.gpg \ - --keyserver keyring.debian.org --recv-keys + --keyserver keyserver.ubuntu.com --recv-keys The can be found in the gpgv error message in debmirror: gpgv: Signature made Tue Jan 23 09:07:53 2007 CET using DSA key ID 2D230C5F @@ -597,10 +597,10 @@ our $post_cleanup=1; our $no_cleanup=0; our $do_source=1; -our $host="ftp.debian.org"; +our $host="archive.ubuntu.com"; our $user="anonymous"; our $passwd="anonymous@"; -our $remoteroot="debian"; +our $remoteroot="ubuntu"; our $download_method="ftp"; our $timeout=300; our $max_batch=0; @@ -779,9 +779,9 @@ # Post-process arrays. Allow commas to separate values the user entered. # If the user entered nothing, provide defaults. @dists=split(/,/,join(',',@dists)); -@dists=qw(sid) unless @dists; +@dists=qw(precise) unless @dists; @sections=split(/,/,join(',',@sections)); -@sections=qw(main contrib non-free main/debian-installer) unless @sections; +@sections=qw(main main/debian-installer universe restricted multiverse) unless @sections; @arches=split(/,/,join(',',@arches)); @arches=qw(i386) unless @arches; @arches=() if (join(',',@arches) eq "none"); @@ -2855,7 +2855,7 @@ and copyright 2009-2010 by Frans Pop . The author disclaims any responsibility for any mangling of your system, -unexpected bandwidth usage bills, meltdown of the Debian mirror network, +unexpected bandwidth usage bills, meltdown of the Debian/Ubuntu mirror network, etc, that this script may cause. See NO WARRANTY section of GPL. =head1 AUTHOR Index: debmirror-2.15ubuntu1/examples/debmirror.conf =================================================================== --- debmirror-2.15ubuntu1.orig/examples/debmirror.conf 2013-05-07 16:26:59.908305716 -0600 +++ debmirror-2.15ubuntu1/examples/debmirror.conf 2013-05-07 16:26:59.904305716 -0600 @@ -20,13 +20,13 @@ $debug=0; # Download options -$host="ftp.debian.org"; +$host="archive.ubuntu.com"; $user="anonymous"; $passwd="anonymous@"; -$remoteroot="debian"; +$remoteroot="ubuntu"; $download_method="ftp"; -@dists="sid"; -@sections="main,main/debian-installer,contrib,non-free"; +@dists="precise"; +@sections="main,main/debian-installer,universe,restricted,multiverse"; @arches="i386"; # @ignores=""; # @excludes=""; debmirror-2.16ubuntu1.1/debian/patches/check_file-return.patch0000664000000000000000000000213012243012102021244 0ustar Description: adjust logic to report on why a file is needed in verbose mode. Author: Kees Cook Bug: http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=455082 Index: debmirror-2.15ubuntu1/debmirror =================================================================== --- debmirror-2.15ubuntu1.orig/debmirror 2013-05-07 16:26:53.120305782 -0600 +++ debmirror-2.15ubuntu1/debmirror 2013-05-07 16:26:53.112305782 -0600 @@ -1568,6 +1568,7 @@ my %params=@_; my ($filename, $size)=delete @params{qw{filename size}}; if (! -f $filename) { + say("Missing: $filename") if ($verbose); return 0; } my $disksize = -s _; @@ -1597,11 +1598,16 @@ open HANDLE, $filename or die "$filename: $!"; $summer->addfile(*HANDLE); return 1 if $checksum eq $summer->hexdigest; + say(sprintf("Mismatch '$filename': sum is %s, expected %s", $summer->hexdigest, $checksum)) + if ($verbose); } else { return 1; } } + elsif ($verbose) { + say(sprintf("Mismatch '$filename': size is %d, expected %d", $disksize, $size)); + } return 0; } debmirror-2.16ubuntu1.1/debian/patches/drop-redundant-rsync.patch0000664000000000000000000000432712243012102021747 0ustar Description: drop the needless section of rsync code for trailing list. Author: Kees Cook Bug: http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=455082 Index: debmirror-2.15ubuntu1/debmirror =================================================================== --- debmirror-2.15ubuntu1.orig/debmirror 2013-05-07 16:26:50.424305809 -0600 +++ debmirror-2.15ubuntu1/debmirror 2013-05-07 16:26:50.416305809 -0600 @@ -1346,12 +1346,16 @@ my @result; my $i=0; my $j=0; + my @tofetch; $opt .= " --progress" if $progress; $opt .= " -v" if $verbose or $debug; $opt .= " -n" if $do_dry_run; $opt .= " --no-motd" unless $verbose; foreach my $file (sort keys %files) { - if (!$files{$file}) { + push(@tofetch, $file) if (!$files{$file}); + } + my $last = scalar(@tofetch); + foreach my $file (@tofetch) { my $dirname; my @dir; ($dirname) = $file =~ m:(.*/):; @@ -1360,7 +1364,10 @@ push (@result, "" . join('/', @dir[0..$_]) . "/"); } push (@result, "$file"); - if (++$j >= $rsync_batch) { + $i++; + $j++; + say("want $file ($i/$last $j/$rsync_batch)") if ($progress || $verbose); + if ($j >= $rsync_batch || $i == $last) { $j = 0; ($fh, $rsynctempfile) = tempfile(); if (@result) { @@ -1388,35 +1395,12 @@ } @result = (); } - if ($max_batch > 0 && ++$i >= $max_batch) { + if ($max_batch > 0 && ($i + 1) >= $max_batch) { print "Batch limit exceeded, mirror run will be partial\n"; push (@errlog,"Batch limit exceeded, mirror run was partial\n"); $num_errors++; last; } - } - } - ($fh, $rsynctempfile) = tempfile(); - if (@result) { - @result = sort(@result); - my $prev = "not equal to $result[0]"; - @result = grep($_ ne $prev && ($prev = $_, 1), @result); - for (@result) { - print $fh "$_\n"; - } - system ("rsync --timeout=$timeout $opt $rsyncremote --include-from=$rsynctempfile --exclude='*' $mirrordir"); - close $fh; - foreach my $dest (@result) { - if (-f $dest) { - if (!check_lists($dest)) { - say("$dest failed checksum verification"); - $num_errors++; - } - } elsif (!-d $dest) { - say("$dest missing"); - $num_errors++; - } - } } return; } debmirror-2.16ubuntu1.1/debian/patches/silence-errors.patch0000664000000000000000000000457412243012102020625 0ustar Description: silence any errors from find. Author: Kees Cook Bug: http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=455082 Index: debmirror-2.15ubuntu1/debmirror =================================================================== --- debmirror-2.15ubuntu1.orig/debmirror 2013-05-07 16:26:46.380305848 -0600 +++ debmirror-2.15ubuntu1/debmirror 2013-05-07 16:26:46.372305848 -0600 @@ -1432,7 +1432,7 @@ } chdir($tempdir) or die "unable to chdir($tempdir): $!\n"; my $res=0; -foreach my $file (`find . -type f`) { +foreach my $file (`find . -type f 2>/dev/null`) { chomp $file; $file=~s:^\./::; # this skips diff files if unwanted @@ -1507,7 +1507,7 @@ # makes directories.. Sort so they are removable in bottom-up # order. chdir($mirrordir) or die "chdir $mirrordir: $!"; - system("find . -depth -type d ! -name . ! -name .. -print0 | xargs -0 rmdir 2>/dev/null") if (! $do_dry_run); + system("find . -depth -type d ! -name . ! -name .. -print0 2>/dev/null | xargs -0 rmdir 2>/dev/null") if (! $do_dry_run); } if ($res != 0) { @@ -2542,7 +2542,7 @@ my $ignore; $ignore = "(".join("|", @ignores).")" if @ignores; # Remove all files in the mirror that we don't know about - foreach my $file (`find . -type f`) { + foreach my $file (`find . -type f 2>/dev/null`) { chomp $file; $file=~s:^\./::; next if (@di_dists && $file =~ m:installer-\w(-|\w)*/current/images/:); @@ -2683,11 +2683,11 @@ sub di_cleanup { # Clean up obsolete files - foreach my $image_dir (`find dists/ -type d -name images`) { + foreach my $image_dir (`find dists/ -type d -name images 2>/dev/null`) { next unless $image_dir =~ m:/installer-\w(-|\w)*/current/images$:; chomp $image_dir; chdir("$image_dir") or die "unable to chdir($image_dir): $!\n"; - foreach my $file (`find . -type f`) { + foreach my $file (`find . -type f 2>/dev/null`) { chomp $file; $file=~s:^\./::; if (! exists $di_files{$image_dir} || ! exists $di_files{$image_dir}{$file}) { @@ -2703,7 +2703,7 @@ # Clean up temporary D-I files (silently) if (-d "$tempdir/d-i") { chdir("$tempdir/d-i") or die "unable to chdir($tempdir/d-i): $!\n"; - foreach my $file (`find . -type f`) { + foreach my $file (`find . -type f 2>/dev/null`) { chomp $file; $file=~s:^\./::; unlink "$file" or die "unlink $tempdir/d-i/$file: $!\n"; debmirror-2.16ubuntu1.1/debian/patches/rsync-retries.patch0000664000000000000000000000220412243012102020466 0ustar Description: attempt to re-connect to rsync servers that time-out. Author: Kees Cook Bug: http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=576577 Index: debmirror-2.15ubuntu1/debmirror =================================================================== --- debmirror-2.15ubuntu1.orig/debmirror 2013-05-07 16:26:57.720305737 -0600 +++ debmirror-2.15ubuntu1/debmirror 2013-05-07 16:26:57.712305738 -0600 @@ -1385,8 +1385,21 @@ print $fh "$_\n"; } } - system ("rsync --timeout=$timeout $opt $rsyncremote --include-from=$rsynctempfile --exclude='*' $mirrordir"); - die "rsync failed!" if ($? != 0); + my $limit = 10; + while (1) { + system ("rsync --timeout=$timeout $opt $rsyncremote --include-from=$rsynctempfile --exclude='*' $mirrordir"); + my $rc = $?; + last if ($rc == 0); + # Retry on connection failures + if ($rc == 5<<8) { + die "rsync failed too many times!" if (--$limit == 0); + say("Pausing before retry..."); + sleep(30); + } + else { + die "rsync failed!"; + } + } close $fh; unlink $rsynctempfile; foreach my $dest (@result) { debmirror-2.16ubuntu1.1/debian/patches/i18n-gz.patch0000664000000000000000000000157512700372655017110 0ustar Description: Download Translation-*.gz in addition to Translation-*.bz2. Launchpad replaces bz2 with xz from xenial, so limiting to bz2 leaves nothing. Author: William Grant Bug-Ubuntu: https://bugs.launchpad.net/ubuntu/+source/debmirror/+bug/1562118 Index: debmirror-2.16ubuntu1.1/debmirror =================================================================== --- debmirror-2.16ubuntu1.1.orig/debmirror +++ debmirror-2.16ubuntu1.1/debmirror @@ -2407,7 +2407,7 @@ sub i18n_from_release { next if substr($path, 0, length($compdir)) ne $compdir; my $filename = substr($path, length($compdir)+1, length($path)-length($compdir)-1); - next if $filename !~ /bz2$/; + next if $filename !~ /\.(?:gz|bz2)$/;; my ($sha1, $size) = ($file_lists{$path}{SHA1}, $file_lists{$path}{size}); if(!(defined($include) && ($subdir."/".$filename)=~/$include/o)) { debmirror-2.16ubuntu1.1/debian/changelog0000664000000000000000000011344212700374206015104 0ustar debmirror (1:2.16ubuntu1.1) trusty; urgency=medium * Download Translation-*.gz in addition to Translation-*.bz2. Launchpad replaces bz2 with xz from xenial. (LP: #1565585) -- William Grant Mon, 04 Apr 2016 14:56:40 +1000 debmirror (1:2.16ubuntu1) trusty; urgency=low * debian/patches/skip-installer.patch: Don't skip -updates when mirroring debian-installer components, as we publish d-i to updates (LP: #1034679) * Merge from Debian unstable. Remaining changes: - debian/{control,rules}: Add quilt for patch management. - Debian bug 455082: + default-settings.patch: change rsync defaults. + silence-errors.patch: throw away find errors. + drop-redundant-rsync.patch: clean up logic in rsync batching. + check_file-return.patch: report why a file needs to sync. - Debian bug 576576: + skip-installer.patch: allow specific releases to be skipped. - Debian bug 576577: + rsync-retries.patch: retry if rsync batch fails connection. - ubuntu-settings.patch: Changed defaults to download Ubuntu, replaced most instances of Debian with Ubuntu in the documentation. * Rebase patches, and drop origin-canonical.patch, included upstream. -- Matthew Fischer Tue, 19 Nov 2013 19:31:48 -0700 debmirror (1:2.16) unstable; urgency=low * Fix confusing output with --precleanup. Closes: #708355 * Support new Contents file location used for wheezy, while still handling the old location. Closes: #637442 Thanks, Christoph Pleger -- Joey Hess Mon, 26 Aug 2013 12:41:03 -0400 debmirror (1:2.15ubuntu2) saucy; urgency=low * debian/patches/skip-installer.patch: Don't skip -updates when mirroring debian-installer components, as we publish d-i to updates (LP: #1034679) -- Adam Conrad Fri, 05 Jul 2013 01:32:31 -0600 debmirror (1:2.15ubuntu1) saucy; urgency=low * Merge from Debian unstable. Remaining changes: - debian/{control,rules}: Add quilt for patch management. - Debian bug 455082: + default-settings.patch: change rsync defaults. + silence-errors.patch: throw away find errors. + drop-redundant-rsync.patch: clean up logic in rsync batching. + check_file-return.patch: report why a file needs to sync. - Debian bug 576576: + skip-installer.patch: allow specific releases to be skipped. - Debian bug 576577: + rsync-retries.patch: retry if rsync batch fails connection. - ubuntu-settings.patch: Changed defaults to download Ubuntu, replaced most instances of Debian with Ubuntu in the documentation. * Rebase patches, and drop origin-canonical.patch, included upstream. -- Adam Conrad Tue, 07 May 2013 16:15:24 -0600 debmirror (1:2.15) unstable; urgency=low * Improved interface to gpgv. Thanks, Tom Jones. * Add --keyring option. Thanks, Tom Jones. * Add --exclude-field and --include-field options. Closes: #695767. Thanks, Colin Watson * Supports https. Closes: #697687 Thanks, Fernando Ike * Treat "Origin: Canonical" like "Origin: Ubuntu" Closes: #702319. Thanks, Tal Danzig -- Joey Hess Sat, 04 May 2013 23:44:27 -0400 debmirror (1:2.14ubuntu1.2) raring; urgency=low * origin-canonical.patch: Resolve issue of not being able to mirror apt archives with the Release file set as "Origin: Canonical", by adding additional special case. Patch courtesy of Tal Danzig. (LP: #1174797) -- David Medberry Tue, 30 Apr 2013 15:55:53 +0000 debmirror (1:2.14ubuntu1) quantal; urgency=low * Merge from Debian unstable. Remaining changes: - debian/{control,rules}: Add quilt for patch management. - Debian bug 455082: + default-settings.patch: change rsync defaults. + silence-errors.patch: throw away find errors. + drop-redundant-rsync.patch: clean up logic in rsync batching. + check_file-return.patch: report why a file needs to sync. - Debian bug 576576: + skip-installer.patch: allow specific releases to be skipped. - Debian bug 576577: + rsync-retries.patch: retry if rsync batch fails connection. - ubuntu-settings.patch: Changed defaults to download Ubuntu, replaced most instances of Debian with Ubuntu in the documentation. * Refreshed patches. -- Stefano Rivera Sat, 30 Jun 2012 23:01:27 +0200 debmirror (1:2.14) unstable; urgency=low * Add --config-file option. Closes: #638295. Thanks, Stefan Kisdaroczi and Matthias Schmitz. * Do not crash on i18n/Index with --debmarshal Closes: #676219. Thanks, Frank Luithle -- Joey Hess Tue, 26 Jun 2012 19:29:31 -0400 debmirror (1:2.13) unstable; urgency=low * Fix mirroring of Translation files for suites (currently contrib and non-free) for which there are no i18n Index files. Use information from Release files instead. Closes: #673444, #644609. Thanks, Joris Dolderer -- Joey Hess Tue, 19 Jun 2012 14:33:41 -0400 debmirror (1:2.12ubuntu1) quantal; urgency=low * Merge from Debian testing. Remaining changes: - debian/{control,rules}: Add quilt for patch management. - Debian bug 455082: + default-settings.patch: change rsync defaults. + silence-errors.patch: throw away find errors. + drop-redundant-rsync.patch: clean up logic in rsync batching. + check_file-return.patch: report why a file needs to sync. - Debian bug 576576: + skip-installer.patch: allow specific releases to be skipped. - Debian bug 576577: + rsync-retries.patch: retry if rsync batch fails connection. - ubuntu-settings.patch: Changed defaults to download Ubuntu, replaced most instances of Debian with Ubuntu in the documentation. * Refreshed patches. * Updated ubuntu-settings.patch to default to precise. -- Stefano Rivera Fri, 04 May 2012 23:52:36 +0200 debmirror (1:2.12) unstable; urgency=low * Always mirror the English "translations" necessary to have any package long descriptions at all, even if --i18n is not enabled. (Unless disabled via --exclude.) * Make i18n/Index parsing not fail if there are non-SHA1 checksums. Although currently only SHA1 is supported here. Closes: #644609 -- Joey Hess Fri, 13 Apr 2012 12:19:34 -0400 debmirror (1:2.11) unstable; urgency=low * Two fixes to output. Closes: #647562 Thanks, Karl Goetz * Support HTTP authentication by setting --user and --password. Closes: #650743 Thanks, Eshat Cakar * --timeout now also affects http and ftp download. Closes: #662694 Thanks, Christoph Goehre * Remove libcompress-zlib-perl from Depends, this is now provided by perl. -- Joey Hess Mon, 05 Mar 2012 17:05:29 -0400 debmirror (1:2.10ubuntu1) precise; urgency=low * Merge from Debian unstable. Remaining changes: - debian/{control,rules}: Add quilt for patch management. - Debian bug 455082: + default-settings.patch: change rsync defaults. + silence-errors.patch: throw away find errors. + drop-redundant-rsync.patch: clean up logic in rsync batching. + check_file-return.patch: report why a file needs to sync. - Debian bug 576576: + skip-installer.patch: allow specific releases to be skipped. - Debian bug 576577: + rsync-retries.patch: retry if rsync batch fails connection. - ubuntu-settings.patch: Changed defaults to download Ubuntu, replaced most instances of Debian with Ubuntu in the documentation. * Refreshed patches. * Slimmed down skip-installer.patch, largely superseded thanks to Debian bug 636627. -- Stefano Rivera Sat, 12 Nov 2011 16:23:21 +0200 debmirror (1:2.10) unstable; urgency=low * Fix skipping d-i for suites that do not include it. Closes: #636627 Thanks, Stefan Kisdaroczi * Allow mirroring d-i on kfreebsd-*; skip it for arch 'all'. Closes: #637457 Thanks, Stefan Kisdaroczi -- Joey Hess Mon, 31 Oct 2011 10:44:07 -0400 debmirror (1:2.9ubuntu1) oneiric; urgency=low * Merge from debian unstable. Remaining changes: - debian/{control,rules}: Add quilt for patch management. - Debian bug 455082: + default-settings.patch: change rsync defaults. + silence-errors.patch: throw away find errors. + drop-redundant-rsync.patch: clean up logic in rsync batching. + check_file-return.patch: report why a file needs to sync. - Debian bug 576576: + skip-installer.patch: allow specific releases to be skipped. - Debian bug 576577: + rsync-retries.patch: retry if rsync batch fails connection. - ubuntu-settings.patch: Changed defaults to download Ubuntu, replaced most instances of Debian with Ubuntu in the documentation. -- Stefano Rivera Wed, 10 Aug 2011 17:44:50 +0200 debmirror (1:2.9) unstable; urgency=low * Use Net::INET6Glue to support making ipv6 connections. Closes: #631302 * Avoid sanity check for empty mirror, to allow mirroring an empty mirror if this is the first time debmirror is run. Closes: #635723 Thanks, Stefan Kisdaroczi -- Joey Hess Thu, 28 Jul 2011 14:57:36 +0200 debmirror (1:2.8ubuntu1) oneiric; urgency=low * Merge from Debian unstable. Remaining changes: - debian/{control,rules}: Add quilt for patch management. - Debian bug 455082: + default-settings.patch: change rsync defaults. + silence-errors.patch: throw away find errors. + drop-redundant-rsync.patch: clean up logic in rsync batching. + check_file-return.patch: report why a file needs to sync. - Debian bug 576576: + skip-installer.patch: allow specific releases to be skipped. - Debian bug 576577: + rsync-retries.patch: retry if rsync batch fails connection. - ubuntu-settings.patch: Changed defaults to download Ubuntu, replaced most instances of Debian with Ubuntu in the documentation. * Dropped patches: - no-motd changes from default-settings.patch, superseded in Debian. - functionalize.patch: Accepted in Debian. - command-exit-checking.patch: Accepted in Debian. -- Stefano Rivera Sun, 05 Jun 2011 22:00:27 +0200 debmirror (1:2.8) unstable; urgency=low * Avoid trying to get d-i for *-updates suites. Closes: #619146 -- Joey Hess Mon, 21 Mar 2011 13:03:45 -0400 debmirror (1:2.7) unstable; urgency=high * Version dep on LWP. Closes: #614594 * Fix typo in d-i download code. Closes: #614620 -- Joey Hess Tue, 22 Feb 2011 13:54:12 -0400 debmirror (1:2.6) unstable; urgency=high * In Release, Packages, and Sources files, support all sizes of SHA checksums that are supported by Digest::SHA. Closes: #614383 * Now depends on libdigest-sha-perl. * Check SHA512, SHA256, or SHA1 in preference to MD5. * Full checksum validation is now enabled by the --checksums switch. (The old --md5sums switch is an alias to that for backwards compatibility.) -- Joey Hess Mon, 21 Feb 2011 19:28:07 -0400 debmirror (1:2.5) unstable; urgency=low * Clean up output messages. * Apply program return code checking patch by Kees Cook. * Allow umask to control directory permissions in mkdir. Closes: #589397 * Add --slow-cpu option that avoids bzipping and gzipping files. Closes: #594948 * Various code cleanups. * Deprecate --cleanup, which had become confusing since --postcleanup is the default, and add --precleanup. * Add --check-gpg (the default) and --no-check-gpg options. * Added a warning message if rsync of extra files fails. * Default to --rsync-extra=trace, and warn if --rsync-extra setting disables trace. * Above will ensure that debmirror at least tries to get trace files by default, and warns if it cannot. Closes: #156564 * Avoid getting rsync-extra files if the site being mirrored has --root=/ , as that cannot work with rsync. * A proxy specified with --proxy or ftp_proxy will now also be used for ftp connections. * Deprecate --method=hftp, just use --proxy with --method=ftp. * Run rsync with --no-motd except for in --verbose mode. * Support --progress for http. (Per #598382) * Apply manpage markup patch from liw. Closes: #599414 * Fix typo in default rsync options. Closes: #599568 * Add --debmarshal snapshot mode by Drake Diedrich. Closes: #550007 * Send verbose/debug mode gpgv error messages to stdout. Closes: #607099 -- Joey Hess Sat, 05 Feb 2011 12:40:14 -0400 debmirror (1:2.4.6ubuntu1) natty; urgency=low [ Stefano Rivera ] * Merge from debian unstable. Remaining changes: - debian/{control,rules}: add quilt for patch management. - Debian bug 455082: - default-settings.patch: change rsync defaults. - silence-errors.patch: throw away find errors. - functionalize.patch: prepare for batching of non-deb files. - drop-redundant-rsync.patch: clean up logic in rsync batching. - check_file-return.patch: report why a file needs to sync. - command-exit-checking.patch: check exit codes of commands. - skip-installer.patch: allow specific releases to be skipped (Debian bug 576576). - rsync-retries.patch: retry if rsync batch fails connection (Debian bug 576577). [ Francisco Javier P.L. ] * ubuntu-settings.patch: Changed defaults to download Ubuntu (LP: #64345), replaced most instances of Debian with Ubuntu in the documentation. Changed examples to be more Ubuntu relevent, patch thanks to Karl Goetz. -- Stefano Rivera Fri, 05 Nov 2010 22:05:40 +0200 debmirror (1:2.4.6) unstable; urgency=low * New maintainer. Frans, we'll miss you. Closes: #595690 * Moved to git; grafted in debmirror's original development history from my home directory. Closes: #594976 * Fix filename of mirror_size file in man page. Closes: #594975 -- Joey Hess Sun, 05 Sep 2010 18:54:21 -0400 debmirror (1:2.4.5) unstable; urgency=low * Drop support for the --adddir option which was obsoleted log ago. * Ensure MD5SUMS files for D-I images get updated. Closes: #590667. Thanks to Stefan Kisdaroczi for reporting the issue and for the patch. * Update archive size information (without Etch as that has been archived). -- Frans Pop Thu, 05 Aug 2010 16:09:13 +0200 debmirror (1:2.4.4ubuntu2) lucid; urgency=low * drop-redundant-rsync.patch: fix index double-increment (LP: #560826) -- Steve Beattie Sun, 11 Apr 2010 10:02:11 -0700 debmirror (1:2.4.4ubuntu1) lucid; urgency=low * Merge from debian unstable (LP: #555967). Remaining changes: - debian/{control,rules}: add quilt for patch management. - Debian bug 455082: - default-settings.patch: change rsync defaults. - silence-errors.patch: throw away find errors. - functionalize.patch: prepare for batching of non-deb files. - drop-redundant-rsync.patch: clean up logic in rsync batching. - check_file-return.patch: report why a file needs to sync. - command-exit-checking.patch: check exit codes of commands. - skip-installer.patch: allow specific releases to be skipped (Debian bug 576576). - rsync-retries.patch: retry if rsync batch fails connection (Debian bug 576577). * Dropped rsync Release file batching for now, as it makes merging much more difficult. -- Kees Cook Mon, 05 Apr 2010 12:38:24 -0700 debmirror (1:2.4.4) unstable; urgency=low * Fix typo in mirror_size. Closes: #575352. * Change internal use of dry-run variables so that setting $dry_run in the config file actually works. Closes: #569348. -- Frans Pop Thu, 25 Mar 2010 19:14:12 +0100 debmirror (1:2.4.3) unstable; urgency=low * Don't delete Contents and Translation files if mirror cleanup is done early. Closes: #568613. -- Frans Pop Sat, 06 Feb 2010 14:42:36 +0100 debmirror (1:2.4.2) unstable; urgency=low * Really allow for Release files without Origin field. Closes: #565593. -- Frans Pop Sun, 31 Jan 2010 14:21:46 +0100 debmirror (1:2.4.1) unstable; urgency=low * Typo fixes in NEWS.Debian file spotted by Geoff Simmons. Closes: #566377. * Allow for Release files without Origin field. Closes: #565593. -- Frans Pop Sun, 24 Jan 2010 07:23:06 +0100 debmirror (1:2.4) unstable; urgency=low * The main Debian archive has started to use rsyncable gzip files. Use the --rsyncable flag when compressing Packages/Sources files using gzip after applying pdiffs to ensure the md5sum of the file matches the one in the Release file again. Closes: #560326. This change may cause unnecessary download of the gzipped Packages/Sources files for other archives that provide pdiffs but don't have rsyncable gzipped files; this can be fixed using the new option --gzip-options. * Fix mirroring of archives without a Release.gpg file. Closes: #561533. Thanks to Loïc Minier for tracing the issue. * Allow to specify the local mirror directory in config files. Closes: #544141. * Add versioned dependency on perl (>= 5.10). Closes: #551878. * Improve dependencies on gpgv/gnupg. -- Frans Pop Sat, 19 Dec 2009 22:21:38 +0100 debmirror (1:2.3.1) unstable; urgency=low * Update example configuration (closes: #549955): - fix error in variable names for setting D-I dists & arches - add example for setting "extra rsync directories" * Rename variables so that @dists can be set again in a configuration file. Closes: #549952. * Enable LWP::ConnCache for the http transfer method. Closes: #395538. Thanks to Gregor Herrmann for pointing out the option. -- Frans Pop Thu, 08 Oct 2009 19:39:41 +0200 debmirror (1:2.3) unstable; urgency=low * Support updating Contents files using diff files. This can significantly reduce the download size when Contents files change. Closes: #436027. * Because of the previous change the option --pdiff has been renamed to --diff as 'package diffs' no longer covers its use. * Fix mirroring archives without a Release file (--ignore-missing-release). * Minor other fixes and improvements. -- Frans Pop Sat, 03 Oct 2009 13:33:46 +0200 debmirror (1:2.2.1) unstable; urgency=low * Only fetch i18n Index files if needed. * Fix mirroring D-I images when the archive is also being mirrored for the same dist. Closes: #547789. -- Frans Pop Tue, 22 Sep 2009 18:12:05 +0200 debmirror (1:2.2) unstable; urgency=low * Allow to include/exclude specific files belonging to D-I images. * Add support for downloading package translation files. Closes: #436030. * Move archive size information to a separate file in /usr/share/doc. -- Frans Pop Sat, 12 Sep 2009 08:59:31 +0200 debmirror (1:2.1.1) unstable; urgency=low * Register the trace and lock files only after loading the state cache. -- Frans Pop Mon, 31 Aug 2009 14:16:47 +0200 debmirror (1:2.1) unstable; urgency=low * Fix location of debmirror.conf. Closes: #544139. * Don't display download speed if rsync is used. Closes: #422100. * Support mirroring specific additional files from specific locations on the mirror: trace files, ./doc, ./indices and ./tools. The transfer method used for this is always rsync, irrespective of what method is specified in the --method option. Closes: #153680, #156564. * Ubuntu uses an identical Codename for different suites, so just ignore it and use the Suite instead. Closes: #544132. -- Frans Pop Sat, 29 Aug 2009 18:55:25 +0200 debmirror (1:2.0) unstable; urgency=low * Remove duplicated checks of md5sums for Packages/Sources files. * Improve performance of parsing Packages/Sources files (by a factor of about 30). * Revert change in directory removal as otherwise also empty parent directories of empty directories no longer get removed. * Fix support for mirrors with need extra directories in dist, such as security mirrors, which got broken by the suite->codename symlink changes. Thanks to Christoph Goehre for reporting the issue and testing the fix. Closes: #543775. * No longer requires a leading "/" or ":" for the root directory. This means the same --root option can be used for both http/ftp and rsync. * Improve accounting of download size and display in B/kiB/MiB depending on the size of the download. Closes: #520487. * Don't write the trace file until the meta data is also in place, and don't write one during a dry run. * Add option to use a cache file to save the state of the mirror between runs, allowing for more efficient mirror runs. Closes: #483922. * Supports mirroring "current" Debian Installer images. With the option to specify a different set of dists and arches than for the package archive. In this release there are no progress updates displayed yet. Closes: #154966. -- Frans Pop Fri, 28 Aug 2009 15:32:37 +0200 debmirror (1:1.0.1) unstable; urgency=low * Skip debian-installer sections for source packages. D-I only has binary packages; the source is included in the regular sections. Closes: #542826. Based on a patch from Ernesto Hernández-Novich, with thanks. * Allow for the fact that for experimental the suite and codename are identical. Thanks to Craig Sanders. Closes: #542929. -- Frans Pop Sun, 23 Aug 2009 07:05:24 +0200 debmirror (1:1.0) unstable; urgency=low * Switch to more common versioning scheme; requires an epoch. * Clarify version of GPL (version 2 or later). * Update periods of activity for various maintainers of the script both in the perl script and in the debian/copyright file. Closes: #542061. * Set more accurate versioned build dependency on debhelper. * Apply patch from Kees Cook to make parsing of Packages/Sources files a bit less fragile. Closes: #451021. * Add sanity check after parsing Packages/Sources files to avoid completely deleting a mirror in case of unexpected errors (#451021, #435663). * Debian mirrors no longer keep uncompressed packages files; don't include them on the local mirror either. * Apply patch from A. Mennucc for more efficient removal of empty directories. Closes: #453091. * Various improvements of the man page for: - the --getcontents switch; with thanks to Slaven Rezic; closes: #524967 - example commands; with thanks to Karl Goetz; closes: #491326 - debmirror.conf configuration file and example * Don't fetch Contents files if they are already up-to-date (#436027). * Remove reduntant slashes in paths from Package files. Closes: #471946. Thanks to Raphael Hertzog for the patch. * Update tables showing archive size in man page, using new mirror-size script. Closes: #498541. * Automatically create and update suite->codename symlinks based on info in the Release file. Directories for dists will always have the codename of the release. Conversion of existing mirrors that use suites for directories is supported. See also NEWS.Debian. Closes: #426170. -- Frans Pop Thu, 20 Aug 2009 19:43:39 +0200 debmirror (20090807) unstable; urgency=low * New maintainer, with thanks to Goswin for his work on previous releases. * Remove no longer needed prerm script. * Correct syntax of NEWS.Debian file. * Switch build system to debhelper. * Bump standards version to 3.8.2. * Improve documentation on how to add an archive keyring for debmirror. Thanks to Kees Cook for the patch. Closes: #451157. -- Frans Pop Fri, 07 Aug 2009 19:24:01 +0200 debmirror (20070123ubuntu3) jaunty; urgency=low * Fix a regression in the estimated download size calculation introduced by skip-installer.patch. (LP: #34376) -- Anders Kaseorg Sun, 15 Feb 2009 16:42:27 -0500 debmirror (20070123ubuntu2) jaunty; urgency=low * debian/{control,rules}: Added "quilt" to manage Ubuntu set of patches. * Extracted earlier changes into logical patches: - gpg-key-documentation.patch (debian bug 451157). - gzreadline-update.patch (debian bug 451021). * Add rsync-batching.patch: enable sane rsync batching instead of one-at-a-time fetches (debian bug 455082). * Add rsync-retries.patch: retry rsyncs if server connections time out. * Add skip-installer.patch: handle installer section and skipping, thanks to Jamie Strandboge. -- Kees Cook Fri, 13 Feb 2009 18:18:39 -0800 debmirror (20070123ubuntu1) hardy; urgency=low * Patched to handle changes to Compress::Zlib gzreadline symantics (LP: #157362). * Update documentation to detail the correct keyring to use (LP: #90546). -- Kees Cook Mon, 12 Nov 2007 14:10:57 -0800 debmirror (20070123) unstable; urgency=low * Add dependency for libdigest-sha1-perl (ACK NMU) (Closes: #386707) * Change manpage entry for --pdiff (Closes: #386697) * Fix Release.gpg check to use gpgv (Closes: #400526) * Fix use of uninitialized value in addition * Count errors in pdiff files as small errors (Closes: #400054) * Cleanup tempfiles (Closes: 399834) * Fix manpage permissions with patch by (Closes: #399058) "Dmitry E. Oboukhov" * Skip pdiff files if patch binary is missing (Closes: #401245) * Skip pdiff files if ed binary is missing and recommend it (Closes: #397936) -- Goswin von Brederlow Tue, 23 Jan 2007 14:53:14 +0100 debmirror (20060907) unstable; urgency=low * Merge pdiff patch by Peter Colberg (Closes: #366855) * Add --pdiff-mode option * Add rsync patch by Peter Colberg (Closes: #299342) * Disable caching for Release and Release.gpg (Closes: #376495) * Default to --postcleanup (Closes: #295423) * Print ftp hashes to stdout (Closes: #349856) (Patch by Bastian Kleineidam ) * Fix typo found by Luca Bruno (Closes: #362561) * Implement ftp authentication with user/passwd (Closes: #360453) (Patch by Peter Baumann ) * Skip Index files that don't exist locally nor in Release Obsoletes other ideas from the BTS (Closes: #369061, #360451, #382271) * Fail immediately if the signature cannot be verified (Closes: #316528) * Show gpg error message on failure (Closes: #316529) * Skip gpg test if --ignore-release-gpg is specified (Closes: #322714) * Re-add --skippackages (Closes: #294974) -- Goswin von Brederlow Thu, 7 Sep 2006 15:36:47 +0200 debmirror (20051209) unstable; urgency=low * Reorder find arguments (Closes: #316461) Patch by Craig Sanders * Move Contents file fetching out of stage 1 to make them not critical (Closes: #314282) * Add % progress for http method (Closes: #328312) * Add archive sizes to the manpage (Closes: #340423) * Consider meta file sizes for % progress (Closes: #341910) * Don't say 'All done' until really all is done (Closes: #319957) * Remove obsolete --nomd5sum option (Closes: #321278) * Prefer --proxy over ENV{ftp_proxy} for hftp (Closes: #334360) * Add tip about gpg to the manpage (Closes: #316506) * Don't check/count source files multiple times -- Goswin von Brederlow Fri, 9 Dec 2005 18:31:21 +0100 debmirror (20050207) unstable; urgency=low * Add NEWS.Debian (Closes: #289025) * Add ~/.debmirror.conf and /etc/debmirror.conf (Closes: #244023) * Typo fix by Nico Golde and more (Closes: #292791) * Add example config file -- Goswin von Brederlow Mon, 7 Feb 2005 05:30:34 +0100 debmirror (20050118) unstable; urgency=low * Add --no-tty option to gpg (Closes: #289286) reported by Holger Ruckdeschel * Move cleanup code into function and add missing chdir (Closes: #287465) adapted patch by Daniel Parthey * Unlink hardlinks before system calls with redirected IO (Closes: #288814) adapted patch by Mirko Parthey * Unlink metafiles later (Closes: #289752) patch by Ingo Saitz * Typo fixes as found by Martin Kourim (Closes: #287732) * Add --ignore-small-errors to allow updating inconsistent upstream mirrors (Closes: #288973) * Hide gpg signature check output if !verbose (Closes: #286575) -- Goswin von Brederlow Tue, 18 Jan 2005 02:59:34 +0200 debmirror (20041219) unstable; urgency=low * Tell LockFile::Simple not to force unlocking after an hour even if the old debmirror is still running. (Closes: #286330) -- Goswin von Brederlow Sun, 19 Dec 2004 18:18:34 +0200 debmirror (20041209) unstable; urgency=high * hide gpg --version output * test for gpg and give cluefull error * add Recommends: gnupg * add trailing / to remoteroot for rsync * add --ignore-release-gpg and gpg check Release * Remember size/md5sums of files to get and check after retrieval * L 1046: Only call $ftp->size($file) once to avoid different results * Handle EOF in Release when searching for md5sums, patch by dean gaudet (Closes: #284037) * Fail on chdir failures, patch by dean gaudet (Closes: #283457) * Fixed division by 0 as reported by Jeroen van Wolffelaar (Closes: #277422) [urgency high, should have been RC] * Fixed ftp failures not detected as reported by Dean Gaudet (Closes: #281151) -- Goswin von Brederlow Thu, 09 Dec 2004 18:36:34 +0200 debmirror (20040926) unstable; urgency=low * Skip Contents files for *-proposed-updates and experimental * Skip debian-installer section for experimental and proposed-updates (Closes: #267721) * Cleanup empty directories only at the very end to avoid races with .temp (Closes: #264503) * Add -L to default rsync options (Closes: #265575) * Add --rsync-options option (Closes: #193797, #219976, #267034) * Copy meta files in cases where hardlinks fail (afs) (Closes: #267956) * Unlink meta files before download (Closes: #264504) -- Goswin von Brederlow Sun, 16 Sep 2004 14:29:34 +0200 debmirror (20040802) unstable; urgency=low * Display Byte counters in MiB and speed in Kib/s * Fix progress/verbose output for ftp method broken by --dry-run * Fix rsync method for --dry-run * Add --rsync-batch option limiting the number of files per rsync call * Count 'batch limit exceeded' as error * Fix XSI:isms in prerm reported by David Weinehall (Closes: #262893) -- Goswin von Brederlow Mon, 2 Aug 2004 13:43:34 +0200 debmirror (20040730) unstable; urgency=low * Don't download Contents-$arch.gz for experimental. Thanks to Eric Wong * Add main/debian-installer to the default sections * Add support for http and hftp, adding --proxy option (Adapted from patch by thomas@poindessous.com) (Closes: #134187, #154364, #196196, #229666) -- Goswin von Brederlow Fri, 30 Jul 2004 23:05:34 +0200 debmirror (20040729) unstable; urgency=low * Download meta files to temp directory first (Closes: #219977) * Added --postcleanup * Download Release files for sources (Closes: #248903) * Typo fix (Closes: #258390). Thanks to Steve Kemp * Probable fix for (Closes: #249445) * Add --dry-run (Closes: #126954) * Code cleanup - Reindent long description in debian/control and add rsync method - use -depth and -print0 in the find | xargs calls - don't use -z for rsync on debs and sources -- Goswin von Brederlow Thu, 29 Jul 2004 19:45:34 +0200 debmirror (20040509) unstable; urgency=low * Added --limit-deb-priority -- Goswin von Brederlow Sun, 9 May 2004 20:11:34 +0200 debmirror (20040427) unstable; urgency=low * Reindented source code to xemacs CPerl style (Closes: #211214) * Added ftp error message to the warning during download and not just the errlog * Added Depends on bzip2 (Closes: #233558) * Due to popular demand: Adding hacks for main/debian-installer (Closes: #245499, #232093, #243634) * Don't fail is extra metafiles are broken (Closes: #211847) * Adopted --exclude-deb-section patch by Meszaros Andras (Closes: #245462) * Added mdtm check to ftp method (Closes: #149984) * Added --ignore-missing-release option (Closes: #221491) -- Goswin von Brederlow Tue, 27 Apr 2004 01:18:34 +0200 debmirror (20040118) unstable; urgency=medium * Check for root in binary-indep to ensure files are owned by root.root (Closes: #215993) * Correct example for non-US (Closes: #213869, #219409) * Forgot to toggle Archive-Update-in-Progress-dual and project/trace/dual (Closes: #221490, #215500, #211210) * Added patch by Marcel Meckel : eliminate warning of uninitialized value (Closes: #223059) * Adpated parts of patch by Pedro Larroy : Added human readable verbose output (Closes: #224694) * Added -v --verbose option * List errors (if any) at the end * Report when the batch limit is exceeded * revert 'stoped using regexps on --include' (Closes: #214306) -- Goswin von Brederlow Sun, 18 Jan 2004 16:49:34 +0100 debmirror (20030829) unstable; urgency=low * Added oneliner by Alexander Wirt +die "You need write permissions on $mirrordir" if (!-w $mirrordir); * changed synopsis of usage too (bug #126857) * Use Release file to md5sum check Packages/Sources files before and after download [Patch by "Ingo Saitz" ]+changes (Closes: #149890) * Download Packages/Packages.bz2 files too and same for Sources.gz (Closes: #159322) * Removed "proposed-updates" example from --adddir, --adddir now obsolete (Closes: #174857) * Preserve partial files with the rsync method (Closes: #181097) * Ignore timestamps on rsync method to fix files with broken MD5sum. (We already only rsync files with wrong size or wrong MD5sum.) -- Goswin von Brederlow Fri, 29 Aug 2003 13:58:34 +0200 debmirror (20030822.1) unstable; urgency=low * Synopsis in manpage now has [options] first (Closes: #126857) * added epoch splitting to debian/rules * stoped using regexps on --include (Closes: #146763) * close ftp connection before scanning the mirror and reopen it after (Closes: #149888) [Patch by "Ingo Saitz" ]+fix * count number of errors when fetching files, stop if metafiles failed and report summary at the end otherwise. (Closes: #151164, #154522) [PS: rsync method does not report errors for missing files, ftp only] * clarify --dist and change default to sid -- Goswin von Brederlow Fri, 22 Aug 2003 21:03:34 +0200 debmirror (20030822) unstable; urgency=low * Reduced number of tries till locking fails. Now 2m instead of 12h * warn if a lock is busy (Closes: #206710) -- Goswin von Brederlow Fri, 22 Aug 2003 13:29:34 +0200 debmirror (20030813) unstable; urgency=medium * New maintainer. * Made an Debian-native package * postinst-should-not-set-usr-doc-link, postinst now empty, removed. * added myself to copyright file, changed Copyright GPL to License GPL. * added --max-batch= option * added arch=none option for source-only mirrors (closes: #154139) * added my contact address to the man page (closes: #167010, #205094) * remove backup file in debian/rules:clean -- Goswin von Brederlow Wed, 13 Aug 2003 16:17:34 +0200 debmirror (20020427-1) unstable; urgency=high * New Release. * Applied patch from Robert MyQueen. Great Kudos to him! (Closes: Bug#144726, Bug#12998) * urgency=high as requested because the predecessor fixes a grave bug and #144726 could also be seen as a RC bug. -- Joerg Wendland Sat, 27 Apr 2002 19:59:34 +0200 debmirror (20020424-1) unstable; urgency=medium * New Release. * Medium for this upload should close a bug tagged grave. * Fix output when using --progress (closes: Bug#127484) * Add a great patch by Masato Taruishi, adding rsync support to debmirror. (closes: Bug#127844) * Use now LockFile::Simple to avoid installation of procmail only for having a lockfile utility. It is tested to be compatible with programs using lockfile. (closes: Bug#128041) * Use Compress::Zlib to decompress Package files and others. (closes: Bug#132306) * Add --timeout parameter. This should close Bug#130679 as it can be set and defaults to 300 seconds instead of the Net::FTP default of 120 seconds. This timeout is also used for the new rsync method. (closes: Bug#130679, Bug#122199) Don't even think about annoying me further with timeout problems. -- Joerg Wendland Wed, 24 Apr 2002 22:21:24 +0200 debmirror (20011230-1) unstable; urgency=low * New Release. * Fixed typo in POD/manpage, thanks to Britton Leo Kerin. (closes: Bug#126859) * Applied patch from Camille Dominique fixing download of Release files. (closes: Bug#126758) * Added Depends: libdigest-md5-perl to support --md5sum switch. (closes: unreported Bug, thanks to Maik Busch) * Added patch from Masato Taruishi adding a --include=regex switch that has the opposed effect as the already existing --exclude switch. (closes: Bug#125973) -- Joerg Wendland Sun, 30 Dec 2001 13:57:19 +0100 debmirror (20011016-1) unstable; urgency=low * Initial Debian release -- Joerg Wendland Thu, 25 Oct 2001 17:12:13 +0200 debmirror-2.16ubuntu1.1/debian/NEWS0000664000000000000000000002360212243012102013711 0ustar debmirror (1:2.5) unstable; urgency=low The --cleanup option has been renamed to --precleanup for clarity. (The old option name will continue to work for now.) Debian mirror admins recommend that all mirrors include trace files. So debmirror now defaults to --rsync-extra=trace. It will warn if you specify a configuration that does not include trace files, or if the trace files cannot be downloaded, using rsync, from the specified mirror. A proxy specified with --proxy or the ftp_proxy environment variable will now be used for ftp mirroring. There is no need to use --method=hftp to enable using ftp proxy, and that method is deprecated. -- Joey Hess Sun, 26 Sep 2010 21:37:38 -0400 debmirror (1:2.3) unstable; urgency=low * Use diff files to update Contents files; option --pdiff is now --diff Contents files are relatively large and can change frequently, especially for the testing and unstable suites. Use of the diff files to update Contents files will significantly reduce the total download size. The option '--pdiff' has been renamed to '--diff' because it no longer affects only "package diffs". For the configuration file the variable to use is now '$diff_mode'. -- Frans Pop Sat, 03 Oct 2009 13:33:39 +0200 debmirror (1:2.2) unstable; urgency=low * Support mirroring of translated packages descriptions If the option --i18n is passed, debmirror will also mirror the files containing translated package descriptions. The --include and --exclude options can be used to select which translations to mirror. -- Frans Pop Sat, 12 Sep 2009 08:58:07 +0200 debmirror (1:2.1) unstable; urgency=low * Support mirroring of trace files, ./doc, ./indices and ./tools Mirroring trace files and files from the listed directories can be specified using the --rsync-extra option. As the name implies, debmirror will always use rsync to transfer these files, irrespective of what transfer method is specified in the --method option. With this new feature, debmirror can mirror all files needed to create custom CD images using debian-cd. Note that current versions of debian-cd no longer require the ./indices directory. See the man page for details. -- Frans Pop Sat, 29 Aug 2009 18:55:19 +0200 debmirror (1:2.0) unstable; urgency=low * Option --root=directory no longer requires "/" or ":" prefix It is now possible to leave the default at "debian" and all transfer methods should just work. For backwards compatibility debmirror will remove a leading "/" or ":" if one is passed. * More efficient mirroring through use of a state cache Debmirror now has the option to save the state of the mirror to a cache file between runs. The cache has a configurable expiry time. While the cache is valid, debmirror will trust that the mirror is consistent with this cache instead of checking the presence of files on disk. Use of the cache avoids a large amount of disk access and may reduce the time required for updates. The limited validity of the cache ensures that the actual state of the mirror still gets checked periodically. You may want to consider using the --md5sums option in combination with the state cache. See the man page for details. * Support for mirroring "current" Debian Installer images With the option to specify a different set of dists and arches than for the package archive. In this release there are no progress updates displayed yet. -- Frans Pop Fri, 28 Aug 2009 15:32:09 +0200 debmirror (1:1.0) unstable; urgency=low * Uncompressed Packages/Sources files In line with official Debian mirrors, debmirror will no longer include uncompressed Packages and Sources files on the local mirror. The only exception is when the remote mirror itself does not contain a gzipped Packages/Sources file. * Automatic creation and update of suite->codename symlinks Debmirror will now extract the correct codename and suite from the Release file and use those for the local mirror. The directory name for a release will always be the codename, and a symlink will be created for the suite. This means it no longer makes any difference whether you pass --dists= or --dists=. A new option --omit-suite-symlinks allows to skip creation of the symlinks (e.g. when mirroring archived releases). Your local mirror will need to be converted if it currently uses suites as directories. Use the --allow-dist-rename option to allow debmirror to automatically do the conversion for you. This should only need to be done once. -- Frans Pop Thu, 20 Aug 2009 19:44:02 +0200 debmirror (20070123) unstable; urgency=low * New gpgv support The Release.gpg check has been rewritten to use gpgv, the same method as used by apt and debian-installer. As a side effect the keyring file has been changed to ~/.gnupg/trustedkeys.gpg. Existing setups must copy their pubring.gpg to trustedkeys.gpg or import the relevant keys there. This allows to only have the Archive keys in trustedkeys.gpg while still having a pubring.gpg for other use. -- Goswin von Brederlow Tue, 23 Jan 2007 14:53:14 +0100 debmirror (20060907) unstable; urgency=low * Pdiff support The Debian archive added pdiff files for the index files to speed up apt-get update. Debmirror now uses those pdiff files to update the index files but by default keeps them out of the local mirror because they greatly slow down operations for local networks. You can change the behaviour with the --pdiff-mode option. * Postcleanup is now default It was mentioned that the default --cleanup removes files before the index files are updated. That can result in files missing from the mirror when the mirroring fails and the index files aren't updated at the end (and also while debmirror runs). The --postcleanup does not have that effect but can lead to temporarily more space usage on the mirror. If you are short on space you might want to make sure you use --cleanup. * Autodetecting non-existing archs and sections In the past it was impossible to mirror i386,amd64 and sarge,etch because sarge has no amd64 architecture. Similarly there is no debian-installer section in contrib. Debmirror now ignores any combination of arch, suite and section that does not exist locally and is not listed in the Release file for the suite. This obsoletes the previously hardcoded exceptions and should allow to mirror unknown archives like Ubuntu without problems. Note that debmirror will fail when a combination of arch, suite and section that exists locally gets dropped from the Release file. There is no danger of loosing a branch when the Release file is corrupted or the upstream changes. -- Goswin von Brederlow Thu, 7 Sep 2006 15:36:47 +0200 debmirror (20050207) unstable; urgency=low * Release.gpg file and check is now mandatory unless ignored Debmirror now checks the Release.gpg signature when mirroring to guard against syncing against a compromised mirror. For security reasons this check is on by default but can be disabled using --ignore-release-gpg. For the check to work the Debian archive key must be added to the (debmirror) users keyring or an alternative keyring must be configured. !!! This breaks existing debmirror scripts and cron !!! !!! jobs in almost all cases. Take care to adapt. !!! * Release files are now mandatory unless ignored Sometimes downloads of meta files can abort for some reason and the download modules are not always reliable on reporting this. Debmirror now stops if the md5sum of meta file does not match the info in the Release file to guard against tampering or download failures. This check can be disabled with --ignore-missing-release but that is discouraged strongly. * output options have been split into verbose, progress and debug Verbose gives an overview of whats happening and a file by file progress while progress option adds individual download progress for the files (FTP and rsync only). Debug isn't useful unless something doesn't work. * download methods now include hftp and http Hftp is ftp over a http proxy like squid, what most people will (mistakenly) know as ftp_proxy. Hftp requires the use of a proxy while http will use it if given. ftp_proxy or http_proxy are taken from the environment unless overridden by --proxy. * cleanup can now be done pre or post mirroring Cleaning up after mirroring will use more space during mirroring but keeps a consistent mirror available at all times. Cleaning up before mirroring on the other hand will remove obsolete files while they are still referenced from the old Packages/Sources files. --postcleanup is recommended unless space prohibits it. * rsync options can be specified, e.g. to add --bwlimit Take note that --rsync-options override the default options completely and should include "-aIL --partial" for normal operation. * small errors (a missing deb or src)) can be ignored Sometimes the upstream mirror is inconsistent in itself. By default debmirror will download all available files but not update the meta data (Packages/Sources files) unless the mirror is consistent. Your mirror will stay stuck in the past until the upstream mirror is repaired. With --ignore-small-errors you can sync the mirror even if some files are missing. Users of --cleanup might want to always use --ignore-small-errors to minimize the overall inconsistencies. -- Goswin von Brederlow Mon, 7 Feb 2005 05:30:34 +0100 debmirror-2.16ubuntu1.1/debian/rules0000775000000000000000000000005212243012102014264 0ustar #!/usr/bin/make -f %: dh $@ --with quilt debmirror-2.16ubuntu1.1/debian/compat0000664000000000000000000000000212243012102014405 0ustar 7 debmirror-2.16ubuntu1.1/debian/control0000664000000000000000000000175112243012102014616 0ustar Source: debmirror Section: net Priority: extra Maintainer: Ubuntu Developers XSBC-Original-Maintainer: Joey Hess Build-Depends: debhelper (>= 7.0.50), quilt (>> 0.48-1) Standards-Version: 3.9.3 Vcs-Git: git://git.debian.org/collab-maint/debmirror.git Vcs-Browser: http://git.debian.org/?p=collab-maint/debmirror.git Package: debmirror Architecture: all Depends: ${misc:Depends}, perl (>= 5.10), libnet-perl, libdigest-md5-perl, libdigest-sha-perl, liblockfile-simple-perl, rsync, bzip2, libwww-perl (>= 5.815), libnet-inet6glue-perl Recommends: gpgv, patch, ed Suggests: gnupg Description: Debian partial mirror script, with ftp and package pool support This program downloads and maintains a partial local Debian mirror. It can mirror any combination of architectures, distributions and sections. Files are transferred by ftp, http, hftp or rsync, and package pools are fully supported. It also does locking and updates trace files. debmirror-2.16ubuntu1.1/mirror-size0000775000000000000000000001640112243012102014177 0ustar #!/usr/bin/perl -w # This script can be used on a mirror (or e.g. merkel.debian.org) to # produce an overview of the size of the archive. The numbers reflect # the "raw" size of the archive: the size of packages and source files. # It does not include the size of files containing meta data, nor of # various separate directories. # Copyright (c) 2009 Frans Pop use strict; use Cwd; use Getopt::Long; use File::Temp qw/ tempfile /; use Compress::Zlib; our @arches= qw(source all i386 amd64 alpha arm armel hppa ia64 m68k mips mipsel powerpc s390 sparc kfreebsd-i386 kfreebsd-amd64); our @sects= qw(main contrib non-free main/debian-installer); our @suites= qw(oldstable stable testing unstable); our %dists; our (@source_files, @package_files); our (%files, %sizes, %total, %width); our $root="/srv/ftp.debian.org/ftp/dists"; chdir($root) or die "chdir $root: $!"; my ($tfh, $tfile) = tempfile(); END { unlink $tfile if $tfile } ### Collect the data foreach my $suite (@suites) { next unless -f "$root/$suite/Release"; if (open RELEASE, "<$root/$suite/Release") { while () { if (/^Codename:/) { ($dists{$suite}) = m/^Codename:\s+(.*)/i; last; } } close RELEASE; } foreach my $sect (@sects) { next unless -d "$root/$suite/$sect"; print(STDERR "Processing $suite $sect\n"); foreach my $arch (@arches) { my $file; if ($arch eq "source") { $file = "source/Sources.gz"; next unless -f "$root/$suite/$sect/$file"; parse_sources($file, $suite, $sect); } else { $file = "binary-$arch/Packages.gz"; next unless -f "$root/$suite/$sect/$file"; parse_packages($file, $suite, $sect, $arch); } } } } ### Print the tables foreach my $suite (@suites) { next unless exists $dists{$suite}; $width{$suite} = exists $sizes{"d$suite"} ? 16 : 6; } print("Total archive size (binary + source) per section:\n\n"); printf("%10s ", "(in MiB)"); foreach my $suite (@suites) { next unless exists $dists{$suite}; printf("| %$width{$suite}s ", $dists{$suite}); } printf("| %6s\n", "all"); print_ruler(); foreach my $sect (@sects) { next unless exists $sizes{all}{$sect}; if ($sect eq "main/debian-installer") { printf("%-10s", "main/d-i"); } else { printf("%-10s", $sect); } foreach my $suite (@suites) { next unless exists $dists{$suite}; if (exists $sizes{$suite}{$sect}) { $total{$suite} += $sizes{$suite}{$sect}; printf(" | %6i", int((1 + $sizes{$suite}{$sect}) /1024/1024)); if (exists $sizes{"d$suite"}{$sect}) { $total{"d$suite"} += $sizes{"d$suite"}{$sect}; printf(" (+%6i)", int((1 + $sizes{"d$suite"}{$sect}) /1024/1024)); } } else { print(" | " . (" " x $width{$suite})); } } printf(" | %6i\n", int((1 + $sizes{all}{$sect}) /1024/1024)); $total{all} += $sizes{all}{$sect}; } print_ruler(); printf("%-9s ", "total"); foreach my $suite (@suites) { next unless exists $dists{$suite}; printf(" | %6i", int((1 + $total{$suite}) /1024/1024)); printf(" (+%6i)", int((1 + $total{"d$suite"}) /1024/1024)) if exists $total{"d$suite"}; } printf(" | %6i", int((1 + $total{all}) /1024/1024)."\n"); print("\n\n"); print("Archive size per architecture (source and arch=all packages are shown separately):\n\n"); printf("%10s ", "(in MiB)"); foreach my $suite (@suites) { next unless exists $dists{$suite}; printf("| %$width{$suite}s ", $dists{$suite}); } printf("| %6s\n", "all"); print_ruler(); foreach my $arch (@arches) { next unless exists $sizes{all}{$arch}; my $parch = $arch; $parch =~ s/kfree/k/; printf("%-10s", $parch); foreach my $suite (@suites) { next unless exists $dists{$suite}; if (exists $sizes{$suite}{$arch}) { printf(" | %6i", int((1 + $sizes{$suite}{$arch}) /1024/1024)); printf(" (+%6i)", int((1 + $sizes{"d$suite"}{$arch}) /1024/1024)) if exists $sizes{"d$suite"}{$arch}; } else { printf(" | " . (" " x $width{$suite})); } } printf(" | %6i\n", int((1 + $sizes{all}{$arch}) /1024/1024)); } my @ts=gmtime(time()); printf("\nAll numbers reflect the state of the archive per %i %s %i.\n", $ts[3], (qw(Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec))[$ts[4]], $ts[5] + 1900); ### Functions sub print_ruler { print("-" x 11); foreach my $suite (@suites) { next unless exists $dists{$suite}; print("|" . "-" x ($width{$suite} + 2)); } print("|" . "-" x 7 . "\n"); } sub parse_packages { local $/ = "\n\n"; my ($file, $suite, $sect, $arch) = @_; my ($line, $res, $size, $filename, $architecture); system_redirect_io("gzip -d", "$root/$suite/$sect/$file", "$tfile"); open(TFILE, "<", $tfile) or die "$tfile: $!"; for (;;) { my $buf; unless (defined( $buf = )) { last if eof; die "$file: $!" if $!; } $_ = $buf; ($filename) = m/^Filename:\s+(.*)/im; $filename =~ s:/+:/:; # remove redundant slashes in paths ($architecture) = m/^Architecture:\s+(.*)/im; ($size) = m/^Size:\s+(\d+)/im; if (! exists $files{$filename}{$suite}) { $sizes{$suite}{$sect} += $size; $sizes{$suite}{$architecture} += $size; if (($suite eq "stable" && exists $dists{oldstable} && ! exists $files{$filename}{oldstable}) || ($suite eq "testing" && exists $dists{stable} && ! exists $files{$filename}{stable}) || ($suite eq "unstable" && exists $dists{testing} && ! exists $files{$filename}{testing})) { $sizes{"d$suite"}{$sect} += $size; $sizes{"d$suite"}{$architecture} += $size; } } if (! exists $files{$filename}{x}) { $sizes{all}{$sect} += $size; $sizes{all}{$architecture} += $size; } $files{$filename}{x} = 1; $files{$filename}{$suite} = 1; } close(TFILE); } sub parse_sources { local $/ = "\n\n"; my ($file, $suite, $sect) = @_; my ($line, $res, $size, $directory, $filename, $md5sum); system_redirect_io("gzip -d", "$root/$suite/$sect/$file", "$tfile"); open(TFILE, "<", $tfile) or die "$tfile: $!"; for (;;) { my $buf; unless (defined( $buf = )) { last if eof; die "$file: $!" if $!; } $_ = $buf; ($directory) = m/^Directory:\s+(.*)/im; while (m/^ ([A-Za-z0-9]{32} .*)/mg) { ($md5sum, $size, $filename)=split(' ', $1, 3); $filename = "$directory/$filename"; $filename =~ s:/+:/:; # remove redundant slashes in paths if (! exists $files{$filename}{$suite}) { $sizes{$suite}{$sect} += $size; $sizes{$suite}{source} += $size; if (($suite eq "stable" && exists $dists{oldstable} && ! exists $files{$filename}{oldstable}) || ($suite eq "testing" && exists $dists{stable} && ! exists $files{$filename}{stable}) || ($suite eq "unstable" && exists $dists{testing} && ! exists $files{$filename}{testing})) { $sizes{"d$suite"}{$sect} += $size; $sizes{"d$suite"}{source} += $size; } } if (! exists $files{$filename}{x}) { $sizes{all}{$sect} += $size; $sizes{all}{source} += $size; } $files{$filename}{x} = 1; $files{$filename}{$suite} = 1; } } close(TFILE); } # run system() with stdin and stdout redirected to files # unlinks stdout target file first to break hard links sub system_redirect_io { my ($command, $fromfile, $tofile) = @_; if (-f $tofile) { unlink($tofile) or die "unlink($tofile) failed: $!"; } system("$command <$fromfile >$tofile"); } debmirror-2.16ubuntu1.1/Makefile0000664000000000000000000000012112243012102013417 0ustar all: debmirror.1 debmirror.1: pod2man debmirror >$@ clean: rm -f debmirror.1 debmirror-2.16ubuntu1.1/tester/0000775000000000000000000000000012243022416013304 5ustar debmirror-2.16ubuntu1.1/tester/.temp/0000775000000000000000000000000012243022416014327 5ustar debmirror-2.16ubuntu1.1/tester/.temp/.tmp/0000775000000000000000000000000012243022416015205 5ustar debmirror-2.16ubuntu1.1/tester/.temp/.tmp/dists/0000775000000000000000000000000012243022416016333 5ustar debmirror-2.16ubuntu1.1/tester/.temp/.tmp/dists/sid/0000775000000000000000000000000012243022416017112 5ustar debmirror-2.16ubuntu1.1/examples/0000775000000000000000000000000012700374210013614 5ustar debmirror-2.16ubuntu1.1/examples/debmirror.conf0000664000000000000000000000321512700374210016451 0ustar # Default config for debmirror # The config file is a perl script so take care to follow perl syntax. # Any setting in /etc/debmirror.conf overrides these defaults and # ~/.debmirror.conf overrides those again. Take only what you need. # # The syntax is the same as on the command line and variable names # loosely match option names. If you don't recognize something here # then just stick to the command line. # # Options specified on the command line override settings in the config # files. # Location of the local mirror (use with care) # $mirrordir="/path/to/mirrordir" # Output options $verbose=0; $progress=0; $debug=0; # Download options $host="ftp.debian.org"; $user="anonymous"; $passwd="anonymous@"; $remoteroot="debian"; $download_method="ftp"; @dists="sid"; @sections="main,main/debian-installer,contrib,non-free"; @arches="i386"; # @ignores=""; # @excludes=""; # @includes=""; # @excludes_deb_section=""; # @limit_priority=""; $omit_suite_symlinks=0; $skippackages=0; # @rsync_extra="doc,tools"; $i18n=0; $getcontents=0; $do_source=1; $max_batch=0; # @di_dists="dists"; # @di_archs="arches"; # Save mirror state between runs; value sets validity of cache in days $state_cache_days=0; # Security/Sanity options $ignore_release_gpg=0; $ignore_release=0; $check_md5sums=0; $ignore_small_errors=0; # Cleanup $cleanup=0; $post_cleanup=1; # Locking options $timeout=300; # Rsync options $rsync_batch=200; $rsync_options="-aIL --partial"; # FTP/HTTP options $passive=0; # $proxy="http://proxy:port/"; # Dry run $dry_run=0; # Don't keep diff files but use them $diff_mode="use"; # The config file must return true or perl complains. # Always copy this. 1; debmirror-2.16ubuntu1.1/mirror_size0000664000000000000000000000362412243012102014261 0ustar Total archive size (binary + source) per section: (in MiB) | lenny | squeeze | sid | all -----------|--------|------------------|------------------|------- main | 118572 | 187314 (+178909) | 214040 (+ 56462) | 353292 contrib | 1380 | 952 (+ 864) | 1444 (+ 572) | 2570 non-free | 5589 | 6141 (+ 4886) | 7303 (+ 2676) | 13117 main/d-i | 361 | 404 (+ 402) | 448 (+ 198) | 963 -----------|--------|------------------|------------------|------- total | 125903 | 194812 (+185062) | 223236 (+ 59910) | 369944 Archive size per architecture (source and arch=all packages are shown separately): (in MiB) | lenny | squeeze | sid | all -----------|--------|------------------|------------------|------- source | 19359 | 27879 (+ 22792) | 31903 (+ 6349) | 48073 all | 12436 | 19942 (+ 18439) | 23142 (+ 6335) | 37198 i386 | 8251 | 14755 (+ 14439) | 15835 (+ 3744) | 26417 amd64 | 8405 | 14879 (+ 14560) | 16126 (+ 3863) | 26811 alpha | 7974 | | 13258 (+ 13258) | 20903 arm | 7084 | | | 7084 armel | 7175 | 10996 (+ 10700) | 11656 (+ 2692) | 20554 hppa | 7615 | 11843 (+ 11526) | 12453 (+ 2450) | 21577 ia64 | 9094 | 13597 (+ 13226) | 14413 (+ 3387) | 25689 mips | 7676 | 11757 (+ 11434) | 12298 (+ 2316) | 21409 mipsel | 7539 | 11564 (+ 11255) | 12057 (+ 2537) | 21315 powerpc | 8101 | 12880 (+ 12568) | 13609 (+ 2994) | 23649 s390 | 7565 | 11146 (+ 10846) | 11743 (+ 2618) | 21016 sparc | 7625 | 12069 (+ 11772) | 12624 (+ 2893) | 22275 kbsd-i386 | | 10585 (+ 10585) | 10887 (+ 2192) | 12778 kbsd-amd64 | | 10912 (+ 10912) | 11227 (+ 2275) | 13188 All numbers reflect the state of the archive per 5 Aug 2010.