usurp
spacesavers2_usurp¶
This takes in the TSV file generated by spacesavers2_grubbers
and a hash from its first column to:
- find the row corresponding to the hash
- keep one copy as of the duplicates as "original copy"
- delete other copies and replace them with hard (or soft) links
Deleting these high-value duplicates has the biggest impact on the users overall digital footprint.
Inputs¶
--grubber
output file fromspacesavers2_grubbers
.--hash
a hash from its first column of the grubber TSV.--force
(OPTIONAL) if the duplicates are cross-device then hard links cannot be made, with--force
you can force using sym-links instead.
The GRUBBER file has the following columns: | Column | Description | | ------ | ------------------------------------- | | 1 | combined hash | | 2 | number of duplicates found | | 3 | total size of all duplicates (human readable) | | 4 | size of each duplicate (human readable) | | 5 | original file | | 6 | ";"-separated list of duplicates files |
usage: spacesavers2_usurp [-h] -g GRUBBER -x HASH [-f | --force | --no-force]
spacesavers2_usurp: delete all but one copy of the file matching the hash and replace all other copies with hardlinks
options:
-h, --help show this help message and exit
-g GRUBBER, --grubber GRUBBER
spacesavers2_grubbers output TSV file
-x HASH, --hash HASH hash (or unique partial hash) from column 1 of spacesavers2_grubbers TSV file
-f, --force, --no-force
forcefully create symlink if hardlink is not possible
Version:
v0.8
Example:
> spacesavers2_usurp -g grubbers.TSV -h someHash
Outputs¶
On screen confirmation that the duplicates are replaced with hard (or soft) links.