1
0
Fork 0
mirror of https://github.com/borgbackup/borg.git synced 2024-12-22 15:57:15 +00:00
borg/docs/usage/tar.rst
Thomas Waldmann 1bc5902718
docs: update about archive series
in borg 1.x, we used to put a timestamp into the archive name to make
it unique, because borg1 required that.

borg2 does not require unique archive names, but it encourages you
to even use an identical archive name within the same SERIES of archives.
that makes matching (e.g. for prune, but also at other places) much
simpler and borg KNOWS which archives belong to the same series.
2024-09-18 14:05:12 +02:00

57 lines
1.6 KiB
ReStructuredText

.. include:: export-tar.rst.inc
.. include:: import-tar.rst.inc
Examples
~~~~~~~~
::
# export as uncompressed tar
$ borg export-tar Monday Monday.tar
# import an uncompressed tar
$ borg import-tar Monday Monday.tar
# exclude some file types, compress using gzip
$ borg export-tar Monday Monday.tar.gz --exclude '*.so'
# use higher compression level with gzip
$ borg export-tar --tar-filter="gzip -9" Monday Monday.tar.gz
# copy an archive from repoA to repoB
$ borg -r repoA export-tar --tar-format=BORG archive - | borg -r repoB import-tar archive -
# export a tar, but instead of storing it on disk, upload it to remote site using curl
$ borg export-tar Monday - | curl --data-binary @- https://somewhere/to/POST
# remote extraction via "tarpipe"
$ borg export-tar Monday - | ssh somewhere "cd extracted; tar x"
Archives transfer script
~~~~~~~~~~~~~~~~~~~~~~~~
Outputs a script that copies all archives from repo1 to repo2:
::
for N I T in `borg list --format='{archive} {id} {time:%Y-%m-%dT%H:%M:%S}{NL}'`
do
echo "borg -r repo1 export-tar --tar-format=BORG aid:$I - | borg -r repo2 import-tar --timestamp=$T $N -"
done
Kept:
- archive name, archive timestamp
- archive contents (all items with metadata and data)
Lost:
- some archive metadata (like the original commandline, execution time, etc.)
Please note:
- all data goes over that pipe, again and again for every archive
- the pipe is dumb, there is no data or transfer time reduction there due to deduplication
- maybe add compression
- pipe over ssh for remote transfer
- no special sparse file support