mirror of https://github.com/borgbackup/borg.git
Merge pull request #7814 from ThomasWaldmann/tam-fixes-1.1
TAM / security fixes (1.1-maint backport)
This commit is contained in:
commit
e6d40f63d4
|
@ -5,6 +5,83 @@ Important notes
|
|||
|
||||
This section provides information about security and corruption issues.
|
||||
|
||||
.. _archives_tam_vuln:
|
||||
|
||||
Pre-1.2.5 archives spoofing vulnerability (CVE-2023-36811)
|
||||
----------------------------------------------------------
|
||||
Note: the security fix released with borg 1.2.5/1.2.6 was backported to
|
||||
1.1-maint branch. Below is a copy of the related upgrade notes from 1.2.6,
|
||||
they also apply here:
|
||||
|
||||
A flaw in the cryptographic authentication scheme in Borg allowed an attacker to
|
||||
fake archives and potentially indirectly cause backup data loss in the repository.
|
||||
|
||||
The attack requires an attacker to be able to
|
||||
|
||||
1. insert files (with no additional headers) into backups
|
||||
2. gain write access to the repository
|
||||
|
||||
This vulnerability does not disclose plaintext to the attacker, nor does it
|
||||
affect the authenticity of existing archives.
|
||||
|
||||
Creating plausible fake archives may be feasible for empty or small archives,
|
||||
but is unlikely for large archives.
|
||||
|
||||
The fix enforces checking the TAM authentication tag of archives at critical
|
||||
places. Borg now considers archives without TAM as garbage or an attack.
|
||||
|
||||
We are not aware of others having discovered, disclosed or exploited this vulnerability.
|
||||
|
||||
Below, if we speak of borg 1.2.6, we mean a borg version >= 1.2.6 **or** a
|
||||
borg version that has the relevant security patches for this vulnerability applied
|
||||
(could be also an older version in that case).
|
||||
|
||||
Steps you must take to upgrade a repository:
|
||||
|
||||
1. Upgrade all clients using this repository to borg 1.2.6.
|
||||
Note: it is not required to upgrade a server, except if the server-side borg
|
||||
is also used as a client (and not just for "borg serve").
|
||||
|
||||
Do **not** run ``borg check`` with borg > 1.2.4 before completing the upgrade steps.
|
||||
|
||||
2. Run ``BORG_WORKAROUNDS=ignore_invalid_archive_tam borg info --debug <repo> 2>&1 | grep TAM | grep -i manifest``.
|
||||
|
||||
a) If you get "TAM-verified manifest", continue with 3.
|
||||
b) If you get "Manifest TAM not found and not required", run
|
||||
``borg upgrade --tam --force <repository>`` *on every client*.
|
||||
|
||||
3. Run ``BORG_WORKAROUNDS=ignore_invalid_archive_tam borg list --format='{name} {time} tam:{tam}{NL}' <repo>``.
|
||||
"tam:verified" means that the archive has a valid TAM authentication.
|
||||
"tam:none" is expected as output for archives created by borg <1.0.9.
|
||||
"tam:none" is also expected for archives resulting from a borg rename
|
||||
or borg recreate operation (see #7791).
|
||||
"tam:none" could also come from archives created by an attacker.
|
||||
You should verify that "tam:none" archives are authentic and not malicious
|
||||
(== have good content, have correct timestamp, can be extracted successfully).
|
||||
In case you find crappy/malicious archives, you must delete them before proceeding.
|
||||
In low-risk, trusted environments, you may decide on your own risk to skip step 3
|
||||
and just trust in everything being OK.
|
||||
|
||||
4. If there are no tam:none archives left at this point, you can skip this step.
|
||||
Run ``BORG_WORKAROUNDS=ignore_invalid_archive_tam borg upgrade --archives-tam <repo>``.
|
||||
This will unconditionally add a correct archive TAM to all archives not having one.
|
||||
``borg check`` would consider TAM-less or invalid-TAM archives as garbage or a potential attack.
|
||||
To see that all archives now are "tam:verified" run: ``borg list --format='{name} {time} tam:{tam}{NL}' <repo>``
|
||||
|
||||
5. Please note that you should never use BORG_WORKAROUNDS=ignore_invalid_archive_tam
|
||||
for normal production operations - it is only needed once to get the archives in a
|
||||
repository into a good state. All archives have a valid TAM now.
|
||||
|
||||
Vulnerability time line:
|
||||
|
||||
* 2023-06-13: Vulnerability discovered during code review by Thomas Waldmann
|
||||
* 2023-06-13...: Work on fixing the issue, upgrade procedure, docs.
|
||||
* 2023-06-30: CVE was assigned via Github CNA
|
||||
* 2023-06-30 .. 2023-08-29: Fixed issue, code review, docs, testing.
|
||||
* 2023-08-30: Released fixed version 1.2.5 (broken upgrade procedure for some repos)
|
||||
* 2023-08-31: Released fixed version 1.2.6 (fixes upgrade procedure)
|
||||
* 2023-09-08: Backported fixes / docs to 1.1-maint branch.
|
||||
|
||||
.. _hashindex_set_bug:
|
||||
|
||||
Pre-1.1.11 potential index corruption / data loss issue
|
||||
|
|
|
@ -99,6 +99,30 @@ General:
|
|||
caused EROFS. You will need this to make archives from volume shadow copies
|
||||
in WSL1 (Windows Subsystem for Linux 1).
|
||||
|
||||
authenticated_no_key
|
||||
Work around a lost passphrase or key for an ``authenticated`` mode repository
|
||||
(these are only authenticated, but not encrypted).
|
||||
If the key is missing in the repository config, add ``key = anything`` there.
|
||||
|
||||
This workaround is **only** for emergencies and **only** to extract data
|
||||
from an affected repository (read-only access)::
|
||||
|
||||
BORG_WORKAROUNDS=authenticated_no_key borg extract repo::archive
|
||||
|
||||
After you have extracted all data you need, you MUST delete the repository::
|
||||
|
||||
BORG_WORKAROUNDS=authenticated_no_key borg delete repo
|
||||
|
||||
Now you can init a fresh repo. Make sure you do not use the workaround any more.
|
||||
|
||||
ignore_invalid_archive_tam
|
||||
Work around invalid archive TAMs created by borg < 1.2.5, see :issue:`7791`.
|
||||
|
||||
This workaround likely needs to get used only once when following the upgrade
|
||||
instructions for CVE-2023-36811, see :ref:`archives_tam_vuln`.
|
||||
|
||||
In normal production operations, this workaround should never be used.
|
||||
|
||||
Some automatic "answerers" (if set, they automatically answer confirmation questions):
|
||||
BORG_UNKNOWN_UNENCRYPTED_REPO_ACCESS_IS_OK=no (or =yes)
|
||||
For "Warning: Attempting to access a previously unknown unencrypted repository"
|
||||
|
|
|
@ -336,6 +336,7 @@ class Archive:
|
|||
self.name = name # overwritten later with name from archive metadata
|
||||
self.name_in_manifest = name # can differ from .name later (if borg check fixed duplicate archive names)
|
||||
self.comment = None
|
||||
self.tam_verified = False
|
||||
self.checkpoint_interval = checkpoint_interval
|
||||
self.numeric_owner = numeric_owner
|
||||
self.noatime = noatime
|
||||
|
@ -378,7 +379,9 @@ class Archive:
|
|||
|
||||
def _load_meta(self, id):
|
||||
data = self.key.decrypt(id, self.repository.get(id))
|
||||
metadata = ArchiveItem(internal_dict=msgpack.unpackb(data, unicode_errors='surrogateescape'))
|
||||
# we do not require TAM for archives, otherwise we can not even borg list a repo with old archives.
|
||||
archive, self.tam_verified, _ = self.key.unpack_and_verify_archive(data, force_tam_not_required=True)
|
||||
metadata = ArchiveItem(internal_dict=archive)
|
||||
if metadata.version != 1:
|
||||
raise Exception('Unknown archive metadata version')
|
||||
return metadata
|
||||
|
@ -815,7 +818,7 @@ Utilization of max. archive size: {csize_max:.0%}
|
|||
def set_meta(self, key, value):
|
||||
metadata = self._load_meta(self.id)
|
||||
setattr(metadata, key, value)
|
||||
data = msgpack.packb(metadata.as_dict(), unicode_errors='surrogateescape')
|
||||
data = self.key.pack_and_authenticate_metadata(metadata.as_dict(), context=b'archive')
|
||||
new_id = self.key.id_hash(data)
|
||||
self.cache.add_chunk(new_id, data, self.stats)
|
||||
self.manifest.archives[self.name] = (new_id, metadata.time)
|
||||
|
@ -1435,6 +1438,23 @@ class ArchiveChecker:
|
|||
except (TypeError, ValueError, StopIteration):
|
||||
continue
|
||||
if valid_archive(archive):
|
||||
# **after** doing the low-level checks and having a strong indication that we
|
||||
# are likely looking at an archive item here, also check the TAM authentication:
|
||||
try:
|
||||
archive, verified, _ = self.key.unpack_and_verify_archive(data, force_tam_not_required=False)
|
||||
except IntegrityError as integrity_error:
|
||||
# TAM issues - do not accept this archive!
|
||||
# either somebody is trying to attack us with a fake archive data or
|
||||
# we have an ancient archive made before TAM was a thing (borg < 1.0.9) **and** this repo
|
||||
# was not correctly upgraded to borg 1.2.5 (see advisory at top of the changelog).
|
||||
# borg can't tell the difference, so it has to assume this archive might be an attack
|
||||
# and drops this archive.
|
||||
name = archive.get(b'name', b'<unknown>').decode('ascii', 'replace')
|
||||
logger.error('Archive TAM authentication issue for archive %s: %s', name, integrity_error)
|
||||
logger.error('This archive will *not* be added to the rebuilt manifest! It will be deleted.')
|
||||
self.error_found = True
|
||||
continue
|
||||
# note: if we get here and verified is False, a TAM is not required.
|
||||
archive = ArchiveItem(internal_dict=archive)
|
||||
name = archive.name
|
||||
logger.info('Found archive %s', name)
|
||||
|
@ -1670,7 +1690,18 @@ class ArchiveChecker:
|
|||
self.error_found = True
|
||||
del self.manifest.archives[info.name]
|
||||
continue
|
||||
archive = ArchiveItem(internal_dict=msgpack.unpackb(data))
|
||||
try:
|
||||
archive, verified, salt = self.key.unpack_and_verify_archive(data, force_tam_not_required=False)
|
||||
except IntegrityError as integrity_error:
|
||||
# looks like there is a TAM issue with this archive, this might be an attack!
|
||||
# when upgrading to borg 1.2.5, users are expected to TAM-authenticate all archives they
|
||||
# trust, so there shouldn't be any without TAM.
|
||||
logger.error('Archive TAM authentication issue for archive %s: %s', info.name, integrity_error)
|
||||
logger.error('This archive will be *removed* from the manifest! It will be deleted.')
|
||||
self.error_found = True
|
||||
del self.manifest.archives[info.name]
|
||||
continue
|
||||
archive = ArchiveItem(internal_dict=archive)
|
||||
if archive.version != 1:
|
||||
raise Exception('Unknown archive metadata version')
|
||||
archive.cmdline = [safe_decode(arg) for arg in archive.cmdline]
|
||||
|
@ -1684,7 +1715,7 @@ class ArchiveChecker:
|
|||
for previous_item_id in archive.items:
|
||||
mark_as_possibly_superseded(previous_item_id)
|
||||
archive.items = items_buffer.chunks
|
||||
data = msgpack.packb(archive.as_dict(), unicode_errors='surrogateescape')
|
||||
data = self.key.pack_and_authenticate_metadata(archive.as_dict(), context=b'archive', salt=salt)
|
||||
new_archive_id = self.key.id_hash(data)
|
||||
cdata = self.key.encrypt(data)
|
||||
add_reference(new_archive_id, len(data), len(cdata), cdata)
|
||||
|
|
|
@ -72,10 +72,11 @@ try:
|
|||
from .helpers import umount
|
||||
from .helpers import msgpack, msgpack_fallback
|
||||
from .helpers import uid2user, gid2group
|
||||
from .helpers import safe_decode
|
||||
from .nanorst import rst_to_terminal
|
||||
from .patterns import ArgparsePatternAction, ArgparseExcludeFileAction, ArgparsePatternFileAction, parse_exclude_pattern
|
||||
from .patterns import PatternMatcher
|
||||
from .item import Item
|
||||
from .item import Item, ArchiveItem
|
||||
from .platform import get_flags, get_process_id, SyncFile
|
||||
from .remote import RepositoryServer, RemoteRepository, cache_if_remote
|
||||
from .repository import Repository, LIST_SCAN_LIMIT, TAG_PUT, TAG_DELETE, TAG_COMMIT
|
||||
|
@ -1711,25 +1712,43 @@ class Archiver:
|
|||
DASHES, logger=logging.getLogger('borg.output.stats'))
|
||||
return self.exit_code
|
||||
|
||||
@with_repository(fake=('tam', 'disable_tam'), invert_fake=True, manifest=False, exclusive=True)
|
||||
@with_repository(fake=('tam', 'disable_tam', 'archives_tam'), invert_fake=True, manifest=False, exclusive=True)
|
||||
def do_upgrade(self, args, repository, manifest=None, key=None):
|
||||
"""upgrade a repository from a previous version"""
|
||||
if args.tam:
|
||||
if args.archives_tam:
|
||||
manifest, key = Manifest.load(repository, (Manifest.Operation.CHECK,), force_tam_not_required=args.force)
|
||||
with Cache(repository, key, manifest) as cache:
|
||||
stats = Statistics()
|
||||
for info in manifest.archives.list(sort_by=['ts']):
|
||||
archive_id = info.id
|
||||
archive_formatted = format_archive(info)
|
||||
cdata = repository.get(archive_id)
|
||||
data = key.decrypt(archive_id, cdata)
|
||||
archive, verified, _ = key.unpack_and_verify_archive(data, force_tam_not_required=True)
|
||||
if not verified: # we do not have an archive TAM yet -> add TAM now!
|
||||
archive = ArchiveItem(internal_dict=archive)
|
||||
archive.cmdline = [safe_decode(arg) for arg in archive.cmdline]
|
||||
data = key.pack_and_authenticate_metadata(archive.as_dict(), context=b'archive')
|
||||
new_archive_id = key.id_hash(data)
|
||||
cache.add_chunk(new_archive_id, data, stats)
|
||||
cache.chunk_decref(archive_id, stats)
|
||||
manifest.archives[info.name] = (new_archive_id, info.ts)
|
||||
print("Added archive TAM: %s -> [%s]" % (archive_formatted, bin_to_hex(new_archive_id)))
|
||||
else:
|
||||
print("Archive TAM present: %s" % archive_formatted)
|
||||
manifest.write()
|
||||
repository.commit()
|
||||
cache.commit()
|
||||
elif args.tam:
|
||||
manifest, key = Manifest.load(repository, (Manifest.Operation.CHECK,), force_tam_not_required=args.force)
|
||||
|
||||
if not hasattr(key, 'change_passphrase'):
|
||||
print('This repository is not encrypted, cannot enable TAM.')
|
||||
return EXIT_ERROR
|
||||
|
||||
if not manifest.tam_verified or not manifest.config.get(b'tam_required', False):
|
||||
# The standard archive listing doesn't include the archive ID like in borg 1.1.x
|
||||
print('Manifest contents:')
|
||||
for archive_info in manifest.archives.list(sort_by=['ts']):
|
||||
print(format_archive(archive_info), '[%s]' % bin_to_hex(archive_info.id))
|
||||
print(format_archive(archive_info))
|
||||
manifest.config[b'tam_required'] = True
|
||||
manifest.write()
|
||||
repository.commit()
|
||||
if not key.tam_required:
|
||||
if not key.tam_required and hasattr(key, 'change_passphrase'):
|
||||
key.tam_required = True
|
||||
key.change_passphrase(key._passphrase)
|
||||
print('Key updated')
|
||||
|
@ -1743,7 +1762,7 @@ class Archiver:
|
|||
manifest, key = Manifest.load(repository, Manifest.NO_OPERATION_CHECK, force_tam_not_required=True)
|
||||
if tam_required(repository):
|
||||
os.unlink(tam_required_file(repository))
|
||||
if key.tam_required:
|
||||
if key.tam_required and hasattr(key, 'change_passphrase'):
|
||||
key.tam_required = False
|
||||
key.change_passphrase(key._passphrase)
|
||||
print('Key updated')
|
||||
|
@ -3110,6 +3129,8 @@ class Archiver:
|
|||
|
||||
If you do **not** want to encrypt the contents of your backups, but still
|
||||
want to detect malicious tampering use ``--encryption authenticated``.
|
||||
To normally work with ``authenticated`` repos, you will need the passphrase, but
|
||||
there is an emergency workaround, see ``BORG_WORKAROUNDS=authenticated_no_key`` docs.
|
||||
|
||||
If ``BLAKE2b`` is faster than ``SHA-256`` on your hardware, use ``--encryption authenticated-blake2``,
|
||||
``--encryption repokey-blake2`` or ``--encryption keyfile-blake2``. Note: for remote backups
|
||||
|
@ -4101,6 +4122,23 @@ class Archiver:
|
|||
Borg 1.x.y upgrades
|
||||
+++++++++++++++++++
|
||||
|
||||
Archive TAM authentication:
|
||||
|
||||
Use ``borg upgrade --archives-tam REPO`` to add archive TAMs to all
|
||||
archives that are not TAM authenticated yet.
|
||||
This is a convenient method to just trust all archives present - if
|
||||
an archive does not have TAM authentication yet, a TAM will be added.
|
||||
Archives created by old borg versions < 1.0.9 do not have TAMs.
|
||||
Archives created by newer borg version should have TAMs already.
|
||||
If you have a high risk environment, you should not just run this,
|
||||
but first verify that the archives are authentic and not malicious
|
||||
(== have good content, have a good timestamp).
|
||||
Borg 1.2.5+ needs all archives to be TAM authenticated for safety reasons.
|
||||
|
||||
This upgrade needs to be done once per repository.
|
||||
|
||||
Manifest TAM authentication:
|
||||
|
||||
Use ``borg upgrade --tam REPO`` to require manifest authentication
|
||||
introduced with Borg 1.0.9 to address security issues. This means
|
||||
that modifying the repository after doing this with a version prior
|
||||
|
@ -4181,6 +4219,8 @@ class Archiver:
|
|||
help='Enable manifest authentication (in key and cache) (Borg 1.0.9 and later).')
|
||||
subparser.add_argument('--disable-tam', dest='disable_tam', action='store_true',
|
||||
help='Disable manifest authentication (in key and cache).')
|
||||
subparser.add_argument('--archives-tam', dest='archives_tam', action='store_true',
|
||||
help='add TAM authentication for all archives.')
|
||||
subparser.add_argument('location', metavar='REPOSITORY', nargs='?', default='',
|
||||
type=location_validator(archive=False),
|
||||
help='path to the repository to be upgraded')
|
||||
|
|
|
@ -726,7 +726,8 @@ class LocalCache(CacheStatsMixin):
|
|||
nonlocal processed_item_metadata_chunks
|
||||
csize, data = decrypted_repository.get(archive_id)
|
||||
chunk_idx.add(archive_id, 1, len(data), csize)
|
||||
archive = ArchiveItem(internal_dict=msgpack.unpackb(data))
|
||||
archive, verified, _ = self.key.unpack_and_verify_archive(data, force_tam_not_required=True)
|
||||
archive = ArchiveItem(internal_dict=archive)
|
||||
if archive.version != 1:
|
||||
raise Exception('Unknown archive metadata version')
|
||||
sync = CacheSynchronizer(chunk_idx)
|
||||
|
|
|
@ -22,6 +22,7 @@ from ..helpers import get_limited_unpacker
|
|||
from ..helpers import bin_to_hex
|
||||
from ..helpers import prepare_subprocess_env
|
||||
from ..helpers import msgpack
|
||||
from ..helpers import workarounds
|
||||
from ..item import Key, EncryptedKey
|
||||
from ..platform import SaveFile
|
||||
from .nonces import NonceManager
|
||||
|
@ -30,6 +31,10 @@ from .low_level import AES, bytes_to_long, bytes_to_int, num_aes_blocks, hmac_sh
|
|||
PREFIX = b'\0' * 8
|
||||
|
||||
|
||||
# workaround for lost passphrase or key in "authenticated" or "authenticated-blake2" mode
|
||||
AUTHENTICATED_NO_KEY = 'authenticated_no_key' in workarounds
|
||||
|
||||
|
||||
class NoPassphraseFailure(Error):
|
||||
"""can not acquire a passphrase: {}"""
|
||||
|
||||
|
@ -82,6 +87,13 @@ class TAMRequiredError(IntegrityError):
|
|||
traceback = False
|
||||
|
||||
|
||||
class ArchiveTAMRequiredError(TAMRequiredError):
|
||||
__doc__ = textwrap.dedent("""
|
||||
Archive '{}' is unauthenticated, but it is required for this repository.
|
||||
""").strip()
|
||||
traceback = False
|
||||
|
||||
|
||||
class TAMInvalid(IntegrityError):
|
||||
__doc__ = IntegrityError.__doc__
|
||||
traceback = False
|
||||
|
@ -91,6 +103,15 @@ class TAMInvalid(IntegrityError):
|
|||
super().__init__('Manifest authentication did not verify')
|
||||
|
||||
|
||||
class ArchiveTAMInvalid(IntegrityError):
|
||||
__doc__ = IntegrityError.__doc__
|
||||
traceback = False
|
||||
|
||||
def __init__(self):
|
||||
# Error message becomes: "Data integrity error: Archive authentication did not verify"
|
||||
super().__init__('Archive authentication did not verify')
|
||||
|
||||
|
||||
class TAMUnsupportedSuiteError(IntegrityError):
|
||||
"""Could not verify manifest: Unsupported suite {!r}; a newer version is needed."""
|
||||
traceback = False
|
||||
|
@ -203,15 +224,17 @@ class KeyBase:
|
|||
output_length=64
|
||||
)
|
||||
|
||||
def pack_and_authenticate_metadata(self, metadata_dict, context=b'manifest'):
|
||||
def pack_and_authenticate_metadata(self, metadata_dict, context=b'manifest', salt=None):
|
||||
if salt is None:
|
||||
salt = os.urandom(64)
|
||||
metadata_dict = StableDict(metadata_dict)
|
||||
tam = metadata_dict['tam'] = StableDict({
|
||||
'type': 'HKDF_HMAC_SHA512',
|
||||
'hmac': bytes(64),
|
||||
'salt': os.urandom(64),
|
||||
'salt': salt,
|
||||
})
|
||||
packed = msgpack.packb(metadata_dict, unicode_errors='surrogateescape')
|
||||
tam_key = self._tam_key(tam['salt'], context)
|
||||
tam_key = self._tam_key(salt, context)
|
||||
tam['hmac'] = HMAC(tam_key, packed, sha512).digest()
|
||||
return msgpack.packb(metadata_dict, unicode_errors='surrogateescape')
|
||||
|
||||
|
@ -228,11 +251,13 @@ class KeyBase:
|
|||
unpacker = get_limited_unpacker('manifest')
|
||||
unpacker.feed(data)
|
||||
unpacked = unpacker.unpack()
|
||||
if AUTHENTICATED_NO_KEY:
|
||||
return unpacked, True # True is a lie.
|
||||
if b'tam' not in unpacked:
|
||||
if tam_required:
|
||||
raise TAMRequiredError(self.repository._location.canonical_path())
|
||||
else:
|
||||
logger.debug('TAM not found and not required')
|
||||
logger.debug('Manifest TAM not found and not required')
|
||||
return unpacked, False
|
||||
tam = unpacked.pop(b'tam', None)
|
||||
if not isinstance(tam, dict):
|
||||
|
@ -242,7 +267,7 @@ class KeyBase:
|
|||
if tam_required:
|
||||
raise TAMUnsupportedSuiteError(repr(tam_type))
|
||||
else:
|
||||
logger.debug('Ignoring TAM made with unsupported suite, since TAM is not required: %r', tam_type)
|
||||
logger.debug('Ignoring manifest TAM made with unsupported suite, since TAM is not required: %r', tam_type)
|
||||
return unpacked, False
|
||||
tam_hmac = tam.get(b'hmac')
|
||||
tam_salt = tam.get(b'salt')
|
||||
|
@ -257,6 +282,52 @@ class KeyBase:
|
|||
logger.debug('TAM-verified manifest')
|
||||
return unpacked, True
|
||||
|
||||
def unpack_and_verify_archive(self, data, force_tam_not_required=False):
|
||||
"""Unpack msgpacked *data* and return (object, did_verify, salt)."""
|
||||
tam_required = self.tam_required
|
||||
if force_tam_not_required and tam_required:
|
||||
# for a long time, borg only checked manifest for "tam_required" and
|
||||
# people might have archives without TAM, so don't be too annoyingly loud here:
|
||||
logger.debug('Archive authentication DISABLED.')
|
||||
tam_required = False
|
||||
data = bytearray(data)
|
||||
unpacker = get_limited_unpacker('archive')
|
||||
unpacker.feed(data)
|
||||
unpacked = unpacker.unpack()
|
||||
if b'tam' not in unpacked:
|
||||
if tam_required:
|
||||
archive_name = unpacked.get(b'name', b'<unknown>').decode('ascii', 'replace')
|
||||
raise ArchiveTAMRequiredError(archive_name)
|
||||
else:
|
||||
logger.debug('Archive TAM not found and not required')
|
||||
return unpacked, False, None
|
||||
tam = unpacked.pop(b'tam', None)
|
||||
if not isinstance(tam, dict):
|
||||
raise ArchiveTAMInvalid()
|
||||
tam_type = tam.get(b'type', b'<none>').decode('ascii', 'replace')
|
||||
if tam_type != 'HKDF_HMAC_SHA512':
|
||||
if tam_required:
|
||||
raise TAMUnsupportedSuiteError(repr(tam_type))
|
||||
else:
|
||||
logger.debug('Ignoring archive TAM made with unsupported suite, since TAM is not required: %r', tam_type)
|
||||
return unpacked, False, None
|
||||
tam_hmac = tam.get(b'hmac')
|
||||
tam_salt = tam.get(b'salt')
|
||||
if not isinstance(tam_salt, bytes) or not isinstance(tam_hmac, bytes):
|
||||
raise ArchiveTAMInvalid()
|
||||
offset = data.index(tam_hmac)
|
||||
data[offset:offset + 64] = bytes(64)
|
||||
tam_key = self._tam_key(tam_salt, context=b'archive')
|
||||
calculated_hmac = HMAC(tam_key, data, sha512).digest()
|
||||
if not compare_digest(calculated_hmac, tam_hmac):
|
||||
if 'ignore_invalid_archive_tam' in workarounds:
|
||||
logger.debug('ignoring invalid archive TAM due to BORG_WORKAROUNDS')
|
||||
return unpacked, False, None # same as if no TAM is present
|
||||
else:
|
||||
raise ArchiveTAMInvalid()
|
||||
logger.debug('TAM-verified archive')
|
||||
return unpacked, True, tam_salt
|
||||
|
||||
|
||||
class PlaintextKey(KeyBase):
|
||||
TYPE = 0x02
|
||||
|
@ -836,6 +907,19 @@ class AuthenticatedKeyBase(RepoKey):
|
|||
# It's only authenticated, not encrypted.
|
||||
logically_encrypted = False
|
||||
|
||||
def _load(self, key_data, passphrase):
|
||||
if AUTHENTICATED_NO_KEY:
|
||||
# fake _load if we have no key or passphrase
|
||||
NOPE = bytes(32) # 256 bit all-zero
|
||||
self.repository_id = NOPE
|
||||
self.enc_key = NOPE
|
||||
self.enc_hmac_key = NOPE
|
||||
self.id_key = NOPE
|
||||
self.chunk_seed = 0
|
||||
self.tam_required = False
|
||||
return True
|
||||
return super()._load(key_data, passphrase)
|
||||
|
||||
def load(self, target, passphrase):
|
||||
success = super().load(target, passphrase)
|
||||
self.logically_encrypted = False
|
||||
|
@ -867,6 +951,8 @@ class AuthenticatedKeyBase(RepoKey):
|
|||
if not decompress:
|
||||
return payload
|
||||
data = self.decompress(payload)
|
||||
if AUTHENTICATED_NO_KEY:
|
||||
return data
|
||||
self.assert_id(id, data)
|
||||
return data
|
||||
|
||||
|
|
|
@ -213,6 +213,12 @@ def get_limited_unpacker(kind):
|
|||
object_hook=StableDict,
|
||||
unicode_errors='surrogateescape',
|
||||
))
|
||||
elif kind == 'archive':
|
||||
args.update(dict(use_list=True, # default value
|
||||
max_map_len=100, # ARCHIVE_KEYS ~= 20
|
||||
max_str_len=10000, # comment
|
||||
object_hook=StableDict,
|
||||
))
|
||||
elif kind == 'key':
|
||||
args.update(dict(use_list=True, # default value
|
||||
max_array_len=0, # not used
|
||||
|
@ -1809,9 +1815,10 @@ class ArchiveFormatter(BaseFormatter):
|
|||
'id': 'internal ID of the archive',
|
||||
'hostname': 'hostname of host on which this archive was created',
|
||||
'username': 'username of user who created this archive',
|
||||
'tam': 'TAM authentication state of this archive',
|
||||
}
|
||||
KEY_GROUPS = (
|
||||
('archive', 'name', 'barchive', 'comment', 'bcomment', 'id'),
|
||||
('archive', 'name', 'barchive', 'comment', 'bcomment', 'id', 'tam'),
|
||||
('start', 'time', 'end', 'command_line'),
|
||||
('hostname', 'username'),
|
||||
)
|
||||
|
@ -1862,6 +1869,7 @@ class ArchiveFormatter(BaseFormatter):
|
|||
'bcomment': partial(self.get_meta, 'comment', rs=False),
|
||||
'end': self.get_ts_end,
|
||||
'command_line': self.get_cmdline,
|
||||
'tam': self.get_tam,
|
||||
}
|
||||
self.used_call_keys = set(self.call_keys) & self.format_keys
|
||||
if self.json:
|
||||
|
@ -1912,6 +1920,9 @@ class ArchiveFormatter(BaseFormatter):
|
|||
def get_ts_end(self):
|
||||
return self.format_time(self.archive.ts_end)
|
||||
|
||||
def get_tam(self):
|
||||
return 'verified' if self.archive.tam_verified else 'none'
|
||||
|
||||
def format_time(self, ts):
|
||||
return OutputTimestamp(ts)
|
||||
|
||||
|
|
|
@ -39,11 +39,11 @@ from ..archiver import Archiver, parse_storage_quota, PURE_PYTHON_MSGPACK_WARNIN
|
|||
from ..cache import Cache, LocalCache
|
||||
from ..constants import * # NOQA
|
||||
from ..crypto.low_level import bytes_to_long, num_aes_blocks
|
||||
from ..crypto.key import KeyfileKeyBase, RepoKey, KeyfileKey, Passphrase, TAMRequiredError
|
||||
from ..crypto.key import KeyfileKeyBase, RepoKey, KeyfileKey, Passphrase, TAMRequiredError, ArchiveTAMRequiredError
|
||||
from ..crypto.keymanager import RepoIdMismatch, NotABorgKeyFile
|
||||
from ..crypto.file_integrity import FileIntegrityError
|
||||
from ..helpers import Location, get_security_dir
|
||||
from ..helpers import Manifest, MandatoryFeatureUnsupported
|
||||
from ..helpers import Manifest, MandatoryFeatureUnsupported, ArchiveInfo
|
||||
from ..helpers import EXIT_SUCCESS, EXIT_WARNING, EXIT_ERROR
|
||||
from ..helpers import bin_to_hex
|
||||
from ..helpers import MAX_S
|
||||
|
@ -1190,7 +1190,7 @@ class ArchiverTestCase(ArchiverTestCaseBase):
|
|||
self.cmd('delete', '--cache-only', self.repository_location)
|
||||
create_json = json.loads(self.cmd('create', '--no-cache-sync', self.repository_location + '::test', 'input',
|
||||
'--json', '--error')) # ignore experimental warning
|
||||
info_json = json.loads(self.cmd('info', self.repository_location + '::test', '--json'))
|
||||
info_json = json.loads(self.cmd('info', self.repository_location + '::test', '--json', '--error')) # ign warn
|
||||
create_stats = create_json['cache']['stats']
|
||||
info_stats = info_json['cache']['stats']
|
||||
assert create_stats == info_stats
|
||||
|
@ -3417,7 +3417,7 @@ class ArchiverCheckTestCase(ArchiverTestCaseBase):
|
|||
corrupted_manifest = manifest + b'corrupted!'
|
||||
repository.put(Manifest.MANIFEST_ID, corrupted_manifest)
|
||||
|
||||
archive = msgpack.packb({
|
||||
archive_dict = {
|
||||
'cmdline': [],
|
||||
'items': [],
|
||||
'hostname': 'foo',
|
||||
|
@ -3425,7 +3425,8 @@ class ArchiverCheckTestCase(ArchiverTestCaseBase):
|
|||
'name': 'archive1',
|
||||
'time': '2016-12-15T18:49:51.849711',
|
||||
'version': 1,
|
||||
})
|
||||
}
|
||||
archive = key.pack_and_authenticate_metadata(archive_dict, context=b'archive')
|
||||
archive_id = key.id_hash(archive)
|
||||
repository.put(archive_id, key.encrypt(archive))
|
||||
repository.commit()
|
||||
|
@ -3554,7 +3555,7 @@ class ManifestAuthenticationTest(ArchiverTestCaseBase):
|
|||
repository.commit()
|
||||
output = self.cmd('list', '--debug', self.repository_location)
|
||||
assert 'archive1234' in output
|
||||
assert 'TAM not found and not required' in output
|
||||
assert 'Manifest TAM not found and not required' in output
|
||||
# Run upgrade
|
||||
self.cmd('upgrade', '--tam', self.repository_location)
|
||||
# Manifest must be authenticated now
|
||||
|
@ -3587,6 +3588,70 @@ class ManifestAuthenticationTest(ArchiverTestCaseBase):
|
|||
assert not self.cmd('list', self.repository_location)
|
||||
|
||||
|
||||
class ArchiveAuthenticationTest(ArchiverTestCaseBase):
|
||||
|
||||
def write_archive_without_tam(self, repository, archive_name):
|
||||
manifest, key = Manifest.load(repository, Manifest.NO_OPERATION_CHECK)
|
||||
archive_data = msgpack.packb({
|
||||
'version': 1,
|
||||
'name': archive_name,
|
||||
'items': [],
|
||||
'cmdline': '',
|
||||
'hostname': '',
|
||||
'username': '',
|
||||
'time': datetime.utcnow().strftime(ISO_FORMAT),
|
||||
})
|
||||
archive_id = key.id_hash(archive_data)
|
||||
repository.put(archive_id, key.encrypt(archive_data))
|
||||
manifest.archives[archive_name] = (archive_id, datetime.now())
|
||||
manifest.write()
|
||||
repository.commit()
|
||||
|
||||
def test_upgrade_archives_tam(self):
|
||||
self.cmd('init', '--encryption=repokey', self.repository_location)
|
||||
self.create_src_archive('archive_tam')
|
||||
repository = Repository(self.repository_path, exclusive=True)
|
||||
with repository:
|
||||
self.write_archive_without_tam(repository, "archive_no_tam")
|
||||
output = self.cmd('list', '--format="{name} tam:{tam}{NL}"', self.repository_location)
|
||||
assert 'archive_tam tam:verified' in output # good
|
||||
assert 'archive_no_tam tam:none' in output # could be borg < 1.0.9 archive or fake
|
||||
self.cmd('upgrade', '--archives-tam', self.repository_location)
|
||||
output = self.cmd('list', '--format="{name} tam:{tam}{NL}"', self.repository_location)
|
||||
assert 'archive_tam tam:verified' in output # still good
|
||||
assert 'archive_no_tam tam:verified' in output # previously TAM-less archives got a TAM now
|
||||
|
||||
def test_check_rebuild_manifest(self):
|
||||
self.cmd('init', '--encryption=repokey', self.repository_location)
|
||||
self.create_src_archive('archive_tam')
|
||||
repository = Repository(self.repository_path, exclusive=True)
|
||||
with repository:
|
||||
self.write_archive_without_tam(repository, "archive_no_tam")
|
||||
repository.delete(Manifest.MANIFEST_ID) # kill manifest, so check has to rebuild it
|
||||
repository.commit()
|
||||
self.cmd('check', '--repair', self.repository_location)
|
||||
output = self.cmd('list', '--format="{name} tam:{tam}{NL}"', self.repository_location)
|
||||
assert 'archive_tam tam:verified' in output # TAM-verified archive is in rebuilt manifest
|
||||
assert 'archive_no_tam' not in output # check got rid of untrusted not TAM-verified archive
|
||||
|
||||
def test_check_rebuild_refcounts(self):
|
||||
self.cmd('init', '--encryption=repokey', self.repository_location)
|
||||
self.create_src_archive('archive_tam')
|
||||
archive_id_pre_check = self.cmd('list', '--format="{name} {id}{NL}"', self.repository_location)
|
||||
repository = Repository(self.repository_path, exclusive=True)
|
||||
with repository:
|
||||
self.write_archive_without_tam(repository, "archive_no_tam")
|
||||
output = self.cmd('list', '--format="{name} tam:{tam}{NL}"', self.repository_location)
|
||||
assert 'archive_tam tam:verified' in output # good
|
||||
assert 'archive_no_tam tam:none' in output # could be borg < 1.0.9 archive or fake
|
||||
self.cmd('check', '--repair', self.repository_location)
|
||||
output = self.cmd('list', '--format="{name} tam:{tam}{NL}"', self.repository_location)
|
||||
assert 'archive_tam tam:verified' in output # TAM-verified archive still there
|
||||
assert 'archive_no_tam' not in output # check got rid of untrusted not TAM-verified archive
|
||||
archive_id_post_check = self.cmd('list', '--format="{name} {id}{NL}"', self.repository_location)
|
||||
assert archive_id_post_check == archive_id_pre_check # rebuild_refcounts didn't change archive_tam archive id
|
||||
|
||||
|
||||
class RemoteArchiverTestCase(ArchiverTestCase):
|
||||
prefix = '__testsuite__:'
|
||||
|
||||
|
|
|
@ -11,6 +11,7 @@ from ..crypto.key import PlaintextKey, PassphraseKey, AuthenticatedKey, RepoKey,
|
|||
Blake2KeyfileKey, Blake2RepoKey, Blake2AuthenticatedKey
|
||||
from ..crypto.key import ID_HMAC_SHA_256, ID_BLAKE2b_256
|
||||
from ..crypto.key import TAMRequiredError, TAMInvalid, TAMUnsupportedSuiteError, UnsupportedManifestError
|
||||
from ..crypto.key import ArchiveTAMInvalid
|
||||
from ..crypto.key import identify_key
|
||||
from ..crypto.low_level import bytes_to_long, num_aes_blocks
|
||||
from ..helpers import IntegrityError
|
||||
|
@ -338,6 +339,8 @@ class TestTAM:
|
|||
blob = msgpack.packb({})
|
||||
with pytest.raises(TAMRequiredError):
|
||||
key.unpack_and_verify_manifest(blob)
|
||||
with pytest.raises(TAMRequiredError):
|
||||
key.unpack_and_verify_archive(blob)
|
||||
|
||||
def test_missing(self, key):
|
||||
blob = msgpack.packb({})
|
||||
|
@ -345,6 +348,9 @@ class TestTAM:
|
|||
unpacked, verified = key.unpack_and_verify_manifest(blob)
|
||||
assert unpacked == {}
|
||||
assert not verified
|
||||
unpacked, verified, _ = key.unpack_and_verify_archive(blob)
|
||||
assert unpacked == {}
|
||||
assert not verified
|
||||
|
||||
def test_unknown_type_when_required(self, key):
|
||||
blob = msgpack.packb({
|
||||
|
@ -354,6 +360,8 @@ class TestTAM:
|
|||
})
|
||||
with pytest.raises(TAMUnsupportedSuiteError):
|
||||
key.unpack_and_verify_manifest(blob)
|
||||
with pytest.raises(TAMUnsupportedSuiteError):
|
||||
key.unpack_and_verify_archive(blob)
|
||||
|
||||
def test_unknown_type(self, key):
|
||||
blob = msgpack.packb({
|
||||
|
@ -365,6 +373,9 @@ class TestTAM:
|
|||
unpacked, verified = key.unpack_and_verify_manifest(blob)
|
||||
assert unpacked == {}
|
||||
assert not verified
|
||||
unpacked, verified, _ = key.unpack_and_verify_archive(blob)
|
||||
assert unpacked == {}
|
||||
assert not verified
|
||||
|
||||
@pytest.mark.parametrize('tam, exc', (
|
||||
({}, TAMUnsupportedSuiteError),
|
||||
|
@ -372,13 +383,26 @@ class TestTAM:
|
|||
(None, TAMInvalid),
|
||||
(1234, TAMInvalid),
|
||||
))
|
||||
def test_invalid(self, key, tam, exc):
|
||||
def test_invalid_manifest(self, key, tam, exc):
|
||||
blob = msgpack.packb({
|
||||
'tam': tam,
|
||||
})
|
||||
with pytest.raises(exc):
|
||||
key.unpack_and_verify_manifest(blob)
|
||||
|
||||
@pytest.mark.parametrize('tam, exc', (
|
||||
({}, TAMUnsupportedSuiteError),
|
||||
({'type': b'\xff'}, TAMUnsupportedSuiteError),
|
||||
(None, ArchiveTAMInvalid),
|
||||
(1234, ArchiveTAMInvalid),
|
||||
))
|
||||
def test_invalid_archive(self, key, tam, exc):
|
||||
blob = msgpack.packb({
|
||||
'tam': tam,
|
||||
})
|
||||
with pytest.raises(exc):
|
||||
key.unpack_and_verify_archive(blob)
|
||||
|
||||
@pytest.mark.parametrize('hmac, salt', (
|
||||
({}, bytes(64)),
|
||||
(bytes(64), {}),
|
||||
|
@ -401,10 +425,12 @@ class TestTAM:
|
|||
blob = msgpack.packb(data)
|
||||
with pytest.raises(TAMInvalid):
|
||||
key.unpack_and_verify_manifest(blob)
|
||||
with pytest.raises(ArchiveTAMInvalid):
|
||||
key.unpack_and_verify_archive(blob)
|
||||
|
||||
def test_round_trip(self, key):
|
||||
def test_round_trip_manifest(self, key):
|
||||
data = {'foo': 'bar'}
|
||||
blob = key.pack_and_authenticate_metadata(data)
|
||||
blob = key.pack_and_authenticate_metadata(data, context=b"manifest")
|
||||
assert blob.startswith(b'\x82')
|
||||
|
||||
unpacked = msgpack.unpackb(blob)
|
||||
|
@ -415,10 +441,23 @@ class TestTAM:
|
|||
assert unpacked[b'foo'] == b'bar'
|
||||
assert b'tam' not in unpacked
|
||||
|
||||
@pytest.mark.parametrize('which', (b'hmac', b'salt'))
|
||||
def test_tampered(self, key, which):
|
||||
def test_round_trip_archive(self, key):
|
||||
data = {'foo': 'bar'}
|
||||
blob = key.pack_and_authenticate_metadata(data)
|
||||
blob = key.pack_and_authenticate_metadata(data, context=b"archive")
|
||||
assert blob.startswith(b'\x82')
|
||||
|
||||
unpacked = msgpack.unpackb(blob)
|
||||
assert unpacked[b'tam'][b'type'] == b'HKDF_HMAC_SHA512'
|
||||
|
||||
unpacked, verified, _ = key.unpack_and_verify_archive(blob)
|
||||
assert verified
|
||||
assert unpacked[b'foo'] == b'bar'
|
||||
assert b'tam' not in unpacked
|
||||
|
||||
@pytest.mark.parametrize('which', (b'hmac', b'salt'))
|
||||
def test_tampered_manifest(self, key, which):
|
||||
data = {'foo': 'bar'}
|
||||
blob = key.pack_and_authenticate_metadata(data, context=b"manifest")
|
||||
assert blob.startswith(b'\x82')
|
||||
|
||||
unpacked = msgpack.unpackb(blob, object_hook=StableDict)
|
||||
|
@ -429,3 +468,18 @@ class TestTAM:
|
|||
|
||||
with pytest.raises(TAMInvalid):
|
||||
key.unpack_and_verify_manifest(blob)
|
||||
|
||||
@pytest.mark.parametrize('which', (b'hmac', b'salt'))
|
||||
def test_tampered_archive(self, key, which):
|
||||
data = {'foo': 'bar'}
|
||||
blob = key.pack_and_authenticate_metadata(data, context=b"archive")
|
||||
assert blob.startswith(b'\x82')
|
||||
|
||||
unpacked = msgpack.unpackb(blob, object_hook=StableDict)
|
||||
assert len(unpacked[b'tam'][which]) == 64
|
||||
unpacked[b'tam'][which] = unpacked[b'tam'][which][0:32] + bytes(32)
|
||||
assert len(unpacked[b'tam'][which]) == 64
|
||||
blob = msgpack.packb(unpacked)
|
||||
|
||||
with pytest.raises(ArchiveTAMInvalid):
|
||||
key.unpack_and_verify_archive(blob)
|
||||
|
|
Loading…
Reference in New Issue