1
0
Fork 0
mirror of https://github.com/borgbackup/borg.git synced 2024-12-27 10:18:12 +00:00

use cached_hash also to generate all-zero replacement chunks

at least for major amounts of fixed-size replacement hashes,
this will be much faster. also less memory management overhead.
This commit is contained in:
Thomas Waldmann 2021-01-08 19:29:29 +01:00
parent f3088a9893
commit ef19d937ed

View file

@ -1662,8 +1662,8 @@ def verify_file_chunks(archive_name, item):
If a previously missing file chunk re-appears, the replacement chunk is replaced by the correct one. If a previously missing file chunk re-appears, the replacement chunk is replaced by the correct one.
""" """
def replacement_chunk(size): def replacement_chunk(size):
data = bytes(size) chunk = Chunk(None, allocation=CH_ALLOC, size=size)
chunk_id = self.key.id_hash(data) chunk_id, data = cached_hash(chunk, self.key.id_hash)
cdata = self.key.encrypt(data) cdata = self.key.encrypt(data)
csize = len(cdata) csize = len(cdata)
return chunk_id, size, csize, cdata return chunk_id, size, csize, cdata