Details
-
Bug
-
Status: Waiting for peer review
-
Minor
-
Resolution: Unresolved
-
3.9.8, 3.10.5, 3.11.1
-
None
-
php 7.4.15+
Description
The tgz_extractor class reads in gzipped tar files - most notably in the restore of tgz formatted course backups.
It performs a read from the file handle with this:
`$buffer = gzread($gz, self::READ_BLOCK_SIZE);` - where READ_BLOCK_SIZE is 65536.
It assumes it gets back 65536 bytes (or whatever is left if near the EOF).
However, as of php 7.4.15, this was added to PHP:
- "Fixed bug #80384 (filter buffers entire read until file closed)"
This fixes an issue in php where streams would read in more data than an attached filter actually needed at any point.
The side effect for Moodle here is that the gzread call might return less than 65536 bytes, even with more than that remaining - depending on what the underlying stream actually is. When this happens, the tar processing fails, as it relies on having a full 512 byte block of data available when processing - but since it doesn't have that full 512 byte amount (when the buffer isn't perfectly divisible by 512), it fails.
This isn't an issue for the default file_system - as its not using a stream with stream_filter_append.
When using a remote file store based file_system (s3, etc), without fixing this - its necessary for the file_system plugin to download the full compressed file, then have it act on that. With it - you can actually have a stream reader that reads the compressed stream and processes it on-the-fly in memory without having to commit to disk first.
Attachments
Issue Links
- has a non-specific relationship to
-
MDL-72935 Download all submissions downloads broken zip file when streaming from S3
-
- Open
-