Which service(blob, file, queue, table) does this issue concern?
Blob
Which version of the SDK was used?
"@azure/storage-blob": "^10.2.0-preview"
What's the Node.js/Browser version?
Node.js
What problem was encountered?
Storage streams never drained / flushed.
Steps to reproduce the issue?
For one scenario we're using the tar package and uploading streams directly from tar archives, using the entry event.
Emits 'entry' events with tar.ReadEntry objects, which are themselves readable streams that you can pipe wherever. Each entry will not emit until the one before it is flushed through, so make sure to either consume the data (with on('data', ...) or .pipe(...)) or throw it away with .resume() to keep the stream flowing.
A few files upload fine, but then it gets 'stuck' and takes several minutes before, slowly, continuing on to upload more files. This indicates that BlockBlobUrl.upload does not consume the file stream. The promise returned from blobUrl.upload also does not resolve.
Our code is something like this:
const parser = new Parse();
createReadStream(tarball).pipe(parser);
parser.on('entry', file => {
blobUrl.upload(Aborter.none, () => file, file.size, {
blobHTTPHeaders: {
blobContentType: mime.getType(file.path)!,
},
});
});
There's a bit more stuff in there around handling promises and such, but that's the gist of it.
Using storage.createBlockBlobFromStreamAsync from the previous SDK worked fine in this scenario, and also manually calling uploadStreamToBlockBlob works...
uploadStreamToBlockBlob(Aborter.none, file, blobUrl, 2 * 1024 * 1024, 20, {
blobHTTPHeaders: {
blobContentType: mime.getType(file.path),
},
});
...but the more ergonomic blobUrl.upload does not.
Have you found a mitigation/solution?
Above ^
Which service(blob, file, queue, table) does this issue concern?
Blob
Which version of the SDK was used?
"@azure/storage-blob": "^10.2.0-preview"What's the Node.js/Browser version?
Node.js
What problem was encountered?
Storage streams never drained / flushed.
Steps to reproduce the issue?
For one scenario we're using the
tarpackage and uploading streams directly from tar archives, using theentryevent.A few files upload fine, but then it gets 'stuck' and takes several minutes before, slowly, continuing on to upload more files. This indicates that
BlockBlobUrl.uploaddoes not consume the file stream. The promise returned fromblobUrl.uploadalso does not resolve.Our code is something like this:
There's a bit more stuff in there around handling promises and such, but that's the gist of it.
Using
storage.createBlockBlobFromStreamAsyncfrom the previous SDK worked fine in this scenario, and also manually callinguploadStreamToBlockBlobworks......but the more ergonomic blobUrl.upload does not.
Have you found a mitigation/solution?
Above ^