Skip to content

Streaming download pool for JS API #21

@max-mapper

Description

@max-mapper

it might make sense to do this as a separate module, and this might already exist, but I want this API:

// for example purposes assume urlStream is an object stream that emits a bunch of URLs as strings, or objects for request options
var nugget = require('nugget')
var downloader = nugget.createDownloadStream() // options could be the parallelism, and also you could pass defaults for request here as options.request or options.defaults maybe
pump(urlStream, downloader, function (err) {
  if (err) throw err
  console.log('done downloading')
})

Internally, createDownloadStream would start a configurable sized parallel queue (maybe powered by https://www.npmjs.com/package/run-parallel-limit). It would return a writable stream that you write urls into.

For every URL received, it should add it to the queue. It should emit events for when it starts and finishes each URL, as well as expose download progress through a static property/object somewhere on the createDownloadStream instance.

Error handling, it should only destroy the stream with an error if it's a catastrophic error. Maybe you can pass in a function that gets called with the (err, resp, body) for each request and that way you can handle the response yourself if you want?

Finally, when it downloads, it should do it like nugget/wget where it saves the resource to a file on disk. The file it saves as should be configurable in the object you write in as input. If you just write a single URL string as input, it should do with nugget does by default -- just use the http filename.

Metadata

Metadata

Assignees

No one assigned

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions