1
0
mirror of https://github.com/mafintosh/tar-stream.git synced 2024-11-22 06:26:26 +00:00
tar-stream is a streaming tar parser and generator.
Go to file
2024-06-14 09:44:28 +02:00
.github/workflows Move to streamx for streams, brittle for testing, and b4a for buffer operations (#143) 2022-12-06 16:04:04 +01:00
test Allow passing empty bodies for "void" headers (#159) 2023-07-05 17:38:39 +02:00
.gitignore Move to streamx for streams, brittle for testing, and b4a for buffer operations (#143) 2022-12-06 16:04:04 +01:00
constants.js fix: 🐛 take care of case when fs is empty object (#152) 2023-06-19 11:50:46 +02:00
extract.js defer end emit until a read happens 2023-06-17 21:47:29 +02:00
headers.js Fix header decoding sometimes failing in the browser (#164) 2024-01-18 19:49:23 -05:00
index.js fix style 2015-10-15 17:56:44 +02:00
LICENSE add full license 2014-08-19 20:45:41 +02:00
pack.js Allow passing empty bodies for "void" headers (#159) 2023-07-05 17:38:39 +02:00
package.json 3.1.7 2024-01-19 15:12:42 -05:00
README.md Improve streams (#150) 2023-06-17 19:11:53 +02:00
SECURITY.md Create SECURITY.md 2024-06-14 09:44:28 +02:00

tar-stream

tar-stream is a streaming tar parser and generator and nothing else. It operates purely using streams which means you can easily extract/parse tarballs without ever hitting the file system.

Note that you still need to gunzip your data if you have a .tar.gz. We recommend using gunzip-maybe in conjunction with this.

npm install tar-stream

build status License

Usage

tar-stream exposes two streams, pack which creates tarballs and extract which extracts tarballs. To modify an existing tarball use both.

It implementes USTAR with additional support for pax extended headers. It should be compatible with all popular tar distributions out there (gnutar, bsdtar etc)

If you want to pack/unpack directories on the file system check out tar-fs which provides file system bindings to this module.

Packing

To create a pack stream use tar.pack() and call pack.entry(header, [callback]) to add tar entries.

const tar = require('tar-stream')
const pack = tar.pack() // pack is a stream

// add a file called my-test.txt with the content "Hello World!"
pack.entry({ name: 'my-test.txt' }, 'Hello World!')

// add a file called my-stream-test.txt from a stream
const entry = pack.entry({ name: 'my-stream-test.txt', size: 11 }, function(err) {
  // the stream was added
  // no more entries
  pack.finalize()
})

entry.write('hello')
entry.write(' ')
entry.write('world')
entry.end()

// pipe the pack stream somewhere
pack.pipe(process.stdout)

Extracting

To extract a stream use tar.extract() and listen for extract.on('entry', (header, stream, next) )

const extract = tar.extract()

extract.on('entry', function (header, stream, next) {
  // header is the tar header
  // stream is the content body (might be an empty stream)
  // call next when you are done with this entry

  stream.on('end', function () {
    next() // ready for next entry
  })

  stream.resume() // just auto drain the stream
})

extract.on('finish', function () {
  // all entries read
})

pack.pipe(extract)

The tar archive is streamed sequentially, meaning you must drain each entry's stream as you get them or else the main extract stream will receive backpressure and stop reading.

Extracting as an async iterator

The extraction stream in addition to being a writable stream is also an async iterator

const extract = tar.extract()

someStream.pipe(extract)

for await (const entry of extract) {
  entry.header // the tar header
  entry.resume() // the entry is the stream also
}

Headers

The header object using in entry should contain the following properties. Most of these values can be found by stat'ing a file.

{
  name: 'path/to/this/entry.txt',
  size: 1314,        // entry size. defaults to 0
  mode: 0o644,       // entry mode. defaults to to 0o755 for dirs and 0o644 otherwise
  mtime: new Date(), // last modified date for entry. defaults to now.
  type: 'file',      // type of entry. defaults to file. can be:
                     // file | link | symlink | directory | block-device
                     // character-device | fifo | contiguous-file
  linkname: 'path',  // linked file name
  uid: 0,            // uid of entry owner. defaults to 0
  gid: 0,            // gid of entry owner. defaults to 0
  uname: 'maf',      // uname of entry owner. defaults to null
  gname: 'staff',    // gname of entry owner. defaults to null
  devmajor: 0,       // device major version. defaults to 0
  devminor: 0        // device minor version. defaults to 0
}

Modifying existing tarballs

Using tar-stream it is easy to rewrite paths / change modes etc in an existing tarball.

const extract = tar.extract()
const pack = tar.pack()
const path = require('path')

extract.on('entry', function (header, stream, callback) {
  // let's prefix all names with 'tmp'
  header.name = path.join('tmp', header.name)
  // write the new entry to the pack stream
  stream.pipe(pack.entry(header, callback))
})

extract.on('finish', function () {
  // all entries done - lets finalize it
  pack.finalize()
})

// pipe the old tarball to the extractor
oldTarballStream.pipe(extract)

// pipe the new tarball the another stream
pack.pipe(newTarballStream)

Saving tarball to fs

const fs = require('fs')
const tar = require('tar-stream')

const pack = tar.pack() // pack is a stream
const path = 'YourTarBall.tar'
const yourTarball = fs.createWriteStream(path)

// add a file called YourFile.txt with the content "Hello World!"
pack.entry({ name: 'YourFile.txt' }, 'Hello World!', function (err) {
  if (err) throw err
  pack.finalize()
})

// pipe the pack stream to your file
pack.pipe(yourTarball)

yourTarball.on('close', function () {
  console.log(path + ' has been written')
  fs.stat(path, function(err, stats) {
    if (err) throw err
    console.log(stats)
    console.log('Got file info successfully!')
  })
})

Performance

See tar-fs for a performance comparison with node-tar

License

MIT