Skip to content

Commit 7b1a0ee

Browse files
committed
Internal event handler modifications as fix for nathanpeck#10
1 parent 65eef85 commit 7b1a0ee

File tree

4 files changed

+15
-17
lines changed

4 files changed

+15
-17
lines changed

CHANGELOG.md

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,17 +1,23 @@
11
Changelog
22
=========
33

4+
#### 0.6.1 (2014-08-22)
5+
6+
* The internal event emitter wasn't set up properly, causing errors about the upload stream object no having the .emit and/or .once methods. This bug impacted versions 0.5.0 and 0.6.0. Fixes issue #10.
7+
48
#### 0.6.0 (2014-08-15)
59

610
* Fix for mismatch between documentation and reality in the maxPartSize() and concurrentParts() options.
711
* New feature: part size and concurrect part helpers can be chained now.
12+
* *Warning, this version has a critical bug. It is recommended that you use 0.6.1 instead*
813

914
### 0.5.0 (2014-08-11)
1015

1116
* Added client caching to reuse an existing s3 client rather than creating a new one for each upload. Fixes #6
1217
* Updated the maxPartSize to be a hard limit instead of a soft one so that generated ETAG are consistent to to the reliable size of the uploaded parts. Fixes #7
1318
* Added this file. Fixes #8
1419
* New feature: concurrent part uploads. Now you can optionally enable concurrent part uploads if you wish to allow your application to drain the source stream more quickly and absorb some of the bottle neck when uploading to S3.
20+
* *Warning, this version has a critical bug. It is recommended that you use 0.6.1 instead*
1521

1622
### 0.4.0 (2014-06-23)
1723

README.md

Lines changed: 2 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -6,17 +6,9 @@ A pipeable write stream which uploads to Amazon S3 using the multipart file uplo
66

77
### Changelog
88

9-
#### 0.6.0 (2014-08-15)
9+
#### 0.6.1 (2014-08-15)
1010

11-
* Fix for mismatch between documentation and reality in the maxPartSize() and concurrentParts() options.
12-
* New feature: part size and concurrect part helpers can be chained now.
13-
14-
#### 0.5.0 (2014-08-11)
15-
16-
* Added client caching to reuse an existing s3 client rather than creating a new one for each upload. Fixes #6
17-
* Updated the maxPartSize to be a hard limit instead of a soft one so that generated ETAG's are consistent due to the reliable size of the uploaded parts. Fixes #7
18-
* Added a changelog.md file. Fixes #8
19-
* New feature: concurrent part uploads. Now you can optionally enable concurrent part uploads if you wish to allow your application to drain the source stream more quickly and absorb some of the backpressure from a fast incoming stream when uploading to S3.
11+
Fix for an issue with the internal event emitter being improperly attached. This issue caused crashs in v0.5.0 and v0.6.0, so it is recommended that you upgrade to v0.6.1 if you are using one of the affected versions.
2012

2113
[Historical Changelogs](CHANGELOG.md)
2214

lib/s3-upload-stream.js

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,5 @@
11
var Writable = require('stream').Writable,
2-
util = require("util"),
3-
EventEmitter = require("events").EventEmitter,
2+
events = require("events"),
43
AWS = require('aws-sdk');
54

65
var cachedClient;
@@ -13,6 +12,7 @@ module.exports = {
1312
// Generate a writeable stream which uploads to a file on S3.
1413
Uploader: function (connection, destinationDetails, doneCreatingUploadStream) {
1514
var self = this;
15+
var e = new events.EventEmitter();
1616

1717
if (arguments.length == 2) {
1818
// No connection passed in, assume that the connection details were already specified using
@@ -50,6 +50,8 @@ module.exports = {
5050
highWaterMark: 4194304 // 4 MB
5151
});
5252

53+
events.EventEmitter.call(self);
54+
5355
// Data pertaining to the overall upload
5456
self.partNumber = 1;
5557
self.partIds = [];
@@ -104,14 +106,14 @@ module.exports = {
104106
else {
105107
// Block uploading (and receiving of more data) until we upload
106108
// some of the pending parts
107-
self.once('chunk', upload);
109+
e.once('chunk', upload);
108110
}
109111

110112
function upload() {
111113
self.pendingParts++;
112114
self.flushPart(function (partDetails) {
113115
--self.pendingParts;
114-
self.emit('chunk'); // Internal event
116+
e.emit('chunk'); // Internal event
115117
self.ws.emit('chunk', partDetails); // External event
116118
});
117119
next();
@@ -267,5 +269,3 @@ module.exports = {
267269
);
268270
}
269271
};
270-
271-
util.inherits(module.exports.Uploader, EventEmitter);

package.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
{
22
"name": "s3-upload-stream",
33
"description": "Writeable stream for uploading content of unknown size to S3 via the multipart API.",
4-
"version": "0.6.0",
4+
"version": "0.6.1",
55
"author": {
66
"name": "Nathan Peck",
77
"email": "[email protected]"

0 commit comments

Comments
 (0)