GithubHelp home page GithubHelp logo

s3-plugin-webpack's Introduction

S3 Plugin

Travis Badge Code Climate

This plugin will upload all built assets to s3

Install Instructions

$ npm i webpack-s3-plugin

Note: This plugin needs NodeJS > 0.12.0

Usage Instructions

I notice a lot of people are setting the directory option when the files are part of their build. Please don't set directory if you're uploading your build. Using the directory option reads the files after compilation to upload instead of from the build process.

You can also use a credentials file from AWS. To set the profile set your s3 options to the following:

s3Options: {
  credentials: new AWS.SharedIniFileCredentials({profile: 'PROFILE_NAME'})
}

s3UploadOptions default to ACL: 'public-read' so you may need to override if you have other needs. See #28

Require webpack-s3-plugin
var S3Plugin = require('webpack-s3-plugin')
With exclude
var config = {
  plugins: [
    new S3Plugin({
      // Exclude uploading of html
      exclude: /.*\.html$/,
      // s3Options are required
      s3Options: {
        accessKeyId: process.env.AWS_ACCESS_KEY_ID,
        secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
        region: 'us-west-1'
      },
      s3UploadOptions: {
        Bucket: 'MyBucket'
      },
      cdnizerOptions: {
        defaultCDNBase: 'http://asdf.ca'
      }
    })
  ]
}
With include
var config = {
  plugins: [
    new S3Plugin({
      // Only upload css and js
      include: /.*\.(css|js)/,
      // s3Options are required
      s3Options: {
        accessKeyId: process.env.AWS_ACCESS_KEY_ID,
        secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
      },
      s3UploadOptions: {
        Bucket: 'MyBucket'
      }
    })
  ]
}
Advanced include and exclude rules

include and exclude rules behave similarly to Webpack's loader options. In addition to a RegExp you can pass a function which will be called with the path as its first argument. Returning a truthy value will match the rule. You can also pass an Array of rules, all of which must pass for the file to be included or excluded.

import isGitIgnored from 'is-gitignored'

// Up to you how to handle this
var isPathOkToUpload = function(path) {
  return require('my-projects-publishing-rules').checkFile(path)
}

var config = {
  plugins: [
    new S3Plugin({
      // Only upload css and js and only the paths that our rules database allows
      include: [
        /.*\.(css|js)/,
        function(path) { isPathOkToUpload(path) }
      ],

      // function to check if the path is gitignored
      exclude: isGitIgnored,

      // s3Options are required
      s3Options: {
        accessKeyId: process.env.AWS_ACCESS_KEY_ID,
        secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
      },
      s3UploadOptions: {
        Bucket: 'MyBucket'
      }
    })
  ]
}
With basePathTransform
import gitsha from 'gitsha'

var addSha = function() {
  return new Promise(function(resolve, reject) {
    gitsha(__dirname, function(error, output) {
      if(error)
        reject(error)
      else
       // resolve to first 5 characters of sha
       resolve(output.slice(0, 5))
    })
  })
}

var config = {
  plugins: [
    new S3Plugin({
      s3Options: {
        accessKeyId: process.env.AWS_ACCESS_KEY_ID,
        secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
      },
      s3UploadOptions: {
        Bucket: 'MyBucket'
      },
      basePathTransform: addSha
    })
  ]
}


// Will output to /${mySha}/${fileName}
With CloudFront invalidation
var config = {
  plugins: [
    new S3Plugin({
      s3Options: {
        accessKeyId: process.env.AWS_ACCESS_KEY_ID,
        secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
        sessionToken: 'a234jasd'  // (optional) AWS session token for signing requests
      },
      s3UploadOptions: {
        Bucket: 'MyBucket'
      },
      cloudfrontInvalidateOptions: {
        DistributionId: process.env.CLOUDFRONT_DISTRIBUTION_ID,
        Items: ["/*"]
      }
    })
  ]
}
With Dynamic Upload Options
var config = {
  plugins: [
    new S3Plugin({
      s3Options: {
        accessKeyId: process.env.AWS_ACCESS_KEY_ID,
        secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
      },
      s3UploadOptions: {
        Bucket: 'MyBucket',
        ContentEncoding(fileName) {
          if (/\.gz/.test(fileName))
            return 'gzip'
        },

        ContentType(fileName) {
          if (/\.js/.test(fileName))
            return 'application/javascript'
          else
            return 'text/plain'
        }
      }
    })
  ]
}

Options

  • exclude: A Pattern to match for excluded content. Behaves similarly to webpack's loader configuration.
  • include: A Pattern to match for included content. Behaves the same as exclude.
  • s3Options: Provide keys for upload options of s3Config
  • s3UploadOptions: Provide upload options putObject
  • basePath: Provide the namespace of uploaded files on S3
  • directory: Provide a directory to upload (if not supplied, will upload js/css from compilation)
  • htmlFiles: Html files to cdnize (defaults to all in output directory)
  • cdnizerCss: Config for css cdnizer check below
  • noCdnizer: Disable cdnizer (defaults to true if no cdnizerOptions passed)
  • cdnizerOptions: options to pass to cdnizer
  • basePathTransform: transform the base path to add a folder name. Can return a promise or a string
  • progress: Enable progress bar (defaults true)
  • priority: priority order to your files as regex array. The ones not matched by regex are uploaded first. This rule becomes useful when avoiding s3 eventual consistency issues

Contributing

All contributions are welcome. Please make a pull request and make sure things still pass after running npm run test For tests you will need to either have the environment variables set or setup a .env file. There's a .env.sample so you can cp .env.sample .env and fill it in. Make sure to add any new environment variables.

Commands to be aware of

WARNING: The test suit generates random files for certain checks. Ensure you delete files leftover on your Bucket.
  • npm run test - Run test suit (You must have the .env file setup)
  • npm run build - Run build

Thanks

  • Thanks to @Omer for fixing credentials from ~/.aws/credentials
  • Thanks to @lostjimmy for pointing out path.sep for Windows compatibility

s3-plugin-webpack's People

Contributors

antonbavykin1991 avatar atomaka avatar bnaya avatar danielapt avatar datapimp avatar dcocchia avatar dependabot[bot] avatar erictheswift avatar federicobond avatar georgetaveras1231 avatar ianserlin avatar icco avatar johnf avatar kamatama41 avatar lostjimmy avatar mikaak avatar mlarraz avatar newyork-anthonyng avatar olliejennings avatar omer avatar rdiazv avatar rtlong avatar sinemacula-ben avatar soaa avatar villesau avatar vladimirpal avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

s3-plugin-webpack's Issues

UnkownEndpoint error when trying to use plugin

  • [ x] I have read and understood this plugin's README
  • [ x] If filing a bug report, I have included my version of node and s3-plugin-webpack
  • [ x] If filing a bug report, I have included which OS (including specific OS
    version) I am using.
  • [ x] If filing a bug report, I have included a minimal test case that reproduces
    my issue.
  • [ x] I understand this is an open-source project staffed by someone with a job and
    that any help I receive is done on free time. I know I am not entitled to anything and will be polite and courteous.
  • [x ] I understand my issue may be closed if it becomes obvious I didn't
    actually perform all of these steps or the issue is not with the library itself

Issue Details

I have been trying to use this plugin and although the S3 upload part does work (I can confirm files are uploaded), I get an error and I don't see the Invalidation show up in my AWS Cloudfront Invalidation console for my Cloudfront Distribution.

I made sure to check that I have an IAM role with the CloudFrontFullAccess policy.

tl;dr files are uploaded to S3 as expected, but there is an UnknownEndpoint error and no invalidation is recorded in my CloudFront dashboard.

Environment

$ node -v
v6.9.4
$ yarn -v
yarn install v0.21.3
[1/4] πŸ”  Resolving packages...
success Already up-to-date.
✨  Done in 1.15s.

Using "webpack-s3-plugin": "^1.0.0-rc.0" on MacOS Sierra 10.12.3

Steps to reproduce

Here is my webpack config

// webpack.config.release.js
const prodConfig = require('./webpack.config.prod');
const S3Plugin = require('webpack-s3-plugin');
const webpackMerge = require('webpack-merge');

module.exports = webpackMerge(prodConfig, {

  plugins: [
    new S3Plugin({
      s3Options: {
        accessKeyId: 'xxx',
        secretAccessKey: 'xxx',
        region: 'us-east-1'
      },
      s3UploadOptions: {
        Bucket: 'xxx'
      },
      cdnizerOptions: {
        defaultCDNBase: 'xxx'
      },
      cloudfrontInvalidateOptions: {
        DistributionId: 'xxx',
        Items: ['/index.html']
      }
    })
  ]

});

I run yarn run release

"release": "webpack --config ./webpack.config.release.js  --progress --profile --bail",

Issue

And this is the error I'm getting.

ERROR in S3Plugin: UnknownEndpoint: Inaccessible host: `s3.amazonaws.com'. This service may not be available in the `us-east-1' region.
Child html-webpack-plugin for "index.html":
        + 1 hidden modules
Child favicons-webpack-plugin for "icons/stats.json":
               Asset     Size  Chunks             Chunk Names
    icons/stats.json  5.63 kB       0  [emitted]
        + 1 hidden modules
Uploading [>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>] 100% 0.0s
✨  Done in 605.86s.

Thanks for making this plugin and let me know if you need anymore information!

Delete previous builds

Is there any way to delete previous files? I use hash in file names to invalidate only index.html but every time webpack generates a new hash value I got duplicated files in my S3 bucket. Is there any way to avoid that?

When using cdnizerOptions and file-loader, HTML is updated after upload to S3

using the configuration below, index.html file is updated properly by cdnizer but the file is uploaded to S3 prior to cdnizer completing it's path replacement.

new S3Plugin({
      directory: 'dist/',
      s3Options: {
        accessKeyId: process.env.AWS_ACCESS_KEY_ID,
        secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
        region: 'us-east-1'
      },
      cdnizerOptions: {
        defaultCDNBase: '/' + hash,
        files: [ '/app.js' ]
      },
      s3UploadOptions: {
        Bucket: 'mobile.' + process.env.TARGET + '.blah.com'
      }
    })

`webpack-contrib` transfer requirements

Step 1.

The license file needs to be updated as part of the transfer into webpack-contrib.

The current copyright holder ( @MikaAK ) needs to to commit the copyright change to act as "signing over" the project to webpack-contrib which is part of the JS Foundation.

The LICENSE needs to be named LICENSE ( no extension ) if it isn't already.

The LICENSE needs to have the following copy.

DO NOT change the license prop in the package.json

Copyright JS Foundation and other contributors

Permission is hereby granted, free of charge, to any person obtaining
a copy of this software and associated documentation files (the
'Software'), to deal in the Software without restriction, including
without limitation the rights to use, copy, modify, merge, publish,
distribute, sublicense, and/or sell copies of the Software, and to
permit persons to whom the Software is furnished to do so, subject to
the following conditions:

The above copyright notice and this permission notice shall be
included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND,
EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

Step 2.

@bebraw & @d3viant0ne need to be added as NPM owners.

Step 3.

As you cannot transfer a repo into an organization unless you have admin level access to it, once the above two steps have been satisfied the repo needs to be transfered to me & I will transfer it into webpack-contrib

Sync / Clear bucket before uploading?

I have a really quick question; is there an option to sync the assets up, e.g. clean the S3 bucket before I upload? Just curious how you might handle asset revving to clear out the old assets to keep the bucket clean. Sorry if I missed something obvious, thanks again for your wonderful work on this. Using it in production with great success!

Cheers.

Please complete these steps and check these boxes (by putting an x inside
the brackets) before filing your issue:

  • [ x ] I have read and understood this plugin's README
  • [ x ] If filing a bug report, I have included my version of node and s3-plugin-webpack
  • [ x ] If filing a bug report, I have included which OS (including specific OS
    version) I am using.
  • [ x ] If filing a bug report, I have included a minimal test case that reproduces
    my issue.
  • [ x ] I understand this is an open-source project staffed by someone with a job and
    that any help I receive is done on free time. I know I am not entitled to anything and will be polite and courteous.
  • [ x ] I understand my issue may be closed if it becomes obvious I didn't
    actually perform all of these steps or the issue is not with the library itself

Thank you for adhering to this process! This ensures that I can pay attention to issues that are relevant and answer questions faster.

cdnizerOptions error

I've been trying to use this with both the the options on this page and with the following:

new S3Plugin({
  // s3Options are required
  s3Options: {
    accessKeyId: 'ACCESSKEY',
    secretAccessKey: 'secretAccessKey',
  },
  s3UploadOptions: {
    Bucket: 'myBucket',
    ACL: 'public-read'
  }
})

In both cases I get an error

    if (!this.cdnizerOptions.files) this.cdnizerOptions.files = [];
                            ^
TypeError: Cannot read property 'files' of undefined
    at new S3Plugin (c:\Users\Fredrik\Documents\LBC\react\twitch-visualizer\node
_modules\webpack-s3-plugin\dist\s3_plugin.js:96:29)
    at Object.<anonymous> (c:\Users\Fredrik\Documents\LBC\react\twitch-visualize
r\webpack-production.config.js:30:5)
    at Module._compile (module.js:456:26)
    at Object.Module._extensions..js (module.js:474:10)
    at Module.load (module.js:356:32)
    at Function.Module._load (module.js:312:12)
    at Module.require (module.js:364:17)
    at require (module.js:380:17)
    at module.exports (c:\Users\Fredrik\Documents\LBC\react\twitch-visualizer\no
de_modules\webpack\bin\convert-argv.js:80:13)
    at Object.<anonymous> (c:\Users\Fredrik\Documents\LBC\react\twitch-visualize
r\node_modules\webpack\bin\webpack.js:54:40)
    at Module._compile (module.js:456:26)
    at Object.Module._extensions..js (module.js:474:10)
    at Module.load (module.js:356:32)
    at Function.Module._load (module.js:312:12)
    at Function.Module.runMain (module.js:497:10)
    at startup (node.js:119:16)

npm ERR! [email protected] prod: `webpack -p --config webpack-production.c
onfig.js --progress --colors`
npm ERR! Exit status 8
npm ERR!
npm ERR! Failed at the [email protected] prod script.
npm ERR! This is most likely a problem with the Twitch-Visualizer package,
npm ERR! not with npm itself.
npm ERR! Tell the author that this fails on your system:
npm ERR!     webpack -p --config webpack-production.config.js --progress --color
s
npm ERR! You can get their info via:
npm ERR!     npm owner ls Twitch-Visualizer
npm ERR! There is likely additional logging output above.
npm ERR! System Windows_NT 6.2.9200
npm ERR! command "c:\\Program Files\\nodejs\\node.exe" "c:\\Program Files\\nodej
s\\node_modules\\npm\\bin\\npm-cli.js" "run" "prod"
npm ERR! cwd c:\Users\Fredrik\Documents\LBC\react\twitch-visualizer
npm ERR! node -v v0.10.32
npm ERR! npm -v 1.4.28
npm ERR! code ELIFECYCLE
npm ERR!
npm ERR! Additional logging details can be found in:
npm ERR!     c:\Users\Fredrik\Documents\LBC\react\twitch-visualizer\npm-debug.lo
g
npm ERR! not ok code 0

S3Plugin: AccessDenied: Access Denied Error

Complete the following or your issue will be deleted (delete me after)

Please complete these steps and check these boxes (by putting an x inside
the brackets) before filing your issue:

  • [ x] I have read and understood this plugin's README
  • [ x] If filing a bug report, I have included my version of node and s3-plugin-webpack
  • [x ] If filing a bug report, I have included which OS (including specific OS
    version) I am using.
  • [x ] If filing a bug report, I have included a minimal test case that reproduces
    my issue.
  • [x ] I understand this is an open-source project staffed by someone with a job and
    that any help I receive is done on free time. I know I am not entitled to anything and will be polite and courteous.
  • [x ] I understand my issue may be closed if it becomes obvious I didn't
    actually perform all of these steps or the issue is not with the library itself

Thank you for adhering to this process! This ensures that I can pay attention to issues that are relevant and answer questions faster.

Issue Details

Plugin reports an Access Denied error if the bucket has no Upload/Delete permission for "Everyone". No other permission settings seems to be working.

Node version 4.5.0
s3-plugin-webpack version 0.9.2
OS version Windows 8.1
If filing a bug report, please include a list of steps that describe how to
reproduce the bug you are experiencing. Include your config being passed to the S3Plugin.

  1. Create a new bucket.
  2. Ensure that bucket it has no other permission other than "Me"
  3. Create an IAM user with All permission on this new bucket.
  4. Try uploading file using s3-plugin-webpack plugin and error should appear on console.

SigV4 error issue

SigV4 Policy

My bucket is in ap-northeast-2 which has to set SigV4.
When I build, it cannot upload to S3 with message

ERROR in S3Plugin: Error: Non-file stream objects are not supported with SigV4 in AWS.S3

Do you have plan to add an 'option' for SigV4?

My environment is below:

Node v4.2.3
[email protected]
OSX 10.10.5

Example config

module.exports = {
    new S3Plugin({
      // s3Options are required
      s3Options: {
        accessKeyId: process.env.AWS_ACCESS_KEY_ID,
        secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
        region: 'ap-northeast-2'
      },
      s3UploadOptions: {
        Bucket: 'my'
      }
    })
  ]
};

When using IAM Policy, I get ERROR in S3Plugin: AccessDenied: Access Denied

Thanks so much for the hard work @MikaAK !

I'm on Mac OS w/ a Config looking like this:

config.plugins.push(new S3Plugin({
  s3Options: {
    accessKeyId: process.env.AWS_ACCESS_KEY_ID,
    secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
    region: 'us-east-1' // Bucket is US Standard
  },
  s3UploadOptions: {
    Bucket: 'my-bucket-name'
  },
  basePath: 'builds'
}))

However - those Keys are not a root AWS credential, instead they're attached to a group with this resource policy:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": "s3:ListAllMyBuckets",
            "Resource": "arn:aws:s3:::*"
        },
        {
            "Effect": "Allow",
            "Action": [
                "s3:ListBucket"
            ],
            "Resource": [
                "arn:aws:s3:::my-bucket-name/*",
                "arn:aws:s3:::my-bucket-name"
            ]
        },
        {
            "Effect": "Allow",
            "Action": "s3:*",
            "Resource": [
                "arn:aws:s3:::my-bucket-name/*",
                "arn:aws:s3:::my-bucket-name"
            ]
        }
    ]
}

The upload throws an AccessDenied error, like so:

ERROR in S3Plugin: AccessDenied: Access Denied
Child html-webpack-plugin for "index.html":
        + 3 hidden modules
Uploading [>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†] 47% 4.3s%

Am I missing something simple? I use S3 a lot - and I use that Policy for almost every s3 "uploader" config I have.

Thankyou so much!

cannot completely build and not any error shown

After build, i got these message in my console

> cross-env NODE_ENV=production webpack --config internals/webpack/webpack.prod.babel.js --color -p

(node:6731) fs: re-evaluating native module sources is not supported. If you are using the graceful-fs module, please update it to a more recent version.
Uploading [>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†βˆ†] 43% 1.0s [816] multi main 28 bytes {0} [built]
    + 1237 hidden modules
Uploading [>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>] 100% 0.0s

npm ERR! Darwin 14.5.0
npm ERR! argv "/usr/local/Cellar/node/6.2.2/bin/node" "/usr/local/bin/npm" "run" "build"
npm ERR! node v6.2.2
npm ERR! npm  v3.9.5
npm ERR! code ELIFECYCLE
npm ERR! [email protected] build: `cross-env NODE_ENV=production webpack --config internals/webpack/webpack.prod.babel.js --color -p`
npm ERR! Exit status 2
npm ERR!
npm ERR! Failed at the [email protected] build script 'cross-env NODE_ENV=production webpack --config internals/webpack/webpack.prod.babel.js --color -p'.

Not sure what's going on under the hood, there is no error log showing.

Region always set to us-west-2 when using AWS best practices

  • I have read and understood this plugin's README
  • If filing a bug report, I have included my version of node and s3-plugin-webpack
  • If filing a bug report, I have included which OS (including specific OS
    version) I am using.
  • If filing a bug report, I have included a minimal test case that reproduces
    my issue.
  • I understand this is an open-source project staffed by someone with a job and
    that any help I receive is done on free time. I know I am not entitled to anything and will be polite and courteous.
  • I understand my issue may be closed if it becomes obvious I didn't
    actually perform all of these steps or the issue is not with the library itself

Issue Details

I don't think these are relevant, but to knock off your checklist super quick:

node version: v5.0.0
s3-plugin-webpack: "webpack-s3-plugin": "^0.9.0"
OS: MacOS 10.12.1 (16B2555)

The README outlines that this plugin supports AWS best practices of for credentials. I am experiencing an issue with this that can be repeated with the following:

  1. Create a set of credentials in AWS
  2. Create an S3 bucket in the "Standard" (us-east-1) S3 region
  3. Update your credentials file with
[default]
region = us-east-1
aws_access_key_id = YOUR_KEY_ID
aws_secret_access_key = YOUR_ACCESS_KEY
  1. Configure S3 Plugin without s3options.

When doing this, I receive the following error:

ERROR in S3Plugin: PermanentRedirect: The bucket you are attempting to access must be addressed using the specified endpoint. Please send all future requests to this endpoint.

If I create a bucket in us-west-2, update my configuration to point at it and repeat the test, I receive no error message.

If I specify s3options with us-east-1, it works.

The problem is the result of

s3Options: _.merge({}, DEFAULT_S3_OPTIONS, s3Options)

Since s3Options is empty when credentials are provided via AWS best practices, your defaults always make it into the client.

cdnizerOptions.defaultCDNBasePath config not working when output.publicPath is defined

Complete the following or your issue will be deleted (delete me after)

Please complete these steps and check these boxes (by putting an x inside
the brackets) before filing your issue:

  • I have read and understood this plugin's README
  • If filing a bug report, I have included my version of node and s3-plugin-webpack
  • If filing a bug report, I have included which OS (including specific OS
    version) I am using.
  • If filing a bug report, I have included a minimal test case that reproduces
    my issue.
  • I understand this is an open-source project staffed by someone with a job and
    that any help I receive is done on free time. I know I am not entitled to anything and will be polite and courteous.
  • I understand my issue may be closed if it becomes obvious I didn't
    actually perform all of these steps or the issue is not with the library itself

Thank you for adhering to this process! This ensures that I can pay attention to issues that are relevant and answer questions faster.

Issue Details

I noticed an issue that when defining output.publicPath in a Webpack configuration, cdnizerOptions.defaultCDNBasePath will not be reflected in the generated index.html

Environment

$ node -v
v6.9.1
  • OS: Mac OSX 10.12.4
  • Webpack S3 Plugin: ^1.0.0-rc.0

Webpack Config

const ExtractTextPlugin = require('extract-text-webpack-plugin');
const HtmlWebpackPlugin = require('html-webpack-plugin');
const path = require('path');
const webpack = require('webpack');
const S3Plugin = require('webpack-s3-plugin');

module.exports = {
  context: path.resolve(__dirname, 'src'),

  entry: {
    index: './index.js',
    vendor: './vendor.js'
  },

  output: {
    path: path.join(__dirname, '/dest/web/abc'),
    publicPath: '/abc',
    filename: '[name].[chunkhash].bundle.js',
    sourceMapFilename: '[name].[chunkhash].bundle.map'
  },

  module: {
    rules: [{
      test: /\.js$/,
      exclude: /node_modules/,
      use: [{
        loader: 'babel-loader',
        options: {
          presets: [
            ['es2015', { modules: false }]
          ]
        }
      }, {
        loader: 'eslint-loader',
        options: {
          failOnWarning: false,
          failOnError: true
        }
      }]
    }, {
      test: /\.html$/,
      exclude: path.join(__dirname, './src/index.html'),
      use: [{
        loader: 'html-loader'
      }]
    }, {
      test: /\.less$/,
      use: ExtractTextPlugin.extract({
        fallback: 'style-loader',
        use: ['css-loader', 'less-loader']
      })
    }, {
      test: /\.(jpg|png|gif)$/,
      use: [{
        loader: 'file-loader'
      }]
    }, {
      test: /\.woff(2)?(\?v=[0-9]\.[0-9]\.[0-9])?$/,
      loader: 'url-loader?limit=10000&mimetype=application/font-woff'
    }, {
      test: /\.(ttf|eot|svg)(\?v=[0-9]\.[0-9]\.[0-9])?$/,
      loader: 'file-loader'
    }]
  },

  plugins: [
    new webpack.optimize.CommonsChunkPlugin({
      name: ['vendor']
    }),

    new ExtractTextPlugin('[name].[chunkhash].style.css'),

    new HtmlWebpackPlugin({
      template: './index.html',
      chunksSortMode: 'dependency'
    }),

    new S3Plugin({
      s3Options: {
        accessKeyId: process.env.AWS_ACCESS_KEY_ID,
        secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
        region: 'xxx'
      },
      s3UploadOptions: {
        Bucket: 'xxx'
      },
      cdnizerOptions: {
        defaultCDNBase: 'http://abc123.cloudfront.net'
      },
      cloudfrontInvalidateOptions: {
        DistributionId: 'xxx',
        Items: ['/index.html']
      }
    })
  ]

};

Generated index.html

<!DOCTYPE html>

<html lang="en">

  <head>
    <base href="/">

  <link href="/abc/index.61ccaa401863e6e0122e.style.css" rel="stylesheet"></head>

  <body ng-app="ne">

    <ne-bootstrap>Loading...</ne-bootstrap>

  <script type="text/javascript" src="/abc/vendor.a42ee6ffe37f88d1821d.bundle.js"></script><script type="text/javascript" src="/abc/index.61ccaa401863e6e0122e.bundle.js"></script></body>

</html>

Expected

/abc/ would be replaced with the value of defaultCDNBasePath

Note: if the value of output.publicPath is an empty string, the expected behavior is seen

Files other than css/js

My webpack build actually produces images, stylesheets, javascripts and html, however with this config:

new S3Plugin({
        include: /.*/,
        s3Options: {
          accessKeyId: process.env.AWS_ACCESS_KEY_ID,
          secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
          region: process.env.REGION
        },
        s3UploadOptions: {
          Bucket: process.env.BUCKET,
          ACL: 'public-read'
        }
      })

it uploads only css and js files, is there something wrong with what I'm doing?

ERROR in S3Plugin: NoSuchDistribution: The specified distribution does not exist. with AWS PROFILE

I get the following error when I'm using AWS_DEFAULT_PROFILE as input.

 ERROR in S3Plugin: NoSuchDistribution: The specified distribution does not exist.

It do not happens when I use aws accesskeys and its only for cloudfrontinvaildation it happens. The normal S3 upload works fine with env AWS_DEFAULT_PROFILE.

So I recon its

invalidateCloudfront() {
     :

    cloudfront.config.update({
      accessKeyId: clientConfig.s3Options.accessKeyId,
      secretAccessKey: clientConfig.s3Options.secretAccessKey,
    })

     :

That breaks it, I think this should only be preformed if the accessKeyId and secretAccessKey are given and not when a aws profile is used.

Recursive support?

I'm trying to upload the contents of an entire directory (public/ in my case), which contains subfolders for my css, js etc.

I'm getting the following error:

95% emitfs.js:844          
  return binding.lstat(pathModule._makeLong(path));
                                  ^
Error: ENOENT: no such file or directory, lstat 'bootstrap.min.css'

Errors do not make builds fail

For example: if the bucket requested does not exist, the error is logged to the console, but the build does not fail as it should and returns a success (exit code 0):

ERROR in S3Plugin: NoSuchBucket: The specified bucket does not exist
Child html-webpack-plugin for "index.html":
        + 3 hidden modules
Uploading [>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>] 100% 0.0s

I also see that the output is out of order from what I expect, with upload progress text appearing in seemingly random spots in the log.

Environment:

  • Node v6.2.0
  • s3_plugin_webpack v1.0.3
  • OSX 10.11.5

Example config:

const webpackMerge = require('webpack-merge');
const commonConfig = require('./webpack.prod.js');
const S3Plugin = require('webpack-s3-plugin');

module.exports = webpackMerge(commonConfig, {
  plugins: [
    new S3Plugin({
      exclude: /^.*\.(map|map\.gz)$/,
      s3Options: {
        accessKeyId: process.env.AWS_ACCESS_KEY_ID,
        secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
        region: 'us-west-2'
      },
      s3UploadOptions: {
        Bucket: 'something-that-doesnt-exist'
      },
      cloudfrontInvalidateOptions: {
        DistributionId: process.env.CLOUDFRONT_DISTRIBUTION_ID,
        Items: ['/*']
      }
    })
  ]
});

ContentEncoding fail for ico file

Issue Details

Hi,

thank your for your great work with this plugin !
I have an issue but I don't really know exactly where it fails.
The problem is that some extension does not accept ContentEncoding.

Environment

Here my environment:

  • Docker (Linux 4.9.4-moby x86_64) running on Mac OS 10.12.2
  • Node v6.9.3
  • npm 3.10.10

Plugin information: v0.9.2

Configuration

The plugin configuration is the following one:

common.plugins.push(
    new S3Plugin({
      include: /.*\.(css|js|svg|png|jpe?g|gif|ico|pdf)/,

      s3Options: s3Options,
      s3UploadOptions: {
        Bucket: siteConfiguration.parameters.amazon_bucket,
        /*
         * Images cache-control: 1 day
         * Other resources cache-control: 1 week
         * Data resources cache-control: 1 month with revalidation
         * Scripts and styles cache-control: 1 year
         */
        CacheControl(filename) {
          var cacheControl = configuration.s3CacheControl.misc;
          if (
           configuration.s3CacheControl.imagesExtensions
             .indexOf(path.extname(filename)) >= 0
          ) {
            cacheControl = configuration.s3CacheControl.images;
          }  else if (
            configuration.s3CacheControl.noCacheExtensions
              .indexOf(path.extname(filename)) >= 0
           ) {
            cacheControl = configuration.s3CacheControl.noCache;
          } else if (/\.js/.test(filename) || /\.css/.test(filename)) {
            cacheControl = configuration.s3CacheControl.stylesAndScripts;
          }

          return cacheControl;
        },

        ContentEncoding(filename) {
          // gzip only css and js files
          if (
            /\.js/.test(filename) ||
            /\.css/.test(filename) ||
            /\.ico/.test(filename)
          ) {
            return 'gzip';
          }
        },
      },

      basePathTransform: function () {
        return configuration.s3AssetPath;
      },

      cloudfrontInvalidateOptions:
        siteConfiguration.parameters.amazon_cloudfront_distribution_id ?
        {
          DistributionId: siteConfiguration.parameters.amazon_cloudfront_distribution_id,
          Items: [`/${configuration.s3AssetPath}*`],
        } :
        {},
    })
  );

Debug

I launch webpack as usual and everything worked for me until I've decided to gzip favicon file.
I tried to add a console.log in the ContentEncoding condition and the output is:

favicon.ico
bundles/mybundle/core.js
bundles/mybundle/helper.js
bundles/mybundle/layout.css
bundles/mybundle/third-party.js
application.js
assets.js
bundler.js
style.css

When I check in the S3 console, the metadata is not filled for the favicon but is filled for both css and js files

capture d ecran 2017-02-22 a 14 58 26

Do you need more information ?

Thank you

@iGitScor

cdnizerCss option is not working

Complete the following or your issue will be deleted (delete me after)

Please complete these steps and check these boxes (by putting an x inside
the brackets) before filing your issue:

  • I have read and understood this plugin's README
  • If filing a bug report, I have included my version of node and s3-plugin-webpack
  • If filing a bug report, I have included which OS (including specific OS
    version) I am using.
  • If filing a bug report, I have included a minimal test case that reproduces
    my issue.
  • I understand this is an open-source project staffed by someone with a job and
    that any help I receive is done on free time. I know I am not entitled to anything and will be polite and courteous.
  • I understand my issue may be closed if it becomes obvious I didn't
    actually perform all of these steps or the issue is not with the library itself

Thank you for adhering to this process! This ensures that I can pay attention to issues that are relevant and answer questions faster.

Issue Details

Hello,

I'm not able to use cdnizerCss option, is not working at all. Path for images and fonts remain exactly the same in my css file.

Thanks.

new S3Plugin({
            include: /.*\.(eot|svg|ttf|woff|woff2|otf)/,
            s3Options: {
                accessKeyId: process.env.AWS_ACCESS_KEY_ID,
                secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
                region: process.env.AWS_REGION
            },
            s3UploadOptions: {
                Bucket: 'my-bucket'
            },
            basePathTransform: addSha,
            cdnizerCss: {
                test: /fonts/,
                cdnUrl: "http://test.com"
            },
            cdnizerOptions: {
                defaultCDNBase: 'http://asdf.ca'
              }
        }),

Pass "compilation" object into "basePathTransform"

  • I have read and understood this plugin's README
  • If filing a bug report, I have included my version of node and s3-plugin-webpack
  • If filing a bug report, I have included which OS (including specific OS
    version) I am using.
  • If filing a bug report, I have included a minimal test case that reproduces
    my issue.
  • I understand this is an open-source project staffed by someone with a job and
    that any help I receive is done on free time. I know I am not entitled to anything and will be polite and courteous.
  • I understand my issue may be closed if it becomes obvious I didn't
    actually perform all of these steps or the issue is not with the library itself

Issue Details

It would be nice to have ability to access to compilation object under basePathTransform function.

It would be very useful if you would like to use such fields as compilation.fullHash, compilation.hash:

new S3Plugin({
    basePath: 'test',
    basePathTransform: (basePath, compilation) => {
        return basePath + compilation.fullHash;
    },
    s3Options: {
        accessKeyId: process.env.AWS_ACCESS_KEY_ID,
        secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
        region: 'us-west-1'
    },
    s3UploadOptions: {
        Bucket: 'MyBucket'
    }
})

As alternative you can use https://github.com/webpack/loader-utils#interpolatename to interpolate basePath*:

new S3Plugin({
    basePath: 'test/[hash]',
    basePathTransform: basePath => {
        return basePath + '[hash]';
    },
    s3Options: {
        accessKeyId: process.env.AWS_ACCESS_KEY_ID,
        secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
        region: 'us-west-1'
    },
    s3UploadOptions: {
        Bucket: 'MyBucket'
    }
})

js isn't being uploaded

It's uploading css but not js. Please downgrade to 0.6.7 momentarily while it's being fixed. Not sure how tests passed this!

Upgrading to 0.4.8 causes error

After upgrading to 0.4.8, during webpack compilation I get the following error:

ERROR in S3Plugin: Error: Invalid or empty files list supplied

We had to rollback to 0.4.5 to have things work

basePathTransform not accepting string

  • I have read and understood this plugin's README
  • If filing a bug report, I have included my version of node and s3-plugin-webpack
  • If filing a bug report, I have included which OS (including specific OS
    version) I am using.
  • If filing a bug report, I have included a minimal test case that reproduces
    my issue.
  • I understand this is an open-source project staffed by someone with a job and
    that any help I receive is done on free time. I know I am not entitled to anything and will be polite and courteous.
  • I understand my issue may be closed if it becomes obvious I didn't
    actually perform all of these steps or the issue is not with the library itself

Issue Details

node v7.3.0, s3-plugin-webpack v0.9.2, Mac OS 10.12.2

First of all: thanks for writing this great plugin πŸ‘ . Much appreciated :).

When entering a string in basePathTransform, it is not working since it's expecting a function.

var dateFormat = require('dateformat');
new S3Plugin({
    exclude: /.*\.scss$/,
    s3Options: {
        accessKeyId: '...',
        secretAccessKey: '...',
        region: '...'
    },
    s3UploadOptions: {
        Bucket: '...'
    },
    basePathTransform: dateFormat(new Date(), 'yyyy-mm-dd-HH-ss'),
});

When I rewrite the dateFormat part to a promise it works:

var timestamp = function() {
  return new Promise(function(resolve, reject) {
    resolve(dateFormat(new Date(), 'yyyy-dd-HH-mm-ss'));
  })
}

I was expecting that a string was okay since the documentation says:

basePathTransform: transform the base path to add a folder name. Can return a promise or a string

But perhaps I misunderstood :)

Alpine Linux - doesn't run

Hey,

Great work on this. I am using this in a project inside a docker container on top of alpine linux, and this plugin doesn't run as part of my build.

Any ideas why ?

Regards

CDNizer + image requires + Uglify

Hi, I'm having a hard time getting CDNizer setup in a way that makes sense. Since this is webpack, a lot of my images (all of them, probably) are going to be translated to requires by webpack, so they won't match the default matchers in CDNizer (which looks for CSS urls, img tags, etc.). That in itself is no big deal -- I started setting up a custom matcher, but then ran into an issue. This plugin is running CDNizer on the 'after-emit' plugin point, and the UglifyJS plugin does its work on 'optimize-chunk-assets' -- so CDNizer is trying to do find/replace on source that's already been uglified (assuming you're using the Uglify plugin). So if I were to write a custom matcher for the images I'm requiring, I'd have to write a pattern to match what Uglify spits out -- and that seems fragile, I'm not confident that Uglify is always going to give me e.exports=n.p, for example.

Since image requires and the Uglify plugin are in pretty widespread use this seems like a significant issue. Wouldn't it be preferable for CDNizer to work on module source at some point before any post-processing occurs? Or is there something I'm missing -- can CDNizer somehow be used with image requires and Uglify without issues?

Add ability to disable "progress"

Please complete these steps and check these boxes (by putting an x inside
the brackets) before filing your issue:

  • I have read and understood this plugin's README
  • If filing a bug report, I have included my version of node and s3-plugin-webpack
  • If filing a bug report, I have included which OS (including specific OS
    version) I am using.
  • If filing a bug report, I have included a minimal test case that reproduces
    my issue.
  • I understand this is an open-source project staffed by someone with a job and
    that any help I receive is done on free time. I know I am not entitled to anything and will be polite and courteous.
  • I understand my issue may be closed if it becomes obvious I didn't
    actually perform all of these steps or the issue is not with the library itself

Issue Details

It would be nice to disable progress via options:

new S3Plugin({
    progress: false,
    // or
    silent: true,
})

Adjusting file names when uploading to S3

Hi,
Is it possible to adjust file name when uploading to S3?
My current setup is this:

        new CompressionPlugin({
            asset: "[path]",  // "[path].gz[query]"
            algorithm: "gzip",
            test:  /\.(js|css|map)$/,
            // threshold: 10240,
            minRatio: 10  // 0.8
        }),

        new S3Plugin({
            s3Options: {
                accessKeyId: 'xxx',
                secretAccessKey: 'xxx',
                region: 'xxx'
            },
            s3UploadOptions: {
                Bucket: 'xxx',
                ContentEncoding(fileName) {
                    if (/\.(js|css|map)$/.test(fileName)) {
                        return 'gzip';
                    }
                }
            },
            cloudfrontInvalidateOptions: {
                DistributionId: 'xxx',
                Items: ["/*"]
            }
        }),

As you can see, the CompressionPlugin overwrites .js, .css and .map files with their compressed version. Ideally, I would like it to retain the original uncompressed files, but skip them when uploading to S3, and instead upload .js.gz as .js etc. Is this possible to do with your plugin?
Thanks!

Version 0.9.1 is missing in registry.npmjs.org

Hi, and thanks for submitting a fix for cache invalidation but npm can't find the new version :(

I did look at https://www.npmjs.com/package/webpack-s3-plugin and the 0.9.1 is visible, you have published 0.9.1 in this repo but in https://registry.npmjs.org/webpack-s3-plugin it still say 0.9.0 is the latest (even tho 0.9.1 has a time in the json file).

So I'm guessing that the publishing of 0.9.1 failed some how so could you please try to republish the 0.9.1 version?

/BR
Erik

Content-Type not being set correctly

I have set the Content-Type option as follows, but in my S3 it is being set as: application/x-www-form-urlencoded

ContentType: (fileName) => {
          if (/\.js/.test(fileName)) {
            return 'application/javascript'
          }

          if (/\.css/.test(fileName)) {
            return 'text/css'
          }
}

Environment:

  • Node v4.4.7
  • s3-plugin-webpack v0.9.0
  • OSX 10.11.6

Is this an issue with the s3 package itself?

TypeError: Cannot read property 'getAllFilesRecursive' of undefined

Getting the following error using the latest version:

95% emit/Users/Dids/Documents/React/<snip>/node_modules/webpack-s3-plugin/dist/s3_plugin.js:190
                _this2.getAllFilesRecursive(file).then(function (res) {
                      ^

TypeError: Cannot read property 'getAllFilesRecursive' of undefined
    at /Users/Dids/Documents/React/<snip>/node_modules/webpack-s3-plugin/dist/s3_plugin.js:190:23
    at FSReqWrap.oncomplete (fs.js:82:15)

Use AWS CredentialProviderChain for S3 authentication

Hi!

The Amazon JS SDK, https://www.npmjs.com/package/aws-sdk, has a few different ways of letting the developer provide credentials when using their SDK; it could be from the ~/.aws/credentials folder, or the current profile, or environment variables (http://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/S3.html)

It would be cool if this module would also enable this, as a fallback if the key and secret key have not been explicitly set. The s3 module uses aws-sdk itself, so potentially that could work.

For me it is much more convenient to keep my credentials where the are, in the ~/.aws/credentials, where also AWS's tools look for them.

Error: Cannot find module './helpers'

Getting this error whenever I put the require path in a webpack config.

Running Windows 8

Error: Cannot find module './helpers' at Function.Module._resolveFilename (module.js:337:15) at Function.Module._load (module.js:287:25) at Module.require (module.js:366:17) at require (module.js:385:17) at Object.<anonymous> (C:\Repositories\Clients\VolunteerLegacy\webhost\VolunteerWidgetv3\node_modules\webpack-s3-plugin\dist\s3_plugin.js:110:16) at Module._compile (module.js:435:26) at Object.Module._extensions..js (module.js:442:10) at Module.load (module.js:356:32) at Function.Module._load (module.js:311:12) at Module.require (module.js:366:17) at require (module.js:385:17) at Object.<anonymous> (C:\Repositories\Clients\VolunteerLegacy\webhost\VolunteerWidgetv3\webpack.deploy.config.js:8:16) at Module._compile (module.js:435:26) at Object.Module._extensions..js (module.js:442:10) at Module.load (module.js:356:32) at Function.Module._load (module.js:311:12)

Where line 8 is 'var S3Plugin = require('webpack-s3-plugin');

use of String.prototype.endsWith not working in node 0.12.10

Getting "TypeError: undefined is not a function" when trying to use the plugin with node 0.12.10.

node_modules/webpack-s3-plugin/dist/s3_plugin.js:170
      return fPath.endsWith(PATH_SEP) ? fPath : fPath + PATH_SEP;

TypeError: undefined is not a function
    at S3Plugin.addSeperatorToPath (node_modules/webpack-s3-plugin/dist/s3_plugin.js:170:20)

Apparently endsWith isn't yet available in this version of the v8 engine (3.28.71.19) that node 0.12.10 uses, or so I presume.

To improve backwards compatibility, would it be possible to use lodash's endsWith method in the several places in the s3-plugin-webpack codebase where String.prototype.endsWith (and startsWith) is used? Lodash is already included, so this should be a very minor change. Happy to submit a PR if you're ok with this.

Thanks!

Document that s3-plugin-webpack uploads object with public-read ACL and overrides bucket policies

  • I have read and understood this plugin's README
  • If filing a bug report, I have included my version of node and s3-plugin-webpack
  • If filing a bug report, I have included which OS (including specific OS
    version) I am using.
  • If filing a bug report, I have included a minimal test case that reproduces
    my issue.
  • I understand this is an open-source project staffed by someone with a job and
    that any help I receive is done on free time. I know I am not entitled to anything and will be polite and courteous.
  • I understand my issue may be closed if it becomes obvious I didn't
    actually perform all of these steps or the issue is not with the library itself

Issue Details

Default options set public-read ACL on every uploaded object: https://github.com/MikaAK/s3-plugin-webpack/blob/master/src/helpers.js#L10.

Object ACLs override S3 bucket policies, unless they are explicitally "Deny" (see https://docs.aws.amazon.com/AmazonS3/latest/dev/access-control-auth-workflow-object-operation.html). It pretty much opens the bucket to the world regardless of the policy users have set and this is mentioned nowhere. AWS bucket policy examples (https://docs.aws.amazon.com/AmazonS3/latest/dev/example-bucket-policies.html provides "Allow" examples mostly, so this likely causes security holes in buckets using default policies from the docs.

Backslash prepended to uploaded file name

After upgrading to 0.6.7 I've noticed the file name is being prepended a backslash. Up to 0.6.6 the uploaded file was correctly named in my bucket as bundle.js, whereas now the uploaded file name reads \bundle.js.

This is how I've got the plugin configured in my webpack.config.js:

new S3Plugin({
  include: /bundle.js/,
  s3Options: {
    accessKeyId: process.env.AWS_ACCESS_KEY_ID,
    secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
    region: 'us-east-1'
  },
  s3UploadOptions: {
    Bucket: 'dev.bucket'
  },
  basePath: 'some/path/dev',
  directory: path.join(__dirname, 'dist')
})

I'm running node v5.8 in Windows 7.

Imported assets still retain base path after cdnizer

new S3Plugin({
    progress: true,
    s3Options: {
      accessKeyId: envr.S3_ACCESS_KEY_ID,
      secretAccessKey: envr.S3_SECRET_ACCESS_KEY,
      region: envr.S3_REGION,
    },
    s3UploadOptions: {
      Bucket: envr.S3_BUCKET
    },
    cdnizerOptions: {
      defaultCDNBase: envr.ASSETS_FQDN
    },
    cloudfrontInvalidateOptions: {
      DistributionId: envr.CF_DISTRIBUTION_ID,
      Items: [`/${envr.BUILD_STAMP}/**`]
    },
    basePathTransform: () => envr.BUILD_STAMP
  })

For some reason, the above config doesn't seem to properly replace imported file base paths.

import CustomerLandingVideoMp4 from '../../assets/images/landing/customerLandingVideo.mp4';

^ This still resolves to

http://localhost:8002/7a173da26e30eae2eec7cc2aad967c74.jpg

Any ideas?

Node security check falls

During the node security check found one vulnerability.

(+) 1 vulnerabilities found
 Name     Installed   Patched   Path                                                                                                       More Info                             
 semver   2.2.1       >=4.3.2   [email protected] > [email protected] > [email protected] > [email protected]   https://nodesecurity.io/advisories/31 

I've opened the PR at the end of this chain, hopefully it will be merged shortly.
github/shahata/jsdelivr-cdn-data#7

Dependencies:
node v7.8.0
OS X 10.11.2

Step to reproduce:

  • Install nsp
  • run nsp check

I see that I am not the first who is trying to solve this issue with jsdelivr-cdn-data. I would suggest to give it a try, though.

Ability to set s3 metadata to specific file type

Hi. Here is my case:
I'm using s3 for hosting multipage application. Since s3 doesn't provide url rewriting feature I use the follow workaround: upload html files (app, admin) without .html extension and add metadata ContentType: text/html
Also I'm going to add cache control attribute for all files excluding html.

Is there ability to set metadata as I described above in the current version? If not, how can i implement it better? Do you plan to add this feature?

Uploaded items with basePath get prepended with /

Please complete these steps and check these boxes (by putting an x inside
the brackets) before filing your issue:

  • I have read and understood this plugin's README
  • If filing a bug report, I have included my version of node and s3-plugin-webpack
  • If filing a bug report, I have included which OS (including specific OS
    version) I am using.
  • If filing a bug report, I have included a minimal test case that reproduces
    my issue.
  • I understand this is an open-source project staffed by someone with a job and
    that any help I receive is done on free time. I know I am not entitled to anything and will be polite and courteous.
  • I understand my issue may be closed if it becomes obvious I didn't
    actually perform all of these steps or the issue is not with the library itself

Thank you for adhering to this process! This ensures that I can pay attention to issues that are relevant and answer questions faster.

Issue Details

Node: v6.0.0 32bits (Windows 10)
s3-plugin-webpack: 0.7.3

When using the plugin with the options basePath set, the files get correctly uploaded, but all the files in root directory have a path separator prepended to it's key.

Example:

new S3Plugin({
      // normal upload options
      basePath: 'stage',
})

This will be uploaded as:

Bucket
└──stage
  β”œβ”€β”€\index.html
  β”œβ”€β”€\styles.css
  β”œβ”€β”€\images
  β”‚  └──logo.png
  └──\scripts
     β”œβ”€β”€main.js
     └──admin.js

I was able to fix the issue bypassing the call to transformBasePath, but I'm not sure about the side effects of that...

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    πŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. πŸ“ŠπŸ“ˆπŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❀️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.