GithubHelp home page GithubHelp logo

adobe / aio-lib-files Goto Github PK

View Code? Open in Web Editor NEW
10.0 23.0 14.0 484 KB

An abstraction on top of cloud blob storage exposing a file system like API

Home Page: https://www.adobe.io

License: Apache License 2.0

JavaScript 100.00%
sdk adobe-io adobe-io-runtime openwhisk cloud-storage cloud-native

aio-lib-files's Introduction

Version Downloads/week Node.js CI License Codecov Coverage

Adobe I/O Lib Files

A Node JavaScript abstraction on top of cloud blob storages exposing a file-system like API.

You can initialize the SDK with your Adobe I/O Runtime (a.k.a OpenWhisk) credentials.

Alternatively, you can bring your own cloud storage keys. Note however, that as of now we only support Azure Blob Storage.

Please note that currently you must be a customer of Adobe Developer App Builder to use this library. App Builder is a complete framework that enables enterprise developers to build and deploy custom web applications that extend Adobe Experience Cloud solutions and run on Adobe infrastructure.

Install

npm install @adobe/aio-lib-files

Use

  const filesLib = require('@adobe/aio-lib-files')

  // init
  // init sdk using OpenWhisk credentials
  const files = await filesLib.init({ ow: { namespace, auth } })
  // init when env vars __OW_API_KEY and __OW_NAMESPACE are set (e.g. when running in an OpenWhisk action)
  const files = await filesLib.init()
  // or if you want to use your own cloud storage account
  const files = await filesLib.init({ azure: { storageAccount, storageAccessKey, containerName } })

  // write private file
  await files.write('mydir/myfile.txt', 'some private content')

  // write publicly accessible file
  await files.write('public/index.html', '<h1>Hello World!</h1>')

   // get file url
  const props = await files.getProperties('public/index.html')
  console.log('props = ', props)
  /*
  props =  { name: 'public/index.html',
    creationTime: 2020-12-09T19:49:58.000Z,
    lastModified: 2020-12-09T19:49:58.000Z,
    etag: '"0x8D89C7B9BB75A6F"',
    contentLength: 21,
    contentType: 'text/html',
    isDirectory: false,
    isPublic: true,
    url:
    'https://jestaiotest.blob.core.windows.net/readme-public/public%2Findex.html' }
  */

  // list all files
  await files.list('/') // ['mydir/myfile.txt', 'public/index.html']
  /*
  list =  [ { name: 'mydir/myfile.txt',
    creationTime: 2020-12-09T19:49:57.000Z,
    lastModified: 2020-12-09T19:49:57.000Z,
    etag: '0x8D89C7B9BB165F8',
    contentLength: 20,
    contentType: 'text/plain',
    isDirectory: false,
    isPublic: false,
    url:
     'https://jestaiotest.blob.core.windows.net/readme/mydir%2Fmyfile.txt' },
  { name: 'public/index.html',
    creationTime: 2020-12-09T19:49:58.000Z,
    lastModified: 2020-12-09T19:49:58.000Z,
    etag: '0x8D89C7B9BB75A6F',
    contentLength: 21,
    contentType: 'text/html',
    isDirectory: false,
    isPublic: true,
    url:
     'https://jestaiotest.blob.core.windows.net/readme-public/public%2Findex.html' } ]
  */

  // read
  const buffer = await files.read('mydir/myfile.txt')
  buffer.toString() // 'some private content'

  // pipe read stream to local file (consider using copy below)
  const rdStream = await files.createReadStream('mydir/myfile.txt')
  const stream = rdStream.pipe(fs.createWriteStream('my-local-file.txt'))
  stream.on('finish', () => console.log('done!'))

  // write read stream to remote file (consider using copy below)
  const rdStream = fs.createReadStream('my-local-file.txt')
  await files.write('my/remote/file.txt', rdStream)

  // delete files in 'my/remote/' dir
  await files.delete('my/remote/')
  // delete all public files
  await files.delete('public/')
  // delete all files including public
  await files.delete('/')

  // copy - higher level utility (works likes scp)
  // works for files and directories both remotely and locally, uses streams under the hood
  /// upload a single file
  await files.copy('my-static-app/index.html', 'public/my-static-app/index.html', { localSrc: true })
  /// upload local directory recursively
  await files.copy('my-static-app/', 'public/', { localSrc: true })
  /// download to local directory recursively (works for files as well)
  await files.copy('public/my-static-app/', 'my-static-app-copy', { localDest: true })
  /// copy remote directories around (works for files as well)
  await files.copy('public/my-static-app/', 'my/private/folder')

  // Share private files
  const presignUrl = await files.generatePresignURL('mydir/myfile.txt', { expiryInSeconds: 60 })

  //Share private files with read, write, delete permissions
  const rwdPresignUrl = await files.generatePresignURL('mydir/myfile.txt', { expiryInSeconds: 60, permissions: 'rwd' })

Presigned URL types and usage

File SDK supports two types of presigned URLs to access a file

  1. External - CDN based URLs which can be accessed from anywhere, assuming right pre-sign permissions for private files. You can use this type of URLs to provide access to your files to external systems and APIs for remote compute use-cases.
  2. Internal - Direct URLs to file storage. These URLs work only if used within Adobe I/O Runtime actions. You can use this type of URLs to chain worker actions that are processing the same file from different Runtime namespaces. As there is no CDN indirection, using this type of URL will improve the performance of your Runtime actions.

Files.getProperties returns both URL types (external as url, and internal as internalUrl) for a given file path Files.generatePresignURL supports UrlType as option to generate presign URL of given type

See usage example below -

  const  { init, UrlType }  = require('@adobe/aio-lib-files')
  const files = await init()

  // getProperties will return both internal and external URLs in return Object
  const props = files.getProperties('public/my-static-app/index.html')

  //generate presign URL with internal URLType
  const internalPresignUrl = await files.generatePresignURL('mydir/myfile.txt', { expiryInSeconds: 60, permissions: 'rwd', urltype: UrlType.internal })

Explore

goto API

Debug

set DEBUG=@adobe/aio-lib-files* to see debug logs.

Adobe I/O Files Store Consistency Guarantees

Strong consistency is guaranteed for all operations and across instances of the files sdk (returned by filesLib.init()).

Troubleshooting

"[StateLib:ERROR_INTERNAL] unknown error response from provider with status: unknown"

  • when using @adobe/aio-lib-files in an action bundled with webpack please make sure to turn off minification and enable resolving of es6 modules. Add the following lines to your webpack config:
  optimization: {
    minimize: false
  },
  resolve: {
    extensions: ['.js'],
    mainFields: ['main']
  }

Contributing

Contributions are welcomed! Read the Contributing Guide for more information.

Licensing

This project is licensed under the Apache V2 License. See LICENSE for more information.

aio-lib-files's People

Contributors

amulyakashyap09 avatar arjuncooliitr avatar dependabot[bot] avatar florind12 avatar greenkeeper[bot] avatar himavanth avatar meryllblanchet avatar michaelgoberling avatar moritzraho avatar purplecabbage avatar sandeep-paliwal avatar sarahxxu avatar shazron avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

aio-lib-files's Issues

Azure blob store on-the-fly gzip compression

Expected Behaviour

Azure blob storage seems to support on-the-fly gzip compression that can be enabled for a configurable list of mime-types: https://docs.microsoft.com/en-us/azure/cdn/cdn-improve-performance. For text-based files such as JSON, gzip compression would greatly reduce bytes that need to be send across the wire to a client, e.g. browser.

Actual Behaviour

Currently, files stored in Azure blob store via aio-lib-files are returned uncompressed as-is.

Reproduce Scenario (including but not limited to)

Steps to Reproduce

  • upload a JSON file via the aio-lib-files library
  • generate a presigned url
  • download and observe the response and size of the file uploaded above

Platform and Version

nodejs:12

global error tests should be rewritten as expect extentions

We have numerous instances of global.expect...

    // eslint-disable-next-line jest/expect-expect
    test('when path is not a valid string', async () => {
      await global.expectToThrowBadArg(files.delete.bind(files, 123), ['filePath', 'string'], { filePath: 123, options: {} })
    })

This means, we need to have many, many eslint-disable-next-line jest/expect-expect

A better practice, and the jest recommended way is to define our own expect extensions

expect.extend({
  toThrowBadArg: ( words, options ) => {
     // more expectations here ...
  }
})

// then tests call it like this:
expect(files.delete.bind(files, 123)).toThrowBadArg( ['filePath', 'string'], { filePath: 123, options: {} })

If we do this well, they can be reused across multiple repo/libs

Presigned URLs for internal or external consumption

The presigned URLs enhancements mentioned in #59 will allow developers to share a file for compute purpose with

  1. Another Runtime action in another namespace (i.e. a different Project Firefly application)
  2. An external API or system

While for 2) it is necessary to expose the presigned URL through the CDN, this is not required for 1).

As a developer, I'd like to be able to specify whether the generated presigned URL will be shared between Runtime actions/Project Firefly Applications, or with external systems. The default should be external systems.

document what `public` means for files

public here does not mean accessible to the whole public internet in general.

public means you don’t need any credentials to access the file:

  1. you can access a file across runtime actions only (internalUrl or url property below)
  2. you can access the file publicly on the Internet if you use the CDN link (url property below)
> await files.write('public/test', 'i am public')
11
> await files.getProperties('public/test')
{
  name: 'public/test',
  creationTime: 2023-11-10T09:29:02.000Z,
  lastModified: 2023-11-10T09:29:02.000Z,
  etag: '"0x9GBE1CF7A26B67A"',
  contentLength: 11,
  contentType: 'application/octet-stream',
  isDirectory: false,
  isPublic: true,
  url: 'https://my-cdn-domain.net/example-not-working-container-id-public/public/test',
  internalUrl: 'https://my-blob-storage.net/example-not-working-container-id-public/public/test'
}

Note url vs internalUrl (valid only when isPublic is true in this case). Direct access to the blob storage is restricted.

CDN domain should be returned by the TVM

Currently the CDN url for prod/stage is hardcoded in files lib, a better approach would be to return it from the TVM, just like the TVM also returns the blob storage account.

delete should accept a list of files to delete

Expected Behaviour

Actual Behaviour

Reproduce Scenario (including but not limited to)

Steps to Reproduce

Platform and Version

Sample Code that illustrates the problem

Logs taken while reproducing problem

presignUrl with write permissions and BYO credentials fail on PUT Blob

Expected Behaviour

write permissions should allow to send a PUT request to the presignURL even with BYO credentials

Actual Behaviour

This is failing for BYO credentials:

  • const files = await Files.init({ azure: {mycreds} })
  • const url = await files.generatePresignURL('hello.txt', { expiryInSeconds: 500, permissions: 'w' })
  • await fetch(url, { method: 'PUT', body: 'hello23' }) => returns 400 missing header (seems that it wants the Authorization header)

same works fine with TVM credentials

Reproduce Scenario (including but not limited to)

Steps to Reproduce

Platform and Version

Sample Code that illustrates the problem

Logs taken while reproducing problem

document expiryInSeconds

Expected Behaviour

Cant find the limit in the docs ... but if you try to set it to a year ...

Actual Behaviour

"expiryInSeconds" must be less than or equal to 86400

Reproduce Scenario (including but not limited to)

Steps to Reproduce

Platform and Version

Sample Code that illustrates the problem

Logs taken while reproducing problem

Presigned urls don't work if user provided its own azure credentials and init might break

Expected Behaviour

  • files.init({ azure: { <with SAS urls> } } should throw a clear error when using files.generatePresignURL as there is no way to generate pre signed urls without valid access to the tvm or storage account credentials
  • files.init({ azure: { <with storage account credentials> } }) should be able to generate presigned url using its own storage account credentials in the same way the TVM would do it when initializing the files sdk with OpenWhisk credentials

Actual Behaviour

  • files.init({ azure: { <with SAS urls> } }) in a runtime action: initializes the tvm while it shouldn't and when using files.generatePresignURL fails with a cryptic error message as it attempts to generate a presign url using the TVM from a blob not managed by the TVM storage account
  • files.init({ azure: { <with SAS urls> } }) outside a runtime action: breaks directly on init as it attempts to init the TVM without any OpenWhisk credentials

Same for files.init({azure: { <with storage account credentials>}}) while in that case we should be able to support pre-signed urls

Broken auto completion in vscode on instance object

const files = await filesLib.init() // => autocompletes
files. // => doesn't autocomplete

vscode's intellisense uses typescript types which seem to be compatible/inferred from our JSDoc. But there might be some sort of conflict.

Refresh expired TVM tokens for long running processes

When using OW credentials, tokens for accessing the underlying blob storage are fetched from the CNA Token Vending Machine (TVM).

Those tokens expire within an hour and currently tokens in long running processes are not automatically refreshed.

As a workaround long running processes should catch Forbidden errors (or periodically check the time interval ~1hour) and reinitialize the storage object when needed to refresh the credentials: storage = await storageSDK.init()

files.list and files.delete are limited to 5000 files

Expected Behaviour

await files.delete('/') should delete all files

Actual Behaviour

files.delete('/') only deletes 5000 files

Reproduce Scenario (including but not limited to)

Run await files.delete('/') on an account that has more than 5000 files. Run files.list('/') before and after the delete command to see how many files were deleted.

Steps to Reproduce

Platform and Version

aio-lib-files version 1.3.1

Sample Code that illustrates the problem

  //Initialize aio-lib-files  
  const files = await filesLib.init();
  LOG.debug('aio-lib-files initialized');
  const allFiles = await files.list('/');
  LOG.debug('Listing all files. Number of files are: '+allFiles.length);
  const deletedFiles = await files.delete('/');
  LOG.debug('Number of deleted files: '+deletedFiles.length);

Logs taken while reproducing problem

No useful logging.
"response": { "result": { "error": "The action did not produce a valid response and exited unexpectedly." }, "status": "action developer error", "success": false }

List should support pagination with a documented default and max limit

As shown by #73 the number of returned elements is 5000 but there are no means to get the next elements.
List should implement a pagination mechanism.
Also the default page size should be set to a much smaller value than the max 5000, so that it take I/O Runtime memory and request/response size limits into account.

Generated presigned URLs do not work in the browser

Expected Behaviour

Presigned URLs generated by the aio-lib-files library should be usable directly in the browser as is.
If I used the library in a wrong way (see below), would there be a way to support this usage? (Which comes near to #59).

Actual Behaviour

Presigned URLs generated by the aio-lib-files library cannot be used. For instance when trying with the browser, I get this error:

<Error>
<Code>BlobNotFound</Code>
<Message>The specified blob does not exist. RequestId:92974b1e-d01e-0114-6272-05e910000000 Time:2021-02-17T21:21:06.1462572Z</Message>
</Error>

Reproduce Scenario (including but not limited to) & Sample Code that illustrates the problem

Run this piece of code. The generated URLs won't work:

        const filesLib = require('@adobe/aio-lib-files');

        // generate presigned-urls for input and output
        const files = await filesLib.init({
            azure:
            {
                storageAccount: process.env.AZURE_STORAGE_ACCOUNT,
                storageAccessKey: process.env.AZURE_STORAGE_KEY,
                containerName: process.env.AZURE_STORAGE_CONTAINER,
            }
        });

        const readUrl = await files.generatePresignURL("./test/files/newton.dn", 
            { 
                expiryInSeconds: 86400, 
                permissions: 'r'
            }
        );
        console.log(readUrl);

        const writeUrl = await files.generatePresignURL("./test/files/newton.dn", 
            { 
                expiryInSeconds: 86400, 
                permissions: 'rwd'
            }
        );
        console.log(writeUrl);

If I used the library in a wrong way, would there be a way to support this usage, so I can use aio-lib-files for all my needs? (Which comes near to #59 also, I think).

Steps to Reproduce

Run the code from the previous point.

Platform and Version

  • Node 12.16.0

Logs taken while reproducing problem

None.

aio library files preview instead of continuous download of library cloud items

The current cloud library is downloaded to the file menu when going through a directory instead of just downloading a preview of the file it will download its entire content to its file directory.

The conflicts this causes when clients settings have been adjusted in the operating system to only download files when they are used is the reason why a client would not be able to continue using the application. A large library which is downloaded to a clients machine could run out of disc space in no time if some of the library content is instantly downloaded.

Either advise clients the content will always be downloaded to a set directory and explain the settings in the operating system will have no effect when using the lib-files or update the application library files so it behaves in accordance to the operating system settings.

Expected Behaviour

Actual Behaviour

Reproduce Scenario (including but not limited to)

Steps to Reproduce

Platform and Version

Sample Code that illustrates the problem

Logs taken while reproducing problem

Support private-only container for BYO credentials

Expected Behaviour

Users should be able to set init with a noPublic option.
OR by default, we should only create a private container and add a addPublic option (breaking change)

Actual Behaviour

Init will create a private and a public container, this doesn't work for users that need to keep their azure instance private

Typescript error

Expected Behaviour

files.d.ts line 53 should be Promise<T> instead of Promise

Actual Behaviour

types/adobe/files.d.ts:53:106 - error TS2314: Generic type 'Promise<T>' requires 1 type argument(s).

53     protected _wrapProviderRequest(requestPromise: Promise, details: any, filePathToThrowOn404: string): Promise;
                

Reproduce Scenario (including but not limited to)

Steps to Reproduce

Platform and Version

Sample Code that illustrates the problem

Logs taken while reproducing problem

Support for custom CDN host option

Actual behavior

The CDN domain is fixed (see https://github.com/adobe/aio-lib-files/blob/master/lib/impl/AzureBlobFiles.js#L29) and points to the Adobe I/O Firefly CDN, this is a problem if the user is bringing his own storage via Files.init({ azure: <storage creds> }) as URLs would be broken.

Expected behavior

Support for a custom CDN host option on files init, e.g. Files.init({ host: 'mycdn.domain' })
EDIT: Let the TVM return the hostname as part of the credentials, as the logic handling this is already added by #66

Why

  • allows support for bring you own storage => DONE in #66
  • allow to test the files lib against non-prod environments

Introduce a `move` function

Expected Behaviour

I've seen quite often in review now aio-lib-files being used. I noticed often the combo copy followed by delete is being used. Could you consider introducing a move function that wraps the copy followed by delete?

The expected behavior would be that it looks like the file got moved from A to B, even if behind the scenes it's a copy to the new location followed by a deletion of the original file (this is an example. Details of the implementation are left open).

Actual Behaviour

There is no move yet. People have to copy the file and then delete it themselves.

Reproduce Scenario (including but not limited to)

There is no move yet. People have to copy the file and then delete it themselves.

Steps to Reproduce

N/A

Platform and Version

All

Sample Code that illustrates the problem

Not a bug, it's an enhancement.

Logs taken while reproducing problem

Not a bug, it's an enhancement.

implement files.exists and better documentation for files.getProperties

Currently to test if a file exists the user must do

const list = await files.list(filePath)
const exists = list.length > 0

which is not very intuitive.

To make the interface clearer, it would be preferable to expose a files.exists method that acts as a sugar for the above.

Also some users might be tented to usefiles.getProperties but it doesn't throw if the file doesn't exists, which might be misleading. This should be documented properly OR the behavior of files.getProperties should be changed to throw in case the filePath doesn't exist.

(EDITED, before the example was wrong, thanks to @alexkli for pointing this out)

Support presignUrls on public files or explicitely throw an error

Expected Behaviour

presignUrls for public files might be useful for handing over write or delete permissions, obviously read is not useful.
we do not support it for now but still don't throw an error, and the user receives a non working presign URL

Actual Behaviour

via TVM credentials:

via BYO credentials:

In Windows and Jest environment, `upath` normalization to unix path is not working

The upath module claims to always return unix paths even on windows. e.g. upath.normalize('a\\win\\path') returns a/win/path. This is needed to normalize paths before uploading them to the cloud storage which uses unix style only. Also unix paths work fine inside the node process in windows.

While upath seems to work fine from the windows cmd line (converting all paths to unix) it breaks when running inside the jest environment in windows.

In the jest evironment upath even converts back the path to windows: i.e.upath.normalize('a/win/path') returns a\\win\\path. This is how the regular path module would do it.

A current workaround is to useupath.toUnix in front of every upath operation.

We should find a better way than using upath.toUnix by either fixing upath with jest or using something else

test failures with new [email protected] dependency

Workaround is to pin [email protected]. 17.8.0 and 17.8.1 makes this fail.

 FAIL  test/init.test.js
  ● Test suite failed to run

    Cannot find module '@hapi/hoek/assert' from 'index.js'

      11 | */
      12 |
    > 13 | const TvmClient = require('@adobe/aio-lib-core-tvm')
         |                   ^
      14 | const logger = require('@adobe/aio-lib-core-logging')('@adobe/aio-lib-files', { provider: 'debug' })
      15 |
      16 | const utils = require('./utils')

      at Resolver.resolveModule (node_modules/jest-resolve/build/index.js:259:17)
      at Object.<anonymous> (node_modules/joi/lib/index.js:3:16)
      at Object.<anonymous> (node_modules/@adobe/aio-lib-core-tvm/lib/TvmClient.js:13:13)
      at Object.<anonymous> (lib/init.js:13:19)
      at Object.<anonymous> (test/init.test.js:13:18)

 FAIL  test/Files.test.js
  ● Test suite failed to run

    Cannot find module '@hapi/hoek/assert' from 'index.js'

    However, Jest was able to find:
    	'../lib/Files.js'

    You might want to include a file extension in your import, or update your 'moduleFileExtensions', which is currently ['js', 'json', 'jsx', 'ts', 'tsx', 'node'].

    See https://jestjs.io/docs/en/configuration#modulefileextensions-array-string

      13 | const upath = require('upath')
      14 | const fs = require('fs-extra')
    > 15 | const joi = require('joi')
         |             ^
      16 | const stream = require('stream')
      17 | const cloneDeep = require('lodash.clonedeep')
      18 | const logger = require('@adobe/aio-lib-core-logging')('@adobe/aio-lib-files', { provider: 'debug' })

      at Resolver.resolveModule (node_modules/jest-resolve/build/index.js:259:17)
      at Object.<anonymous> (node_modules/joi/lib/index.js:3:16)
      at Object.<anonymous> (lib/Files.js:15:13)
      at Object.<anonymous> (test/Files.test.js:2:45)

 FAIL  test/impl/AzureBlobFiles.test.js
  ● Test suite failed to run

    Cannot find module '@hapi/hoek/assert' from 'index.js'

    However, Jest was able to find:
    	'../../lib/impl/AzureBlobFiles.js'

    You might want to include a file extension in your import, or update your 'moduleFileExtensions', which is currently ['js', 'json', 'jsx', 'ts', 'tsx', 'node'].

    See https://jestjs.io/docs/en/configuration#modulefileextensions-array-string

      14 |
      15 | const azure = require('@azure/storage-blob')
    > 16 | const joi = require('joi')
         |             ^
      17 | const stream = require('stream')
      18 | const mime = require('mime-types')
      19 | const fetch = require('node-fetch')

      at Resolver.resolveModule (node_modules/jest-resolve/build/index.js:259:17)
      at Object.<anonymous> (node_modules/joi/lib/index.js:3:16)
      at Object.<anonymous> (lib/impl/AzureBlobFiles.js:16:13)
      at Object.<anonymous> (test/impl/AzureBlobFiles.test.js:22:28)

--------------------|----------|----------|----------|----------|-------------------|
File                |  % Stmts | % Branch |  % Funcs |  % Lines | Uncovered Line #s |
--------------------|----------|----------|----------|----------|-------------------|
All files           |        0 |        0 |        0 |        0 |                   |
 lib                |        0 |        0 |        0 |        0 |                   |
  Files.js          |        0 |        0 |        0 |        0 |... 57,858,862,864 |
  FilesError.js     |        0 |        0 |        0 |        0 |... 48,49,50,51,53 |
  init.js           |        0 |        0 |        0 |        0 |... 64,66,67,68,76 |
  utils.js          |        0 |        0 |        0 |        0 |... 25,26,29,34,37 |
 lib/impl           |        0 |        0 |        0 |        0 |                   |
  AzureBlobFiles.js |        0 |        0 |        0 |        0 |... 26,628,638,642 |
--------------------|----------|----------|----------|----------|-------------------|
Jest: "global" coverage threshold for statements (100%) not met: 0%
Jest: "global" coverage threshold for branches (100%) not met: 0%
Jest: "global" coverage threshold for lines (100%) not met: 0%
Test Suites: 3 failed, 3 total
Tests:       0 total
Snapshots:   0 total
Time:        1.326s
Ran all test suites.

Presigned URLs generation enhancements

As a developer, I'd like to share a file created with aio-lib-files from my Runtime action to other entities for enhanced compute scenarios.

These could be:

  • Other Runtime actions, within the same namespace or a different one (i.e. different Project Firefly app)
  • Adobe APIs
  • 3rd party systems and APIs
  • Direct file consumers

For these enhanced compute scenarios I'd like to be able to provide write and delete permissions to the file through the generated presigned URL.

I'd like to be able to configure a TTL in seconds for the generated presigned URL, and this configuration should be explicitly set by myself.

Finally, I'd like the API to be generic enough for me to provide a custom implementation for my own file storage.

append mode

it would be super useful if we could write to a file through the api in append mode, this would allow other use cases like very basic logging without having to read and re-write the entire file

    await files.append('my-static-app/users.log', 'created new user `${newUserId}`')

presignedUrl file path must already exist

Expected Behaviour

it should be possible to get a presignedUrl with write permission for a file that has not yet been created.

Actual Behaviour

an error is thrown

Reproduce Scenario (including but not limited to)

const rwdPresignUrl = await files.generatePresignURL('dir_could_not_exist/file_does_not_exist_yet.txt', {
  expiryInSeconds: 3600, 
  permissions: 'rwd'
})

Workaround

write to the file before getting the url

await files.write('dir_could_not_exist/file_does_not_exist_yet.txt', '')
const rwdPresignUrl = await files.generatePresignURL('dir_could_not_exist/file_does_not_exist_yet.txt', {
  expiryInSeconds: 3600, 
  permissions: 'rwd' 
})

files.copy returns 414 for own storage

Expected Behaviour

I want to copy a file from one folder to another using files.copy and my own storage account (azure)

Actual Behaviour

files works as expected with own credentials. but when I try to copy a file from folderA to folderB I just get an error as follow
[FilesLib:ERROR_INTERNAL] unknown error response from provider with status: 414

Reproduce Scenario (including but not limited to)

here is the sample code I try to run (the source file exists)
await files.copy('folderA/myFile.json','folderB/')

Steps to Reproduce

  1. init files using own azure storage account
  2. try to copy a file

Platform and Version

using "@adobe/aio-lib-files": "^1.2.3"

Sample Code that illustrates the problem

await files.copy('folderA/myFile.json','folderB/')

Logs taken while reproducing problem

error: [FilesLib:ERROR_INTERNAL] unknown error response from provider with status: 414

Can not write files to a bring-your-own azure storage

I would like to use a BYO azure storage with the Files SDK:

const files = await filesLib.init({ azure: { storageAccount, storageAccessKey, containerName } })

However files.write gave a bad creds error even though all creds are correct:

2020-09-22T14:46:05.873Z       stdout: 2020-09-22T14:46:05.872Z @adobe/aio-lib-files:error {
  "sdk": "FilesLib",
  "sdkDetails": {
    "filePath": "my-folder/test.csv",
    "contentType": "Readable"
  },
  "code": "ERROR_BAD_CREDENTIALS",
  "message": "[FilesLib:ERROR_BAD_CREDENTIALS] cannot access `cloud storage provider`, make sure your credentials are valid",
  "stacktrace": "FilesLibError: [FilesLib:ERROR_BAD_CREDENTIALS] cannot access `cloud storage provider`, make sure your credentials are valid\n    at new <anonymous> (/nodejsAction/9KGzgxBe/index.js:41347:9)\n    at /nodejsAction/9KGzgxBe/index.js:23658:39\n    at processTicksAndRejections (internal/process/task_queues.js:97:5)\n    at async AzureBlobFiles._writeStream (/nodejsAction/9KGzgxBe/index.js:118225:5)\n    at async NodeActionRunner.main [as userScriptMain] (/nodejsAction/9KGzgxBe/index.js:41148:25)"
}

Reference: reported by @UrsBoller in the Firefly Forum: https://experienceleaguecommunities.adobe.com/t5/project-firefly-questions/access-to-own-storage-using-aio-lib-state-and-aio-lib-action/qaq-p/378249/comment-id/202#M202

Revoke presigned URLs

As a developer, I'd like to be able to programmatically revoke presigned URLs, even if their TTL hasn't expired yet.

Note: the API should be generic enough to allow different kind of implementations (could be very different approach from one cloud provider to the other, or one file storage to the other).

Allow all origins for presigned URL

One of the biggest advantages of presigned URL for files is to allow uploading files directly from the client / browser. The client could be the Firefly SPA (React spectrum) or a 3rd party site. However, it is currently not possible to make fetch requests from client-side javascript to the presigned URL (on storage CDN: https://firefly.azureedge.net/) due to CORS blocking.

const response = await fetch(
  url,
  {
    method: 'PUT',
    body: file,
    headers: {
      'Content-Type': 'image/png'
    }
  }
)

Expected Behaviour

The file should be uploaded to the presigned URL.

Actual Behaviour

Access to fetch at 'https://firefly.azureedge.net/...' from origin 'https://my-namespace.adobeio-static.net' has been blocked by CORS policy: Response to preflight request doesn't pass access control check: No 'Access-Control-Allow-Origin' header is present on the requested resource. If an opaque response serves your needs, set the request's mode to 'no-cors' to fetch the resource with CORS disabled.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.