GithubHelp home page GithubHelp logo

azure / azure-storage-azcopy Goto Github PK

View Code? Open in Web Editor NEW
586.0 48.0 204.0 38.18 MB

The new Azure Storage data transfer utility - AzCopy v10

License: MIT License

Go 86.06% Shell 0.04% Python 8.70% CSS 4.46% HTML 0.52% JavaScript 0.22%

azure-storage-azcopy's Introduction

AzCopy v10

AzCopy v10 is a command-line utility that you can use to copy data to and from containers and file shares in Azure Storage accounts. AzCopy V10 presents easy-to-use commands that are optimized for high performance and throughput.

Features and capabilities

✅ Use with storage accounts that have a hierarchical namespace (Azure Data Lake Storage Gen2).

✅ Create containers and file shares.

✅ Upload files and directories.

✅ Download files and directories.

✅ Copy containers, directories and blobs between storage accounts (Service to Service).

✅ Synchronize data between Local <=> Blob Storage, Blob Storage <=> File Storage, and Local <=> File Storage.

✅ Delete blobs or files from an Azure storage account

✅ Copy objects, directories, and buckets from Amazon Web Services (AWS) to Azure Blob Storage (Blobs only).

✅ Copy objects, directories, and buckets from Google Cloud Platform (GCP) to Azure Blob Storage (Blobs only).

✅ List files in a container.

✅ Recover from failures by restarting previous jobs.

Download AzCopy

The latest binary for AzCopy along with installation instructions may be found here.

Find help

For complete guidance, visit any of these articles on the docs.microsoft.com website.

✳️ Get started with AzCopy (download links here)

✳️ Upload files to Azure Blob storage by using AzCopy

✳️ Download blobs from Azure Blob storage by using AzCopy

✳️ Copy blobs between Azure storage accounts by using AzCopy

✳️ Synchronize between Local File System/Azure Blob Storage (Gen1)/Azure File Storage by using AzCopy

✳️ Transfer data with AzCopy and file storage

✳️ Transfer data with AzCopy and Amazon S3 buckets

✳️ Transfer data with AzCopy and Google GCP buckets

✳️ Use data transfer tools in Azure Stack Hub Storage

✳️ Configure, optimize, and troubleshoot AzCopy

✳️ AzCopy WiKi

Supported Operations

The general format of the AzCopy commands is: azcopy [command] [arguments] --[flag-name]=[flag-value]

  • bench - Runs a performance benchmark by uploading or downloading test data to or from a specified destination

  • copy - Copies source data to a destination location. The supported directions are:

    • Local File System <-> Azure Blob (SAS or OAuth authentication)
    • Local File System <-> Azure Files (Share/directory SAS authentication)
    • Local File System <-> Azure Data Lake Storage (ADLS Gen2) (SAS, OAuth, or SharedKey authentication)
    • Azure Blob (SAS or public) -> Azure Blob (SAS or OAuth authentication)
    • Azure Blob (SAS or public) -> Azure Files (SAS)
    • Azure Files (SAS) -> Azure Files (SAS)
    • Azure Files (SAS) -> Azure Blob (SAS or OAuth authentication)
    • AWS S3 (Access Key) -> Azure Block Blob (SAS or OAuth authentication)
    • Google Cloud Storage (Service Account Key) -> Azure Block Blob (SAS or OAuth authentication) [Preview]
  • sync - Replicate source to the destination location. The supported directions are:

    • Local File System <-> Azure Blob (SAS or OAuth authentication)
    • Local File System <-> Azure Files (Share/directory SAS authentication)
    • Azure Blob (SAS or public) -> Azure Files (SAS)
  • login - Log in to Azure Active Directory (AD) to access Azure Storage resources.

  • logout - Log out to terminate access to Azure Storage resources.

  • list - List the entities in a given resource

  • doc - Generates documentation for the tool in Markdown format

  • env - Shows the environment variables that you can use to configure the behavior of AzCopy.

  • help - Help about any command

  • jobs - Sub-commands related to managing jobs

  • load - Sub-commands related to transferring data in specific formats

  • make - Create a container or file share.

  • remove - Delete blobs or files from an Azure storage account

Find help from your command prompt

For convenience, consider adding the AzCopy directory location to your system path for ease of use. That way you can type azcopy from any directory on your system.

To see a list of commands, type azcopy -h and then press the ENTER key.

To learn about a specific command, just include the name of the command (For example: azcopy list -h).

AzCopy command help example

If you choose not to add AzCopy to your path, you'll have to change directories to the location of your AzCopy executable and type azcopy or .\azcopy in Windows PowerShell command prompts.

Frequently asked questions

What is the difference between sync and copy?

  • The copy command is a simple transferring operation. It scans/enumerates the source and attempts to transfer every single file/blob present on the source to the destination. The supported source/destination pairs are listed in the help message of the tool.

  • On the other hand, sync scans/enumerates both the source, and the destination to find the incremental change. It makes sure that whatever is present in the source will be replicated to the destination. For sync,

  • If your goal is to simply move some files, then copy is definitely the right command, since it offers much better performance. If the use case is to incrementally transfer data (files present only on source) then sync is the better choice, since only the modified/missing files will be transferred. Since sync enumerates both source and destination to find the incremental change, it is relatively slower as compared to copy

Will copy overwrite my files?

By default, AzCopy will overwrite the files at the destination if they already exist. To avoid this behavior, please use the flag --overwrite=false.

Will sync overwrite my files?

By default, AzCopy sync use last-modified-time to determine whether to transfer the same file present at both the source, and the destination. i.e, If the source file is newer compared to the destination file, we overwrite the destination You can change this default behaviour and overwrite files at the destination by using the flag --mirror-mode=true

Will 'sync' delete files in the destination if they no longer exist in the source location?

By default, the 'sync' command doesn't delete files in the destination unless you use an optional flag with the command. To learn more, see Synchronize files.

How to contribute to AzCopy v10

This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.microsoft.com.

When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact [email protected] with any additional questions or comments.

azure-storage-azcopy's People

Contributors

adreed-msft avatar aishmanoh avatar dependabot[bot] avatar dvrkps avatar gapra-msft avatar jeffreyrichter avatar jiacfan avatar johnrusk avatar microsoft-github-policy-service[bot] avatar microsoftopensource avatar mohsha-msft avatar nakulkar-msft avatar nitin-deamon avatar normesta avatar pranavmalik-msft avatar prjain-msft avatar rickle-msft avatar riven-spell avatar rpohane avatar seguler avatar siminsavani-msft avatar simonwaldherr avatar strikerzee avatar t-cguruceaga-msft avatar tasherif-msft avatar testwill avatar thomasmaurer avatar tiverma-msft avatar yugangw-msft avatar zezha-msft avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

azure-storage-azcopy's Issues

On Windows, AzCopy Cannot Handle Long File Paths

Which version of the AzCopy was used?

10.0.3-preview

Which platform are you using? (ex: Windows, Mac, Linux)

Windows

What command did you run?

Note: Please remove the SAS to avoid exposing your credentials. If you cannot remember the exact command, please retrieve it from the beginning of the log file.

.\azcopy.exe copy "..\..\1234567891234567981234567981321321654684354645313216843513213223111141445 45474567271271727231731324175417234173241732417324173aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaabbb" "https://REDACTED.blob.core.windows.net/uploadtohere?REDACTED"

What problem was encountered?

On upload, AzCopy failed with the error: failed to perform copy command due to error: cannot start job due to error: cannot find source to upload.
For download, it seems like AzCopy can handle slightly longer paths, but there is still a point where it fails.

How can we reproduce the problem in the simplest way?

Create a file, whose full path is at least 250ish characters long. Try to transfer it with AzCopy.

Have you found a mitigation/solution?

No.

Install AzCopy on Windows sets path wrong

Which version of the AzCopy was used?

AzCopy 8.1.0-netcore

Note: The version is visible when running AzCopy without any argument

Which platform are you using? (ex: Windows, Mac, Linux)

Windows

What command did you run?

The "Microsoft Azure Storage AzCopy" shortcut installed

Note: Please remove the SAS to avoid exposing your credentials. If you cannot remember the exact command, please retrieve it from the beginning of the log file.

What problem was encountered?

When reviewing the updated path in the new "Microsoft Azure Storage AzCopy" command prompt window, the path is incorrect. It has an extra AzCopy at the end.

C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy\AzCopy

How can we reproduce the problem in the simplest way?

  1. Install AzCopy 8.1 for Windows on Windows 10
  2. Launch AzCopy command prompt from the start menu
  3. Type PATH to view this session's PATH
  4. Change to a different directory and try running AzCopy.exe
  5. Observe command not found
C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy>cd \

C:\>azcopy
'azcopy' is not recognized as an internal or external command,
operable program or batch file.

C:\>

Have you found a mitigation/solution?

Fix the 'set PATH' statement in C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy\LaunchCmd.cmd

Support reading from STDIN

Hey,
would be nice to transfer files with azcopy which are read by STDIN, like:

cat MYFILE | azcopy cp "https://myaccount.blob.core.windows.net/mycontainer/1file.txt?sastokenhere"

This is already implemented in azcopy v7.

Best regards,
Jonas

Cat command output

Is this expected behavior/output of the cat command without having grep in the pipeline?

PS C:\Users\artek\Desktop\OneDrive - Microsoft\Current tasks\PowerShell & CLI & AzCopy\azcopy_windows_amd64_10.0.1> cat "C:\Users\artek.azcopy\ed3dc4a0-8949-9d4f-57dc-f8fccaa7bd4d.log"
2018/10/09 18:00:01 AzcopVersion 10.0.2-Preview
2018/10/09 18:00:01 OS-Environment windows
2018/10/09 18:00:01 OS-Architecture amd64
2018/10/09 18:10:14 AzcopVersion 10.0.2-Preview
2018/10/09 18:10:14 OS-Environment windows
2018/10/09 18:10:14 OS-Architecture amd64

[Question] 10.0.2 prev. Azcopy sync limitations.. Blob storage only?

Running Win 7. does new Azcopy sync support (local files) to (file storage)? Or does it only support (blob storage)?

Syncing from local file (C:\GQ) to (Azure File Storage "File Share")

so something like below , should work?
azcopy sync c:\GQ https:\\<azure file storage path> --recursive

Issue when copying from a file share locally onto another file share

Which version of the AzCopy was used?

Note: The version is visible when running AzCopy without any argument

10.0.1

Which platform are you using? (ex: Windows, Mac, Linux)

Linux

What command did you run?

Note: Please remove the SAS to avoid exposing your credentials. If you cannot remember the exact command, please retrieve it from the beginning of the log file.

./azcopy_linux_amd64 cp 'https://<fileshare:.file.core.windows.net//?<redacted_sas>&sharesnapshot=' '<different_azure_file_share>' --recursive=true

What problem was encountered?

000 : File Creation Error operation not supported

How can we reproduce the problem in the simplest way?

If I change the path from the CIFS share to the local directory the command works (i.e. saving onto the VM itself). But if I make the path something on a new empty windows cifs fileshare (i.e. restoring my snapshot onto a fresh share) I get errors

Have you found a mitigation/solution?

No. But I believe it to be related to allocating memory for the file here

err = syscall.Fallocate(int(f.Fd()), 0, 0, fileSize)

Unclear failures in upload from local filesystem to Azure Storage

Ran a copy command that ended in 1000's of errors. An example from the log follows. I assume it was due to slow upload network (from home machine) so, I changed to AZCOPY_CONCURRENCY_VALUE=10 and so far haven't reproed. Given the concurrency is an optional env var, I'd expect AZCOPY to dynamically reduce the number of threads (or, determine the upload throughput first) or, provide a clear error message so the user can manually set correctly.

Summary:
.log file created in C:\Users\klaas/.azcopy
519 Done, 904 Failed, 4322 Pending, 0 Skipped 5745 Total , 2-sec Throughput (Mb/s): 5.7147

Job 42742d06-ee4e-a143-7458-16f358102298 summary
Elapsed Time (Minutes): 1009.0905
Total Number Of Transfers: 5745
Number of Transfers Completed: 519
Number of Transfers Failed: 904
Number of Transfers Skipped: 0
TotalBytesTransferred: 798344735
Final Job Status: Cancelled

Put https://klaasbackup.blob.core.windows.net/summerbackup/2018/2018/JUne/DSCF1731.JPG?blockid=MTM1ODQyMmItYjYyYS00MzQ5LTZiOTUtZWJkYWQ0MjFjNmQ1&comp=block&se=2018-10-15T09%3A50%3A29Z&sig=3W7BpmoZYi8i6OXRlFg3mXhxQMIiEjLtUlR1ZR6jCN8%3D&sp=rwdlacup&spr=https&srt=sco&ss=bfqt&st=2018-09-15T01%3A50%3A29Z&sv=2017-11-09&timeout=601: net/http: HTTP/1.x transport connection broken: write tcp 192.168.1.48:55519->52.183.104.36:443: wsasend: An existing connection was forcibly closed by the remote host.
PUT https://klaasbackup.blob.core.windows.net/summerbackup/2018/2018/JUne/DSCF1731.JPG?blockid=mtm1odqymmityjyyys00mzq5ltziotutzwjkywq0mjfjnmq1&comp=block&se=2018-10-15t09%3A50%3A29z&sig=REDACTED&sp=rwdlacup&spr=https&srt=sco&ss=bfqt&st=2018-09-15t01%3A50%3A29z&sv=2017-11-09&timeout=601
PUT https://klaasbackup.blob.core.windows.net/summerbackup/2018/2018/JUne/DSCF1731.JPG?blockid=mtm1odqymmityjyyys00mzq5ltziotutzwjkywq0mjfjnmq1&comp=block&se=2018-10-15t09%3A50%3A29z&sig=REDACTED&sp=rwdlacup&spr=https&srt=sco&ss=bfqt&st=2018-09-15t01%3A50%3A29z&sv=2017-11-09&timeout=601
Put https://klaasbackup.blob.core.windows.net/summerbackup/2018/2018/JUne/DSCF1731.JPG?blockid=MTM1ODQyMmItYjYyYS00MzQ5LTZiOTUtZWJkYWQ0MjFjNmQ1&comp=block&se=2018-10-15T09%3A50%3A29Z&sig=3W7BpmoZYi8i6OXRlFg3mXhxQMIiEjLtUlR1ZR6jCN8%3D&sp=rwdlacup&spr=https&srt=sco&ss=bfqt&st=2018-09-15T01%3A50%3A29Z&sv=2017-11-09&timeout=601: net/http: HTTP/1.x transport connection broken: write tcp 192.168.1.48:55653->52.183.104.36:443: wsasend: An existing connection was forcibly closed by the remote host.
2018/09/15 03:59:27 ERR: [P#0-T#169] UPLOADFAILED: d:///Users/klaas/My%20Pictures/2018/JUne/DSCF1731.JPG : 000 : Chunk Upload Failed -> github.com/Azure/azure-storage-azcopy/ste.newAzcopyHTTPClientFactory.func1.1, /go/src/github.com/Azure/azure-storage-azcopy/ste/mgr-JobPartMgr.go:95

No build links and can't build for MacOS

Hi

I first reported this here
MicrosoftDocs/azure-docs#13735 (comment)

And am now moving it to this repo as it is more appropriate.

The readme suggests that there should be links for 3 downloads

Download the AzCopy executable using one of the following links:

Windows x64
Linux x64
MacOS x64

But there are no links, so I attempted to build on MacOS myself.
Having no experience with GO, I installed Go and added one by one the respective dependencies
until I encountered the following error.

common/credCache_darwin.go:28:2: cannot find package "github.com/jiacfan/keychain" in any of:
        /usr/local/opt/go/libexec/src/github.com/jiacfan/keychain (from $GOROOT)
        /Users/hrant/go-workspace/src/github.com/jiacfan/keychain (from $GOPATH)
cmd/cancel.go:28:2: cannot find package "github.com/spf13/cobra" in any of:
        /usr/local/opt/go/libexec/src/github.com/spf13/cobra (from $GOROOT)
        /Users/hrant/go-workspace/src/github.com/spf13/cobra (from $GOPATH)
common/credCache_linux.go:26:2: cannot find package "github.com/jiacfan/keyctl" in any of:
        /usr/local/opt/go/libexec/src/github.com/jiacfan/keyctl (from $GOROOT)
        /Users/hrant/go-workspace/src/github.com/jiacfan/keyctl (from $GOPATH)
cmd/cancel.go:28:2: cannot find package "github.com/spf13/cobra" in any of:
        /usr/local/opt/go/libexec/src/github.com/spf13/cobra (from $GOROOT)
        /Users/hrant/go-workspace/src/github.com/spf13/cobra (from $GOPATH)
cmd/cancel.go:28:2: cannot find package "github.com/spf13/cobra" in any of:
        /usr/local/opt/go/libexec/src/github.com/spf13/cobra (from $GOROOT)
        /Users/hrant/go-workspace/src/github.com/spf13/cobra (from $GOPATH)

I can't find github.com/jiacfan/keychain to add that dependency.
Please kindly either provide links for builds or advise on a proper building steps.

Many thanks in advance

Cannot SSH to CentOS 7.5 VM after installing azcopy 7.3.0-netcore

What problem was encountered?

Cannot SSH to CentOS 7.5 VM after installed azcopy 7.3.0-netcore
Found the installation will change the permission of user home directory from 700 to 775

How can we reproduce the problem in the simplest way?

  1. create VM using Azure gallery image CentOS 7.5
    "imageReference": {
    "publisher": "OpenLogic",
    "offer": "CentOS",
    "sku": "7.5",
    "version": "latest"

  2. SSH to the VM
    before installing azcopy, found the permission of user home directory is 700

[holgo@azcopy home]$ date; ls -al
Wed Sep 26 08:12:12 UTC 2018
total 0
drwxr-xr-x. 3 root root 19 Sep 26 08:06 .
dr-xr-xr-x. 17 root root 224 Aug 15 20:04 ..
drwx------. 5 holgo holgo 124 Sep 26 08:10 holgo

  1. Install azcopy
    wget -O azcopy.tar.gz https://aka.ms/downloadazcopylinux64
    tar -xf azcopy.tar.gz
    sudo ./install.sh
    sudo yum install -y libunwind icu

  2. Now issue can be reproduced where any further SSH attempt will fail with error saying:
    Permission denied (publickey,gssapi-keyex,gssapi-with-mic)

From /var/log/secure, found error pointing to bad ownership or modes for directory /home/holgo

[holgo@azcopy home]$ sudo tail -f /var/log/secure
...
Sep 26 08:16:05 localhost sshd[1831]: Authentication refused: bad ownership or modes for directory /home/holgo
....

  1. re-check to find permission of /home/holgo is changed to 775

[holgo@azcopy home]$ date ; ls -al
Wed Sep 26 08:14:52 UTC 2018
total 0
drwxr-xr-x. 3 root root 19 Sep 26 08:06 .
dr-xr-xr-x. 17 root root 224 Aug 15 20:04 ..
drwxrwxr-x. 6 holgo holgo 177 Jul 19 04:56 holgo
[holgo@azcopy home]$

Have you found a mitigation/solution?

"sudo chmod -R 700 holgo/" will mitigate the issue

Piped input fails with content-length of 0

Attempted to pipe input and get 400 error because content-length is zero. Is this a supported scenario?

head -c 1K < /dev/urandom | ./azcopy-v2-linux cp "https://$account/test/file1.dat$token"

untitled

azcopy sync fails with "AuthenticationErrorDetail: se is mandatory", however the se is definitely included

AzCopy 10.0.4-Preview on Ubuntu 14.04

Command:
azcopy sync 'https://production8858.blob.core.windows.net/wad-iis-logfiles?se=2030-01-01&sp=rl&sv=2018-03-28&sr=c&sig=REDACTED' /backup --recursive

SAS was generated using:

az storage container generate-sas \
    --account-name production8858 \
    --account-key 'redacted' \
    --name wad-iis-logfiles \
    --permissions rl \
    --expiry 2030-01-01

The "se" value is definitely included, but azcopy fails.

Result:

Failed with error error starting the sync between source https://production8858.blob.core.windows.net/wad-iis-logfiles and destination /backup. 

Failed with error cannot list blobs for download. 

Failed with error -> github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-storage-blob-go/azblob.NewResponseError, /go/src/github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-storage-blob-go/azblob/zz_generated_response_error.go:28

===== RESPONSE ERROR (ServiceCode=AuthenticationFailed) =====
Description=Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
RequestId:c18abdb2-801e-0049-2f14-7c2fba000000
Time:2018-11-14T12:19:11.8402297Z, Details: 
   AuthenticationErrorDetail: se is mandatory. Cannot be empty
   Code: AuthenticationFailed
   GET https://production8858.blob.core.windows.net/wad-iis-logfiles?comp=list&restype=container&sig=REDACTED&sp=rl&sr=c&sv=2018-03-28&timeout=901
   User-Agent: [AzCopy/10.0.4-Preview Azure-Storage/0.3 (go1.10.3; linux)]
   X-Ms-Client-Request-Id: [41df3c4f-57ec-4ac7-7e35-7af1f4d305f0]
   X-Ms-Version: [2018-03-28]
   --------------------------------------------------------------------------------
   RESPONSE Status: 403 Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
   Access-Control-Allow-Origin: [*]
   Access-Control-Expose-Headers: [Access-Control-Allow-Origin]
   Content-Length: [407]
   Content-Type: [application/xml]
   Date: [Wed, 14 Nov 2018 12:19:11 GMT]
   Server: [Microsoft-HTTPAPI/2.0]
   X-Ms-Error-Code: [AuthenticationFailed]
   X-Ms-Request-Id: [c18abdb2-801e-0049-2f14-7c2fba000000]

ContentType for JSON objects not correctly identified

Which version of the AzCopy was used?

10.0.1-Preview

Which platform are you using? (ex: Windows, Mac, Linux)

Windows & Linux

What command did you run?

Note: Please remove the SAS to avoid exposing your credentials. If you cannot remember the exact command, please retrieve it from the beginning of the log file.

azcopy /SetContentType

What problem was encountered?

ContentType detection makes use of the built-in http.DetectContentType in Go, however this does not correctly identify the content type of JSON files (as well as JavaScript files and SVG images). This is particularly problematic when uploading material to be used in a static website. See the following for detail of the problem caused and some work-arounds:

https://liftcodeplay.com/2017/11/28/how-to-fix-azure-storage-blob-content-types/

How can we reproduce the problem in the simplest way?

Use azcopy on a test.json file and inspect the content type, set as application/octet-stream which then prevents the JSON from loading correctly in a client browser. Copy the same file using the Azure Storage Explorer and the correct content type is detected.

Have you found a mitigation/solution?

Two approaches to mitigation;

  1. Crawl all the files in the container after copying and set correct content types based on file extension or;
  2. Copy all files and then do a second invocation of AzCopy to copy the *.json files with an explicit Content Type set, i.e. AzCopy\AzCopy.exe" /Source:"" /Dest:"https://my.blob.core.windows.net/$web" /Pattern:"*.json" /S /Y /SetContentType:"application/json" /Z /V

A better solution for customers would be to have some additional extra file extension based ContentType heuristics in AzCopy for common miss-detected types. The JSON mime-type issue is particularly problematic and hard for end-users to debug as a content type problem.

azcopy ls is not showing blobs in container

I see this on Linux and OSX.

[admin@ip-0A05000B ~]$ azcopy --version
azcopy version 10.0.4-Preview
[admin@ip-0A05000B ~]$ azcopy login
To sign in, use a web browser to open the page https://microsoft.com/devicelogin and enter the code BLC4LFLT9 to authenticate.
Login succeeded.

[admin@ip-0A05000B ~]$ azcopy ls "https://requawestus2.blob.core.windows.net/cyclecloud"
List is using OAuth token for authentication.

cannot list blobs for download. Failed with error -> github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-storage-blob-go/azblob.NewResponseError, /go/src/github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-storage-blob-go/azblob/zz_generated_response_error.go:28
===== RESPONSE ERROR (ServiceCode=AuthorizationPermissionMismatch) =====
Description=This request is not authorized to perform this operation using this permission.
RequestId:62af0922-301e-0085-2004-7b7e29000000
Time:2018-11-13T03:54:05.3116844Z, Details: 
   Code: AuthorizationPermissionMismatch
   GET https://requawestus2.blob.core.windows.net/cyclecloud?comp=list&restype=container&timeout=901
   Authorization: REDACTED
   User-Agent: [AzCopy/10.0.4-Preview Azure-Storage/0.3 (go1.10.3; linux)]
   X-Ms-Client-Request-Id: [99b8342d-573d-47a7-6e15-34888aa24380]
   X-Ms-Version: [2018-03-28]
   --------------------------------------------------------------------------------
   RESPONSE Status: 403 This request is not authorized to perform this operation using this permission.
   Content-Length: [279]
   Content-Type: [application/xml]
   Date: [Tue, 13 Nov 2018 03:54:05 GMT]
   Server: [Windows-Azure-Blob/1.0 Microsoft-HTTPAPI/2.0]
   X-Ms-Error-Code: [AuthorizationPermissionMismatch]
   X-Ms-Request-Id: [62af0922-301e-0085-2004-7b7e29000000]
   X-Ms-Version: [2018-03-28]

This is a complete reproduction after install on OSX and linux. There are no other options on the ls command - such as saskey for auth.

Uploading an Empty VHD Never Finishes

Which version of the AzCopy was used?

10.0.1-Preview

Which platform are you using? (ex: Windows, Mac, Linux)

Mac

What command did you run?

./azcopy copy "/Users/mrayermann/Downloads/*" "https://REDACTED.blob.core.windows.net/testcontainer/?REDACTED --recursive --overwrite=false --include "test.vhd;"

What problem was encountered?

From the output, it looks like the job never finishes. But, if I look at the container via Storage Explorer, I can see the file in my container.

How can we reproduce the problem in the simplest way?

  1. Create an empty .vhd: touch test.vhd
  2. Use AzCopy to upload it to a container

Have you found a mitigation/solution?

Quit AzCopy after a few minutes, with the assumption that the transfer probably finished.

Set Blob Properties - HTTP Cache-Control, etc

Is there a way to set properties on uploaded files, such as HTTP x-ms-blob-cache-control?

Which version of the AzCopy was used?

AzCopy/v10.0.2-Preview Azure-Storage/0.1 (go1.10.3; Windows_NT)

Which platform are you using? (ex: Windows, Mac, Linux)

Windows 10

Sync overrides the container during the first run

So sync works correctly with sastoken but whatever is in the container prior to the first sync gets deleted/overridden. Can we append instead?

BTW: during the 1st run sync detected 5 files in the folder although there was just 2 (which was correctly reflected in the container after the sync). I then deleted 1 file and ran the sync again and that time it correctly detected 1 file and correctly deleted 1 file from the container.

PS C:\Users\artek\Desktop\OneDrive - Microsoft\Current tasks\PowerShell & CLI & AzCopy\azcopy_windows_amd64_10.0.1> .\azcopy sync "C:\azcopy" "https://akzrsdemo.blob.core.windows.net/aktest?sv=RSv%2BW5Y%3D" --recursive=true

Job 08599522-c810-624a-6d48-faf60bd9ba7d has started

08599522-c810-624a-6d48-faf60bd9ba7d.log file created in C:\Users\artek/.azcopy
0 Done, 0 Failed, 5 Pending, 0 Skipped, 5 Total, 2-sec Throughput (Mb/s): 0

Job 08599522-c810-624a-6d48-faf60bd9ba7d summary
Elapsed Time (Minutes): 0.0334
Total Number Of Transfers: 5
Number of Transfers Completed: 5
Number of Transfers Failed: 0
Number of Transfers Skipped: 0
TotalBytesTransferred: 15612824
Final Job Status: Completed

PS C:\Users\artek\Desktop\OneDrive - Microsoft\Current tasks\PowerShell & CLI & AzCopy\azcopy_windows_amd64_10.0.1> .\azcopy sync "C:\azcopy" "https://akzrsdemo.blob.core.windows.net/aktest?sv=Y%3D" --recursive=true

Job 1f756979-1816-dd4f-4af4-db129ae90cc2 has started

1f756979-1816-dd4f-4af4-db129ae90cc2.log file created in C:\Users\artek/.azcopy
0 Done, 0 Failed, 1 Pending, 0 Skipped, 1 Total, 2-sec Throughput (Mb/s): 0

Job 1f756979-1816-dd4f-4af4-db129ae90cc2 summary
Elapsed Time (Minutes): 0.0333
Total Number Of Transfers: 1
Number of Transfers Completed: 1
Number of Transfers Failed: 0
Number of Transfers Skipped: 0
TotalBytesTransferred: 14944
Final Job Status: Completed

When resuming, throughput is no longer shown

C:\Users\seguler\Desktop\azcopy_windows_amd64_10.0.0\azcopy_windows_amd64_10.0.0>.\azcopy_windows_amd64.exe jobs resume 8f925120-9bb6-b540-707e-4be86899b4ff --destination-sas="REDACTED"

Job 8f925120-9bb6-b540-707e-4be86899b4ff has started

8f925120-9bb6-b540-707e-4be86899b4ff.log file created in C:\Users\seguler/.azcopy
1350 Done, 0 Failed, 0 Skipped, 58065 Pending, 59415 Total

When output=json is used for copy, some errors aren't outputted in a JSON friendly way

Which version of the AzCopy was used?

10.0.2-preview

Which platform are you using? (ex: Windows, Mac, Linux)

Windows

What command did you run?

I ran

copy "C:\Users\marayerm\Desktop\*" "https://redacted.blob.core.windows.net/one/?REDACTED" --overwrite=false --follow-symlinks --recursive --fromTo=LocalBlob --include "New Text Document.txt;" --output=json

when there is no file at C:\Users\marayerm\Desktop\New Text Document.txt

What problem was encountered?

Although output=json was specified, the output I received was failed to perform copy command due to error: cannot start job due to error: nothing can be uploaded, please use --recursive to upload directories., which is not JSON. I have seen other situations where this happens, but this is the easiest to reproduce. Basically, if output=json is used, then all output should be formatted as JSON objects.

How can we reproduce the problem in the simplest way?

Try to do a copy where the source does not exist.

Have you found a mitigation/solution?

No.

Provide a docker image on hub.docker.com

Which version of the AzCopy was used?

Try to use the new Go version of azcopy.

Which platform are you using? (ex: Windows, Mac, Linux)

Docker, Linux container

What command did you run?

For now, I'm building my own docker image base on this repository but it's not automated.

Add MD5 check

/CheckMD5 is part of the v8.1 of Azcopy and help us to avoid data corruption when downloading very large file from Azure blob.
It would be great to have this feature added as part of this next Azcopy command line.

[QUESTION] AzCopy list fails with "403 Server failed to authenticate the request."

I am brand new to AzCopy so forgive me, but I have reviewed the readme.md and really tried to figure this one out. Eventually I'd like to test sync but I am stuck at the starting gate.

Version 10.0.2-Preview

Windows 10

Commands

ps> azcopy.exe login
Login succeeded.
ps> azcopy.exe list https://mystorage.blob.core.windows.net/backups
List is using OAuth token for authentication.

Error:

cannot list blobs for download. Failed with error -> github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-storage-blob-go/2018-03-28/azblob.NewResponseError, /go/src/github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-storage-blob-go/2018-03-28/azblob/zz_generated_response_error.go:28
===== RESPONSE ERROR (ServiceCode=AuthenticationFailed) =====
Description=Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
RequestId:db87066e-c01e-0173-67d6-5fb0b6000000
Time:2018-10-09T13:43:58.0479030Z, Details:
   AuthenticationErrorDetail: Issuer validation failed. Issuer did not match.
   GET https://mystorage.blob.core.windows.net/backups?comp=list&restype=container&timeout=901
   Authorization: REDACTED
   User-Agent: [AzCopy/v10.0.2-Preview Azure-Storage/0.1 (go1.10.3; Windows_NT)]
   X-Ms-Client-Request-Id: [8a047954-e7bf-4d30-77b0-ce3ea113e728]
   X-Ms-Version: [2018-03-28]
   --------------------------------------------------------------------------------
   RESPONSE Status: 403 Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
   Content-Length: [422]
   Content-Type: [application/xml]
   Date: [Tue, 09 Oct 2018 13:43:57 GMT]
   Server: [Microsoft-HTTPAPI/2.0]
   X-Ms-Error-Code: [AuthenticationFailed]
   X-Ms-Request-Id: [db87066e-c01e-0173-67d6-5fb0b6000000]

When azcopying blobs between 2 storage containers, blob metadata key values are capitalized.

Which version of the AzCopy was used?

10.0.0.3

Which platform are you using? (ex: Windows, Mac, Linux)

Linux

What command did you run?

azcopy cp "https://CENSORED.blob.core.windows.net/CONTAINER1?sv=2017-11-09&ss=bfqt&srt=sco&sp=rwdlacup&se=2018-10-25T21:22:33Z&st=2018-10-22T13:22:33Z&spr=https&sig=CENSORED" "https://CENSORED.blob.core.windows.net/CONTAINER2?sv=2017-11-09&ss=bfqt&srt=sco&sp=rwdlacup&se=2018-10-25T21:22:33Z&st=2018-10-22T13:22:33Z&spr=https&sig=CENSORED" --recursive=true

What problem was encountered?

The blob metadata for the original container had 2 keys:

encryptedvalue
filesize

after the transfer the 2 keys were changed to:

Encryptedvalues
Filesize

How can we reproduce the problem in the simplest way?

Run the command to transfer the files same as above then compare the metadata fields. In the Azure portal the metadata fields look the same, but when you check in Azure storage explorer you can see they are different.

Have you found a mitigation/solution?

When you copy without the shared access signature and instead use the keys themselves it doesn't occur.

Transfer of .pst file to Office 365 cannot complete with AzCopy for mac

Which version of the AzCopy was used?

Note: The version is visible when running AzCopy without any argument
  • AzCopy 10.0.2-Preview

Which platform are you using? (ex: Windows, Mac, Linux)

  • macOS High Sierra, 10.13.6 (17G65)

What command did you run?

Note: Please remove the SAS to avoid exposing your credentials. If you cannot remember the exact command, please retrieve it from the beginning of the log file.
  • The following command was run, with URL parameters and file paths replaced by underscores for privacy and security reasons:
./azcopy cp "___/backup.pst" "https://___.blob.core.windows.net/ingestiondata?sv=___&sr=___&si=___&sig=___&se=___"

Note that the instructions for Windows specify the following command:

AzCopy.exe /Source:<Location of PST files> /Dest:<SAS URL> /V:<Log file location> /Y

with the following comment regarding /Y:

This required switch allows the use of write-only SAS tokens when you upload the PST files to the Azure storage location. The SAS URL you obtained in step 1 (and specified in /Dest: parameter) is a write-only SAS URL, which is why you must include this switch. Note that a write-only SAS URL will not prevent you from using the Azure Storage Explorer to view a list of the PST files uploaded to the Azure storage location.

I am unsure how to add the equivalent tag here, as I have found no pointers in the documentation or help pages.

What problem was encountered?

  • AzCopy ran silently for a very long time and ended with a failed transfer:
Job a0eef242-719d-c04f-4cd7-8765dcf1b232 has started

a0eef242-719d-c04f-4cd7-8765dcf1b232.log file created in /___/.azcopy
0 Done, 1 Failed, 0 Pending, 0 Skipped, 1 Total

In the log file I have the following header:

2018/10/12 03:31:09 AzcopVersion  10.0.2-Preview
2018/10/12 03:31:09 OS-Environment  darwin
2018/10/12 03:31:09 OS-Architecture  amd64
2018/10/12 03:31:09 Job-Command cp ___/backup.pst https://___.blob.core.windows.net/ingestiondata?se=___&si=___&sig=___&sr=___&sv=___
2018/10/12 03:31:09 JobID=dfb0ba05-f63c-5748-62e2-8b5f5aa5745a, credential type: Anonymous
2018/10/12 03:31:09 scheduling JobID=dfb0ba05-f63c-5748-62e2-8b5f5aa5745a, Part#=0, Transfer#=0, priority=0
2018/10/12 03:31:09 INFO: [P#0-T#0] has worker 213 which is processing TRANSFER
2018/10/12 03:31:09 INFO: [P#0-T#0] Starting transfer: Source "___/backup.pst" Destination "https://___.blob.core.windows.net/ingestiondata/backup.pst?se=___&si=___&sig=___&sr=___&sv=___". Specified chunk size 8388608

Followed by a BUNCH of these that all have the exact same timestamp and "try" number (Try=1):

2018/10/12 03:31:09 ==> OUTGOING REQUEST (Try=1)
PUT https://___.blob.core.windows.net/ingestiondata/backup.pst?blockid=___&comp=___&se=___&si=___&sig=___&sr=___&sv=___&timeout=901
Content-Length: [8388608]
User-Agent: [AzCopy/v10.0.2-Preview Azure-Storage/0.1 (go1.10.3; darwin)]
X-Ms-Client-Request-Id: [7cb4a66d-c7d6-4b5e-7d91-6557c2855b95]
X-Ms-Version: [2018-03-28]

And then a bunch of these (whitespace may be wonky, as I have trouble copying from vim):

2018/10/12 03:35:27 ==> REQUEST/RESPONSE (Try=1/4m17.34582974s[SLOW >3s], OpTime=4m17.345889972s) -- REQUEST ERROR
    PUT https://___.blob.core.windows.net/ingestiondata/backup.pst?blockid=___&comp=___&se=___&si=___&sig=___&sr=___&sv=___&timeout=901
    Content-Length: [8388608]
    User-Agent: [AzCopy/v10.0.2-Preview Azure-Storage/0.1 (go1.10.3; darwin)]
    X-Ms-Client-Request-Id: [1db006dc-c187-4242-6a08-d290227a7d60]
    X-Ms-Version: [2018-03-28]
    --------------------------------------------------------------------------------
    ERROR:
-> github.com/Azure/azure-storage-azcopy/ste.newAzcopyHTTPClientFactory.func1.1, /go/src/github.com/Azure/azure-storage-azcopy/ste/mgr-JobPartMgr.go:95
 HTTP request failed
 
Put https://___.blob.core.windows.net/ingestiondata/backup.pst?blockid=___&comp=___&se=___&si=___&sig=___&sr=___&sv=___&timeout=901: read tcp 10.254.246.200:52028->52.239.148.74:443: read: connection reset by peer

goroutine 264 [running]:
github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-storage-blob-go/2018-03-28/azblob.stack(0xc42014e310, 0xc420a52100, 0x0)
         /go/src/github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-storage-blob-go/2018-03-28/azblob/zc_policy_request_log.go:146 +0xa7
github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-storage-blob-go/2018-03-28/azblob.NewRequestLogPolicyFactory.func1.1(0x461c020, 0xc4209828a0, 0xc420996800, 0x45b502b, 0xc, 0x45      b4175, 0xa)
         /go/src/github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-storage-blob-go/2018-03-28/azblob/zc_policy_request_log.go:96 +0x665
github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-pipeline-go/pipeline.PolicyFunc.Do(0xc42096c8c0, 0x461c020, 0xc4209828a0, 0xc420996800, 0xa, 0x452fa80, 0xc420044800, 0xc42072f8      d0)
         /go/src/github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-pipeline-go/pipeline/core.go:42 +0x44
github.com/Azure/azure-storage-azcopy/ste.NewVersionPolicyFactory.func1.1(0x461c020, 0xc4209828a0, 0xc420996800, 0x0, 0xc4209b7118, 0x402bcd4, 0x45d5b90)
         /go/src/github.com/Azure/azure-storage-azcopy/ste/mgr-JobPartMgr.go:59 +0xe8
github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-pipeline-go/pipeline.PolicyFunc.Do(0xc4209053a0, 0x461c020, 0xc4209828a0, 0xc420996800, 0xc4209b7128, 0xc4209b71e8, 0x4119083, 0      xc4209828b0)
         /go/src/github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-pipeline-go/pipeline/core.go:42 +0x44
github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-storage-blob-go/2018-03-28/azblob.responderPolicy.Do(0x46186a0, 0xc4209053a0, 0xc420744d20, 0x461c020, 0xc4209828a0, 0xc42099680      0, 0xc4209828b0, 0x461bfa0, 0xc42072f8c0, 0x0)
         /go/src/github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-storage-blob-go/2018-03-28/azblob/zz_generated_responder_policy.go:33 +0x53
github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-storage-blob-go/2018-03-28/azblob.anonymousCredentialPolicy.Do(0x46186e0, 0xc4209053c0, 0x461c020, 0xc4209828a0, 0xc420996800, 0      x4913da0, 0x461c020, 0xc4209828a0, 0xc420953660)
         /go/src/github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-storage-blob-go/2018-03-28/azblob/zc_credential_anonymous.go:54 +0x4f
github.com/Azure/azure-storage-azcopy/ste.NewBlobXferRetryPolicyFactory.func1.1(0x461bfa0, 0xc42072f8c0, 0xc420996700, 0x45b9940, 0x16, 0xc4208d39b0, 0x24)
         /go/src/github.com/Azure/azure-storage-azcopy/ste/xfer-retrypolicy.go:362 +0x6c1
github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-pipeline-go/pipeline.PolicyFunc.Do(0xc42096c910, 0x461bfa0, 0xc42072f8c0, 0xc420996700, 0x24, 0x41f7870, 0x454f900, 0xc4208ef6b0      )
         /go/src/github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-pipeline-go/pipeline/core.go:42 +0x44
github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-storage-blob-go/2018-03-28/azblob.NewUniqueRequestIDPolicyFactory.func1.1(0x461bfa0, 0xc42072f8c0, 0xc420996700, 0x45b432d, 0xa,       0xc420768180, 0x3b)
         /go/src/github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-storage-blob-go/2018-03-28/azblob/zc_policy_unique_request_id.go:19 +0x9c
github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-pipeline-go/pipeline.PolicyFunc.Do(0xc4209053e0, 0x461bfa0, 0xc42072f8c0, 0xc420996700, 0x3b, 0xc42000e450, 0xc4208ef710, 0xc420      9b7608)
         /go/src/github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-pipeline-go/pipeline/core.go:42 +0x44
github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-storage-blob-go/2018-03-28/azblob.NewTelemetryPolicyFactory.func1.1(0x461bfa0, 0xc42072f8c0, 0xc420996700, 0x1, 0x0, 0x1, 0xc420      00e450)
        /go/src/github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-storage-blob-go/2018-03-28/azblob/zc_policy_telemetry.go:34 +0x9e
github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-pipeline-go/pipeline.PolicyFunc.Do(0xc4208ef710, 0x461bfa0, 0xc42072f8c0, 0xc420996700, 0xc4208ef710, 0x45b24df, 0xc4209b7678, 0      x4013318)
         /go/src/github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-pipeline-go/pipeline/core.go:42 +0x44
github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-pipeline-go/pipeline.(*pipeline).Do(0xc42072f880, 0x461bfa0, 0xc42072f8c0, 0x4618700, 0xc420744d20, 0xc420996700, 0x2d, 0xc42079      2035, 0x19, 0x0)
         /go/src/github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-pipeline-go/pipeline/core.go:128 +0x81
github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-storage-blob-go/2018-03-28/azblob.blockBlobClient.StageBlock(0xc420792000, 0x5, 0x0, 0x0, 0x0, 0xc420792008, 0x2d, 0xc420792035,       0x19, 0x0, ...)
         /go/src/github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-storage-blob-go/2018-03-28/azblob/zz_generated_block_blob.go:262 +0x4a9
github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-storage-blob-go/2018-03-28/azblob.BlockBlobURL.StageBlock(0xc420792000, 0x5, 0x0, 0x0, 0x0, 0xc420792008, 0x2d, 0xc420792035, 0x      19, 0x0, ...)
         /go/src/github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-storage-blob-go/2018-03-28/azblob/url_block_blob.go:74 +0x131
github.com/Azure/azure-storage-azcopy/ste.(*blockBlobUpload).blockBlobUploadFunc.func1(0xdf)
         /go/src/github.com/Azure/azure-storage-azcopy/ste/xfer-localToBlockBlob.go:338 +0x5a1
github.com/Azure/azure-storage-azcopy/ste.(*jobsAdmin).transferAndChunkProcessor(0xc4201340c0, 0xdf)
         /go/src/github.com/Azure/azure-storage-azcopy/ste/JobsAdmin.go:216 +0xa6
created by github.com/Azure/azure-storage-azcopy/ste.initJobsAdmin
         /go/src/github.com/Azure/azure-storage-azcopy/ste/JobsAdmin.go:160 +0x501

... but yeah, it's a >12MB log file. Is it safe to share all its contents? I can't tell what's sensitive and what's not haha.

How can we reproduce the problem in the simplest way?

Have you found a mitigation/solution?

  • Not yet... I am trying to find a Windows PC I can borrow to try getting it done with the Windows utility. This is rather frustrating. I really appreciate any insights or advice you may have! Thanks for taking the time to read this.

Vague error when writing to destination

Which version of the AzCopy was used?

Note: The version is visible when running AzCopy without any argument

10.0.2

Which platform are you using? (ex: Windows, Mac, Linux)

Linux 16.04

What command did you run?

Note: Please remove the SAS to avoid exposing your credentials. If you cannot remember the exact command, please retrieve it from the beginning of the log file.

./azcopy cp "https://cjwstorage.blob.core.windows.net/dltest/" /mnt --recursive

What problem was encountered?

The logs said 'No such file or directory'. I expected it to say 'no permission to write to destination' or something

2018/10/03 20:43:26 ERR: [P#0-T#2096] DOWNLOADFAILED: https://cjwstorage.blob.core.windows.net/dltest/2631.bin?se=2018-10-14t04%3A28%3A12z&sig=REDACTED&sp=rwdlacup&spr=https%2Chttp&srt=sco&ss=bfqt&st=2018-10-02t20%3A28%3A12z&sv=2017-11-09 : 000 : File Creation Error open /mnt/dltest/2631.bin: no such file or directory
Dst: /mnt/dltest/2631.bin
2018/10/03 20:43:26 ERR: [P#0-T#2100] DOWNLOADFAILED: https://cjwstorage.blob.core.windows.net/dltest/8732.bin?se=2018-10-14t04%3A28%3A12z&sig=REDACTED&sp=rwdlacup&spr=https%2Chttp&srt=sco&ss=bfqt&st=2018-10-02t20%3A28%3A12z&sv=2017-11-09 : 000 : File Creation Error open /mnt/dltest/8732.bin: no such file or directory
Dst: /mnt/dltest/8732.bin
2018/10/03 20:43:26 ERR: [P#0-T#2100] /mnt/dltest/8732.bin: 0: Delete File Error -remove /mnt/dltest/8732.bin: no such file or directory
2018/10/03 20:43:26 JobID=8ee131d6-d92a-494e-4faa-9230f365db52, Part#=0, TransfersDone=2053 of 10000
2018/10/03 20:43:26 INFO: [P#0-T#2114] has worker 104 which is processing TRANSFER
2018/10/03 20:43:26 ERR: [P#0-T#2114] DOWNLOADFAILED: https://cjwstorage.blob.core.windows.net/dltest/8403.bin?se=2018-10-14t04%3A28%3A12z&sig=REDACTED&sp=rwdlacup&spr=https%2Chttp&srt=sco&ss=bfqt&st=2018-10-02t20%3A28%3A12z&sv=2017-11-09 : 000 : File Creation Error open /mnt/dltest/8403.bin: no such file or directory
Dst: /mnt/dltest/8403.bin

How can we reproduce the problem in the simplest way?

Have you found a mitigation/solution?

I didn't have access to the /mnt folder. After chown, the command worked fine

Allow anything to be uploaded as a page blob/VHDs as block blobs

Which version of the AzCopy was used?

10.0.2-preview

Which platform are you using? (ex: Windows, Mac, Linux)

Windows

Currently, VHDs are always uploaded as page blobs, and everything else is always uploaded as block blobs. It'd be nice if there was a way to override those defaults.

Passing in bad formed SAS doesn't error out quickly.

Command: azcopy cp "source" "destination?SAS" --recursive=true
Where SAS was incorrect.
Command enumerated local files and then starting increasing failure after failure. If the SAS key isn't correct on the first PUT, why continue to process the list ?

[Question] Error: Combination 'Localfile' not supported for sync command (File Storage)

Azcopy 10.0.2 Preview - Win 7 - Powershell

The following string works for a "copy" but when using sync it throws the error further below
.\azcopy sync "C:\GCDS_dev" "https://azgcdsdevst1.file.core.windows.net/gcdsbuild/GQtest?<retracted_key>" --recursive

error parsing the input given by the user. Failed with error source 'C:\GCDS_dev' / destination 'https://azgcdsdevst1.fi
le.core.windows.net/gcdsbuild/GQtest?"retracted_key"' combination 'LocalFile' not supported for sync command

Any Ideas?

Unable to build since 10.0.1: not enough arguments in call to blockBlobUrl.StageBlock

Which version of the AzCopy was used?

10.0.1

Note: The version is visible when running AzCopy without any argument

Which platform are you using? (ex: Windows, Mac, Linux)

Linux

What command did you run?

go get github.com/Azure/azure-storage-azcopy

Note: Please remove the SAS to avoid exposing your credentials. If you cannot remember the exact command, please retrieve it from the beginning of the log file.

What problem was encountered?

go get github.com/Azure/azure-storage-azcopy                                   :(
# github.com/Azure/azure-storage-azcopy/ste
go/src/github.com/Azure/azure-storage-azcopy/ste/xfer-localToBlockBlob.go:336:36: not enough arguments in call to blockBlobUrl.StageBlock
	have (context.Context, string, io.ReadSeeker, azblob.LeaseAccessConditions)
	want (context.Context, string, io.ReadSeeker, azblob.LeaseAccessConditions, []byte)
go/src/github.com/Azure/azure-storage-azcopy/ste/xfer-localToBlockBlob.go:582:37: not enough arguments in call to pageBlobUrl.UploadPages
	have (context.Context, int64, io.ReadSeeker, azblob.BlobAccessConditions)
	want (context.Context, int64, io.ReadSeeker, azblob.PageBlobAccessConditions, []byte)

How can we reproduce the problem in the simplest way?

go get github.com/Azure/azure-storage-azcopy                                   :(

Have you found a mitigation/solution?

10.0.0 worked fine for me

[Question] Azcopy sync --log-level string flag options?

V10.0.2 - Win 7

what are the possible string options for "azcopy sync" --log-level flag?

"WARNING" is default, but it seems that the logs stored in c:\users<username>.azcopy store more than just warnings?

Some of these logs get quite large. 1.4GB copy made a 100mb log!!

so decreasing the verbosity would be certainly useful if azcopy was to run on a regular basis?

Sync does not work with OAuth & error messages are not clear

PS C:\Users\artek\Desktop\OneDrive - Microsoft\Current tasks\PowerShell & CLI & AzCopy\azcopy_windows_amd64_10.0.1> .\azcopy sync "C:\azcopy" https://akzrsdemo.blob.core.windows.net/
Using OAuth token for authentication.

error performing the sync between source and destination. Failed with error error starting the sync between source C:/azcopy/ and destination https://akzrsdemo.blob.core.windows.net. Failed with error cannot list blobs for download. Failed with error -> github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-storage-blob-go/2018-03-28/azblob.NewResponseError, /go/src/github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-storage-blob-go/2018-03-28/azblob/zz_generated_response_error.go:28
===== RESPONSE ERROR (ServiceCode=AuthenticationFailed) =====
Description=Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
RequestId:418ad35a-c01e-0033-54f7-5f6b64000000
Time:2018-10-09T17:39:53.7100125Z, Details:
AuthenticationErrorDetail: Authentication scheme Bearer is not supported.
GET https://akzrsdemo.blob.core.windows.net?comp=list&restype=container&timeout=901
Authorization: REDACTED
User-Agent: [AzCopy/v10.0.2-Preview Azure-Storage/0.1 (go1.10.3; Windows_NT)]
X-Ms-Client-Request-Id: [9352f58d-14b0-44cd-5204-f68040769df0]
X-Ms-Version: [2018-03-28]

RESPONSE Status: 403 Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
Content-Length: [421]
Content-Type: [application/xml]
Date: [Tue, 09 Oct 2018 17:39:52 GMT]
Server: [Microsoft-HTTPAPI/2.0]
X-Ms-Error-Code: [AuthenticationFailed]
X-Ms-Request-Id: [418ad35a-c01e-0033-54f7-5f6b64000000]

PS C:\Users\artek\Desktop\OneDrive - Microsoft\Current tasks\PowerShell & CLI & AzCopy\azcopy_windows_amd64_10.0.1> .\azcopy sync "C:\azcopy" https://akzrsdemo.blob.core.windows.net/newcontainer
Using OAuth token for authentication.

error performing the sync between source and destination. Failed with error error starting the sync between source C:/azcopy/ and destination https://akzrsdemo.blob.core.windows.net/newcontainer. Failed with error cannot list blobs for download. Failed with error -> github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-storage-blob-go/2018-03-28/azblob.NewResponseError, /go/src/github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-storage-blob-go/2018-03-28/azblob/zz_generated_response_error.go:28
===== RESPONSE ERROR (ServiceCode=AuthenticationFailed) =====
Description=Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
RequestId:968a0fa4-101e-009e-7ff7-5f7219000000
Time:2018-10-09T17:40:53.3630243Z, Details:
AuthenticationErrorDetail: Authentication scheme Bearer is not supported.
GET https://akzrsdemo.blob.core.windows.net/newcontainer?comp=list&restype=container&timeout=901
Authorization: REDACTED
User-Agent: [AzCopy/v10.0.2-Preview Azure-Storage/0.1 (go1.10.3; Windows_NT)]
X-Ms-Client-Request-Id: [52d95c9f-9b08-4d6d-4868-f0944d33b6c7]
X-Ms-Version: [2018-03-28]

RESPONSE Status: 403 Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
Content-Length: [421]
Content-Type: [application/xml]
Date: [Tue, 09 Oct 2018 17:40:52 GMT]
Server: [Microsoft-HTTPAPI/2.0]
X-Ms-Error-Code: [AuthenticationFailed]
X-Ms-Request-Id: [968a0fa4-101e-009e-7ff7-5f7219000000]

PS C:\Users\artek\Desktop\OneDrive - Microsoft\Current tasks\PowerShell & CLI & AzCopy\azcopy_windows_amd64_10.0.1> .\azcopy sync "C:\azcopy" https://akzrsdemo.blob.core.windows.net/aktest
Using OAuth token for authentication.

error performing the sync between source and destination. Failed with error error starting the sync between source C:/azcopy/ and destination https://akzrsdemo.blob.core.windows.net/aktest. Failed with error cannot list blobs for download. Failed with error -> github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-storage-blob-go/2018-03-28/azblob.NewResponseError, /go/src/github.com/Azure/azure-storage-azcopy/vendor/github.com/Azure/azure-storage-blob-go/2018-03-28/azblob/zz_generated_response_error.go:28
===== RESPONSE ERROR (ServiceCode=AuthenticationFailed) =====
Description=Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
RequestId:534b0f92-701e-0020-34f7-5fa640000000
Time:2018-10-09T17:41:15.4788450Z, Details:
AuthenticationErrorDetail: Authentication scheme Bearer is not supported.
GET https://akzrsdemo.blob.core.windows.net/aktest?comp=list&restype=container&timeout=901
Authorization: REDACTED
User-Agent: [AzCopy/v10.0.2-Preview Azure-Storage/0.1 (go1.10.3; Windows_NT)]
X-Ms-Client-Request-Id: [0e9363fd-39b9-4cb2-75b7-eca33eb4ab84]
X-Ms-Version: [2018-03-28]

RESPONSE Status: 403 Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
Content-Length: [421]
Content-Type: [application/xml]
Date: [Tue, 09 Oct 2018 17:41:15 GMT]
Server: [Microsoft-HTTPAPI/2.0]
X-Ms-Error-Code: [AuthenticationFailed]
X-Ms-Request-Id: [534b0f92-701e-0020-34f7-5fa640000000]

Create if not Exists, for the storage account

My plan is to use azcopy as part of CI jobs to upload release assets, etc.

I'd like to be able to use this in scenarios where the azure storage container or account might be abstracted or parameterized. In this case, it would be ideal to be able to say something along the lines of: azcopy ensure-container --account-name=strgacct1 --container-name=release-assets.

Possible downside: I'm guessing that this tool only deal with Storage auth right now, rather than generic ARM auth. Maybe this isn't a full downside given the changes to Azure's Storage auth with RBAC, etc.

[Question] Copy fails from local files to azure file storage - 404 errors

10.0.2 Preview - Win 7

Copy fails when I try a local file copy to-> azure file storage with the 10.0.2 preview. Works with current 8.1.0

this is the 8.1.0 line
azcopy /source:C:\GCDS_dev /dest:/https://azgcdsdevst1.file.core.windows.net/gcdsbuild/GQtest /DestKey:<retracted> /s /BlockSizeInMB:32

converted to 10.0.2 preview
azcopy copy C:\GCDS_dev https://azgcdsdevst1.file.core.windows.net/gcdsbuild/GQtest?<retracted_sas> --recursive=true --block-size=32

Results is all files fail with 404 errors in the log... any ideas?

RESPONSE Status: 404 The specified resource does not exist.
Content-Length: [223]
Content-Type: [application/xml]
Date: [Mon, 08 Oct 2018 14:18:06 GMT]
Server: [Windows-Azure-Blob/1.0 Microsoft-HTTPAPI/2.0]
X-Ms-Error-Code: [ResourceNotFound]
X-Ms-Request-Id: [7ece6fe9-501e-0107-1c11-5f316f000000]
X-Ms-Version: [2018-03-28]
2018/10/08 14:18:07 JobID=5d651ae0-11f9-284d-4ea1-9e641861b502, Part#=0, TransfersDone=24 of 25
2018/10/08 14:18:09 ==> REQUEST/RESPONSE (Try=1/8.618s[SLOW >3s], OpTime=8.624s) -- RESPONSE SUCCESSFULLY RECEIVED
DELETE https://azgcdsdevst1.blob.core.windows.net/gcdstest2/GQtest/GCDS_dev/Plotters/PLT_CAN_Bdy.dwg?"retracted_sas"%3D&timeout=901
User-Agent: [AzCopy/v10.0.2-Preview Azure-Storage/0.1 (go1.10.3; Windows_NT)]
X-Ms-Client-Request-Id: [a661a951-830b-41ce-4a8a-7758ddfc5812]
X-Ms-Version: [2018-03-28]

[Question] Slow performance of sync - Local File to Blob

V10.0.2 Preview - Win 7

.\azcopy sync "C:\GCDS_dev" "https://azgcdsdevst1.blob.core.windows.net/gcdstest2?--Key Retracted--" --recursive

When syncing larger amounts of data >1GB (local file to Blob) sync seems to take a long time to even prep the job. (i.e syncing 1.4gb of data seems to take greater than 30 mins to even srart the job)

While copy function seems to start almost straight away.

I know the sync command obviously has some file comparison work to do before it can do anything, but it still seems extraordinarily slow to begin.

Any idea what could be causing delay?

Is it possible to report file conflict check progress to the command line with a flag?

Docs are insufficient to help users understand what to do

As the README.md exists today, I don't really understand the following:

  • when to use cp vs sync
  • the semantics of sync. How is the sync achieved? Is it two-way? How is integrity checked? Are timestamps used, are only missing files used?
  • is it possible to sync between two blob stores?

As an aside, what do I do if I do not want "job" behavior? I have things arranged such that uploads/syncs should be idempotent and hopefully shouldn't need to keep state around. Plus, I'm exclusively using this in CI scenarios where state won't be persisted between runs anyway.

AzCopy Sync Blob to Blob support

Which version of the AzCopy was used?

Note: The version is visible when running AzCopy without any argument

10.0

Which platform are you using? (ex: Windows, Mac, Linux)

Windows

What problem was encountered?

Azcopy sync now supports file system to blob, it should also support blob to blob sync.

[question] can we use proxy on v10?

Which version of the AzCopy was used?

Note: The version is visible when running AzCopy without any argument

considering to use v10

Which platform are you using? (ex: Windows, Mac, Linux)

Linux

What problem was encountered?

it's the question.
I want to use proxy as in Windows
(i.e. https://blogs.msdn.microsoft.com/azure4fun/2016/10/17/azcopy-unable-to-connect-to-the-remote-server/ )

Seen the code, it looks to use the system proxy. can't user change as they want?
https://github.com/Azure/azure-storage-azcopy/blob/master/common/oauthTokenManager.go#L69

If there are any way to set the own proxy, I really want to know how. Thanks,

Files with uppercase extensions are not uploaded with azcopy

We are migrating to Azure, but we have a lot of files. I have a copy script for every file extension we want to copy to blob storage. We are using AzCopy 8. For example png files:

For regular png files we get the result:

Finished 1252 of total 1252 file(s).
[2018/08/24 00:36:15] Transfer summary:
-----------------
Total files transferred: 1252
Transfer successfully:   1252
Transfer skipped:        0
Transfer failed:         0
Elapsed time:            00.00:00:26

This command however

.\AzCopy.exe /@:"C:\Path\inputParams.txt" /Pattern:"*.PNG /V:"C:\LogPath\png-cap.log"

inputParams.txt is this:

/Source:"D:\RootPath" 
/Dest:https://blip.blob.core.windows.net/root/ 
/DestKey:{omitted}== 		
/SetContentType
/NC:4
/S 
/XO
/Y

Prints this

Finished 0 of total 0 file(s).
[2018/08/24 00:36:32] Transfer summary:
-----------------
Total files transferred: 0
Transfer successfully:   0
Transfer skipped:        0
Transfer failed:         0
Elapsed time:            00.00:00:17

The output of C:\LogPath\png-cap.log is:

[2018/08/24 00:36:15.362+02:00] >>>>>>>>>>>>>>>>
[2018/08/24 00:36:15.378+02:00][VERBOSE] Finished: 0 file(s), 0 B; Average Speed:0 B/s.
[2018/08/24 00:36:15.378+02:00][VERBOSE] 8.0.0 : AzCopy /@:C:\Path\inputParams.txt /Pattern:*.PNG /V:C:\LogPath\png-cap.log
[2018/08/24 00:36:15.471+02:00][VERBOSE] Attempt to parse address 'D:\RootPath' to a directory as a candidate location succeeded.
[2018/08/24 00:36:15.471+02:00][VERBOSE] Source is interpreted as a Local directory: D:\RootPath\.
[2018/08/24 00:36:15.502+02:00][VERBOSE] Attempt to parse address 'https://blip.blob.core.windows.net/root/' to a directory as a candidate location succeeded.
[2018/08/24 00:36:15.518+02:00][VERBOSE] Attempt to parse address 'https://blip.blob.core.windows.net/root/' to a single file location failed: Invalid location 'https://blip.blob.core.windows.net/root/', cannot get valid account, container and blob name.
[2018/08/24 00:36:15.518+02:00][VERBOSE] Destination is interpreted as a Cloud blob directory: https://blip.blob.core.windows.net/root/.
[2018/08/24 00:36:20.370+02:00][VERBOSE] Finished: 0 file(s), 0 B; Average Speed:0 B/s.
[2018/08/24 00:36:25.377+02:00][VERBOSE] Finished: 0 file(s), 0 B; Average Speed:0 B/s.
[2018/08/24 00:36:30.369+02:00][VERBOSE] Finished: 0 file(s), 0 B; Average Speed:0 B/s.
[2018/08/24 00:36:32.647+02:00] Transfer summary:
-----------------
Total files transferred: 0
Transfer successfully:   0
Transfer skipped:        0
Transfer failed:         0
Elapsed time:            00.00:00:17

Is there a way to have the files with capital letters as an extension uploaded?

Detailed information when downloading from Azure blob

Currently, we only see the throughput displayed when we download a file from a blob.
If we add the option --output=json, it adds a bit more information but that is not readable from an end user perspective.
We would need to have the size of the file being downloaded, the time remaining, the size remaining, etc.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.