backblaze / b2_command_line_tool Goto Github PK
View Code? Open in Web Editor NEWThe command-line tool that gives easy access to all of the capabilities of B2 Cloud Storage
License: Other
The command-line tool that gives easy access to all of the capabilities of B2 Cloud Storage
License: Other
I'm having this error on b2 cli tool version: 0.4.8
I imagine i need some sort of authorize_automatically
key in my ~/.b2_account_info
config file?
b2 sync my_folder b2:my_bucket/my_folder
Traceback (most recent call last):
File "/usr/local/bin/b2", line 9, in <module>
load_entry_point('b2==0.4.8', 'console_scripts', 'b2')()
File "/usr/local/lib/python2.7/dist-packages/b2/console_tool.py", line 759, in main
exit_status = ct.run_command(decoded_argv)
File "/usr/local/lib/python2.7/dist-packages/b2/console_tool.py", line 245, in run_command
return self.sync(args)
File "/usr/local/lib/python2.7/dist-packages/b2/console_tool.py", line 675, in sync
recursive=True
File "/usr/local/lib/python2.7/dist-packages/b2/bucket.py", line 114, in ls
response = session.list_file_names(self.id_, start_file_name, fetch_count)
File "/usr/local/lib/python2.7/dist-packages/b2/session.py", line 40, in wrapper
reauthorization_success = self.account_info.authorize_automatically()
AttributeError: 'StoredAccountInfo' object has no attribute 'authorize_automatically'
Reauthorizing manually with: b2 authorize_account
fixes it temporarily.
Any ideas?
message_and_exit
in favor of exceptionsoptparse
? We can't use argparse
due to RHEL6 not having it)I would like to see a feature to enrypt files during upload and possibly decrypting it again when downloading. It should be possible to use this feature with backup scripts/cronjobs.
Questions that need to be discussed:
It would be a beautiful thing for B2 CLT to display the upload speed and progress, similar to what wget does.
This kind of feedback would also give assurance that something is happening.
Hi,
I am getting quite often the following error when uploading files:
urllib2.URLError: <urlopen error [Errno 104] Connection reset by peer>
What can I do to troubleshoot the issue ?
Thanks
I've written an (almost complete) bash completion script for b2 https://gist.github.com/mapio/67eb007b2f995b708759 I'll be glad if you want to consider it for inclusion in this repo.
I'm trying to sync several directories for backup.
After A while I get
URL: https://api001.backblaze.com/b2api/v1/b2_start_large_file
Params: None
Headers: {'Authorization': uxxxxxxx', 'User-Agent'
: 'backblaze-b2/0.4.7 python/2.7.6'}
{
"code": "bad_request",
"message": "large files not available yet",
"status": 400
}
Which is fine except the sync stops and I'm missing 100G after the large file.
There should be an option to just show that error and continue and not stop.
Hello,
I'm using b2 0.5.0 on python 2.7, Ubuntu 14.04 64 bit.
I wanted to sync again after I had already sync with a previous version of b2 (0.4.10 afaik), and I get this error:
ERROR: unknown error: 400 bad_request large files not available yet
the largest file in the source directory is 427M, while the file it's trying to upload is about 201M.
root@amaranta:/other/localbackup# /root/tools/backblaze/bin/b2 version
b2 command line tool, version 0.5.0
root@amaranta:/other/localbackup# ls -lh -S | head
total 152G
-rw------- 1 root root 427M Jun 4 2015 duplicity-full-signatures.20150604T075332Z.sigtar.gpg
-rw------- 1 root root 201M Sep 3 2015 duplicity-inc.20150902T014501Z.to.20150903T014502Z.vol2.difftar.gpg
-rw------- 1 root root 201M Nov 15 03:47 duplicity-inc.20151114T024507Z.to.20151115T024505Z.vol7.difftar.gpg
-rw------- 1 root root 201M Jun 26 2015 duplicity-inc.20150625T014502Z.to.20150626T014502Z.vol3.difftar.gpg
-rw------- 1 root root 201M Nov 18 03:46 duplicity-inc.20151117T024502Z.to.20151118T024503Z.vol1.difftar.gpgERROR: unknown error: 400 bad_request large files not available yet
-rw------- 1 root root 201M Sep 25 2015 duplicity-inc.20150924T014501Z.to.20150925T014503Z.vol6.difftar.gpg
-rw------- 1 root root 201M Jan 12 03:51 duplicity-inc.20160111T024504Z.to.20160112T024503Z.vol4.difftar.gpg
-rw------- 1 root root 201M Nov 14 03:46 duplicity-inc.20151113T024503Z.to.20151114T024507Z.vol2.difftar.gpg
-rw------- 1 root root 201M Aug 5 2015 duplicity-inc.20150804T014501Z.to.20150805T014501Z.vol1.difftar.gpg
root@amaranta:/other/localbackup# /root/tools/backblaze/bin/b2 sync ./ b2:amarantabackup
+ duplicity-inc.20160111T024504Z.to.20160112T024503Z.vol6.difftar.gpg
ERROR: unknown error: 400 bad_request large files not available yet
root@amaranta:/other/localbackup# ls -l duplicity-inc.20160111T024504Z.to.20160112T024503Z.vol6.difftar.gpg
-rw------- 1 root root 209718570 Jan 12 03:52 duplicity-inc.20160111T024504Z.to.20160112T024503Z.vol6.difftar.gpg
root@amaranta:/other/localbackup# ls -lh duplicity-inc.20160111T024504Z.to.20160112T024503Z.vol6.difftar.gpg
-rw------- 1 root root 201M Jan 12 03:52 duplicity-inc.20160111T024504Z.to.20160112T024503Z.vol6.difftar.gpg
root@amaranta:/other/localbackup#
pip freeze output in b2 virtualenv:
b2==0.5.0
cffi==1.5.2
cryptography==1.3.1
enum34==1.1.2
idna==2.1
ipaddress==1.0.16
ndg-httpsclient==0.4.0
portalocker==0.5.7
pyasn1==0.1.9
pycparser==2.14
pyOpenSSL==16.0.0
requests==2.9.1
six==1.10.0
https://www.backblaze.com/b2/docs/integration_checklist.html#hanerr translated into development items:
Uploading files: re-attempt the upload using the new URL and auth key pair:
Multithreading:
sync
b2_get_upload_url
or b2_get_upload_part_url
for each threadRange
header on b2_download_file_by_id
and b2_download_file_by_name
to download individual parts of a fileRetrying:
Does the API support creating a signed temporary url to download files?
Some people have written scripts around b2
that parse the output I plan on adding a --quiet
option for uploads and downloads to disable the progress bar. We can add the same option everywhere later.
I just created a new bucket using the CLI:
$ b2 create_bucket public01 allPublic
af37c3b95c4aa1fd5fXXXX
When I list all available buckets, I can also see that backblaze created it succesfully:
$ b2 list_buckets
af37c3b95c4aa1fd5fXXXX allPublic public01
But as soon as I try to upload a file or list existing files, It throws the No such bucket: af37c3b95c4aa1fd5fXXXX
error message:
$ b2 upload_file af37c3b95c4aa1fd5fXXXX D7V_7370.jpg D7V_7370.jpg
No such bucket: af37c3b95c4aa1fd5fXXXX
$ b2 ls af37c3b95c4aa1fd5fXXXX
No such bucket: af37c3b95c4aa1fd5fXXXX
I added print response
below line 443
for debugging purposes - not sure if that helps:
$ b2 ls af37c3b95c4aa1fd5fXXXX
{u'buckets': [{u'bucketType': u'allPublic', u'bucketId': u'af37c3b95c4aa1fd5fXXXX', u'bucketName': u'public01', u'accountId': u'f739ca1dXXXX'}]}
No such bucket: af37c3b95c4aa1fd5fXXXX
Is there any awesome reason to why the B2 CLI tool is not on npm (or brew) yet?
I started using b2 tool on a daily basis for testing and each session I need to authorize again. I've checked the code and it would seem that b2 command line tool does not save the Application Key. Additionally there is no automated handling for an expired token, so the command line tool user needs to observe error messages and reauthorize by himself.
Can I add an option to save the Application Key persistently and request a new token when necessary if HTTP 401 code is received?
Could that option be enabled by default?
I'm having the the following error, on the latest version of the cli tool (0.5.2)
jcalonso@my-server:/media/data# b2 sync myData b2:myBucket/myData
Traceback (most recent call last):
File "/usr/local/bin/b2", line 9, in <module>
load_entry_point('b2==0.5.2', 'console_scripts', 'b2')()
File "/usr/local/lib/python2.7/dist-packages/b2/console_tool.py", line 782, in main
exit_status = ct.run_command(decoded_argv)
File "/usr/local/lib/python2.7/dist-packages/b2/console_tool.py", line 258, in run_command
return self.sync(args)
File "/usr/local/lib/python2.7/dist-packages/b2/console_tool.py", line 736, in sync
self._print("+ %s" % filename)
File "/usr/local/lib/python2.7/dist-packages/b2/console_tool.py", line 292, in _print
print(*args, file=self.stdout)
UnicodeEncodeError: 'ascii' codec can't encode character u'\xb4' in position 44: ordinal not in range(128)
I would love to back up my postgres db on B2.
If you ever feel like giving more reasons to why B2 is awesome you could look into expanding https://github.com/wal-e/wal-e/ with a B2 backend.
I was just testing syncing from b2 to local filesystem and got an ValueError:
ValueError: unknown url type
host1:B2_Command_Line_Tool user1$ git pull -v
From https://github.com/Backblaze/B2_Command_Line_Tool
= [up to date] master -> origin/master
Already up-to-date.
host1:B2_Command_Line_Tool user1$ ./b2 version
b2 command line tool, version 0.3.11
host1:B2_Command_Line_Tool user1$ ./b2 create_bucket testtemp allPrivate
040e479d4f5c932b5f26011c
host1:B2_Command_Line_Tool user1$ ./b2 sync ./tmp-upload/ b2:testtemp/test/
+ test1.bin
./tmp-upload/test1.bin
32%
DONE.
+ test10.bin
./tmp-upload/test10.bin
5%
18%
40%
48%
61%
76%
90%
96%
DONE.
+ test5.bin
./tmp-upload/test5.bin
10%
26%
37%
59%
84%
DONE.
host1:B2_Command_Line_Tool user1$ ./b2 sync b2:testtemp/test/ ./tmp-download/
+ test1.bin
Traceback (most recent call last):
File "./b2", line 1904, in <module>
main()
File "./b2", line 1886, in main
ct.sync(args)
File "./b2", line 1818, in sync
authorization=True,
File "./b2", line 1436, in download_file_by_id_helper
headers_received_cb=headers_received_cb,
File "./b2", line 994, in download_file_from_url
with OpenUrl(url, None, request_headers) as response:
File "./b2", line 1228, in __enter__
self.file = urllib2.urlopen(request)
File "/usr/local/Cellar/python/2.7.11/Frameworks/Python.framework/Versions/2.7/lib/python2.7/urllib2.py", line 154, in urlopen
return opener.open(url, data, timeout)
File "/usr/local/Cellar/python/2.7.11/Frameworks/Python.framework/Versions/2.7/lib/python2.7/urllib2.py", line 423, in open
protocol = req.get_type()
File "/usr/local/Cellar/python/2.7.11/Frameworks/Python.framework/Versions/2.7/lib/python2.7/urllib2.py", line 285, in get_type
raise ValueError, "unknown url type: %s" % self.__original
ValueError: unknown url type: 4_z040e479d4f5c932b5f26011c_f1122fbae0a47607a_d20160130_m112948_c001_v0001018_t0041
I would like to implement something in python using b2 and its command line tool. Current implementation already lets me import it (after I symlink it to b2.py
, that is) and call functions on it, which is good, however for any application which is not console-based or requires some resiliency against errors, it is impossible to use due to the current architecture. It is also not possible to test it automatically in isolation. Clearly the current codebase is already designed with integration with other python code - at least partially.
The problem is with two layers being mixed with each other. One layer is what performs operations on the backend (let's call it controller) and the other is what displays the retrieved information (let's call it view). For example, if I would like to create a GUI application which would present the result of ls
, I can't, because the function uses print
to return results to the user. Here the layers should be divided so that the backend function returns an iterator of entries (objects), which would then be printed by the view function. If someone would need to call a library equivalent of ls
, he would call the backend one and then display the result in the GUI (or web page or whatever he is trying to build).
Another example is that if something does not exist or there is some other problem, current tool calls sys.exit()
. If the user supplies invalid data to the gui application, as an application developer I want to display an error message, not shutdown the whole program. Therefore I'd expect the library functions to raise an exception on error, so that it can be handled properly.
It is possible to implement most of the functionality of b2 command line tool again in a way which has properly separated layers and allows for console, gui, web page and other types of outputs, as well as for automated testing in isolation.
On the other hand, the same changes could be developed here in this repository, with keeping most of the interface as it is. To be precise, I'd contribute most (if not all) of the code and tests required to do this properly. As it is a larger amount of work, I'd like to discuss it first here, so if my view and views of official maintainers are not aligned, it can be compared before much work is put into development.
Please comment the above.
Using sphinx
or some other handy tool
To reduce I/O, some information could be safely cached in memory:
If ~/.b2_account_info
is being rewritten, the file beginning is being written properly, but the rest of the file contents doesn't go away. Loading such file works as intended, so the file is not seriously corrupted, but this way it will never shrink.
$ ./b2 clear_account
$ cat ~/.b2_account_info ; echo
{}
"account_auth_token": "XXX",
"account_id": "YYY",
"api_url": "https://api001.backblaze.com",
"download_url": "https://f001.backblaze.com"
}
It seems to be a regression after #6
Fix is in #9.
I have a use case that sounds pretty common. I am generating a set of files that have a hash in their file name so that any time the files have the same name they will have the same content. Think of rails asset pipeline where a hash is inserted into the file name. I then want to upload all new files to b2 so that they can be used.
The sync command is almost perfect as this except that even if the files have the same name and contents they will be reuploaded because the modification time is newer. It would be nice to have an option that would ignore the modification time and just use the name (and possibly the size).
(Bonus points: add an option to get an error if the size is different rather then reuploading or skipping)
This is similar to rsync's --ignore-existing
which only copies new files.
Following the steps from your documentation on B2 usage here: https://www.backblaze.com/b2/docs/quick_bucket.html
and downloading the b2 script (not from this repo, but from the link provided in the documentation), after running:
b2 authorize_account ACCOUNT_ID ACCOUNT_KEY
I get:
Traceback (most recent call last):
File "/Users/alex/bin/b2", line 949, in <module>
main()
File "/Users/alex/bin/b2", line 912, in main
authorize_account(args)
File "/Users/alex/bin/b2", line 355, in authorize_account
url = auth_urls[option]
KeyError: '--production'
This is under Python 2:
$ which python
/usr/local/bin/python
$ python --version
Python 2.7.9
b2 script version:
b2 version
b2 command line tool, version 0.3.5
After downloading the script from this repo, version 0.3.7. the script works as expected and outlined in your documentation.
Running: ./b2 get_file_info 4_zXXXX_f112b0a3542d6db50_d20160214_m193100_c001_v0001019_t0020
status: 0
stdout:
{
"accountId": "XXX",
"bucketId": "XXX",
"contentLength": 5,
"contentSha1": "aaf4c61ddcc5e8a2dabede0f3b482cd9aea9434d",
"contentType": "application/octet-stream",
"fileId": "4_zXXXX_f112b0a3542d6db50_d20160214_m193100_c001_v0001019_t0020",
"fileInfo": {
"src_last_modified_millis": "1455478250628"
},
"fileName": "c"
}
expected:
1455478250512
actual:
1455478250628
ERROR
Tests FAILED
Running: ./b2/b2.py get_file_info 4_zXXXX_f11952305af06934c_d20160218_m010453_c001_v0001019_t0038
status: 0
stdout:
{
"accountId": "XXXX",
"bucketId": "XXXX",
"contentLength": 5,
"contentSha1": "aaf4c61ddcc5e8a2dabede0f3b482cd9aea9434d",
"contentType": "application/octet-stream",
"fileId": "4_zXXXX_f11952305af06934c_d20160218_m010453_c001_v0001019_t0038",
"fileInfo": {
"src_last_modified_millis": "1455757481897"
},
"fileName": "sync/c"
}
expected:
1455757481893
actual:
1455757481897
ERROR
After the facts described in #144 , I tried sync (b2 v 0.5.4) with the seemingly new --skipNewer
option, which should act as a workaround to my issue.
This is what happened:
root@amaranta:/other/localbackup# /root/tools/backblaze/bin/b2 sync --skipNewer --delete ./ b2:amarantabackup
delete duplicity-inc.20160430T014501Z.to.20160501T014502Z.manifest.gpg (old version)
delete duplicity-inc.20160430T014501Z.to.20160501T014502Z.vol1.difftar.gpg (old version)
delete duplicity-inc.20160429T014502Z.to.20160430T014501Z.manifest.gpg (old version)
upload duplicity-inc.20160429T014502Z.to.20160430T014501Z.manifest.gpg
upload duplicity-inc.20160501T014502Z.to.20160502T014501Z.manifest.gpg
upload duplicity-inc.20160430T014501Z.to.20160501T014502Z.manifest.gpg
delete duplicity-new-signatures.20160430T014501Z.to.20160501T014502Z.sigtar.gpg (old version)
delete duplicity-new-signatures.20160429T014502Z.to.20160430T014501Z.sigtar.gpg (old version)
delete duplicity-full.20160422T100919Z.vol37.difftar.gpg (old version)
upload duplicity-inc.20160430T014501Z.to.20160501T014502Z.vol1.difftar.gpg
upload duplicity-new-signatures.20160501T014502Z.to.20160502T014501Z.sigtar.gpg
upload duplicity-new-signatures.20160430T014501Z.to.20160501T014502Z.sigtar.gpg
delete duplicity-full.20160422T100919Z.vol45.difftar.gpg (old version)
upload duplicity-new-signatures.20160429T014502Z.to.20160430T014501Z.sigtar.gpg
upload duplicity-inc.20160501T014502Z.to.20160502T014501Z.vol1.difftar.gpg
Unluckily, I hadn't saved a screenshot of the web gui before doing that, so I don't know what it did show.
But, rest assured again that the files were totally unmodified since they were originally uploaded, but the tool somehow detected they were changed (locale timestamp > remote timestamp?) and reuploaded them.
I'm including a full ls -l
for the first reuploaded file:
root@amaranta:/other/localbackup# ls -l --time-style="full-iso" duplicity-inc.20160430T014501Z.to.20160501T014502Z.manifest.gpg
-rw------- 1 root root 233 2016-05-01 03:49:12.653555465 +0200 duplicity-inc.20160430T014501Z.to.20160501T014502Z.manifest.gpg
What I notice, right now, is that such files' "uploaded" column in the web gui matched the LOCAL creation time:
duplicity-inc.20160430T014501Z.to.20160501T014502Z.manifest.gpg 233.0 bytes 05/01/2016 03:49
I don't know whether the webgui is autodetecting my browser's timezone or doing geolocalization otherwise, or there's a "timezone stripping" issue which should be fixed.
Also, it seems that, by now, the "uploaded" field with the "sync" command doesn't actually save the "uploaded time", but the "original creation" time - which is a bit of a different beast, isn't it?
Thanks.
It'd be really useful to be able to exclude paths that match a certain pattern from synchronization. For example, if I'm synchronizing a bunch of node.js projects, I might want to have b2 sync ignore all the node_modules folders.
Having all of the code in a single file is awkward. And now that it can be installed with pip
, there is no longer a need to have it all in a single file to make downloading easy.
Backblaze has a sync tool written in C. It would be nice if the sync command in Python were compatible.
I think the only issue is that the C tool has an option for encoding the modification time in the file name. This makes scanning for updates faster because it doesn't have to call b2_get_file_info
on each file.
If any of the uploads/downloads/deletes fails during a sync, it prints an error, which then quickly scrolls off the screen.
It would be better to include an error counter in the status line, and summarize the errors at the end.
Maybe you need to change link to b2 tools from https://www.backblaze.com/b2/docs/quick_command_line.html maybe to github? because link from doc not working
$ b2 authorize_account ae2424115563411111111 001eba671154b1ab4f85cb846af9c4aea
Traceback (most recent call last):
File "/home/tommy/src/b2", line 949, in <module>
main()
File "/home/tommy/src/b2", line 912, in main
authorize_account(args)
File "/home/tommy/src/b2", line 355, in authorize_account
url = auth_urls[option]
KeyError: '--production'
I tried b2 from this git repo and it just works.
We have a goal of making the B2 library and command-line tool thread safe. Users of the library (both B2RawApi and B2Api) will want to multi-thread their applications. And we want to multi-thread the command-line tool so that it can upload and download many files in parallel.
This issue is a place to discuss the design issues for the library.
On windows, the portalocker
package has a dependency on win32con
, which is not available through pip. It has its own installer.
I don't want to force people to run yet another windows installer to get win32con
.
I tried installing a fresh python from python.org, and then installing b2
from pypi. When I try to run b2, I get this error:
ImportError: No module named 'win32con'
after updating to b2 command line version 0.4.4 from 0.3.0 using pip install and removing manually copied intial b2 script
Given that I authorize to my account with
$ b2 authorize_account someAccountId
and provide application key interactively
and that I check authorization by listing bucket content with
$ b2 ls someBucketName
objectName1
objectName2
...
I fail to upload new objects with
$ b2 upload_file --contentType b2/x-auto someBucketName filenameLocal filenameRemote
ERROR: No such bucket: someBucketName
I checked with someBucketId with same result
I also check to clear account and authorize again with same result.
I double checked man pages to make sure I got it right, I noticed the command line options changes (for contentType) and I think command line syntax is ok
There aren't any tests yet for the new sync code.
Hello,
I've just upgraded my b2 client from 0.5.0 to 0.5.4; now the sync command yields this error:
root@amaranta:/other/localbackup# /root/tools/backblaze/bin/b2 sync --delete ./ b2:amarantabackup
ERROR: destination file is newer: duplicity-full-signatures.20160422T100919Z.sigtar.gpg
this is the local file information (rest assured it was NEVER altered after creation):
root@amaranta:/other/localbackup# ls -l --time-style="full-iso" duplicity-full-signatures.20160422T100919Z.sigtar.gpg
-rw------- 1 root root 1114394497 2016-04-22 15:08:23.830379590 +0200 duplicity-full-signatures.20160422T100919Z.sigtar.gpg
The web gui for this file says:
duplicity-full-signatures.20160422T100919Z.sigtar.gpg 1.1 GB 04/22/2016 19:24
The last columns says "uploaded", by the way.
I see two problems here:
There's another issue (which I think is separate) that I'll report separately.
Is it possible to rename files once they’ve been uploaded? I’ve several times uploaded a 500MB+ file only to discover that I mistyped the remote filename and don’t want to have to re-upload the entire file just to change the filename…
The B2RawApi layer and the post_json() function that go with it have enough interesting code in them that they need to be unit tested. I think the way to do this is to define a super-simple interface that wraps the urllib calls, which can then be stubbed out for testing.
The interface can take (url, headers, data) and return (status, headers, data). It should be clear about what it returns when the URL is malformed, when there is a broken pipe, when the connection fails, and when there is a timeout.
Using tqdm for the progress bar seems to be generating errors with the latest version available on pypi (3.8.0) as well as the latest stable version from github (4.0.0). Seems to occur on both upload_file and sync operations. Did not occur when used version 3.4.0 of tqdm.
I and others have observed that when using download_file_by_name or download_file_by_id, the file you are downloading spools into RAM and only getting written to disk when it's fully downloaded.
Therefore, a file larger than the amount of RAM in your system will crash. Is there a way to periodically flush the downloading file to disk?
Hi!
Could you please tag new releases on GitHub? It would be useful in order to get notified when a new version is out and to have a nice release history.
Thanks!
It would be great if b2sync were included in this repo or perhaps a separate repository since it's a C++ program?
I had this issue where one of my sysmlinks was broken and b2 tool broke, this is the stack trace:
Traceback (most recent call last):
File "/usr/local/bin/b2", line 9, in <module>
load_entry_point('b2==0.5.4', 'console_scripts', 'b2')()
File "/usr/local/lib/python2.7/dist-packages/b2/console_tool.py", line 861, in main
exit_status = ct.run_command(decoded_argv)
File "/usr/local/lib/python2.7/dist-packages/b2/console_tool.py", line 789, in run_command
return command.run(args)
File "/usr/local/lib/python2.7/dist-packages/b2/console_tool.py", line 609, in run
max_workers=max_workers
File "/usr/local/lib/python2.7/dist-packages/b2/sync.py", line 877, in sync_folders
source_folder, dest_folder, args, now_millis, reporter
File "/usr/local/lib/python2.7/dist-packages/b2/sync.py", line 777, in make_folder_sync_actions
for (source_file, dest_file) in zip_folders(source_folder, dest_folder):
File "/usr/local/lib/python2.7/dist-packages/b2/sync.py", line 646, in zip_folders
current_a = next_or_none(iter_a)
File "/usr/local/lib/python2.7/dist-packages/b2/sync.py", line 620, in next_or_none
return six.advance_iterator(iterator)
File "/usr/local/lib/python2.7/dist-packages/b2/sync.py", line 499, in all_files
yield self._make_file(relative_path)
File "/usr/local/lib/python2.7/dist-packages/b2/sync.py", line 553, in _make_file
mod_time = int(round(os.path.getmtime(full_path) * 1000))
File "/usr/lib/python2.7/genericpath.py", line 54, in getmtime
return os.stat(filename).st_mtime
OSError: [Errno 2] No such file or directory: '/media/2a9074d0-4788-45ab-bfae-fc46427c69fa/PersonalData/some-broken-symlink'
For uploading few of files there is a nice feature - progress bar.
Unfortunately when using backup scripts for uploading over 200 files this makes thousands of lines, it's hard to debug if any error appears. Please make flag to turn it off or report progress with date and time for each 5 or 10%
Now it looks like this for me for each file:
1201: Apr 13 01:05:58 INFO: 0%
1201: Apr 13 01:05:58 INFO: 0%
1201: Apr 13 01:05:58 INFO: 1%
1201: Apr 13 01:05:58 INFO: 1%
1201: Apr 13 01:05:58 INFO: 1%
1201: Apr 13 01:05:58 INFO: 2%
1201: Apr 13 01:05:58 INFO: 2%
1201: Apr 13 01:05:58 INFO: 3%
1201: Apr 13 01:05:58 INFO: 3%
1201: Apr 13 01:05:58 INFO: 3%
1201: Apr 13 01:05:58 INFO: 4%
1201: Apr 13 01:05:58 INFO: 4%
1201: Apr 13 01:05:58 INFO: 4%
1201: Apr 13 01:05:58 INFO: 5%
1201: Apr 13 01:05:58 INFO: 5%
1201: Apr 13 01:05:58 INFO: 6%
1201: Apr 13 01:05:58 INFO: 6%
1201: Apr 13 01:05:58 INFO: 6%
1201: Apr 13 01:05:58 INFO: 7%
1201: Apr 13 01:05:58 INFO: 7%
1201: Apr 13 01:05:58 INFO: 7%
1201: Apr 13 01:05:58 INFO: 7%
1201: Apr 13 01:05:58 INFO: 8%
1201: Apr 13 01:05:58 INFO: 8%
1201: Apr 13 01:05:58 INFO: 9%
1201: Apr 13 01:05:58 INFO: 9%
1201: Apr 13 01:05:58 INFO: 9%
1201: Apr 13 01:05:58 INFO: 10%
1201: Apr 13 01:05:58 INFO: 10%
1201: Apr 13 01:05:58 INFO: 11%
1201: Apr 13 01:05:58 INFO: 11%
1201: Apr 13 01:05:58 INFO: 11%
1201: Apr 13 01:05:58 INFO: 12%
1201: Apr 13 01:05:58 INFO: 12%
1201: Apr 13 01:05:58 INFO: 12%
1201: Apr 13 01:05:58 INFO: 13%
1201: Apr 13 01:05:58 INFO: 13%
1201: Apr 13 01:05:58 INFO: 13%
1201: Apr 13 01:05:58 INFO: 14%
1201: Apr 13 01:05:58 INFO: 14%
1201: Apr 13 01:05:58 INFO: 14%
1201: Apr 13 01:05:58 INFO: 15%
1201: Apr 13 01:05:58 INFO: 15%
1201: Apr 13 01:05:58 INFO: 15%
1201: Apr 13 01:05:58 INFO: 16%
1201: Apr 13 01:05:58 INFO: 16%
1201: Apr 13 01:05:58 INFO: 16%
1201: Apr 13 01:05:58 INFO: 17%
1201: Apr 13 01:05:58 INFO: 17%
1201: Apr 13 01:05:58 INFO: 17%
1201: Apr 13 01:05:58 INFO: 18%
1201: Apr 13 01:05:58 INFO: 18%
1201: Apr 13 01:05:58 INFO: 18%
1201: Apr 13 01:05:58 INFO: 19%
1201: Apr 13 01:05:58 INFO: 19%
1201: Apr 13 01:05:58 INFO: 19%
1201: Apr 13 01:05:58 INFO: 19%
1201: Apr 13 01:05:58 INFO: 20%
1201: Apr 13 01:05:58 INFO: 20%
1201: Apr 13 01:05:58 INFO: 20%
1201: Apr 13 01:05:58 INFO: 20%
1201: Apr 13 01:05:58 INFO: 21%
1201: Apr 13 01:05:58 INFO: 21%
1201: Apr 13 01:05:58 INFO: 21%
1201: Apr 13 01:05:58 INFO: 21%
1201: Apr 13 01:05:58 INFO: 22%
1201: Apr 13 01:05:58 INFO: 22%
1201: Apr 13 01:05:58 INFO: 22%
1201: Apr 13 01:05:58 INFO: 23%
1201: Apr 13 01:05:58 INFO: 23%
1201: Apr 13 01:05:58 INFO: 23%
1201: Apr 13 01:05:58 INFO: 23%
1201: Apr 13 01:05:58 INFO: 24%
1201: Apr 13 01:05:58 INFO: 24%
1201: Apr 13 01:05:58 INFO: 25%
1201: Apr 13 01:05:58 INFO: 25%
1201: Apr 13 01:05:58 INFO: 25%
1201: Apr 13 01:05:58 INFO: 26%
1201: Apr 13 01:05:58 INFO: 26%
1201: Apr 13 01:05:58 INFO: 27%
1201: Apr 13 01:05:58 INFO: 27%
1201: Apr 13 01:05:58 INFO: 27%
1201: Apr 13 01:05:58 INFO: 28%
1201: Apr 13 01:05:58 INFO: 28%
1201: Apr 13 01:05:58 INFO: 29%
1201: Apr 13 01:05:58 INFO: 29%
1201: Apr 13 01:05:58 INFO: 29%
1201: Apr 13 01:05:58 INFO: 30%
1201: Apr 13 01:05:58 INFO: 30%
1201: Apr 13 01:05:58 INFO: 30%
1201: Apr 13 01:05:58 INFO: 30%
1201: Apr 13 01:05:58 INFO: 31%
1201: Apr 13 01:05:58 INFO: 31%
1201: Apr 13 01:05:58 INFO: 32%
1201: Apr 13 01:05:58 INFO: 32%
1201: Apr 13 01:05:58 INFO: 32%
1201: Apr 13 01:05:58 INFO: 32%
1201: Apr 13 01:05:58 INFO: 33%
1201: Apr 13 01:05:58 INFO: 33%
1201: Apr 13 01:05:58 INFO: 33%
1201: Apr 13 01:05:58 INFO: 34%
1201: Apr 13 01:05:58 INFO: 34%
1201: Apr 13 01:05:58 INFO: 34%
1201: Apr 13 01:05:58 INFO: 35%
1201: Apr 13 01:05:58 INFO: 35%
1201: Apr 13 01:05:58 INFO: 36%
1201: Apr 13 01:05:58 INFO: 36%
1201: Apr 13 01:05:58 INFO: 36%
1201: Apr 13 01:05:58 INFO: 36%
1201: Apr 13 01:05:58 INFO: 37%
1201: Apr 13 01:05:58 INFO: 37%
1201: Apr 13 01:05:58 INFO: 37%
1201: Apr 13 01:05:58 INFO: 38%
1201: Apr 13 01:05:58 INFO: 38%
1201: Apr 13 01:05:58 INFO: 38%
1201: Apr 13 01:05:58 INFO: 38%
1201: Apr 13 01:05:58 INFO: 39%
1201: Apr 13 01:05:58 INFO: 39%
1201: Apr 13 01:05:58 INFO: 39%
1201: Apr 13 01:05:58 INFO: 39%
1201: Apr 13 01:05:58 INFO: 40%
1201: Apr 13 01:05:58 INFO: 40%
1201: Apr 13 01:05:58 INFO: 40%
1201: Apr 13 01:05:58 INFO: 41%
1201: Apr 13 01:05:58 INFO: 41%
1201: Apr 13 01:05:58 INFO: 41%
1201: Apr 13 01:05:58 INFO: 41%
1201: Apr 13 01:05:58 INFO: 42%
1201: Apr 13 01:05:58 INFO: 42%
1201: Apr 13 01:05:58 INFO: 42%
1201: Apr 13 01:05:58 INFO: 42%
1201: Apr 13 01:05:58 INFO: 43%
1201: Apr 13 01:05:58 INFO: 43%
1201: Apr 13 01:05:58 INFO: 43%
1201: Apr 13 01:05:58 INFO: 43%
1201: Apr 13 01:05:58 INFO: 44%
1201: Apr 13 01:05:58 INFO: 44%
1201: Apr 13 01:05:58 INFO: 44%
1201: Apr 13 01:05:58 INFO: 44%
1201: Apr 13 01:05:58 INFO: 44%
1201: Apr 13 01:05:58 INFO: 45%
1201: Apr 13 01:05:58 INFO: 45%
1201: Apr 13 01:05:58 INFO: 45%
1201: Apr 13 01:05:58 INFO: 45%
1201: Apr 13 01:05:58 INFO: 45%
1201: Apr 13 01:05:58 INFO: 46%
1201: Apr 13 01:05:58 INFO: 46%
1201: Apr 13 01:05:58 INFO: 47%
1201: Apr 13 01:05:58 INFO: 47%
1201: Apr 13 01:05:58 INFO: 47%
1201: Apr 13 01:05:58 INFO: 48%
1201: Apr 13 01:05:58 INFO: 48%
1201: Apr 13 01:05:58 INFO: 48%
1201: Apr 13 01:05:58 INFO: 48%
1201: Apr 13 01:05:58 INFO: 49%
1201: Apr 13 01:05:58 INFO: 49%
1201: Apr 13 01:05:58 INFO: 49%
1201: Apr 13 01:05:58 INFO: 50%
1201: Apr 13 01:05:58 INFO: 50%
1201: Apr 13 01:05:58 INFO: 50%
1201: Apr 13 01:05:58 INFO: 51%
1201: Apr 13 01:05:58 INFO: 51%
1201: Apr 13 01:05:58 INFO: 51%
1201: Apr 13 01:05:58 INFO: 51%
1201: Apr 13 01:05:58 INFO: 52%
1201: Apr 13 01:05:58 INFO: 52%
1201: Apr 13 01:05:58 INFO: 52%
1201: Apr 13 01:05:58 INFO: 52%
1201: Apr 13 01:05:58 INFO: 53%
1201: Apr 13 01:05:58 INFO: 53%
1201: Apr 13 01:05:58 INFO: 54%
1201: Apr 13 01:05:58 INFO: 54%
1201: Apr 13 01:05:58 INFO: 54%
1201: Apr 13 01:05:58 INFO: 55%
1201: Apr 13 01:05:58 INFO: 55%
1201: Apr 13 01:05:58 INFO: 55%
1201: Apr 13 01:05:58 INFO: 56%
1201: Apr 13 01:05:58 INFO: 56%
1201: Apr 13 01:05:58 INFO: 56%
1201: Apr 13 01:05:58 INFO: 56%
1201: Apr 13 01:05:58 INFO: 57%
1201: Apr 13 01:05:58 INFO: 57%
1201: Apr 13 01:05:58 INFO: 57%
1201: Apr 13 01:05:58 INFO: 58%
1201: Apr 13 01:05:58 INFO: 58%
1201: Apr 13 01:05:58 INFO: 58%
1201: Apr 13 01:05:58 INFO: 58%
1201: Apr 13 01:05:58 INFO: 59%
1201: Apr 13 01:05:58 INFO: 59%
1201: Apr 13 01:05:58 INFO: 59%
1201: Apr 13 01:05:58 INFO: 59%
1201: Apr 13 01:05:58 INFO: 60%
1201: Apr 13 01:05:58 INFO: 60%
1201: Apr 13 01:05:58 INFO: 60%
1201: Apr 13 01:05:58 INFO: 60%
1201: Apr 13 01:05:58 INFO: 61%
1201: Apr 13 01:05:58 INFO: 61%
1201: Apr 13 01:05:58 INFO: 62%
1201: Apr 13 01:05:58 INFO: 62%
1201: Apr 13 01:05:58 INFO: 62%
1201: Apr 13 01:05:58 INFO: 63%
1201: Apr 13 01:05:58 INFO: 63%
1201: Apr 13 01:05:58 INFO: 63%
1201: Apr 13 01:05:58 INFO: 63%
1201: Apr 13 01:05:58 INFO: 64%
1201: Apr 13 01:05:58 INFO: 64%
1201: Apr 13 01:05:58 INFO: 65%
1201: Apr 13 01:05:58 INFO: 65%
1201: Apr 13 01:05:58 INFO: 65%
1201: Apr 13 01:05:58 INFO: 65%
1201: Apr 13 01:05:58 INFO: 66%
1201: Apr 13 01:05:58 INFO: 66%
1201: Apr 13 01:05:58 INFO: 67%
1201: Apr 13 01:05:58 INFO: 67%
1201: Apr 13 01:05:58 INFO: 68%
1201: Apr 13 01:05:58 INFO: 68%
1201: Apr 13 01:05:58 INFO: 68%
1201: Apr 13 01:05:58 INFO: 69%
1201: Apr 13 01:05:58 INFO: 69%
1201: Apr 13 01:05:58 INFO: 70%
1201: Apr 13 01:05:58 INFO: 70%
1201: Apr 13 01:05:58 INFO: 70%
1201: Apr 13 01:05:58 INFO: 71%
1201: Apr 13 01:05:58 INFO: 71%
1201: Apr 13 01:05:58 INFO: 71%
1201: Apr 13 01:05:58 INFO: 72%
1201: Apr 13 01:05:58 INFO: 72%
1201: Apr 13 01:05:58 INFO: 72%
1201: Apr 13 01:05:58 INFO: 73%
1201: Apr 13 01:05:58 INFO: 73%
1201: Apr 13 01:05:58 INFO: 73%
1201: Apr 13 01:05:58 INFO: 74%
1201: Apr 13 01:05:58 INFO: 74%
1201: Apr 13 01:05:58 INFO: 75%
1201: Apr 13 01:05:58 INFO: 75%
1201: Apr 13 01:05:58 INFO: 76%
1201: Apr 13 01:05:58 INFO: 76%
1201: Apr 13 01:05:58 INFO: 76%
1201: Apr 13 01:05:58 INFO: 77%
1201: Apr 13 01:05:58 INFO: 77%
1201: Apr 13 01:05:58 INFO: 77%
1201: Apr 13 01:05:58 INFO: 78%
1201: Apr 13 01:05:58 INFO: 78%
1201: Apr 13 01:05:58 INFO: 78%
1201: Apr 13 01:05:58 INFO: 78%
1201: Apr 13 01:05:58 INFO: 79%
1201: Apr 13 01:05:58 INFO: 79%
1201: Apr 13 01:05:58 INFO: 79%
1201: Apr 13 01:05:58 INFO: 80%
1201: Apr 13 01:05:58 INFO: 80%
1201: Apr 13 01:05:58 INFO: 80%
1201: Apr 13 01:05:58 INFO: 80%
1201: Apr 13 01:05:58 INFO: 81%
1201: Apr 13 01:05:58 INFO: 81%
1201: Apr 13 01:05:58 INFO: 82%
1201: Apr 13 01:05:58 INFO: 82%
1201: Apr 13 01:05:58 INFO: 83%
1201: Apr 13 01:05:58 INFO: 83%
1201: Apr 13 01:05:58 INFO: 84%
1201: Apr 13 01:05:58 INFO: 84%
1201: Apr 13 01:05:58 INFO: 84%
1201: Apr 13 01:05:58 INFO: 85%
1201: Apr 13 01:05:58 INFO: 85%
1201: Apr 13 01:05:58 INFO: 86%
1201: Apr 13 01:05:58 INFO: 86%
1201: Apr 13 01:05:58 INFO: 86%
1201: Apr 13 01:05:58 INFO: 87%
1201: Apr 13 01:05:58 INFO: 87%
1201: Apr 13 01:05:58 INFO: 87%
1201: Apr 13 01:05:58 INFO: 87%
1201: Apr 13 01:05:58 INFO: 88%
1201: Apr 13 01:05:58 INFO: 88%
1201: Apr 13 01:05:58 INFO: 89%
1201: Apr 13 01:05:58 INFO: 89%
1201: Apr 13 01:05:58 INFO: 89%
1201: Apr 13 01:05:58 INFO: 89%
1201: Apr 13 01:05:58 INFO: 90%
1201: Apr 13 01:05:58 INFO: 90%
1201: Apr 13 01:05:58 INFO: 90%
1201: Apr 13 01:05:58 INFO: 91%
1201: Apr 13 01:05:58 INFO: 91%
1201: Apr 13 01:05:58 INFO: 91%
1201: Apr 13 01:05:58 INFO: 91%
1201: Apr 13 01:05:58 INFO: 92%
1201: Apr 13 01:05:58 INFO: 92%
1201: Apr 13 01:05:58 INFO: 93%
1201: Apr 13 01:05:58 INFO: 93%
1201: Apr 13 01:05:58 INFO: 93%
1201: Apr 13 01:05:58 INFO: 94%
1201: Apr 13 01:05:58 INFO: 94%
1201: Apr 13 01:05:58 INFO: 94%
1201: Apr 13 01:05:58 INFO: 94%
1201: Apr 13 01:05:58 INFO: 95%
1201: Apr 13 01:05:58 INFO: 95%
1201: Apr 13 01:05:58 INFO: 95%
1201: Apr 13 01:05:58 INFO: 95%
1201: Apr 13 01:05:58 INFO: 96%
1201: Apr 13 01:05:58 INFO: 96%
1201: Apr 13 01:05:58 INFO: 96%
1201: Apr 13 01:05:58 INFO: 96%
1201: Apr 13 01:05:58 INFO: 96%
1201: Apr 13 01:05:58 INFO: 97%
1201: Apr 13 01:05:58 INFO: 97%
1201: Apr 13 01:05:58 INFO: 97%
1201: Apr 13 01:05:58 INFO: 97%
1201: Apr 13 01:05:58 INFO: 98%
1201: Apr 13 01:05:58 INFO: 98%
1201: Apr 13 01:05:58 INFO: 98%
1201: Apr 13 01:05:58 INFO: 98%
1201: Apr 13 01:05:58 INFO: 99%
1201: Apr 13 01:05:58 INFO: 99%
1201: Apr 13 01:05:58 INFO: 99%
1201: Apr 13 01:05:58 INFO: 99%
1201: Apr 13 01:05:58 INFO: DONE.
Trying this on Windows, with the latest Python (3.5.1) and running the script just says:
File "b2.py", line 1351
print self.desc
^
Not sure if anyone expects this to work in Windows, whether I should use a lower version of Python, or what.
When uploading a large file, it's annoying to have it fail after uploading most of the parts, and then have to start the whole process over again. It would be better to check and see if there is a matching upload (file name, content type, etc.) already in progress with parts that match the file being uploaded, and then continuing from there.
I am unsure if this is intentional or not, but larger files end with a "Connection reset by peer" after a while. I've been trying to back up a lot of content. But I haven't found a way to recursively upload files, therefore my only option has been to upload a .tar archive of all the files.
Trying to upload smaller files work fine:
However, larger file uploads always fail over a 1 gbit/s line:
This was B2 support tickets #187260 and #191398, Nilay suggested I post here instead.
I'm trying out B2 as a secondary store alongside Glacier, and I'm (ab)using b2.py as an import in a Python script that regularly uploads files to both.
Every few days, B2 will completely barf on uploads. It happened January 13th, and it happened on the 8th before that, and the 4th before that. The traceback looks like this:
B2 exception for /home/iverad/WWJ/ive3rad_wwj_2016_01_13_21_22_12.aac:
Traceback (most recent call last):
File "iveradvpsdel.py", line 177, in <module>
b2.upload_file(['--contentType', content_type, B2_BUCKET_NAME, b, b2_name])
File "/home/iverad/b2.py", line 728, in upload_file
response = post_file(url, headers, local_file, exit_on_error=False)
File "/home/iverad/b2.py", line 331, in post_file
with OpenUrl(url, data_file, headers, exit_on_error) as response_file:
File "/home/iverad/b2.py", line 298, in __enter__
self.file = urllib2.urlopen(request)
File "/usr/lib/python2.7/urllib2.py", line 154, in urlopen
return opener.open(url, data, timeout)
File "/usr/lib/python2.7/urllib2.py", line 431, in open
response = self._open(req, data)
File "/usr/lib/python2.7/urllib2.py", line 449, in _open
'_open', req)
File "/usr/lib/python2.7/urllib2.py", line 409, in _call_chain
result = func(*args)
File "/usr/lib/python2.7/urllib2.py", line 1240, in https_open
context=self._context)
File "/usr/lib/python2.7/urllib2.py", line 1200, in do_open
r = h.getresponse(buffering=True)
File "/usr/lib/python2.7/httplib.py", line 1073, in getresponse
response.begin()
File "/usr/lib/python2.7/httplib.py", line 415, in begin
version, status, reason = self._read_status()
File "/usr/lib/python2.7/httplib.py", line 379, in _read_status
raise BadStatusLine(line)
BadStatusLine: ''
Or this:
B2 exception for /home/iverad/KDKA/ive3rad_kdka_2016_01_13_21_22_14.aac:
Traceback (most recent call last):
File "iveradvpsdel.py", line 177, in <module>
b2.upload_file(['--contentType', content_type, B2_BUCKET_NAME, b, b2_name])
File "/home/iverad/b2.py", line 728, in upload_file
response = post_file(url, headers, local_file, exit_on_error=False)
File "/home/iverad/b2.py", line 331, in post_file
with OpenUrl(url, data_file, headers, exit_on_error) as response_file:
File "/home/iverad/b2.py", line 298, in __enter__
self.file = urllib2.urlopen(request)
File "/usr/lib/python2.7/urllib2.py", line 154, in urlopen
return opener.open(url, data, timeout)
File "/usr/lib/python2.7/urllib2.py", line 431, in open
response = self._open(req, data)
File "/usr/lib/python2.7/urllib2.py", line 449, in _open
'_open', req)
File "/usr/lib/python2.7/urllib2.py", line 409, in _call_chain
result = func(*args)
File "/usr/lib/python2.7/urllib2.py", line 1240, in https_open
context=self._context)
File "/usr/lib/python2.7/urllib2.py", line 1197, in do_open
raise URLError(err)
URLError: <urlopen error [Errno 111] Connection refused>
Right now I'm just letting b2.py do its thing, it retries five times (or whatever) and then gives up, and it just logs these errors for me to eventually manually retry later.
Is there something else I could/should be doing?
Instruction on how to structure dependencies is here:
pyca/cryptography#2880 (comment)
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.