GithubHelp home page GithubHelp logo

xyou365 / autorclone Goto Github PK

View Code? Open in Web Editor NEW
1.4K 51.0 522.0 1.42 MB

AutoRclone: rclone copy/move/sync (automatically) with thousands of service accounts

Home Page: https://www.gfan.loan/?p=235

Python 100.00%
rclone service-accounts google-drive google-cloud-apis team-drive gsuite-admin linux python3 json-files windows resources movies

autorclone's Introduction

AutoRclone: rclone copy/move/sync (automatically) with service accounts (still in the beta stage)

Many thanks for rclone and folderclone.

  • create service accounts using script
  • add massive service accounts into rclone config file
  • add massive service accounts into groups for your organization
  • automatically switch accounts when rclone copy/move/sync
  • Windows system is supported

Step 1. Copy code to your VPS or local machine

Before everything, install python3. Because we use python as our programing language.

For Linux system: Install screen, git and latest rclone. If in Debian/Ubuntu, directly use this command

sudo apt-get install screen git && curl https://rclone.org/install.sh | sudo bash

After all dependency above are successfully installed, run this command

sudo git clone https://github.com/xyou365/AutoRclone && cd AutoRclone && sudo pip3 install -r requirements.txt

For Windows system: Directly download this project then install latest rclone. Then run this command (type in cmd command windows or PowerShell windows) in our project folder

pip3 install -r requirements.txt

Let us create only the service accounts that we need. Warning: abuse of this feature is not the aim of autorclone and we do NOT recommend that you make a lot of projects, just one project and 100 sa allow you plenty of use, its also possible that overabuse might get your projects banned by google.

Enable the Drive API in Python Quickstart and save the file credentials.json into project directory.

If you do not have any project in your account then

  • create 1 projec
  • enable the required services
  • create 100 (1 project, each with 100) Service Accounts
  • and download their credentials into a folder named accounts
Note: 1 service account can copy around 750gb a day, 1 project makes 100 service accounts so thats 75tb a day, for most users this should easily suffice. 

The command would look something like python3 gen_sa_accounts.py --quick-setup 1 replace "1" with the number of projects you want

If you have already N projects and want to create service accounts only in newly created projects,

to

  • create additional 1 project (project N+1 to project N+2)
  • enable the required services
  • create 100 (1 project, with 100) Service Accounts
  • and download their credentials into a folder named accounts

run

python3 gen_sa_accounts.py --quick-setup 1 --new-only

If you want to create some service accounts using existing projects (do not create more projects), run python3 gen_sa_accounts.py --quick-setup -1. Note that this will overwrite the existing service accounts.

After it is finished, there will be many json files in one folder named accounts.

Step 3. Add service accounts to Google Groups (Optional but recommended for hassle free long term use)

We use Google Groups to manager our service accounts considering the
Official limits to the members of Team Drive (Limit for individuals and groups directly added as members: 600).

For GSuite Admin

  1. Turn on the Directory API following official steps (save the generated json file to folder credentials).

  2. Create group for your organization in the Admin console. After create a group, you will have an address for example[email protected].

  3. Run python3 add_to_google_group.py -g [email protected]

For meaning of above flags, please run python3 add_to_google_group.py -h

For normal user

Create Google Group then add the service accounts as members by hand. Limit is 10 at a time, 100 a day but if you read our warning and notes above, you would have 1 project and hence easily in your range.

Step 4. Add service accounts or Google Groups into Team Drive

If you do not use Team Drive, just skip. Warning: It is NOT recommended to use service accounts to clone "to" folders that are not in teamdrives, SA work best for teamdrives.

If you have already created Google Groups (Step 2) to manager your service accounts, add the group address [email protected] or [email protected] to your source Team Drive (tdsrc) and destination Team Drive (tddst).

Otherwise, add service accounts directly into Team Drive.

Enable the Drive API in Python Quickstart and save the credentials.json into project root path if you have not done it in Step 2.

  • Add service accounts into your source Team Drive: python3 add_to_team_drive.py -d SharedTeamDriveSrcID
  • Add service accounts into your destination Team Drive: python3 add_to_team_drive.py -d SharedTeamDriveDstID

Step 5. Start your task

Let us copy hundreds of TB resource using service accounts. Note: Sarcasm, over abuse of this (regardless of what cloning script you use) may get you noticed by google, we recommend you dont be a glutton and clone what is important instead of downloading entire wikipedia.

For server side copy

  • publicly shared folder to Team Drive
  • Team Drive to Team Drive
  • publicly shared folder to publicly shared folder (with write privilege)
  • Team Drive to publicly shared folder
python3 rclone_sa_magic.py -s SourceID -d DestinationID -dp DestinationPathName -b 1 -e 600
  • For meaning of above flags, please run python3 rclone_sa_magic.py -h

  • Add --disable_list_r if rclone cannot read all contents of public shared folder.

  • Please make sure the Rclone can read your source and destination directory. Check it using rclone size:

  1. rclone --config rclone.conf size --disable ListR src001:

  2. rclone --config rclone.conf size --disable ListR dst001:

For local to Google Drive (needs some testing)

  • local to Team Drive
  • local to private folder
  • private folder to any (think service accounts cannot do anything about private folder)
python3 rclone_sa_magic.py -sp YourLocalPath -d DestinationID -dp DestinationPathName -b 1 -e 600
  • Run command tail -f log_rclone.txt to see what happens in details (linux only).

Also let's talk about this project in Telegram Group AutoRclone

[Blog(中文)](Blog (中文) https://gsuitems.com/index.php/archives/13/) | Google Drive Group | Google Drive Channel

autorclone's People

Contributors

tsunayoshisawada avatar xyou365 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

autorclone's Issues

use the new drive rclone backend

with this commit rclone/rclone@a86196a

we got a new option to directly change service accounts
I think this would be a better way to handle it instead of killing the task and editing the changing the service account and stating the transfer again

error when add service accounts to Google Group

When I tried to add sa emails, generated by AutoRclone, to Google Group, the error message shows:
"Your organization or group is configured to allow only organization members to join"

I checked the emails made by AutoRclone are like this:
[email protected]
It is very different from my gmail .edu domain.

What am I doing wrong?
Thanks!

Step 2 throws "KeyError: 'installed'"

Heya!

When doing Step 2, aka. running python3 gen_sa_accounts.py --quick-setup 1 (in my case just with python since it's already v3) I get the following error:

C:\Users\epicl\Downloads\AutoRclone-master>python gen_sa_accounts.py --quick-setup 1
Traceback (most recent call last):
  File "gen_sa_accounts.py", line 311, in <module>
    resp = serviceaccountfactory(
  File "gen_sa_accounts.py", line 161, in serviceaccountfactory
    proj_id = loads(open(credentials,'r').read())['installed']['project_id']
KeyError: 'installed'

Any way to fix this? The same happens on Ubuntu btw.

After first service account finished, speed significantly slows

After I get the

SUCCESS: The process with PID 5232 has been terminated.

The next service account starts but the speed go from about 2000MB/s to 1-40MB/s
Is it possible to get higher speeds once the next account starts.
It does seem to slowly speed up over time and gets higher but it stills caps at about 100MB/s.
Any advice is appreciated.

Exceeded the maximum calls(1000) in a single batch request.

sudo python3 add_to_google_group.py -g [email protected]
<googleapiclient.discovery.Resource object at 0x7fc9ef262208>
Readying accounts |##########################      | 1000/1200Traceback (most recent call last):
  File "add_to_google_group.py", line 68, in <module>
    batch.add(group.members().insert(groupKey=gaddr, body=body))
  File "/usr/local/lib/python3.6/dist-packages/googleapiclient/_helpers.py", line 134, in positional_wrapper
    return wrapped(*args, **kwargs)
  File "/usr/local/lib/python3.6/dist-packages/googleapiclient/http.py", line 1400, in add
    % MAX_BATCH_LIMIT
googleapiclient.errors.BatchError: <BatchError "Exceeded the maximum calls(1000) in a single batch request.">

Consider adding --drive-stop-on-upload-limit

The option makes rclone handle API transfer limit hit messages as errors. The messages are actually formatted differently for transfer limit vs api usage limits so this reliable. It cuts the account at 750GB.

EDIT: On second check I realized this was a recent option addition so if this added it should be given as a non-default option.

Stuck on reading source/destination | checks: 0 files

Copied 736gb successfully but got stuck on dst002: reading source/destination | checks: 0 files

Total Size 4tb, team drive to team drive!

Command used sudo python3 rclone_sa_magic.py -s source -d destination -dp backup -b 1 -e 600

Step 2 impossible, doesn't create accounts

Hi,

i use rclone but autorclone can be very great without gdrive limits.

I have a problem when i try to install it in win10.

I have installed all things and when i launch
python3 gen_sa_accounts.py --quick-setup 5
doesn't create nothing in account folder

I'm stuck on there, please can you help me?

Failed to start remote control

I'm running the following command:

python3 rclone_sa_magic.py -sp "/home/user/media/" -d "<drive-id>"

and every once in a while it can't start a new remote because of the following errors:

rclone --config ./rclone.conf copy --drive-stop-on-upload-limit --drive-server-side-across-configs --rc --rc-addr="localhost:5572" -v --ignore-existing --tpslimit 3 --transfers 3 --drive-chunk-size 32M --drive-acknowledge-abuse --log-file=log_rclone.txt "/home/user/media/" "dst010:" &
>> Let us go dst010: 19:47:00
2020/06/10 19:47:10 Failed to rc: connection failed: Post "http://localhost:5572/core/pid": dial tcp 127.0.0.1:5572: connect: connection refused
2020/06/10 19:47:10 Failed to rc: connection failed: Post "http://localhost:5572/core/stats": dial tcp 127.0.0.1:5572: connect: connection refused
2020/06/10 19:47:10 Failed to rc: connection failed: Post "http://localhost:5572/core/stats": dial tcp 127.0.0.1:5572: connect: connection refused
2020/06/10 19:47:10 Failed to rc: connection failed: Post "http://localhost:5572/core/stats": dial tcp 127.0.0.1:5572: connect: connection refused
No rclone task detected (possibly done for this account). (1/3)

rclone --config ./rclone.conf copy --drive-stop-on-upload-limit --drive-server-side-across-configs --rc --rc-addr="localhost:5572" -v --ignore-existing --tpslimit 3 --transfers 3 --drive-chunk-size 32M --drive-acknowledge-abuse --log-file=log_rclone.txt "/home/user/media/" "dst011:" &
>> Let us go dst011: 19:47:10
dst011: 510GB Done @ 245.282449MB/s | checks: 2125 files

In this case the downtime is low, but sometimes it takes much longer (I've seen up to an hour). The log_rclone.txt returns the following error:

2020/06/10 19:47:00 Failed to start remote control: start server failed: listen tcp 127.0.0.1:5572: bind: address already in use

I'm running Ubuntu Server 20.04 LTS with the latest rclone:

user@server:~$ rclone --version
rclone v1.52.0
- os/arch: linux/amd64
- go version: go1.14.3

Wrong length of team_drive_id or publicly shared root_folder_id

i already use AutoRclone to copy to my shared drives, but this command:

py rclone_sa_magic.py -s "a1b2c3d4e5f6_g7h8i9j0k1l2m3n" -d "usual_shared_drive_id" -dp "name" -b X -e Y

give me this error:

rclone is detected: ~\rclone.exe
generating rclone config file.
Wrong length of team_drive_id or publicly shared root_folder_id

also, no output log!

使用add_to_team_drive.py无法添加满500个机器人

总共生成了500个SA账号,最近发现用脚本总是只能添加200-300个之间,重复执行会出现+1 +2的人数上涨,执行下来完成任务非常快,不清楚是不是添加速度太快了被google限制了吗?

root@:~/autorclone# python3 add_to_team_drive.py -d ******

Found credentials.
Make sure the Google account that has generated credentials.json
is added into your Team Drive (shared drive) as Manager
(Press any key to continue)
Readying accounts |################################| 500/500
Adding...
Complete.
Elapsed Time:
00:00:08.53

团队网盘到团队网盘

源:gdriveA,只有下载权限
目的地:gdriveB,可以加入sa账号
此项目如何实现 gdriveA ⇨ gdriveB呢 ?

Public Shared Drive - Add on all service accounts?

Could you add function to add public shared link to all the service accounts?

I'm trying to download from a public shared drive to my own drive, but each service account needs to have this public shared drive opened for it to work.

Delete SA, accounts?

Hi
Thanks for the project
When the task is finished, how i delete all accounts from de teamdrive?
Thanks

如何写命令才能让qbittoreent在下载完成后成功调用

使用下面命令:python3 /root/AutoRclone/rclone_sa_magic.py -sp "%F" -d 0AHe-XXXXX-XXXXX -dp %N% -b 1 -e 600

貌似一直无法成功,我试了下会提示./account文件夹不存在,需要cd到AutoRclone文件夹才能执行成功

rclone is detected: /usr/bin/rclone
generating rclone config file.
No json files found in ./accounts

Keep trying to upload when the transfer is complete

root@instance-3:~/AutoRclone# python3 rclone_sa_magic.py -sp /### -d ### -dp ccc -b 1 -e 100
rclone is detected: /usr/bin/rclone
generating rclone config file.
rclone config file generated.
Start: 10:47:05
screen -d -m -S wrc rclone --config ./rclone.conf copy --drive-server-side-across-configs --rc -vv --ignore-existing --tpslimit 3 --transfers 3 --drive-chunk-size 32M --drive-acknowledge-abuse --log-file=log_rclone.txt "###" "dst001:ccc"
>> Let us go dst001: 10:47:05
2019/10/27 11:07:07 Failed to rc: connection failed: Post http://localhost:5572/core/stats: dial tcp 127.0.0.1:5572: connect: connection refused
2019/10/27 11:07:07 Failed to rc: connection failed: Post http://localhost:5572/core/stats: dial tcp 127.0.0.1:5572: connect: connection refused
2019/10/27 11:07:07 Failed to rc: connection failed: Post http://localhost:5572/core/stats: dial tcp 127.0.0.1:5572: connect: connection refused
No rclone task detected (possibly done for this account). (1/3)
screen -d -m -S wrc rclone --config ./rclone.conf copy --drive-server-side-across-configs --rc -vv --ignore-existing --tpslimit 3 --transfers 3 --drive-chunk-size 32M --drive-acknowledge-abuse --log-file=log_rclone.txt "###" "dst002:ccc"
>> Let us go dst002: 11:07:07
2019/10/27 11:07:17 Failed to rc: connection failed: Post http://localhost:5572/core/stats: dial tcp 127.0.0.1:5572: connect: connection refused
2019/10/27 11:07:17 Failed to rc: connection failed: Post http://localhost:5572/core/stats: dial tcp 127.0.0.1:5572: connect: connection refused
2019/10/27 11:07:17 Failed to rc: connection failed: Post http://localhost:5572/core/stats: dial tcp 127.0.0.1:5572: connect: connection refused
No rclone task detected (possibly done for this account). (2/3)
screen -d -m -S wrc rclone --config ./rclone.conf copy --drive-server-side-across-configs --rc -vv --ignore-existing --tpslimit 3 --transfers 3 --drive-chunk-size 32M --drive-acknowledge-abuse --log-file=log_rclone.txt "###" "dst003:ccc"
>> Let us go dst003: 11:07:17
2019/10/27 11:07:27 Failed to rc: connection failed: Post http://localhost:5572/core/stats: dial tcp 127.0.0.1:5572: connect: connection refused
2019/10/27 11:07:27 Failed to rc: connection failed: Post http://localhost:5572/core/stats: dial tcp 127.0.0.1:5572: connect: connection refused
2019/10/27 11:07:27 Failed to rc: connection failed: Post http://localhost:5572/core/stats: dial tcp 127.0.0.1:5572: connect: connection refused
No rclone task detected (possibly done for this account). (3/3)
All done (3/3).

log_rclone.txt

2019/10/27 11:07:06 DEBUG : ###.mp4: MD5 = 4830ef8fcf6322291a99fe3$
2019/10/27 11:07:06 INFO  : ###.mp4: Copied (new)
2019/10/27 11:07:06 INFO  :
Transferred:       81.953G / 81.953 GBytes, 100%, 70.016 MBytes/s, ETA 0s
Errors:                 0
Checks:                 0 / 0, -
Transferred:            2 / 2, 100%
Elapsed time:    19m58.5s

2019/10/27 11:07:06 DEBUG : 6 go routines active
2019/10/27 11:07:06 DEBUG : rclone: Version "v1.50.0" finishing with parameters ["rclone" "--config" "./rclone.c$
2019/10/27 11:07:07 DEBUG : rclone: Version "v1.50.0" starting with parameters ["rclone" "--config" "./rclone.co$
2019/10/27 11:07:07 NOTICE: Serving remote control on http://127.0.0.1:5572/
2019/10/27 11:07:07 DEBUG : Using config file from "###/AutoRclone/rclone.conf"
2019/10/27 11:07:07 INFO  : Starting HTTP transaction limiter: max 3 transactions/s with burst 1
2019/10/27 11:07:08 DEBUG : ###.mkv: Dest$
2019/10/27 11:07:08 INFO  : Google drive root 'ccc': Waiting for checks to finish
2019/10/27 11:07:08 DEBUG : ###.mp4: Destination exists, skipping
2019/10/27 11:07:08 INFO  : Google drive root 'ccc': Waiting for transfers to finish
2019/10/27 11:07:08 INFO  :
Transferred:             0 / 0 Bytes, -, 0 Bytes/s, ETA -
Errors:                 0
Checks:                 2 / 2, 100%
Transferred:            0 / 0, -
Elapsed time:          0s

2019/10/27 11:07:08 DEBUG : 8 go routines active
2019/10/27 11:07:08 DEBUG : rclone: Version "v1.50.0" finishing with parameters ["rclone" "--config" "./rclone.c$
2019/10/27 11:07:17 DEBUG : rclone: Version "v1.50.0" starting with parameters ["rclone" "--config" "./rclone.co$
2019/10/27 11:07:17 NOTICE: Serving remote control on http://127.0.0.1:5572/
2019/10/27 11:07:17 DEBUG : Using config file from "###/AutoRclone/rclone.conf"
2019/10/27 11:07:17 INFO  : Starting HTTP transaction limiter: max 3 transactions/s with burst 1
2019/10/27 11:07:18 DEBUG : ###.mp4: Destination exists, skipping
2019/10/27 11:07:18 DEBUG : ###.mkv: Dest$
2019/10/27 11:07:18 INFO  : Google drive root 'ccc': Waiting for checks to finish
2019/10/27 11:07:18 INFO  : Google drive root 'ccc': Waiting for transfers to finish
2019/10/27 11:07:18 INFO  :
Transferred:             0 / 0 Bytes, -, 0 Bytes/s, ETA -
Errors:                 0
Checks:                 2 / 2, 100%
Transferred:            0 / 0, -
Elapsed time:          0s

2019/10/27 11:07:18 DEBUG : 9 go routines active
2019/10/27 11:07:18 DEBUG : rclone: Version "v1.50.0" finishing with parameters ["rclone" "--config" "./rclone.c$

How to Authenticate Other/Different Google Account?

I able to accomplished AutoRclone and able to transfer files from Team Drive #1 to Team Drive #2 in Google Account A

So, I would like to replicate the same for Google Account B. But, the script keep listed the project from Google Account A.

How to re-authenticate the script for me to be able to authenticate Google Account B?

增加支持排除列表功能

rclone可以用 --exclude-from exclude-file.txt 排除不需要拷贝的文件和目录,非常实用。
强烈建议添加这个功能!
谢谢!

Sync Feature Request

Hi,

Please add this important features, to sync Team Drives, without 750 limit / day.

Thanks in advance

note. Guys, say here how you are interest about this feature

gen_sa_accounts.py tries to create SA on deleted projects

When using run python3 gen_sa_accounts.py --quick-setup -1 it tries to create SA on deleted projects too. It should only try to make SA on active projects.

gen_sa_accounts.py --list-projects also returns all projects including deleted projects.

无法添加进td中

~/AutoRclone# python3 add_to_team_drive.py -d

Found credentials.
Make sure the Google account that has generated credentials.json
is added into your Team Drive (shared drive) as Manager
(Press any key to continue)
Readying accounts |#################### | 460/701Traceback (most recent call last):
File "add_to_team_drive.py", line 63, in
ce = json.loads(open(i, 'r').read())['client_email']
KeyError: 'client_email'

Stuck in step2,just report error when enabling services

python gen_sa_accounts.py --quick-setup 1 --new-only                                             
creat projects: 1
Creating 1 projects
Enabling services
Traceback (most recent call last):
  File "gen_sa_accounts.py", line 323, in <module>
    download_keys=args.download_keys
  File "gen_sa_accounts.py", line 224, in serviceaccountfactory
    _enable_services(serviceusage,ste,services)
  File "gen_sa_accounts.py", line 89, in _enable_services
    batch.execute()
  File "G:\software\code\Anaconda\envs\test\lib\site-packages\googleapiclient\_helpers.py", line 134, in positional_wrapper
    return wrapped(*args, **kwargs)
  File "G:\software\code\Anaconda\envs\test\lib\site-packages\googleapiclient\http.py", line 1524, in execute
    self._execute(http, self._order, self._requests)
  File "G:\software\code\Anaconda\envs\test\lib\site-packages\googleapiclient\http.py", line 1454, in _execute
    self._batch_uri, method="POST", body=body, headers=headers
  File "G:\software\code\Anaconda\envs\test\lib\site-packages\google_auth_httplib2.py", line 198, in request
    uri, method, body=body, headers=request_headers, **kwargs)
  File "G:\software\code\Anaconda\envs\test\lib\site-packages\httplib2\__init__.py", line 1994, in request
    cachekey,
  File "G:\software\code\Anaconda\envs\test\lib\site-packages\httplib2\__init__.py", line 1651, in _request
    conn, request_uri, method, body, headers
  File "G:\software\code\Anaconda\envs\test\lib\site-packages\httplib2\__init__.py", line 1557, in _conn_request
    conn.connect()
  File "G:\software\code\Anaconda\envs\test\lib\site-packages\httplib2\__init__.py", line 1391, in connect
    raise socket_err
  File "G:\software\code\Anaconda\envs\test\lib\site-packages\httplib2\__init__.py", line 1326, in connect
    self.sock = self._context.wrap_socket(sock, server_hostname=self.host)
  File "G:\software\code\Anaconda\envs\test\lib\ssl.py", line 423, in wrap_socket
    session=session
  File "G:\software\code\Anaconda\envs\test\lib\ssl.py", line 870, in _create
    self.do_handshake()
  File "G:\software\code\Anaconda\envs\test\lib\ssl.py", line 1139, in do_handshake
    self._sslobj.do_handshake()
OSError: [Errno 0] Error

RootID of Source ?

Is that possible to transfer all of my Drive (root) ? What is FolderID i must to use ? Isn't "root" ?

run gen_sa_accounts.py failed;PermissionError: [Errno 13] Permission denied: 'token.pickle'

$ python3 gen_sa_accounts.py --quick-setup 1
Please visit this URL to authorize this application: https://accounts.google.com/o/oauth2/auth?response_type=code&client_id=95576504074-3o3c4a8au0m9fplnc88blv37psg35ghu.apps.googleusercontent.com&redirect_uri=urn%3Aietf%3Awg%3Aoauth%3A2.0%3Aoob&scope=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fdrive+https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform+https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fiam&state=PjP63ZjIxkrLDfbyRPtsyOmKh0HHGE&prompt=consent&access_type=offline
Enter the authorization code: *********************************
Traceback (most recent call last):
File "gen_sa_accounts.py", line 311, in
resp = serviceaccountfactory(
File "gen_sa_accounts.py", line 175, in serviceaccountfactory
with open(token, 'wb') as t:
PermissionError: [Errno 13] Permission denied: 'token.pickle'
2020-08-12 19-05-56 的屏幕截图

Mass deleting Service Account keys

For some reason my service accounts have 3-4 keys each, which means that when I run python3 gen_sa_accounts.py --download-keys [project-id] I get 300-400 keys. Is there a way to delete all the excess keys from the service accounts without deleting the service accounts themselves?

How this great tool Bypass daily limits?

Thanks for making this great code.
But i would like to know if this tool can bypass the daily 750Gb copy limit?
Also does it bypass the daily 8TB download limits?
Id yes for both, how can we do it?
Thanks.

Allow custom rclone options

It would be really great if we were allowed to pass rclone options (like --rc) or allow us to use rclone sync instead of rclone copy

无法同时执行多个rclone操作

e.g.
python3 rclone_sa_magic.py -s SourceID1 -d DestinationID1 -dp DestinationPathName -b 1 -e 100
python3 rclone_sa_magic.py -s SourceID2 -d DestinationID2 -dp DestinationPathName -b 101 -e 200

实际上读取的是同一份rclone.conf ,望优化。

Copy整个文件夹到另一台机器上后无法运行

换了台vps,利用rclone将整个文件夹copy到新vps上,也安装好了依赖环境,但是在执行从共享文件夹拷贝到TD时一直提示:
Failed to copy: failed to make directory: googleapi: Error 404: File not found: 0AD7f-V*****KFUk9PVA., notFound

确定TD ID是正确的,因为在老机器上也是这么执行,不太清楚是我拷贝漏了东西还是说sa账号只能同时被一台机器使用?

使用的命令如下:
python3 rclone_sa_magic.py -s "1N*******cMM-xkc2Y39jYRAHG253uk" -d 0AD7f-V*****KFUk9PVA -dp "/1127" -b 1 -e 600

IndexError: list index out of range

Hello:
I am starting working with your great tool, but when i start adding my service accounts to google group using the following order:

python3 add_to_google_group.py -g [email protected]
I get the following error:
Traceback (most recent call last): File "add_to_google_group.py", line 43, in <module> flow = InstalledAppFlow.from_client_secrets_file(credentials[0], scopes=[ IndexError: list index out of range

My test running on Ubuntu and the numbers of service account is 700 service accounts.
Also if i would like to add those service accounts by hand, how can i do that ?
Should i have to grab all email and send them invitation to the group?
Thanks.

Copy stopping without using all service accounts

I'm having trouble getting AutoRclone to use all of my service accounts. I ran the following command:

python3 rclone_sa_magic.py -s {SourceID} -d {DestID} -b 1 -e 600

It proceeded to do 10 copy operations:

>> Let us go dst001:
thru
>> Let us go dst010:

But then it stopped and gave an elapsed time and "All Done" message. Since I have 100 service accounts, why did it stop after 10?

Slow copy while checking files

hi.. i have Gdrive with 500TB data.. i used Autoclone to copy that into another drive.

now i have added additional 50TB data into my original drive. but when using autoclone to copy into another drive.. its taking forever to check existing data..
how to increase performance of check process. .do we need to add additional flag while running it ?

plz help

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.