GithubHelp home page GithubHelp logo

Comments (6)

kriz87 avatar kriz87 commented on July 19, 2024

@SachiaLanlus : good question.

In my testcase I downloaded 100 files, every file around 500 KB.
download_async -> ~0:30 min
download_sync -> ~3:30 min

Why is download_async that much faster?

@ezhov-evgeny Is there a function to check whether the download_async process is completed for all 100 files?

from webdav-client-python-3.

SachiaLanlus avatar SachiaLanlus commented on July 19, 2024

@SachiaLanlus : good question.

In my testcase I downloaded 100 files, every file around 500 KB.
download_async -> ~0:30 min
download_sync -> ~3:30 min

Why is download_async that much faster?

@ezhov-evgeny Is there a function to check whether the download_async process is completed for all 100 files?

These result is due to the difference between async and sync.
Async version will spawn a worker thread and return immediately.
Sync version will run on the main thread, so all the requests will be sequentially processed.

And if you want to check if download_async is completed, you can use the callback to set global variable or other signal.
Here is my implementation example.

from webdav3.client import Client
from threading import Semaphore
import glob

options = {
 'webdav_hostname': "host",
 'webdav_login':    "user",
 'webdav_password': "pswd",
 'VERBOSE': True,
}
max_concurrent_operation=10
local_path_base='archivedwl-231\\'
remote_path_base='000/upload/'


client = Client(options)

semph=Semaphore(max_concurrent_operation)

file_list=[x[x.rfind('\\')+1:] for x in glob.glob(local_path_base+'*')]

for f in file_list:
    full_local_path=local_path_base+f
    full_remote_path=remote_path_base+f
    semph.acquire()
    client.upload_async(remote_path=full_remote_path,local_path=full_local_path,callback=semph.release)
    print(f)

for i in range(max_concurrent_operation):
    semph.acquire()
print(len(client.list(remote_path_base)))

In fact my question is not this.
I know the difference between async and sync.

from webdav-client-python-3.

ezhov-evgeny avatar ezhov-evgeny commented on July 19, 2024

@SachiaLanlus

What is the recommendation of this implementation?

There is no builtin options for control of maximum amount of threads. I recommend to implement it in your code for improving performance on so big amount of files.

Can I set maximum concurrent requests number?

It depends on network bandwidth and WebDAV server performance. Too much threads can decrease uploading/downloading time. I think you can make few tests in your environment for example with 5, 20 and 100 threads and decide what is more preferable in your case. I usually use 5-20 threads.

And if I use python Threading to multi-thread it, should they share one client instance(and use download_async) or spawn for each thread worker(and use download_sync)?

Actually upload/download_async just create new thread and run upload/download_sync in that thread.
I think you can use any of this methods in you purposes which is easy for you in implementation.

  • In case of *_async methods you will need to control amount of invokes for limit concurrent threads amount.
  • In case of *_sync methods you will need also create threads for that, but in this case you can use thread pool. I recommend it.

When you connect to the one server you can use one client instance it will be thread safe.

from webdav-client-python-3.

SachiaLanlus avatar SachiaLanlus commented on July 19, 2024

@ezhov-evgeny

Thanks for your reply
I will try some experiments with different setting.

I have tried two style of implementation.
First one is to use python queue(max_len=max_concurrent) with dispatch/worker/collector framework.
Second one use semaphore which is in the reply above.
I will try thread pool you recommend, it looks not only cool but simple!

and

When you connect to the one server you can use one client instance it will be thread safe.

This line is what I need.

Thank you very much!

from webdav-client-python-3.

kriz87 avatar kriz87 commented on July 19, 2024

Thanks for your code example.
I am pretty new to multithreading and I have tried your semaphore example.

How can I handle exceptions in threads / semaphores?
In my case I got a lost connection exception in one thread.
Semaphore was not released and process ran forever.

I tried following code, which did not solve my issue:

semph.acquire()
try:
    self.client.download_async(remote_path=rem_path, local_path=loc_path, callback=semph.release)
except Exception as e:
    print(e)
    semph.release()
    pass

from webdav-client-python-3.

SachiaLanlus avatar SachiaLanlus commented on July 19, 2024

@kriz87
I think if you want to handle the exception.
You have to wrap it into your worker thread.
I write another example

from webdav3.client import Client
import glob
import queue
import threading


options = {
 'webdav_hostname': "host",
 'webdav_login':    "username",
 'webdav_password': "pswd",
 'VERBOSE': True,
}
max_concurrent_operation=5
local_path_base='userlogs\\'
remote_path_base='000/upload/'


#define worker
def worker(client=Client(options)):
    while(True):
        f=global_input_queue.get()
        if(f is None):
            break
        full_local_path=local_path_base+f
        full_remote_path=remote_path_base+f
        while(True):
            try:
                client.upload_sync(remote_path=full_remote_path,local_path=full_local_path)
            except:
                #handle the exception here
                continue
            break
        global_output_queue.put(f)



client = Client(options)
file_list=[x[x.rfind('\\')+1:] for x in glob.glob(local_path_base+'*')]
global_input_queue=queue.Queue()
global_output_queue=queue.Queue()

#dispatch
for f in file_list:
    global_input_queue.put(f)

#start worker
threads = list()
for i in range(max_concurrent_operation):
    t = threading.Thread(target=worker, args = (client,))
    t.start()
    threads.append(t)

#start collector
for index in range(len(file_list)):
    entry=global_output_queue.get()
    if(entry is not None):
        print(entry)
#join
for i in range(max_concurrent_operation):
    global_input_queue.put(None)
for t in threads:
    t.join()

print(len(client.list(remote_path_base)))

from webdav-client-python-3.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.