Author: "Will Stevens" [email protected]
$ git clone https://github.com/openstack/python-swiftclient.git
$ cd python-swiftclient
$ python setup.py install
NOTE: This is only needed if you are using auth_version > 1
$ git clone https://github.com/openstack/python-keystoneclient.git
$ cd python-keystoneclient
$ pip install -r requirements.txt && python setup.py install
$ pip install boto
Test the performance of different object storage solutions. Run tests against either or both a Swift API and a S3 API. If you run tests against both the Swift and S3 APIs for a specific provider, the average performance graphs will include both the Swift and S3 performance on a single graph for easy comparison.
-
The
./run.py
script will create a new container and upload files into it. Then it will download those files and delete them from the object store. Each of these operations are timed and the results are logged in thelogs
directory. -
The
./parse.py
script uses the logs as input and generates an html file (html/index.html
) with graphs of the results. -
The
./clean.py
script is used if there are any errors during the run of therun.py
script and files get left on the object store. Running that script will remove anything that remains on the object store that was created by therun.py
script.
This utility is completely non-destructive so it will not interact with the objects already on the object store and it will not leave objects or containers on the object store when it finishes. That being said, if there are errors in execution it is possible that it could leave objects on the object store. If this happens run the ./clean.py
file and it will clean up anything left over after the error.
By default this script creates and uses a container name of global_unique_bucket_name
. After execution this container will be removed. If that container name already exists on your system, you will want to change that setting in your config.py
file.
- Run
./generate_files_*.sh
according to the OS you will be running this test from, which populates the upload directory. - Configure
connections
in theconfig.py
settings file. - Configure
test_cases
in theconfig.py
settings file to determine which endpoints to hit with what size files and how many batches to run. - Run
./run.py
to populate thelogs
anddownloads
directories. - Run
./parse.py
to generate and html file with graphs of the results inhtml/index.html
.
These files are used to initially populate the uploads
directory with sample files. It creates three groups of files in the directories small
, medium
and large
which are then used in tests.
- small: 20 files in 50k increments starting at 50k
- medium: 20 files in 5m increments starting at 5m
- large: 5 files in 200m increments starting at 200m
This is the main configuration file which is used by all the executable scripts. Check the sample file to understand the structure. In general you can configure the following things.
- connections: A dictionary of connection configurations for different object storage providers.
- test_cases: A list of test case dictionaries which outline the parameters of the tests you want to run.
- column_width: The width of the provider column in the graphed output. Use this to adjust the width of the output page.
- unique_container_name: The name of the container that the test should use. Careful, this container will be removed. It will be created by the script, so you do not have to setup anything, just make sure it is set to something that DOES NOT already exist on your object storage account.
This script uses the connections and test_cases and runs the batches of tests. Each of the tests get logged in the appropriate log file in the logs
directory.
This script uses the configured test_cases and the logs
generated by the ./run.py
script and generates the html file html/index.html
with graphs comparing the performance of the different object storage providers and the APIs which they expose.
You should not need this script if everything goes well. If for some reason there is an error when you run the ./run.py
script, it is possible that the container and the objects that were uploaded may not get cleaned up correctly. If that happens, run this script and it will clean up the objects and container which were left on the object store.