Comments (30)
Current PyPi maintainer @AvdN responded to me by e-mail and assured this will be fixed once the new people doing the Python 3 rewrite have established themselves and #52 has been fixed.
from rdiff-backup.
I have added Otto as maintainer on PyPI based on the data that at least one of the members on savannah.nongnu.org (Andrew Foster) has been involved in the mailing list thread and that the project page there has been updated with information on the move of the project.
from rdiff-backup.
Seems Travis-CI has PyPi integration, so we could push each tagged master branch there automatically in the future: https://docs.travis-ci.com/user/deployment/pypi/
from rdiff-backup.
I wouldn't want to rely too much on Travis' capabilities as it makes us dependent on GitHub/Microsoft. AFAICJ setup.py upload
does the trick as well, could be called from within Travis but also outside of it.
from rdiff-backup.
from rdiff-backup.
I want to bring this topic back and try to follow a bit more our contribution guide line. I want to improve the current travis CICD pipeline to build and test packages compatible with PYPi.
Since rdiff-backup has C modules, the recommended way is to use manylinux images to compile & test the pacakges.
Here some of the highlight changes I'm proposing:
- Replace the debian:sid docker image by manylinux images
- Remove the Dockerfile
- Change travis pipeline to have stages:
- test : where all our tox.ini test runs with a matrix for py35,36,37
- deploy: where I will add a new step to build the wheel packages and push it to pypi with a matrix for manylinux2010_x86_64, manylinux2010_i686, manylinux1_x86_64, manylinux1_i686. That include py35,36,37
Question is should we keep the Makefile ?
@ericzolf @ottok Are you using that make file during your developement ? If not, I would recommend to move this logic in travis.yml or in a bash script under /tools/.
from rdiff-backup.
I want to bring this topic back and try to follow a bit more our contribution guide line. I want to improve the current travis CICD pipeline to build and test packages compatible with PYPi.
We all agree with the objective, I think.
Since rdiff-backup has C modules, the recommended way is to use manylinux images to compile & test the pacakges.
I'll trust you on this one, no clue.
1. Replace the debian:sid docker image by manylinux images 2. Remove the Dockerfile
I'm confused: how do you use the manylinux images? Aren't they docker images? RTFM with a link is fine.
3. Change travis pipeline to have stages:
* test : where all our tox.ini test runs with a matrix for py35,36,37 * deploy: where I will add a new step to build the wheel packages and push it to pypi with a matrix for manylinux2010_x86_64, manylinux2010_i686, manylinux1_x86_64, manylinux1_i686. That include py35,36,37
- What are the differences between the manylinux 2010 and 1?
- What about py38? It's been released since almost 2 weeks https://docs.python.org/3/whatsnew/3.8.html
- Rather as a side note, I'd like to have the slow tests taken out of the PR tests and moved to daily tests, so that we lose less time on PRs. We can improve this later.
Question is should we keep the Makefile ?
@ericzolf @ottok Are you using that make file during your developement ? If not, I would recommend to move this logic in travis.yml or in a bash script under /tools/.
I'm not, I'm only using setup.py
and tox
. The thing is that Otto created it for a good reason: he wanted the testing docker approach to be available locally to developers, which I think is a rather good thing. Even though I'm happy to test directly on my laptop, this isn't exactly best practice and a container is more secure. If there is a command line tool to do the same locally with travis.yml, fine with me.
This said, moving as much cruft away from the root directory as possible is in my sense, hence moving stuff to tools
sounds good to me.
I've added this issue to the Milestone 2.0.0. Feel free to disagree but for me it belongs to goal Nr. 3
from rdiff-backup.
Since rdiff-backup has C modules, the recommended way is to use manylinux images to compile & test the pacakges.
I'll trust you on this one, no clue.
To create a wheel package that is compatible with all linux distro, we need to compile against a very old version of glibc. manylinux image provide all of this in a docker image that is simple to use.
I'm confused: how do you use the manylinux images? Aren't they docker images? RTFM with a link is fine.
Those are docker image with multiple version of python pre-compiled. So with the same image we can build from py27 to py38 if we want. We need to use those to comply with pypi repository. Only manylinux packages can be uploaded to pypi.
Here a link to manylinux project: https://github.com/pypa/manylinux
And a demo of it usage: https://github.com/pypa/python-manylinux-demo
What are the differences between the manylinux 2010 and 1?
Those are different docker image. maxylinux1 is based on centos5 and deprecated, but still required by some platform. manylinux2010 is base on centos6 and still suported. We can decide to use only 2010. But the effort to make manylinux1 working is minimal.
What about py38? It's been released since almost 2 weeks https://docs.python.org/3/whatsnew/3.8.html
No worries, manylinux already provide python 3.8 environment.
Rather as a side note, I'd like to have the slow tests taken out of the PR tests and moved to daily tests, so that we lose less time on PRs. We can improve this later.
Got it
I'm not, I'm only using setup.py and tox. The thing is that Otto created it for a good reason: he wanted the testing docker approach to be available locally to developers, which I think is a rather good thing. Even though I'm happy to test directly on my laptop, this isn't exactly best practice and a container is more secure. If there is a command line tool to do the same locally with travis.yml, fine with me.
This said, moving as much cruft away from the root directory as possible is in my sense, hence moving stuff to tools sounds good to me.
I've added this issue to the Milestone 2.0.0. Feel free to disagree but for me it belongs to goal Nr. 3
I will wait for @ottok comments a bit. I'm not a big fan of Makefile for python project and prefer to stick with python tools. People who wants to use docker to run the build locally will be free to use the bash script I'm gonna provide in /tools/ directory. I will also take a very similar approach for appveyor build once I'm done with this one.
from rdiff-backup.
There should not be any conflict with Python tools vs Docker. Docker is used just as a wrapper for making the build in an isolated container which guarantees certain properties of the level of automation and documentation. Anybody can still do the build running the same Python tools manually and having them locally installed.
Anyway, I think the goal should be (as stated in #58 (comment)) to make a automatic integration from git/Github so that latest versions show up on pypi automatically so we cater for people who which to use pypi without too much effort.
from rdiff-backup.
Current PyPi maintainer @AvdN responded to me by e-mail and assured this will be fixed once the new people doing the Python 3 rewrite have established themselves and #52 has been fixed.
@AvdN we have now the control over the Savannah site, and #52 - now #1 of rdiff-backup.net is closed, can you tell us what could be the next step?
from rdiff-backup.
I set up Otto as a maintainer on PyPI for the project a long time ago, so he should be able to do
python setup.py sdist
twine upload rdiff-backup_XYZ.tar.gz
I now made him an owner.
If you want to make pre-build binary wheel files, I can give some pointers on how I do that for ruamel.yaml using manylinux (on my local linux box), appveyor for windows and a MacOS VM for Apple.
There is also a rdiff-backup account on pypi, which is maintainer for rdiff-backup, let me know if someone wants the password for that. I am not using it (but emails for the account currently go to me).
from rdiff-backup.
@ottok is not very active in recent time (he shall feel free to chime in if he disagrees) and hence could you also give me the necessary rights?
Else, any pointer is more than welcome! Is https://bitbucket.org/ruamel/yaml/src/default/ still the correct repository (I see only Windows builds)?
from rdiff-backup.
For the record, PyPi project owners/access since December 2019:
from rdiff-backup.
Yes! Than you! Next step is once we've fixed release to GitHub to release to PyPI.
from rdiff-backup.
What is the status with https://pypi.org/project/rdiff-backup/ ? I don't see anything new there even though I added you as admins in December when I got access. I assume you have some plan for an automatic release pipeline as envisioned above?
Do you mind if I do a manual beta release now so we get more testing feedback?
from rdiff-backup.
Notes on steps to upload:
$ cat $HOME/.pypirc
[pypi]
username = __token__
password = pypi-AgEIc....
$ git diff
diff --git a/Dockerfile b/Dockerfile
index 1ae1552..709762d 100644
--- a/Dockerfile
+++ b/Dockerfile
@@ -39,3 +39,9 @@ ENV RDIFF_TEST_USER testuser
ENV RDIFF_TEST_GROUP testuser
RUN useradd -ms /bin/bash --uid ${RDIFF_TEST_UID} ${RDIFF_TEST_USER}
+
+RUN DEBIAN_FRONTEND=noninteractive apt-get update -yqq && \
+ apt-get install -y --no-install-recommends \
+ python3-pip
+
+RUN python3 -m pip install --upgrade setuptools wheel auditwheel
diff --git a/Makefile b/Makefile
index 09719cd..fedfc4f 100644
--- a/Makefile
+++ b/Makefile
@@ -32,7 +32,7 @@ test-runtime-root: test-runtime-files
test-runtime-slow: test-runtime-files
@echo "=== Long running performance tests ==="
- ${RUN_COMMAND} tox -c tox_slow.ini -e py
+ ${RUN_COMMAND} tox -c tox_slow.ini -e pypython3 setup.py sdist bdist_wheel
build:
# Build rdiff-backup (assumes src/ is in directory 'rdiff-backup' and it's
@@ -43,8 +43,8 @@ bdist_wheel:
# Prepare wheel for deployment.
# See the notes for target "build"
# auditwheel unfortunately does not work with modern glibc
- ${RUN_COMMAND} ./setup.py bdist_wheel
- # ${RUN_COMMAND} auditwheel repair dist/*.whl
+ ${RUN_COMMAND} python3 setup.py sdist bdist_wheel
+ ${RUN_COMMAND} auditwheel repair dist/*.whl
$ make container
$ make bdist_wheel
$ python3 -m pip install --user --upgrade twine keyrings.alt
$ python3 -m twine upload --verbose dist/rdiff-backup-1.4.0b1.dev34+g1cce874.d20200119.tar.gz
--> fails as PyPi does not allow local version strings
Still trying to figure out the full procedure..
from rdiff-backup.
Sorry, I'm still stuck on #221 (time issue, the beginning of the year was tough). I don't mind a manual upload, just leave .travis.yml alone until I've re-factored it please.
from rdiff-backup.
Ok, I figured it fully out now and uploaded:
$ python3 -m twine upload --verbose dist/rdiff-backup-1.4.0b2.tar.gz
Uploading distributions to https://upload.pypi.org/legacy/
Uploading rdiff-backup-1.4.0b2.tar.gz
100%|| 301k/301k [00:02<00:00, 114kB/s]
View at:
https://pypi.org/project/rdiff-backup/1.4.0b2/
Notes:
- PyPi does not allow local version segments (https://www.python.org/dev/peps/pep-0440/#local-version-segments)
- As scm_version generates the version, I had to locally make a temporary tag v1.4.0b2 to get a PyPi-compatible build (https://github.com/pypa/setuptools_scm/#default-versioning-scheme)
from rdiff-backup.
Uploaded manually https://pypi.org/project/rdiff-backup/1.9.0b0/
from rdiff-backup.
I allow myself to take over the issue ownership to automate also the release to Pypi.
@ikus060 if you have time left, please review/merge my PRs.
from rdiff-backup.
@ericzolf Sorry, I'm not present as I would like too. At least, I'm tracking the changes made to the code.
Tag me on the PR will take a look.
from rdiff-backup.
I'm currently stuck due to https://travis-ci.community/t/pypi-deploy-failing-on-windows-couldnt-install-pip-setuptools-twine-or-wheel/6338 on the branch ericzolf-add-pypi-deploy-58.
This given, I also learned that pypi release can only work with a "simple" looking version, i.e. not something like 1.9.0b1.dev4+g8d7d85b
. Need to check if and how I can filter on such a condition.
from rdiff-backup.
This given, I also learned that pypi release can only work with a "simple" looking version, i.e. not
something like 1.9.0b1.dev4+g8d7d85b. Need to check if and how I can filter on such a condition.
yes, we should not deployed developement version to pypi, only tagged should be release
from rdiff-backup.
I agree and they were tagged, but this makes development of the feature quite difficult.
Anyway the bug is already addressed: travis-ci/dpl#1161 and travis-ci/dpl#1162 but no clue when it'll be released.
from rdiff-backup.
To test we can use https://test.pypi.org/, just need to set it up in a smart way...
from rdiff-backup.
@ikus060 can you please make us maintainer of https://test.pypi.org/project/rdiff-backup ?
from rdiff-backup.
@ericzolf Done
I added you as maintainer. Other user are not register to test.pypi.org
from rdiff-backup.
In eb20edc I enabled the commented out line auditwheel repair dist/*.whl
but it didn't work and I am not familiar with auditwheel. Do we want to have it working and part of the build chain?
In the same test I also extended the Dockerfile. It is good practice to do release builds in a clean environment with well defined dependencies installed.
I understand the beauty of having a git tag that automatically makes everything happen, but it can be hard to get every detail in place. Releases don't happen so often, so I don't think it is that bad either if they are done manually.
from rdiff-backup.
In eb20edc I enabled the commented out line
auditwheel repair dist/*.whl
but it didn't work and I am not familiar with auditwheel. Do we want to have it working and part of the build chain?
No, you can remove it if you're working on the Makefile, it was a try, replaced by the 3 jobs around manylinux we now have in our pipeline.
In the same test I also extended the Dockerfile. It is good practice to do release builds in a clean environment with well defined dependencies installed.
Nothing to say about this. This said, the Travis CI environment is well defined enough for me, and using a Docker container adds even more time to the pipeline, so IMHO should be considered carefully for each use case.
I understand the beauty of having a git tag that automatically makes everything happen, but it can be hard to get every detail in place. Releases don't happen so often, so I don't think it is that bad either if they are done manually.
Can you be more specific? We're probably two PRs away from being successful, I wouldn't want to give up now.
from rdiff-backup.
I understand the beauty of having a git tag that automatically makes everything happen, but it can be hard to get every detail in place. Releases don't happen so often, so I don't think it is that bad either if they are done manually.
Can you be more specific? We're probably two PRs away from being successful, I wouldn't want to give up now.
The way it's working right now is great. We create a tag and voilà, stuff get compiled and pushed everywhere. That is called continuous deliveries. Time where we do this manually is long gone. No body want the burden to do it manually and having everything automated make it easier to pass the ball to others.
from rdiff-backup.
Related Issues (20)
- [BUG] Homebrew contains unreleased version 2.4.0 of rdiff-backup HOT 3
- [BUG] pyproject.toml license field should be GPL-2.0-or-later
- [ENH] make tox.ini only write to readable places with --current-env HOT 4
- [?] Is backup to a samba share supported? HOT 3
- [?] rdiff-backup doc and remote HOT 2
- [BUG] CVE-2023-49797 pyinstaller: unauthorized deletion of files HOT 2
- [ENH] allow flexible usage of better hashing algorithm than SHA1
- [BUG] rdiff-backup fails on too long filenames under Windows HOT 1
- What are the errors in statistics? HOT 4
- [ENH] populate no_compression_regexp with _something_ so it matches the documentation HOT 4
- [BUG] read-only commands should return 2 as warning if last back-up failed HOT 6
- [BUG] Removal of setup.py usage in debian/autobuild.sh regressed it HOT 2
- [BUG] crash on date beyond 2038 (last 32 bits date) HOT 3
- test action fails with empty error message when using API 201 HOT 1
- [ENH] Suppress output line-wrapping when using --parsable-output HOT 3
- [?] MacOS with Python Universal2, using pip to install rdiff-backup is working, but the resulting installation is broken HOT 2
- [ENH] Efficient restore to a populated destination HOT 7
- [BUG] Recurring Failure to "find the path specified: b'I:/'" HOT 2
- backup to a pCloud mounted drive[?] HOT 8
- problem [Errno 11] Resource temporarily unavailable, when running backup HOT 8
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from rdiff-backup.