GithubHelp home page GithubHelp logo

scientific-python / cookie Goto Github PK

View Code? Open in Web Editor NEW
231.0 9.0 46.0 1.12 MB

Scientific Python Library Development Guide and Cookiecutter

Home Page: https://learn.scientific-python.org/development

License: BSD 3-Clause "New" or "Revised" License

Python 98.29% Ruby 1.71%
scikit-hep python pypi-package cookiecutter cookiecutter-python3 cookiecutter-python

cookie's People

Contributors

0xtowel avatar agriyakhetarpal avatar blakenaccarato avatar burgholzer avatar carreau avatar danielballan avatar danielhollas avatar dependabot[bot] avatar dimitripapadopoulos avatar dlu avatar eduardo-rodrigues avatar github-actions[bot] avatar grlee77 avatar henryiii avatar hugovk avatar jack-mcivor avatar jarrodmillman avatar jeanelsner avatar jfrost-mo avatar klieret avatar kratsg avatar kreczko avatar lsetiawan avatar matthewfeickert avatar maxmynter avatar mfisher87 avatar mwtoews avatar pre-commit-ci[bot] avatar rgommers avatar tkoyama010 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

cookie's Issues

Ruff `target-version` can be automatically inferred if `requires-python` is set

According to https://beta.ruff.rs/docs/settings/#target-version, the target-version property used for ruff can be inferred from the project.requires-python field, if it is present.
Repo-review currently requires the target-version to be set in

class RF002(Ruff):
"Target version must be set"
requires = {"RF001"}
@staticmethod
def check(pyproject: dict[str, Any]) -> bool:
"""
Must select a minimum version to target. Affects pyupgrade,
isort, and others.
"""
match pyproject:
case {"tool": {"ruff": {"target-version": str()}}}:
return True
case _:
return False

Based on the above, this check could be relaxed a little. Maybe to something like:

class RF002(Ruff):
    "Target version must be set"
    requires = {"RF001"}

    @staticmethod
    def check(pyproject: dict[str, Any]) -> bool:
        """
        Must select a minimum version to target. Affects pyupgrade,
        isort, and others. Can be inferred from project.requires-python.
        """

        match pyproject:
            case {"tool": {"ruff": {"target-version": str()}}}:
                return True
            case {"project": {"requires-python": str()}}:
                return True
            case _:
                return False

Or even something where it is suggested to not specify tool.ruff.target-version whenever project.requires-python is specified.

Homepage in [project.urls] not found

Hi, thank you for this great project. It is really helpful.

I discovered the "Homepage" URL, defined in [project.urls] returns 404. Maybe the GitHub pages deployment is not used anymore?

See:

Homepage = "https://scientific-python.github.io/cookie"

Just an idea: This might be another check to include? To have valid and accessible URLs in pyproject.toml?

MANIFEST.in for setuptools/pybind11 is excluding too much?

this line https://github.com/scikit-hep/cookie/blob/cc1ea50641fc6f87b55ba5143d954a4c672fe07c/%7B%7Bcookiecutter.project_name%7D%7D/MANIFEST-setuptools%2Cpybind11.in#L5

seems to remove all files. As an example project, see this script I wrote which mimics how MANIFEST.in is parsed using distutils (line-by-line)

$ cat file_list.py 
from distutils.filelist import FileList
file_list = FileList()

for line in open('MANIFEST.in').readlines():
    line = line.strip()
    if not line: continue
    print(line)
    file_list.process_template_line(line)
    print(file_list.files, end='\n'*2)

and the output

graft src
['src/pylibmagic/_version.pyi', 'src/pylibmagic/_version.py', 'src/pylibmagic/__init__.py', 'src/pylibmagic/py.typed', 'src/pylibmagic/__pycache__/__init__.cpython-39.pyc', 'src/pylibmagic.egg-info/PKG-INFO', 'src/pylibmagic.egg-info/not-zip-safe', 'src/pylibmagic.egg-info/SOURCES.txt', 'src/pylibmagic.egg-info/requires.txt', 'src/pylibmagic.egg-info/top_level.txt', 'src/pylibmagic.egg-info/dependency_links.txt']

graft tests
['src/pylibmagic/_version.pyi', 'src/pylibmagic/_version.py', 'src/pylibmagic/__init__.py', 'src/pylibmagic/py.typed', 'src/pylibmagic/__pycache__/__init__.cpython-39.pyc', 'src/pylibmagic.egg-info/PKG-INFO', 'src/pylibmagic.egg-info/not-zip-safe', 'src/pylibmagic.egg-info/SOURCES.txt', 'src/pylibmagic.egg-info/requires.txt', 'src/pylibmagic.egg-info/top_level.txt', 'src/pylibmagic.egg-info/dependency_links.txt', 'tests/test_package.py', 'tests/test_compiled.py', 'tests/__pycache__/test_package.cpython-39-pytest-7.1.1.pyc', 'tests/__pycache__/test_compiled.cpython-39-pytest-7.1.1.pyc']

include LICENSE README.md pyproject.toml setup.py setup.cfg
['src/pylibmagic/_version.pyi', 'src/pylibmagic/_version.py', 'src/pylibmagic/__init__.py', 'src/pylibmagic/py.typed', 'src/pylibmagic/__pycache__/__init__.cpython-39.pyc', 'src/pylibmagic.egg-info/PKG-INFO', 'src/pylibmagic.egg-info/not-zip-safe', 'src/pylibmagic.egg-info/SOURCES.txt', 'src/pylibmagic.egg-info/requires.txt', 'src/pylibmagic.egg-info/top_level.txt', 'src/pylibmagic.egg-info/dependency_links.txt', 'tests/test_package.py', 'tests/test_compiled.py', 'tests/__pycache__/test_package.cpython-39-pytest-7.1.1.pyc', 'tests/__pycache__/test_compiled.cpython-39-pytest-7.1.1.pyc', 'LICENSE', 'README.md', 'pyproject.toml', 'setup.py', 'setup.cfg']

global-exclude __pycache__ *.py[cod] .*
warning: no previously-included files matching '__pycache__' found anywhere in distribution
['src/pylibmagic.egg-info/PKG-INFO', 'src/pylibmagic.egg-info/not-zip-safe', 'LICENSE']

If I drop, instead the .* requirement at the end of this line, I get

global-exclude __pycache__ *.py[cod]
warning: no previously-included files matching '__pycache__' found anywhere in distribution
['src/pylibmagic/_version.pyi', 'src/pylibmagic/_version.py', 'src/pylibmagic/__init__.py', 'src/pylibmagic/py.typed', 'src/pylibmagic.egg-info/PKG-INFO', 'src/pylibmagic.egg-info/not-zip-safe', 'src/pylibmagic.egg-info/SOURCES.txt', 'src/pylibmagic.egg-info/requires.txt', 'src/pylibmagic.egg-info/top_level.txt', 'src/pylibmagic.egg-info/dependency_links.txt', 'tests/test_package.py', 'tests/test_compiled.py', 'LICENSE', 'README.md', 'pyproject.toml', 'setup.py', 'setup.cfg']

which looks potentially better? I suspect what should have happened is a line like exclude .* since I think the goal was to exclude (hidden) files starting with periods, but global-exclude seems to be a regex that will match anywhere. See some investigation I did below:

>>> files = ['src/pylibmagic/_version.pyi', 'src/pylibmagic/_version.py', 'src/pylibmagic/__init__.py', 'src/pylibmagic/py.typed', 'src/pylibmagic/__pycache__/__init__.cpython-39.pyc', 'src/pylibmagic.egg-info/PKG-INFO', 'src/pylibmagic.egg-info/not-zip-safe', 'src/pylibmagic.egg-info/SOURCES.txt', 'src/pylibmagic.egg-info/requires.txt', 'src/pylibmagic.egg-info/top_level.txt', 'src/pylibmagic.egg-info/dependency_links.txt', 'tests/test_package.py', 'tests/test_compiled.py', 'tests/__pycache__/test_package.cpython-39-pytest-7.1.1.pyc', 'tests/__pycache__/test_compiled.cpython-39-pytest-7.1.1.pyc', 'LICENSE', 'README.md', 'pyproject.toml', 'setup.py', 'setup.cfg']

>>> from distutils.filelist import translate_pattern
>>> translate_pattern(".*", 0, None, 0) # action: global-exclude
re.compile('(?s:\\.[^/]*)\\Z')
>>> translate_pattern(".*", 1, None, 0) # action: exclude
re.compile('(?s:\\A\\.[^/]*)\\Z')
>>> [f for f in files if translate_pattern(".*", 0, None, 0).search(f)]
['src/pylibmagic/_version.pyi', 'src/pylibmagic/_version.py', 'src/pylibmagic/__init__.py', 'src/pylibmagic/py.typed', 'src/pylibmagic/__pycache__/__init__.cpython-39.pyc', 'src/pylibmagic.egg-info/SOURCES.txt', 'src/pylibmagic.egg-info/requires.txt', 'src/pylibmagic.egg-info/top_level.txt', 'src/pylibmagic.egg-info/dependency_links.txt', 'tests/test_package.py', 'tests/test_compiled.py', 'tests/__pycache__/test_package.cpython-39-pytest-7.1.1.pyc', 'tests/__pycache__/test_compiled.cpython-39-pytest-7.1.1.pyc', 'README.md', 'pyproject.toml', 'setup.py', 'setup.cfg']
>>> [f for f in files if translate_pattern(".*", 1, None, 0).search(f)]
[]

which shows you that the files that got matched by the pattern is perhaps too greedy?

See https://github.com/python/cpython/blob/b3f2d4c8bab52573605c96c809a1e2162eee9d7e/Lib/distutils/filelist.py#L115 for reference (anchor=0 or anchor=1).

Follow up after nox PR

  • Docs generation needs testing (I think it's broken) #34
  • Trampolim should be made statically versioned until FFY00/trampolim#4 is fixed.
  • We probably should add test skips for PyPy on macOS and Windows CIBW
  • Tempting to bump to CIBW 2.0.0b2!
  • Add spellcheck #34

pipx plus copier does not work as described in readme

See below:

Lord Stark:~$ pipx install copier
  installed package copier 9.1.1, installed using Python 3.8.6
  These apps are now globally available
    - copier
⚠️  Note: '/Users/kratsg/.local/bin' is not on your PATH environment variable. These apps will not be globally
    accessible until your PATH is updated. Run `pipx ensurepath` to automatically add it, or manually modify your
    PATH in your shell's config file (i.e. ~/.bashrc).
done! ✨ 🌟 ✨
Lord Stark:~$ pipx inject copier copier-templates-extensions
  injected package copier-templates-extensions into venv copier
done! ✨ 🌟 ✨
Lord Stark:~$ pipx run copier copy gh:scientific-python/cookie itkdb-reports --trust
Traceback (most recent call last):
  File "/Users/kratsg/.local/pipx/.cache/b3c4e793c52a986/bin/copier", line 5, in <module>
    from copier.__main__ import copier_app_run
  File "/Users/kratsg/.local/pipx/.cache/b3c4e793c52a986/lib/python3.8/site-packages/copier/__init__.py", line 6, in <module>
    from .main import *  # noqa: F401,F403
  File "/Users/kratsg/.local/pipx/.cache/b3c4e793c52a986/lib/python3.8/site-packages/copier/main.py", line 46, in <module>
    from .subproject import Subproject
  File "/Users/kratsg/.local/pipx/.cache/b3c4e793c52a986/lib/python3.8/site-packages/copier/subproject.py", line 15, in <module>
    from .template import Template
  File "/Users/kratsg/.local/pipx/.cache/b3c4e793c52a986/lib/python3.8/site-packages/copier/template.py", line 20, in <module>
    from yamlinclude import YamlIncludeConstructor
ModuleNotFoundError: No module named 'yamlinclude'

Python 3.10: not all pre-commit hooks supported

While #41 introduced Python 3.10 to the CI, not all pre-commit hooks support that version.

One such hook is pycl:

ERROR: Package 'pycln' requires a different Python: 3.10.0 not in '<3.10,>=3.6.2'

It is still an open issue: hadialqattan/pycln#78

In addition, even the pyproject.toml templates include lines like

# https://github.com/scikit-hep/cookie/blob/main/%7B%7Bcookiecutter.project_name%7D%7D/pyproject-flit.toml#L34
requires = [
  "typing_extensions>=3.7; python_version<'3.8'",
]

So I assume that this will also create failures.

Allow for black mirror in pre-commit check

Note that the latest black docs now recommends that users use https://github.com/psf/black-pre-commit-mirror instead of https://github.com/psf/black because the version of black provided by the mirror is faster.

The mirror supports all black tags from 22.3.0 to through to the current 23.7.0 tag. Hence, I think the guide should start recommending using this version of the black repo, or at the very least the sp-repo-review tool should allow for the use of this URL.

Add `hatch-vcs`?

I have been using cookie for hatch projects recently, but I could not find an option to include hatch-vcs using the CLI. The setuptools backend includes setuptools_scm by default, should the hatchling backend include hatch-vcs by default too or is it not included intentionally? Thanks!

Edit: hatch-vcs is mentioned in the guide, but not included in cookie.

Consider adding hatchling

Yet another PEP 517 backend is hatchling. I'm not sure if it's got anything that makes it stand out from the rest. I think it might support plug-ins. Its parent project, hatch, is an env manager similar to pdm and Poetry.

Ruff select ALL regression

It appears that the issue reported in #283 has re-appeared in the latest version: 2023.10.27, while it was fixed in the previous version: 2023.09.21.

See astropy/astropy#15367, if one changes the cookie's pre-commit version from 2023.09.21 to 2023.10.27 reproduces this error.

A few updates

Based on the UHI repo, I've noticed a few issues that could be improved:

  • It looks like a .gitignore is not included.
  • The CONTRIBUTING should have specific instructions for Poetry (and flit, if needed).
  • There should be a docs extra and test extra
  • Maybe include a .readthedocs.yml
  • CI shouldn't do editable installs.
  • check-manifest doesn't work with other systems, drop from setup.cfg
  • Maybe use pipx for build (with update to docs)

Add guide on executable documentation

Following up #136, investigate whether we should recommend:

  • Jupytext
  • the ipython sphinx directive
  • the matplotlib plot directive

I suspect that Jupytext is now the best way to do this, and more compatible with a Markdown-based approach, but I want to educate myself more.

Attribution page

We should put some sort of attribution page (or place on a page) giving credit to the places the content used to live & people who worked on it.

Error when running repo-review with pandas

Hello,

I receive the error below when trying to run repo-review with pandas (url)

When I tried with other popular repos (numpy, xarray), there were no problem.

Is this a bug? Thank you very much.

Running Python via Pyodide
Traceback (most recent call last):
  File "/lib/python311.zip/_pyodide/_base.py", line 468, in eval_code
    .run(globals, locals)
     ^^^^^^^^^^^^^^^^^^^^
  File "/lib/python311.zip/_pyodide/_base.py", line 310, in run
    coroutine = eval(self.code, globals, locals)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "", line 9, in 
  File "/lib/python3.11/site-packages/repo_review/processor.py", line 214, in process
    result = apply_fixtures({"name": name, **fixtures}, tasks[name].check)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/lib/python3.11/site-packages/repo_review/fixtures.py", line 106, in apply_fixtures
    return func(**kwargs)
           ^^^^^^^^^^^^^^
  File "/lib/python3.11/site-packages/sp_repo_review/checks/pyproject.py", line 92, in check
    return "minversion" in options and float(options["minversion"]) >= 6
                                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ValueError: could not convert string to float: '7.3.2'

No license metadata in pyproject.toml with cookiecutter

Hi,
Thank for your great guide and these templates, it very useful and I discovered many things (copier, cruft, flit_scm, ...).

I am trying to compare several backends and I have generated project templates with "Hatchling", "Flit-core" and "Setuptools with pyproject.toml".
I left the default BSD license for all.

No template has a license metadata in the [project] table of the pyproject.toml file.
There is a classifier License :: OSI Approved :: BSD License.

When the wheel is build with Flit, it contains no LICENSE file.

Here is what I did for the "Hatchling" backend :

$ cookiecutter gh:scientific-python/cookie
  [1/9] The name of your project (package): my_scientific_package
  [2/9] The name of your (GitHub?) org (org): mylogin
  [3/9] The url to your GitHub or GitLab repository (https://github.com/mylogin/my_scientific_package):
  [4/9] Your name (My Name): My full name
  [5/9] Your email ([email protected]): [email protected]
  [6/9] A short description of your project (A great package.): Testing the scientific templates for different backends
  [7/9] Select a license
    1 - BSD
    2 - Apache
    3 - MIT
    Choose from [1/2/3] (1):
  [8/9] Choose a build backend
    1 - Hatchling                      - Pure Python (recommended)
    2 - Flit-core                      - Pure Python (minimal)
    3 - PDM-backend                    - Pure Python
    4 - Whey                           - Pure Python
    5 - Poetry                         - Pure Python
    6 - Setuptools with pyproject.toml - Pure Python
    7 - Setuptools with setup.py       - Pure Python
    8 - Setuptools and pybind11        - Compiled C++
    9 - Scikit-build-core              - Compiled C++ (recommended)
    10 - Meson-python                  - Compiled C++ (also good)
    11 - Maturin                       - Compiled Rust (recommended)
    Choose from [1/2/3/4/5/6/7/8/9/10/11] (1):
  [9/9] Use version control for versioning [y/n] (y):

sp_repo_review: false positive and false negative mypy checks

The MY101 and MY102 have been passing and failing (respectively) on the develop branch of PyBaMM, but it should be the other way.

Running sp_repo_review: https://learn.scientific-python.org/development/guides/repo-review/?repo=pybamm-team%2FPyBaMM&branch=develop

Actual mypy config: https://github.com/pybamm-team/PyBaMM/blob/94aa498176d0b6bb1186aa63bebd9c85f7b74bff/pyproject.toml#L272-L282

I noticed that running the develop version of sp_repo_review and repo_review does not give these false results. Please feel free to close this if this has been fixed but has not been released. Thanks!

PyTest config updates

See #45. Let's move the tiny pytest section in Style over to the PyTest page while we are at it (developer pages).

Packaging tutorial: Directory structure unclear

The guide puts __init__.py and refraction.py directly into src/

```bash
touch src/__init__.py
```
Place `refraction.py`, our code from the previous section, next to it, at
`src/refraction.py`.

whereas the directory structure listing has src/example/...

.
├── pyproject.toml
├── src
│ └── example
│ ├── __init__.py
│ └── refraction.py

Grouped Dependabot updates

Update to grouped dependabot updates (which are great, actually), and add check for groups. Would have helped with upload/download archive.

conda packages

repo-review and sp-repo-review are now conda packages on conda-forge.
There is a bit of an snag with testing repo-review when sp-repo-review is provided in the tests environment. It is not critical, but it would be great if you wouldn't mind taking a look at the PR: conda-forge/repo-review-feedstock#1

This is not an issue, the packages are already released, the tests are just a non-necessary double-check that I prefer to define in the recipes

Also, you are welcome to become a maintainer for that recipe, let me know

Question - importing test utilities?

Do you have any recommendations for handling the cases where I want some code importable only in tests? One example in open source is https://github.com/pydantic/pydantic/blob/main/tests/test_datetime.py#L20

I think this would require including a tests/__init__.py file? I noticed that you recommended against that here

In general, do not place a `__init__.py` file in your tests; there's not often a
reason to make the test directory importable, and it can confuse package
discovery algorithms.
, so interested in your thoughts.

repo-review: does not skip all GH checks for GitLab repos

The following output is observed (GH200 is not skipped).

GitHub Actions:
├── GH100 Has GitHub Actions config ❌
│   All projects should have GitHub Actions config for this series of checks.  If there are no .yml files in .github/workflows, the
│   remaining checks will be skipped.
├── GH101 Has nice names [skipped]
├── GH102 Auto-cancel on repeated PRs [skipped]
├── GH103 At least one workflow with manual dispatch trigger [skipped]
├── GH104 Use unique names for upload-artifact [skipped]
├── GH200 Maintained by Dependabot ❌
│   All projects should have a .github/dependabot.yml file to support at least GitHub Actions regular updates. Something like this:
│
│
│    version: 2
│    updates:
│    # Maintain dependencies for GitHub Actions
│      - package-ecosystem: "github-actions"
│        directory: "/"
│        schedule:
│          interval: "weekly"
│
├── GH210 Maintains the GitHub action versions with Dependabot [skipped]
├── GH211 Do not pin core actions as major versions [skipped]
└── GH212 Require GHA update grouping [skipped]

Move to copier?

I've been looking at copier, and it looks like a much better maintained cookiecutter-like project, with some really nice features, like a much, much better CLI interface (help text! Types!). What would people think about moving to copier (I looked at dual-supporting both for a while, but the templates are different enough that it probably wouldn't be reasonable).

Along with that, what about a rename? Since we just moved it, it might be okay to do, and pipx run copier new gh:scientific-python/cookie seems a little odd for a project that isn't using cookiecutter; would something else be better? template, new, package, pyproject, etc? The repo itself has three things in it now (the guide, the template, and sp-repo-review), but the only one you type on the command line is for the template generation. The old name would still work via redirect (though you'd already have to type copier new in stead of cookiecutter, so I think it's not too bad even if it didn't work.)

Black not configured in pre-commit to find src files?

I'm playing around with pre-commit and black together, and looked through scikit-hep/cookie for inspiration (with the now-recommended hatch backend), but when pre-commit runs, black doesn't check any files. I've gone through black's docs and looked at other projects a bit and can't seem to see how black is figuring out the files / directory to search. Is it possible that the the generated pyproject.toml in this instance doesn't pass black enough information? Otherwise is there something else I've missed?

Use pyproject-flake8?

One way to pull all the configuration into a pyproject file is to use the pyproject-flake8 package, like pypa/wheel does. Something to consider - I'd take arguments for or against.

Typo in Packaging tutorial

Thanks for this nice collection of wisdom!

In the Packaging tutorial here, the __init__.py and refraction.py files are placed in the wrong directory, under src instead of src/example. The tree output of the file structure is correct, however.

built wheels install CMakeLists.txt and License in env's lib folder

This is related to scikit-build option that I chose when asked by cookiecutter. I think the problem is that the MANIFEST.in explicitly includes CMakeLists.txt and LICENSE for sdist, but this really doesn't apply to bdist_wheels.

  1. Binary wheels don't need to include the CMakeLists.txt that was used to build it, so I think that should be excluded by default.
  2. The License file should be relative the pkg installed, not in the env's lib folder. In fact, the License is already included in the pkg's corresponding -info installed path.

I was able to solve this by removing
https://github.com/scikit-hep/cookie/blob/eed6d2600ffa9d1517a09abf240b4a4b7dd38e6e/%7B%7Bcookiecutter.project_name%7D%7D/setup-skbuild.py#L16

But, there's likely a more elegant solution (probably using exclude_package_data). I'm not sure if this issue was introduced by trying to satisfy installation for different implementations of Python (like PyPy).

Review the PyGrep hooks suggestion

The PyGrep hooks (PC170) may now be partially replaced by ruff. In particular the PGH rule set appears to cover many of the checks suggested in PC170. In particular, ruff can cover the "python" related checks.

pre-commit hooks: black-jupyter and isort interfering with each other

Not sure if I hit a forbidden phase-space, but it seems black-jupyter and isort.

In the current config, black-jupyter wants

from . import _print_unused_id_ranges, _print_used_id_ranges, _scan_groups_and_users

but isort wants

from . import (_print_unused_id_ranges, _print_used_id_ranges,
               _scan_groups_and_users)

which results in

black-jupyter............................................................Failed
- hook id: black-jupyter
- files were modified by this hook

...

isort....................................................................Failed
- hook id: isort
- files were modified by this hook

ad infinitum.

This is documented in https://black.readthedocs.io/en/stable/guides/using_black_with_other_tools.html#isort

The most simple solution is to add

[tool.isort]
profile = "black"

to the pyproject.toml

Overview: Merging in content from NSLS-II cookiecutter and guide

This issue is an overview of the effort to merge the cookiecutter and guide maintained by Brookhaven National Lab NSLS-II into this repo and move the result upstream into the scientific-python GitHub organization.

This work is being started at the SciPy Dev Summit 2023.

Gitter badge link is incorrect

Current link is

https://gitter.im/{{cookiecutter.url}}/community?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge

which leads to entries like

https://gitter.im/https://github.com/<account or org>/<project name>/community?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge

Instead, it should probably be {{cookiecutter.url}}{{ cookiecutter.github_username }}/{{ cookiecutter.project_slug }}:

https://gitter.im/{{ cookiecutter.github_username }}/{{ cookiecutter.project_slug }}//community?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge

Does anyone know if cookiecutter.github_username also works for organizations? If yes, I can make a PR.

Don't flag ruff select rules when ALL is enabled

Ruff has the option of setting select = ["ALL"], which selects all the rule sets not in "preview". In particular, this includes

  • flake8-bugbear (RF101)
  • isort (RF102)
  • pyupgrade (RF103).

However, repo review flags these as failures when "ALL" is set even though they are actually being used.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.