GithubHelp home page GithubHelp logo

pyvista / pytest-pyvista Goto Github PK

View Code? Open in Web Editor NEW
12.0 10.0 5.0 1.09 MB

Plugin to test PyVista plot outputs

Home Page: https://pytest.pyvista.org

License: MIT License

Python 100.00%
pytest image-regression pyvista

pytest-pyvista's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

pytest-pyvista's Issues

Default cache_dir value could be better?

IMO, this default value is not the best choice:

this_path = pathlib.Path(__file__).parent.absolute()
self.cache_dir = os.path.join(this_path, "image_cache")

I'd like to request some clarification on this choice and propose altering it to use the current working directory.

I would expect pytest . to properly handle the image cache or default to save the cache in a location that is visible/relative to the active project. This current default dumps the images in a non-visible location.

An alternative option is to have cache_dir be easily stable in the conftest.py for a project -- is this currently possible?

Documentation

Let's add some solid documentation for this tool. I think it can be used just about anywhere that uses pyvista, and in many places that don't that simply want to compare images.

Documentation should use sphinx and follow the pydata-theme. Ping @banesullivan to create a new subdomain for this. Something like "pytest.pyvista.org".

Update development version to 0.2.dev0

๐Ÿž Problem
It looks like the development version, the one in main, is 0.1.dev0 even if there has been a release/0.2. Therefore, a pull-request should be opened to update the version number in main.

Add pyvista tests into CI

It would be useful to run the full pyvista tests on one python/vtk version in this package to avoid unintended breakages downstream.

Not creating images for new tests

The old behavior was that if the image was missing, it would be created. Now you have to use --reset_image_cache to generate figures for new tests.

Can we make it so that if the cached image is missing, it gets generated?

Dump failing images to a directory

I'm dealing with a scenario right now where a lot of tests are failing the image regression test on Windows but not other platforms on CI. It would be nice if there were an option to dump the failing images to a specific path so that I can upload those as a CI artifact to inspect. I'm currently manually doing this

Teardown verify_image_cache after each test

See #49 for some effects where it seems that verify_image_cache is leaking outside of test scope.

We should teardown, e.g. removing the object from the theme, to reduce cross talk, after each test.

Consider removing appveyor.yml file

๐Ÿž Problem
the appveyor.yml file got generated as a consequence of the cookiecutter template. This file is used to configure the CI/CD in the Appveyor platform. However, it looks like this repository takes advantage of GitHub actions for CI/CD purposes. Therefore, it looks like this file is no longer needed.

Test names case (in)sensitivity

In Pytest you can do:

import pyvista as pv

@pytest.mark.parametrize(
    "title",
    [
        "Mechanical",
        "mechanical",
        "meCHANICAL",
        "ux",
        "UX",
        "CSGZ",
    ],
)
def test_bc_plot_bc_labels(title):

    # Create a simple mesh (a sphere)
    mesh = pv.Sphere()

    # Create a plotter object with a custom title
    plotter = pv.Plotter(title=title)

    # Add the mesh to the plotter
    plotter.add_mesh(mesh)

    # Set up the plotter window and display the plot
    plotter.show()

Pytest won't be able to generate "mechanical" and "meCHANICAL". Nor "UX".

Reference:

Add option to check if a cached image isn't compared to an image during all tests

It would be very useful to have an option to check that all cached images were compared during a test. This is a complement to checking that all comparisons have an image in the cache.

This is tricky if tests are skipped for any reason. So, it would be only useful if there is a canonical version for which no test is skipped. I'm not sure if this is the case today for pyvista. It is very possible that at least one plotting test is skipped on every vtk version. But, this could be useful for downstream libs that have less wide-open vtk version support.

pyvista>=0.37 constraint can conflict with development installs

I've seen a few times recently where installing pytest-pyvista after having a dev install of pyvista causes a reinstall of pyvista from PyPI. I believe this is due to the pyvista>=0.37 constraint on the package and the fact that pyvista is currently at 0.38.dev0 on main which pip doesn't handle gracefully. I suspect this issue will resolve on the next MINOR version release of pyvista, but until then....

Is pyvista>=0.37 really necessary for this extension? I'm wondering if we could free that constraint or use an earlier compatible version?

We really do not need a release branch for this package

Having a dedicated release branch is massive overkill for this package and adds overhead to cutting releases. We should switch the versioning mechanism to use setuptools_scm and tag directly on the main branch.

@pyvista/developers, thoughts?

Release 0.1.9

Hi @pyvista/maintainers, pinging @germa89 for visibility.

I was wondering if I could make a release for the plugin, since we need to use the new flag introduced in #82

If that's okay, I will proceed.

Add test-level option to choose image name for comparison

This would enable reducing duplicate images in tests. For example, tests in pyvista/pyvista#5428 generate multiple images that are duplicates of one another. It could be something like:

def test_identical_images(verify_image_cache):
    verify_image_cache.file_compare = "identical_images"
    # generate image with first plot and compare to "identical_images.png"
    verify_image_cache.file_compare = "identical_images"
    # generate image with second plot and compare to same "identical_images.png"

Reset cache only if error

I think this is a very specific need, but it might be useful to somebody else.

I would like to generate the cache only for the images that fails.

Conflicting flags do not raise an error or warning.

The flags --fail_extra_image_cache and --add_missing_images are incompatible with each other. This is not documented anywhere. We should:

  • Update the documentation to cover this.
  • Choose what flag will take priority in case of conflict.
  • Raise a warning in the code to notify that both flags are incompatible when both flags are active, and tell the user which one is prioritized.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.