GithubHelp home page GithubHelp logo

Comments (12)

dakotasmith avatar dakotasmith commented on September 26, 2024

I'm not sure you can use the matchers.assert_ with the matchers.verify_ or at least I am uncertain of a clear expectation. I have always just used one or the other, because I found that when an assert would fail, it would stop the rest of the test execution, which is the opposite of soft-asserts.

https://github.com/Element-34/py.saunter/blob/master/saunter/matchers.py#L67

Checking the source it seems that any attempt to use matchers.assert_ is going to assert something. If that fails, you get a stack trace because of the pytest_runtest_call() method in your conftest.py file. Mine has something like this:

def pytest_runtest_call(item, __multicall__):
    try:
        __multicall__.execute()
    except Exception as e:
       if hasattr(item.parent.obj, 'driver') or hasattr(item.parent.obj, 'selenium'):
       item.parent.obj.take_named_screenshot('exception')
   raise

However, matchers.verify_ doesn't allow the exception to be raised. When an assert fails, it adds the user message or a generated message to self.verification_errors, which is then evaluated in pytest_runtest_makereport() in the same conftest.py file.

def pytest_runtest_makereport(__multicall__, item, call):
if call.when == "call":
    try:
        assert([] == item.parent.obj.verificationErrors)
    except AssertionError:
        call.excinfo = py.code.ExceptionInfo()
[...]

Since matchers.verify_true() is just a try/except around matchers.assert_true(), the "exception" that is written into the list of verification_errors is the message you are seeing.

In your example, if the assert is first, it always runs. I know it's a trivial example, but is there any reason the assert_true couldn't be an additional verify_true?

from py.saunter.

mmaypumphrey avatar mmaypumphrey commented on September 26, 2024

So, the calls to assert_* are supposed to stop the test and the calls to verify_* are supposed to NOT stop the test. Mixing them within a single test is highly desirable IMO.

The issue above is that the verify_* is stopping the test when it should NOT do that. Secondly, the msg param I specified is not getting output.

The problem is with my second verify_true, not the assert_true at the end.

from py.saunter.

dakotasmith avatar dakotasmith commented on September 26, 2024

Just trying to help out and investigate

So, the calls to assert_* are supposed to stop the test and the calls to verify_* are supposed to NOT stop the test. Mixing them within a single test is highly desirable IMO.

I had never thought of it that way. For me it was a hard rule: calls to assert_* are to be used when a test should stop if it throws an exception; calls to verify_* collect messages into a list when an exception is thrown. Though mixing them may be desirable, I never saw it as a feature. In fact, as I've said, I try the opposite tact, which is to not use both verify_ and assert_ methods in one test.

The issue above is that the verify_* is stopping the test when it should NOT do that. Secondly, the msg param I specified is not getting output.

The difference between how assert_ and verify_ process failures is connected to conftest.py how py.test "realizes" the failure and displays it to you. If your message is "Expected to fail because expr false" then I do see it as being output, and with the stack trace that "caused" the failure, but that might not be what you expect.

Reference the pytest_runtest_makereport() method again in conftest.py. Since verification errors are appending to a list of verification errors, when using verify_, pytest asserts []==self.verificationErrors or that an empty list is equal to the list of verification errors on our current test. Since the only exception this assert will throw is the contents of a list not being equal, it can only show you the messages that you put in the list, not the entire stack track, which is why I mentioned on the Google Group thread using very explicit messages.

The problem is with my second verify_true, not the assert_true at the end.

Here is my attempt at understanding the problem as you have described it

  • You expect the output for verify_ methods to be the same as assert_ methods, where each collected verifiedError displays the stack trace which led to the underlying assert failure.
  • When writing tests which I don't want to halt upon an expectation mismatch, I use verify_ the whole way through.
  • You would like to mix and match verify_ and assert_ to allow for some evaluations that DO halt the execution of your tests, and some evaluations that will not halt upon mismatch. That also sounds like an awesome feature.
  • You have a preference towards the stack trace style of reporting used in assert_. Admittedly, I like it too, but given how pytest is reporting and what verify_* does, I wasn't sure it would be possible.

Here is the unexpected behavior we both are observing

While I'm not sure what to expect when mixing matchers.a* and matchers.v*, exceptions are silenced once you use matchers.verify_. I have created an example of that here.

https://gist.github.com/dakotasmith/7bbf7eab1df8d0ff5db6#file-verify_eats_exceptions-py

from py.saunter.

dakotasmith avatar dakotasmith commented on September 26, 2024

Have you tried just using assert, instead of the matchers.assert wrappers?

assert value==expectedvalue

Should still cause py.test to fail the test in the way you and py.saunter expect.

from py.saunter.

mmaypumphrey avatar mmaypumphrey commented on September 26, 2024

The Python assert doesn't have a verify equivalent. It also doesn't support a msg argument from the test dev.

Your four bullet points summarizing the problem are not quite right. I do NOT want a stack trace--I want my own msg="" string displayed. What I'm getting (and complaining about here!) is a stack trace.

from py.saunter.

dakotasmith avatar dakotasmith commented on September 26, 2024

I understand Python assert doesn't have a verify equivalent. I was suggesting that, when you are using matchers.verify and want to assert something in a way that throws and exception and halts your test, to use assert instead of matchers.assert_

from py.saunter.

dakotasmith avatar dakotasmith commented on September 26, 2024

As for the stack trace, I think the work here would be the creation of a VerificationError exception, and in conftest.py's pytest_runtest_makereport, ensure that

call.excinfo = py.code.ExceptionInfo() 

is in a raised VerificationError in a try/except block inside the except AssertionError block. The message of VerificationError would be set to accept a list of msgs. But to be fair, I don't know what py.code.ExceptionInfo does when it is nested that way.

I'm pretty sure as long as the except block that contains the above line is an AssertionError, using verify will return the stack trace you see. Which does include the message, just with a bit extra stuff around it. Change the exception, and the stack trace will change.

I'll take a stab at something, but I'm not certain I'll solve it.

from py.saunter.

adamgoucher avatar adamgoucher commented on September 26, 2024

In conftest.py, if you switch out the existing function or this one, does it display the way you want? I think its better than currently done since apparently py.test does a bit of monkeypatching of assert.

def pytest_runtest_makereport(__multicall__, item, call):
    if call.when == "call":
        try:
            if len(item.parent.obj.verificationErrors) != 0:
                raise AssertionError(item.parent.obj.verificationErrors)
        except AssertionError:
            call.excinfo = py.code.ExceptionInfo()

    report = __multicall__.execute()

    item.outcome = report.outcome

    if call.when == "call":
        if hasattr(item.parent.obj, 'config') and item.parent.obj.config.getboolean('SauceLabs', 'ondemand'):
            s = saunter.saucelabs.SauceLabs(item)

    return report

from py.saunter.

adamgoucher avatar adamgoucher commented on September 26, 2024

Actually, here is a better one. It appears to work when

  • verifications passed, assert failed
  • verifications failed, assert passed
  • verifications failed, assert failed
def pytest_runtest_makereport(__multicall__, item, call):
    if call.when == "call":
        try:
            if len(item.parent.obj.verificationErrors) != 0:
                if call.excinfo:
                    raise AssertionError((call.excinfo.exconly(), item.parent.obj.verificationErrors))
                else:
                    raise AssertionError(item.parent.obj.verificationErrors)
        except AssertionError:
            call.excinfo = py.code.ExceptionInfo()

    report = __multicall__.execute()

    item.outcome = report.outcome

    if call.when == "call":
        if hasattr(item.parent.obj, 'config') and item.parent.obj.config.getboolean('SauceLabs', 'ondemand'):
            s = saunter.saucelabs.SauceLabs(item)

    return report

from py.saunter.

adamgoucher avatar adamgoucher commented on September 26, 2024

And if you want to get fancy

                    raise AssertionError({"assert": call.excinfo.exconly(), "verifications": item.parent.obj.verificationErrors})

from py.saunter.

mmaypumphrey avatar mmaypumphrey commented on September 26, 2024

Adam: Your conftest.py code (2 comments up) still doesn't work on pysaunter 0.54.

from py.saunter.

 avatar commented on September 26, 2024

I have worked with mmaypumphrey recently and have also tried to implement the soft asserts in version 0.64 without success.

from py.saunter.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.