Comments (12)
I'm not sure you can use the matchers.assert_ with the matchers.verify_ or at least I am uncertain of a clear expectation. I have always just used one or the other, because I found that when an assert would fail, it would stop the rest of the test execution, which is the opposite of soft-asserts.
https://github.com/Element-34/py.saunter/blob/master/saunter/matchers.py#L67
Checking the source it seems that any attempt to use matchers.assert_ is going to assert something. If that fails, you get a stack trace because of the pytest_runtest_call() method in your conftest.py file. Mine has something like this:
def pytest_runtest_call(item, __multicall__):
try:
__multicall__.execute()
except Exception as e:
if hasattr(item.parent.obj, 'driver') or hasattr(item.parent.obj, 'selenium'):
item.parent.obj.take_named_screenshot('exception')
raise
However, matchers.verify_ doesn't allow the exception to be raised. When an assert fails, it adds the user message or a generated message to self.verification_errors, which is then evaluated in pytest_runtest_makereport() in the same conftest.py file.
def pytest_runtest_makereport(__multicall__, item, call):
if call.when == "call":
try:
assert([] == item.parent.obj.verificationErrors)
except AssertionError:
call.excinfo = py.code.ExceptionInfo()
[...]
Since matchers.verify_true() is just a try/except around matchers.assert_true(), the "exception" that is written into the list of verification_errors is the message you are seeing.
In your example, if the assert is first, it always runs. I know it's a trivial example, but is there any reason the assert_true couldn't be an additional verify_true?
from py.saunter.
So, the calls to assert_* are supposed to stop the test and the calls to verify_* are supposed to NOT stop the test. Mixing them within a single test is highly desirable IMO.
The issue above is that the verify_* is stopping the test when it should NOT do that. Secondly, the msg param I specified is not getting output.
The problem is with my second verify_true, not the assert_true at the end.
from py.saunter.
Just trying to help out and investigate
So, the calls to assert_* are supposed to stop the test and the calls to verify_* are supposed to NOT stop the test. Mixing them within a single test is highly desirable IMO.
I had never thought of it that way. For me it was a hard rule: calls to assert_* are to be used when a test should stop if it throws an exception; calls to verify_* collect messages into a list when an exception is thrown. Though mixing them may be desirable, I never saw it as a feature. In fact, as I've said, I try the opposite tact, which is to not use both verify_ and assert_ methods in one test.
The issue above is that the verify_* is stopping the test when it should NOT do that. Secondly, the msg param I specified is not getting output.
The difference between how assert_ and verify_ process failures is connected to conftest.py how py.test "realizes" the failure and displays it to you. If your message is "Expected to fail because expr false" then I do see it as being output, and with the stack trace that "caused" the failure, but that might not be what you expect.
Reference the pytest_runtest_makereport() method again in conftest.py. Since verification errors are appending to a list of verification errors, when using verify_, pytest asserts []==self.verificationErrors or that an empty list is equal to the list of verification errors on our current test. Since the only exception this assert will throw is the contents of a list not being equal, it can only show you the messages that you put in the list, not the entire stack track, which is why I mentioned on the Google Group thread using very explicit messages.
The problem is with my second verify_true, not the assert_true at the end.
Here is my attempt at understanding the problem as you have described it
- You expect the output for verify_ methods to be the same as assert_ methods, where each collected verifiedError displays the stack trace which led to the underlying assert failure.
- When writing tests which I don't want to halt upon an expectation mismatch, I use verify_ the whole way through.
- You would like to mix and match verify_ and assert_ to allow for some evaluations that DO halt the execution of your tests, and some evaluations that will not halt upon mismatch. That also sounds like an awesome feature.
- You have a preference towards the stack trace style of reporting used in assert_. Admittedly, I like it too, but given how pytest is reporting and what verify_* does, I wasn't sure it would be possible.
Here is the unexpected behavior we both are observing
While I'm not sure what to expect when mixing matchers.a* and matchers.v*, exceptions are silenced once you use matchers.verify_. I have created an example of that here.
https://gist.github.com/dakotasmith/7bbf7eab1df8d0ff5db6#file-verify_eats_exceptions-py
from py.saunter.
Have you tried just using assert, instead of the matchers.assert wrappers?
assert value==expectedvalue
Should still cause py.test to fail the test in the way you and py.saunter expect.
from py.saunter.
The Python assert doesn't have a verify equivalent. It also doesn't support a msg argument from the test dev.
Your four bullet points summarizing the problem are not quite right. I do NOT want a stack trace--I want my own msg="" string displayed. What I'm getting (and complaining about here!) is a stack trace.
from py.saunter.
I understand Python assert doesn't have a verify equivalent. I was suggesting that, when you are using matchers.verify and want to assert something in a way that throws and exception and halts your test, to use assert instead of matchers.assert_
from py.saunter.
As for the stack trace, I think the work here would be the creation of a VerificationError exception, and in conftest.py's pytest_runtest_makereport, ensure that
call.excinfo = py.code.ExceptionInfo()
is in a raised VerificationError in a try/except block inside the except AssertionError block. The message of VerificationError would be set to accept a list of msgs. But to be fair, I don't know what py.code.ExceptionInfo does when it is nested that way.
I'm pretty sure as long as the except block that contains the above line is an AssertionError, using verify will return the stack trace you see. Which does include the message, just with a bit extra stuff around it. Change the exception, and the stack trace will change.
I'll take a stab at something, but I'm not certain I'll solve it.
from py.saunter.
In conftest.py, if you switch out the existing function or this one, does it display the way you want? I think its better than currently done since apparently py.test does a bit of monkeypatching of assert.
def pytest_runtest_makereport(__multicall__, item, call):
if call.when == "call":
try:
if len(item.parent.obj.verificationErrors) != 0:
raise AssertionError(item.parent.obj.verificationErrors)
except AssertionError:
call.excinfo = py.code.ExceptionInfo()
report = __multicall__.execute()
item.outcome = report.outcome
if call.when == "call":
if hasattr(item.parent.obj, 'config') and item.parent.obj.config.getboolean('SauceLabs', 'ondemand'):
s = saunter.saucelabs.SauceLabs(item)
return report
from py.saunter.
Actually, here is a better one. It appears to work when
- verifications passed, assert failed
- verifications failed, assert passed
- verifications failed, assert failed
def pytest_runtest_makereport(__multicall__, item, call):
if call.when == "call":
try:
if len(item.parent.obj.verificationErrors) != 0:
if call.excinfo:
raise AssertionError((call.excinfo.exconly(), item.parent.obj.verificationErrors))
else:
raise AssertionError(item.parent.obj.verificationErrors)
except AssertionError:
call.excinfo = py.code.ExceptionInfo()
report = __multicall__.execute()
item.outcome = report.outcome
if call.when == "call":
if hasattr(item.parent.obj, 'config') and item.parent.obj.config.getboolean('SauceLabs', 'ondemand'):
s = saunter.saucelabs.SauceLabs(item)
return report
from py.saunter.
And if you want to get fancy
raise AssertionError({"assert": call.excinfo.exconly(), "verifications": item.parent.obj.verificationErrors})
from py.saunter.
Adam: Your conftest.py code (2 comments up) still doesn't work on pysaunter 0.54.
from py.saunter.
I have worked with mmaypumphrey recently and have also tried to implement the soft asserts in version 0.64 without success.
from py.saunter.
Related Issues (20)
- pysaunter summary of results doesn't match xml summary of results
- conftest.py s/b moved to pysaunter egg if possible HOT 1
- Enable HTMLUnit HOT 1
- Request: Screen capturing for Webdriver
- single-quoted -m argument on Windows fails; double-quoted one works HOT 1
- overly short limit on length of -m argument HOT 1
- Verify_equal in matchers.py has a mistake HOT 1
- Misdirected ticket
- -r run_tests_list & -s skip_tests_list options would really be nice
- Documentation on py.Saunter needs to be updated
- pytest stops collecting tests when pysaunter release is newer than 0.50 HOT 6
- sauce platforms
- sauce profiles
- screenshots - rc
- SauceLabs tests timeout when get_log is set to true in saunter.ini
- Requests version must be >= 1.1.0
- How to support firefox profile in pysaunter 0.63 WebDriver
- Matchers.py assert_text_present
- Installation into virtualenv doesn't install the saunter executable HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from py.saunter.