pytest and testNg automation framework

一.pytest

1. Install pytest: pip install pytest
2. Write use cases – collect use cases – execute use cases – generate reports
3.How pytest automatically identifies use cases

The identification rules are as follows:
1. Search the root directory: By default, test cases are collected from the current directory, that is, in which directory the pytest command is run, the search will be performed from which directory;
2. Search rules:
1) Search for files: files that conform to the naming rules test_*.py or *_test.py
2) Rules for identifying use cases in files that satisfy 1):
2.1) Function names starting with test_;
2.2) Among the test classes starting with Test (without __init__ function), the functions starting with test_
4. Run test cases based on tagnames
In pytest, the test cases are marked first, and at run time, the test cases are filtered by the mark name.
1) Register tag name
Registered through the pytest.ini configuration file. In the pytest.ini file:
[pytest] # Fixed section name
markers= # Fixed option name
Tag name 1: The description content of the tag name.
Tag name 2
Tag name N
2) Mark the test case in the test case/test class (only registered tag names can be used)
Add in front of the test case: @pytest.mark.Registered tag name
3) Call the pytest.main() function and pass the runtime parameters in a list
The parameters for filtering use cases based on tag names are: -m tag name
5.fixture usage
When we write test cases, we will involve environment preparation before the use case is executed, and environment cleanup after the use case is executed.
Environment preparation work code before use case execution (pre-work code)
Environment cleanup work after use case execution (post-work code)
Usually, in automated testing frameworks, they are called fixtures.
Fixtures are defined through the @pytest.fixture decorator. If a function is decorated with @pytest.fixture, then the function is a fixture.
When using fixtures, it is divided into two parts: fixture definition and fixture call.
In addition, there is also a fixture sharing mechanism and a nested calling mechanism.
1) Define fixture
The pre-preparation code and post-cleaning code are both written in a function. Distinguish pre-code and post-code through the yeild keyword. The code before yeild is pre-code, and the code after yeild is post-code. In actual application scenarios, there can be only pre-preparation code, or only post-cleaning code.
Fixture has 4 scopes: test session (session), test module (module), test class (class), test case (function)
Set via @pytest.fixture(scope=scope). By default, scope=function
Session: The entire process of pytest executing test cases is called a session
Set scope to session. autouse means automatic use. Generally, function, class, and module level fixtures will not turn on autouse=True.
@pytest.fixture(scope=”session”,autouse=True)
def init3():
yield
print(“After the use case is executed, the code executed”) # Only the post-cleanup code after the use case is executed
2) Fixture return value setting: yeild return value (the variables in the pre-code must be passed to the test case for use. So first, the fixture function must return the variable.)
3) Call fixture
There are 2 calling methods:

  • Add: @pytest.mark.usefixture(“function name of fixture”) above the test case/test class
  • Use the fixture function name as a parameter of the test case function (when using this method, the first method is not required)
    6.conftest.py sharing mechanism
    In some large business scenarios, the same pre-preparation work and post-cleaning work will be used in many use cases.
    The pytest framework provides a fixture sharing mechanism that allows different use case modules to use the same fixture. This is the conftest.py file. In the test case file, there is no need to introduce the conftest.py file. If you directly call the function name of the fixture, it will automatically be found in conftest.py.
    1) conftest.py hierarchical scope
    In which directory conftest.py is located, all use cases in this directory (including subdirectories) can use the fixtures in it
    conftest.py in the root directory has the scope of the entire project.
    Within a certain python package, the scope is within the package
    So there is a question, what should I do if a fixture with the same name appears?
    This involves the order in which fixtures are called when the test case is executed. Generally speaking, call according to the principle of proximity.
    Fixture in the test case file > Fixture in the current directory > Fixture in the parent directory > Fixture in the root directory
    7.fixture nesting
    Nested use means: a fixture can be used as a parameter of another fixture, that is, init() can be used as an input parameter of the ini_req_assert function.
    If func_fix3 is called in the use case, the execution sequence is:
    [Low levels can call high levels or the same level, but not vice versa]
    Prefix of func_fix
    Prefix of func_fix3
    —-Test Case—-
    postfix of func_fix3
    postfix of func_fix
@pytest.fixture(scope="class")
def func_fix():
    print("---- Register account Start 1 -----")
    yield True
    print("---- Register account End 4 -----")

@pytest.fixture
def func_fix3(func_fix):
    print("**** Login account 2 ****")
    print(func_fix)
    yield func_fix
    print("**** Login completed 3 ****")

8. Data driven
Mark the test case method @pytest.mark.parametrize(“case”, cases)
9. Concurrency testing
Generally, the third-party plug-in pytest-xdist is used. The premise is that there are no dependencies between use cases and the execution is out of sequence, so it is not suitable for interface automation.
10. Retry after failure
It is helpful to improve the stability of automated use cases. You can use pytest’s pytest-rerunfailures plug-in to implement use case reruns. There are two forms, one is globally through the command line: Pytest –reruns 2 –reruns-delay 5, and the other is through the command line: The first is to mark the case method locally @pytest.mark.flaky(reruns=2, reruns_delay=1)

11.pytest skips test cases

1. Unconditional skip——@pytest.mark.skip()
Add the decorator @pytest.mark.skip(reason=None) in front of the test method or class to indicate that the test case will not be executed. The parameter reason represents the reason for skipping. reason can be omitted.
2. Conditional skip——@pytest.mark.skipif()
@pytest.mark.skipif(condition, reason=”xxx”)
Description: condition-the condition to skip. If it is True, the function will be skipped. If it is False, the function will not be skipped.
12.Commonly used plug-ins for pytes
pytest-xdist distributed testing, the premise is that the use cases are independent and have no dependencies. They can run independently and there is no order requirement for use case execution.
pytest-rerunfailures plug-in to implement use case re-running
The pytest-ordering plug-in controls the execution order of test cases
The pytest-dependency plugin is used to declare dependencies between test cases to ensure that test cases are executed in the correct order

13.pytest parameterized scope
@pytest.mark.parametrize() decorator method for parameterization
Parameterization is performed using pytest.fixture(). Functions decorated by fixtures can be passed into other functions as parameters.
The conftest.py file stores parameterized functions that can be applied to all test cases in the module.

14.pytest’s decorator
@pytest.fixture(scope=”session>module>class>function”, autouse=True)
Combined with the shared file conftest.py to implement the preconditions and postcleaning operations of test cases
@pytest.mark.parametrize(parameter name’, list): Test case parameterization can be realized.
@pytest.mark.usefixtures(): Use defined fixtures directly in the test function
@pytest.mark.skip skips those that do not need to be run. Parameters can be provided to provide reasons or conditions for skipping test cases.
@pytest.mark.timeout test case setting timeout.
@pytest.mark.run(order=1): Control the execution order of use cases. The smaller the number, the earlier it is executed.
@pytet.mark.dependency: used to declare dependencies between test cases to ensure that test cases are executed in the correct order

15.pytest advantages
Use assert directly to assert
Failure test cases provide very detailed error information
Automatically discover and collect test modules and use cases
Flexible fixture management
mark use case mark
Rich plug-in system
Support retry on failure
Compatible with unittest
16.The difference between unittest and pytest
1. Use case design rules
unittest:
(1) The test class must inherit unittest.TestCase
(2) The test function must start with “test_”
pytest:
(1) The file name of the test file must start with “test_” or end with “_test”
(2) The test class name must start with “Test”
(3) The test function name must start with “test”
2. Assertion
unittest:
assertXXX
pytest:
Use assert directly to assert
3. Pre- and post-use cases
unittest:
(1) Setup is executed before each use case is executed, and teardown is executed after each use case is executed.
(2) Executed before all use cases in the setupclass class are executed, and executed after all use cases in the teardownclass class are executed.
Each test file must be set separately before and after
pytest:
The prefix and postfix of pytest can be customized through the fixture. Before yield, it is prefix, and after yield, it is postfix. The format is @pytest.fixture(scope=”XXX”). Scope: has four levels, function (default ),class,module,session
Create the conftest.py file. All test files in the current directory can automatically call this fixture.
4. Parameterization
unittest: Parameterization needs to rely on third-party libraries, use ddt for parameterization
pytest: implemented through the decorator @pytest.mark.parametrize
5.Report display
unittest: generated through HTMLTestRunner
pytest: can integrate allure plug-in generation
6. Rerun after failure
unittest: not supported
pytest: pytest –reruns=2 (2 means rerun 2 times)
7. Select use cases for execution
unittest: not supported
pytest:
(1) Create a pytest.ini file,
[pytest]
markers = demo (marker name): just for display (remark name, if you don’t write it, you don’t need to add a colon)
(2) Add: @pytest.mark.mark name in front of the test case/test class to be executed
(3) Add [-m, label name of the use case to be executed’] in the main.py file to filter the use cases based on the label name and execute them
8. Different installation requirements
pytest is a third-party unit testing library and requires additional installation; unittest is a standard library and requires no additional installation.

Two: TestNg

Overview: TestNg is a testing framework in Java that can be used for unit testing and integration testing. It can use annotations to strengthen tests. Testers generally use it to execute automated test cases.
testng.xml file:

<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE suite SYSTEM "http://testng.org/testng-1.0.dtd">
<!-- Kit -->
<!-- The way TestNG executes test cases: first find the method with the @Test annotation under each test class file, then add these test methods to a test suite (Test Suite), and then execute the Test Suit
Explaining the execution of test cases from a macro perspective is such a process. From the configuration file point of view, it is to execute the TestNG.xml file. The default testng.xml is used when executing directly. We can run the script by writing TestNG.xml ourselves.
TestNG can do the following things by setting the testng.xml file:
1. Create test suites from packages, classes, and methods from different sources
2. Include some options, such as rerunning failed test cases
3. Support regular expressions
4. Run and pass external parameters into the test method
5. Support configuration of multi-threaded execution environment
 -->
 <!-- suite manages multiple tests, and test manages multiple classes. The smallest unit is a class file.
  -->
<suite name="Default Suite">
<parameter name = "browser" value = "firfox"></parameter>
<parameter name="server" value="172.18.40.37"></parameter>
<test name="Java_Learn" >
<!-- Group testing: Group testing is similar to tagging. Each test method is given a tag. When running, the runs can be distinguished based on the group, such as only running test cases of a certain group -->
<!--<groups>
<run>
<include name = "API Test"/>
<include name="Function Test"/> -->
<!--excloude tag implements methods other than this tag -->
<!--<exclude name = "API Test"/>
<exclude name="Function Test"/>
</run> -->
<!-- Group dependency -->
<!-- <dependencies>
                <group name="app" depends-on="tomcat" />
            </dependencies>
</groups>-->
<classes>
<!--<class name = "com.gfapp.test.TestGroupsDemo" />
<class name = "com.gfapp.test.TestDependenceDemo" />-->
    <class name = "com.gfapp.test.Testparam" />
    <listener class-name="com.lemon.listener.AllureReportListener"/>
    <listener class-name="com.gfapp.common.TestngListener" />
</classes>
</test>
</suite>

(1) A suite consists of one or more tests
(2) A test (test) is composed of multiple classes (classes)
(3) A class is composed of one or more methods
Is the root tag of testng.xml
name: the name of the suite, this is a mandatory attribute
parallel: Whether TestNG runs different threads to run this suite, the default is none, other levels are methods, tests, classes, instances
thread-count: number of threads used if parallel mode is enabled (ignoring otherwise)

<!--The number of multi-threads is 2: thread-count="2"-->
<!--
    Tests level: Use cases under different test tags can be executed in different threads; use cases under the same test tag can only be executed in the same thread.
    thread-count: represents the maximum number of concurrent threads
-->

1. Commonly used annotations:
@BeforeMethod: The annotated method will be run before each test method
@AfterMethod: The annotated method will be run after each test method
@BeforeClass: Starts execution before all test methods of a certain test class (class) are executed. The annotated method only runs once
@AfterClass: Execution starts after all test methods of a certain test class (class) are executed. The annotated method only runs once.
@BeforeTest: Start execution before all test methods of a test (test) start executing
@AfterTest: Start execution after all test methods of a test (test) have finished executing
@BeforeSuite: Executed before all test methods of a certain test suite (suite) are executed.
@AfterSuite: Executed after all test methods of a certain test suite (suite) are executed.
@DataProvider: Data provider, method of providing data. As a data provider, dataProvider can only return data of type Object[][] and Iterator. You want to receive data from this DataProvider. The @Test method requires a dataProvider name equal to this annotation name
@Parameters: Describes how to pass parameters to the @Test method
@Test: test method
Note: Except for methods annotated with @Parameters, which can have formal parameters, other annotated methods are empty.
Commonly used parameters in @Tes:
@Test(enabled = false): The use case will not run
@Test(timeOut = 5000): timeout test
@Test(groups=”group2″): Group test
@Test(expectedExceptions = ArithmeticException.class): expected exception test
@Test(dependsOnMethods= {“TestNgLearn1”}): dependency test (hard dependency)
Divided into hard dependencies and soft dependencies
Hard dependency: The default is this dependency mode, that is, all its dependent methods or groups must be passed, otherwise the classes or methods identified as dependent will be skipped and marked as skip in the report. This is the default dependency mode;
Soft dependency: In this method, whether all the methods or groups it depends on pass will not affect the operation of the identified dependent classes or methods. Note that if this method is used, there must be no success between the dependent and the dependent. Failure causation, otherwise the use case will fail. This method needs to add alwaysRun=true in the annotation, such as @Test(dependsOnMethods= {“TestNgLearn1”}, alwaysRun=true);
2.testNg common listeners:
IAnnotationTransformer
IAnnotationTransformer2
IHookable
IInvokedMethodListener
IMethodInterceptor
IReporter
ISuiteListener
ITestListener
3. How to use Testng’s listener? What does TestNg’s data driver return?
The use of Testng’s listeners in automation mainly focuses on failure screenshots and failure retries.
Failed retry needs to implement two interfaces IRetryAnalyzer and IAnnotationTransformer
Failed screenshots need to implement ITestListener
After implementation, you need to add a listening tag to the testng configuration file.
Testng’s data-driven return result is a two-dimensional array, so when doing data-driven, no matter whether our data exists in excel or xml
Or in storage media such as databases, eventually we need to convert them into two-dimensional arrays
4. Retry after failure
Divided into local and global
Part: a. Create a new class FaileRetry to implement the IRetryAnalyzer interface and override the retry method
b. Create a new test class and add the @Test(retryAnalyzer = FaileRetry.class) annotation to the test method
Global: a. Just like the local one, first create a new class FaileRetry to implement the IRetryAnalyzer interface and override the retry method. The difference is that the variable for the current number of runs must be a public static shared variable
b. Create a new class RetryListener to implement the IAnnotationTransformer interface and override the transform method. The purpose of this class is to monitor whether all test methods have the retryAnalyzer annotation attribute. If there is this attribute, the rerun mechanism we set will not be executed (partially The priority of re-running is higher than the global one)
c. When the number of failed retries for the first case reaches the maximum value, subsequent cases will never enter the rerun mechanism. Therefore, after each case is rerun, the current number of runs needs to be reset to the initial value.
d. After retrying, it was found that all use cases executed by retrying were recorded in skipped. This will affect the total number of our tests. After the test is completed, the results of the retrying use cases will be removed from skipped.
e. Add the listening class to the testng.xml file
Note: Try to use a higher version of the testng version, otherwise an infinite loop and other problems may occur. I use version 6.14.3
Specific code reference: https://blog.csdn.net/Caoqingqing521/article/details/110792727

5.TestNG multi-thread testing
a. Concurrency in test methods
Add @Test(threadPoolSize = 5, invocationCount = 10) to the test method
Explanation: threadPoolSize indicates the thread pool capacity used to call this method. In this example, 5 threads execute this method in parallel at the same time; invocationCount indicates the total number of times this method needs to be executed.
b. Test, class, method level concurrency can be set under the suite tag in testng.xml

Their differences are as follows:
tests level: Use cases under different test tags can be executed in different threads, and use cases under the same test tag can only be executed in the same thread.
Class level: Use cases under different class tags can be executed in different threads, and use cases under the same class tag can only be executed in the same thread.
methods level: all use cases can be executed in different threads
6. Parameter passing method
a. Pass parameters through parameter
Add @Parameters({ “browser”, “server” }) // directly to the test method
xml file

b. Pass parameters through dataprovider
//The annotated method must return a data of type Object[][] and Iterator
// Provide a test method for data. The annotated method returns an Object[][], where each Object[] can be assigned in the parameter list of the test method. The @Test method
// If you want to receive data from this DataProvider, you need to use a dataProvider name equal to the annotation name.
@DataProvider(name = “user”)
public Object[][] getstr() {
return new Object[][] { { “admin”, “123456”, “Login successful” }, { “admin”, “12345”, “Wrong password” },
};
}
// DataProvider returns a two-dimensional array of Object. Each one-dimensional array in the two-dimensional array will be passed to the calling function as a parameter. When running, you will find
// test identified by @Test
//The number of times the method is executed is consistent with the number of one-dimensional arrays contained in object[][], and the number of parameters of the function identified by @Test is also consistent with the number of elements in the one-dimensional array in object
@Test(dataProvider = “user”)
private void sout(String uname, String pword, String msg) {
System.out.println(uname + “->” + pword + “->” + msg);
}

Three: Basic use of HttpClient

Before use, you need to import the package and use maven
.Steps to use HttpClient to send requests and receive responses:

  1. Create HttpClient object.
    HttpClient client = HttpClients.createDefault();
  2. Create an instance of the request method and specify the request URL. If you need to send a GET request, create an HttpGet object; if you need to send a POST request, create an HttpPost object.
  3. If you need to send request parameters, you can call the setEntity(HttpEntity entity) method to set the request parameters (usually post request)
  4. Call execute(HttpUriRequest request) of the HttpClient object to send a request. This method returns an HttpResponse.
  5. Call HttpResponse’s getAllHeaders(), getHeaders(String name) and other methods to obtain the server’s response headers; call HttpResponse’s getEntity() method to obtain the HttpEntity object, which wraps the server’s response content. The program can obtain the server’s response content through this object.
  6. Release the connection. Regardless of whether the execution method is successful or not, the connection must be released