HttpRunner3.x (4) test case structure

In httprunner, test cases have a three-tier structure, which are

1. The test collection testsuite corresponds to a directory and contains one or more test case files.

2. The test case testcase corresponds to a single file, which can be a yaml/json/python file, containing one or more test steps.

3. Test step teststep, a single step.

Generally use the yaml file format to write use cases.

The use case format of the yaml file is as follows:

config:
    name: xxx
    variables: # configuration variables (config variables)
        varA: "configA"
        varB: "configB"
        varC: "configC"
    parameters: # parameter variables
        varA: ["paramA1"]
        varB: ["paramB1"]
    base_url: "https://postman-echo.com"
    verify: False
    export: ["foo3"]

teststeps:
-
    name: step 1
    ...
-
    name: step 2
    ...

Example

In order to introduce the writing method of the yaml file, the website https://pity.fun/ is used here for testing.

Visit the website, enter the account password, open F12 and click Login, you can see the login interface.

https://api.pity.fun/auth/login post method, the request body is json

{“username”:”tester”,”password”:”tester”}

The response body is json

{“code”:0,”msg”:”Login Successful”,”data”:{“token”:”eyJ0eXAiOiJKV1QiLCJhbGciOye8″,”user”:{” id”:6,”username”:”tester”,”name”:”tester”,”email”:”[email protected]”,”role\ “:1,”phone”:”11111111111″,”created_at”:”2022-12-30 16:34:55″,”updated_at”:”2023-03-13 15:00:09″,”deleted_at”:0,”update_user”:6,”last_login_at”:”2023-03-16 22:39:26″,”avatar” :”https://static.pity.fun/avatar/user_6.png”,”is_valid”:true},”expire”:1679236766.573221}}

login.yml

Create login.yml under the testcase folder, the content is as follows

config:
    name: "Successful login"
    variables:
        username: tester
        password: tester
        expect_foo2:config_bar2
    base_url: "https://api.pity.fun"
    verify: False
    export:
        -token

teststeps:
-
    name: login successful
    variables:
        foo1: bar1
    request:
        method: POST
        url: /auth/login
        headers:
            Content-Type: "application/json"
        json:
            {"username":"$username","password":"$password"}
    extract:
        token: body.data.token
    validate:
        - eq: ["status_code", 200]
        - eq: [body.code,0]
        - eq: [body.msg,"Login successful"]

Copy the relative path of the file, open the terminal, enter

hrun filename -s

You can see that there is a command executed through

Next, introduce the content of this yaml file

config (required)

Every test case must have a config section. And there can be only one config section.

name (required)

The name of the test case, which will be displayed in logs and reports.

base_url(optional)

Common Host in test cases, such as “https://api.pity.fun”. If base_url is specified, the url in the test step can only write relative path. This configuration is useful when testing in different environments.

variables (optional)

Defined global variables, scoped to the entire use case. Each test step can reference config variables. That is, step variables have higher priority than config variables.

When referencing variables, use $variable name reference, such as the account name and password of the instance, which are passed in through variables.

parameters (optional)

Global parameters are used to implement data-driven, and the scope is the entire use case. This will be written later.

verify(optional)

Specifies whether to verify the server’s TLS certificate. This is especially useful if we want to record the HTTP traffic executed by the test case, because if verify is not set or set to True, an SSLError will occur.

export (optional)

Specifies the output test case variables. Treat each test case as a black box, config variables are input variables, and config export is an output variable. When a test case is referenced in a step of another test case, the config export will be extracted and used in the subsequent test step.

In the example, the token is exported, and if the next interface needs to use the token, it can be used. (Note that you can’t use it casually, you must have two interfaces in one test case file)

teststeps

Each test case has 1 or more test steps (List[step]), and each test step corresponds to an API request or reference to other use cases.

name (required)

name is used to define the test step name, which will appear in the log and test report.

variables (optional)

Variables defined in a test step are scoped to the current test step.

If you want to share variables in multiple test steps, you need to define them in config variables.

Variables in test steps will override variables with the same name in config variables.

request (required)

METHOD (required)

Set the http method, support all http methods (GET/POST/PUT/PATCH/DELETE/) in RestFul, which is equivalent to the method in requests.request.

URL (required)

Set Url, if base_url is set in config, url can only be a relative path part. Equivalent to url in requests.request.

PARAMS (optional)

Set the query of Url, equivalent to the params in requests.request, used when the request method is GET method.

HEADERS (optional)

Set the headers of the request, which is equivalent to the headers in requests.request. Such as the request header in the example

Content-Type: "application/json"
COOKIES (optional)

Set cookies for Http requests, which is equivalent to cookies in requests.request.

DATA (optional)

Set the Body of the http request, which is equivalent to the data in requests.request.

JSON (optional)

Set the body of the http request in json format, which is equivalent to the json in requests.request.

The request body in the example is in json format

extract(optional)

Extract parameters from the response result of the current HTTP request and save them in parameter variables (such as token). Subsequent test cases can be referenced in the form of $token.

Principle: Use jmespath to extract the content of Json response body.

Extract the token from body.data.toke in the example and name it token.

validate(optional)

The result verification item defined in the test case, with the scope of the current test case, is used to verify the running results of the current test case.

Principle: Use jmespath to extract the content of Json response and perform assertion verification.

The example asserts that the response status code is 200, and asserts the code and msg in the two bodies of the response body.

hooks (optional)

Reference test cases in test steps

Create a new file query.yml, write the following content and run it, you can see that two steps have been run successfully.

This use case has two steps, the first of which references the login file.

The second step uses the $token method to call the token variable exported in step 1.

In the steps of referencing test cases, variables and export keywords can be used, but other keywords such as assertions cannot be used.

config:
    name: "Query user information"
    base_url: "https://api.pity.fun"

teststeps:
- name: successful login
    testcase: ./testcases/login.yml
-
    name: query user information
    request:
        method: GET
        url: /auth/query?token=$token
    validate:
        - eq: ["status_code", 200]
        - eq: [body.code,0]
        - eq: [body.msg,"Operation succeeded"]

variables

Same as variables in request

testcase

Specifies the referenced test case

export

Variables extracted from referenced test cases that can be referenced in later test steps