Oops! Something went wrong while submitting the form.
We use cookies to improve your browsing experience on our website, to show you personalised content and to analize our website traffic. By browsing our website, you consent to our use of cookies. Read privacy policy.
The blog intends to provide a step-by-step guide on how to automate API testing using Postman. It also demonstrates how we can create a pipeline for periodically running the test suite.
Further, it explains how the report can be stored in a central S3 bucket, finally sending the status of the execution back to a designated slack Channel, informing stakeholders about the status, and enabling them to obtain detailed information about the quality of the API.
Introduction to Postman
To speed up the API testing process and improve the accuracy of our APIs, we are going to automate the API functional tests using Postman.
Postman is a great tool when trying to dissect RESTful APIs.
It offers a sleek user interface to create our functional tests to validate our API's functionality.
Furthermore, the collection of tests will be integrated with GitHub Actions to set up a CI/CD platform that will be used to automate this API testing workflow.
Getting started with Postman
Setting up the environment
Click on the "New" button on the top left corner.
Select "Environment" as the building block.
Give the desired name to the environment file.
Create a collection
Click on the "New" button on the top left corner.
Select "Collection" as the building block.
Give the desired name to the Collection.
Adding requests to the collection
Configure the Requests under test in folders as per requirement.
Enter the API endpoint in the URL field.
Set the Auth credentials necessary to run the endpoint.
Set the header values, if required.
Enter the request body, if applicable.
Send the request by clicking on the "Send" button.
Verify the response status and response body.
Creating TESTS
Click on the "Tests" tab.
Write the test scripts in JavaScript using the Postman test API.
Run the tests by clicking on the "Send" button and validate the execution of the tests written.
Alternatively, the prebuilt snippets given by Postman can also be used to create the tests.
In case some test data needs to be created, the “Pre-request Script“ tab can be used.
Running the Collection
Click on the ellipses beside the collection created.
Select the environment created in Step 1.
Click on the "Run Collection" button.
Alternatively, the collection and the env file can be exported and also run via the Newman command.
Collaboration
The original collection and the environment file can be exported and shared with others by clicking on the "Export" button. These collections and environments can be version controlled using a system such as Git.
While working in a team, team members raise PRs for their changes against the original collection and env via forking. Create a fork.
Make necessary changes to the collection and click on Create a Pull request.
Validate the changes and approve and merge them to the main collection.
Integrating with CI/CD
Creating a pipeline with GitHub Actions
GitHub Actions is a continuous integration and continuous delivery (CI/CD) platform that allows you to automate your build, test, and deployment pipeline.
You can create workflows that build and test every pull request to your repository, or deploy merged pull requests to production. To create a pipeline, follow the below steps
Create a .yml file inside folder .github/workflows at root level.
The same can also be created via GitHub.
Configure the necessary actions/steps for the pipeline.
Workflow File
Add a trigger to run the workflow.
The schedule in the below code snippet is a GitHub Actions event that triggers the workflow at a specific time interval using a CRON expression.
The push and pull_request events denote the Actions event that triggers the workflow for each push and pull request on the develop branch.
The workflow_dispatch tag denotes the ability to run the workflow manually, too, from GitHub Actions.
Create a job to run the Postman Collection.
Check out the code from the current repository. Also, create a directory to store the results.
Install Nodejs.
Install Newman and necessary dependencies
Running the collection
Upload Newman report into the directory
Generating Allure report and hosting the report onto s3.
Along with the default report that Newman provides, Allure reporting can also be used in order to get a dashboard of the result.
To generate the Allure report, install the Allure dependencies given in the installation step above.
Once, that is done, add below code to your .yml file.
Create a bucket in s3, which you will be using for storing the reports
Create an iam role for the bucket.
The below code snipper user aws-actions/configure-aws-credentials@v1 action to configure your AWS.
Credentials Allure generates 2 separate folders eventually combining them to create a dashboard.
Use the code snippet in the deploy section to upload the contents of the folder onto your s3 bucket.
Once done, you should be able to see the Allure dashboard hosted on the Static Website URL for your bucket.
Send Slack notification with the Status of the job
When a job is executed in a CI/CD pipeline, it's important to keep the team members informed about the status of the job.
Below GitHub Actions step sends a notification to a Slack channel with the status of the job.
It uses the “notify-slack-action" GitHub Action, which is defined in the "ravsamhq/notify-slack-action" repository.
The "if: always()" condition indicates that this step should always be executed, regardless of whether the previous steps in the workflow succeeded or failed.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
A Guide to End-to-End API Test Automation with Postman and GitHub Actions
Objective
The blog intends to provide a step-by-step guide on how to automate API testing using Postman. It also demonstrates how we can create a pipeline for periodically running the test suite.
Further, it explains how the report can be stored in a central S3 bucket, finally sending the status of the execution back to a designated slack Channel, informing stakeholders about the status, and enabling them to obtain detailed information about the quality of the API.
Introduction to Postman
To speed up the API testing process and improve the accuracy of our APIs, we are going to automate the API functional tests using Postman.
Postman is a great tool when trying to dissect RESTful APIs.
It offers a sleek user interface to create our functional tests to validate our API's functionality.
Furthermore, the collection of tests will be integrated with GitHub Actions to set up a CI/CD platform that will be used to automate this API testing workflow.
Getting started with Postman
Setting up the environment
Click on the "New" button on the top left corner.
Select "Environment" as the building block.
Give the desired name to the environment file.
Create a collection
Click on the "New" button on the top left corner.
Select "Collection" as the building block.
Give the desired name to the Collection.
Adding requests to the collection
Configure the Requests under test in folders as per requirement.
Enter the API endpoint in the URL field.
Set the Auth credentials necessary to run the endpoint.
Set the header values, if required.
Enter the request body, if applicable.
Send the request by clicking on the "Send" button.
Verify the response status and response body.
Creating TESTS
Click on the "Tests" tab.
Write the test scripts in JavaScript using the Postman test API.
Run the tests by clicking on the "Send" button and validate the execution of the tests written.
Alternatively, the prebuilt snippets given by Postman can also be used to create the tests.
In case some test data needs to be created, the “Pre-request Script“ tab can be used.
Running the Collection
Click on the ellipses beside the collection created.
Select the environment created in Step 1.
Click on the "Run Collection" button.
Alternatively, the collection and the env file can be exported and also run via the Newman command.
Collaboration
The original collection and the environment file can be exported and shared with others by clicking on the "Export" button. These collections and environments can be version controlled using a system such as Git.
While working in a team, team members raise PRs for their changes against the original collection and env via forking. Create a fork.
Make necessary changes to the collection and click on Create a Pull request.
Validate the changes and approve and merge them to the main collection.
Integrating with CI/CD
Creating a pipeline with GitHub Actions
GitHub Actions is a continuous integration and continuous delivery (CI/CD) platform that allows you to automate your build, test, and deployment pipeline.
You can create workflows that build and test every pull request to your repository, or deploy merged pull requests to production. To create a pipeline, follow the below steps
Create a .yml file inside folder .github/workflows at root level.
The same can also be created via GitHub.
Configure the necessary actions/steps for the pipeline.
Workflow File
Add a trigger to run the workflow.
The schedule in the below code snippet is a GitHub Actions event that triggers the workflow at a specific time interval using a CRON expression.
The push and pull_request events denote the Actions event that triggers the workflow for each push and pull request on the develop branch.
The workflow_dispatch tag denotes the ability to run the workflow manually, too, from GitHub Actions.
Create a job to run the Postman Collection.
Check out the code from the current repository. Also, create a directory to store the results.
Install Nodejs.
Install Newman and necessary dependencies
Running the collection
Upload Newman report into the directory
Generating Allure report and hosting the report onto s3.
Along with the default report that Newman provides, Allure reporting can also be used in order to get a dashboard of the result.
To generate the Allure report, install the Allure dependencies given in the installation step above.
Once, that is done, add below code to your .yml file.
Create a bucket in s3, which you will be using for storing the reports
Create an iam role for the bucket.
The below code snipper user aws-actions/configure-aws-credentials@v1 action to configure your AWS.
Credentials Allure generates 2 separate folders eventually combining them to create a dashboard.
Use the code snippet in the deploy section to upload the contents of the folder onto your s3 bucket.
Once done, you should be able to see the Allure dashboard hosted on the Static Website URL for your bucket.
Send Slack notification with the Status of the job
When a job is executed in a CI/CD pipeline, it's important to keep the team members informed about the status of the job.
Below GitHub Actions step sends a notification to a Slack channel with the status of the job.
It uses the “notify-slack-action" GitHub Action, which is defined in the "ravsamhq/notify-slack-action" repository.
The "if: always()" condition indicates that this step should always be executed, regardless of whether the previous steps in the workflow succeeded or failed.
Velotio Technologies is an outsourced software product development partner for top technology startups and enterprises. We partner with companies to design, develop, and scale their products. Our work has been featured on TechCrunch, Product Hunt and more.
We have partnered with our customers to built 90+ transformational products in areas of edge computing, customer data platforms, exascale storage, cloud-native platforms, chatbots, clinical trials, healthcare and investment banking.
Since our founding in 2016, our team has completed more than 90 projects with 220+ employees across the following areas:
Building web/mobile applications
Architecting Cloud infrastructure and Data analytics platforms