Building a Load Testing Framework using — Adding Assertions and Custom Metrics (Part 4)

Kushal Bhalaik
4 min readOct 29


In Part-03 of this article series, we added support for uploading files using k6. In this post, we all add assertions (also known as checks) and also see how to add Custom Metrics to the existing framework.

Why Assertions and Custom metrics?

Assertions: Let’s assume you ran a test script for 24 hours with 100 concurrent users, but there is a basic flaw in your test or test data setup involving a VPN. So unless you add assertions, your test script will continue running for 24 hours without giving fruitful results and wasting time and resources. So we need assertions at certain points in the test script to make sure the outcome of the execution is as you intended.

Custom metrics: k6 provides pre-built metrics out of the box, which are good for measuring the overall performance of a flow in an application, but still would fall short of the the intermediate results you might want to measure. To achieve this we need to add custom metrics at certain levels to make sure we get the desired metrics at intermediate steps.

Adding BDD Assertions/checks:

In order to add assertions we will have to rely on a library called k6chaijs. As we are aware k6 project is not a node.js project so in order to make use of certain dependencies we will have to remotely import them into our project.

k6 supports JSlib which is a collection of JS libraries to be used effectively with k6 framework

we will also be including BDD syntax in our Test Scripts. we begin by importing the following functions

import { describe, expect } from '';

and then convert the existing code in testSuite() function to use describe jasmine blocks like below

// actual test code (run by VUs)
export default function testSuite() {

describe('Login', () => {
login(, userInfo.password);

describe('POST request', () => {

describe('file upload', () => {
//file upload code
before and after BDD

and then we can add expect (assertions) statements after an API request is made in each block to verify if the request resulted in the desired behavior.

for example:

describe('POST request', () => {
let data = { name: "Bert" };

// Using a JSON string as body
let res =`${baseUrl}/post`, JSON.stringify(data), {
headers: { "Content-Type": "application/json" },



you can see these assertions in the execution result:

assertions in the execution results

also, you can combine these assertions/checks with the scenario thresholds to stop execution if the error rate is too high:

export const options = {
thresholds: {
checks: ['rate>=0.9'], // 90% assertions should pass
http_req_failed: [{ //stops test if the failure rate is greater than 5%
threshold: 'rate<=0.05',
abortOnFail: true,

Adding Custom metrics:

With k6 we can add 4 different types of custom metrics in our test scripts, namely

more on this here:

In this example, we will add a custom metric of type “trend”. we start by importing:

import { Trend } from 'k6/metrics';

Let’s say we want to measure time for how much time it takes to upload a file, so we can define fileUploadStartTime and fileUploadEndTime variables for storing start and end timestamps respectively as below, and also create a trend type metric fileUploadTime.

// custom metrics
let fileUploadStartTime;
let fileUploadEndTime;

const fileUploadTime = new Trend('file_upload_time', true)

Now in order to use this in our code, we will modify the fileUpload test as follows:

  describe('file upload', () => {
//file upload
let newFile = init.getARandomFile();

// requestBody for fileUpload
const requestBody = {
file: http.file(newFile.file,,

fileUploadStartTime = new Date().getTime() // file upload start time

const fileUploadResponse =`${baseUrl}/upload`,
fileUploadEndTime = new Date().getTime() // file upload end time

fileUploadTime.add(fileUploadEndTime - fileUploadStartTime) // add total time to metric


now after running the test script, we can see the file_upload_time was added to the execution results as below:

so that’s how we can add custom metrics in our scripts, we can also other types of custom metrics in our code as per the need.


Github Code:

Originally posted on



Kushal Bhalaik

QA By Day. Wannabe dev by other times.