Testing

Integration Tests

We had two sets of integration tests, one for each of our npm modules. Combining our tests yielded over 97% coverage of our codebase.

In both our packages, we made sure to have rigorous integration tests. This is because we make external API calls and have multiple different packages working together.

Voiceflow to UBF

vf-to-ubf coverage

For vf-to-ubf, we have a unified bot format which is generated from a Voiceflow. We only support certain types of diagrams from Voiceflow, so when it comes to testing, it requires us to try a diverse set of input diagrams. We then make sure that if validation passes, we get the UBF diagram that we expect. To do these tests, we had to generate our own set of test files using Voiceflow.

test("Voiceflow Diagram contains unsupported type", () => {
    diagram = JSON.parse(largeDiagrams.getUnsupportedType())
    validate.validateDiagram(diagram).then(resp => {
        expect(resp).toBe(false)
    })
})

The different input diagrams meant that we could test all edge and error cases, such as using blocks we don’t support, adding or removing keys from a diagram, and many other different changes.

UBF to Twilio

ubf-to-twilio coverage

For ubf-to-twilio, the main purpose for it is to put and get data in and out of Twilio with API calls. This means taking each call within our Twilio implementation of our API and ensuring we exhaustively try diverse sets of inputs.

test("Bot Missing Diagram", async () => {
    var bot = {"phoneNumber": null, "name": "test", "id": null, "diagram": null, "timestamp": new Date().getTime()}
    try{
        id = await upload.uploadNewBot(client,bot)
    } catch (e){
        expect(e.name).toBe("ValidationError")
    }
})

If a request to Twilio is sent successfully, we check if the right value is returned and if a request fails, we check that the expected error condition is triggered.