Protractor Adventure

Dominika Jedrzejczak
by: Dominika Jedrzejczak | March 23, 2017

Once upon a time (OK, it wasn’t THAT long ago), Pearson Poznan took on the job of creating two modules of a new global project – NSE. A decree was issued and soon the whole city knew that Pearson was looking for fierce warriors up to the challenge. Numerous candidates stepped forward, and so a tournament was held. After all trials were completed, 3 warrior princesses and 11 brave warriors started working on the project…

Time flies, and a lot has changed in the project since then. My story will be about just one of those changes and starts on one sunny summer morning in 2015. As usual, I arrived at the office at the break of dawn to get a head start on the day. I was surprised to see there our front-end developer Michał, who usually arrived at work a little later. I decided to seize this opportunity and discuss with him the automation problems I had been grappling with the day before. This interesting discussion brought about the idea to change the automation tool and start writing tests in Protractor. And that’s how this whole thing started…

Choosing and preparing a tool is no easy task, so we decided to discuss the idea with our Head of QA, Grzegorz. This meeting resulted in creating a list of requirements for the new solution:

  • scenarios in the native language (English),
  • support for Page Object,
  • solution based on Selenium,
  • tests launched on Selenium Grid,
  • tests launched on multiple browsers,
  • independence from an operational system, and
  • possibility of supporting tests by developers with the lowest entry threshold possible.

Having all these requirements, we started analyzing whether Protractor could meet all of them. It took me two months to prepare a configuration that would meet the minimum of our requirements. We managed to achieve our set goals through the use of the following nodejs modules:

Module Function
PROTRACTOR Automatic support for Angular’s asynchronicity (function waitForAngular)
CUCUMBERJS Scenarios in the native language Gherkin
ASTROLABE Page Object for tests
PROTRACTOR-CUCUMBER

PROTRACTOR-CUCUMBER-FRAMEWORK

Enables the connecting of protractor and cucumberjs
CHAI

CHAI-AS-PROMISED

Support for assertions
GRUNT-PROTRACTOR-RUNNER Preparing tasks for launching tests


The next step was to decide on the approach for organizing the code of tests. We had already chosen Page Object, which put in order a large portion of the code. But there were other elements left that needed to be organized as a logical whole. The following structure is the result of our deliberations:

Protractor 1

Respective sections contain as follows:

  • features – Gherkin scenarios,
  • steps – definitions of scenario steps in JS,
  • pages – astrolab Page Objects,
  • elements – sections singled out from Page Object (eg. bigger, more complicated elements on page, repeated elements), and
  • support – supporting files, that is
    • methods – libraries of frequently used functions, eg. operations and conversions of dates, strings, additional calculations and data conversions;
    • Data – json files with data essential for running and mapping tests (eg. mapping classes of elements to their logical meaning);
    • users – collection of json files representing environments and containing respective user data;
    • photo – graphic files used in tests.

With time, our needs and requirements for the tool were increasing. The first change came about when two teammates – Robert and Maciej – prepared an additional nodejs module grunt-protractor-cucumber-html-report, which is used to generate an html report from a json containing results generated from cucumber. Creating and adding this module made it possible for us to graphically present the results of running automated tests and thus present them to other teammates as well as business stakeholders. Below is a screenshot from the html report of the testing process:

Protractor 2

We also wanted to make launching tests as easy as possible, so we added other nodejs modules: grunt-selenium-standalone and selenium-webdriver, which make it possible for selenium to be automatically downloaded and launched when the tests are launched. You can find this first, basic configuration here.

The configuration of the tool has also undergone a lot of changes. The fixed, predefined configuration file was exchanged for a template, which enabled dynamic generation of configurations based on the parameters given while launching the Grunt task. The following variables were configured in this manner:

  • choice of browser,
  • choice of testing environment and its corresponding file containing user database,
  • choice of the manner of representing results (results in a console / html report),
  • choice of selenium server (selenium grid / local selenium), and
  • choice of tag (ie. test cases to be run).

The next important step was to prepare a task on Jenkins for launching tests. Thanks to Ola and Jakub’s cooperation, we managed to create a solution that provides daily monitoring of our application’s accretions. A testing environment was prepared for this purpose – built daily at the end of the day from the newest code the tests are launched on. The data for testing is provided by a database copy made before launching automates and run right after their completion. Additionally, a task for manual test launching on a designated environment was created. The list of available parameters can be seen on the screenshot below:

Protractor 3

A lot still remains to be done when it comes to:

  • writing new and maintaining existing scenarios,
  • solving project dependencies (eg. integration with other systems), and
  • improving the tool configuration and the ways of providing data.

But in the throes of fighting for better and more effective tests we cannot forget about how much has already been achieved. Having said that, concluding my story, I would like to thank everyone who had a hand in this success – be it through daily work on the tool and tests or through taking on other people’s responsibilities to enable them to work on additional technical tasks. I would like to thank all present and past members of NSE teams: Staff Tools, Learning Services and Mental Chillout for everything that we have achieved together. I hope that a lot of adventures and successes still await us! 🙂

Dominika Jedrzejczak

Dominika Jedrzejczak

Senior QA Specialist at Pearson Ioki. Often takes part in knowledge sharing events such as Test Warez, PTaQ, GGC and Test Carrots. Loves dancing, writing poems and Marvel movies. Her motto is "It is never too late to be what you might have been"
Informacja dotycząca plików cookies

Informujemy, iż w celu optymalizacji treści dostępnych w naszym serwisie, dostosowania ich do Państwa indywidualnych potrzeb korzystamy z informacji zapisanych za pomocą plików cookies na urządzeniach końcowych użytkowników. Pliki cookies użytkownik może kontrolować za pomocą ustawień swojej przeglądarki internetowej. Dalsze korzystanie z naszego serwisu internetowego, bez zmiany ustawień przeglądarki internetowej oznacza, iż użytkownik akceptuje stosowanie plików cookies.

Akceptuję