Document toolboxDocument toolbox

Performance Report

This report documents the results for the performance benchmark for Go2Group synapseRT. This report contains information including test environment, test scenario, test data, test conclusion, and more.

Purpose

Run the test to find the performance bottleneck of the product, and get the performance results to use as a benchmark for comparison with the next version.

Reader

  • synapseRT product team
  • synapseRT user

Test Environment

Hardware and System

OS

Windows 10 Pro 64-bit

Processor

Intel(R) Xeon(R) CPU E3-1231 v3 @3.40GHz

CPU Core

4

Memory

32.0 GB

Software

JIRA Server

JIRA Core 7.2.3

synapseRTRT NextGen

V8.5.1

Database

Oracle12c

Browser

Firefox 47.02

Test Data

Testing_Data_JIRA7.2.3_v8.4.3.1_0104(3000TCs&1000REQs&100TPs_REQHierarchyDone_TC&REQLinkageDone).zip

Test Scenarios

As JIRA is a web-based server, our performance testing is focused on page loading time with a large amount of data. 

Open the pages listed below with a large amount of data and record the page loading time:

  1. Navigate to the Requirements page
  2. Navigate to the Test Suites page
  3. Navigate to the Test Plans page
  4. Navigate to the Traceability page
  5. Navigate to the synapseRT Reports page
  6. Add a test cycle to a test plan with a large amount of test cases
  7. Open a test plan issue with a large amount of data
  8. Open a test cycle page with a large amount of test cases
  9. Start a test cycle with a large amount of test cases
  10. Initiate a "Bulk Operation" with a large amount of test cases
  11. Expand a test plan from the Test Plans page (in the Unresolved Plans tab)
  12. Link a large amount of test cases to a test suite
  13. Close a test run dialog box to refresh the Test Cycle page
  14. Gadget: choose a test plan to load its test cycle in the Edit Gadget page
  15. Continue to import test cases to a JIRA project
  16. Expand requirement hierarchy from the Requirements page with requirement hierarchy setup and test case associations

Test Results

We ran this benchmark test several times. The results for each test were almost identical. One set of results is listed below:

Test scenario

Test data

Average page loading time

Navigate to the Requirements page

  • There are 1000 requirements
  • No. of level one requirements: 21
  • No. of level two requirements: 20 (L1 REQ)*5+1(L1 REQ)*79 = 179
  • No. of level three requirements: 16 (L1 REQ)*5(L2 REQ)*10(L3 REQ) = 800
  • No. of test cases: 300 (L3 REQ) * 10 (TC) = 3000
  • No. of test plans: five test plans (four test cycles with 3000 test cases)

4.64s

Navigate to the Test Suites page

  •  There are a total of five test suites
  • Each test suite has 600 test cases associated with it

2.27s

Navigate to the Test Plans page

  • There are 100 test plans
  • There are five test plans
  • Each test plan with 3000 test cases added (above five test plans)
  • Four test cycles created for each test plan (above five test plans) and in "ACTIVE" status
  • No test case is added and no test cycle is added for the remaining 95 test plans

13.22s

Navigate to the Traceability page

Click on the Traceability menu from the project side bar

1.43s

Navigate to the synaspeRT Reports page


1.22s

Add a test cycle to a test plan with a large amount of test cases

PTA-8100

  • There are 3000 test cases

18s

Open a test plan issue with a large amount of data

PTA-8100

  • There are 3000 test cases
  • There are four test cycles (all in "ACTIVE" status)
  • There are 3000 requirements covered

5.78s

Open a test cycle page with a large amount of test cases

PTA-8100/test cycle one (3000 TCs)

  • There are 3000 test cases

19.18s

Start a test cycle with a large amount of test cases

PTA-8100/test cycle one (3000 TCs)

  • There are 3000 test cases

50s

Initiate a "Bulk Operation" with a large amount of test cases

PTA-8100/test cycle one (3000 TCs)

  • Assign all 3000 test cases to a JIRA user

8s

Expand a test plan from the Test Plans page (in the  Unresolved Plans tab)

PTA-8100

  • There are 3000 test cases
  • There are five test cycles (all in "ACTIVE" status)

5.76s

Link a large amount of test cases to a test suite

[Performance testing] test suite eight (link)

  • There are 600 test cases

12s

Close a test run dialog box to refresh the Test Cycle page

PTA-8100/test cycle one (3000 TCs)

  • A test cycle with 3000 test cases

<1s

Gadget: Choose a test plan to load its test cycle in the Edit Gadget page

PTA-8100

  • There are 3000 test cases
  • There are three test cycles (all in "ACTIVE" status)

<1s

Continue to import test cases to a JIRA project

  • 1.) The first 600 test cases
  • 2.) The second 600 test cases
  • 3.) The third 600 test cases

1.) 2m 50s

2.) 3m 10s

3.) NA

Expand the requirement hierarchy from the Requirements page

with requirement hierarchy setup and test case associations

  • There are 1000 requirements in total
  • For the first 20 requirements (level one requirements) in the list (they are visible to the Requirements page by default):
    • Each level one requirement has five direct child requirements (level two requirements)
    • Each level two requirement has ten direct child requirements (level three requirements)
    • Each level three requirement has ten test cases associated with it

1.) Open the Requirements page:

4.44s

2.) Expand level two requirements:

434ms

Test Conclusion

  1. The above data are based on the results of running the test one time. When we run tests many times, the results are almost identical, thus validating the results.
  2. The next step: In the future, we will re-run the test with the same environment and scripts, and use this data for comparison with the next version.
  3. Looking at the above data, we can conclude that importing a large amount of test cases takes a lot of time, and we need to improve product performance in this aspect in the future release. However, considering the data load (the large amount of data) we used to exceed the average data load, the wait is still acceptable to the user.