Automation in the development of the 1C: Enterprise platform

This article will discuss how we automate the development and testing of the 1C: Enterprise 8 technology platform. The 1C: Enterprise 8 platform is a set of tools for creating business applications and their execution environment. This is a large (over ten million lines of code) project in C ++, Java and JavaScript. Dozens of programmers are working on it, simultaneously developing and supporting up to 10 different versions of the product.

The platform runs on various versions of the OS and database:

  • OS: Windows, Linux, macOS
  • DBMS: MS SQL, PostgreSQL, IBM DB2, Oracle, proprietary file DBMS
  • Mobile OS: Android, iOS, Windows

Supports several types of clients:


Given that it is necessary to support a number of versions of the above OS, DBMS and browsers, testing the platform becomes a non-trivial task.

image

General Automation Tasks


The goals that we set ourselves:

  • Automate and accelerate routine development and testing tasks as much as possible.
  • Continuous testing with minimal test effort
  • Add only high-quality new code to the product version
  • Do not break old functionality
  • To bring the number of significant defects in the manufactured platform to zero
  • Detect problems early to minimize investigation and correction costs

image
Simultaneous development of several versions of the platform.
We use the practice of Continuous Integration (CI); merging working copies of the code into a common main branch occurs several times a day, after the merge, auto-assembly and autotesting of the changed project is performed. If there are problems during assembly or testing, the changed code is returned for revision.

image
Development Processes for One Version of the Platform
The Challenges of CI:

  • Assembly
    • A series of assemblies of various types and subsequent testing of mutable versions as part of a continuous cycle. To speed up the investigation of isolated changes in the test results, we use incremental compilation - only what has changed and direct dependencies are compiled. Assemblies are fully assembled for a full testing cycle. The need and sequence of additional assemblies is determined by the test results, a preliminary analysis of which is automated.
    • Check aside significant changes (integration builds). If the engineer considers the changes significant, he first pours them into a separate branch, collects it and runs all the tests. Upon successful completion of all tests, changes are made to the main branch.
    • The fastest detection of errors, if possible in automatic mode
    • Automation of routine actions (analysis of dumps, migration of changes between branches, registration of errors)
  • Tiered testing
    • Regression
    • Unit tests
    • Integration testing
    • Stress tests
    • Visual tests
    • Backward Compatibility Tests
    • Scenario testing
  • Progressive
    • Mostly functional
  • Acceptance testing
  • Testing progress and change
    • here we also include manual testing

Automatic assembly takes place several times a day. A full automatic testing cycle takes about a day, which, unfortunately, is unacceptably long for some tasks (balancing testing resources speeds up the process if there are free resources - if there are any at the moment). To mitigate this negative effect, we are developing a “lite” version of the tests, which should be run in an hour and affect about 80% of the functionality. Thus, a common understanding - how efficient the assembly is - we can get much faster. There may be times when an hour is not needed.
Now, when testing, the results of previous test cycles are taken into account, and problematic / new / fixed tests are launched with high priority, which allows you to see the progress of changes in the most changeable functionality directly at the beginning of testing.

For some types of assemblies, the “10 failures” rule is adopted, when a series of tests is automatically interrupted when 10 failures occur within the same series to free up resources for testing other assemblies / other versions, etc.

About 70 physical servers and about 1,500 virtual servers participate in the assembly and testing.

Instruments


Jenkins


We use Jenkins as a continuous integration system. At peak periods, he performs from 20 builds of the platform per day; It takes about 1.5 hours to complete one assembly, and 1 hour to test. Assembly is carried out in parallel by architecture (Windows, Linux, macOS), each assembly - in hundreds of threads at the same time. This approach several years ago allowed us to reduce the assembly time of one version of the platform with all architectures from 8 hours to 80 minutes, and we are not going to stop there.
Through web services, Jenkins is integrated with our task tracker, the Base of Tasks (written on the 1C: Enterprise platform), and in case of problems it automatically makes errors directly in the Base of Tasks, applying links to logs and test artifacts. Jenkins also prepares the platform for publication, if necessary, filters and parses dumps.

Jenkins also manages testing, allowing you to implement arbitrarily complex scenarios on arbitrary hardware configurations, including a large number of virtual machines, and also does additional work, for example, delivery and installation of the platform on 1,500 servers up to 70 times a day.

Apache JMeter


In JMeter have a very valuable quality - it has low hardware requirements for emulation of the large number of users. JMeter also allows you to generate a mixed load in one test - HTTP, SOAP, JDBC, LDAP, SMTP, TCP.

In particular, we use JMeter to test the performance of the application cluster and its individual components, as well as for load testing the application cluster on a large number (up to 10,000) of users. For this testing, one database server, two 1C servers and one load server are sufficient.

We have 4 test benches on which a single cluster is tested, a cluster in a fault-tolerant and non-fault-tolerant configurations; To test these configurations, we need only two physical machines.

image
JMeter Performance Charts

Test center


For more complex testing, we use our product Test Center (part of the Corporate Toolkit ). Test Center is a configuration on the 1C: Enterprise 8 platform; it allows you to describe multi-user test scenarios, automatically run them and monitor their progress. We run the Test Center on the so-called conveyors; one pipeline consists of 2 powerful physical servers on which virtual machines are located:

  • 1 application server 1C
  • 1 database server
  • 1 licensing server
  • 40 servers with client sessions

We put a lot of effort to improve the accuracy of the conveyor; now, when we run tests on the same platform versions and configurations, the spread of the results is less than 1.5%. Either 100 very fast clients (performing operations without pauses) or 1000 clients close to real users (emulating the work of an ordinary person, with pauses between actions) work on one conveyor.
Conveyors design typical stand options:

  • small
  • medium
  • big

15 different configurations of work sites can be assembled from conveyors. Configurations vary in server composition, fault tolerance. Servers can be on Linux and Windows. The bases for testing (as well as test scenarios) are prepared in two versions:

  • cloud, for 1cfresh
    • technology (a database with a large number of relatively small data areas)
    • CORP, for large deployments (large base)

    Separated infobases (for testing work in 1cfresh technology) with configurations:

    • 1c accounting
    • Management of our company
    • Salary and HR Management

    In CORP options, the configurations are tested:

    • Salary and HR Management
    • 1C: ERP Enterprise Management 2

    Load tests can involve: 1, 2, 4, 10 conveyors.
    Load tests are available in options for 100, 200, 400, 3000 and 10000 users.
    In different configurations of working sites, the number of servers in a cluster varies from 1 to 6.
    To run tests for 10,000 users in one database, two working 1C application servers are used. Each cluster configuration is automatically configured from hundreds of parameters at the beginning of each test. In fact, we can assume that the stand is fully prepared for work automatically, because:

    • platform delivered
    • scripts are delivered
    • cluster is configured
    • loading database
    • configuration files are configured (according to the specified parameters)
    • preparing publications of infobases

    Cluster configuration scripts, configuration files, OS, special processing are stored centrally in Git and delivered to stands automatically if there are any changes.
    We also have restructuring testing scenarios (updating the product version, during which the database structure changes). We are testing restructuring at the same stands. After the test is completed, the final result is checked - the data in the database must be updated correctly, and the database structure must correspond to the new version. Both the old and the new restructuring mechanism are being tested .

    During stress tests, we automatically collect and analyze:

    • all errors according to the Test Center
    • exceptions from the platform technology magazine
    • all requests from the technology log of the platform
    • all errors from the log
    • all measurements of operations performed with technological information on their implementation
    • all equipment load data

    For all data, reports are automatically generated (various depending on the types of tests), which are sent to the responsible person. All data is stored and aggregated in a special database with statistics and test results.

    image
    Test Center Screen

    We also carry out load testing of the work of 10,000 users in the “1C: ERP Enterprise Management 2” configuration on a failover cluster with modeling of hardware failures, network failures, lack of memory, CPU resources and disk space. This is a large test scenario, in which, throughout the entire test, 1C server process hangs are simulated, some processes are “killed” by the taskkill utility, the network is disconnected and restored, etc. As part of the testing, user scenarios of work in different subsystems are run - warehouse, purchases, sales, mutual settlements, etc. In the ERP stress test, about 400 key operations are performed, the test takes several hours.

    One of the ERP testing scripts (running in parallel with other scripts)
    image

    Configuration Performance Comparison


    On top of the described systems, our internal tool “Configuration Performance Comparison” (SEC) works, which allows you to compare performance:

    • different versions of the same configuration on the same platform
    • two versions of the platform with the same configuration
    • different versions of the database / OS with the same platform / configuration

    The “Performance Comparison of Configurations” system collects all the same parameters that are collected during normal load tests. The system allows you to automatically detect the occurrence of errors, load changes on servers, change in the duration of requests (or the appearance of requests that were not previously available).

    We analyze both deterioration and improvement, which may be a symptom of a problem.

    System can be used for comparison

    • configuration versions
    • platform versions
    • DBMS versions,
    • any settings

    As a result, we receive reports on passed stress tests, with detailed information and a comparison of performance, equipment load; the reports contain the execution time of queries to the DBMS, the facts of exceptions, etc.

    Performance comparison refers to measuring the overall performance of configurations, average runtime, and average APDEX for each key operation.

    Visual testing


    All of the above tools emulate the work of users, calling the appropriate methods of the built-in objects of the tested configurations, making calls to web and HTTP services, etc. But it is also extremely important to test exactly what the user really sees, especially the user working through the web client (where drawing the interface with the browser can take quite a while). We were faced with a situation where the performance in terms of automatic tests did not change when switching to a new version, but when we planted a person with a stopwatch, he received one number on the old version, and completely different numbers on the new one. This is due, in particular, to the time of rendering the graphical interface, which in the new version for some reason could change.
    We wrote our own tool that allows you to do visual testing of almost any application. The tool writes the actions of the user who launched the application to the script file. The tool also records an image of the working area of ​​the screen. When monitoring new versions of the client, scripts are played without user participation. When playing a script, the tool, before simulating keystrokes or mouse buttons, expects the appearance of the same screen image (accurate to a pixel) as it was in the recorded script.

    The tool also measures application performance with an accuracy of 25 milliseconds, writes the results to the log for further automatic comparison. In some cases, we loop back parts of the script (for example, repeat entering the order several times) to analyze the degradation of the script execution time. This test, in addition to measuring performance, also allows us to be sure that in the new version of the platform the user will see the same screens in the thin client and in the browser as in the previous version of the application.

    An example of running a script to enter an order in the configuration "Managing Our Company" - an order is entered 5 times; here is the real speed of the 1C: Enterprise platform, if the user instantly responds to the availability of the interface:


    Functional testing


    We are also actively developing functional testing. We are testing combinations of major versions of the OS and databases, for each such combination we have our own set of virtual machines, the entire set of combinations forms one conveyor; automated adding of new combinations of OS and database to this pipeline. Each functional test turns into a set of tasks that run on all possible combinations; tasks are performed by the first free stands. The Configurator is tested (1C application development environment), embedded language functions, query language, etc.

    When testing the Configurator, we check most of the commands available on the command line of the Configurator. In addition, we have a special library (we don’t supply it outside), which allows us to test the internal logic of the Configurator, accessible only through the user interface, without resorting to direct UI testing. Thus, most of the functions for working with configuration extensions, the comparison / combining functionality and other Configurator functionality are tested.

    For testing purposes, writing scripts in 1C language is available in this mode. Within the script, special objects are available for testing purposes. Starting the configurator in this mode can be combined in one test with the launch of the client application. This allows you to use this mode not only as a means of testing the configurator, but also as a way to configure the test environment.

    Eating your own dogfood


    There are a number of our internal tools written on the 1C: Enterprise platform that we use in our daily work. They work on the latest builds of the platform. Below we will talk about two of them - the “Base of tasks” and “Reports of employees”.

    Base of tasks


    Our internal task tracker, “Task Base”, is a configuration written on the 1C: Enterprise platform. This is 21 independent databases (some of them are working, some are test) on different versions of the platform, with different OS and DBMS, the databases are synchronized through the platform data exchange mechanism ; platform versions are updated daily, on some servers experimental versions of the platform are installed with separate new features. The new features of the platform that can be captured can be tested on the “Base of Tasks” the very next day. Different database instances work with different server environments (OS, DBMS) and with different versions of the platform, and users also log in from different clients (thin client, mobile client ) and through the web client from different browsers. Thus, testing of different versions of the platform in different environments is carried out.

    Employee Reports


    “Employee reports” is a time tracker for time tracking, which is used by employees of the 1C: Enterprise platform development department. It works on the latest build of the platform.

    "1C: Document Management"


    We also use the , which is used by all employees of our company, with new, not yet released versions of the platform.

    Platform tests in application solutions


    Along with automatic visual tests of popular application solutions (“Enterprise Accounting”, “Managing Our Company”, “Salary and Personnel Management”, etc.) we carry out manual tests: scenario, visual, manual testing according to the test plan of the main cases. After reaching a certain level of platform quality, we ask developers of applied configurations to switch to development on a new version of the platform and test their products on the version that is being prepared for release.

    Platform beta testing by partners


    Some of our partners are interested in using earlier, not yet released versions of the 1C: Enterprise platform. Such partners sign an NDA with 1C , get access to the platform assemblies before the release of the test version and have the opportunity to use the latest version of the platform in real conditions. This allows partners to detect problems in the platform at an early stage and to be sure that in the release version of the platform these problems will no longer be. We try to consider requests from such partners regarding the errors found with high priority. By the way, if one of the readers of this article wants to take part in the beta testing of the 1C: Enterprise platform, write to CorpTechSupport@1c.ru .

    Plans


    The plans include a transition to Continuous Delivery, a practice that assumes that the main assembly is always ready for release in order to reduce the time from the end of development to release. To achieve this, we want to expand our test coverage and develop functional and load testing.