All Collections
Release Notes
Magnetite 2024
Performance Test Status - Magnetite'24
Performance Test Status - Magnetite'24
Daví Alcoforado avatar
Written by Daví Alcoforado
Updated over a week ago

Performance testing is allowing us to evaluate how our systems and applications perform under diverse conditions. In this report, we'll provide an overview of the testing methodology and results obtained for various scenarios.

Concurrent Users 🧑‍💻🧑‍💻🧑‍💻: The number of users online simultaneously.)
Scenario Completion Time ⏱️: The allowed duration (in seconds) for a scenario to finish.
Test Iterations 🔁: The number of times the test is repeated or looped.

Methodology 📈

Our testing methodology involved the collection of critical performance metrics, including Total Samples, Average Response Time, Throughput, and Received KB/sec.

We conducted performance testing on two types of scenarios:

Simple Full Scenarios

  • Designed to handle 25 users within 10 seconds for 1 loop.

  • Focused on end-to-end scenarios, including screen access with relatively lighter loads.

Heavy Full Scenarios

  • Configured to accommodate 20 users within 10 seconds for 1 loop.

  • Focused on end-to-end scenarios involving access screens, loading extensive data, and applying heavy load.

Current Release Performance Testing Status 💻💪

The charts and tables below show the performance results of brains. app through different releases:

Samples vs Average Response Time:

  • The sample count exhibited over the years, with notable increases in the second half of 2019 compared to the first half of 2018. This was followed by a decrease in the second half of 2022 compared to the previous period. Subsequently, another increase was observed in the first half of 2023, followed by a decrease in the second half of 2023. In the first half of 2024, there was a slight decrease in sample count compared to the previous period, indicating either continued testing efforts or an expanded testing scope.

  • The average response time over various periods, with notable spikes in the second half of 2019 and the first half of 2020. Substantial improvements were observed in the first half of 2021, continuing into the second half of 2022. Further enhancements occurred in both halves of 2023. Although there was a slight increase in average response time in the first half of 2024, ongoing optimization efforts have helped maintain performance standards. However, this is expected compared to the Sample count we were using.

It's important to consider the enhancements in average response time during 2024-H1 suggest that optimization efforts have likely been implemented to improve system performance.

2024-H1 Compared to Previous Periods:

  • In 2024-H1, the sample count shows a slight decrease from 2023-H2, suggesting potential adjustments in testing strategies or focus. Despite a slight increase in average response time.

  • Requests per second doubled, indicating notable improvements in system scalability and responsiveness. This increase reflects enhanced capacity to handle concurrent user requests efficiently.

  • Furthermore, received data rates surged compared to 2023-H2, showcasing the system's robust data processing capabilities and efficient handling of high data loads. This significant increase underscores the system's ability to manage data transfer tasks effectively.

Did this answer your question?