Working with a Security Application

About Client

The Client is a leading provider of next-generation endpoint protection, threat intelligence and related services. Their software enables customers to prevent damage from targeted attacks, detects and attributes advanced malware and adversary activity in real time. The detection and attribution of the malware is performed jointly by the client-side driver and via data-processing in the cloud. The application is not a traditional antivirus, instead it compliments antivirus and fills the gap in the ‘pattern based detection’ nature of the antivirus software.


  • As is the challenge of developing any security software that resides on the endpoint, the client is keen on finding out performance impact before their customers report such issues. Given the fact that multiple operating systems are supported by the ‘Application Under Test’ (AUT) and it’s a constantly evolving application, another concern is quick and effective coverage across various supported platforms.
  • Another key interest is to identify the impact of having security software of different kinds on the same host. As the AUT works on behaviour/intelligence based detection whereas traditional antivirus works based on pattern detection, client is interested in finding out whether AUT falsely flags any trusted/reputed antivirus or vice-versa.
  • For benchmarking purposes, client is also interested in using known exploits/tools to trigger detections and validating those using predefined rules.
  • Since endpoint application generated events are actually stored and processed via cloud hosted service, client also wants functionality and usability validation of end-user facing UI for cloud’s front-end
  • Another desire of Engineering team at the client-end is to integrate all automated endpoint and cloud related tests such that they form an end-to-end validation flow.
  • Client is also interested in rolling out improved APIs to their customers but this requires validation of various positive and negative scenarios


  • Performance Concerns – Engineers at Musikaar provided robust and portable framework of automated tests to get the performance numbers of various metrics that accurately describe improvement or degradation in AUT’s impact on core Operating System (OS) behaviour. To quickly cover various supported OS, not only execution, the analysis of the results was also automated
  • Antivirus Compatibility Concerns – Musikaar QA team developed comprehensive matrix of OS platforms/antivirus and test-cases to find out critical compatibility issues that could arise due to presence of other antivirus software
  • Front-end Validation – Musikaar’s manual QA devised functional tests to cover all UI the features of the front-end. The same tests were then tweaked and automated by our SDETs to achieve functional coverage as well as capture response time performance for various queries that could be executed against the front-end.
  • Known Exploit Detection – Musikaar SDETs used Metasploit/Kali to generate various payloads and exploit known vulnerabilities to trigger detection. Then pre-defined rules were used to identify any false-positives or false-negatives and results were shared with Engineering
  • End-to-End Automation – Integrating all the automated endpoint tests and front-end validation tests, SDETs delivered a portable suite of tests which could be executed in almost any environment with very little prep-work. This automated suite also doubled up as the environment monitor that would notify engineers in case certain services would become unavailable or certain new artefacts would be available.
  • API Support – Our engineers wrote extensive cases to cover various positive and negative scenarios possible as input parameters and resultant output for the newly developed APIs. These tests are continuously executed against the staging environment provided to QA.

Required skill set

Client needed manual QA engineers and Software Development Engineer – Test (SDET) with following skills:

  • QA skills (Manual and Automation)
  • Hands-on experience of enterprise network and OS basics
  • Troubleshooting and debugging skills
  • Aesthetic reporting and analytical skills
  • Hands-on experience of PowerShell and/or Python
  • Exposure to security domain and anti-malware products

Communication Mode

To go over requirements, progress and updates we used following modes of communication

  • Daily email reports as well as milestone based reports
  • Bi-weekly calls and WebEx sessions
  • All sensitive information was transmitted using TLS encrypted file transfer or encrypted emails

Tools & Technologies Used

  • Microsoft Windows 7/8.1/2008 R2/2012 R2, Apple OSX 10.9/10.10 – Target configurations
  • VMware ESXi & CloneZilla – Test-bed preparation and maintenance
  • Python, PowerShell script, Batch file script, Selenium, Apache JMeter, MS SQL Express – For various functional & Performance test automation
  • Metasploit & Kali – For exploit validation
  • Various enterprise application and antivirus software – For compatibility validation


Back to Top