«System integrator is a centre of expertise that implements technical solutions of data monitoring and analysis systems»

«System integrator is a centre of expertise that implements technical solutions of data monitoring and analysis systems»
Stanislav Prishchep
Head of IT Security Systems

STEP LOGIC has been developing technical solutions for data monitoring and providing after-sale services in this domain for more than 20 years. The company has managed to implement a number of major projects, such as a cloud service monitoring centre for a telecoms operator, or a SOC, which provides an internal service for subsidiaries, in a large federal commercial bank. STEP LOGIC dynamically develops and implements solutions based on machine data analysis platforms. Stanislav Prishchep, Head of IT Security Systems at STEP LOGIC, tells us about this.

– Stanislav, what tools are currently available for monitoring infrastructure, applications and business processes?

– Modern enterprises and organizations are implemented with the highly automated IT environment, and data monitoring and analysis tools are used in various domains of it. They not only perform auxiliary tasks, for example, monitoring of IT infrastructure, IT security, but also are used for profit-making. A vivid example of that is online services for analysing purchasing power or tracking transport.

There are many developers of machine data monitoring and analysis tools. You can divide them into groups based on the implementation approach and customization options.

The first group is mono-functional boxed solutions that perform a specific application task, for example, to monitor information security, network, productivity, and so on.

The second group is monitoring solutions that are based on the Hadoop ecosystem and can perform several application tasks simultaneously. It allows you to perform deep data analysis using different algorithms and apply different analysis methods. However, the implementation of such a solution takes months, and it requires the help of data scientists and data engineers to extract the necessary information from the data. Therefore, the Hadoop ecosystem is most often used to perform large-scale or highly specialized tasks.

The third group is machine data analysis platforms (Splunk, Elastic) that allow users and engineers to independently extend the application functionality. In this case, a long implementation is not required as the necessary components are already integrated with each other by the vendor. It is possible to avoid data duplication and irrational use of computing resources, as it happens when several mono-functional tools are used, due to the use of a single software platform for several application tasks at once. Therefore, in my opinion, solutions based on machine data analysis platforms are now the most promising for wide application.

– What is well-organized monitoring? Does it largely depend on human participation, or is it more important to choose the right automation tools? What skills should the specialist responsible for monitoring have?

– First of all, a well-organized and competent team of specialists and managers is required to organize monitoring as well as any other workflow, and, secondly, high-quality technical means to automate their working process. Organizational documents may also be required to ensure the proper level of quality of the process.

Technical means of automating data analysis accumulate their functionality and significantly improve the efficiency of the monitoring process, including the use of artificial intelligence technologies. But their capabilities are limited and very far from human. As to it the key role in this process is devoted to a specialist who monitors, uses the system, adapts it, defines the rules, and analyses results.

For an analytic specialist who uses automated data analysis tools, it is important to:

  • understand the subject area and the capabilities of the automation software that the specialist uses;
  • be able to track the monitoring area, set scenarios and rules to detect incidents;
  • know what mathematical algorithms and data can be used to identify incidents and misoperations.

– What is the role of the integrator in this process? When is the integrator's help irreplaceable?

– System integrator is a centre of expertise that implements technical solutions of data monitoring and analysis systems. The integrator company is often involved in creating a centre of expertise with monitoring tools for a client, providing technical support and maintenance of the created systems.

In addition, integrators can act as providers of managed services based on their expertise and hosted monitoring tools. For example, they help to monitor the performance of the client's information systems, identify hacker attacks and other IT security incidents.

– What experience does your company have in this domain? How long have you been worked in this domain, what achievements are you proud of, what unique technologies do you have?

STEP LOGIC has been developing technical solutions for data monitoring and providing after-sale services in this domain for more than 20 years. The company has managed to implement a number of major projects, such as a cloud service monitoring centre for a telecoms operator, or a SOC, which provides an internal service for subsidiaries, in a large federal commercial bank.

Our company dynamically develops and implements solutions based on machine data analysis platforms. A decision for information security monitoring that can be adapted to cyber-security centres of any scale and functional orientation is one of them. It implements a unique logic to chain incidents. It means that different security events are aggregated and analysed not separately, but as a sequence of injurious actions within the system. It reduces the financial and time expenditures of analysts to analyse the situation and respond to incidents. It is currently undergoing testing and trial operation.

We utilize data analysis platforms in projects where other monitoring software products are redundant or have insufficient functionality in addition to full-fledged developments. For example, we provide control of single metrics and events for detecting and quickly analysing malfunctions of a given subsystem, and we aggregate data and adjust the functionality of "boxed" software products and business applications. Thus, a specific business task is solved, and a client gets a clear result without purchasing an expensive software product, the functionality of which could be used just partially.

– Tell us about an interesting and illustrative project.

– Our team implemented a project last year to create a vulnerability management system for one of the largest private airlines in Russia. It was to increase the transparency of the process managing information security vulnerability. We integrated a machine data analysis platform with multiple vulnerability scanners, a vulnerability database, a request management system and a SIEM system.

As a result, we were able to reduce the scan period from the standard 3 months to 1 month, increase the time rate, reduce information security risks, and the IT and information security departments were able to monitor and analyse the KPI of the process.

– What is your expert opinion on the prospects for the development of IT monitoring and the growth of its importance for companies in general and as well as for other different categories of users?

– IT monitoring has great prospects as a crucial automating tool for decision-making in the era of digitalization. The more work processes are automated, the more digital information becomes available for automatic analysis. Therefore, technical solutions that allow you to process large amounts of data and make the right management decisions are very important and in demand. In my opinion, the most important technologies that develop this domain and provide deep and high-quality data analysis is machine learning and artificial intelligence. Systems based on them can learn from previously identified problems, transfer their knowledge and independently adapt to the infrastructure.

The use of monitoring tools is no longer an auxiliary, but a required task in the operation of IT infrastructure and automation of workflow, a necessary control tool in any application area, from technological areas to business processes and marketing campaigns.

– Thank you for the interview!

By Anna Tumakova.
Source: spbit.ru

Back to all opinions
Subscribe