If the press and analysts are to be believed, our IT landscape will look very different in the not-so-distant future. Of course, this is also a recurring topic in terms of IT monitoring. How to maintain the balance between media noise and what works in practice? What does the client need now and what will he need in a year and what role does the task of IT monitoring play?
Cloud – still a trend or an old hat?
The cloud is no longer a “trend” as such, as it has been around for some time, the cloud has received a new huge increase after Covid-19 and the resulting home office. I don’t think there has been a fundamental paradigm shift due to the pandemic: the cloud was already dominating the headlines before 2020 and just shifted a little more into focus due to Covid-19.
While cloud penetration is widespread, local IT still exists and has not lost its relevance – even analysts and journalists, who still announced the end of the administrator only a few years ago, have now emerged and talked about hybrid environments. the combination of public cloud services, private cloud and local IT. We also see this – with our customers and also with our own IT. For example, services are received from the cloud: Office 365, CRM system, storage systems, etc., and the data is stored in the public cloud. The private cloud plays an important role for many companies when it comes to storing sensitive data or services that should not / should not be assigned to the public cloud. Finally, for most companies, local IT also plays a far greater role than simply providing Internet access. Sensitive data is stored locally, applications run in the company’s own data center and, of course, there is the local infrastructure.
Monitoring is needed to support ITOps: of course, local IT should not be overlooked, but this is usually the bread and butter business of most established monitoring solutions. In addition, cloud applications and cloud providers should be included in the central monitoring by requesting the appropriate interfaces. This means that monitoring solutions need to be constantly expanded and adapted.
AI and network monitoring: machine learning or just smart algorithms?
AI has become increasingly prevalent in recent years, in part due to the development of cloud technologies. However, “real” AI in the form of machine learning is often still combined with more or less complex algorithms. Although there are endless possibilities and possibilities, but also a lot of noise and delusions.
AI can play an important role in network monitoring, as it collects huge amounts of data, and the combination of cloud and AI or machine learning is destined to analyze this data and recognize patterns. This may be related to anomaly detection, improved root cause analysis, but also trends and predictive maintenance. Currently, mainly several new and cloud-based monitoring solutions, mainly for the corporate market, are on the move for AI. These are mainly security or performance monitoring of applications based on advanced traffic monitoring, ie. highly specialized solutions aimed at proven specialists.
In the future, classic broad-based monitoring solutions will certainly become smarter in one form or another. Whether this will be machine learning or just smart algorithms remains to be seen. Of course, we at Paessler are interested in the potential of AI, but it can’t just be applied, it has to offer added value to the classic IT administrator or ITOps teams.
SDx – Everything defined by the software
What started as a software-defined network has now become SDx, software-defined all-storage, WAN, LAN, radio, data center … While in the past devices have become more intelligent and therefore more sophisticated, intelligence increasingly outsourced to include a software layer. The advantages are obvious: the “duller” and therefore cheaper devices are centrally tuned and controlled. There are lower hardware acquisition costs, less configuration and management effort.
While in the past more and more intelligent devices had to be integrated into the monitoring system separately and at high cost, this has changed somewhat with SDx: by connecting the software layer, many values can be queried directly there. In parallel, additional factors such as core hardware, traffic, and environmental infrastructure can be monitored through a comprehensive monitoring solution, thus providing an overall picture of the entire SDx environment. In the end, not much has changed for comprehensive SDx monitoring solutions. There is a new component of IT that usually carries its own monitoring capabilities, which in turn can be integrated into a comprehensive monitoring effort.
Digitization (IoT, Industry 4.0 …)
Digitalisation is often seen only as a marginal phenomenon, as it involves areas that have previously been separated from IT – such as manufacturing facilities, medical devices and infrastructure, or construction technology. However, as digitalization generates more and more data, its ability to transport, store and process data belongs to IT. In other words, sooner or later digitalization will have a significant impact on IT.
Of the four trends listed, digitalization has the least to do with traditional IT, but it is actually where it has the strongest impact. With regard to monitoring, in the event of a fault, it is possible to immediately identify where the problem is and whether there are obstacles to the transport of data on the network. However, this only works in a hospital context if the monitoring solution can monitor both IT and medical infrastructure. However, there are no comprehensive monitoring systems in the medical sector. The communication servers commonly used there do not have or only have basic monitoring functionality and certainly do not have the ambition to extend this to diverse and very complex IT environments. Therefore, comprehensive monitoring solutions must come from IT. The future of monitoring
In the future, SDx, cloud and digitization will be standard. Broad-based monitoring solutions will cover everything from IT to manufacturing to intelligent building technologies or medical technologies. It will provide a central overview as well as control panels for experts. In the long run, the topic of AI will also become relevant for comprehensive solutions. Monitoring produces huge amounts of data. These data currently provide information on the current status of the monitored components. With the help of artificial intelligence, it will also be possible to predict trends that are emerging in areas such as predictive maintenance, best practices, unusual behavior and intrusion detection.
To stand in front of the curve
Monitoring is essentially a technology follower, not a driver. What is in the media headlines and the focus of analysts will usually be put into practice by a few companies tomorrow and by most companies the next day. That is why broad and reliable support for established technologies has a clear priority.
Of course, there is always a need for innovative monitoring tools that serve cutting-edge technology. Most IT administrators and ITOps teams live here and now and must face the challenges of the present. Broad-based monitoring solutions such as PRTG must first and foremost meet these challenges and evolve to meet the needs of their customers. This currently means comprehensive features for monitoring classic (and hybrid) IT environments, integrating cloud services and SDx providers, and increasing the integration of digitization protocols.
Martin Hodgson, State Manager, Paessler AG