Some time ago I published an article about how to store the NetEye SMS Protocol log into an ELK environment. Now, after using it some times, I discovered that it was not completely correct as the time/date functions for the Logstash filters are a bit more complicated. In particular, it was that the date was written in the SMS protocol file in this way:
June 29th 2016, 10:30:22 CEST 2016
And we used this Logstash date filter to convert it:
date {
locale = "en"
match = [ "sms_timestamp_text", "EEE MMM dd HH:mm:ss" ]
}
Now it seemed that it would work, but after some time (some days until the start of the next month) we discovered that the date in the first days of the month would look like:
July 1th 2016, 10:30:22 CEST 2016
As we had a textual timezone and date filters do not support this, in the first draft we had this rule to be able to parse the sms_timestamp_text:
match =>[ "message", "%{SMS_TIMESTAMP_SHORT:sms_timestamp_text}
%{WORD:timezone} %{YEAR}:%{INT:sms_phonenumber}:%{GREEDYDATA:sms_text}"
We discovered that our filter would not work with that as we had “dd” for 2 digit day. Now, how would we do this as the date cannot be matched by “d”, neither by “dd”? After studying the filter rules I discovered the solution. It is possible to have “or” rules inside the date and so being able to match more than one date format. So we changed the filter in this way:
date {
locale = "en"
match => [ "sms_timestamp_text", "EEE MMM dd HH:mm:ss Z yyyy", "EEE MMM d HH:mm:ss Z yyyy" ]
}
You see the new Z and yyyy parameters because if we do not match a complete date it will not work correctly. To be able to parse this correctly now I discovered that the pattern match had to change in this way:
match => [ "message", "%{SMS_TIMESTAMP:sms_timestamp_text}:%{INT:sms_phonenumber}:%{GREEDYDATA:sms_text}" ]
As I told earlier, Logstash cannot parse textual time zones, but this is what we have here. What should we do? We know that our date is in Western Europe so we have a solution for this mutate tag in this way:
I have over 20 years of experience in the IT branch. After first experiences in the field of software development for public transport companies, I finally decided to join the young and growing team of Würth Phoenix. Initially, I was responsible for the internal Linux/Unix infrastructure and the management of CVS software. Afterwards, my main challenge was to establish the meanwhile well-known IT System Management Solution WÜRTHPHOENIX NetEye. As a Product Manager I started building NetEye from scratch, analyzing existing open source models, extending and finally joining them into one single powerful solution. After that, my job turned into a passion: Constant developments, customer installations and support became a matter of personal. Today I use my knowledge as a NetEye Senior Consultant as well as NetEye Solution Architect at Würth Phoenix.
Author
Juergen Vigna
I have over 20 years of experience in the IT branch. After first experiences in the field of software development for public transport companies, I finally decided to join the young and growing team of Würth Phoenix. Initially, I was responsible for the internal Linux/Unix infrastructure and the management of CVS software. Afterwards, my main challenge was to establish the meanwhile well-known IT System Management Solution WÜRTHPHOENIX NetEye. As a Product Manager I started building NetEye from scratch, analyzing existing open source models, extending and finally joining them into one single powerful solution. After that, my job turned into a passion: Constant developments, customer installations and support became a matter of personal. Today I use my knowledge as a NetEye Senior Consultant as well as NetEye Solution Architect at Würth Phoenix.
Hello everyone! As you may remember, a topic I like to discuss a lot on this blog is the Proof of Concept (POC) about how we could enhance search within our online NetEye User Guide. Well, we're happy to share Read More
In the ever-evolving landscape of IT monitoring and management, the ability to efficiently handle multi-dimensional namespaces is crucial. Within NetEye, Log-SIEM (Elastic), provides a comprehensive solution for managing the single namespace dimension with the namespace of a data_stream. This blog Read More
In high-demand environments, efficiency isn't just an advantage – it's essential. One of the biggest hurdles we encountered was the overwhelming strain placed on NetEye's (Elastic) master nodes during the data enrichment process. As data volumes skyrocket, so do the Read More
Hey everyone! We played around a bit last time with our radar data to build a model that we could train outside Elasticsearch, loading it through Eland and then applying it using an ingest pipeline. But since our data is Read More
Right now, at Würth Phoenix, we are investing in automating most of our operations using Ansible. You're probably already familiar with what Ansible does, but to summarize, Ansible is an open-source, command-line IT automation application written in Python. I've talked Read More