Using Macros in a Log Parser

Started by bquiggle, December 28, 2020, 10:27:21 PM

Previous topic - Next topic

bquiggle

I am trying to create a log parser that will alert me when it detects the string "***" in any of a series of log files. I have managed to get it working if I point it specifically at a single log file, but when I try to use a macro to have it parse all of the files in a folder, I am running into trouble. My parser code is as follows:


<parser trace="0" name="*** Monitor">
   <file>D:\program\log\@{log}</file>
   <rules>
      <rule name="Log Monitor">
         <match repeatCount="0" repeatInterval="1">\*\*\*</match>
         <event>LOGWATCH_TEST</event>
         <logName></logName>
      </rule>
   </rules>
   <macros>
      <macro name="log">.*\.log</macro>
   </macros>
</parser>


Can anyone help me understand what I'm doing wrong here? Part of the trouble I'm running into is when I make XML edits, the system seems to automatically revert and change some of them even when I save changes. For instance, it keeps automatically moving the "macros" section below the rules section but the documentation for NetXMS specifies the macros section should come first. It also keeps on adding in the "logName" tag below the event tag? I don't know what this tag is but even when I remove it, it comes back. Is this intended behavior?

Victor Kirhenshtein

Hello!

Currently log parser cannot expand single <file> entry into multiple files. You can use timestamp macros or external scripts to monitor files with changing names, but not changing set of files simultaneously.
You can safely ignore <logName> tag. It is purely cosmetic issue - instead of omitting it altogether if not set (as with other tags) code that generates XML always put it. This tag can be used to specify Windows event log name to filter by specific log (useful for Windows event synchronization).
Macros section moving to the bottom is a bug - we will fix it.

Best regards,
Victor

bquiggle

Thank you for the response. That explains the trouble I've been having! So, essentially, I would need to create a script to generate a list of log files to parse? What would be the syntax of inserting the output of that script into the parser? Would it be best to create the script somewhere within NetXMS and point the parser to it, or does it need to be done externally using Powershell or a batch file?

Filipp Sudanov

Hi!

It can be a script, but only one filename in it's output is accepted. Usually, when we need to monitor an application, this application writes data to only one log file at a time. If we need to monitor several applications, we specify several files or log parser definitions.
Documentation was updated with possible ways of using macros and scripts to get file name - https://www.netxms.org/documentation/adminguide/log-monitoring.html#file-tag
For script you just have to put script name in backticks like: `/path/script.sh`

bquiggle

Thanks. That makes sense. I was able to get it monitoring a couple different dynamically changing files using time/date macros and parsing the files separately.

I did have a couple of follow up questions that I haven't been able to find a solution to:

I currently have the parser and actions setup to monitor the logs for a specific string, and if it finds it, it emails me the line from the log containing the string. If the string occurs twice in immediate succession, it will send me two separate emails. Is there a good way to group event notification emails when the event occurs multiple times in quick succession?

How would I set up the parser to trigger an event if the log hasn't been written to after a set amount of time? Is this possible?

Victor Kirhenshtein

Quote from: bquiggle on January 05, 2021, 06:26:09 PM
I currently have the parser and actions setup to monitor the logs for a specific string, and if it finds it, it emails me the line from the log containing the string. If the string occurs twice in immediate succession, it will send me two separate emails. Is there a good way to group event notification emails when the event occurs multiple times in quick succession?

You can do that with scripts in event processing policy. When event is processed, update some key in persistent storage with current timestamp, and add additional script filter which will compare that key to current timestamp, and only allow further event processing if previous event is far enough in the past. See screenshot for example (it will send notification for SNMP_EGP_NEIGHBOUR_LOSS only if it comes more than 10 minutes after previous alert).

Quote from: bquiggle on January 05, 2021, 06:26:09 PM
How would I set up the parser to trigger an event if the log hasn't been written to after a set amount of time? Is this possible?

Using parser it can be done only in very complex way - you can match each line and generate event, setting timestamp from taht event, and have scheduled task or DCI to check how old that timestamp is.
Alternatively you can use DCI to read file log file modification time and check how old it is (assuming that modification time updated after write which may not always be the case if application keeps log file open).

Best regards,
Victor

bquiggle

The filtering script would completely suppress additional alerts though, correct? I still want to see every matched string because they will contain unique information, but if two of them occur in the same log file back-to-back, I'd like it grouped into a single email.

I will have to do testing with the DCI checking log file modification time and see if that works for me.