0

I have a script where all of the logging (output, errors etc) are captured in one single log file each day.
This script can be called more then 10k times a day, with a maximum of 3 calls simultaneously.

The problem I'm facing at the moment is that sometimes the logging is writing through each other in the same logfile when they executed simultaneously.
I was already looking into options like flock and semaphores to solve this, but this could create a queue on execution of the process itself, which is not desired in my case.

So I was wondering if it's possible to do the logging in some other kind of other process, where it only writes in the logfile when no other process is writing? with a result that the logfile is 'clean' and calls towards the script run as it is at the moment, without any delay?

MC68020
  • 7,981
NeG
  • 71

1 Answers1

0

Why don't you just use the logger command which should be on most Linux distros. It is as easy as

logger "this is my log message"

It should, by default, send your logs to /var/log/syslog

You can call this in your script, which will then generate a log entry via the standard logging daemons and machinery on your OS which should handle all the concurrency issues you are trying to manage.

Check the logger man page for details, there are a lot of options you can tweak including loglevels, etc.

You can also configure your system logger (syslog typically) to handle these logs in a special way, including log rotation, compression, filenames, etc, etc...

Brad
  • 211