ProcessEventLog

The ProcessEventLog table contains all log messages for Deephaven workers and query or merge servers, including the output from workers, which is useful for investigating behavior and diagnosing failures or crashes. You should filter by the Process or ProcessInfoId columns to retrieve rows of interest. You must also sort by the Timestamp column to view data in order, because otherwise data captured from workers' stdout and stderr will be out of order.

Column NameColumn TypeDescription
DateStringThe date on which the log event was generated. This is the partitioning column.
TimestampDateTimeThe timestamp for the logged event.
HostStringThe host name for the logged event.
LevelStringThe level for the event. This is usually one of the standard log levels (INFO, WARN, etc), but in the case of a worker output logged by the query server, it will instead indicate the level of captured output (stdout or stderr).
ProcessStringThe name of the process that generated the event (e.g., RemoteQueryDispatcher, worker_1).
ProcessInfoIdStringThe unique process info ID of the process generating the entry.
AuthenticatedUserStringThe authenticated user who last ran this query.
EffectiveUserStringThe effective user who last ran this query.
LogEntryStringThe logged event.
ProcessInfoIdStringThe unique process info ID of the process generating the entry.

Configuration

The same set of processes that can write to the AuditEventLog can write to the ProcessEventLog, although most of them turn off process event logging by default.

Processes that can write to the ProcessEventLog can override the configuration property <process name>.writeDatabaseProcessLogs, based on its process name. If true, then ProcessEventLog data will be written; if false, then ProcessEventLog data will not be written. For example, to have the Authentication Server write process event logs, set AuthenticationServer.writeDatabaseProcessLogs=true.

Any process that can write to the Process Event Log can override several configuration items. All configuration overrides should be based on the process name or main class name. These properties also impact events written to the Audit Event Log.

Configuration PropertyDescription
<process name>.captureLog4jIf true, any output sent to the Log4J logger is written into the Process Event Log.
<process name>.captureSysoutIf true, any system output is written into the Process Event Log.
<process name>.captureSyserrIf true, any error output is written into the Process Event Log.
<process name>.aliveMessageSecondsIf non-zero, a message is periodically written to the Process Event Log indicating that the process is still alive.
<process name>.logLevelThe minimum log level event which will be written. The default value is INFO. Other options include:
- FATAL
- EMAIL
- STDERR
- ERROR
- WARN
- STDOUT
- INFO
- DEBUG
- TRACE
<process name>.useLasIf true, then events will be written through the Log Aggregator Service; if false, then events will be written directly to binary log files.
<process name>.useMainClassNameForLogsWhether to use the class name for log entries; if false, then the retrieved value from the process.name property will be used instead of the class name.
<process name>.writeDatabaseProcessLogsIf true, then process event logs will be created; if false, then process event logs will not be created.
RemoteQueryProcessor.sendLogsToSystemOutIf defined and set to true, tells the query workers to send their logs to standard system output. This cannot be used when writing to the Process Event Log.

Write entries to CSV files

You can write ProcessEventLog entries to CSV files. To turn this on, specify the following property for the dispatchers and workers:

ProcessEventLog.interceptor=com.illumon.iris.db.util.logging.ProcessEventLogInterceptorCsv

Also specify a full path name to a directory where the CSV files will be written with the property. This directory must be writable by all the processes that will generate these files; typically making it group-writable by dbmergegrp will be adequate:

ProcessEventLog.interceptor.csv.directory=/path/to/directory

CSV file names will consist of the following pattern:

<PQ name if available>-<process name>-<host name>-<optional GUID>.date/timestamp

Some messages (during initial worker startup and shutdown) will be logged in the dispatcher’s log instead of the workers' logs.

The following properties define the CSV writer's behavior:

Configuration PropertyDescription
ProcessEventLog.interceptor.csv.formatAn optional CSV format, from org.apache.commons.csv.CSVFormat#Predefined. If none is specified, the default is Excel.
ProcessEventLog.interceptor.csv.delimiterAn optional delimiter. If none is specified (the property is non-existent or commented out), the default is taken from the CSV format. Delimiters must be one character.
ProcessEventLog.interceptor.csv.queueCapacityTo ensure that CSV writes do not affect performance, all CSV operations are submitted to a queue and performed off-thread. This specifies the queue’s capacity. The default is 1,000.
ProcessEventLog.interceptor.csv.rolloverDailyIf this is specified, the CSV files will roll over daily. The default is true. If rolling over daily (or not at all), the date/timestamp will be in the format yyyy-MM-dd.
ProcessEventLog.interceptor.csv.rolloverHourlyIf this is specified, the CSV files will roll over hourly. This takes precedence over daily rollover. The default is false. The date/timestamp will include a time and offset.
ProcessEventLog.interceptor.csv.timeZoneThe time zone to be used for filenames and timestamps in the CSV files. The default is the system default time zone. This is the text name of the time zone, such as America/New_York.
ProcessEventLog.interceptor.csv.flushMessagesHow frequently to flush the queue to disk (it will always flush when the queue is emptied). The default value is 100.

If you are not seeing CSV files being created, the following steps may help you troubleshoot the issue.

  1. Check the most recent startup log or the Process Event Log for the worker. If there is a configuration error in the interceptor properties, this is where it will most likely show up. It is designed to not prevent process and worker startup if it is misconfigured.
  2. Check the permissions on the directory to which CSV files are being written. It will need to be writable by all the processes, typically dbmergegrp.
  3. Since the PQ name is part of the filename, special Linux file path characters can cause issues. For example, a forward-slash / will be interpreted as a directory separator. For this case, appropriate subdirectories will need to be created to hold the CSV files.