ProcessEventLog
The ProcessEventLog
table contains all log messages for Deephaven workers and query or merge servers, including the output from workers, which is useful for investigating behavior and diagnosing failures or crashes. You should filter by the Process
or ProcessInfoId
columns to retrieve rows of interest. You must also sort by the Timestamp
column to view data in order, because otherwise data captured from workers' stdout
and stderr
will be out of order.
Column Name | Column Type | Description |
---|---|---|
Date | String | The date on which the log event was generated. This is the partitioning column. |
Timestamp | DateTime | The timestamp for the logged event. |
Host | String | The host name for the logged event. |
Level | String | The level for the event. This is usually one of the standard log levels (INFO , WARN , etc), but in the case of a worker output logged by the query server, it will instead indicate the level of captured output (stdout or stderr ). |
Process | String | The name of the process that generated the event (e.g., RemoteQueryDispatcher, worker_1 ). |
ProcessInfoId | String | The unique process info ID of the process generating the entry. |
AuthenticatedUser | String | The authenticated user who last ran this query. |
EffectiveUser | String | The effective user who last ran this query. |
LogEntry | String | The logged event. |
ProcessInfoId | String | The unique process info ID of the process generating the entry. |
Configuration
The same set of processes that can write to the AuditEventLog
can write to the ProcessEventLog
, although most of them turn off process event logging by default.
Processes that can write to the ProcessEventLog
can override the configuration property <process name>.writeDatabaseProcessLogs
, based on its process name. If true
, then ProcessEventLog
data will be written; if false
, then ProcessEventLog
data will not be written. For example, to have the Authentication Server write process event logs, set AuthenticationServer.writeDatabaseProcessLogs=true
.
Any process that can write to the Process Event Log can override several configuration items. All configuration overrides should be based on the process name or main class name. These properties also impact events written to the Audit Event Log.
Configuration Property | Description |
---|---|
<process name>.captureLog4j | If true , any output sent to the Log4J logger is written into the Process Event Log. |
<process name>.captureSysout | If true , any system output is written into the Process Event Log. |
<process name>.captureSyserr | If true , any error output is written into the Process Event Log. |
<process name>.aliveMessageSeconds | If non-zero, a message is periodically written to the Process Event Log indicating that the process is still alive. |
<process name>.logLevel | The minimum log level event which will be written. The default value is INFO . Other options include: - FATAL - EMAIL - STDERR - ERROR - WARN - STDOUT - INFO - DEBUG - TRACE |
<process name>.useLas | If true , then events will be written through the Log Aggregator Service; if false , then events will be written directly to binary log files. |
<process name>.useMainClassNameForLogs | Whether to use the class name for log entries; if false , then the retrieved value from the process.name property will be used instead of the class name. |
<process name>.writeDatabaseProcessLogs | If true , then process event logs will be created; if false , then process event logs will not be created. |
RemoteQueryProcessor.sendLogsToSystemOut | If defined and set to true , tells the query workers to send their logs to standard system output. This cannot be used when writing to the Process Event Log. |
Write entries to CSV files
You can write ProcessEventLog
entries to CSV files. To turn this on, specify the following property for the dispatchers and workers:
ProcessEventLog.interceptor=com.illumon.iris.db.util.logging.ProcessEventLogInterceptorCsv
Also specify a full path name to a directory where the CSV files will be written with the property. This directory must be writable by all the processes that will generate these files; typically making it group-writable by dbmergegrp
will be adequate:
ProcessEventLog.interceptor.csv.directory=/path/to/directory
CSV file names will consist of the following pattern:
<PQ name if available>-<process name>-<host name>-<optional GUID>.date/timestamp
Some messages (during initial worker startup and shutdown) will be logged in the dispatcher’s log instead of the workers' logs.
The following properties define the CSV writer's behavior:
Configuration Property | Description |
---|---|
ProcessEventLog.interceptor.csv.format | An optional CSV format, from org.apache.commons.csv.CSVFormat#Predefined . If none is specified, the default is Excel. |
ProcessEventLog.interceptor.csv.delimiter | An optional delimiter. If none is specified (the property is non-existent or commented out), the default is taken from the CSV format. Delimiters must be one character. |
ProcessEventLog.interceptor.csv.queueCapacity | To ensure that CSV writes do not affect performance, all CSV operations are submitted to a queue and performed off-thread. This specifies the queue’s capacity. The default is 1,000. |
ProcessEventLog.interceptor.csv.rolloverDaily | If this is specified, the CSV files will roll over daily. The default is true . If rolling over daily (or not at all), the date/timestamp will be in the format yyyy-MM-dd . |
ProcessEventLog.interceptor.csv.rolloverHourly | If this is specified, the CSV files will roll over hourly. This takes precedence over daily rollover. The default is false . The date/timestamp will include a time and offset. |
ProcessEventLog.interceptor.csv.timeZone | The time zone to be used for filenames and timestamps in the CSV files. The default is the system default time zone. This is the text name of the time zone, such as America/New_York . |
ProcessEventLog.interceptor.csv.flushMessages | How frequently to flush the queue to disk (it will always flush when the queue is emptied). The default value is 100. |
If you are not seeing CSV files being created, the following steps may help you troubleshoot the issue.
- Check the most recent startup log or the Process Event Log for the worker. If there is a configuration error in the interceptor properties, this is where it will most likely show up. It is designed to not prevent process and worker startup if it is misconfigured.
- Check the permissions on the directory to which CSV files are being written. It will need to be writable by all the processes, typically
dbmergegrp
. - Since the PQ name is part of the filename, special Linux file path characters can cause issues. For example, a forward-slash
/
will be interpreted as a directory separator. For this case, appropriate subdirectories will need to be created to hold the CSV files.
Related documentation
- Internal tables overview
AuditEventLog
PersistentQueryConfigurationLog
PersistentQueryStateLog
ProcessEventLogIndex
ProcessInfo
ProcessMetrics
QueryOperationPerformanceLogIndex
QueryOperationPerformanceLog
QueryPerformanceLog
QueryUserAssignmentLog
ResourceUtilization
ServerStateLogIndex
ServerStateLog
UpdatePerformanceLogIndex
UpdatePerformanceLog
WorkspaceDataSnapshot
WorkspaceData