Print Email PDF

Qumulo Core Audit Logging with Elasticsearch

On a Qumulo Core 2.12.0 (and higher) cluster with audit logging configured, you can use a mechanism for tracking filesystem and cluster configuration operations. 

Once configured, audit logging generates an audit log entry based upon any cluster configuration change or filesystem operation. This log message body consists of multiple fields in a CSV (comma separated) format as shown below:

10.220.200.26,groot-1,"AD\alice",smb2,fs_read_data,ok,123,"/alice/personal/resume.doc",""

If you have a cluster that is moderately busy, it is possible for audit logging to generate over 2GB of data within 1 to 2 days. With this amount of data, it becomes exceedingly difficult to parse through the logs to find the one event that you are looking for. And while Qumulo Core generates the audit logs, it does not parse, analyze, index or visualize the data contained in the logs. This is where a third party search analytics engine like Elasticsearch can assist you.

Elasticsearch with Qumulo Core Audit Logging

Elasticsearch is a highly scalable open-source full-text search and analytics engine that allows you to store, search, and analyze big volumes of data quickly and in near real time.  Elasticsearch quickly shards the data on clusters of servers into smaller, manageable pieces to parallelize operations and thus, increase performance. We are going to show you how to setup an Elasticsearch cluster within 30 to 60 minutes in order to quickly index and search through Qumulo Core Audit Logs.

Elasticsearch, along with Logstash, Filebeats, and Kibana, provide what is called "the ELK stack." This open source software can take any file or groups of files and store them in searchable format within a clustered schema-less database. 

Visualization of Elasticsearch indexed data occurs through Kibana software. With Kibana, you can create reports and visualization panels that can reside within user defined dashboards.

An example dashboard of Qumulo Core Audit Logs may resemble:

kibana_audit_logs.png

Of course, there are no limits to the visualizations and reports that you can configure. Once your visualization panels are created, place them within any dashboard for live viewing and analyzing as your data is collected.

Configuring an Elasticsearch indexing cluster can be very daunting if you are attempting to install the ELK stack components on multiple hardware machines or VMware nodes. The amount of configuration necessary to get a working, resilient Elasticsearch cluster can take hours, if not days. Here is where Docker really shines.

With a docker environment, it is possible to completely configure an Elasticsearch cluster in a matter of minutes. In order to make it extremely simple, we have taken the guesswork out of creating a working Elasticsearch cluster using docker in order to collect Qumulo Core Audit Logs.

Head to Qumulo's Github and check out How to implement Elasticsearch with Docker for Qumulo Audit Logs for instructions to get Elasticsearch up and running with Docker to ingest and visualize your Qumulo Audit Logs.

Was this article helpful?
2 out of 2 found this helpful

Comments

0 comments

Please sign in to leave a comment.

Have more questions?
Open a Case
Share it, if you like it.