Monday, October 31, 2016

Oracle Linux and Oracle Cloud as alternative rsyslog consolidation target

One of the main com pliancy benchmarks which is used to build a lot of the security standards on is the benchmark developed by CIS, Centre of Internet Security. In general following the CIS guidelines for security and implementing them on your systems is a good start to become compliant with a lot of security standards. For example the DOD security standards are based for a part of the CIS benchmark as well as a lot of other security standards who find their origin in the CIS benchmark.

Within the Oracle Linux benchmark one of the topics is the installation, configuration and use of rsyslog. Some of the implementation points are scored when doing an official assessment based upon the CIS benchmark, some are not. One of the topics that are not scored under the rsyslog part is “configure rsyslog to Send Logs to a Remote Log Host”. This will take care, when implemented, to send logfiles to another host to ensure they are immutable on the machine itself in case it is hacked. Common practice when someone hacks into your server is to clean the logfiles, ensuring you have the logfiles in another location will make it harder to remove traces of a hack.

Remote logging in general:
In general it is good practice to ensure your operating system logs are moved to another location so they cannot be wiped by someone who hacked your system. Implementing the configuration for rsyslog to send logs to a remote log host is a good practice seen from the point of view you want to move away your logfiles from the server and secure them at a central location.

Additionally it is beneficial to have your logfiles all consolidated in one location so you can analyze all your systems in one place instead of having to log into every server to go through logs. If you do not have a centralized solution analyzing your entire landscape will become hard and will become almost impossible for a human.

The alternative: 
Even though using rsyslog configuration to send information to a central log server is working very well better solutions are in place at this moment. The most popular and the most commonly used solution is using elasticsearch and kibana for this in combination with logstash. Commonly refered to as the ELK stack which stands for ELasticsearch & Kibana. ELasticsearch is an opensource storage and search solution where Kibana is (also opensource) a dashboard and analysis tool.

The ELK based solutions:
As stated, the ELK based solution in combination with logstash is a popular solution for small and large IT footprints where you have the need to consolidate logging and analyze it. Using an ELK based solution is a great alternative to the rsyslog solution as stated in the benchmark documents for Oracle Linux as they are written down by CIS.

Building the architecture:
The below diagram shows how you can ship all your local logs to a central Elasticsearch and Kibana implementation.


Logstash shipper
Each Oracle Linux node will have a Logstash shipper deamon running. What the Logstash shipper will do in esscence is reading all the logfiles you indicated in the configuration and will send every line to Redis as soon as it finds a new line in the logfile.

Redis
All the Logstash shipper deamons will send the logfile lines to Redis. Redis will act as a "buffer" and message broker. This will ensure that if Logstash indexer is unable to receive a message Redis will keep it in memory and deliver it as soon as the indexer can receive it. Good practice is to build Redis as a cluster.

Logstash indexer
The logstash indexer will receive all message from Redis, possibly filter them, and ensure the record is stored in elasticsearch.

Elasticsearch
Elasticsearch is a search engine based on Lucene and will store all records of all servers that are sending information to it. In essence this is a searchable collection of all the logfile entries of all the servers you have included in this setup and instructed to consolidate log files centrally.

Kibana
Kibana is an open source data visualization plugin for Elasticsearch and will be able to visualize and help in understanding the massive amounts of data from all the log files your servers are sending to Elasticsearch .

Hybrid cloud
As enterprises move the a more hybrid cloud model you will have servers on premise, in private cloud and in the public cloud. The desire from many IT departments is to ensure they will have a single consolidation of all log files in one place. Using the Oracle Public Cloud, the Oracle Compute service to setup a stack capable of hosting an ELK consolidation platform.


The above diagram shows how an architecture could look like if you consolidate all logging with ELK on the Oracle Public Cloud. This includes machines running on a private cloud as well as the Oracle public cloud and traditional bare metal servers. 

No comments: