Latest Post

Fluentd error: Unable to push logs to [elasticsearch]

After application deployments, Kibana stopped showing logs exactly after 7 days. The error "Fluentd error: Unable to push logs to [elasticsearch]" was shown in the fluentd logs. The initial response was to increase the buffer limits for fluentd as follows: chunk_limit_size 10M queue_limit_length 256 The behavior occurred again after two weeks, which led to the same error. On closer investigation, the error was preceded by the statement "Failed to write to the buffer." This led me to inspect the fluentd configuration again and found the following code in the buffer part which caused the fluentd buffers to be filled as per the official documentation on Fluentd : overflow_action block The fix for this overflow_action is to change from block to drop_oldest_chunk, allowing the fluentd logs to flow seamlessly to the elastic search by dropping the oldest logs in the buffer.   <buffer> @type file path /var/log/fluentd-buffers/kubernet

Mount DFS-R Share in RedHat Enterprise Linux 6

1. Verify the following packages are available in the linux system:
    a. cifs-utils
    b. keyutils

2. The following lines need to be added to the end of the /etc/request-key.conf file:

create cifs.spnego * * /usr/sbin/cifs.upcall %k

create dns_resolver * * /usr/sbin/cifs.upcall %k

3. Mount the DFS-R folder

mount.cifs -o user=windowsuser,password=windowspassword,domain=domainname //domainname/share /localmountpoint

4. Add the following entry in /etc/fstab if the mount point is to mounted with the DFS-R share continuously on server reboot.

//domainname/share                 /localmountpoint             cifs    user=windowsuser,password=windowspassword,domain=domainname,uid=200,gid=2002 0 0

***Thanks to Mike Burr***


Popular posts from this blog

On-board Linux computers to Azure Log Analytics

Fluentd error: Unable to push logs to [elasticsearch]

Office 365 User unable to book room on-premise in Exchange Hybrid environment