Splunk free Configuration

March 14, 2014
Tags: ,

Splunk Free

This is the second post in how to setup pfSense, Squid, and Splunk. Here is a link to the first post, pfSense, Squid, and Splunk free

Sometime around 2013 Splunk switched to a paid and free version. The free version does not allow for emailed alerts or user authentication. For this post, I will be configuring Splunk for a daily digest email sent through a local postfix installation. This post starts after Ubuntu and Splunk are installed and Splunk is already being populated with events from pfSense (or another source)  via syslog. I am using Splunk version 6.0 and Ubuntu 12.04 for this setup. The main purpose for setting up Splunk was to get notifications of events. To gain that functionality back let’s use Python. This post assumes the reader already has a grasp on Python.  I am using Python 2.7.3 for these examples. To get started, install the Splunk SDK for Python. This is a Python module that allows you to interact with Splunk. Once the module is installed attempt to connect to your Splunk instance. Here is my code:

#!/usr/bin/python


#imports
import smtplib
from email.MIMEMultipart import MIMEMultipart
from email.MIMEText import MIMEText
from email.MIMEImage import MIMEImage
import splunklib.client as client
import splunklib.results as results
import sys
from time import sleep
import matplotlib
matplotlib.use('Agg')
import matplotlib.pyplot as plt
import numpy as np
import datetime as DT
from matplotlib.dates import date2num

#variables
HOST = "localhost"
PORT = 8089
USERNAME = "admin"
PASSWORD = ""
strFrom= 'admin@admin.com'
strTo= 'youremail@gmail.com'

##smtp init
##Create the root message and fill in the from, to, and subject headers
msgRoot = MIMEMultipart('related')
msgRoot['Subject'] = 'Message From Splunk'
msgRoot['From'] = strFrom
msgRoot['To'] = strTo
msgRoot.preamble = 'This is a multi-part message in MIME format.'

#SPLUNK
#get Top 20 Block FW ports
service = client.connect(host=HOST,port=PORT,username=USERNAME,password=PASSWORD)
#set the search from 7 days ago midnight to yesterday midnight. 
kwargs = {"earliest_time": "-8d@d", "latest_time": "@d"}

#function to wait the script for the search to finish
def polling():
     while True:
          job.refresh()
          #print job.is_done()
          if job.is_done():
                  break
          else:
                 sleep(5)

#message body
msgAlternative = MIMEMultipart('alternative')
msgRoot.attach(msgAlternative)

msgText = MIMEText('This is the alternative plain text message.')
msgAlternative.attach(msgText)

###############
#
#First query
#
###############
searchquery_normal = "search block in on em0 | chart count by dest_port | sort count desc | head 20"
#kwargs = {"earliest_time": "-7d@h"}
job = service.jobs.create(searchquery_normal, **kwargs)

polling()
splunk_text += """<h3>Top 20 Blocked Ports & URLS Last 7 Days</h3></b><table border ="0" cellpadding="3"><tr><td valign="top"><table border="1" cellpadding="3"><tr><td><b>Port</b></td><td><b>Count</b></td></tr>\n"""
#add in the splunk text too
for result in results.ResultsReader(job.results(segmentation='none')):
splunk_text += "<tr><td>" + str(result['dest_port'])+ "</td><td>" + str(result['count']) + "</td></tr>\n"

#close first table
splunk_text += """</table></td>"""

########end of table##########
splunk_text += """</table><br>"""

msgText = MIMEText(splunk_text, 'html')
msgAlternative.attach(msgText)

smtpObj = smtplib.SMTP('localhost')
smtpObj.sendmail(strFrom, strTo, msgRoot.as_string())
smtpObj.quit()

I have omitted some of this code for the sake of brevity. The code is setup to be formatted as a table for an email. The machine that is running Splunk is also running postfix which is configured to send email out. The code that I removed is additional queries to Splunk and uses matplotlib to create an image file to graphically represent my data. If anyone is interested please leave a comment and I can post the full code.

The next step would be to setup a daily cron job that would run this python code and you would then have a daily digest email with data about your network!

Leave a Reply