- Copy and place the script and config file developed by APIsec into the Splunk accessible folder.
- Update the config file with the Service Account username/password (I have used a token for testing) and update the logs settings to enable which logs to fetch:
4. From Splunk Enterprise Go to Settings-> Data inputs -> Scripts -> New Local Script
Select the script provided by APISec and then select the interval for periodic execution, it could be a CRON job also.
5. Next Provide the Source Type to JSON.
6. This will be the final settings:
7. You should be able to see the logs in Splunk:
To receive data from Splunk UF we need to enable the receiving port on Splunk Indexer (Default port is 9997) (Settings->Forwarding and receiving -> Configuring receiving)
Install Universal Forwarder on the machine from where the user wants to send the data to the indexer, and while installing provide the IP address and port of the Splunk Indexer:
Here select data from Splunk forwarder (Settings-> Add Data )
We should see the configured forwarder on Splunk, here in my case, it shows my laptop hostname. Next, the user needs to provide a class name, this will create an app in Splunk Forwarder
Next, we need to select “Scripts” as a source:
IMP Note: One of the catches here is that the script that we have developed needs to be available in the Splunk directory only if you see in the image below we have only two options to select the script from: “bin\scripts” and “\etc\system\bin”. Another thing is Splunk UF doesn’t support Python script, so we need to write another wrapper script in “.bat” or “.sh” depending on OS. From this wrapper script, we need to call our Python script. So for this Python and our required scripts libraries must be installed on the client machine.
Here we can configure the interval in seconds or as a CRON job:
Our APIs are providing output in JSON, so I selected the source type as JSON:
Lastly, we can see the output here, I have written wrapper script “wrapper_script2.bat”:




