In this challenge room, you will act as John, who has recently cleared his third screening interview for the SOC-L2 position at MSSP Cybertees Ltd, and a final challenge is ready to test your knowledge, where you will be required to apply the knowledge to FIX the problems in Splunk.
You are presented with a Splunk Instance and the network logs being ingested from an unknown device.
Fix the multi-line event: The incoming logs are multiline and lack proper event separation, which prevents Splunk from parsing them correctly. This is common when the device source is unknown or improperly configured.
the source-type event is visible in the log -> sourcetype = network_logs
To correct the event boundaries, we need to define the logic in props.conf, located (or created) at:
Bash:
/opt/splunk/etc/apps/fixit/default/nano props.conf
Bash:
[network_logs] SHOULD_LINEMERGE = true BREAK_ONLY_BEFORE = \[Network-log\] TRANSFORM-network = network_custom_fieldsf
Note: Since we’re planning to extract fields in the next step, the TRANSFORM- line is already added here for efficiency.
Once the event boundaries are fixed, we can extract the key values from each event. The goal is to make the logs searchable by defining five custom fields:
We use Regex101 to construct and test our pattern against the sample logs.
Sample log
[Network-log]: User named Johny Bil from Development department accessed the resource Cybertees.THM/about.html from the source IP 192.168.0.1 and country Japan at: Thu Sep 28 00:13:46 2023
Regex Pattern
\[Network- log\]:\sUser\snamed\s([\w\s]+)\sfrom\s([\w]+)\sdepartment\saccessed\sthe\sresource\s([\w]+\.[\w]+\/[\w]+\.[\w]+)\sfrom\sthe\ssource\sIP\s((?:\d{1,3}\.){3}\d{1,3})\sand\scountry\s([\w\s]+)\sat
[Network-log\]:
\sUser\snamed\s
([\w\s]+) -> first field we want to extract: Username
\sfrom\s
([\w]+) -> second field we want to extract: Department
\sdepartment\saccessed\sthe\sresource\s
([\w]+\.[\w]+\/[\w]+\.[\w]+) -> third field: Domain
\sfrom\sthe\ssource\sIP\s
((?:\d{1,3}\.){3}\d{1,3}) -> fourth field: IP address
\sand\scounty\s
([\w\s]+) -> fifth field: Country
\sat
the date and time are not required
Bash:
[network_custom_fields] REGEX = \[Network- log\]:\sUser\snamed\s([\w\s]+)\sfrom\s([\w]+)\sdepartment\saccessed\sthe\sresource\s([\w]+\.[\w]+\/[\w]+\.[\w]+)\sfrom\sthe\ssource\sIP\s((?:\d{1,3}\.){3}\d{1,3})\sand\scountry\s([\w\s]+)\sat\:\s FORMAT = Username::$1 Department::$2 Domain::$3 Source_IP::$4 Country::$5 WRITE_META = true
Bash:
[Username] INDEXED = true [Department] INDEXED = true [Domain] INDEXED = true [Source_IP] INDEXED = true [Country] INDEXED = true
Bash:
/opt/splunk/bin/splunk restart
After restart, your five custom fields should appear in the Splunk UI. Select them from the field sidebar to confirm parsing is successful.
This is the standard path where Splunk stores app files.
/opt/splunk/etc/apps/fixit
this is a sample of our log
[Network-log]: User named Johny Bil from Development department accessed the resource Cybertees.THM/about.html from the source IP 192.168.0.1 and country Japan at: Thu Sep 28 00:13:46 2023
first we want to break the event before `[network-log]`
NOTE:
BREAK_ONLY_BEFORE = ^\d{4}-\d{2}-\d{2} – Identifies the start of a new event if it begins with a date in the format `YYYY-MM-DD.
BREAK_ONLY_BEFORE
We know that the inputs.conf location is inside the default folder
Bash:
/opt/splunk/etc/apps/fixit/default/nano inputs.conf
we can find here our answer
/opt/splunk/etc/apps/fixit/bin/network-logs
we can use regex101.com to create our pattern
copy the sample logs we got at the beginnin
Kibana Query:
PROCESS HERE
Our event start is [Network-log]
[Network-log]
It’s clearly visible in the logs
Cybertees.THM
Counted via the Country field after extraction.
12
Checked using distinct values in the Department field.
6
28
52
These were the three key config files involved.
fields.conf, props.conf, transforms.conf
fields.conf, props.conf, transforms.conf
Splunk Query:
index=main Username=Robert*
Canada, United States
Simple search on the Domain field shows:
Splunk Query:
index=main Domain=*secret-document.pdf
sarah hall
Turns out that when your logs look like a spaghetti mess, it’s kinda hard to do security analysis.
Who could’ve guessed? After teaching Splunk how to read (thanks, props.conf), we finally got structured events and could extract all the juicy fields. Suddenly, usernames made sense, IPs showed up, and regex wasn’t just a random keyboard smash.
Lesson learned: logs are only useful if they don’t make your eyes bleed.
Apparently we have to tell you this: yes, cookies are used. They're not edible, and no, you can't escape them — but you can pick what gets tracked. Yay for EU law!