Fixit

TryHackMe Challenge Walkthrough

Challenge Overview

In this challenge room, you will act as John, who has recently cleared his third screening interview for the SOC-L2 position at MSSP Cybertees Ltd, and a final challenge is ready to test your knowledge, where you will be required to apply the knowledge to FIX the problems in Splunk.
You are presented with a Splunk Instance and the network logs being ingested from an unknown device.

Fix event boundaries

Fix the multi-line event: The incoming logs are multiline and lack proper event separation, which prevents Splunk from parsing them correctly. This is common when the device source is unknown or improperly configured.

the source-type event is visible in the log -> sourcetype = network_logs

To correct the event boundaries, we need to define the logic in props.conf, located (or created) at:

Bash:

/opt/splunk/etc/apps/fixit/default/nano props.conf

Bash:

[network_logs]
SHOULD_LINEMERGE = true
BREAK_ONLY_BEFORE = \[Network-log\]
TRANSFORM-network = network_custom_fieldsf
  • SHOULD_LINEMERGE = true ensures multiline events are merged.
  • BREAK_ONLY_BEFORE tells Splunk to start a new event only when it encounters [Network-log].
  • TRANSFORM-network points to the custom field extraction stanza we’ll define in transforms.conf.


Note: Since we’re planning to extract fields in the next step, the TRANSFORM- line is already added here for efficiency.

Extract Custom Fields

Once the event boundaries are fixed, we can extract the key values from each event. The goal is to make the logs searchable by defining five custom fields:

  • Username
  • Department
  • Domain
  • Source_IP
  • Country
STEP 1: Build the Regex pattern

We use Regex101 to construct and test our pattern against the sample logs.

Sample log

[Network-log]: User named Johny Bil from Development department accessed the resource Cybertees.THM/about.html from the source IP 192.168.0.1 and country Japan at: Thu Sep 28 00:13:46 2023

Regex Pattern

\[Network- log\]:\sUser\snamed\s([\w\s]+)\sfrom\s([\w]+)\sdepartment\saccessed\sthe\sresource\s([\w]+\.[\w]+\/[\w]+\.[\w]+)\sfrom\sthe\ssource\sIP\s((?:\d{1,3}\.){3}\d{1,3})\sand\scountry\s([\w\s]+)\sat

[Network-log\]:
\sUser\snamed\s
([\w\s]+) -> first field we want to extract: Username
\sfrom\s
([\w]+) -> second field we want to extract: Department
\sdepartment\saccessed\sthe\sresource\s
([\w]+\.[\w]+\/[\w]+\.[\w]+) -> third field: Domain
\sfrom\sthe\ssource\sIP\s
((?:\d{1,3}\.){3}\d{1,3}) -> fourth field: IP address
\sand\scounty\s
([\w\s]+) -> fifth field: Country
\sat

 

the date and time are not required

STEP 2: Create Transforms.conf

Bash:

[network_custom_fields]
REGEX = \[Network- log\]:\sUser\snamed\s([\w\s]+)\sfrom\s([\w]+)\sdepartment\saccessed\sthe\sresource\s([\w]+\.[\w]+\/[\w]+\.[\w]+)\sfrom\sthe\ssource\sIP\s((?:\d{1,3}\.){3}\d{1,3})\sand\scountry\s([\w\s]+)\sat\:\s
FORMAT = Username::$1 Department::$2 Domain::$3 Source_IP::$4 Country::$5
WRITE_META = true
STEP 3: Create fields.conf

Bash:

[Username]
INDEXED = true

[Department]
INDEXED = true

[Domain]
INDEXED = true

[Source_IP]
INDEXED = true

[Country]
INDEXED = true
STEP 4: Restart Splunk

Bash:

/opt/splunk/bin/splunk restart

After restart, your five custom fields should appear in the Splunk UI. Select them from the field sidebar to confirm parsing is successful.

Q & A

Q1: What is the full path of the FIXIT app directory?

This is the standard path where Splunk stores app files.

Answer

/opt/splunk/etc/apps/fixit

Q2: What Stanza will we use to define Event Boundary in this multi-line Event case?

this is a sample of our log

[Network-log]: User named Johny Bil from Development department accessed the resource Cybertees.THM/about.html from the source IP 192.168.0.1 and country Japan at: Thu Sep 28 00:13:46 2023

first we want to break the event before `[network-log]`

NOTE:
BREAK_ONLY_BEFORE = ^\d{4}-\d{2}-\d{2} – Identifies the start of a new event if it begins with a date in the format `YYYY-MM-DD.

Answer

BREAK_ONLY_BEFORE

Q3:In the inputs.conf, what is the full path of the network-logs script?

We know that the inputs.conf location is inside the default folder

Bash:

/opt/splunk/etc/apps/fixit/default/nano inputs.conf

we can find here our answer

Answer

/opt/splunk/etc/apps/fixit/bin/network-logs

Q4: What regex pattern will help us define the Event's start?

we can use regex101.com to create our pattern

copy the sample logs we got at the beginnin

Kibana Query:

PROCESS HERE

Our event start is [Network-log]

Answer

[Network-log]

Q5: What is the captured domain?

It’s clearly visible in the logs

Answer

Cybertees.THM

Q6: How many countries are captured in the logs?

Counted via the Country field after extraction.

Answer

12

Q7: How many departments are captured in the logs?

Checked using distinct values in the Department field.

Answer

6

Q8: How many usernames are captured in the logs?
Answer

28

Q9: How many source IPs are captured in the logs?
Answer

52

Q10: Which configuration files were used to fix our problem? [Alphabetic order: File1, file2, file3]

These were the three key config files involved.

fields.conf, props.conf, transforms.conf

Answer

fields.conf, props.conf, transforms.conf

Q11: What are the TOP two countries the user Robert tried to access the domain from? `[Answer in comma-separated and in Alphabetic Order][Format: Country1, Country2]

Splunk Query:

index=main Username=Robert*
Answer

Canada, United States

Q12: Which user accessed the secret-document.pdf on the website?

Simple search on the Domain field shows:

Splunk Query:

index=main Domain=*secret-document.pdf
Answer

sarah hall

Conclusion

Turns out that when your logs look like a spaghetti mess, it’s kinda hard to do security analysis.

Who could’ve guessed? After teaching Splunk how to read (thanks, props.conf), we finally got structured events and could extract all the juicy fields. Suddenly, usernames made sense, IPs showed up, and regex wasn’t just a random keyboard smash.

Lesson learned: logs are only useful if they don’t make your eyes bleed.