Building a beginner’s detection lab with Defender, Sentinel, and Splunk

Regan
Detect FYI
Published in
5 min readJan 17, 2024

--

This is how you’ll feel once you’ve built a detection lab.

If you work in an MSSP as an analyst or consult a lot, you’ll likely come in contact with SIEMs, EDRs, and data sources that are new, unfamiliar, or just plain forgot. Oftentimes, you might find yourself wishing you had access to a particular product or dataset for a given situation. This is where a detection lab comes in handy! In this post, I will go over the practical steps I used to build a detection lab at home. I’ll be going over how I decided to host the machines, the boxes I setup, and the steps to get data into Sentinel and Splunk respectively.

Goals

The first thing to define is your goals — what kind of data you configure and bring in to which platform will be determined by what you want to get out of your lab. While I primarily work on Microsoft Sentinel and Defender, I also occasionally step into Splunk, so today I’m going to setup Sentinel & Splunk. You however may wish to use something completely different for your SIEM! This is about your learning, so pick something you want to gain practical skills in.

The components I’ll be using to build my lab, in no certain order:
- An active Azure tenant
- Hyper-V and Azure Arc — For Azure management, and the ability to deploy the Azure Monitor Agent.
- A Microsoft Sentinel workspace
- Defender for Cloud — Provides access to Defender for Endpoint
- Standalone Splunk with a development license
- Tailscale for VPN networking

You can realistically use any combination of products and tools you have access to. You can easily re-create this at a smaller scale with tools like ELK instead of Splunk, and Elastic’s EDR agent. You might also want to roll with something like Aurora Lite by Nextron Systemsfor live Sigma detections, or maybe even Velociraptor.

Building the Lab

The choice to go with Hyper-V as my host was a fairly pragmatic one — I’ve never really used it before. I also wanted a chance to play with Azure Arc and the Windows Admin Center in Azure as the management plane.

On a fresh instance of Windows Server, I started by installing the necessary Hyper-V roles, and dropping the Azure Arc onboarding script onto the server. Within minutes, Arc was talking back to the Azure portal on my host box node-1.

My Hyper-V host.

Arc gives a bunch of cool different capabilities. For the purposes of this lab I’m mainly interested in the Windows Admin Center integration, but Arc critically also allows us to install the Azure Monitor Agent, which lets us ingest Security Events and other log sources off the machine via Data Collection Rules.

From Arc, we’re easily able to deploy Windows Admin Center (referred to hereon as WAC) from the management blade and connect to the box. I was immediately struck by how easy it was to setup and how much capability we have in WAC to manage the machine, all through the Azure portal.

⚠️ A small caveat — I had huge issues getting it to work on Firefox, you might want to use something Chromium-based in order for the UI to load properly!

Windows Admin Center running on my Hyper-V host.

We can also take a moment to install Tailscale on our host, an awesome zero-tier VPN that will let me connect to the lab from anywhere.

I’ve also created Windows machines for each of the EDR products I want to run, an Ubuntu server for Splunk, and another Ubuntu server which will be used for Caldera. These machines will act as our data sources, with Caldera acting as our Atomic Red Team orchestrator later on.

Setting up Splunk

Following Splunk’s documentation, I installed Splunk on a new Ubuntu server in a standalone deployment style, running triple duty as a Search Head, Indexer, and Deployment Server. This way I can keep the setup simple, and manage all the different apps from one place.

At this point, you’ll want to deploy the Splunk Universal Forwarder to the boxes you want to collect Windows events from, install the Windows app for Splunk, and create a server class for our endpoints. You’ll then need two deployment apps — one that will tell our forwarders to send the logs to Splunk, and another that will collect Windows security events.

The server classes on my deployment server.

Always remember to check ‘Restart Splunkd’! Otherwise, you’ll need to wait for the Splunk service on each endpoint to restart for the changes to take effect.

With a quick search, we can validate that we’re receiving events in Splunk:

Big data!

Setting up Sentinel

After creating your Sentinel workspace, the first thing I do is go through the content hub and deploy all the Microsoft connectors, critically the Defender XDR solution. Once those have been deployed, and assuming you’ve got Defender active in your tenant, you can connect it to your workspace. Be sure to dive into the ‘Defender XDR’ connector and enable all the telemetry sources!

We can now see Defender sending the full suite of telemetry to Sentinel:

Bigger data!

Now that we have the foundations in place, the last thing I’ll do in this part is run Yamato Security’s script for quickly enabling the necessary audit logs for Sigma rule coverage on our Windows machines.

Always check the code before running it on your endpoints! Lab or not, we’re cybersecurity professionals!

At this point you may wish to onboard your Windows endpoints to Azure Arc so you can deploy the Azure Monitor Agent and collect the Windows events in Sentinel as well. Likewise, you should also consider installing the Azure app for Splunk so you can ingest those juicy Defender Advanced Hunting events into Splunk. And just like that, you’ve got an insane amount of log and EDR coverage going into your SIEMs! You’re now ready to build Caldera or your threat emulation framework of choice and get to work!

If this has been helpful, or you have any questions about my specific setup, my DMs are open at @ rcegann on Twitter.

--

--

Security Engineer with a focus on Microsoft Sentinel, the Defender stack, and a bit of Splunk. Opinions are my own. Hack the planet.