The post Home Lab Chronicles: Building a Robust Network Infrastructure appeared first on GK.
]]>I recently put together a home lab designed to be both affordable and powerful, with a strong focus on flexibility and scalability.
In this blog post, I’ll share the details of my setup and why I chose the components I did, including the TP-Link Omada ecosystem and various open-source solutions.
My setup revolves around TP-Link’s Omada ecosystem, a choice I made for its balance of functionality and budget-friendliness. Here’s a quick look at the core pieces of hardware:

One of the most important aspects of my home lab is the virtualized environment I’ve built using Proxmox.
Proxmox is an open-source type 1 hypervisor platform, and it runs on my old Dell computer, which is equipped with a quad-core Intel Core i7-3770 processor and 32GB of RAM.
Not the most powerful PC, but it’s more than enough to run a variety of projects.
This setup allows me to run a range of virtual machines (VMs) for different use cases:
This environment gives me the flexibility to test different systems without needing multiple physical machines, making it cost-effective and space-efficient.
A key part of my network setup is the use of VLANs (Virtual Local Area Networks). I’ve created multiple VLANs to segregate traffic for specific purposes:

Segmenting traffic like this not only improves security but also ensures better performance and easier troubleshooting.

Backing up my virtual environments and data is crucial, and for that, I use two 4TB external drives.
One is dedicated to backing up my Proxmox server, while the other backs up my Synology NAS.
This backup strategy ensures that even if something goes wrong with my setup, I can restore my data quickly and efficiently.
To keep everything organized and avoid the typical clutter of cables, I’ve rack-mounted my equipment in a 6U rack.
This holds my 16-port switches, a 24-port patch panel, and an outlet power strip surge protector.
The clean, efficient setup not only saves space but also makes managing the hardware much easier.
To crimp my Cat6 cables, I stripped about 1.5 inches of the jacket, arranged the wires in the T568B configuration, and inserted them into RJ-45 connectors.

After crimping the connectors with a tool, I made 3 to 5-inch patch cables for better cable management in my rack.
Finally, I used a cable tester to verify proper alignment and ensure each connection was secure and functioning correctly.

Proper wire labeling is essential in my home lab for organization and clarity. I used self-adhesive labels and a label maker to create clear, durable tags, ensuring consistent formatting.

Each cable is labeled on both ends with key details like cable type (e.g., Cat6), destination (e.g., “Living Room AP” for the access point), and any relevant VLAN information.
This systematic approach simplifies troubleshooting and makes future modifications more efficient.
Instead of purchasing outdated Cisco hardware, I decided to use Packet Tracer for my networking simulations.
This software provides me with all the tools I need to experiment with Cisco configurations and hone my networking skills without the expense or space requirements of actual hardware.
One of the most interesting aspects of my home lab is running Wazuh, an open-source security platform, on a Linux Ubuntu VM.
With Wazuh, I can test endpoint protection, intrusion detection, and other cybersecurity measures, allowing me to practice handling real-world security incidents in a safe environment.

One of the main reasons I decided to go with TP-Link’s suite of products was the Omada controller function.
Having a single, centralized interface to manage all of my devices makes network administration much more efficient.
The TP-Link Omada ecosystem is also incredibly budget-friendly, which was a big factor in my decision.
Building this home lab has been an incredibly rewarding experience.
It’s not just about having a space to test and experiment—it’s about creating a real-world learning environment that mimics the challenges faced in professional IT and network administration.
Whether you’re new to networking or looking to deepen your skills, a home lab can provide endless opportunities to grow.
If you’re looking to build your own lab, my advice is simple: start with a clear goal, find budget-friendly equipment that meets your needs, and take advantage of open-source software wherever possible.
The post Home Lab Chronicles: Building a Robust Network Infrastructure appeared first on GK.
]]>The post How We Recovered from a Ransomware Attack appeared first on GK.
]]>While many heard about the infamous Qlocker ransomware that encrypted files using the 7-zip utility, we had the misfortune of dealing with eCh0raix, a different strain that exploited vulnerabilities in QNAP’s firmware.
The attack wasn’t the result of poor network security or a careless mistake on our part—hackers took advantage of an unpatched security flaw in the NAS firmware, making our defenses futile against the attack.
It started as a regular day—until we noticed that many of our company files were suddenly inaccessible.
Digging deeper, we discovered the files had been encrypted, and a ransom note demanding 0.01 Bitcoin (approximately $550 at the time) was left in place of the original files.
Hackers provided a Tor link, requiring us to pay the ransom in exchange for the decryption key.
The attackers were swift and precise, locking down our incremental backups, leaving us with a decision: either pay the ransom or restore from a backup, knowing we’d lose about 24 hours’ worth of data. We made the choice not to pay the ransom.
Luckily, we had been diligent about our backups. We were able to restore our critical data from an earlier backup, effectively losing only a day’s worth of work.
Despite the relatively minor data loss, we decommissioned the QNAP NAS as soon as we could.
We replaced it with a Synology NAS, which has a more robust reputation for security.
In hindsight, this decision was crucial—just a year later, QNAP devices were hit with yet another ransomware attack on September 3, 2022.
At that point, it became clear that hackers seemed to have a persistent target on QNAP.
This experience brought to light a few crucial lessons. One of the most important? Always change the default ports on your NAS.
Many NAS devices, including QNAP, use default ports (like 8080 and 443) that are easy targets for hackers scanning for vulnerable systems.
By changing these ports, you can add an extra layer of protection against automated attacks.
Additionally, always make sure your NAS firmware is up to date.
QNAP released patches after these vulnerabilities were exploited, but many users had already been affected by the time those updates were made available.
Regular patching is key to staying ahead of potential attacks.
This experience taught us several key lessons. First, no matter how secure your network infrastructure is, vulnerabilities in your storage devices can still be exploited.
The eCh0raix ransomware proved that even strong network defenses are useless if firmware is left unpatched. The second lesson was the importance of a reliable backup strategy.
Because we regularly performed full and incremental backups, we avoided the devastating consequences that many businesses face when ransomware strikes.
Our ability to quickly restore from a backup made all the difference in resolving the issue without paying the ransom.
During the attack, I scoured the Reddit forums for a possible fix, where many other QNAP users shared their frustration with the eCh0raix and Qlocker attacks.
Some had opted to pay the ransom, while others were able to recover files through backups or decryption tools that were being developed by security researchers.
But for many, there was no easy fix, and they were left weighing the cost of the ransom against the potential loss of critical business data.
The decision to switch to a Synology NAS was a proactive measure to avoid future ransomware attacks.
While no system is invulnerable, Synology has a solid track record in terms of security patches and rapid response to vulnerabilities.
Additionally, we implemented even more stringent backup policies, ensuring multiple layers of redundancy, including offsite storage solutions.
Ultimately, this experience served as a harsh reminder that technology can be both an asset and a liability.
Vulnerabilities will always exist, and bad actors are constantly looking for ways to exploit them.
If you’re using a QNAP or any NAS device, make sure it’s regularly updated, and more importantly, maintain a reliable and well-tested backup system.
Avoiding ransomware attacks may not always be possible, but being prepared for recovery is within your control.
And as for QNAP? They seem to have angered someone in the hacker world because their vulnerabilities keep being targeted—proving once again that cybersecurity is a cat-and-mouse game.
The post How We Recovered from a Ransomware Attack appeared first on GK.
]]>The post My Experience Installing Docker: A Journey into Containerization appeared first on GK.
]]>Recently, I took the plunge into the world of containerization by installing Docker on my dusty old Dell computer, equipped with a quad-core 3.4GHz Intel Core i7-3770 and maxed out at 32GB of RAM. Several years ago I kept hearing about containerization. I was familiar with virtualization but didn’t know much about running apps in containers. Did a Google what
Eager to explore how Docker can simplify application development and deployment, I set out to familiarize myself with this powerful tool using Proxmox as my virtualization platform.
I decided to install Docker as a container within Proxmox, leveraging its capabilities to create a streamlined environment. The installation process was relatively straightforward, thanks to the wealth of online resources available.

After getting Docker up and running, I quickly loaded Portainer to make deploying virtual machines (VMs) and containers (CTs) easier. Running my first container with the hello-world image was a thrilling moment. Seeing the success message confirmed everything was working as expected, and I felt a sense of accomplishment. I allocated 14GB of RAM, 4 CPU cores, and 90GB of hard drive space to the Docker container I created, optimizing the performance on my old hardware.
With Docker installed and Portainer set up, I began to explore its capabilities further. I experimented with pulling and running various container images, including setting up a simple web server using Nginx. The convenience of quickly launching containers and managing services through Portainer was a game-changer for my workflow. Using Docker Compose also simplified the orchestration of multi-container applications, making it even more intuitive to manage.
Despite the smooth installation, I faced a few challenges along the way. Navigating Docker’s networking configurations initially proved tricky, and I spent some time learning how to connect containers effectively. Additionally, getting accustomed to Docker’s command-line interface took some practice, but as I continued to use it, I became more comfortable with the commands.
Overall, installing Docker on my old Dell machine was a rewarding experience that opened up new avenues in application development. While there were some challenges, each one provided valuable lessons and insights. As I continue to explore Docker’s capabilities, I’m excited about the potential it holds for streamlining my future projects and enhancing my skills in the ever-evolving tech landscape. The combination of Proxmox, Docker, and Portainer has truly transformed my approach to virtualization and containerization.
The post My Experience Installing Docker: A Journey into Containerization appeared first on GK.
]]>The post Deploying Wazuh in My Home Lab: A Personal Experience appeared first on GK.
]]>My goal was to create a robust monitoring solution for three devices:
a physical Windows desktop, a virtualized Linux machine running Ubuntu, and another virtualized Windows machine.
By leveraging Proxmox for virtualization, I aimed to gain practical experience that would help me navigate the complexities of these compliance frameworks and improve my skills in securing networks.
However, the journey was filled with excitement and a few challenges along the way.
I’ve been running Proxmox for several months now and it was relatively straightforward to create an environment for Wazuh to run on.
I created virtual machines for both the Ubuntu and Windows systems.
However, I quickly realized that configuring the networking correctly took some trial and error.
I had to double-check IP addresses and ensure that all devices were communicating properly, which led to a few frustrating moments.
I needed to allocate more resources(RAM) so that everything would run smoothly. Here’s a snapshot of the resource allocation for my Wazuh deployment:
Once I got the virtual machines up and running, I dove into installing Wazuh. At first, I was optimistic, but I faced some challenges. The installation process was not as smooth as I had hoped.
For instance, I had some trouble getting the Wazuh agent on my Windows desktop to connect to the Wazuh manager. I spent quite a bit of time tweaking configurations and checking logs to figure out what was wrong.
It turned out that I had to adjust some firewall settings to allow communication between the devices.
To access the Wazuh dashboard remotely, I decided to set up NGINX as a reverse proxy. This part of the process was a bit daunting for me as a newcomer.
I followed various guides but ran into issues with the DNS setup. There were moments of confusion when the site wouldn’t load, usually when I add a new A Record the site appears right away.
After a bit of refreshing site popped up with an SSL certificate.
Once Wazuh was up and running, I was excited to start monitoring the activity on my devices.
I was amazed at how Wazuh gathered data from the Windows desktop, tracking login attempts and system changes.
However, I soon realized that I needed to spend time familiarizing myself with the dashboard.
At first, it was overwhelming to interpret the alerts and logs. I found myself sifting through notifications, trying to determine what was normal and what might be a genuine threat.
I also had to spend time learning how to customize rules for monitoring, especially for the Ubuntu VM. The initial settings didn’t quite match my needs, so I had to dig into the documentation to figure out how to tailor the alerts for my setup.
Deploying Wazuh in my home lab has been a journey of discovery filled with its share of challenges.
While I faced issues with networking, installation, and configuration, each hurdle taught me something new about cybersecurity and system monitoring.
As I continue to refine my setup and expand my knowledge, I’m excited to see how Wazuh can help me stay vigilant against potential threats in a network.
The post Deploying Wazuh in My Home Lab: A Personal Experience appeared first on GK.
]]>