mkiesel.ch

geopipe - Or How to Avoid Accidental Travel Bans

03. Aug 2022, by manu in posts

Intro #

About two weeks ago I found myself wanting to do some statistics on a specific range of servers. And by statistics, I mean to do some light, non-intrusive port scanning. Now the thing with port scanning is that some countries already see it as a gray or even black thing to do. And if you are careless with for example nmap -sC you can potentially execute code that would definitely not be legal.

With that in mind, I wanted to access a somewhat safe harbor by only scanning Swiss servers. I noticed pretty quickly that just because a domain ends with .ch (TLD↗ of Switzerland) does not mean the server is located in Switzerland. My safe harbor would possibly only exist on servers in Switzerland where Swiss laws apply. So I needed a way to filter out domains that don’t have an IP inside the Swiss cyberspace.

But I Hear You Ask #

What about CDNs↗ and all the sites behind Cloudflare and other Proxies or WAFs↗? And couldn’t you just scan the known Swiss IP space instead of domain names ending in .ch?

Regarding Proxies: This is absolutely true. For now, I’m gonna ignore those sites since there is not really a way nor an automated way to find out the real IP addresses behind those sites if they are set up correctly. The problem with scanning by IP is simple. A web server for example can simply ignore requests that come with the header Host: <IP> instead of Host: myserver.ch. This can be done by design if a company for example runs their own reverse proxy. The server itself has many domain names and will route incoming traffic according to but not limited to the Host header.

MaxMind #

I actually did not do any research regarding other solutions but the one way to geolocate IPs that I knew of was MaxMind↗. But since it is free (as in it requires a free account), offline and there exist many libraries/modules examples around its GeoLite2 database, it was my database of choice.

All the Cool Kidz Use Golang #

So I was in need of a tool that could simply filter out domains that didn’t have an IP associated inside the Swiss cyberspace. I don’t code that much and rather wanted to do something simple. I would put this tool inside the “pipeline tools” category like for example httpx↗, hakrawler↗, gobuster↗, ffuf↗ and co.. “Pipelining” refers to usually getting input via stdin and putting processed stuff to stdout.

Most tools inside this category are written in Golang↗. The three things I’ve been told about that language before starting were that it’s blazingly fast, simple to write but really ugly looking. Sometimes I actually wanna try out something new and after discovering that Gregory Oschwald↗ made a MaxMind GO module↗, I installed it and began my journey.

A Wild Deadlock Appears #

After the initial “Hi Mom” I took thelicato’s fire↗ as my base code and started from there. I quickly figured at that goroutines would be the way to go. They are kinda threads but not really. Executing a function with the word go in front will start in in the background and continue with the code afterward. Those routines are super powerful but easy to mess up.

They for example can’t return any value directly while a normal function can. They instead rely on channels to send messages to each other or back to the main function. This is where I met my enemy of this project, the deadlock error message, for the first time. When defining a channel you either need to specify its exact length or make it a buffered channel. If you make a buffered channel, you must read to it as many times as you write to it. If you don’t do that the program will hang. To specify the end of a channel you can close the channel. If you close it at the wrong moment deadlock will occur at runtime.

As a Golang noob that messed me up. I did not know when to exactly close a channel when to wait for finished workers, and when to read from channels.

To Many Addresses #

I got the first PoC pretty fast but had a little performance problem which needs a bit of backstory. The MaxMind database wants an IP address and not a domain name for geolocation. So easy peasy do some DNS resolution before looking up the location right? The problem is that a single domain can have multiple IP addresses assigned to it. Since I did not yet understand goroutines and wait groups, this messed my program up.

After taking all the domains from stdin I would count how many there are and in the end, I would read the output channel from the workerDB (goroutine) exactly as many times as counted. But this will hang the program or end in a deadlock since I can have 5 domains in stdin but 8 after the DNS lookup. So for the first PoC, I did the DNS lookup inside main() and waited for it to finish before starting the workers. This is not really good for performance since you will need to wait for all DNS lookups to be finished.

Stop, Wait a Minute #

With a little hint from a work friend, I eventually figured out that I needed multiple wait groups in order for my program to function. Waitgroups are in short a way to track how a worker is doing. You can wait at specific moments in your code for a worker to finish before going further.

You can find the finished project here↗.