Get your privacy back
Generate fake web browsing and mitigate tracking
PartyLoud is a highly configurable and straightforward free tool that helps you prevent tracking directly from your linux terminal, no special skills required. Once started, you can forget it is running. It provides several flags; each flag lets you customize your experience and change PartyLoud behaviour according to your needs.
- Simple. 3 files only, no installation required, just clone this repo an you're ready to go.
- Powerful. Thread-based navigation.
- Stealthy. Optimized to emulate user navigation.
- Portable. You can use this script on every unix-based OS.
This project was inspired by noisy.py
How It Works
- URLs and keywords are loaded (either from partyloud.conf and badwords or from user-defined files)
- If proxy flag has been used, proxy config will be tested
- For each URL in ULR-list a thread is started, each thread as an user agent associated
- Each thread will start by sending an HTTP request to the given URL
- The response if filtered using the keywords in order to prevent 404s and malformed URLs
- A new URL is choosen from the list generated after filering
- Current thread sleeps for a random time
- Actions from 4 to 7 are repeated using the new URL until user send kill signal (CTRL-C or enter key)
- Configurable urls list and blocklist
- Multi-threaded request engine (# of thread are equal to # of urls in partyloud.conf)
- Error recovery mechanism to protect Engines from failures
- Spoofed User Agent prevent from fingerprinting (each engine has a different user agent)
- Dynamic UI
Clone the repository:
git clone https://github.com/realtho/PartyLoud.git
Navigate to the directory and make the script executable:
cd PartyLoud chmod +x partyloud.sh
Usage: ./partyloud.sh [options...] -d --dns <file> DNS Servers are sourced from specified FILE, each request will use a different DNS Server in the list !!WARNING THIS FEATURE IS EXPERIMENTAL!! !!PLEASE LET ME KNOW ISSUES ON GITHUB !! -l --url-list <file> read URL list from specified FILE -b --blocklist <file> read blocklist from specified FILE -p --http-proxy <http://ip:port> set a HTTP proxy -s --https-proxy <https://ip:port> set a HTTPS proxy -n --no-wait disable wait between one request and an other -h --help dispaly this help
To stop the script press either enter or CRTL-C
Isn't this literally just a cli based frontend to curl?
The core of the script is a curl request, but this tool does more than that. When you run the script, several threads are started. Each thread makes a different http request and parses the output to choose the next url, simulating web navigation. Unless the user stops the script (either pressing enter or via CTRL-C), it will stay alive.
How does the error recovery mechanism work?
Error recovery mechanism is an elegant way to say that if the http request returns a status code starting with 4 or 5 (error), the script will use a backup-url on order to continue normal execution.
May I fork your project?
How easy is this fake traffic to detect?
Unfortunately it's pretty easy, but keep in mind that this is a beta and I'll fix this "issue" in upcoming releases.
What does badwords do?
badwords is just a list of keywords used to filter urls in order to prevent 404s and traversing non-html content (like images, css, js). You can create your own, but, unless you have special needs, I recommend you use the default one or at least use it as a template.
What does partyloud.conf do?
partyloud.conf is just a list of root urls used to start the fake navigation. You can create your own conf file, but pay attention that the more urls you add, the more threads you start. This is an "open issue". Upcoming releases will come with a max thread number in order to avoid Fork Bombs.