Mass Site Visitor - a Traffic bot
Vesion 1.0 out NOW (Nov 2019), See the details below.
Mass Site Visitor (MSV) is designed to generate
large amounts of traffic
to websites of your choice through the list of proxies in a multi-threaded way. It runs automated browser sessions in the background on your PC, and directs their traffic through proxies. MSV allows you to set visitor behavior from simple like page visiting, to complex conditional clicks on any elements of your web page.
(for Google Analytics traffic sources)
(non-headless browser to influence video/audio view counts
on web-sites that track the playback from visitors, and count anonymous, non-logged-in viewers). Pro-tip: don't use JS referrer spoofing with this feature, because media sharing sites don't care much about traffic sources, that would be only useful for influencing Google Analytics.
3. More proxy filters for scraper.
4. Test all proxies (will use the connection timeout and number of attempts from the Options window)
5. Private proxy
support, consult the manual as to how setup you proxy list to supply authentication info for a private proxy provider.
6. Editable browser dimensions
. Check the Options menu, use the format, as you'll see for the default resolutions, provided with the program.
geckodriver. For cases when Chrome is not enough. Nightly
is a preferred version, because only it was working properly without crashes on different machines.
: Alexa toolbar/extension
. Warning: works stably only with Firefox Nightly.
9. Multiple web-site destinations
for a session. One web-site per line in a textifield - one of those will be randomly selected each time to visit.
10. Search for the element to click within an iframe.
If a link is within an embedded third-party web-page, consult the manual on how to use it. If the link isn't in an iframe - just leave the last 2 textfields of the script as they are (one with NO
, another empty)
Version 0.6 released (3 Dec 2017):
Version 0.7 released (24 Jan 2018):
- selenium library updated to 3.6.0 - stability improved
- Third-party dependencies added into the About window
- Open working folder button in the Options window - for users' convenience
- Bugfixes/code refactoring/stability improvements
Version 1.0 released (10 Nov 2019):
- 3 additional proxy scraping targets
- Bugfixes, stability improvements
Version 1.0.1 released (22 Feb 2020):
- Locale spoofing (setting browser language)
- New Selenium version used
- Latest Chrome/Firefox support
- Bugfixes, stability improvements
You can set referrers and user agents, which along with proxies will be randomly selected to create a browser session. Referrers
, user agents
should be stored in a text-file and can be appended to an already loaded list in the program by pressing the 'Load the list' button. Remember to add the word socks5
after the ip of a public socks4/5 proxy, separated by a whitespace.
Results for this browser session you can select to save as an image snapshot for every visit (to see how web-site is seen from different IPs), with a customized behavior of the browser while it's on page, you can set the program to click some links
too (check out Settings - Options
There is an ability to filter out non-responsive proxies from the ones that work, and save them to file (Settings - Export Healthy Proxies
If you don't have a proxy list of your own, and the one supplied with MSV isn't satisfactory (with time those proxies might cease to work), there is an ability to scrap public proxies from the web
, Settings - Import Proxies from the web
, if you are scraping proxies from the web, you don't have to worry about adding the proxy type for a socks proxy, program will save all the info in the right format.
The log of events, that happened in a context of the program is available for the user to study. It will contain messages about the proxy scraping process, as well as the process of sending traffic through proxies to the target website.
See the video preview
Don't forget to reduce thread count
if you are using low-end machine
. Having several dozens of threads running browser sessions that are being created/destroyed constantly, and are trying to load some web-page is a performance-heavy task.
If you have trouble using the bot with Chrome/Firefox, it may be due to an outdated version of the webdriver , which is provided with the
bot in the bin/ folder (chromedriver and geckodriver (Firefox)). You
should download the latest version of those from the official web-sites,
check these links