Logo
Home
 
User Agents
New Agents
List All
My User Agent
Add New
 
User-Agents Database

User Agents

User Agent Date Added
Atlasedit11/02/2004 0:25:57
This robot will transverse a web site and map every web page found. It will index them in a coma separate file. (.csv) Atlas will construct a site map of your web site, logging all web pages and locations. This will enable you to analyze the site and monitor size.
ChangeDetectionedit6/02/2004 1:05:26
Every time your website changes we'll notify your visitors.
Check&Getedit6/02/2004 0:15:29
Check&Get is handy and powerful bookmark manager and web monitoring program that lets you organize your browser bookmarks, check your favorite Internet pages and detect if their contenthas changed or has become unavailable.
CheckHostedit6/01/2021 12:38:29
contentkingapp.comedit17/03/2018 10:13:02
DomainCrawleredit3/11/2009 1:24:04
fell for bad bot trap
DomainWatcher Botedit20/12/2012 19:49:53
eCatchedit6/02/2004 1:09:37
Wysigot captures and monitors all internet, network and local sites
FreeWebMonitoring SiteCheckeredit30/05/2015 8:56:27
GetterroboPlus Puuedit27/07/2005 22:57:34
Puu robot is used to gater data from registered site in Search Engin "straight FLASH!!" for building anouncement page of state of renewal of registered site in "straight FLASH!!". Robot runs everyday. Purpose of the robot. One or more of: - gathering: gather data of original standerd TAG for Puu contains the information of the sites registered my Search Engin. - maintenance: link validation

This robot patorols based registered sites in Search Engin "straight FLASH!!"
Golemedit12/02/2004 20:48:11
Golem generates status reports on collections of URLs supplied by clients. Designed to assist with editorial updates of Web-related sites or products.
Hasteedit5/02/2004 20:51:56
We are checking the connectivity of the Web and 'Haste' checks selected websites only once, compiling averages on link numbers. If a Web site is detected to be on error, 'Haste' logs this information.
Intelliseekedit13/05/2006 12:17:04
Internet Shinchakubinedit25/07/2005 0:57:01
makes a list of new links and changed pages based on user's frequently clicked pages in the past 31 days. client may run this software one or few times every day, manually or specified time.

shipped for SHARP's PC users since Feb 2000
internetseeredit6/02/2004 1:10:38
InternetSeer remotely monitors your Web site to insure that your Web site is available 24/7. If InternetSeer is unable to reach your site, we will send you an immediate email alert that your site is unreachable.
internetVista monitor (Mozilla compatible)edit9/01/2017 9:38:57
Katipoedit8/03/2004 0:51:26
Watches all the pages you have previously visited and tells you when they have changed.
Lachesisedit6/02/2004 1:15:42
web response time tool (from Intel)
LinkWalkeredit6/02/2004 1:07:37
Link Walker crawls the web for broken links, offering the 'service' to site owners.
LinkWalker generates a database of links and sends reports of bad ones to webmasters.
Mirror Checkingedit27/09/2005 11:49:46
Monsteredit25/07/2005 0:46:52
The Monster has two parts - Web searcher and Web analyzer. Searcher is intended to perform the list of WWW sites of desired domain (for example it can perform list of all WWW sites of mit.edu, com, org, etc... domain) In the User-agent field $TYPE is set to 'Mapper' for Web searcher and 'StAlone' for Web analyzer.
Muninnedit25/07/2005 0:53:52
Muninn looks at museums within my reach and tells me about current exhibitions.
My Little Web Surveyedit2/05/2006 23:35:30
A little script that crawls the web. It only reads HTML pages and ignore any other format. It then follows the links and store the "Server" field.
Net Probeedit2/11/2007 0:26:15
Netcraftedit5/02/2004 20:52:36
Providing network security services, including application testing, code reviews, and automated penetration testing.
NetTransmitterStudioedit2/05/2006 1:16:10
Requests same url over and over again with a few days in between.
notifyninja.comedit28/12/2016 11:05:07
url not working (404)
PiltdownManedit25/07/2005 1:25:14
The PiltdownMan robot is used to get a list of links from the search engines in our database. These links are followed, and the page that they refer is downloaded to get some statistics from them. The robot runs once a month, more or less, and visits the first 10 pages listed in every search engine, for a group of keywords.

To maintain a database of search engines, we needed an automated tool. That's why we began the creation of this robot.
ping.blo.gsedit17/03/2011 6:27:24
ping.wordblog.deedit30/01/2012 11:49:36
Pingdomedit5/11/2006 22:37:19
Web site monitoring
PostFavoritesedit3/05/2006 1:26:28
Yahoo Search My Web
- Save what you like to build your own personal web
- "Re-find" pages instantly when you need them again
- Share your personal web
- Better than bookmarks
PRTG Network Monitoredit25/09/2018 16:14:09
Random User Agentedit14/11/2005 23:55:13
In each request the user agent changes.

on 2005.11.10 first request was for WebKnight http://www.aqtronix.com/webknight/ Got redirected to PageID=99 and that was second and final request.
Right Web Monitoredit6/02/2004 1:08:40
Right Web Monitor is a handy tool that lets you automatically check your favorite Internet resources and detect changes in their content.
rwws.comedit6/11/2005 0:10:04
mail.rwws.com
requests WebKnight FAQ each time with different user agent (in 1 day), not the same interval, so it is not really monitoring.
Connecting to http://mail.rwws.com/ shows coming soon page
connecting to http://mail.rwws.com/ shows windows 2000 under construction site
very bad thing, doesn't feel right!
SandCrawleredit1/05/2006 0:06:59
Microsoft's SandCrawler: used for monitoring what server you are running so that they'll know their market share.
ScooperBotedit13/07/2014 10:51:49
CustomScoop provides an all-in-one monitoring solution that covers traditional and social media. Every account includes on-demand analytics, distribution tools, and personalized service.
Site Valetedit31/07/2005 23:45:09
a deluxe site monitoring and analysis service
SiteCheckedit25/09/2015 11:59:27
sitedownchecker.com crawleredit25/09/2015 13:24:48
SiteMonitoredit9/10/2015 20:54:13
SMPUedit3/05/2006 21:41:15
Referer: http://www.norhaus.com/smpu.html
SMPU is a HTTP/1.0 URI parser and spider. The purpose of SMPU is resource collection and web site analysis.

- SMPU does not request any page more than once on any crawl.
- We will send you any information we have collected by request.

What does it do?

More often than not SMPU is used as a download utility, as it can recursively download some (or all) resources on a website. If you are seeing many requests that are all different then your server's contents are being either wholly or partially mirrored by the user.

If you are seeing occasional requests the chances are SMPU is being used as a spider to traverse the internet looking for something, and found a reference to your site.

What can I make it do?

Plenty of things, as a download util it's pretty good but it is more powerful as an analysis tool. You should familiarise yourself with the arguments for an idea of what it can do. It is free to download, and if you are a regular command prompt user it's a pretty useful tool to have around.
Snappyedit5/11/2006 22:54:45
SnykeBotedit6/02/2004 1:09:07
Snyke est un "agent" de surveillance de page Web
Surfbotedit9/09/2006 0:32:04
SurveyBotedit7/02/2004 22:34:38
Each week SurveyBot will query websites for statistics and other useful information. This information goes into the creation of the Whois Source domain search engine.

Uses referrer: http://www.whois.sc/
The Informantedit10/02/2004 1:40:03
The Informant robot continually checks the Web pages that are relevant to user queries. Users are notified of any new or updated pages. The robot runs daily, but the number of hits per site per day should be quite small, and these hits should be randomly distributed over several hours. Since the robot does not actually follow links (aside from those returned from the major search engines such as Lycos), it does not fall victim to the common looping problems. The robot will support the Robot Exclusion Standard by early December, 1996.
UptimeBotedit7/04/2004 17:30:24
Uptimerobotedit20/12/2012 17:13:24

Add new user agent

User Agents - Search

Enter keyword or user agent: