Logo
Home
 
User Agents
New Agents
List All
My User Agent
Add New
 
User-Agents Database

User Agents

User Agent Date Added
ChristCrawler.comedit11/02/2004 16:14:07
A Christian internet spider that searches web sites to find Christian Related material
cIeNcIaFiCcIoN.nEtedit11/02/2004 16:17:52
Robot encargado de la indexación de las páginas para www.cienciaficcion.net
CindooSpideredit22/02/2009 23:52:09
CipinetBotedit8/02/2004 15:25:43
Climate Change Spideredit8/02/2004 15:26:47
cmsworldmap.comedit4/02/2011 12:39:31
Search Engine for CMS System from linkfactory.dk

fell for bad bot trap
Collectiveedit11/02/2004 16:21:03
Collective is the most cleverest Internet search engine, With all found urls guaranteed to have your search terms.

Collective is a highly configurable program designed to interrogate online search engines and online databases, it will ignore web pages that lie about there content, and dead url's, it can be super strict, it searches each web page it finds for your search terms to ensure those terms are present, any positive urls are added to a html file for your to view at any time even before the program has finished. Collective can wonder the web for days if required.
Conceptbotedit11/02/2004 16:26:22
The Conceptbot spider is used to research concept-based search indexing techniques. It uses a breadth first search to spread out the number of hits on a single site over time. The spider runs at irregular intervals and is still under construction.
CoolBotedit11/02/2004 16:27:43
The CoolBot robot is used to build and maintain the directory of the german search engine Suchmaschine21.
Cowbotedit8/02/2004 15:27:55
Crawlsonedit10/09/2020 14:28:39
CrazyWebCrawleredit24/09/2015 23:55:45
Cuscoedit11/02/2004 16:42:30
The Cusco robot is part of the CUCE indexing sistem. It gathers information from several sources: HTTP, Databases or filesystem. At this moment, it's universe is the .pt domain and the information it gathers is available at the Portuguese search engine Cusco http://www.cusco.pt/.
DeepIndexedit8/02/2004 15:29:32
Desert Realm Spideredit8/02/2004 15:30:24
The spider indexes fantasy and science fiction sites by using a customizable keyword algorithm. Only home pages are indexed, but all pages are looked at for links. Pages are visited randomly to limit impact on any one webserver.
DeuSuedit9/10/2015 20:50:57
DeuSu is my personal project. It is a search-engine which has its very own search-index. To create this search-index, over a billion web-pages have to be crawled. The DeuSu robot is the software used to do this.
Die Blinde Kuhedit11/02/2004 0:10:22
The robot is use for indexing and proofing the registered urls in the german language search engine for kids. Its a non-commercial one-woman-project of Birgit Bachmann living in Hamburg, Germany.
DIE-KRAEHEedit8/02/2004 15:28:58
DienstSpideredit11/02/2004 16:54:43
Indexing and searching the NCSTRL(Networked Computer Science Technical Report Library) and ERCIM Collection
Diggeredit11/02/2004 16:56:06
indexing web sites for the Diggit! search engine
Digital Integrity Robotedit11/02/2004 17:03:37
Direct Hit Grabberedit11/02/2004 17:04:55
Direct Hit Grabber indexes documents and collects Web statistics for the Direct Hit Search Engine (available at www.directhit.com and our partners' sites)
DittoSpyderedit8/09/2006 0:13:47
Ditto image search engine
DNAbotedit11/02/2004 17:07:07
A search robot in 100 java, with its own built-in database engine and web server . Currently in Japanese.
DragonBotedit11/02/2004 17:10:23
Collects web pages related to East Asia
DuckDuckBotedit10/09/2015 23:29:37
DuckDuckPreviewedit25/09/2015 13:35:06
DWCP (Dridus' Web Cataloging Project)edit11/02/2004 17:11:45
The DWCP robot is used to gather information for Dridus' Web Cataloging Project, which is intended to catalog domains and urls (no content).
Eco-Portal Spideredit8/02/2004 15:33:26
ELFINBOTedit11/02/2004 17:17:04
ELFIN is used to index and add data to the "Lets Find It Now Search Engine" (http://letsfinditnow.com). The robot runs every 30 days.
EMPAS_ROBOTedit8/02/2004 15:30:46
Environmental Sustainability Spideredit8/02/2004 15:34:26
EroCrawleredit8/09/2006 0:17:58
adult search engine
ES.NETedit5/07/2005 11:02:08
Innerprise develops full-text indexing search engine software technology enabling search for your Web site, Intranet, or the Web. Advanced crawler features ensure that only documents you want indexed are indexed. Key features provide support for common file types, secure servers, multiple servers, and complete automation through built-in schedulers.
Estheredit11/02/2004 22:45:21
This crawler is used to build the search database at http://search.falconsoft.com/
EuroSeek Arachnoideaedit10/02/2004 1:19:00
Evliya Celebiedit11/02/2004 22:46:23
crawles pages under ".tr" domain or having turkish character encoding (iso-8859-9 or windows-1254)
ExactSeek Crawleredit8/02/2004 19:40:22
ExactSeek.com is an internet search engine and directory that receives and indexes over 30,000 new site submissions daily. To date, more than 2 million web sites have been indexed and added to the ExactSeek database and another 2 to 3 million web sites will be added in the near future. Our goal is to not index the Web but to provide searchers with a "quality" database of between 4 and 5 million web sites. In addition to standard web search results, ExactSeek also offers targeted searches of specialized databases. Currently, visitors can use niche search engines to find newsletters, articles, mp3 files, images, and comparison shopping sites.

Uses UAs: "eseek-crawler" and "exactseek-crawler*"
Exaleadedit8/02/2004 15:35:38
Excite ArchitextSpideredit10/02/2004 1:19:52
Its purpose is to generate a Resource Discovery database, and to generate statistics. The ArchitextSpider collects information for the Excite and WebCrawler search engines.
EZResultedit10/02/2004 1:30:10
FASTedit8/02/2004 15:36:30
Crawler for alltheweb.com
FastBugedit8/02/2004 15:37:06
FastCrawleredit11/02/2004 22:49:58
FastCrawler is used to build the databases for search engines used by 1klik.dk and it's partners
FeedFetcher-Googleedit13/04/2008 22:39:11
Feedster Crawleredit8/02/2004 15:39:37
FemtosearchBotedit21/12/2018 13:26:33
fell for bad bot trap
Findexa Crawleredit2/08/2007 23:00:00
Norwegian search engine
Findxbotedit17/02/2015 22:04:25
Findx bot is a web scraping bot used by the search engine findx. Findx's goal is to create an independent European search engine with a strong focus on privacy and user choice. Findx bot scrapes sites to include in its index and help direct users to those sites.
FlickBotedit8/02/2004 15:40:11

Add new user agent

User Agents - Search

Enter keyword or user agent: