Showing 6 open source projects for "web crawler bot"

View related business solutions
  • Mavenlink | Project Management Software Icon
    Mavenlink | Project Management Software

    Connecting People, Projects, and Profits

    Mavenlink is an innovative online resource management and project management software built for professional services teams. Offering a better way to manage projects and resources, Mavenlink transforms businesses by combining project management, collaboration, time tracking, resource management, and project financials all in one place.
    Get Started Today
  • Curtain LogTrace File Activity Monitoring Icon
    Curtain LogTrace File Activity Monitoring

    For any organizations (up to 10,000 PCs)

    Curtain LogTrace File Activity Monitoring is an enterprise file activity monitoring solution. It tracks user actions: create, copy, move, delete, rename, print, open, close, save. Includes source/destination paths and disk type. Perfect for monitoring user file activities.
    Learn More
  • 1
    ostRAT

    ostRAT

    OpenSourceTelegramRAT - Remote PC access via Telegram Bot.

    ostRAT is free and open source. GPLv3 Сomputer remote control software. Works via telegram bot. A lot of functions, for example: - Screenshot: sends a screenshot - Off: turns off the computer - Url: opens entered link - Write: sends your text to the computer - Move: changes mouse location with x and y - and more! WARNING: Using the bot is recommended only on your device. Failure to comply with the recommendation may result in criminal liability.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 2
    pyspider

    pyspider

    A powerful Spider(Web Crawler) system in Python

    pyspider is a powerful Spider(Web Crawler) system in Python. Components are connected by message queue. Every component, including message queue, is running in their own process/thread, and replaceable. That means, when process is slow, you can have many instances of processor and make full use of multiple CPUs, or deploy to multiple machines. This architecture makes pyspider really fast. benchmarking.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 3
    diskover

    diskover

    File system crawler and disk space usage software

    diskover is a file system crawler and disk space usage software that uses Elasticsearch to index your file metadata. diskover crawls and indexes your files on a local computer or remote storage server over network mounts. diskover helps manage your storage by identifying old and unused files and give better insights into data change "hotfiles", file duplication "dupes" and wasted space. It is designed to help deal with managing large amounts of data growth and provide detailed storage...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 4

    Python Crawler Library

    Python Web Crawler Library

    A simple library for crawling the web. This library will give you the ability to create macros for crawling web site and preforming simple actions like preforming "log in" and other simple actions in web sites.
    Downloads: 0 This Week
    Last Update:
    See Project
  • Cloudbrink Personal SASE service Icon
    Cloudbrink Personal SASE service

    For companies looking for low maintenance, secure, high performance connectivity for hybrid and remote workers

    Cloudbrink’s Personal SASE is a high-performance connectivity and security service that delivers a lightning-fast, in-office experience to the modern hybrid workforce anywhere. Combining high-performance ZTNA with Automated Moving Target Defense (AMTD), and Personal SD-WAN all connections are ultra-secure.
    Learn More
  • 5
    Universal information crawler is a fast precise and reliable Internet crawler. Uicrawler is a program/automated script which browses the World Wide Web in a methodical, automated manner and creates the index of documents that it accesses.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 6
    A configurable knowledge management framework. It works out of the box, but it's meant mainly as a framework to build complex information retrieval and analysis systems. The 3 major components: Crawler, Analyzer and Indexer can also be used separately.
    Downloads: 0 This Week
    Last Update:
    See Project
  • Previous
  • You're on page 1
  • Next
MongoDB Logo MongoDB