robotstxt (0.6.2)

0 users

A 'robots.txt' Parser and 'Webbot'/'Spider'/'Crawler' Permissions Checker.

Provides functions to download and parse 'robots.txt' files. Ultimately the package makes it easy to check if bots (spiders, crawler, scrapers, ...) are allowed to access specific resources on a domain.

Maintainer: Peter Meissner
Author(s): Peter Meissner [aut, cre], Oliver Keys [ctb], Rich Fitz John [ctb]

License: MIT + file LICENSE

Uses: future, future.apply, httr, magrittr, spiderbar, stringr, testthat, knitr, dplyr, rmarkdown, covr
Reverse suggests: newsanchor, rzeit2, spiderbar, webchem

Released almost 2 years ago.

6 previous versions



  (0 votes)


  (0 votes)

Log in to vote.


No one has written a review of robotstxt yet. Want to be the first? Write one now.

Related packages: Rserve, XML, httpRequest, rjson, RCurl, OAIHarvester, RgoogleMaps, sendmailR, twitteR, scrapeR, RJSONIO, imguR, googleVis, factualR, ROAuth, Rook, osmar, whisker, FastRWeb, ggmap(20 best matches, based on common tags.)

Search for robotstxt on google, google scholar, r-help, r-devel.

Visit robotstxt on R Graphical Manual.