Title:
|
ANALYSIS OF THE USAGE STATISTICS OF ROBOTS EXCLUSION STANDARD |
Author(s):
|
Ajay, Smitha , Jaliya Ekanayake |
ISBN:
|
972-8924-19-4 |
Editors:
|
Pedro Isaías, Miguel Baptista Nunes and Inmaculada J. Martínez |
Year:
|
2006 |
Edition:
|
V II, 2 |
Keywords:
|
Robots Exclusion standard, robots, hidden web |
Type:
|
Short Paper |
First Page:
|
345 |
Last Page:
|
348 |
Language:
|
English |
Cover:
|
|
Full Contents:
|
click to dowload
|
Paper Abstract:
|
Robots Exclusion standard [4] is a de-facto standard that is used to inform the crawlers, spiders or web robots about the disallowed sections of a web server. Since its inception in 1994, the robots exclusion standard has been extensively used. In this paper, we present our results of the statistical analysis of the usage of robots exclusion standard. Based on the results obtained, we propose that organizations like W3C should adopt the Robert Exclusion Standard and make it an official standard. |
|
|
|
|