| Server IP : 3.138.164.131 / Your IP : 216.73.216.136 Web Server : Apache System : Linux ns1.techtime.me 4.18.0-147.8.1.el8.lve.1.x86_64 #1 SMP Mon Jun 29 09:55:57 EDT 2020 x86_64 User : injazaat ( 1471) PHP Version : 8.1.20 Disable Function : NONE MySQL : OFF | cURL : ON | WGET : ON | Perl : ON | Python : ON | Sudo : OFF | Pkexec : OFF Directory : /proc/self/root/opt/alt/python38/lib64/python3.8/urllib/__pycache__/ |
Upload File : |
U
.��c�$ � @ s\ d Z ddlZddlZddlZdgZe�dd�ZG dd� d�ZG dd� d�Z G d d
� d
�Z
dS )a% robotparser.py
Copyright (C) 2000 Bastian Kleineidam
You can choose between two licenses when using this package:
1) GNU GPLv2
2) PSF license for Python 2.2
The robots.txt Exclusion Protocol is implemented as specified in
http://www.robotstxt.org/norobots-rfc.txt
� N�RobotFileParser�RequestRatezrequests secondsc @ sr e Zd ZdZddd�Zdd� Zdd� Zd d
� Zdd� Zd
d� Z dd� Z
dd� Zdd� Zdd� Z
dd� Zdd� ZdS )r zs This class provides a set of methods to read, parse and answer
questions about a single robots.txt file.
� c C s2 g | _ g | _d | _d| _d| _| �|� d| _d S )NFr )�entries�sitemaps�
default_entry�disallow_all� allow_all�set_url�last_checked��self�url� r �7/opt/alt/python38/lib64/python3.8/urllib/robotparser.py�__init__ s
zRobotFileParser.__init__c C s | j S )z�Returns the time the robots.txt file was last fetched.
This is useful for long-running web spiders that need to
check for new robots.txt files periodically.
)r �r
r r r �mtime% s zRobotFileParser.mtimec C s ddl }|� � | _dS )zYSets the time the robots.txt file was last fetched to the
current time.
r N)�timer )r
r r r r �modified. s zRobotFileParser.modifiedc C s&