File: //lib64/python3.9/urllib/__pycache__/robotparser.cpython-39.pyc
a
    �DOg�$  �                   @   s\   d Z ddlZddlZddlZdgZe�dd�ZG dd� d�ZG dd� d�Z	G d	d
� d
�Z
dS )a%   robotparser.py
    Copyright (C) 2000  Bastian Kleineidam
    You can choose between two licenses when using this package:
    1) GNU GPLv2
    2) PSF license for Python 2.2
    The robots.txt Exclusion Protocol is implemented as specified in
    http://www.robotstxt.org/norobots-rfc.txt
�    N�RobotFileParser�RequestRatezrequests secondsc                   @   sr   e Zd ZdZddd�Zdd� Zdd� Zd	d
� Zdd� Zd
d� Z	dd� Z
dd� Zdd� Zdd� Z
dd� Zdd� ZdS )r   zs This class provides a set of methods to read, parse and answer
    questions about a single robots.txt file.
    � c                 C   s2   g | _ g | _d | _d| _d| _| �|� d| _d S )NFr   )�entries�sitemaps�
default_entry�disallow_all�	allow_all�set_url�last_checked��self�url� r   �*/usr/lib64/python3.9/urllib/robotparser.py�__init__   s    
zRobotFileParser.__init__c                 C   s   | j S )z�Returns the time the robots.txt file was last fetched.
        This is useful for long-running web spiders that need to
        check for new robots.txt files periodically.
        )r   �r
   r   r   r   �mtime%   s    zRobotFileParser.mtimec                 C   s   ddl }|� � | _dS )zYSets the time the robots.txt file was last fetched to the
        current time.
        r   N)�timer   )r
   r   r   r   r   �modified.   s    zRobotFileParser.modifiedc                 C   s&