Handling Allowed and Disallowed Urls from Robots.txt #2137
PedroFonsecaDEV
started this conversation in
General
Replies: 2 comments 1 reply
-
See #229 (comment) |
Beta Was this translation helpful? Give feedback.
0 replies
-
Hi @B4nan, yah, I had seen the comment before, but I couldn't find an example on how to use the robots parser and crawlee together. Could you please provide us with a basic example ? For instance how to use the robot parser in combination with |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
As an example lets look at the GitHub Robots.txt file.
I could not find any documentation on how to handle allowed and disallowed urls.
Especially cases like:
Could you please provide us an code example ?
Thanks
Beta Was this translation helpful? Give feedback.
All reactions