Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

depop SSL: CERTIFICATE_VERIFY_FAILED #50

Open
pastor91 opened this issue May 10, 2024 · 1 comment
Open

depop SSL: CERTIFICATE_VERIFY_FAILED #50

pastor91 opened this issue May 10, 2024 · 1 comment

Comments

@pastor91
Copy link

Hello !

When runing the commande for depop "python scraper.py -n" i get this error
Has anyone experienced this error?

PS C:\Users\berze\Downloads\Vinted-Scraper-main> python scraper.py -d
Creation of the directory failed or the folder already exists
Traceback (most recent call last):
  File "C:\Users\berze\AppData\Local\Programs\Python\Python311\Lib\site-packages\urllib3\connectionpool.py", line 715, in urlopen
    httplib_response = self._make_request(
                       ^^^^^^^^^^^^^^^^^^^
  File "C:\Users\berze\AppData\Local\Programs\Python\Python311\Lib\site-packages\urllib3\connectionpool.py", line 404, in _make_request
    self._validate_conn(conn)
  File "C:\Users\berze\AppData\Local\Programs\Python\Python311\Lib\site-packages\urllib3\connectionpool.py", line 1058, in _validate_conn
    conn.connect()
  File "C:\Users\berze\AppData\Local\Programs\Python\Python311\Lib\site-packages\urllib3\connection.py", line 419, in connect
    self.sock = ssl_wrap_socket(
                ^^^^^^^^^^^^^^^^
  File "C:\Users\berze\AppData\Local\Programs\Python\Python311\Lib\site-packages\urllib3\util\ssl_.py", line 449, in ssl_wrap_socket
    ssl_sock = _ssl_wrap_socket_impl(
               ^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\berze\AppData\Local\Programs\Python\Python311\Lib\site-packages\urllib3\util\ssl_.py", line 493, in _ssl_wrap_socket_impl
    return ssl_context.wrap_socket(sock, server_hostname=server_hostname)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\berze\AppData\Local\Programs\Python\Python311\Lib\site-packages\cloudscraper\__init__.py", line 98, in wrap_socket
    return self.ssl_context.orig_wrap_socket(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\berze\AppData\Local\Programs\Python\Python311\Lib\ssl.py", line 517, in wrap_socket
    return self.sslsocket_class._create(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\berze\AppData\Local\Programs\Python\Python311\Lib\ssl.py", line 1108, in _create
    self.do_handshake()
  File "C:\Users\berze\AppData\Local\Programs\Python\Python311\Lib\ssl.py", line 1379, in do_handshake
    self._sslobj.do_handshake()
ssl.SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: certificate has expired (_ssl.c:1006)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "C:\Users\berze\AppData\Local\Programs\Python\Python311\Lib\site-packages\requests\adapters.py", line 440, in send
    resp = conn.urlopen(
           ^^^^^^^^^^^^^
  File "C:\Users\berze\AppData\Local\Programs\Python\Python311\Lib\site-packages\urllib3\connectionpool.py", line 799, in urlopen
    retries = retries.increment(
              ^^^^^^^^^^^^^^^^^^
  File "C:\Users\berze\AppData\Local\Programs\Python\Python311\Lib\site-packages\urllib3\util\retry.py", line 592, in increment
    raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='depop.com', port=443): Max retries exceeded with url: / (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: certificate has expired (_ssl.c:1006)')))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "C:\Users\berze\Downloads\Vinted-Scraper-main\scraper.py", line 602, in <module>
    download_depop_data(userids)
  File "C:\Users\berze\Downloads\Vinted-Scraper-main\scraper.py", line 353, in download_depop_data
    s.get("https://depop.com")
  File "C:\Users\berze\AppData\Local\Programs\Python\Python311\Lib\site-packages\requests\sessions.py", line 542, in get
    return self.request('GET', url, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\berze\AppData\Local\Programs\Python\Python311\Lib\site-packages\cloudscraper\__init__.py", line 257, in request
    self.perform_request(method, url, *args, **kwargs)
  File "C:\Users\berze\AppData\Local\Programs\Python\Python311\Lib\site-packages\cloudscraper\__init__.py", line 190, in perform_request
    return super(CloudScraper, self).request(method, url, *args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\berze\AppData\Local\Programs\Python\Python311\Lib\site-packages\requests\sessions.py", line 529, in request
    resp = self.send(prep, **send_kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\berze\AppData\Local\Programs\Python\Python311\Lib\site-packages\requests\sessions.py", line 645, in send
    r = adapter.send(request, **kwargs)
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\berze\AppData\Local\Programs\Python\Python311\Lib\site-packages\requests\adapters.py", line 517, in send
    raise SSLError(e, request=request)
requests.exceptions.SSLError: HTTPSConnectionPool(host='depop.com', port=443): Max retries exceeded with url: / (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: certificate has expired (_ssl.c:1006)')))
@Relax594
Copy link

Relax594 commented May 13, 2024

Depop has increase their protection further. They also started temporarily banning IPs / Proxies. This will either have a 403 response or as in your case a 443 (Timeout). Haven't found a solution for myself yet and @Gertje823 is yet to respond to this or #47 .

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants