Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

OIDC Refresh Token - [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1129) #651

Open
sergeiwaigant opened this issue Sep 27, 2023 · 0 comments
Labels
needs_verify type/bug Something isn't working

Comments

@sergeiwaigant
Copy link

SUMMARY

We are using VMware Tanzu (TKGI) version 1.17 which is providing a Kubernetes cluster in version 1.26.5
With the tkgi CLI we can fetch a kubeconfig which looks like this:

apiVersion: v1
clusters:
- cluster:
    certificate-authority-data: <vmware-self-signed-CA>
    server: https://cluster-3936.domain.company:8443
name: 3936
contexts:
- context:
    cluster: 3936
    user: srvapp
name: 3936
current-context: 3936
kind: Config
preferences: {}
users:
- name: srvapp
user:
    auth-provider:
    config:
        client-id: pks_cluster_client
        client-secret: ""
        id-token: <token>
        idp-certificate-authority-data: <company-root-CA>
        idp-issuer-url: https://tkgi.domain.company:8443/oauth/token
        refresh-token: <refresh-token>
    name: oidc

The k8s API is using a certificate signed with the VMware self signed CA.
The idp-issuer-url is hosted with a certificate signed with the Company Root CA

Now the issue:
As long as the id-token is within the 5 minutes validity, the k8s modules are working fine and are able to perform all tasks against the k8s API. Once the token is expired, it is trying to communicate with the idp-issuer-url and fails with the following error:

urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='tkgi.domain.company', port=8443): Max retries exceeded with url: /oauth/token/.well-known/openid-configuration (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1129)')))

Traceback (most recent call last):
  File \"/home/user/.virtualenv/ansible_venv/lib64/python3.9/site-packages/urllib3/connectionpool.py\", line 714, in urlopen
    httplib_response = self._make_request(
  File \"/home/user/.virtualenv/ansible_venv/lib64/python3.9/site-packages/urllib3/connectionpool.py\", line 403, in _make_request
    self._validate_conn(conn)
  File \"/home/user/.virtualenv/ansible_venv/lib64/python3.9/site-packages/urllib3/connectionpool.py\", line 1053, in _validate_conn
    conn.connect()
  File \"/home/user/.virtualenv/ansible_venv/lib64/python3.9/site-packages/urllib3/connection.py\", line 419, in connect
    self.sock = ssl_wrap_socket(
  File \"/home/user/.virtualenv/ansible_venv/lib64/python3.9/site-packages/urllib3/util/ssl_.py\", line 449, in ssl_wrap_socket
    ssl_sock = _ssl_wrap_socket_impl(
  File \"/home/user/.virtualenv/ansible_venv/lib64/python3.9/site-packages/urllib3/util/ssl_.py\", line 493, in _ssl_wrap_socket_impl
    return ssl_context.wrap_socket(sock, server_hostname=server_hostname)
  File \"/usr/lib64/python3.9/ssl.py\", line 501, in wrap_socket
    return self.sslsocket_class._create(
  File \"/usr/lib64/python3.9/ssl.py\", line 1041, in _create
    self.do_handshake()
  File \"/usr/lib64/python3.9/ssl.py\", line 1310, in do_handshake
    self._sslobj.do_handshake()
ssl.SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1129)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File \"/home/user/.ansible/tmp/ansible-tmp-1695801734.8432624-1801678-279625003024084/AnsiballZ_k8s.py\", line 107, in <module>
    _ansiballz_main()
  File \"/home/user/.ansible/tmp/ansible-tmp-1695801734.8432624-1801678-279625003024084/AnsiballZ_k8s.py\", line 99, in _ansiballz_main
    invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)
  File \"/home/user/.ansible/tmp/ansible-tmp-1695801734.8432624-1801678-279625003024084/AnsiballZ_k8s.py\", line 47, in invoke_module
    runpy.run_module(mod_name='ansible_collections.kubernetes.core.plugins.modules.k8s', init_globals=dict(_module_fqn='ansible_collections.kubernetes.core.plugins.modules.k8s', _modlib_path=modlib_path),
  File \"/usr/lib64/python3.9/runpy.py\", line 225, in run_module
    return _run_module_code(code, init_globals, run_name, mod_spec)
  File \"/usr/lib64/python3.9/runpy.py\", line 97, in _run_module_code
    _run_code(code, mod_globals, init_globals,
  File \"/usr/lib64/python3.9/runpy.py\", line 87, in _run_code
    exec(code, run_globals)
  File \"/tmp/ansible_kubernetes.core.k8s_payload_sdtugg_q/ansible_kubernetes.core.k8s_payload.zip/ansible_collections/kubernetes/core/plugins/modules/k8s.py\", line 479, in <module>
  File \"/tmp/ansible_kubernetes.core.k8s_payload_sdtugg_q/ansible_kubernetes.core.k8s_payload.zip/ansible_collections/kubernetes/core/plugins/modules/k8s.py\", line 473, in main
  File \"/tmp/ansible_kubernetes.core.k8s_payload_sdtugg_q/ansible_kubernetes.core.k8s_payload.zip/ansible_collections/kubernetes/core/plugins/module_utils/k8s/runner.py\", line 52, in run_module
  File \"/tmp/ansible_kubernetes.core.k8s_payload_sdtugg_q/ansible_kubernetes.core.k8s_payload.zip/ansible_collections/kubernetes/core/plugins/module_utils/k8s/client.py\", line 351, in get_api_client
  File \"/tmp/ansible_kubernetes.core.k8s_payload_sdtugg_q/ansible_kubernetes.core.k8s_payload.zip/ansible_collections/kubernetes/core/plugins/module_utils/k8s/client.py\", line 148, in _create_configuration
  File \"/tmp/ansible_kubernetes.core.k8s_payload_sdtugg_q/ansible_kubernetes.core.k8s_payload.zip/ansible_collections/kubernetes/core/plugins/module_utils/k8s/client.py\", line 146, in _create_configuration
  File \"/tmp/ansible_kubernetes.core.k8s_payload_sdtugg_q/ansible_kubernetes.core.k8s_payload.zip/ansible_collections/kubernetes/core/plugins/module_utils/k8s/client.py\", line 126, in _load_config
  File \"/home/user/.virtualenv/ansible_venv/lib64/python3.9/site-packages/kubernetes/config/kube_config.py\", line 822, in load_kube_config
    loader.load_and_set(config)
  File \"/home/user/.virtualenv/ansible_venv/lib64/python3.9/site-packages/kubernetes/config/kube_config.py\", line 589, in load_and_set
    self._load_authentication()
  File \"/home/user/.virtualenv/ansible_venv/lib64/python3.9/site-packages/kubernetes/config/kube_config.py\", line 288, in _load_authentication
    if self._load_auth_provider_token():
  File \"/home/user/.virtualenv/ansible_venv/lib64/python3.9/site-packages/kubernetes/config/kube_config.py\", line 307, in _load_auth_provider_token
    return self._load_oid_token(provider)
  File \"/home/user/.virtualenv/ansible_venv/lib64/python3.9/site-packages/kubernetes/config/kube_config.py\", line 413, in _load_oid_token
    self._refresh_oidc(provider)
  File \"/home/user/.virtualenv/ansible_venv/lib64/python3.9/site-packages/kubernetes/config/kube_config.py\", line 450, in _refresh_oidc
    response = client.request(
  File \"/home/user/.virtualenv/ansible_venv/lib64/python3.9/site-packages/kubernetes/client/api_client.py\", line 373, in request
    return self.rest_client.GET(url,
  File \"/home/user/.virtualenv/ansible_venv/lib64/python3.9/site-packages/kubernetes/client/rest.py\", line 244, in GET
    return self.request(\"GET\", url,
  File \"/home/user/.virtualenv/ansible_venv/lib64/python3.9/site-packages/kubernetes/client/rest.py\", line 217, in request
    r = self.pool_manager.request(method, url,
  File \"/home/user/.virtualenv/ansible_venv/lib64/python3.9/site-packages/urllib3/request.py\", line 74, in request
    return self.request_encode_url(
  File \"/home/user/.virtualenv/ansible_venv/lib64/python3.9/site-packages/urllib3/request.py\", line 96, in request_encode_url
    return self.urlopen(method, url, **extra_kw)
  File \"/home/user/.virtualenv/ansible_venv/lib64/python3.9/site-packages/urllib3/poolmanager.py\", line 376, in urlopen
    response = conn.urlopen(method, u.request_uri, **kw)
  File \"/home/user/.virtualenv/ansible_venv/lib64/python3.9/site-packages/urllib3/connectionpool.py\", line 826, in urlopen
    return self.urlopen(
  File \"/home/user/.virtualenv/ansible_venv/lib64/python3.9/site-packages/urllib3/connectionpool.py\", line 826, in urlopen
    return self.urlopen(
  File \"/home/user/.virtualenv/ansible_venv/lib64/python3.9/site-packages/urllib3/connectionpool.py\", line 826, in urlopen
    return self.urlopen(
  File \"/home/user/.virtualenv/ansible_venv/lib64/python3.9/site-packages/urllib3/connectionpool.py\", line 798, in urlopen
    retries = retries.increment(
  File \"/home/user/.virtualenv/ansible_venv/lib64/python3.9/site-packages/urllib3/util/retry.py\", line 592, in increment
    raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='tkgi.domain.company', port=8443): Max retries exceeded with url: /oauth/token/.well-known/openid-configuration (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1129)')))
", "module_stdout": "", "msg": "MODULE FAILURE
See stdout/stderr for the exact error", "rc": 1}

It looks like the modules are not using the certificate in idp-certificate-authority-data while talking to idp-issuer-url when trying to fetch a new token to verify the trust.
Switching over to shell module and running kubectl commands instead is working fine, even after the expiry.
In fact, this is one workaround, to just run something like kubectl get pods to refresh the token in the kubeconfig, then the modules start working again.

I can reproduce the issue with plain python commands and also able to solve it:

$ python3
Python 3.9.16 (main, May 31 2023, 12:21:58) 
[GCC 8.5.0 20210514 (Red Hat 8.5.0-18)] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import requests
>>> response = requests.get('https://tkgi.domain.company:8443')

<<<< same error as above >>>>

When exporting the REQUESTS_CA_BUNDLE to point to a cert bundle which includes the company Root CA certificate the issue is gone. From what I understood, the requests API is using the cert bundle shipped with certifi python package.

export REQUESTS_CA_BUNDLE=/etc/ssl/certs/ca-bundle.trust.crt
$ python3
Python 3.9.16 (main, May 31 2023, 12:21:58) 
[GCC 8.5.0 20210514 (Red Hat 8.5.0-18)] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import requests
>>> response = requests.get('https://tkgi.domain.company:8443')
>>> print(response)
<Response [200]>

I tried already to set the same environment variable for the k8s task in Ansible or in the shell where Ansible is executed, but without success.
Even replacing the .pem files that I found in the virtual env didn't helped...

find ~/.virtualenv/ansible_venv/ -name *pem
/home/user/.virtualenv/ansible_venv/lib/python3.9/site-packages/pip/_vendor/certifi/cacert.pem
/home/user/.virtualenv/ansible_venv/lib/python3.9/site-packages/certifi/cacert.pem

I am not able to figure out which certificate bundle is used by the k8s module while talking to the idp-issuer-url or if any is used at all. From my perspective it should use the one that is given in the kubeconfig.

Maybe someone could have a look into the code to analyze this issue, which is IMHO a bug.

ISSUE TYPE
  • Bug Report
COMPONENT NAME
  • kubernetes.core.k8s
  • kubernetes.core.k8s_info
ANSIBLE VERSION
ansible [core 2.13.12]
  config file = /home/user/.ansible.cfg
  configured module search path = ['/home/user/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
  ansible python module location = /home/user/.virtualenv/ansible_venv/lib64/python3.9/site-packages/ansible
  ansible collection location = /home/user/.ansible/collections:/usr/share/ansible/collections
  executable location = /home/user/.virtualenv/ansible_venv/bin/ansible
  python version = 3.9.16 (main, May 31 2023, 12:21:58) [GCC 8.5.0 20210514 (Red Hat 8.5.0-18)]
  jinja version = 3.1.2
  libyaml = True
COLLECTION VERSION
ansible-galaxy collection list kubernetes.core

# /home/user/.virtualenv/ansible_venv/lib/python3.9/site-packages/ansible_collections
Collection      Version
--------------- -------
kubernetes.core 2.4.0  

# /home/user/.ansible/collections/ansible_collections
Collection      Version
--------------- -------
kubernetes.core 2.4.0  

# /home/user/.virtualenv/ansible_venv/lib64/python3.9/site-packages/ansible_collections
Collection      Version
--------------- -------
kubernetes.core 2.4.0  
CONFIGURATION
ansible-config dump --only-changed

CALLBACKS_ENABLED(/home/user/.ansible.cfg) = ['profile_tasks']
DEFAULT_HOST_LIST(/home/user/.ansible.cfg) = ['/home/user/workspace/hosts']
DEFAULT_LOCAL_TMP(/home/user/.ansible.cfg) = /home/user/.ansible/tmp/ansible-local-1829836glz6caxc
DEFAULT_ROLES_PATH(/home/user/.ansible.cfg) = ['/home/user/roles']
DEFAULT_VAULT_PASSWORD_FILE(/home/user/.ansible.cfg) = /home/user/.ansible.vault
HOST_KEY_CHECKING(/home/user/.ansible.cfg) = False
OS / ENVIRONMENT
cat /etc/redhat-release 
Red Hat Enterprise Linux release 8.8 (Ootpa)
STEPS TO REPRODUCE

Please see the summary above

- name: "Create namespace"
  environment: "{{ kube_environment }}"
  kubernetes.core.k8s:
    context: "{{ kube_context }}"
    name: "{{ namespace }}"
    kind: Namespace
    state: present
EXPECTED RESULTS

The module should use the CA certificate which is given in the kubeconfig when talking to the OIDC auth endpoint while refreshing the token.

ACTUAL RESULTS

Please see the summary above

urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='tkgi.domain.company', port=8443): Max retries exceeded with url: /oauth/token/.well-known/openid-configuration (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1129)')))
@gravesm gravesm added type/bug Something isn't working needs_verify labels Sep 27, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
needs_verify type/bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants