Learn how to run dlt (or any Python code) with Airflow using PythonOperator, PythonVirtualenvOperator, KubernetesPodOperator, and external compute services
While running the workloads on GKE, did you use Application Default Credentials to perform the authentication? I'm having a hard time with google's gcloud.auth.default credentials and dlt compatible classes.
In my case it's important to rely on ADC because generating service account keys is forbidden.
Wow what a pleasure to read and informative!
While running the workloads on GKE, did you use Application Default Credentials to perform the authentication? I'm having a hard time with google's gcloud.auth.default credentials and dlt compatible classes.
In my case it's important to rely on ADC because generating service account keys is forbidden.
Hi Osvaldo,
I am replying here for the people who could end up here from somewhere else (Google? ChatGPT?).
As per our conversation in the dlthub slack, you were able to solve (on your own) the problem using:
```
import dlt
from dlt.sources.filesystem import filesystem, read_csv
from dlt.sources.credentials import GcpOAuthCredentials
creds = GcpOAuthCredentials()
```