Module pipelines.utils.dump_earth_engine_asset.tasks

Tasks for dumping data directly from BigQuery to GCS.

Functions

def create_table_asset(service_account: str, service_account_secret_path: str, project_id: str, gcs_file_asset_path: str, ee_asset_path: str)

Create a table asset in Earth Engine.

parameters: service_account Service account email in the format of earth-engine@.iam.gserviceaccount.com service_account_secret_path Path to the .json file containing the service account secret. project_id: Earth Engine project ID. gcs_asset_path Path to the asset in Google Cloud Storage in the format of "gs:////file.csv" ee_asset_path Path that the asset will be created in Earth Engine in the format of projects//assets/ or users//

def download_data_to_gcs(project_id: str = None, query: str = None, gcs_asset_path: str = None, bd_project_mode: str = 'prod', billing_project_id: str = None, location: str = 'US')

Get data from BigQuery.

def get_earth_engine_key_from_vault(vault_path_earth_engine_key: str)

Get earth engine service account key from vault.

def get_project_id(project_id: str = None, bd_project_mode: str = 'prod')

Get the project ID.

def trigger_cron_job(project_id: str, ee_asset_path: str, cron_expression: str)

Tells whether to trigger a cron job.

def update_last_trigger(project_id: str, ee_asset_path: str, execution_time: datetime.datetime)

Update the last trigger.