{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Calculation of global climatology and annual cycles of Cloud Fractional Cover from EUMETSAT's CM SAF CLARA-A3 dataset\n", "\n", "## 1. Introduction\n", "\n", "This notebook provides you with an introduction to one variable of the **EUMETSAT's CM SAF CLARA-A3** dataset available at the [Climate Data Store](https://cds.climate.copernicus.eu/#!/home) (CDS). The overall dataset contains data for Essential Climate Variables (ECVs) _Cloud Properties_ as well as _Surface_ - and _Earth Radiation Budget_, while this notebook focuses on **Cloud Fractional Cover** as part of the ECV _Cloud Properties_ available here: [Cloud properties global gridded monthly and daily data from 1979 to present derived from satellite observations](https://cds.climate.copernicus.eu/cdsapp#!/dataset/satellite-cloud-properties?tab=overview).\n", "\n", "In addition, the tutorial is about the **Surface downwelling longwave/shortwave flux** (as part of the ECV _Surface Radiation Budget_) to demonstrate the relation between clouds and radiation. The data is also available at the CDS: [Surface radiation budget from 1979 to present derived from satellite observations](https://cds.climate.copernicus.eu/cdsapp#!/dataset/satellite-surface-radiation-budget?tab=overview).\n", "\n", "The tutorial covers the full process from scratch to end and starts with a short introdution to the dataset and how to access the data from the Climate Data Store of the Copernicus Climate Change Service (C3S). This is followed by a step-by-step guide on how to process and visualize the data. Once you feel comfortable with the python code, you are invited to adjust or extend the code according to your interests! After a short introduction about how to use a Jupyter notebook the analysis starts!" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**Figure 1:** Summary of following options for analysis of a cloud and radiation dataset: Climatology of cloud fractional cover (a), annual cycle of cloud fractional cover for three selected cities (b) and climatology of surface downwelling longwave radiation (c); each based on the CLARA-A3 dataset and data from 1979 - 2020." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### How to access the notebook\n", "\n", "This tutorial is in the form of a [Jupyter notebook](https://jupyter.org/). It can be run on a cloud environment, or on your own computer. You will not need to install any software for the training as there are a number of free cloud-based services to create, edit, run and export Jupyter notebooks such as this. Here are some suggestions (simply click on one of the links below to run the notebook):" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "\n", "
Run the tutorial via free cloud platforms: | \n", "\n",
" | \n",
" \n",
" | \n",
" \n",
" | \n",
"
---|
<xarray.Dataset> Size: 59GB\n", "Dimensions: (time: 504, lat: 720, bndsize: 2, lon: 1440)\n", "Coordinates:\n", " * lat (lat) float32 3kB -89.88 -89.62 -89.38 ... 89.38 89.62 89.88\n", " * lon (lon) float32 6kB -179.9 -179.6 -179.4 ... 179.4 179.6 179.9\n", " * time (time) datetime64[ns] 4kB 1979-01-01 ... 2020-12-01\n", "Dimensions without coordinates: bndsize\n", "Data variables: (12/18)\n", " lat_bnds (time, lat, bndsize) float32 3MB dask.array<chunksize=(1, 720, 2), meta=np.ndarray>\n", " lon_bnds (time, lon, bndsize) float32 6MB dask.array<chunksize=(1, 1440, 2), meta=np.ndarray>\n", " time_bnds (time, bndsize) datetime64[ns] 8kB dask.array<chunksize=(1, 2), meta=np.ndarray>\n", " record_status (time) int8 504B dask.array<chunksize=(1,), meta=np.ndarray>\n", " nobs (time, lat, lon) float64 4GB dask.array<chunksize=(1, 720, 1440), meta=np.ndarray>\n", " nobs_day (time, lat, lon) float64 4GB dask.array<chunksize=(1, 720, 1440), meta=np.ndarray>\n", " ... ...\n", " cfc_day (time, lat, lon) float64 4GB dask.array<chunksize=(1, 720, 1440), meta=np.ndarray>\n", " cfc_night (time, lat, lon) float64 4GB dask.array<chunksize=(1, 720, 1440), meta=np.ndarray>\n", " cma_prob (time, lat, lon) float64 4GB dask.array<chunksize=(1, 720, 1440), meta=np.ndarray>\n", " cma_prob_day (time, lat, lon) float64 4GB dask.array<chunksize=(1, 720, 1440), meta=np.ndarray>\n", " cma_prob_night (time, lat, lon) float64 4GB dask.array<chunksize=(1, 720, 1440), meta=np.ndarray>\n", " cfc_unc_mean (time, lat, lon) float64 4GB dask.array<chunksize=(1, 720, 1440), meta=np.ndarray>\n", "Attributes: (12/37)\n", " title: CM SAF cLoud, Albedo and RAdiation dataset, ...\n", " summary: This file contains AVHRR-based Thematic Clim...\n", " id: DOI:10.5676/EUM_SAF_CM/CLARA_AVHRR/V003\n", " product_version: 3.0\n", " creator_name: DE/DWD\n", " creator_email: contact.cmsaf@dwd.de\n", " ... ...\n", " CMSAF_included_Daily_Means: 31.0\n", " CMSAF_L1_processor: PyGAC, level1c4pps\n", " CMSAF_L2_processor: PPSv2018-patch5\n", " CMSAF_L3_processor: CMSAFGACL3_V3.0\n", " variable_id: cfc\n", " license: The CM SAF data are owned by EUMETSAT and ar...