API testing - Rwema25/AE-project GitHub Wiki
Climate Datasets and API Availability
| Dataset | API Available? | API Access Registration Required? | Notes / Source |
|---|---|---|---|
| CHIRPS (Climate Hazards Group InfraRed Precipitation with Station data) | No official public API | N/A | Data usually accessed via FTP or direct downloads, not REST APIs. |
| CHIRTS (CHIRPS + Temperature) | No official public API | N/A | Similar access methods as CHIRPS; no dedicated API documented. |
| AgERA5 | Yes (via Copernicus Climate Data Store API) | Yes (CDS registration and API key required) | Access through CDS API with Python cdsapi client; requires license acceptance and .cdsapirc setup. See CDS API documentation and ag5Tools. |
| ERA5 | Yes | Yes (registration and API key required) | Available via Copernicus Climate Data Store API; requires personal access token setup. |
| TerraClimate | No official public API | N/A | Data usually accessed via direct download; no standard public API. |
| IMERG (Integrated Multi-satellitE Retrievals for GPM) | Yes | Yes (registration and API key required) | NASA provides APIs via GES DISC; EarthData login and token required. |
| TAMSAT (Tropical Applications of Meteorology using SATellite data) | No official public API | N/A | Data typically accessed via downloads; no public API documented. |
| NASA Earth Exchange Global Daily Downscaled Projections (NEX-GDDP) | No official public API | N/A | Data accessible via EarthData portal; APIs limited or indirect; registration usually required. |
| NASA POWER (Prediction Of Worldwide Energy Resources) | Yes | No API key required for basic use; optional registration for higher limits | NASA POWER API available; no EarthData login or token needed for standard access |
| CMIP6 (Coupled Model Intercomparison Project Phase 6) | Yes (via some services) | Depends on service (e.g., Open-Meteo no key, others yes) | Accessible via APIs like Open-Meteo Climate API (no key) and others that may require registration. |
Open-Meteo Climate API Testing with CMIP6 Climate Model Data
In this section, we present the results of our testing of the Open-Meteo Climate API, which provides high-resolution climate model data from CMIP6 simulations. The API allows querying daily climate variables such as maximum temperature and precipitation for specified locations and time periods. Our testing focused on retrieving data for Nairobi, Kenya, over the year 2024, handling API constraints, and evaluating performance and data quality.
The following table summarizes the key aspects of the API testing process, including request parameters, data retrieval, error handling, and observed limitations.
| Test Aspect | Description / Result |
|---|---|
| API Endpoint | - https://climate-api.open-meteo.com/v1/climate |
| Authentication | - No authentication or API key required for non-commercial use |
| Request Parameters | - Latitude: -1.2921 (Nairobi) - Longitude: 36.8219 - Date range: chunked intervals (initially 10-day limit observed, monthly chunks implemented for convenience) - Model: MRI_AGCM3_2_S - Variables: daily max temperature, precipitation - Units: Celsius, mm - Time format: ISO8601 |
| Data Retrieval | - Successfully retrieved daily climate model data in chunks - Combined monthly chunks to form a full year dataset |
| Data Format | - JSON response parsed into R list - Converted into clean data frame with columns: date, temperature_max_c, precipitation_mm |
| Response Time | - Average ~0.19 seconds per monthly chunk - First request slightly longer (~1 second) |
| Error Handling | - Used tryCatch() to handle request failures gracefully - Warnings printed for failed requests without stopping the loop |
| Data Completeness | - Full year (2024) daily data obtained by aggregating chunked requests - Duplicate rows removed |
| Data Accuracy | - Data matches expected climate model outputs for Nairobi region and specified variables |
| Limitations | - API enforces maximum date range per request of approximately 10 days (empirically observed) - Chunking required for longer periods (months, years) |
| Usability | - Easy integration with R packages (httr, jsonlite, dplyr) - No API key needed - Flexible parameterization |
| Scalability | - Chunked requests enable multi-year dataset retrieval - Total time scales with number of chunks requested |
| Documentation | - API usage and testing process documented with clear tables and explanations - Official API docs: Open-Meteo Climate API - Markdown format used for easy integration and version tracking |
To illustrate the practical implementation of this testing approach, the following R code demonstrates how to retrieve and aggregate monthly climate data chunks from the Open-Meteo Climate API for Nairobi in 2024, including error handling and timing of requests.
library(httr)
library(jsonlite)
library(dplyr)
url <- "https://climate-api.open-meteo.com/v1/climate"
# Function to get data for a given date range
get_climate_data <- function(start_date, end_date) {
params <- list(
latitude = -1.2921,
longitude = 36.8219,
start_date = start_date,
end_date = end_date,
models = "MRI_AGCM3_2_S",
daily = "temperature_2m_max,precipitation_sum",
temperature_unit = "celsius",
precipitation_unit = "mm",
timeformat = "iso8601"
)
response <- GET(url, query = params)
if (status_code(response) == 200) {
data <- content(response, as = "parsed", type = "application/json")
dates <- unlist(data$daily$time)
temp_max <- unlist(data$daily$temperature_2m_max)
precip <- unlist(data$daily$precipitation_sum)
df <- data.frame(
date = as.Date(dates),
temperature_max_c = temp_max,
precipitation_mm = precip
)
return(df)
} else {
warning(paste("Request failed with status:", status_code(response), "for", start_date, "to", end_date))
return(NULL)
}
}
# Create a sequence of monthly intervals for 2024
start_dates <- seq(as.Date("2024-01-01"), as.Date("2024-12-01"), by = "month")
end_dates <- seq(as.Date("2024-01-31"), as.Date("2024-12-31"), by = "month")
# Make sure both vectors are the same length
if(length(end_dates) < length(start_dates)) {
end_dates <- c(end_dates, as.Date("2024-12-31"))
}
# Initialize empty data frame
full_year_data <- data.frame()
# Loop over each month and append data with timing
for (i in seq_along(start_dates)) {
cat("Fetching data from", as.character(start_dates[i]), "to", as.character(end_dates[i]), "\n")
time_taken <- system.time({
monthly_data <- tryCatch(
get_climate_data(as.character(start_dates[i]), as.character(end_dates[i])),
error = function(e) {
cat("Error for", as.character(start_dates[i]), "to", as.character(end_dates[i]), "\n")
return(NULL)
}
)
})
cat("Time taken:", time_taken["elapsed"], "seconds\n\n")
if(!is.null(monthly_data)) {
full_year_data <- bind_rows(full_year_data, monthly_data)
}
}
# Remove duplicate rows if any
full_year_data <- distinct(full_year_data)
print(full_year_data)
Below is a sample of the output generated by the R code, showing daily maximum temperature and precipitation data retrieved from the Open-Meteo Climate API for Nairobi in 2024.
> print(full_year_data)
date temperature_max_c precipitation_mm
1 2024-01-01 29.3 0.00
2 2024-01-02 28.5 1.46
3 2024-01-03 26.8 2.19
4 2024-01-04 26.7 0.00
5 2024-01-05 27.1 0.00
6 2024-01-06 29.8 0.88
7 2024-01-07 28.1 6.00
8 2024-01-08 24.8 9.36
9 2024-01-09 19.7 0.44
10 2024-01-10 22.0 0.88
11 2024-01-11 23.1 0.44
12 2024-01-12 23.9 1.32
13 2024-01-13 25.1 1.76
14 2024-01-14 26.0 0.15
15 2024-01-15 27.3 0.00
16 2024-01-16 28.1 0.00
17 2024-01-17 27.9 0.00
18 2024-01-18 29.0 0.00
19 2024-01-19 29.6 0.00
20 2024-01-20 29.7 0.00
21 2024-01-21 29.5 0.00
22 2024-01-22 28.4 0.00
23 2024-01-23 29.2 0.00
24 2024-01-24 30.5 0.00
25 2024-01-25 29.2 0.00
26 2024-01-26 28.4 0.00
27 2024-01-27 27.4 0.00
28 2024-01-28 28.1 0.00
29 2024-01-29 28.5 0.00
30 2024-01-30 29.1 0.00
31 2024-01-31 28.7 0.00
32 2024-02-01 26.9 0.00
33 2024-02-02 25.1 0.15
34 2024-02-03 27.6 0.00
35 2024-02-04 25.3 0.74
36 2024-02-05 26.4 0.30
37 2024-02-06 27.1 0.00
38 2024-02-07 29.4 0.00
39 2024-02-08 27.7 0.15
40 2024-02-09 27.1 0.15
41 2024-02-10 27.9 0.00
42 2024-02-11 28.3 0.00
43 2024-02-12 29.6 0.00
44 2024-02-13 30.3 0.89
45 2024-02-14 31.1 0.15
46 2024-02-15 29.8 0.00
47 2024-02-16 29.5 0.00
48 2024-02-17 30.6 0.00
49 2024-02-18 30.0 0.45
50 2024-02-19 29.6 0.00
51 2024-02-20 29.0 0.00
52 2024-02-21 30.4 0.00
53 2024-02-22 31.5 0.00
54 2024-02-23 31.5 0.30
55 2024-02-24 29.9 12.19
56 2024-02-25 22.5 21.86
57 2024-02-26 25.6 10.56
58 2024-02-27 26.1 4.61
59 2024-02-28 25.1 4.17
60 2024-02-29 24.6 3.87
61 2024-03-01 27.0 2.53
62 2024-03-02 26.2 2.53
63 2024-03-03 26.1 0.15
64 2024-03-04 26.3 0.30
Overview of AgERA5 API Access and Usage
The following table summarizes the key aspects of accessing and using the AgERA5 dataset through the Copernicus Climate Data Store (CDS) API. It highlights important details about authentication, request parameters, data retrieval, and practical considerations to help users effectively download and utilize AgERA5 climate data programmatically.
| Test Aspect | Description / Result |
|---|---|
| API Endpoint | - Accessed via CDS API client to dataset "sis-agrometeorological-indicators" |
| Authentication | - Requires registration on Copernicus Climate Data Store (CDS) and API key configured in .cdsapirc file |
| Request Parameters | - Variable: "2m_temperature" - Statistic: "24_hour_minimum" - Year: "1980" - Months: All 12 months - Days: All days 1-31 - Version: "1_0" - Area: Bounding box around Nairobi (N: -1.2, W: 36.74, S: -1.37, E: 36.9) |
| Data Retrieval | - Requested full-year daily minimum 2m temperature data for Nairobi region - Uses client.retrieve() method with .download() to save data locally |
| Data Format | - Data downloaded as NetCDF file (default format for AgERA5) |
| Response Time | - Dependent on data volume; full year and area request may take several minutes to complete |
| Error Handling | - Errors raised if .cdsapirc file missing or API key invalid - Network or quota errors possible during download |
| Data Completeness | - Full year coverage requested; actual completeness depends on dataset availability and license acceptance |
| Data Accuracy | - Data sourced from Copernicus AgERA5 reanalysis product, widely validated for agricultural climate applications |
| Limitations | - Must manually accept dataset license on CDS website before API download allowed - Large requests may be throttled or require chunking |
| Usability | - Python cdsapi client provides straightforward API access - Requires basic Python programming knowledge and environment setup |
| Scalability | - Supports requests for multiple years, variables, and spatial areas with appropriate chunking |
| Documentation | - Official CDS API documentation: https://cds.climate.copernicus.eu/how-to-api - Dataset page includes "Show API request code" feature for custom queries |
The following Python script demonstrates how to use the cdsapi client to request and download daily minimum 2-meter temperature data for the year 1980 over the Nairobi region from the AgERA5 dataset.
import cdsapi
dataset = "sis-agrometeorological-indicators"
request = {
"variable": "2m_temperature",
"statistic": ["24_hour_minimum"],
"year": ["1980"],
"month": [
"01", "02", "03",
"04", "05", "06",
"07", "08", "09",
"10", "11", "12"
],
"day": [
"01", "02", "03",
"04", "05", "06",
"07", "08", "09",
"10", "11", "12",
"13", "14", "15",
"16", "17", "18",
"19", "20", "21",
"22", "23", "24",
"25", "26", "27",
"28", "29", "30",
"31"
],
"version": "1_0",
"area": [-1.2, 36.74, -1.37, 36.9]
}
client = cdsapi.Client()
client.retrieve(dataset, request).download()
NASA POWER API testing process
The following table summarizes the key aspects and results of testing the NASA POWER API for retrieving daily temperature data at a specific location, highlighting the API’s usability, data quality, and practical considerations for integration in climate data analysis workflows.
| Test Aspect | Description / Result |
|---|---|
| API Endpoint | - https://power.larc.nasa.gov/api/temporal/daily/point?start=20150101&end=20150630&latitude=-1.2921&longitude=36.8219&community=ag¶meters=T2M&format=json&user=Michel&header=true&time-standard=lst |
| Authentication | - No API key required; simple user identifier optional (user=Michel) |
| Request Parameters | - Latitude: -1.2921 (Nairobi) - Longitude: 36.8219 - Date range: 2015-01-01 to 2015-06-30 - Community: ag (Agroclimatology) - Parameters: T2M (2m air temperature) - Format: JSON - Time standard: Local Solar Time (LST) |
| Data Retrieval | - Successful GET request with HTTP 200 status code - Data returned as nested JSON with daily temperature values keyed by date |
| Data Format | - JSON response parsed using jsonlite::fromJSON() - Extracted temperature data converted to data frame with columns: date, temperature_c |
| Response Time | - Typically under a few seconds for daily point data requests |
| Error Handling | - Checked HTTP status code before parsing - Used conditional stop on failure |
| Data Completeness | - Returned data covers entire requested date range - Dates correctly parsed and ordered |
| Data Accuracy | - Temperature values consistent with expected climatology for Nairobi region |
| Limitations | - Data availability depends on spatial resolution (~0.5° x 0.625°) - Exact coordinates may yield empty data if outside grid - No bulk or multi-point queries in single request (point data only) |
| Usability | - Easy integration with R packages (httr, jsonlite, dplyr) - No complex authentication or API keys needed - Clear parameterization and documentation |
| Scalability | - Suitable for single-point daily data retrieval - Larger spatial or temporal coverage requires multiple requests |
| Documentation | - API usage documented with example R code and troubleshooting notes - Official docs: NASA POWER API, and API get start |
To illustrate the practical implementation of this testing approach, the following R code demonstrates how to retrieve and aggregate monthly climate data chunks from the Open-Meteo Climate API for Nairobi in 2024, including error handling and timing of requests.
library(httr)
library(jsonlite)
library(dplyr)
# Define the API URL (copied directly from your query)
api_url <- "https://power.larc.nasa.gov/api/temporal/daily/point?start=20150101&end=20150630&latitude=-1.2921&longitude=36.8219&community=ag¶meters=T2M&format=json&user=Michel&header=true&time-standard=lst"
# Send GET request to NASA POWER API
response <- GET(api_url)
# Check if request was successful
if (status_code(response) == 200) {
json_data <- content(response, as = "text", encoding = "UTF-8")
parsed_data <- fromJSON(json_data, flatten = TRUE)
temp_data <- parsed_data$properties$parameter$T2M$data
temperature_df <- data.frame(
date = as.character(names(temp_data)),
temperature_c = as.numeric(unlist(temp_data))
) %>%
mutate(
date = as.Date(date, format = "%Y%m%d")
) %>%
arrange(date)
head(temperature_df)
} else {
stop(paste("API request failed with status:", status_code(response)))
}
The following is a sample JSON response from the NASA POWER API, showing daily 2-meter air temperature (T2M) values for Nairobi, Kenya, over the first month of 2015, including geographic coordinates and elevation metadata.
{
"type": "Feature",
"geometry": {
"type": "Point",
"coordinates": [36.822, -1.292, 1642.2]
},
"properties": {
"parameter": {
"T2M": {
"20150101": 19.56,
"20150102": 19.63,
"20150103": 20.4,
"20150104": 21.33,
"20150105": 20.41,
"20150106": 20.89,
"20150107": 21.16,
"20150108": 21.25,
"20150109": 21.64,
"20150110": 22.14,
"20150111": 21.48,
"20150112": 21.56,
"20150113": 21.01,
"20150114": 20.76,
"20150115": 21.24,
"20150116": 22.95,
"20150117": 23,
"20150118": 23.38,
"20150119": 23.88,
"20150120": 21.16,
"20150121": 20.88,
"20150122": 21.35,
"20150123": 21.58,
"20150124": 21.15,
"20150125": 20.51,
"20150126": 21.26,
"20150127": 22.12,
"20150128": 22.69,
"20150129": 22.12,
"20150130": 22.25,
"20150131": 22.48,
Crop Calendar Datasets: API Availability and Access
The following table summarizes key crop calendar datasets, highlighting their API availability, registration requirements, and access methods.
| Dataset | API Available? | API Access Registration Required? | Notes / Source |
|---|---|---|---|
| AgMIP-GGCMI Crop Calendar | No official public API | N/A | Provided as R-package and static gridded datasets; crop calendars at 0.5° resolution for 18 crops with rainfed/irrigated separation; used for crop model calibration; data downloadable from GitHub and Zenodo repositories. |
| MIRCA-OS (Monthly Irrigated and Rainfed Cropped Area, Open Source) | No official public API | N/A | Dataset provides monthly cropped area for irrigated and rainfed systems globally; data accessed via direct download (e.g., from FAO or associated portals); no REST API documented. |
| WorldCereal Project Crop Calendars | No official public API | N/A | Crop calendar data developed for global cereal crops; typically accessed via project portals or data repositories; no dedicated API available publicly. |
This table presents the testing results and key characteristics of the WorldCereal Project Crop Calendars dataset, highlighting its access method, data format, and usability in comparison to API-based climate data services.
| Test Aspect | Description / Result |
|---|---|
| API Endpoint | - No official API endpoint available for WorldCereal Crop Calendars. - Dataset downloadable as a ZIP file from: https://tandf.figshare.com/articles/dataset/Global_crop_calendars_of_maize_and_wheat_in_the_framework_of_the_WorldCereal_project/20005293 |
| Authentication | - No authentication or API key required; data accessed via direct download |
| Request Parameters | - Not applicable; data provided as static gridded maps and shapefiles |
| Data Retrieval | - Download ZIP file containing crop calendar maps for maize and wheat - Maps include Start of Season (SOS), End of Season (EOS), location info, and agro-ecological zones (AEZ) metadata |
| Data Format | - ZIP archive containing GeoTIFF maps and shapefiles - Maps display SOS, EOS in day of year (DOY) format - AEZ polygons with crop calendar attributes |
| Response Time | - Not applicable; download speed depends on user internet connection and file size |
| Error Handling | - No API errors; download failures depend on server or network issues |
| Data Completeness | - Covers global maize and wheat crop calendars at 0.5° resolution - Includes major cereal seasons and up to two maize seasons - Stratified into 106 agro-ecological zones |
| Data Accuracy | - Crop calendars derived from multiple reputable sources (GEOGLAM, USDA-FAS, FAO, JRC-ASAP) and machine learning models trained on ERA5 climate data - Widely used for global crop mapping and agro-ecological stratification |
| Limitations | - Static dataset; no temporal updates via API - Focused on maize and wheat only - Requires GIS software or programming tools to read and process maps |
| Usability | - Suitable for GIS analysis and crop modeling - Requires manual download and local processing - No programmatic access for automated workflows |
| Scalability | - Scalable for global coverage but limited by manual download and local processing resources |
| Documentation | - Detailed documentation and methodology available in project publications and ESA WorldCereal website - Dataset description: WorldCereal Crop Calendars on Figshare |
Climate Dataset Availability and API Access via Google Earth Engine
Google Earth Engine (GEE) is a powerful cloud-based platform that hosts a vast collection of geospatial datasets, including numerous key climate data products relevant to agroecological research. This section summarizes the availability of important climate datasets within GEE, highlighting which datasets can be accessed directly through the platform, their corresponding Earth Engine asset IDs, and the programmatic interfaces (APIs) supported for data retrieval and analysis. For datasets not currently hosted on GEE, alternative access methods and APIs are noted to guide integration into agroecological workflows.
| Dataset Name | Available in GEE? | Earth Engine Dataset ID / Notes | Provider / Source | Data Type | Access Method (API) | Notes / Comments |
|---|---|---|---|---|---|---|
| CHIRPS Precipitation | Yes | UCSB-CHG/CHIRPS/DAILY |
UCSB/CHG | ImageCollection | Python, JavaScript, R | Daily precipitation, 1981-present, ~5.5 km resolution |
| ERA5 Climate Reanalysis | Yes | ECMWF/ERA5_LAND_HOURLY |
ECMWF | ImageCollection | Python, JavaScript, R | Land reanalysis data with temperature, precipitation, etc. |
| MODIS NDVI | Yes | MODIS/006/MOD13Q1 |
NASA | ImageCollection | Python, JavaScript, R | Vegetation index, 250m resolution |
| SRTM Digital Elevation | Yes | USGS/SRTMGL1_003 |
USGS | Image | Python, JavaScript, R | 30m resolution digital elevation model |
| AgERA5 Agricultural Data | No | N/A | Various | TBD | External APIs (Copernicus CDS) | Accessed via Copernicus Climate Data Store API; not currently in GEE |
| CMIP6 Climate Projections | Yes | NASA/NEX-GDDP-CMIP6 |
NASA | ImageCollection | Python, JavaScript | Downscaled CMIP6 climate model projections |
| NASA POWER | No | N/A | NASA | Various | NASA POWER API (external) | Not in GEE; accessible via NASA POWER API without EarthData login |
| NASA Earth Exchange Global Daily Downscaled Projections (NEX-GDDP) | Yes | NASA/NEX-GDDP |
NASA | ImageCollection | Python, JavaScript | Downscaled CMIP5 climate projections |
| TAMSAT | No | N/A | TAMSAT | Various | Downloads only | Not available in GEE; data accessed via downloads |
| IMERG (Integrated Multi-satellitE Retrievals for GPM) | Yes | NASA/GPM_L3/IMERG_V06 |
NASA | ImageCollection | Python, JavaScript | Satellite precipitation estimates |
| TerraClimate | Yes | IDAHO_EPSCOR/TERRACLIMATE |
University of Idaho | ImageCollection | Python, JavaScript | Monthly climate and water balance data |
| CHIRTS (CHIRPS + Temperature) | No | N/A | UCSB/CHG | TBD | External APIs (ClimateSERV) or R package chirps |
Not available in GEE; accessed via ClimateSERV API or R package chirps |
Notably, while many key datasets such as CHIRPS, ERA5, and TerraClimate are fully integrated into GEE and accessible via its Python and JavaScript APIs, some important datasets like AgERA5 and NASA POWER require external APIs or direct downloads. To demonstrate practical usage and confirm data accessibility, we conducted a test extracting daily precipitation data from the CHIRPS dataset using the Earth Engine Python API. The following code snippet illustrates this process and highlights the ease of programmatic access to high-quality climate data through GEE.
CHIRPS
!pip install earthengine-api
import ee
import datetime
ee.Initialize()
# Load CHIRPS daily dataset
chirps = ee.ImageCollection('UCSB-CHG/CHIRPS/DAILY')
# Define Nairobi point geometry (longitude, latitude)
nairobi = ee.Geometry.Point(36.817223, -1.286389)
# Define date range
start_date = datetime.date(2020, 1, 1)
end_date = datetime.date(2020, 1, 31)
delta = datetime.timedelta(days=1)
current_date = start_date
print("Date, Precipitation (mm)")
while current_date <= end_date:
next_date = current_date + delta
# Filter image for one day
image = chirps.filterDate(str(current_date), str(next_date)).first()
if image is None:
print(f"{current_date}, No image found")
else:
# Extract precipitation at Nairobi point
precip = image.reduceRegion(
reducer=ee.Reducer.mean(),
geometry=nairobi,
scale=5566, # CHIRPS native resolution ~5.55 km
maxPixels=1e9,
bestEffort=True
).get('precipitation').getInfo()
print(f"{current_date}, {precip}")
current_date = next_date
The above code queries the CHIRPS daily precipitation dataset for Nairobi over the month of January 2020. It filters the dataset by date and extracts the mean precipitation value at the specified geographic point for each day. This approach demonstrates how Google Earth Engine enables efficient, programmatic access to high-resolution climate data, facilitating detailed temporal analysis at specific locations relevant to agroecological studies.
Date, Precipitation (mm)
2020-01-01, 0
2020-01-02, 0
2020-01-03, 0
2020-01-04, 0
2020-01-05, 0
2020-01-06, 0
2020-01-07, 19.17416000366211
2020-01-08, 0
2020-01-09, 0
2020-01-10, 0
2020-01-11, 0
2020-01-12, 0
2020-01-13, 28.82002067565918
2020-01-14, 0
2020-01-15, 0
2020-01-16, 0
2020-01-17, 0
2020-01-18, 0
2020-01-19, 0
2020-01-20, 0
2020-01-21, 0
2020-01-22, 0
2020-01-23, 0
2020-01-24, 0
2020-01-25, 34.24251937866211
2020-01-26, 26.745319366455078
2020-01-27, 0
2020-01-28, 0
2020-01-29, 0
2020-01-30, 0
2020-01-31, 26.745319366455078
ERA5
import ee
import datetime
# Initialize the Earth Engine API
ee.Initialize()
# Load ERA5 daily dataset and select the total precipitation band
era5 = ee.ImageCollection('ECMWF/ERA5/DAILY').select('total_precipitation')
# Define Nairobi point geometry (longitude, latitude)
nairobi = ee.Geometry.Point(36.817223, -1.286389)
# Define date range
start_date = datetime.date(2020, 1, 1)
end_date = datetime.date(2020, 1, 31)
delta = datetime.timedelta(days=1)
current_date = start_date
print("Date, Total Precipitation (mm)")
while current_date <= end_date:
next_date = current_date + delta
# Filter image for one day
image = era5.filterDate(str(current_date), str(next_date)).first()
if image is None:
print(f"{current_date}, No image found")
else:
# Extract total precipitation at Nairobi point
precip_m = image.reduceRegion(
reducer=ee.Reducer.mean(),
geometry=nairobi,
scale=30000, # ERA5 native resolution ~31 km; 30,000m used here
maxPixels=1e9,
bestEffort=True
).get('total_precipitation').getInfo()
# Convert from meters to millimeters
precip_mm = precip_m * 1000 if precip_m is not None else None
print(f"{current_date}, {precip_mm}")
current_date = next_date
Date, Total Precipitation (mm)
2020-01-01, 0.004870817065238953
2020-01-02, 0.4606954753398895
2020-01-03, 1.0118503123521805
2020-01-04, 0.11597387492656708
2020-01-05, 1.1076070368289948
2020-01-06, 4.25216369330883
2020-01-07, 2.2987890988588333
2020-01-08, 4.17330302298069
2020-01-09, 1.657482236623764
2020-01-10, 6.145646795630455
2020-01-11, 2.6914402842521667
2020-01-12, 5.795121192932129
2020-01-13, 10.143805295228958
2020-01-14, 4.366599023342133
2020-01-15, 1.1030733585357666
2020-01-16, 1.7602518200874329
2020-01-17, 11.607900261878967
2020-01-18, 9.046618826687336
2020-01-19, 0.5304403603076935
2020-01-20, 0.12766756117343903
2020-01-21, 0.15154294669628143
2020-01-22, 1.0277777910232544
2020-01-23, 3.675827756524086
2020-01-24, 0.8690860122442245
2020-01-25, 0.6284154951572418
2020-01-26, 1.7681196331977844
2020-01-27, 2.2757556289434433
2020-01-28, 6.732601672410965
2020-01-29, 9.409807622432709
2020-01-30, 4.783229902386665
2020-01-31, 3.619477152824402
AgERA5
import ee
import datetime
# Initialize the Earth Engine API
ee.Initialize()
# Load AgERA5 daily dataset and select the precipitation band
agera5 = ee.ImageCollection('projects/climate-engine-pro/assets/ce-ag-era5-v2/daily').select('Precipitation_Flux')
# Define Nairobi point geometry (longitude, latitude)
nairobi = ee.Geometry.Point(36.817223, -1.286389)
# Define date range
start_date = datetime.date(2020, 1, 1)
end_date = datetime.date(2020, 1, 31)
delta = datetime.timedelta(days=1)
current_date = start_date
print("Date, Total Precipitation (mm)")
while current_date <= end_date:
next_date = current_date + delta
# # Filter the image collection to include only images from the current day
daily_collection = agera5.filterDate(current_date.isoformat(), next_date.isoformat())
# Since AgERA5 provides one image per day with daily total precipitation,
# directly select the first (and only) image for that day without further aggregation
daily_image = daily_collection.first()
if daily_image is None:
print(f"{current_date}, No image found")
else:
# Extract precipitation value at Nairobi point
precip = daily_image.reduceRegion(
reducer=ee.Reducer.mean(),
geometry=nairobi,
scale=10000, # ~9.6 km native resolution
maxPixels=1e9,
bestEffort=True
).get('Precipitation_Flux')
# Get value client-side
precip_mm = precip.getInfo() if precip else None
if precip_mm is None:
print(f"{current_date}, No data at point")
else:
print(f"{current_date}, {precip_mm}")
current_date = next_date
Date, Total Precipitation (mm)
2020-01-01, 0
2020-01-02, 0.3799999952316284
2020-01-03, 0.7699999809265137
2020-01-04, 0.6100000143051147
2020-01-05, 0.10999999940395355
2020-01-06, 0.3700000047683716
2020-01-07, 2.0399999618530273
2020-01-08, 3.0899999141693115
2020-01-09, 1.909999966621399
2020-01-10, 0.949999988079071
2020-01-11, 0.4000000059604645
2020-01-12, 7.159999847412109
2020-01-13, 4.840000152587891
2020-01-14, 1.850000023841858
2020-01-15, 0.8399999737739563
2020-01-16, 0.8100000023841858
2020-01-17, 4.25
2020-01-18, 0.9900000095367432
2020-01-19, 0.6299999952316284
2020-01-20, 0.10000000149011612
2020-01-21, 0.5
2020-01-22, 0.7799999713897705
2020-01-23, 1.7899999618530273
2020-01-24, 0.6200000047683716
2020-01-25, 0.33000001311302185
2020-01-26, 1.309999942779541
2020-01-27, 5.070000171661377
2020-01-28, 13.260000228881836
2020-01-29, 5.409999847412109
2020-01-30, 8.0600004196167
2020-01-31, 10.65999984741211
Point-Specific Data Extraction in Google Earth Engine (GEE)
The precipitation data used in this analysis are sourced from the CHIRPS daily dataset via Google Earth Engine’s Python API. Data extraction is performed using the reduceRegion() function applied over a single point geometry representing the location of Nairobi.
Key aspects of this approach include:
- Point Geometry: The
ee.Geometry.Pointobject ensures that extraction targets a specific geographic coordinate, avoiding larger polygons or arbitrary regions. - Native Spatial Resolution: The
scaleparameter is set to approximately 5566 meters, which corresponds to the native resolution of the CHIRPS dataset. This ensures that the value retrieved corresponds to the single raster pixel underlying the point. - Use of Mean Reducer: Applying the
ee.Reducer.mean()over a point geometry effectively returns the precipitation value of that pixel, as no aggregation over multiple pixels occurs. - No Aggregation Beyond Pixel Level: Since CHIRPS pixels cover roughly 5.5 km by 5.5 km, the extracted value represents the average precipitation over this area intersecting the point, without further spatial averaging or tiling aggregation.
- Best Effort Parameter: The
bestEffort=Trueoption allows Earth Engine to automatically adjust processing scale or reduce the resolution when necessary to ensure that the extraction completes successfully within computational limits, without compromising the point-specific nature of the data. - Maximum Pixels Parameter: The
maxPixels=1e9argument sets a very high ceiling for the number of pixels that Earth Engine will process in your region of interest, which ensures that even if scaling or large requests occur, the computation will not be blocked due to overly strict pixel limits. For point geometries, this is typically not reached, but it adds robustness for efficient data extraction. - Validation and Best Practices: This method ensures exact pixel-level data retrieval for the defined points, aligning with Google Earth Engine best practices for point-based raster value extraction. For multiple points or more complex geometries, the
sampleRegions()orreduceRegions()functions could be employed.
In summary, the method extracts point-specific precipitation values at the pixel level from CHIRPS data, which provides spatially explicit and temporally continuous climate data suitable for subsequent onset detection and analysis at the selected locations.
This table below compares different scale values and their effect on CHIRPS data extraction:
| Scale Value | Effect on CHIRPS Precipitation Data Extraction |
|---|---|
| Approx. 5566 meters | Returns the value of the single native-resolution pixel under the point. |
| Much smaller (e.g., 10m) | Samples the same native pixel repeatedly; no added spatial detail, more compute. |
| Larger than native | Aggregates multiple pixels, returning spatial averages over larger areas. |