Birdhouse

Documentation Status Travis Build GitHub license Join the chat at https://gitter.im/bird-house/birdhouse

Birdhouse is a GitHub organization comprised of Python projects related to Web Processing Services to support climate data analysis.

The full documentation is available on ReadTheDocs and in the docs/ folder.

Overview

Introduction

Birdhouse is a collaborative project open for the community to participate. It is a software framework containing a collection of Web Processing Service (WPS). The deployed algorithms are focusing on Earth Systems and environmental data processing with the philosophy of streamlining the software development and deployment. By supporting climate, earth observation and biodiversity data and processes, Birdhouse can be used in a wide array of Earth sciences projects and workflows. The core benefit of this project is to allow the seamless use of climate services developed by a diverse network of national meteorological offices, regional climate service providers, academics, not-for-profit research centers and private industry. As governments move toward open-data policies, there will be a need for analytical services that extract value out of the deluge of information. Using an interoperable software architecture, institutions can provide both data and services allowing users to process the data remotely from a laptop, instead of having to acquire and maintain large storage infrastructures.

What is WPS?

Geographic Information Processing for the Web
The Web Processing Service (WPS) offers a simple web-based method of finding, accessing, and using all kinds of calculations and models.

A WPS is a technical solution (WPS Concepts) in which processes are hosted on a server and accessed over the web (Fig. 1). These processes con- form to a standardized format, ensuring that they follow the principle of reusable design: they can be instantiated multiple times for different input arguments or data sources, customized following the same structure to handle new inputs, and are modular, hence can be combined to form new processes. In addition, a WPS can be installed close to the data to enable processing directly out of the archive. A WPS can also be linked to a theoretically limit- less combination of several other WPSs, or generally OpenGIS Web Services (OWS). Our understanding of process is used in the same sense as in the OGC standard: ’for any algorithm, calculation or model that either generates new data or trans- forms some input data into output data’. A submitted process is a job. A service provides a collection of processes containing scientific methods that focus on climate impact and extreme weather events. A combination of processes is called a workflow, and a collection of WPS-related software compartments is a framework. WPS divides the operation into server and client side, with appropriate security in between to avoid misuse.

_images/WPS_principe.png

Note

Read the documentation on Geographic Information Processing for the Web

WPS Use Case

Todo

needs to be updated.

A user runs WPS processes remotely on a machine with direct access to climate data archives.

_images/wps_adamsteer.png

Architecture

Framework Overview

_images/framework.png

ESGF is currently the main climate data resource (but more resources are possible). ESGF Solr-index is used to find ESGF data. The ESGF identity provider with OpenIDs and X509 certificate is used for authentication.

There are several WPS services. Malleefowl is the main one for the Phoenix client. Malleefowl is used to search, download (with caching) ESGF data and to retrieve certificates. Malleefowl has also a workflow engine (dispel4py) to chain WPS processes.

The results of the WPS processes are stored on the file system and are accessible via URL (with a token id).

Results can be shown on a Map using a Web Mapping Service (ncWMS, adagucserver).

The PyCSW Catalog Service is used to register WPS services and also to publish WPS outputs. Published results in the PyCSW can also used as input source for processes again.

WPS serivces can be accessed through web-applications like Phoenix or from scripts.

Note

See also the Birdhouse Presentation.

Birdhouse is the home of Web Processing Services used in climate science and components to support them (the birds):

Client Side Components

  • Phoenix: a web-based WPS client with ESGF data access
  • Birdy: a WPS command line client and native library

Server Side Components

WPS services for climate data analysis:

  • Flyingpigeon: services for the climate impact community
  • Black Swan: services for the extreme weather event assessments
  • Hummingbird: provides cdo and compliance-checker as a service
  • Emu: some example WPS processes for demo

Many climate analysis operations are implemented using OpenClimateGIS including the python package icclim.

Supporting Services and libraries:

  • Twitcher: an OWS Security Proxy
  • Malleefowl: access to climate data (ESGF, …) as a service
  • Eggshell: provides common functionallity for Birdhouse WPS services

You can find the source code of all birdhouse components on GitHub. Docker images with birdhouse components are on Docker Hub

Files and Folders

Warning

This section is outdated. We are moving to a new deployment without Buildout.

The birds have a similar folder structure. While library dependencies are stored within the conda deployment

Three folder locations have to be pointed out:

  • repository clones: The fetched code by git clone. It is recommended to store the repositories in ~/birdhouse
  • anaconda: By default, the installation process creates a folder ~/anaconda for general anaconda-specific software.
  • conda environments: All birds (repositories) are built with their own environment to avoid missmatch of dependencies. By default, the conda environments are in ~/.conda/envs/.

To change the default settings, create a Makefile.config with:

$ cp Makefile.config.example Makefile.config

and change the paths accordingly to your needs.

Furthermore, in environment.yml, the conda packages can be defined. It is recommended to pin the version. The bird-specific packages are defined here, while in requirements/conda_pinned, general versions are set.

There are log files situated at:: ~/birdhouse/var/log/pywps/

Guidelines

To guide you through the learning curve of installation modules of birdhouse and set up an running birdhouse ecosystem, administer the server-side birdhouse components or even improve and develop your own specific functions, here are some general guidelines:

Installation Guidelines

Warning

This section is outdated …

Birdhouse consists of several components like Malleefowl and Emu. Each of them can be installed individually. The installation is done using the Python-based build system Buildout. Most of the dependencies are maintained in the Anaconda Python distribution. For convenience, each birdhouse component has a Makefile to ease the installation so you don’t need to know how to call the Buildout build tool.

Requirements

Birdhouse uses Anaconda Python distribution for most of the dependencies. If Anaconda is not already installed, it will be installed during the installation process. Anaconda has packages for Linux, MacOSX and Windows. But not all packages used by birdhouse are already available in the default package channel of Anaconda. The missing packages are supplied by birdhouse on Binstar. But we currently maintain only packages for Linux 64-bit and partly for MacOSX.

So the short answer to the requirements is: you need a Linux 64-bit installation.

Birdhouse is currently used on Ubuntu 14.04 and CentOS 6.x. It should also work on Debian, LinuxMint and Fedora.

Birdhouse also installs a few system packages using apt-get on Debian based distributions and yum on RedHat/CentOS based distributions. For this you need a user account with sudo permissions. Installing system packages can be done in a separate step. So your installation user does not need any special permissions. All installed files will go into a birdhouse Anaconda environment in the home folder of the installation user.

Installing from source

The installation of birdhouse components from source is done with some few commands. Here is an example for the Emu WPS service:

$ git clone https://github.com/bird-house/emu.git
$ cd emu
$ make clean install
$ make start
$ firefox http://localhost:8094/wps

All the birdhouse components follow the same installation pattern. If you want to see all the options of the Makefile then type:

$ make help

You will find more information about these options in the Makefile documentation.

Read the documention of each birdhouse component for the details of the installation and how to configure the components. The birdhouse bootstrap documentation gives some examples of the different ways of making the installation.

On the WPS client side we have:

  • Phoenix: a Pyramid web application.
  • Birdy: a simple WPS command line tool.

On the WPS server side we have:

  • Malleefowl: provides base WPS services to access data.
  • Flyingpigeon: provides WPS services for the climate impact community.
  • Hummingbird: provides WPS services for CDO and climate metadata checks.
  • Emu: just some WPS processes for testing.

Nginx, gunicorn and supervisor

Birdhouse sets up a PyWPS server (and also the Phoenix web application) using Buildout. We use the Gunicorn HTTP application server (similar to Tomcat for Java servlet applications ) to run these web applications with the WSGI interface. In front of the Gunicorn application server, we use the Nginx HTTP server (similar to the Apache web server). All these web services are started/stopped and monitored by a Supervisor service.

See the following image for how this looks like:

_images/WsgiApp.png

When installing a birdhouse WPS service, you don’t need to care about this setup. This is all done by Buildout and using some extensions provided by birdhouse.

The Makefile of a birdhouse application has convenience targets to start/stop a WPS service controlled by the Supervisor and to check the status:

$ make start    # start wps service
$ make stop     # stop wps service
$ make status   # show status of wps service
Supervisor status ...
/home/pingu/.conda/envs/birdhouse/bin/supervisorctl status
emu                              RUNNING   pid 25698, uptime 0:00:02
malleefowl                       RUNNING   pid 25702, uptime 0:00:02
mongodb                          RUNNING   pid 25691, uptime 0:00:02
nginx                            RUNNING   pid 25699, uptime 0:00:02
phoenix                          RUNNING   pid 25694, uptime 0:00:02
pycsw                            RUNNING   pid 25700, uptime 0:00:02
tomcat                           RUNNING   pid 25693, uptime 0:00:02

You can also use the Supervisor monitor web service which by default is available on port http://localhost:9001/. The Supervisor monitor app looks like in the following screenshot.

_images/supervisor-monitor.png

Using birdhouse with Docker

An alternative way to install and deploy birdhouse Web Processing Services is by using Docker. The birdhouse WPS servers are available as a Docker image on Docker Hub. See an example on how to use them with the Emu WPS Docker image.

Administrator Guidelines

Warning

This section is outdated …

Set up a birdhouse ecosystem server

If you are already familiar with installing single standalone WPS (follow the Installation Guidelines guides in the documentations of e.g. emu), then you are ready to set up a birdhouse containing flyingpigeon (providing scientific analyses methods), malleefowl (to search and fetch data) and the pheonix (a graphic interface for a web browser including a WMS).

General Remarks
Check the Requirements of your system!
The installation is done as normal user, root rights are causing conflicts.
Clone the Repositories from GitHub

It is recommended to collect the repositories in a seperate folder (e.g. birdhouse, but can have a name of your choice):

$ mkdir birdhouse
$ cd birdhouse
  • fetch the source code:
$ git clone https://github.com/bird-house/flyingpigeon.git
$ git clone https://github.com/bird-house/pyramid-phoenix.git
$ git clone https://github.com/bird-house/malleefowl.git
  • phoenix password

To be able to log into the Phoenix GUI once the services are running, it is necessary to generate a password: go into the pyramid-phoenix folder and run:

$ make passwd

This will automatically write a password hash into pyramid-phoenix/custom.cfg

  • installation

You can run the installation with default settings. It will create an anaconda environment into your HOME direcory and deploy all required software dependecies there. read the ‘’changing the default configuration’ first if you would like to change the defaults.

In all of the tree folders (malleefowl, flyingpigeon and pyramid-phoenix) run:

$ make install

This installation will take some minutes to fetch all dependencies and install them into seperate conda environments. With the default settings, the installation creates the following folders:

$ ls ~/anaconda/

contains general software required by anaconda:

$ ls ~/.conda/envs/

contains the seperate environments of the birds for their specific software dependencies:

$ ls ~/birdhouse/var/

the local cache for fetched input files, output files and logs. This folder is growing (while fetching files and storing job outputs) under productive usage of birdhouse.

  • start the services

in one of the birds run:

$ make start

or:

$ make restart

and to check if the services are running, run:

$ make status
  • launching the Phoenix GUI

If the services are running, you can launch the GUI in a common web browser. By default, phoenix is set to port 8081:

firefox http://localhost:8081

or:

firefox https://localhost:8443/

Now you can log in (upper right corner) with your Phoenix password created previously. Phoenix is just a graphical interface with no more function than looking nice ;-).

  • register a service in the GUI

Your first administrator step is to register flyingpigeon as a service. For that, log in with your phoenix password. In the upper right corner is a tool symbol to open the ‘settings’. Click on ‘Services’ and the ‘Register a Service’.

flyingpigeon is per default at port 8093.

the appropriate url is:

http://localhost:8093/wps

Provide service title and name as you like: Service Title: Flyingpigeon Service Name: flyingpigeon

check ‘Service Type’ : ‘Web Processing Service’ (default) and register.

Optionally, you can check ‘Public access?’, to allow unregistered users to launch jobs. (NOT recommended)

  • launching a job

Now your birdhouse ecosysem is set up. The also installed malleefowl is already running in the background and will do a lot of work silently. Ther is no need to register malleefowl manually!

Launching a job can be performed as a process (Process menu) or with the wizard. To get familliar with the processes provided by each of the birds, read the approriate documentation for each of the services listed in the overview:

  • changing the default configuration:

The default configuration can be changed by creating a Makefile.config file. There is an example provided to be used:

$ cp Makefile.config.example Makefile.config

and set the appropriate path. You have to do this in all bird repositories.

Furthermore, you might change the hostname (to provide your service to the outside), ESGF-node connection, the port or the log-level for more/less information in the administrator logfiles. Here is an example pyramid-phoenix/custom.cfg:

[settings]
hostname = localhost
http-port = 8081
https-port = 8443
log-level = DEBUG
# run 'make passwd' and to generate password hash
phoenix-password = sha256:513....
# generate secret
# python -c "import os; print(''.join('%02x' % ord(x) for x in os.urandom(16)))"
phoenix-secret = d5e8417....30
esgf-search-url = https://esgf-data.dkrz.de/esg-search
wps-url = http://localhost:8091/wps
  • Administration HELP:

In case of questions or trouble shooting, feel welcome to join the birdhouse chat and get into contact with the developers directly:

Birdhouse-Chatroom

Backups

See the mongodb documentation on how to backup the database. With the following command you can make a dump of the users collection of the Phoenix database:

$ mongodump --port 27027 --db phoenix_db --collection users

–>

Developer Guidelines

Code of Conduct

Note

Before we start please be aware that contributors to this project are expected to act respectfully toward others in accordance with the OSGeo Code of Conduct.

Contribution Workflow

The Birdhouse project openly welcomes contributions (bug reports, bug fixes, code enhancements/features, etc.). This document will outline some guidelines on contributing to birdhouse. As well, the birdhouse Communication is a great place to get an idea of how to connect and participate in birdhouse community and development where everybody is welcome to rise questions and discussions.

Here are some basic guides to smoothly contribute to birdhouse:

Source code

The source code of all birdhouse components is available on GitHub. Respecting the git mechanisms you can fork, clone and pull source-code into your repositories for modification and enhancement. Once your improvement is ready, make a pull request to integrate your work into the origin birdhouse repositories.

Note

Please keep your forks close to the origin repositories and don’t forget the pull requests.

Issue tracker

To keep track on the contribution and development, please use the issue tracker on GitHub for the corresponding birdhouse component.

Please find the coding guide in the Wiki.

Code Style

A good start to contribute is an enhancement of existing code with better or new functions. To respect a common coding style, Birdhouse uses PEP8 checks to ensure a consistent coding style. Currently the following PEP8 rules are enabled in setup.cfg:

[flake8]
ignore=F401,E402
max-line-length=120
exclude=tests

See the flake8 documentation on how to configure further options.

To check the coding style run flake8:

$ flake8 emu   # emu is the folder with python code
# or
$ make pep8    # make calls flake8

To make it easier to write code according to the PEP8 rules enable PEP8 checking in your editor. In the following we give examples how to enable code checking for different editors.

Sublime
{
 // set vertical rulers in specified columns.
 "rulers": [79],

 // turn on word wrap for source and text
 // default value is "auto", which means off for source and on for text
 "word_wrap": true,

 // set word wrapping at this column
 // default value is 0, meaning wrapping occurs at window width
 "wrap_width": 79
 }

Todo

Add PEP8 instructions for more editors: PyCharm, Kate, Emacs, Vim, Spyder.

Writing functions:

Todo

Guideline for writing functions. Where to place, how to comment.

Writing a WPS process

In birdhouse, we are using the PyWPS implementation of a Web Processing Service. Please read the PyWPS documentation on how to implement a WPS process.

Note

To get started quickly, you can try the Emu WPS with some example processes for PyWPS.

Writing tests

Todo

Guideline to write tests. Look at the Emu to see examples.

Writing documentation

Last but not least, a very very important point is to write a good documentation about your work! Each WPS (bird) has a docs folder for this where the documentation is written in reStructuredText and generated with Sphinx.

The documentation is automatically published to ReadTheDocs with GitHub webhooks. It is important to keep the Code Style and write explanations to your functions. There is an auto-api for documentation of functions.

Todo

explanation of enabling spinx automatic api documentation.

The main documentation (which you are reading now) is the starting point to get an overview of birdhouse. Each birdhouse component comes with its own Sphinx documentation and is referenced by the main birdhouse document.

Environment with conda

Todo

How to create a conda package

Make your own Bird

If you are familiar with all the upper chapters you are ready to create your own WPS. The WPS in birdhouse are named after birds, so this section is giving you a guidline of how to make your own bird. Birds are sorted thematically, so before setting up a new one, make sure it is not already covered and just missing some processes and be clear in the new thematic you would like to provide.

We have now a Cookiecutter template to create a new bird (PyWPS application). It is the recommended and fastest way to create your own bird:

https://github.com/bird-house/cookiecutter-birdhouse

Note

The cookiecutter is brand-new. Please give feedback and help to improve it.

Communication

There are numerous ways to interact with the Birdhouse community, for example join the chat or follow our blog. Also we are present on several conferences where you can enjoy one of our good presentations.

Chat-room

The most easiest way to drop a line to the developers is our Gitter chat room. If you want to have a quick technical question to one of the developers, or just wants to follow the discussions, feel welcome to join.

Meetings

More complex and real discussions are done regularly in video conferences. Check out the information for upcoming birdhouse meetings. Here you also find the minutes of previews video conferences and feel welcome to join an upcoming one.

Blog-post

In the blog you can find interesting articles and information related to birdhouse in general. We also inform regularly abut the main steps forward in the software development that you can keep track on whats going on in the birdhouse. If you want to receive a notification of new articles follow birdhouse news on our blog.

Newsletter

To be informed about the main progress in the birdhouse development as well as related information you can subscribe to our newsletter.

Wiki

The birdhouse wiki provides an area for supporting information that frequently changes and / or is outside the scope of the formal documentation.

Projects using birdhouse

Birdhouse is used in the following projects:

Todo

add some introduction sentences for each project

COPERNICUS

OGC-Testbeds

Todo

Add references to OGC testbed.

  • OGC Testbed 13: Enhancement of scheduling services
  • OGC Testbed 14: Enhancement of security

PAVICS

  • PAVICS: Platform for climate analysis and visualization by Ouranos and CRIM, Canada.

A2C2

  • A2C2: Atmospheric flow Analogues for Climate Change

Publications

Talks and articles

Articles, book sections and conference proceedings and presentations related to the birdhouse projects:

Todo

Update the bibtex file

2018:

2017:

2016:

2015:

2014:

2013:

  • Gerics Hamburg User-Developer Workshop

References

[HEAC+18]N. Hempelmann, C. Ehbrecht, C. Alvarez-Castro, P. Brockmann, W. Falk, J. Hoffmann, S. Kindermann, B. Koziol, C. Nangini, S. Radanovics, R. Vautard, and P. Yiou. Web processing service for climate impact and extreme weather event analyses. flyingpigeon (version 1.0). Computers & Geosciences, 110(Supplement C):65 – 72, 2018. URL: http://www.sciencedirect.com/science/article/pii/S0098300416302801, doi:https://doi.org/10.1016/j.cageo.2017.10.004.
[JGG+14]C. Jung, M. Gasthuber, A. Giesler, M. Hardt, J. Meyer, F. Rigoll, K. Schwarz, R. Stotzka, and A. Streit. Optimization of data life cycles. Journal of Physics: Conference Series, 513(3):032047, 2014. URL: http://stacks.iop.org/1742-6596/513/i=3/a=032047.
[JMS17]Christopher Jung, Jörg Meyer, and Achim Streit, editors. Helmholtz Portfolio Theme Large-Scale Data Management and Analysis (LSDMA). KIT Scientific Publishing, Karlsruhe, 2017. ISBN 978-3-7315-0695-9. 46.12.02; LK 01. doi:10.5445/KSP/1000071931.

Release Notes

Washington (December 2018, v0.6.1)

Birdhouse will be at the AGU 2018 in Washington D.C..

Warning

In the making …

Dar es Salaam (September 2018, v0.6.0)

Birdhouse was present at the FOSS4G 2018 in Dar es Salaam.

Highlighted Changes:

  • Ansible playbook to install PyWPS applications.
  • Skipped Buildout deployment … not all birds are converted yet.
  • Updated Cookiecutter template for new deployment.
  • Using PyWPS OpenDAP support.
  • Initial version of Birdy native client.

Released Birds:

  • Ansible Playbook for PyWPS 0.1.0
  • Cookiecutter Template for PyWPS 0.3.0
  • Birdy WPS Client: 0.4.0
  • Emu WPS: 0.9.0
  • Hummingbird WPS: 0.6.0

Maintained Birds with Buildout:

New Birds in the making:

Montréal (March 2018, v0.5.0)

We had a workshop in Montréal with CRIM and Ouranos.

Highlighted Changes:

  • Birdhouse has a Logo :)
  • A Cookiecutter template for Birdhouse WPS birds is available.
  • A new WPS Bird Black Swan for extreme weather event assessments is started by LSCE, Paris. This bird is spawned off Flyingpigeon.
  • A new Python library, Eggshell, is started to provide common base functionallity to WPS birds like Flyingpigeon and Black Swan.
  • The Twitcher security proxy supports now X509 certificates for authentication to WPS services.

Released Birds:

New Birds in the making:

Bonn (August 2016, v0.4.0)

Birdhouse was present at the FOSS4G 2016 in Bonn.

Highlighted Changes:

  • Leaflet map with time-dimension plugin.
  • using twitcher security proxy.
  • using conda environments for each birdhouse compartment.
  • using ansible to deploy birdhouse compartments.
  • added weather-regimes and analogs detection processes.
  • allow upload of files to processes.
  • updated Phoenix user interface.

Paris (October 2015, v0.3.0)

  • updated documents on readthedocs
  • OAuth2 used for login with GitHub, Ceda, …
  • LDAP support for login
  • using ncWMS and adagucwms
  • register and use Thredds catalogs as data source
  • publish local netcdf files and Thredds catalogs to birdhouse Solr
  • qualtiy check processes added (cfchecker, qa-dkrz)
  • generation of docker images for each birdhouse component
  • using dispel4py as workflow engine in Malleefowl
  • using Celery task scheduler/queue to run and monitor WPS processes
  • improved Phoenix web client
  • using birdy wps command line client

Paris (September 2014, v0.2.0)

  • Phoenix UI as WPS client with ESGF faceted search component and a wizard to chain WPS processes
  • PyWPS based processing backend with supporting processes of Malleefowl
  • WMS service (inculded in Thredds) for visualization of NetCDF files
  • OGC CSW catalog service for published results and OGC WPS services
  • ESGF data access with wget and OpenID
  • Caching of accessed files from ESGF Nodes and Catalog Service
  • WPS processes: cdo, climate-indices, ensemble data visualization, demo processes
  • IPython environment for WPS processes
  • initial unit tests for WPS processes
  • Workflow engine Restflow for running processing chains. Currently there is only a simple workflow used: get data with wget - process data.
  • Installation based on anaconda and buildout
  • buildout recipes (birdhousebuilder) available on PyPI to simplify installation and configuration of multiple WPS server
  • Monitoring of all used services (WPS, WMS, CSW, Phoenix) with supervisor
  • moved source code and documentation to birdhouse on GitHub

Helsinki (May 2014, v0.1.2)

  • presentation of birdhouse at EGI, Helsinki
  • stabilized birdhouse and CSC processes
  • updated documenation and tutorials

Vienna (April 2014, v0.1.1)

  • presentation of birdhouse at EGU, Vienna.
  • “quality check” workflow for CORDEX data.

Hamburg (December 2013, v0.1.0)

  • First presentation of Birdhouse at GERICS (German Climate Service Center), Hamburg.

License

Birdhouse is Open Source and released under the Apache License, Version 2.0.

Copyright [2014-2017] [Carsten Ehbrecht]

Licensed under the Apache License, Version 2.0 (the “License”);
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an “AS IS” BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

Glossary

Anaconda
Anaconda Python distribution
Python distribution for large-scale data processing, predictive analytics, and scientific computing. https://www.continuum.io/
Binstar
Anaconda Server
Anaconda cloud
Binstar is a service that allows you to create and manage public and private Anaconda package repositories. https://anaconda.org/ https://docs.continuum.io/
Bokeh
Bokeh is a Python interactive visualization library that targets modern web browsers for presentation. Its goal is to provide elegant, concise construction of novel graphics in the style of D3.js, but also deliver this capability with high-performance interactivity over very large or streaming datasets. http://bokeh.pydata.org/en/latest/
Buildout
Buildout is a Python-based build system for creating, assembling and deploying applications from multiple parts, some of which may be non-Python-based. It lets you create a buildout configuration and reproduce the same software later. http://www.buildout.org/en/latest/
CDO
Climate Data Operators
CDO is a collection of command line Operators to manipulate and analyse Climate and NWP model Data. https://code.zmaw.de/projects/cdo
cfchecker
The NetCDF Climate Forcast Conventions compliance checker. https://pypi.python.org/pypi/cfchecker
climate indice
A climate index is a calculated value that can be used to describe the state and the changes in the climate system. http://icclim.readthedocs.io/en/latest/intro.html#climate-indices-label
CMIP5
In climatology, the Coupled Model Intercomparison Project (CMIP) is a framework and the analog of the Atmospheric Model Intercomparison Project (AMIP) for global coupled ocean-atmosphere general circulation models. https://en.wikipedia.org/wiki/Coupled_model_intercomparison_project
Conda
The conda command is the primary interface for managing Anaconda installations. http://conda.pydata.org/docs/index.html
CORDEX
The CORDEX vision is to advance and coordinate the science and application of regional climate downscaling through global partnerships. http://www.cordex.org/
COWS
The COWS Web Processing Service (WPS) is a generic web service and offline processing tool developed within the Centre for Environmental Data Archival (CEDA). http://cows.ceda.ac.uk/cows_wps.html
CSW
Catalog Service
Catalog Service for the Web (CSW), sometimes seen as Catalog Service - Web, is a standard for exposing a catalogue of geospatial records in XML on the Internet (over HTTP). The catalogue is made up of records that describe geospatial data (e.g. KML), geospatial services (e.g. WMS), and related resources. https://en.wikipedia.org/wiki/Catalog_Service_for_the_Web
Dispel4py
Dispel4Py is a Python library for describing abstract workflows for distributed data-intensive applications. http://www2.epcc.ed.ac.uk/~amrey/VERCE/Dispel4Py/index.html
Docker
Docker - An open platform for distributed applications for developers and sysadmins. https://www.docker.com/
Docker Hub
Docker Hub manages the lifecycle of distributed apps with cloud services for building and sharing containers and automating workflows. https://hub.docker.com/
Emu
Emu is a Python package with some test proccess for Web Processing Services. http://emu.readthedocs.io/en/latest/
ESGF
Earth System Grid Federation
An open source effort providing a robust, distributed data and computation platform, enabling world wide access to Peta/Exa-scale scientific data. http://esgf.llnl.gov/
GeoPython
GitHub organisation of Python projects related to geospatial. https://geopython.github.io/
GeoServer
GeoServer is an open source software server written in Java that allows users to share and edit geospatial data. http://docs.geoserver.org/stable/en/user/index.html
GitHub
GitHub is a web-based Git repository hosting service. https://github.com/ https://en.wikipedia.org/wiki/GitHub
Gunicorn
Gunicorn Green Unicorn is a Python WSGI HTTP Server for UNIX. http://gunicorn.org/
Homebrew
The missing package manager for OS X. http://brew.sh/
ICCLIM
Indice Calculation CLIMate
ICCLIM (Indice Calculation CLIMate) is a Python library for computing a number of climate indices. http://icclim.readthedocs.io/en/latest/
Linuxbrew
Linuxbrew is a fork of Homebrew, the Mac OS package manager, for Linux. http://brew.sh/linuxbrew/
Malleefowl
Malleefowl is a Python package to simplify the usage of Web Processing Services. http://malleefowl.readthedocs.io/en/latest/
NetCDF
NetCDF (Network Common Data Form) is a set of software libraries and self-describing, machine-independent data formats that support the creation, access, and sharing of array-oriented scientific data. https://en.wikipedia.org/wiki/NetCDF
Nginx
nginx [engine x] is an HTTP and reverse proxy server. http://nginx.org/
ocgis
OpenClimateGIS
OpenClimateGIS (OCGIS) is a Python package designed for geospatial manipulation, subsetting, computation, and translation of climate datasets stored in local NetCDF files or files served through THREDDS data servers. https://www.earthsystemcog.org/projects/openclimategis/ https://github.com/NCPP/ocgis
OGC
Open Geospatial Consortium
The Open Geospatial Consortium (OGC) is an international voluntary consensus standards organization, originated in 1994. https://en.wikipedia.org/wiki/Open_Geospatial_Consortium, http://www.opengeospatial.org/standards/wps
OpenID
OpenID (OID) is an open standard and decentralized protocol by the non-profit OpenID Foundation that allows users to be authenticated by certain co-operating sites (known as Relying Parties or RP) using a third party service. https://en.wikipedia.org/wiki/OpenID, http://openid.net/
OWSLib
OWSLib is a Python package for client programming with Open Geospatial Consortium web service interface standards, and their related content models. OWSLib has WPS client library which is used in Birdhouse to access WPS services. http://geopython.github.io/OWSLib/, http://geopython.github.io/OWSLib/#wps
Phoenix
Pyramid Phoenix is a web-application build with the Python web-framework pyramid. Phoenix has a user interface to make it easier to interact with Web Processing Services. http://pyramid-phoenix.readthedocs.io/en/latest
PyCSW
pycsw is an OGC CSW server implementation written in Python. Started in 2010 (more formally announced in 2011), pycsw allows for the publishing and discovery of geospatial metadata, providing a standards-based metadata and catalogue component of spatial data infrastructures. http://pycsw.org/, https://github.com/geopython/pycsw
PyPi
Python Package Index
The Python Package Index is a repository of software for the Python programming language. https://pypi.python.org/pypi
Pyramid
Pyramid is a Python web framework. http://www.pylonsproject.org/
PyWPS
Python Web Processing Service is an implementation of the Web processing Service standard from Open Geospatial Consortium. http://pywps.org/
RestFlow
RestFlow is a dataflow programming language and runtime engine designed to make it easy for scientists to build and execute computational pipelines. https://github.com/restflow-org/restflow/wiki
Supervisor
Supervisor is a client/server system that allows its users to monitor and control a number of processes on UNIX-like operating systems. http://supervisord.org/
Taverna
Taverna is an open source and domain-independent Workflow Management System – a suite of tools used to design and execute scientific workflows. http://www.taverna.org.uk/
TDS
THREDDS
The THREDDS Data Server (TDS) is a web server that provides metadata and data access for scientific datasets, using a variety of remote data access protocols. http://www.unidata.ucar.edu/software/thredds/current/tds/
VisTrails
VisTrails is an open-source scientific workflow and provenance management system that supports data exploration and visualization. http://www.vistrails.org/index.php/Main_Page
WMS
Web Mapping Service
A Web Map Service (WMS) is a standard protocol for serving georeferenced map images over the Internet that are generated by a map server using data from a GIS database. https://en.wikipedia.org/wiki/Web_Map_Service
Workflow
Workflow Management System
A workflow management system (WfMS) is a software system for the set-up, performance and monitoring of a defined sequence of tasks, arranged as a workflow. https://en.wikipedia.org/wiki/Workflow_management_system
WPS
Web Processing Service
WPS is an open standard to search and run processes with a simple web-based interface. See: Wordcounter Example.
WSGI
WSGI is an interface specification by which server and application communicate. http://wsgi.tutorial.codepoint.net/
x509
In cryptography, X.509 is an ITU-T standard for a public key infrastructure (PKI) and Privilege Management Infrastructure (PMI). https://en.wikipedia.org/wiki/X.509
XML-RPC
It’s a spec and a set of implementations that allow software running on disparate operating systems, running in different environments to make procedure calls over the Internet. http://xmlrpc.scripting.com/default.html