For netCDF and IO¶
netCDF4: recommended if you want to use xarray for reading or writing netCDF files
scipy: used as a fallback for reading/writing netCDF3
pydap: used as a fallback for accessing OPeNDAP
h5netcdf: an alternative library for reading and writing netCDF4 files that does not use the netCDF-C libraries
pynio: for reading GRIB and other geoscience specific file formats
zarr: for chunked, compressed, N-dimensional arrays.
cftime: recommended if you want to encode/decode datetimes for non-standard calendars or dates before year 1678 or after year 2262.
PseudoNetCDF: recommended for accessing CAMx, GEOS-Chem (bpch), NOAA ARL files, ICARTT files (ffi1001) and many other.
rasterio: for reading GeoTiffs and other gridded raster datasets.
iris: for conversion to and from iris’ Cube objects
cfgrib: for reading GRIB files via the ECMWF ecCodes library.
For accelerating xarray¶
Alternative data containers¶
sparse: for sparse arrays
pint: for units of measure
At the moment of writing, xarray requires a highly experimental version of pint (install with
pip install git+https://github.com/andrewgsavage/pint.git@refs/pull/6/head). Even with it, interaction with non-numpy array libraries, e.g. dask or sparse, is broken.
Any numpy-like objects that support NEP-18. Note that while such libraries theoretically should work, they are untested. Integration tests are in the process of being written for individual libraries.
Minimum dependency versions¶
xarray adopts a rolling policy regarding the minimum supported version of its dependencies:
Python: 42 months (NEP-29)
numpy: 24 months (NEP-29)
pandas: 12 months
scipy: 12 months
sparse, pint and other libraries that rely on NEP-18 for integration: very latest available versions only, until the technology will have matured. This extends to dask when used in conjunction with any of these libraries. numpy >=1.17.
all other libraries: 6 months
The above should be interpreted as the minor version (X.Y) initially published no more than N months ago. Patch versions (x.y.Z) are not pinned, and only the latest available at the moment of publishing the xarray release is guaranteed to work.
You can see the actual minimum tested versions:
xarray itself is a pure Python package, but its dependencies are not. The easiest way to get everything installed is to use conda. To install xarray with its recommended dependencies using the conda command line tool:
$ conda install xarray dask netCDF4 bottleneck
We recommend using the community maintained conda-forge channel if you need difficult-to-build dependencies such as cartopy, pynio or PseudoNetCDF:
$ conda install -c conda-forge xarray cartopy pynio pseudonetcdf
New releases may also appear in conda-forge before being updated in the default channel.
If you don’t use conda, be sure you have the required dependencies (numpy and pandas) installed first. Then, install xarray with pip:
$ pip install xarray
To run the test suite after installing xarray, install (via pypi or conda) py.test and run
pytest in the root directory of the xarray
A fixed-point performance monitoring of (a part of) our codes can be seen on this page.
To run these benchmark tests in a local machine, first install
airspeed-velocity: a tool for benchmarking Python packages over their lifetime.
asv run # this will install some conda environments in ./.asv/envs