API reference

This page provides an auto-generated summary of xarray’s API. For more details and examples, refer to the relevant chapters in the main part of the documentation.

Top-level functions

apply_ufunc(func : Callable, *args : Any, …) Apply a vectorized function for unlabeled arrays on xarray objects.
align(*objects[, join, copy, indexes, exclude]) Given any number of Dataset and/or DataArray objects, returns new objects with aligned indexes and dimension sizes.
broadcast(*args, **kwargs) Explicitly broadcast any number of DataArray or Dataset objects against one another.
concat(objs[, dim, data_vars, coords, …]) Concatenate xarray objects along a new or existing dimension.
merge(objects[, compat, join]) Merge any number of xarray objects into a single Dataset as variables.
where(cond, x, y) Return elements from x or y depending on cond.
set_options(**kwargs) Set options for xarray in a controlled context.
full_like(other, fill_value[, dtype]) Return a new object with the same shape and type as a given object.
zeros_like(other[, dtype]) Shorthand for full_like(other, 0, dtype)
ones_like(other[, dtype]) Shorthand for full_like(other, 1, dtype)

Dataset

Creating a dataset

Dataset([data_vars, coords, attrs, compat]) A multi-dimensional, in memory, array database.
decode_cf(obj[, concat_characters, …]) Decode the given Dataset or Datastore according to CF conventions into a new Dataset.

Attributes

Dataset.dims Mapping from dimension names to lengths.
Dataset.sizes Mapping from dimension names to lengths.
Dataset.data_vars Dictionary of xarray.DataArray objects corresponding to data variables
Dataset.coords Dictionary of xarray.DataArray objects corresponding to coordinate
Dataset.attrs Dictionary of global attributes on this dataset
Dataset.encoding Dictionary of global encoding attributes on this dataset
Dataset.indexes OrderedDict of pandas.Index objects used for label based indexing
Dataset.get_index(key) Get an index for a dimension, with fall-back to a default RangeIndex

Dictionary interface

Datasets implement the mapping interface with keys given by variable names and values given by DataArray objects.

Dataset.__getitem__(key) Access variables or coordinates this dataset as a DataArray.
Dataset.__setitem__(key, value) Add an array to this dataset.
Dataset.__delitem__(key) Remove a variable from this dataset.
Dataset.update(other[, inplace]) Update this dataset’s variables with those from another dataset.
Dataset.items(…)
Dataset.values(…)

Dataset contents

Dataset.copy([deep]) Returns a copy of this dataset.
Dataset.assign(**kwargs) Assign new data variables to a Dataset, returning a new object with all the original variables in addition to the new ones.
Dataset.assign_coords(**kwargs) Assign new coordinates to this object.
Dataset.assign_attrs(*args, **kwargs) Assign new attrs to this object.
Dataset.pipe(func, *args, **kwargs) Apply func(self, *args, **kwargs)
Dataset.merge(other[, inplace, …]) Merge the arrays of two datasets into a single dataset.
Dataset.rename(name_dict[, inplace]) Returns a new object with renamed variables and dimensions.
Dataset.swap_dims(dims_dict[, inplace]) Returns a new object with swapped dimensions.
Dataset.expand_dims(dim[, axis]) Return a new object with an additional axis (or axes) inserted at the corresponding position in the array shape.
Dataset.drop(labels[, dim]) Drop variables or index labels from this dataset.
Dataset.set_coords(names[, inplace]) Given names of one or more variables, set them as coordinates
Dataset.reset_coords([names, drop, inplace]) Given names of coordinates, reset them to become variables

Comparisons

Dataset.equals(other) Two Datasets are equal if they have matching variables and coordinates, all of which are equal.
Dataset.identical(other) Like equals, but also checks all dataset attributes and the attributes on all variables and coordinates.
Dataset.broadcast_equals(other) Two Datasets are broadcast equal if they are equal after broadcasting all variables against each other.

Indexing

Dataset.loc Attribute for location based indexing.
Dataset.isel([drop]) Returns a new dataset with each array indexed along the specified dimension(s).
Dataset.sel([method, tolerance, drop]) Returns a new dataset with each array indexed by tick labels along the specified dimension(s).
Dataset.squeeze([dim, drop]) Return a new object with squeezed data.
Dataset.reindex([indexers, method, …]) Conform this object onto a new set of indexes, filling in missing values with NaN.
Dataset.reindex_like(other[, method, …]) Conform this object onto the indexes of another object, filling in missing values with NaN.
Dataset.set_index([append, inplace]) Set Dataset (multi-)indexes using one or more existing coordinates or variables.
Dataset.reset_index(dims_or_levels[, drop, …]) Reset the specified index(es) or multi-index level(s).
Dataset.reorder_levels([inplace]) Rearrange index levels using input order.

Computation

Dataset.apply(func[, keep_attrs, args]) Apply a function over the data variables in this dataset.
Dataset.reduce(func[, dim, keep_attrs, …]) Reduce this dataset by applying func along some dimension(s).
Dataset.groupby(group[, squeeze]) Returns a GroupBy object for performing grouped operations.
Dataset.groupby_bins(group, bins[, right, …]) Returns a GroupBy object for performing grouped operations.
Dataset.rolling([min_periods, center]) Rolling window object.
Dataset.resample([freq, dim, how, skipna, …]) Returns a Resample object for performing resampling operations.
Dataset.diff(dim[, n, label]) Calculate the n-th order discrete difference along given axis.
Dataset.quantile(q[, dim, interpolation, …]) Compute the qth quantile of the data along the specified dimension.

Aggregation: all any argmax argmin max mean median min prod sum std var

Missing values: isnull notnull count dropna fillna where

ndarray methods: argsort clip conj conjugate imag round real cumsum cumprod

Grouped operations: assign assign_coords first last fillna where

Reshaping and reorganizing

Dataset.transpose(*dims) Return a new Dataset object with all array dimensions transposed.
Dataset.stack(**dimensions) Stack any number of existing dimensions into a single new dimension.
Dataset.unstack(dim) Unstack an existing dimension corresponding to a MultiIndex into multiple new dimensions.
Dataset.shift(**shifts) Shift this dataset by an offset along one or more dimensions.
Dataset.roll(**shifts) Roll this dataset by an offset along one or more dimensions.
Dataset.sortby(variables[, ascending]) Sort object by labels or values (along an axis).

DataArray

DataArray(data[, coords, dims, name, attrs, …]) N-dimensional array with labeled coordinates and dimensions.

Attributes

DataArray.values The array’s data as a numpy.ndarray
DataArray.data The array’s data as a dask or numpy array
DataArray.coords Dictionary-like container of coordinate arrays.
DataArray.dims Tuple of dimension names associated with this array.
DataArray.sizes Ordered mapping from dimension names to lengths.
DataArray.name The name of this array.
DataArray.attrs Dictionary storing arbitrary metadata with this array.
DataArray.encoding Dictionary of format-specific settings for how this array should be serialized.
DataArray.indexes OrderedDict of pandas.Index objects used for label based indexing
DataArray.get_index(key) Get an index for a dimension, with fall-back to a default RangeIndex

ndarray attributes: ndim shape size dtype

DataArray contents

DataArray.assign_coords(**kwargs) Assign new coordinates to this object.
DataArray.assign_attrs(*args, **kwargs) Assign new attrs to this object.
DataArray.pipe(func, *args, **kwargs) Apply func(self, *args, **kwargs)
DataArray.rename(new_name_or_name_dict) Returns a new DataArray with renamed coordinates or a new name.
DataArray.swap_dims(dims_dict) Returns a new DataArray with swapped dimensions.
DataArray.expand_dims(dim[, axis]) Return a new object with an additional axis (or axes) inserted at the corresponding position in the array shape.
DataArray.drop(labels[, dim]) Drop coordinates or index labels from this DataArray.
DataArray.reset_coords([names, drop, inplace]) Given names of coordinates, reset them to become variables.
DataArray.copy([deep]) Returns a copy of this array.

ndarray methods: astype item

Indexing

DataArray.__getitem__(key)
DataArray.__setitem__(key, value)
DataArray.loc Attribute for location based indexing like pandas.
DataArray.isel([drop]) Return a new DataArray whose dataset is given by integer indexing along the specified dimension(s).
DataArray.sel([method, tolerance, drop]) Return a new DataArray whose dataset is given by selecting index labels along the specified dimension(s).
DataArray.squeeze([dim, drop]) Return a new object with squeezed data.
DataArray.reindex([method, tolerance, copy]) Conform this object onto a new set of indexes, filling in missing values with NaN.
DataArray.reindex_like(other[, method, …]) Conform this object onto the indexes of another object, filling in missing values with NaN.
DataArray.set_index([append, inplace]) Set DataArray (multi-)indexes using one or more existing coordinates.
DataArray.reset_index(dims_or_levels[, …]) Reset the specified index(es) or multi-index level(s).
DataArray.reorder_levels([inplace]) Rearrange index levels using input order.

Comparisons

DataArray.equals(other) True if two DataArrays have the same dimensions, coordinates and values; otherwise False.
DataArray.identical(other) Like equals, but also checks the array name and attributes, and attributes on all coordinates.
DataArray.broadcast_equals(other) Two DataArrays are broadcast equal if they are equal after broadcasting them against each other such that they have the same dimensions.

Computation

DataArray.reduce(func[, dim, axis, keep_attrs]) Reduce this array by applying func along some dimension(s).
DataArray.groupby(group[, squeeze]) Returns a GroupBy object for performing grouped operations.
DataArray.groupby_bins(group, bins[, right, …]) Returns a GroupBy object for performing grouped operations.
DataArray.rolling([min_periods, center]) Rolling window object.
DataArray.resample([freq, dim, how, skipna, …]) Returns a Resample object for performing resampling operations.
DataArray.get_axis_num(dim) Return axis number(s) corresponding to dimension(s) in this array.
DataArray.diff(dim[, n, label]) Calculate the n-th order discrete difference along given axis.
DataArray.dot(other) Perform dot product of two DataArrays along their shared dims.
DataArray.quantile(q[, dim, interpolation, …]) Compute the qth quantile of the data along the specified dimension.

Aggregation: all any argmax argmin max mean median min prod sum std var

Missing values: isnull notnull count dropna fillna where

ndarray methods: argsort clip conj conjugate imag searchsorted round real T cumsum cumprod

Grouped operations: assign_coords first last fillna where

Reshaping and reorganizing

DataArray.transpose(*dims) Return a new DataArray object with transposed dimensions.
DataArray.stack(**dimensions) Stack any number of existing dimensions into a single new dimension.
DataArray.unstack(dim) Unstack an existing dimension corresponding to a MultiIndex into multiple new dimensions.
DataArray.shift(**shifts) Shift this array by an offset along one or more dimensions.
DataArray.roll(**shifts) Roll this array by an offset along one or more dimensions.
DataArray.sortby(variables[, ascending]) Sort object by labels or values (along an axis).

Universal functions

This functions are copied from NumPy, but extended to work on NumPy arrays, dask arrays and all xarray objects. You can find them in the xarray.ufuncs module:

angle arccos arccosh arcsin arcsinh arctan arctan2 arctanh ceil conj copysign cos cosh deg2rad degrees exp expm1 fabs fix floor fmax fmin fmod fmod frexp hypot imag iscomplex isfinite isinf isnan isreal ldexp log log10 log1p log2 logaddexp logaddexp2 logical_and logical_not logical_or logical_xor maximum minimum nextafter rad2deg radians real rint sign signbit sin sinh sqrt square tan tanh trunc

IO / Conversion

Dataset methods

open_dataset(filename_or_obj[, group, …]) Load and decode a dataset from a file or file-like object.
open_mfdataset(paths[, chunks, concat_dim, …]) Open multiple files as a single dataset.
open_rasterio(filename[, chunks, cache, lock]) Open a file with rasterio (experimental).
Dataset.to_netcdf([path, mode, format, …]) Write dataset contents to a netCDF file.
save_mfdataset(datasets, paths[, mode, …]) Write multiple datasets to disk as netCDF files simultaneously.
Dataset.to_array([dim, name]) Convert this dataset into an xarray.DataArray
Dataset.to_dataframe() Convert this dataset into a pandas.DataFrame.
Dataset.to_dict() Convert this dataset to a dictionary following xarray naming conventions.
Dataset.from_dataframe(dataframe) Convert a pandas.DataFrame into an xarray.Dataset
Dataset.from_dict(d) Convert a dictionary into an xarray.Dataset.
Dataset.close() Close any files linked to this object
Dataset.compute(**kwargs) Manually trigger loading of this dataset’s data from disk or a remote source into memory and return a new dataset.
Dataset.persist(**kwargs) Trigger computation, keeping data as dask arrays
Dataset.load(**kwargs) Manually trigger loading of this dataset’s data from disk or a remote source into memory and return this dataset.
Dataset.chunk([chunks, name_prefix, token, lock]) Coerce all arrays in this dataset into dask arrays with the given chunks.
Dataset.filter_by_attrs(**kwargs) Returns a Dataset with variables that match specific conditions.
Dataset.info([buf]) Concise summary of a Dataset variables and attributes.

DataArray methods

open_dataarray(*args, **kwargs) Open an DataArray from a netCDF file containing a single data variable.
DataArray.to_dataset([dim, name]) Convert a DataArray to a Dataset.
DataArray.to_netcdf(*args, **kwargs) Write DataArray contents to a netCDF file.
DataArray.to_pandas() Convert this array into a pandas object with the same shape.
DataArray.to_series() Convert this array into a pandas.Series.
DataArray.to_dataframe([name]) Convert this array and its coordinates into a tidy pandas.DataFrame.
DataArray.to_index() Convert this variable to a pandas.Index.
DataArray.to_masked_array([copy]) Convert this array into a numpy.ma.MaskedArray
DataArray.to_cdms2() Convert this array into a cdms2.Variable
DataArray.to_dict() Convert this xarray.DataArray into a dictionary following xarray naming conventions.
DataArray.from_series(series) Convert a pandas.Series into an xarray.DataArray.
DataArray.from_cdms2(variable) Convert a cdms2.Variable into an xarray.DataArray
DataArray.from_dict(d) Convert a dictionary into an xarray.DataArray
DataArray.compute(**kwargs) Manually trigger loading of this array’s data from disk or a remote source into memory and return a new array.
DataArray.persist(**kwargs) Trigger computation in constituent dask arrays
DataArray.load(**kwargs) Manually trigger loading of this array’s data from disk or a remote source into memory and return this array.
DataArray.chunk([chunks, name_prefix, …]) Coerce this array’s data into a dask arrays with the given chunks.

Plotting

plot.plot(darray[, row, col, col_wrap, ax, …]) Default plot of DataArray using matplotlib.pyplot.
plot.contourf(x, y, z, ax, **kwargs) Filled contour plot of 2d DataArray
plot.contour(x, y, z, ax, **kwargs) Contour plot of 2d DataArray
plot.hist(darray[, figsize, size, aspect, ax]) Histogram of DataArray
plot.imshow(x, y, z, ax, **kwargs) Image plot of 2d DataArray using matplotlib.pyplot
plot.line(darray, *args, **kwargs) Line plot of 1 dimensional DataArray index against values
plot.pcolormesh(x, y, z, ax[, infer_intervals]) Pseudocolor plot of 2d DataArray
plot.FacetGrid(data[, col, row, col_wrap, …]) Initialize the matplotlib figure and FacetGrid object.

Testing

testing.assert_equal(a, b) Like numpy.testing.assert_array_equal(), but for xarray objects.
testing.assert_identical(a, b) Like xarray.testing.assert_equal(), but also matches the objects’ names and attributes.
testing.assert_allclose(a, b[, rtol, atol, …]) Like numpy.testing.assert_allclose(), but for xarray objects.

Exceptions

MergeError Error class for merge failures due to incompatible arguments.

Advanced API

Variable(dims, data[, attrs, encoding, fastpath]) A netcdf-like variable consisting of dimensions, data and attributes which describe a single Array.
IndexVariable(dims, data[, attrs, encoding, …]) Wrapper for accommodating a pandas.Index in an xarray.Variable.
as_variable(obj[, name]) Convert an object into a Variable.
register_dataset_accessor(name) Register a custom property on xarray.Dataset objects.
register_dataarray_accessor(name) Register a custom accessor on xarray.DataArray objects.

These backends provide a low-level interface for lazily loading data from external file-formats or protocols, and can be manually invoked to create arguments for the from_store and dump_to_store Dataset methods:

backends.NetCDF4DataStore(netcdf4_dataset[, …]) Store for reading and writing data via the Python-NetCDF4 library.
backends.H5NetCDFStore(filename[, mode, …]) Store for reading and writing data via h5netcdf
backends.PydapDataStore(ds) Store for accessing OpenDAP datasets with pydap.
backends.ScipyDataStore(filename_or_obj[, …]) Store for reading and writing data via scipy.io.netcdf.