nbodykit.source.catalog.
CSVCatalog
(*args, **kwargs)¶A CatalogSource that uses CSVFile
to read data from disk.
Multiple files can be read at once by supplying a list of file
names or a glob asterisk pattern as the path
argument. See
Reading Multiple Data Files at Once for examples.
Parameters: |
|
---|
Examples
Please see the documentation for examples.
Attributes
Index |
The attribute giving the global index rank of each particle in the list. |
attrs |
A dictionary storing relevant meta-data about the CatalogSource. |
columns |
All columns in the CatalogSource, including those hard-coded into the class’s defintion and override columns provided by the user. |
csize |
The total, collective size of the CatalogSource, i.e., summed across all ranks. |
hardcolumns |
The union of the columns in the file and any transformed columns. |
size |
The number of objects in the CatalogSource on the local rank. |
use_cache |
If set to True , use the built-in caching features of dask to cache data in memory. |
Methods
Selection () |
A boolean column that selects a subset slice of the CatalogSource. |
Value () |
When interpolating a CatalogSource on to a mesh, the value of this array is used as the Value that each particle contributes to a given mesh cell. |
Weight () |
The column giving the weight to use for each particle on the mesh. |
compute (*args, **kwargs) |
Our version of dask.compute() that computes multiple delayed dask collections at once. |
copy () |
Return a shallow copy of the object, where each column is a reference of the corresponding column in self . |
get_hardcolumn (col) |
Return a column from the underlying file source. |
gslice (start, stop[, end, redistribute]) |
Execute a global slice of a CatalogSource. |
make_column (array) |
Utility function to convert an array-like object to a dask.array.Array . |
read (columns) |
Return the requested columns as dask arrays. |
save (output, columns[, datasets, header]) |
Save the CatalogSource to a bigfile.BigFile . |
sort (keys[, reverse, usecols]) |
Return a CatalogSource, sorted globally across all MPI ranks in ascending order by the input keys. |
to_mesh ([Nmesh, BoxSize, dtype, interlaced, …]) |
Convert the CatalogSource to a MeshSource, using the specified parameters. |
view ([type]) |
Return a “view” of the CatalogSource object, with the returned type set by type . |
nbodykit.source.catalog.
BinaryCatalog
(*args, **kwargs)¶A CatalogSource that uses BinaryFile
to read data from disk.
Multiple files can be read at once by supplying a list of file
names or a glob asterisk pattern as the path
argument. See
Reading Multiple Data Files at Once for examples.
Parameters: |
|
---|
Examples
Please see the documentation for examples.
Attributes
Index |
The attribute giving the global index rank of each particle in the list. |
attrs |
A dictionary storing relevant meta-data about the CatalogSource. |
columns |
All columns in the CatalogSource, including those hard-coded into the class’s defintion and override columns provided by the user. |
csize |
The total, collective size of the CatalogSource, i.e., summed across all ranks. |
hardcolumns |
The union of the columns in the file and any transformed columns. |
size |
The number of objects in the CatalogSource on the local rank. |
use_cache |
If set to True , use the built-in caching features of dask to cache data in memory. |
Methods
Selection () |
A boolean column that selects a subset slice of the CatalogSource. |
Value () |
When interpolating a CatalogSource on to a mesh, the value of this array is used as the Value that each particle contributes to a given mesh cell. |
Weight () |
The column giving the weight to use for each particle on the mesh. |
compute (*args, **kwargs) |
Our version of dask.compute() that computes multiple delayed dask collections at once. |
copy () |
Return a shallow copy of the object, where each column is a reference of the corresponding column in self . |
get_hardcolumn (col) |
Return a column from the underlying file source. |
gslice (start, stop[, end, redistribute]) |
Execute a global slice of a CatalogSource. |
make_column (array) |
Utility function to convert an array-like object to a dask.array.Array . |
read (columns) |
Return the requested columns as dask arrays. |
save (output, columns[, datasets, header]) |
Save the CatalogSource to a bigfile.BigFile . |
sort (keys[, reverse, usecols]) |
Return a CatalogSource, sorted globally across all MPI ranks in ascending order by the input keys. |
to_mesh ([Nmesh, BoxSize, dtype, interlaced, …]) |
Convert the CatalogSource to a MeshSource, using the specified parameters. |
view ([type]) |
Return a “view” of the CatalogSource object, with the returned type set by type . |
nbodykit.source.catalog.
BigFileCatalog
(*args, **kwargs)¶A CatalogSource that uses BigFile
to read data from disk.
Multiple files can be read at once by supplying a list of file
names or a glob asterisk pattern as the path
argument. See
Reading Multiple Data Files at Once for examples.
Parameters: |
|
---|
Examples
Please see the documentation for examples.
Attributes
Index |
The attribute giving the global index rank of each particle in the list. |
attrs |
A dictionary storing relevant meta-data about the CatalogSource. |
columns |
All columns in the CatalogSource, including those hard-coded into the class’s defintion and override columns provided by the user. |
csize |
The total, collective size of the CatalogSource, i.e., summed across all ranks. |
hardcolumns |
The union of the columns in the file and any transformed columns. |
size |
The number of objects in the CatalogSource on the local rank. |
use_cache |
If set to True , use the built-in caching features of dask to cache data in memory. |
Methods
Selection () |
A boolean column that selects a subset slice of the CatalogSource. |
Value () |
When interpolating a CatalogSource on to a mesh, the value of this array is used as the Value that each particle contributes to a given mesh cell. |
Weight () |
The column giving the weight to use for each particle on the mesh. |
compute (*args, **kwargs) |
Our version of dask.compute() that computes multiple delayed dask collections at once. |
copy () |
Return a shallow copy of the object, where each column is a reference of the corresponding column in self . |
get_hardcolumn (col) |
Return a column from the underlying file source. |
gslice (start, stop[, end, redistribute]) |
Execute a global slice of a CatalogSource. |
make_column (array) |
Utility function to convert an array-like object to a dask.array.Array . |
read (columns) |
Return the requested columns as dask arrays. |
save (output, columns[, datasets, header]) |
Save the CatalogSource to a bigfile.BigFile . |
sort (keys[, reverse, usecols]) |
Return a CatalogSource, sorted globally across all MPI ranks in ascending order by the input keys. |
to_mesh ([Nmesh, BoxSize, dtype, interlaced, …]) |
Convert the CatalogSource to a MeshSource, using the specified parameters. |
view ([type]) |
Return a “view” of the CatalogSource object, with the returned type set by type . |
nbodykit.source.catalog.
HDFCatalog
(*args, **kwargs)¶A CatalogSource that uses HDFFile
to read data from disk.
Multiple files can be read at once by supplying a list of file
names or a glob asterisk pattern as the path
argument. See
Reading Multiple Data Files at Once for examples.
Parameters: |
|
---|
Examples
Please see the documentation for examples.
Attributes
Index |
The attribute giving the global index rank of each particle in the list. |
attrs |
A dictionary storing relevant meta-data about the CatalogSource. |
columns |
All columns in the CatalogSource, including those hard-coded into the class’s defintion and override columns provided by the user. |
csize |
The total, collective size of the CatalogSource, i.e., summed across all ranks. |
hardcolumns |
The union of the columns in the file and any transformed columns. |
size |
The number of objects in the CatalogSource on the local rank. |
use_cache |
If set to True , use the built-in caching features of dask to cache data in memory. |
Methods
Selection () |
A boolean column that selects a subset slice of the CatalogSource. |
Value () |
When interpolating a CatalogSource on to a mesh, the value of this array is used as the Value that each particle contributes to a given mesh cell. |
Weight () |
The column giving the weight to use for each particle on the mesh. |
compute (*args, **kwargs) |
Our version of dask.compute() that computes multiple delayed dask collections at once. |
copy () |
Return a shallow copy of the object, where each column is a reference of the corresponding column in self . |
get_hardcolumn (col) |
Return a column from the underlying file source. |
gslice (start, stop[, end, redistribute]) |
Execute a global slice of a CatalogSource. |
make_column (array) |
Utility function to convert an array-like object to a dask.array.Array . |
read (columns) |
Return the requested columns as dask arrays. |
save (output, columns[, datasets, header]) |
Save the CatalogSource to a bigfile.BigFile . |
sort (keys[, reverse, usecols]) |
Return a CatalogSource, sorted globally across all MPI ranks in ascending order by the input keys. |
to_mesh ([Nmesh, BoxSize, dtype, interlaced, …]) |
Convert the CatalogSource to a MeshSource, using the specified parameters. |
view ([type]) |
Return a “view” of the CatalogSource object, with the returned type set by type . |
nbodykit.source.catalog.
TPMBinaryCatalog
(*args, **kwargs)¶A CatalogSource that uses TPMBinaryFile
to read data from disk.
Multiple files can be read at once by supplying a list of file
names or a glob asterisk pattern as the path
argument. See
Reading Multiple Data Files at Once for examples.
Parameters: |
|
---|
Attributes
Index |
The attribute giving the global index rank of each particle in the list. |
attrs |
A dictionary storing relevant meta-data about the CatalogSource. |
columns |
All columns in the CatalogSource, including those hard-coded into the class’s defintion and override columns provided by the user. |
csize |
The total, collective size of the CatalogSource, i.e., summed across all ranks. |
hardcolumns |
The union of the columns in the file and any transformed columns. |
size |
The number of objects in the CatalogSource on the local rank. |
use_cache |
If set to True , use the built-in caching features of dask to cache data in memory. |
Methods
Selection () |
A boolean column that selects a subset slice of the CatalogSource. |
Value () |
When interpolating a CatalogSource on to a mesh, the value of this array is used as the Value that each particle contributes to a given mesh cell. |
Weight () |
The column giving the weight to use for each particle on the mesh. |
compute (*args, **kwargs) |
Our version of dask.compute() that computes multiple delayed dask collections at once. |
copy () |
Return a shallow copy of the object, where each column is a reference of the corresponding column in self . |
get_hardcolumn (col) |
Return a column from the underlying file source. |
gslice (start, stop[, end, redistribute]) |
Execute a global slice of a CatalogSource. |
make_column (array) |
Utility function to convert an array-like object to a dask.array.Array . |
read (columns) |
Return the requested columns as dask arrays. |
save (output, columns[, datasets, header]) |
Save the CatalogSource to a bigfile.BigFile . |
sort (keys[, reverse, usecols]) |
Return a CatalogSource, sorted globally across all MPI ranks in ascending order by the input keys. |
to_mesh ([Nmesh, BoxSize, dtype, interlaced, …]) |
Convert the CatalogSource to a MeshSource, using the specified parameters. |
view ([type]) |
Return a “view” of the CatalogSource object, with the returned type set by type . |
nbodykit.source.catalog.
FITSCatalog
(*args, **kwargs)¶A CatalogSource that uses FITSFile
to read data from disk.
Multiple files can be read at once by supplying a list of file
names or a glob asterisk pattern as the path
argument. See
Reading Multiple Data Files at Once for examples.
Parameters: |
|
---|
Examples
Please see the documentation for examples.
Attributes
Index |
The attribute giving the global index rank of each particle in the list. |
attrs |
A dictionary storing relevant meta-data about the CatalogSource. |
columns |
All columns in the CatalogSource, including those hard-coded into the class’s defintion and override columns provided by the user. |
csize |
The total, collective size of the CatalogSource, i.e., summed across all ranks. |
hardcolumns |
The union of the columns in the file and any transformed columns. |
size |
The number of objects in the CatalogSource on the local rank. |
use_cache |
If set to True , use the built-in caching features of dask to cache data in memory. |
Methods
Selection () |
A boolean column that selects a subset slice of the CatalogSource. |
Value () |
When interpolating a CatalogSource on to a mesh, the value of this array is used as the Value that each particle contributes to a given mesh cell. |
Weight () |
The column giving the weight to use for each particle on the mesh. |
compute (*args, **kwargs) |
Our version of dask.compute() that computes multiple delayed dask collections at once. |
copy () |
Return a shallow copy of the object, where each column is a reference of the corresponding column in self . |
get_hardcolumn (col) |
Return a column from the underlying file source. |
gslice (start, stop[, end, redistribute]) |
Execute a global slice of a CatalogSource. |
make_column (array) |
Utility function to convert an array-like object to a dask.array.Array . |
read (columns) |
Return the requested columns as dask arrays. |
save (output, columns[, datasets, header]) |
Save the CatalogSource to a bigfile.BigFile . |
sort (keys[, reverse, usecols]) |
Return a CatalogSource, sorted globally across all MPI ranks in ascending order by the input keys. |
to_mesh ([Nmesh, BoxSize, dtype, interlaced, …]) |
Convert the CatalogSource to a MeshSource, using the specified parameters. |
view ([type]) |
Return a “view” of the CatalogSource object, with the returned type set by type . |
nbodykit.source.catalog.
Gadget1Catalog
(*args, **kwargs)¶A CatalogSource that uses Gadget1File
to read data from disk.
Multiple files can be read at once by supplying a list of file
names or a glob asterisk pattern as the path
argument. See
Reading Multiple Data Files at Once for examples.
Parameters: |
|
---|
Attributes
Index |
The attribute giving the global index rank of each particle in the list. |
attrs |
A dictionary storing relevant meta-data about the CatalogSource. |
columns |
All columns in the CatalogSource, including those hard-coded into the class’s defintion and override columns provided by the user. |
csize |
The total, collective size of the CatalogSource, i.e., summed across all ranks. |
hardcolumns |
The union of the columns in the file and any transformed columns. |
size |
The number of objects in the CatalogSource on the local rank. |
use_cache |
If set to True , use the built-in caching features of dask to cache data in memory. |
Methods
Selection () |
A boolean column that selects a subset slice of the CatalogSource. |
Value () |
When interpolating a CatalogSource on to a mesh, the value of this array is used as the Value that each particle contributes to a given mesh cell. |
Weight () |
The column giving the weight to use for each particle on the mesh. |
compute (*args, **kwargs) |
Our version of dask.compute() that computes multiple delayed dask collections at once. |
copy () |
Return a shallow copy of the object, where each column is a reference of the corresponding column in self . |
get_hardcolumn (col) |
Return a column from the underlying file source. |
gslice (start, stop[, end, redistribute]) |
Execute a global slice of a CatalogSource. |
make_column (array) |
Utility function to convert an array-like object to a dask.array.Array . |
read (columns) |
Return the requested columns as dask arrays. |
save (output, columns[, datasets, header]) |
Save the CatalogSource to a bigfile.BigFile . |
sort (keys[, reverse, usecols]) |
Return a CatalogSource, sorted globally across all MPI ranks in ascending order by the input keys. |
to_mesh ([Nmesh, BoxSize, dtype, interlaced, …]) |
Convert the CatalogSource to a MeshSource, using the specified parameters. |
view ([type]) |
Return a “view” of the CatalogSource object, with the returned type set by type . |
nbodykit.source.catalog.
ArrayCatalog
(data, comm=None, use_cache=False, **kwargs)[source]¶A CatalogSource initialized from a dictionary or structured ndarray.
Parameters: |
|
---|
Attributes
Index |
The attribute giving the global index rank of each particle in the list. |
attrs |
A dictionary storing relevant meta-data about the CatalogSource. |
columns |
All columns in the CatalogSource, including those hard-coded into the class’s defintion and override columns provided by the user. |
csize |
The total, collective size of the CatalogSource, i.e., summed across all ranks. |
hardcolumns |
The union of the columns in the file and any transformed columns. |
size |
The number of objects in the CatalogSource on the local rank. |
use_cache |
If set to True , use the built-in caching features of dask to cache data in memory. |
Methods
Selection () |
A boolean column that selects a subset slice of the CatalogSource. |
Value () |
When interpolating a CatalogSource on to a mesh, the value of this array is used as the Value that each particle contributes to a given mesh cell. |
Weight () |
The column giving the weight to use for each particle on the mesh. |
compute (*args, **kwargs) |
Our version of dask.compute() that computes multiple delayed dask collections at once. |
copy () |
Return a shallow copy of the object, where each column is a reference of the corresponding column in self . |
get_hardcolumn (col) |
Return a column from the underlying data array/dict. |
gslice (start, stop[, end, redistribute]) |
Execute a global slice of a CatalogSource. |
make_column (array) |
Utility function to convert an array-like object to a dask.array.Array . |
read (columns) |
Return the requested columns as dask arrays. |
save (output, columns[, datasets, header]) |
Save the CatalogSource to a bigfile.BigFile . |
sort (keys[, reverse, usecols]) |
Return a CatalogSource, sorted globally across all MPI ranks in ascending order by the input keys. |
to_mesh ([Nmesh, BoxSize, dtype, interlaced, …]) |
Convert the CatalogSource to a MeshSource, using the specified parameters. |
view ([type]) |
Return a “view” of the CatalogSource object, with the returned type set by type . |
get_hardcolumn
(col)[source]¶Return a column from the underlying data array/dict.
Columns are returned as dask arrays.
hardcolumns
¶The union of the columns in the file and any transformed columns.
nbodykit.source.catalog.
LogNormalCatalog
(Plin, nbar, BoxSize, Nmesh, bias=2.0, seed=None, cosmo=None, redshift=None, unitary_amplitude=False, inverted_phase=False, comm=None, use_cache=False)[source]¶A CatalogSource containing biased particles that have been Poisson-sampled from a log-normal density field.
Parameters: |
|
---|
References
Cole and Jones, 1991 Agrawal et al. 2017
Attributes
Index |
The attribute giving the global index rank of each particle in the list. |
attrs |
A dictionary storing relevant meta-data about the CatalogSource. |
columns |
All columns in the CatalogSource, including those hard-coded into the class’s defintion and override columns provided by the user. |
csize |
The total, collective size of the CatalogSource, i.e., summed across all ranks. |
hardcolumns |
A list of the hard-coded columns in the CatalogSource. |
size |
The number of objects in the CatalogSource on the local rank. |
use_cache |
If set to True , use the built-in caching features of dask to cache data in memory. |
Methods
Position () |
Position assumed to be in Mpc/h |
Selection () |
A boolean column that selects a subset slice of the CatalogSource. |
Value () |
When interpolating a CatalogSource on to a mesh, the value of this array is used as the Value that each particle contributes to a given mesh cell. |
Velocity () |
Velocity in km/s |
VelocityOffset () |
The corresponding RSD offset, in Mpc/h |
Weight () |
The column giving the weight to use for each particle on the mesh. |
compute (*args, **kwargs) |
Our version of dask.compute() that computes multiple delayed dask collections at once. |
copy () |
Return a shallow copy of the object, where each column is a reference of the corresponding column in self . |
get_hardcolumn (col) |
Construct and return a hard-coded column. |
gslice (start, stop[, end, redistribute]) |
Execute a global slice of a CatalogSource. |
make_column (array) |
Utility function to convert an array-like object to a dask.array.Array . |
read (columns) |
Return the requested columns as dask arrays. |
save (output, columns[, datasets, header]) |
Save the CatalogSource to a bigfile.BigFile . |
sort (keys[, reverse, usecols]) |
Return a CatalogSource, sorted globally across all MPI ranks in ascending order by the input keys. |
to_mesh ([Nmesh, BoxSize, dtype, interlaced, …]) |
Convert the CatalogSource to a MeshSource, using the specified parameters. |
view ([type]) |
Return a “view” of the CatalogSource object, with the returned type set by type . |
nbodykit.source.catalog.
UniformCatalog
(nbar, BoxSize, seed=None, comm=None, use_cache=False)[source]¶A CatalogSource that has uniformly-distributed Position
and Velocity
columns.
The random numbers generated do not depend on the number of available ranks.
Parameters: |
---|
Attributes
Index |
The attribute giving the global index rank of each particle in the list. |
attrs |
A dictionary storing relevant meta-data about the CatalogSource. |
columns |
All columns in the CatalogSource, including those hard-coded into the class’s defintion and override columns provided by the user. |
csize |
The total, collective size of the CatalogSource, i.e., summed across all ranks. |
hardcolumns |
A list of the hard-coded columns in the CatalogSource. |
rng |
A MPIRandomState that behaves as numpy.random.RandomState but generates random numbers in a manner independent of the number of ranks. |
size |
The number of objects in the CatalogSource on the local rank. |
use_cache |
If set to True , use the built-in caching features of dask to cache data in memory. |
Methods
Position () |
The position of particles, uniformly distributed in BoxSize |
Selection () |
A boolean column that selects a subset slice of the CatalogSource. |
Value () |
When interpolating a CatalogSource on to a mesh, the value of this array is used as the Value that each particle contributes to a given mesh cell. |
Velocity () |
The velocity of particles, uniformly distributed in 0.01 x BoxSize |
Weight () |
The column giving the weight to use for each particle on the mesh. |
compute (*args, **kwargs) |
Our version of dask.compute() that computes multiple delayed dask collections at once. |
copy () |
Return a shallow copy of the object, where each column is a reference of the corresponding column in self . |
get_hardcolumn (col) |
Construct and return a hard-coded column. |
gslice (start, stop[, end, redistribute]) |
Execute a global slice of a CatalogSource. |
make_column (array) |
Utility function to convert an array-like object to a dask.array.Array . |
read (columns) |
Return the requested columns as dask arrays. |
save (output, columns[, datasets, header]) |
Save the CatalogSource to a bigfile.BigFile . |
sort (keys[, reverse, usecols]) |
Return a CatalogSource, sorted globally across all MPI ranks in ascending order by the input keys. |
to_mesh ([Nmesh, BoxSize, dtype, interlaced, …]) |
Convert the CatalogSource to a MeshSource, using the specified parameters. |
view ([type]) |
Return a “view” of the CatalogSource object, with the returned type set by type . |
nbodykit.source.catalog.
RandomCatalog
(csize, seed=None, comm=None, use_cache=False)[source]¶A CatalogSource that can have columns added via a collective random number generator.
The random number generator stored as rng
behaves
as numpy.random.RandomState
but generates random
numbers only on the local rank in a manner independent of
the number of ranks.
Parameters: |
---|
Attributes
Index |
The attribute giving the global index rank of each particle in the list. |
attrs |
A dictionary storing relevant meta-data about the CatalogSource. |
columns |
All columns in the CatalogSource, including those hard-coded into the class’s defintion and override columns provided by the user. |
csize |
The total, collective size of the CatalogSource, i.e., summed across all ranks. |
hardcolumns |
A list of the hard-coded columns in the CatalogSource. |
rng |
A MPIRandomState that behaves as numpy.random.RandomState but generates random numbers in a manner independent of the number of ranks. |
size |
The number of objects in the CatalogSource on the local rank. |
use_cache |
If set to True , use the built-in caching features of dask to cache data in memory. |
Methods
Selection () |
A boolean column that selects a subset slice of the CatalogSource. |
Value () |
When interpolating a CatalogSource on to a mesh, the value of this array is used as the Value that each particle contributes to a given mesh cell. |
Weight () |
The column giving the weight to use for each particle on the mesh. |
compute (*args, **kwargs) |
Our version of dask.compute() that computes multiple delayed dask collections at once. |
copy () |
Return a shallow copy of the object, where each column is a reference of the corresponding column in self . |
get_hardcolumn (col) |
Construct and return a hard-coded column. |
gslice (start, stop[, end, redistribute]) |
Execute a global slice of a CatalogSource. |
make_column (array) |
Utility function to convert an array-like object to a dask.array.Array . |
read (columns) |
Return the requested columns as dask arrays. |
save (output, columns[, datasets, header]) |
Save the CatalogSource to a bigfile.BigFile . |
sort (keys[, reverse, usecols]) |
Return a CatalogSource, sorted globally across all MPI ranks in ascending order by the input keys. |
to_mesh ([Nmesh, BoxSize, dtype, interlaced, …]) |
Convert the CatalogSource to a MeshSource, using the specified parameters. |
view ([type]) |
Return a “view” of the CatalogSource object, with the returned type set by type . |
rng
¶A MPIRandomState
that behaves as
numpy.random.RandomState
but generates random
numbers in a manner independent of the number of ranks.
nbodykit.source.catalog.
FKPCatalog
(data, randoms, BoxSize=None, BoxPad=0.02, use_cache=True)[source]¶An interface for simultaneous modeling of a data
CatalogSource and a
randoms
CatalogSource, in the spirit of
Feldman, Kaiser, and Peacock, 1994.
This main functionality of this class is:
data
CatalogSource and randoms
CatalogSource, using
column names prefixed with “data/” or “randoms/”BoxSize
of the source, by
finding the maximum Cartesian extent of the randoms
data
and randoms
Parameters: |
|
---|
References
Attributes
attrs |
A dictionary storing relevant meta-data about the CatalogSource. |
columns |
Columns for individual species can be accessed using a species/ prefix and the column name, i.e., data/Position . |
hardcolumns |
Hardcolumn of the form species/name |
species |
List of species names |
use_cache |
If set to True , use the built-in caching features of dask to cache data in memory. |
Methods
compute (*args, **kwargs) |
Our version of dask.compute() that computes multiple delayed dask collections at once. |
copy () |
Return a shallow copy of the object, where each column is a reference of the corresponding column in self . |
get_hardcolumn (col) |
Construct and return a hard-coded column. |
make_column (array) |
Utility function to convert an array-like object to a dask.array.Array . |
read (columns) |
Return the requested columns as dask arrays. |
save (output, columns[, datasets, header]) |
Save the CatalogSource to a bigfile.BigFile . |
to_mesh ([Nmesh, BoxSize, dtype, interlaced, …]) |
Convert the FKPCatalog to a mesh, which knows how to “paint” the FKP density field. |
view ([type]) |
Return a “view” of the CatalogSource object, with the returned type set by type . |
to_mesh
(Nmesh=None, BoxSize=None, dtype='f4', interlaced=False, compensated=False, window='cic', fkp_weight='FKPWeight', comp_weight='Weight', nbar='NZ', selection='Selection', position='Position')[source]¶Convert the FKPCatalog to a mesh, which knows how to “paint” the FKP density field.
Additional keywords to the to_mesh()
function include the
FKP weight column, completeness weight column, and the column
specifying the number density as a function of redshift.
Parameters: |
|
---|
nbodykit.source.catalog.
HaloCatalog
(source, cosmo, redshift, mdef='vir', mass='Mass', position='Position', velocity='Velocity')[source]¶A wrapper CatalogSource of halo objects to interface nicely with
halotools.sim_manager.UserSuppliedHaloCatalog
.
Parameters: |
|
---|
Attributes
Index |
The attribute giving the global index rank of each particle in the list. |
attrs |
A dictionary storing relevant meta-data about the CatalogSource. |
columns |
All columns in the CatalogSource, including those hard-coded into the class’s defintion and override columns provided by the user. |
csize |
The total, collective size of the CatalogSource, i.e., summed across all ranks. |
hardcolumns |
A list of the hard-coded columns in the CatalogSource. |
size |
The number of objects in the CatalogSource on the local rank. |
use_cache |
If set to True , use the built-in caching features of dask to cache data in memory. |
Methods
Concentration () |
The halo concentration, computed using nbodykit.transform.HaloConcentration() . |
Mass () |
The halo mass column, assumed to be in units of \(M_\odot/h\). |
Position () |
The halo position column, assumed to be in units of \(\mathrm{Mpc}/h\). |
Radius () |
The halo radius, computed using nbodykit.transform.HaloRadius() . |
Selection () |
A boolean column that selects a subset slice of the CatalogSource. |
Value () |
When interpolating a CatalogSource on to a mesh, the value of this array is used as the Value that each particle contributes to a given mesh cell. |
Velocity () |
The halo velocity column, assumed to be in units of km/s. |
VelocityOffset () |
The redshift-space distance offset due to the velocity in units of distance. |
Weight () |
The column giving the weight to use for each particle on the mesh. |
compute (*args, **kwargs) |
Our version of dask.compute() that computes multiple delayed dask collections at once. |
copy () |
Return a shallow copy of the object, where each column is a reference of the corresponding column in self . |
get_hardcolumn (col) |
Construct and return a hard-coded column. |
gslice (start, stop[, end, redistribute]) |
Execute a global slice of a CatalogSource. |
make_column (array) |
Utility function to convert an array-like object to a dask.array.Array . |
read (columns) |
Return the requested columns as dask arrays. |
save (output, columns[, datasets, header]) |
Save the CatalogSource to a bigfile.BigFile . |
sort (keys[, reverse, usecols]) |
Return a CatalogSource, sorted globally across all MPI ranks in ascending order by the input keys. |
to_halotools ([BoxSize, selection]) |
Return the CatalogSource as a halotools.sim_manager.UserSuppliedHaloCatalog . |
to_mesh ([Nmesh, BoxSize, dtype, interlaced, …]) |
Convert the CatalogSource to a MeshSource, using the specified parameters. |
view ([type]) |
Return a “view” of the CatalogSource object, with the returned type set by type . |
Concentration
()[source]¶The halo concentration, computed using nbodykit.transform.HaloConcentration()
.
This uses the analytic formulas for concentration from Dutton and Maccio 2014.
Radius
()[source]¶The halo radius, computed using nbodykit.transform.HaloRadius()
.
Assumed units of \(\mathrm{Mpc}/h\).
VelocityOffset
()[source]¶The redshift-space distance offset due to the velocity in units of distance. The assumed units are \(\mathrm{Mpc}/h\).
This multiplies Velocity
by \(1 / (a 100 E(z)) = 1 / (a H(z)/h)\).
to_halotools
(BoxSize=None, selection='Selection')[source]¶Return the CatalogSource as a halotools.sim_manager.UserSuppliedHaloCatalog
.
The Halotools catalog only holds the local data, although halos are
labeled via the halo_id
column using the global index.
Parameters: | |
---|---|
Returns: | cat – the Halotools halo catalog, storing the local halo data |
Return type: |
nbodykit.source.catalog.
HODCatalog
(halos, logMmin=13.031, sigma_logM=0.38, alpha=0.76, logM0=13.27, logM1=14.08, seed=None, use_cache=False, comm=None)[source]¶A CatalogSource that uses the HOD prescription of Zheng et al 2007 to populate an input halo catalog with galaxies.
The mock population is done using halotools
. See the documentation
for halotools.empirical_models.Zheng07Cens
and
halotools.empirical_models.Zheng07Sats
for further details
regarding the HOD.
The columns generated in this catalog are:
csize
size
conc_NFWmodel
Velocity
Position
For futher details, please see the documentation.
Note
Default HOD values are from Reid et al. 2014
Parameters: |
|
---|
References
Zheng et al. (2007), arXiv:0703457
Attributes
Index |
The attribute giving the global index rank of each particle in the list. |
attrs |
A dictionary storing relevant meta-data about the CatalogSource. |
columns |
All columns in the CatalogSource, including those hard-coded into the class’s defintion and override columns provided by the user. |
csize |
The total, collective size of the CatalogSource, i.e., summed across all ranks. |
hardcolumns |
The union of the columns in the file and any transformed columns. |
size |
The number of objects in the CatalogSource on the local rank. |
use_cache |
If set to True , use the built-in caching features of dask to cache data in memory. |
Methods
Position () |
Galaxy positions, in units of Mpc/h |
Selection () |
A boolean column that selects a subset slice of the CatalogSource. |
Value () |
When interpolating a CatalogSource on to a mesh, the value of this array is used as the Value that each particle contributes to a given mesh cell. |
Velocity () |
Galaxy velocity, in units of km/s |
VelocityOffset () |
The RSD velocity offset, in units of Mpc/h |
Weight () |
The column giving the weight to use for each particle on the mesh. |
compute (*args, **kwargs) |
Our version of dask.compute() that computes multiple delayed dask collections at once. |
copy () |
Return a shallow copy of the object, where each column is a reference of the corresponding column in self . |
get_hardcolumn (col) |
Return a column from the underlying data array/dict. |
gslice (start, stop[, end, redistribute]) |
Execute a global slice of a CatalogSource. |
make_column (array) |
Utility function to convert an array-like object to a dask.array.Array . |
read (columns) |
Return the requested columns as dask arrays. |
repopulate ([seed]) |
Update the HOD parameters and then re-populate the mock catalog |
save (output, columns[, datasets, header]) |
Save the CatalogSource to a bigfile.BigFile . |
sort (keys[, reverse, usecols]) |
Return a CatalogSource, sorted globally across all MPI ranks in ascending order by the input keys. |
to_mesh ([Nmesh, BoxSize, dtype, interlaced, …]) |
Convert the CatalogSource to a MeshSource, using the specified parameters. |
view ([type]) |
Return a “view” of the CatalogSource object, with the returned type set by type . |
nbodykit.source.catalog.
MultipleSpeciesCatalog
(names, *species, **kwargs)[source]¶A CatalogSource interface for handling multiples species of particles.
This CatalogSource stores a copy of the original CatalogSource objects
for each species, providing access to the columns via the format
species/
where “species” is one of the species names provided.
Parameters: |
|
---|
Examples
Initialization:
>>> data = UniformCatalog(nbar=3e-5, BoxSize=512., seed=42)
>>> randoms = UniformCatalog(nbar=3e-5, BoxSize=512., seed=84)
>>> cat = MultipleSpeciesCatalog(['data', 'randoms'], data, randoms)
Accessing the Catalogs for individual species:
>>> data = cat["data"] # a copy of the original "data" object
Accessing individual columns:
>>> data_pos = cat["data/Position"]
Setting new columns:
>>> cat["data"]["new_column"] = 1.0
>>> assert "data/new_column" in cat
Attributes
attrs |
A dictionary storing relevant meta-data about the CatalogSource. |
columns |
Columns for individual species can be accessed using a species/ prefix and the column name, i.e., data/Position . |
hardcolumns |
Hardcolumn of the form species/name |
species |
List of species names |
use_cache |
If set to True , use the built-in caching features of dask to cache data in memory. |
Methods
compute (*args, **kwargs) |
Our version of dask.compute() that computes multiple delayed dask collections at once. |
copy () |
Return a shallow copy of the object, where each column is a reference of the corresponding column in self . |
get_hardcolumn (col) |
Construct and return a hard-coded column. |
make_column (array) |
Utility function to convert an array-like object to a dask.array.Array . |
read (columns) |
Return the requested columns as dask arrays. |
save (output, columns[, datasets, header]) |
Save the CatalogSource to a bigfile.BigFile . |
to_mesh ([Nmesh, BoxSize, dtype, interlaced, …]) |
Convert the catalog to a mesh, which knows how to “paint” the the combined density field, summed over all particle species. |
view ([type]) |
Return a “view” of the CatalogSource object, with the returned type set by type . |
__getitem__
(key)[source]¶This provides access to the underlying data in two ways:
key
is a species name.species/column
.__setitem__
(col, value)[source]¶Add columns to any of the species catalogs.
Note
New column names should be prefixed by ‘species/’ where
‘species’ is a name in the species
attribute.
columns
¶Columns for individual species can be accessed using a species/
prefix and the column name, i.e., data/Position
.
hardcolumns
¶Hardcolumn of the form species/name
species
¶List of species names
to_mesh
(Nmesh=None, BoxSize=None, dtype='f4', interlaced=False, compensated=False, window='cic', weight='Weight', selection='Selection', value='Value', position='Position')[source]¶Convert the catalog to a mesh, which knows how to “paint” the the combined density field, summed over all particle species.
Parameters: |
|
---|