beamds.beam.data package#
Submodules#
beamds.beam.data.beam_data module#
- class beamds.beam.data.beam_data.BeamData(*args, data=None, path=None, name=None, all_paths=None, index=None, label=None, columns=None, lazy=True, device=None, target_device=None, schema=None, override=False, compress=None, split_by='keys', chunksize=1000000000, chunklen=None, n_chunks=None, key_map=None, partition=None, archive_size=1000000, preferred_orientation='index', read_kwargs=None, write_kwargs=None, quick_getitem=False, orientation=None, glob_filter=None, info=None, synced=False, write_metadata=True, read_metadata=True, metadata_path_prefix=None, key_fold_map=None, chunksize_policy='round', **kwargs)[source]#
Bases:
BeamName
- property all_paths#
- as_tensor(device=None, dtype=None, return_vector=False)[source]#
Convert the data to tensor in place @param device: @param dtype: @param return_vector: @return:
- clone(*args, data=None, path=None, all_paths=None, key_map=None, index=None, label=None, columns=None, schema=None, orientation=None, info=None, constructor=None, key_fold_map=None, **kwargs)[source]#
- columns_chunk_file_extension = '.columns_chunk'#
- property columns_map#
- columns_partition_directory_name = '.columns_part'#
- property conf#
- static data_batch(data, index=None, label=None, orientation=None, info=None, flatten_index=False, flatten_label=False)[source]#
- property data_type#
- property data_types#
- default_data_file_name = 'data_container'#
- property device#
- property dtypes#
- property flatten_data#
- property flatten_items#
- get_default_params(*args, **kwargs)[source]#
Get default parameters from the class
@param args: @param kwargs: @return:
- static get_n_chunks(data, n_chunks=None, chunklen=None, chunksize=None, size=None, chunksize_policy='round')[source]#
- property has_index#
- property has_label#
- property hash#
- property index#
- index_chunk_file_extension = '.index_chunk'#
- index_partition_directory_name = '.index_part'#
- property info#
- property key_map#
- property label#
- metadata_files = {'all_paths': '.all_paths.pkl', 'aux': '.aux.pkl', 'conf': '.conf.pkl', 'index': '.index', 'info': '.info.fea', 'label': '.label', 'schema': '.schema.pkl'}#
- property metadata_paths#
- property objects_type#
- property orientation#
- property parent#
- property path#
- property root_path#
- property schema#
- property schema_type#
- property shape#
- property simplified#
- property size#
- static slice_scalar_or_list(data, keys, data_type=None, keys_type=None, replace_missing=False)[source]#
- property stack#
- property stacked_index#
- property stacked_labels#
- property stacked_values#
- store(path=None, data=None, compress=None, chunksize=None, chunklen=None, n_chunks=None, partition=None, split_by=None, archive_size=None, override=None, split=True, chunksize_policy=None, **kwargs)[source]#
- property total_size#
- property values#
- static write_object(data, path, override=True, size=None, archive=False, compress=None, chunksize=1000000000, chunklen=None, n_chunks=None, partition=None, file_type=None, schema=None, textual_serialization=False, split_by=None, split=True, priority=None, blacklist_priority=None, chunksize_policy='round', **kwargs)[source]#
- static write_tree(data, path, sizes=None, split_by='keys', archive_size=1000000, chunksize=1000000000, override=True, chunklen=None, n_chunks=None, partition=None, file_type=None, root=False, schema=None, split=False, textual_serialization=False, blacklist_priority=None, chunksize_policy='round', **kwargs)[source]#