beamds.beam.utils package#
Submodules#
beamds.beam.utils.utils_all module#
- class beamds.beam.utils.utils_all.BeamDict(initial_data=None, **kwargs)[source]#
Bases:
dict
,Namespace
- class beamds.beam.utils.utils_all.BeamJsonEncoder(*, skipkeys=False, ensure_ascii=True, check_circular=True, allow_nan=True, sort_keys=False, indent=None, separators=None, default=None)[source]#
Bases:
JSONEncoder
- default(obj)[source]#
Implement this method in a subclass such that it returns a serializable object for
o
, or calls the base implementation (to raise aTypeError
).For example, to support arbitrary iterators, you could implement default like this:
def default(self, o): try: iterable = iter(o) except TypeError: pass else: return list(iterable) # Let the base class default method raise the TypeError return super().default(o)
- exception beamds.beam.utils.utils_all.CachedAttributeException[source]#
Bases:
Exception
Custom exception to be raised instead of AttributeError in cached properties.
- class beamds.beam.utils.utils_all.DataBatch(index, label, data)#
Bases:
tuple
- data#
Alias for field number 2
- index#
Alias for field number 0
- label#
Alias for field number 1
- class beamds.beam.utils.utils_all.DataObject(data, data_type=None)[source]#
Bases:
object
- property data_type#
- class beamds.beam.utils.utils_all.LimitedSizeDict(size_limit=None, on_removal=None)[source]#
Bases:
OrderedDict
- class beamds.beam.utils.utils_all.LimitedSizeDictFactory(size_limit=None, on_removal=None)[source]#
Bases:
object
- class beamds.beam.utils.utils_all.ThreadSafeDict(*args, **kwargs)[source]#
Bases:
dict
- pop(k[, d]) v, remove specified key and return the corresponding value. [source]#
If the key is not found, return the default if given; otherwise, raise a KeyError.
- popitem()[source]#
Remove and return a (key, value) pair as a 2-tuple.
Pairs are returned in LIFO (last-in, first-out) order. Raises KeyError if the dict is empty.
- setdefault(key, default=None)[source]#
Insert key with a value of default if key is not in the dictionary.
Return the value for key if key is in the dictionary, else default.
- class beamds.beam.utils.utils_all.Timer(logger, name='', silent=False, timeout=None, task=None, task_args=None, task_kwargs=None, graceful=False)[source]#
Bases:
object
- property elapsed#
- beamds.beam.utils.utils_all.beam_traceback(exc_type=None, exc_value=None, tb=None, context=3)[source]#
- beamds.beam.utils.utils_all.build_container_from_tupled_keys(keys, values, sorted_keys=None)[source]#
- beamds.beam.utils.utils_all.deserialize_annotation(annotation_str, global_ns=None)[source]#
Convert serialized annotation back to its original format.
- beamds.beam.utils.utils_all.dict_to_signature(d, global_ns=None)[source]#
Convert a dictionary representation back to a Signature object.
- beamds.beam.utils.utils_all.find_port(port=None, get_port_from_beam_port_range=True, application='none', blacklist=None, whitelist=None)[source]#
- beamds.beam.utils.utils_all.get_notebook_name()[source]#
Execute JS code to save Jupyter notebook name to variable notebook_name
- beamds.beam.utils.utils_all.getmembers(object, predicate=None)[source]#
Return all members of an object as (name, value) pairs sorted by name. Optionally, only return members that satisfy a given predicate.
- beamds.beam.utils.utils_all.include_patterns(*patterns)[source]#
Factory function that can be used with copytree() ignore parameter. Arguments define a sequence of glob-style patterns that are used to specify what files to NOT ignore. Creates and returns a function that determines this for each directory in the file hierarchy rooted at the source directory when used with shutil.copytree().
- beamds.beam.utils.utils_all.jupyter_like_traceback(exc_type=None, exc_value=None, tb=None, context=3)[source]#
- class beamds.beam.utils.utils_all.nested_defaultdict(default_factory=None, **kwargs)[source]#
Bases:
defaultdict
- beamds.beam.utils.utils_all.parse_string_number(x, time_units=None, unit_prefixes=None, timedelta_format=True, return_units=False)[source]#
- beamds.beam.utils.utils_all.retry(func=None, retries=3, logger=None, name=None, verbose=False, sleep=1, timeout=None)[source]#
- beamds.beam.utils.utils_all.run_forever(func=None, *args, sleep=1, name=None, logger=None, **kwargs)[source]#
- beamds.beam.utils.utils_all.serialize_annotation(annotation)[source]#
Convert annotation to a serializable format.
- beamds.beam.utils.utils_all.signature_to_dict(signature)[source]#
Convert a Signature object to a dictionary representation.
- beamds.beam.utils.utils_all.slice_array(x, index, x_type=None, indices_type=None, wrap_object=False)[source]#
- beamds.beam.utils.utils_all.tqdm_beam(x, *args, threshold=10, stats_period=1, message_func=None, enable=None, notebook=True, **argv)[source]#
Beam’s wrapper for the tqdm progress bar. It features a universal interface for both jupyter notebooks and .py files. In addition, it provides a “lazy progress bar initialization”. The progress bar is initialized only if its estimated duration is longer than a threshold.
- Parameters:
x
threshold (float) – The smallest expected duration (in Seconds) to generate a progress bar. This feature is used only if enable is set to None.
stats_period (float) – The initial time period (in seconds) to calculate the ineration statistics (iters/sec). This statistics is used to estimate the expected duction of the entire iteration.
message_func (func) – A dynamic message to add to the progress bar. For example, this message can plot the instantaneous loss.
enable (boolean/None) – Whether to enable the progress bar, disable it or when set to None, use lazy progress bar.
notebook (boolean) – A boolean that overrides the internal calculation of is_notebook. Set to False when you want to avoid printing notebook styled tqdm bars (for example, due to multiprocessing).
beamds.beam.utils.utils_ds module#
- beamds.beam.utils.utils_ds.collate_chunks(*xs, keys=None, dim=0, on='index', how='outer', method='tree', squeeze=True, logger=None)[source]#
- beamds.beam.utils.utils_ds.divide_chunks(x, chunksize=None, n_chunks=None, partition=None, squeeze=False, dim=0, x_type=None, chunksize_policy='round')[source]#
- beamds.beam.utils.utils_ds.get_chunks(x, chunksize=None, n_chunks=None, partition=None, dim=0)[source]#
- beamds.beam.utils.utils_ds.hash_tensor(x, fast=False, coarse=False)[source]#
This function returns a deterministic hash of the tensor content @param x: the tensor to hash @param fast: whether to consider only the first and last elements of the tensor for hashing @param coarse: whether to apply coarse hashing where the tensor is quantized into low resolution (16bit) tensor @return: an integer representing the hash value
- beamds.beam.utils.utils_ds.recursive_chunks(x, chunksize=None, n_chunks=None, partition=None, squeeze=False, dim=0, x_type=None, chunksize_policy='round')[source]#
- beamds.beam.utils.utils_ds.recursive_collate_chunks(*xs, dim=0, on='index', how='outer', method='tree')[source]#
- beamds.beam.utils.utils_ds.recursive_flatten(x, flat_array=False, x_type=None, tolist=True, _root=True, depth=-1)[source]#
- beamds.beam.utils.utils_ds.set_seed(seed=-1, constant=0, increment=False, deterministic=False)[source]#
- Parameters:
seed – set -1 to avoid change, set 0 to randomly select seed, set [1, 2**31) to get new seed
constant – a constant to be added to the seed
increment – whether to generate incremental seeds
deterministic – whether to set torch to be deterministic
- Returns:
None