oasislmf.pytools.summary.manager

Entry point to the run method of summarypy read an event loss stream and aggregate it into the relevant summary loss stream

using numba latest version 0.59, we are limited is our usage of classes and have to pass each data structure into the different function The attributes are name the same throughout the code, below is their definition

Static intermediary data structure:

summary_pipes : dict summary_set_id to path then updated to stream when reading starts summary_sets_id : array of all summary_set_id that have a stream summary_set_id_to_summary_set_index : array mapping summary_set_id => summary_set_index summary_set_index_to_loss_ptr : mapping array summary_set_index => starting index of summary_id loss (first dimension) in the loss_summary table item_id_to_summary_id : mapping array item_id => [summary_id for each summary set] item_id_to_risks_i : mapping array item_id => risks_i (index of th corresponding risk)

event based intermediary data structure:

summary_set_index_to_present_loss_ptr_end : copy of summary_set_index_to_loss_ptr that store the end of list of summary_id for each summary set present_summary_id : array that store all the summary_id found in the event

each summary set start at summary_set_index_to_loss_ptr and ends at summary_set_index_to_present_loss_ptr_end

loss_index : store the index in loss_summary of each summary corresponding to an item for quick lookup on read loss_summary : 2D array of losses for each summary

loss_summary[loss_index[summary_set_index], sidx] = loss

is_risk_affected : array to store in if a risk has already been affected or not in this event

Module Contents

Classes

SummaryReader

Abstract class to read event stream

Functions

create_summary_object_file(static_path, run_type)

create and write summary object into static path

load_summary_object(static_path, run_type)

load already prepare summary data structure if present otherwise create them

get_summary_object(static_path, run_type)

read static files to get summary static data structure

write_summary_objects(static_path, run_type, ...)

read_summary_objects(static_path, run_type)

nb_extract_risk_info(item_id_to_risks_i, ...)

extract_risk_info(len_item_id, summary_map)

extract relevant information regarding item and risk mapping from summary_map

read_buffer(byte_mv, cursor, valid_buff, event_id, ...)

read valid part of byte_mv and load relevant data for one event

mv_write_event(byte_mv, event_id, len_sample, ...)

load event summary loss into byte_mv

get_summary_set_id_to_summary_set_index(summary_sets_id)

create an array mapping summary_set_id => summary_set_index

get_summary_xref_info(summary_xref, summary_sets_id, ...)

extract mapping from summary_xref

run(files_in, static_path, run_type, low_memory, ...)

Args:

main(create_summarypy_files, static_path, run_type, ...)

Attributes

oasislmf.pytools.summary.manager.logger[source]
oasislmf.pytools.summary.manager.SPECIAL_SIDX_COUNT = 6[source]
oasislmf.pytools.summary.manager.SUMMARY_HEADER_SIZE[source]
oasislmf.pytools.summary.manager.SIDX_LOSS_WRITE_SIZE[source]
oasislmf.pytools.summary.manager.SUPPORTED_SUMMARY_SET_ID[source]
oasislmf.pytools.summary.manager.risk_key_type[source]
oasislmf.pytools.summary.manager.summary_info_dtype[source]
oasislmf.pytools.summary.manager.SUPPORTED_RUN_TYPE[source]
oasislmf.pytools.summary.manager.create_summary_object_file(static_path, run_type)[source]

create and write summary object into static path

oasislmf.pytools.summary.manager.load_summary_object(static_path, run_type)[source]

load already prepare summary data structure if present otherwise create them

oasislmf.pytools.summary.manager.get_summary_object(static_path, run_type)[source]

read static files to get summary static data structure

oasislmf.pytools.summary.manager.write_summary_objects(static_path, run_type, summary_info, summary_set_id_to_summary_set_index, summary_set_index_to_loss_ptr, item_id_to_summary_id, item_id_to_risks_i)[source]
oasislmf.pytools.summary.manager.read_summary_objects(static_path, run_type)[source]
oasislmf.pytools.summary.manager.nb_extract_risk_info(item_id_to_risks_i, summary_map_item_ids, summary_map_loc_ids, summary_map_building_ids)[source]
oasislmf.pytools.summary.manager.extract_risk_info(len_item_id, summary_map)[source]

extract relevant information regarding item and risk mapping from summary_map Args:

len_item_id: number of items summary_map: numpy ndarray view of the summary_map

Returns:

(number of risk, mapping array item_id => risks_i)

oasislmf.pytools.summary.manager.read_buffer(byte_mv, cursor, valid_buff, event_id, item_id, summary_sets_id, summary_set_index_to_loss_ptr, item_id_to_summary_id, loss_index, loss_summary, present_summary_id, summary_set_index_to_present_loss_ptr_end, item_id_to_risks_i, is_risk_affected, has_affected_risk)[source]

read valid part of byte_mv and load relevant data for one event

oasislmf.pytools.summary.manager.mv_write_event(byte_mv, event_id, len_sample, last_loss_summary_index, last_sidx, output_zeros, has_affected_risk, summary_set_index, summary_set_index_to_loss_ptr, summary_set_index_to_present_loss_ptr_end, present_summary_id, loss_summary, summary_index_cursor, summary_sets_cursor, summary_stream_index)[source]

load event summary loss into byte_mv

Args:

byte_mv: numpy byte view to write to the stream event_id: event id len_sample: max sample id last_loss_summary_index: last summary index written (used to restart from the last summary when buffer was full) last_sidx: last sidx written in the buffer (used to restart from the correct sidx when buffer was full

see other args definition in run method

class oasislmf.pytools.summary.manager.SummaryReader(summary_sets_id, summary_set_index_to_loss_ptr, item_id_to_summary_id, loss_index, loss_summary, present_summary_id, summary_set_index_to_present_loss_ptr_end, item_id_to_risks_i, is_risk_affected, has_affected_risk)[source]

Bases: oasislmf.pytools.common.event_stream.EventReader

Abstract class to read event stream

This class provide a generic interface to read multiple event stream using: - selector : handle back pressure, the program is paused and don’t use resource if nothing is in the stream buffer - memoryview : read a chuck (PIPE_CAPACITY) of data at a time then work on it using a numpy byte view of this buffer

To use those methods need to be implemented: - __init__(self, …) the constructor with all data structure needed to read and store the event stream - read_buffer(self, byte_mv, cursor, valid_buff, event_id, item_id)

simply point to a local numba.jit function name read_buffer (a template is provided bellow) this function should implement the specific logic of where and how to store the event information.

Those to method may be overwritten - item_exit(self):

specific logic to do when an item is finished (only executed once the stream is finished but no 0,0 closure was present)

  • event_read_log(self):

    what kpi to log when a full event is read

usage snippet:
with ExitStack() as stack:

streams_in, (stream_type, stream_agg_type, len_sample) = init_streams_in(files_in, stack) reader = CustomReader(<read relevant attributes>) for event_id in reader.read_streams(streams_in):

<event logic>

read_buffer(byte_mv, cursor, valid_buff, event_id, item_id)[source]
oasislmf.pytools.summary.manager.get_summary_set_id_to_summary_set_index(summary_sets_id)[source]

create an array mapping summary_set_id => summary_set_index

oasislmf.pytools.summary.manager.get_summary_xref_info(summary_xref, summary_sets_id, summary_set_id_to_summary_set_index)[source]

extract mapping from summary_xref

oasislmf.pytools.summary.manager.run(files_in, static_path, run_type, low_memory, output_zeros, **kwargs)[source]
Args:

files_in: list of file path to read event from run_type: type of the source that is sending the stream static_path: path to the static files low_memory: if true output summary index file output_zeros: if true output 0 loss **kwargs:

oasislmf.pytools.summary.manager.main(create_summarypy_files, static_path, run_type, **kwargs)[source]