oasislmf.pytools.summary.manager ================================ .. py:module:: oasislmf.pytools.summary.manager .. autoapi-nested-parse:: Entry point to the run method of summarypy read an event loss stream and aggregate it into the relevant summary loss stream using numba latest version 0.59, we are limited is our usage of classes and have to pass each data structure into the different function The attributes are name the same throughout the code, below is their definition Static intermediary data structure: summary_pipes : dict summary_set_id to path then updated to stream when reading starts summary_sets_id : array of all summary_set_id that have a stream summary_set_id_to_summary_set_index : array mapping summary_set_id => summary_set_index summary_set_index_to_loss_ptr : mapping array summary_set_index => starting index of summary_id loss (first dimension) in the loss_summary table item_id_to_summary_id : mapping array item_id => [summary_id for each summary set] item_id_to_risks_i : mapping array item_id => risks_i (index of th corresponding risk) event based intermediary data structure: summary_set_index_to_present_loss_ptr_end : copy of summary_set_index_to_loss_ptr that store the end of list of summary_id for each summary set present_summary_id : array that store all the summary_id found in the event each summary set start at summary_set_index_to_loss_ptr and ends at summary_set_index_to_present_loss_ptr_end loss_index : store the index in loss_summary of each summary corresponding to an item for quick lookup on read loss_summary : 2D array of losses for each summary loss_summary[loss_index[summary_set_index], sidx] = loss is_risk_affected : array to store in if a risk has already been affected or not in this event Attributes ---------- .. autoapisummary:: oasislmf.pytools.summary.manager.logger oasislmf.pytools.summary.manager.SPECIAL_SIDX_COUNT oasislmf.pytools.summary.manager.SUMMARY_HEADER_SIZE oasislmf.pytools.summary.manager.SIDX_LOSS_WRITE_SIZE oasislmf.pytools.summary.manager.SUPPORTED_SUMMARY_SET_ID oasislmf.pytools.summary.manager.risk_key_type oasislmf.pytools.summary.manager.summary_info_dtype oasislmf.pytools.summary.manager.SUPPORTED_RUN_TYPE Classes ------- .. autoapisummary:: oasislmf.pytools.summary.manager.SummaryReader Functions --------- .. autoapisummary:: oasislmf.pytools.summary.manager.create_summary_object_file oasislmf.pytools.summary.manager.load_summary_object oasislmf.pytools.summary.manager.get_summary_object oasislmf.pytools.summary.manager.write_summary_objects oasislmf.pytools.summary.manager.read_summary_objects oasislmf.pytools.summary.manager.nb_extract_risk_info oasislmf.pytools.summary.manager.extract_risk_info oasislmf.pytools.summary.manager.read_buffer oasislmf.pytools.summary.manager.mv_write_event oasislmf.pytools.summary.manager.get_summary_set_id_to_summary_set_index oasislmf.pytools.summary.manager.get_summary_xref_info oasislmf.pytools.summary.manager.run oasislmf.pytools.summary.manager.main Module Contents --------------- .. py:data:: logger .. py:data:: SPECIAL_SIDX_COUNT :value: 6 .. py:data:: SUMMARY_HEADER_SIZE .. py:data:: SIDX_LOSS_WRITE_SIZE .. py:data:: SUPPORTED_SUMMARY_SET_ID .. py:data:: risk_key_type .. py:data:: summary_info_dtype .. py:data:: SUPPORTED_RUN_TYPE .. py:function:: create_summary_object_file(static_path, run_type) create and write summary object into static path .. py:function:: load_summary_object(static_path, run_type) load already prepare summary data structure if present otherwise create them .. py:function:: get_summary_object(static_path, run_type) read static files to get summary static data structure .. py:function:: write_summary_objects(static_path, run_type, summary_info, summary_set_id_to_summary_set_index, summary_set_index_to_loss_ptr, item_id_to_summary_id, item_id_to_risks_i) .. py:function:: read_summary_objects(static_path, run_type) .. py:function:: nb_extract_risk_info(item_id_to_risks_i, summary_map_item_ids, summary_map_loc_ids, summary_map_building_ids) .. py:function:: extract_risk_info(len_item_id, summary_map) extract relevant information regarding item and risk mapping from summary_map Args: len_item_id: number of items summary_map: numpy ndarray view of the summary_map Returns: (number of risk, mapping array item_id => risks_i) .. py:function:: read_buffer(byte_mv, cursor, valid_buff, event_id, item_id, summary_sets_id, summary_set_index_to_loss_ptr, item_id_to_summary_id, loss_index, loss_summary, present_summary_id, summary_set_index_to_present_loss_ptr_end, item_id_to_risks_i, is_risk_affected, has_affected_risk) read valid part of byte_mv and load relevant data for one event .. py:function:: mv_write_event(byte_mv, event_id, len_sample, last_loss_summary_index, last_sidx, output_zeros, has_affected_risk, summary_set_index, summary_set_index_to_loss_ptr, summary_set_index_to_present_loss_ptr_end, present_summary_id, loss_summary, summary_index_cursor, summary_sets_cursor, summary_stream_index) load event summary loss into byte_mv Args: byte_mv: numpy byte view to write to the stream event_id: event id len_sample: max sample id last_loss_summary_index: last summary index written (used to restart from the last summary when buffer was full) last_sidx: last sidx written in the buffer (used to restart from the correct sidx when buffer was full see other args definition in run method .. py:class:: SummaryReader(summary_sets_id, summary_set_index_to_loss_ptr, item_id_to_summary_id, loss_index, loss_summary, present_summary_id, summary_set_index_to_present_loss_ptr_end, item_id_to_risks_i, is_risk_affected, has_affected_risk) Bases: :py:obj:`oasislmf.pytools.common.event_stream.EventReader` Abstract class to read event stream This class provide a generic interface to read multiple event stream using: - selector : handle back pressure, the program is paused and don't use resource if nothing is in the stream buffer - memoryview : read a chuck (PIPE_CAPACITY) of data at a time then work on it using a numpy byte view of this buffer To use those methods need to be implemented: - __init__(self, ...) the constructor with all data structure needed to read and store the event stream - read_buffer(self, byte_mv, cursor, valid_buff, event_id, item_id) simply point to a local numba.jit function name read_buffer (a template is provided bellow) this function should implement the specific logic of where and how to store the event information. Those to method may be overwritten - item_exit(self): specific logic to do when an item is finished (only executed once the stream is finished but no 0,0 closure was present) - event_read_log(self): what kpi to log when a full event is read usage snippet: with ExitStack() as stack: streams_in, (stream_type, stream_agg_type, len_sample) = init_streams_in(files_in, stack) reader = CustomReader() for event_id in reader.read_streams(streams_in): .. py:attribute:: summary_sets_id .. py:attribute:: summary_set_index_to_loss_ptr .. py:attribute:: item_id_to_summary_id .. py:attribute:: loss_index .. py:attribute:: loss_summary .. py:attribute:: present_summary_id .. py:attribute:: summary_set_index_to_present_loss_ptr_end .. py:attribute:: item_id_to_risks_i .. py:attribute:: is_risk_affected .. py:attribute:: has_affected_risk .. py:attribute:: logger .. py:method:: read_buffer(byte_mv, cursor, valid_buff, event_id, item_id) .. py:function:: get_summary_set_id_to_summary_set_index(summary_sets_id) create an array mapping summary_set_id => summary_set_index .. py:function:: get_summary_xref_info(summary_xref, summary_sets_id, summary_set_id_to_summary_set_index) extract mapping from summary_xref .. py:function:: run(files_in, static_path, run_type, low_memory, output_zeros, **kwargs) Args: files_in: list of file path to read event from run_type: type of the source that is sending the stream static_path: path to the static files low_memory: if true output summary index file output_zeros: if true output 0 loss **kwargs: .. py:function:: main(create_summarypy_files, static_path, run_type, **kwargs)