skutils package
Submodules
skutils.FemtoDAQController module
- class skutils.FemtoDAQController.FemtoDAQController(url: str, verbose: bool = False)[source]
Bases:
object
- property adc_max_val
- property adc_min_val
- property channels
- configureCoincidence(mode: Literal['hit_pattern', 'multiplicity'], multiplicity: int | None = None, hit_pattern: Dict[str, str] | None = None)[source]
Configures coincidence prior to an experimental run.
- Parameters:
mode –
the coincidence mode for triggering. Must be one of two options. - “multiplicity” : global trigger requires at least the specified
number of individual channels to trigger
- ”hit_pattern”global trigger requires specific channels to
trigger or not trigger. AKA coincidence/anti-coincidence/ignore hit pattern
multiplicity – Required if mode = “multiplicity”. The minimum number individual channel triggers required to define an Event. This arugment is ignored if mode is “hit_pattern”
hit_pattern –
Required if mode = “hit_pattern”. This argument must be a dictionary. Keys are the channel number, and value is one of:
’COINCIDENCE’ : If a trigger is required on this channel ‘ANTICOINCIDENCE’ : If a trigger is not allowed on this channel ‘IGNORE’ : If triggers on this channel have no impact on Event
All channels must be present when presented to configureCoincidence. A simple builder for configureCoincidence is helpers.HitPatternCoincidenceBuilder which will fill in unspecified items with “IGNORE” exists. This arugment is ignored if mode is “multiplicity”
Hit Pattern Example
1hit_pattern = {"channel_0_trigger_hit_pattern" : "COINCIDENCE", "channel_1_trigger_hit_pattern" : "ANTICOINCIDENCE"} 2digitizer.configureCoincidence("hit_pattern", hit_pattern=hit_pattern)
Multiplicity Example
1multiplicity = 3 2digitizer.configureCoincidence("multiplicity", multiplicity=multiplicity)
- configureRecording(channels: Sequence[int], run_name: str = 'API_Recording', format_type: str = 'gretina', record_waves: bool = True, record_summaries: bool = False, directory: str | None = None, seq_file_size_MB: int = 100, only_record_triggered_channels: bool = False)[source]
Configures file recording prior to an experimental run.
- Parameters:
channels – list of channels to record during this experimental run
run_name – The name of this experimental run. This string will be pre-pended along with a date code to all to the names of all data files generated during this run
format_type – The file format to use. Call getRecordingFormatInfo for a full list of data formats and the data products they support
record_waves – True to save waveforms. This will raise an error if the specified file format doesn’t support waveform recording. Default is True.
record_summaries – True to save pulse summaries. This will raise an error if the specified file format doesn’t support pulse summary recording. Default is False.
directory – The remote directory on your FemtoDAQ unit to save data to. If left as None, then defaults to the data partition directory on the FemtoDAQ unit. SkuTek recommends keeping this as it’s default unless you want the FemtoDAQ unit to save data over an NFS mount.
seq_file_size_MB – Maximum size data file in MB before a new file is created. 100MB by default.
only_record_triggered_channels –
If True, then only record the channels in the channels list that triggered in the event. This is more efficient and reduces the liklihood of noise waveforms ending up in your data files. If left as False, the default, then all specified channels will be written to disk even if no trigger was detected. This is less efficient, but ensures that the same channels will be in each event.
warning Setting this to True can result in varying event sizes and in some cases can result in empty cells in row/column type file formats such the IGOR Pulse Height format.
- Raises:
RuntimeError – if record_waves is True, but the specified file format does not support waveform recording.
RuntimeError – if record_summaries is True, but the specified file format does not support pulse summary recording.
RuntimeError – if directory is specified, but does not exist or is inaccessible from your FemtoDAQ digitizer.
- configureSoftwareStreaming(channels: Sequence[int], format: str, target_ip: str, target_port: int | str, only_stream_triggered_channels: bool = False)[source]
Configures streaming readout from software prior to an experimental run.
- Parameters:
channels – list of channels to stream during this experimental run
target_ip – The IP address to stream to.
target_port – The network port at the specified IP address to stream to.
only_stream_triggered_channels – If True, then only record the channels in the channels list that triggered in the event. This is more efficient and reduces the liklihood of noise waveforms ending up in your data files. If left as False, the default, then all specified channels will be written to disk even if no trigger was detected. This is less efficient, but ensures that the same channels will be in each event.
- downloadFile(filename: str, save_to: str | None = None, silent: bool = False)[source]
Download a file from a given path, save to a location on disk, and optionally print out values
- Parameters:
filename – Remote file to download
save_to – location to save that file to, or the local destionation
silent – Don’t print values out
- downloadLastRunDataFiles(save_to: str | None = None)[source]
Iterate through all data files from the last run and download them.
- Parameters:
save_to – an optional parameter specifying where to save the data.
- getBaselineRestorationExclusion()[source]
Get the area used in baseline restoration exclusion to excluse your triggered pulse.
- getCoincidenceSettings()[source]
Obtain the current Coincidence settings.
Look at the FemtoDAQ web docs for more information on that packet, this function returns the “data” field of that packet
- getDigitalOffset(channel: int)[source]
Get the digital offset for a specified channel in ADC counts difference from the value to be displayed. This means that the offset is not inverted when you enable inverting waveforms.
- Parameters:
channel – channel to get the offset from
- getEnableBaselineRestoration()[source]
Get the status of enablement of the Baseline Restoration feature, Baseline Restoration is not supported on all products
- getEnableTrigger(channel: int)[source]
Get whether a trigger is specified for a channel
- Parameters:
channel – Channel to get the trigger enabled status from.
- getHistogramQuantity(channel: int)[source]
Get the quantity histogrammed at each event, check setHistogramQuantity for the meanings of values
- Parameters:
channel – Channel to get the quantity histogrammed per event
- getHistogramScaling(channel: int)[source]
Get the state of the histogram scaling for a specified channel :param channel: Channel to get the histogram scaling state from.
- getListOfDataFiles(last_run_only: bool = False) Sequence[str] [source]
Get the list of all remote data files.
- Parameters:
last_run_only – If true, only gets the data files recorded in the last run.
- getPulseHeightAveragingWindow()[source]
Get how many pulseheights are averaged together for performing a trigger
- getPulseHeightWindow()[source]
Get the window to check the maximum value of pulses in from the trigger
- getRecordingFormatInfo()[source]
returns the output of utils.get_streaming_formats from the target unit
- getRecordingSettings()[source]
Get the current recording settings, for more information, look at the FemtoDAQ WebAPI docs for what the exact return is. The function specifically returns the “data” portion of the packet.
- getRunStatistics() Dict[str, Any] [source]
returns a dictionary which contains at least the following keys
‘run_time’ : duration of run ‘number_of_packets_streamed_from_software’ : number of packets that have been streamed from
from our software streaming system
‘number_of_events_recorded’ : number of events saved to disk via the Recording System
- getSoftwareStreamSettings()[source]
Retrieve the stream settings currently made for the Vireo. :returns: Dict of a json packet The JSON packet should look like this:
- {
“soft_stream_channels”: channels, “soft_stream_dest_ip”: target_ip, “soft_stream_dest_port”: int | str, “only_stream_triggered”: bool
},
- getTriggerActiveWindow()[source]
Get the duration of the time window when the instrument is counting triggers occuring on all enabled ADC channels
- getTriggerAveragingWindow(channel: int)[source]
Get the duration of the leading and trailing summation windows in ADC samples. :param channel: channel to get the trigger averaging window of.
- getTriggerEdge(channel: int)[source]
Get what edge a trigger happens for a specified channel. :param channel: channel to get the trigger edge data from.
- getTriggerSensitivity(channel: int)[source]
Get the trigger threshold of the specified channel.
- Parameters:
channel – channel to obtain the trigger threshold of.
- getTriggerXPosition()[source]
Get the position of where the triggered item will be located across the N-sample window
- property name
- property num_channels
- property num_wave_samples
- property product_name
- property serial_number
- setAnalogOffsetPercent(channel: int, offset_percent: int)[source]
Set the analog offset as a percentage for a given channel. This value is unable to be read back.
- Parameters:
channel – Channel to set the offset
offset_percent – The percent offset for analog baseline offset ranging from -100 to 100 as an integer
- Raises:
ValueError – If the offset percentage is not in the valid range
- setBaselineRestorationExclusion(window_width: int)[source]
Set the area used to exclude an area from being affected from baseline restoration
- Parameters:
window_width – area to prevent affection from restoration exclusion.
- setBiasVoltage(voltage: float)[source]
Set the intended voltage offset for biasing a detector.
- Parameters:
voltage – An integer indicating the voltage in volts to offset the HV output bias.
- setBiasVoltageRaw(voltage: int)[source]
Set the raw DAC value used to bias output for a detector.
- Parameters:
voltage – an integer indicating the voltage in raw DAC bytes
- setDigitalOffset(channel: int, offset: int)[source]
Set the digital offset of a specified channel in ADC counts difference from the value to be displayed. This means that the offset is not inverted when you enable inverting waveforms.
- Parameters:
channel – channel to get the offset from
offset – Offset in ADC counts
- setEnableBaselineRestoration(enable: bool)[source]
Enable (or disable) Baseline Restoration on some products, Baseline Restoration is not supported on all products
- setEnableTrigger(channel: int, enable: bool)[source]
Set the status of a trigger for a specified channel.
- Parameters:
channel – Channel to enable or disable the triggering on.
enable – Enable or disable triggering on this channel
- setGlobalId(global_id: int)[source]
Set the globalID for an experiment to the device.
- Parameters:
global_id – a 0-255 integer representing an ID in an experiment
- setHistogramQuantity(channel: int, quantity: int)[source]
Set the quantity histogrammed at each event.
0 is the maximum value of a trace after averaging
1 is the running sum over the PulseHeight window without averaging
2 for running average of the PulseHeight window sum (AKA the average of mode 1)
3 for the maximum value of the trigger waveform after averaging.
- Parameters:
channel – channel to set what quantity is being histogrammed
quantity – What quantity do we want to histogram on this channel.
- setHistogramScaling(channel: int, state: int)[source]
Set the histogram scaling for a specified channel, if state is 1, this bins by 2, otherwise for 0, do not bin. To cover the whole ADC range state must be one
- Parameters:
channel – Channel to set the histogram scaling state
state – State to set the histogram scaling in, typically 1 for bin by 2 or 0 for no binning.
- setPulseHeightAveragingWindow(window_width: int)[source]
Set how many pulseheights are averaged together for performing a trigger
- Parameters:
window_width – integer width in ADC samples
- setPulseHeightWindow(window_width: int)[source]
Set the active window for measuring pulse heights compared to the trigger
- Parameters:
window_width – Number of ADC samples after the trigger to look for maximum pulse height values
- setQuadQDCWindows(base_width: int, fast_width: int, slow_width: int, tail_width: int)[source]
Set the windows for FGPA-based integration of an event
- Parameters:
base_width – pre-trigger area to integrate in ADC count, this is not configurable, but it is 128 samples wide and ends 8 samples before the FAST window
fast_width – Width of the fast window, starts at hte maximum value of the sliding window integration and will always cover the peak of the pulse.
slow_width – starts at the end of the fast window, integer in ADC counts.
tail_width – Starts at the end of the slow window, integer in ADC counts.
- setTriggerActiveWindow(window_width: int)[source]
Set the duration of the time window when the instrument is counting triggers occuring on all enabled ADC channels. For example, when dealing with HPGe detectors, the Trigger Active Window should cover the duration of the energy shaping filters. Additional trigger pulses in an ADC channel with signal a signal pileup. These events should be excluded from the pulse height histogram.
In waveform capture mode, the waveform will be truncated to the size of “Trigger Active Window”.
If your events are short and do not need to use the entire 8192 sample window, you can reduce your file size and increase your event throughput.
- Parameters:
window_width – number of samples to keep the trigger active.
- setTriggerAveragingWindow(channel: int, window_width: int)[source]
Set the trigger averaging window, the valid range is determined by the device, but a typical valid range would be: [1, 2, 4, 8, 16, 32] in terms of ADC samples to average for triggering.
- Parameters:
channel – channel to set the trigger averaging window
window_width – width of the trigger averaging window
- setTriggerEdge(channel: int, direction: Literal['rising'] | Literal['falling'] | int)[source]
Set whether the trigger is to be on the rising or falling edge of a waveform. This applies AFTER inversion.
- Parameters:
channel – Channel to set the trigger edge detection on.
direction – Direction of travel, rising or falling edge.
- setTriggerSensitivity(channel: int, sensitivity: int)[source]
Set the trigger threshold of the specified channel
- Parameters:
channel – channel to set the trigger threshold of.
sensitivity – Threshold of the trigger in ADC counts.
- setTriggerXPosition(x_position: int)[source]
Set the position of the trigger in the N-sample window.
- Parameters:
x_position – The position of the trigger in the N-sample window.
- property trigger_sensitivity_max
- property trigger_sensitivity_min
- property wave_max_val
- property wave_min_val
skutils.GretinaFileWriter module
- class skutils.GretinaFileWriter.GretinaFileWriter(fname, ascii_version_header=[])[source]
Bases:
object
Writes Skutek formatted Gretina data to data files - the same format as the “.bin” files saved natively by our digitizer.
- property checksum
- write_data_packet(channel, data, timestamp, wave_type, module_num=0, version_num=0, bitdepth=14, compression=0)[source]
writes a packet to the file
- Args:
channel(int): channel index of data data(numpy.ndarry): data. must be a 1D numpy array of dtype uint16 or int16 timestamp(int): trigger timestamp wave_type(str): GebTypes.raw_waveform or GebTypes.raw_histogram module_num(int): module number to go in skutek words. default is 0 version_num(int): version number to go in skutek words. default is 0 bitdepth(int): bitdepth to be encoded in packet. defaults to 14bit compression(int): compression level for zlib. 0 is None, 9 is max
- write_event(event, timestamp, wave_type, module_num=0, version_num=0, bitdepth=14, compression=0)[source]
writes an event to the file. Event must be a pandas dataframe with the integer channel number being the column header
- Args:
- event(numpy.ndarry): The event data. If a pandas data
frame, then Columns must be the integer channels number. If numpy array, then first column is assumed to be channel 0
timestamp(int): trigger timestamp wave_type(str): skutils.GebTypes.raw_waveform OR skutils.GebTypes.raw_histogram module_num(int): module number to go in skutek words. default is 0 version_num(int): module number to go in skutek words. default is 0 bitdepth(int): bitdepth to be encoded in packets. defaults to 14bit compression(int): compression level for zlib. 0 is None, 9 is max
skutils.GretinaLoader module
- class skutils.GretinaLoader.EventMetadata(packets, number)[source]
Bases:
object
- load_arrays_from_file(fp, big_endian, wave_type)[source]
Loads hists or traces for this event into a dataframe
- class skutils.GretinaLoader.GretinaLoader(**kwargs)[source]
Bases:
object
Object to load in event data from Skutek Gretina formats. unrecognized packet types will be ignored
UNTESTED WITH BIG ENDIAN DATA
- Attributes:
filename(str): filename being loaded memmap(bool): whether or not the whole file was loaded into memory. cache_event_metadata(bool): if True, then cache the positions of events as they are loaded. This makes traversing backwards through the file much faster. default is True fp(IoBuffer): open file or memory mapped file object big_endian(bool): whether or the not the file is big endian or small endian.
defaults to system byte order, but is updated as soon as an endian packet is parsed None if no ascii version packet is parsed
- ascii_header(list): list of ascii version data. Typically
[ddc_apps_version, git_branch_name, firmware_revision] empty list [] if no ascii version packet is parsed
event_number(int): index of most recently parsed event packet_parser_functions(dict): dictionary of functions to parse packet
contents. wave and histogram packets are parsed independently of this list. Functions are called with arguments (file_pointer, packet_start, packet_end) and are expected not to advance the file position.
- fetch_event(desired_event)[source]
fetches the specified event in the file. Does not advance event number or file position
- Returns:
- Tuple:
- dict: dictionary of metadata about the packet. keys are ‘packets’,
‘channels’, ‘wave_type_string’, ‘timestamp’, ‘event_number’
- pandas.DataFrame: The event data. Columns are channels (smallest first),
row is the index of the event sample
- Raises:
RuntimeError if event does not exist in this file
- load_and_sort_all_events()[source]
Retrieves a list of all events sorted by timestamp. Unlike fetch_event and seek_event, all packets with the same timestamp are defined as an event even if the packets are non-contiguous in the file.
Regardless of instantiation parameters, this function will enable memmap to seek the process up
- Returns:
- tuple:
list of dicts: metadata for all events, ordered by timestamp (smallest first) list of DataFrames: data for all events, ordered by timestamp (smallest first)
- next_event()[source]
Reads next event from the current file location and returns a metadata dictionary and a pandas dataframe of the event data. This will advance the location in the file.
Events are defined as a series of contiguous packets with the same timestamp
- Warning:
if end of file has been reached, then this will return (None, None)
- Returns:
- Tuple:
- dict: dictionary of metadata about the packet. keys are ‘packets’,
‘channels’, ‘wave_type_string’, ‘timestamp’, ‘event_number’. OR None if EOF
- pandas.DataFrame: The event data. Columns are channels (smallest first),
row is the index of the event sample. OR None if EOF
- property next_event_num
- peek_at_next_event_metadata()[source]
Finds all packets associated with the next event and returns metadata about the next event which can be used to read it. Does not advance file position
returns None if there are no more events in the file
- peek_at_next_packet_metadata()[source]
returns the metadata for the next data packet in the file without advancing the file position. Returns None if the end of the file is hit. If it encounters non-data packets during its peeking, then it will pass positional information to whatever function is defined for that packet in packet_parser_functions
- Args:
None
- Returns:
- PacketMetadata: metadata and info about the next data packet. This includes
GEB Headers, Skutek words, and position information in the file. OR None if we’ve reached the end of the file
- class skutils.GretinaLoader.PacketMetadata(packet_header, word1, word2, word3, start)[source]
Bases:
object
- ALL_DTYPES = {1342177280: [[dtype('uint32'), dtype('int32')], [dtype('>u4'), dtype('>i4')]], 1342177296: [[dtype('uint16'), dtype('int16')], [dtype('>u2'), dtype('>i2')]], 1342177312: [[dtype('uint8')], [dtype('uint8')]]}
- HIST_DTYPES = [[dtype('uint32'), dtype('int32')], [dtype('>u4'), dtype('>i4')]]
- PS_DTYPES = [[dtype('uint8')], [dtype('uint8')]]
- TRACE_DTYPES = [[dtype('uint16'), dtype('int16')], [dtype('>u2'), dtype('>i2')]]
skutils.GretinaMmapWriter module
- class skutils.GretinaMmapWriter.GretinaMmapWriter(*args, preallocate=1073741824, **kwargs)[source]
Bases:
GretinaFileWriter
Writes Skutek formatted Gretina data to a UDP socket - the same format as the “.bin” files saved natively by our digitizer.
skutils.GretinaUdpWriter module
- class skutils.GretinaUdpWriter.GretinaUdpWriter(target_ip, target_port, ascii_version_header=[])[source]
Bases:
GretinaFileWriter
Writes Skutek formatted Gretina data to a UDP socket - the same format as the “.bin” files saved natively by our digitizer.
skutils.Loader module
- class skutils.Loader.BaseLoader(fpath: str, rebuild_events_with_window: int | None = None)[source]
Bases:
object
The base class that all Loader types are an extension of, Loaders extending this subclass this class and then implement load_channel_batch
All BaseLoader derived classes can be used as a context manager, i.e.:
- with <loader>(file) as loader:
# Do whatever
- NOTE:
An individual BaseLoader instance is able to run exactly once before needing to -reopen the file, please keep this in mind.
- channelByChannel() Generator[ChannelData, Any, None] [source]
Get the individual channels, loaded one at a time
- lazyLoad() Generator[EventInfo, Any, None] [source]
Lazily yield events, returns the next event in a generator fashion for iterating
- loadChannelBatch() Sequence[ChannelData] | None [source]
The base method for loading channels, this loads a sequence of channels (events) or individual channels.
This is specialized for all loader types.
- class skutils.Loader.ChannelData(channel: int, timestamp: int, pulse_summary: Dict[str, Any] | None, wave: ndarray | None)[source]
Bases:
object
All data related to a single channel, could be as part of an event or not as part of an event, and may or may not have a pulse summary or wave
Check self.has_wave and self.has_summary to see if either are present.
- property num_wave_samples: int
Returns the number of samples in the wave, 0 if this has no wave
- property pileup: bool
Returns true if there have been multiple triggers in this channel
- class skutils.Loader.EventCSVLoader(fpath: str, rebuild_events_with_window: int | None = None)[source]
Bases:
BaseLoader
Loader for the TSV-type format for the Vireo EventCSV Format
- loadChannelBatch() Sequence[ChannelData] | None [source]
The base method for loading channels, this loads a sequence of channels (events) or individual channels.
This is specialized for all loader types.
- class skutils.Loader.EventInfo(channel_data: Sequence[ChannelData])[source]
Bases:
object
Information related to a group of channels collated as an “Event” as defined by either the file format or a rebuilt coincidence window.
- property channel_multiplicity: int
Returns the number of channels that triggered at least once in this event
- property channels: Sequence[int]
List of all channels in the event
- property max_timestamp: int
The largest timestamp found in the list of timestamps
- property min_timestamp: int
The smallest timestamp found in the list of timestamps
- property pileup_count: Sequence[int] | Sequence[None]
Returns a list of the number of triggers fired for each channel
- property pulse_heights: Sequence[int] | Sequence[None]
Returns the pulse heights on each channel.
- property timestamp_range: int
Range of timestamps from the maximum and minimum
- property timestamps: Sequence[int]
All timestamps found throughout all of the channels we have
- property total_triggers: int
Returns the total number of triggers that fired across all channels. AKA Hit Multiplicity
- property trigger_heights: Sequence[int] | Sequence[None]
Returns the trigger heights trigger on each channel.
- class skutils.Loader.GretaLoader(fpath: str, rebuild_events_with_window: int | None = None)[source]
Bases:
BaseLoader
Loader for the SkuTek GRETA single-packet format
- loadChannelBatch() Sequence[ChannelData] | None [source]
The base method for loading channels, this loads a sequence of channels (events) or individual channels.
This is specialized for all loader types.
- class skutils.Loader.GretaPacketRoutingHeader[source]
Bases:
Structure
- checksum
Structure/Union member
- flags
Structure/Union member
- length
Structure/Union member
- sequence_number
Structure/Union member
- subtype
Structure/Union member
- timestamp
Structure/Union member
- type
Structure/Union member
- version
Structure/Union member
- class skutils.Loader.GretaPacketTotal[source]
Bases:
Structure
- header
Structure/Union member
- subheader
Structure/Union member
- class skutils.Loader.GretaPacketWaveSubheader[source]
Bases:
Structure
- channel
Structure/Union member
- module_number
Structure/Union member
- pulse_height
Structure/Union member
- qdc_base_sum
Structure/Union member
- qdc_fast_sum
Structure/Union member
- qdc_slow_sum
Structure/Union member
- qdc_tail_sum
Structure/Union member
- reserved_0
Structure/Union member
- reserved_1
Structure/Union member
- size
Structure/Union member
- start_location
Structure/Union member
- subheader_version
Structure/Union member
- trig_count
Structure/Union member
- trigger_height
Structure/Union member
- triggered
Structure/Union member
- class skutils.Loader.IGORPulseHeightLoader(fpath: str, rebuild_events_with_window: int | None = None)[source]
Bases:
BaseLoader
IGOR pulse height loader, this type of loader does not actually have waveforms, but only the height and timestamp of a summary
Only the pulse_height section of ChannelData and the timestamp will be filled
- loadChannelBatch() Sequence[ChannelData] | None [source]
The base method for loading channels, this loads a sequence of channels (events) or individual channels.
This is specialized for all loader types.
- class skutils.Loader.IGORWaveLoader(fpath: str, rebuild_events_with_window: int | None = None)[source]
Bases:
BaseLoader
A loader for the IGOR wave format type, this is an event type format and will consistently have events correctly built so long as the orignial event was made.
- loadChannelBatch() Sequence[ChannelData] | None [source]
The base method for loading channels, this loads a sequence of channels (events) or individual channels.
This is specialized for all loader types.
- class skutils.Loader.WrappedGretinaLoader(fpath: str, rebuild_events_with_window: int | None = None)[source]
Bases:
BaseLoader
Different from the original GretinaLoader, this wraps that to the standard BaseLoader interface for consistency purposes.
- loadChannelBatch() Sequence[ChannelData] | None [source]
The base method for loading channels, this loads a sequence of channels (events) or individual channels.
This is specialized for all loader types.
skutils.constants module
- class skutils.constants.GebPacketHeader(type, length, timestamp)
Bases:
tuple
- length
Alias for field number 1
- timestamp
Alias for field number 2
- type
Alias for field number 0
- class skutils.constants.GebTypes[source]
Bases:
object
- WAVE_TYPE_STRINGS = {1342177280: 'histogram', 1342177296: 'waveform', 1342177312: 'pulse_summary'}
- endian_indicator = 1343234128
- endian_indicator_nonnative = 1344278608
- general_ascii = 1342177440
- raw_histogram = 1342177280
- raw_pulse_summary = 1342177312
- raw_waveform = 1342177296
- skutek_type_prefix = 1342177280
- version_info_ascii = 1342177441
- class skutils.constants.PulseSummary0(pulse_height, trig_height, trig_count, triggered, qdc_base_sum, qdc_fast_sum, qdc_slow_sum, qdc_tail_sum)
Bases:
tuple
- pulse_height
Alias for field number 0
- qdc_base_sum
Alias for field number 4
- qdc_fast_sum
Alias for field number 5
- qdc_slow_sum
Alias for field number 6
- qdc_tail_sum
Alias for field number 7
- trig_count
Alias for field number 2
- trig_height
Alias for field number 1
- triggered
Alias for field number 3
- class skutils.constants.SkutekWord1(version, module, signed, channel)
Bases:
tuple
- channel
Alias for field number 3
- module
Alias for field number 1
- signed
Alias for field number 2
- version
Alias for field number 0
skutils.file_tools module
Module contents
- class skutils.BaseLoader(fpath: str, rebuild_events_with_window: int | None = None)[source]
Bases:
object
The base class that all Loader types are an extension of, Loaders extending this subclass this class and then implement load_channel_batch
All BaseLoader derived classes can be used as a context manager, i.e.:
- with <loader>(file) as loader:
# Do whatever
- NOTE:
An individual BaseLoader instance is able to run exactly once before needing to -reopen the file, please keep this in mind.
- channelByChannel() Generator[ChannelData, Any, None] [source]
Get the individual channels, loaded one at a time
- lazyLoad() Generator[EventInfo, Any, None] [source]
Lazily yield events, returns the next event in a generator fashion for iterating
- loadChannelBatch() Sequence[ChannelData] | None [source]
The base method for loading channels, this loads a sequence of channels (events) or individual channels.
This is specialized for all loader types.
- class skutils.ChannelData(channel: int, timestamp: int, pulse_summary: Dict[str, Any] | None, wave: ndarray | None)[source]
Bases:
object
All data related to a single channel, could be as part of an event or not as part of an event, and may or may not have a pulse summary or wave
Check self.has_wave and self.has_summary to see if either are present.
- property num_wave_samples: int
Returns the number of samples in the wave, 0 if this has no wave
- property pileup: bool
Returns true if there have been multiple triggers in this channel
- class skutils.EventCSVLoader(fpath: str, rebuild_events_with_window: int | None = None)[source]
Bases:
BaseLoader
Loader for the TSV-type format for the Vireo EventCSV Format
- loadChannelBatch() Sequence[ChannelData] | None [source]
The base method for loading channels, this loads a sequence of channels (events) or individual channels.
This is specialized for all loader types.
- class skutils.EventInfo(channel_data: Sequence[ChannelData])[source]
Bases:
object
Information related to a group of channels collated as an “Event” as defined by either the file format or a rebuilt coincidence window.
- property channel_multiplicity: int
Returns the number of channels that triggered at least once in this event
- property channels: Sequence[int]
List of all channels in the event
- property max_timestamp: int
The largest timestamp found in the list of timestamps
- property min_timestamp: int
The smallest timestamp found in the list of timestamps
- property pileup_count: Sequence[int] | Sequence[None]
Returns a list of the number of triggers fired for each channel
- property pulse_heights: Sequence[int] | Sequence[None]
Returns the pulse heights on each channel.
- property timestamp_range: int
Range of timestamps from the maximum and minimum
- property timestamps: Sequence[int]
All timestamps found throughout all of the channels we have
- property total_triggers: int
Returns the total number of triggers that fired across all channels. AKA Hit Multiplicity
- property trigger_heights: Sequence[int] | Sequence[None]
Returns the trigger heights trigger on each channel.
- class skutils.FemtoDAQController(url: str, verbose: bool = False)[source]
Bases:
object
- property adc_max_val
- property adc_min_val
- property channels
- configureCoincidence(mode: Literal['hit_pattern', 'multiplicity'], multiplicity: int | None = None, hit_pattern: Dict[str, str] | None = None)[source]
Configures coincidence prior to an experimental run.
- Parameters:
mode –
the coincidence mode for triggering. Must be one of two options. - “multiplicity” : global trigger requires at least the specified
number of individual channels to trigger
- ”hit_pattern”global trigger requires specific channels to
trigger or not trigger. AKA coincidence/anti-coincidence/ignore hit pattern
multiplicity – Required if mode = “multiplicity”. The minimum number individual channel triggers required to define an Event. This arugment is ignored if mode is “hit_pattern”
hit_pattern –
Required if mode = “hit_pattern”. This argument must be a dictionary. Keys are the channel number, and value is one of:
’COINCIDENCE’ : If a trigger is required on this channel ‘ANTICOINCIDENCE’ : If a trigger is not allowed on this channel ‘IGNORE’ : If triggers on this channel have no impact on Event
All channels must be present when presented to configureCoincidence. A simple builder for configureCoincidence is helpers.HitPatternCoincidenceBuilder which will fill in unspecified items with “IGNORE” exists. This arugment is ignored if mode is “multiplicity”
Hit Pattern Example
1hit_pattern = {"channel_0_trigger_hit_pattern" : "COINCIDENCE", "channel_1_trigger_hit_pattern" : "ANTICOINCIDENCE"} 2digitizer.configureCoincidence("hit_pattern", hit_pattern=hit_pattern)
Multiplicity Example
1multiplicity = 3 2digitizer.configureCoincidence("multiplicity", multiplicity=multiplicity)
- configureRecording(channels: Sequence[int], run_name: str = 'API_Recording', format_type: str = 'gretina', record_waves: bool = True, record_summaries: bool = False, directory: str | None = None, seq_file_size_MB: int = 100, only_record_triggered_channels: bool = False)[source]
Configures file recording prior to an experimental run.
- Parameters:
channels – list of channels to record during this experimental run
run_name – The name of this experimental run. This string will be pre-pended along with a date code to all to the names of all data files generated during this run
format_type – The file format to use. Call getRecordingFormatInfo for a full list of data formats and the data products they support
record_waves – True to save waveforms. This will raise an error if the specified file format doesn’t support waveform recording. Default is True.
record_summaries – True to save pulse summaries. This will raise an error if the specified file format doesn’t support pulse summary recording. Default is False.
directory – The remote directory on your FemtoDAQ unit to save data to. If left as None, then defaults to the data partition directory on the FemtoDAQ unit. SkuTek recommends keeping this as it’s default unless you want the FemtoDAQ unit to save data over an NFS mount.
seq_file_size_MB – Maximum size data file in MB before a new file is created. 100MB by default.
only_record_triggered_channels –
If True, then only record the channels in the channels list that triggered in the event. This is more efficient and reduces the liklihood of noise waveforms ending up in your data files. If left as False, the default, then all specified channels will be written to disk even if no trigger was detected. This is less efficient, but ensures that the same channels will be in each event.
warning Setting this to True can result in varying event sizes and in some cases can result in empty cells in row/column type file formats such the IGOR Pulse Height format.
- Raises:
RuntimeError – if record_waves is True, but the specified file format does not support waveform recording.
RuntimeError – if record_summaries is True, but the specified file format does not support pulse summary recording.
RuntimeError – if directory is specified, but does not exist or is inaccessible from your FemtoDAQ digitizer.
- configureSoftwareStreaming(channels: Sequence[int], format: str, target_ip: str, target_port: int | str, only_stream_triggered_channels: bool = False)[source]
Configures streaming readout from software prior to an experimental run.
- Parameters:
channels – list of channels to stream during this experimental run
target_ip – The IP address to stream to.
target_port – The network port at the specified IP address to stream to.
only_stream_triggered_channels – If True, then only record the channels in the channels list that triggered in the event. This is more efficient and reduces the liklihood of noise waveforms ending up in your data files. If left as False, the default, then all specified channels will be written to disk even if no trigger was detected. This is less efficient, but ensures that the same channels will be in each event.
- downloadFile(filename: str, save_to: str | None = None, silent: bool = False)[source]
Download a file from a given path, save to a location on disk, and optionally print out values
- Parameters:
filename – Remote file to download
save_to – location to save that file to, or the local destionation
silent – Don’t print values out
- downloadLastRunDataFiles(save_to: str | None = None)[source]
Iterate through all data files from the last run and download them.
- Parameters:
save_to – an optional parameter specifying where to save the data.
- getBaselineRestorationExclusion()[source]
Get the area used in baseline restoration exclusion to excluse your triggered pulse.
- getCoincidenceSettings()[source]
Obtain the current Coincidence settings.
Look at the FemtoDAQ web docs for more information on that packet, this function returns the “data” field of that packet
- getDigitalOffset(channel: int)[source]
Get the digital offset for a specified channel in ADC counts difference from the value to be displayed. This means that the offset is not inverted when you enable inverting waveforms.
- Parameters:
channel – channel to get the offset from
- getEnableBaselineRestoration()[source]
Get the status of enablement of the Baseline Restoration feature, Baseline Restoration is not supported on all products
- getEnableTrigger(channel: int)[source]
Get whether a trigger is specified for a channel
- Parameters:
channel – Channel to get the trigger enabled status from.
- getHistogramQuantity(channel: int)[source]
Get the quantity histogrammed at each event, check setHistogramQuantity for the meanings of values
- Parameters:
channel – Channel to get the quantity histogrammed per event
- getHistogramScaling(channel: int)[source]
Get the state of the histogram scaling for a specified channel :param channel: Channel to get the histogram scaling state from.
- getListOfDataFiles(last_run_only: bool = False) Sequence[str] [source]
Get the list of all remote data files.
- Parameters:
last_run_only – If true, only gets the data files recorded in the last run.
- getPulseHeightAveragingWindow()[source]
Get how many pulseheights are averaged together for performing a trigger
- getPulseHeightWindow()[source]
Get the window to check the maximum value of pulses in from the trigger
- getRecordingFormatInfo()[source]
returns the output of utils.get_streaming_formats from the target unit
- getRecordingSettings()[source]
Get the current recording settings, for more information, look at the FemtoDAQ WebAPI docs for what the exact return is. The function specifically returns the “data” portion of the packet.
- getRunStatistics() Dict[str, Any] [source]
returns a dictionary which contains at least the following keys
‘run_time’ : duration of run ‘number_of_packets_streamed_from_software’ : number of packets that have been streamed from
from our software streaming system
‘number_of_events_recorded’ : number of events saved to disk via the Recording System
- getSoftwareStreamSettings()[source]
Retrieve the stream settings currently made for the Vireo. :returns: Dict of a json packet The JSON packet should look like this:
- {
“soft_stream_channels”: channels, “soft_stream_dest_ip”: target_ip, “soft_stream_dest_port”: int | str, “only_stream_triggered”: bool
},
- getTriggerActiveWindow()[source]
Get the duration of the time window when the instrument is counting triggers occuring on all enabled ADC channels
- getTriggerAveragingWindow(channel: int)[source]
Get the duration of the leading and trailing summation windows in ADC samples. :param channel: channel to get the trigger averaging window of.
- getTriggerEdge(channel: int)[source]
Get what edge a trigger happens for a specified channel. :param channel: channel to get the trigger edge data from.
- getTriggerSensitivity(channel: int)[source]
Get the trigger threshold of the specified channel.
- Parameters:
channel – channel to obtain the trigger threshold of.
- getTriggerXPosition()[source]
Get the position of where the triggered item will be located across the N-sample window
- property name
- property num_channels
- property num_wave_samples
- property product_name
- property serial_number
- setAnalogOffsetPercent(channel: int, offset_percent: int)[source]
Set the analog offset as a percentage for a given channel. This value is unable to be read back.
- Parameters:
channel – Channel to set the offset
offset_percent – The percent offset for analog baseline offset ranging from -100 to 100 as an integer
- Raises:
ValueError – If the offset percentage is not in the valid range
- setBaselineRestorationExclusion(window_width: int)[source]
Set the area used to exclude an area from being affected from baseline restoration
- Parameters:
window_width – area to prevent affection from restoration exclusion.
- setBiasVoltage(voltage: float)[source]
Set the intended voltage offset for biasing a detector.
- Parameters:
voltage – An integer indicating the voltage in volts to offset the HV output bias.
- setBiasVoltageRaw(voltage: int)[source]
Set the raw DAC value used to bias output for a detector.
- Parameters:
voltage – an integer indicating the voltage in raw DAC bytes
- setDigitalOffset(channel: int, offset: int)[source]
Set the digital offset of a specified channel in ADC counts difference from the value to be displayed. This means that the offset is not inverted when you enable inverting waveforms.
- Parameters:
channel – channel to get the offset from
offset – Offset in ADC counts
- setEnableBaselineRestoration(enable: bool)[source]
Enable (or disable) Baseline Restoration on some products, Baseline Restoration is not supported on all products
- setEnableTrigger(channel: int, enable: bool)[source]
Set the status of a trigger for a specified channel.
- Parameters:
channel – Channel to enable or disable the triggering on.
enable – Enable or disable triggering on this channel
- setGlobalId(global_id: int)[source]
Set the globalID for an experiment to the device.
- Parameters:
global_id – a 0-255 integer representing an ID in an experiment
- setHistogramQuantity(channel: int, quantity: int)[source]
Set the quantity histogrammed at each event.
0 is the maximum value of a trace after averaging
1 is the running sum over the PulseHeight window without averaging
2 for running average of the PulseHeight window sum (AKA the average of mode 1)
3 for the maximum value of the trigger waveform after averaging.
- Parameters:
channel – channel to set what quantity is being histogrammed
quantity – What quantity do we want to histogram on this channel.
- setHistogramScaling(channel: int, state: int)[source]
Set the histogram scaling for a specified channel, if state is 1, this bins by 2, otherwise for 0, do not bin. To cover the whole ADC range state must be one
- Parameters:
channel – Channel to set the histogram scaling state
state – State to set the histogram scaling in, typically 1 for bin by 2 or 0 for no binning.
- setPulseHeightAveragingWindow(window_width: int)[source]
Set how many pulseheights are averaged together for performing a trigger
- Parameters:
window_width – integer width in ADC samples
- setPulseHeightWindow(window_width: int)[source]
Set the active window for measuring pulse heights compared to the trigger
- Parameters:
window_width – Number of ADC samples after the trigger to look for maximum pulse height values
- setQuadQDCWindows(base_width: int, fast_width: int, slow_width: int, tail_width: int)[source]
Set the windows for FGPA-based integration of an event
- Parameters:
base_width – pre-trigger area to integrate in ADC count, this is not configurable, but it is 128 samples wide and ends 8 samples before the FAST window
fast_width – Width of the fast window, starts at hte maximum value of the sliding window integration and will always cover the peak of the pulse.
slow_width – starts at the end of the fast window, integer in ADC counts.
tail_width – Starts at the end of the slow window, integer in ADC counts.
- setTriggerActiveWindow(window_width: int)[source]
Set the duration of the time window when the instrument is counting triggers occuring on all enabled ADC channels. For example, when dealing with HPGe detectors, the Trigger Active Window should cover the duration of the energy shaping filters. Additional trigger pulses in an ADC channel with signal a signal pileup. These events should be excluded from the pulse height histogram.
In waveform capture mode, the waveform will be truncated to the size of “Trigger Active Window”.
If your events are short and do not need to use the entire 8192 sample window, you can reduce your file size and increase your event throughput.
- Parameters:
window_width – number of samples to keep the trigger active.
- setTriggerAveragingWindow(channel: int, window_width: int)[source]
Set the trigger averaging window, the valid range is determined by the device, but a typical valid range would be: [1, 2, 4, 8, 16, 32] in terms of ADC samples to average for triggering.
- Parameters:
channel – channel to set the trigger averaging window
window_width – width of the trigger averaging window
- setTriggerEdge(channel: int, direction: Literal['rising'] | Literal['falling'] | int)[source]
Set whether the trigger is to be on the rising or falling edge of a waveform. This applies AFTER inversion.
- Parameters:
channel – Channel to set the trigger edge detection on.
direction – Direction of travel, rising or falling edge.
- setTriggerSensitivity(channel: int, sensitivity: int)[source]
Set the trigger threshold of the specified channel
- Parameters:
channel – channel to set the trigger threshold of.
sensitivity – Threshold of the trigger in ADC counts.
- setTriggerXPosition(x_position: int)[source]
Set the position of the trigger in the N-sample window.
- Parameters:
x_position – The position of the trigger in the N-sample window.
- property trigger_sensitivity_max
- property trigger_sensitivity_min
- property wave_max_val
- property wave_min_val
- class skutils.GebTypes[source]
Bases:
object
- WAVE_TYPE_STRINGS = {1342177280: 'histogram', 1342177296: 'waveform', 1342177312: 'pulse_summary'}
- endian_indicator = 1343234128
- endian_indicator_nonnative = 1344278608
- general_ascii = 1342177440
- raw_histogram = 1342177280
- raw_pulse_summary = 1342177312
- raw_waveform = 1342177296
- skutek_type_prefix = 1342177280
- version_info_ascii = 1342177441
- class skutils.GretaLoader(fpath: str, rebuild_events_with_window: int | None = None)[source]
Bases:
BaseLoader
Loader for the SkuTek GRETA single-packet format
- loadChannelBatch() Sequence[ChannelData] | None [source]
The base method for loading channels, this loads a sequence of channels (events) or individual channels.
This is specialized for all loader types.
- class skutils.IGORPulseHeightLoader(fpath: str, rebuild_events_with_window: int | None = None)[source]
Bases:
BaseLoader
IGOR pulse height loader, this type of loader does not actually have waveforms, but only the height and timestamp of a summary
Only the pulse_height section of ChannelData and the timestamp will be filled
- loadChannelBatch() Sequence[ChannelData] | None [source]
The base method for loading channels, this loads a sequence of channels (events) or individual channels.
This is specialized for all loader types.
- class skutils.IGORWaveLoader(fpath: str, rebuild_events_with_window: int | None = None)[source]
Bases:
BaseLoader
A loader for the IGOR wave format type, this is an event type format and will consistently have events correctly built so long as the orignial event was made.
- loadChannelBatch() Sequence[ChannelData] | None [source]
The base method for loading channels, this loads a sequence of channels (events) or individual channels.
This is specialized for all loader types.
- skutils.LegacyGretinaLoader
alias of
GretinaLoader
- class skutils.WrappedGretinaLoader(fpath: str, rebuild_events_with_window: int | None = None)[source]
Bases:
BaseLoader
Different from the original GretinaLoader, this wraps that to the standard BaseLoader interface for consistency purposes.
- loadChannelBatch() Sequence[ChannelData] | None [source]
The base method for loading channels, this loads a sequence of channels (events) or individual channels.
This is specialized for all loader types.