ramanchada2

Purpose

ramanchada2 software package is meant to fill the gap between the theoretical Raman analysis and the experimental Raman spectroscopy by providing means to compare data of different origin. The software is in early development stage but still able to solve practical problems.

Features

Read simulated data

Process simulated data by VASP and CRYSTAL and provide same interface. CRYSTAL data contain intensities for multiple orientations -- laser beam incidents perpendicularly or parallelly to the observation and information for mono-crystals. VASP data provide data only for poly-crystals but in different format. So the perpendicular and parallel intensities are calculated by an implemented algorithm.

Models

LMFIT theoretical models can be build by spectral information obtained by simulations or by provided by the user. These models can be fit to experimental data, providing calibration information. At poor initial calibration the minimisation procedure naturally fails. An iterative procedure aiming to solve this problem was adopted in the code. On the first iteration the experimental spectrum lines are artificially broadened. This makes it possible for the minimisation procedure to find a parameters that are close enough to be used as an initial guess for the second iteration. In few iterations the algorithm is able to fit to the original experimental data. This idea is implemented and is at proof-of-concept level.

Generate spectra

Spectra can be generated by the theoretical models. Random Poissonian noise and artificial random-generated baseline can be added to the generated spectra, making them convenient tools to test new methods for analysis.

Spectrum manipulation

A number of filters can be applied to spectra (experimental and generated). Scaling on both x and y axes is possible. Scaling could be linear or arbitrary user defined function. A convolution is possible with set of predefined functions as well as user defined model.

Concept

The code is object oriented, written in python. Main elements are Spectrum and theoretical models. Theoretical models are based on LMFIT library, while Spectrum is a custom made class. Spectrum object contains data for x and y axes and metadata coming from experimental files or other sources. It is planned to add information about the uncertainties in x and y. All filters and manipulation procedures are available as class methods. Measures are taken to preserve spectrum instances immutable, so filters are generating new spectra, preserving the original unchanged. Additionally, Spectrum has information about its history -- the sequence of applied filters.

File formats

.cha

ramanchada software package introduced .cha file format, which is an HDF5 with a simple layout.

Cache in .cha files

The concept to keep previous variants of data is employed in ramanchada2. If configured so, the software saves the data for all Spectrum instances to a tree-organized .cha file. When a particular chain of operations is requested by the user, the software checks if the final result is present in the cache file, if so it is provided, otherwise the software checks for its parent. When a parent or some of the grand parents are present, they are taken as a starting point and the needed steps are applied to provide the final result. The current implementation uses h5py library to access local hdf files. It is foreseen to have implementation with h5pyd that support network operations.

Nexus format

The latest ramanchada2 package allows export of a spectrum to NeXus format.

Decorated Functions in Spectrum

Function: __add__

Docstring: No docstring available


Function: __init__

Docstring: No docstring available


Function: __mul__

Docstring: No docstring available


Function: __sub__

Docstring: No docstring available


Function: __truediv__

Docstring: No docstring available


Function: abs_nm_to_shift_cm_1

Docstring: Convert wavelength to Ramanshift in wavenumber

Arguments:
  • spe: internal use only
  • laser_wave_length_nm: Laser wave length

Returns: Corrected x-values


Function: abs_nm_to_shift_cm_1_filter

Docstring: Convert wavelength to Ramanshift in wavenumber

Arguments:
  • spe: internal use only
  • laser_wave_length_nm: Laser wave length

Returns: Spectrum with corrected x-values


Function: add_baseline

Docstring: Add artificial baseline to the spectrum. A random baseline is generated in frequency domain using uniform random numbers. The baseline in frequency domain is tapered with bohman window to reduce the bandwidth of the baseline to first n_freq frequencies and is transformed to "time" domain. Additionaly by using func parameter the user can define arbitrary function to be added as baseline.

Arguments:
  • n_freq: Must be > 2. Number of lowest frequency bins distinct from zero.
  • amplitude: Upper boundary for the uniform random generator.
  • pedestal: Additive constant pedestal to the spectrum.
  • func: Callable. User-defined function to be added as baseline. Example: func = lambda x: x*.01 + x**2*.0001.
  • rng_seed: int, optional. Seed for the random generator.

Function: add_gaussian_noise

Docstring: Add gaussian noise to the spectrum.

Random number i.i.d. $N(0, \sigma)$ is added to every sample

Arguments:
  • old_spe: internal use only
  • new_spe: internal use only
  • sigma: Sigma of the gaussian noise.
  • rng_seed: int or rng state, optional, seed for the random generator. If a state is provided, it is updated in-place.

Returns: modified Spectrum


Function: add_gaussian_noise_drift

Docstring: Add cumulative gaussian noise to the spectrum.

Exponential-moving-average-like gaussian noise is added to each sample. The goal is to mimic the low-frequency noise (or random substructures in spectra). The additive noise is $$a_i = coef*\sum_{j=0}^{i-1}g_j + g_i,$$

where $$g_i = \mathcal{N}(0, 1+\frac{coef}{\sqrt 2}).$$

This way drifting is possible while keeping the $$\sigma(\Delta(a)) \approx 1.$$

Arguments:
  • old_spe: internal use only
  • new_spe: internal use only
  • sigma: Sigma of the gaussian noise.
  • coef: float in [0, 1], drifting coefficient. If coef == 0, the result is identical to add_gaussian_noise().
  • rng_seed: int or rng state, optional. Seed for the random generator. If a state is provided, it is updated in-place.

Returns: modified Spectrum


Function: add_poisson_noise

Docstring: Add poisson noise to the spectrum.

For each particular sample the noise is proportional to $\sqrt{scale*a_i}$.

Arguments:
  • old_spe: internal use only
  • new_spe: internal use only
  • scale: float, optional, default is 1. Scale the amplitude of the noise.
  • rng_seed: int or rng state, optional. Seed for the random generator. If a state is provided, it is updated in-place.

Returns: modified Spectrum


Function: apply_processing

Docstring: No docstring available


Function: bayesian_gaussian_mixture

Docstring: Decompose the spectrum to Bayesian Gaussian Mixture

Arguments:
  • spe: internal use only
  • n_samples: optional. Defaults to 5000. Resampled dataset size
  • n_components: optional. Defaults to 50. Number of expected gaussian components
  • max_iter: optional. Defaults to 100. Maximal number of iterations.
  • moving_minimum_window: optional. Defaults to None. If None no moving minimum is subtracted, otherwise as specified.
  • random_state: optional. Defaults to None. Random generator seed to be used.
  • trim_range: optional. Defaults to None: If None ignore trimming, otherwise trim range is in x-axis values.
Returns:

BayesianGaussianMixture: Fitted Bayesian Gaussian Mixture


Function: calibrate_by_deltas_filter

Docstring: No docstring available


Function: calibrate_by_deltas_model

Docstring:

  • Builds a composite model based on a set of user specified delta lines.
  • Initial guess is calculated based on 10-th and 90-th percentiles of the distributions.

The phasespace of the model is flat with big amount of narrow minima. In order to find the best fit, the experimental data are successively convolved with gaussians with different widths startign from wide to narrow. The model for the calibration is 3-th order polynomial, which potentialy can be changed for higher order polynomial. In order to avoid solving the inverse of the calibration function, the result is tabulated and interpolated linarly for each bin of the spectrum. This alogrithm is useful for corse calibration.


Function: central_moments

Docstring: No docstring available


Function: convolve

Docstring: Convole spectrum with arbitrary lineshape.

Arguments:
  • old_spe: internal use only
  • new_spe: internal use only
  • lineshape: callable, str or np.ndarray. If callable: should have a single positional argument x, e.g. lambda x: np.exp((x/5)**2). If predefined peak profile: can be gaussian, lorentzian, voigt, pvoigt, moffat or pearson4. If np.ndarray: lineshape in samples.
  • **kwargs: Additional kwargs will be passed to lineshape function.

Returns: modified Spectrum


Function: derivative_sharpening

Docstring: Derivative-based sharpening.

Sharpen the spectrum subtracting second derivative and add fourth derivative.

Arguments:
  • old_spe: internal use only
  • new_spe: internal use only
  • filter_fraction float in (0; 1]: Default is 0.6 Depth of filtration
  • signal_width: The width of features to be enhanced in sample count
  • der2_factor: Second derivative scaling factor
  • der4_factor: Fourth derivative scaling factor

Returns: modified Spectrum


Function: drop_spikes

Docstring: Removes single-bin spikes.

Remove x, y pairs recognised as spikes using left and right successive differences and standard-deviation-based threshold.

Arguments:
  • old_spe: internal use only
  • new_spe: internal use only
  • n_sigma: optional, default is 10. Threshold is n_sigma times the standard deviation.

Returns: modified Spectrum


Function: dropna

Docstring: Remove non finite numbers on both axes

Arguments:
  • old_spe: internal use only
  • new_spe: internal use only

Returns: modified Spectrum


Function: find_peak_multipeak

Docstring: Find groups of peaks in spectrum.

Arguments:
  • spe: internal use only
  • prominence: Optional. Defaults to None If None the prominence value will be spe.y_nose. Reasonable value for promience is const * spe.y_noise_MAD.
  • wlen: optional. Defaults to None. wlen value used in scipy.signal.find_peaks. If wlen is None, 200 will be used.
  • width: optional. Defaults to None. width value used in scipy.signal.find_peaks. If width is None, 2 will be used.
  • hht_chain: optional. Defaults to None. List of hht_chain window sizes. If None, no hht sharpening is performed.
  • bgm_kwargs: kwargs for bayesian_gaussian_mixture
  • sharpening 'hht' or None. Defaults to None. If 'hht' hht sharpening will be performed before finding peaks.
  • strategy: optional. Defauts to 'topo'. Peakfinding method
Returns:

ListPeakCandidateMultiModel: Located peak groups


Function: find_peak_multipeak_filter

Docstring: Same as find_peak_multipeak but the result is stored as metadata in the returned spectrum.

Arguments:
  • old_spe: internal use only
  • new_spe: internal use only
  • args, *kwargs: same as find_peak_multipeak

Function: fit_peak_multimodel

Docstring: No docstring available


Function: fit_peak_positions

Docstring: Calculate peak positions and amplitudes.

Sequence of multiple processings:

  • subtract_moving_minimum
  • find_peak_multipeak
  • filter peaks with x-location better than threshold
Arguments:
  • spe: internal use only
  • mov_min: optional. Defaults to 40 subtract moving_minimum with the specified window.
  • center_err_threshold: optional. Defaults to 0.5. threshold for centroid standard deviation. Only peaks with better uncertainty will be returned.
  • find_peaks_kw: optional keyword arguments to be used with find_peak_multipeak
  • fit_peaks_kw: optional keyword arguments to be used with fit_peaks_multipeak
Returns:

Dict[float, float]: {positions: amplitudes}


Function: fit_peaks_filter

Docstring: Same as fit_peak_multipeak but the result is stored as metadata in the returned spectrum.

Arguments:
  • old_spe: internal use only
  • new_spe: internal use only
  • should_break: same as in fit_peaks_multipeak
  • args, *kwargs: same as fit_peaks_multipeak

Function: from_cache_or_calc

Docstring: Load spectrum from cache or calculate if needed.

The cache is a nested structure of spectra. All processings applied to a spectrum result to spectra of the initial one. If part of the requred processings are available, only the needed steps are calculated and added to the cache.

Arguments:
  • required_steps: List of required steps in the form [{'proc': str, 'args': List[Any], 'kwargs': Dict[str, Any]}, ...]
  • cachefile: optional. Defaults to None. Filename of the cache. If None no cache is used

Function: from_chada

Docstring: No docstring available


Function: from_delta_lines

Docstring: Generate Spectrum with delta lines.

Arguments:
  • deltas: Keys of the dictionary are the x positions of the deltas; values are the amplitudes of the corresponding deltas.
  • xcal: Callable, optional. x axis calibration function.
  • nbins: int, optional. Number of bins in the spectrum.
  • xaxis: Array-like, optional. The xaxis of the new spectrum. If xaxis is provided, xcal should be None and nbins is ignored.

Example:

This will produce spectrum with 1000 bins in the range [-1000, 2000):

xcal = lambda x: x*3 -1000, nbins=1000

Function: from_local_file

Docstring: Read experimental spectrum from a local file.

Arguments:
  • in_file_name: Path to a local file containing a spectrum.
  • filetype: Specify the filetype. Filetype can be any of: spc, sp, spa, 0, 1, 2, wdf, ngs, jdx, dx, txt, txtr, tsv, csv, dpt, prn, rruf, spe (Princeton Instruments) or None. None used to determine by extension of the file.
  • backend: native, rc1_parser or None. None means both.
  • custom_meta: Dict Add custom metadata fields in the created spectrum
Raises:
  • ValueError: When called with unsupported file formats.

Function: from_simulation

Docstring: Generate spectrum from simulation file.

The returned spectrum has only few x/y pairs -- one for each simulated line. Values along the x-axis will not be uniform. To make it uniform, one needs to resample the spectrum.

Arguments:
  • in_file: Path to a local file, or file-like object.
  • sim_type: If vasp: .dat file from VASP simulation. If crystal_out: .out file from CRYSTAL simulation, not preferred. If crystal_dat: .dat file from CRYSTAL simulation.
  • use: One of the directions I_tot, I_perp, I_par, I_xx, I_xy, I_xz, I_yy, I_yz, I_zz, I_tot, I_perp, I_par are available for both CRYSTAL and VASP. I_xx, I_xy, I_xz, I_yy, I_yz, I_zz are available only for CRYSTAL. If a Dict is passed, the key should be directions and values should be weighting factor. For example, use={'I_perp': .1, 'I_par': .9}

Function: from_spectral_component_collection

Docstring: from_spectral_component_collection

Arguments:
  • spe_components: SpectralComponentCollection
  • x: int or array-like, optional, default 2000. x axis of the spectrum.

Function: from_stream

Docstring: No docstring available


Function: from_test_spe

Docstring: Create new spectrum from test data.

Arguments:
  • index: int or None, optional, default is None. If int: will be used as an index of filtered list. If None: a random spectrum will be taken.
  • **kwargs: The rest of the parameters will be used as filter.

Function: from_theoretical_lines

Docstring: Generate spectrum from lmfit shapes.

Arguments:
  • shapes: The shapes to be used for spectrum generation.
  • params: Shape parameters to be applied to be used with shapes.
  • x: Array with x values, by default np.array(2000).

Function: gen_samples

Docstring: No docstring available


Function: get_spikes

Docstring: Get single-bin spikes only.

Get x, y pairs recognised as spikes using left and right successive differences and standard-deviation-based threshold and linear interpolation.

Arguments:
  • old_spe: internal use only
  • new_spe: internal use only
  • n_sigma: optional, default is 10. Threshold is n_sigma times the standard deviation.

Returns: modified Spectrum


Function: hdr_from_multi_exposure

Docstring: Create an HDR spectrum from several spectra with different exposures.

The resulting spectrum will have the details in low-intensity peaks from long-exposure-time spectrum. As long-exposure-time spectrum might be sturated, the information for high-intensity peaks will be taken from short-exposure-time spectrum.

Arguments:
  • spes_in (List[Spectrum]): Set of spectra with different exposures
  • meta_exposure_time (str, optional): The name of the metadata parameter having the exposure/integration time. The units should be the same for all the spectra. Defaults to 'intigration times(ms)'.
  • meta_ymax (str, optional): The name fo the metadata parameter having the maximum ADC value. This value will be used as a threshold. If value in a spectrum is higher, the value will be taken from a spectrum with shorter exposure. Defaults to 'yaxis_max'.

Function: hht_sharpening

Docstring: Hilbert-Huang based sharpening.

In order to reduce the overshooting, moving minimum is subtracted from the result

Arguments:
  • old_spe: internal use only
  • new_spe: internal use only
  • movmin: optional. Default is 100 Window size for moving minimum

Returns: modified Spectrum


Function: hht_sharpening_chain

Docstring: Hilbert-Huang based chain sharpening.

Sequence of Hilbert-Huang sharpening procedures are performed.

Arguments:
  • old_spe: internal use only
  • new_spe: internal use only
  • movmin: List[int], optional. Default is [150, 50] The numer of values in the list defines how many iterations of HHT_sharpening will be performed and the values define the moving minimum window sizes for the corresponding operations.

Returns: modified Spectrum


Function: moving_average

Docstring: Moving average filter.

Arguments:
  • old_spe: internal use only
  • new_spe: internal use only
  • window_size: int, optional, default is 10.

Returns: modified Spectrum


Function: moving_average_convolve

Docstring: Moving average filter.

Arguments:
  • old_spe: internal use only
  • new_spe: internal use only
  • window_size: int, optional, default is 10.

Returns: modified Spectrum


Function: moving_median

Docstring: Moving median filter.

The resultant spectrum is moving minimum of the input.

Arguments:
  • old_spe: internal use only
  • new_spe: internal use only
  • window_size: int, optional, default is 10.

Returns: modified Spectrum


Function: moving_minimum

Docstring: Moving minimum baseline estimator. Successive values are calculated as minima of rolling rectangular window.


Function: normalize

Docstring: Normalize the spectrum.

Arguments:
  • strategy: If unity: normalize to sum(y). If min_unity: subtract the minimum and normalize to 'unity'. If unity_density: normalize to Σ(y_i*Δx_i). If unity_area: same as unity_density. If minmax: scale amplitudes in range [0, 1]. If 'L1' or 'L2': L1 or L2 norm without subtracting the pedestal.

Function: pad_zeros

Docstring: Extend x-axis by 100% in both directions.

The x-axis of resultant spectrum will be: $[x_{lower}-(x_{upper}-x_{lower})..(x_{upper}+(x_{upper}-x_{lower}))]$. The length of the new spectrum is 3 times the original. The added values are with an uniform step. In the middle is the original spectrum with original x and y values. The coresponding y vallues for the newly added x-values are always zeros.

Arguments:
  • old_spe: internal use only
  • new_spe: internal use only

Returns: modified Spectrum


Function: recover_spikes

Docstring: Recover single-bin spikes.

Recover x, y pairs recognised as spikes using left and right successive differences and standard-deviation-based threshold and linear interpolation.

Arguments:
  • old_spe: internal use only
  • new_spe: internal use only
  • n_sigma: optional, default is 10. Threshold is n_sigma times the standard deviation.

Returns: modified Spectrum


Function: resample_NUDFT

Docstring: Resample the spectrum using Non-uniform discrete fourier transform.

The x-axis of the result will be uniform. The corresponding y-values will be calculated with NUDFT and inverse FFT.

Arguments:
  • spe: internal use only
  • x_range: optional. Defaults to (0, 4000). The x_range of the new spectrum.
  • xnew_bins: optional. Defaults to 100. Number of bins of the new spectrum
  • window: optional, Defaults to None. The window to be used for lowpass filter. If None 'blackmanharris' is used. If no low-pass filter is required, one can use window=lambda x: [1]*len(x).
  • cumulative: optional. Defaults to False. If True, the resultant spectrum will be cumulative and normalized (in analogy with CDF).
Returns:

(x_values, y_values)


Function: resample_NUDFT_filter

Docstring: Resample the spectrum using Non-uniform discrete fourier transform.

The x-axis of the result will be uniform. The corresponding y-values will be calculated with NUDFT and inverse FFT.

Arguments:
  • old_spe: internal use only
  • new_spe: internal use only
  • x_range: optional. Defaults to (0, 4000). The x_range of the new spectrum.
  • xnew_bins: optional. Defaults to 100. Number of bins of the new spectrum
  • window: optional, Defaults to None. The window to be used for lowpass filter. If None 'blackmanharris' is used. If no low-pass filter is required, one can use window=lambda x: [1]*len(x).
  • cumulative: optional. Defaults to False. If True, the resultant spectrum will be cumulative and normalized (in analogy with CDF).

Returns: modified Spectrum


Function: resample_spline

Docstring: Resample the spectrum using spline interpolation.

The x-axis of the result will be uniform. The corresponding y-values will be calculated with spline interpolation.

Arguments:
  • spe: internal use only
  • x_range: optional. Defaults to (0, 4000). The x_range of the new spectrum.
  • xnew_bins: optional. Defaults to 100. Number of bins of the new spectrum
  • spline: optional, Defaults to 'pchip'. Name of the spline funcion to be used.
  • cumulative: optional. Defaults to False. If True, the resultant spectrum will be cumulative and normalized (in analogy with CDF).
Returns:

(x_values, y_values)


Function: resample_spline_filter

Docstring: Resample the spectrum using spline interpolation.

The x-axis of the result will be uniform. The corresponding y-values will be calculated with spline interpolation.

Arguments:
  • old_spe: internal use only
  • new_spe: internal use only
  • x_range: optional. Defaults to (0, 4000). The x_range of the new spectrum.
  • xnew_bins: optional. Defaults to 100. Number of bins of the new spectrum
  • spline: optional, Defaults to 'pchip'. Name of the spline funcion to be used.
  • cumulative: optional. Defaults to False. If True, the resultant spectrum will be cumulative and normalized (in analogy with CDF).

Returns: modified Spectrum


Function: scale_xaxis_fun

Docstring: Apply arbitrary calibration function to the x-axis values.

Arguments:
  • old_spe: internal use only
  • new_spe: internal use only
  • fun: function to be applied
  • args: Additional arguments to the provided functions

Returns: Corrected spectrum

Raises:
  • ValueError: If the new x-values are not strictly monotonically increasing.

Function: scale_xaxis_linear

Docstring: Scale x-axis using a factor.

Arguments:
  • old_spe: internal use only
  • new_spe: internal use only
  • factor: Defaults to 1. Multiply x-axis values with factor
  • preserve_integral: optional. Defaults to False. If True, preserves the integral in sence $\sum y_{orig;\,i}*{\Delta x_{orig}}_i = \sum y_{new;\,i}*{\Delta x_{new}}_i = $

Returns: Corrected spectrum


Function: scale_yaxis_linear

Docstring: Scale y-axis values

This function provides the same result as spe*const

Arguments:
  • old_spe: internal use only
  • new_spe: internal use only
  • factor optional. Defaults to 1. Y-values scaling factor

Returns: corrected spectrum


Function: set_new_xaxis

Docstring: Substitute x-axis values with new ones

Arguments:
  • old_spe: internal use only
  • new_spe: internal use only
  • xaxis: new x-axis values

Returns: corrected spectrum

Raises:
  • ValueError: If the provided array does not match the shape of the spectrum.

Function: shift_cm_1_to_abs_nm

Docstring: Convert Ramanshift in wavenumber to wavelength

Arguments:
  • spe: internal use only
  • laser_wave_length_nm: Laser wave length

Returns: Corrected x-values


Function: shift_cm_1_to_abs_nm_filter

Docstring: Convert Ramanshift in wavenumber to wavelength

Arguments:
  • spe: internal use only
  • laser_wave_length_nm: Laser wave length

Returns: Spectrum with corrected x-values


Function: smoothing_RC1

Docstring: Smooth the spectrum.

The spectrum will be smoothed using the specified filter. This method is inherited from ramanchada1 for compatibility reasons.

Arguments:
  • old_spe: internal use only
  • new_spe: internal use only
  • method: method to be used
  • **kwargs: keyword arguments to be passed to the selected method

Returns: modified Spectrum


Function: spe_distribution

Docstring: No docstring available


Function: spike_indices

Docstring: Find spikes in spectrum

Single-bin spikes are located using left and right successive differences. The threshold is based on the standart deviation of the metric which makes this algorithm less optimal.

Arguments:
  • spe: internal use only
  • n_sigma: Threshold value should be n_sigma times the standart deviation of the metric.

Returns: List of spike indices


Function: subtract_baseline_rc1_als

Docstring: No docstring available


Function: subtract_baseline_rc1_snip

Docstring: No docstring available


Function: subtract_moving_median

Docstring: Subtract moving median filter.

The resultant spectrum is moving minimum of the input subtracted from the input.

Arguments:
  • old_spe: internal use only
  • new_spe: internal use only
  • window_size: int, optional, default is 10.

Returns: modified Spectrum


Function: subtract_moving_minimum

Docstring: No docstring available


Function: trim_axes

Docstring: Trim axes of the spectrum.

Arguments:
  • old_spe: internal use only
  • new_spe: internal use only
  • method: 'x-axis' or 'bins' If 'x-axis' boundaries will be interpreted as x-axis values. If 'bins' boundaries will be interpreted as indices.
  • boundaries: lower and upper boundary for the trimming.

Returns: modified Spectrum


Function: xcal_argmin2d_iter_lowpass

Docstring: Calibrate spectrum

The calibration is done in multiple steps. Both the spectrum and the reference are passed through a low-pass filter to preserve only general structure of the spectrum. low_pass_nfreqs defines the number of frequencies to be preserved in each step. Once all steps with low-pass filter a final step without a low-pass filter is performed. Each calibration step is performed using ~ramanchada2.spectrum.calibration.by_deltas.xcal_fine() algorithm.

Arguments:
  • old_spe (Spectrum): internal use only
  • new_spe (Spectrum): internal use only
  • ref (Dict[float, float]): wavenumber - amplitude pairs
  • low_pass_nfreqs (List[int], optional): The number of elements defines the number of low-pass steps and their values define the amount of frequencies to keep. Defaults to [100, 500].

Function: xcal_fine

Docstring: Iterative calibration with provided reference based on ~ramanchada2.misc.utils.argmin2d.align()

Iteratively apply polynomial of poly_order degree to match the found peaks to the reference locations. The pairs are created using ~ramanchada2.misc.utils.argmin2d.align() algorithm.

Arguments:
  • old_spe (Spectrum): internal use only
  • new_spe (Spectrum): internal use only
  • ref (Union[Dict[float, float], List[float]]): _description_
  • ref (Dict[float, float]): If a dict is provided - wavenumber - amplitude pairs. If a list is provided - wavenumbers only.
  • poly_order (NonNegativeInt): polynomial degree to be used usualy 2 or 3
  • max_iter (NonNegativeInt): max number of iterations for the iterative polynomial alignment
  • should_fit (bool, optional): Whether the peaks should be fit or to associate the positions with the maxima. Defaults to False.
  • find_peaks_kw (dict, optional): kwargs to be used in find_peaks. Defaults to {}.

Function: xcal_fine_RBF

Docstring: Wavelength calibration using Radial basis fuction interpolation

Please be cautious! Interpolation might not be the most appropriate approach for this type of calibration.

**kwargs are passed to RBFInterpolator


Function: xcal_rough_poly2_all_pairs

Docstring: No docstring available


Function: y_noise_savgol

Docstring: No docstring available


Function: y_noise_savgol_DL

Docstring: No docstring available


  1#!/usr/bin/env python3
  2
  3
  4"""
  5# Purpose
  6`ramanchada2` software package is meant to fill the gap between the theoretical
  7Raman analysis and the experimental Raman spectroscopy by providing means to
  8compare data of different origin. The software is in early development stage
  9but still able to solve practical problems.
 10
 11# Features
 12
 13## Read simulated data
 14Process simulated data by [VASP][] and [CRYSTAL][] and provide same interface.
 15CRYSTAL data contain intensities for multiple orientations -- laser beam
 16incidents perpendicularly or parallelly to the observation and information
 17for mono-crystals. VASP data provide data only for poly-crystals but in
 18different format. So the perpendicular and parallel intensities are calculated
 19by an implemented [algorithm][].
 20
 21## Models
 22[LMFIT][] theoretical models can be build by spectral information obtained by
 23simulations or by provided by the user. These models can be fit to experimental
 24data, providing calibration information. At poor initial calibration the minimisation
 25procedure naturally fails. An iterative procedure aiming to solve this problem
 26was adopted in the code. On the first iteration the experimental spectrum lines
 27are artificially broadened. This makes it possible for the minimisation procedure
 28to find a parameters that are close enough to be used as an initial guess for
 29the second iteration. In few iterations the algorithm is able to fit to the original
 30experimental data. This idea is implemented and is at proof-of-concept level.
 31
 32## Generate spectra
 33Spectra can be generated by the theoretical models. Random Poissonian noise and
 34artificial random-generated baseline can be added to the generated spectra, making
 35them convenient tools to test new methods for analysis.
 36
 37## Spectrum manipulation
 38A number of filters can be applied to spectra (experimental and generated).
 39Scaling on both x and y axes is possible. Scaling could be linear or arbitrary
 40user defined function. A convolution is possible with set of predefined functions
 41as well as user defined model.
 42
 43# Concept
 44The code is object oriented, written in python. Main elements are Spectrum and
 45theoretical models. Theoretical models are based on LMFIT library, while
 46Spectrum is a custom made class. Spectrum object contains data for x and y axes
 47and metadata coming from experimental files or other sources. It is planned
 48to add information about the uncertainties in x and y. All filters and manipulation
 49procedures are available as class methods. Measures are taken to preserve spectrum
 50instances immutable, so filters are generating new spectra, preserving the original
 51unchanged. Additionally, Spectrum has information about its history -- the sequence
 52of applied filters.
 53
 54# File formats
 55
 56## `.cha`
 57[ramanchada][] software package introduced `.cha` file format, which is an [HDF5][]
 58with a simple layout.
 59
 60### Cache in .cha files
 61The concept to keep previous variants of data is employed in `ramanchada2`. If
 62configured so, the software saves the data for all Spectrum instances to a
 63tree-organized `.cha` file. When a particular chain of operations is requested
 64by the user, the software checks if the final result is present in the cache file,
 65if so it is provided, otherwise the software checks for its parent. When a parent
 66or some of the grand parents are present, they are taken as a starting point and
 67the needed steps are applied to provide the final result. The current implementation
 68uses [h5py][] library to access local hdf files. It is foreseen to have implementation
 69with [h5pyd][] that support network operations.
 70
 71## Nexus format
 72
 73The latest ramanchada2 package allows export of a spectrum to [NeXus][] format.
 74
 75
 76[CRYSTAL]: https://www.crystal.unito.it/index.php
 77[HDF5]: https://hdfgroup.org/
 78[LMFIT]: https://lmfit.github.io/lmfit-py/index.html
 79[VASP]: https://www.vasp.at/
 80[algorithm]: https://doi.org/10.1103/PhysRevB.54.7830
 81[h5py]: https://h5py.org/
 82[h5pyd]: https://github.com/HDFGroup/h5pyd
 83[ramanchada]: https://github.com/h2020charisma/ramanchada
 84[NeXus]: https://www.nexusformat.org/
 85
 86.. include:: ../../../../../../spectrum_functions.md
 87"""
 88
 89from __future__ import annotations
 90
 91from . import spectrum
 92from . import theoretical_lines
 93__all__ = [
 94    'auxiliary',
 95    'io',
 96    'misc',
 97    'protocols',
 98    'spectral_components',
 99    'spectrum',
100    'theoretical_lines'
101]
102__version__ = '1.3.0'
103
104
105import logging
106
107
108class CustomFormatter(logging.Formatter):
109    green = "\x1b[32m"
110    blue = "\x1b[34m"
111    yellow = "\x1b[33m"
112    red = "\x1b[31m"
113    bold_red = "\x1b[31;1m"
114    reset = "\x1b[0m"
115    fmt = "%(asctime)s %(name)s %(levelname)s - %(message)s"
116    fmt = "%(levelname)s - %(filename)s:%(lineno)d %(funcName)s() - %(message)s"
117
118    FORMATS = {
119        logging.DEBUG: green + fmt + reset,
120        logging.INFO: blue + fmt + reset,
121        logging.WARNING: yellow + fmt + reset,
122        logging.ERROR: red + fmt + reset,
123        logging.CRITICAL: bold_red + fmt + reset
124    }
125
126    def format(self, record):
127        log_fmt = self.FORMATS.get(record.levelno)
128        formatter = logging.Formatter(log_fmt)
129        return formatter.format(record)
130
131
132def basicConfig(level=logging.INFO):
133    ch = logging.StreamHandler()
134    ch.setLevel(level)
135    ch.setFormatter(CustomFormatter())
136    logging.basicConfig(handlers=[ch], force=True)
137
138
139stream = logging.StreamHandler()
140stream.setFormatter(CustomFormatter())
141logging.basicConfig(handlers=[stream], force=True)
142logger = logging.getLogger(__name__)