Atomic magnetometers are emerging as an alternative to SQUID magnetometers for detection of biological magnetic fields. They have been used to measure both the magnetocardiography (MCG) and magnetoencephalography (MEG) signals. One of the virtues of the atomic magnetometers is their ability to operate as a multi-channel detector while using many common elements. Here we study two configurations of such a multi-channel atomic magnetometer optimized for MEG detection. We describe measurements of auditory evoked fields (AEF) from a human brain as well as localization of dipolar phantoms and auditory evoked fields. A clear N100m peak in AEF was observed with a signal-to-noise ratio of higher than 10 after averaging of 250 stimuli. Currently the intrinsic magnetic noise level is 4 fTHz$^{-1/2}$ at 10 Hz. We compare the performance of the two systems in regards to current source localization and discuss future development of atomic MEG systems.
COBISS.SI-ID: 10306644
Cloud computing represents one of the fastest growing areas of technology and offers a new computing model for various applications and services. This model is particularly interesting for the area of biometric recognition, where scalability, processing power, and storage requirements are becoming a bigger and bigger issue with each new generation of recognition technology. Next to the availability of computing resources, another important aspect of cloud computing with respect to biometrics is accessibility. Since biometric cloud services are easily accessible, it is possible to combine different existing implementations and design new multi-biometric services that next to almost unlimited resources also offer superior recognition performance and, consequently, ensure improved security to its client applications. Unfortunately, the literature on the best strategies of how to combine existing implementations of cloud-based biometric experts into a multi-biometric service is virtually nonexistent. In this paper, we try to close this gap and evaluate different strategies for combining existing biometric experts into a multi-biometric cloud service. We analyze the (fusion) strategies from different perspectives such as performance gains, training complexity, or resource consumption and present results and findings important to software developers and other researchers working in the areas of biometrics and cloud computing. The analysis is conducted based on two biometric cloud services, which are also presented in the paper.
COBISS.SI-ID: 10478420
Similarity scores represent the basis for identity inference in biometric verification systems. However, because of the so-called miss-matched conditions across enrollment and probe samples and identity-dependent factors these scores typically exhibit statistical variations that affect the verification performance of biometric systems. To mitigate these variations, score normalisation techniques, such as the z-norm, the t-norm or the zt-norm, are commonly adopted. In this study, the authors study the problem of score normalisation in the scope of biometric verification and introduce a new class of non-parametric normalisation techniques, which make no assumptions regarding the shape of the distribution from which the scores are drawn (as the parametric techniques do). Instead, they estimate the shape of the score distribution and use the estimate to map the initial distribution to a common (predefined) distribution. Based on the new class of normalisation techniques they also develop a hybrid normalisation scheme that combines non-parametric and parametric techniques into hybrid two-step procedures. They evaluate the performance of the non-parametric and hybrid techniques in face-verification experiments on the FRGCv2 and SCFace databases and show that the non-parametric techniques outperform their parametric counterparts and that the hybrid procedure is not only feasible, but also retains some desirable characteristics from both the non-parametric and the parametric techniques.
COBISS.SI-ID: 10602068
In the presented study a comprehensive statistical analysis of the chemical composition of atmospheric particulate matter was carried out. The data were collected from April 2003 to August 2008 with a 7-day time resolution in the Port of Koper and analyzed by the Proton Induced X-ray method (PIXE). The concentrations of fifteen chemical elements (S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Ni, Cu, Zn, Br, Sr and Pb) were further explored by applying positive matrix factorization (PMF). Long-term PIXE data factorization enables us to study the seasonal variability of the chemical elements and source loadings by performing trend and seasonal decomposition of the derived sources. The PMF analysis identified six source factors, three natural-regional sources and three local-anthropogenic sources, which were extensively analyzed by using statistical methods from pattern recognition and signal processing.
COBISS.SI-ID: 1536548804
Thermometers in laboratory environment and industrial applications are often subject to extraneous, usually unwanted and uncontrolled magnetic fields. Magnetic field influence can be minimized, but cannot be fully cancelled out. Even more, in most cases, there is no awareness of the existence of magnetic fields, let alone their effect on measurement instrumentation. The goal of this paper was to analyse and empirically and experimentally prove the magnetic sensitivity of thermocouples exposed to low magnetic fields: both dc and ac. From the results, it can be concluded that, ideally for temperature measurements of the highest accuracy in the above-cryogenic temperature range, magnetic sensitivity should be estimated and taken into account either as the correction of an error and/or as an additional source of measurement uncertainty. Special consideration should be given to thermocouple orientation relative to the magnetic field direction, influence of metal enclosures and magnetization effects on ferromagnetic components of thermocouples.
COBISS.SI-ID: 10414932