Yao, Zhigang
(2011)
A Statistical Approach to the Inverse Problem in Magnetoencephalography.
Doctoral Dissertation, University of Pittsburgh.
(Unpublished)
Abstract
Magnetoencephalography (MEG) is an imaging technique used to measure the magnetic field outside the human head produced by the electrical activity inside the brain. The MEG inverse problem, identifying the location of the electric sources from the magnetic signal measurements, is illposed; that is, there is an infinite number of mathematically correct solutions. Common source localization methods assume the source does not vary with time and do not provide estimates of the variability of the fitted model. We reformulate the MEG inverse problem by considering timevarying sources and we model their time evolution using a state space model. Based on our model, we investigate the inverse problem by finding the posterior source distribution given the multiple channels of observations at each time rather than fitting fixed source estimates. A computational challenge arises because the data likelihood is nonlinear, where Markov chain Monte Carlo (MCMC) methods including conventional Gibbs sampling are difficult to implement. We propose two new Monte Carlo methods based on sequential importance sampling. Unlike the usual MCMC sampling scheme, our new methods work in this situation without needing to tune a highdimensional transition kernel which has a very highcost. We have created a set of C programs under LINUX and use Parallel Virtual Machine (PVM) software to speed up the computation.Common methods used to estimate the number of sources in the MEG data include principal component analysis and factor analysis, both of which make use of the eigenvalue distribution of the data. Other methods involve the information criterion and minimum description length. Unfortunately, all these methods are very sensitive to the signaltonoise ratio (SNR). First, we consider a wavelet approach, a residual analysis approach and a Fourier approach to estimate the noise variance. Second, a NeymanPearson detection theorybased eigenthresholding method is used to decide the number of signal sources. We apply our methods to simulated data where we know the truth. A real MEG dataset without a human subject is also tested. Our methods allow us to estimate the noise more accurately and are robust in deciding the number of signal sources.
Share
Citation/Export: 

Social Networking: 

Details
Item Type: 
University of Pittsburgh ETD

Status: 
Unpublished 
Creators/Authors: 

ETD Committee: 

Date: 
30 September 2011 
Date Type: 
Completion 
Defense Date: 
1 June 2011 
Approval Date: 
30 September 2011 
Submission Date: 
15 August 2011 
Access Restriction: 
5 year  Restrict access to University of Pittsburgh for a period of 5 years. 
Institution: 
University of Pittsburgh 
Schools and Programs: 
Dietrich School of Arts and Sciences > Statistics 
Degree: 
PhD  Doctor of Philosophy 
Thesis Type: 
Doctoral Dissertation 
Refereed: 
Yes 
Uncontrolled Keywords: 
illposed problem; NeymanPearson eigenthresholding; sequential importance sampling; state space model; virtual dimensionality; Fourier basis; wavelet basis; parallel computing 
Other ID: 
http://etd.library.pitt.edu/ETD/available/etd08152011141540/, etd08152011141540 
Date Deposited: 
10 Nov 2011 19:59 
Last Modified: 
15 Nov 2016 13:49 
URI: 
http://dscholarship.pitt.edu/id/eprint/9115 
Metrics
Monthly Views for the past 3 years
Plum Analytics
Actions (login required)

View Item 