Link to the University of Pittsburgh Homepage
Link to the University Library System Homepage Link to the Contact Us Form

A Statistical Approach to the Inverse Problem in Magnetoencephalography

Yao, Zhigang (2011) A Statistical Approach to the Inverse Problem in Magnetoencephalography. Doctoral Dissertation, University of Pittsburgh. (Unpublished)

[img]
Preview
PDF
Primary Text

Download (2MB) | Preview

Abstract

Magnetoencephalography (MEG) is an imaging technique used to measure the magnetic field outside the human head produced by the electrical activity inside the brain. The MEG inverse problem, identifying the location of the electric sources from the magnetic signal measurements, is ill-posed; that is, there is an infinite number of mathematically correct solutions. Common source localization methods assume the source does not vary with time and do not provide estimates of the variability of the fitted model. We reformulate the MEG inverse problem by considering time-varying sources and we model their time evolution using a state space model. Based on our model, we investigate the inverse problem by finding the posterior source distribution given the multiple channels of observations at each time rather than fitting fixed source estimates. A computational challenge arises because the data likelihood is nonlinear, where Markov chain Monte Carlo (MCMC) methods including conventional Gibbs sampling are difficult to implement. We propose two new Monte Carlo methods based on sequential importance sampling. Unlike the usual MCMC sampling scheme, our new methods work in this situation without needing to tune a high-dimensional transition kernel which has a very high-cost. We have created a set of C programs under LINUX and use Parallel Virtual Machine (PVM) software to speed up the computation.Common methods used to estimate the number of sources in the MEG data include principal component analysis and factor analysis, both of which make use of the eigenvalue distribution of the data. Other methods involve the information criterion and minimum description length. Unfortunately, all these methods are very sensitive to the signal-to-noise ratio (SNR). First, we consider a wavelet approach, a residual analysis approach and a Fourier approach to estimate the noise variance. Second, a Neyman-Pearson detection theory-based eigenthresholding method is used to decide the number of signal sources. We apply our methods to simulated data where we know the truth. A real MEG dataset without a human subject is also tested. Our methods allow us to estimate the noise more accurately and are robust in deciding the number of signal sources.


Share

Citation/Export:
Social Networking:
Share |

Details

Item Type: University of Pittsburgh ETD
Status: Unpublished
Creators/Authors:
CreatorsEmailPitt UsernameORCID
Yao, Zhigangzhy16@pitt.eduZHY16
ETD Committee:
TitleMemberEmail AddressPitt UsernameORCID
Committee ChairGleser, Leon J.gleser@pitt.eduGLESER
Committee CoChairEddy, William F.bill@stat.cmu.edu
Committee MemberKrafty, Robert T.krafty@pitt.eduKRAFTY
Committee MemberIyengar, Satishssi@pitt.eduSSI
Date: 30 September 2011
Date Type: Completion
Defense Date: 1 June 2011
Approval Date: 30 September 2011
Submission Date: 15 August 2011
Access Restriction: 5 year -- Restrict access to University of Pittsburgh for a period of 5 years.
Institution: University of Pittsburgh
Schools and Programs: Dietrich School of Arts and Sciences > Statistics
Degree: PhD - Doctor of Philosophy
Thesis Type: Doctoral Dissertation
Refereed: Yes
Uncontrolled Keywords: ill-posed problem; Neyman-Pearson eigenthresholding; sequential importance sampling; state space model; virtual dimensionality; Fourier basis; wavelet basis; parallel computing
Other ID: http://etd.library.pitt.edu/ETD/available/etd-08152011-141540/, etd-08152011-141540
Date Deposited: 10 Nov 2011 19:59
Last Modified: 15 Nov 2016 13:49
URI: http://d-scholarship.pitt.edu/id/eprint/9115

Metrics

Monthly Views for the past 3 years

Plum Analytics


Actions (login required)

View Item View Item