Dos Santos Diniz, Eduardo Jose
(2024)
Multi-Scale Spatiotemporal Neural Computation: On the relationship between dynamical attractors, spiking neural networks, and convolutional neural circuits.
Doctoral Dissertation, University of Pittsburgh.
(Unpublished)
This is the latest version of this item.
Abstract
Understanding spatiotemporal neural dynamics and developing biologically-inspired artificial neural networks remain open challenges in computational neuroscience. Critical gaps persist in elucidating cortical rhythms, memory consolidation, and biological networks' remarkable spatiotemporal processing capabilities. This dissertation hypothesizes that asymmetric connectivity and dedicated fast-slow processing pathways in neural systems enhance depth, robustness, and versatility in handling complex spatiotemporal patterns. Our first contribution is elucidating how neurons communicate and synchronize activity via temporally precise spikes by examining the dynamics of spike-coding networks. Developing models of cortical neural oscillators reveal the origins of spontaneous transitions between active and silent states underlying slow-wave sleep rhythms, demonstrating how the intricate balance of excitation and inhibition orchestrates these oscillations. Our second is to establish a mathematical equivalence between Hopfield networks' associative memory models and spike-coding networks by showing that fast and slow asymmetric connectivity weights induce equivalent cyclic attractor dynamics in both systems. Introducing asymmetric weights in slow connections enables both models to learn and generate complex temporal firing sequences, transitioning between quasi-attractor states representing stored memories. Simulations demonstrate the efficacy of spike-coding networks for encoding and retrieving temporal sequences while performing the n-back working memory task. Our third contribution is to harness the potential of generative adversarial networks for unpaired cross-modality translation from 3 Tesla to 7 Tesla magnetic resonance imaging. We propose a fast-slow convolutional network architecture to enhance translation performance by balancing local and global information processing. This dissertation makes significant contributions by elucidating brain mechanisms underlying rhythms and memory, and unifying foundational computational frameworks while extracting principles to improve artificial neural network design.
Share
Citation/Export: |
|
Social Networking: |
|
Details
Item Type: |
University of Pittsburgh ETD
|
Status: |
Unpublished |
Creators/Authors: |
|
ETD Committee: |
|
Date: |
11 January 2024 |
Date Type: |
Publication |
Defense Date: |
1 September 2023 |
Approval Date: |
11 January 2024 |
Submission Date: |
8 September 2023 |
Access Restriction: |
No restriction; Release the ETD for access worldwide immediately. |
Number of Pages: |
180 |
Institution: |
University of Pittsburgh |
Schools and Programs: |
Swanson School of Engineering > Electrical Engineering |
Degree: |
PhD - Doctor of Philosophy |
Thesis Type: |
Doctoral Dissertation |
Refereed: |
Yes |
Uncontrolled Keywords: |
Mathematical Neuroscience,
Computational Neuroscience,
Neural Computation,
Neural Coding,
Neural Synchronization,
Brain Oscillations,
Memory Consolidation,
N-Back Working Memory Task,
Convolutional Neural Networks,
Machine Learning,
Deep Learning,
Spike-Coding Networks,
Leaky Integrate-and-Fire Models,
Phase-Locking,
Wilson-Cowan Models,
Neural Oscillators,
Bifurcation Analysis,
Hopfield Networks,
Circulant Matrices,
CycleGANs,
Fast-Slow Pathways |
Date Deposited: |
11 Jan 2024 19:39 |
Last Modified: |
11 Jan 2024 19:39 |
URI: |
http://d-scholarship.pitt.edu/id/eprint/45468 |
Available Versions of this Item
Metrics
Monthly Views for the past 3 years
Plum Analytics
Actions (login required)
|
View Item |