Thesis Defense
Date:
This talk examines neural synchrony (i.e., the coordinated timing of neuronal activity) as a potential mechanism for visual binding. I introduce three artificial neural network models that induce synchrony through different dynamics and show that these models improve object representation, robustness, and human-like generalization. I then compare these computational results with shared temporal variance (STV, proxy for synchrony) measured in primate IT cortex, revealing information carried by temporal signals that firing rates alone do not capture. Together, the work provides new evidence that temporal coordination supports visual perception and offers a framework for aligning ANN dynamics more closely with the brain.
