PySilSub: An open-source Python toolbox for implementing the method of silent substitution in vision and nonvisual photoreception research

Joel Thomas Martin*, Geoffrey M Boynton, Daniel Hart Baker, Alex Wade, Manuel Spitschan

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

The normal human retina contains several classes of photosensitive cell—rods for low-light vision, three cone classes for daylight vision, and intrinsically photosensitive retinal ganglion cells (ipRGCs) expressing melanopsin for non-image-forming functions, including pupil control, melatonin suppression, and circadian photoentrainment. The spectral sensitivities of the photoreceptors overlap significantly, which means that most lights will stimulate all photoreceptors to varying degrees. The method of silent substitution is a powerful tool for stimulating individual photoreceptor classes selectively and has found much use in research and clinical settings. The main hardware requirement for silent substitution is a spectrally calibrated light stimulation system with at least as many primaries as there are photoreceptors under consideration. Device settings that will produce lights to selectively stimulate the photoreceptor(s) of interest can be found using a variety of analytic and algorithmic approaches. Here we present PySilSub (https://github.com/PySilentSubstitution/pysilsub), a novel Python package for silent substitution featuring flexible support for individual colorimetric observer models (including human and mouse observers), multiprimary stimulation devices, and solving silent substitution problems with linear algebra and constrained numerical optimization. The toolbox is registered with the Python Package Index and includes example data sets from various multiprimary systems. We hope that PySilSub will facilitate the application of silent substitution in research and clinical settings.
Original languageEnglish
Article number10
Number of pages16
JournalJournal of Vision
Volume23
Issue number7
DOIs
Publication statusPublished - 14 Jul 2023

Bibliographical note

©2023 The Authors

Cite this