Here we present a protocol for collecting large-volume, four-color, single-molecule localization imaging data from neural tissue. We have applied this technique to map the location and identities of chemical synapses across whole cells in mouse retinae. Our sample preparation approach improves 3D STORM image quality by reducing tissue scattering, photobleaching, and optical distortions associated with deep imaging. This approach can be extended for use on other tissue types enabling life scientists to perform volumetric super-resolution imaging in diverse biological models.
For a detailed application of this protocol, please refer to Sigal et al., 2015.

Figure 1

Figure 2

Figure 3

Figure 4

Figure 5

Figure 6

Figure 7

Figure 8

Figure 9

Figure 10
This is a list of supplementary files associated with this preprint. Click to download.
Volumetric super-resolution imaging by serial ultrasectioning and STochastic Optical Reconstruction Microscopy (STORM) Protocol
Reagent Recipe Table
Loading...
Posted 04 Aug, 2021
Posted 04 Aug, 2021
Here we present a protocol for collecting large-volume, four-color, single-molecule localization imaging data from neural tissue. We have applied this technique to map the location and identities of chemical synapses across whole cells in mouse retinae. Our sample preparation approach improves 3D STORM image quality by reducing tissue scattering, photobleaching, and optical distortions associated with deep imaging. This approach can be extended for use on other tissue types enabling life scientists to perform volumetric super-resolution imaging in diverse biological models.
For a detailed application of this protocol, please refer to Sigal et al., 2015.

Figure 1

Figure 2

Figure 3

Figure 4

Figure 5

Figure 6

Figure 7

Figure 8

Figure 9

Figure 10
This is a list of supplementary files associated with this preprint. Click to download.
Volumetric super-resolution imaging by serial ultrasectioning and STochastic Optical Reconstruction Microscopy (STORM) Protocol
Reagent Recipe Table
Loading...
© Research Square 2022