KEYWORDS: Photoacoustic spectroscopy, Near infrared, Semiconductors, Pulsed laser operation, Polymers, New and emerging technologies, Neurons, Nanoparticles, Modulation, In vivo imaging
Toward non-genetic non-invasive neural modulation with a submillimeter precision, we report the development and application of photoacoustic nanotransducers (PAN) to neural stimulation in cultured primary neurons and in live brain. Our PAN, based on synthesized semiconducting polymer nanoparticles, efficiently generate localized ultrasound by a photoacoustic process upon absorption of nanosecond pulsed light in the NIR-II window. We showed that PAN binding on the neuron membrane through non-specific interaction and specific targeting of mechanosensitive ion channels both successfully activate primary neurons in culture. We also demonstrated in live mouse brain motor cortex activation and invoked subsequent motor responses through PAN.
Miniature stimulated Raman scattering (SRS) imaging systems such as an SRS handheld probe and endoscope will enable label-free in vivo optical-biopsy for human patient investigation. Towards the miniature system, the challenge remains at the design and fabrication of such an achromatic micro-objective with low optical aberrations. Recent advances in achromatic metalenses with diffraction-limited performance open the opportunity to tackle the challenge. Here, we demonstrate the first proof-of-concept of metalens-enabled SRS imaging. The metalenses hold great potentials for developing endoscopic SRS and nonlinear imaging system for future clinical applications.
Breast-conserving surgery is a well-accepted breast cancer treatment. However, it is still challenging for the surgeon to accurately localize the tumor during the surgery. Also, the guidance provided by current methods is 1 dimensional distance information, which is indirect and not intuitive. Therefore, it creates problems on a large re-excision rate, and a prolonged surgical time. To solve these problems, we have developed a fiber-delivered optoacoustic guide (OG), which mimics the traditional localization guide wire and is preoperatively placed into tumor mass, and an augmented reality (AR) system to provide real-time visualization on the location of the tumor with sub-millimeter variance. By a nano-composite light diffusion sphere and light absorbing layer formed on the tip of an optical fiber, the OG creates an omnidirectional acoustic source inside tumor mass under pulsed laser excitation. The optoacoustic signal generated has a high dynamic range (~ 58dB) and spreads in a large apex angle of 320 degrees. Then, an acoustic radar with three ultrasound transducers is attached to the breast skin, and triangulates the location of the OG tip. With an AR system to sense the location of the acoustic radar, the relative position of the OG tip inside the tumor to the AR display is calculated and rendered. This provides direct visual feedback of the tumor location to surgeons, which will greatly ease the surgical planning during the operation and save surgical time. A proof-of-concept experiment using a tablet and a stereo-vision camera is demonstrated and 0.25 mm tracking variance is achieved.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.