Harmony becomes cacophony when healthy cells become cancerous

John Stuart Reid, technical director of the CymaScope laboratory, and Professor Ji, of Rutgers University, have completed a study toward creating an AI-based system to improve cancer surgery that could also lead to a new method of early cancer detection. The study has been published in the Water Journal.

The discovery that cells create sound was made by Professor James Gimzewski of UCLA, in 2002. Using an Atomic Force Microscope he and his colleague, Dr. Andrew Pelling, were able to listen to the sounds of cells for the first time. Surprisingly, they found that the sounds lie in the audible range; in other words, if our ears were sensitive enough we would be able to hear the sounds of our own cells. (Perhaps it is fortunate that we cannot!) Professor Gimzewski named their new approach to cell biology, “sonocytology,” combining “sono” (sound) with ”cytology’” (the study of cells). But Atomic Force Microscopy is technically challenging, requiring an acoustically isolated room and many other demanding precautions, rendering it less attractive than other, more recently explored methods of listening in to cell sounds. In the new study the sounds from cells were derived by Raman Spectroscopy, in which a laser probe strikes not one cell but typically thousands, the light being modulated by the movements of a myriad of cell membranes. The fact that many cells influence the laser beam means that the Raman system provides an accessible method of cell sound detection. As the laser light reflects from the tissue sample it carries with it tiny fluctuations that are collected by an electronic detector and are simultaneously made audible by a computer, therefore rendering the cell sounds audible as a diagnostic tool.

In the paper published In the Water Journal titled, “Imaging Cancer and Healthy Cell Sounds in Water by CymaScope, followed by Quantitative Analysis by Planck-Shannon Classifier” first steps are discussed in creating a real time system for surgeons, based on visual data provided by a CymaScope instrument, in which the sound of healthy and cancer cells is imprinted onto medical grade water, rather like a fingerprint on glass, leaving a visual signature of the cell sounds. A typical cymascopic image of a healthy cell sound is symmetrical and beautiful, while that of a cancer cell is typically skewed and ugly by comparison. The visual imagery would be displayed to the surgeon via specially adapted eyewear, augmented by a digital number derived in real time via software calculations, appearing in the eyewear, thereby supporting the surgeon’s decision where to make the incision. The system could also lead to an AI-based method of early cancer detection.

The use of sound in medical modalities is growing each year, both for therapeutic and diagnostic applications and this drug-free approach to medicine is finding welcome support among many physicians and in hospitals worldwide. In keeping with the title of this news item, “Harmony becomes cacophony when healthy cells become cancerous”, sound has a great future in medicine, a voice that deserves to be shouted out for all to hear.

The full article can be read at: http://dx.doi.org/10.14294/WATER.2019.6