Skin

University of Arkansas biomedical engineering professor Kyle Quinn has received a $1.6 million grant to develop non-invasive, real-time “optical biopsies” of chronic skin wounds.

FAYETTEVILLE – Biomedical engineering professor Kyle Quinn has received a four-year, $1.6 million grant from the National Institutes of Health to develop non-invasive, real-time “optical biopsies” of chronic skin wounds.

The goal of Quinn and researchers in his lab is to provide digital histopathology images — the microscopic examination of tissue to study the manifestation and progression of disease — and other quantitative information without the need for an invasive biopsy, tissue processing and staining with histology dyes.

CHRONIC SKIN WOUNDS

Chronic wounds are skin injuries that fail to progress through the normal healing process. There are many types of chronic wounds, including pressure ulcers, diabetic foot ulcers, venous stasis ulcers and arterial insufficiency ulcers. These non-healing wounds can have different underlying causes, but are often characterized by inflamed tissue, poor blood circulation, callus formation or infection. They affect more than 150 million people worldwide and cost approximately $50 billion in health care annually in the United States alone.

The initial clinical assessment of a chronic skin wound involves visual inspection, but more detailed characterization relies on histological analysis of wound tissue biopsies. While this approach is useful in clinics and research labs for understanding wound pathophysiology and developing new products to treat chronic skin wounds, it is inherently invasive, time-consuming and qualitative.

For several years, Quinn has been working on an alternative, quantitative imaging system that addresses some limitations of conventional histological analysis. Researchers in his lab use multiphoton microscopy to view tissue in three dimensions at the cellular level and generate 3D maps of wound metabolism. This imaging technique is non-invasive, which allows them to measure changes in cellular metabolism and skin organization over time within the same wounds.

DEEP LEARNING AND AI

Although their metabolic imaging technique can provide highly detailed assessments of cellular function, the analysis of their image data takes time. Researchers must manually map relevant image regions to specific layers of the skin or wound regions, which is a slow and tedious process. To speed it up, Quinn has partnered with Justin Zhan, professor of computer science and computer engineering. Zhan, a data science expert, is helping Quinn combine multiphoton microscopy and “deep learning,” an artificial intelligence-based approach to analysis.

Deep learning is a subset of machine learning, which uses computer algorithms to extract meaningful information from data, using neural networks inspired by the organization of neurons that make up the human brain. Deep learning uses computer algorithms to train neural networks with multiple layers, which enables the algorithm to learn more complex tasks. The deep learning approach will enable Quinn to provide rapid quantitative analysis of chronic skin wounds.

“Through deep learning we can train a computer algorithm to delineate wound regions accurately and very quickly,” Quinn said. “This will greatly speed up our analysis and remove the subjectivity and bias that is inherent when you ask humans to assess images and identify features.”

Quinn and Jake Jones, Quinn’s former doctoral student, have published preliminary results on the use of deep learning to identify wound features in the Journal of Investigative Dermatology and Lasers in Surgery and Medicine. Jones, who earned his Ph.D. and now works as a product manager for an optics manufacturing company, performed most of the preliminary work.

In addition to Zhan, Quinn will collaborate with leaders in the wound healing field, including Aristidis Veves, research director of the Joslin-Beth Israel Deaconess Foot Center, and Marjana Tomic Canic, professor of dermatology at the University of Miami. By combining wound image data from multiple labs, the team will have a more diverse set of data to rigorously train neural networks that can broadly work for different kinds of wounds.

Click an emoticon to express your reaction to this article.

1
0
0
0
0