BiliScreen is a new smartphone app that is designed to screen for pancreatic cancer by having users snap a selfie. It’s shown here with a 3-D printed box that helps control lighting conditions to detect signs of jaundice in a person’s eye.Dennis Wise/University of Washington
Pancreatic cancer has one of the worst prognoses — with a five-year survival rate of 9 percent — in part because there are no telltale symptoms or non-invasive screening tools to catch a tumor before it spreads.
Now, University of Washington researchers are developing an app that could allow people to easily screen for pancreatic cancer and other diseases — by snapping a smartphone selfie.
BiliScreen uses a smartphone camera, computer vision algorithms and machine learning tools to detect increased bilirubin levels in a person’s sclera, or the white part of the eye. The app is described in a paper to be presented Sept. 13 at Ubicomp 2017, the Association for Computing Machinery’s International Joint Conference on Pervasive and Ubiquitous Computing.
One of the earliest symptoms of pancreatic cancer, as well as other diseases, is jaundice, a yellow discoloration of the skin and eyes caused by a buildup of bilirubin in the blood. The ability to detect signs of jaundice when bilirubin levels are minimally elevated — but before they’re visible to the naked eye — could enable an entirely new screening program for at-risk individuals.
In an initial clinical study of 70 people, the BiliScreen app — used in conjunction with a 3-D printed box that controls the eye’s exposure to light — correctly identified cases of concern 89.7 percent of the time, compared to the blood test currently used.
“The problem with pancreatic cancer is that by the time you’re symptomatic, it’s frequently too late,” said lead author Alex Mariakakis, a doctoral student at the Paul G. Allen School of Computer Science & Engineering. “The hope is that if people can do this simple test once a month — in the privacy of their own homes — some might catch the disease early enough to undergo treatment that could save their lives.”
BiliScreen builds on earlier work from the UW’s Ubiquitous Computing Lab, which previously developed BiliCam, a smartphone app that screens for newborn jaundice by taking a picture of a baby’s skin. A recent study in the journal Pediatrics showed BiliCam provided accurate estimates of bilirubin levels in 530 infants.
In collaboration with UW Medicine doctors, the UbiComp lab specializes in using cameras, microphones and other components of common consumer devices — such as smartphones and tablets — to screen for disease.
BiliScreen provides estimates of bilirubin levels in a person’s blood. Elevated levels can be an early warning sign for pancreatic cancer, hepatitis and other diseases.Dennis Wise/University of Washington
The blood test that doctors currently use to measure bilirubin levels — which is typically not administered to adults unless there is reason for concern — requires access to a health care professional and is inconvenient for frequent screening. BiliScreen is designed to be an easy-to-use, non-invasive tool that could help determine whether someone ought to consult a doctor for further testing. Beyond diagnosis, BiliScreen could also potentially ease the burden on patients with pancreatic cancer who require frequent bilirubin monitoring.
In adults, the whites of the eyes are more sensitive than skin to changes in bilirubin levels, which can be an early warning sign for pancreatic cancer, hepatitis or the generally harmless Gilbert’s syndrome. Unlike skin color, changes in the sclera are more consistent across all races and ethnicities.
Yet by the time people notice the yellowish discoloration in the sclera, bilirubin levels are already well past cause for concern. The UW team wondered if computer vision and machine learning tools could detect those color changes in the eye before humans can see them.
“The eyes are a really interesting gateway into the body — tears can tell you how much glucose you have, sclera can tell you how much bilirubin is in your blood,” said senior author Shwetak Patel, the Washington Research Foundation Entrepreneurship Endowed Professor in Computer Science & Engineering and Electrical Engineering. “Our question was: Could we capture some of these changes that might lead to earlier detection with a selfie?”
BiliScreen uses a smartphone’s built-in camera and flash to collect pictures of a person’s eye as they snap a selfie. The team developed a computer vision system to automatically and effectively isolate the white parts of the eye, which is a valuable tool for medical diagnostics. The app then calculates the color information from the sclera — based on the wavelengths of light that are being reflected and absorbed — and correlates it with bilirubin levels using machine learning algorithms.
The UW team tested two different accessories for BiliScreen: a 3-D printed box to control lighting conditions and glasses that help the app calibrate colors. The goal is to remove the need for additional accessories, potentially by mining data from facial pictures.Dennis Wise/University of Washington
To account for different lighting conditions, the team tested BiliScreen with two different accessories: paper glasses printed with colored squares to help calibrate color and a 3-D printed box that blocks out ambient lighting. Using the app with the box accessory — reminiscent of a Google Cardboard headset — led to slightly better results.
Next steps for the research team include testing the app on a wider range of people at risk for jaundice and underlying conditions, as well as continuing to make usability improvements — including removing the need for accessories like the box and glasses.
“This relatively small initial study shows the technology has promise,” said co-author Dr. Jim Taylor, a professor in the UW Medicine Department of Pediatrics whose father died of pancreatic cancer at age 70.
“Pancreatic cancer is a terrible disease with no effective screening right now,” Taylor said. “Our goal is to have more people who are unfortunate enough to get pancreatic cancer to be fortunate enough to catch it in time to have surgery that gives them a better chance of survival.”
Co-authors include Allen School undergraduate student Megan A. Banks, research study coordinator Lauren Phillipi and assistant professor of medicine Lei Yu.
The research was funded by the National Science Foundation, the Coulter Foundation and endowment funds from the Washington Research Foundation.
In 2015, scientists from Rice University revealed they had created light-driven nanosubmarines. These tiny molecular machines were activated by ultraviolet light and based on earlier work from Nobel laureate Bernard Feringa, whose ground-breaking research won the prize for chemistry in 2016. These single-molecule machines have now been shown to be able to target, and drill into, specific cancer cells, paving the way for a variety of highly targeted future nanomedicine treatments.
These molecular machines consist of 244 atoms with a tail-like propeller that creates propulsion when exposed to UV light. After proving the concept worked back in 2015, the team moved on to exploring whether the molecular motor could penetrate an individual cell.
"We thought it might be possible to attach these nanomachines to the cell membrane and then turn them on to see what happened," explains chemist James Tour.
First the team needed to attach the molecular motor to a component that allowed it to target a specific cell. In these early experiments a peptide was utilized that drove the molecule to attach itself to the membrane of human prostate cancer cells. The molecules were shown to effectively locate and attach to the targeted cells, but not drill into them until specifically triggered by UV light. Once triggered, the motors spun up to two to three million rotations per second to break through the cell membrane and kill the cell within one to three minutes.
The obvious challenge that needs to be overcome is to develop an activation trigger other than ultraviolet light, which currently limits the molecular motors to being controllable when concentrated at the surface of tissue. Other triggers are currently being investigated, with near infra-red (IR) light looking like the best option to control these motors when delivered deep into a body.
"In this process, the motor will absorb two photons simultaneously and get enough energy to start the rotor," says Gufeng Wang, a chemist on the Rice University team. "Since near IR light has deep penetration depth, we are no longer limited to the surface of the tissue."
There is much work that still needs to be done before these molecular motors become a real, clinical treatment, but there are a variety of exciting outcomes this technology promises. As well as targeting and destroying cancer cells, the molecular motors could be utilized to deliver drugs directly into diseased cells.
As well as working on additional activation mechanisms, the team is embarking on a series of small animal tests to examine the effectiveness of the molecules on living organisms.
"The researchers are already proceeding with experiments in microorganisms and small fish to explore the efficacy in-vivo," says Tour. "The hope is to move this swiftly to rodents to test the efficacy of nanomachines for a wide range of medicinal therapies."
The research was published in the journal Natureand the video below provides a closer look at the team's breakthrough.