March 5, 2021
Society of Photo-Optical Instrumentation Engineers

Effects of image capture and correction approaches on quantifying results of lateral flow assays with mobile phones

Wenbo Wang, Liming Hu, Matthew D. Keller

There have been numerous attempts to use mobile phone images to interpret results of lateral flow assays (LFA’s). Many initial efforts created attachments to position test strips by the camera or added external light sources. To see widespread use, especially in low-resource settings, for aiding test interpretation or performing some level of quantification, a mobile phone LFA reader should not require materials beyond the test and would be phone-agnostic. To assess the feasibility of this approach, twelve CareStart malaria LFA cassettes were run using spiked whole blood. About 880 images of these cassettes were acquired using three brands of phones and under various lighting conditions, imaging distances, and viewing angles. Test strip regions were converted to 1-dimensional (1D) intensity profiles along the direction of flow. Corrections for color accuracy, gamma, and white balance were implemented, and features such as peak height and area under the curve of control and test lines were used in linear regression. Both a fully connected and a 1D convolutional neural network were trained on the 1D profiles of test strips without feature extraction as well. The best regression models achieved R2 of 0.77 and prediction error of 102 ng/ml. A multi-class support vector machine provided 84% accuracy for a semi-quantitative approach of negative or weak, medium, and high positives. For all analyses, corrections to color, white balance, etc. did not provide meaningful improvements, and limiting analysis to a single phone was not substantially better. Thus, there is promise for a device-agnostic mobile phone LFA reader.

View Journal