Image Credit: Microsoft Research YouTube channel (InnerEye software)
When we think of computer vision, facial recognition-what your iPhone uses to unlock your phone-is what comes to mind, at least for me. Computer vision is “how computers see and understand digital images and videos,” and we often associate it with the tasks our phones are capable of-identifying faces, pets, etc. Having such advanced technology right at our fingertips is amazing-but computer vision has applications far beyond a phone. Applications as far as hospitals, in fact.
In a tech world increasingly dominated by artificial intelligence, computer vision was bound to reach the medical world at some point. Now that it has, it’s being used to diagnose and treat patients.
MaxQ AI is a small company that offers software capable of identifying abnormalities in a patient’s brain. The algorithm used to run the software is trained with millions of brain scans uploaded by developers. Put to the test, it will point out irregularities in a scan uploaded by a medical professional. With this, the patient can be given effective treatment based on the conclusions drawn by MaxQ AI’s software.
The software is still in the process of being approved by the US Food and Drug Administration, but MaxQ AI CEO Gene Saragnese hopes that existing partnerships with Samsung, General Electric Company, and the International Business Machines Corporation will allow the software to benefit up to seventy-five percent of hospitals in the world.
Microsoft’s InnerEye is another computer vision application aimed at identifying irregularities. Given a three-dimensional scan, the software can calculate the dimensions of the organ or other body part displayed. Then, it can pinpoint tumors and abnormalities. Like MaxQ AI’s software, InnerEye requires lots of training data and can be extremely useful to medical professionals.
Triton, another computer vision-based software, is offered by Gauss, and can aid in tracking blood loss during surgery-and it’s all run from an iPad! With the software, a sponge full of blood can be analyzed to produce the patient’s blood loss and rate of blood loss. Trained from sample data, Triton is able to draw these conclusions by estimating how much blood is concentrated in the sponge being held.
In a study on C-section patients, Triton identified more hemorrhages than the naked eyes of surgeons and allowed adjustments in treatment. Additionally, patients who Triton was used on had generally shorter hospital stays.
Full circle, back to facial recognition! AiCure is a startup that allows a patient’s ingestion of medication to be monitored. In front of a phone camera, the patient takes their prescribed medication, and facial recognition technology identifies the process. In the context of clinical trials, AiCure’s software can help researchers track the number of people who drop out.
Computer vision has a home in the hospital. Applying advanced artificial intelligence to the medical field allows targeted treatment, faster diagnosis, and more. The eyes of a trained algorithm can aid in creating a more personalized medical plan, allowing for more effective and efficient treatment.
Sources
DeepAI (https://deepai.org/machine-learning-glossary-and-terms/computer-vision)
Emerj (https://emerj.com/ai-sector-overviews/computer-vision-healthcare-current-applications/)