In tissue samples taken from surgery, it usually takes about 30 minutes for human pathologists to diagnose a brain tumor from it. By contrast, a new artificial intelligence system can do this in less than 150 seconds, and is more accurate than its human counterparts.
Optical histology images show two different forms of brain tumors, diffuse astrocyma (left) and meningioma (right)
In a new study recently published in the journal Nature Medicine, scientists describe a new diagnostic technique that combines advanced optical imaging technology with the power of artificial intelligence. The system can make real-time, fast and accurate diagnosis of brain tumors while the patient is still on the operating table. In tests, Artificial Intelligence made a slightly more accurate and shorter diagnosis than a human pathologist. Excitingly, the new system could be used in places where no professional neuroscientists could be found, and could also help diagnose other types of cancer.
In cancer surgery, it is not uncommon for surgeons to extract potentially problematic tissue for laboratory analysis. These inoperative biopsies help to make more accurate diagnoses and help the medical team design the next steps, such as arranging follow-up surgery to remove the tumor. About 1.1 million brain samples a year in the U.S. need to be biopsised by trained neuropathologists, a process that is “time, resources, and labor-intensive,” according to the new study.
In fact, these diagnoses involve more than a dozen steps, including transporting tissue from the operating room to the laboratory, temporarily placing it in a cryogenically frozen state, defrosting and dewatering the sample, cleaning with xylene, and finally analyzing it under a microscope. Most importantly, performing all these steps requires an histopathological, which is what is lacking today. “Given the 42 per cent vacancy rate for neuropathology researchers, further shortages are expected,” the study said. “
To simplify the process, Daniel Orringer, a neuroscientist at New York University, and colleagues developed a diagnostic technique called “stimulateraman hisology” A new optical imaging technology for short is AI deep neural network. SRH uses scattering lasers to illuminate features that are not normally visible in standard imaging techniques. During surgery, images obtained through SRH are evaluated by artificial intelligence algorithms in less than 150 seconds, compared with 20 to 30 minutes for human neuropathologists.
Even more amazingly, artificial intelligence can detect biopsy features that are invisible to the naked eye. “As surgeons, we can only act on what we see; this technology allows us to see what is not visible, improving the speed and accuracy of surgery and reducing the risk of misdiagnosis,” says Daniel Olinger. “
To build deep neural networks, the scientists trained the system using 2.5 million images from 415 patients. By the end of the training, artificial intelligence had been able to classify brain tissue into 13 common types of brain tumors, such as malignant gliomas, lymphomas, metastatic tumors, diffuse astrocymas, and meningioma.
The researchers then conducted clinical trials of 278 people with brain tumors and epilepsy in three different medical institutions to test the effectiveness of the system. SRH images were evaluated by human experts and artificial intelligence. The results showed that the accuracy rate of artificial intelligence to identify tumors was 94.6%, while that of human neuropathologists was 93.9%. Interestingly, the mistakes that humans make are different from those made by Artificial Intelligence, which is actually good news, because it shows that the wrong nature of Artificial Intelligence can be explained and corrected in the future to obtain a more accurate system.
SRH will revolutionise the field of neuropathology because it improves the decision-making process during surgery and can provide expert assessment results in hospitals where neurologists are lacking. In addition, the study notes that since many histological features of brain tumors can also be seen in other forms of cancer, the system may eventually be used in other areas, including dermatology, gynecology, breast surgery and head and neck surgery.
To be sure, artificial intelligence is slowly overtaking humans. Google, for example, has developed a system that is better at diagnosing breast and lung cancer than human experts. We can sometimes feel nervous about artificial intelligence that transcends humans (which is understandable), but in the medical field, let artificial intelligence take a bigger step.
Google Artificial Intelligence to Detect Breast Cancer
Google researchers have successfully trained artificial intelligence to detect breast cancer, with even higher accuracy than doctors
In an effort to expand into the medical field, Google researchers have successfully trained artificial intelligence that can detect breast cancer with even greater accuracy than doctors. In the Google-funded study, a team of independent researchers from different hospitals and universities, researchers from Google’s health department, and engineers from DeepMind, a British-owned artificial intelligence company, analyzed and compared nearly 29,000 mammograms from the UK and the US. The study found that false negative results in the United States and the United Kingdom decreased by 9.7% and 2.7%, respectively, and false positive results by 5.7% and 1.2%, respectively. This is also done when artificial intelligence processes less information. In its controlled study, Google randomly selected 500 X-rays from its U.S. data set and provided the patient’s age, breast cancer history, and previous X-rays.
Google provides limited demographic information for the sample, but if artificial intelligence is used in a more diverse population in the real world, it may help to uncover cancers hidden behind dense tissue. The paper also points out that, unlike in the US, artificial intelligence can also reduce the “workload” of Radiologists in the UK. Unlike in the US, patients in the UK usually get a second opinion on each mammogram.
There is no doubt that there will be more in the next decade as Google expands its business into medical technology. Sundar Pichai, Google’s chief executive, said healthcare was one of the biggest applications for artificial intelligence and that “the benefits will continue to be felt over the next 10 to 20 years”. The latest study comes as Google is already studying deep learning to detect the spread of breast cancer. Over the past few years, Google has also been training artificial intelligence to detect diabetic eye and heart disease, as well as data on the progression of multiple sclerosis. In 2014, Google tried to launch a smart contact lens with microchips to detect blood sugar levels, but stopped working on the glasses after finding that the amount of tears was not enough to measure blood sugar. Against this backdrop, Google has acquired Fitbit, including its subsidiary Fitbit Health Solutions, in an effort to combine health insurance costs with Fitbit’s data.
But Google is also controversial about the mining of user data, even though they claim it is for charitable purposes. In 2017, British authorities accused Google of illegally obtaining health records from 1.6 million people for a study of kidney damage conducted by DeepMind. Just a few months ago, Google collected medical data from millions of Americans without the knowledge of a study called Project Nightingale (which Google considers to be compatible with the Health Insurance Circulation and Accountability Act).
Etta Pisano, a professor of radiology at Beth Israel Lech Medical Center, doesn’t think the medical community needs to rush. Pisano mentioned the failure of early computer-aided detection (CAD) technology. The technology was first introduced in the 1990s and “showed great promise in experimental testing, but it was not successful in real life”. The diagnostic accuracy of this technique is not as good as that of humans. However, Pisano wrote that breast cancer screening “may be an ideal application for artificial intelligence in the field of medical imaging” because it has a large data set available and is more binary than other diagnoses that must take into account multiple factors.