One of the biggest areas where technology is impacting the diagnosis and treatment of disease is in the burgeoning market of wearable technology. Led by companies like Fitbit, Pebble, Apple, and Samsung, the wearable health and fitness-tracking scene has exploded as businesses rush to integrate their products more fully in people’s lives — making them all but indispensable for many. But wearables aren’t just vanity tech for the fitness-obsessed: for example, an Apple Watch was credited with saving the life of a high school football player in Massachusetts. When this young man started experiencing chest and back pain after practice, he used his Apple Watch to check his heart rate, which had reached 145 beats per minute. He immediately contacted his trainer, was rushed to the hospital, diagnosed with rhabdomyolysis, and treated just in time for a full recovery. Building on the success of the Apple Watch as a health accessory, at Apple’s Worldwide Developer Conference in June 2017, the company announced improvements to watchOS that will allow users to better manage diabetes with glucose and insulin delivery monitoring. Needless to say, other manufacturers in the wearables market are sure to follow suit. As wearable technology continues to be used by individuals to monitor and manage medical conditions, it’s conceivable that doctors in the near future will come to rely on a patient’s Apple Watch or Fitbit to provide valuable insight into their life and habits.
Another significant technological revolution in the medical community is the rise of big data, to no one’s surprise. As more and more devices, including the wearables mentioned above, collect data and increasing numbers of studies continue to be published online, the medical community can see patterns and detect trends in ways that weren’t previously possible. Population science is just one field that has benefited from this explosion of available data. Rather than treating individuals in relative isolation, the vast amount of medical data available now helps doctors and scientists better understand entire population subcategories and geographical regions to see how diseases are spreading and determine what steps need to be taken to stop them. But big data helps individual patients, too. Thanks to comprehensive electronic medical records collected over many years, doctors have much more complete information about any given patient’s history and past treatments.
Another closely related technological breakthrough that’s already beginning to impact the medical community is the rise of cognitive computing, a subset of artificial intelligence that focuses on large-scale learning, reasoning, and understanding — similar to how the human mind functions. As technology continues to evolve and big data become more ubiquitous, healthcare professionals risk drowning in a sea of information. In the pursuit of their primary goal — taking care of patients — doctors may miss critical or even life-saving insight if they’re overwhelmed by the sheer amount of data available. This is where cognitive computing will play an increasingly pivotal role. One example is IBM’s Watson Health platform: already a leader in artificial intelligence, IBM is hoping to leverage Watson Health in the medical community to help doctors diagnose and treat diseases by searching and indexing vast stores of data that aren’t readily accessible by medical professionals. Writing for Fortune, Laura Lorenzetti poses one possible Watson Health scenario:
“Imagine if you had a rare, undiagnosed disease that’s stumped doctor after doctor. What if there were a single, secure database that could read your symptoms, then run through thousands of clinical studies, similar patient records, and medical textbooks to present a risk-matched list of potential diseases?
Just one year after its launch, IBM Watson Health is already starting to make this seemingly impossible task a reality, thanks to its powerful cognitive computing platform and a wide-reaching partnership strategy.”
Thanks to the rise of new technologies like wearables, big data, and cognitive computing, modern medicine is able to diagnose, treat, and monitor diseases in ways doctors of the past could hardly have imagined. In the coming years, these trends are set to continue, leading to even more exciting developments in the medical field.