It can feel like electronic records have always been part of healthcare. In a computer saturated society, EHRs are simultaneously unremarkable and a popular subject of complaint. Digitization promised quality and efficiency, but doctors are spending more time documenting than diagnosing. Add in high rates of burnout and new challenges for patient-provider interactions, and you get a medical profession in the throes of transformation

But electronic health records haven’t always seemed inevitable. In fact, EHRs were uncommon even 10 years ago. To offer some perspective, here’s a brief history of electronic health records, synthesized primarily from a 2016 research study published in the Yearbook of Medical Informatics.

Laying the groundwork

Doctors have been documenting patient health since at least 1,600-3,000 BC, the approximate date of ancient Egyptian hieroglyphs that depict the practice of keeping medical records. However, paper medical records weren’t steadily used until the early 1900s. In the past 100 years, terminology has shifted from “medical record” to “health record,” suggesting that a patient’s chart should also include health and lifestyle information.

Medical record keeping became commonplace during the 20th century, but it was new computer technology developed in the 1960s and ‘70s that laid the foundation for electronic health records (EHRs). In the ’70s and ‘80s, a number of academic medical facilities started using EHRs. These systems mostly existed on large mainframe computers, and their primary purpose was to facilitate research and improve medical care.

EHRs become possible

By the early ‘90s, digital record keeping began to spread. Computer hardware became more affordable, powerful and compact. Local area networks and the Internet provided faster and easier information access, kicking off the first web-based EHRs. As the limitations of paper medical records became increasingly apparent, the Institute of Medicine started advocating for EHR adoption.

However, widespread use of EHRs was delayed by high costs, data entry errors, a lackluster response from physicians, and no real incentives. The high initial costs of digitizing medical records would outweigh any gains in efficiency. EHR adoption was gradual and sporadic. For the time being, EHRs would compliment, not replace, paper medical records.

Early issues persist

Despite slow adoption, the 1990s defined the contours of fundamental and ongoing challenges and applications for EHRs. For example, it quickly became clear that medical information in EHRs could be used for clinical decision support (CDS), which generated a new domain of medical informatics. Early CDS functions were primarily in academic EHRs, and they included simple functions like drug-allergy warnings.

Initial use of third party applications within EHRs also pointed to a need for standards. Health Level Seven (HL7) was a leading early standard that remains relevant. Standards provided a common set of data element definitions, which allowed different EHR systems to interface.

By 2004, 13 percent of US healthcare facilities had an EHR system fully implemented. If EHRs hand’t exactly taken off since 1990, it wasn’t for a shortage of technological solutions. Rather, the new technology faced serious procedural, professional, political and especially ethical issues. Patients, providers, and healthcare institutions recognized the need for policies and standards that enable data security and interoperability.

Successful incentives

In 2009, EHRs got the boost they were waiting for with the Health Information Technology for Economic and Clinical Health (HITECH) Act. The HITECH Act motivated widespread digitization of healthcare by incentivizing the meaningful use of EHRs and related technology.

These graduated incentive payments worked, and in 2018 over 98 percent of hospitals use EHRs. While adoption by physicians as a whole has been slower (closer to 70 percent), it is now common to electronically share health records between providers.

Widespread EHR adoption has been coupled by a growing industry and rapid innovation. As of 2017, there were over 186 vendors supplying EHR technology to US hospitals. This quickly evolving industry is still finding solutions to key challenges like interoperability and security, but the inevitable era of EHRs has arrived. 

Comments are closed.