Healthcare/ Leadership
Chapter 1
History of the U.S. Healthcare System
LEARNING OBJECTIVES
The student will be able to:
Identify five milestones of medicine and medical education and their importance to health care.
Identify five milestones of the hospital system and their importance to health care.
Identify five milestones of public health and their importance to health care.
Identify five milestones of health insurance and their importance to health care.
Explain the difference between primary, secondary, and tertiary prevention.
Explain the concept of the iron triangle as it applies to health care.
DID YOU KNOW THAT?
When the practice of medicine first began, tradesmen such as barbers practiced medicine. They often used the same razor to cut hair as to perform surgery.
In 2010, the United States spent $2.6 trillion on healthcare spending or 17.6% of the gross domestic product, which is the highest in the world.
In 2011, U.S. Census data indicate there were 48.6 million uninsured U.S. citizens, which is a decrease from 50 million in 2010.
The Centers for Medicare and Medicaid Services (CMS) predicts annual healthcare costs will be $4.64 trillion by 2020, which represents nearly 20% of the U.S. gross domestic product.
The United States is one of only a few developed countries that does not have universal healthcare coverage.
In 2002, the Joint Commission issued hospital standards requiring them to inform their patients if their results were not consistent with typical care results.
INTRODUCTION
It is important as a healthcare consumer to understand the history of the U.S. healthcare delivery system, how it operates today, who participates in the system, what legal and ethical issues arise as a result of the system, and what problems continue to plague the healthcare system. We are all consumers of health care. Yet, in many instances, we are ignorant of what we are actually purchasing. If we were going to spend $1,000 on an appliance or a flat screen television, many of us would research the product to determine if what we are purchasing is the best product for us. This same concept should be applied to purchasing healthcare services.
Increasing healthcare consumer awareness will protect you in both the personal and professional aspects of your life. You may decide to pursue a career in health care either as a provider or as an administrator. You may also decide to manage a business where you will have the responsibility of providing health care to your employees. And lastly, from a personal standpoint, you should have the knowledge from a consumer point of view so you can make informed decisions about what matters most—your health. The federal government agrees with this philosophy. Recently, the Centers for Medicare and Medicaid Services (CMS) used its claim data to publish the hospital costs of the 100 most common treatments nationwide. The purpose of this effort is to provide data to consumers regarding healthcare costs because the costs vary considerably across the United States. This effort may also encourage pricing competition of healthcare services (Godert, 2013).
As the U.S. population’s life expectancy continues to increase—increasing the “graying” of the population—the United States will be confronted with more chronic health issues because, as we age, more chronic health conditions develop. The U.S. healthcare system is one of the most expensive systems in the world. According to 2010 statistics, the United States spent $2.6 trillion on healthcare expenditures or 17.6% of its gross domestic product (CMS, 2013a). The gross domestic product (GDP) is the total finished products or services that are produced in a country within a year. These statistics mean that nearly 18% of all of the products made within the borders of the United States within a year are healthcare related. Estimates indicate that healthcare spending will be $4.6 trillion by 2020, which represents nearly 20% of the gross domestic product. In 2011, there were 48.6 million uninsured U.S. citizens, a decrease from 50 million in 2010 (Kaiser Family Foundation [KFF], 2013). The Institute of Medicine’s (IOM) 1999 report indicated that nearly 100,000 citizens die each year as a result of medical errors. Although there have been quality improvement initiatives in the healthcare industry such as the Patient Safety and Quality Improvement Act of 2005, recent research indicates that medical errors in hospitals remain high (Classen et al., 2011).
Employers are offering less healthcare benefits. In 2002, 72% offered health insurance benefits, which has dropped to 67.5% in 2010. This is typical of smaller businesses that have a small number of employees who need benefits (Kliff, 2012).
These rates are some of the highest in the world but, unlike most developed countries, the United States does not offer healthcare coverage as a right of citizenship. Most developed countries have a universal healthcare program, which means access to all citizens. Many of these systems are typically run by the federal government, have centralized health policy agencies, are financed through different forms of taxation, and payment of healthcare services are by a single payer—the government (Shi & Singh, 2008). France and the United Kingdom have been discussed as possible models for the United States to follow to improve access to health care, but these programs have problems and may not be the ultimate solution for the United States. However, because the United States does not offer any type of universal healthcare coverage, many citizens who are not eligible for government-sponsored programs are expected to provide the service for themselves through the purchase of health insurance or the purchase of actual services. Many citizens cannot afford these options, resulting in their not receiving routine medical care. The passage of the Patient Protection and Affordable Care Act of 2010 (PPACA, or ACA) has attempted to increase access to affordable healthcare. One of the mandates of the Act is the establishment of state-run health insurance marketplaces, which provide opportunities for consumers to search for affordable health insurance plans. There is also a mandate that individuals who do not have health insurance purchase health insurance if they can afford it or pay a fine. Both of these mandates should decrease the number of uninsured in the United States. These programs will be closely evaluated to assess whether their goals will be achieved.
CONSUMER PERSPECTIVE ON HEALTH CARE
Basic Concepts of Health
Prior to discussing this complex system, it is important to identify three major concepts of healthcare delivery: primary, secondary, and tertiary prevention. These concepts are vital to understanding the U.S. healthcare system because different components of the healthcare system focus on these different areas of health, which often results in lack of coordination between the different components.
Primary, Secondary, and Tertiary Prevention
According to the American Heritage Dictionary (2001), prevention is defined as “slowing down or stopping the course of an event.” Primary prevention avoids the development of a disease. Promotion activities such as health education are primary prevention. Other examples include smoking cessation programs, immunization programs, and educational programs for pregnancy and employee safety. State health departments often develop targeted, large education campaigns regarding a specific health issue in their area. Secondary prevention activities are focused on early disease detection, which prevents progression of the disease. Screening programs, such as high blood pressure testing, are examples of secondary prevention activities. Colonoscopies and mammograms are also examples of secondary prevention activities. Many local health departments implement secondary prevention activities. Tertiary prevention reduces the impact of an already established disease by minimizing disease-related complications. Tertiary prevention focuses on rehabilitation and monitoring of diseased individuals. A person with high blood pressure who is taking blood pressure medication is an example of tertiary prevention. A physician who writes a prescription for that blood pressure medication to control high blood pressure is an example of tertiary prevention. Traditional medicine focuses on tertiary prevention, although more primary care providers are encouraging and educating their patients on healthy behaviors (Centers for Disease Control and Pretention [CDC], 2007).
We, as healthcare consumers, would like to receive primary prevention to prevent disease. We would like to participate in secondary prevention activities such as screening for cholesterol or blood pressure because it helps us manage any health problems we may be experiencing and reduces the potential impact of a disease. And, we would like to also visit our physicians for tertiary measures so, if we do have a disease, it can be managed by taking a prescribed drug or some other type of treatment. From our perspective, these three areas of health should be better coordinated for the healthcare consumer so the United States will have a healthier population.
In order to understand the current healthcare delivery system and its issues, it is important to learn the history of the development of the U.S. healthcare system. There are four major sectors of our healthcare system that will be discussed in this chapter that have impacted our current system of operations: (1) the history of practicing medicine and the development of medical education, (2) the development of the hospital system, (3) the history of public health, and (4) the history of health insurance. In Tables 1-1 to 1-4 , several important milestones are listed by date and illustrate historic highlights of each system component. The list is by no means exhaustive, but provides an introduction to how each sector has evolved as part of the U.S. healthcare system.
MILESTONES OF MEDICINE AND MEDICAL EDUCATION
The early practice of medicine did not require a major course of study, training, board exams, and licensing, as is required today. During this period, anyone who had the inclination to set up a physician practice could do so; oftentimes, clergy were also medical providers, as well as tradesmen such as barbers. The red and white striped poles outside barber shops represented blood and bandages because the barbers were often also surgeons. They used the same blades to cut hair and to perform surgery (Starr, 1982). Because there were no restrictions, competition was very intense. In most cases, physicians did not possess any technical expertise; they relied mainly on common sense to make diagnoses (Stevens, 1971). During this period, there was no health insurance, so consumers decided when they would visit a physician and paid for their visits out of their own pockets. Often, physicians treated their patients in the patients’ homes. During the late 1800s, the medical profession became more cohesive as more technically advanced services were delivered to patients. The establishment of the American Medical Association (AMA) in 1847 as a professional membership organization for physicians was a driving force for the concept of private practice in medicine. The AMA was also responsible for standardizing medical education (AMA, 2013a; Goodman & Musgrave, 1992).
In the early history of medical education, physicians gradually established large numbers of medical schools because they were inexpensive to operate, increased their prestige, and enhanced their income. Medical schools only required four or more physicians, a classroom, some discussion rooms, and legal authority to confer degrees. Physicians received the students’ tuitions directly and operated the school from this influx of money. Many physicians would affiliate with established colleges to confer degrees. Because there were no entry restrictions, as more students entered into medical schools, the existing internship program with physicians was dissolved and the Doctor of Medicine (MD) became the standard (Vault Career Intelligence, 2013). Although there were major issues with the quality of education provided because of the lack of educational requirements, medical school education became the gold standard for practicing medicine (Sultz & Young, 2006). The publication of the Flexner Report in 1910, which evaluated medical schools in Canada and the United States, was responsible for forcing medical schools to develop curriculums and admission testing. Curriculums and admission testing are still in existence today.
TABLE 1-1 Milestones of Medicine and Medical Education 1700–2013
• 1700s: Training and apprenticeship under one physician was common until hospitals were founded in the mid-1700s. In 1765, the first medical school was established at the University of Pennsylvania.
• 1800s: Medical training was provided through internships with existing physicians who often were poorly trained themselves. There were only four medical schools in the United States that graduated only a handful of students. There was no formal tuition with no mandatory testing.
• 1847: The AMA was established as a membership organization for physicians to protect the interests of its providers. It did not become powerful until the 1900s when it organized its physician members by county and state medical societies. The AMA wanted to ensure they were protecting their financial well-being. It also began to focus on standardizing medical education.
• 1900s to 1930s: The medical profession was represented by general or family practitioners who operated in solitary practices. A small percentage of physicians were women. Total expenditures for medical care were less than 4% of the gross domestic product.
• 1904: The AMA created the Council on Medical Education to establish standards for medical education.
• 1928: Formal medical education was attributed to Abraham Flexner, who wrote an evaluation of medical schools in the United States and Canada indicating many schools were substandard. He made recommendations to close several schools, enact admission requirements, and set a standard curriculum. The Flexner Report led to standardized admissions testing for students called the Medical College Admission Test (MCAT), which is still used as part of the admissions process today.
• 1930s: The healthcare industry was dominated by male physicians and hospitals. Relationships between patient and physicians were sacred. Payments for physician care were personal.
• 1940s to 1960s: When group health insurance was offered, the relationship between patient and physician changed because of third-party payers (insurance). In the 1950s, federal grants supported medical school operations and teaching hospitals. In the 1960s, the Regional Medical Programs provided research grants and emphasized service innovation and provider networking.
• 2008: There is increased racial diversity in the number of medical school graduates. Although whites continue to represent the largest number of medical school graduates, there continues to be a decline in white graduates. Asians represent the largest ethnicity of medical school graduates. Women medical graduates continue to enter the workforce in great numbers, but men still outnumber women physicians.
• 2001–2012: In 2011, the ACA established the Center for Medicare & Medicaid Innovation that will examine ways to deliver care to patients. In 2012, the ACA provided incentives for physicians to establish accountable care organizations.
• 2012: In 2012–2013, the average annual cost for a public medical school for an in-state resident was $30,000. The annual cost for a private medical school was $50,000. Approximately 47% of the students were females.
In 2008, there was increased racial diversity in the number of medical school graduates. Although whites continue to represent the largest number of medical school graduates, their numbers are declining. Asians represent the largest ethnic group of medical school graduates. Women medical graduates continue to enter the workforce in great numbers but men still outnumber women physicians. In 2012–2013, the average annual cost for a public medical school for an in-state resident was $30,000. The annual cost for a private medical school was $50,000 (Association of American Medical Colleges [AAMC], 2013).
MILESTONES OF THE HOSPITAL SYSTEM
In the early 19th century, almshouses or poorhouses were established to serve the indigent. They provided shelter while treating illness. Government-operated pesthouses segregated those who could spread their disease. The framework of these institutions set up the conception of the hospital. Initially, wealthy people did not want to go to hospitals because the conditions were deplorable and the providers were not skilled, so hospitals, which were first built in urban areas, were used by the poor. During this period, many of the hospitals were owned by the physicians who practiced in them (Rosen, 1983).
In the early 20th century, with the establishment of a more standardized medical education, hospitals became more accepted across socioeconomic classes and became the symbol of medicine. With the establishment of the AMA, who protected the interests of providers, the reputation of providers became more prestigious. During the 1930s and 1940s, the ownership of the hospitals changed from physician-owned to church-related and government-operated (Starr, 1982).
In 1973, the first Patient Bill of Rights was established to protect healthcare consumers in the hospitals. In 1974, a federal law was passed that required all states to have Certificate of Need (CON) laws to ensure the state approved any capital expenditures associated with hospital/medical facilities’ construction and expansion. The Act was repealed in 1987, but as of 2011, 36 states still have some type of CON mechanism (National Conference of State Legislatures [NCSL], 2013). The concept of CON was important because it encouraged state planning to ensure their medical system was based on need. In 1985, the Emergency Medical Treatment and Active Labor Act (EMTALA) was enacted to ensure that consumers were not refused treatment for an emergency. During this period, inpatient hospital use was typical; however, by the 1980s, many hospitals were offering outpatient or ambulatory surgery that continues into the 21st century. The Balanced Budget Act of 1997 authorized outpatient Medicare reimbursement to support these cost-saving measures (CDC, 2001). Hospitalists, created in 1996, are providers that focus specifically on the care of patients when they are hospitalized. This new type of provider recognized the need of providing quality hospital care (American Hospital Association [AHA], 2013; Sultz & Young, 2006). In 2002, the Joint Commission on the Accreditation of Healthcare Organizations (now The Joint Commission) issued standards to increase consumer awareness by requiring hospitals to inform patients if their results were not consistent with typical results (AHA, 2013).
Hospitals are the foundation of our healthcare system. As our health insurance system evolved, the first type of health insurance was hospital insurance. As society’s health needs increased, expansion of different medical facilities increased. There was more of a focus on ambulatory or outpatient services because we, as consumers, prefer outpatient services and, secondly, it is more cost effective. In 1980, the AHA estimated that 87% of hospitals offered outpatient surgery. Although hospitals are still an integral part of our healthcare delivery system, the method of their delivery has changed. More hospitals have recognized the trend of outpatient services and have integrated those types of services in their delivery.
MILESTONES OF PUBLIC HEALTH
The development of public health is important to note because its development was separate from the development of private medical practices. Physicians were worried that government health departments could regulate how they practiced medicine, which could limit their income. Public health specialists also approached health from a collectivistic and preventive care viewpoint—to protect as many citizens as possible from health issues and to provide strategies to prevent health issues from occurring. Private practitioners held an individualistic viewpoint—citizens more often would be paying for physician services from their health insurance or from their own pockets and physicians would be providing them guidance on how to cure their diseases, not prevent them. The two contrasting viewpoints still exist today, but there have been efforts to coordinate and collaborate more of the traditional and public health activities.
TABLE 1-2 Milestones of the Hospital and Healthcare Systems 1820–2013
• 1820s: Almshouses or poorhouses, the precursor of hospitals, were developed to serve the poor primarily. They provided food and shelter to the poor and consequently treated the ill. Pesthouses, operated by local governments, were used to quarantine people who had contagious diseases such as cholera. The first hospitals were built around urban areas in New York City, Philadelphia, and Boston and were used often as a refuge for the poor. Dispensaries or pharmacies were established to provide free care to those who could not afford to pay and to dispense drugs to ambulatory patients.
• 1850s: A hospital system was finally developed but their conditions were deplorable because there were unskilled providers. Hospitals were owned primarily by the physicians who practiced in them.
• 1890s: Patients went to hospitals because they had no choice. There became more cohesiveness among providers because they had to rely on each other for referrals and access to hospitals, which gave them more professional power.
• 1920s: The development of medical technological advances increased the quality of medical training and specialization and the economic development of the United States. The establishment of hospitals became the symbol of the institutionalization of health care. In 1929, President Coolidge signed the Narcotic Control Act, which provided funding for hospital construction for drug addicts.
• 1930s to 1940s: Once physician-owned hospitals were now owned by church groups, larger facilities, and government at all levels.
• 1970 to 1980: The first Patient Bill of Rights was introduced to protect healthcare consumer representation in hospital care. In 1974, the National Health Planning and Resources Development Act required states to have CON laws to qualify for federal funding.
• 1980 to 1990: According to the AHA, 87% of hospitals were offering ambulatory surgery. In 1985, the EMTALA was enacted, which required hospitals to provide screening and stabilize treatment regardless of the ability to pay by the consumer.
• 1990 to 2000s: As a result of the Balanced Budget Act cuts of 1997, the federal government authorized an outpatient Medicare reimbursement system.
• 1996: Hospitalists are clinicians that provide care once a patient is hospitalized.
• 2002: The Joint Commission on the Accreditation of Healthcare Organizations (now The Joint Commission) issued standards to increase consumer awareness by requiring hospitals to inform patients if their results were not consistent with typical results.
• 2011: In 1974, a federal law was passed that required all states to have certificate of need (CON) laws to ensure the state approved any capital expenditures associated with hospital/medical facilities’ construction and expansion. The act was repealed in 1987 but as of 2011, 36 states still have some type of CON mechanism.
• 2013: The Center of Medicare & Medicaid Services developed a Bundled Payments for Care Improvement initiative. Acute care hospitals and other providers will enter into payment arrangements that include financial and performance accountability for episodes of care for each patient.
During the 1700s into the 1800s, the concept of public health was born. In their reports, Edwin Chadwick, Dr. John Snow, and Lemuel Shattuck demonstrated a relationship between the environment and disease (Chadwick, 1842; Turnock, 1997). As a result of their work, public health law was enacted and, by the 1900s, public health departments were focused on the environment and its relationship to disease outbreaks.
TABLE 1-3 Milestones in Public Health 1700–2013
• 1700 to 1800: The United States was experiencing strong industrial growth. Long work hours in unsanitary conditions resulted in massive disease outbreaks. U.S. public health practices targeted reducing epidemics, or large patterns of disease in a population, that impacted the population. Some of the first public health departments were established in urban areas as a result of these epidemics.
• 1800 to 1900: Three very important events occurred. In 1842, Britain’s Edwin Chadwick produced the General Report on the Sanitary Condition of the Labouring Population of Great Britain, which is considered one of the most important documents of public health. This report stimulated a similar U.S. survey. In 1854, Britain’s John Snow performed an analysis that determined contaminated water in London was the cause of the cholera epidemic in London. This discovery established a link between the environment and disease. In 1850, Lemuel Shattuck, based on Chadwick’s report and Snow’s activities, developed a state public health law that became the foundation for public health activities.
• By 1900 to 1950: In 1920, Charles Winslow defined public health as a focus of preventing disease, prolonging life, and promoting physical health and efficiency through organized community efforts.
• During this period, most states had public health departments that focused on sanitary inspections, disease control, and health education. Throughout the years, public health functions included child immunization programs, health screenings in schools, community health services, substance abuse programs, and sexually transmitted disease control.
• In 1923, a vaccine for diphtheria and whooping cough was developed. In 1928, Alexander Fleming discovered penicillin. In 1933, the polio vaccine was developed. In 1946, the National Mental Health Act (NMHA) provided funding for research, prevention, and treatment of mental illness.
• 1950 to 1980: In 1950, cigarette smoke is identified as a cause of lung cancer.
• In 1952, Dr. Jonas Salk developed the polio vaccine.
• The Poison Prevention Packaging Act of 1970 was enacted to prevent children from accidentally ingesting substances. Childproof caps were developed for use on all drugs. In 1980, the eradication of smallpox was announced.
• 1980 to 1990: The first recognized cases of AIDS occurred in the United States in the early 1980s.
• 1988: The Institute of Medicine Report defined public health as organized community efforts to address the public interest in health by applying scientific and technical knowledge and promote health. The first Healthy People Report (1987) was published that recommended a national prevention strategy.
• 1990 to 2000: In 1997, Oregon voters approved a referendum that allowed physicians to assist terminally ill, mentally competent patients to commit suicide. From 1998 to 2006, 292 patients exercised their rights under the law.
• 2000s: The second Healthy People Report was published in 2000. The terrorist attack on the United States on September 11, 2001, impacted and expanded the role of public health. The Public Health Security and Bioterrorism Preparedness and Response Act of 2002 provided grants to hospitals and public health organizations to prepare for bioterrorism as a result of September 11, 2001.
• 2010: The ACA was passed. Its major goal is to improve the nation’s public health level. The third Healthy People Report was published.
• 2013: The ACA provided funding to state Medicaid programs to increase preventive services at little or no cost.
Disease control and health education were also integral components of public health departments. In 1916, The Johns Hopkins University, one of the most prestigious universities in the world, established the first public health school (Duke University Library, 2013). Winslow’s definition of public health focuses on the prevention of disease, while the IOM defines public health as the organized community effort to protect the public by applying scientific knowledge (IOM, 1988; Winslow, 1920). These definitions are exemplified by the development of several vaccines for whooping cough, polio, smallpox, diphtheria, and the discovery of penicillin. All of these efforts focus on the protection of the public from disease.
The three most important public health achievements are (1) the recognition by the Surgeon General that tobacco use is a health hazard; (2) the number of vaccines that have been developed that have eradicated some diseases and controlled the number of childhood diseases that exist; and (3) early detection programs for blood pressure and heart attacks and smoking cessation programs, which have dramatically reduced the number of deaths in this country (Novick, Morrow, & Mays, 2008).
Assessment, policy development, and assurance, core functions of public health, were developed based on the 1988 report, The Future of Public Health, which indicated there was an attrition of public health activities in protecting the community (IOM, 1988). There was poor collaboration between public health and private medicine, no strong mission statement and weak leadership, and politicized decision making. Assessment was recommended because it focused on the systematic continuous data collection of health issues, which would ensure that public health agencies were vigilant in protecting the public (IOM, 1988; Turnock, 1997). Policy development should also include planning at all health levels, not just federally. Federal agencies should support local health planning (IOM, 1988). Assurance focuses on evaluating any processes that have been put in place to assure that the programs are being implemented appropriately. These core functions will ensure that public health remains focused on the community, has programs in place that are effective, and has an evaluation process in place to ensure that the programs do work (Turnock, 1997).
The Healthy People 2000 report, which started in 1987, was created to implement a new national prevention strategy with three goals: increase life expectancy, reduce health disparities, and increase access to preventive services. Also, three categories of health promotion, health prevention, and preventive services were identified and surveillance activities were emphasized. Healthy People provided a vision to reduce preventable disabilities and death. Target objectives were set throughout the years to measure progress (CDC, 2013a).
The Healthy People 2010 report was released in 2000. The report contained a health promotion and disease prevention focus to identify preventable threats to public health and to set goals to reduce the threats. Nearly 500 objectives were developed according to 28 focus areas. Focus areas ranged from access to care, food safety, education, environmental health, to tobacco and substance abuse. An important component of Healthy People 2010 is the development of an infrastructure to ensure public health services are provided. Infrastructure includes skilled labor, information technology, organizations, and research. In 2010, Healthy People 2020 was released. It contains 1,200 objectives that focus on 42 topic areas. According to the Centers for Disease Control and Prevention (CDC), a smaller set of Healthy People 2020 objectives, called Leading Health Indicators (LHIs), have been targeted to communicate high-priority health issues. (CDC, 2013a). The goals for all of these reports are consistent with the definitions of public health in both Winslow’s and the IOM’s reports.
It is important to mention the impact the terrorist attack on the United States on September 11, 2001, the anthrax attacks, the outbreak of global diseases such as severe acute respiratory syndrome (SARS), and the U.S. natural disaster of Hurricane Katrina had on the scope of public health responsibilities. As a result of these major events, public health has expanded its area of responsibility. The terms “bioterrorism” and “disaster preparedness” have more frequently appeared in public health literature and have become part of strategic planning. The Public Health Security and Bioterrorism Preparedness and Response Act of 2002 provided grants to hospitals and public health organizations to prepare for bioterrorism as a result of September 11, 2001 (CDC, 2009).
Public health is challenged by its very success because the public now takes public health measures for granted: There are several successful vaccines that targeted almost all childhood diseases, tobacco use has decreased significantly, accident prevention has increased, there are safer workplaces because of the Occupational Safety and Health Administration (OSHA), fluoride is added to the public water supply, or there is decreased mortality because of heart attacks (Turnock, 1997). When some major event occurs like anthrax poisoning or a SARS outbreak, people immediately think that public health will automatically control these problems. The public may not realize how much effort, dedication, and research takes place to protect them.
TABLE 1-4 Milestones of the U.S. Health Insurance System 1800–2014
• 1800 to 1900: Insurance was purchased by individuals like one would purchase car insurance. In 1847, the Massachusetts Health Insurance Co. of Boston was the first insurer to issue “sickness insurance.” In 1853, a French mutual aid society established a prepaid hospital care plan in San Francisco, California. This plan resembles the modern Health Maintenance Organization (HMO).
• 1900 to 1920: In 1913, the International Ladies Garment Workers began the first union-provided medical services. The National Convention of Insurance Commissioners drafted the first model for regulation of the health insurance industry.
• 1920s: The blueprint for health insurance was established in 1929 when J. F. Kimball began a hospital insurance plan for school teachers at the Baylor University Hospital in Texas. This initiative became the model for Blue Cross plans nationally. The Blue Cross plans were nonprofit and covered only hospital charges so as not to infringe on private physicians’ income.
• 1930s: There were discussions regarding the development of a national health insurance program. However, the AMA opposed the move (Raffel & Raffel, 1994). With the Depression and U.S. participation in World War II, the funding required for this type of program was not available. In 1935, President Roosevelt signed the Social Security Act (SSA), which created “old age insurance” to help those of retirement age. In 1936, Vassar College, in New York, was the first college to establish a medical insurance group policy for students.
• 1940s to 1950s: The War Labor Board froze wages, forcing employers to offer health insurance to attract potential employees. In 1947, the Blue Cross Commission was established to create a national doctors network. By 1950, 57% of the population had hospital insurance.
• 1965: President Johnson signed Medicare and Medicaid programs into law.
• 1970s to 1980s: President Nixon signed the HMO Act, which was the predecessor of managed care. In 1982, Medicare proposed paying for hospice or end-of-life care. In 1982, diagnosis related groups (DRGs) and prospective payment guidelines were developed to control insurance reimbursement costs. In 1985, the Consolidated Omnibus Budget Reconciliation Act (COBRA) required employers to offer partially subsidized health coverage to terminated employees.
• 1990 to 2000: President Clinton’s Health Security Act proposed a universal healthcare coverage plan, which was never passed. In 1993, the Family Medical Leave Act (FMLA) was enacted, which allowed employees up to 12 weeks of unpaid leave because of family illness. In 1996, the Health Insurance Portability and Accountability Act (HIPAA) was enacted, making it easier to carry health insurance when changing employment. It also increased the confidentiality of patient information. In 1997, the Balanced Budget Act (BBA) was enacted to control the growth of Medicare spending. It also established the State Children’s Health Insurance Program (SCHIP).
• 2000: The SCHIP, now known as the Children’s Health Insurance Program (CHIP), was implemented.
• 2000: The Medicare, Medicaid, and SCHIP Benefits Improvement and Protection Act provided some relief from the BBA by providing across-the-board program increases.
• 2003: The Medicare Prescription Drug, Improvement, and Modernization Act was passed, which created Medicare Part D, prescription plans for the elderly.
• 2006: Massachusetts mandated all residents have health insurance by 2009.
• In 2009, President Obama signed the American Recovery and Reinvestment Act (ARRA), which protected health coverage for the unemployed by providing a 65% subsidy for COBRA coverage to make the premiums more affordable.
• 2010: The ACA was signed into law, making it illegal for insurance companies to rescind insurance on their sick beneficiaries. Consumers can also appeal coverage claim denials by the insurance companies. Insurance companies are unable to impose lifetime limits on essential benefits.
• 2013: As of October 1, individuals can buy qualified health benefits plans from the Health Insurance Marketplaces. If an employer does not offer insurance, effective 2015, consumer can purchase it from the federal Health Insurance Marketplace.
• 2014: The ACA requires all individuals to purchase health insurance if they can afford it.
MILESTONES OF THE HEALTH INSURANCE SYSTEM
There are two key concepts in group insurance: “risk is transferred from the individual to the group and the group shares the cost of any covered losses incurred by its member” (Buchbinder & Shanks, 2007). Like life insurance or homeowner’s insurance, health insurance was developed to provide protection should a covered individual experience an event that requires health care. In 1847, a Boston insurance company offered sickness insurance to consumers (Starr, 1982).
During the 19th century, large employers such as coal mining and railroad companies offered medical services to their employees by providing company doctors. Fees were taken from their pay to cover the service. In 1913, a union-provided health insurance was provided by the International Ladies Garment Workers where health insurance was negotiated as part of their contract (Duke University Library, 2013). During this period, there were several proposals for a national health insurance program but the efforts failed. The AMA was worried that any national health insurance would impact the financial security of their providers. The AMA persuaded the federal government to support private insurance efforts (Raffel & Raffel, 1994).
In 1929, a group hospital insurance plan was offered to teachers at a hospital in Texas. This became the foundation of the nonprofit Blue Cross plans. In order to placate the AMA, Blue Cross initially offered only hospital insurance in order to avoid infringement of physicians’ incomes (Blue Cross Blue Shield Association [BCBS], 2007; Starr, 1982). In 1935, the Social Security Act was created and was considered “old age” insurance. During this period, there was continued discussion of a national health insurance program. But, with the impact of World War II and the Depression, there was no funding for this program. The government felt that the Social Security Act was a sufficient program to protect consumers. These events were a catalyst for the development of a health insurance program that included private participation. Although a universal health coverage program was proposed during President Clinton’s administration in the 1990s, it was never passed. In 2009, there has been a major public outcry at regional town hall meetings opposing any type of government universal healthcare coverage. In 2006, Massachusetts proposed mandatory health coverage for all citizens, so it may be that universal health coverage would be a state-level initiative (KFF, 2013).
By the 1950s, nearly 60% of the population had hospital insurance (AHA, 2007). Disability insurance was attached to Social Security. In the 1960s, President Johnson signed Medicare and Medicaid into law, which protects the elderly, disabled, and indigent. President Nixon established the Health Maintenance Organization (HMO), which focused on effective cost measures for health delivery. Also, in the 1980s, diagnostic related groups (DRGs) and prospective payment guidelines were established to provide guidelines for treatment. These DRGs were attached to appropriate insurance reimbursement categories for treatment. The Consolidated Omnibus Budget Reconciliation Act (COBRA) was passed to provide health insurance protection if an individual changes jobs. In 1993, the Family Medical Leave Act (FMLA) was passed to protect an employee if there is a family illness. An employee can receive up to 12 weeks of unpaid leave and maintain his or her health insurance coverage during this period. In 1994, the Uniformed Services Employment and Reemployment Rights Act (USERRA) entitles individuals who leave for military service to return to their job. Also, in 1996, the Health Insurance Portability and Accountability Act (HIPAA) was passed to provide stricter confidentiality regarding the health information of individuals. In 1997, the Balanced Budget Act (BBA) was passed that required massive program reductions for Medicare and authorized Medicare reimbursement for outpatient services (CMS, 2013b).
At the start of the 21st century, cost, access, and quality continue to be issues for U.S. health care. Employers continue to play an integral role in health insurance coverage. In 2009, nearly 57% of the population was covered by employer insurance (AMA, 2013b). The largest public coverage program is Medicare—14% of the population. The State Children’s Health Insurance Program (SCHIP); renamed CHIP was implemented to ensure that children, who are not Medicaid eligible, receive health care. The Medicare, Medicaid, and SCHIP Benefits Improvement and Protection Act provided some relief from the BBA of 1997 by restoring some funding to these consumer programs. In 2003, a consumer law, the Medicare Prescription Drug, Improvement, and Modernization Act, created a major overhaul of the Medicare system (CMS, 2013b). The Act created Medicare Part D, a prescription plan that became effective in 2006 that provided different prescription programs to the elderly, based on their prescription needs. It has been criticized because it is so complex. The elderly had a difficult time understanding which plan to select. It also has not been cost effective; the cost of the program has been estimated at $550 billion. The 10-year estimated cost of this program is $1.2 trillion (Brownlee, 2007). In 2008, the National Defense Authorization Act expanded the FMLA to include families of military service members to take a leave of absence if the spouse, parent, or child was called to active military service. The 2010 ACA requires individuals to purchase health insurance by 2014. Despite these efforts, health insurance coverage continues to be an issue for the United States.
CURRENT SYSTEM OPERATIONS
Government’s Participation in Health Care
The U.S. government plays an important role in healthcare delivery. The United States has three governmental levels participating in the healthcare system: federal, state, and local. The federal government provides a range of regulatory and funding mechanisms including Medicare and Medicaid, established in 1965 as federally funded programs to provide health access to the elderly (65 years or older) and the poor, respectively. Over the years, these programs have expanded to include the disabled. They also have developed programs for military personnel, veterans, and their dependents.
Federal law does ensure access to emergency services regardless of ability to pay as a result of EMTALA (Regenstein, Mead, & Lara, 2007). The federal government determines a national healthcare budget, sets reimbursement rates, and also formulates standards for providers for eligible Medicare and Medicaid patients (Barton, 2003). The state level is responsible for regulatory and funding mechanisms but also provides healthcare programs as dictated by the federal government. The local or county level of government is responsible for implementing programs dictated by both the federal and state level.
The U.S. healthcare system is not a true system because of its fragmentation and lack of centralized decision making (Shi & Singh, 2008). The United States has several federal health regulatory agencies including the CDC for public health, the Food and Drug Administration (FDA) for pharmaceutical controls, and Centers for Medicare & Medicaid Services (CMS) for the indigent, disabled, and the elderly. There is also The Joint Commission, a private organization that focuses on healthcare organizations’ oversight and the Agency for Healthcare Research and Quality (AHRQ) is the primary federal source for quality delivery of health services. The Center for Mental Health Services (CMHS), in partnership with state health departments, leads national efforts to assess mental health delivery services. Although the federal government is to be commended because of the many agencies that focus on major healthcare issues, with multiple organizations there is often duplication of effort and miscommunication that results in inefficiencies (KFF, 2013). However, there are several regulations in place that protect patient rights. One of the first pieces of legislation is the Sherman Antitrust Act of 1890 and ensuing legislation, which ensures fair competition in the marketplace for patients by prohibiting monopolies (Niles, 2013). Regulations such as HIPAA protects patient information; COBRA gives workers and families the right to continue healthcare coverage if they lose their job; the Newborns’ and Mothers’ Health Protection Act (NMHPA) of 1996 prevents health insurance companies from discharging a mother and child too early from the hospital; the Women’s Health and Cancer Rights Act (WHCRA) of 1998 prevents discrimination of women who have cancer; the Mental Health Parity Act (MHPA) of 1996 and its 2008 amendment requires health insurance companies to provide fair coverage for mental health conditions; the Genetic Information Nondiscrimination Act of 2008 prohibits U.S. insurance companies and employers from discriminating based on genetic test results; the Lilly Ledbetter Fair Pay Act of 2009 provides protection for unlawful employment practices related to compensation discrimination; and finally, the ACA of 2010 focuses on increasing access to healthcare, improving the quality of healthcare delivery, and increasing the number of those individuals who have health insurance. All of these regulations are considered social regulations because they were enacted to protect the healthcare consumer.
Private Participation in Health Care
The private sector focuses on the financial and delivery aspects of the system. Healthcare costs are paid by a health insurance plan, private, or government, and the enrollee of the plan. Approximately 34% of the 2009 healthcare expenditures were paid from private health insurance, insurance offered by a private insurance company such as Blue Cross; private out-of-pocket expenses or payments, funds paid by the individual, were 14%; and federal, state, and local governments paid 40%. Out-of-pocket payments are considered the individual’s cost share of his or her healthcare costs. Approximately 57% of private healthcare financing is through employer health insurance, a type of voluntary health insurance set up by an individual’s employer. The delivery of the services provided is through legal entities such as hospitals, clinics, physicians, and other medical providers (National Center for Health Statistics [NCHS], 2011). The different providers are an integral part of the medical care system and need to coordinate their care with the layers of the U.S. government. In order to ensure access to health care, communication is vital between public and private components of healthcare delivery.
Figure 1-1 The Iron Triangle of Health Care
Source: Reproduced from Kissick, William, MD, DR, PH, Medicine’s Dilemmas, p. 3. New Haven, CT: Yale University Press, 1994. Reprinted by permission.
ASSESSING YOUR HEALTHCARE SYSTEM USING THE IRON TRIANGLE
Many healthcare systems are evaluated using the Iron Triangle of Health Care—a concept that focuses on the balance of three factors: quality, cost, and accessibility to health care (see Figure 1-1 ). This concept was created in 1994 by Dr. William Kissick (Kissick, 1994). If one factor is emphasized, such as cost reduction, it may create an inequality of quality and access because costs are being cut. Because lack of access is a problem in the United States, healthcare systems may focus on increasing access, which could increase costs. In order to assess the success of a healthcare delivery, it is vital that consumers assess their health care by analyzing the balance between cost, access, and quality. Are you receiving quality care from your provider? Do you have easy access to your healthcare system? Is it costly to receive health care? Although the Iron Triangle is used by many experts in analyzing large healthcare delivery systems, as a healthcare consumer, you can also evaluate your healthcare delivery system by using the Iron Triangle. An effective healthcare system should have a balance between the three components.
CONCLUSION
Despite U.S. healthcare expenditures, the U.S. disease rates remain higher than many developed countries because the United States has an expensive system that is available to only those who can afford it (Regenstein, Mead, & Lara, 2007). Findings from the 11th MetLife annual survey indicate that healthcare costs are worrying employees and their employers. Over 60% of employees are worried they will not be able to pay out-of-pocket expenses not covered by insurance. Employers are increasing the cost sharing of their employees for healthcare benefits because of the cost increases (Business Wire, 2013). Because the United States does not have universal health coverage, there are more health disparities across the nation. Persons living in poverty are more likely to be in poor health and less likely to use the healthcare system compared to those with incomes above the poverty line. If the United States offered universal health coverage, the per capita expenditures would be more evenly distributed and likely more effective. The major problem for the United States is that healthcare insurance is a major determinant of access to health care. With nearly 49 million uninsured in the United States with limited access to routine health care, disease rates and mortality rates will not improve. Based on the fragmented development of U.S. health care, the system is based on individualism and self-determination and focusing on the individual rather than collectivistic needs of the population. In a recent 2013 report, the CDC indicates there was a decline in U.S. infant mortality rates between 2005 and 2011 because of declines in certain geographic areas. However, despite this positive result, the United States is still ranked worldwide much lower than other developed countries due to the continued preterm birth rates. This is an important statistic because it is often used to compare the health status of nations worldwide. Although our healthcare expenditures are very high, our infant mortality rates rank higher than many countries. Racial disparities in disease and death rates continue to be a concern (CDC, 2013b). Both private and public participants in the U.S. health delivery system need to increase their collaboration to reduce these disease rates. Leaders need to continue to assess our healthcare system using the Iron Triangle to ensure there is a balance between access, cost, and quality.