Very little is taught about the main structure and history of the health system in the residency programs where thousands of foreign medical graduates FMG/International medical graduates (IMGs) get their training.
So for those who want to join residencies in future and those who are doing their residencies and those newly minted practising IMG
US healthcare 101
America’s history of healthcare is a bit different than most first world nations. Our staunch belief in capitalism has prevented us from developing the kind of national healthcare the United Kingdom, France, and Canada have used for decades. As a result, we have our own system of sorts that has evolved drastically over the past century and is both loved and hated.
Whichever end of the spectrum you lean toward, there’s no doubt about it: the history of healthcare in America is a long and winding road. How we got to where we are in 2017 is quite a story, so let’s dive in.
The History of Healthcare: From the Late 1800’s to Now
The Late 1800’s
The earliest formalized records in America’s history of healthcare are dated toward the end of the 19th century. The industrial revolution brought steel mill jobs to many U.S. cities, but the dangerous nature of the work led to more and more workplace injuries.
As these manufacturing jobs became increasingly prevalent, their unions grew stronger. In order to shield their union members from catastrophic financial losses due to injury or illness, they began to offer various forms of sickness protection. At the time, there was very little organized structure and most decisions were made on a trial and error basis.
The 1900’s
With the turn of the century came a push for organized medicine, led in part by the American Medical Association (AMA), which was growing stronger and gained 62,000 physicians during the coming decade. But because the working class wasn’t supportive of the idea of compulsory healthcare, the U.S. didn’t see the kind of groundswell that leading European nations would see soon after.
The 26th President of the United States, Theodore Roosevelt (1901-1909), believed health insurance was important because “no country could be strong whose people were sick and poor.” Even so, he didn’t lead the charge for stronger healthcare in America. Most of the initiative in the early 1900’s was led by organizations outside the government.
The 1910’s
One of the organizations heavily involved with advancing healthcare was the American Association of Labor Legislation (AALL), who drafted legislation targeting the working class and low-income citizens (including children). Under the proposed bill, qualified recipients would receive sick pay, maternity benefits, and a death benefit of $50.00 to cover funeral expenses. The cost of these benefits would be split between states, employers, and employees.
The AMA initially supported the bill, but some medical societies expressed objections, citing concerns over how doctors would be compensated. The fierce opposition caused the AMA to back down and ultimately pull support for the AALL bill. Not to mention the fact that union leaders feared compulsory health insurance would weaken them, as a portion of their power came from being able to negotiate insurance benefits for union members.
As one might expect, the private insurance industry also opposed the AALL Bill because they feared it would undermine their business. If Americans received compulsory insurance through the government, they might not see the need to purchase additional insurance policies privately (especially life insurance), which could put them out of business — or at the very least, cut into their profits. In the end, the AALL bill couldn’t garner enough support to move forward and global events turned Americans' attention toward the war effort.
After the start of World War I, Congress passed the War Risk Insurance Act, which covered military servicemen in the event of death or injury. The Act was later amended to extend financial support to the servicemen’s dependents. The War Risk Insurance program essentially ended with the conclusion of the war in 1918, though benefits continued to be paid to survivors and their families.
The 1920’s
Post World War I, the cost of healthcare became a more pressing matter as hospitals and physicians began to charge more than the average citizen could afford. Seeing that this was becoming an issue, a group of teachers created a program through Baylor University Hospital where they would agree to pre-pay for future medical services (up to 21 days in advance). The resulting organization was not-for-profit and only covered hospital services. It was essentially the precursor to Blue Cross.
The 1930’s
When the Great Depression hit in the 30’s, healthcare started to become a more heated debate. One might believe such conditions would create the perfect climate for compulsory, universal healthcare, but in reality, it did not. Rather, unemployment and “old age” benefits took precedence.
While “The Blues” (Blue Cross and Blue Shield) began to expand across the country, the 32nd President of the United States, Franklin Delano Roosevelt (1933-1945), knew healthcare would grow to be a substantial problem, so he got to work on a health insurance bill that included the “old age” benefits so desperately needed at the time.
However, the AMA once again fiercely opposed any plan for a national health system, causing FDR to drop the health insurance portion of the bill. The resulting Social Security Act of 1935created a system of “old-age” benefits and allowed states to create provisions for people who were either unemployed or disabled (or both).
The 1940’s
As the U.S. entered World War II after the attack on Pearl Harbor, attention fell from the health insurance debate. Essentially all government focus was placed on the war effort, including the Stabilization Act of 1942, which was written to fight inflation by limiting wage increases.
Since U.S. businesses were prohibited from offering higher salaries, they began looking for other ways to recruit new employees as well as incentivizing existing ones to stay. Their solution was the foundation of employer-sponsored health insurance as we know it today. Employees enjoyed this benefit, as they didn’t have to pay taxes on their new form of compensation and they were able to secure healthcare for themselves and their families. After the war ended, this practice continued to spread as veterans returned home and began looking for work.
While this was an improvement for many, it left out vulnerable groups of people: retirees, those who are unemployed, those unable to work due to a disability, and those who had an employer that did not offer health insurance. In an effort to not alienate at-risk citizens, some government officials felt it was important to keep pushing for a national healthcare system.
The Wagner-Murray-Dingell Bill was introduced in 1943, proposing universal health care funded through a payroll tax. If the history of healthcare thus far could be a lesson for anyone, the bill was faced with intense opposition and eventually drowned in committee.
When FDR died in 1945, Harry Truman (1945-1953) became the 33rd President of the United States. He took over FDR’s old national health insurance platform from the mid-30’s, but with some key changes. Truman’s plan included all Americans, rather than only working class and poor citizens who had a hard time affording care — and it was met with mixed reactions in Congress.
Some members of Congress called the plan “socialist” and suggested that it came straight out of the Soviet Union, adding fuel to the Red Scare that was already gripping the nation. Once again, the AMA took a hard stance against the bill, also claiming the Truman Administration was towing “the Moscow party line.” The AMA even introduced their own plan, which proposed private insurance options, departing from their previous platform that opposed third-parties in healthcare.
Even after Truman was re-elected in 1948, his health insurance plan died as public support dropped off and the Korean War began. Those who could afford it began purchasing health insurance plans privately and labor unions used employer-sponsored benefits as a bargaining chip during negotiations.
The 1950’s
As the government became primarily concerned with the Korean War, the national health insurance debate was tabled, once again. While the country tried to recover from its third war in 40 years, medicine was moving forward. It could be argued that the effects of Penicillin in the 40’s opened people’s eyes to the benefits of medical advancements and discoveries.
In 1952, Jonas Salk’s team at the University of Pittsburgh created an effective Polio vaccine, which was tested nationwide two years later and was approved in 1955. During this same time frame, the first organ transplant was performed when Dr. Joseph Murray and Dr. David Hume took a kidney from one man and successfully placed it in his twin brother.
Of course, with such leaps in medical advancement, came additional cost — a story from the history of healthcare that is still repeated today. During this decade, the price of hospital care doubled, again pointing to America’s desperate need for affordable healthcare. But in the meantime, not much changed in the health insurance landscape.
The 1960’s
By 1960, the government started tracking National Health Expenditures (NHE) and calculated them as a percentage of Gross Domestic Product (GDP). At the start of the decade, NHEaccounted for 5 percent of GDP.
When John F. Kennedy (1961-1963) was sworn in as the 35th President of the United States, he wasted no time at all on a healthcare plan for senior citizens. Seeing that NHE would continue to increase and knowing that retirees would be most affected, he urged Americans to get involved in the legislative process and pushed Congress to pass his bill. But in the end, it failed miserablyagainst harsh AMA opposition and again — fear of socialized medicine.
After Kennedy was assassinated on November 22, 1963, Vice President Lyndon B. Johnson (1963-1969) took over as the 36th President of the United States. He picked up where Kennedy left off with a senior citizen’s health plan. He proposed an extension and expansion of the Social Security Act of 1935, as well as the Hill-Burton Program (which gave government grants to medical facilities in need of modernization, in exchange for providing a “reasonable” amount of medical services to those who could not pay).
Johnson’s plan focused solely on making sure senior and disabled citizens were still able to access affordable healthcare, both through physicians and hospitals. Though Congress made hundreds of amendments to the original bill, it did not face nearly the opposition that preceding legislation had — one could speculate as to the reason for its easier path to success, but it would be impossible to pinpoint with certainty.
It passed the House and Senate with generous margins and went to the President’s desk. Johnson signed the Social Security Act of 1965 on July 30 of that year, with President Harry Truman sitting at the table with him. This bill laid the groundwork for what we now know as Medicare and Medicaid.
The 1970’s
By 1970, NHEaccounted for 6.9 percent of GDP, due in part to “unexpectedly high” Medicare expenses. Because the U.S. had not formalized a health insurance system (it was still just people who could afford it buying insurance), they didn’t really have any idea how much it would cost to provide healthcare for an entire group of people — especially an older group who is more likely to have health problems. Nevertheless, this was quite a leap in a ten year time span, but it wouldn’t be the last time we’d see such jumps. This decade would mark another push for national health insurance — this time from unexpected places.
Richard Nixon (1969-1974) was elected the 37th President of the United States in 1968. As a teen, he watched two brothers die and saw his family struggle through the 1920’s to care for them. To earn extra money for the household, he worked as a janitor. When it came time to apply for colleges, he had to turn Harvard down because his scholarship didn’t include room and board.
Entering the White House as a Republican, many were surprised when he proposed new legislature that strayed from party lines in the healthcare debate. With Medicare still fresh in everyone’s minds, it wasn’t a stretch to believe additional healthcare reform would come hot on its heels, so members of Congress were already working on a plan.
In 1971, Senator Edward (Ted) Kennedy proposed a single-payer plan (a modern version of a universal, or compulsory system) that would be funded through taxes. Nixon didn’t want the government reaching so far into Americans’ lives, so he proposed his own plan, which required employers to offer health insurance to employees and even provided subsidies to those who had trouble affording the cost. You can read more about the history of employer-sponsored healthcare by downloading our free guide below.
Nixon believed that basing a health insurance system in the open marketplace was the best way to strengthen the existing makeshift system of private insurers. In theory, this would have allowed the majority of Americans to have some form of health insurance. People of working age (and their immediate families) would have insurance through their employers and then they’d be on Medicare when they retired. Lawmakers believed the bill satisfied the AMA because doctors’ fees and decisions would not be influenced by the government.
Kennedy and Nixon ended up working together on a plan, but in the end, Kennedy buckled under pressure from unions and he walked away from the deal — a decision he later said was “one of the biggest mistakes of his life.” Shortly after negotiations broke down, Watergate hit and all the support Nixon’s healthcare plan had garnered completely disappeared. The bill did not survive his resignation and his successor, Gerald Ford (1974-1977) distanced himself from the scandal.
However, Nixon was able to accomplish two healthcare-related tasks. The first was an expansion of Medicare in the Social Security Amendment of 1972 and the other was the Health Maintenance Organization Act of 1973 (HMO), which established some order in the healthcare industry chaos. But by the end of the decade, American medicine was considered to be in “crisis,” aided by an economic recession and heavy inflation.
The 1980’s
By 1980, NHE accounted for 8.9 percent of GDP, an even larger leap than the decade prior. Under the Reagan Administration (1981-1989), regulations loosened across the board and privatization of healthcare became increasingly common.
In 1986, Reagan signed the Consolidated Omnibus Budget Reconciliation Act (COBRA), which allowed former employees to continue to be enrolled in their previous employer’s group health plan — as long as they agreed to pay the full premium (employer portion plus employee contribution). This provided health insurance access to the recently unemployed who might have otherwise had difficulty purchasing private insurance (due to a pre-existing condition, for example).
The 1990’s
By 1990, NHE accounted for 12.1 percent of GDP — the largest increase thus far in the history of healthcare. Like others before him, the 42nd President of the United States, Bill Clinton (1993-2001), saw that this rapid increase in healthcare expenses would be damaging to the average American and attempted to take action.
Shortly after being sworn in, Clinton proposed the Health Security Act of 1993. It proposed many similar ideas to FDR and Nixon’s plans — a mix of universal coverage while respecting the private insurance system that had formed on its own in the absence of legislation. Individuals could purchase insurance through “state-based cooperatives,” companies could not deny anyone based on a pre-existing condition, and employers would be required to offer health insurance to full-time employees.
Multiple issues stood in the way of the Clinton plan, including foreign affairs, the complexity of the bill, an increasing national deficit, and opposition from big business. After a period of debate toward the end of 1993, Congress left for winter recess with no conclusions or decisions, leading to the bill’s quiet death.
In 1996, Clinton signed the Health Insurance Portability and Accountability Act (HIPAA), which established privacy standards for individuals. It also guaranteed that a person’s medical records would be available upon their request and placed restrictions on how pre-existing conditions were treated in group health plans.
The final healthcare contribution from the Clinton Administration was part of the Balanced Budget Act of 1997. It was called the Children’s Health Insurance Program (CHIP) and it expanded Medicaid assistance to “uninsured children up to age 19 in families with incomes too high to qualify them for Medicaid.” CHIP is run by each individual State and is still in use today.
In the meantime, employers were trying to find ways to cut back on healthcare costs. In some cases, this meant offering HMOs, which by design, are meant to cost both the insurer and the enrollee less money. Typically this includes cost saving measures, such as narrow networks and requiring enrollees to see a primary care physician (PCP) before a specialist. Generally speaking, insurance companies were trying to gain more control over how people received healthcare. This strategy worked overall — the 90’s saw slower healthcare cost growth than previous decades.
The 2000’s
By the year 2000, NHE accounted for 13.3 percent of GDP — just a 1.2 percent increase over the past decade. When George W. Bush (2001-2009) was elected the 43rd President of the United States, he wanted to update Medicare to include prescription drug coverage. This idea eventually turned into the Medicare Prescription Drug, Improvement and Modernization Act of 2003 (sometimes called Medicare Part D). Enrollment was (and still is) voluntary, although millions of Americans use the program.
The history of healthcare slowed down at that point, as the national healthcare debate was tabled while the U.S. focused on the increased threat of terrorism and the second Iraq War. It wasn’t until election campaign mumblings began in 2006 and 2007 that insurance worked its way back into the national discussion.
When Barack Obama (2009-2017) was elected the 44th President of the United States in 2008, he wasted no time getting to work on health care reform. He worked closely with Senator Ted Kennedy to create a new healthcare law that mirrored the one Kennedy and Nixon worked on in the 70’s.
Like Nixon’s bill, it mandated that applicable large employers provide health insurance, in addition to requiring that all Americans carry health insurance, even if their employer did not offer it. The bill would establish an open Marketplace, on which insurance companies could not deny coverage based on pre-existing conditions. American citizens earning less than 400 percent of the poverty level would qualify for subsidies to help cover the cost. It wasn’t universal or single-payer coverage, but instead used the existing private insurance industry model to extend coverage to millions of Americans. The bill circulated the House and the Senate for months, going through multiple revisions, but ultimately, passed and moved to the President’s desk.
2010 to Now
While the country focused on the second Iraq War, the cost of healthcare took another leap. By 2010, NHEaccounted for 17.4 percentof GDP. This period of time would bring a new, but divisive chapter in the history of healthcare in America.
On March 23, 2010, President Obama signed the Patient Protection and Affordable Care Act (PPACA), commonly called the Affordable Care Act (ACA) into law. Because the law was complex and the first of its kind, the government issued a multi-year rollout of its provisions. In theory, this should have helped ease insurance companies (and individuals) through the transition, but in practice, things weren’t so smooth. The first open enrollment season for the Marketplace started in October 2013 and it was rocky, to say the least.
Nevertheless, 8 million people signed up for insurance through the ACA Marketplace during the first open enrollment season. The numbers increased to 11.7 million in 2015 and it’s estimated that the ACA has covered an average of 11.4 million annually ever since.
It’s no secret that the ACA was met with heavy opposition for a variety of reasons (the individual mandate and the employer mandate being two of the most hotly contested). Some provisions were even taken before the Supreme Court on the basis of constitutionality. In addition, critics highlighted the problems with healthcare.gov as a sign this grand “socialist” plan was destined to fail.
Regardless of the controversy, it could be argued that the most helpful part of the ACA was its pre-existing condition clause. Over the course of the 20th century, insurance companies began denying coverage to individuals with pre-existing conditions, such as asthma, heart attacks, strokes, and AIDS. The exact point when pre-existing conditions were cited in the history of our healthcare is debatable, but very possibly, it occurred as for-profit insurance companies popped up across the landscape. Back in the 20’s, not-for-profit Blue Cross charged the same amount, regardless of age, sex, or pre-existing condition, but eventually, they changed their status to compete with the newcomers. And as the cost of healthcare increased, so did the number of people being denied coverage.
Prior to the passing of the ACA, it’s estimated that one in seven Americans were denied health insurance because of a pre-existing condition, the list of which was extensive and often elusive, thanks to variations between insurance companies and language like “including, but not limited to the following.”
In addition, the ACA allowed for immediate coverage of maternal and prenatal care, which had previously been far more restrictive in private insurance policies. Usually, women had to pay an additional fee for maternity coverage for at least 12 months prior to prenatal care being covered — otherwise, the pregnancy was viewed as a pre-existing condition and services involving prenatal care (bloodwork, ultrasounds, check-ups, etc) were not included in the policy.
The Future of Healthcare: History Shall Repeat Itself
Since Donald Trump was sworn in as the 45th President of the United States on January 20, 2017, many have been questioning what will happen with our healthcare system — specifically, what will happen to the ACA, since Trump ran on a platform of “repealing and replacing” the bill.
As open enrollment for 2017 drew to a close, it became apparent to lawmakers that either repealing or replacing the ACA would be no easy task. If they were to repeal the bill, what would happen to the 11 million Americans currently insured through the Marketplace? If they come up with a replacement plan, what does that look like? What changes would be made? Will there be a Marketplace? Will insurance companies be able to deny coverage based on pre-existing conditions again?
There are plenty of questions that need answered and it doesn’t seem like a decision will be reached anytime soon, but one thing is for sure: changes will be made. What those changes will be depends on Congress, Trump, the new Secretary of Health and Human Services, Tom Price — and you, the voter.
The history of healthcare in America will continue to evolve and it will be interesting to see where this administration takes us and the affects their plan will have on Americans. Regardless, we’ll have to keep an eye on the national health expenditure numbers. According to the latest data available, NHE was 17.8 percent of GDP in 2015, signaling slower growth than the previous decade, but only time will tell the full story.
No comments:
Post a Comment