What does the health information Portability and Accountability Act allow patients to do?

Compliance Concepts

In Securing HP NonStop Servers in an Open Systems World, 2006

Health Insurance Portability & Accountability Act

The Health Insurance Portability and Accountability Act (HIPAA) was passed in the USA in 1996. HIPAA is specific to a single country and to a specific industry, healthcare. The deadline for compliance by large business entities was April 2005. For smaller companies the deadline was April 2006. When a company fails to comply, the individuals responsible face substantial civil and criminal penalties, including imprisonment.

HIPAA outlines several general objectives. Those that pertain to information security are:

Protect the health information of individuals against unauthorized access

Specific requirements under this general objective put IT departments under pressure to:

Implement procedures for creating, changing, and safeguarding passwords

Implement unique names and/or numbers to individually identify and track user identities

Implement procedures to verify that persons or entities seeking access to protected health information are who they claim to be

Implement technical policies and procedures that allow access only to those persons or software programs that have “a need to know”

Implement automatic procedures that terminate an electronic session after a predetermined time of inactivity

Implement procedures for monitoring log-in attempts and reporting discrepancies

Implement regular reviews of system activity via audit logs, access reports, and security incident tracking reports

Implement hardware, software, and/or procedural mechanisms that record and review activity of systems that store or use protected health information

Implement a mechanism to encrypt and decrypt protected health information

Implement policies and procedures to protect protected health information from improper alteration or destruction

Implement electronic mechanisms to corroborate that protected health information has not been altered or destroyed in an unauthorized manner

Implement technical security measures to guard against unauthorized access to protected health information transmitted over an electronic communications network

Implement security measures to ensure that electronically transmitted protected health information is not improperly modified without detection until such time as it is properly destroyed

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9781555583446500044

Statutory and regulatory GRC

Leighton Johnson, in Security Controls Evaluation, Testing, and Assessment Handbook (Second Edition), 2020

HIPAA—1996

Health Insurance Portability and Accountability ActPublic Law 104-191, 10 Statute 1936, enacted August 21, 1996, originally had Title I protecting health insurance coverage for workers and their families when they change or lose their jobs and Title II, known as the Administrative Simplification (AS) provisions, requiring the establishment of national standards for electronic health care transactions and national identifiers for providers, health insurance plans, and employers.

The effective compliance date of the updated HIPAA Privacy Rule was April 14, 2003, with a 1-year extension for certain “small plans.” The HIPAA Privacy Rule regulates the use and disclosure of Protected Health Information (PHI) held by “covered entities.”

(Covered entities are the medical practitioners and organizations which collect the PHI from the people who receive the medical treatment.)

The regulatory part of the HIPAA Privacy Rule is explained later in this chapter; however, suffice to say, the Department of Health and Human Services has extended the HIPAA privacy rule to independent contractors of covered entities who fit within the definition of “business associates.” This has led to the legal concept of “downstream liability” for PHI and users of the extended PHI in the “covered entities” area. Now, PHI is any information held by a covered entity, which concerns health status, provision of health care, or payment for health care that can be linked to an individual.

Cover entities can only disclose PHI without a patient's written authorization to facilitate treatment, payment, or health care operations. Any other disclosures of PHI require the covered entity to obtain written authorization from the individual for the disclosure.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780128184271000033

Regulatory overview

Arnab Ray, in Cybersecurity for Connected Medical Devices, 2022

Secure design principles for privacy

While the FDA is responsible for ensuring that medical devices are safe and effective, its parent organization, the Department of Health and Human Services (HHS), is responsible for the privacy of individual data collected by different stakeholders in the healthcare system, hospitals and physician's offices and insurance companies. It is also responsible for the privacy of data that are stored on and transmitted by medical devices. HHS's mandate to protect the private data of individuals originates from Title II of the HIPAA, specifically from two sections—the Privacy Rule and the Security Rule.

The Security Rule defines three classes of security safeguards required for compliance: administrative, physical, and technical (Fig. 3.2).

What does the health information Portability and Accountability Act allow patients to do?

Figure 3.2. HIPAA control classes.

Technical safeguards are the most germane in the current context, since it defines authentication, authorization, access control, logging, and data security–related privacy controls.

HIPAA-driven controls apply to Electronic Patient Health Information or ePHI. ePHI is considered to be electronic data that can be used to identify a particular individual through data about his/her medical condition, either current or past or future, the kind of medical care provided to the individual, and payment information for past or present or future care. Identification of an individual may happen in two ways:

Direct identification: This identification happens by retrieving a direct identifier like name or social security number from electronic health data. For example, let us assume a therapy device in a hospital stores the names of patients to whom the therapy was applied to over the past week. Someone walking by pulls up the records screen, finds the name of a celebrity, and reports it to a gossip website

Indirect identification: This identification happens by retrieving indirect identifiers like date of birth, gender, zip code, and linking that with data from other sources. Let us assume the therapy device in the hospital did not store the name of the patient, but did store their gender, date of birth, and zip code. Given that celebrities dates of birth are public knowledge on Wikipedia, and the gossip magazine knows that the celebrity they suspect of having a condition stays in Beverly Hills, there is a very high probability that if they get the zip code, gender, and the date of birth from the medical record, they can now link the celebrity to their medical condition. This kind of identification is in fact pretty easy; according to a study [28], three indirect identifiers used together (date of birth, zip code, and gender) can uniquely identify 87% of the American population.

ePHI may be deidentified, i.e., attributes that can potentially lead to identification of individuals are removed before they are stored electronically. HIPAA determines two methods by which a manufacturer may determine whether data have been deidentified—either through analysis by an expert or by removing 18 attributes specified by the Privacy Rule. While deidentification leads to an electronic record no longer considered as ePHI and removes the headache of protecting their confidentiality or be liable for breach if it happens, in most cases, deidentification removes many of the reasons why the data were being kept in the first place. If a manufacturer of a home renal therapy device stores patient data so that they can automate the delivery of solutions to the patient's home when patient's supply runs low, not retaining the address or the name of the patient would compromise the whole model of therapy, patients may forget to order supplies and find themselves in situations where they are unable to provide self-care as a result.

The GDPR is an EU data privacy regulation for the protection of personal data of EU citizens. GDPR defines mandatory legal requirements on how personally identifying data of EU citizens is collected, stored, processed, and destroyed, even if the corporation collecting and storing the data is outside the EU, and the rights of those whose data is being collected over their own data.

Besides capturing in legislation the notion of “privacy by design,” i.e., recommending the consideration of privacy as a design driver while designing the security of a system and the definition of the information architecture of a system, GDPR introduces a novel concept as a privacy design control: pseudonymization. Recognizing the fact that data that have been totally deidentified are of little business value, GDPR proposes pseudonymization as a compromise between total anonymization and no anonymization and encourages companies to adopt it as an information security design paradigm.

As per GDPR, pseudonymization is defined as follows:

Pseudonymization means the processing of personal data in such a manner that the personal data can no longer be attributed to a specific data subject without the use of additional information, provided that such additional information is kept separately and is subject to technical and organizational measures to ensure that the personal data are not attributed to an identified or identifiable natural person.

What this means, from an implementation point of view, is that data that contain personally identifiable data are split across data stores with independent technical cybersecurity controls. An example of pseudonymized data in the device of medical devices is provided below.

An infusion pump stores therapy data about a patient on the pump. The data are kept indexed by a patient id and kept encrypted on the device. A separate electronic database managed by the HDO maintains the association with the patient id and direct identifiers of the patient, like name and SSN and address. This database, often implemented as part of an electronic health records system (EHR), is subject to encryption and other cybersecurity controls which are distinct and independent from the controls employed by the MDM on the pump.

Here, the properties of pseudonymization are satisfied in the way GDPR defines them. A data breach wherein the PHI stored on the device is exposed does not immediately lead to the leaking of the association of the medical condition to the individual. The compromise of the EHR also does not lead to the divulgence of this association. Only a compromise of both data sources is required for the association to be made, and the chances of that are low, given that they are maintained by independent organizations (the HDO and the device manufacturer) with independently designed and implemented controls.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780128182628000103

Information Systems Legislation

Craig Wright, in The IT Regulatory and Standards Compliance Handbook, 2008

The Health Insurance Portability Accountability Act

The Health Insurance Portability and Accountability Act (HIPAA, or the Kennedy-Kassebaum Act) was implemented as law in 1996. The sections relevant to security and this paper are the Privacy Rule and the Security Rule.

The Privacy Rule defines patient medical records or protected healthcare information (PHI) and controls the use and disclosure of PHI, necessitating well-built measures to certify patient privacy.

The Security Rule balances the Privacy rule by defining administrative, physical, and technical security safeguards required to protect PHI. Security standards are defined for each of these groupings. HIPAA provides rigid sentences for those who violate it, including criminal prosecution.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9781597492669000217

Database Activity Monitoring

Josh Shaul, Aaron Ingram, in Practical Oracle Security, 2007

The Health Insurance Portability and Accountability Act

The Health Insurance Portability and Accountability Act (HIPAA) includes a Privacy Rule that took effect in 2003. This ensures the security and privacy of Protected Health Information (PHI), which includes, among other things, the patient’s name, address, date of birth, social security number, medical data, payment history, account number and doctor’s name. Track all unexpected access to this information regardless of it being a read or a write. Breaches incur a fine per record, so be sure to record the number of records affected by database commands. As usual, track DBAs, DDL commands, backups/restorations, and configuration changes.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B978159749198350010X

Regulations Governing Protected Health Information

Paul Cerrato, in Protecting Patient Information, 2016

Technology is only part of the equation

The HIPAA regulations spend a lot of ink outlining how electronic data should be protected, but they also discuss physical safeguards. When referring to “reasonable and appropriate” precautions, they suggest that shredding paper documents containing PHI before discarding them is one such measure. Securing medical records with lock and key and limiting access to those keys is likewise necessary. Of course, keeping locked doors closed is another obvious safeguard but not always one that decision makers or their staffs abide by. More than one medical provider has been guilty of keeping the door to a server closet open because it was getting too hot in the closet.

HIPAA regulations also put a premium on written policy statements and staff training, as mentioned earlier. All the technology in the world cannot replace a clear cut set of institutional guidelines and a culture that values patient safety and privacy. However, HHS realizes that the needs and capabilities of healthcare organizations vary widely and attempts to take these differences into consideration as it spells out the administrative requirements needed to mitigate the risk of an information leak.

HHS expects you to appoint a privacy official to develop and implement the organization’s privacy policies and procedures, as well as a person in the office to contact in case there are complaints or requests for information.

Equally important is a workforce training program that educates all workflow members on your policies and procedures. And the HIPAA regulation makes it clear that the workforce does not just include employees but also volunteers, trainees, and anyone else whose conduct is under the direct control of your organization, whether or not they are paid for their services. The federal regulations also insist that you have a mechanism in place that applies sanctions against workers who violate policies and procedures in the Privacy Rule. In other words, workers need to be held accountable for their actions and realize that there can be serious consequences for ignoring the privacy safeguards put in place. More details on what the policy and procedures manual should contain and what the training should consist of will be covered in subsequent chapters.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780128043929000034

Compliance Monitoring with Nessus 3

Russ Rogers, in Nessus Network Auditing (Second Edition), 2008

Understanding Compliance

So, what are we talking about when we refer to compliance? When the Internet began to be used for legitimate business in the 1990’s, there wasn’t as much concern about the security of servers or transactions. In fact, looking back now, it’s a bit frightening to consider how many servers were possibly compromised during those emerging years.

Somewhere around the year 2000, however, people started to realize just how dangerous it would be to lose all that information that was being stored and transmitted across the Internet. These “visionaries”, as it were, sat down and created the first documents recommending a variety of security policies, processes, and configurations.

However, a quick look through the news archives since the year 2000 will show that a good number of organizations failed to embrace the importance of these recommendations, resulting in loss of sensitive or private information from customers, employees, or the company itself. This is where compliance comes into the picture. The governing organizations realized the need to create enforceable regulations in a variety of industries. This was the birth of a number of regulations, such as HIPAA, FERPA, GLBA, NERC, PCI and ISO 17799. But even with all of these great strides towards compliance, only recently have we seen progress towards actually enforcing these regulations and follow through with punitive action for those organizations that fail to comply.

Table 13.1 shows examples of the regulations and legislation in a variety of industries. It should not, however, be considered an exhaustive list. The detailed discussion of every available regulation is beyond the scope of this chapter.

Table 13.1. Example Security Guidelines and Regulations

IndustryCompliance DocumentsWeb Page
Healthcare Industry HIPAA http://www.hhs.gov/ocr/hipaa
Financial PCI Compliance https://www.pcisecuritystandards.org/
Education FERPA http://www.ed.gov/policy/gen/guid/fpco/ferpa/index.html
Utility / Public Service NERC http://www.nerc.com/
International ISO27002:2005 http://www.iso.org/iso/home.htm
Federal Government NIST 800 Series, FIPS documents http://csrc.nist.gov/

HIPAA

The Health Insurance Portability and Accountability Act (HIPAA) was released in August of 1996 by Congress to improve the efficiency and effectiveness of the health care system in the United States. As part of HIPAA, privacy and security rules were included, providing guidance for the protection of private and sensitive patient information. HIPAA provides only the most general of guidance and does not include technical details concerning the implementation of hardware and software.

Payment Card Industry (PCI)

The payment card industry, or PCI, is the term used to describe organizations that process all types of payment cards, including credit cards, debit cards, ATM cards, and pre-paid cards. On September 7th, 2006, the PCI Security Standards Council was created by American Express, Visa, MasterCard, Discover, and the Japan Credit Bureau in order to manage and maintain the PCI Data Security Standard (DSS). The DSS provides requirements to organizations that process payment cards for providing secure transmission and storage of customer information before, during, and after a transaction occurs.

FERPA

The Family Educational Rights and Privacy Act is not actually a recent developed, as it was turned into law in 1974. The real goal of FERPA was to protect student information from unauthorized disclosure or use. But over time, as technology as evolved, security pendants have started using it as a basis for technological security requirements wherever student information is stored or transmitted.

NERC

The North American Electric Reliability Corporation (NERC) was created as a non-profit organization in 1968 to provide guidance on how to ensure the reliability of all interconnected power systems in the United States, as well as parts of Canada and Mexico. NERC standards include penalties to power organizations that fail to comply, including the security of those systems.

ISO/IEC 27002:2005

ISO/IEC 27002:2005 was originally released by the International Standards Organization as ISO17799 and subsequently renumbered in July of 2007. This document provides guidance at an International level about the security requirements of information systems. 27002:2005 is thought to be the best International guide on best practices for risk assessment, security policy, access control, and more.

NIST 800 Series

The National Institute of Standards and Technology (NIST) provides guidance to the Federal Government on a number of subjects, including information security. The publication series we’re most interested in is the 800 series, which address information security, specifically. This series was established in 1990, but the latest document to be released at the time of this publication is the DRAFT version of 800–115, which addresses Information Security Testing.

WARNING

It should be noted that these documents are constantly being updated, and eventually retired, over time. Professionals in the field of information security should continue monitoring their evolution to ensure they’re addressing every possible requirement.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9781597492089000137

The Database Environment

Jan L. Harrington, in Relational Database Design (Third Edition), 2009

Other Factors in the Database Environment

Choosing hardware and software to maintain a database and then designing and implementing the database itself was once enough to establish a database environment. Today, however, security concerns loom large, coupled with government regulations on the privacy of data. In addition, a new database is unlikely to be the first database in an organization that has been in business for a while; the new database may need to interact with an existing database that cannot be merged into the new database. In this section, we’ll briefly consider how those factors influence database planning.

Security

Before the Internet, database management was fairly simple in that we were rarely concerned about security. A user name and password were enough to secure access to a centralized database. The most significant security threats were internal—from employees who either corrupted data by accident or purposely exceeded their authorized access.

Most DBMSs provide some type of internal security mechanism. However, that layer of security is not enough today. Adding a database server to a network that has a full-time connection to the Internet means that database planning must also involve network design. Authentication servers, firewalls, and other security measures therefore need to be included in the plans for a database system.

There is little benefit to the need for added security. The planning time and additional hardware and software increase the cost of implementing the database. The cost of maintaining the database also increases as network traffic must be monitored far more than when we had classic centralized architectures. Unfortunately, there is no alternative. Data is the lifeblood of almost every modern organization, and it must be protected.

The cost of a database security breach can be devastating to a business. The loss of trade secrets, the release of confidential customer information—even if the unauthorized disclosure of data doesn't cause any problems, security breaches can be a public relations nightmare, causing customers to lose confidence in the organization and convincing them to take their business elsewhere.

Note: Because database security is so vitally important, Chapter 16 is devoted entirely to this topic.

Government Regulations and Privacy

Until the past 10 years or so, decisions about what data must be secured to maintain privacy has been left up to the organization storing the data. In the United States, however, that is no longer the case for many types of data. Government regulations determine who can access the data and what they may access. The following are some of the U.S. laws that may affect owners of databases.

Health Insurance Portability and Accountability Act (HIPAA): HIPAA is intended to safeguard the privacy of medical records. It restricts the release of medical records to the patient alone (or the parent/guardian in the case of those under 18) or to those the patient has authorized in writing to retrieve records. It also requires the standardization of the formats of patient records so they can be transferred easily among insurance companies and the use of unique identifiers for patients. (The Social Security number may not be used.) Most importantly for database administrators, the law requires that security measures be in place to protect the privacy of medical records.

Family Educational Rights and Privacy Act (FERPA): FERPA is designed to safeguard the privacy of educational records. Although the U.S. federal government has no direct authority over private schools, it does wield considerable power over funds that are allocated to schools. Therefore, FERPA denies federal funds to those schools that don't meet the requirements of the law. It states that parents have a right to view the records of children under 18 and that the records of older students (those 18 and over) cannot be released to anyone but the student without the written permission of the student. Schools therefore have the responsibility to ensure that student records are not disclosed to unauthorized people, increasing the need for secure information systems that store student information.

Children's Online Privacy Protection Act: Provisions of this law govern which data can be requested from children (those under 13) and which of those data can be stored by a site operator. It applies to Web sites, “pen pal services,” e-mail, message boards, and chat rooms. In general, the law aims to restrict the soliciting and disclosure of any information that can be used to identify a child—beyond information required for interacting with the Web site—without approval of a parent or guardian. Covered information includes first and last name, any part of a home address, e-mail address, telephone number, Social Security number, or any combination of the preceding. If covered information is necessary for interaction with a Web site—for example, registering a user—the Web site must collect only the minimally required amount of information, ensure the security of that information, and not disclose it unless required to do so by law.

Legacy Databases

Many businesses keep their data “forever.” They never throw anything out, nor do they delete electronically stored data. For a business that has been using computing since the 1960s or 1970s, this typically means that old database applications are still in use. We refer to such databases that use pre-relational data models as legacy databases. The presence of legacy databases presents several challenges to an organization, depending on the need to access and integrate the older data.

If legacy data are needed primarily as an archive (either for occasional access or retention required by law), then a company may choose to leave the database and its applications as they stand. The challenge in this situation occurs when the hardware on which the DBMS and application programs run breaks down and cannot be repaired. The only alternative may be to recover as much of the data as possible and convert it to be compatible with newer software.

Businesses that need legacy data integrated with more recent data must answer the question “Should the data be converted for storage in the current database, or should intermediate software be used to move data between the old and the new as needed?” Because we are typically talking about large databases running on mainframes, neither solution is inexpensive.

The seemingly most logical alternative is to convert legacy data for storage in the current database. The data must be taken from the legacy database and reformatted for loading into the new database. An organization can hire one of a number of companies that specialize in data conversion, or it can perform the transfer itself. In both cases, a major component of the transfer process is a program that reads data from the legacy database, reformats them as necessary so that they match the requirements of the new database, and then loads them into the new database. Because the structure of legacy databases varies so much among organizations, the transfer program is usually custom-written for the business using it.

Just reading the procedure makes it seem fairly simple, but keep in mind that because legacy databases are old, they often contain “bad data” (data that are incorrect in some way). Once bad data get into a database, it is very difficult to get rid of them. Somehow, the problem data must be located and corrected. If there is a pattern to the bad data, that pattern must be identified to prevent any more bad data from getting into the database. The process of cleaning the data can therefore be the most time-consuming part of data conversion. Nonetheless, it is still far better to spend the time cleaning the data as they come out of the legacy database than attempting to find and correct the data once they get into the new database.

The bad data problem can be compounded by missing mandatory data. If the new database requires that data be present (for example, requiring a zip code for every order placed in the United States) and some of the legacy data are missing the required values, there must be some way to “fill in the blanks” and provide acceptable values. Supplying values for missing data can be handled by conversion software, but application programs that use the data must then be modified to identify and handle the instances of missing data.

Data migration projects also include the modification of application programs that ran solely using the legacy data. In particular, it is likely that the data manipulation language used by the legacy database is not the same as that used by the new database.

Some very large organizations have determined that it is not cost effective to convert data from a legacy database. Instead, they choose to use some type of middleware that moves data to and from the legacy database in real time as needed. An organization that has a widely used legacy database can usually find middleware. IBM markets software that translates and transfers data between IMS (the legacy product) and DB2 (the current, relational product). When such an application does not exist, it will need to be custom-written for the organization.

Note: One commonly used format for transferring data from one database to another is XML, which you will read more about in Chapter 18.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780123747303000012

Sara Gerke, ... Glenn Cohen, in Artificial Intelligence in Healthcare, 2020

12.4.3.1 United States

The Health Insurance Portability and Accountability Act (HIPAA) Privacy Rule (45 C.F.R. Part 160 as well as subparts A and E of Part 164) is the key federal law to protect health data privacy ([94], p. 38). However, HIPAA has significant gaps when it comes to today’s healthcare environment since it only covers specific health information generated by “covered entities” or their “business associates.” HIPAA does not apply to nonhealth information that supports inferences about health such as a purchase of a pregnancy test on Amazon ([95], p. 232; [94], p. 39). Moreover, the definition of “covered entities” also limits it scope; it generally includes insurance companies, insurance services, insurance organizations, healthcare clearinghouses, and healthcare providers (45 C.F.R. §§ 160.102, 160.103), but not much beyond that ([95], p. 231; [94], p. 39). In particular, much of the health information collected by technology giants such as Amazon, Google, IBM, Facebook, and Apple that are all investing heavily in the field of AI in healthcare, and are not “covered entities,” will fall outside of HIPAA ([94], p. 39). HIPAA also does not apply in cases of user-generated health information ([95], p. 232; [94], p. 39). For example, a Facebook post about a disease falls outside of HIPAA’s regime ([95], p. 232).

A different problem with HIPAA is its reliance on de-identification as a privacy strategy. Under HIPAA de-identified health information can be shared freely for research and commercial purposes [[95], p. 231; 45 C.F.R. § 164.502(d)(2)]. It provides two options for de-identification: (1) a determination by someone with appropriate knowledge of and experience with usually accepted scientific and statistical methods and principles; or (2) the removal of 18 identifies (e.g., names, social security numbers, and biometric identifiers) of the individual or of relatives, household members, or employers of the individual, and no actual knowledge of the covered entity that the information could be used to identify an individual [45 C.F.R. § 164.514(b)]. But this may not adequately protect patients because of the possibility of data triangulation—to re-identify data thought to be de-identified under the statute through the combination of multiple datasets ([94], pp. 39, 40; [96]). The problem of data triangulation has also recently been featured in a lawsuit, Dinerstein v. Google [70], in which the plaintiffs alleged that the defendants shared medical records with Google containing enough information that enabled Google to potentially re-identify patients given all of its other data at hand.

For all these reasons, HIPAA is not adequate to protect the health privacy of patients. It is time for federal law to take seriously the protection of health-relevant data that is not covered by HIPAA ([95], p. 232; [97], pp. 9, 16). Such a federal law should facilitate both innovations, including health AI applications, and adequate protection of health privacy of individuals.

While HIPAA preempts less protective state law, it does not preempt states whose laws are more protective. Inspired by the EU GDPR, California recently has taken action at the state level: The California Consumer Privacy Act of 2018 (CCPA) became effective on January 1, 2020 (Cal. Civ. Code § 1798.198). The CCPA grants various rights to California residents with regard to personal information that is held by businesses. The term business is defined in Section 1798.140(c) of the California Civil Code and applies to “a sole proprietorship, partnership, limited liability company, corporation, association, or other legal entity that is organized or operated for the profit or financial benefit of its shareholders or other owners that collects consumers’ personal information or on the behalf of which that information is collected and that alone, or jointly with others, determines the purposes and means of the processing of consumers’ personal information, that does business in the State of California, and that satisfies one or more of the following thresholds:

A.

Has annual gross revenues in excess of twenty-five million dollars (…).

B.

Alone or in combination, annually buys, receives for the business’s commercial purposes, sells, or shares for commercial purposes, alone or in combination, the personal information of 50,000 or more consumers, households, or devices.

C.

Derives 50 percent or more of its annual revenues from selling consumers’ personal information.”

The CCPA defines the term personal information broadly as “information that identifies, relates to, describes, is reasonably capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular consumer or household,” including a real name, alias, postal address, social security number, and biometric information [Cal. Civ. Code § 1798.140(o)(1)]. In particular, personal information is not “publicly available information”—“information that is lawfully made available from federal, state, or local government records” [Cal. Civ. Code § 1798.140(o)(2)].

The CCPA does not apply to protected health information that is collected by HIPAA covered entities or their business associates [Cal. Civ. Code § 1798.145(c)(1)]. However, it applies to a great deal of information in so-called “shadow health records”—health data that is collected outside of the health system ([98], p. 449). Thus the CCPA is a welcome attempt to at least partially fill in legal gaps and improve the data protection of individuals.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780128184387000125

The Healthcare Environment

Tony W. York, Don MacAlister, in Hospital and Healthcare Security (Sixth Edition), 2015

Health Insurance Portability and Accountability Act

The U.S. Health Insurance Portability and Accountability Act (HIPAA) was enacted as federal law in 1996 under the direction and control of the Department of Health and Human Services (HHS). The law applies to health information, often referred to as Personal Health Information (PHI), created or maintained by healthcare providers who engage in certain electronic transactions, health plans, and healthcare clearinghouses. The Office of Civil Rights (OCR) is the departmental component responsible for implementing and enforcing the privacy regulation. The agency issued a final Privacy Rule that became effective in April 2001 and became enforceable for most covered entities in April 2005. Regulations are designed to safeguard Personal Health Information (PHI) maintained or transmitted in electronic form. Personal computers, external portable hard drives (including iPods and similar devices), magnetic tape, removable storage devices such as USB memory sticks, CDs, DVDs and other digital memory cards, PDAs and smartphones, the Internet and extranets are examples of electronic media that may contain PHI.

There is a distinction between the HIPAA terms Privacy Rule (PR) and Security Rule (SR). The Privacy Rule basically defines what data must be protected, regardless of format, and how it can and cannot be utilized by the organization maintaining and responsible for controlling the data. The Security Rule provides requirements for protecting the defined PHI and defines physical safeguards as “physical measures, policies, and procedures to protect a covered entity’s electronic information systems and related buildings and equipment, from natural and environmental hazards, and unauthorized intrusion.”26

The Physical Safeguards section, titled Facility Access Control, requires healthcare facilities to implement policies and procedures to limit physical access to its electronic information systems, and the facility (or facilities) in which they are housed, while establishing that properly authorized access is allowed. This includes implementation of procedures to control and validate a person’s access based on role (or function), including visitor control, and control of access to software programs.

The Facility Security Plan section requires defining and documenting the safeguards used to protect the healthcare facility. It is a “reasonable and appropriate” expectation that requires implementation of policies and procedures to safeguard the facility, and the equipment housed within it, from unauthorized physical access, tampering and theft. This includes documenting repairs and modifications of physical security equipment and safeguards on a regular basis, including changing locks, making routine maintenance checks and installing new security devices.

The Facility Security Plan should establish basic expectations for the physical attributes of the building proper, and those surrounding a specific workstation, or class of workstation, which can access electronic protected health information. Workstation security safeguards at their heart should be designed to restrict those who access PHI to only authorized users.

The device and media controls of the security plan are expected to govern the receipt and removal of hardware and electronic media that contain PHI, into and out of the facility, as well as the movement of these items within the facility. Disposal of electronic media containing PHI should make certain devices are unusable and/or inaccessible. This may include degaussing or physically damaging the device beyond repair.

It is obvious from the Security Rule requirements that HIPAA compliance in the healthcare environment involves several disciplines which include security, risk management, and information technology. It requires a multiple disciplinary management approach for implementing compliance policy and procedures.

With regard to HIPAA enforcement activities (nonprivacy), the CMS continues to operate based on a complaint-driven process, addressing complaints filed against covered entities by requesting and reviewing documentation of their compliance status and/or corrective actions.

Historically, the OCR’s approach to Privacy Rule violations has been passive; however, this started to change in 2011 when the OCR fined Cignet Health in Maryland $4.3 million for its violation of the Privacy Rule. The OCR found Cignet had violated the rights of 41 patients by denying them access to their medical records.27 In the same time period, OCR reported a $1 million settlement with Massachusetts General Hospital for losing 192 patient records.

The American Recovery and Reinvestment Act of 2009 (ARRA) established a tiered civil penalty structure for HIPAA violations noted in Table 1-2. The Department of Health and Human Services still has discretion in determining the amount of the penalty, based on the nature and extent of the violation, but is prohibited from imposing civil penalties (except in cases of willful neglect) if the violation is corrected within 30 days.

Table 1-2. HIPAA Violations and Enforcement28

HIPAA ViolationMinimum PenaltyMaximum Penalty
Individual did not know (and by exercising reasonable diligence would not have known) that he/she violated HIPAA $100 per violation, with an annual maximum of $25,000 for repeat violations (Note: maximum that can be imposed by State Attorneys General regardless of the type of violation) $50,000 per violation, with an annual maximum of $1.5 million
HIPAA violation due to reasonable cause and not due to willful neglect $1,000 per violation, with an annual maximum of $100,000 for repeat violations $50,000 per violation, with an annual maximum of $1.5 million
HIPAA violation due to willful neglect but violation is corrected within the required time period $10,000 per violation, with an annual maximum of $250,000 for repeat violations $50,000 per violation, with an annual maximum of $1.5 million
HIPAA violation is due to willful neglect and is not corrected $50,000 per violation, with an annual maximum of $1.5 million $50,000 per violation, with an annual maximum of $1.5 million

The term “convergence,” meaning an organized and coordinated effort of security and information technology, is truly at play in managing HIPAA requirements in addition to other major areas of managing security risks. The security function of the healthcare organization will be charged with, and responsible for, the physical protection of the data and in the area of law enforcement disclosures.

It is not uncommon to receive requests for information from law enforcement officials who are unaware their requests violate either HIPAA, other laws, or confidentiality protocols. An example of such a situation occurred in Wisconsin where a nurse was prosecuted for refusing to release information to law enforcement, citing the rules of HIPAA. The nurse was charged with obstructing an officer and contempt of court for refusing to allow an officer to serve a patient with a restraining order.29 It would appear there was no request by law enforcement for PHI and that simply acknowledging the patient was in the facility at a specific location would not have violated the HIPAA Security Rule.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780124200487000015

What does the health information Portability and Accountability Act Hipaa allow patients to do?

The law was created to give individuals more control and access to their medical information, protect individually identifiable medical information (protected health information) from threats of loss or disclosure, and simplify the administration of health insurance claims and lower costs.

What are the 3 main purposes of Hipaa?

The HIPAA legislation had four primary objectives: Assure health insurance portability by eliminating job-lock due to pre-existing medical conditions. Reduce healthcare fraud and abuse. Enforce standards for health information. Guarantee security and privacy of health information.