Compliance Doesn’t Equal Security, But What About Privacy?
Data Privacy, or information privacy, typically refers to the ability of an individual to determine how, when, and to what extent their personal information can be shared. Privacy is the protection of personal data from unauthorized access, as well as the right to control how that data is stored, collected, and disclosed.
Overlap of Privacy, Security, and Compliance
Data privacy, data security, and compliance most definitely overlap, but they are not all the same. As stated above, privacy is the protection of individual rights and the control of personal data. Security includes the physical and technical controls that are put in place to protect data from compromise. And compliance is the practice of ensuring organizations are following specific third-party regulations and/or requirements for protecting data.
A particular data system can be considered compliant because it is meeting the minimum compliance requirements, but that doesn’t necessarily mean it is fully secure. And a system can be secure, but still violate privacy if authorized users of the system collect or use personal data in inappropriate ways. At the same time, organizations are unable to build and maintain a comprehensive data privacy program without basic security controls to protect that data from potential cyberattacks. Working together, privacy, security, and compliance can help protect organizational data and reduce overall risk.
Data Privacy Requirements
Since the onset of the COVID-19 pandemic, data privacy has become a more prevalent concern for consumers in many ways, due to both the impact of sharing health-related information, as well as just the fact that more and more services have been pushed online. Every day, another news headline reminds us of yet another breach of our personal data. Along with growing compliance requirements, there is now a certain expectation for organizations to have comprehensive privacy programs in place, and implementing a program should be a priority for those organizations that don’t.
Colleges and universities collect and store a vast amount of data, and students today want to know that their financial and health information, as well as their education records are protected and are not used for purposes outside of their approval. They want to be made aware of any personal information shared outside of the original recipient, and have a choice in whether or not their personal information is released. Higher education has always had a mix of federal compliance requirements that protect specific types of sensitive data, but general consumer privacy requirements are increasing and becoming more strictly enforced. Here are several key requirements and how they apply:
Family Educational Rights and Privacy Act (FERPA)
FERPA protects the privacy of student educational records and prohibits educational institutions from disclosing personally identifiable information in education records without written consent.
Health Insurance Portability and Accountability Act (HIPAA)
HIPAA safeguards Protected Health Information (PHI) and applies to both student and non-student PHI. Colleges and universities that conduct HIPAA-covered transactions online, such as insurance claims, are considered healthcare providers under HIPAA.
Gramm Leach Bliley Act (GLBA)
GLBA protects consumer financial data and applies to how higher education institutions collect, store, and use student financial records containing personally identifiable information (PII). For example, records regarding tuition payments and/or financial aid data must be protected in accordance with GLBA. GLBA regulations include both a Privacy Rule (that focuses on requirements around customer privacy notices) and a Safeguards Rule (covering requirements for protecting the confidentiality and security of customer information).
General Data Protection Regulation (GDPR)
The GDPR which went into effect in May 2018, remains the strictest privacy regulation, and applies to any organization with a physical presence in the EU, as well as any institution that handles the personal information of EU residents.
California Consumer Privacy Act (CCPA)
The United States does not have a national data privacy law, but there are now several state privacy laws in place that must be monitored. The CCPA was the first and offers broad protections for consumers, including guidelines for how for-profit organizations must handle data belonging to any resident of California. It went into effect in January 2020, and most recently, in July of 2021, California’s Attorney General provided a CCPA update and a list of 27 enforcement actions his office had taken. The California Privacy Rights Act (CPRA) goes into effect in January 2023, and will supersede and enhance the strictures of the CCPA.
If an organization is a non-profit, they might be exempt from the CCPA, but any third-party service providers most likely are not. If they are processing information on behalf of your organization, the responsibility for their compliance and their ability to protect the personal information of your users falls, at least in part, on your organization. That is why it is so important to thoroughly assess vendors before allowing them access to data.
Virginia Consumer Data Privacy Act (CDPA)
The Virginia CDPA establishes a framework for controlling and processing personal data in the Commonwealth. This Act was greatly inspired by the CCPA, and has an effective date of January 1, 2023.
Colorado Privacy Act (CPA)
The CPA became the third data privacy regulation and was signed into law in July 2021. It was largely modeled after the CCPA and CDPA and will go into effect on July 1, 2023.
Many of the requirements from within the state laws reflect those from the GDPR, and include provisions such as the right to access and the right to request personal data be deleted (the so-called "right to be forgotten"), and to opt-out of the sale of personal information. Researching and ensuring that an organization is updating its privacy program and maintaining compliance with all applicable state, federal, and international laws, and industry privacy standards is important. However, understanding the ins and outs of changing privacy legislation can be difficult for compliance teams. With pending data privacy legislation in Massachusetts, Minnesota, New York, North Carolina, Ohio, and Pennsylvania, it is only going to get more confusing. So, what CAN you do?
Building a Privacy Program
Rather than trying to tackle each compliance regulation individually, below are some recommended steps for building a comprehensive privacy program for your organization:
1. Establish a Privacy Team
The organization’s Privacy Team or committee should include executive leadership, including both academic and business leaders, as well as leaders with responsibility for information security, compliance, ethics, and your general counsel. The organization may also be required to appoint a Data Protection/Data Privacy Officer that is responsible for leading the program.
2. Assess data collection and processing
It is critical to understand where sensitive data lives throughout the organization. Inventory the organizational data from all areas and develop an information map of the systems and applications in use:
a. What data is being collected?
b. Where is the data being sourced?
c. Why is the data collected/for what purpose?
d. How is it processed?
e. Who has access?
f. Who is responsible for managing the systems?
g. How long is the data retained?
h. Where is the data transferred to?
i. How is data disposed of?
This may also be an opportunity to assess the reasons why information is being collected across your various departments. Focus on data minimization and collect only what is necessary to provide the intended services.
3. Risk Assessment
Once an organization knows where its data is, it’s time to figure out who is responsible for the different data sets and how each should be managed. In this step you will identify any risks to personal information and prioritize the overall impact.
a. Review current policies and procedures in place that govern personal data
b. Identify biggest areas of risk and gaps in compliance
c. Develop an action plan, prioritize actions, and develop strategies for controlling risks
4. Third Party Vendor Management
As organizations outsource an increasing number of services to third parties, it is critical to have a vendor management process that can determine if service providers are protecting data in accordance with the regulations that apply to your organization. Whenever an area wants to deploy a new vendor or solution that will have access to sensitive data, a security and privacy risk assessment is essential.
a. Review if/how data is being shared with the third party under consideration
b. Create and maintain an inventory of vendor involvement
c. Update contract language to cover personal data processing
5. Policy Development
a. If data is going to be used for a secondary purpose, individuals must be notified.
b. Clearly define your policy for handling data access requests and how users can request changes.
c. Review breach investigation and notification policies.
6. Security Controls/Procedures
Once risks have been defined, it is important to implement security controls and safeguards to remediate each risk, whether the information is stored electronically or in paper files.
a. Implement and enforce access controls.
b. Implement a chain of custody process documenting the history of the possession and handling of information/records.
c. Develop or update any needed security procedures and necessary controls (i.e. encryption).
d. Define and update the incident response process/facilitate breach response exercises
7. Educate staff and users
Develop a training/awareness program and ensure all users involved in the processing of personal information understand the importance of data protection, data privacy principles, and procedures that are in place to ensure compliance.
Colleges and universities should also be informing their students about what data is being collected from them and how that data is being stored, used and protected; allow students to update their own data on demand; and provide students with the option to opt out of sharing at any time. Transparency is critical because unless students have access to privacy policies and practices, and understand how institutions plan to use their data, it’s difficult to trust their ability to protect it.
A comprehensive privacy program will define how data is collected, stored, and used. It should define who can access that data, when they can access it, and under what specific conditions. The program must also include a data classification process and a record retention policy so all staff understand how long different types of information can be kept. Take the time to educate the community about risks and best practices, advocate for a shared responsibility to protect data, and develop a culture that inspires awareness and knowledge throughout all departments.
If you had to choose between security, compliance, and privacy, which would you pick? Don’t answer that, it’s a trick question! A compliant organization will have solid administrative, technical, and physical security safeguards to ensure confidentiality, integrity, and availability of data, as well as the policies and practices in place to enforce privacy. Your privacy program should allow your organization to prove (through all documented policies and practices) that it has taken every step possible to meet various compliance requirements, protect information from compromise, and ensure data privacy.
Additional guidance from the Security Advisor Team below:
[Bivens]: "Privacy" can mean so many different things to different people that it's easy to overlook privacy protection until something bad happens, and by then the damage—monetary or reputational—may be severe.
To address citizens' privacy concerns, more and more governments are passing privacy-protection laws. These statutes don't always overlap cleanly, and can frustrate Management's search for one-size-fits-all Compliance solutions.
Meanwhile, many companies are learning—sometimes the hard way—how an increase in the amount of data decreases the anonymity of its sources. A single item, such as a street address or telephone number, isn't very revealing, but when combined with enough other information, such as state or county tax records or location data, can disclose complete identities—and greatly increase liability if said data is leaked or stolen.
To stay ahead of the curve, I advise organizations whose business involves the use of personal information to:
1. Never collect more information than you need. It's often tempting to ask users for "just one more thing while you're here", thinking, perhaps, the information will be useful someday. Weigh the immediate value of the information against the cost of protecting it. A cost-versus-benefit question is less emotional, and is a business decision that's often much easier to make.
2. Track where (and how) customer data is used. Document "users of information" to help answer questions about who has access to personal data (and make your auditors smile a little more). When the security of a system is compromised, knowing what data the breach exposed—and where it came from—can prove invaluable in fulfilling legal reporting requirements and customer notifications. It's been my experience that organizations who can quickly and accurately answer questions about the scope and impact of a breach inspire confidence in customers, partners, and regulators, and improve the outcome of an otherwise destructive event.
3. Keep up with changes in the regulatory landscape so you're not caught off-guard. Just as software updates are a crucial part of a threat-reduction plan, so is knowing—and complying with—the latest statutory and regulatory requirements.
The tangled web of laws, standards, data-retention rules, and reporting requirements has become increasingly difficult to track, and is a key area where vigilance in monitoring may prove well worth its cost.