Data Disposition: Answering the “Why”, “What”, and “How” of Data Minimization

Data Disposition: Answering the “Why”, “What”, and “How” of Data Minimization

In our previous entries, we’ve discussed WHY it’s so important to have policies and procedures in place for proper data management. Understanding the necessity of a data minimization and disposition program is the first step to properly securing and managing your data lifecycle. But what comes next? Over the next two articles, we’re going to discuss identifying WHAT you should remove as part of your data governance workflow. In an upcoming article series, we’ll look at the key factors in HOW that data needs to be disposed. 

 Part 1: Define the Policies That Govern Data 

An organization should maintain a baseline set of policies governing its retention of data. Unfortunately, there is no “one size fits all” recommendation that works for defining something like a retention policy. It’s all based on the needs of the organization so it’s critical that those who are responsible for data establish what sort of policies should be enforced on data. Considerations may include: 

  • Data age: It’s been said that making decisions based on old data is worse than making decisions based on no data. Understanding your organization’s need for fresh, updated information is critical. This varies from business to business. Many research organizations look at 5-year-old data as extinct. Other organizations won’t think of removing data until it’s crossed the 10-year mark. Naturally the speed of change and development in your organization plays a key role in this decision. 
  • Regulatory requirements: These determinations can actually affect data that needs to be kept longer and data that needs to be removed. Some governing entities like the SEC require that you hold on to certain data for at least 7 years. However, privacy regulations like GDPR, DSAR and CCPA may require you to remove data on clients and customers much sooner than that. 
  • Legal requirements: Due to the nature of some data, it’s a legal necessity to maintain all records associated with some historical issue. For instance, we’ve seen cases where organizations that used to manufacture products with asbestos are still required to maintain all documents surrounding the use of it even to this day. 
  • Usage: This consideration goes hand-in-hand with Data Age. You may have a policy in place that data older than 5 years can and should be disposed. However, there are exceptions to a policy like this. You may have foundational or framework information that is still used on a daily basis and should be taken into consideration. This may require developing structural guidelines around where your data is stored and applying the policy differently to data in those repositories. 
  • Duplication: There are myriad reasons why data gets duplicated on networks including errant processes, backups, differing permissions, and convenience just to name a few. Developing a policy surrounding how duplicative data should be handled on the network is critical because not only can this create a storage issue by simply wasting space, but it also leads to branching of data sources. If a document is in two places and separate people begin to work on separate copies of that document, now you have unique documents that have branched from one another and may lead to incomplete or erroneous decision making. Duplication is one of the largest problems we see in systems today and it requires effective data management to control. 


Below are three ways we can help you on your journey integrating responsible data intelligent into your company

  1. Schedule a demo with us. We can help show you around, answer questions, and help you see if iCONECT is right for you. Click Here!
  2. Learn more about our platform. Click Here!
  3. Download our RFP Toolkit. See how we stack up and help your through your journey. Click Here!
  4. Share this blog post with someone who you think would benefit from it! Share via LinkedIn, Facebook or email.


Jonathan Younie, Product Manager, Data Governance – LINKEDIN PROFILE  

As an information systems management professional with over 25 years’ experience, Jonathan oversees the iCONECT Data Governance program. Throughout his career, Jonathan has been responsible for oversight of organizations’ entire data, network, and software engineering landscape with a focus on information security.  In his most recent role as CTO and CISO for Ramsey Quantitative Systems, Inc. (RQSI), a quantitative investment firm, Jonathan has been responsible for planning and overseeing the development of the company’s internal software systems, network infrastructure, disaster recovery, and business continuity, as well as ensuring the safety of the organization’s intellectual property. Jonathan brings an extensive understanding of organizational data needs to the iCONECT team and plays a key role in helping our clients find opportunities to manage and control their data so they can focus on driving productivity and progress.  He has been in professional computer engineering and information systems management for over 25 years and possesses more than 15 technology certifications in the software, network, and security engineering fields, as well as technical training certifications.