• PTP Chicago

What is Data Preparation and Why Does it Matter?




A 2012 article in the Harvard Business Review suggested that ‘data scientist’ was the “sexiest job of the 21st century”, but eight years down the line, data scientists themselves aren’t feeling too happy with their job. Their main gripe—data preparation.


Research suggests that data scientists spend anywhere between 50% to 80% of their time preparing data for analysis, which is not something they are happy about. Despite it being a large part of the job, scientists find data preparation to be tedious, “janitorial” work which, according to VP of Jawbone Monica Rogati, “at times, feels like everything we do”.

In the following article, we will take a look at the data preparation process, why it matters, who needs it, and where it's headed.


Topics covered:

  • What is data preparation?

  • Why does data preparation matter?

  • How does data preparation work?

  • Where is data preparation used?

  • What is the future of data preparation?

What is Data Preparation?

Data preparation is the process of cleaning and transforming large amounts of raw data for analysis. It is the first step in the data science process, leading to data exploration and analytics. Data preparation takes a lot of time and effort to accomplish, but data scientists need to prepare their data before they can extract any useful insights from it. Data from different sources with differing levels of quality can be merged through data preparation, to create a clean, uniform format. Most artificial intelligence and machine learning (AI/ML) systems depend on big data to function and for big data to be useful it needs to be prepared first.

For example, the e-commerce site Groupon depends heavily on data preparation services to connect its subscribers with activities, travel opportunities, retailers, and other services. Groupon collects up to one terabyte of data per day, which is cleaned, transformed, and analyzed using a big data management platform, and shared with the sales and marketing team for insight.

Why does data preparation matter?

Achieving a clean and consistent format is crucial when data needs to be mined for insights. Without preparation, machine learning programs can miss important patterns within the data, and not include them during analysis. Data prep (KW) can help:

  • Fix errors: Data prepping helps catch errors at the source. Once the data has been moved to the analysis stage, it would be difficult to catch.

  • Improve quality: Data cleansing and transformation will ensure that all data is accurate and of high quality.

  • Increase efficiency: High-quality data can be processed and analyzed more quickly, leading to more efficient decision-making.





Looking for more AI/ML insights? Check out:

Supervised vs. Unsupervised Learning in Machine Learning

Artificial Intelligence vs. Machine Learning

Artificial Intelligence in Cyber Security





How does data preparation work?

Data type can change depending on the business it is being prepared for, but the basic framework of data preparation remains the same. The following five steps are common to all forms of data preparation:

  1. Accumulation: The first step is to gather the relevant data. This could be from an existing data catalog, or input individually.

  2. Assessment: Once data has been gathered it needs to be given context, also known as data discovery. Here the data is studied, and patterns and outliers in the data are identified. This helps to give the data structure and prepare it for cleansing.

  3. Cleaning: Data cleaning, also called data cleansing, is usually the most time-consuming part of data preparation. This involves identifying errors, filling in missing values, removing extraneous information, and masking sensitive data.

  4. Wrangling: Also known as data munging, data transforming, or data enrichment, wrangling involves updating data entries to make them more understandable. This will aid with analysis later.

  5. Storage: Finally, data is stored for later use, or transferred to a third-party application for analysis and processing.

Where is it used?


Data preparation is a necessary and important component of big data analytics and data science. Therefore, any industry that uses big data will need to have that data prepared, including:


Retail: Retailers need to correctly anticipate the needs of their customers, and have a plan to address them if they want to stay competitive. To be able to create this kind of predictive modeling, data scientists use large amounts of data collected from customer profiles (purchase history, interests, frequency of purchase, etc.). This data is cleaned and formatted through data preparation, to make it suitable for modeling.


Medicine: The healthcare industry has begun to invest heavily in the data science process. Large amounts of patient data are now handled by big data platforms, and healthcare professionals depend on the insights generated by them to help with diagnosis.

Social security: Governments around the world are looking into using data preparation and big data analysis in public services, especially social security. In the US, the Social Security Administration (SSA), uses data prep to comb through large amounts of unstructured data that they receive from disability claims. Structuring the data through data transformation can also help detect fraudulent claims.


Mass Media: Entertainment is consumed across a variety of different platforms and media. Data prepping can help collate input on consumer interests and preferences, from across multiple media, and make them ready for analysis.


What is the future of data preparation?

We live in the data era. Data scientists may not enjoy working on data preparation, but they cannot deny its importance. Any industry that depends on big data collection and analysis needs data preparation services to help them understand the data. As things stand now, human-driven data preparation is still a key factor in data science and its importance will only grow as more and more businesses adopt AI/ML solutions for their businesses. Data sets are only going to get bigger, more complex, and fast-moving.


There have been some attempts in recent times towards automation. IBM, for example, has been working on automating parts of the data preparation process to reduce the amount of time data scientists spend on cleaning and transforming big data, giving them more time to focus on analytics. As the call for automation in data preparation grows stronger, we may start to see increasing use of augmented analytic programs to help in data management, easing some of their burdens.


Is automation then the future of data preparation? Possibly.



Are you looking for a job in Information Technology?

See all of our current openings here!





Check out our latest video on YouTube!



About the Company:

Peterson Technology Partners (PTP) has been Chicago's premier Information Technology (IT) staffing, consulting, and recruiting firm for over 22+ years. Named after Chicago's historic Peterson Avenue, PTP has built its reputation by developing lasting relationships, leading digital transformation, and inspiring technical innovation throughout Chicagoland.

Based in Park Ridge, IL, PTP's 250+ employees have a narrow focus on a single market (Chicago) and expertise in 4 innovative technical areas;

  • Cybersecurity

  • Artificial Intelligence

  • Data Science

  • Cloud & DevOps

PTP exists to ensure that all of our partners (clients and candidates alike) make the best hiring and career decisions.

Connect: LinkedIn | Facebook | Twitter | YouTube

Review: Google | Glassdoor | Yelp | All Review Links

Apply: Stack Overflow | Dice | LinkedIn | Glassdoor | All Job Openings

Peterson Technology Partners is an equal opportunity employer.

7 views