ETL, ELT, and More: 6 Ways to Manage Multiple Data Sources

ETL, or extract, transform, load, is not just another random business acronym. ETL is a process by which multiple data sources are brought together in one centralized—or a few disparate—databases. This ETL process involves three quintessential steps: 

  1. The extraction of data from its original source
  2. The transformation of data through deduplication, combination, and quality assurance
  3. The loading of data into the final, central database.

The ETL process might sound overly complicated, but rest assured that it is a necessary process for businesses that want to capitalize upon their data. Furthermore, tools exist that help companies undergo the ETL process in the smoothest way possible. Alongside tools, often it is best to also seek out expertise from a partner that specializes in data management and integration. These partners get to know an organization’s unique needs and culture in a way that allows them to craft strategies that best fit a business’s goals.

Before detailing any of these tools and partners, however, let’s dive a little deeper into what ETL means in the context of business. We’ll also cover the difference between ETL vs. ELT and other methods of integrating data from multiple sources.

What is ETL in Business?

ETL is crucial for businesses that want to optimize their ability to analyze their data. Not only does ETL take multiple sources of data and consolidate them into one spot, but ETL also can enable multiple types of data to work together. 

ETL tools make this consolidation possible by enabling the migration of data between a variety of sources, destinations, and tools. But how exactly does this occur?

The Steps of ETL

Extract

In this initial step, the desired data—whether structured or unstructured—is imported from a data warehouse or a data lake to a single repository. Some common data sources included in this stage are:

After the various desired sources of data are identified, data extraction can happen in one of the following ways:

Transform

The half-way point of ETL is the process of transformation. It is the actual cleansing and deduplication necessary for data to undergo to prepare it for effective and accurate analysis. This is the most important and often the most arduous step of ETL, including several key stages similar to these:

Transformation is a particularly crucial step of the ETL process to get right because it significantly improves data integrity by ensuring that various types and sources of data reach their end destination in viable and ready-to-use forms.

Load

How a load occurs in the ETL process is dependent mostly on how a company plans on utilizing their data. It is crucial to take into account the end host system’s functionalities and how it operates to ensure the function of the system isn’t negatively impacted.

Depending on what host system a company uses, there are generally two different ways to load data into a data warehouse:

ETL is just one approach (with different microvariations) to integrating data from multiple sources.

ETL vs. ELT

ELT is similar to ETL, but the order of operations is different. Instead of transforming data before exporting it to the final host system, the data is imported raw to be transformed as needed later on.

ETL vs. ELT Pros and Cons

When deciding whether to choose ETL vs. ELT,  several factors must be considered. Both approaches have their pros and cons. 

ETL has been in use longer than ELT, so there are numerous well-established processes and tools to implement it. ETL is also more flexible, since it can be implemented in both on-premise and cloud environments. Because data is transformed before being loaded, it allows you to remove or encrypt sensitive data before it reaches the target ETL database

The downside to ETL is that data is unavailable while undergoing the transformation process. It’s also not suitable for large volumes of data since the transformation stage takes so much time. 

ELT, on the other hand, allows for immediate access to data. All data, whether it’s unstructured or not, is immediately transferred to a data lake, where it can be transformed as needed. While ELT can be more efficient than ETL, if the process involves large volumes of unstructured data, it can be hard to analyze and report on that data. There are also limited tools available to support ELT, and it’s ideally suited to the cloud. 

Because ELT can offload some processing from the ETL infrastructure, it is often used for transfers that involve big data and modern cloud-based data warehouses that have powerful processing capabilities. ETL, on the other hand, is often used in scenarios where data requires significant cleansing and manipulation before being loaded into the target system.

Other Data Integration Methods

Other ways to facilitate data integration include:

Whichever data integration method a company uses, the same truth applies: Companies need to consider integrating their data in order to gather and utilize profit-boosting and efficiency-improving analytics.

What are ETL Tools?

ETL tools are various software programs designed to make data implementation processes easier. There are four common type of ETL tools:

What ETL Tools Should You Be Using?

What ETL tools a company needs to quickly, easily, and successfully complete their data transition process depends on the needs of a company. Evaluation factors for a company to consider before choosing a tool or suite of tools include, but are not limited to:

How Kenway Can Help

Kenway is a data management consulting company that offers expertise in the areas of data warehouse modernization, data governance framework, and data storage and migration.

Our firm has a steady history of providing companies of all sizes and industries with comprehensive data management and governance solutions. Reach out to us today about taking your first steps towards centralization of data through ETL or another data integration method.

ETL vs. ELT FAQs

Which is better ETL or ELT?

ETL and ELT both have their advantages and disadvantages, and neither is better than the other. ETL is typically preferred for projects that require significant data cleansing and manipulation. ELT is typically preferred when the target environment has the capacity to transform the data after it has been loaded.

What is the key difference between ETL and ELT?

The key difference between ETL vs. ELT is the order of the steps in the process. With ETL, data is transformed before it’s loaded into the ETL database. With ELT, the data is loaded into the target database, and then transformed. 

Is ELT replacing ETL?

While ELT is better suited to cloud-based environments and allows for faster access to data, there are still use cases for ETL. ETL is still preferred for moving from on-premise databases to the cloud.

 

Data Literacy Framework: The Key to Accomplishing Your Data Goals

You have a data strategy. You have a roadmap of expensive projects and impressive technology to help you implement them. There are ambitious goals tied to your success. You’ve even created new policies, processes, and procedures. Then, you launch. Low to moderate success is achieved. 

Expectations were so high! The project or tool had so much potential! So, what went wrong? Front-line staff bypassed procedures and entered the bare minimum data to fly through tasks more quickly. Executives ignored carefully developed KPIs in favor of "gut instinct." 

Despite all your preparation, something was missing. Just as a garden’s success is dependent on how well you prepare and enrich the soil, execution of your data strategy relies on how well you prepare and enrich the people implementing it. Everyone, from front-line staff to executives, needs to have the skills to participate. 

That’s why it’s not just enough to invest in tools and planning. Investing in data literacy, so your employees can leverage data and technology to achieve the ambitious goals you set, is also critical. The way you approach this process—your data literacy framework—makes a difference. The right framework should incorporate considerations for your data goals, your workforce’s current capabilities, and the tools you will use to implement your data strategy. 

Here’s why it’s important to cultivate a data-literate workforce and what you should consider as you build your data literacy framework. 

Understanding Data Literacy

According to MIT, data literacy is the ability to read, work with, analyze, and argue with data. To build a data-literate company, employees need to have different levels of competency with each aspect of data literacy. For example, front-line workers need to be more adept at reading and working with data, whereas managers need to be more skilled in analyzing and arguing with data. 

Improving employees’ data literacy skills enables them to: 

With these capabilities, they can incorporate data in their day-to-day tasks and bring your business closer to realizing its data strategy goals. 

Why Is Data Literacy Important?

Businesses that invest in data literacy programs see wide-ranging benefits, from higher levels of productivity to increased data utilization. 

Improve Employee Productivity, Satisfaction, and Propensity to Innovate

According to a Tableau report, data literate employees are more productive and make faster, better decisions, which translates to a better customer experience. With better access to data, and the skills to use that data effectively, employees are more capable of innovating. Offering a data literacy program also increases loyalty—nearly 80% of employees say they’re more likely to stay at a company that offers data upskilling.

Make Better Use of Data

By closing the data literacy skills gap, you can empower employees to leverage data to solve business challenges. For example, there’s no shortage of people analytics tools available to help HR teams track turnover, engagement, diversity, and other key metrics. When they know how to choose the right data sources, interpret data sets, delineate between causation and correlation, and communicate their findings, they can take full advantage of these tools.

Increase Data Maturity

Improving data literacy is a key aspect of progressing through the stages of data maturity. At the highest level of data maturity, data management isn’t solely the responsibility of IT. Instead, IT works in unison with the larger business to develop and maintain data management strategies and employees at all levels are capable of using data to drive decision-making. 

Barriers to Establishing Data Literacy

Considering these benefits, why is it so difficult to increase data literacy and realize its potential? Cultural and technical hurdles often get in the way. As you build your data literacy framework, it’s important to think about how these barriers impact your business. 

Building a Data Literacy Framework

Effective data literacy programs are geared towards empowering employees at all levels of the organization to use data effectively. According to the above Tableau report, organizations that offer training for a wide variety of skills to all employees see better results than those that only offer narrowly focused training programs. So, how do you offer the right education without overwhelming your workforce?

By following a data literacy framework, you can take a methodical approach to develop a program that will have lasting, tangible benefits for your workforce and your business. 

1. Generate the Need 

Because improving data literacy requires a cultural shift, getting leadership buy-in is essential. To get leaders and other key stakeholders on board with the data literacy program, show them its value to the business. 

2. Assess the Data Literacy of Your Workforce 

Based on their previous experience and the data functions they’re currently expected to perform, individual employees’ current data literacy skill levels will vary. By assessing their current capabilities, you can create tailored programs that will drive comprehension and better data utilization.

3. Teach Basic Data Concepts

At this stage, it’s important to get individual contributors engaged with data. They need to understand the “why” behind the project. Teach the value of data, and how it can improve day-to-day workflows, so that individual employees understand its role in their work.

4. Develop a Common Language

Even though you’re asking employees to level up their skills, they shouldn’t be expected to become data gurus. Engage employees at all levels by simplifying difficult concepts and using ordinary language instead of technical jargon.

5. Develop Employee Data Management Capabilities

As employees gain more awareness and access to data, they play a larger role in maintaining its accuracy, accessibility, and safety. Educate them on the data management best practices and company policies they need to know to promote data integrity.

6. Apply Data Knowledge 

As employees become more data literate, encourage them to use data more often. Promote data as a tool that empowers them to solve problems, innovate, and collaborate with confidence. 

Advance Data Literacy in Your Workforce

A successful data literacy program backed by a solid framework can help your company transform into a data-driven organization. It ensures that the time, energy, and money you invest in your data strategy, roadmap, and tools pay off. Instead of experiencing the disappointment of poor-performing projects, you can execute data initiatives with confidence. 

At Kenway, we understand the important role education plays in the success of any data initiative. Whether you need help developing your data literacy program, or if you want to ensure that literacy is a key component of your next data project, we can help. To learn how we help other companies like yours, read our case studies.


FAQs

What is a data literacy framework? 

A data literacy framework guides your data literacy program. It includes considerations for your workforce’s current capabilities, your data goals, and an educational plan geared towards helping them gradually develop their skills. 

What are the main characteristics of data literacy?

According to MIT, the main characteristics of data literacy are reading, working with, analyzing, and arguing with data. The more competent employees are in these areas, the higher their level of data literacy.

How do you develop data literacy?

To develop data literacy, follow a well-developed plan to promote a data-literate culture. Employees should be encouraged to become more aware of and engaged with data in their day-to-day work. To promote success, an effective data literacy program should be tailored to meet employees at their current level of competency. 

 

Benefits of Data Process Transformation

Digital transformation is hardly a new concept. With 91% of businesses engaged in some form of digital initiative, it’s widely understood that upgrading technology stacks is essential to remain competitive in today’s landscape. 

The problem, however, is how many organizations begin the digital transformation process only for it to fail—84%, in fact. While there’s a variety of reasons behind failure, here at Kenway Consulting, we’ve been helping organizations conduct successful digital transformations for decades. 

As an organization begins to digitally transform its business processes and data into more actionable states of being, roadblocks and unforeseen challenges tend to arise. There are a myriad of reasons organizations can struggle to achieve a successful digital transformation, but there are two that stick out:

    1. Fragmented, disjointed data processes were in place from the beginning of a company’s transformation
    2. Strategy for the transformation was an afterthought, causing major disruptions as organizations underwent a faulty transformation 

In this blog, we will focus on a vital aspect of digital transformations: the evolution of data processes. A data process transformation is one of the most difficult, yet critical, parts of a digital transformation journey—and having poor data quality comes with a major cost for an organization. Here, we will share key insights we’ve learned throughout our years of experience in helping organizations with transforming their data processes and developing a sound strategy so that they may ultimately reap the benefits rather than fail.

What is Digital Transformation?

Digital transformation is the process of leveraging technology, organizational processes and people to develop or enhance existing business models and revenue streams. 

One of the most popular types of digital transformation is data process transformation, mainly consisting of the movement of data to the cloud. Moving data from legacy systems to the cloud is typically unique to each use case and factors in many intricacies that need to be strategized and prepared for ahead of migration. 

When undergoing data process transformation, there are several main considerations for success. Below are a handful of influential steps when undergoing a transformation of a company’s data processes:

      1. Data Discovery: The identification and understanding of the data being transferred from the legacy system to the new platform. This step is essential in an organization’s comprehension of the data that is being moved and why it is important to current business processes to ensure it is leveraged correctly
      2. Data Governance: The collection of clearly defined policies, procedures, standards, processes, roles, and responsibilities that ensure the effective and efficient use of data in enabling an organization to achieve its goals. At this stage, business and data quality rules are defined 
      3. Data Migration: The design and development of data migration and transformation process to take the data from its current format within the present state solution, and transform it to fit the structure of the future solution
      4. Data Review & Correction: Data steward review and correction of data issues resulting from the data quality rules applied during the data migration and transformation process
      5. Ongoing Data Assessment: The development of an ongoing process for new scenarios and escalation to the Data Governance Committee to confirm the appropriate application of business rules 

It’s common for an organization to run into disruptions during a data process transformation—expecting the unexpected is crucial, but also comes with its own set of challenges. As teams navigate how to best approach the evolution of their data processes, seeking external guidance from a group of experts could be the difference between a company’s success or failure. Hiring an expert team of corporate technology consultants to ensure digital transformation success helps organizations put their best foot forward when undergoing such an important process.

In a recent case study, Kenway Consulting defined and implemented Data Transformation to support an improved future state for an industry-leading healthcare solutions provider. Read the full data process transformation case study here. 

Benefits of Digital Transformation

Whether consumer habits are shifting or a global pandemic occurs, undergoing a digital transformation, specifically a data process transformation, can quickly become a priority to fit current and upcoming business needs. While there are many, the benefits of digital transformation include more accurate forecasting of market trends, improving internal processes, and making more data-driven decisions. Here are a few more:

1. Keeps organizations competitive

Utilization of Business Intelligence (BI) tools to functionally organize data provides companies with insight into everyday activities, allowing leaders to identify more productive operations methods, price risks, and forecast market patterns ahead of their competitors. This kind of visibility into a company’s data pays off as data-driven businesses are 58% more likely to beat revenue goals than those who do not prioritize optimizing their data. 

Data-centric companies have data easily accessible and organized in a way that supports business objectives, further encouraging employees to be able to deliver the best possible product/service to the end-user. This is why data-driven companies are 23 times more likely to acquire customers than their peers. If the data being used to make big-picture decisions is easy to gather and analyze, businesses can be better equipped to deliver the best possible product for their customers.

2. Improved data quality

In our experience, we’ve observed that seamlessly undergoing a data process transformation is a major hurdle for our clients—specifically when assessing a business's data quality. 

Data quality issues can come in many forms, such as misplaced data, human error, and formatting inconsistencies. In fact, 41% of companies say inconsistent data across their tech stack is their biggest challenge. And as data grows over the years, it becomes harder to draw correlations between data sets, resulting in less effective analytics and insights, as well as a diminished business-user trust. 

Through the data transformation process, organizations have the opportunity to clean their data and implement more resilient data governance frameworks that ensure accuracy and establish a single source of truth. 

3. Increased data usage

Companies are collecting an overabundance of data on a daily basis, leaving a majority of it under-utilized. In fact, 60-73% of all organizational data is never analyzed for Business Intelligence purposes. As a result, organizations are paying the price in the form of missed revenue opportunities, lower efficiency, and productivity/quality issues. With a Business Intelligence tool, organizational leaders and employees can use data to improve efficiencies and make more informed decisions. In fact, 74% of business leaders expect long-term gains in productivity by making data insights available to frontline employees.

data process transformation

Key Insights for Ensuring Digital Transformation Success

While there are many benefits of digital transformation that pay off in the long term, there are also a variety of risks that are important to mitigate upfront. The biggest risk of all is the high probability of failure with 70% of all major business transformations being unsuccessful.

As organizations discuss data process transformations and run risk assessments, here are some key considerations to have when developing a migration strategy:  

1. Have a plan to ensure internal adoption

Only 37.8% of organizations report having a data-driven organization. Many organizations' internal team members will face a massive learning curve when learning how to properly manage, store and utilize data to make decisions. 

If a company’s team does not embrace the new technology, their digital transformation will result in a failure to adopt it. To help employees not only accept, but fully embrace new data processes, a change management strategy is necessary. Successful change management ensures:

Companies that invested in a rigorous change management approach reported a 79% success rate—three times the average for all other initiatives. 

2. Evaluate all potential costs 

The average digital transformation budget for mid-large scale companies is $14 million. Because of this, the cost of investing in data process transformation is a common concern for business leaders. But the truth is, while transformation is not cheap, the long-term savings far outweigh the upfront costs. In fact, data-driven companies are 162% more likely to outperform laggards.

3. Assess your resources

From lack of data process transformation expertise to the absence of trained staff to manage the change, there are numerous ways an organization can experience a shortage of adequate resources when developing and rolling out new data processes. 

This lack of expertise can lead to several roadblocks when refining data processes: unclear goals, poor planning, and little to no risk assessment are a few that can make an organization’s data transformation collapse. Not to mention, companies can’t stop running just because they are transforming their data processes—the show must go on. Hiring a corporate consultant can help provide the resources a company needs to ensure its digital transformation is a success. Plus, while the consultants work as an extension of a company’s team, employees can better use their time to focus on the day-to-day. 

Additionally, data isn’t the only process that needs to be transformed—team structures need to evolve as well. Oftentimes, the maintenance required to support an ongoing data process transformation is typically overlooked. Organizations need to have a team in place to maintain new data processes once they’ve been integrated. It’s critical for teams to be adequately trained to actively embrace the technology and for new hires to be made to fill the gaps necessary to ensure successful adoption. 

Ace Your Digital Transformation with a Partner You Can Trust

Data process transformation is no longer a want for organizations hoping to remain competitive, it’s a requirement. With so many risks that go into the digital transformation process, hiring a management and technology consulting company that has proven experience in helping organizations with data transformation can help avoid bottlenecks and accelerate success. 

At Kenway Consulting, we lead businesses to victory by integrating our guiding principles into everything we do. These guiding principles lead how we approach our interactions with our consultants, clients, recruits, and those with whom we network. At Kenway Consulting, we believe that the means to success is actually more important than success itself. Our principles focus on the following themes: 

      1. Treat each individual with respect
      2. Integrity
      3. Means over outcomes
      4. Communication
      5. Entrepreneurial spirit & tenacity
      6. Value & quality

We’ve been helping organizations successfully implement modern data platforms for nearly two decades. Our team specializes in helping organizations build their IT strategy and build the data architecture & design required to support it. Connect with us to learn about how we can help with your data process transformation needs. 

How to Make the Most of Data Visualization

The human brain can process entire images that the eye sees in as little as 13 milliseconds

Regardless if you are the CEO, technology director, or compliance officer within your organization, information in the form of graphs and charts is not only easier to digest, but also promotes data-driven decision-making.

But there is a problem: Most organizations generate massive amounts of organizational data every day that is left unused, taking up storage in its rawest form. Consequently, teams are unaware of the data available to them, and if they are aware of it, they are unsure how to access or interpret it. Scattered and disorganized data requires hours of manual consolidation, cleansing, and validation, and the output is ultimately prone to manual errors. 

If your teams are inundated with spreadsheets and spending an inordinate amount of time on gathering, cleansing,  and reconciling disparate data sources manually rather than providing value-add analysis, then it might be time for a change.

If your organization’s leadership is not leveraging organizational data as a valuable asset to drive proactive risk mitigation and decision-making, then the real question is, “How much am I spending by NOT investing in my data?” 

The cold, hard truth is that organizations can no longer afford to rely on spreadsheets and dirty data to make business decisions; however, data visualization can help to automate the consolidation and aggregation of data, equipping teams with the power to quickly interpret information to drive business results and increase overall team efficiency and satisfaction.

What is Data Visualization?

Before we cover how visualizing your data can help your organization, you may be wondering, what is data visualization? 

Data visualization is the transformation of unstructured or raw data into a visual form to communicate complex data relationships and data-driven insights in a way that is captivating and easy to understand. By succinctly summarizing copious amounts of organizational information into visually appealing reports, teams do not have to dissect and analyze underlying data to understand trends over time.

Data visualization bridges the gap between data and action by providing access to real-time metrics, allowing businesses to be better positioned when it comes to: 

In addition, data visualizations provide leaders the opportunity to harness existing data and leverage it to learn from past mistakes, build on past successes, and anticipate developments that drive innovation and accurately predict future outcomes. 

Key Benefits of Data Visualization

The average organization collects data across 400 different sources. However, about 47% of that data goes completely unused because it is disorganized, unstructured, and dirty, which can cost your organization countless hours and dollars

In order to fully realize the value that your data can offer, good data visualization is imperative. However, investing in data visualization tools and technologies without organizational buy-in or foundational data practices can actually prevent companies from maximizing ROI in the long run. In order to ensure sustainable value realization, you must first establish core data governance practices, clean up data sources, and determine the data needs of your organization.

Once these foundational data practices are in place, data visualization can deliver the following key benefits:

1. Increased Comprehensibility of Data and the Break Down of Data Silos

Because visual data is processed much faster by the human brain, presenting data in an easily consumable format has the incredible ability to streamline organizational production. In contrast to text form, which has more historically been used as the preferred medium for exchanging information, humans can process visual images 60,000x faster. Furthermore, data visualization provides a much more interactive approach to displaying data, thus allowing users to quickly understand the story the data is telling without needing words to provide context. Presenting data to your executives or teams in a visual manner allows for far fewer gaps in communication throughout the enterprise, which can ultimately shorten business meetings by 24% and give you and your teams more time for other value-add initiatives.

Additionally, data visualization can break down data silos within your organization and reduce the amount of time spent on manual reporting. Sixty percent of employees believe they could save six or more hours if static reporting was automated. Business Intelligence tools bridge the gap between siloed data and reporting by utilizing centralized data to display accessible visual reports. Ultimately, implementing a centralized Business Intelligence solution can help prevent wasted efforts on non-value-add activities, while also acting as a catalyst for cross-functional collaboration. 

2. Save Costs & Drive ROI

How can visualizing your data really drive a return on investment? The answer to this question is unique to every organization and depends on the problem you are trying to solve; however, the competitive advantages to investing in Business Intelligence are as follows:

In addition to understanding the tangible benefits of implementing a Business Intelligence solution, it is equally important to note the true costs of not having one. How much will a lack of visibility, process inefficiencies, employee unproductivity, and outdated IT enhancements cost your organization over time? Some potential costs to consider include:

While ROI looks different for every organization, statistics show that data visualization offers an average of $13.01 ROI on every dollar spent. Business Intelligence tools make your data centralized and easily accessible, so employees spend more time on business functions rather than compounding the problem large amounts of data can present.

Data Visualization Best Practices

The quantitative and qualitative benefits of implementing Business Intelligence tools are endless; however, to fully capitalize on your investment in data visualization, you will want to consider these five data visualization best practices:

1. Identify Your Most Critical Data

The first best practice is to establish a core set of data that is most relevant to the entire enterprise. By first defining the business impact you are striving to achieve from implementing data visualization, you can then identify your most critical data element. Are you hoping to:

Once you identify your most critical data elements, you can begin to strategically and actively reduce the volume of data you do not need and acquire new data elements to paint a holistic picture of your organization.

2. Establish Data Governance

Establishing data governance to aggregate and organize by effectively managing data definitions and values is imperative to lay the foundation for sustainable value realization from an investment in data visualization. A few key first steps include:

But these steps are only the beginning. Keep in mind that data management requires ongoing evaluation of data quality to best promote accurate reporting.

3. Implement a Centralized Data Model 

In order to blend data sources into cohesive visualizations, it is best practice to create a centralized data repository. ​​Whether the aggregation occurs within a reporting tool itself or reporting database, it is imperative to blend data sources to provide cross-functional reporting. Benefits of having a centralized data model include:

4. Create a Data-Driven Culture

A key consideration of any digital transformation is to ensure employees within the organization embrace the new technology. In order to increase adoption and combat any resistance to change, it is essential to develop a cultural framework that motivates your employees to leverage the Business Intelligence tools available to them. Of course, this is easier said than done. Today, only 24% of companies admit to having a truly data-driven culture. A few challenges to overcome include:

In spite of these challenges, a data-driven culture is possible to achieve. You set yourself up for success when:

5. Know the Audience

Another data visualization best practice is to know your audience. When designing reports, it is important to understand who is the intended audience of the report and what information the end-user needs. For example, executive-level audiences will require a different level of granularity than employees completing day-to-day tasks. A few dashboards to consider to variate data visualization for different audiences include the following:

Rather than just displaying information that was previously in PowerPoint into a Business Intelligence tool, you can fully harness the power of data visualization by asking questions such as:

From there, you can design the right reports by leveraging data to surface actionable insights and improve business performance.

Success Stories: Implementing Data Visualization Into Your Organization

While there is no one-size-fits-all solution when it comes to visual analytics, at Kenway Consulting, our expertise and obsession with all things data have helped us paint the picture of transformative business opportunities for organizations just like yours. Here are a few examples of how Kenway has used data visualization to help small businesses to large, global enterprises alike.

Gain Insights From Untapped Data

A mobile application company developed an app focused on virtual engagement to provide cultural institutions with enhanced experiences for their visitors. The application collected large amounts of data from its users but had no way to make that data insightful for clients. The organization was looking for an analytics platform that could:

Kenway developed a fully automated, end-to-end process to support the visualizations needed to help the client’s customers understand the value of their data and make informed decisions. The reports created provided insight into who was visiting their institutions, where visitors were spending most of their time, the most-visited areas of the property, and more. 

Read the full case study here.

Establish a Single Source of Truth 

A leading asset management firm had the goal of harnessing massive amounts of data to become more strategic and intentional in targeting its wealth advisor clients. However, due to numerous data inefficiencies and process gaps across the organization, it struggled to support its sales teams in understanding the full breadth of their relationships with current and potential clients. The company lacked real clarity around advisor profiles such as:

The main problems faced by the organization included:

      1. Siloed data sources
      2. Disconnects in organizational communication
      3. Slow and ineffective processes

Kenway collaborated with the business to understand its needs, analyze the current state, and work with technology teams to build a design that would deliver results. To ensure the organization was set up for success and continued growth, we took the unique approach of working together with the asset management company, as opposed to helicoptering in and leaving them with a design and recommendations that were not tailored to their needs. 

This partnership also allowed the asset management company to cultivate institutional knowledge and build in-house capabilities and data visuals needed to support and adapt the modern data platform over time. This focus on enabling critical business outcomes built upon a solid baseline of governance and architectural capabilities helped to ensure sustainability and long-term success.

Read the full case study here.

Visualize Forecasting Data

When it comes to your business, it is better to be proactive rather than reactive. While we cannot predict the future, business forecasting can help you prepare for potential outcomes. Data visualization can be especially helpful in the development of forecasting charts. 

Forecasting charts analyze your descriptive analytics (historical data) over a specific period of time and provide predictive analytics, or trend lines, that extend past the current date to help you predict future business outcomes. Predictable forecasting can be beneficial when trying to:

At Kenway, we offer Business Intelligence solutions like Power Business Intelligence to bring your business-critical insights to life through customized reports and dashboards. Using applications such as What-If Analysis, your organization can plan for best-case and worst-case scenarios over the next 6-12 months. 

Data Visualization
By leveraging historical data as a proxy for future Inventory Aging, we created a dashboard that forecasts inventory aging for a retailer client to predict the concentration of aged inventory for future risk mitigation.
data visualization methods
Using historical revenue trends as a proxy for future revenue predictions, we created a dashboard that allows the finance team to gain insight into future company performance.

Drive Revenue by Increasing Timeliness and Accessibility of Customer Data

Data visualization can help your organization have better insights from customer data to quickly identify and capitalize on new market opportunities.

In order to identify new customers in new markets, you first need to have a strong understanding of your current customer base. Aggregating and cleansing customer data that is spread over a range of disparate sources, such as sales, accounting, and marketing, can be extremely time-consuming and near impossible through conventional methods and excel spreadsheets. Even if you manage to combine various data sources, surfacing meaningful insights based on criteria such as product line, region, demographic, or sales territory can prove to be even more difficult.

Blending disparate customer data within a Business Intelligence tool allows you to create standardized KPIs, metrics, and visuals to better analyze the characteristics of your current customer base in real-time and become more intentional and strategic with your go-to-market strategy. 

Kenway has vast experience in leveraging analytics and data visualization to reveal a 360-degree view of your customers. To make this information even more powerful, Kenway can also blend external and internal datasets to present a macroeconomic view of your internal data trends.

data visualization best practices
Leveraged CRM data to forecast future opportunities based on the expected probability of current sales targets to materialize. 
Data visualization employee interaction map
Leveraging CRM data, we created a network map to show the prospecting synergies across the sales organization for more intelligent targeting.

No matter how savvy your sales organization and business leaders are, their innate ability to identify new opportunities does not compete with a tool that can quickly analyze and consolidate terabytes of data.

Selecting the Best Data Visualizations For Your Organization

There is no denying it: Enterprise data collection is not slowing down. In fact, over the next two years, it is expected to increase at a 42.2% annual growth rate. As the volume and complexity of data caches continue to proliferate, Business Intelligence and data visualization tools will enable your entire organization to consume the information being collected and make proactive business decisions.

Not sure how to navigate the future of your data? Kenway can help. From surveys and polls to decision support tools for the C-suite, our Power Business Intelligence portfolio highlights how our Business Intelligence engagements have helped transform data into consumable, interactive dashboards and reports that drive business-impacting decisions. Request a free data strategy consultation today.

 

Making Data Insightful

 

Industry: Technology

Solution: Data Warehouse and Data Analytics

Client: Virtual Engagement and Mobile Application Company

The Situation

An organization focusing on virtual engagement provides cultural institutions with enhanced experiences for their visitors by turning mobile devices into personal concierges and expert tour guides and providing options for augmented reality and virtual reality experiences at zoos and parks. In the process, the mobile app collects invaluable usage data from visitors such as time spent in an exhibit, videos watched, paths taken through buildings, and other key indicators that can then be analyzed to make strategic decisions around marketing, value of exhibits, and areas of improvement. The company is setting a new standard for virtual and mobile experiences at cultural institutions and needs to ensure that the data it collects can be monetized and leveraged by its customers.

The Problem

The application collected large amounts of data from users but lacked an effective way to make that data insightful for clients. The organization was looking for help identifying analytical insights from its aggregate user data (such as exhibit engagement patterns, high traffic areas, visitor demographics, etc.) that would be powerful enough to support strategic decision-making and could be sold back to these cultural institutions.

The organization was also interested in understanding what the Business Intelligence (BI) landscape could offer, and what tools were available to help continue building out its analytical framework.  Specifically, they wanted to know more about:

The Solution

Kenway provided a mix of services to build a solution that uniquely met the needs of this organization, including Vendor Assessment, Data Management, BI, Architecture and Design, and Custom Development. Ultimately, Kenway worked to retrieve the data collected through the app and load that information into a newly-built Redshift backend database.  They also wrote APIs to pull all data into staging tables, SQL scripts to pull that data out of the staging tables and into a normalized data model, and a Qlik Sense reporting tool to visualize the data.

 To determine the best BI tool on the market, Kenway performed a vendor assessment comparing different BI tools on the Gartner Magic Quadrant; Tableau, Qlik Sense, Power BI, and Amazon QuickSight (not on the quadrant) were all considered. Based on an assessment of the organization, Kenway knew the tool would need to provide an end-to-end process of source to dashboards, an ability to handle larger volumes of data/scale to support big data, and easy-to-use, intuitive visualizations.

As an aid to this assessment, Kenway’s BI expert also created a vendor assessment to weigh and analyze features offered by each of the options being considered. The paper highlighted key areas of importance such as stress test, strength and weakness deep dive, total cost, and logistics and implementation. After all comparisons were finished, the recommendation came down to Qlik Sense and Power BI, with both having similar features to meet the client’s needs.  Once cost and integration factors were considered, Qlik Sense was identified as the best tool to deliver on the client’s defined requirements.

To bring the application usage data into insightful visuals, Kenway provided a combination of Application Development, Data Management, and Analytics services to further expand the capabilities of its client’s existing Amazon Web Services (AWS) architecture. They used the AWS pipeline to execute SQL and take the data from a staging table to the production table. By providing the right technical skills, Kenway was able to develop a fully functional “Analytics Pipeline” to bring the data into a data warehouse and make it available for the analytics tool. This new data warehouse was built on Redshift. To help enrich the demographic data of the users, a third-party data source was brought in to merge with the client’s app usage data. The demographic data was provided monthly through an SFTP site that Kenway automated to retrieve, load and merge to its client’s data set. This additional data source provided more insightful analytics to the customers.

What We Delivered

Kenway delivered a fully automated, end-to-end process that pulled the app data already being stored on AWS using APIs, loaded it to a normalized data model on the newly-built Redshift data warehouse, and visualized it using Qlik Sense BI reports.  The end-to-end solution included the following:

The Result

Kenway developed a fully automated, end-to-end process to support the visualizations needed to help the client’s customers understand the value of their data and make informed decisions. The reports that were created provided insight into who was visiting their institutions, where visitors were spending most of their time, most-visited areas of the property, etc.

If you’d like to learn more about how Kenway can help with your Analytics Pipeline or our custom development expertise, reach out to us at [email protected].

 

A few examples of the visuals that were created:

Visual 1 - demographics:

Visual 2 app usage summary:

Visual 3 – app openings:

Visual 4 – traffic patterns:

Visual 5 – favorites within the app:

Visual 6 – videos watched within the app:

 

 

Top Innovations from Snowflake Summit 2021

Over the past year, Kenway has continued to invest in its partnership with Snowflake through a variety of experiences and certifications such as Snowflake SnowPro Core. Snowflake is a cloud data warehouse that unites siloed data, discovers and securely shares data, and executes diverse analytic workloads. We believe there are many opportunities for our clients to reap the benefits of this innovative cloud warehouse and data lakehouse platform.

As one of Kenway’s certified Snowflake SnowPro Core employees, I recently had the opportunity to attend Snowflake’s two-day virtual summit, which provided me significant insights into all the new features they will soon be releasing to the public. Here are some of the overall themes and highlights that were discussed:

Connected Industries

Every Snowflake user can access data across the cloud, regardless of region, which provides organizations a better means for data collaboration. Some companies are using Snowflake to bring in their data in real time, which allows them to get up-to-date insights on transactions. Snowflake has substantially increased the data sets available on its Data Marketplace. The Marketplace is where companies can share their data rather than make copies of it and then move it around. Another key topic in the Connected Industries domain was Snowflake’s new integration with ServiceNow. This native integration allows the ServiceNow data to be readily available in Snowflake for companies to manage consumer relationships and become part of a central data repository, rather than being siloed.

Another powerful feature in Snowflake that helps industries stay connected is its improved data sharing capabilities. There are three methods for organizations to share data: share it with other accounts, publish to a private exchange, or publish to the public marketplace. The new API will help guide users through the process to make entire databases, tables, views or functions available to those with whom they’d like to share. Snowflake sharing eliminates the need to create and build processes to move data inside or outside of an organization. There is one copy of the data created, and data updates are made available to consumers in real time.

Global Governance

Last year, Snowflake released a feature that allowed users to mask data dynamically. To add onto that, they have been working on row access policies and object tagging. Personally identifiable information (PII) is difficult and tedious to identify in data, and is usually done manually. Snowflake now has two key features that allow its customers to tag data automatically through classification and anonymized views. They can then apply functions to the table that will either generalize or suppress that data. These features still provide the analytical value needed to generate insights and protect customers’ information.

Security has also been front and center when talking about storing data in the cloud. To address this concern, Snowflake has a layered security model that includes network security, identity authentication and access management, and single sign-on. One of the latest improvements is the private connectivity to users’ Snowflake internal stages. The staged data that is accessed through client apps remains on the private network. On AWS, users access Snowflake’s S3 PrivateLink capability where the data stays on the Amazon private network which eliminates the need for proxies. Azure has a similar capability called Azure Private EndPoint. Another important security feature is the new session policies that can be set at the account level or individual users, as well as database and UI timeouts.

Platform Optimization

Snowflake continues to improve its performance and efficiency, and the most recent updates made are better compression and query acceleration. All new data being written to a warehouse is compressed even more, resulting in some customers benefiting from a 30% cost savings for user storage. The query acceleration service boosts performance which can result in 15 times performance improvement and less latency, giving users better predictability on query response times. These new features are being done behind the scenes so that there is no impact or downtime to Snowflake users. In conjunction with performance and storage improvements, users will soon have a new administrative experience with improved admin screens to better track usage, storage and costs for their organizations.

Data Programmability

Snowflake users have been utilizing task scheduling to help with their data pipelines. One improvement to this feature is serverless tasks which allow for serverless execution, and automatically determine the appropriate amount of compute resources. Another new feature in this domain is schema detection which can determine the schema for semi-structured file types of Parquet, ORC or Avro. Users will soon also be able to store and process files as they do with structured or semi-structured data, eliminating the need to load data from files in the data lake. Developers will now have the ability to create functions and stored procedures in a SQL based language rather than just JavaScript.

Another new feature built to make developers’ lives easier was Snowflake’s SQL API. This allows them to submit SQL calls through the API, which supports standard queries, DDL and DML statements. These API calls are lightweight with little overhead. Why is this important? If organizations migrating their applications use Snowflake as their data warehouse, there will be no need for them to refactor the code since Snowflake already includes REST API. This key benefit ultimately helps reduce migration costs.

There were many more great discussions at Snowflake Summit 2021, all of which are now available to watch on-demand. We’ve already begun to share some of these insights with our clients, and look forward to helping them leverage these capabilities.

If you attended the Summit or have a chance to watch the recorded sessions, I encourage you to share your thoughts and favorite takeaways. Please connect with me on LinkedIn to discuss what you found most interesting – I’d love to compare notes!

 

Reflecting on a $900M Miss

A string of mergers in the 1990s turned Citigroup Inc. (Citi) into a financial powerhouse. Many years later, the legacy of those deals and decisions made around system integration (or lack thereof) is now at the core of a major issue for the bank (the nation’s third-largest).

In late 2020, federal banking regulators fined Citi $400 million following a $900 million errant payment caused by a lack of proper risk management and compliance controls. Regulators have ordered Citi to fix its systems, due to “significant ongoing deficiencies,” in order to reduce the likelihood of another error. Adding to Citi’s pain, a court ruled earlier this year that creditors could keep more than $500 million of the accidental payment that Citi sent them in August 2020, directly hitting its bottom line.

Due to the lack of full integration as new businesses were purchased, many of Citi’s various businesses run on their own independent systems and have their own method for tracking legal entities and transactions. There are hundreds of systems inside the bank, and an entity doing business with multiple business units could have different identification codes in each area. This imperfect view of a legal entity and its activity across the bank makes it hard to link all aspects of a relationship, leading to limitations in the company’s ability to manage risk.

Why is this a problem?

A lack of proper risk management and controls can have widespread impact on an organization. One example is financial institutions are required to perform a series of checks on customers and potential customers before entering into a business relationship with them. These Know Your Customer (KYC) guidelines verify the identity, sustainability and risks involved with maintaining customer relationships, and fit within the broader scope of a bank’s Anti-Money Laundering (AML) regulatory policies. They are employed by companies of all sizes to ensure that their proposed customers, agents, consultants or distributors are compliant – and are who they claim to be. Capturing the true scope of a legal entity is also key to other regulatory and compliance mandates, such as the Single Counterparty Credit Limit (SCCL) rule. The lack of an adequate and cohesive view of entities across systems makes it increasingly difficult to measure the full scope of a relationship and verify that those business relationships are compliant. It is evident and necessary for institutions to both prove and ensure their compliance with these regulations in order to avoid federal fines such as those imposed on Citi.

Beyond the risk management view, there are also lost revenue opportunities by not having a cohesive view of entities across the bank. A financial institution that is unable to connect the dots between the business owner that has made a commercial loan with the bank and the opportunity that may exist by extending wealth management services to that same business owner is leaving revenue on the table. But, in many banks where client interactions are managed within customer relationship management (CRM) platforms that are siloed by business unit, those connections and opportunities don’t often materialize (and if they do, don’t happen very efficiently).

Who does this affect?

These potential regulatory issues can arise at any bank that:

What is the solution?

Federal regulations mandate that an organization should have a complete and accessible view of who its customers are across systems and business units, as well as associated entities. This is not possible without adequate Data Governance and Data Management across an organization.

Consolidated, accessible and complete customer data helps with compliance to federal regulations and has a wide array of benefits for an organization, including:

Kenway’s Information Insight capability blends Data Management, Data Governance, and Business Intelligence, all of which are fundamental in ensuring organizations have a 360-degree view of customers and associated entities across their systems.

Our method focuses on the people, processes and technology surrounding your data ecosystem to create the best solution for your organization. We define and implement processes to help you govern your data from the point of origin, to the point of consumption, to the point of retirement. By taking this approach, we believe that data can be managed in a way that minimizes cost while maximizing an organization’s ability to utilize data as a strategic asset.

If you’re interested in learning more about our experience and approach in this area, contact us at [email protected].

 

California Love

If you have ever been to California or purchased a product that was sold or distributed in California, you have probably seen the warning: “This product contains chemicals known to the State of California to cause cancer, birth defects, or other reproductive harm.”

It seems like a fair warning until you find it plastered everywhere. I stepped into a hotel elevator once in San Francisco and was presented with a warning on the elevator wall. It was on a paintbrush I purchased at a home improvement store. It was also on the dashboard of a car I rented in San Diego. It led me to cynically think to myself, “Well, what doesn’t cause cancer?”

This warning was plastered everywhere after Proposition 65 became law in 1986, due to the ballot proposition being approved by voters. California is somewhat unique in that it allows voters to govern by ballot measures that do not require the support of the executive or legislative branches of the state government. Just as California stepped up to lead the nation and save us all from cancer-causing elevators and paintbrushes, they have once again done the same in leading the nation in data privacy.

The California Consumer Privacy Act (CCPA) was passed by the state legislature in 2018 and went into effect in 2020. This legislative bill was passed as a compromise to avoid a more stringent ballot initiative being proposed for ballot vote. However, data privacy advocates were upset that the final product was watered down. So, in 2020, they introduced Proposition 24, also known as the California Privacy Rights Act (CPRA), which was subsequently approved by voters in the November 2020 general election. The new measures will go into effect on January 1, 2023.

So, what is CPRA?  CPRA does not replace CCPA. Rather, it creates additional consumer rights and modifies existing CCPA rights. CPRA also establishes a new privacy enforcement agency (the California Privacy Protection Agency) and establishes a new category of personal information. Because CPRA was passed through a ballot proposition, the measures within the act can only be amended to further strengthen the act. In other words, CPRA cannot be watered down through legislative action. Only a subsequent ballot measure, superseding federal measure, or a judicial ruling of unconstitutionality could reduce the provisions of CPRA.

One of CPRA’s notable changes is that it increases the scope of a covered business to not only those that sell personal information, but now also includes those businesses that share it – a seemingly minor detail that will actually have a significant impact on businesses. Further, the introduction of “sensitive personal information” expands the definition of personal information and adds new constraints on how that data may be used, how it must be disclosed, and how consumers can limit the use of it. This sensitive personal information includes government identifiers (e.g., SSN, driver’s license number), financial account and login information, geolocation data, race, ethnicity, sexual orientation, religious beliefs, philosophical beliefs, the content of non-public communications (e.g., email, text messages), genetic data, biometric data, and health information.

CPRA also includes certain GDPR (General Data Protection Regulation) principles, such as data minimization, purpose limitation, and storage limitation. In other words, the personal information that businesses collect must be minimized to only what is reasonably necessary; that personal information can only be used for the purpose that was disclosed to the consumer; and the retention period for each category of personal information must be disclosed.

While some of these regulations will result in new policies that will be drafted and communicated to consumers, others will put more onus on the business and its processes, procedures and risk mitigation efforts. While CCPA provided consumers with the right to request that their data be deleted, the right to opt-out, and the right to know how their data was being collected, CPRA now provides consumers with the right to correct inaccurate information, the right to access information about how automated decision-making technology is using their data (i.e. meaningful information about the logic involved in such decision-making processes), and the right to opt-out of automated decision-making technology that uses their data.

Covered businesses will soon be asking themselves, “When was the last time we completed a risk assessment and cybersecurity audit?”  CPRA will require them to regularly conduct both, and then submit results to the new privacy enforcement agency. Given that CPRA has added login credentials to the list of data types that a private citizen can take legal action upon, if breached, cybersecurity is likely to gain a heightened focus. Previously, when such a breach occurred, legal action could only be initiated by the Office of the Attorney General. Now, any private citizen who is impacted and covered under CPRA could bring forth their legal action directly against the offending entity.

When CCPA became effective in 2020, many of Kenway’s clients were relieved to see that employee data and data specific to a business-to-business transactions were exempt. In effect, CCPA treated consumer personal information differently than employee and business-to-business personal information. While CPRA will extend those exemptions, it will also permanently sunset those exemptions on January 1, 2023.

It is widely expected that additional states will enact measures like CPRA. Many states have already drafted similar legislation, and that legislation is working its way through various legislative committees and forums for debate and amendments. Many of these lost some momentum during the COVID-19 pandemic, as attention instead turned to the national public health crisis and its impacts to state programs and budgets. However, states will likely return their attention to data privacy within their 2021 and 2022 legislative calendars.  Of course, that is if superseding federal action does not take place first. Most businesses would prefer a federal data privacy plan (and are lobbying accordingly) rather than having to navigate the potential variations of 50 state laws. In that spirit, many companies that do business in Europe are employing the practices that have been established to address GDPR, since GDPR is still slightly more restrictive than CPRA.

While CPRA will not go into effect until January 1, 2023, now is the time to start preparing. For any organization that will be covered by CPRA, it will be critical for them to have a crystal-clear understanding of their data ecosystem. Without it, implementing any of the required reporting will be impossible. This includes the appropriate Data Governance and Data Management: policies, procedures, data flows, data lineage, data dictionary, and catalog. Organizations will also want to look at their cybersecurity plans and begin planning for regular assessments.

All that said, regulatory compliance cannot be treated like a conventional IT project. Our experts work with clients to create a strategic action plan. This includes leveraging our clients’ internal or external counsel’s interpretation of the law and its applicability to their organization in order to identify gaps and create a plan to resolve them. That plan includes the necessary resources, activities, and scope. We then work with our clients’ teams to connect the dots across all their systems storing personal information, weaving in business processes to pinpoint risks, and subsequently building a plan to avoid them. This is all part of the effort to ensure that our clients not only get compliant with the new regulations, but are positioned to stay compliant as new or amended measures are enacted.

Proposition 65 was brought to you to by the State of California to warn you of exposure that may impact your health. Kenway is here to warn you that Proposition 24 (CPRA) may impact your business. The clock is ticking – you’ve got slightly less than two years to be ready – and we’re ready to help you through that journey. Even if your business is not covered by CPRA, it is no longer a matter of “if” but, rather, “when.”

We’re here to help when you are ready to start.

 

Introduction to Data Governance

In today’s age, the generation of large amounts of both structured and unstructured data has expanded at an exponential rate, along with the complexity of data ecosystems. Key data sources are increasing in both size and volume, and the way that data is captured and assessed is shifting from strictly on-premise databases to cloud technology. Data has become critically valuable to every industry and department whether it be financial services, data science and analytics teams, sales and marketing, or healthcare. Organizations are becoming more reliant on data to run day-to-day operations and drive decision-making. In order to keep pace with the ever-expanding wave of source systems, digitization, and Big Data, organizations are making Data Governance an increasingly significant institution that is vital to both business and information technology (IT) strategies.

Read More: Introduction to Data Governance - White Paper_(PDF)

 

The CDO’s toolbox: Data Governance & Data Management

Until the early 2000s, most firms accumulated data at a manageable rate and were able to collect, store and use that information with little additional effort. However, over the past two decades, the introduction of automation mechanisms (i.e., robotics, IoT, etc.), social media, and cloud storage has made it increasingly cheap and easy to collect and store data.

Today, organizations are accumulating an unsurmountable amount of information. Innovative academics have identified a plethora of ways to leverage data, and tech giants have developed (virtually) limitless computing power to process it. The world is now exponentially accumulating information so fast and vast, that we’re observing instances where laws are struggling to protect it and companies are frantically trying to leverage it.

As one would expect, with an increase in the amount of available data comes an increase in usage across organizations. From executives and employees, to customers and regulators, it’s become a vital component of interaction across a myriad of stakeholders, making it extremely critical for competing in today’s emerging data-driven economy.

Data as a Strategic Asset

This increase in data usage has led us to a world where the quality of data matters. When the data that is stored has inconsistencies in form, has missing components, or is not up to date it can have a significant impact on an organization.

A colleague shared with me an example of this quality issue occurring during her previous job at a global bank. When generating monthly reports that showed details about funds (i.e., returns, exposure by geographic region, product type, currency, etc.), she found herself spending a ton of time exporting data from systems, making manual adjustments so that it was in the correct form, realizing it didn’t look right, going back to the numbers to figure out what was wrong, and so on.

In such situations, poor data quality can turn out to be costly because analysts and managers often find themselves spending more time preparing data than analyzing it. Unlike a decade or two ago, today, poor data quality has a direct and greater financial impact. Governing and managing it inadequately is often indicated by subpar operational performance (i.e., bad decisions due to incorrect reporting), reputation loss (i.e., data leaks), or worse (i.e., regulatory fines due to non-compliance with privacy laws).

Data Governance and Data Management Lay the Foundation

To thrive in today’s data economy, the CDO/CIO office is often pressured to be intentional about continuously improving data quality across the organization. To do this, they need to rely on both Data Governance and Data Management mechanisms. While the concepts of Data Governance and Data Management are commonly understood and documented, one of the major differences between the two is that Data Governance is a strategy (i.e., macro) and Data Management is a practice (i.e., micro). But both are necessary for any organization to thrive.

Organizations that formalize Data Governance have roles and responsibilities defined for data ownership and stewardship. They also have policies, procedures and enforcement in place to ensure that data quality standards are upheld from data entry to data delivery. On the other hand, Data Management is prevalent in organizations that have tools and technologies that serve various purposes, such as visibility into metadata (i.e., what data is in which table, measure calculations, data lineage, etc.), access controls, etc.

Examples of common problems solved by Data Governance and Data Management:

Both Data Governance and Data Management are complementary in nature and essential for an organization to run optimally in today’s world. In the ideal state, a combination of these two would allow organizations to understand the positive effects data can have on their business and offer the ability to create that impact with little effort. Such an ideal further enables organizations to connect the right insights with the right people for driving value all the way from optimizing business processes to propelling innovation.

Driving Value from Data Governance and Data Management

Structuring and initiating Data Governance and Data Management implementations can appear daunting and intimidating. Common trends suggest organizations often struggle to launch and sustain a Data Governance program, or lack buy-in on basic Data Management tools because it seems to be an expensive proposition.

Kenway’s approach to data is different. By understanding and prioritizing based on use cases supported by business objectives, organizations are enabled to structure and pace the development of their data infrastructure in such a way that the project could potentially pay for itself by adding immediate ROI to business initiatives.

We’d love to learn more about your organization’s Data Governance and Data Management initiatives. Drop us a note at [email protected], or check out our Information Insight page to learn more.

Request a Data Governance Consultation