the state of data quality

Transcript

1 The state of data quality An Experian Data Quality white paper

2 The state of data quality Contents Executive summary 2 Introduction 3 Research overview 3 Research methodology 3 Key findings 4 An abundance of channels 5 The state of data quality 6 Strategy development 7 Consequences of poor information 9 Value of data sets 10 A five year evolution – what has changed? 11 Creating a centralized data strategy 13 Create a central task force 14 Consolidate data 15 Implement best practices 16 Conclusion 19 Page 1 | The state of data quality

3 The state of data quality Executive Summary The perception of data across organizations is changing. No longer is data just viewed as a secondary component of business. Today information contained within a database is viewed by senior management and many departments as a critical factor in decision making, customer interaction and service delivery. In fact, 93 percent of companies believe data is essential to their marketing success. However, data quality inaccuracy leaves organizations at risk. Unfortunately, the level of global inaccurate contact data has increased from 17 percent to 22 percent, up five percent in just 12 months. With the increasing volume of information collected through a variety of User feedback channels, there is more room for human error. This combined with the prevalence of segmented, departmental approaches to data accuracy is preventing stakeholders from analyzing, improving and controlling data problems. This year’s study revealed that 66 percent of companies lack a coherent, centralized approach to data quality. But, ad-hoc approaches are dividing resources and further segmenting information. Poor data quality is having a negative effect on budgets, marketing efforts and most importantly, customer satisfaction. Organizations that are not able to control the quality of their data are unable to effectively communicate with their customer base. Data quality is the foundation for any data-driven effort and in order to succeed in the year ahead, organizations will need to look at prioritizing data accuracy and accessibility. Thomas Schutz, SVP, General Manager of North American Operations Experian Data Quality An Experian Data Quality white paper | Page 2

4 The state of data quality Introduction Research overview In December 2013, Experian Data Quality commissioned a research study to look at current approaches to data quality. This report, ‘The state of data quality,’ reviews the evolution of data quality and consumer interaction while providing best practices for data management. Research methodology Over 1,200 respondents globally took part in the research, produced by Dynamic Markets for Experian Data Quality. Individuals from the U.S., UK, France, Germany, Spain and the Netherlands completed the survey. Industry sectors included in the sample were finance, public sector, retail, manufacturing, utilities and education. Respondents consisted of C-level executives, vice presidents, directors, managers and administrative staff connected to data management, across a variety of functions. Countries in sample Netherlands 8% Spain 9% Germany 8% France 8% U.S. 33% UK 33% 40 30 20 10 0 Percentage within study Page 3 | The state of data quality

5 Key findings

6 The state of data quality An abundance of channels Face-to-face Physical store Organizations are interacting with consumers in countless ways. On average, Mobile website Website companies use 3.4 channels to collect customer or prospect contact data. Mobile application Catalogue Multinational companies operate through more channels than those who operate in Other Call center a single country. The most common channel for interacting with customers is the 4% Channels to collect consumer data organization’s website, followed by a sales team and the call center. 12% 14% 16% The number of channels has remained consistent year-over-year. And while 42% 18% websites are dependably the most popular channel over the past few years, mobile 7% 54% is gaining in prominence. Today, half of organizations are capturing customer 10% contact data through mobile applications. 60% 22% 24% 11% Organizations are not just collecting information through select channels, they are 35% also sending marketing messages to consumers to create brand awareness and drive purchases. These marketing communications are sent through a number of 73% 36% channels, the most popular of which is email, followed by social media and then mobile telephone. Social media prevalence is increasing year over year, up five percent from 2012. Marketing communication channels With email being the most popular marketing communication channel, it is not surprising that 83 percent of companies acquire customer or prospect email 5% addresses for their email marketing efforts. These addresses are collected through an average of three channels, the most popular being the company’s website and the call center. U.S. companies actually collect customer and prospect email 23% 38% addresses in a wider variety of ways, compared to other countries. While this general diversification of channels is not new, companies are starting 11% to increase their focus on cross-channel marketing. Cross-channel marketing is the coordination of different channels to provide the customer with a consistent 13% 10% experience rather than a more segmented multi-channel approach. While 87 percent of companies now engage in cross-channel marketing, 83 percent of them face challenges in this area of their operation. Mobile telephone Email The principal challenges associated with cross-channel marketing relate to data. Landline telephone Physical address Other Social media Having accurate information on the consumer and having enough information 5% on the consumer are the two primary challenges when providing consistent communication across channels. Respondents in the U.S., Germany and Spain actually face a wider variety of challenges when they engage in cross-channel 23% 38% marketing, compared to respondents in the UK, France and the Netherlands. 11% Multinational companies also relate to a greater number of cross-channel 13% 10% marketing challenges, compared to companies that only have national offices. These cross-channel marketing challenges directly correlate the current state of data quality across the globe. Page 5 | The state of data quality

7 The state of data quality The state of data quality Data quality continues to be a challenge for many organizations as they look to improve efficiency and customer interaction through data insight. 91 percent of companies suffer from common data errors. The most common data errors are incomplete or missing data, outdated information and inaccurate data. Because of the prevalence of these common errors, the vast majority of companies suspect their contact data might be inaccurate in some way. Globally, the average On average, U.S. companies amount of inaccurate data has risen to 22 percent from 17 percent just 12 months believe ago. U.S. organizations actually believe they have the highest percentage of 25 percent of their data inaccurate data at 25 percent. is inaccurate. The level of inaccurate data is staggering when one considers how much businesses are relying on information for business intelligence and improved Common data errors plague customer interaction. 91 percent The main cause of inaccurate data remains to be human error, which has of organizations. consistently been the main cause of errors over the past three years. While all other causes clearly lagged behind the front runner, they include a lack of communication between departments and technical limitations. Information collected across various channels is frequently exposed to human error as consumers and individual employees enter information manually. Collectively, 78 percent of companies have problems with the quality of data they collect from various channels. Globally, call centers produce the poorest data quality, followed by websites. However, in the U.S. that order was reversed, with company websites causing the most challenges. The level of inaccurate data relates to a lack of a sophisticated data management strategy, which many organizations are struggling to centralize. Reason for data inaccuracy Other 5% Lack of internal communication between 31% departments Human error 59% Inadequate data strategy 24% Inadequacies in relevant technology 19% A lack of relevant technology 22% Lack of internal manual resources 22% Insufficient budget 20% Inadequate senior management support 14% 60 10 0 70 50 40 30 20 An Experian Data Quality white paper | Page 6

8 The state of data quality Strategy development The main drivers for having Organizations conceptually see the benefit of having accurate data. The main a data quality strategy drivers for having a data quality strategy include increased efficiency, enhancement include increase efficiency, of customer satisfaction, and more informed decision making. Respondents in the enhancement of customer U.S. say more of these factors account for why their organization has a strategy to satisfaction, and more informed maintain data quality. On the industry side, respondents in manufacturing, financial decision making. services and utilities relate to more of these issues as drivers towards having a strategy to maintain high-quality contact data, compared to those in education and In general, senior managers the public sector. selected a greater volume Interestingly, the benefit of cost savings continues to drop as a motivator for a data of drivers when it comes to quality strategy, which reflects the current desire for data to inform overall business maintaining high-quality contact strategy rather than just to serve an operational purpose. records. There are several key areas that make up a given strategy. They include: Data management strategies 1. Management of data quality are often segmented between departments. 2. Utilization of third parties 30 percent 3. Adoption of strategies manage data quality centrally. 4. Selection of implementation method 66 percent Today, only 30 percent of companies manage their data quality strategy centrally, lack a coherent, centralized through a single director. That means 66 percent of companies lack a coherent, approach to data quality. centralized approach to data quality. Given the number of channels and departments that interact with data, it is challenging to possess quality information Third parties are frequently used when each department has different standards and methods for data management. for data management strategies. Third parties are frequently used for data management strategy suggestions. 64 percent 64 percent of companies have used or still use third parties for their data quality of companies have used or still use strategy. Utilizing third parties for data management is most common in retail and third parties for their data quality manufacturing, compared to other industry sectors. When reviewing variances strategy. across company size, among these generally large companies, more of the smaller ones use third parties for data management. However, given the decentralization of data management strategies in general, it is most likely that these third party engagements are for one-off campaigns or departmental management practices. Page 7 | The state of data quality

9 The state of data quality Some companies take advantage of automated software techniques. One in three companies use dedicated point-of-capture software to verify information Data management methods vary as it is entered. In addition, another one in three companies use dedicated back- greatly by company office software to clean data after it is submitted. Automated processes are an indicator of the sophistication of data management methods. Companies who use 55 percent use automated methods of data automated data management methods are more likely to have their data strategy management managed centrally, by a single director. and manual methods are used by However, many companies still rely on manual data cleansing methods. 53 percent of companies perform manual data cleansing tasks. These include processes 53 percent like manually reviewing data in excel or one-off manual corrections for seasonal of organizations. campaigns. While overall the prevalence of manual methods is down from previous years, organizations do need to look at utilizing more automated methods to prevent human error. Ignorance in SaaS solutions for data management has fallen quite Finally, there are different deployment methods for data management strategies. considerably during a 12 month Some organizations choose to deploy on premise software to manage data quality. period However, software-as-a-service (SaaS) deployments continue to gain in popularity. Ignorance about SaaS solutions for data management has fallen quiet considerably 53 percent during a 12 month period from 15 percent last year to nine percent today. of organizations are using SaaS to manage data quality. Today, over half of companies are using SaaS to manage data quality. Only nine percent of organizations have no plans to implement a SaaS solution for data quality. The U.S. and France stand out in that more companies are already using SaaS technology to manage data quality. Manufacturing and retail are leading adopters with roughly one in five managing all of their contact data through SaaS technology. However, knowledge about SaaS solutions seems to vary by department and seniority level. In general, the more senior staff and those in IT and data management roles are better informed of their organization’s position with respect to the use of the cloud. The lack of a centralized strategy and consistent automated methods fuel a large percentage of data inaccuracies. This leads to a number of consequences. An Experian Data Quality white paper | Page 8

10 The state of data quality Consequences of poor information Having inaccurate information on the consumer Having enough information about the consumer With a quarter of information believed to be inaccurate in U.S. organizations, Inability to analyze customer information companies are facing many consequences. Inability to access disparate customer information Inability to create compelling offers First, inaccurate data is affecting the company bottom line. 77 percent of Lack of customer participation Other companies believe their bottom line is affected by inaccurate and incomplete contact data and on average, respondents believe 12 percent of revenue is wasted. Problems with loyalty campaigns Despite increased knowledge about data quality and the benefits of utilizing data- driven techniques, the average percentage of wasted revenue has not changed in 5% this survey since 2007. 22% 18% However, changes in business practices have brought about new consequences. Some relate to customer engagement and loyalty programs that have made a strong surge in the past few years. 84 percent of companies have a loyalty or 11% customer engagement program. Unfortunately, 74 percent of respondents have 15% encountered problems with these programs. The main causes are inaccurate information, not enough information on the consumer, and an inability to analyze 13% customer information. All of these issues relate to data accuracy and accessibility. 16% Another trend is business intelligence and analytics, frequently referred to today as big data. 89 percent of companies now use their data in a strategic way for business intelligence and analytics. In fact, the U.S. stands out with more companies Cannot consolidate data across channels conducting business intelligence and analytics on their data, compared to the UK Inaccurate data and France. Not enough information available Too much information available These programs also encounter problems due to poor data quality. 81 percent of No analytics resources organizations encounter problems when trying to generate meaningful business Lack of training Lack of flexible data intelligence, mainly due to data inaccuracies. Other problems include a lack Other of information, a lack of flexible data and systems, followed by an inability to 3% consolidate data across channels. Problems with business intelligence 12% 3% 15% Finally, marketers are continuing to communicate through email, however, 67 22% 13% 12% percent have experienced email deliverability problems in the last 12 months. 15% These problems result in poor customer service, an inability to communicate with 16% 9% 10% subscribers and unnecessary costs. 22% 13% 10% 16% 9% Page 9 | The state of data quality

11 The state of data quality Value of data sets Having inaccurate information on the consumer Having enough information about the consumer As organizations look to gain value from there data, certain data sets have emerged Inability to analyze customer information as being more important that others, particularly for marketing. In fact, 93 percent Enrichment data is used by Inability to access disparate customer information of respondents think some form of data is essential to their marketing success. Inability to create compelling offers 94 percent Lack of customer participation Contact data tops the list of data and information deemed to be essential to Other of organizations. marketing success, followed by sales data and demographic data. To gain additional insight that may not be contained within an existing database, On average, companies append many organizations are looking to third parties to append insight for marketing and business intelligence purposes. 94 percent of companies append enrichment data three different to their contact information. data sets to their customer contact information. On average, companies append three different types. The top three data sets are business data, geolocation data and demographic data. U.S. companies seem more switched on to data enhancement and append a wider variety of enrichment data. One data set in particular is preference data, which is especially popular as a route to marketing success for the U.S., compared to other countries. Information essential to marketing success Other 5% Sales data 44% Preference data 31% International data 18% Behavioural data 26% Geolocation data 20% 3% Demographic data 38% 12% 15% Contact data 54% 60 20 40 10 0 50 30 22% 13% 10% 16% 9% An Experian Data Quality white paper | Page 10

12 The state of data quality A five year evolution – what has changed? Data quality has moved Looking back over the past five years of the Experian Data Quality study, what has from a primarily operational changed about our perception, use and motivations for data quality? function that was assoicated with efficiency and cost First, the perception of inaccurate data is increasing. Globally, the average amount savings, to a strategic function of inaccurate data has risen to 22 percent from 17 percent just 12 months ago. aligned with consumer insight This increase is due to the increasing volumes of data from multiple sources. and overarching business Organizations generally acknowledge that the proliferation of digital channels and intelligence. mobile technology has brought more information than ever before. While some of this is unstructured data that is difficult to mine, there is a general increase in overall customer and prospect information. The average amount of inaccurate data has risen to In addition to increased volumes, data quality has moved from a primarily operational function that was associated with efficiency and cost savings, to 22 percent a strategic function aligned with consumer insight and overarching business for global organizations from intelligence. The increase in data has also led to the concept of big data, which came about 17 percent within the past few years and is generally a buzz word most of us have read about. only just 12 months ago. While many of us have read about the term, there still is no single, consistent definition that has emerged in the market. Respondents in this year’s survey were asked to review several definitions for big data, and no single answer emerged as the front runner. 45 percent of respondents believe the term refers to a large, unified single source database. Although this took the highest percentage, multiple data sources and predictive analytics definitions were not far behind. More senior level respondents selected more of these possible interpretations of what the term big data means to them, whereas 40 percent of administrative level staff admitted they did not know what the term meant. Finally, we have seen the explosion of SaaS. Five years ago, few organizations implemented cloud or hosted solutions. Today with the explosion of software-as-a- service platforms, many organizations are looking to utilize this deployment method for their data quality management software to decrease implementation time and ensure consistently updated technology. However, as we have seen in previous years, security still remains a concern around SaaS that large organizations need to research further. Page 11 | The state of data quality

13 Creating a centralized data strategy

14 The state of data quality Creating a centralized data strategy Given the importance of data-driven efforts, the level of inaccurate data needs to decrease for organizations to gain actionable insight. Improved data quality leads to better consumer interactions and informed business decisions. Based on the research findings, organizations need to look at reviewing a centralized approach to data management. One-off, ad-hoc projects of the past no longer suffice given the volume of data and the speed at which it needs to be accessed. Given the number of channels through which data is being entered, organizations need to look at creating a centralized strategy for data management. A central approach ensures consistency across departments, access to many data sources for customer information and improved best practices related to data management. In order to create a centralized approach, organizations need to focus on three key areas. Create a central task force Data management is not the responsibility of one department alone. In order to create a centralized approach, organizations need to Consolidate data consolidate different sources of information. Strong tools and processes will prevent the most common errors Implement data best practices within a database. These areas allow organizations to create a centralized source for data with consistent management practices and easy access to valuable consumer information. Page 13 | The state of data quality

15 The state of data quality Create a central task force Companies that have a Data management is not the responsibility of one department alone. Many centralized approach to how departments input and utilize data for daily operations. Information is entered they review and manage their through websites, call centers, sales representatives, and many more. Then data quality strategy are more departments such as billing, customer service, fulfillment and marketing utilize likely to utilize SaaS technology information to communicate with customers or supply them with basic goods and for data quality management services. compared to those who are decentralized. In order to manage data quality as it relates to all of these areas, a centralized task force needs to be created. This group should consist of stakeholders and individuals to execute plans from across the organization. The majority of organizations manage their data quality in a Stakeholders should consist of members from various departments that have a decentralized way. stake in the quality of information. These individuals can provide detail around how 66 percent information is collected and utilized. From there, create a data map to showcase lack a coherent, centralized all workflows within the organization. This will help understand what tools or approach. processes should be implemented and a prioritization structure for those projects. IT should also be involved to implement priorities and source technology that will Those who use more need to be utilized to enforce and maintain data management. IT can also provide sophisticated, automated insight into technical resources available, given other business priorities. methods for data management are more likely to have their data Benchmarks can be taken and regular progress can be checked as organizations strategy managed centrally. move through the process of implementing new solutions. The group can meet on a regular basis to review new statistics and processes, and identify if the quality of information is improving for all departments. An Experian Data Quality white paper | Page 14

16 The state of data quality Consolidate data One of the first tasks in any central strategy is consolidating information. According to the Experian Data Quality study, the average large organization has eight different databases. That statistic most likely does not include other spreadsheets or sources of data that may exist outside of a database, which can be numerous in large organizations. In order to create a centralized approach to data management, organizations need to consolidate different sources of information. This will allow data to more easily be accessed, but also allows for consistency in management and standardization processes. In order to consolidate data, organizations should take several steps. 1. Identify all sources of information that should be consolidated and who the owners are for each source. 2. Identify the data infrastructure. Is there an existing database where all information can be stored or is a new system required that can better accommodate multiple departments? The data quality task force can help determine the need. 3. Clean and standardize as much existing information as possible. Most often, contact data will be an easy source of information that is contained within each source. These details can be utilized to identify duplicate records across sources in order to consolidate information for each client into a single central source. 4. Utilize software to identify duplicates and remove the possibly of human error. Once potential duplicates are found, a golden record can be identified and all information can fall within that record. Creating a centralized source can take some time; however, it will be invaluable for quickly accessing customer information and improving business intelligence. Centralization should be paired with data best practices in order to ensure the quality of information. Page 15 | The state of data quality

17 The state of data quality Implement data best practices Implementing data Once information is consolidated, organizations should look to implement best management best practices practices around data management within their newly formed system. In order for will help combat incomplete departments to adopt a new central source, it needs to be easy to utilize and have or missing data, outdated quality information. Otherwise individuals will revert back to old data sources that information and inaccurate better fit their business need. data, which are the most common errors within a It is important to note that data quality is not meant to be a one-off cleaning database. engagement; data is constantly entering a database, expiring or changing format. Just five years ago, the term big data did not exist and the concept of unstructured Organizations vary in the way they data from social media and mobile phones was just starting to come to the manage their data quality. forefront. As organizations become more sophisticated with their data management practices, it is important that they have a best practices strategy and then adjust it 99 percent over time based on market need. state they have a data management strategy in place. Implementing data management best practices will help combat incomplete or missing data, outdated information and inaccurate data, which are the most 38 percent common errors within a database. By preventing these errors, information can be perform regular manual analysis maintained over time and be fit for its desired purpose. in Excel. There are six key best practices. 55 percent use some sort of automated 1. Create benchmarks around data accuracy method to manage data accuracy. As with any business initiative, it is important to track progress in order to demonstrate that time and budget associated with a project are bringing a return on investment. The data quality task force should look to create benchmarks While there is less confessed around data accuracy to help show improvement, but determine what processes or ignorance among IT and data investment should be continued and what did not work as well as planned. management professionals around the term big data, they do Benchmarks can easily be taken around package delivery, returned mail, email demonstrate the widest array of deliverability, or customer service calls. Businesses can also look to leverage opinion on what the term means. third party consultants to benchmark segments within the database. Over time, these same benchmarks can be reviewed once processes and tools have been implemented. 2. Verify data upon entry Today, most businesses utilize information as soon as it is entered for loyalty offers, marketing efforts, fulfillment, or billing. While information has always needed to be accurate from an operational standpoint, consumer communication has become more of a factor given the speed at which organizations need to follow-up with relevant marketing offers. Inaccurate data affects customer interaction almost immediately. An Experian Data Quality white paper | Page 16

18 The state of data quality Therefore, it is important to check the validity of information as it is being entered. Software tools can be put in place to verify structured customer information, such as email address, mailing address and telephone number. This standardized and validated information allows organizations to more easily find existing accounts and accurately append third party data sets that rely on basic customer information. With cross-channel marketing efforts, point-of-capture verification processes help to ensure the accuracy of customer information, but also the prevention of duplicate accounts, which keep marketers from valuable consumer insight. By validating customer information, organizations can prevent inaccurate data and ensure that communications not only reach the consumer, but that any customization techniques are more accurate across channels. 3. Validate information with consumers when possible Data expires within a database quickly, in fact, it is estimated that two percent of contact data goes bad each month, which is almost a quarter of the database annually. To keep consumer information up to date, it is important to validate information with the consumer as often as possible. Not all of these methods involve directly reaching out to the consumer. Marketers can watch outbound communication efforts for signs customer information may not be accurate. For example, marketers can watch delivery rates and open rates from email campaigns. If an email bounces or customers go for long stretches without opening messages, the email address may not be active. Marketers can flag email addresses for update at the next consumer interaction, or reach out to the customer for changes in communication preferences. However, when customers call into a call center or go into a branch or store location, associates can verify existing information to make sure it is still the best method of communication for that individual. Page 17 | The state of data quality

19 The state of data quality 4. Improve searching functionality Among those with contact Duplicate records cause problems for organizations by spreading out account data accuracy issues, the history and creating incomplete customer records. Duplicates are often created main cause of such problems when recent information is entered and it cannot be reconciled with an existing is believed to be human error, record. followed by an inadequate data strategy and a lack of internal Most often this is because the record cannot be found due to a slight variation, manual resources. such as a name abbreviation or a mis-keyed email address. Basic searching functionality within a database is often poor, requiring an exact match to find an existing record. More sophisticated searching can be put in place Duplicate data is among the top to find potential matches and identify more possibilities for the account than just a three data quality errors for one-for-one match. 30 percent 5. Check the database for duplicate entries on a regular basis of organizations. Even with validated data and improved searching, duplicates will inevitably be created due to the nature of human error. Stakeholders should be sure to check the Stakeholders should be sure to database on a regular interval to ensure no duplicate accounts have been created check the database on a regular and to consolidate information whenever possible. interval to ensure no duplicate 6. Review data management processes annually accounts have been created and consolidate information whenever Data management and information requirements change constantly across an possible. organization. Looking back on the past five years, data techniques have changed dramatically as evidence by the research. To ensure that data is fit for purpose and can be used in the desired way by the organization, the data quality task force should look to review data management practices annually and identify new ways that information is being utilized or processes that may not be fulfilling their purpose based on benchmark data. By reviewing management processes on a regular basis, organizations can make certain they are able to use their valuable data asset to its maximum potential. An Experian Data Quality white paper | Page 18

20 The state of data quality The utilization of data across organizations has shifted. Businesses see the potential for this valuable asset to provide an avenue for better consumer interaction and business decision making with the evolution of big data. To ensure data is fit for purpose, organizations need to take steps to ensure its accuracy, accessibility and completeness. Data management best practices should be implemented to standardize data, better consolidate it into a single record for each client, and append additional data sets when required. These practices need to be managed centrally across an organization to consolidate resources and ensure all information receives similar validation and standardization. Data quality is the foundation for any data-driven effort. As the data proliferation continues, organizations need to prioritize data quality to ensure the success of these initiatives. About Experian Data Quality Experian Data Quality is a global leader in providing data quality software and services to organizations of all sizes. We help our clients to proactively manage the quality of their data through world-class validation, matching, enrichment and profiling capabilities. With flexible software-as-a-service and on-premise deployment models, Experian Data Quality software allows organizations around the world to truly connect with their customers by delivering intelligent interactions, every time. Established in 1990 with offices throughout the United States, Europe and Asia Pacific, Experian Data Quality has more than 13,500 clients worldwide in retail, finance, education, insurance, government, healthcare and other sectors. For more information, visit http://www.qas.com. For more information about how you can improve your data quality management strategy, please call 1 888 727 8330 or visit us online at www.qas.com. Page 19 | The state of data quality

21 Experian Data Quality 125 Summer St Ste 1910 Boston, MA 02110-1615 T 888.727.8330 [email protected] www.qas.com Intelligent interactions. Every time. © 2013 Experian Information Solutions, Inc. All rights reserved. Experian and the Experian marks used herein are service marks or registered trademarks of Experian Information Solutions, Inc. Other product and company names mentioned herein are the property of their respective owners. 09/2013

Related documents

vol9 organic ligands

vol9 organic ligands

C HERMODYNAMICS HEMICAL T OMPOUNDS AND C OMPLEXES OF OF C U, Np, Pu, Am, Tc, Se, Ni and Zr O ELECTED WITH RGANIC L IGANDS S Wolfgang Hummel (Chairman) Laboratory for Waste Management Paul Scherrer Ins...

More info »
NI XNET Hardware and Software Manual   National Instruments

NI XNET Hardware and Software Manual National Instruments

XNET NI-XNET Hardware and Software Manual NI-XNET Hardware and Software Manual July 2014 372840H-01

More info »
Final rule: Home Mortgage Disclosure (Regulation C)

Final rule: Home Mortgage Disclosure (Regulation C)

BILLING CODE: 4810- -P AM BUREAU OF CONSUMER FINANCIAL PROTECTION 1003 12 CFR Part Docket No. CFPB -0019 -2014 RIN 3170- AA10 Home Mortgage Disclosure (Regulation C) AGENCY: Consumer Financial Protect...

More info »
MVS Diagnosis: Tools and Service Aids

MVS Diagnosis: Tools and Service Aids

z/OS Version 2 Release 3 MVS Diagnosis: Tools and Service Aids IBM GA32-0905-30

More info »
ADAfaEPoV

ADAfaEPoV

Advanced Data Analysis from an Elementary Point of View Cosma Rohilla Shalizi

More info »
AcqKnowledge 4 Software Guide

AcqKnowledge 4 Software Guide

® Acq 4 Software G uide Knowledge Check BIOPAC.COM > Sup port > Manuals for updates For Life Science Research Applications Data Acquisition and Analysis with BIOPAC Hardware Systems Reference Manual f...

More info »
Dell EMC Unisphere for PowerMax Online Help

Dell EMC Unisphere for PowerMax Online Help

™ ™ Dell EMC Unisphere for PowerMax Version 9.0.0 Online Help (PDF version)

More info »
Nastran Dmap Error Message List

Nastran Dmap Error Message List

Overview of Error Messages NX Nastran displays User Information, Warning, and Error messages in the printed output. The amount of information reported in a message is controlled by system cell 319. Wh...

More info »
clp en

clp en

Guidance on the Application of the CLP Criteria – July 2017 1 Version 5.0 G U I D A N C E Guidance on the Application of the CLP Criteria Guidance to Regulation (EC) No 1272/2008 on classification, la...

More info »
LaneEtAlPrivacyBigDataAndThePublicGood

LaneEtAlPrivacyBigDataAndThePublicGood

This is a prel version of the book Privacy, Big Data , and the Public Good : iminary er, and for Lane, Victoria Stodden, Stefan Bend nt, ed. Julia Helen Niss enbaum Frameworks Engageme Unive rsity Pre...

More info »
Driver.dvi

Driver.dvi

STS Springer Texts in Statistics Springer Texts in Statistics James · Witten · Hastie · Tibshirani Gareth James · Daniela Witten · Trevor Hastie · Robert Tibshirani An Introduction to Statistical Lear...

More info »
SAP S/4HANA 1809   Feature Scope Description

SAP S/4HANA 1809 Feature Scope Description

PUBLIC Document Version: 1.2 – 2019-01-21 SAP S/4HANA 1809 - Feature Scope Description company. All rights reserved. affiliate THE BEST RUN 2019 SAP SE or an SAP ©

More info »
Strong Start for Mothers and Newborns Evaluation: Year 5 Project Synthesis Volume 1: Cross Cutting Findings

Strong Start for Mothers and Newborns Evaluation: Year 5 Project Synthesis Volume 1: Cross Cutting Findings

g Star t f or Mothe rs and Newbor ns Evaluation: Stron YNTHESIS ROJECT S AR 5 P YE Volume 1 indings -Cutting F ross : C Prepared for: ss Caitlin Cro -Barnet Center fo HS nd Medicaid Innovation, DH r M...

More info »
Thriving on Our Changing Planet: A Decadal Strategy for Earth Observation from Space

Thriving on Our Changing Planet: A Decadal Strategy for Earth Observation from Space

TIONAL ACADEMIES PRESS THE NA This PDF is available at http://nap.edu/24938 SHARE     Thriving on Our Changing Planet: A Decadal Strategy for Earth Observation from Space DET AILS 700 pages | 8.5 ...

More info »
DoD7045.7H

DoD7045.7H

DoD 7045.7-H EPARTMENT OF D EFENSE D F UTURE Y EARS D EFENSE P ROGRAM (FYDP) S TRUCTURE Codes and Definitions for All DoD Components Office of the Director, Program Analysis and Evaluation A pril 2004

More info »
The Health Consequences of Smoking   50 Years of Progress: A Report of the Surgeon General

The Health Consequences of Smoking 50 Years of Progress: A Report of the Surgeon General

The Health Consequences of Smoking—50 Years of Progress A Report of the Surgeon General U.S. Department of Health and Human Services

More info »