Category: Croyten’s Blog


In 2011, Saugatuck completed a survey of 200 enterprise IT users and business leaders and roughly 30 vendors that found cloud-based business intelligence and analytics would be among the fast growing of the cloud-based business management solutions in the market over the next two years. This growth represents an 84 percent compounded annual  rate, but did the prediction ring true?

Among companies that are currently using business intelligence tools and have been since 2007, the adoption of business intelligence has remained flat. The 2012 Successful BI Survey shows that approximately 25 percent of the employee base relies on business intelligence tools, a figure that has not changed in the last five years. Given the adoption of new technologies and integration into mobile capabilities, this result may come as a surprise to most.

For others, however, the result of this survey simply demonstrates that the wrong element is being measured to truly understand what is happening in business intelligence in 2013. The tools for gathering the data don’t matter nearly as much as what platform companies are using to access the data and what they are doing with it once it’s in the data center. It is the challenge of enterprise in this next generation, and one that is easily overcome with data analytics and the strategic use of the cloud.

The stagnant adoption of business intelligence tools in the enterprise and the small business is not due to a lack of understanding of the value it presents, but instead the result of significant investments in legacy systems that demanded a focused approach to every network and data center deployment and integration. The process was often cumbersome and expensive, which limited access for a number of potential users. Now, as more companies are embracing the cloud, the playing field is about to change.

The cloud is expanding business intelligence and analytics to include multiple users throughout the organization, simplifying access and making business intelligence and the use of analytics more ubiquitous. The cloud provides one level for managing the complexities of business intelligence, including the gathering of analytics components, networking and storage. As big data continues to play a dominate role in a company’s ability to effectively compete, it’s no longer enough to simply manage information.

All companies are examining the best way to manage the exponential growth in unstructured data, forcing key decision-makers to determine the best way to analyze this data in real-time to support the effective use of this information. While Gartner is predicting the growth of the business intelligence market to hit 9.7 percent this year, business analytics in the cloud is expected to grow three times faster.

Businesses of all sizes are flocking to the cloud for business intelligence and analytics as it provides vast computing and storage resources without significant investment. Plus, the ability to gather and act on granular information is a key competitive advantage and one that is difficult and costly to achieve without business intelligence analytics in the cloud. As the data bubble continues to expand, those able to embrace the cloud will enjoy greater capacity and capability when turning that data into actionable intelligence. 

Threats Associated with BYOD

By Amine Mekkaoui,

The use of mobile devices among the global workforce is not a new concept, but the introduction of user ownership is a trend that has just gained momentum in the last few years. Professionals in a wide range of industries are relying on their own mobile devices to support the balance between work and home, introducing a whole new set of risks for the corporate network when the proper policies and controls are not in place.

While BYOD (Bring Your Own Device) offers plenty of benefits for the enterprise and the employee, a strategic approach is necessary to mitigate the risks associated with users accessing the network and supported applications from outside of the corporate firewall. Let’s take a look at some of the threats that exist with BYOD and what you need to do to protect your network, your users and your proprietary information.

  1. Lacking a Robust Policy – Now that users are accustomed to relying on their own devices to access the network and their personal email, they also need to know what is acceptable use, who has access to their device(s), and what will happen if the device or the information contained within the device is compromised. An effective policy outlines expectations and outcomes, while also providing for the proper sharing of information so all employees are informed.

  2. Weak Authentication Methods – It’s a given that employees will need unique user names and passwords to access the corporate network, but it’s also a given that such information is easily captured by hackers. It’s critical that IT management implements and enforces strong authentication methods and limits access to applications. Strong authentication methods demand constant monitoring and regular updates to ensure any breach is immediately identified and mitigated.

  3. No Visibility or Control over Devices – Employees often prefer BYOD as a concept as it suggests they have complete control over their mobile device. While the physical control may remain, IT management establishes its own control over the device with mobile device management or other applications that provide remote access and complete visibility. Access to such technology ensures IT always knows what devices are accessing the network and can immediately locate, lock and wipe clean any compromised or lost device.

  4. Applications – While a number of applications exist to promote the activities of the professional in the field, a larger number exist to waste time or access proprietary information with malicious intent. Any applications downloaded by the user without IT approval are a risk to the corporate network. The simple scan of a QR code could quickly launch malware on the device, with reach into any network to which it is connected. The corporate policy must define what constitutes an approved application and how to avoid downloading malicious software.

While this list merely scratches the surface of the threats that exist with BYOD, it still provides clear insight into what you need to consider within your own environment. Whether yours is a large enterprise, small- to medium-sized business or sole proprietorship, any mobile device used to access your network, server or other IT assets presents a threat to your operation. Before allowing BYOD to flourish, put the right strategy in place to support only the safe use of all mobile devices.

Many companies make one huge mistake when implementing their data governance plan.  They assume that once they develop related policies and implement the needed technology solutions to support the strategy, the rest will take care of itself.

These organizations are, unfortunately, in for a rude awakening.  What they don’t realize is that existing business cultures will have a profound impact on how those initiatives are carried out.  In other words, a company’s “personality” and working environment may have as much to do with data governance success as any other factor in the plan.  For example, the willingness – or lack thereof – of both IT and business stakeholders to embrace new initiatives can make or break the strategy.  Or, pre-existing tension between departments and business units can halt the collaboration needed to get the project off the ground in the first place.  

What are some business culture “problems” that can have the greatest impact on a data governance strategy?

Lack of Communication
Many businesses suffer from poor communication across various levels and departments.  And, others are so eager to get their critical projects into play, they often dive right in without properly informing and educating their employees about the plan.  When it comes to data governance, this approach can create major problems.  For example, if stakeholders don’t understand why data governance is important, don’t know how it works, or don’t see how it applies to them, they are likely to be lax when it comes to complying with related policies and procedures.   

Too Many “Cooks Stirring the Pot”

While contribution and consensus among all departments that will be affected by data governance is critical, companies who are prone to forming “mega-committees” to spearhead important projects may see their data governance efforts fail.  Action and execution will end up taking a back seat to meetings, bureaucracy, and debate, and these businesses will likely never get past the policy-making phase.  

Failure to Synchronize and Coordinate
Employees get used to working a certain way, and asking them to significantly alter how they perform their day-to-day activities is likely to be met with some resistance.  Yet, many companies simply demand that employees follow certain data governance processes – no matter how different from current workflows they may be – without any consideration as to whether or not those staff members are capable of carrying those procedures out, and how other responsibilities will be affected.  What these organizations are forgetting is that governance processes are not separate and distinct.  They must be seamlessly integrated into any related IT and business activity they will impact.  

Out of Sight, Out of Mind
Countless companies make the mistake of introducing a major strategy with much noise and fanfare, then executing on that plan quietly, without keeping employees informed of new developments, results, etc.  When it comes to data governance, this is a surefire way for employees to lose interest, because they’ll associate the lack of “hoopla” with a lack of importance.  Conducting ongoing training on new data governance techniques, or setting milestones that track and measure the benefits a data governance strategy is delivering can help keep the initiative at the forefront of employees’ minds, and maintain focus on their goals and responsibilities in carrying out that plan.   

To learn more about governing your data, or for tips to help optimize your data governance strategy, visit our Web site at www.croyten.com.   

Passive vs. Active Data Governance

By Amine Mekkaoui,

There are countless ways to oversee and manage your corporate data.  But, the approach you take will likely determine the success or failure of your data governance initiative.

Some companies will take a “passive” approach, giving users the ability to freely interact with information in back-end systems, then monitoring the results of those interactions – after the fact – using various reporting and data assessment tools. Others will be more proactive, monitoring the way data is created, modified, and handled in real-time as these tasks are performed, so problems can be detected and corrected before “bad” data is introduced into the operating system environment.   

In this post, we will discuss the many problems that may arise when companies take a passive approach to data governance, and highlight the reasons why active data governance is the more effective route to take. 

The key issue with passive data governance is the fact that, once invalid or incorrect information enters a database, it’s potential to create problems immediately begins.  In the time between when a user adds the bad data, and a tool identifies its existence and “cleanses” it, an incorrect bill may be sent to a customer, an inaccurate work order may be transmitted to a field service technician, or even worse, an invalid report may be sent to a regulatory body.  Therefore, catching corrupt information beforehand is critical, particularly in heavily regulated sectors such as financial services and healthcare. 

Bad data within a corporate environment can also lead to interruptions in core business processes.  Downtime can be particularly significant if the poor quality information in question triggers or is consumed by automated workflows.  These disruptions not only drain worker productivity, they can negatively affect revenue generation. 

And, most importantly, the validation and correction of bad data after the fact will require more resources than the proactive monitoring of data activities.  The amount of time and cost associated with scanning a massive database for integrity issues, then cleansing or deleting corrupt information, far surpasses the resources needed to flag suspicious data interactions as they occur, and validate them on the fly. 

While passive data governance can serve as a highly effective secondary measure, catching the one or two errors or inconsistencies that may mistakenly bypass more active monitoring measures, the truth is that active data governance is the only true way to promote optimum information accuracy, timeliness, and completeness across an entire business. 

The value of data is simply immeasurable to today’s corporations, and the speed at which information flows throughout and beyond an organization leaves little room for error.  Even the slightest problem with data integrity, even if for a short period of time, could be detrimental to operational efficiency, profitability, and regulatory compliance. 

To learn more about governing your data, or for tips to help optimize your data governance strategy, visit our Web site at www.croyten.com.   

Data Governance: Your MDM Plan Won’t Work Without It

By Amine Mekkaoui,

At the heart of every successful master data management (MDM) strategy is master data that is complete and accurate at all times. But, the optimum quality and consistency of master data can only be secured if comprehensive data governance plays an integral role in its creation, collection, storage, handling, and administration.

The Data Governance Institute, a provider of in-depth, vendor-neutral information about best practices in the management and stewardship of enterprise information, has defined data governance as “a system of decision rights and accountabilities for information-related processes, executed according to agreed-upon models which describe who can take what actions with what information, and when, under what circumstances, using what methods.”

And, the experts all agree that MDM initiatives that lack formal data governance policies have a higher likelihood of failure. Why?  Because data governance not only helps to ensure the integrity of the master data that stakeholders use to formulate important business plans and make critical day-to-day business decisions, it aids in effective compliance with regulatory and information disclosure demands.

However, Gartner predicts that 90 percent of organizations will not succeed at their first attempts at data governance.  This failure can be caused by a variety of common factors, including:

  • Too much reliance on IT.  According to Ventana Research’s Mark Smith, responsibility for data quality is not just IT’s job.  It is up to information consumers within functional business units – who have insight into the context in which master data is used – to help administer these assets.   
  • No clear documentation.  Data governance policies and related procedures must be defined and documented in a way that both technical and business stakeholders can easily understand, and must be readily accessible to all those who generate or interact with master data.    
  • Poor enforcement. Data governance processes that are loosely enforced – or not enforced at all – are not likely to be adhered to.  Documentation must not only account for what the rules and guidelines are, but what the possible penalties will be if they are not properly followed.

In some scenarios, bad or invalid master data may be worse than no master data at all.  In order to preserve the correctness and consistency of master data across an organization, companies must implement a formalized data governance program that includes strict “checks and balances” that are overseen by a council of key stakeholders from both the IT team, and various business units.  Only then can master data be optimized to ensure accuracy, comprehensiveness, and most importantly, relevance to all those who rely on it to support core business activities.     

To learn more about best practices in data governance and master data management, visit the Croyten Web site at www.croyten.com

Why Phased MDM Implementations are More Effective

By Amine Mekkaoui,

As more and more companies embark on new master data management (MDM) initiatives, few truly realize just how complicated the deployment stage of their project will be.  Even the most solid and well-thought out plans are likely to face obstacles, require changes, and experience other issues once the implementation actually begins. 

That’s why many experts agree that the “phased” approach – as opposed to a “big bang” deployment – is the best way to go.  The MDM Institute, in their December, 2008 Market Report describes master data management as a critical strategic initiative, and strongly recommends that it be carried “across multiple lines of business, multiple channels, and therefore across multiple years.”

Companies who have attempted to execute on their entire MDM strategy all at once have run into significant problems, such as:

  • Project delays
  • Cost overruns
  • Loss of end user productivity
  • Unplanned drain on IT resources

Why so many issues?  Because master data management isn’t just a set of technology solutions to be installed and forgotten about.  It’s a rigid discipline that spans both IT and business.  It requires an evolution of both culture and process – and those changes simply can’t happen overnight. 

The implementation of MDM on an enterprise-scale will also undoubtedly impact back-end systems, disrupting core business activities. Deploying MDM across the business, in every department simultaneously, can bring critical operations to a screeching halt.  On the other hand, a well-timed series of smaller roll-outs will affect only one or two divisions at a time, making it easy for the company to create a contingency plan, and minimize losses from the temporary reduction in output.     

Additionally, broad-reaching MDM implementations are highly inflexible because they fail to give project leaders the opportunity to assess the viability of the strategy, and make changes along the way. But, incremental implementations make it easy to assess goals – and whether or not they can actually be met by the current plan – before the entire initiative has been executed upon.  Corrections and adjustments can take place “on the fly”, as the deployment is in progress, ensuring success in both the short-term and the long term.   

To learn more about best practices in master data management deployment, visit the Croyten Web site at www.croyten.com

Data Validation: What It Means to MDM

By Amine Mekkaoui,

While the creation of master data is important, and the seamless dissemination of it to end users is even more important, it is the accuracy and quality of that data that is crucial to the success of your master data management (MDM) strategy. 

Yet, few companies carefully consider data quality as they are developing their MDM plan, and fail to put the proper validation mechanisms in place upon execution of that plan.  This cannot only seriously hinder MDM success, it can have a severe impact on core business operations. 

Why are validation and quality control so vital? Because information is generated from many sources.  There is application data, which is maintained in various back-end business systems, as well as the metadata that describes its attributes.  There is transaction data, which is created in the course of “live” events or automated messages, and the reference data that provides detail about it.  Then finally, there is master data, which links these together to facilitate the creation and centralization of a single, consistent set of values across all sources.   

Take, for example, a client’s location.  While a customer relationship management (CRM) system may display one address, an accounting package may show another. Yet a third address may be included in an electronic document, such as a purchase order, transferred during the course of a business-to-business transaction.  These types of inconsistencies, if not detected and corrected in a timely manner, can cause major setbacks in MDM projects.  In other words, bad data will ultimately lead to bad master data. 

And, when master data is poor, businesses won’t achieve the levels of flexibility and agility they set out to reach, since they’ll be basing both tactical and strategic decisions on information of sub-par quality. 

How does validation work?  Automated validation can work in several ways.  It can scan the environment to uncover inaccuracies, such as those mentioned in the above example, across multiple data sets, and flag them for review.  An IT staff member can then manually take a look, and make any needed corrections to promote accuracy throughout the business. 

The more advanced quality control techniques allow for the use of dynamic business rules.  These rules can be proactively applied to back-end systems, to ensure that bad information doesn’t enter the environment in the first place.  For example, it can prevent end users from entering client last names that include numbers, or mailing addresses that don’t have enough characters.  These business rules can also be used to automatically “cleanse” bad data after the fact, instantly reformatting or altering it, based on pre-set guidelines, once it has been discovered. 

In order for an MDM initiative to deliver optimum returns, fully-automated controls and validation must be put into place, to ensure that master data is accurate and up-to-date at all times.  However, these controls must be broad-reaching, governing not only how data is handled once it has been created, but how it is generated and updated throughout its lifecycle.