Category: Business Intelligence


When Artificial Intelligence Meets Data Analytics

By Amine Mekkaoui,

When AI meets Data Analytics by Croyten

If you are from an organization that strives to function in a highly-technological environment, then it is crucial that you know the relation of big data and artificial intelligence: the latter depends heavily on the former for success, while also helping organizations unlock the potential in their data stores in ways that were previously cumbersome or impossible. Leveraging well-managed and presented data can improve organizations big-time. The problem is, handling data is stressful due to a variety of reasons.

Data Analytics is the process of making sense of and transforming data into useful knowledge. This process is composed of many stages and phases, and while there are software or tools that exist to assist, data-wrangling – the exhaustive process of cleaning and organizing data – is still rarely addressed. Obviously, practical data analytics is painful, and a helping hand in the form of automation through artificial intelligence can make a huge difference in this field.

To revolutionize the speed and efficiency with which data can be transformed into useful knowledge is the goal of The Alan Turing Institute’s Artificial Intelligence for Data Analytics project, otherwise known as AIDA. According to the initiative, it aims to combine multidisciplinary work from machine learning, semantic technologies, and programming languages to: (1) Build AI assistants for individual tasks, (2) Build an open-source platform and integrate the assistants into the platform; and (3) Provide exemplar use cases of real-world data wrangling. It also aims to solve some data engineering challenges such as. (a) data organisation (data parsing, integration, dictionary, and transformation); (b) data quality (canonicalisation, missing data, anomaly detection); and (c) feature engineering.


Data analytics required a lot of effort but with the help of AI, not only did it speed up the process but also allowed depth in making sense of data in the past.

AI-related initiatives like AIDA fuel better opportunities in insights and knowledge production since it is creating new methods in analyzing data, and data analytics has become less labor-intensive. Data analytics required a lot of effort but with the help of AI, not only did it speed up the process but also allowed depth in making sense of data in the past. In fact, AI is now deemed promising as it thrives in different kinds of industries. 

AI in Action

AI and machine learning are powerful levers when it comes to big data. Together with the power of human intuition, they are critical to helping businesses have a more holistic view of all of that data. It revolutionizes the way you get rules, decisions, and predictions done which entail the increase of the potential to dramatically improve the productivity of data scientists, analysts, and researchers benefiting governments and organizations because it will allow faster delivery of insights and decision-making.

Insurance Sector 

A recent study from the Organisation for Economic Co-operation and Development (OECD) (2020) encourages the insurance sector to prepare incorporating AI in their specific context. For instance, having more data leads to improved predictive analytics, enabling pricing that is better suited to expected risk. And since insurance is based on predicting how risk is realised, having access to big data has the potential to transform the entire insurance production process.

Healthcare

Payers and providers of care, and life sciences companies have started employing several types of AI in various categories such as diagnosis and treatment, patient engagement, recommendations, and administrative practices (Future Healthc, 2019). It will take many years before AI completely erases humans in medical domains, but at the moment, it has made a promising impact in the medical field: 1) Algorithms are already outperforming radiologists at spotting malignant tumours, and guiding researchers in how to construct cohorts for costly clinical trials; 2) Machine Learning is deemed to have the primary capability behind development of precision medicine; and 3) AI-based capabilities are deemed effective in personalising and contextualising care by, for example, sending messaging alerts with relevant and targeted content that provoke actions at moments.

Government

With AI in data analytics, data-driven governments are reaping a more efficient and convenient delivery of public services, and better-informed policymaking with predictive analytics, policy simulations, and real-time early warning systems because the use of technologies allows them to observe their citizens and physical environment with unprecedented data density and analyse these observations (European Liberal Forum, 2019).


While AI is yet to be explored, it has been actively changing and making a big difference not just in the field of data analytics, but also in the market as a whole.

By bringing the fusion of AI and Data Analytics, Croyten can assist you to ensure that your organization can potentially reap the benefits this advancement is opening. While AI is yet to be explored, it has been actively changing and making a big difference not just in the field of data analytics, but also in the market as a whole. Thanks to Artificial Intelligence, new products are developed which are better than before, and the opportunity of autonomy it offers saves businesses huge amounts of time, leading to quicker decisions gleaned from data. 

Data is the new oil, they say. If so, data analytics is the vehicle that processes this oil, and artificial intelligence plays the role of an upgraded machine system. Combine them altogether and they can make your organization stand out from the rest.

A Collections of Data and Things

By Amine Mekkaoui,

Within seconds, a company executive in the U.S. can know exactly how many parts their global manufacturing plants are producing. A delivery company can tell you exactly to the minute when their truck will be arriving. A utility company can monitor usage across the country and know when it’s reaching a peak. All of this can be done because of the Internet of Things (IoT) and Big Data.

The IoT is basically a collection of Internet – enabled devices or sensors, other than your computer, which are connected to the Internet and can send and receive data. Big data is what you get when all of this information is collected and analyzed.

Devices such as smartphones, scanners, sensors, and GPS can gather and distribute a lot of information. IoT technology allows the input from these devices to be pulled together. Once it’s all been collected, companies can utilize big data analytics tools to improve business operations, manage equipment and people, target marketing and make their business run more effectively and efficiently.

The IoT is forcing people and companies to change the way they look at things. Information is being funneled fast, in large amounts, structured and unstructured and from places we never thought we’d get information from. Refrigerators talking to smartphones for a shopping list, or fitness trackers measuring your burned calories, sensors sending vital health data to doctors to monitor their patient‘s health in real time, and anything else you can imagine. Vendors can then use that information for marketing directly to consumers or provide better and timely service. Inventory in stores could soon be reliant on just a sensor on a shelf that indicates when an item needs to be restocked.

The next step for businesses is to figure out how to make the most of the data pouring in from things like smart meters, devices, and sensors. How is this data going to affect your next business decision and how is it all going to be analyzed?

Companies need to plan for a continuing influx of data as more devices become connected and interconnected. You need the bandwidth to store data, the real-time analytical tools to analyze it, and the ability to monetize it and turn it into something profitable. Without a plan, you could be left behind.

Big Data, Big Decisions

By Amine Mekkaoui,

Big data. It’s a pretty broad term, but it’s used to describe data sets that are so big or complex that in order to get the most value out of them companies need to use enhanced data applications; and more importantly know how to manage all of the information.

When it comes to big data, it’s not so much about how much you have, but more about what you can do with it. Managing this data means creating a structure that can store, process, and organize large volumes of structured and unstructured data.

For example, a typical bank offers its customers multiple products; a mortgage, car loan, checking account, saving account, credit line, etc.  In today’s economy customers also conduct transactions online and via mobile devices, and provide feedback on services via social media. Should all that data be stored in different places? No, but banks and other firms are starting to recognize that having all that relevant data stored in one place can provide a wealth of insights about their customers, which is critical to better serving their customers and offering them customized products that makes sense. Having that data in one place will enable efficient data management control and a single-client view.

Regardless of the size of your company or the data being brought in, big data can provide a whole new way of approaching big decisions.

A consumer-oriented company can use big data to listen to, learn from, and leverage consumer feedback to produce targeted B-to-C campaigns. All of the feedback collected from social media and surveys allows a company to build and update consumer profiles and then execute personalized marketing and advertising campaigns.

Insight-driven organizations (IDO) need data to drive decisions. A lot of the time, though, there is so much data these organizations don’t even know where to start. It can be a very long journey to take big data and turn it into insight-driven material. As an IDO you need to figure out what insights will be most impactful with your clients. Then you need to know if you have the right data and analytics to create these insights, and if not, how do you go about getting that information? Once those insights are available how do you create a strategy to implement them in day-to-day decisions making?

Regardless of who is using big data or how it’s being used, there are always concerns about security. Most companies don’t have the infrastructure to store big data on their own IT networks, which means they are either going to be using the Cloud or a third party storage. Transferring data out doesn’t mean to companies transfer their liability. With data coming in from all avenues; social media, emails, files, etc., there are more entry points that need to be protected as along with external access points where the data is being housed.

Finding the most accurate and secure way to use big data will lead to better decision making – which will result in more efficient operations, reducing costs, and reducing risks.

What Can You Do with All This Data?

By Amine Mekkaoui,

Today’s market is all about data. Consumers want to capture information relevant to their user experience; marketers want to capture that information to customize offerings for the consumer; enterprises want to turn data into business intelligence so as to secure a core competitive advantage; and data center vendors want to push virtualization so as to support the massive amounts of data to be captured, stored, mined and managed.

From an enterprise standpoint, there are a number of opportunities to capture data. In fact, companies throughout the world are capturing data at every touch point and market feed, hoping to extract the information they need to improve their product offerings and their market positioning. Without a clear strategy in place to direct the capture, organization and management of that data, however, it does nothing more than consume space on the server.

To truly make the most of the data capture, the enterprise needs to understand the source and why it’s selected, the type of data they want to capture and what they hope to do with that data once they have it in hand. Let’s examine a few possibilities:

  • The mobile consumer – this individual is in a position to share an immense amount of information with the enterprise, including location, purchase history, preferred communication channel and even the information they want to receive via email, text and social media channels. When captured, this information should not only be stored with the contact information, it also can be categorized according to the consumer profile, compiling the likes, dislikes, habits and preferences of a specific target customer.
  • The point of sale – whether in person or through the contact center, the point of sale is one of the best places to capture valuable data. Customers will share a wealth of information about their lives, their preferences, their plans for the future and so much more during this interaction. When that information is captured in the right format, offers can be generated that match their preferences perfectly, creating an opportunity for a cross-sell or upsell conversion. That information should also be stored in the customer account and associated with the profile so as to develop broader-reaching solutions in the future
  • The free offer – individuals who respond to the free offer or complete a form for more information provide a goldmine of consumer data. The first data capture must be short to ensure completion, but the follow-up call is the perfect opportunity to ask all the right questions to qualify the person as a lead, promote them to another buying opportunity or simply move them to a non-sale opportunity. Regardless of the classification, the point is to classify the individual and their information so the company can turn that information into intelligent data and potential opportunities.

The sheer volume of data being produced by consumers and the enterprise is putting significant pressure on today’s businesses to capture that data and turn into a business opportunity. Companies must pay attention to how they capture, the speed in which they capture, how they organize and then use that data. The core strategy needs to focus on each of these elements with a clear direction on how captured data will be used to promote the core competencies of the business. It also needs to ensure data capture is immediate as sometimes two minutes is too late. With valid channels to capture the information in real-time, the enterprise is well on its way to turning big data into business intelligence.

In 2011, Saugatuck completed a survey of 200 enterprise IT users and business leaders and roughly 30 vendors that found cloud-based business intelligence and analytics would be among the fast growing of the cloud-based business management solutions in the market over the next two years. This growth represents an 84 percent compounded annual  rate, but did the prediction ring true?

Among companies that are currently using business intelligence tools and have been since 2007, the adoption of business intelligence has remained flat. The 2012 Successful BI Survey shows that approximately 25 percent of the employee base relies on business intelligence tools, a figure that has not changed in the last five years. Given the adoption of new technologies and integration into mobile capabilities, this result may come as a surprise to most.

For others, however, the result of this survey simply demonstrates that the wrong element is being measured to truly understand what is happening in business intelligence in 2013. The tools for gathering the data don’t matter nearly as much as what platform companies are using to access the data and what they are doing with it once it’s in the data center. It is the challenge of enterprise in this next generation, and one that is easily overcome with data analytics and the strategic use of the cloud.

The stagnant adoption of business intelligence tools in the enterprise and the small business is not due to a lack of understanding of the value it presents, but instead the result of significant investments in legacy systems that demanded a focused approach to every network and data center deployment and integration. The process was often cumbersome and expensive, which limited access for a number of potential users. Now, as more companies are embracing the cloud, the playing field is about to change.

The cloud is expanding business intelligence and analytics to include multiple users throughout the organization, simplifying access and making business intelligence and the use of analytics more ubiquitous. The cloud provides one level for managing the complexities of business intelligence, including the gathering of analytics components, networking and storage. As big data continues to play a dominate role in a company’s ability to effectively compete, it’s no longer enough to simply manage information.

All companies are examining the best way to manage the exponential growth in unstructured data, forcing key decision-makers to determine the best way to analyze this data in real-time to support the effective use of this information. While Gartner is predicting the growth of the business intelligence market to hit 9.7 percent this year, business analytics in the cloud is expected to grow three times faster.

Businesses of all sizes are flocking to the cloud for business intelligence and analytics as it provides vast computing and storage resources without significant investment. Plus, the ability to gather and act on granular information is a key competitive advantage and one that is difficult and costly to achieve without business intelligence analytics in the cloud. As the data bubble continues to expand, those able to embrace the cloud will enjoy greater capacity and capability when turning that data into actionable intelligence. 

Many companies make one huge mistake when implementing their data governance plan.  They assume that once they develop related policies and implement the needed technology solutions to support the strategy, the rest will take care of itself.

These organizations are, unfortunately, in for a rude awakening.  What they don’t realize is that existing business cultures will have a profound impact on how those initiatives are carried out.  In other words, a company’s “personality” and working environment may have as much to do with data governance success as any other factor in the plan.  For example, the willingness – or lack thereof – of both IT and business stakeholders to embrace new initiatives can make or break the strategy.  Or, pre-existing tension between departments and business units can halt the collaboration needed to get the project off the ground in the first place.  

What are some business culture “problems” that can have the greatest impact on a data governance strategy?

Lack of Communication
Many businesses suffer from poor communication across various levels and departments.  And, others are so eager to get their critical projects into play, they often dive right in without properly informing and educating their employees about the plan.  When it comes to data governance, this approach can create major problems.  For example, if stakeholders don’t understand why data governance is important, don’t know how it works, or don’t see how it applies to them, they are likely to be lax when it comes to complying with related policies and procedures.   

Too Many “Cooks Stirring the Pot”

While contribution and consensus among all departments that will be affected by data governance is critical, companies who are prone to forming “mega-committees” to spearhead important projects may see their data governance efforts fail.  Action and execution will end up taking a back seat to meetings, bureaucracy, and debate, and these businesses will likely never get past the policy-making phase.  

Failure to Synchronize and Coordinate
Employees get used to working a certain way, and asking them to significantly alter how they perform their day-to-day activities is likely to be met with some resistance.  Yet, many companies simply demand that employees follow certain data governance processes – no matter how different from current workflows they may be – without any consideration as to whether or not those staff members are capable of carrying those procedures out, and how other responsibilities will be affected.  What these organizations are forgetting is that governance processes are not separate and distinct.  They must be seamlessly integrated into any related IT and business activity they will impact.  

Out of Sight, Out of Mind
Countless companies make the mistake of introducing a major strategy with much noise and fanfare, then executing on that plan quietly, without keeping employees informed of new developments, results, etc.  When it comes to data governance, this is a surefire way for employees to lose interest, because they’ll associate the lack of “hoopla” with a lack of importance.  Conducting ongoing training on new data governance techniques, or setting milestones that track and measure the benefits a data governance strategy is delivering can help keep the initiative at the forefront of employees’ minds, and maintain focus on their goals and responsibilities in carrying out that plan.   

To learn more about governing your data, or for tips to help optimize your data governance strategy, visit our Web site at www.croyten.com.   

Passive vs. Active Data Governance

By Amine Mekkaoui,

There are countless ways to oversee and manage your corporate data.  But, the approach you take will likely determine the success or failure of your data governance initiative.

Some companies will take a “passive” approach, giving users the ability to freely interact with information in back-end systems, then monitoring the results of those interactions – after the fact – using various reporting and data assessment tools. Others will be more proactive, monitoring the way data is created, modified, and handled in real-time as these tasks are performed, so problems can be detected and corrected before “bad” data is introduced into the operating system environment.   

In this post, we will discuss the many problems that may arise when companies take a passive approach to data governance, and highlight the reasons why active data governance is the more effective route to take. 

The key issue with passive data governance is the fact that, once invalid or incorrect information enters a database, it’s potential to create problems immediately begins.  In the time between when a user adds the bad data, and a tool identifies its existence and “cleanses” it, an incorrect bill may be sent to a customer, an inaccurate work order may be transmitted to a field service technician, or even worse, an invalid report may be sent to a regulatory body.  Therefore, catching corrupt information beforehand is critical, particularly in heavily regulated sectors such as financial services and healthcare. 

Bad data within a corporate environment can also lead to interruptions in core business processes.  Downtime can be particularly significant if the poor quality information in question triggers or is consumed by automated workflows.  These disruptions not only drain worker productivity, they can negatively affect revenue generation. 

And, most importantly, the validation and correction of bad data after the fact will require more resources than the proactive monitoring of data activities.  The amount of time and cost associated with scanning a massive database for integrity issues, then cleansing or deleting corrupt information, far surpasses the resources needed to flag suspicious data interactions as they occur, and validate them on the fly. 

While passive data governance can serve as a highly effective secondary measure, catching the one or two errors or inconsistencies that may mistakenly bypass more active monitoring measures, the truth is that active data governance is the only true way to promote optimum information accuracy, timeliness, and completeness across an entire business. 

The value of data is simply immeasurable to today’s corporations, and the speed at which information flows throughout and beyond an organization leaves little room for error.  Even the slightest problem with data integrity, even if for a short period of time, could be detrimental to operational efficiency, profitability, and regulatory compliance. 

To learn more about governing your data, or for tips to help optimize your data governance strategy, visit our Web site at www.croyten.com.   

Data Governance: Your MDM Plan Won’t Work Without It

By Amine Mekkaoui,

At the heart of every successful master data management (MDM) strategy is master data that is complete and accurate at all times. But, the optimum quality and consistency of master data can only be secured if comprehensive data governance plays an integral role in its creation, collection, storage, handling, and administration.

The Data Governance Institute, a provider of in-depth, vendor-neutral information about best practices in the management and stewardship of enterprise information, has defined data governance as “a system of decision rights and accountabilities for information-related processes, executed according to agreed-upon models which describe who can take what actions with what information, and when, under what circumstances, using what methods.”

And, the experts all agree that MDM initiatives that lack formal data governance policies have a higher likelihood of failure. Why?  Because data governance not only helps to ensure the integrity of the master data that stakeholders use to formulate important business plans and make critical day-to-day business decisions, it aids in effective compliance with regulatory and information disclosure demands.

However, Gartner predicts that 90 percent of organizations will not succeed at their first attempts at data governance.  This failure can be caused by a variety of common factors, including:

  • Too much reliance on IT.  According to Ventana Research’s Mark Smith, responsibility for data quality is not just IT’s job.  It is up to information consumers within functional business units – who have insight into the context in which master data is used – to help administer these assets.   
  • No clear documentation.  Data governance policies and related procedures must be defined and documented in a way that both technical and business stakeholders can easily understand, and must be readily accessible to all those who generate or interact with master data.    
  • Poor enforcement. Data governance processes that are loosely enforced – or not enforced at all – are not likely to be adhered to.  Documentation must not only account for what the rules and guidelines are, but what the possible penalties will be if they are not properly followed.

In some scenarios, bad or invalid master data may be worse than no master data at all.  In order to preserve the correctness and consistency of master data across an organization, companies must implement a formalized data governance program that includes strict “checks and balances” that are overseen by a council of key stakeholders from both the IT team, and various business units.  Only then can master data be optimized to ensure accuracy, comprehensiveness, and most importantly, relevance to all those who rely on it to support core business activities.     

To learn more about best practices in data governance and master data management, visit the Croyten Web site at www.croyten.com

Why Phased MDM Implementations are More Effective

By Amine Mekkaoui,

As more and more companies embark on new master data management (MDM) initiatives, few truly realize just how complicated the deployment stage of their project will be.  Even the most solid and well-thought out plans are likely to face obstacles, require changes, and experience other issues once the implementation actually begins. 

That’s why many experts agree that the “phased” approach – as opposed to a “big bang” deployment – is the best way to go.  The MDM Institute, in their December, 2008 Market Report describes master data management as a critical strategic initiative, and strongly recommends that it be carried “across multiple lines of business, multiple channels, and therefore across multiple years.”

Companies who have attempted to execute on their entire MDM strategy all at once have run into significant problems, such as:

  • Project delays
  • Cost overruns
  • Loss of end user productivity
  • Unplanned drain on IT resources

Why so many issues?  Because master data management isn’t just a set of technology solutions to be installed and forgotten about.  It’s a rigid discipline that spans both IT and business.  It requires an evolution of both culture and process – and those changes simply can’t happen overnight. 

The implementation of MDM on an enterprise-scale will also undoubtedly impact back-end systems, disrupting core business activities. Deploying MDM across the business, in every department simultaneously, can bring critical operations to a screeching halt.  On the other hand, a well-timed series of smaller roll-outs will affect only one or two divisions at a time, making it easy for the company to create a contingency plan, and minimize losses from the temporary reduction in output.     

Additionally, broad-reaching MDM implementations are highly inflexible because they fail to give project leaders the opportunity to assess the viability of the strategy, and make changes along the way. But, incremental implementations make it easy to assess goals – and whether or not they can actually be met by the current plan – before the entire initiative has been executed upon.  Corrections and adjustments can take place “on the fly”, as the deployment is in progress, ensuring success in both the short-term and the long term.   

To learn more about best practices in master data management deployment, visit the Croyten Web site at www.croyten.com

Data Validation: What It Means to MDM

By Amine Mekkaoui,

While the creation of master data is important, and the seamless dissemination of it to end users is even more important, it is the accuracy and quality of that data that is crucial to the success of your master data management (MDM) strategy. 

Yet, few companies carefully consider data quality as they are developing their MDM plan, and fail to put the proper validation mechanisms in place upon execution of that plan.  This cannot only seriously hinder MDM success, it can have a severe impact on core business operations. 

Why are validation and quality control so vital? Because information is generated from many sources.  There is application data, which is maintained in various back-end business systems, as well as the metadata that describes its attributes.  There is transaction data, which is created in the course of “live” events or automated messages, and the reference data that provides detail about it.  Then finally, there is master data, which links these together to facilitate the creation and centralization of a single, consistent set of values across all sources.   

Take, for example, a client’s location.  While a customer relationship management (CRM) system may display one address, an accounting package may show another. Yet a third address may be included in an electronic document, such as a purchase order, transferred during the course of a business-to-business transaction.  These types of inconsistencies, if not detected and corrected in a timely manner, can cause major setbacks in MDM projects.  In other words, bad data will ultimately lead to bad master data. 

And, when master data is poor, businesses won’t achieve the levels of flexibility and agility they set out to reach, since they’ll be basing both tactical and strategic decisions on information of sub-par quality. 

How does validation work?  Automated validation can work in several ways.  It can scan the environment to uncover inaccuracies, such as those mentioned in the above example, across multiple data sets, and flag them for review.  An IT staff member can then manually take a look, and make any needed corrections to promote accuracy throughout the business. 

The more advanced quality control techniques allow for the use of dynamic business rules.  These rules can be proactively applied to back-end systems, to ensure that bad information doesn’t enter the environment in the first place.  For example, it can prevent end users from entering client last names that include numbers, or mailing addresses that don’t have enough characters.  These business rules can also be used to automatically “cleanse” bad data after the fact, instantly reformatting or altering it, based on pre-set guidelines, once it has been discovered. 

In order for an MDM initiative to deliver optimum returns, fully-automated controls and validation must be put into place, to ensure that master data is accurate and up-to-date at all times.  However, these controls must be broad-reaching, governing not only how data is handled once it has been created, but how it is generated and updated throughout its lifecycle.