Why are so many customers failing in their Big Data initiatives?


I strongly believe that companies with a successful Big Data strategy have an information-centric culture where all employees are fully aware of the possibilities of well-analyzed and visualized information. Better data visualization can help you make better decisions

As a matter of fact, Gartner’s top predictions for 2012 and beyond included this prediction about Big Data: “Through 2015, more than 85 percent of Fortune 500 organizations will fail to effectively exploit Big Data for competitive advantage.” This leads to the question “Why are so many customers failing in their Big Data initiatives?”

The success of a Big Data implementation is directly proportional to the maturity model of the organization.

Remembering the Big Data project implementation experience I would like to share the approach that includes three assessment steps as mentioned below. I thought it would be insightful if I also mention here the recommendations which lead to a successful Big Data implementation.

I. APPROACH

II. RECOMMENDATION

Recommend a model, which will demonstrate the real value of Big Data as it is applicable to an organization. The final recommendations and roadmap, based on our learning’s yield one of two possible outcomes:

• If an organization already has all the necessary tools, processes, systems, and solution to solve the existing problems, then we will recommend through a business case that they are not a good contender to adopt Big Data technologies but can resolve their problems with existing ecosystem

• If an organization demonstrates the potential value of a Big Data investment, then we would recommend moving forward with next steps: take the executable roadmap and blueprint to engage in a Big Data proof of concept (POC)

III. METHODOLOGY

Organizations that approach big data from a value perspective with partnership between the business and IT are much more likely to be successful than those which adopt a pure technology approach. For this reason, making appropriate investments in both technology and organizational skill sets to ensure enterprise capability in extracting value from big data is essential.

Don’t wait, start now

Start collecting massive amounts of data and store it centralized with Hadoop, hire or train your data scientists and change your culture to an information-centric organization. This will help to drive innovation and stay ahead. Don’t wait, as Big Data is the only way forward.

Phani Kumar Reddy is a Manager Analytics at Bodhtree, Managing presales of BI with expertise on various big data technologies, like Hadoop, Big query , Passionate leader in Data Analytics, Business Intelligence, and Big Data services

Read More

Balance your Supply Chain with Big Data

Let’s start by going back…way back from a tech perspective. In the 1840s Samuel Finley Breese Morse, the American co-inventor of Morse code envisioned laying cable across the Atlantic to enable telephonic communication from US to Europe. The business benefit metric of the solution was a reduction in message transmission time from 10 days to only a few minutes. With this massive return, the initiative would seem like a “no brainer” from today’s perspective where communication is at milliseconds speed from your cell phone; believe it or not, the question commonly asked then was ‘Do we really need communication so fast?’ The project ultimately took over 18 years to complete when US president James Buchanan finally conversed with Queen Victoria over the transatlantic cable, hence demonstrating the first business benefit. Let us call this the ‘Paradigm Shift Period’ for communication. Modern businesses now rely on instant communication across the world with voice and data transfers occurring at lightning speed. People, processes and technologies within business have all evolved to conform to this new paradigm of global data interconnection.

In fact, the original challenge has now come full circle. Business and government have become so efficient at capturing and transmitting data that getting the data is no longer the core of the issue. The challenge and opportunity now lay in processing and interpreting the terabytes, even petabytes, of available structured and unstructured data to influence effective business strategy.

The chances are that you’ve been bombarded with Big Data buzz over the last year. But in spite of all the noise, you’ve probably noticed that few of these descriptions contain focused business use cases for applying Big Data technologies. I am the first to acknowledge and agree with Gartner research that Big Data Analytics is riding a hype cycle that will likely peak sometime in 2013. Between now and then a lot of mind share will go into figuring out if there is value for your domain, your industry and your job. If you work in supply chain, irrespective of the industry, continue reading to understand how Big Data is expected to bring both direct and indirect impact. Some of these reverberations may fundamentally change the nature and duties performed in supply chain jobs. In 2010 we have witnessed a ‘Paradigm Shift Period’ for Big Data Analytics with major players like SAP announcing the next generation of real-time analytics as many ask a similar question to 170 years earlier, ‘Do we really need analytics so fast?’ SAP is now seeing their Hana analytics customers grow rapidly, similar to other big players like Oracle. We are witnessing an epic shift in supply chain data analytics that will make the approaches of the last decade seem antiquated.

The Supply Chain Domain

The core of any supply chain strategy is maintaining an appropriate balance between the supply and respective demand. Every other related model, including the well-known JIT (Just in Time), really targets the same goal with different degrees of precision and timeliness. Every time you enter the car repair shop and the mechanic mentions a part will take X days to order, you get a prime, though frustrating, example of a supply-demand imbalance. It is every organization’s goal to maintain a supply-demand balance by optimizing cost and quality with operational efficiencies.

On a much larger scale, I have observed operations at a $40B Hi Tech manufacturer where maintaining the supply-demand balance is a far more complex proposition. Everyday employees and partners in this supply chain ecosystem are trying to find answers to key supply chain questions, but their view is constrained to only a piece of the picture since reports rely primarily on structured data. How fast the person can get accurate and relevant information has a significant impact on the growth, profitability and productivity of the supply chain function.

The following are some ballpark metrics for the annual activities involved in keeping supply aligned with constant variation in market demand:

Does this look ugly? It is. But think about what these numbers will be after data volumes grow 16X by 2016.

It’s a category 5 hurricane of data.

All of the above communication is related to one or more of the following four areas: Assess the demand, Assess the supply, Fulfilment of demand, Delivery of the product/service. The efficiency and success of these activities can be tracked through metrics such as lead-time variance, forecast inaccuracies, on-time shipments and quality metrics to name a few.

Big Data for Supply Chain

NOW, let us bring Big Data into the picture and see how this outlook changes. A Big Data problem exists if data Volume, Velocity and Variety become difficult or impossible to store, process, and analyse using traditional technology and methods. With Big Data technologies, the capability to find answers faster and cheaper has grown exponentially.

While we predict 16X growth in data volumes in just a few years, human ability to comprehend does not keep the same pace. From the perspective of people, processes and technology within supply chain management, improvements will need to catch up as you implement Big Data technologies. The probability is high that Big Data technologies will play a key role in handling your rapid data expansion, so gear up your people and processes to match the potential of these technological innovations. Within the broad range of supply chain roles, let us consider the role of planner to see how his/her activities change from today’s traditional technologies vs. Big Data technologies of tomorrow.

Key Supply Chain functions Today – Traditional Technologies Tomorrow – Big Data Technologies
Forecasting Running reports and analysis on a daily basis (reports alone can take hours to produce). Forecasting using real time dashboards, eliminating the concept of running reports. Data is ready at lightning fast speeds with the capability to capture snapshots of analysis.
Demand Planning Mostly using human-fed structured data Demand Planning using structured and unstructured data (e.g. web clickstreams, Facebook likes, Twitter Feeds , Customer reviews, news article mentions)
Supply Planning Traditional reports and email communications Supply Planning using real time data with deep insights to the news of vendors and partners.
Fulfilment & Delivery Tracked through workflows and report status Proactive delivery tracking to predict possible delays and correlated interdependent events.

There is a fundamental shift from planners reading the data and recommending changes to the machine recommending changes and planners managing the exceptions. This has been the goal of many organizations for the last decade, but recent Big Data technology innovations represent quantum-leap advances toward true strategy automation.

The traditional model makes local copies of data which the planner edits and writes back. The read/write process might take anywhere from seconds to many hours depending on the tasks. With Big Data, the turnaround becomes milliseconds. The natural reaction is, “Do I really need information flow that fast?” The important question is not how fast the information flows, but how quickly you can change your decision from A to B, capturing a time-sensitive opportunity or averting a major cost. Cancelling a wrong work order or not considering all available information for analysis could mean a poor decision in current model. Visualize the planners viewing all the information they want to see in real time while the competition is still updating data and processing reports.

Bringing the Supply Chain Contacts, Content and Context Together for decisions

The most critical factor for effective corporate decisions is to bring the contacts, content and context closer to each other. For example, a supply chain company that knows a part defect would potentially affect the assembly, which could in turn delay customer delivery and eventually affect services. Predicting the occurrence of defects well in advance through analysis of historical Big Data has huge ROI potential by enabling appropriate adjustments to every event in this chain. Additionally, with Big Data recommending related content and relaying all of this to the right contacts, the result is direct ROI in the form of improved quality metrics, increased customer satisfaction and reduced maintenance costs for part replacement.

Today’s Big Data technologies have the capability to demonstrate how in the automobile industry an alternator part data sheet (Content) can be analysed against all cars sold (Contacts) and reveal the root cause for battery replacements (Context), an issue which has cost the company millions of dollars in repair services. Similar examples can be found in many Big Data technology use cases across industry verticals.

All of these scenarios are primarily connecting the 3Cs, the Contacts (e.g. Customer information or internal employee) and Content (Use case specific information e.g. Battery failure) with Context (How a battery replacement is due to alternator failure).

Much of a Planner’s time is spent searching for information across multiple tools, reports and manual communication with traditional technologies. One gauge of an effective Big Data technologies implementation is to reduce the number of reports to 1/10 the current volume. Let the machines do the job of relating and correlating the huge flow of information, and put the planner in the command seat to review recommendations and approve/disapprove. This will directly increase the productivity of the planner as he/she has to focus on reviewing the recommendations rather than searching for information.

Where to Start

All of this means that you need to first conduct an assessment of your supply chain ecosystem with a specific use case in mind to which Big Data technologies will be applied. The specific area targeted for improvement may be forecast inaccuracies, which in today’s model relies mostly on structured data combined with massive exchanges of manual communication, ignoring much of the available market feedback (unstructured data). Measure the baseline and set realistic targets. Traditional Forecast/Demand planning fundamentally relies on a set of numbers entered by internal and external users. It does not factor in some of the Big Data elements such as sentimental analysis of the market, internal/external unstructured communication (e.g. blogs, chats, Tweets, customer reviews). When the unstructured information is correlated with structured data, new insights arise prompting better decisions. 1% improvement in your forecasting drives multi-fold improvements to your entire supply Chain based on empirical research. Upon realizing these early Big Data benefits, we can then expand it to broader supply chain use cases.

ROI

Now, where do you initiate the change and get the quick ROI? Our recommendation is to pick the top five supply chain reports you run on your traditional BI Solutions and Analytics platform, analyse them and assess whether Big Data technologies would bring in improved results. Consider dimensions of accuracy, precision, and timeliness. For example, forecasting traditionally depends on sales, BU or operations entering their forecasts and coming up with some form of consensus. Inherent forecast inaccuracy exists, which are mitigated by a continuous improvement process. Now, with Big Data you start feeding unstructured market information into the analysis, casting more light on external reactions to your product. This insight provides early indications of demand variations, allowing for corrections to forecasts.

Conclusion

The fundamental disruption in our supply chain eco system has begun through Big Data technology capabilities impacting People, Process and Technology. Faster, better and cheaper processing of Big Data will drive improvements in people’s behaviour and actions, bringing improved supply/demand balance. Similarly, process improvements learned from various supply chain driven companies (e.g. automobile) will flow into other industries like Hi Tech and Healthcare. The traditional daily job of a supply chain employee who reads and writes Content relating it to a Context working with his set of Contacts will dramatically change. Human-driven searching will fundamentally shift to machine-driven searching, mapping relevant information for faster decision making with recommendations. Get started with a use case which can be easily measured for ROI realization, then use this success as a launch pad to expand Big Data insights across the organization.

Read More

PIG and Big Data – Processing Massive Data Volumes at High Speed

For most organizations, availability of data is not the challenge.  Rather, it’s handling, analyzing, and reporting on that data in a way that can be translated into effective decision-making.

PIG is an open source project intended to support ad-hoc analysis of very large data volumes. It allows us to process data collected from a myriad of sources such as relational databases, traditional data warehouses, unstructured internet data, machine-generated log data, and free-form text.

How does it process?

PIG is used to build complex jobs behind the scenes to spread the load across many servers and process massive quantities of data in an endlessly scalable parallel environment.

Unlike traditional BI tools that are used to report on structured data, PIG is a high level data flow language which creates step-by-step procedures on raw data to derive valuable insights. It offers major advantages in efficiency and flexibility to access different kinds of data.

What does PIG do?

PIG opens up the power of Map Reduce to the non-java community. The complexity of writing java programs can be avoided by creating simple procedural language abstraction over Map Reduce to expose a more Structured Query Language (SQL)-like interface for big data applications.

PIG provides common data processing operations for web search platforms like web log processing. PIG Latin is a language that follows a specific format in which data is read from the file system, a number operations are performed on the data (transforming it in one or more ways), and then the resulting relation is written back to the file system.

PIG scripts can use functions that you define for things such as parsing input data or formatting output data and even operators. UDFs (user defined functions) are written in the Java language and permit PIG to support custom processing. UDFs are the way to extend PIG into your particular application domain.

PIG allows rapid prototyping of algorithms for processing petabytes of data. It effectively addresses data analysis challenges such as traffic log analysis and user consumption patterns to find things like best-selling products.

Common Use Cases:

Mostly used for data pipelining which includes bringing in data feed, data cleansing, and data enhancements through transformations. A common example would be log files.

PIG is used for iterative data processing to allow time sensitive updates to a dataset. A common example is “Bulletin”, which involves constant inflow of small pieces of new data to replace the older feeds every few minutes.

Sailaja Bhagavatula specializes in SAP Business Objects and Hadoop for Bodhtree, a business analytics services company focused on helping customers get maximum value from their data.  Bodhtree not only implements the tools to enable processing and analysis of massive volumes of data, we also help business to ensure the questions being asked target key factors for long term growth.

Read More

Extending SAP Business Objects to All Organizational Decision-Makers

BI tools play a vital role in decision making and innovation at every level in dynamic organizations. SAP Business Objects includes tools that help expand the reach of BI Information services, enabling the organization to share, integrate and also Embed BI in applications, services, tools and business processes.

Unification of the BI data used across applications

BI can be used across multiple functions and is generally not specific to any particular department or team. It can be leveraged across applications related to finance, operations, sales or human resources. SAP Business Objects Enterprise provides a unified structure with a powerful semantic layer and integration capability that brings a “single version of the truth” to data drawn from multiple sources.

Share BI with any service-Enabled Application

To build applications that extend the advantage of a company’s BI Investment, developers can use SAP Business Objects BI Software Development Kits (SDK’s). These SDK’s can be used in any Java or .Net based application for authentication, authorization, scheduling, content display, ad hoc query, or server administration. SAP Business Objects also offers a comprehensive set of Web Services that expose BI functionality as a platform-agnostic interface. The software also supports your organization by extending the reach of BI beyond traditional corporate business.

SAP Business Objects Web Services enhances support for your tactical and operation decision making, which improves Business process efficiency.

Sridurga Vannemreddi is an SAP Business Objects and Big Data developer with Bodhtree.  For more than a decade, Bodhtree has specialized in business analytics, leveraging close partnerships with leading BI software manufacturers such as SAP Business Objects, Informatica, and IBM Cognos.  Bodhtree offers free assessments to map analytics solutions to the goals and objectives specific to your organization.

Read More