1. Start With a Business problem in mind.
Exploring huge amounts of data with Hadoop and other advanced analytic tools can be fun for your analytics team, but it can also be a waste of time and resources if the results do not translate into something that solves real-world business problems.
Identify projects that are both promising and practical, and take time to understand the various types of problems big data analytics can solve for your organization. For instance, there’s a lot of conversation around analyzing unstructured data such as video and speech. However, the most important source of big data for many businesses is consumer transactions, which tend to yield structured data. Payment card and loyalty program transactions, for example, produce abundant, timely streams of data. These are replete with granular details on the what, when, how much, and how often details of individual spending. And, keep in mind that the cost and complexity of analyzing huge amounts of structured data are often much lower than the cost and complexity of analyzing unstructured data.
The bottom line is that you need to find out what kind of business problem or challenge can be addressed with the data you have, and ensure that the data being analyzed is current, accurate, and offers real insight.
2. Ask the right questions
The most valuable insight into business performance is achieved by pre-determining exactly what information is needed and then asking your data specific questions. Many companies implement big data solutions and expect insight without first deciding what they need to know. Vague questions will not receive clear answers.
3. Take a partnering approach
In order to achieve the best results it’s important to work in collaboration with your chosen analytics provider by involving key stakeholders from your own business at the outset. Collaboration will equip you with the knowledge and skills you’ll need to maintain and extend your big data solution in-house in the future.
4. Leverage Analytic Innovation.
Innovations in big data processing and analytics are transforming how businesses get value from their customer data. We’re seeing a shift from approaches that supply periodic snapshots in the form of descriptive reports and dashboards (what happened) to systems that continuously analyze incoming data to produce predictions (what is likely to happen) and prescriptions (what to do about it) that are actionable in real time.
Many types of analytics will increasingly operate inside production streams. Relying less on persistent historical data, they will instead respond to changes in the current environment. Analytic outputs will be combined with complex event processing to enable very rapid responses to customer behavior.
Big data tools and infrastructure are also making it easier to apply machine learning techniques to explore huge datasets that include a wide variety of structured and unstructured data. The right balance of these techniques with human analytic and domain expertise not only lifts business performance but also improves the ability of companies to learn at a fast pace from data-driven experiments.
5. Balance Automation with Expertise
More analytics doesn’t mean less need for human expertise. Analytic expertise informed by deep domain knowledge is essential for building effective predictive and decision-making models. Today’s shortage of analytic talent puts more pressure on organizations to ensure they engage with well-trained data scientists, either their own in-house experts or vendors with whom they choose to collaborate.
Make sure the people or businesses with whom you partner for your big data projects really understand the data that drives both the decisions and the building of the analytic models. With the expanding space of open source and commercial data science tools, newer “data scientists” often use these tools without a true understanding of how they work, what the parameters mean, and the impact they can have on your business decisions.