Decoding the Data Science Terrain: Part 2
In our previous post, we described a technology/process neutral framework for data science and analytics which was more abstract and high level. In this blog we will look to dive down into the details by dividing the analytics landscape into six layers:
1. Use cases
2. Data sets
3. Data collection
4. Data preparation
6. Intelligent Actions
In this post, we will talk about Use cases.
The first of the many steps in tapping the transformative potential of data is to clearly articulate how (and by how much) the business will benefit from an analytics solution. The business also needs to visualize how the solution will be put to use, what it looks like, etc. Ideally, these are business-led approaches to using analytics for solving business problems, rather than technology-driven or vendor-driven approaches.
When it comes to use cases, they can be defined at two levels: at a broader program level, or at a narrower project level. The former will involve connecting with senior business management leaders to understand their vision on how to leverage analytics for transformation. When it comes to defining use cases at a project level, the conversations will involve potential beneficiaries or owners of the business process, more weighted to middle managers and analysts.
A good use case must take into account critical elements such as strong support from the business teams, compelling business motivation, what’s realistically feasible from a data/technology point of view, and a clear plan on how the solution will be implemented to realize the benefits. For example, the business may want to build a personalization engine that can personalize website content for every visitor. To enable such personalization, there are some critical elements that are needed – a unique way of identifying and tracking historical visitor behaviour across all interactions (easier said than done) and a way of embedding personalized recommendations from the models into the website (again, easier said than done.) Most importantly, a strong justification for why this would improve critical business performance.
Let us discuss this in the context of a B2B organization. We will look at how leading marketers are applying analytics to solve key challenges that they uniquely face in the B2B industry:
• Product features are not differentiators: B2B companies are searching for differentiators through solutions and data-driven services
• Complex purchasing: Long purchase cycles (several months), many buyers (influencers, decision makers, gate keepers, sponsors), complex deals (integration, customization), multiple divisions (each having their own requirements, may not map to existing product features)
• Channel fragmentation: Buyers interact across multiple channels (web, social, PR, analysts, content, reviews and ratings), using multiple-devices from multiple locations
• Buyers want to be in control: They gather social intelligence about products and solutions, through interactions with other buyers and experts in the field. About 70% of research is complete before a buyer reaches out
Given these realities, marketing teams in B2B organizations are looking to engage with their buyers across every stage – build awareness, help in research, evaluate alternatives, make the purchase and managing the post-purchase experience.
Marketers in the consumer marketing world of online commerce companies such as Amazon, Priceline, eBay have dramatically increased the precision of their marketing efforts. However, getting to that level of precision in B2B marketing is going to be hard. This is probably why CMOs never feel that they are being trusted by their bosses. The only way to earn the CEO’s trust is for marketers to transform themselves into revenue-driven marketers; and this can be done by leveraging analytics in a systematic fashion across all marketing initiatives.
Marketing is already steeped in technology, including CRM, Content Management, Social Marketing, Web and Digital Marketing. Marketers use this for Demand Generation, Brand Marketing, Events, Field Marketing, Content Marketing, Partner Marketing, PR, Analyst Relations and Social Media Marketing, with lots of overlap among each of these areas. Marketing automation software enables them to consolidate these efforts using features such as progressive enhancement, drip marketing, personalization, etc. There’s a huge amount of analytics that are delivered with this technology.
The question is: how will more investments in analytics lead to better marketing outcomes without adding to the already complex technology landscape?
Most of the analytics in B2B marketing work in silos. For example, every channel’s visitors, engagement, conversion, etc. is measurable to the last decimal. It’s easy to analyse the number of impressions, clicks and downloads, but it’s hard to quantify the overall impact of this in terms of revenue. It’s easy to set and achieve siloed goals such as, “increase website visitors by x%,” or “improve downloads to visits ratio by y%.” However, at the end, what matters is the answer to the question: what really works, and why? To be able to answer this questions, marketers need to first create a single source of truth, rather than just rely on siloed analytics.
To do this, we recommend the creation of a data lake. A data lake integrates data from every technologies and platform, in a flexible manner, and at a very low level of granularity. Essentially, a data lake can capture and store every interaction with each customer, including each of their visits, the pages visited, time spent, emails received, opened, clicked, content downloaded, campaigns they were part of, etc. In addition, the data is harmonized across customers, channels, campaigns, programs, events, and accounts (normalization of definitions across the organization, automation of business rules such as brand, product -> category mapping, monitoring of data quality.)
A data lake provides a strong foundation for different types of analytics, including reporting, personalization, content insights, ad-hoc analysis, data services, etc. A data lake enables true agility by acting as a single source and a clean and integrated repository of marketing information.
Let’s look briefly at some of the use cases that a data lake enables for a wide range of user communities within the marketing organization:
• Marketing scorecards: This provides an omni-channel perspective about a company’s marketing performance at a glance. A data lake truly enables creation of scorecards that present a single version of the truth
• Self-service data preparation: Tools that leverage data lakes enable more data-oriented and analytical marketers prepare their data for advanced modeling
• Data science workbenches: Data scientists interested in discovering new knowledge about customers, accounts, segments, use sandboxes that are rapidly provisioned from the data lake
• Ad-hoc analysis workbenches: Enables marketers who need to slice and dice performance (for example: what is the contribution of different content to MQL’s for the Americas region over the last three months?) using a visual-based or a SQL-based approach for querying data in the data lake
• Data services: Aggregate customer behaviour can be used to personalize content and navigation for customers. Such personalization is powered by recommendations delivered to front-end applications using API calls to the data lake
As it’s obvious, a Marketing Data Lake provides the ability to turn a chaotic, multi-structured data environment into a flexible, manageable, strategic asset. This is a good example of a use case for a marketing data lake in a B2B environment. In the upcoming posts, we will dive in-depth and talk about where the data lake sits in the ecosystem, how much it will cost, what concrete benefits it will deliver, etc.
Essentially, defining the use cases is a task that is undertaken up front, and it’s a task that’s driven by the business, and involving stakeholders from users, technology teams and analytics teams. We must make sure that there’s compelling business motivation, strong support from various stakeholders, technical and data feasibility, and a clear plan for how the data lake will be used once it is built.