Customer-Centric Data : From Here to Maturity

Earlier this month, I was fortunate to have be invited to moderate a panel at the Digital Analytics Hub in New Orleans.

The panel was focused on ‘Cognitive Analytics’ and included:

Many attendees told me it went very well and they learned a great deal…which of course, makes me feel like I did my job and people were satisfied with the content.

If you’re not familiar with the event, and you’re somewhere in the vast marketing technology landscape or represent a budget dedicated to installing it; this would be a fantastic time to get in on what is still a small conference dedicated to practicum and applied program development.  All plugs aside – I really enjoyed every event and huddle I attended (so a thank you Mattias and Michael and their teams is in order for the invite and all the hard work to execute)

The goal here isn’t just to share my appreciation for the conference, but to point out a gap which exists in how companies are currently collecting, organizing and making use of data. To date, I have not seen published any comprehensive guide to take a company from their very first installation of Google Analytics or Adobe (Day 0) to an operational customer-centric ecosystem.

This is probably due to a number of reasons based on the conversations in the huddles I attended, like:

  1. Some hesitation learning or using SQL, JavaScript, Server-Side code, or any code for that matter.
  2. A known or unknown lack of governance surrounding the data generated by users.
  3. Shortage of stewardship in organizations dealing with large and complex data systems.
  4. Deficiencies in standards for collection and interoperability, as well as skilled resources to commit.

In the past, to deal with situations like this for online analytics, Cardinal Path developed an audit and assessment model called the OAMM (Online Analytics Maturity Model). This was advanced by Stéphane Hamel, a pioneer in the space and an outstanding contributor.

What the OAMM provided was a benchmark to define where in the process of using online data to make marketing and advertising decision your company was, with respect to the ‘state of the industry’. That model was useful and valuable and it helped set realistic targets for a digital measurement investment over a reasonable engagement.  The problem is that, its 2017 and we’re moving into new digital measurement territory and the OAMM can no longer effectively address the diversification of the digital analytics spectrum.  OAMM does not account for the dawning broad implications of artificial intelligence, machine learning, and real-space journey measurements necessary to drive programs.

To begin to address this deficiency, we have start the process of integrating into our offerings an audit that draws reference to the OAMM; but we’ve evolved it to focus on a ‘Customer-Centric Organization Maturity Assessment’ (or COMA).

This essentially adheres to the following scope questions:

  1. What is the quality of the technical implementation of the data collection tools involved?
  2. How is the organization prepared to manage customer-oriented, diverse application data?
  3. Is there a DMP, EDW, CDW, data lake, or queryable storage facility (cloud or on premise)?
  4. What types of technologies are tethered to whatever central storage/single-source of truth?
  5. Are the resources working with the systems working efficiently with the whole martech stack?
  6. Can this organization benefit from periphery tools, modules, or resources that can leverage data?
  7. Is there any initiative which uses customer data in some automated or intelligent fashion?

What we try to do with this is determine, through a series of fairly simple assessments, if analysis and event investment makes sense on complex concepts like attribution, content valuation, event prediction and even if the company is ready for more advanced ‘360° Journey Sciences’ (this is a specific subset of Data Science which we’re coining to differentiate how customers navigate decision trees to achieve an objective).

We feel like this helps us take a proper measure of how ready a company is to take on the tasks of dealing with the really complex stuff, and sets them up to bite off the pieces that they can handle – as well as informs, from a practical perspective, what investment level might be sensible for what is possible from the resources that are available.  This greatly increases the chances that any initiatives have a high rate of success and impact, as well as the appropriate visibility and documentation to deliver progress on a timeline.

Data Science and digital intelligence are powerful tools, but they are complicated and require teams of people aligned to grow and support a successful program.

Please feel free to contact us directly to have a conversation about whether or not this makes sense for how you’re planning to attack 2018. We would love to build use-cases and deliver back how effective this approach is to set down building blocks for a valuable customer-centric data program. We’d like to know what your challenges are, where you WANT to go and how we can help you get there.

*(a naming credit here is due as well to University of Pennsylvania's Mask & Wig group from 1999 who performed a comedy show I attended with the late Brett D. Hellerman - an inspiring dilettante)
0 replies

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply

Your email address will not be published. Required fields are marked *