Insights from DA Hub 2016

DA Hub is the new name for the former Xchange conferences. Most experienced digital analytics practitioners will tell you it is the one conference they never miss. This year there were many tracks and I certainly could not attend them all, but here are some nuggets I took away. Note that in the keynote, all the comments come from the speaker; but in the Huddles, they can come from anyone in the room.

Keynote with Dean Abbot from Smarter HQ

  • Domain expertise prior to data analysis is what the business needs. Ask the right questions.

  • Everything still has to be shoved into row/column format even if columns are 1000 wide.

  • Each column represents a way to describe, an attribute, often of the customer.

  • Cloud computing systems allow us to solve problems we could not solve before (unless you owned a Cray computer). But more than that, it allows us to explore iterations instead of introducing bias by hand-selecting variables to reduce the load.

  • There still not enough business knowledge in the data. Lots of work to be done still.

From the Huddles Talent and Innovation

  • Teachers can make good hires since they already have strong presentation skills.

  • One way to keep people is to require they attend two conferences a year and one speaking engagement.

  • Shout out to the DAA self-assessment tool

  • Break down barriers with a mentoring program beyond your department. Lunch 1x month and calls twice a month.

  • Recognition is required more frequently for Millennials but less about gift cards and more about sincere appreciation, especially from those impacted by analysis.

  • Vendor presentation quickies are a way for firms to listen to new technology pitches for 30 min each all stacked up.

  • Allow innovation through minor funding ($5-$8K) of pet projects and see where it goes.

Data Science

  • Digital analyst and data science teams are getting bigger but not necessarily working together.

  • First there's growth, then structure and that can smash innovation.

  • Think in terms of roles, not titles or teams.

  • Allow time to presell analysis results.

  • The expected time for a data scientist to return results is 2-3 years (of model building and testing). For a digital analyst it's about 1 month.

Analysis ROI

  • Many recommendations are made but the ability to push the implementation of those recommendations can be a struggle.

  • Traction often comes from being aggressive about taking credit for test results BUT danger lies in extrapolating results to the larger implementation. This almost never pans out.

  • Circulate every test and the dollar value from it at year end.

  • Consider surveying your stakeholders. Did they use the team's services? Quality? Completeness?

  • Stop doing things you suspect have no value (such as issuing reports which are never read)

  • Use a request tracking tool to help quantify the value your team delivers and also evaluate whether efforts were worth the time.

  • Forrester can be hired to benchmark your teams capabilities within your industry and broader.

  • Track emails when reports or analysis is sent.

Customer Experience Tools

  • Customer experience tools (Decibel Insight, Clicktale, Crazy Egg) can be used to do top down analysis which is best married with tracking tools (Google Analytics, Adobe Analytics) for a complete picture.

  • Women tend to scroll more than men.

  • Site engagement can be "gamified" where members score points for taking action.

  • Using customer experience tools with NPS allows you to see how those posting high scores struggle sometimes as much as those with low scores.

  • Use customer experience tools to soft launch, adjust fixes, launch again and refine, then fully release with minimal issues.

Testing

  • Consider running a dry test where the new feature or change is suggested to the audience but landing page thanks them for their feedback/ coming soon.

  • Push product managers to try bigger, scarier tests not use testing to simply validate product tweaks.

  • Testing actually has 3 layers: presentation layer, process layer, audience layer.

  • There is tension between testing the customer need (for example, an optimized experience) and the business need (conversion).

  • Consider the partial pooling technique when sample sizes are too small.

  • Testing without executive leadership can run rampant.

Data Product

  • Think of the delivery of analysis as your "data product"

  • How well do you know your stakeholders? A day in the life.

  • Align insights with existing initiatives and know the "inconvenient truth" can be a hard sell

  • What does the organization think is small change versus a big number?

  • Classic product management cycles can be applied to engineering larger changes. It is the analyst's job to present stories to the stakeholder.

  • Prototype the value of change.

What else? Feel free to add your own insights in the comments below.

See something interesting and need more context? These are just bullet points but I'm happy to help. Just drop some time on my calendar or connect with me on LinkedIn.

Previous
Previous

Approach to Optimize

Next
Next

How to Get Action from Data