Being Data Driven vs. Data Informed

Pavel Samsonov posted an interesting article on the limits of being “data driven”, and how this focus on being Data Driven results in things like Nike losing $25B by chasing things they can measure easily.

From the article:

Data isn’t worthless. Data is really, really valuable for telling you what has happened in the past. Great expense has gone into producing data that can tell you what’s going on in the present. But as the 7- and 8-figure salaries of quantitative analysts at hedge funds show us, using data to extrapolate what will happen in the future is one of the most challenging things you can try to do with it.

Typically, the way one would do that is by harvesting what’s called warm data— the qualitative data that gives the numbers their meaning — and then using that to tell a story about where the numbers are going.

and

“Validating” what everyone already believes is far easier — but it adds no value. The value of research doesn’t come from elevating people who are already shouting. It comes from finding the people who are not being heard, and adding their voices to the conversation.

Then you can make a real data-driven decision: a decision driven by all the data.

Otherwise, the best you can do is commodity-grade decision-making.

It’s tempting to collect data that you know will validate what you’re already planning to do. That’s not super useful.

But being Data Driven is something we’re supposed to strive toward! The Harvard Business Review has a bunch of thought-leader-y articles on how to do this stuff. And there’s current Literature on how to turn universities into data-driven organizations. It’s hard to resist the pull toward Data Driven-ness!

I’ve been trying for a few years now to identify data that might be used to describe how our online learning platforms and classrooms are being used. My intent is to make these data visible - primarily to get huge chunks of organizational knowledge out of individual peoples’ heads and into a place where it can be sustainably shared.

My take:

Our work is too complex to be easily described with Dashboards and metrics. Sure, important parts can and should be (classroom utilization, activity patterns in online platforms, aggregated support requests to identify themes, project management, etc.), but huge chunks of our work is organic and inferred and human. Things that are difficult to measure, therefore difficult to crunch into Dashboards or charts.

I think a better approach is to be Data Informed. Using what data makes sense to describe historical trends and to identify gaps - but being OK with not having Data on everything because the work is complex and interconnected and difficult (impossible?) to measure easily. Embracing the ambiguity, but navigating it through using meaningful data that is available, and through trust (hiring excellent people who you can trust (and who can trust each other) to do their best work) and collaboration.

Anyway. The article gave me a good and timely sanity check on the Quest for Data.