Leading clearly with data

We all know that data science is critical to the future of work. But what do leaders really need to know in order to leverage data well?

To answer that question, I caught up with Edoardo Airoldi, the Millard E. Gladfelter Professor of Statistics and Data Science at the Fox School and director of Temple University’s Data Science Institute. Here is his advice.

MA: What do executives need to know about understanding data?

EA: Executives don’t need to be in the weeds with data. But they certainly must choose a set of metrics they want to track that tell them about three quantitative aspects of their business: past performance, the efficiency of core processes, and future performance.

Past performance metrics track the outcomes of the business, so, they provide insights into whether the company has been doing well. Process metrics track aspects of the business that create value. For instance, a company that depends on customers spending time on their website would track metrics about customer engagement and satisfaction. Future performance metrics provide predictions about the expected value that the business will generate next week, next month and next year.

If I were an executive, those are three sets of metrics I would want to see on a slide deck on a regular basis.

In addition, executives need to make sure that these metrics are measured accurately. In other words, the metrics being tracked are truthful to the processes they are meant to depict. Proxies may be misleading. So executives really need to rely on someone who can do the statistical analysis on how accurately these metrics reflect what they’re meant to measure. This last aspect of data is one that people are often not familiar with. There is good data and bad data. And the first rule of statistics and artificial intelligence is “junk in, junk out.” So we all need reliable metrics.

MA: I would imagine the same thing could be true about a metric that has been tracked for a long time. Just because you’ve been tracking it doesn’t mean that it’s still an accurate representation of what you want to measure.

EA: Absolutely, yes. There should be a process that periodically reevaluates the metrics being tracked. Do we need to add another metric? Are the historical metrics still accurate in this changing world, pre-COVID or post-COVID? Are these metrics still valid or useful to monitor an evolving business?

MA: How can executives create or encourage a data-driven culture? What is their role?

EA: This is an interesting question. Some companies may have a product manager, for instance, who claims that a process or a product contributed to a 10% growth in revenue. Well, that’s great. But there has to be an internal standard for being able to make that claim. For instance, if the product manager has run an A/B test that demonstrates, beyond any doubt, a causal link between the process or product and the 10% increase in revenue, then they should be allowed to claim 10% of the revenue came from their team. But in the absence of an A/B test or another standard, they should not be able to attribute the 10% increase in revenue to their new process or product. This is particularly important, because being able to make such an attribution claim may unlock an additional headcount or resources, proportional to the additional revenue that is being generated.

Say, for instance, that the claim is based on some kind of regression or correlation analysis, that is open to all sorts of confounding variables or issues. Well then I’d say it’s great that it looks like the new process or product is contributing to an additional 10% of revenue, but we won’t be allocating the resources because you haven’t really demonstrated your claim with evidence that is up to an accepted standard. The best companies do that.

So executives need to set a clear standard for attribution of incremental revenues. To do that, the company has to create a culture where engineers, economists, statisticians and business leaders all work together to make sure that the analytical standard is high. That’s the executive’s job, to raise the bar and set that expectation, and in my experience, the companies that have done that, things have worked quite well for them.

MA: For those executives who might be a little data-shy, how can they get more comfortable? How can they learn enough to, let’s say, hire the right people to build this data-driven culture?

EA: One good way would be to make sure that they have analytics teams staffed with statisticians, data scientists or economists, ideally all three—in addition to engineers, and people with a machine learning background, who are good at AI and writing production code. And it’s not just having that kind of talent, it’s also having them work together.

Another simple and effective way is to engage academic advisors with those sets of skills. They can help the executives hire the right people, or help engineers with the design of the right standards. Oftentimes, if companies are really not familiar with data, they won’t know what questions to ask. So if the company doesn’t have the talent yet, they can turn to outside experts. The Data Science Institute at the Fox School would be a perfect place where they could tap into this kind of talent, for instance. And we strive to serve as a brain trust for all sorts of companies, from start-ups to large companies, in the big tech, financial and biotech space, in the Philadelphia greater metropolitan area and beyond.

MA: What resources should executives be investing in when it comes to data and the future of work?

EA: Be prepared to make investments in storage and online computing resources. Companies may own their own data servers, but there are big questions about what they should store. Depending on the scale, what is not useful today may be key to tomorrow’s business. So keeping more data, as much as possible, is always a good idea.

Executives might want to invest in Microsoft Azure, AWS, Google Cloud or other cloud computing and storage providers. They are not cheap but that’s an expense that needs to be budgeted for. And I cannot stress enough how valuable it is to keep data around.

MA: What pitfalls should leaders be aware of when it comes to applying data and making data-driven decisions at a leadership level?

EA: There are two major pitfalls I see. The first is bad data. The rule of any data analysis, as I said, is “junk in, junk out.” If you measure things incorrectly, or if you forgot to account for some special circumstances, that’s bad. In order to avoid it, executives need to make sure that whoever is collecting the data and whoever is analyzing the data talk to each other.

The second pitfall goes back to correlation versus causation. Data-savvy organizations understand very clearly that regression or correlation or exploratory analysis can give them an idea for what could be happening, but that they require a different, deeper analysis to really quantify the causal mechanism that is driving what they are seeing in the data. So not being able to understand this distinction is another common pitfall that often has disastrous consequences.

MA: How can executives who want to work with the Data Science Institute get involved?

EA: We are building an industry partners program. We have existing collaborations with Wells Fargo, Google, LinkedIn, Microsoft Research and several other companies, big and small. Our mission is to serve as a brain trust for industry at large, in data analytics, machine learning and AI. Reach out and we’ll be happy to discuss whatever problem you have and see if we can help.

This article originally ran in On the Verge, the Fox School’s flagship research publication. To check out the full issue of On The Verge: Business With Purpose, click here.