As the accessibility of data continues to grow in nearly every area of business, the expertise of Haitao Li in supply chain modeling, management science and operations management is in high demand.
He’s recently been called upon not just in his research efforts and instruction at the University of Missouri–St. Louis but also for talks with groups across a variety of industries. In April alone, the associate professor of supply chain and analytics gave presentations at Monsanto, the University of Missouri–Columbia as well as UMSL. He also co-presented a webinar for Gurobi Optimization on “Labor Strategy Optimization for the Professional Services Industry.”
The webinar featured a mathematical programming model developed by Li and co-presenter Cipriano Santos during their years of consulting work with Hewlett-Packard Laboratory. This optimization model uses prescriptive analytics to offer a staffing plan that meets targets revenues.
Following this presentation, UMSL Daily sat down with Li to glean information on the importance of data-driven decision support in workforce planning.
About what percent of expenditures are spent on labor costs?
It’s huge, especially for the professional services industry. You can imagine that it is different from a manufacturing company where lots of the assets are equipment, hardware and facilities. For service companies, their main assets are people and with multiple skills. It’s definitely the largest portion of spending for a service company and the service industry.
How much is inefficient labor management costing employers?
It can also be huge. For large international service firms, the inefficiency happens during planning and matching. These can have serious consequences when projects come in that have not been properly staffed ahead of time. They are going to resort to a contingent workforce or a third-party contractor, which is often more expensive. For companies, their internal workforce is sunk cost, meaning they already pay for it. One objective of this application is actually maximizing utilization of the internal workforce to reduce extra spending on external workforce. That’s why the planning needs to be done ahead of time. It has a significant impact.
Layoffs appear to be a simple solution to reducing labor costs. But what new problems do job cuts present?
Just cutting people is often not the most cost effective because when you cut people, you either need to retrain or reskill the remaining workforce to fill those job positions, which can be costly. There are several additional aspects related to simply cutting people. I talked with several researchers on the issues related to morale. Even for employees that remain, there is this negative impact on morale because people will be sensitive. Even if they are not cut, they may think, “Will I be in the next round?” That actually hurts. This is internal, but let’s not forget the detrimental effect externally. How about the social image of the company? I certainly would not go that route if not absolutely needed.
I think the opportunity is that as companies have more and more data, data-driven prescriptive analytics may help companies better optimize their workforce capacity, capability and cross-training needs.
What are some of the other main inefficiencies you’ve seen in the way companies manage their labor force?
The approach for many companies is highly accounting based – often using spreadsheets. They certainly understand the importance of planning. Different companies have their own way to delineate what resources will be needed for their plans for production and to meet demands. But I think the gap is that many companies are doing that in a more manual and trial-and-error way. Where these system-support applications can help is to really automate that process. The recommended solution is obtained through a rigorous design optimization model so decision makers will have more confidence.
Are these tools the best way to determine if labor force costs are beneficial or inefficient?
A tool will never replace the process of human decision-making. A model always and only provides decision support. When we implement the model or a tool in the real world, we need to validate the solution. I would start with some sort of benchmarking. Let’s take this labor strategy approach as an example. If you compare our approach with a spreadsheet-based approach, you immediately see the benefit and the amount of improvement in the financial metrics.
But sometimes for a problem that has more complexity or multiple criteria to focus on, it may require the decision-maker to play around with more scenarios, to play around with some input data or their relative importance to measure those different metrics just to test whether the model solutions behave properly.
For companies that don’t have proper workforce planning tools in place, where is a good place to start?
The good news is that right now there are indeed many workforce planning tools. You can certainly do a search and there are tons of off-the-shelf commercial tools and software for workforce planning and scheduling. Most of those tools have very easy to use user interfaces. Pretty much all the user needs to do is prepare the data, put it in and then it generates solutions. Certainly, this is a good starting point, but it can be costly.
I want to emphasize the gap. You may ask why can’t a company always rely on this software? Certainly, we have new problems like this labor strategy optimization. This is a very unique problem we identified at HP but can transcend to other companies in the industry. With the more and more competitive marketing environment and globalized economy, we are looking at new problems and new needs for decision-makers. That’s why I see the gap, the limitations for many of the off-the-shelf software. Often the company or industry may have new needs not addressed, modeled or coped with by off-the-shelf software, or it may take them a longer time to adapt. That’s why I’m excited about what I do and what our supply chain and analytics program does here.
In an increasingly global economy, what new complexities arise when decision-makers consider offshore operation models?
We talk about offshore a lot. Offshore is a well-known model and practice in industry for many positive reasons. What our study reveals actually is the decision on offshoring could be data-driven. In our study and model, we showed that the amount of workforce that we shoot offshore depends on the offshore risks, plus a wide spectrum of attributes including workforce skills, their rank, their pay rate, their geographic location. We also consider the risks. We have a graph to show when the offshore risks vary and how the optimal labor strategy will vary in response to it.
I think we need to let people know, if you have data available, we could probably help you make better offshore decisions.
What are some of the common offshore risks?
A common offshore risk involves time-zone differences. We do offshore a lot with India and China. This is a significant issue because China is a 13-hour difference – it’s almost the opposite. Coordinating teams all over the world is not simple. I have friends who are doing that. They work with HP and other international Chinese companies. They have to work with their teams in the U.S., which requires them to work either at midnight or very early in the morning. Guess what kind of productivity they have? So we are talking about time zones, time spent coordinating those meetings and the potential disruptions or delays due to less productivity. In addition, we may encounter high-than-average attrition of resources.
On the flipside, what are the benefits of offshore operations?
Certainly, labor cost is one consideration. Traditionally, the U.S. is probably higher than the others, but the new trend is that we are actually not. Many industries in China and India are really catching up. You can see a significant increase in their salary. Cost is becoming less of an incentive.
In addition to cost, we are also looking at skills. Many locations have that critical mass of IT people who are well trained and speak fluent English. Also, closeness to the market is certainly a factor. For some of the projects originating in one region, you probably want to be in that vicinity.
So rather than generalizing the offshore risks and benefits industry-wide, it’s better to look at data of individual companies, correct?
Exactly. I think that’s where data-driven optimization comes into play. It’s really hard to tell. We all know the benefits, but we also understand these increasing costs. Cost may not be a good driver anymore, not to mention those risks. How do you factor that? How do you calibrate your decision? That’s exactly where optimization modeling and decision support becomes valuable to best trade off all these factors.