Kevin Micalizzi: Today we'll be discussing forecasting with Jason Jordan. He's a partner at Vantage Point Performance. Welcome back, Jason.
Jason Jordan: Thank you. Thanks for having me.
Micalizzi: So, Jason, for our listeners who haven't had a chance to hear you speak or to see your work, would you share a little bit about yourself?
Jordan: Sure, I'd be happy to. So I'm Jason Jordan. I'm a partner with a company called Vantage Point. And we train and develop sales managers. So we're a sales training company, but focused exclusively on the population of sales managers and sales leaders.
Consequently we do a lot of research, a lot of thinking, a lot of writing, which you might have seen if you've spent some time on Salesforce's website. We look at issues that affect sales management. So pipeline management would be one. Coaching obviously is an issue. How to use CRM data to better assess and manage sales teams.
And then forecasting's right in there as one of the topics that we've done a lot of research on and find a lot of interest in. And so I'm thrilled to join you today and talk about this.
Micalizzi: Excellent. Before we jump into it, I'm Kevin Micalizzi, product marketing senior manager at Salesforce and the executive producer of the Quotable podcast. And I'm joined by one of my favorite co-hosts, Lynne Zaledonis, VP of Product Marketing at Salesforce. Welcome back, Lynne.
Lynne Zaledonis: Great. Thanks so much, Kevin, for the warm introduction. I bet you say that to all your co-hosts. And, Jason, so happy to have you back again. And I'm excited to be talking about forecasting with you today, so thanks for joining us.
Jordan: Yeah, thanks, Lynne.
Micalizzi: Definitely. So, Jason, what exactly do we mean when we say forecasting? Because I think a lot of folks mix up several activities and call it all forecasting.
Jordan: That's a great question. And it's a good question to start this conversation. People use a lot of terms in the sales force, and they don't always mean what we want them to mean or expect them to mean. So coaching is one that has a lot of different meanings. Sales process is another one that has a lot of meanings. And forecasting is surely one that, depending on who's saying what, could mean nothing or everything at the same time.
The biggest issue and confusion that we see with forecasting is it's often used as a synonym for pipeline management, and pipeline management's often used as forecasting. So the first thing that we do with companies when we engage to try to work on the skills of their sales management is to try to pull those two things apart.
And the difference in our minds is that forecasting is really about trying to predict the future. It's kind of obvious when you say it that way.
But we have to contrast that with pipeline management, which is about building a healthy sales pipeline and helping sales reps get those deals from beginning to end so you increase the win rate within those deals.
So if you think about pipeline management as building a pipeline of deals and trying to win them and forecasting as trying to predict the future, it really pulls the two activities apart and makes it much easier to do both better.
And the easiest way to think about what is forecasting is if you're trying to estimate the size of a deal and assign a close date to it and put a probability on it, you're forecasting. And if you're digging into the details of an opportunity and trying to strategically think about how you're going to position yourself against the competition or what the next call should look like, then you're managing your pipeline. It's a meaningful distinction that trips a lot of people up.
Zaledonis: Yeah, but probably both of those are synonymous, right? It's probably — I'm going to guess — what drives your forecast accuracy is strong pipeline management. So I'm sure these two go hand in hand.
Jordan: They do go hand in hand. And the reality is they tend to happen during the same conversation. So you'll schedule a pipeline review. As a manager, you'll schedule a review with your sales rep. And you'll sit down and you'll talk about the sales pipeline. You go through the opportunities. And while you're discussing strategy for each opportunity, you're also updating the data in Salesforce.
Jordan: You're updating the close date. You're updating the probability if you're inputting that as a salesperson. And it's hard to pull the two apart, because they often happen simultaneously. And a lot of the reasons people even have sales pipelines is so they can have accurate forecasts.
So it's logical that people think about them in the same thought, but it's useful to distinguish between them, because you can do them both better if you really understand what the goal of each is.
Zaledonis: So talk a little bit more about that. So separating them makes them stronger?
Jordan: No doubt. Forecasting is something that has to be done. It's critically important, especially at the senior levels in the organization, because forecasts are used for planning and communication outside the organization and within the organization for investment decisions.
There's a lot of reasons you have to have an accurate forecast or else you can't run your business. And that's just separate from the idea of managing the sales pipeline and trying to win more deals.
We have some research that shows that companies that are really effective at managing the sales pipeline grow their year-over-year revenue at a 15% greater rate. And it makes sense, because they're focused on winning deals and really building healthy pipeline.
And companies that forecast better are more accurate and more likely to hit their target as well. But obviously it's iterative, because you're increasing your forecast or decreasing your forecast continually based on what you see coming down the pike.
So I like to think about the differences as if you're thinking about you're a captain of a ship. So pipeline management is you're the captain of the ship. You're trying to make sure that everything's going right to get the ship to shore as efficiently and as safely as possible. And forecasting is estimating your time of arrival.
If you're not managing the ship well, you're not going to get there in time. But if you spend all your time forecasting, you're not managing the ship. So they do go hand in hand and one surely affects the other.
Micalizzi: So, Jason, I had seen Vantage Point had published in your research that 69% of salespeople don't trust their own forecasts. And I also saw that the “State of Sales” research report from Salesforce identified 79% of teams as currently using or planning to use sales analytic technology.
So if 69% are not trusting their forecasts, but 79% are using some kind of analytics technology, what are they doing wrong? Where's the disconnect here?
Jordan: No forecast is going to be accurate. It's like a weather forecast. It's never going to be accurate. The issue is getting it as close as possible.
There are a couple things going on. First is the technology and then it's the methodology. And so in the research we did on forecast accuracy, the number-one driver of forecast accuracy we saw in our research was that it's technology-enabled. And this makes sense. Forecasting is a very analytic activity, perhaps the most analytic activity that a sales team engages in, because you are using numbers and calculations and assumptions, and you're trying to get to a number or a range of numbers, which is your forecast.
And so if you're not investing in a technology to do better forecasting, and you're trying to do it through email or Excel spreadsheets, and you don't have an analytic engine that's capable of pulling the data together and analyzing it and doing some correlations, it's kind of hard to really even manage the process. Technology is almost the infrastructure that enables it. And as I said, the research we have shows that if you have a lot of technology you've invested in properly, it's the biggest influence on forecast accuracy.
But then there's the methodology piece of it, and this is where I think a lot of companies fall down and why I think there's so little confidence in the forecast, because the third biggest driver of forecast accuracy in our research was having clearly defined terms for forecasting.
And that's one of the biggest issues we have. Think about it. You're a salesperson, you're a sales manager, you're asked to put in a forecast. And a lot of times those terms aren't clearly defined. What is a committed deal? What does that mean? What is a qualified opportunity? What does that mean?
So the technology can be as good as it can be, and getting better, but if a salesperson is putting in numbers, and they know they're just making it up, then they're not prone, nor should they have great confidence in what it is that they're doing.
So both of these things have to go hand in hand. You have to have the technology that enables the process. Then you have to have the process itself, which is clearly defined and the expectations are set and salespeople know what a qualified opportunity is. They know what they should put in as a committed deal.
And when those two things work together, then it's a great picture. But if you underinvest in either one of those, then you get kind of shoddy forecasting that comes with bad assumptions.
Zaledonis: I can remember that being one of the most painful things when I was in sales. So as you know from my title in the beginning of this, I'm in product marketing, but I spent over a decade in the sales organization, both as a rep and as a manager. And it's that Friday, the forecast is due, and I'm walking around to cubes or texting people, like, what do you got? What do you got for me?
How do you drive that adoption? What makes people proactively keep a tight forecast or have that information available?
Jordan: I can totally relate to your story. The last time I had a full quota-carrying role where I was dedicated to sales — I mean, we're all in sales — but the last time I had a full dedicated role as sales manager, actually it was VP of sales. I managed a very large account, the second-largest account in the company actually, about $7 million a year in revenue we were doing.
And my boss would call me on the phone and say, "Hey, what's your forecast for next year? I've got to get this in." With no preparation.
Zaledonis: In a second, yeah.
Jordan: I didn't know his phone call was coming. "It was $7 million. Right? Because that's what we're doing now." And he'd say, "Are you sure it couldn't be a little bit higher? Seven and a half?" And he'd, "Yeah, you know, I'm trying to get these numbers just right.” And I'd say, "Eight? No."
Zaledonis: Keep going, keep going.
Jordan: That was how the forecast got done for the second-largest customer in this pretty large company, and a publicly traded company that ultimately went to Wall Street somewhere.
And so part of the issue was process. And this is the same whether you want to have success with coaching or whether you want to have success with managing your sales pipeline. You have to have a process. And it has to be done with some rigor.
And so rather than calling me on Friday afternoon at 2 o'clock and trying to beat my forecast higher, we should have had a regularly scheduled meeting where the agenda was forecasting. And it doesn't have to be time consuming. It could be 10 minutes.
But it needs to have some structure around it so that I can say, "Okay, here's how I came to my forecast. I took my sales this year and there's some reasons I moved it up and some reasons I moved it down, and here's what it is."
And then if the assumptions in the process of forecasting are laid bare, then an interesting thing happens. It turns that activity into a coaching activity, because then my boss could have said, "Oh, well, so you moved it up a half-million dollars because of this. Talk to me about that." Or "You discounted your forecast by $700,000 because of this assumption. Talk to me about that."
And so that's when people become accountable to the forecast. They begin to have confidence in the forecast. And when the forecast generally improves is when there's some structure and definition around it. And there's structure and definition around the activity itself so that it does become something people can see and have confidence in, and more importantly, start to test the assumptions and do something approaching coaching. Forecasting can become a very valuable organizational activity as opposed to what was very frequently administrative.
Zaledonis: In my role I talk to a lot of different customers, and talking about what their needs are and what their business processes and practices are. And it's funny.
You kind of talked about the two different categories of forecasting that I hear a lot. Some is that top-down, where management says, "This is your forecast for the year” or what your quota is for the year. People tend to divide that by four. Then that becomes their forecast for the quarter.
And then others where it's bottom-up, where you're taking a look at what you have in stage five perhaps, and then that becomes what your forecast is.
What do you see? What are people telling you, and what do you see? And weigh in with your expertise on those, because I'm sure people would love to hear your thought on those different approaches.
Jordan: There's just so much confusion about this. Again, I think the sales force is less defined than any other part of the organization. You couldn't walk into a manufacturing facility without everything being documented, at least anymore. You couldn't walk into the finance function without seeing all the numbers in the right columns and rows.
Sales just isn't that way. There is a lot of confusion around this in most organizations. We have a training program just on sales forecasting. In fact, I've done pieces of it at Dreamforce for the last few years in a workshop format.
And when you say we're going to talk about the forecast, people will often stop you and say, "Well, wait a minute. What's the forecast? Are we talking about our target? Are we talking about our budget? Are we talking about our forecast? And if it's forecast, which forecast? Is it our blood commit forecast or our stretch forecast?"
One I just heard the other day was the stars-aligned forecast. If "When You Dream Upon a Star" is playing in your head when you're doing your forecast, you're probably off to a bad start.
So I don't think there's a best way to do it. If the forecast is important for your organization, you should probably do it from all the directions. You should have your salespeople forecasting if they have useful information from the field and customers. You should probably have a centralized forecasting function. You should probably have sales management putting reality checks on what's coming from the field.
And I think, generally speaking, the best practice is to get input from a bunch of places. But in reality, what happens is input comes from a bunch of places and people just pick the one they like the best. So it's a little bit of an organization decision.
The best rule I ever heard for whether or not salespeople should be forecasting — because the vast majority of salespeople, certainly in business-to-business sales force's forecast. I had a client one time named Jim. I always call it Jim's rule. So I asked Jim, I said, "Why do you even have the salespeople forecast? It consumes a lot of time. And I feel pretty comfortable that someone back at the home office could probably do a companywide forecast as effectively as you could rolling it up from several hundred salespeople."
And he said, "Well, we just have a belief that the salespeople are closest to the customer. And they're closest to the market. And they have information that people at central headquarters don't have."
And so that's my test. If the salespeople literally have information about customers and accounts that no one else can have, then they have to be a part of the forecasting process. But a lot of companies, it's not the case.
So it's kind of a company-by-company decision. But I think there's just a commonsense approach to it that's often missing.
Micalizzi: When you're looking at the way you approach your forecasting, some do it by territory, some by account, opportunity, product. When is it appropriate to use each one?
Jordan: It's a great question. And it's part of our forecasting program —actually, our training program — is helping people understand on what do you base your forecast.
The primary ones you just laid out. You can base it on individual opportunities. You can base it on accounts. You can base it on territories. And I actually draw a distinction between basing it on individual opportunities and basing it on a pipeline of opportunities.
So you can do forecast as an individual opportunity. You can say this opportunity has a 40% chance of closing. Or this other opportunity has a 25% chance of closing. And salespeople put those on individual deals.
Or if you have a lot of deals simultaneously and you have a sales pipeline, ideally you'd get to the point of pulling historical data and just knowing that we win 47.5% of deals at this stage that look like this.
Then you can kind of take it away from the individual deal and start putting it in pipeline percentages with historical averages. And those methods make most sense, as you might expect, when your sales team pursuing deals, individual deals with opportunities with a beginning and an end. You identify a lead and you go through stages, and then you win or lose it.
And that's the predominant forecasting methodology. Eighty-four percent of the companies we studied in one of our research studies showed that that's how they're basing their forecast, is on an opportunity. They weren't distinguishing between an opportunity and a pipeline, but on an opportunity.
But a lot of salespeople don't sell in that way. They don't have individual opportunities. So they may be managing a territory with 200 accounts and probably 200 customers and maybe another 150 prospects. And so they're assigned this territory, whether it's geographic or a kind of virtual territory, and then the business just comes. They're repeat customers or whatnot.
And we'll see companies try to shove that into an opportunity-based or a pipeline-based forecasting methodology. And that's when things start getting really crazy, because it just doesn't fit the way the salespeople are selling. They're managing accounts, they're managing relationships, and the deals just kind of flow through. And in that case, it makes more sense to do more of a trend analysis than assigning probabilities to accounts.
And then there are other salespeople, as you point out, that have major accounts like I did in my last role I mentioned, where I had this one account that was $7 million. In that situation it might not make sense to do that either, picking opportunities. It might make sense to take last year's performance and apply some assumptions to growth in the company or changes in the company strategy. You know a lot about the company, and at that point, you would base your forecast on the account.
And so it really comes down to the nature of the sale, whether a salesperson's pursuing digital opportunities, whether they're kind of managing relationships within a territory, or whether they're even managing rather large strategic accounts. The nature of what they're doing should dictate the type of forecasting they're doing.
But again, most companies try to shoehorn the forecast into stages in a pipeline, whether or not it's relevant.
Zaledonis: I have a lot of friends that perhaps have an account, and it's very familiar to me with what their deal — it's like, how much can you get out of XYZ company this year, and that's their forecast. Right? How much they gauge, because they provide a service or a product that's recurring to that particular company. That's interesting.
But like here at Salesforce, we have a little mix of both. There's some people that have one account that they will work the entire year. And then we have our small business team that's closing 30 deals a month.
How do our executives at our company blend those two types of forecasts? What do you recommend for people in that situation?
Jordan: Well, I think you need to have multiple forecasting methodologies. And this is where you really have to think about how much you want to invest in forecasting. Just in the same way that each of those people may need a different sales process, and so one sales role may need a process that helps them manage individual opportunities. Another sales role may need a process, an account management process that helps them strategically align with individual accounts. They might need different forecasting methodologies as well.
Or even if you're a sales role where you pursue individual opportunities, you may need two different pipelines as well. You may want to track separately a pipeline of new opportunities versus a pipeline of opportunities from repeat customers. Or you may want to have a pipeline of new product sales versus existing product sales.
So you can really break it down and get as specific as you want to. It's just an issue again of how much you want to invest in the process, how important it is to the organization, how many roles you have, how differentiated they are. But I think if you think about the forecast kind of aligning in the same way that sales processes align, one company needs a territory management process, another role may need an account management process, another role may need a pipeline management process. That's where you can start thinking about, how do we do our forecasting?
Because all of this, the sales process, the forecast, the management, the coaching, it all really needs to be driven by the nature of the sale and the nature of the role that you're managing.
Micalizzi: I want to dig a little bit more into the how of forecasting. It seems, in most cases, like there's an expectation that sales managers are just going to extract data. Like that manager who called you and said, "Hey, what's your forecast? Oh, well, you know, it doesn't really line up with what I want to see."
How should they be taking that data from all their reps and pulling it into something coherent and hopefully useful?
Jordan: There's a little bit of irony here, which is that we invest in technology — Salesforce — and analytics and tools to sit on top of it. The entire ecosystem, if you looked at it, there's a lot there, plug-ins that create forecast or create pipeline reports.
One of the values of having that type of tool is that it keeps real-time information. You kind of expect the salespeople to keep the data updated continuously, and when something changes with an opportunity, they update the system. And when something changes in an account, they update the system.
And so that's a little bit why I remain vexed at how much time people spend forecasting. If the technology is structured properly and if the definitions and terms and the expectations are set properly, it should be something that's pretty much updated in real time at all times.
And it really shouldn't be that much of an activity, but yet, we do ask our sales managers to meet with our salespeople and update the forecasts, when in reality they should just be pressing the button that generates the forecast.
So it comes back to this idea of, what's the methodology and what's the technology and how do those two work together? And if you're primitive enough — and I don't think this is a smart way to go about it — but if you're primitive enough that you haven't invested in a CRM tool or any kind of technology to help you enable the forecast, you're literally doing this in an Excel spreadsheet or on the back of an envelope, then you really have to think about what the methodology is and clearly define it.
If you have the technology and the methodology and all of the analytics are embedded in that, then you really just need to make sure the real-time information is put in and that the salespeople understand the impact of putting bad information in, and there's some accountability for keeping the information up-to-date and accurate.
So depending where on the spectrum you are of leaning on the technology or leaning on the methodology, this could be a very simple process in the field.
It's a very simple activity in the field that doesn't consume a lot of time, because the technology should be doing the heavy lifting for you.
It's another thing where I'd love to have a simple clean answer, but sales forces these days are sophisticated. They have specialized roles, which they should have, because they're focused on very specific segments of customers and different buying patterns, whether it's transactional or consultative or long-sale cycle of high velocity. It gets kind of messy.
Zaledonis: Yeah. It's funny you said — we were talking about the how, and I get what you're saying, that it can be clean and fast and it doesn't need to be a big time suck. But what about that art of it, the science versus art thing? There's always that subjective nature that, as a manager, you need to adjust people's forecasts for the sandbaggers or for the people who are overzealous.
What's your guidance on that? Because you might have a quick forecast that is nowhere near accurate based on some personal traits.
Jordan: Oh, yeah, so there's one thing we know about salespeople is they have very happy eyes. And you want them to, right? Salespeople need to be the optimists who think they can win every deal. So in some regards, front-line sales managers are critical to making this real, giving a reality check, because sales managers do see their sales team.
If you have 10 salespeople reporting to you, over time you kind of get to realize that, okay, Jason, that guy's a madman, he's going to throw everything in his forecast. I know it never happens.
And Lynne's more prudent, right? She's more thoughtful about it, and she understands the importance of it. And she tends to get it right. And so I do think that intervention can be required. If the salespeople have total control over the forecast, which they do in a lot of companies, they're inputting the assumptions. The sales manager does play a very critical role in keeping everything in check.
And I do think it comes down to having historical perspective on how accurate our people are forecasting. Is it always overstated or is it usually pretty accurate? Is it always understated?
So I think in all cases historical data is there to help you be a better manager, and forecasting is sure a place where historical data can help enormously.
Zaledonis: And as you were talking, I was thinking I guess this comes back to pipeline management, right? If you have that strong pipeline management in process, they understand that what drives the right amount and date and all the opportunity rigor that actually gives you the more accurate forecast. I'm learning from you.
Jordan: Oh. Sales management's role is a diverse role. But part of sales management role is to put processes and procedures and set expectations and then to manage to those processes, procedures, and expectations. And I keep coming back to the role of the sales manager and making this all work, because if you just tell people to give a forecast and you don't give them guidance on what that means and you don't put processes and wrap management process around it, if they don't have the technology, then sales management kind of gets what they deserve, which is bad forecasting, bad pipeline management.
It really so often in an organization comes down to just defining what it is and putting expectations in place and measuring it. I come back to coaching as another example, but most senior leaders expect their sales managers to be coaching. And salespeople like to get coached. And sales managers know it's impactful and they like to be doing it.
But if you really ask people to show the coaching model, often it's not there. And if they have a coaching model, you ask them when they're supposed to do it, and there's not clear expectations for when it's going to be used, in what situation, what the outcomes are.
And ask them if they're measuring coaching, and that never happens, almost never in any organization.
So forecasting's no different. So if you put the expectation in place, you put the processes in place, you enable with technology, you measure it, then over time it's going to improve. It's impossible that it doesn't improve if you have the system in place for continuous improvement.
I noticed in a couple places you talk about forced urgency in sales. Would you talk a little bit about that, and how it impacts forecasting and even pipeline management?
Jordan: Sure. The sales force is the most urgent part of any organization. And it's completely by design. So it's based on this centuries-old belief that if people work harder, they're more productive.
I can argue with sales leadership about this all the time, but there's just one data point I give.
If you ask someone, if you say, "Would you rather your salespeople be working harder or working smarter?" they'll say, "Well, of course, I'd rather them be working smarter. That's necessary." And then they'll kind of lean in and go, "You know, but I think they could be working a little harder, too."
So this idea that if we work harder, we produce more is more prevalent in the sales force than anywhere else. So what do we do? We set aggressive quotas and we report on them weekly and we have daily meetings and daily phone calls to drive a sense of urgency. And there's just nowhere in the organization where there's more urgency. And a lot of times it becomes counterproductive.
The research would say that the more consultative a sale gets, the longer the sales cycle gets. More salespeople need to stop and just think and be strategic. And the more transactional it is, then the more urgency, and the more activity does tend to drive improved performance. If you're banging out 200 calls a day, having the same conversation, 220 calls will probably get you 10% more.
And forecasting is just another place where this is the case. We want all the information urgently. So everyone, every sales manager has to get the data by Friday afternoon so they can put in their forecast by Monday morning.
And it's not always the case that this urgency is there. We were working with a company one time, training their sales — and a big company, you would know the company, anyone from any country in the world would know the company. We were working with their sales managers, and we were trying to improve the coaching that was taking place and particularly to early-stage opportunities.
And the sales manager said, "You know what? Our sales cycle is 6-to-12 months long. There's really no reason we have to update the forecast, have meetings every Friday afternoon to update the forecast, because what changes in seven days when it's a 12-month sales cycle?" And I just said, "What we could do is we just repurpose this meeting every other week to have coaching conversations. And then twice a month we update the forecast." And senior leadership almost — I'm trying to think of some nonprofane way — they lost their minds at the idea that we didn't have the updated forecast every seven days, even though it's a 12-month sales cycle.
And it's just a clear example of the idea of urgency that we drive into the sales force. And it's often counterproductive.
Zaledonis: Okay. So, Jason, this has been, as somebody who is a former sales rep and a sales manager, it's been really interesting and very informing. I sort of wish I could go back and redo some of those times where I wasn't strong on hitting that forecast.
I heard a couple things from you I just wanted to summarize for people who are listening that I heard as best practices for forecasting. First and foremost was pipeline management and forecasting. While they go hand in hand and are inseparable, they needed to be treated separately because they have different purposes in the sales process.
Zaledonis: I also heard a strong need for a process to make sure that your reps are all aligned and that they're being more productive with their time and not wasting a lot of time in forecasting.
And I heard that technology is a big supporting factor for companies that have more accurate forecasts. And that technology and process combined is what makes the most accurate forecasting and helps people be more effective and efficient in the way they do it.
Zaledonis: So I want to share those tips with people. And thank you for sharing them with us.
Jordan: Yeah, anytime. Thank you for having me.
Micalizzi: Before we close, though, I want to ask you — we're going to call this our lightning-round question — if you could take all your current knowledge and experience, go back in time and share it with yourself at the beginning of your sales career, what advice would you give yourself?
Jordan: Don't even have to think about it. The most important advice — and I give this to anyone anytime they ask — you just have to identify the important things and focus on it. In the sales force, there's too much stuff to do. There's no shortage of things you can do to improve your performance. The issue is finding the one or two things that will really drive performance and make a difference. And if my career I'd spent all my energy on the important stuff and peeled away the unimportant stuff, I'd probably be sitting in a different place right now.
Micalizzi: Sounds good.
Zaledonis: Awesome. I agree with that advice, too. Some of the best salespeople I worked with did such a great job of ignoring all the little pithy things and just going for the big things. And I always spread my time too broadly, trying to do everything.
Jordan: You know, the funny thing, Lynne, is that most people understand what the important stuff is. They just don't do it. Or they let other things get in the way. It's not that people in sales aren't intelligent and driven and motivated. They just get too busy and make bad decisions.
Zaledonis: Yeah. No, I hear you. So, Jason, we're following all these great steps, we've got our processes and technologies in place for forecasting, but yet, we're missing our forecast.
What advice do you have for sales leaders and sales professionals in general who are missing their forecasts? What do they do? How do they even start to identify where the problem is before they can fix it?
Jordan: Well, there are two things. One is having an inaccurate forecast. And I guess that could be missed in either direction. And having an inaccurate forecast, I think that often comes down to not really understanding the drivers of your sale.
So if your forecast is inaccurate and you're sometimes overforecasting and underforecasting by the same margin, then you're probably basing it on some bad assumptions. Either you're trying to shove an account management forecast into an opportunity management model, or you're thinking that activity drives the productivity of your sales team when, in fact, it might be some economic driver or some cyclical event.
So if you're missing it on both sides, then you just need to re-examine your forecasting model, because you're probably basing your forecast on the wrong things.
If your issue is you're just never reaching your forecast, then either your forecasts are too rosy or you've just got an underperforming team. And if your team's underperforming, then, to my comment about focus, you just really need to understand what it is that's driving success. And you really need to pay most attention to that and then communicate it and measure it and coach it. And a lot of times sales forces that are habitually underperforming are just really, again, focusing on the wrong things.
If you put the right activity in the right place, eventually the performance will come.
Micalizzi: Perfect. Jason, thank you so much for joining us today.
Jordan: Any time.
Micalizzi: And, Lynne, thank you for co-hosting.
Zaledonis: My pleasure, Kevin.
Micalizzi: And for those of you listening in, remember the best way to stay on top of all things Quotable is by subscribing at www.quotable.com/subscribe.