How Golden State Foods Modeled Its Business Down to the SKU | Lauren Bissell

Media Thumbnail
  • 0.5
  • 1
  • 1.25
  • 1.5
  • 1.75
  • 2
This is a podcast episode titled, How Golden State Foods Modeled Its Business Down to the SKU | Lauren Bissell. The summary for this episode is: <p>Golden State Foods (GSF), one of the largest diversified suppliers to the foodservice industry, wanted to budget and forecast down to the SKU level but needed a solution that was scalable, easy to maintain, and quickly adopted by their teams. Lauren Bissell, Continuous Improvement Manager - FP&amp;A, will explain how GSF overcame the fear of complexity by taking a focused and systematic approach. She’ll also show how the company used training and consultation from KeenVision and the Planful Platform to create detailed models for multiple business groups.</p>
SKU level planning using Dynamic Planning
01:47 MIN
Invest in the knowledge to provide the solution
01:38 MIN
Implementation, admin view
00:28 MIN

Speaker 1: Thanks for coming. I mean, last session of the day. After a break, it's like, what else could you do to try to get people to like go back to their room and get ready for the evening or whatever. My goal of this session is really to give you confidence that dynamic planning can be deployed and successfully maintained by the accounting and finance functions within your own organization. Over the past several years, I've attended trainings and performed conferences, host analytics conferences, and listened in on sessions about dynamic planning, understood what the basics of what it could do. We understood how it could apply to our business, but really didn't know where to start in using it effectively for our business. Sometimes I come home from these conferences and I'm like, I learned so much. I just want to go do it. And then you open up dynamic planning and you're like, wait a second, that looks a little different from those pretty screens and interfaces that they were showing up on the screen. Another thing too, when we started implementing dynamic planning for some of our business groups years ago, it didn't seem at the time that customers themselves were building and maintaining models. Or if they were, it was really a full- time job for them. Now, I certainly think that's changed over the last couple years, but this was my perspective a few years ago. Just by a show of hands, who was at the Bakerfield Solutions Dynamic Planning Session I think it was this morning? A couple of folks. I feel like this is kind of an add on to that, honestly, because from my perspective, their presentation was on kind of the end user input. It had a pretty little interface and you input your budget numbers, and then you can report off of it. But I'm going to go on the back end, the admin. Most of us here or some of us here are administrators of Planful for our organization. This is how we built it for our end users and deployed it for our own business needs. I guess before we get into it, why am I speaking on this? Well, I'm a wife and a mother, and I guess that doesn't have anything to do with dynamic planning, but I'm a former auditor. Recovered, I guess. It's been so long. But when I started at Golden State Foods, or GSF as we call ourselves, I was in the details of the implementation of Planful for our business needs. We started with consolidation and reporting, grew into leveraging other parts of the application for budgeting and forecasting purposes. Who are we? GSF, we supply goods to the food service industry. We operate as a distributor and a manufacturer for quick service restaurants. We operate with in several distinct business groups. We operate in Egypt, Australia and New Zealand, China, here in the US, of course. And through these business groups, we manufacture goods like hamburger, patties, sauces, dressing, condiments, ice cream, smoothies, fresh produce. We do distribution for some of the quick service restaurants as well. It's basically how I justify all of the money I spend at Starbucks. It's bad. We've grown with Planful over time. We started with the consolidation reporting modules to be our one source of financial truth. We've since adopted structured planning for several of our business groups. And after adopting structured planning, there was still that one missing piece for our manufacturing business groups, and that was the ability to build a budget and forecast at the SKU level. The SKU level is for us a level that's more detailed than our chart of accounts. This isn't a dimension in our core application in Planful in our general ledger. It's something outside of the current dimensionality that we had. That's what I'm going to talk about today. We weren't necessarily looking for a reason to use dynamic planning, but rather thinking about using dynamic planning to actually solve a business need that we had. Why dynamic planning? What are the benefits that that would provide that Excel and old processes couldn't provide for us? I'll get into that, and then we'll go into actually some of the screens in Planful how we implemented this successfully and the approach that we took. What were the pros and cons of going down the path of using dynamic planning to budget and forecast at the SKU level? Well, for us, it was the benefits. Increased transparency. You've probably heard a lot of companies talk about the benefits of Planful, whether it's workforce planning, reporting, their budgeting and forecasting. I don't want to go into detail on this, but things aren't living on spreadsheets anymore on people's individual's desktop computers. It's a centralized collaborative tool that we'd be able to use. It brings some structure to our processes. If you have turnover, spreadsheets don't just die with the computer that was handed into IT or whatnot if they didn't put it in a shared folder or whatnot. It's given us some time savings as well. Our board meetings can now take place a week to two weeks earlier than they used to take place. Past years, board meetings took place after the next period closed following quarter end. We'd close a quarter, and then we'd close another period, and then we'd have a board meeting about the quarter. Information's a little outdated at that point. It eliminated some of the administrative duties and consolidating budget and forecast information for us. There's, of course, some costs though. We have to create new scenarios for new budget years and forecast years. There's ongoing maintenance to the templates as your business changes, and there is an incremental investment to implement the solution, of course. I took a sip of water because dimensionality is important. Dimensionality, this is really the reason why we chose dynamic planning for our solution for a SKU level forecast. This allows us to budget and forecast at the SKU level. Like I said, SKU isn't a dimension in our chart of accounts. Up on the screen here, this is an example of the dimensionality in our chart of accounts. This is the flexibility we have with the dimensionality by business group in dynamic planning. The left list of dimensions here that you're seeing is our general ledger's chart of accounts. The right three lists are the different dimensions we can use in dynamic planning using different models for each business group. The dimensions in purple represent dimensions that are consistent with our chart of accounts. Of course, we've got account time scenario, business activity, product, things like that, that are just consistent between our GL and dynamic planning. The yellow highlighted dimensions represent dimensions unique to the specific business group's dynamic planning model based on their business needs. You can see one business group uses customer, while another uses customer category, another uses brand. It all depends on the different systems that they're using and the different business that they have. One of our business groups actually uses zones to classify like where in the facility production takes place. A couple of other things to point out while we're here. Dynamic planning of course provides for dimensionality that's different than what's in our general ledger and structured planning. That's why we're here. The dynamic planning dimensionality can actually be different for each model used by our business groups. We started with one business group and I'll go through that example. But since we've learned how to use dynamic planning, we've then implemented this for five of our business groups, and they're all very different. There's five different models with five different hierarchies, and it's tailored custom to their business and how they want to build their budget and forecast. And then also a quick win for us, in dynamic planning, we can budget using the codes in each of our business group's local chart of accounts, as opposed to using the translated chart of account codes. This is because we operate on multiple ERPs. We use translation tables to map local chart of account information into Planful for consolidation reporting purposes. Dynamic planning allows us to use the coding from the business group's local chart of accounts. This is just something that our accountants enjoy. When they do reporting in the core application, they're using like our account codes, our dimensionality, because they have their own local chart of account codes. This is just an easy win. We can say," Hey, we can build in your local codes." I don't know, they love that. I'm all for easy wins. How do we get this done successfully? Where do we start? We already had access to dynamic planning based on the bundle of modules we had purchased at the time. We had been in dynamic planning, just playing around with its functionality. But at a certain point, it was really time to get serious. We identified a couple of things. We knew we didn't have the internal knowledge to build and maintain models ourselves at the time. We also knew that to be successful, we need to take ownership over this process. There's just too many times that we need to make changes on the fly to not be able to do these ourselves. We started a conversation with Planful who led us to KeenVision. I just want to be clear, this isn't a session on KeenVision, although the work they did was certainly instrumental to where we are and how we use dynamic planning, but this is how we sought their expertise to teach us how to be successful with their tool. They could have certainly built a SKU level revenue and cost of product model rather quickly for us. But we knew it was important for our long- term success and adoption that we had the internal knowledge to create change and troubleshoot as needed. Because like I said, our businesses are constantly changing and we're constantly evolving within our own budgeting and forecasting process. We're always trying to make improvements and make changes. We wanted to know how to do those ourselves. Just over 30 hours, this is the number of hours that we spent with KeenVision. This was learning how to build and apply dynamic planning models to our business group. Our statement of work initially quoted us, I believe, 40 hours, but we were able to complete the project and knowledge transfer in just under that. It was a nominal amount of money spent on that. I won't give away their chargeable rate or whatnot, but our approach was to start with one business group. The consultants at KeenVision knew our goal. They knew that we wanted them to teach us how to build it, and so they didn't just go and build it themselves. We met regularly to go through a piece of the application, and then I would go back and spend time building it out. After we completed the SKU level model for one of our business groups, that knowledge we learned from KeenVision and how to build and maintain it ourselves, we applied then to four other business groups. KeenVision is still actually a really good resource for us. It's fun to see them in the hall and reminisce on the days of building this stuff. But not all of our business groups operate similarly. One of our business groups had a challenge and they wanted to spread certain costs down to the SKU level. I had heard of this breakback functionality from Planful media releases and some past performed conferences and whatnot. It was one of those instances where we could actually go to KeenVision and say," We have this problem. I've heard of breakback. I think this is what we want to use to be able to solve this problem. I don't know how to do it. Can you teach me?" And then it's like just an hour conversation. They get me up to speed on it and then I can deploy it to our business group, which is really nice. All right. Now we're going to get technical. Maybe you've not seen dynamic planning before or maybe you're an admin and you've seen it a couple times, but still maybe unsure as to what it takes to actually set up and maintain these models. I wanted to share a bit of what one of our models actually looks like after we applied what we learned from KeenVision to our business groups. But I do make this clear, this is like an admin view. These next section of slides I'm going to share is what I consider like administration only into dynamic planning. My end users don't see these screens. During an implementation with one of our business groups, I'll have multiple conversations with my end users on the goal, what it should look like, what do we need to consider, everything like that, their current process. And then I'll take that information, I'll translate it into dynamic planning. I'll share what it looks like from our end user perspective in a little bit, but just want to make this clear again that the next few slides are not something my end users have really ever seen before. All right. First step in building a model for our business group. We're doing a SKU level revenue and cost of product budget. The first step for us is creating a master model and building out its dimensionality. This can be done in Spotlight Excel as well, but the screenshots that you're seeing here are actually from the core application. It's really as simple as adding dimensions, creating the hierarchy within each dimension, and adding all your leaf members or roll ups within each of those different dimensions that we've created. I consider myself a doer. I like to get things done. Like a task list is like my favorite thing. I did this, cross it off. And then if it's not on the list, I write it on the list and then I cross it off. It's so satisfying. Actually, I'm going to put presentation for Perform on my to- do list and then cross it off after this. But this is one of those projects where you really need to put pencil to paper and think and get a vision of what you're building before you start. Sometimes these days it feels like if you're not sitting at your computer and typing away furiously, you're not actually doing work. But it's really important to think through what the end product should look like to be able to effectively game plan how to execute getting to that end product. All right. I've built these little roadmaps. Here we are. This is like the starting line. It's so sad. That's all we have and a bunch of blank paper. This is the starting line of our SKU level revenue and cost of product model. All we've done is created the master model and built out the dimensionality. Good job us. All right. An external source model or ESM. Here I'm building out the foundation of our SKU model. We'll ultimately have several ESMs that make up the SKU model, but this is an example of the ESM the team uses to input their sales volume by SKU. I'm not going to go through the cost side as well. We'll just focus on sales. We don't have enough time to do everything. That's another thing, Bakerfield Solutions in that session this morning, they're like," We're going to build it out right here." I'm like, are you really? Because that did take me a little bit of time. Good for them. It was a cool session. You can go watch it. I think they record all these. I do recommend it. The reason this is an ESM is because the data's external to Planful and needs to be input by the end user. The fields you see here as rows will ultimately become columns in the end user input screen. You can see the fields we have in column B. This is what type of field this is. Some of them are direct input fields and some of them are calculated fields or formula fields where there will actually be no input from the end user. It's really nice what Planful has done. The formula fields follow a similar logic to Excel. There are all formulas that we're familiar with. You can see division in SKU, I've got a concatenate formula. Business activity, I've got an if formula based off of the brand. It works just like Excel, which is really nice. For example, I'll drill into the business activity field a little more. Business activity is a dimension in our chart of accounts and we've created a formula to read off of the brand field. Now we can report off a brand or an aggregated version of brand, which is business activity, and that ultimately gets sent back to our core application since business activity is actually a dimension in our chart of accounts. We've got our master model with dimensions. Now we have an external source model and this allows us to input SKU level data into the model with dimensions that is currently external to Planful. Remember, SKU is not a dimension in our core chart of accounts, and so that's why we're using dynamic planning because now SKU is a dimension in our model. All right. We've created an external source model, but the model still doesn't know what to do with the data that will ultimately be entered. It hasn't been entered yet. Creating a source map allows us to tell the model where to move or save the information input into the ESM. You could have multiple source maps for just one ESM, or you can just have one. But an example would be saving various costs back to different GL accounts. On our cost side, we do save things like ingredient costs, labor costs, packaging costs. They all go to different GL accounts. They'll have their own individual source map to save that information to that different GL account. I'll show you two screenshots here. The first one here that you're seeing, this is moving data from one ESM to another ESM. In our sales volume forecast ESM, we input cases sold by SKU. We then send this data to another ESM to marry it with the price information by SKU and a quick Excel formula within the model then allows us to calculate our revenue. I'm missing one. Let me go back. It's a little out of order, but this is the second screenshot of our source map for our ESM. This is where we're actually telling the model to save sales volume in cases by month back to a specific GL account. Actually that's not the right one. Here it is. Here. In this screen, the screenshot here is you can see in column D row 30, that account there 90104, this is where our cases sold information gets saved. It's GL account for that. This source map is basically saying," Send all of this information cases sold into this GL account 90104." But that's not all that this source map is doing and reading all the other dimensions that we have here and categorizing the case volume into division, SKU, brand, et cetera. Using division as an example. We have three facilities in this model and they have their own unique division code. When the end user goes in and inputs their information, they input their division and a bunch of other information, of course, SKU, brand, and case information by period. What this source map does is it's saving their specific division back to their division code, so then we can report off of it that way. With source maps, we can now move data between ESMs and save data back to our master model. We're maybe halfway there. We can now report off of the information input into our ESMs using dynamic planning. Now we can actually run reports like revenue per SKU or cases sold by brand, but ultimately we want to send this information back to the core application. The reason for that is we budget and forecast volume, revenue, and cost of product through dynamic planning, so we can have this additional layer of dimensionality with the SKU level. But all of our other expenses, our workforce, our operating expenses, all our allocations are done within the core application within structured planning. We want to be able to write a map so that we can send this information from our revenue and cost of product model back to the core application to marry it with all that other information so we can run a full P& L for our budget and forecast information. There's two steps to this process. First, we set up a map in dynamic planning and basically tell the application where we send data from the model to the core application. The reason we're using dynamic planning, of course, is because we want to budget and forecast at a level more detailed than our chart of account. In some instances, data doesn't get sent back to our core application. SKU is a great example. There's no dimension for SKU in our core chart of accounts, so we don't send it back to the core application. A screen like this may look a little daunting. If I was here three years ago, I'd be like,""ow do you know what the heck any of that stuff is doing?" But once someone explains it to you the first time, it totally makes sense. And that's really the only reason why we're able to deploy this to multiple business groups after our initial training. Basically if we want to report off of SKU level data, we use the dynamic planning model. But if we want to run a full P& L for budget or forecast, we use the core application. The second step to creating a map and writing information back to the core application is to create a data load rule used to load data from dynamic planning. To automate the loading of data from dynamic planning to the core application, we create something called the calculation. Here's our roadmap. Now we can send information back from our SKU level model to the core application. And within the core application, we budget all of our other revenue and costs. And so then from there, we can run a full budget or forecast P& L. All right. Calculation is really important. When an end user inputs data, they select save data. Through creating a calculation, we're actually telling the model what to do when the end user selects save data. This calculation has a lot of steps in it, but it's basically telling the model to run the source maps that we created, as well as the map we created to send information back to the core application. The calculation is something you set up once and it doesn't really change. Hasn't changed much for us. With a calculation, we're then telling the application to execute certain processes within the model. All right. This is the pretty part. This is where Bakerfield Solutions basically started their session earlier. This is from like an end user perspective now. What we're doing here is we're actually want to get information into the model now. How do we do that? We create views for our end user. It's based off of the rows that we use to create the ESM, which now become our column. This screenshot's an example of the design view when we create the input screen for, again, the sales volume forecast input. Starting in row two, the end user would input their division code into column A, SKU code into column B, their brand into column C, and then their sales volume and cases by month into the remaining column. You'll remember though, if we go back a few slides, which I won't, that when we built out the ESM, we had more fields in our ESM. We had business activity, for example. But you'll remember that we automated the business activity field to be a formula based off of the brand input. For things like this, we're able to automate it before it even gets to the end user to simplify their view. It's information that would be duplicative in entering. We're just trying to save time. Their view when they get into the application, what they have to enter is just a little more limited. This is the property's view of this view. This is where we tell the application to run the calculation we created when the end user actually selects saved data. That way, anytime data is saved by the end user, the calculations run, the information gets sent back to the core application, and anyone can start to run reports off of it now. This is an example of what the end user input would actually look like once all the information is entered into. All right. We made it. That's it, all the steps we needed to be able to save data back to our master model and the core application when an end user inputs data and select save. What did we create here? When the end user selects save, their information gets written back to the core application for integration with inputs from our structured planning and for reporting purposes. For me as an admin, it means no more loading of data manually for business groups that are using Planful for budgeting and forecasting. For the end user, it means different things. It means more centralized visibility into budget and forecast assumptions. It means being able to report off a budget and forecast information at a more detailed level. Excel is great, but we all know that it has its limitations in certain areas. We have a lot more flexibility now when it comes to reporting. Within dynamic planning, we use Spotlight Excel to report on SKU level data. We can do that by product, brand, brand category, customer, depending on the model and the business group that we're doing it for. This one's funny. Saving. This is an example of the communication sent internally regarding the reporting that's expected within each business group's quarterly business review presentation. If our business groups are using Planful for their forecast, this reporting is all automated. If not, fill your latest estimate, that's what we call our quarterly forecast, that information has to be entered in manually to the report. I don't know. No offense, but if I got an email like this, I probably wouldn't read it all. You didn't compose it though. You did review and send it though. But it's a lot. I mean, for some of our business groups, that means actually manually during information into roughly 25 reports for just a quarterly business review presentation. When the forecast changes, which we all know is inevitable, probably multiple times, and the last time is probably either the day of or right before when you have to send it, that's just more rounds of manual changes that need to be made. I like to do my job. I like to do it the first time, but it's that second and third time when you have to repeat what you're doing, that's where I find there's always mistakes as well. All right. We're done, right? The model's built and working properly. Information's being saved back to the core application. We can run full P& Ls with budget and forecast information. We marry that with the actual information that's already in the application, but we actually have to go one step further. We always try to focus on adaptability. Will our end users actually be able to use this successfully? What can we give them to make them successful? How can we just best support our end users? Selfishly as well, how do I remember how the model works? I mean, some of this stuff is a little complex. At this point too, it's been done a few years ago. If I need to go back in and do some maintenance on it, I have a good memory, but I like to try not to test it too much. We wrote a lot of the processes down. We have several business groups using dynamic planning. We've got multiple models with over 20 ESMs. Each are built a little differently to suit the needs of that business group. There was really no way I'd be able to remember like all the intricacies and nuances without writing everything down. For each business group, we wrote down instructions for dynamic planning, structured planning, and then the model architecture, which is super, super nerdy, but it's actually really cool. I got to play with some boxes and colors and stuff. But it has the names of ESMs, the source maps, calculations, and the relationships between these items. The thing about dynamic planning is it's really, really powerful and really, really flexible, but it's also a little more complex from an admin perspective. It doesn't have the pretty little user interface that the core application has, but that allows it to do so many things. The model architecture is actually really important just from an admin perspective. Naming convention I found is actually really important. When you're creating maps to send information from one place to another, putting in the name, where it's from and where it's going, just helps you when you have to then use it or reference it in a calculation later on to execute the process, things like that. I kind of feel bad for the first business group that we implemented this for. Because by the time we got to like three, four, five, everything is much cleaner and nicer. It's like first child, right? I guess. Get all your kinks out. In the dynamic planning user guide, this is actually some excerpts from our user guide on how are end users actually use this model for their purpose. This is what we documented for inputting information into their sales volume forecast. We're actually telling end users exactly where to go. When they get into the application to input sales volume by SKU, you need to navigate, to analyze, data, select the folder icon, find your sales volume forecast input view, and click select. Screenshots are so fun, but then it's also really frustrating when Planful changes like the user interface or whatever and it's like, well, now it looks outdated. This is Excel, so it's kind of nice. It doesn't ever really look outdated. Here, we're then telling our end users how to navigate within the input screen. Once we're there, you need to open up this division filter. You need to find and select your division and click on get data. Really important. All right. This is the last step in entering sales volume forecast information into the model. We're telling them what information to input. We're also telling them the calculation that will run in the background, and then we warn them of any potential error messages or common error messages. In the sales volume example, we tell them," You need to enter in your division, your SKU code, your brand, and your sales volume in cases by period." We tell them too that inaudible in the background to save business activity data based on the brand field, because we don't want to hide information from them. We just want to tell them the system is already doing this for you in case you're wondering maybe they're not. I don't know. But then inaudible...on like Tom Brady inspirational motivational quotes, whatever. What do you got, Google? This is what it is. I'm not a person who defends myself very often. I kind of let my actions speak for me. When I take this approach to sharing what Planful can do with our business groups, I kind of let it speak for itself. We can continue to do things manually in silos. But as deadlines shorten and transparency becomes demanded these days, the successes our other business groups have with the application and how they leverage Planful and how it makes their life easier is really the most compel... That's all I got. I'm now taking questions from the GSF team.

Speaker 2: First off, wow! That's fantastic what you and your team have done. That's so impressive. Great job. Where do the rebates or discounts for sales lie? Is it in this program, or is it at the general ledger level?

Speaker 1: We can budget and forecast that information in this specific model for our business who has some rebate information, we factor that in in like our price table. We didn't go through all the ESMs. I was sharing what the end user ESM input looks like for our sales volume, but we have an ESM for price table. We have our price, any rebates per SKU that we could input as well.

Speaker 3: With having so many different types of industries within the organization, which one is the most challenging to create the dynamic planning for?

Speaker 1: It really depends. It depends on the business and their existing models. Some of these businesses, the Excel models that we have are like quite impressive and also quite scary too. I've found within our own organization it's not necessarily a certain type of industry. It's, what is our cost structure? Do we have standard costing? What does that look like? But it's really trying to simplify the process that we currently have based on the individual business groups that we have and how do we make their life a little easier.

Speaker 4: This is for SKUs, right? If you get a new SKU, do you have to update both the price ESM or just one of them and it'll flow through?

Speaker 1: When you get a new SKU, we have to add it to the dimensionality or else the model doesn't know like where to save that information, because we do report off of SKU. It's a field that we use to save information to. It needs to be added to the hierarchy. And then you would have to just add it as an input then. If you're projecting volume second half of the year, it'll be in your sales volume forecast input field. And then, of course, you'll want to make sure you've got a price and a cost for that as well.

Speaker 5: Just one question, I noticed who your favorite teams are. How come Tom Brady had the Tampa Bay Buccaneers shirt on instead of the Patriots?

Speaker 1: Because... Gosh. Some people in this room know my husband, don't tell my husband because he is like a fine wine and he gets better with age. I am a New England Patriots fan though. Thanks, everyone.


Golden State Foods (GSF), one of the largest diversified suppliers to the foodservice industry, wanted to budget and forecast down to the SKU level but needed a solution that was scalable, easy to maintain, and quickly adopted by their teams. Lauren Bissell, Continuous Improvement Manager - FP&A, will explain how GSF overcame the fear of complexity by taking a focused and systematic approach. She’ll also show how the company used training and consultation from KeenVision and the Planful Platform to create detailed models for multiple business groups.

Today's Guests

Guest Thumbnail

Lauren Bissell

|FP&A Continuous Improvement Manager at Golden State Foods