Archive for the ‘Enterprise Resource Planning’ Category

One of our main activities at Profit Point is to help companies and organizations to plan better, to make informed decisions that lead to improvements such as more efficient use of resources, lower cost, higher profit and reduced risk. Frequently we use computer models to compare the projected results for multiple alternative futures, so that an organization can better understand the impacts and tradeoffs of different decisions. Companies can usually effectively carry out these types of processes and make decisions, since the CEO or Board of the entity is empowered to make these types of decisions, and then direct their implementation.

Infrastructure and resource allocation decisions must be made on a national and international basis as well, and are usually more difficult to achieve than within a company. An example of this today is the on-going controversy in southeastern Asia regarding the use of water from the Mekong River in the countries through which it flows: China, Myanmar, Laos, Thailand, Cambodia and Vietnam.
For a map of the river and region, refer to the link below:

http://e360.yale.edu/content/images/0616-mekong-map.html

The Mekong River rises in the Himalayan Mountains and flows south into the South China Sea. For millennia the marine ecosystems downstream have developed based on an annual spring surge of water from snow melt upstream. The water flow volume during this annual surge period causes the Tonle Sap River, a Mekong tributary in Cambodia, to reverse flow and absorb some of the extra water, resulting in a large temporary lake. That lake is the spawning ground for much of the fish population in the entire Lower Mekong river basin, which is in turn the main protein source for much of the human population in those areas.

Now China has an ambitious dam construction program underway along the upper Mekong, and other countries (along with their development partners) are planning more dams downstream. Laos, for one, has proposed construction of eleven dams, with an eye towards becoming “The Battery of Asia”.

The challenge here is to find and implement a resource allocation tradeoff that meets multiple objectives, satisfying populations and companies that need clean water, countries that need electricity to promote economic development and fish that need their habitat and life cycle.

Multiple parties have developed measures and models that can help forecast the impact of different infrastructure choices and water release policies on the future Mekong basin. Let’s hope that the governments in Southeast Asia are able to agree on a reasonable path forward, and implement good choices for the future use of the river.

For more information here are a few examples of articles on the Mekong:

http://ngm.nationalgeographic.com/2015/05/mekong-dams/nijhuis-text

http://www.internationalrivers.org/resources/the-lower-mekong-dams-factsheet-text-7908

There is nothing like a bit of vacation to help with perspective.

Recently, I read about the San Diego Big Boom fireworks fiasco — when an elaborate Fourth of July fireworks display was spectacularly ruined after all 7,000 fireworks went off at the same time. If you haven’t seen the video, here is a link.

And I was reading an article in the local newspaper on the recent news on the Higgs: Getting from Cape Cod to Higgs boson read it here:

And I was thinking about how hard it is to know something, really know it. The data collected at CERN when they smash those particle streams together must look a lot like the first video. A ton of activity, all in a short time, and a bunch of noise in that Big Data. Imagine having to look at the fireworks video and then determine the list of all the individual type of fireworks that went up… I guess that is similar to what the folks at CERN have to do to find the single firecracker that is the Higgs boson.

Sometimes we are faced with seemingly overwhelming tasks of finding that needle in the haystack.

In our business, we help companies look among potentially many millions of choices to find the best way of operating their supply chains. Yeah, I know it is not the Higgs boson. But it could be a way to recover from a devastating earthquake and tsunami that disrupted operations literally overnight. It could be the way to restore profitability to an ailing business in a contracting economy. It could be a way to reduce the greenhouse footprint by eliminating unneeded transportation, or decrease water consumption in dry areas. It could be a way to expand in the best way to use assets and capital in the long term. It could be to reduce waste by stocking what the customers want.

These ways of running the business, of running the supply chain, that make a real difference, are made possible by the vast amounts of data being collected by ERP systems all over the world, every day. Big Data like the ‘point-of’sale’ info on each unit that is sold from a retailer. Big Data like actual transportation costs to move a unit from LA to Boston, or from Shanghai to LA. Big Data like the price elasticity of a product, or the number of products that can be in a certain warehouse. These data and many many other data points are being collected every day and can be utilized to improve the operation of the business in nearly real time. In our experience, much of the potential of this vast collection of data is going to waste. The vastness of the Big Data can itself appear to be overwhelming. Too many fireworks at once.

Having the data is only part of the solution. Businesses are adopting systems to organize that data and make it available to their business users in data warehouses and other data cubes. Business users are learning to devour that data with great visualization tools like Tableau and pivot tables. They are looking for the trends or anomalies that will allow them to learn something about their operations. And some businesses adopting more specialized tools to leverage that data into an automated way of looking deeper into the data. Optimization tools like our Profit Network, Profit Planner, or Profit Scheduler can process vast quantities of data to find the best way of configuring or operating the supply chain.
So, while it is not the Higgs boson that we help people find, businesses do rely on us to make sense of a big bang of data and hopefully see some fireworks along the way.

Uncovering the Value Hiding Behind Environmental Improvement Investments

Supply Chain optimization is a topic of increasing interest today, whether the main intention is to maximize the efficiency of one’s global supply chain system or to pro-actively make it greener. There are many changes that can be made to improve the performance of a supply chain, ranging from where materials are purchased, the types of materials purchased, how those materials get to you, how your products are distributed, and many more. An additional question on the mind of some decision makers is: Can I minimize my environmental footprint and improve my profits at the same time?

Many changes you make to your supply chain could either intentionally – or unintentionally – make it greener, so effectively reducing the carbon footprint of the product or material at the point that it arrives at your receiving bay. Under the right circumstances, if the reduced carbon footprint results from a conscious decision you make and involves a change from ‘the way things were’, then there might be an opportunity to capture some financial value from that decision in the form of Greenhouse Gas (GHG) emission credits, even when these emission reductions occur at a facility other than yours (Scope 3 emissions under the Greenhouse Gas Protocol).

As an example, let’s consider the possible implications of changes in the transportation component of the footprint and decisions that might allow for the creation of additional value in the form of GHG emission credits. In simple terms, credits might be earned if overall fuel usage is reduced by making changes to the trucks or their operation, such as the type of lubricant, wheel width, idling elimination (where it is not mandated), minimizing empty trips, switching from trucks to rail or water transport, using only trucks with pre-defined retrofit packages, using only hybrid trucks for local transportation and insisting on ocean going vessels having certain fuel economy improvement strategies installed. These are just some of the ways fuel can be saved. If, as a result of your decisions or choices made, the total amount of fuel and emissions is reduced, then valuable emission credits could be earned. It is worth noting that capturing those credits is dependent on following mandated requirements and gaining approval for the project.)

Global Market for GHG Credits

If your corporate environmental strategy requires that you retain ownership of these reductions, then you keep the credits created and the value of those credits should be placed on the balance sheet as a Capital Asset. Alternatively, if you are able, the credits can be sold on the open market and the cash realized and placed on the balance sheet. Either way, shareholders will not only get the ‘feel good’ benefit of the environmental improvement, but also the financial benefit from improvement to the balance sheet. If preferred, the credits can be sold to directly offset the purchase price of the material involved, effectively reducing that price and so increasing the margin on the sales price of the end-product and again improving the bottom line. If capital investment is required as part of the supply chain optimization, the credit value can also be a way to shorten the payback period and improve the ROI, or to allow an optimization to occur

So, when you consider improving your environmental impact or optimizing your supply chain, consider the possibility that there might be additional value to unlock if you include both environmental and traditional business variables in your supply chain improvement efforts.

Written by: Peter Chant, President, The FReMCo Corporation Inc.

I was sitting on the plane the other day and chatting with the guy in the next seat when I asked him why he happened to be traveling.  He was returning home from an SAP ERP software implementation training course.  When I followed up and asked him how it was going, I got the predictable eye roll and sigh before he said, “It was going OK.”  There are two things that were sad here.  First, the implementation was only “going OK” and second, that I had heard this same type of response from so many different people implementing big ERP that I was expecting his response before he made it.

So, why is it so predictable that the implementations of big ERP systems struggle?  I propose that one of the main reasons is that the implementation doesn’t focus enough on the operational decision-making that drives the company’s performance.

A high-level project history that I’ve heard from too many clients looks something like this:

  1. Blueprinting with wide participation from across the enterprise
  2. Implementation delays
    1. Data integrity is found to be an issue – more resources are focused here
    2. Transaction flow is found to be more complex than originally thought – more resources are focused here
    3. Project management notices the burn rate from both internal and external resources assigned to the project
  3. De-scoping of the project from the original blueprinting
    1. Reports are delayed
    2. Operational functionality is delayed
  4. Testing of transactional flows
  5. Go-live involves operational people at all levels frustrated because they can’t do their jobs

Unfortunately, the de-scoping phase seems to hit some of the key decision-makers in the supply chain, like plant schedulers, supply and demand planners, warehouse managers, dispatchers, buyers, etc. particularly hard, and it manifests in the chaos after go-live.  These are the people that make the daily bread and butter decisions that drive the company’s performance, but they don’t have the information they need to make the decisions that they must make because of the de-scoping and the focus on transaction flow.  (It’s ironic that the original sale of these big ERP systems are made at the executive level as a way to better monitor the enterprise’s performance and produce information that will enable better decision-making.)

What then, would be a better way to implement an ERP system?  From my perspective, it’s all about decision-making.  Thus, the entire implementation plan should be developed around the decisions that need to be made at each level in the enterprise.  From blueprinting through the go-live testing plan, the question should be, “Does the user have the information in the form required and the tools (both from the new ERP system and external tools that will still work properly when the new ERP system goes live) to make the necessary decision in a timely manner?”  Focusing on this question will drive user access, data accuracy, transaction flow, and all other elements of the configuration and implementation.  Why? Because the ERP system is supposed to be an enabler and the only reasons to enter data into the system or to get data out is either to make a decision or as the result of a decision.

Perhaps with that sort of a focus there will be a time when I’ll hear an implementation team member rave about how much easier it will be for decision-makers throughout the enterprise once the new system goes live.  I can only hope.

Rich Guy

The rise of zombies in pop culture has given credence to the idea that a zombie apocalypse could happen. In a CFO zombie scenario, CFO’s would take over entire companies, roaming the halls eating anything living that got in their way. They would target the brains of supply chain managers and operations people. The proliferation of this idea has led many business people to wonder “How do I avoid a CFO zombie apocalypse?”

Supply chain managers are seeking and developing new and improved ways to exploit the volumes of data available from their ERP systems. They are choosing advanced analytics technologies to understand and design efficient sustainable supply chains. These advanced analytics technologies rely on the use of optimization technology. I am using the mathematical concept of “optimization” as opposed to non-mathematical process of making something better.

Mathematical optimization technology is at the heart of more than a few supply chain software applications. These applications “optimize” some process or decision. Optimization-base programs, for example, those frequently found in strategic supply chain network planning, factory scheduling, sales and operations planning and transportation logistics use well-known mathematical techniques such as linear programming to scientifically determine the “best” result. That “best solution” is usually defined as minimizing or maximizing a single, specific variable, such as cost or profit. However, in many cases the best solution must account for a number of variables or constraints. Advanced analytics technologies can improve a company’s bottom line – and it can improve revenue, too! CFO’s like this.

Advanced analytics technologies provide easy-to-use, optimization-based decision support solutions to solve complex supply chain and production problems.  And, these solutions can help companies quickly determine how to most effectively use limited resources and exploit opportunities.

So, from my perspective, there are seven practical reasons to embrace advanced analytics technologies:

  1. Your company saves money, increases profits.
  2. You get to use all your ERP system’s data.
  3. It’s straightforward and uncomplicated.
  4. You have the tools to discover great ideas and make better decisions.
  5. At the end of the day, you know the total cost of those decisions.
  6. You have a roadmap to make changes.
  7. You avoid the CFO zombie apocalypse

This article written by Alan Kosansky and Jim Piermarini was originally published in Supply Chain Brain.

More than a decade has passed since businesses started using Enterprise Resource Planning (ERP) for managing data and transactions throughout the supply chain. Traditionally, ERP systems have provided transparency and insight into transaction-level data in the supply chain that support important business planning activities. Now, a new generation of applications is being developed to help fill the gaps between general business planning and business-specific, tactical and strategic decisions. These ERP-connected applications offer supply chain executives previously unavailable analysis and insights into the decisions that directly impact customer service, profitability and competitive advantage.

Supply Chains Differences

Supply chains are as different as the companies and people that run them. Some companies view their supply chain operations as a “utility” that is expected to function without any investment in intellectual capital. These organizations are content to rely on industry best practices in their supply chain operations and follow the leaders (or the features that are added by ERP software providers) in supply chain improvement. Other organizations see their supply chain operations as a strategic opportunity to develop a competitive advantage and increase market share. They know that with some small departures from the norm and a modest investment in intellectual capital, supply chains can provide enhanced performance to the business. These companies understand that there are opportunities for creative and unique ideas in the supply chain to improve company performance and achieve business strategy objectives.

Today, many C-level executives see their ERP systems as key enablers to company productivity, and for the most part, they are correct. Since ERP systems perform many valuable functions, there is a natural assumption that they can handle whatever business strategy the company adopts. However, new business ideas by definition run the risk of stressing the ERP system features beyond their ability to cope. Usually these failures are discovered only during the implementation of a new business strategy. So what happens when the ERP system fails to support the new business strategy in certain critical details? Those working in the trenches know this scenario all too well. But, what can be done to implement strategic supply chain initiatives when ERP is not equipped to handle business-specific initiatives?

Making the ERP Work

There are three possible approaches for implementing supply chain planning activities that offer a company a competitive advantage:

1. Figure out how to get the ERP system to do it. This approach works well if the company’s needs align well with current industry practices supported by ERP systems. Otherwise, companies may find themselves going down a path that consumes significant resources for a poor fit in the end. Companies that adhere to this path typically do so in part because there is a strong C-level edict in favor of simple, clean upgrades for the ERP system. Faced with this, the IT organization has enormous power to shape the nature of the supply chain operation to fit within the established ERP norms, and thus can act as a barrier for business innovation and supply chain improvement.

2.Modify the ERP system to provide new functionality. This is an approach often promoted by IT organizations committed to supporting the fewest number of tools. While this is an important cost management objective, it is important to understand the full cost to implement and support the system over the long term. What can be accomplished is often limited by the lack of flexibility in large ERP systems and IT organizations. Since ERP systems are mission-critical systems, the support and maintenance of the core functions are of paramount importance. This task, placed on a limited IT staff, leads to large backlogs of enhancement work and long queue times. And while IT departments are well-equipped to manage their primary assets, few if any IT departments have the requisite domain knowledge to cross over into supply chain optimization. Given long wait times, organizations will often choose the simplest approximation of the business change that can be ushered to the top of the queue. This approach can result in a quick-fix style of strategy implementation, rather than a priority-based feature development, and may leave the most important aspects of the initiative lingering in the queue.

3. Add an integrated solution to the ERP system that replaces one or more functions that are needed to achieve the business strategy. This could be from an out-of-the-box third-party provider, or for full competitive advantage, a targeted or custom supply chain application that integrates with the company’s ERP data. This approach has the benefit of including priority-based features that the current ERP system lacks, and the additional benefit of avoiding the ERP enhancement queue. The downside, however, is that it suffers from the stigma of being yet another application and not the ERP system itself. This usually presents a hurdle that requires a careful analysis to understand the total cost relative to the strategic benefit. While not all business changes will overcome this roadblock, there are good reasons to look at this approach. These include:

  • Ensuring a tight fit between the business strategy and the tool execution
  • Minimizing the cost, overhead, and extra setup and maintenance in un-needed functions from a shrink-wrapped general purpose tool
  • Providing the marketplace with a specialized and unique operation of the supply chain for competitive advantage.

Example from the Field

A leading consumer electronics company with about $2bn in annual sales implemented an integrated solution to its ERP system to manage its order fulfillment process for competitive advantage. The company had recently modified its corporate strategy to increase retail sales through its “big box” customers (Walmart, Best Buy, Staples, etc.). However, key service level agreements were not being met for these customers due to lower than expected order fulfillment measures. A simple inventory analysis recommended large increases in the stock required at the warehouse, with some method of segregating inventory for each big-box customer so it could not be taken by orders from other customers.

In this case, one of the leading causes of low service for customers was that they ordered “just-in-time”. These JIT orders were not being given any priority over other customers’ orders with longer lead times. The company noted that these important customers may have provided accurate plan information, but that was not being used to assure them any better service. The analysis recommended that separate stocks of inventory be set up based on the big-box planning information, and that other customers not be allowed to take from those inventory locations. This would result in a large increase in overall stocks, but should achieve the desired increase in service levels.

One manager questioned this recommendation, wanting to know why the ERP system did not use the big-box planning information to appropriately manage the company’s service levels. She also questioned what could be done to avoid increasing her inventory risk and yet still achieve the business strategy. This is a question many managers face when their analysts say that to improve service you need to increase inventory levels. Often there are alternatives. This key manager’s insight set the path for her company to make a significant shift in their supply chain operations, with remarkable benefits. What follows will answer the question: Can I raise the service level of my key customers without increasing my inventory and capital risk? The short answer is, “yes”. Significant service benefits and risk reductions can be achieved, but only if you are willing to deviate from your ERP’s standard approach to implementing key supply chain initiatives.

The industry standard approach for assigning available inventory to open orders is to use a FIFO (first in, first out) approach. This approach prioritizes orders based on when the order was received and assigns on-hand inventory to those orders that were received and entered into the system first. While this approach has a degree of fairness to it, and is available in all ERP systems, it did not align well with the business objectives of this company. It actually penalized key customers who issued JIT purchase orders while giving ample planning information. These JIT orders would have to wait until all the older orders, from non-key customers, were allocated before they would be assigned any inventory.

The standard ERP process does not take into consideration the customer’s strategic importance or their planning information. Given this FIFO process, the internal recommendation makes sense: set up separate safety stocks for each big-box customer (based on their planning information), in separate inventory locations, and make a rule that directs big-box orders to their separate inventory.

But having separate safety stocks violates the principle that more customers need lower inventory together, than each does individually. Pooling the inventory helps to avoid unnecessary capital risk. The standard ERP FIFO inventory assignment process could be replaced with one that met customer needs more effectively.

The company embarked on a project to take into account several important factors when deciding how much inventory to assign to each order:

  • The priority of the customer
  • The amount of inventory actually in the sales channel of the customer, and
  • The planning information that the customer shared with the company.

Customer priority is a key and strategic factor in deciding which customers receive product, when inventory availability is limited or delayed. This business need meant that strategic and high-volume customers should typically be serviced before others. However, this may not be the case if a strategic or high-volume customer happens to be sitting on a lot of inventory in their channel. In these cases, it may be preferable to share the wealth with smaller volume resellers to maximize the sell-through to retail customers. Moreover, these rules may apply differently for each SKU in a manufacturer’s product line.

The business rules to implement these sorts of complex trade-offs can get complicated. If one wants to retain a certain amount of flexibility in these rules, then the ERP system is a poor place to make these decisions. However, since most, if not all, of the data resides in the ERP system, these decisions must be tightly integrated with the data and transaction handling within the ERP system. So an application was constructed to manage the inventory assignment process in this way to more closely match the business strategy. The new application is run several times a day, extracting needed info from the ERP system, making the assignment of inventory to all open orders, and sending back the info to the ERP system.

Using this integrated solution, overall service levels for these key customers were sharply increased, prompting several supply chain awards from these big-box customers. As a result of the increase in service level, Walmart (a strategic customer) was so pleased they chose to increase their orders of all this company’s products by 100 percent. The overall inventory did not increase.

The new method demonstrated that pooled inventory was an effective approach to containing inventory levels. In subsequent versions of this application, the integration of point of sale data has allowed even more control over the inventory in the various channels to market. As a result, this company has declared this application a business-critical application. It overcame the hurdle, and the application can defend its spot on the chart of critical business applications alongside the ERP system.

Integrated Solution Success

Using an integrated solution to the ERP system was a win-win approach that allows the business the flexibility to manage order fulfillment for competitive advantage while maintaining the benefits of centralized data and the strong transactional handling capabilities delivered by ERP.

But order fulfillment is not the only area where there is opportunity to supplement the strengths of ERP with flexible and powerful business optimization processes and tools. Other areas where leading companies have decided to enhance their ERP capabilities include optimization-based infrastructure planning, sales and operation planning, distribution route and territory planning, transportation bid optimization, transportation fleet planning, and production scheduling.

These are just some examples of where complex and/or strategic business rules can provide competitive advantage through improved supply chain performance. While ERP systems remain the backbone of all successful large business operations today, they are not the only path available to companies who desire to apply innovative approaches to their business and supporting supply chain activities. Global enterprises that seek a competitive advantage now have the opportunity to leverage their ERP investments by integrating optimization-based solutions to key business strategies.

Profit Point’s S&OP software and service helps global manufacturers improve forecasting, operations planning, sales and profitability.

Profit Point, the leading supply chain optimization software and services company, today announced the release of its Profit S&OP software to complement it’s S&OP consulting services. Profit Point’s combined S&OP solution provides business decision makers with the process and tools to manage and optimize sales and operations planning across the supply chain.

The Profit S&OP software tool is fully-customizable to meet the needs of supply chains across all industries and is designed to improve tactical planning for the key decision makers across a company, including finance, sales, manufacturing, logisitics and supply chain. The software provides a centralized dashboard to gain insights and control over a company’s supply chain, including features to enhance collaborative forecasting and improve manufacturing, distribution, and inventory decisions.

“Global manufacturers struggle to accurately plan for global demand across their product lines in a timely manner,” noted Alan Kosansky, Profit Point’s President. “Our S&OP solution solves this problem with a combination of effective processes and a shared planning tool that provides one set of numbers for the key stakeholders across the entire supply chain.”

Using Profit Point’s S&OP solution, manufacturers can coordinate with their supply chain planners across the globe to build accurate, detailed manufacturing and distribution plans quickly and integrate with point-of-sale demand tracking systems. And, the software connects with existing ERP systems, such as SAP® and Oracle®, so analysis and decisions are up to date across the entire organization.

“Improved planning can help any large manufacturer reduce inventory excess and capital risk.” said Jim Piermarini, Profit Point’s CEO. “But the key to successful planning includes the right technology and the right process to align employees with the company’s strategic objectives.”

Profit S&OP has an integrated optimization engine that seamlessly drives the best scenarios to the forefront of a tactical planning sessions. Throughout the process, decision makers are able to visualize and test multiple future scenarios to achieve a collaborative, cross discipline decision making process. Key features in the software include the ability to automatically generate an optimal tactical plan down to the bill of materials (BOM) level, integration with existing ERP data warehouse, multi-period planning horizon, scenario analyzer to systematically assess multiple future scenarios, complex BOM exploration and the ability to visualize plans, timelines and bill of materials to correct bottlenecks and reduce excesses.

To learn more about Profit Point’s sales and operations planning software and services, call us at (866) 347-1130 or contact us here.

About Profit Point:
Profit Point Inc. was founded in 1995 and is now a global leader in supply chain optimization. The company’s team of supply chain consultants includes industry leaders in the fields infrastructure planning, green operations, supply chain planning, distribution, scheduling, transportation, warehouse improvement and business optimization. Profit Point’s has combined software and service solutions that have been successfully applied across a breadth of industries and by a diverse set of companies, including Dow Chemical, Coca-Cola, Toys “R” Us, Logitech and Toyota.

Here at Profit Point we regularly hear from clients with well established Enterprise Resource Planning (ERP) systems that they need something more.  ERP systems are excellent for doing certain things including:

  1. Providing central repositories of data
  2. Enabling cross functional work processes within and across companies
  3. Costing of goods
  4. Planning resources and materials at a high level

However the more complicated your business work processes and manufacturing production processes the less sufficient a standard ERP system will be in providing the best decision support functionality.  Some of the complications that require decision support systems (DSS) and which we have been helping clients deal with lately include:

  1. Work processes to handle make to order versus make to stock material assignments
  2. Allocation of inventory to customer orders when in an oversold position
  3. Sequence dependent setups / cleanings of manufacturing equipment
  4. Scheduling of production sequenced through a “product wheel”

DSS are necessary because of the complexity of first finding a feasible solution and then having some means of sorting through the huge number of feasible options to find a “good” or “optimal” solution.  DSS help in these kinds of situations to:

  1. Reduce costs
  2. Reduce manufacturing lead times
  3. Improve customer service
  4. Increase revenue

ERP systems are a necessary part of being able to deliver a DSS by providing the data necessary for making the decisions in question but don’t have the following:

  1. Ability to be tailored to a specific work process or manufacturing environment
  2. Advanced analytical capability to sort through the complexity and volume of options to get to a “good” or “optimal” solution
  3. Graphical user interface tools to be able to allow a user to visualize the data in a way that gives them the insights needed to make decisions

At Profit Point we specialize in listening to our clients needs and then building DSS to unlock improvement opportunities which enable our clients to outdistance the competition.

At Profit Point, we often repeat the mantra “People, Process, Technology.”  All three are important for the kinds of projects we work on.  You have to have good systems (the technology part) that support good work processes and people that follow the process and use the systems.  If your people are not committed to following the process and using the systems, you are going nowhere fast.

Recently we were discussing with a senior manager at one of our clients what makes for a good Sales and Operations Planning Process (S&OP Process).  Being someone who is more of a process and technology guy I was thinking that he might say something like “You have to have a well thought out work process that is clearly communicated to everyone involved” or “You have to have a system that is easy to work with that supports the work process well.”  WRONG!

The first thing he mentioned was that senior management needed to be openly committed to the process and systems.  He illustrated this for us by recounting what another senior manager at this same client said during an S&OP meeting with a large group.  The group was going back and forth discussing a “potential” order from a customer and this particular senior manager said “If it’s not in the system then it’s a rumor and we don’t plan and schedule for rumors.”

As you can imagine, this cut down on the chatter in the room quite quickly.  This client had spent a lot of time and money developing processes and systems that worked well and those two things are necessary but not sufficient.  You have to have leadership that says “We have a work process to follow and a system to use to support executing that process.  Follow the process and use the system.”

Next you have to have people who do exactly that!  If this is not happening then as I heard from another executive “Either the people will change or the people will change!”

You have to be able to trust the data in the system but really at its root this boils down to trusting the people who entered the data in the system.  As I was reminded, this starts at the top!

Big ERP

May 4th, 2010 11:01 pm Category: Enterprise Resource Planning, by: Richard Guy

Many of us have worked with companies that provide large ERP solutions. Some of these experiences have been successful and others somewhat less than ideal. If you work in the manufacturing, supply chain, logistics areas, then you realize the vast importance of having access to meaningful data, although implementing a large ERP system does not necessarily mean you can get to that data. It has been my experience that having the data available and having the ability to get to that data and using it to perform strategic or tactical analyses may be challenging.

I recall a situation that happened to me many years ago. I was working for a large corporation that maintained a large database on its customers. All of this information was on a mainframe. To get access to the data so that we could perform analyses and generate reports required communication with the MIS department. We would schedule a meeting with one of the analyst to discuss what data we needed access to and what reports we required. The analyst would routinely tell us to fill out a job request form number 777. Then, this form need to go through several levels of management approval. If the request made it through the approval process it was put on the development schedule. Typically from start to end the process would take several months. In today’s world that would not be acceptable. Read the rest of this entry »

Despite our egalitarian mindset in the U.S., when it comes to customers, let’s face it: They have never been ‘created equal.’ Certainly for decades, manufacturers and distributors have offered better pricing to some customers than others. We’re all familiar with quantity break pricing, column pricing with different discount levels for different categories of customers, and contract pricing. And who doesn’t visit the local supermarket today and notice the ‘buy 3 get 1 free’ offers to encourage us to increase our purchases?

Volume is valuable and warrants better pricing, we are in the habit of believing. And most often this is true. Not only does a high-volume customer drive our buying power with suppliers by helping us reach the next price break level on the purchasing side, but it can make each sale more profitable: The cost of servicing 10 orders that result in a sale of 100 units can be 10 times as great as the cost of servicing a single order for those 100 units.

This bias towards volume underlies traditional customer ranking methods. But many manufacturers today are taking a closer look at these policies and finding them lacking. Instead, they are engaging in a detailed cost analysis effort called ‘cost-to-serve.’ While cost-to-serve can be a very broad subject covering product costs, location costs, transportation costs and service costs, to name a few, this article will take a look primarily at customer costs.

It’s not that heretofore companies have ignored factors that shade the degree of profitability of a large client. Many firms, presented with the opportunity of doing business with, say, Wal-Mart or the federal government, may question whether it’s really worth doing. They’re thinking about the overhead of handling such a client and the cost of meeting client demands – with slim price margins.

What’s different today is that companies are trying to measure these costs precisely and to make informed, scientific decisions based upon them. Whether they engage consulting firms who have developed methods for tackling this measurement, purchase software to help them out, or devise their own internal approach, more and more manufacturers and wholesalers are gathering detailed costs and trying to apply them to decisions about their customers.

Consumer goods companies, for instance, are recording metrics such as the true cost of customer service. How much support time does this customer require of the customer service organization? How much sales time to we devote to him? Does the customer frequently return merchandise, and if so, what is the cost of processing that return? In the case of consumer goods manufacturers, we might also look at custom-branded merchandise: What is the true cost of providing private labeling for a retailer? Are we really capturing in the product cost all of the special handling required by the purchasing and distribution organizations? All of these costs are very important is assessing a customer’s true profitability.

On the other side of the equation, there may be some sales and marketing benefits that a customer brings, and these, too, should be weighed. Does the name ‘Wal-Mart’ on our client list provide positive benefit to the organization? Is another client who doesn’t seem to purchase very much an outstanding reference for us who sends other potential customers to us? If a business can establish a process and gain agreement across the organization on measuring true costs and benefits, it can define policies to more precisely control bottom-line revenue.
Certainly, one of the first decisions that can be made, once true costs are measured and accepted by an organization, is to eliminate customers who are really unprofitable. But cost-to-serve can also come into play in other ways. We may want to devise strategic programs that nurture our best clients to safeguard their business. We may hold special events for them or assign dedicated reps, for instance.

One of the situations where cost-to-serve becomes a critical tool is in inventory allocation, particularly in an inventory shortage situation. When there is insufficient inventory to meet demand, most manufacturers will want to serve the most valuable customers first.

This frequently comes into play in segments of the technology industry, such as computer peripherals, typically with the launch of a popular new consumer product. An extreme example of this might be the launch of a new Wii game player at the start of the holiday season. Armed with true cost-to-serve data, manufacturers could make allocation decisions scientifically to spread the available inventory across the order pool while maximizing profit.

You might ask whether this process can be automated today. The answer is ‘partially.’ Allocation can certainly be automated, but collecting cost-to-serve data on customers usually involves some manual steps, because most companies don’t have all the systems in place to collect this data automatically (and even with sophisticated systems, the data may not be collected in exactly the way you wish.) Some spreadsheet work may be required. Once the spreadsheet is in place, however, the process becomes straightforward.

Perhaps you want to rank customers sequentially from top to bottom, or group them into ‘profit’ segments. Once that is done, an algorithm can be designed to optimize the allocation of inventory according to the rules tied to those rankings or segments. The allocation algorithm might be designed to work directly from the spreadsheet, as well, automating even more of the process. In any case, executing the service decisions in accord with true costs ensures we are protecting our most valuable customers.

The application of cost-to-serve to inventory allocation takes on an even more interesting aspect for consumer goods manufacturers who ship to retailers. As those of us familiar with this industry are aware, most large retailers have very specific guidelines defining how suppliers must do business with them. The retailers specify how an order must arrive – shipped complete, packed by store, etc.; when it must arrive – ‘arrive by’ date; and a variety of paperwork details including design, content and placement of shipping labels and bills of lading. Associated with each of these requirements is a dollar penalty the supplier will incur, taken as a deduction from the supplier’s invoice, for violation of the guideline.

For a consumer goods manufacturer, these penalties or ‘chargebacks,’ can mean the difference between a profitable client and an unprofitable one. In this situation, the ability to allocate inventory defensively, to minimize chargebacks (or at least make an informed scientific decision to incur them) is critical. A powerful allocation engine, in an inventory shortage situation, can maximize profit by factoring potential chargeback costs for late or partial shipment into the equation. In this case, the allocation engine ensures that the cost to serve the retailer is as low as possible.

In addition to retailer penalties, another aspect of ‘allocation-according-to-true-cost’ involves inventory fulfillment location choices. If a company operates a single distribution center in Los Angeles and imports all its product from Asia, there may be only a single fulfillment option. But for the wide majority of consumer goods manufacturers who import from Asia, service clients nationwide, and operate either multiple distribution centers or a distribution center located in, for instance, the Midwest, there are several options and a variety of questions
arise.

If inventory is constrained at the facility that would normally handle a particular customer’s order, should the order be fulfilled from an alternate facility? To make this decision, we need to factor in not only the additional shipping cost but also to weigh that cost against the value of the customer. There may be low profit customers, viewed from the perspective of cost-to-serve, for whom we do not want to make this investment. In the case of a retailer where a potential penalty is involved, the decision might be made dynamically based on a comparison of the chargeback incurred against the additional cost of shipping. If the chargeback fee would be higher than the additional shipping cost, it may be worthwhile to use the alternate distribution center.

This type of on-the-fly fulfillment decision is often called ‘dynamic allocation.’ Another example of dynamic allocation involves intercepting shipments in transit to, say, our hypothetical Midwest distribution center. Least cost fulfillment might dictate fulfilling west coast orders by pulling off inventory required to fulfill them at a deconsolidation facility near the port – before a shipment heads out to the distribution center in the Midwest. Under what conditions is this the least-cost choice? An inventory allocation algorithm based on cost-to-serve can make this decision mathematically, using rules the manufacturer defines.

It’s important to emphasize that the decisions on exactly how to apply cost-to-serve data to inventory allocation will depend on the philosophy of the individual company. For this reason, such allocation solutions are often unique and are adjuncts to the standard capabilities of order management systems. Leading-edge firms who are structuring allocation based on true costs typically do so via point solutions that supplement their central transactional systems.

Profit Point, as the name suggest, provides these point solutions and integrates them into SAP, Oracle, and other order management systems to help clients make the best, most profitable allocation and customer decisions. Our expertise in this area can help clients drive maximal profit to the bottom line.

This article was written by Cindy Engers, a Senior Account Manager at Profit Point.

To learn more about our supply chain data integration and business optimization services, contact us here or call (866) 347-1130.

What is a Monte Carlo model and what good is it? We’re not talking a type of car produced by General Motors under the Chevy nameplate. “Monte Carlo” is the name of a type of mathematical computer model. A Monte Carlo is merely a tool for figuring out how risky some particular situation is. It is a method to answer a question like: “what are the odds that such-and-such event will happen”. Now a good statistician can calculate an answer to this kind of question when the circumstances are simple or if the system that you’re dealing with doesn’t have a lot of forces that work together to give the final result. But when you’re faced with a complicated situation that has several processes that interact with each other, and where luck or chance determines the outcome of each, then calculating the odds for how the whole system behaves can be a very difficult task.

Let’s just get some jargon out of the way. To be a little more technical, any process which has a range of possible outcomes and where luck is what ultimately determines the actual result is called “stochastic”, “random” or “probabilistic”. Flipping a coin or rolling dice are simple examples. And a “stochastic system” would be two or more of these probabilistic events that interact.

Imagine that the system you’re interested in is a chemical or pharmaceutical plant where to produce one batch of material requires a mixing and a drying step. Suppose there are 3 mixers and 5 dryers that function completely independent of one another; the department uses a ‘pool concept’ where any batch can use any available mixer and any available dryer. However, since there is not enough room in the area, if a batch completes mixing but there is no dryer available, then the material must sit in the mixer and wait. Thus the mixer can’t be used for any other production. Finally, there are 20 different materials that are produced in this department, and each of them can have a different average mixing and drying time.

Now assume that the graph of the process times for each of the 8 machines looks somewhat like what’s called a ‘bell-shaped curve’. This graph, with it’s highest point (at the average) right in the middle and the left and right sides are mirror images of each other, is known as a Normal Distribution. But because of the nature of the technology and the machines having different ages, the “bells” aren’t really centered; their average values are pulled to the left or right so the bell is actually a little skewed to one side or the other. (Therefore, these process times are really not Normally distributed.)

If you’re trying to analyze this department, the fact that the equipment is treated as a pooled resource means it’s not a straightforward calculation to determine the average length of time required to mix and dry one batch of a certain product. And complicating the effort would be the fact that the answer depends on how many other batches are then in the department and what products they are. If you’re trying to modify the configuration of the department, maybe make changes to the scheduling policies or procedures, or add/change the material handling equipment that moves supplies to and from this department, a Monte Carlo model would be the best approach to performing the analysis.

In a Monte Carlo simulation of this manufacturing operation, the model would have a clock and a ‘to-do’ list of the next events that would occur as batches are processed through the unit. The first events to go onto this list would be requests to start a batch, i.e. the paperwork that directs or initiates production. The order and timing for the appearance of these batches at the department’s front-door could either be random or might be a pre-defined production schedule that is an input to the model.

The model “knows” the rules of how material is processed from a command to produce through the various steps in manufacturing and it keeps track of the status (empty and available, busy mixing/drying, possibly blocked from emptying a finished batch, etc.) of all the equipment. And the program also follows the progress and location of each batch. The model has a simulated clock, which keeps moving ahead and as it does, batches move through the equipment according to the policies and logic that it’s been given. Each batch moves from the initial request stage to being mixed, dried and then out the back-door. At any given point in simulated time, if there is no equipment available for the next step, then the batch waits (and if it has just completed mixing it might prevent another batch from being started).

What sets a Monte Carlo model apart however is that when the program needs to make a decision or perform an action where the outcome is a matter of chance, it has the ability to essentially roll a pair of dice (or flip a coin, or “choose straws”) in order to determine the specific outcome. In fact, since rolling dice means that each number has an equal chance of “coming up”, a Monte Carlo model actually contains equations known as “probability distributions”, which will pick a result where certain outcomes have more or less likelihood of occurrence. It’s through the use of these distributions, that we can accurately reflect those skewed non-Normal process times of the equipment in the manufacturing department.

The really cool thing about these distributions is that if the Monte Carlo uses the same distribution repeatedly, it might get a different result each time simply due to the random nature of the process. Suppose that the graph below represents the range of values for the process time of material XYZ (one of the 20 products) in one of the mixers. Notice how the middle of the ‘bell’ is off-center to the right (it’s skewed to the right).


So if the model makes several repeated calls to the probability distribution equation for this graph, sometimes the result will be the 2.0-2.5 hrs, other times 3.5-4.0 hrs, and on some occasions >4hrs. But in the long run, over many repetitions of this distribution, the proportion of times for each of the time bands will be the values that are in the graph (5%, 10%, 15%, 20%, etc.) and were used to define the equation.

So to come back to the manufacturing simulation, as the model moves batches through production, when it needs to determine how much time will be required for a particular mixer or dryer, it runs the appropriate probability equation and gets back a certain process time. In the computer’s memory, the batch will continue to occupy the machine (and the machine’s status will be busy) until the simulation clock gets to the correct time when the process duration has completed. Then the model will check the next step required for the batch and it will move it to the proper equipment (if there is one available) or out of the department all together.

In this way then, the model would continue to process batches until it either ran out of batches in the production schedule that was an input, or until the simulation clock reached some pre-set stopping point. During the course of one run, the computer would have been monitoring the process and recording in memory whatever statistics were relevant to the goal of the analysis. For example, the model might have kept track of the amount of time that certain equipment was block
ed from emptying XYZ to the next step. Or if the aim of the project was to calculate the average length of time to produce a batch, the model would have been following the overall duration of each batch from start to finish in the simulated department.

The results from just one run of the Monte Carlo model however are not sufficient to be used as a basis for any decisions. The reason for this is the fact that this is a stochastic system where chance determines the outcome. We can’t really rely on just one set of results, because just through the “luck of the draw” the process times that were picked by those probability distribution equations might have been generally on the high or low side. So the model is run repeatedly some pre-set number of repetitions, say 100 or 500, and results of each of these is saved.

Once all of the Monte Carlo simulations have been accumulated, it’s possible to make certain conclusions. For example, it might turn out that the overall process time through the department was 10 hrs or more on 8% of the times. Or the average length of blocked time, when batches are prevented from moving to the next stage because there was no available equipment, was 12 hrs; or that the amount of blocked time was 15hrs or more on 15% of the simulations.

With information like this, a decision maker would be able to weigh the advantages of adding/changing specific items of equipment as well as modifications to the department’s policies, procedures, or even computer systems. In a larger more complicated system, a Monte Carlo model such as the one outlined here, could help to decrease the overall plant throughput time significantly. At some pharmaceutical plants for instance, where raw materials can be extremely high valued, decreasing the overall throughput time by 30% to 40% would represent a large and very real savings in the value of the work in process inventory.

Hopefully, this discussion has helped to clarify just what a Monte Carlo model is, and how it is built. This kind of model accounts for the fundamental variability that is present is almost all decision making. It does not eliminate risk or prevent a worst-case scenario from actually occurring. Nor does it guarantee a best-case outcome either. But it does give the business manager added insight into what can go wrong or right and the best ways to handle the inherent variability of a process.

This article was written by John Hughes, Profit Point’s Production Scheduling Practice Leader.

To learn more about our supply chain optimization services, contact us here.

Profit Point’s data integration and scheduling optimization services deliver reliable results with reduced operations costs.

North Brookfield, MA

Profit Point today announced that its Profit Data InterfaceTM software has been selected by Rohm and Haas Company (NYSE: ROH) to integrate its scheduling processes with the company’s ERP data warehouse. The company, which last reported nearly $9 billion in annual sales, produces innovative products for nine industries worldwide through a network of more than 100 manufacturing, technical research and customer service sites. Optimizing and supporting the production and distribution scheduling across this network is a complex and ever-changing process.

“Rohm and Haas has a history of improving our operations to enhance customer service levels and reduce cost,” said Dave Shaw, the company’s Business Process Manager for MFG and Supply Chain. “Production scheduling, which entails constant change to meet demand, is one of the toughest challenges in the supply chain. In the past, the lack of a reliable data interface has limited our ability to react quickly and with a high degree of confidence in our results. Profit Point’s Data Interface software has given us near real-time access to highly reliable data, so we can respond quickly and know that our plan is right.”

Profit Data Interface is a robust application that helps decision makers boost the effectiveness of their ERP data by extending its usefulness with optimization applications. By leveraging existing ERP systems, the software provides a robust and proven method that supply chain managers can rely upon to optimize their critical business processes and improve profitability.

“Rohm and Haas is a recognized leader in the chemicals industry with a reputation for supply chain excellence,” said Jim Piermarini, Profit Point’s CEO. “We have supported their scheduling processes for years. So, it was clear that the next evolution was to directly connect their optimization software to the date store using our Data Interface product.”

Profit Data Interface, which integrates with SAP® and Oracle® data stores, can be used to optimize the entire supply chain including network planning, production and inventory planning, distribution scheduling, sales planning and vehicle routing.

To learn more about Profit Point’s supply chain software and services, visit www.profitpt.com.

About Profit Point:
Profit Point Inc. was founded in 1995 and is now a global leader in supply chain optimization. The company’s team of supply chain consultants includes industry leaders in the fields infrastructure planning, green operations, supply chain planning, distribution, scheduling, transportation, warehouse improvement and business optimization. Profit Point’s has combined software and service solutions that have been successfully applied across a breadth of industries and by a diverse set of companies, including General Electric, Dole Foods, Logitech and Toyota.

About Rohm and Haas Company:
Leading the way since 1909, Rohm and Haas is a global pioneer in the creation and development of innovative technologies and solutions for the specialty materials industry. The company’s technologies are found in a wide range of industries including: Building and Construction, Electronics and Electronic Devices, Household Goods and Personal Care, Packaging and Paper, Transportation, Pharmaceutical and Medical, Water, Food and Food Related, and Industrial Process. Innovative Rohm and Haas technologies and solutions help to improve life every day, around the world. Visit www.rohmhaas.com for more information.

Contact:
Richard Guy
Profit Point
(866) 347-1130
http://www.profitpt.com

Leading supply chain consulting firm’s entire line of optimization software is now capable of quickly and easily leveraging SAP’s robust ERP data warehouse.

North Brookfield, MA (PRWEB) August 25, 2008 — Profit Point, a leading supply chain optimization consulting firm, today announced the introductions of Profit Connect, an interface that bridges its line of optimization software applications with SAP’s enterprise resource planning (ERP) applications. With more than 46,000 customers worldwide, SAP is the ERP software of choice for thousands of medium and large businesses. By combining SAP’s central data store with Profit Point’s supply chain optimization software, business managers are now able to gain increased visibility to improve the quality of their critical business decisions.

“Historically, data availability and integrity have been the biggest challenge facing business managers that seek to improve their business operations,” stated Alan Kosansky, Profit Point’s President. “Compatibility with the universe of SAP’s real-time data enables our clients to use our industry-leading business optimization tools with easy access to the universe of SAP data.”

Profit Point’s entire line of supply chain optimization software, which includes tools to improve network design, production and distribution planning, scheduling and vehicle routing, is designed to help manufacturing and distribution managers improve the decisions they make using advanced optimization algorithms and proven supply chain methodologies. By leveraging existing ERP systems, Profit Point’s software provides a robust and proven method that supply chain managers can rely upon to optimize their critical business processes and improve profitability.

“In recent years, we have seen our clients increase their use and reliance on SAP for data management,” said Jim Piermarini, Profit Point’s Chief Technology Officer. “We saw an opportunity to access this data store, so that our clients could easily and accurately aggregate data for their optimization projects and increase the frequency of these business improvement efforts.”

Profit Connect solves the data integration challenges by providing an easy, direct bridge to SAP’s data store. Using Profit Point’s SAP-compatible software, business managers can now avoid data duplication and distortion, improve efficiencies and customer service, cut operational costs and improve decision making through accurate analysis and proven optimization techniques.

To learn more about how Profit Point’s supply chain software can help improve your profitability, contact us here:

(866) 347-1130 or
(435) 487-9141

Send us an Email

If your company is installing, has installed, or is considering installing ERP software, you are probably already aware of this: despite the expenditure of millions of dollars each and the dedication of dozens (sometimes hundreds) of staff people, 60% of ERP projects fail to deliver the results expected of them. This statistic, reported by The Conference Board means that 6 out of 10 ERP projects are not on time, and/or budget, and/or don’t deliver the value expected from them a year or more after launch. Additionally, The Conference Board found that, in most cases, implementation costs are 25% over budget. If this seems like a high failure rate, this is the most favorable outcome across several studies we know of.

Read our complete Enterprise Resource Planning report on how to escape the ERP Blues.

Contact Us Now

610.645.5557

Contact Us

Contact UsInfo

Please call:
+1 (610) 645-5557

Meet our Team

Our Clients

Published articles

  • A Fresh Approach to Improving Total Delivered Cost
  • Filling the Gap: Tying ERP to the Business Strategy
  • 10 Guidelines for Supply Chain Network Infrastructure Planning
  • Making Sound Business Decisions in the Face of Complexity
  • Leveraging Value in the Executive Suite
  • Should you swap commodities with your competitors?
  • Supply Chain: Time to Experiment
  • Optimization Technology Review
  • The Future of Network Planning: On the Verge of a New Cottage Industry?
  • Greening Your Supply Chain… and Your Bottom Line
  • Profit Point’s CEO and CTO Named a "Pro to Know" by Supply & Demand Chain Executive Magazine