Archive for the ‘SAP Integration’ Category

Here at Profit Point, we typically put in a fair amount of effort up front to scope out a project together with our client.  This typically helps us and our client to set appropriate expectations and develop mutually agreeable deliverables.  These are key to project success.  But another key element to project success is getting good quality data that will allow our clients to make cost effective decisions from the analysis work we are doing or the software tool we are implementing.

Decision support models are notoriously data hogs.  Whether we are working on a strategic supply chain network design analysis project or implementing a production scheduling tool or some optimization model, they all need lots and lots of data.

The first thing we do (which is usually part of our scoping effort) is identify each of the data types that will be required and what will be the source of this data.  To do this we start with what decisions are trying to be made and what data is required to make them successfully.  From there we identify if the data currently exists in some electronic form (such as an MRP system) or whether it will have to be collected and entered into some system (say a spreadsheet or database program) and then figure out how the data will get into the tool we are developing.

Second, we try to get sample data from each data source as early as possible.  This allows us to see if the assumptions that were made as part of the scoping effort were valid.  There is nothing like getting your hands on some real data to see if what you and your team were assuming is really true!  Often there are some discoveries and revelations that are made by looking at real data that require design decisions to be made to be able to meet the project deliverables.

Third, to help with data validation we find it extremely helpful to be able to visualize the data in an appropriate way.  This could take the form of graphs, maps, Gantt charts, etc. depending on the type of data and model we are working on.  On a recent scheduling project, we had the schedulers review cycle times in a spreadsheet but it wasn’t until they saw the data in Gantt chart form that they noticed problems with the data that needed correcting.

Identifying data sources, getting data as early as possible and presenting the data in a visualized form are absolutely required to make a project successful.  Omitting any of these steps will at least add to the project cost and / or duration or possibly doom the project to failure.

I was sitting on the plane the other day and chatting with the guy in the next seat when I asked him why he happened to be traveling.  He was returning home from an SAP ERP software implementation training course.  When I followed up and asked him how it was going, I got the predictable eye roll and sigh before he said, “It was going OK.”  There are two things that were sad here.  First, the implementation was only “going OK” and second, that I had heard this same type of response from so many different people implementing big ERP that I was expecting his response before he made it.

So, why is it so predictable that the implementations of big ERP systems struggle?  I propose that one of the main reasons is that the implementation doesn’t focus enough on the operational decision-making that drives the company’s performance.

A high-level project history that I’ve heard from too many clients looks something like this:

  1. Blueprinting with wide participation from across the enterprise
  2. Implementation delays
    1. Data integrity is found to be an issue – more resources are focused here
    2. Transaction flow is found to be more complex than originally thought – more resources are focused here
    3. Project management notices the burn rate from both internal and external resources assigned to the project
  3. De-scoping of the project from the original blueprinting
    1. Reports are delayed
    2. Operational functionality is delayed
  4. Testing of transactional flows
  5. Go-live involves operational people at all levels frustrated because they can’t do their jobs

Unfortunately, the de-scoping phase seems to hit some of the key decision-makers in the supply chain, like plant schedulers, supply and demand planners, warehouse managers, dispatchers, buyers, etc. particularly hard, and it manifests in the chaos after go-live.  These are the people that make the daily bread and butter decisions that drive the company’s performance, but they don’t have the information they need to make the decisions that they must make because of the de-scoping and the focus on transaction flow.  (It’s ironic that the original sale of these big ERP systems are made at the executive level as a way to better monitor the enterprise’s performance and produce information that will enable better decision-making.)

What then, would be a better way to implement an ERP system?  From my perspective, it’s all about decision-making.  Thus, the entire implementation plan should be developed around the decisions that need to be made at each level in the enterprise.  From blueprinting through the go-live testing plan, the question should be, “Does the user have the information in the form required and the tools (both from the new ERP system and external tools that will still work properly when the new ERP system goes live) to make the necessary decision in a timely manner?”  Focusing on this question will drive user access, data accuracy, transaction flow, and all other elements of the configuration and implementation.  Why? Because the ERP system is supposed to be an enabler and the only reasons to enter data into the system or to get data out is either to make a decision or as the result of a decision.

Perhaps with that sort of a focus there will be a time when I’ll hear an implementation team member rave about how much easier it will be for decision-makers throughout the enterprise once the new system goes live.  I can only hope.

Rich Guy

The rise of zombies in pop culture has given credence to the idea that a zombie apocalypse could happen. In a CFO zombie scenario, CFO’s would take over entire companies, roaming the halls eating anything living that got in their way. They would target the brains of supply chain managers and operations people. The proliferation of this idea has led many business people to wonder “How do I avoid a CFO zombie apocalypse?”

Supply chain managers are seeking and developing new and improved ways to exploit the volumes of data available from their ERP systems. They are choosing advanced analytics technologies to understand and design efficient sustainable supply chains. These advanced analytics technologies rely on the use of optimization technology. I am using the mathematical concept of “optimization” as opposed to non-mathematical process of making something better.

Mathematical optimization technology is at the heart of more than a few supply chain software applications. These applications “optimize” some process or decision. Optimization-base programs, for example, those frequently found in strategic supply chain network planning, factory scheduling, sales and operations planning and transportation logistics use well-known mathematical techniques such as linear programming to scientifically determine the “best” result. That “best solution” is usually defined as minimizing or maximizing a single, specific variable, such as cost or profit. However, in many cases the best solution must account for a number of variables or constraints. Advanced analytics technologies can improve a company’s bottom line – and it can improve revenue, too! CFO’s like this.

Advanced analytics technologies provide easy-to-use, optimization-based decision support solutions to solve complex supply chain and production problems.  And, these solutions can help companies quickly determine how to most effectively use limited resources and exploit opportunities.

So, from my perspective, there are seven practical reasons to embrace advanced analytics technologies:

  1. Your company saves money, increases profits.
  2. You get to use all your ERP system’s data.
  3. It’s straightforward and uncomplicated.
  4. You have the tools to discover great ideas and make better decisions.
  5. At the end of the day, you know the total cost of those decisions.
  6. You have a roadmap to make changes.
  7. You avoid the CFO zombie apocalypse

Despite our egalitarian mindset in the U.S., when it comes to customers, let’s face it: They have never been ‘created equal.’ Certainly for decades, manufacturers and distributors have offered better pricing to some customers than others. We’re all familiar with quantity break pricing, column pricing with different discount levels for different categories of customers, and contract pricing. And who doesn’t visit the local supermarket today and notice the ‘buy 3 get 1 free’ offers to encourage us to increase our purchases?

Volume is valuable and warrants better pricing, we are in the habit of believing. And most often this is true. Not only does a high-volume customer drive our buying power with suppliers by helping us reach the next price break level on the purchasing side, but it can make each sale more profitable: The cost of servicing 10 orders that result in a sale of 100 units can be 10 times as great as the cost of servicing a single order for those 100 units.

This bias towards volume underlies traditional customer ranking methods. But many manufacturers today are taking a closer look at these policies and finding them lacking. Instead, they are engaging in a detailed cost analysis effort called ‘cost-to-serve.’ While cost-to-serve can be a very broad subject covering product costs, location costs, transportation costs and service costs, to name a few, this article will take a look primarily at customer costs.

It’s not that heretofore companies have ignored factors that shade the degree of profitability of a large client. Many firms, presented with the opportunity of doing business with, say, Wal-Mart or the federal government, may question whether it’s really worth doing. They’re thinking about the overhead of handling such a client and the cost of meeting client demands – with slim price margins.

What’s different today is that companies are trying to measure these costs precisely and to make informed, scientific decisions based upon them. Whether they engage consulting firms who have developed methods for tackling this measurement, purchase software to help them out, or devise their own internal approach, more and more manufacturers and wholesalers are gathering detailed costs and trying to apply them to decisions about their customers.

Consumer goods companies, for instance, are recording metrics such as the true cost of customer service. How much support time does this customer require of the customer service organization? How much sales time to we devote to him? Does the customer frequently return merchandise, and if so, what is the cost of processing that return? In the case of consumer goods manufacturers, we might also look at custom-branded merchandise: What is the true cost of providing private labeling for a retailer? Are we really capturing in the product cost all of the special handling required by the purchasing and distribution organizations? All of these costs are very important is assessing a customer’s true profitability.

On the other side of the equation, there may be some sales and marketing benefits that a customer brings, and these, too, should be weighed. Does the name ‘Wal-Mart’ on our client list provide positive benefit to the organization? Is another client who doesn’t seem to purchase very much an outstanding reference for us who sends other potential customers to us? If a business can establish a process and gain agreement across the organization on measuring true costs and benefits, it can define policies to more precisely control bottom-line revenue.
Certainly, one of the first decisions that can be made, once true costs are measured and accepted by an organization, is to eliminate customers who are really unprofitable. But cost-to-serve can also come into play in other ways. We may want to devise strategic programs that nurture our best clients to safeguard their business. We may hold special events for them or assign dedicated reps, for instance.

One of the situations where cost-to-serve becomes a critical tool is in inventory allocation, particularly in an inventory shortage situation. When there is insufficient inventory to meet demand, most manufacturers will want to serve the most valuable customers first.

This frequently comes into play in segments of the technology industry, such as computer peripherals, typically with the launch of a popular new consumer product. An extreme example of this might be the launch of a new Wii game player at the start of the holiday season. Armed with true cost-to-serve data, manufacturers could make allocation decisions scientifically to spread the available inventory across the order pool while maximizing profit.

You might ask whether this process can be automated today. The answer is ‘partially.’ Allocation can certainly be automated, but collecting cost-to-serve data on customers usually involves some manual steps, because most companies don’t have all the systems in place to collect this data automatically (and even with sophisticated systems, the data may not be collected in exactly the way you wish.) Some spreadsheet work may be required. Once the spreadsheet is in place, however, the process becomes straightforward.

Perhaps you want to rank customers sequentially from top to bottom, or group them into ‘profit’ segments. Once that is done, an algorithm can be designed to optimize the allocation of inventory according to the rules tied to those rankings or segments. The allocation algorithm might be designed to work directly from the spreadsheet, as well, automating even more of the process. In any case, executing the service decisions in accord with true costs ensures we are protecting our most valuable customers.

The application of cost-to-serve to inventory allocation takes on an even more interesting aspect for consumer goods manufacturers who ship to retailers. As those of us familiar with this industry are aware, most large retailers have very specific guidelines defining how suppliers must do business with them. The retailers specify how an order must arrive – shipped complete, packed by store, etc.; when it must arrive – ‘arrive by’ date; and a variety of paperwork details including design, content and placement of shipping labels and bills of lading. Associated with each of these requirements is a dollar penalty the supplier will incur, taken as a deduction from the supplier’s invoice, for violation of the guideline.

For a consumer goods manufacturer, these penalties or ‘chargebacks,’ can mean the difference between a profitable client and an unprofitable one. In this situation, the ability to allocate inventory defensively, to minimize chargebacks (or at least make an informed scientific decision to incur them) is critical. A powerful allocation engine, in an inventory shortage situation, can maximize profit by factoring potential chargeback costs for late or partial shipment into the equation. In this case, the allocation engine ensures that the cost to serve the retailer is as low as possible.

In addition to retailer penalties, another aspect of ‘allocation-according-to-true-cost’ involves inventory fulfillment location choices. If a company operates a single distribution center in Los Angeles and imports all its product from Asia, there may be only a single fulfillment option. But for the wide majority of consumer goods manufacturers who import from Asia, service clients nationwide, and operate either multiple distribution centers or a distribution center located in, for instance, the Midwest, there are several options and a variety of questions
arise.

If inventory is constrained at the facility that would normally handle a particular customer’s order, should the order be fulfilled from an alternate facility? To make this decision, we need to factor in not only the additional shipping cost but also to weigh that cost against the value of the customer. There may be low profit customers, viewed from the perspective of cost-to-serve, for whom we do not want to make this investment. In the case of a retailer where a potential penalty is involved, the decision might be made dynamically based on a comparison of the chargeback incurred against the additional cost of shipping. If the chargeback fee would be higher than the additional shipping cost, it may be worthwhile to use the alternate distribution center.

This type of on-the-fly fulfillment decision is often called ‘dynamic allocation.’ Another example of dynamic allocation involves intercepting shipments in transit to, say, our hypothetical Midwest distribution center. Least cost fulfillment might dictate fulfilling west coast orders by pulling off inventory required to fulfill them at a deconsolidation facility near the port – before a shipment heads out to the distribution center in the Midwest. Under what conditions is this the least-cost choice? An inventory allocation algorithm based on cost-to-serve can make this decision mathematically, using rules the manufacturer defines.

It’s important to emphasize that the decisions on exactly how to apply cost-to-serve data to inventory allocation will depend on the philosophy of the individual company. For this reason, such allocation solutions are often unique and are adjuncts to the standard capabilities of order management systems. Leading-edge firms who are structuring allocation based on true costs typically do so via point solutions that supplement their central transactional systems.

Profit Point, as the name suggest, provides these point solutions and integrates them into SAP, Oracle, and other order management systems to help clients make the best, most profitable allocation and customer decisions. Our expertise in this area can help clients drive maximal profit to the bottom line.

This article was written by Cindy Engers, a Senior Account Manager at Profit Point.

To learn more about our supply chain data integration and business optimization services, contact us here or call (866) 347-1130.

Profit Point’s data integration and scheduling optimization services deliver reliable results with reduced operations costs.

North Brookfield, MA

Profit Point today announced that its Profit Data InterfaceTM software has been selected by Rohm and Haas Company (NYSE: ROH) to integrate its scheduling processes with the company’s ERP data warehouse. The company, which last reported nearly $9 billion in annual sales, produces innovative products for nine industries worldwide through a network of more than 100 manufacturing, technical research and customer service sites. Optimizing and supporting the production and distribution scheduling across this network is a complex and ever-changing process.

“Rohm and Haas has a history of improving our operations to enhance customer service levels and reduce cost,” said Dave Shaw, the company’s Business Process Manager for MFG and Supply Chain. “Production scheduling, which entails constant change to meet demand, is one of the toughest challenges in the supply chain. In the past, the lack of a reliable data interface has limited our ability to react quickly and with a high degree of confidence in our results. Profit Point’s Data Interface software has given us near real-time access to highly reliable data, so we can respond quickly and know that our plan is right.”

Profit Data Interface is a robust application that helps decision makers boost the effectiveness of their ERP data by extending its usefulness with optimization applications. By leveraging existing ERP systems, the software provides a robust and proven method that supply chain managers can rely upon to optimize their critical business processes and improve profitability.

“Rohm and Haas is a recognized leader in the chemicals industry with a reputation for supply chain excellence,” said Jim Piermarini, Profit Point’s CEO. “We have supported their scheduling processes for years. So, it was clear that the next evolution was to directly connect their optimization software to the date store using our Data Interface product.”

Profit Data Interface, which integrates with SAP® and Oracle® data stores, can be used to optimize the entire supply chain including network planning, production and inventory planning, distribution scheduling, sales planning and vehicle routing.

To learn more about Profit Point’s supply chain software and services, visit www.profitpt.com.

About Profit Point:
Profit Point Inc. was founded in 1995 and is now a global leader in supply chain optimization. The company’s team of supply chain consultants includes industry leaders in the fields infrastructure planning, green operations, supply chain planning, distribution, scheduling, transportation, warehouse improvement and business optimization. Profit Point’s has combined software and service solutions that have been successfully applied across a breadth of industries and by a diverse set of companies, including General Electric, Dole Foods, Logitech and Toyota.

About Rohm and Haas Company:
Leading the way since 1909, Rohm and Haas is a global pioneer in the creation and development of innovative technologies and solutions for the specialty materials industry. The company’s technologies are found in a wide range of industries including: Building and Construction, Electronics and Electronic Devices, Household Goods and Personal Care, Packaging and Paper, Transportation, Pharmaceutical and Medical, Water, Food and Food Related, and Industrial Process. Innovative Rohm and Haas technologies and solutions help to improve life every day, around the world. Visit www.rohmhaas.com for more information.

Contact:
Richard Guy
Profit Point
(866) 347-1130
http://www.profitpt.com

Contact Us Now

610.645.5557

Contact Us

Contact UsInfo

Please call:
+1 (610) 645-5557

Meet our Team

Our Clients

Published articles

  • A Fresh Approach to Improving Total Delivered Cost
  • Filling the Gap: Tying ERP to the Business Strategy
  • 10 Guidelines for Supply Chain Network Infrastructure Planning
  • Making Sound Business Decisions in the Face of Complexity
  • Leveraging Value in the Executive Suite
  • Should you swap commodities with your competitors?
  • Supply Chain: Time to Experiment
  • Optimization Technology Review
  • The Future of Network Planning: On the Verge of a New Cottage Industry?
  • Greening Your Supply Chain… and Your Bottom Line
  • Profit Point’s CEO and CTO Named a "Pro to Know" by Supply & Demand Chain Executive Magazine