O UTPUT CONTROL :
R ESPONSE TO RECENT HISTORICAL WORKLOAD AND ORDER STATUS INFORMATION AT THE M AKE -T O -O RDER SHOP FLOOR
A UTHOR : R AMON B EEKMAN
D EGREE COURSE : M ASTER ’ S T HESIS TOM-DD N EWCASTLE
I NSTITUTIONS : U NIVERSITY OF G RONINGEN
N EWCASTLE U NIVERSITY
S UPERVISORS : D R . M. J. L AND
D R . G. P ANG
D ATE : 7 D ECEMBER 2015
2
A BSTRACT
Many MTO organisations experience difficulties to adjust their capacity to the variation in demand. A key challenge is to adjust capacity timely to overcome the negative effects of changing workload, like lead time increase. Most capacity planning methods determine capacity levels based on forecasted demand. However, due to varying customer needs these methods are limited applicable to MTO companies. The applicable planning methods are limited considering the workload that is in transit to the shop floor and its stage, which according to literature best indicates future workload at the shop floor. Since it is found that capacity levels should be in balance with the inflow of orders, this research aims to identify aggregate planning opportunities to adjust capacity by response to recent historical workload and order status information. A case study has led to the identification that capacity needs can be derived from recent inflow and outflow differences in the load between all stages. A capacity planning model is developed which determines a next period (e.g. month) output level by applying this opportunity. By using case company data to test the model, it is found that more constant shop floor throughput times can be realised by adjusting capacity to inflow and outflow differences. Therewith, the predictability of order lead times to customers can be improved.
1. I NTRODUCTION
Companies differentiate their products in order to be able to supply different customers and markets. This is especially the case for Make-to-Order (MTO) companies (Chen, et al., 2014).
In MTO companies, customer requirements are key in how products are produced and assembled (Yang & Fung, 2014). The need for flexibility to meet individual customer needs is higher than for Make-to-Stock (MTS) companies because the latter can supply products from stock conform to customer requirements. Hence many organisations, in particular MTO companies, experience temporary imbalances between available capacity and required capacity.
To overcome these, capacity planning is focusing on determining the capacity levels of
productive resources and the timing of it (Alp & Tan, 2008). However, a difficulty in the
capacity planning is the uncertainty of capacity utilisation in the future in combination with the
time needed to adjust capacity (Eickemeyera, et al., 2014). This makes it relevant to investigate
the opportunities of capacity planning under the condition of uncertainty.
3
Ways to manage orders in a MTO environment can be classified into input and output control (Kingsman & Hendry, 2002). Input control is hereby defined as the control of workflow into a queue, onto the shop floor or into the complete system. Output control is the control of jobs out of the queue, shop or system and is achieved by adjusting capacity. Output control is more effective in reducing lead time (Kingsman & Hendry, 2002) and is the focus of this paper.
Capacity planners can face difficulties with adjusting capacity, due to capacity constraints like maximum capacity levels or the time required to change capacity (Eickemeyera, et al., 2014).
Most literature discussing capacity planning is limited applicable to MTO companies, while the applicable capacity planning methods have limitations. Capacity planning is much discussed in Enterprise Resource Planning and hierarchical (or aggregate) planning literature. Literature discussing these methods usually derive the required capacity levels from forecasted demand (Stevenson, et al., 2005; Yi-Feng, et al., 2013). However, Stevenson et al. (2005) mention that many MTO are unable to accurately forecast demand, due to varying customer needs. They state that capacity planning in MTO environments can be done from the stage of high probability of order acceptance, after which the planning should be updated at the later stages of order acceptance and release as plans can change. Although capacity planning literature exists that is applicable to MTO companies, it limited considers to respond to the workload (or
‘load’) that is in transit – depending on the order status – to the department (or workstation).
Since this information best indicates future expected workload (Land, et al., 2015), while capacity levels should be in balance with the recent inflow of orders (Thürer, et al., 2014), no literature is found discussing how to adjust capacity at the MTO company to the recent developments in the load levels by considering the different stages.
The purpose of this paper is to identify aggregate planning opportunities to adjust capacity at
the MTO shop floor by response to recent historical workload and order status information. The
aggregate planning level is hereby defined as the planning level that focuses on capacity
changes which are realisable on the medium, i.e. monthly, term. This research provides useful
practical insights for managers of MTO companies who deal with capacity constraints and
experience problems in adjusting capacity on aggregate planning levels by response to
workload that is in transit to the shop floor. This is researched by means of a case study in which
workload and order status data are analysed, leading to an identified planning opportunity. From
this, a capacity planning model is developed which enables to plan output levels on monthly
basis by response to the historical information.
4
This paper is organised as follows. First, literature is reviewed on the factors restricting capacity planning, its objective and related capacity planning methods. It clarifies concepts and indicates gaps in literature which are used for the design of the research methodology. This paper continues with the case description, highlighting the capacity planning issues the case company faces. From this, a planning opportunity is identified and applied to a capacity planning model.
Lastly, the performance, robustness and implications of the developed model are presented.
2. R ELATED RESEARCH
First, literature about restrictions to capacity flexibility will be discussed, as these affect capacity planning. Thereafter, the objective of capacity planning and corresponding performance will be the main focus. It will end with a review on existing and related capacity planning methods, their limitations and an identified gap.
2.1 C APACITY RESTRICTIONS
The definition of capacity is taken from Alp and Tan (2008) who define it as the maximum amount of potential production in a representative time period by utilising all actual productive resources. The productive resources are the facilities, machines, managers and employees which actually enable to produce goods (Slack, et al., 2010). Alp and Tan (2008) mention that capacity consists of nominal and contingent capacity. The nominal capacity level is determined by the size of the internal resources of the company, like the permanent workers and the leased or owned machines. With contingent capacity, the nominal capacity can be increased temporarily. This can be realised by, for instance, hiring temporary staff from employment agencies, overtime production or renting shop floor space (Alp & Tan, 2008).
Two types of constraints are found by literature analysis, which limit to achieve the required
capacity levels. First of all, the maximum capacity level is identified, since the production
system is constrained by its available resources (Díaz-Madroñero, et al., 2014). Constraining
resources (bottlenecks) determine the maximum throughput (Fernandes, et al., 2014), and thus
the maximum potential output in a time period. The second factor concerns the maximum
achievable rate of capacity change within a time period. It is a result of the degree of which
companies are able to adjust capacity in the short term by, for instance, subcontracting, hiring
staff or working overtime (Çınar & Güllü, 2012). Moreover, capacity lead time can hinder the
ability to change capacity (Ryan, 2004). Lead time is the period that is required to construct or
receive the capacity from the moment of recruitment or investment. These can be long, meaning
that the decisions to expand capacity should be done in advance.
5
2.2 C APACITY PLANNING PERFORMANCE MEASUREMENT
Before focussing on existing capacity planning methods, the objective and corresponding performance measurement of capacity planning will be defined. A main disturbing factor in capacity planning at MTO companies is variation in demand, while these companies are mostly not able to reduce this (Thürer, et al., 2014). Despite this, companies have to focus on realising short and reliable lead times, since customers increasingly select their suppliers based on the lead time performance (Thürer, et al., 2014). When companies fail to meet the promised lead times, lost market share and lost profit are the consequences (Kingsman, et al., 1993).
Lead time is the sum of the throughput times, consisting of waiting and processing time, and can be controlled by adjusting capacity levels (Thürer, et al., 2012; Thürer, et al., 2015). Thürer et al. (2014) mention that capacity levels should not be too high, as it can cause an empty shop floor, and finally undesired capacity overages. Therefore they state that capacity levels should be in balance with the rate of order inflow. This implies that throughput time (or lead time) variation can be used as performance measurement of capacity planning, as lead times should not be long, while short lead times should be avoided as well since this can indicate capacity overages. Also in Workload Control literature, realised throughput times are used to measure performance (Soepenberg, et al., 2012).
2.3 C APACITY PLANNING METHODS
The limitations of existing planning methods and research which is used to identify aggregate planning opportunities, will be clarified. Aggregate planning covers the medium-length planning term by means of the creation of a production plan which optimises company goals while satisfying future customer demand (Albey & Bilge, 2011). Martínez-Costa et al. (2014) also mention that it covers the medium term, typically from a month to a year, in which the amount of staff and the amount of working time are considered. They state that capacity planning for a longer term only covers the adjustment in the facilities and the equipment, but this is not the scope of this research.
Most literature focusing on capacity planning, for instance Enterprise Resource Planning, hierarchical planning, Sales and Operations Planning, or Selçuk et al. (2006) who update shop floor status to the planning, base capacity levels on forecasted demand (Ivert, et al., 2015;
Stevenson, et al., 2005; Yi-Feng, et al., 2013). Forecasts, created by the marketing, sales or
other responsible departments, help to manage demand uncertainty (Ivert, et al., 2015). These
forecasts are based on the expected demand in the coming planning period. However, in MTO
environments forecasting demand cannot be done accurately, due to a high level of customised
6
products and varying customer needs (Stevenson, et al., 2005). Therefore, capacity planning at MTO companies can best be initiated from the stage of high probability of order acceptance, which is not earlier than the start of the tendering process, and needs to be updated at the acceptance and release stages (Stevenson, et al., 2005). These updates are required since the MTO shop floor is subject to a central role of the customer, even after the release stage (Thürer, et al., 2014). The following planning methods are applicable to MTO companies since they plan capacity based on actual information instead of forecasted demand.
Most capacity planning methods which are applicable to MTO companies respond to the actual load level at the shop floor. For intstance, Ryan (2004) developed a planning method focusing on both the size and timing of the capacity adjustment by response to this load level. The underlying objective here is to weigh up the pros and cons of acquiring capacity by many small adjustments (i.e. continuous adjustments) or by few large adjustments (i.e. discrete adjustments) (Van Mieghem, 2003). Ryan (2004) first determines the optimal timing of capacity changes by exceedance of a predetermined capacity utilisation level (i.e. a threshold level), after which the optimum size is determined under the timing policy. The threshold level is a result of the amount of risk the organisation wants to take; cost of losing customers should be prevented, while having capacity overages in the future should be avoided as well. Marathe and Ryan (2009) added throughput time and service level constraints to this method (Chou, et al., 2014).
Thus, capacity adjustments are based on the actual shop floor load level and a load threshold level. However, this can be a wrong trigger as it does not consider the load before the stage of release to the shop floor, which is future load.
Research focusing on capacity planning at early stages (i.e. before work is released to the shop
floor) is for instance the work of Rafiei and Rabbani (2012). They focus on capacity planning
at the stage of order acceptance. First, they advise to check whether orders can be released on
time, then to check capacity after the release stage, evaluate an increase if necessary and assess
the profitability of the required capacity increase. This should lead to order acceptance or
rejection. However, status on the shop floor in terms of workload and capacity, can change over
time due to changing circumstances, such as breakdowns, unplanned maintenance, random
fluctuations in rework and output (Çınar & Güllü, 2012), or as Muchiri and Pintelon (2008)
mention, the seven major losses (i.e. major and minor stoppages due to failure or production
defect, cutting blade loss due to wear, changeovers, start up and shut down, speed losses and
quality losses). The limitation of applying this method is that the required capacity levels will
not be updated at later stages.
7
Research that considers the load from early stages (e.g. order acceptance) to later stages (e.g.
order release at the downstream workstations), is the work of Land et al. (2015). They stress the importance of capacity response to workload that is in transit to the department (or workstation), by aggregating the load levels over all stages. These so-called aggregate loads are the earliest possible indication of capacity utilisation at a certain department (Oosterman, et al., 2000). Response to aggregate load is achieved by setting aggregate load threshold levels and corresponding capacity adjustment rates. Although aggregate load will be a better indication of future load than shop floor load, it still does not contain information about the stages and recent developments in the load. This while the analysed literature in this section suggests that developments in the load at later stages – for instance increasing load at workstation Y located more downstream towards workstation Z – can be a better indication of future release levels (e.g. at workstation Z) compared to the load at the stage just after order acceptance. No research is found which plans capacity by response to these recent historical developments.
In conclusion, aggregate planning covers the medium-length (weeks to months) planning term based on the time it takes to acquire the capacity after a change. Capacity is restricted by the maximum output in a time period and the ability to adjust capacity on the short term. MTO capacity planning models are based on actual information as these methods adjust capacity either by response to shop floor load, only at the stage of order acceptance or by aggregation of all load levels without considering the stages. Since no capacity planning methods are found which base capacity levels on recent developments in the load between different stages, while this can be useful to better predict future load levels, this identified a relevant gap in literature.
3. M ETHODOLOGY
The methodology used in this paper aims to set the foundations of identifying opportunities to adjust capacity by response to recent historical workload and order status information. A theory building case study was performed as it allows to define concepts, the domain, relations and predictors (Karlsson, 2009). By a single case study, an in-depth analysis took place, which enabled to research the relations and predictors of the concepts and domain defined by literature. The unit of analysis of this research is the MTO company facing capacity planning difficulties in terms of response to workload that is in transit to the shop floor.
3.1 C ASE SELECTION
A single case study was done as it allows to perform an in-depth analysis which is required to
study the complexity of capacity planning in detail. The case was selected based on the
requirement that the company should face relative high variation in demand, causing difficulties
8
to adjust capacity. It was also required to select a case company which is able to adjust bottleneck capacity on the medium (weeks to months) term. Lastly, it was preferable to select a case company which has some degree of fixed routing in the production process. This allowed to define production stages, which was needed to identify planning opportunities.
The company that was selected is a medium-size production company, that had experienced difficulties to adjust capacity in order to fulfil demand for several years. Production and supporting departments carry out project activities in the technical automation, electrical engineering and mechanical engineering sector. The production department consists of four sub-departments and employs around one hundred employees. More information about the case company can be found in the case description.
3.2 D ATA ANALYSIS
Data analysis was done by triangulation meaning that different types of data were used to increase the research validity through comparison of data types. Qualitative and quantitative data were used with the intention to draw general conclusions from the case, resulting in knowledge that is applicable to similar cases.
Qualitative data were initially used to validate the case selection. For instance, difficulties to adjust capacity to the fluctuating demand needed to be recognisable for management. Also, managers and an expert were asked to give their opinion about the correctness of the data interpretation and the research outcome. Only the validated data were used for further analysis.
Quantitative data were used to identify planning opportunities and for validation. Load levels over a period of five years were analysed to determine the current throughput time performance.
From this, a baseline measurement was set, indicating the performance if management would
have had responded adequately to the available information. The data were aggregated over
months, decreasing the sensitiveness of inaccurate data in the analysis. For the data analysis,
throughput diagrams were used as Soepenberg, Land and Gaalman (2012) find that throughput
diagrams enable to get insight in throughput time performance over time. Thereafter, a planning
opportunity was identified, which was found by analysing throughput diagrams. This planning
opportunity was applied in a capacity planning model that determines a next period (e.g. month)
output level. By using case company data as input for the capacity planning model, and an
applied Excel solver technique, the model and the identified planning opportunity could be
validated.
9 3.3 D ATA COLLECTION
Interviews with planners and production managers were conducted for validation purposes.
First of all, questions were asked to define the degree of capacity flexibility at each department, which helped to focus the analysis. As mentioned, also the quantitative findings were validated by asking questions concerning the correctness of the presented data analysis. Furthermore, the research outcome was validated by an expert interview, a specialist in the field of optimisation of discrete planning and production systems. This allowed to define the strengths and limitations of this research.
Quantitative data were obtained from the Enterprise Resource Planning system of the company.
Per work order, data included the moment of acceptance, and the moment of release and completion at all departments. This was the input to calculate load levels between different stages over time and throughput time performance.
4. C ASE DESCRIPTION
The production process of the case company is divided in four phases each hosted by one department (A: mechanics workshop, B: painting, C: assembly and D: testing). Products are always processed in the sequence A, B, C and D. Each order requires its own activities within the departments. Also demand varies over time, resulting in fluctuating output and hence, variation in the effective man-hours at each department and if applicable, in machine utilisation.
The departments have different characteristics. The mechanics workshop consists of different
machines, like laser cutting, folding and welding machines, all needed to construct the product
body. This department is flexible in terms of labour capacity which means that capacity can be
doubled by means of temporary staff (contingent capacity) depending on the availability of
machines and skills of temporary staff. The painting department only processes the products
which are produced by the mechanics workshop. This department is inflexible in terms of staff
capacity due to the required skills and the facilities (e.g. machines and shop floor space). The
assembly department receives products from the painting department and from external
suppliers. It can double its capacity by hiring temporary staff (contingent capacity), obtainable
in several days. Finally, the test department checks whether products are conform high quality
standards. This department is inflexible due to the required human skills and the test facilities
(in particular shop floor space). Overall, the assembly department has the best opportunities to
adjust capacity, since it can adjust bottleneck capacity. Moreover, other departments, as will be
explained later, are able to generate output levels which correspond to changes in the release
10
rates. While this research focuses on identifying capacity planning opportunities, the assembly department is the main focus during the case analysis.
Figure 1 shows the throughput diagram from perspective of the assembly department for the period of begin to end of year 2012. When an order arrives at the company, the cumulative level of accepted orders increases with the amount of assembly hours of the work order. When an order is released to the mechanics workshop, the cumulative level of released orders to the mechanics workshop and painting department increases with the amount of assembly hours of the work order. The cumulative levels at the stages of order completion at the mechanics workshop and painting department, order release to the assembly department, and order completion at the assembly department, are derived in the same way.
Throughput time variation is defined as key measurement of capacity planning and can be derived from the different cumulative order curves. Especially the shop floor throughput times are point of interest, as these can be controlled by adjusting capacity. The throughput time at the assembly department is measured by the horizontal distance between the brown curve in Figure 1 visualising the cumulatively released orders to the assembly department, and the red curve visualising the cumulatively completed orders at the assembly department, specified for the moment of order release to the assembly department. It can be observed that, from 1-1-2012 until 1-7-2012, the rate of the released orders to the assembly department is higher compared to the rate of the completed orders at the assembly department. This results in increasing shop floor throughput times from moment I (see Figure 1), until the release level at moment II.
Throughput times are decreasing from moment II, as the upper right horizontal arrow indicates, because of a higher rate of completed orders at the assembly department compared to the rate of released orders to the assembly department. Lastly, from Figure 1 it can be observed that the completion levels at the mechanics workshop and painting department better correspond to the changing release rates to the mechanics workshop and painting department. This justifies to focus on identifying planning opportunities for the assembly department.
The realised throughput times vary between around 40 and 120 days from the start of year 2010
until the end of year 2014 (see Figure 2). Managers and planners of the case company mention
that they sometimes fail to adjust capacity on time what can explain this variation. Therefore it
is interesting to consider the effects of capacity planning based on the actual released orders to
the assembly department as they are known, and to investigate planning opportunities to adjust
capacity based on recent developments in the load by using historical information as is the
purpose of this research.
11
Figure 1 Cumulative order levels from the perspective of the assembly department, with I and II defined as the start and end moments of increasing throughput times.
Figure 2 Throughput times (in days) at the assembly department from start of year 2010 until the end of year 2014, with highlighted the discussed period.
I
II
12
5. C APACITY PLANNING OPPORTUNITIES
The case highlights the relevance to investigate whether throughput time variation can be reduced. This is researched by two steps (see Figure 3). Firstly, the impact of response to the latest shop floor status is investigated, since capacity levels can be derived from the actual release levels. Therewith, a baseline measurement will be set and used to improve the performance by identifying planning opportunities by response to recent historical workload and order status information.
Figure 3 Research steps to identify capacity planning opportunities.
5.1 R ESPONSE TO LATEST SHOP FLOOR STATUS
From literature it is derived that throughput times should be ideally kept constant. Future capacity levels should then be adjusted in a such a way that a cumulative level of completed orders is realised equal to the cumulative level of released orders after a fixed moment. For the assembly department, a constant throughput time of two months can be set as an optimum, based on the average realised throughput times (see Figure 2). Then, the required output levels are known for the coming two months. However, it is difficult to adjust capacity in such a way that throughput times will be constant over time, due to capacity constraints. The first constraint that can be considered is that the amount of completed orders (i.e. the output) cannot exceed a maximum value 𝛾 (i.e. the maximum capacity in hours). Secondly, companies can be limited capable in adjusting capacity over time. For instance, if well-trained staff is required, the company should train staff and this consequently may have an impact on the realisable output adjustments over time. Hence, a second constraint that can be considered is that output levels cannot exceed or fall below a specific rate, 𝛼 (in %), compared to the output of the previous month.
To infer the effect of the capacity constraints on the throughput times by response to release levels, the following constraints are set. For 𝛾, the realised maximum output at the assembly department during the period from the start of year 2010 until the end of year 2014 is taken, which is 8430 hours per month. The other constraint, 𝛼, is set to 33%. This value is conservative compared to the realised output change at 1-7-2012 (see Figure 1). The black dotted output curve in Figure 4 is generated, visualising the cumulatively completed orders at the assembly department if would have been responded to the release levels, by inclusion of these set capacity
Response to latest shop floor status
Response to recent historical work load and order status
information
13
constraints. The purple dashed curve visualises the cumulatively completed orders at the assembly department if throughput times would have been kept constant. It can be seen that the black dotted curve does not increase with the same rate as the purple dashed curve from 1-2- 2012, due to the capacity constraints restricting the required output adjustments. This results in varying throughput times (see the black dotted curve in Figure 5), as the horizontal distance between the cumulatively released order curve and the cumulatively completed order curve varies (see Figure 4). The cumulative output levels and corresponding performance over five years by inclusion of the set capacity constraints, are set as the baseline measurement. Lastly, it can be concluded that by response to the cumulative release levels, planners could have realised more constant throughput times.
In conclusion, it is found that throughput times can be controlled by response to the shop floor release levels. However, capacity constraints cause throughput time variation when the release rates change more than the achievable capacity changes. Therefore, it is interesting to identify if throughput time performance can be improved by response to the developments in the load at earlier stages.
Figure 4 Cumulative order levels of released orders to the assembly department, and completed orders at the
assembly department of the current situation, if throughput times would have been kept constant, and if would
have been responded to the release levels, including capacity constraints.
14
Figure 5 Throughput times (in days) at the assembly department of the current situation, if throughput times would have been kept constant and if would have been responded to the release levels, including capacity constraints.
5.2 R ESPONSE TO RECENT HISTORICAL WORKLOAD AND ORDER STATUS INFORMATION
Capacity planning by response to the shop floor release levels can cause throughput time variation because of capacity constraints. As this needs to be minimised, it is investigated whether it is possible to adjust capacity by response to the recent developments in the load levels between different stages (e.g. order acceptance, release and completion). The conjecture is that inflow and outflow differences in the load can indicate a need to change capacity. In other words, if more orders arrive than are entirely processed, it can be a reason to increase capacity, while if more orders are processed than arrive it can be a reason to decrease capacity.
A throughput diagram is used to identify the inflow and outflow differences. For instance, if less orders are accepted (i.e. inflow) than are released (i.e. outflow) for two months, the cumulative level of accepted orders increase less compared to the cumulative level of released orders. Therewith, inflow and outflow differences can be identified by a ‘virtual break-even- point’, a value which is based on the latest cumulative order levels and recent historical (e.g.
one month) order rates. This virtual break-even-point indicates the number of months, from the
month of the taken cumulative order levels, when the two cumulative order curves will intersect
under the assumption that the rates will not change over time. In case of less accepted orders
compared to the released orders, in for instance the last month, the virtual break-even-point will
15
be several months in the future from that month, and this can indicate a need to decrease capacity. In case of more accepted orders compared to the released orders in for instance the last month, the curves will be convergent in the last month. This will generate a negative virtual break-even-point and this can indicate a need to increase capacity. Moreover, a virtual break- even-point can be an indication of the urgency to change capacity. For instance, if the value is further away from zero, because of lower differences between the level and rates, this can indicate low urgency to change capacity.
By two examples this can be explained. It can be seen that in the period just before moment ‘X’
(see Figure 6), the rate of acceptance (curve A) is slightly higher than the rate of release to the mechanics workshop and painting department (curve B). As the rate difference is low and the absolute difference between the cumulative order levels is relatively high, this will generate an intersection point which lays far away in time, based on the rates of the two curves rates during the last month. In an equal manner for curves B and C in the period just before point ‘X’, the virtual break-even-point is nearer to point ‘X’ and this can trigger a need to increase capacity A capacity planning model can be developed which responds to these virtual break-even-points.
Figure 6 Cumulative order levels with the completed orders at the assembly department if would have been responded to the release levels, including capacity constraints.
X
A
B
C
D
G
16 5.2.1 Symbols
Some notation is required to explain the model that determines the output response in case of increasing or decreasing workload.
The next set of constants are used to represent the current input/output at different order stages:
𝐴
𝑡: Orders accepted in month t (in hours),
𝐵
𝑡: Orders released to the mechanics workshop & painting department in month t (in hours),
𝐶
𝑡: Orders completed at the mechanics workshop & painting department in month t (in hours),
𝐷
𝑡: Orders released to the assembly department in month t (in hours),
𝐻
𝑡: Orders completed at the assembly department in month t (in hours),
𝐴
𝑡𝑐, 𝐵
𝑡𝑐, 𝐶
𝑡𝑐, 𝐷
𝑡𝑐, 𝐻
𝑡𝑐: Cumulative level at the specified order stage at end of month t (in hours).
These values will be used to calculate:
𝐼
𝑡𝐴𝐵, 𝐼
𝑡𝐵𝐶, 𝐼
𝑡𝐶𝐷, 𝐼
𝑡𝐷𝐻: Virtual break-even-point of the two specified order curves (in months after t-1).
The next decision variables and functions of decision variables are used:
𝐿
𝐴𝐵, 𝐿
𝐵𝐶, 𝐿
𝐶𝐷, 𝐿
𝐷𝐻: Threshold level of load change between the two specified order stages that will trigger capacity adjustment (decision variable),
𝑂
𝐴𝐵, 𝑂
𝐵𝐶, 𝑂
𝐶𝐷, 𝑂
𝐷𝐻: Output adjustment factor when the corresponding 𝐿 is exceeded,
𝑍
𝑡𝐴𝐵, 𝑍
𝑡𝐵𝐶, 𝑍
𝑡𝐶𝐷, 𝑍
𝑡𝐷𝐻: Response value specified for each individual order curves to adjust output in month t,
𝑍
𝑡: Final response value to adjust output in month t,
𝑅
𝑡: Required output at end of month t (in hours),
𝑉
𝑡: Virtual output at end of month t (in hours),
𝑊
𝑡: Planned work in process at the assembly department at end of month t (in hours),
𝑇
𝑡: Planned throughput time at the assembly department measured from month t (in months).
Z is a non-linear function of L and O, R is a linear function of Z,
V is non-linear function of R,
17 W is a non-linear function of V,
T is a non-linear function of W and H.
The index t in the objective function runs from m to n:
𝑚: First month in the period used for optimisation,
𝑛: Last month in the period used for optimisation.
5.2.2 Parameters A few parameters are set:
Maximum capacity level (𝛾). This value is based on the realised output over the years 2010 until 2015, and is set to 8430 hours per month (see section 5.1),
Maximum change in capacity allowed relative to the previous month (𝛼). This value is set to 33% (see section 5.1),
Number of months aggregated to determine the current rate of input and output (𝛽). It is interesting to investigate if more historical periods (higher 𝛽) lead to less capacity adjustments and decreased throughput time variation,
Throughput time target. This target is set to 2 months (see section 5.1).
5.2.3 Model design
The virtual break-even-points are applied in a model which plans the output of the assembly department based on inflow and outflow differences in the load between different stages. This section starts with the computations of the break-even-points and continues with the subsequent response and its computations. The objective function is to minimise the variance of the resulting throughput times.
The virtual break-even-point is calculated first. Notice that:
The load between order acceptance and release to the mechanics workshop & painting department at end of month t-1 is:
𝐴
𝑡−1𝑐− 𝐵
𝑡−1𝑐 The change in load between order acceptance and release to the mechanics workshop &
painting department per period during the previous 𝛽 periods is:
∑ 𝐵
𝑡/𝛽
𝑡−1
𝑡−𝛽
− ∑ 𝐴
𝑡/𝛽
𝑡−1 𝑡−𝛽
18
This implies that the virtual break-even-point between the 𝐴
𝑐and 𝐵
𝑐curve would be 𝐼
𝑡𝐴𝐵periods after t-1, with:
𝐼
𝑡𝐴𝐵=
∑ 𝐴𝑐𝑡−1− 𝐵𝑡−1𝑐𝐵𝑡/𝛽
𝑡−1𝑡−𝛽 − ∑𝑡−1𝑡−𝛽𝐴𝑡/𝛽
, ∀𝑡.
Now if
1𝐼𝑡𝐴𝐵
would exceed the decision variable 𝐿
𝐴𝐵or get below -𝐿
𝐴𝐵in a month t, then a response 𝑂
𝐴𝐵factor will be used in that month and otherwise the response value will be 1, i.e.:
𝑍
𝑡𝐴𝐵= {
𝑂
𝐴𝐵, 𝑖𝑓
1𝐼𝑡𝐴𝐵
> 𝐿
𝐴𝐵1
𝑂𝐴𝐵
, 𝑖𝑓
1𝐼𝑡𝐴𝐵
< −𝐿
𝐴𝐵1, 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
, ∀𝑡,
with 𝑂
𝐴𝐵the decision variable indicating the strength of response to changes in the load between order acceptance and release to the mechanics workshop and painting department.
Response values for other sets of curves, 𝑍
𝑡𝐵𝐶, 𝑍
𝑡𝐶𝐷and 𝑍
𝑡𝐷𝐻, are calculated in an equal manner.
The final response is based on the average of all response values, i.e.:
𝑍
𝑡=
𝑍𝑡𝐴𝐵 + 𝑍𝑡𝐵𝐶 + 𝑍𝑡𝐶𝐷 + 𝑍𝑡𝐷𝐻4
, ∀𝑡.
The response value adjusts the initial output level of the baseline measurement, which is the difference between the cumulative level of release to the assembly department from two months earlier (as set as the throughput time target) and the cumulative output level of the previous month. Hence, the required output in month t is:
𝑅
𝑡=𝑍
𝑡(𝐷
𝑡−2𝑐− 𝐻
𝑡−1𝑐), ∀𝑡,
Now, capacity constraints can be included. First, a virtual output, 𝑉
𝑡, is generated, which is the output under the constraint of a maximum capacity adjustment rate. If the required output relative to the previous output exceeds a rate α, then the output 𝑉
𝑡is the previous output rate adjusted by this factor α. If the percentage output increase falls below a rate −α, then the output 𝑉
𝑡is the previous output rate adjusted by this factor – α. Otherwise, the required output is taken for 𝑉
𝑡, i.e.:
𝑉
𝑡= {
𝐻
𝑡−1(1 − 𝛼), 𝑖𝑓
𝑅𝑡−𝐻𝑡−1𝐻𝑡−1
< −𝛼 𝐻
𝑡−1(1 + 𝛼), 𝑖𝑓
𝑅𝑡𝐻−𝐻𝑡−1𝑡−1
> 𝛼 𝑅
𝑡, 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
, ∀𝑡.
19
Secondly, the maximum capacity level constraint (𝛾) is included. The finally planned output level may not exceed the maximum capacity level and therefore:
𝐻
𝑡= 𝑚𝑖𝑛{ 𝑉
𝑡, 𝛾 } ∀𝑡.
Finally, the cumulative output level at end of month t is calculated by:
𝐻
𝑡𝑐= 𝐻
𝑡−1𝑐+ 𝐻
𝑡, ∀𝑡.
5.2.4 Decision variables
As can be derived from section 5.2.3, eight decision variables are required allowing to respond to the virtual break-even-points. As mentioned 𝐿
𝐴𝐵should be determined, as well as the output adjustment factor 𝑂
𝐴𝐵to respond to the virtual break-even-points derived from curves A and B and the same holds for the three combinations of curves BC, CD and DH.
5.2.5 Objective function
By defining appropriate decision variables, it is attempted to generate output levels which lead to minimal throughput time variation. These decision variables are optimised by minimisation of the objective function, which is the variance of the planned throughput times (𝑇
𝑡) in the period from month m to n. To calculate the planned throughput time, first the WIP level at the assembly department needs to be determined as the load between release to the assembly department and completion at the assembly department at end of month t:
𝑊
𝑡= 𝐷
𝑡𝑐− 𝐻
𝑡𝑐, ∀𝑡,
Notice that if the WIP level is equal to the output of the next two months, then the throughput times will stay at a planned level of 2 months. Put differently, 𝑇
𝑡can be calculated by:
𝑇
𝑡=
𝑊𝑡(𝐻𝑡+1+𝐻𝑡+2)/2
, ∀𝑡.
Finally, the variation in 𝑇
𝑡is minimised from month m to n, by the objective function:
min ∑ (𝑇
𝑡)
2− (
𝑇𝑡𝑛−𝑚
)
2𝑛𝑡=𝑚
.
5.2.6 Modelling constraints
The next constraints were added to the original constraints in the optimisation model in order to increase the effectiveness of the Excel solving time:
0 ≤ 𝐿
𝐴𝐵, 𝐿
𝐵𝐶, 𝐿
𝐶𝐷, 𝐿
𝐷𝐻≤ 10,
0,67 ≤ 𝑂
𝐴𝐵, 𝑂
𝐵𝐶, 𝑂
𝐶𝐷, 𝑂
𝐷𝐻≤ 1.
20
6. M ODEL RESULTS
This section will discuss the performance of the developed model (see section 5.2). First, the model its impact on the throughput time performance will be discussed. Thereafter, validation of the model will be the main focus by performing a robustness tests This section ends with a discussion of the model implications.
6.1 I MPACT ON THROUGHPUT TIME PERFORMANCE
The model results are generated by using five years of historical workload and order status data of the case company. The impact of the model can well be explained by using the 𝛽=1 parameter and decision variables which are determined by the objective function from (𝑚) 1-1-2011 to (𝑛) 1-12-2012. The performance of the capacity planning model is assessed by applying these determined decision variables in the response to the remaining three years (2010, 2013 and 2014) of workload which comes from the data set. These three years simulate future workload through the company, useful to validate the model.
The model performance can be compared with the baseline measurement set in section 5.1,
since the model performs under the same capacity constraints (𝛾 and 𝛼). The impact can best
be explained by analysing the green dashed curve in Figure 7, visualising the cumulatively
completed order levels at the assembly department generated by the model. This curve can be
compared with the black dotted curve, representing the cumulatively completed order levels at
the assembly department if would have been responded to the release levels (the baseline
measurement). The curve which is horizontally located closer to the dashed purple curve over
time, indicates lower throughput time variation, and thus performs better. The consequence of
the model can be seen from 1-12-2013 (see Figure 7). From 1-12-2013 to 1-4-2012, the planned
output of the model is adjusted with a positive total response value (𝑍
𝑡), leading to an increased
output in the coming four months relatively to the output level of the baseline measurement. In
other words, by increasing output from 1-12-2013, throughput time increase – caused by the
increased release rate to the assembly department from one month later – is tempered. Higher
inflow rates compared to the outflow rates during one period back in time triggers this positive
response. From 1-3-2012, the inflow rates are lower compared to the outflow rates, resulting in
a negative total response value (𝑍
𝑡) from one month later. Overall, the green dashed curve fits
closer to the purple curve compared to the black dotted curve, indicating lower throughput time
variation by applying the developed capacity planning model.
21
Figure 7 Cumulative order levels of completed orders if throughput times would have been kept constant, if would have been responded to the release levels, including capacity constraints and if would have been responded to the inflow/outflow differences, including capacity constraints (with 𝛽=1, 𝑚=1-1-2011 and 𝑛=1-12-2012).
Figure 8 Throughput times (in days) at the assembly department if throughput times would have been kept constant, if would have been responded to the release levels, including capacity constraints and if would have been responded to the inflow/outflow differences, including capacity constraints (with
𝛽=1, 𝑚=1-1-2011 and 𝑛=1-12-2012).Positive response
IV
Negative response
Optimisation period
22
The throughput times over five years at the assembly department if would have been responded to the inflow and outflow differences, are visualised by the green dashed curve in Figure 8.
Compared to the throughput times at the assembly department if would have been responded to the release levels, visualised by the black dotted curve in Figure 8 (i.e. baseline measurement), it can be observed that the green dashed curve has lower variation. With a standard deviation of 7,6 days, the performance of the capacity planning model by using the determined decision variables is stronger compared to the baseline measurement which has a standard deviation of 9,8 days.
6.2 R OBUSTNESS TEST RESULTS
A robustness test is performed to assess the validity of the developed model. The robustness test is set up in such a way that multiple scenarios are generated from the five year workload data. Instead of optimising the decision variables over the years of 2011 and 2012, as is done in section 6.1, decision variables are optimised for other periods. Throughput time variance is constantly minimised for two years, by each time a different combination of years. Put differently, optimum decision variables are determined, first, for years 2010 and 2011 (indicated by X-s in Table 1), then for 2011 and 2012, and for instance for 2010 and 2014.
Throughput time performance of the capacity planning model is measured over the years 2010- 2014, by applying the determined decision variables. Besides using a 𝛽=1 parameter, test results are generated for a 𝛽=3 parameter.
The model test results with a 𝛽=1 parameter show better performance in terms of average standard deviation (stdev) in the throughput times (8,2 days) compared to the baseline measurement (9,8 days) (see Table 1). Throughput time has exceeded 90 days, on average, 1,3 months in 5 years, which is lower compared to the baseline measurement (2 months).
Furthermore, throughput time has fallen below 47 days, on average, 1 month in 5 years, which
is also lower compared to the baseline measurement (2 months). Moreover, it has not fallen
below an excessive level of 20 days, indicating low probability of an empty shop floor. All this
indicates an improved performance when using the most recent historical data to respond to the
inflow and outflow differences compared to response to release levels.
23
Table 1 Model robustness test results
At a 𝛽=3, the model performance, in terms of standard deviation in the throughput times (9,2 days), is weaker compared to the 𝛽=1 parameter (8,2 days). However, it is still better compared to the baseline measurement (9,8 days). Throughput time has exceeded 90 days, on average, 1,3 months in 5 years, which indicates a better performance compared to the baseline measurement (2 months). Despite that throughput time has not fallen below 20 days, indicating low probability of an empty shop floor, it fell below 47 days, on average, 2,1 months in 5 years, which is higher compared to the baseline measurement (2 months). An explanation of the weaker performance compared to the 𝛽=1 parameter is the relative slow response to the inflow and outflow differences, since the virtual brake-even-points are calculated by more historical periods of data. More stability in the output levels over the months is not found.
𝜷
Optimised period (X means selected) Throughput time performance in years 2010-20142010 2011 2012 2013 2014 Stdev
(in days)
#months
>90 days
#months
<47 days
#months
<20 days Response
to release (baseline)
9,8 2 2 0
1
X X 7,9 1 0 0
X X 7,6 1 1 0
X X 9,5 3 0 0
X X 6,9 0 2 0
X X 7,2 0 1 0
X X 9,6 3 0 0
X X 6,7 0 2 0
X X 9,6 3 0 0
X X 8,5 1 2 0
X X 8,5 1 2 0
Average 8,2 1,3 1 0
3
X X 8,3 0 3 0
X X 8,5 1 1 0
X X 9,1 3 0 0
X X 7,1 0 2 0
X X 8,4 1 1 0
X X 9,6 2 0 0
X X 8,7 1 3 0
X X 9,5 3 1 0
X X 11,5 1 5 0
X X 10,9 1 5 0
Average 9,2 1,3 2,1 0