EDIT: I did misunderstand at this point, as you pointed out in your reply.
Ok, I think I get your model, but I don’t really see why a grocery store in particular would follow it, and it seems like a generally worse way to make order decisions for one. I think it’s more plausible for earlier parts of the supply chain, where businesses may prefer to produce consistent volumes, because there are relevant thresholds (in revenue) for shutting down, downsizing, expanding and entering the market, and it’s costly to make such a decision (selling/buying capital, hiring/firing staff) only to regret it later or even flip-flop.[1] It takes work to hire someone, so hiring and firing (in either order) is costly. Capital assets lose value once you purchase or use them, so buying and selling (in either order) is costly. If changes in a business’ production levels often require such a decision, that business has reason to try to keep production more consistent or stick with their plans to avoid accumulating such costs. But not all changes to production levels require such decisions.
(I don’t mean to imply you don’t understand all of the above; this is just me thinking through it, checking my understanding and showing others interested.)
I don’t think a grocery store has to adjust its capital or staff to order more or less, or at least not for the vast majority of marginal changes in order size. Same for distributors/wholesalers.
I’m not sure about broiler farms. They’d sometimes just have to wait longer for a contract (or never get one again), or maybe they’d get a smaller contract and raise fewer broilers (the market is contract-based in the US, and the farms don’t own the broilers[2]), so it often just wouldn’t be their decision. But on something like your model, if a farm was planning to enter the market or expand, and contracts or revenues (or market reports) come only slightly worse than expected (still above the threshold in your model, and which is far more likely than coming below the threshold), they’d enter/expand anyway. For farms not planning to expand/enter the market, maybe they’d even take on a contract they don’t expect to pay for its variable costs, just to get more favour from the companies contracting them in the future or to push out competitors. Or, just generally, the contracts would very disproportionately be above their thresholds for shutdown, as they expect them to be. Also, many individual farmers are probably subject to the sunk cost fallacy.
Then there are the integrator/processor companies like Tyson that contract the farms. A small number of companies control a large shares of this part of the supply chain, and they’ve been caught price-fixing (see here and here), which undermines the efficiency (and of course competitiveness) of the market. Below their predictions, maybe they’d want to keep giving farms contracts in order to keep them from shutting down or to keep them from switching to competitors, because it’ll be harder/slower to replace them if demand recovers, or just to hurt competitors. Or, if they were already planning to expand production, but sales come in below expectation, they’d do it anyway for similar reasons.
Here’s an example for a grocery store:
Suppose, to avoid stockouts (like you propose they should), as a rule, they order 7 more units than (the expected value of) their predicted sales.
Suppose they would have predicted 123 sales for the next period had you not abstained. Because you abstained, they instead predict 122. So, as a result of your abstention, they order 129 instead of 130, and you make a difference, at least at this level.
Now, maybe they need to order in specific multiples of units. Say they need to order in multiples of 10, and they order the minimum multiple of 10 that’s at least 7 over what they predict.
In the above case, your abstention makes no difference, and they would order 130 either way, but that’s just one case. The threshold to order 10 fewer is when the prediction modulo 10 would have been 4 and your abstention drops it below that.[3] If you look at a randomly sampled period where they need to order, there’s not really any reason to believe that their prediction modulo 10 will be especially unlikely to be 4 compared to any of the other digits.[4]
Broiler production contracts add another risk aside from the way in which compensation is determined. Traditionally, broiler contracts have not required strong commitments by integrators. In 2006, about half of broiler contracts were “flock to flock”; that is, the integrator made no specific commitment to provide birds beyond the current flock’s placement. Those contracts that specified a longer duration (usually 1 to 5 years) rarely committed the integrator to a specified number of birds or flocks in a year.
I guess one way would be if they have sufficiently consistent purchases and choose a supplier based on the multiple to get their prediction modulo the multiple away from the threshold. I think it’s very unlikely they’d switch suppliers just to get their predictions in a better spot with respect to multiples.
Hi—thanks again for taking more time with this, but I don’t think you do understand my model. It has nothing to do with capital assets, hiring/firing workers, or switching suppliers. All that it requires is that some decisions are made in bulk, i.e. at a level of granularity larger than the impact of any one individual consumer. I agree this is less likely for retail stores (possibly some of them order in units of 1? wouldn’t it be nice if someone actually cared enough to look into this rather than us all arguing hypothetically...), but it will clearly happen somewhere back up the supply chain, which is all that my model requires.
Your mistake is when you write “Say they need to order in multiples of 10, and they order the minimum multiple of 10 that’s at least 7 over what they predict.” That’s not what my model predicts (I think it’s closer to M&H’s first interpretation of buffers?), nor does it make economic sense, and it builds in linearity. What a profit-maximizing store will do is to balance the marginal benefit and marginal cost. Thus if they would ideally order 7 extra, but they have to order in multiples of 10 and x=4 mod10, they’ll order x+6 not x+16 (small chance of one extra stock-out vs large chance of 10 wasted items). They may not always pick the multiple-of-10 closest to 7 extra, but they will balance the expected gains and losses rather than using a minimum. From there everything that I’m suggesting (namely the exponential decline in probability, which is the key point where this differs from all the others) follows.
And a quick reminder: I’m not claiming that my model is the right one or the best one, however it is literally the first one that I thought of and yet no one else in this literature seems to have considered it. Hence my conclusion that they’re making far stronger claims than are possibly warranted.
(I’ve edited this comment, but the main argument about grocery stores hasn’t changed, only some small additions/corrections to it, and changes to the rest of my response.)
Thanks for clarifying again. You’re right that I misunderstood. The point as I now understand is that they expect the purchases (or whatever they’d ideally order, if they could order by individual units) to fall disproportionately in one order size and away from each threshold for lower and higher order sizes, i.e. much more towards the middle, and they’ve arranged for their order sizes to ensure this.
I’ll abandon the specific procedure I suggested for the store, and make my argument more general. For large grocery stores, I think my argument at the end of my last comment is still basically right, though, and so you should expect sensitivity, as I will elaborate further here. In particular, this would rule out your model applying to large grocery stores, even if they have to order in large multiples, assuming a fixed order frequency.
Let’s consider a grocery store. Suppose they make purchase predictions p (point estimates or probability distributions), and they have to order in multiples of K, but I’ll relax this assumption later. We can represent this with a function f from predictions to order sizes so that f(p)=K∗g(p), where g is an integer-valued function.f can be the solution to an optimization problem, like yours. I’m ignoring any remaining stock they could carry forward for simplicity, but they could just subtract it from p and put that stock out first. I’m also assuming a fixed order frequency, but M&H mention the possibility of “a threshold at which a delivery of meat comes a day later”. I think your model is a special case of this, ignoring what I’m ignoring and with the appropriate relaxations below.
I claim the following:
Assuming the store is not horrible at optimizing, f should be nondecreasing and scale roughly linearly with p. What I mean by “roughly linearly with p” is that for (the vast majority of possible values of) p, we can assume that f(p+K)=f(p)+K, and that values of p where f(p+1)=f(p)+K, i.e. the thresholds, are spaced roughly K apart. Even if different order sizes didn’t differ in multiples of some fixed number, something similar should hold, with spacing between thresholds roughly reflecting order size differences.
A specific store might have reason to believe their predictions are on a threshold much less than 1/K of the time across order decisions, but only for one of a few pretty specific reasons:
They were able to choose K the first time to ensure this, intentionally or not, and stick with it and f regardless of how demand shifts.
The same supplier for the store offers different values of K (or the store gets the supplier to offer another value of K), and the store switches K or uses multiple values of K simultaneously in a way that avoids the thresholds. (So f defined above isn’t general enough.)
They switch suppliers or products as necessary to choose K in a way that avoids the thresholds. Maybe they don’t stop offering a product or stop ordering from the same supplier altogether, but optimize the order(s) for it and a close substitute (or multiple substitutes) or multiple suppliers in such a way that the thresholds are avoided for each. (So f defined above isn’t general enough.)
If none of these specific reasons hold, then you shouldn’t expect to be on the threshold much less than 1/K of the time,[1] and you should believe E[f(p−1)]≈E[f(p)]−1, where the expectation is taken over your probability distribution for the store’s prediction p.
How likely are any of these reasons to hold, and what difference should they make to your expectations even if they did?
The first reason wouldn’t give you far less than 1/K if the interquartile range of their predictions across orders over time isn’t much smaller than K, but they prefer or have to keep offering the product anyway. This is because the thresholds are spaced roughly K apart, p will have to cross thresholds often with such a large interquartile range, and if p has to cross thresholds often, it can’t very disproportionately avoid them.[2]
Most importantly, however, if K is chosen (roughly) independently of p, your probability distribution for pmodK for a given order should be (roughly) uniform over 0,..., K−1,[3] so p should hit the threshold with probability (roughly) 1/K. It seems to me that K is generally chosen (roughly) independently of p. In deciding between suppliers, the specific value of K seems less important than the cost per unit, shipping time, reliability and a lower value of K.[4] In some cases, especially likely for stores belonging to large store chains, there isn’t a choice, e.g. Walmart stores order from Walmart-owned distributors, or chain stores will have agreements with the same supplier company across stores. Then, having chosen a supplier, a store could try to arrange for a different value of K to avoid thresholds, but I doubt they’d actually try this, and even if they did try, suppliers seem likely to refuse without a significant increase in the cost per unit for the store, because suppliers have multiple stores to ship to and don’t want to adjust K by the store.
Stores similarly probably wouldn’t follow the strategies in the second and third reasons because they wouldn’t be allowed to, or even if they could, other considerations like cost per unit, shipping time, reliability and stocking the same product would be more important. Also, if the order quantities vary often enough between orders based on such strategies, you’d actually be more likely to make a difference, although smaller when you do.
So, I maintain that for large stores, you should believe E[f(p−1)]≈E[f(p)]−1.
And a quick reminder: I’m not claiming that my model is the right one or the best one, however it is literally the first one that I thought of and yet no one else in this literature seems to have considered it. Hence my conclusion that they’re making far stronger claims than are possibly warranted.
Fair. I don’t think they should necessarily have considered it, though, in case observations they make would have ruled it out, but it seems like they didn’t make such observations.
but it will clearly happen somewhere back up the supply chain, which is all that my model requires.
I don’t think this is obvious either way. This seems to be a stronger claim than you’ve been making elsewhere about your model. I think you’d need to show that it’s possible and worth it for those at one step of the supply chain to choose K or suppliers like in a way I ruled out for grocery stores and without making order sizes too sensitive to predictions. Or something where my model wasn’t general enough, e.g. I assumed a fixed order frequency.
It could be more than 1/K, because we’ve ruled out being disproportionately away from the threshold by assumption, but still allowed the possibility of disproportionately hitting it.
I would in fact expect lower numbers within 0, …, K−1 to be slightly more likely, all else equal. Basically Benford’s law and generalizations to different digit positions. Since these are predictions and people like round numbers, if K is even or a multiple of 5, I wouldn’t be surprised if even numbers and multiples of 5 were more likely, respectively.
Except maybe if the minimum K across suppliers is only a few times less than p, closer to p or even greater, and they can’t carry stock forward past the next time they would otherwise receive a new shipment.
EDIT: I did misunderstand at this point, as you pointed out in your reply.
Ok, I think I get your model, but I don’t really see why a grocery store in particular would follow it, and it seems like a generally worse way to make order decisions for one. I think it’s more plausible for earlier parts of the supply chain, where businesses may prefer to produce consistent volumes, because there are relevant thresholds (in revenue) for shutting down, downsizing, expanding and entering the market, and it’s costly to make such a decision (selling/buying capital, hiring/firing staff) only to regret it later or even flip-flop.[1] It takes work to hire someone, so hiring and firing (in either order) is costly. Capital assets lose value once you purchase or use them, so buying and selling (in either order) is costly. If changes in a business’ production levels often require such a decision, that business has reason to try to keep production more consistent or stick with their plans to avoid accumulating such costs. But not all changes to production levels require such decisions.
(I don’t mean to imply you don’t understand all of the above; this is just me thinking through it, checking my understanding and showing others interested.)
I don’t think a grocery store has to adjust its capital or staff to order more or less, or at least not for the vast majority of marginal changes in order size. Same for distributors/wholesalers.
I’m not sure about broiler farms. They’d sometimes just have to wait longer for a contract (or never get one again), or maybe they’d get a smaller contract and raise fewer broilers (the market is contract-based in the US, and the farms don’t own the broilers[2]), so it often just wouldn’t be their decision. But on something like your model, if a farm was planning to enter the market or expand, and contracts or revenues (or market reports) come only slightly worse than expected (still above the threshold in your model, and which is far more likely than coming below the threshold), they’d enter/expand anyway. For farms not planning to expand/enter the market, maybe they’d even take on a contract they don’t expect to pay for its variable costs, just to get more favour from the companies contracting them in the future or to push out competitors. Or, just generally, the contracts would very disproportionately be above their thresholds for shutdown, as they expect them to be. Also, many individual farmers are probably subject to the sunk cost fallacy.
Then there are the integrator/processor companies like Tyson that contract the farms. A small number of companies control a large shares of this part of the supply chain, and they’ve been caught price-fixing (see here and here), which undermines the efficiency (and of course competitiveness) of the market. Below their predictions, maybe they’d want to keep giving farms contracts in order to keep them from shutting down or to keep them from switching to competitors, because it’ll be harder/slower to replace them if demand recovers, or just to hurt competitors. Or, if they were already planning to expand production, but sales come in below expectation, they’d do it anyway for similar reasons.
Here’s an example for a grocery store:
Suppose, to avoid stockouts (like you propose they should), as a rule, they order 7 more units than (the expected value of) their predicted sales.
Suppose they would have predicted 123 sales for the next period had you not abstained. Because you abstained, they instead predict 122. So, as a result of your abstention, they order 129 instead of 130, and you make a difference, at least at this level.
Now, maybe they need to order in specific multiples of units. Say they need to order in multiples of 10, and they order the minimum multiple of 10 that’s at least 7 over what they predict.
In the above case, your abstention makes no difference, and they would order 130 either way, but that’s just one case. The threshold to order 10 fewer is when the prediction modulo 10 would have been 4 and your abstention drops it below that.[3] If you look at a randomly sampled period where they need to order, there’s not really any reason to believe that their prediction modulo 10 will be especially unlikely to be 4 compared to any of the other digits.[4]
I see papers on sunk-cost hysteresis and entry and exist decisions under uncertainty, like Baldwin, 1989, Dixit, 1989, Gschwandtner and Lambson, 2002.
Also:
For their prediction x, if x mod10=4, then they order x+16. If x mod10=3, then they order x+7.
I guess one way would be if they have sufficiently consistent purchases and choose a supplier based on the multiple to get their prediction modulo the multiple away from the threshold. I think it’s very unlikely they’d switch suppliers just to get their predictions in a better spot with respect to multiples.
Hi—thanks again for taking more time with this, but I don’t think you do understand my model. It has nothing to do with capital assets, hiring/firing workers, or switching suppliers. All that it requires is that some decisions are made in bulk, i.e. at a level of granularity larger than the impact of any one individual consumer. I agree this is less likely for retail stores (possibly some of them order in units of 1? wouldn’t it be nice if someone actually cared enough to look into this rather than us all arguing hypothetically...), but it will clearly happen somewhere back up the supply chain, which is all that my model requires.
Your mistake is when you write “Say they need to order in multiples of 10, and they order the minimum multiple of 10 that’s at least 7 over what they predict.” That’s not what my model predicts (I think it’s closer to M&H’s first interpretation of buffers?), nor does it make economic sense, and it builds in linearity. What a profit-maximizing store will do is to balance the marginal benefit and marginal cost. Thus if they would ideally order 7 extra, but they have to order in multiples of 10 and x=4 mod10, they’ll order x+6 not x+16 (small chance of one extra stock-out vs large chance of 10 wasted items). They may not always pick the multiple-of-10 closest to 7 extra, but they will balance the expected gains and losses rather than using a minimum. From there everything that I’m suggesting (namely the exponential decline in probability, which is the key point where this differs from all the others) follows.
And a quick reminder: I’m not claiming that my model is the right one or the best one, however it is literally the first one that I thought of and yet no one else in this literature seems to have considered it. Hence my conclusion that they’re making far stronger claims than are possibly warranted.
(I’ve edited this comment, but the main argument about grocery stores hasn’t changed, only some small additions/corrections to it, and changes to the rest of my response.)
Thanks for clarifying again. You’re right that I misunderstood. The point as I now understand is that they expect the purchases (or whatever they’d ideally order, if they could order by individual units) to fall disproportionately in one order size and away from each threshold for lower and higher order sizes, i.e. much more towards the middle, and they’ve arranged for their order sizes to ensure this.
I’ll abandon the specific procedure I suggested for the store, and make my argument more general. For large grocery stores, I think my argument at the end of my last comment is still basically right, though, and so you should expect sensitivity, as I will elaborate further here. In particular, this would rule out your model applying to large grocery stores, even if they have to order in large multiples, assuming a fixed order frequency.
Let’s consider a grocery store. Suppose they make purchase predictions p (point estimates or probability distributions), and they have to order in multiples of K, but I’ll relax this assumption later. We can represent this with a function f from predictions to order sizes so that f(p)=K∗g(p), where g is an integer-valued function.f can be the solution to an optimization problem, like yours. I’m ignoring any remaining stock they could carry forward for simplicity, but they could just subtract it from p and put that stock out first. I’m also assuming a fixed order frequency, but M&H mention the possibility of “a threshold at which a delivery of meat comes a day later”. I think your model is a special case of this, ignoring what I’m ignoring and with the appropriate relaxations below.
I claim the following:
Assuming the store is not horrible at optimizing, f should be nondecreasing and scale roughly linearly with p. What I mean by “roughly linearly with p” is that for (the vast majority of possible values of) p, we can assume that f(p+K)=f(p)+K, and that values of p where f(p+1)=f(p)+K, i.e. the thresholds, are spaced roughly K apart. Even if different order sizes didn’t differ in multiples of some fixed number, something similar should hold, with spacing between thresholds roughly reflecting order size differences.
A specific store might have reason to believe their predictions are on a threshold much less than 1/K of the time across order decisions, but only for one of a few pretty specific reasons:
They were able to choose K the first time to ensure this, intentionally or not, and stick with it and f regardless of how demand shifts.
The same supplier for the store offers different values of K (or the store gets the supplier to offer another value of K), and the store switches K or uses multiple values of K simultaneously in a way that avoids the thresholds. (So f defined above isn’t general enough.)
They switch suppliers or products as necessary to choose K in a way that avoids the thresholds. Maybe they don’t stop offering a product or stop ordering from the same supplier altogether, but optimize the order(s) for it and a close substitute (or multiple substitutes) or multiple suppliers in such a way that the thresholds are avoided for each. (So f defined above isn’t general enough.)
If none of these specific reasons hold, then you shouldn’t expect to be on the threshold much less than 1/K of the time,[1] and you should believe E[f(p−1)]≈E[f(p)]−1, where the expectation is taken over your probability distribution for the store’s prediction p.
How likely are any of these reasons to hold, and what difference should they make to your expectations even if they did?
The first reason wouldn’t give you far less than 1/K if the interquartile range of their predictions across orders over time isn’t much smaller than K, but they prefer or have to keep offering the product anyway. This is because the thresholds are spaced roughly K apart, p will have to cross thresholds often with such a large interquartile range, and if p has to cross thresholds often, it can’t very disproportionately avoid them.[2]
Most importantly, however, if K is chosen (roughly) independently of p, your probability distribution for p mod K for a given order should be (roughly) uniform over 0,..., K−1,[3] so p should hit the threshold with probability (roughly) 1/K. It seems to me that K is generally chosen (roughly) independently of p. In deciding between suppliers, the specific value of K seems less important than the cost per unit, shipping time, reliability and a lower value of K.[4] In some cases, especially likely for stores belonging to large store chains, there isn’t a choice, e.g. Walmart stores order from Walmart-owned distributors, or chain stores will have agreements with the same supplier company across stores. Then, having chosen a supplier, a store could try to arrange for a different value of K to avoid thresholds, but I doubt they’d actually try this, and even if they did try, suppliers seem likely to refuse without a significant increase in the cost per unit for the store, because suppliers have multiple stores to ship to and don’t want to adjust K by the store.
Stores similarly probably wouldn’t follow the strategies in the second and third reasons because they wouldn’t be allowed to, or even if they could, other considerations like cost per unit, shipping time, reliability and stocking the same product would be more important. Also, if the order quantities vary often enough between orders based on such strategies, you’d actually be more likely to make a difference, although smaller when you do.
So, I maintain that for large stores, you should believe E[f(p−1)]≈E[f(p)]−1.
Fair. I don’t think they should necessarily have considered it, though, in case observations they make would have ruled it out, but it seems like they didn’t make such observations.
I don’t think this is obvious either way. This seems to be a stronger claim than you’ve been making elsewhere about your model. I think you’d need to show that it’s possible and worth it for those at one step of the supply chain to choose K or suppliers like in a way I ruled out for grocery stores and without making order sizes too sensitive to predictions. Or something where my model wasn’t general enough, e.g. I assumed a fixed order frequency.
It could be more than 1/K, because we’ve ruled out being disproportionately away from the threshold by assumption, but still allowed the possibility of disproportionately hitting it.
For realistic distributions of p across orders over time.
I would in fact expect lower numbers within 0, …, K−1 to be slightly more likely, all else equal. Basically Benford’s law and generalizations to different digit positions. Since these are predictions and people like round numbers, if K is even or a multiple of 5, I wouldn’t be surprised if even numbers and multiples of 5 were more likely, respectively.
Except maybe if the minimum K across suppliers is only a few times less than p, closer to p or even greater, and they can’t carry stock forward past the next time they would otherwise receive a new shipment.