1. Trang chủ
  2. » Kinh Doanh - Tiếp Thị

Tài liệu Competition and Quality Choice in the CPU Market  ppt

54 376 0
Tài liệu đã được kiểm tra trùng lặp

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Tiêu đề Competition and Quality Choice in the CPU Market
Tác giả Chris Nosko
Trường học Harvard University
Chuyên ngành Economics, Industrial Organization
Thể loại Nghiên cứu
Năm xuất bản 2010
Thành phố Cambridge
Định dạng
Số trang 54
Dung lượng 798,86 KB

Các công cụ chuyển đổi và chỉnh sửa cho tài liệu này

Nội dung

I use variation from the introduction of the Core 2 Duo,and the relative frequency of product line changes in general, to ask the following questions: In this industry, how do firms choo

Trang 1

Competition and Quality Choice in the CPU Market ∗

Chris Nosko Harvard University November 2010

AbstractThis paper uses the CPU market to study how multiproduct firms generate returnsfrom innovation Using a new dataset, I estimate a discrete-choice model of CPUdemand and then recover estimates of the sunk cost of product introductions I combinethese estimates with a model of firm product choice to examine how product linedecisions change with asymmetric technological capabilities and with the competitiveenvironment I use the model to show how technological leaders can use product lines asstrategic weapons, isolating competition to less desirable areas of the product spectrum

I apply this insight to a large shift in technological leadership – Intel’s introduction ofthe Core 2 Duo – and quantify the portion of returns that came from Intel’s ability topush its principle competitor, AMD, into lower-margin product segments I find thatcompetition plays a key role in determining firms’ product line decisions and that thesedecisions are important in generating returns from innovation Ignoring endogenousproduct choices leads to underestimates of the social welfare losses from monopoly

∗ I am grateful to Ulrich Doraszelski, Lisa Kahn, Greg Lewis, Julie Mortimer, Ariel Pakes, and Alan Sorensen for invaluable advice Also, I thank Brett Gordon for his help in putting together the pricing data, Che-Lin Su for helpful discussions about computational methods, and especially Jeremy Davies of Contextworld for generously supplying downstream data All errors are solely mine.

Trang 2

1 Introduction

When firms innovate, they often don’t just introduce products at the top-end of the market.Instead, they tend to reset their whole product line in an effort to extract the most possibleprofit from their innovation Their incentives to reshape lower market segments depend

on the industry structure, especially whether it is a monopoly or an oligopoly, and thetechnological capabilities of rival firms In oligopoly, one way that firms generate profit is

by strategically using product choices to change the nature of competition in the industry.Our ability to understand this phenomenon requires knowledge of how firms make productchoices and how these product choices change with market structure Despite the evidentimportance of competition for driving decisions about product lines, and the effect that thesedecisions have on consumer welfare, most antitrust analyses downplay them, and models ofinnovation almost completely ignore them

I use the CPU market to study how imperfectly competitive firms make product decisionsand how these decisions affect their ability to generate returns from innovation In thismarket, firms offer a menu of quality-differentiated products that is often reset to integratenew technology and to respond to actions of competitors In 2006, Intel introduced a newproduct line, called the Core 2 Duo Hailed as “the most impressive piece of silicon theworld has ever seen”1 the market went from relative equality between Intel and its rival,AMD, to one that was firmly dominated by Intel Interestingly, it wasn’t just Intel’s ability

to produce faster chips that led to dominance Instead, much of Intel’s increased profitcame from pushing AMD out of mid-range market segments, areas where AMD still had thetechnological capability to compete I use variation from the introduction of the Core 2 Duo,and the relative frequency of product line changes in general, to ask the following questions:

In this industry, how do firms choose the number and quality of their products? How wouldthese choices be different if the industry were a monopoly? And how do firms use strategicproduct choices to increase returns from innovation?

1 See the CNET article, “Intel’s Core 2 Duo lives up to hype” from July 16, 2006:

http://tinyurl.com/cnet-core2-duo

Trang 3

My first finding relates to the role that competition plays in driving product line decisions.When marginal costs rise relatively slowly with quality level – a stylized fact in the CPUindustry – a monopolist has little incentive to introduce a broad spectrum of products.Instead, a monopolist can extract almost all feasible profit with a limited number of products

at the high-end of the market In oligopoly, this strategy is no longer optimal because acompetitor can steal marketshare by introducing products at lower price points This processleads to a competitive equilibrium with more products spread throughout the product line.Thus, in markets like CPUs, quality-based product separation can be driven largely bycompetitive interaction rather than a desire to discriminate between consumer types.2 Thiscontrasts with the standard literature on price discrimination, which sees the introduction

of quality-differentiated products as a mechanism for extracting more revenue from valued consumers.3 This finding implies that, because product line decisions matter littlefor a monopolist’s profitability, when they innovate, resetting a product line will play muchless of a role in extracting profit from that innovation than for firms in an oligopoly

high-I next find that, in oligopoly, returns from innovation come not only from the ability

to produce a better product at the top end, but also from an innovator’s ability to stealbusiness from rivals throughout the product line Using a simple model, I construct anexample showing how a technological leader can isolate competition to lower margin portions

of the market, thereby increasing market power over a larger product space I combine mymodel with data from the introduction of the Core 2 Duo to quantify that role that thesebusiness stealing effects played in generating profit for Intel I break apart the portion ofreturns that came from Intel’s introduction of new top products from the returns that camefrom strategic quality choices throughout the product spectrum This comparison gives

an estimate of how much we would underestimate the effect of competition on innovationincentives if we held product lines fixed Next, I compare the profits that Intel generated in

2 In a series of theory papers, Johnson and Myatt (2003, 2006) discuss the role of competition in driving quality choices in a Cournot setting with differentiated products.

3 There is a long literature on using quality as a price discrimination mechanism going back at least to Jules Dupuit in the 19th century The foundational modern work is Mussa and Rosen (1978), with generalizations

to multi-dimensional consumer types and/or multiple firms by Rochet and Stole (2002), Armstrong (1996), Rochet and Chone (1998), and Armstrong and Vickers (2001).

Trang 4

the duopoly structure with profits that a counterfactual monopolist would have made withthe same innovation I show that, while these returns are substantially lower than that of

an oligopolist in percentage terms, in absolute dollars, the returns are very similar

My empirical strategy relies on a combination of institutional details, a rich datasetcontaining some “exogenous” shifts, and a structural model This industry has two firms,and these firms compete over products that differ in relatively straightforward ways, allowing

me to write down a model that is simple enough to take to the data, but that still captures keyaspects of the industry By using this model to estimate primitives that we don’t generallyobserve in datasets – consumer preferences, marginal and sunk costs – I attempt to untanglesome of the key drivers of product line decisions I then use these estimates to shed light onIntel’s introduction of the Core 2 Duo, an event that I treat as the result of an exogenousinnovative process.4 Using the structural model, I compute how a counterfactual monopolistwould choose products and compare these and the competitive outcomes to a social planner

My dataset contains CPU list prices (when purchased in 1,000 lot units) combined withEuropean country and time specific data on desktop sales Because the sales data containinformation on the CPU that shipped with the computer, I am able to construct a monthlydataset of CPU prices and quantities sold across 8 European countries from 2003-2008 Iexploit the cross country variation to generate estimates of a horizontal taste for Intel’s vs.AMD’s products and the near constant flow of new chips provides variation in quality levels

As mentioned, I use the introduction of the Core 2 Duo to consider how innovative activityinteracts with quality choice to generate returns

The basis of the structural model is an underlying utility framework that allows forconsumer heterogeneity in willingness to pay for quality (vertical differentiation) and het-erogeneity in brand preference across CPU companies (horizontal heterogeneity) Demand

4 In this paper I treat innovation outcomes as exogenous events, focusing on how firms generate returns from them With respect to the Core 2 Duo in particular, this is probably not a bad assumption: The project that led to the Core 2 Duo, codenamed Banias, began at least as early as 2001 to develop a CPU for laptops (which later became the Pentium M) It was only later that this technology was thought appropriate for the desktop market See the Seattle Times, “How Israel saved Intel”, http://tinyurl.com/core2-banias More generally, the results in this paper can be seen as an input to a dynamic game that endogenizes innovation decisions.

Trang 5

estimation proceeds following the Pure Characteristic Model of Berry and Pakes (2007) withsome modifications tailoring the problem to my setting I recover sunk costs of productintroductions from observing decisions on whether and when firms introduced new productsinto the market Moment inequalities (Pakes, Porter, Ho, and Ishii (2006) ) allow me torecover bounds on the sunk cost parameter.

I find that a counterfactual monopolist has little incentive to introduce a whole productline: With a single product he can capture 98% of the profit that he earns with a full,optimally-placed product line Given the sunk cost estimates, a monopolist would introducebetween 1 and 3 products compared to the 8 to 10 products that exist in the competitivemarket Consumer surplus from a monopolist is found to go down by 65% compared to thecompetitive outcome Much of that comes from the increased monopolist prices, but a non-trivial 13% comes from the reduction of products and their inefficiently high quality levels

I further find that the returns to innovation are higher in percentage terms in an oligopolythan in a monopoly My estimates indicate that Intel’s profits increased by 96% with theintroduction of the Core 2 Duo (from 95 to 180 million dollars monthly) 49% of that camefrom the introduction of new products (holding old products fixed), and the rest came fromthe realignment of products throughout the spectrum Finally, a monopolist with the sameinnovation would have increased profits by 17% (from 488 to 573 million) Even though amonopolist has lower percentage returns, in dollar values, the amount is very similar to theoligopoly outcome

These results speak to recent antitrust enforcement in this industry The market leader,Intel, has been widely accused of actively working to exclude AMD from the market Anumber of regulatory agencies including the European Union Competition Commission, theU.S Federal Trade Commission, and the Fair Trade Commission of Japan have either fined

or investigated Intel’s behavior Naturally, in analyzing the possible effects of a marketdominated more strongly by Intel, we would like to know how the product landscape wouldchange My results indicate that the current market is quite competitive An Intel monopolywould result in substantial lost consumer welfare, mostly because of higher monopoly prices,

Trang 6

but also because of a decrease in the number of products on the market.

There are a number of recent empirical IO papers that examine topics related to mine.Eizenberg (2008) estimates a game where downstream OEMs choose a discrete portfolio

of CPU options to offer with their PC products in a first stage, and then set prices in

a second stage.5 In contrast, I focus on competition between the upstream firms, Inteland AMD, and on how product line decisions affect their ability to generate returns frominnovation Fan (2009), Crawford and Shum (2006), Mazzeo (2002), and Draganska, Mazzeo,and Seim (2009), consider endogenous product choice in newspapers, cable television, hotels,and ice cream, respectively Berry and Waldfogel (1999, 2001) explore firm choice of radiostation formats using merger activity brought on by the Telecommunications Act of 1996,and Sweeting (2007) models single-product firms and estimates sunk cost of format switching.Two papers use the CPU market to study different topics Song (2007) estimates a demandsystem closely related to the one I use below in order to compare consumer welfare measures

to more widely used models Goettler and Gordon (2009) model firm innovation incentives

in the face of dynamic consumers Because of the complexity of the dynamic model, they areforced to limit firm heterogeneity, modeling Intel and AMD as single-product firms, whichdoesn’t allow for the segmentation incentives that I focus on

This paper is organized as follows Section 2 describes the industry, introduces thedata that will be used, and discusses changes in the CPU market that make it a suitableenvironment to study nonlinear pricing and competition Section 3 details utility primitivesand goes through their estimation Section 4 lays out the quality choice model which includesthe second-stage pricing game and estimation of marginal and sunk costs Section 5 laysout and solves counterfactuals using the estimates from earlier sections The counterfactualssimulate what the market would look like if it were a monopoly, run by a social planner, orhad different innovation outcomes

5 Eizenberg shows how to account for issues of self-selection and partial identification in these sorts of games, an estimation problem closely related to the one in this paper.

Trang 7

2 The CPU Market

The market for desktop, laptop, and server CPUs is dominated by two companies: Intel andAMD, with (respectively) approximately 80% and 18% market share as of January 2009.This paper concentrates on the market for desktop CPUs These are CPUs that go intohome and business machines that are used for everyday tasks I concentrate on this marketrather than the market for laptop chips because it is more competitive (Intel dominates themarket for laptop chips) and more stable (laptop growth has been explosive over the last fewyears) More data are also available for this market because enthusiasts tend to buy desktopchips and chart their performance extensively

Within the desktop market, each firm typically offers between 10 and 15 chip varieties

at any given time By far the largest difference between these chips is performance Higherperforming chips tend to have higher clockspeed (operate at a higher frequency), more high-speed cache memory available, include multiple cores, and use more advanced process tech-nology Firms can and do use all of these levers to manipulate performance, but at the end

of the day a consumer need only look at how the chip performs based on some benchmark

to determine it’s product quality.6

The CPU market has long been known for offering quality-differentiated product lines.The Intel 80486 was a popular example in both the economics literature (Deneckere andMcAfee (1996) ) and the popular press Introduced in 1989, Intel created low-quality andhigh-quality versions of this chip Strikingly, in order to create the low-quality chip, theywent to some cost to destroy a perfectly good high-quality chip.7 Recent examples include

6 This is, of course, a simplification E.g., chips that have differing number of cores appeal to consumers who do different sorts of tasks with their computer (making it not perfectly collapsible to a one dimensional metric) Nevertheless, this is a product that comes about as close as possible to differing on a single vertical dimension.

7 More specifically, Intel released a “DX” version that included a math co-processing chip, and an “SX” version that did not The co-processing chip gave a performance boost to power users but was ignored by the mainstream software of most users The DX version sold at a substantial premium The story goes that

in order to create an SX chip, Intel manufactured a DX version and then incurred some cost to destroy the connection between the CPU and the co-processor This manufacturing process is somewhat apocryphal At first Intel used DX chips with properly functioning CPUs but with manufacturing defects in the co-processing unit Later, they created a separate mask to exclude the co-processing unit completely, which decreased the die size and hence the cost.

Trang 8

Intel’s selling of a code that “unlocks” features of their G6951 processor Consumers purchasethe chip at a stock level, and should they wish to access additional performance increases,they can buy the magic code (which costs Intel nothing) that makes the chip perform better.8

While these anecdotes illustrate isolated incidents, the industry has eluded systematicstudy on the quality-choice dimension I believe this is because up until the end of 2003, Inteland AMD used a rather crude segmentation mechanism: chip manufacturers concentrated

on their top chips and left their older chips to serve more price sensitive segments of themarket (at reduced prices) While segmentation was occurring, the quality choice itself wasnot being made every period – instead, firms were constrained by their top performing chipsfrom the last period, a strategy known as waterfalling This changed toward the end of 2003when firms began adjusting price and quality on a regular basis to hit different segments ofthe market Appendix 1 documents this shift in the industry

The competitive nature of the industry has fluctuated markedly over the 2000’s Themid 2000’s were the height of AMDs competitiveness, peaking with 30% marketshare at thebeginning of 2006 This is up from around 10% at the beginning of 2003 and 20% at thebeginning of 2009 Figure 1 plots Intel and AMD marketshare from the end of 2003 throughthe beginning of 2009

[Figure 1 about here.]

Changing technical leadership is partially responsible for the marketshare fluctuations From2002-2006, AMD consistently released products whose price/performance characteristicswere similar to or beat Intel’s products However, with the release of the Core 2 prod-uct line at the end of 2006, Intel regained technical leadership, a position which they’ve heldevery since

Figures 2 and 3 tell the story of a rapidly and significantly changing market In June

2006, Intel and AMD both offered products throughout the quality spectrum In manyparts of the spectrum, AMD offered products that were better and cheaper than comparableIntel products (the top panel of 2) In July 2006, Intel released a number of Core 2 Duo

8 See: http://tinyurl.com/intel-g6951 I thank Kelly Shue for pointing this out to me.

Trang 9

products The bottom panel of figure 2 shows that these products completely dominatedAMD’s offerings, substantially altering the price/quality landscape AMD responded byslashing prices and removing products that were no longer competitive (figure 3) By January

of 2008, the competitive nature of the market had changed so significantly that AMD wasrelegated to the bottom portion of the quality spectrum, offering almost no chips at themedium to high end Interestingly, while AMD’s overall marketshare did not drop all themuch Its share of more expensive chips dropped almost to 0 (figure 4)

[Figure 2 about here.]

[Figure 3 about here.]

[Figure 4 about here.]

I exploit this shift in technological leadership in two ways: First, it provides naturalvariation in product characteristics that helps identify the demand system Second, I explorefirms’ reaction to this technological change, focusing on ways that this affected quality choicesand the strategic interactions between firms The section on counterfactuals discusses this

in more detail

In addition to variation in marketshare across time, there is also substantial variationacross country Figure 5 shows that AMD’s marketshare in France is consistently higherthan in other countries, especially when contrasted with their Southern neighbors, Spain.This is a useful source of variation that will play a key role in identifying the horizontalaspect of consumer heterogeneity

[Figure 5 about here.]

Data for this paper come from a variety of sources Prices were gathered from websitesdevoted to tracking the wholesale prices of Intel and AMD chips These are prices paid by

Trang 10

distributors and system builders when purchased in lots of 1,000.9

Quantities come from Contextworld, a European firm that tracks computer sales textworld contracts with major retailers and distributors across the region to receive point

Con-of sales data They then extrapolate to include retailers that they do not have contractswith To check the accuracy of these extrapolations, I compared aggregate Contextworlddata to other consulting firms that report their versions of these numbers, such as IDC,and the numbers were quite similar The Contextworld data contain characteristics such ashard drive size, ram, screen, and important for my purposes, the exact CPU that went intothe computer These data can be broken down across country, distribution channel, andcomputer type (laptop or desktop) One downside to using downstream sales data is thatthere is a lag between when a CPU is purchased from the upstream supplier and when it issold to the end user (making it into the quantities data)

Performance data were collected from various enthusiast websites These sites and forumstake CPUs and run them through a series of benchmarks and performance tests The endresult is a number of metrics that allows chips to be compared to each other It is importantfor this paper to use actual benchmark numbers instead of CPU speed (or some combina-tion of CPU speed and cache) because I am explicitly comparing performance across CPUmanufacturers with substantially different architectures Simple clockspeed doesn’t reflectactual performance in this case because the chip architecture interacts with numerous char-acteristics of the chip in complex ways to actually move information through the pipeline

9 There is plenty of evidence that the large OEMs do not pay these list prices Instead, at a minimum, they get percentage discounts off of them depending on their size As long as these are the same across the OEMs in my data, then my substantive empirical conclusions will not be affected It would be more problematic if specific OEMs got specific discounts off of certain chips and not on others I have not heard

of instances of this occurring, but given the bilateral nature of these deals, I certainly cannot rule it out.

Trang 11

Because I am primarily concerned with the level of competition between Intel and AMD,

I break out a dummy, dj from the product observables where dj = (dIntel − dAM D) dInteland dAM D are dummies for whether the product is made by AMD or by Intel

uij = βxj+ νidj− 1

αipj+ ξj (2)

I allow for consumers to have varying tastes for purchasing an Intel or AMD product byincluding the random coefficient, νi on dj If an Intel product, νi will enter the utilityfunction positively and if an AMD product, negatively νi determines how substitutableproducts are across firms At one extreme, if νi is a constant with value 0, then two products(from different firms) with the same characteristics would be identical from a consumer’s

10 The key difference between this model and the more commonly used discrete choice model of Berry, Levinsohn, and Pakes (BLP) is the absence of a product and individual specific error term (generally denoted

 ij ) Although appending an  ij would ease the computational burden in estimation (a point to which I return below), it is problematic in this context Here, a product is defined at the chip level Adding a BLP-style error term would oddly imply that consumers received independent utility draws for closely related chips (say products that only slightly differed in speed) The larger problem comes from considering the supply side: Consumers with iid product level error terms provide strong incentives for firms to introduce new products, even if they are very closely related to existing products Firms can “generate utility” from simply introducing a product because some consumers will get a high draw from the idiosyncratic shock even if they got a low draw from a product with identical observables If these shocks don’t accurately represent reality, then a model which considers firms’ incentives to introduce new products will overstate profit opportunities These reasons for using the PCM are stressed in the original paper by Berry and Pakes They are especially important here because the supply side and counterfactuals explicitly consider product introduction choices.

Trang 12

perspective, which gives Bertrand pricing implying zero markup On the other hand, thehigher the mean value of νi for a given firm, the more market power that firm enjoys asconsumers will prefer that firm’s products over a competitor’s even with inferior productcharacteristics As the variance of νi increases, price competition softens because the tails

of the distribution more heavily favor one firm or the other

In the CPU market, there are a number of reasons that consumers might prefer one firm’sproduct over another’s (holding observables constant) First, both Intel and AMD advertiseheavily The Intel Inside campaign was widely credited with generating consumer awareness

of the CPU, moving it from a commodity semiconductor part to a major part of the consumerbuying decision Second, due to differences in architectures, AMD and Intel chips performdifferent types of tasks at different speeds For instance, those who play computer gamestend to prefer AMD chips, while those with business oriented tasks prefer Intel Lastly, thereare complementarities between the CPU and other components inside the computer CPU’sfrom a given company can often be upgraded without changing the motherboard, video card,etc, while changing to a CPU from a different company would require re-purchasing thosecomponents

Splitting apart the random coefficients into a mean and variance term and assuming that

αi is distributed lognormal and νi normal gives (where nν and nα are standard normals):

uij = βxj + (¯ν + σνnν)dj − 1

exp( ¯α + σαnα)pj + ξj (3)

Because utility is only defined up to a monotonic transformation, two normalizations areneeded in order to identify the underlying components of the demand system I first normal-ize the utility of the outside option to 0 Without this normalization, an additive shift to allprices would unrealistically not affect market shares The second normalization – setting themean of the underlying normal on the price coefficient, ¯α to 0 – pins down the multiplicativescale Without it, multiplying both sides of the utility function by the same constant wouldnot change the implications of the model

Trang 13

Define δj as the mean product level quality:

As in Berry and Pakes, to generate aggregate market shares, first consider a consumer’sproduct choice conditional on purchasing from firm n I construct upper and lower cutoffvalues for each product:

Will purchase product j

Here I diverge from the original Berry and Pakes estimation routine Because of thestructure of my model, one horizontal and one vertical characteristic with two firms, I amable to calculate an analytical cutoff value for each αi type that determines which consumerspurchase from each firm Let uj∗

n be a consumer’s preferred product conditional on purchasingfrom firm n Then, for every αi, there is a cutoff value, ˜ν, such that uj∗

1 = uj∗

2.Using the functional form of utility gives:

Trang 14

Assuming marginal distributions for α, ν:f (α), g(ν), market shares for product j from firm1(2) are given by:

to more easily formulate the problem as a constrained maximization problem and use theMathematical Programming with Equilibrium Constraints (MPEC) techniques as discussed

in Su and Judd (2008)

It is common in the discrete choice estimation literature to assume that observed andunobserved product characteristics are uncorrelated with each other This is useful for atleast two reasons: 1) The estimation routine usually involves a regression where observedcharacteristics are regressors and the unobserved characteristic is the error term, resulting

in biased coefficients without this assumption (and in the absence of further instruments).2) It allows for characteristics of products from firm j−i to be used as instruments for prices

of products from firm j This relies on other firms’ characteristics being correlated withprice (which makes sense given that more crowded regions of the product space shouldhave lower markups) and uncorrelated with unobservables If firms choose characteristicsknowing the unobservables of products produced by themselves and their competitors, thenthese instruments will not be valid and the estimated parameters will be biased

Assuming exogenous product characteristics is problematic in my model because thewhole point of the supply side is to investigate how quality choices are made Using productcharacteristics as instruments creates a situation where, without further assumptions, thedemand estimation is incompatible with the quality-choice model My solution is to exploit

Trang 15

the panel structure of the data which allows me to insert product-level dummy variables inthe estimation.

As in Nevo (2001), product level dummies change the structural error in the estimation.The unobserved product characteristics are “soaked” up by the dummies, which subsume allcharacteristics that do not change from market to market The structural error term is nowthe market-level deviation from the mean utility level In my case this means time/countryspecific deviations for an individual chip

Estimation proceeds by minimizing a GMM objective function subject to simulated ket shares equaling actual market shares:

mar-min

σ ν ,σ α ,˜ δ,p,β

˜ξ(β, σ)0ZW Z0ξ(β, σ)˜ (11)

Subject To:

For each candidate value of the standard deviations of the random coefficients, the routinecalculates the mean utility levels that equate actual with observed marketshares and findsthe value of the objective function by taking the residuals from an IV regression The

IV regression has the mean product utilities on the left-hand side and the right-hand sideincludes product dummies (kj), time dummies (kt), and market dummies (km) See equation

13 ˜ξjtm is the market/time specific error term The product dummy variables, βj give the

δj’s from the underlying utility model and are used as inputs to the supply side of themodel.11

˜

δjtm= βjkj+ βtkt+ βmkm+ ˜ξjtm (13)

This estimation routine presents a computational challenge: Unlike in BLP, where the iosyncratic error term ensures positive market shares for all products at all potential param-eter values, this model often predicts that some products will have 0 market share This

id-11 The downside to this estimation strategy is that recovering the underlying utility coefficients requires

an extra step of regressing the dummies on product characteristics – a regression that is only valid if the unobserved product characteristics are uncorrelated with observed product characteristics In this context this means firms don’t know ξ when they choose product characteristics, an assumption that is unlikely to

be true in my case Fortunately, I’m not actually concerned with the coefficients on product characteristics – recovering the δ0s is sufficient for the supply side estimation and counterfactuals.

Trang 16

occurs when the mean utility levels become “un-ordered,” making some products dominated

by their neighbors When this happens, the gradient of the constraints for those productsare 0 and standard computational techniques no longer work However, for parameter val-ues where the marketshares are all non-0, this is a straightforward constrained maximizationproblem The key, then, is to get good starting values To do this, I implement a routine thatsmoothes out the marketshare constraints by viewing consumer’s choices probabilistically.12

I then use these as starting values for the analytical market share routine I solve this as

a mathematical program with equilibrium constraints (Su and Judd 2008) using the mization routine KNITRO where marketshares are computed with quadrature Appendix 2details this process

Instruments other than the X’s themselves are necessary both because I estimate the standarddeviations of the random coefficients (requiring at least two extra instruments) and becausefirms choose prices knowing the demand shocks, leading to potential correlation betweenprices and the unobservables

Price correlation is less of a problem than in demand system estimation without productdummies because the unobservable here is not the mean unobservable quality level (which Iestimate as part of the δ’s), rather its idiosyncratic deviations

The instruments I use come from the downstream data and consist of products that arebundled in the computer systems that OEMs sell Specifically, I use hard drive size, amount

of ram, screen size, presence of discrete video card, and the version of installed operatingsystem These are all characteristics that have clear ordinal rankings where consumers prefermore to less Firms tend to bundle higher priced chips with better characteristics, creatingthe correlation between the instruments and price that is necessary Furthermore, decisionsabout which products to bundle are made by the OEM at much less frequency than thefrequency with which CPU prices change, making it highly plausible that the instruments

12 This probabilistic formulation was developed by Che-Lin Su in currently unpublished work I thank him for sharing it with me.

Trang 17

are uncorrelated with the unobserved demand shocks.

Table 1 present the results of the demand estimation routine The left-hand panel lists theestimated quality for a selection of chips that were available at the end of 2006 and in early

2007.13 Because the mean of the price coefficient is essentially 1, the coefficients can roughly

be thought of as the average consumer’s willingness to pay (relative to the outside option) foreach chip Using estimated mean quality levels (as opposed to performance comparisons tied

to CPU clockspeed) allows for the easy comparison of chips across generations and betweenfirms I’ve sorted the table within firm in a manner that corresponds with my a priori view ofhow these chips should be ranked.14 For the most part, the estimated coefficients line up withthis informal ranking.15 The right-hand panel of 1 shows a selection of month dummies, thecountry dummies, and the estimates of the standard deviations of the random coefficients.This industry is highly seasonal resulting in month dummies that are higher in Decembercompared to the rest of the year

[Table 1 about here.]

Supply side competition is assumed to proceed in two stages In a first stage, firms chooseproduct qualities, paying a sunk cost for moving products from their previous spot (productsare moved by introducing a new product and taking out an old one that was at the sameprice level) In a second stage, firms compete in prices taking quality choices as given

13 I list these as being illustrative examples that come out of the estimation A table listing the quality levels of the 91 chips that exist in my data would not add all that much and would significantly detract from the readability of the table.

14 That is to say, by chip generation in the order of better chip generations and by product number within chip generation because product numbers often give a rough idea of how the CPU companies themselves view which products are better than others.

15 It is somewhat interesting to see that the high end products from lower chip lines sometimes are valued higher than the low end chips from higher chip lines (compare, for example, the Celeron 360 and the Pentium

4 531).

Trang 18

In reality, product quality choices are dynamic Because products live for more thanone period, their placement today affects sunk costs that would need to be paid tomorrowshould they be moved I follow most of the literature in assuming a two-period model for anumber of reasons: First, the main point of this paper concerns ways in which competitioninteracts with product placement decisions, and to understand these incentives, dynamicsare not necessary Second, in order to explore changes in market primitives it is necessary

to solve counterfactuals under different circumstances Running these counterfactuals asfull dynamic games is computationally impossible because the state space increases withthe number of products My approach is to solve the full game as a two-period model andexplore how the game shifts with dynamics with a more limited number of products I notewith footnotes throughout where I have also computed dynamic versions of the problem.This section starts by laying out the second stage pricing game and discusses marginalcosts (which fall out of the pricing game) Then, the first stage quality choice game isformalized and used as the basis of the sunk cost estimation routine

Firms are assumed to make pricing decisions simultaneously with full knowledge of thestructural error terms Because of the horizontal heterogeneity embedded in the demandsystem, a product from AMD and a product from Intel with exactly the same observableand unobservable characteristics will not be perfect substitutes

Taking qualities and number of products as given, the profit maximization problem forfirm 1 is:

16 The p’s and C’s have product but not country subscripts and the problem for the firm maximizes over

Trang 19

This yields a first order condition of:

1)∂Smj 1(p, δ)

∂pj 1

+Smj 1(p, δ)

(15)

The equilibrium is a fixed point in p for the two firms

The own price derivative for product j is given by:

This is a very natural equation The first two terms quantify the consumers that are lost

to the products above and below in the product space However, not all consumers are lost:only those that were already purchasing products from firm 1 If all consumers purchasedfrom firm 1 so that G(˜ν(p, δ, ∆j1)|∆j1) = 0, then the model collapses back to the verticalmodel The third term quantifies consumers that are lost to firm 2 through a change in ˜ν.Notice that, while only consumers at the boundary of indifference between products are lost

to neighboring products, consumers throughout the spectrum are lost to firm 2

Because neighboring products are also owned by firm 1, losing consumers above andbelow is internalized by the firm It’s straightforward to compute the cross price derivatives

Trang 20

4.1.1 Marginal Costs

Marginal costs are assumed to follow a Markov process A chip’s cost in any given period is

a function of last period’s cost and an idiosyncratic error term:

Given the demand parameters, the first order conditions (equation 15) define a set of linear equations that can be used to back out the marginal cost of a chip in any period.17

non-Estimation begins by solving these systems of equations period by period, giving estimates

of each chip in each period.18 These estimated costs are then regressed on last period’s costs

to get an estimate of ρ

Figure 6 graphs estimated marginal costs for AMD and Intel for two months: May

2006 and May 2007 At the low end of the product spectrum, costs for AMD and Intelare relatively similar, but AMD’s costs rise faster with quality than Intel’s, resulting insignificantly higher costs for producing high quality chips This asymmetry in costs is alarge part of Intel’s competitive advantage Between May 2006 and May 2007, costs for bothcompanies fell These reductions came from the introduction of new technology that allowedfor higher quality chips to be produced at lower cost

[Figure 6 about here.]

Using estimates of marginal costs across all months, I can run equation 18 to get an estimate

of ρ Doing this yields a reasonable value of 9102 with a standard error of (.04) Thisindicates that marginal costs are falling over time, consistent with the earlier graph

17 While Caplin and Nalebuff (1991) show a unique pricing equilibrium for this class of games with product firms, as far as I know, there is no equivalent proof for multi-product firms That potential for multiple equilibria does not affect the estimation because, if there were multiple equilibria, the routine would simply pick out the one that is played in the data However, this could be more problematic in the sunk cost estimation (where I compute counterfactual pricing equilibria) I fall back on the standby of computing equilibria for a wide variety of starting values and see that they always converge to the same point, leading me to hypothesize that, at least at the estimated demand parameters, the equilibrium to this pricing game is unique.

single-18 In this industry, lower quality chips are sometimes the byproduct of higher quality chips Because of variation in the production process, sometimes chips that are designed to be high performance chips don’t pass quality checks but can be salvaged and used at lower performance levels To the extent that firms take into account this process when pricing, my marginal cost estimates will accommodate this behavior Indeed, it’s one of the advantages of estimating marginal costs.

Trang 21

4.2 First Stage: Quality Choices

I extend the game back to a first stage where firms choose quality levels and number ofproducts to be offered When deciding whether to introduce a product of a given quality,firms have at least three things to consider: 1) Every product introduction incurs a sunkcost 2) If they introduce a product with similar characteristics to the competition, thenmarkups will be lower due to closer substitution patterns And 3) Firms would like to usequality to 2nd-degree price discriminate Factors 1 and 2 push the firm toward having fewerproducts in spaces that are uninhabited by their competition Factor 3 pushes toward having

a broad closely-spaced product line The counter-balancing of these forces will determinethe equilibrium

The formal problem for firm 1 is laid out in equations 19 and 20

in every period is given by ¯δ1t I assume it evolves exogenously and that firms are aware

of its evolution Since firms make quality choices without knowledge of the structural errorterms, the expectation operator is over the realization of demand and cost shocks.19 Firmsare forced to pay a sunk cost to change chip qualities (SC) that depends on their previousset of products in the market

4.2.1 Sunk Cost Estimation

I use information on products that were introduced and potential products that were notintroduced to estimate a sunk cost of product introduction Consider a firm’s decision to

19 See Eizenberg (2008) for a full discussion of the sample selection problems involved in estimating these kinds of models I follow him in assuming that firms do not observe the per-period shocks when they make their product decisions.

Trang 22

introduce a new product If they introduce it, then it must have been the case that it wasmore profitable than not introducing it and not paying the sunk cost Similarly, if theydecide not to introduce a product, then it must have been the case that the firm wouldhave been worse off introducing the product and paying a sunk cost These two optimalityconditions allow me to implement an inequality estimator in the style of Pakes, Porter, Ho,and Ishii (2006).

Firms are assumed to pay a constant sunk cost for introducing a new product irrespective

of where in the quality space that product is located Letting δjp denoted the location ofproducts in the previous period, sunk costs are given by:

One side of the bound comes from looking at products that a firm could have moved (byadding a new product to the line and removing the old product) but decided not to If theymoved the product, they would in expectation increase profits but also incur the sunk cost

of product introduction, θ Letting π(δ0j) represent profits for a product movement to anyother position, then inequality 22 must hold

θ ≥ Eπ(δ0

j) − π(δj) (22)

The other side of the bound comes from looking at products that firms did indeed decide tomove In this case, they could have left the product in its old position and foregone the sunkcost, θ That they decided to move the product indicates that the firm expected to makemore profit from its movement than in keeping it in the same place Inequality 23 formalizes

Trang 23

this concept.

θ ≤ E [π(δj) − π(δjp)] (23)

Firms make quality choices in expectation without knowing the realization of demand andcost shocks It is also assumed that firms make quality choices without knowing the qualitychoices that their competition will make Let νsc denote the difference between profit expec-tations and realized profit I assume that νsc is unobserved by both the econometrician andthe firm.20 Denoting r(δj) as the estimate of observed profit that comes from the demandand costs estimates, the relationship between π(δj) and r(δj) is given by:

Using this procedure, I estimate that the sunk costs of introducing a product fall in therange of $1, 236, 000 − $3, 412, 000 This is rather small compared to profits but consistentwith the idea that once a chip generation is introduced, adding an additional product doesn’trequire a whole lot of extra work

Putting together the estimates described above with the structure of the model, allows me

to construct counterfactuals that speak to the role that competition plays in this market

20 Pakes, Porter, Ho, and Ishii allow for a second error that firms know but that is unobserved by the econometrician With appropriate instruments, this can be included in the model I don’t allow for this second error, implicitly assuming that sunk costs are the same across firms and across time.

Trang 24

Key parts of the counterfactuals examine how consumer welfare changes under differentscenarios To be specific about this, consumer welfare is given by:

[Figure 7 about here.]

Moving to a monopoly from an oligopoly has (at least) 3 effects on consumer welfare: 1) Ifconsumers have a taste for AMD’s products, substituting to Intel’s products or the outsidegood will cause a welfare loss for those consumers 2) Prices will be higher 3) Monopolistshave fewer incentives to introduce products into the market and for the products that are

21 Because ν i is assumed to be distributed normal and therefore has infinite support, the very tail ends of this distribution massively change the consumer surplus calculation In order to prevent this, I define ν and

ν as the inverse of the normal distribution at p=.001 and 999 In other words, I chop off the very extreme tail of the ν i distribution.

Trang 25

introduced, monopolists will choose different quality levels.22 The goal of this counterfactual

is to separate out the different components of consumer welfare change and quantify potentialsocial loss or gain

The first set of counterfactuals, detailed in table 2, decomposes a shift to monopoly intothe profit gains and welfare losses from each of these mechanisms Removing AMD butfixing products and prices doesn’t change profit (up 2.8%) or consumer welfare (down 1.2%)very much The relatively small change in consumer welfare indicates that consumer tastefor AMD products is not all that strong Individuals who were purchasing AMD productssuffer some loss from purchasing an Intel product or substituting to the outside option, buttheir lost utility is not large

[Table 2 about here.]

Next, I solve the monopolist’s profit maximization problem, keeping product qualities fixed

at the competitive level Figure 8 plots markups as a function of quality (δ) Prices riseand Intel’s profit goes to $562.2 million Meanwhile consumer welfare drops 51% to $791.3million The increase in prices is telling: It indicates that despite AMD’s relatively smallmarketshare, this industry is quite competitive Removing AMD from the market wouldallow Intel to significantly raise prices leading to large consumer welfare losses

[Figure 8 about here.]

5.1.1 Optimal Product Placement

Consistent with equation 19, a monopolist solves for optimal quality choices knowing themarkov process that costs follow (and the ρ parameter of that process) but without knowing

22 In theory not all consumers are necessarily made worse off by a monopolist: if quality levels change such that consumer types are served that weren’t served under oligopoly or prices go down on some products in response to monopolist segmentation, then welfare for those consumers could go up In practice, it is highly unlikely that these gains will swamp consumer welfare losses for other consumers, and with the structure that I impose I haven’t been able to find parameter values for which this happens Of course, a monopolist will also be more profitable than an oligopolistic and that profit may offset the consumer welfare change leading to greater social surplus.

Trang 26

the idiosyncratic period-by-period demand and cost shocks.

• Simulate from the empirical demand and cost residuals

• Regress costs on quality (along with quadratic and cubic terms) and use the estimatedparameters to construct a mapping from quality to cost for any potential quality level.This smooths out the cost function and prevents lower quality products from beingmore expensive to make than higher quality products

• Compute optimal prices by setting the first order conditions to 0 and solving theresulting system of nonlinear equations

• Compute the profit function given the optimal prices, quality levels, and costs

• Do this 1,000 times for each candidate quality vector and take the average acrosssimulations

• Choose a new candidate vector and repeat the above steps

• Take the maximum of the candidate vectors

In practice, the uncertainty changes the problem very little: The profit function as computedabove for a given candidate vector is very similar to the profit computed by ignoring theuncertainty (assuming demand and cost shocks are zero) Thus, in practice, I first solvethe problem as a constrained optimization problem without uncertainty by handing it off to

Trang 27

the solver KNITRO, and then compute how the profit function changes with uncertainty bypermeating around the solution from the constrained optimization problem.

To examine how many products a monopolist would like to introduce, I compute theoptimal products and consequent profits for N=1 through N=9 Table 3 lists these profits

It is immediately apparent from the table that as long as sunk costs are of even moderatesize, a monopolist will introduce very few products Profits increase for each additionalproduct, but at a very slow rate Furthermore, gains to re-optimizing over the 9 productsare very small relative to the competitive offerings Using the sunk cost estimates from aboveindicates that a monopolist will introduce between 1 and 3 products compared to the 9 thatexist in the data

[Table 3 about here.]

Considering the problem from the social planners perspective serves as a useful benchmarkfrom which to compare the monopoly counterfactual and the oligopoly data A social plannerchooses the number of products for Intel and AMD up to the level where introducing a newproduct decreases consumer surplus by more than the sunk cost of product introduction.Social welfare is maximized when prices are equal to marginal costs, and because marginalcosts are assumed to be constant with respect to quantity at any given quality level, firmprofits are equal to 0

max

hCS(δt, ct, ˜ξt)

i

− SC(δ(t−1,t)) (30)max(δt) ≤ ¯δt (31)

Table 3 displays the social welfare gains from each product introduction Similar tothe monopoly case, most social welfare is generated by the introduction of the first product.Subsequent products generate additional social welfare, but the gains drop off pretty quickly

A social planner would introduce more products (somewhere around 5) than a monopolistbut fewer than the competitive outcome

Ngày đăng: 18/02/2014, 07:20

TỪ KHÓA LIÊN QUAN

🧩 Sản phẩm bạn có thể quan tâm