top of page

Louis Abraham

January 18, 2024

18.30-19.30 Paris / 12.30-13.30 NY time

LLM, current state, and outlooks:

Discussed two papers:

Large Language models as Simulated Economic Agents: What Can We Learn from Homo Silicus?

 

by John J. Hornton

LLaMA: Open and Efficient Foundation Language Models

by Touvron et al

louis abraham.jpg

Giovanni Compiani

Chicago Booth

December 7, 2023

Demand Estimation with Text and Image Data

Abstract:

We propose a demand estimation method that allows researchers to estimate substitution patterns from unstructured image and text data. We first employ a series of machine learning models to measure product similarity from products’ images and textual descriptions. We then estimate a nested logit model with product-pair specific nesting parameters that depend on the image and text similarities between products. Our framework does not require collecting product attributes for each category and can capture product similarity along dimensions that are hard to account for with observed attributes. We apply our method to a dataset describing the behavior of Amazon shoppers across several categories and show that incorporating texts and images in demand estimation helps us recover a flexible cross-price elasticity matrix.

Giovanni Compiani
Abstract Futuristic Background
20181204110419.jpg

 

November 16 2023, 5pm Paris time / 11am NY time

Jeremy Large 

(Oxford University)

Estimating very large demand systems (joint with Joshua Lanier and John Quah) 

​​

Abstract: We present a discrete choice, random utility model and a new estimation technique for analyzing consumer demand for large numbers of products. We allow the consumer to purchase multiple units of any product and to purchase multiple products at once (think of a consumer selecting a bundle of goods in a supermarket). In our model each product has an associated unobservable vector of attributes from which the consumer derives utility. Our model allows for heterogeneous utility functions across consumers, complex patterns of substitution and complementarity across products, and nonlinear price effects. The dimension of the attribute space is, by assumption, much smaller than the number of products, which effectively reduces the size of the consumption space and simplifies estimation. Nonetheless, because the number of bundles available is massive, a new estimation technique, which is based on the practice of negative sampling in machine learning, is needed to sidestep an intractable likelihood function. We prove consistency of our estimator, validate the consistency result through simulation exercises, and estimate our model using supermarket scanner data.

Abstract Futuristic Background
Andrea ENACHE (1).jpg

 

October 19 2023, 5pm Paris time / 11am NY time

Andreea Enache 

(Stockholm School of Economics)

Congestion and Market Expansion: Timing of New Movie Releases in Paris Theaters
(joint with Christophe Bellégo)

​​

Abstract: We use a unique dataset on movie ticket sales in the French movie theaters to structurally estimate the seasonal underlying movie demand, while accounting for competition effects, weather shocks and seasonal sale promotions campaigns. We use a three-level nested logit that allows us to accurately estimate the underlying demand by movie genre and find a significant inter-movie genres competition. Moreover, we control for the congestion in movie theaters and highlight the existence of a trade-off between the demand expansion for movies and the business stealing effect between different movie genres due to the limited availability of screens in a movie theater. We use the estimated model to predict the movie revenues in various scenarios and we provide recommendations for optimal time release for movies. We also discuss the implications for the theater's strategy of movies portfolio diversification given the estimated substitution patterns between different genres.

Abstract Futuristic Background
Capture d’écran 2023-06-23 à 00.14.05.png

 

June 28 2023, 5pm Paris time / 11am NY time

Pierre Bodéré 

(New York University)

Dynamic Spatial Competition in Early Education: an Equilibrium Analysis of the Preschool Market in Pennsylvania

​​

Abstract: High-quality preschool is one of the most cost-effective educational interventions, yet the United States invests little in early childhood education. Recent policy discussions call for increasing preschool enrollment and raising the quality provided, especially for disadvantaged children, but equilibrium responses of private providers which make up most of the market generate trade-offs between these objectives. Supply expansion may lower incentives to invest in quality, and price responses to demand subsidies can increase the costs faced by non-subsidized parents. This paper develops a dynamic model of the preschool market to evaluate the effectiveness of policies at achieving these objectives. The model nests a static equilibrium model of spatial competition and preschool choice within a dynamic model of providers’ entry, exit and quality investments. I estimate this model using data on the universe of child-care centers in Pennsylvania. I use the model to simulate the aggregate and distributional consequences of proposed approaches to early education expansion. I find that policies focused on expanding supply raise access but decrease the quality children attend due to parents’ value for proximity. Demand subsidies generate market expansion, but on their own do not create sufficient incentives for providers to invest in quality. Among the simulated policies, the most cost-effective at expanding high-quality enrollment combine demand subsidies targeted to low-income families with financial support to providers serving disadvantaged children. These policies increase access by reducing exit of providers, and expand high-quality enrollment for low-income children through subsidies. In addition, these targeted policies generate spillovers to the educational quality of non-targeted families by creating incentives for centers to invest in quality.

Abstract Futuristic Background
photo-ZHOUJUNJIE.jpeg

 

February 16 2023, 2pm Paris time / 8am NY time

Junjie Zhou 

(Tsinghua University)

Network games made simple (joint with Yves Zenou)

​​

Abstract: Most network games assume that the best response of a player is a linear function of the actions of her neighbors; clearly, this is a restrictive assumption. We developed a theory called sign-equivalent transformation (SET) underlying the mathematical structure behind a system of equations defining the Nash equilibrium. By applying our theory, we reveal that many network models in the existing literature, including those with nonlinear best responses, can be transformed into games with best-response potentials after appropriate restructuring of equilibrium conditions using SET. Thus, through our theory, we produce a unified framework that provides conditions for existence and uniqueness of equilibrium for most network games with both linear and nonlinear best-response functions. We also provide novel economic insights for both the existing network models and the new ones we develop in this study.

Illuminated Abstract Shapes_edited.jpg
Christensen_Timothy.jpeg

November 24, 5pm Paris time / 11am NY time

Timothy Christensen 

(New York University)

Counterfactual Sensitivity and Robustness (joint with Benjamin Connault)

Abstract: We propose a framework for analyzing the sensitivity of counterfactuals to parametric assumptions about the distribution of latent variables in structural models. In particular, we derive bounds on counterfactuals as the distribution of latent variables spans nonparametric neighborhoods of a given parametric specification while other “structural” features of the model are maintained. Our approach recasts the infinite-dimensional problem of optimizing the counterfactual with respect to the distribution of latent variables (subject to model constraints) as a finite-dimensional convex program. We develop an MPEC version of our method to further simplify computation in models with endogenous parameters (e.g., value functions) defined by equilibrium constraints. We propose plug-in estimators of the bounds and two methods for inference. We also show that our bounds converge to the sharp nonparametric bounds on counterfactuals as the neighborhood size becomes large. To illustrate the broad applicability of our procedure, we present empirical applications to welfare analysis in matching models with transferable utility and dynamic discrete choice models.

Illuminated Abstract Shapes_edited.jpg

December 15 2022, 5pm Paris time / 11am NY time

Marcin Pęski 

(University of Toronto)

Fuzzy Conventions

Abstract: We study binary coordination games with random utility played in networks. A typical equilibrium is fuzzy - it has positive fractions of agents playing each action. The set of average behaviors that may arise in an equilibrium typically depends on the network. The largest set (in the set inclusion sense) is achieved by a network that consists of a large number of copies of a large complete graph. The smallest set (in the set inclusion sense) is achieved in a lattice-type network. It consists of a single outcome that corresponds to a novel version of risk dominance that is appropriate for games with random utility.

Marcin_Peski.jpeg
Programming Console
Ilse_Lindenlaub.jpeg

November 17 2022, 5pm Paris time / 11am NY time

Ilse Lindenlaub

(Yale University)

Firm Sorting and Spatial Inequality (joint work with Ryungha Oh and Michael Peters)

Abstract: We study the importance of firm sorting for spatial inequality. If productive locations are able to attract the most productive firms, then firm sorting acts as an amplifier of spatial inequality. We develop a novel model of spatial firm sorting, in which heterogeneous firms first choose a location and then hire workers in a frictional local labor market. Firms’ location choices are guided by a fundamental trade-off: Operating in productive locations increases output per worker, but sharing a labor market with other productive firms makes it hard to poach and retain workers, and hence limits firm size. We show that sorting between firms and locations is positive—i.e., more productive firms settle in more productive locations—if firm and location productivity are complements and labor market frictions are sufficiently large. We estimate our model using administrative data from Germany and find that highly productive firms indeed sort into the most productive locations. In our main application, we quantify the role of firm sorting for wage differences between East and West Germany, which reveals that firm sorting accounts for 17%-27% of the West-East wage gap.

September 29 2022, 5pm Paris time / 11am NY time

Alexander Wolitzky 

(Massachusetts Institute of Technology)

Persuasion as Matching (joint with Anton Kolotilin and Roberto Corrao)

Abstract: In persuasion problems where the receiver’s action is one-dimensional and his utility is single-peaked, optimal signals are characterized by duality, based on a first-order approach to the receiver’s problem. A signal is optimal if and only if the induced joint distribution over states and actions is supported on a compact set (the contact set) where the dual constraint binds. A signal that pools at most two states in each realization is always optimal, and such pairwise signals are the only solutions under a non-singularity condition on utilities (the twist condition). We provide conditions under which higher actions are induced at more or less extreme pairs of states. Finally, we provide conditions for the optimality of either full disclosure or negative assortative disclosure, where signal realizations can be ordered from least to most extreme. Optimal negative assortative disclosure is characterized as thesolution to a pair of ordinary differential equations.

Wolitzky_Alexander.jpeg
Illuminated Abstract Shapes_edited.jpg
Susanne_Schennach.jpeg

August 25 2022, 5pm Paris time / 11am NY time

Susanne M. Schennach 

(Brown University)

Optimally-Transported Generalized Method of Moments (joint with Vincent Starck)

Abstract: We propose a novel optimal transport-based version of the Generalized Method of Moment (GMM). Instead of handling overidentified models by reweighting the data until all moment conditions are satisfied (as in Generalized Empirical Likelihood methods), this method proceeds by introducing measurement error of the least mean square magnitude necessary to simultaneously satisfy all moment conditions. This approach, based on the notion of optimal transport, aims to address the problem of assigning a logical interpretation to GMM results even when overidentification tests reject the null, a relatively common situation in empirical applications. We discuss the implementation of the method as well as its asymptotic properties and illustrate its  usefulness through examples.

Illuminated Abstract Shapes_edited.jpg

April 7 2022, 5pm Paris time / 11am NY time

Jacob Leshno 

(University of Chicago Booth School of Business)

Price Discovery in Waiting Lists:
A Connection to Stochastic Gradient Descent (joint with Itai Ashlagi, Pengyu Qian and Amin Saberi)

Abstract: Waiting lists offer agents a choice among types of items and associated non-monetary prices given by required waiting times. These non-monetary prices are endogenously determined by a simple tâtonnement-like price discovery process: an item’s price increases when an agent queues for it, and decreases when an item arrives and a queuing agent is assigned. By drawing a connection between price adjustments in waiting lists and the stochastic gradient descent optimization algorithm, we show that the waiting list mechanism achieves the optimal allocative efficiency minus a loss due to price fluctuations that is bounded by the granularity of price changes. We further consider a price discovery process inspired by the waiting list mechanism and show that this sim- ple price discovery process performs well if the granularity of price changes is chosen to appropriately trade-off the speed of price adaptation and loss from price fluctuations.

JacobLeshno.jpeg
Programming Console
Sergio_OcampoDiaz.png

March 31 2022, 5pm Paris time / 11am NY time

Sergio Ocampo

(Western University)

A Task-Based Theory of Occupations with Multidimensional Heterogeneity

Abstract: I develop an assignment model of occupations with multidimensional heterogeneity in production tasks and worker skills. Tasks are distributed continuously in the skill space, whereas workers have a discrete distribution with a finite number of types. Occupations arise endogenously as bundles of tasks optimally assigned to a type of worker. The model allows us to study how occupations respond to changes in the economic environment, making it useful for analyzing the implications of automation, skill-biased technical change, offshoring, and workers’ training. I characterize how wages, the marginal product of workers, the substitutability between worker types, and the labor share depend on the assignment of tasks to workers. I introduce automation as a choice of the optimal size and location of a mass of identical robots in the task space. Automation displaces workers by replacing them in the performance of tasks. This generates a cascading effect on other workers as the boundaries of occupations are redrawn.

February 17 2022, 5pm Paris time / 11am NY time

Florian Gunsilius

(University of Michigan)

 

Matching for causal effects via multimarginal optimal transport (joint with Yuliang Xu)

Abstract: Matching on covariates is a well-established framework for estimating causal effects in observational studies. The principal challenge in these settings stems from the often high-dimensional structure of the problem. Many methods have been introduced to deal with this challenge, with different advantages and drawbacks in computational and statistical performance and interpretability. Moreover, the methodological focus has been on matching two samples in binary treatment scenarios, but a dedicated method that can optimally balance samples across multiple treatments has so far been unavailable. This article introduces a natural optimal matching method based on entropy-regularized multimarginal optimal transport that possesses many useful properties to address these challenges. It provides interpretable weights of matched individuals that converge at the parametric rate to the optimal weights in the population, can be efficiently implemented via the classical iterative proportional fitting procedure, and can even match several treatment arms simultaneously. It also possesses demonstrably excellent finite sample properties.

florian-gunsilius.jpg
PhotoPauline1.jpg

January 6 2022, 5pm Paris time / 11am NY time

Pauline Corblet

Education Expansion, Sorting, and the Decreasing Education Wage Premium
(job market paper)

Abstract: This paper studies the interplay between worker supply and firm demand, and their effect on sorting and wages in the labor market. I build a model of one-to-many matching with multidimensional types in which several workers are employed by a single firm. Matching is dictated by worker preferences, their relative productivity in the firm, and substitution patterns with other workers. Using tools from the optimal transport literature, I solve the model and structurally estimate it on Portuguese matched employer-employee data. The Portuguese labor market is characterized by an increase in the relative supply of high school graduates, an increasingly unbalanced distribution of high school graduates versus non-graduates across industries, and a decreasing high school wage premium between 1987 and 2017. Counterfactual exercises suggest that both changes in worker preferences and the increasing relative productivity of high school graduates over non-graduates act as a mitigating force on the decreasing high school wage premium, but do not fully compensate for high school graduates’ rise in relative supply.

Programming Console

December 2 2021, 5pm Paris time / 11am NY time

Job Boerma

Sorting with Team Formation
(with Aleh Tsyvinski and Alexander P. Zimin)

Abstract: We fully solve an assignment problem with heterogeneous firms and multiple heterogeneous workers whose skills are imperfect substitutes, that is, when production is submodular. We show that sorting is neither positive nor negative and is characterized sufficiently by two regions. In the first region, mediocre firms sort with mediocre workers and coworkers such that output losses are equal across all these pairings (complete mixing). In the second region, high skill workers sort with a low skill coworker and a high productivity firm, while high productivity firms employ a low skill worker and a high skill coworker (pairwise countermonotonicity). The equilibrium assignment is also necessarily characterized by product countermonotonicity, meaning that sorting is negative for each dimension of heterogeneity with the product of heterogeneity in the other dimensions. The equilibrium assignment as well as wages and firm values are completely characterized in closed form. We illustrate our theory with an application to show that our model is consistent with the observed dispersion of earnings within and across U.S. firms. Our counterfactual analysis gives evidence that the change in the firm project distribution between 1981 and 2013 has a larger effect on the observed change in earnings dispersion than the change in the worker skill distribution.

Job-Boerma-profielfoto-900px-1-480x720.jpg
Coding Station
rosaianicola-200924-0569.jpg

September 23 2021, 1pm Paris time / 7am NY time

Nicola Rosaia

Duality and Estimation of Undiscounted Markov Decision Processes

Abstract: This paper studies estimation of undiscounted Markov decision processes (MDPs). Exploiting convex analytic methods, it argues that undiscounted MDPs can be treated as static discrete choice models over state-action frequencies, leveraging this idea to derive a conjugate duality framework for studying this type of models. It then exploits this framework to draw implications in several dimensions. First, it characterizes the empirical content of undiscounted MDPs, analyzing how exclusion or parametric restrictions can produce identification of agents’ payoffs, and providing an axiomatic characterization of the undiscounted dynamic logit model; second, it proves convergence of simple inversion algorithms based on progressive Tâtonnements, and investigates novel estimation strategies based on these. Finally, it shows that the dual framework extends to models with persistent fixed effects and to models where certain actions or states are unobserved.

bottom of page