07 October 2019


Economic Decisions and Simon’s Notion of Bounded Rationality

 International Business Research, 11, 7, 2018, pp. 63-75.

Abstract: This paper focuses on Simon’s notion of bounded rationality, defined as the limitations and difficulties of the decision maker to behave in the way the traditional rational choice theory assumes, due to his insufficient cognitive and computational capacities to process all the relevant information.

Keywords: bounded rationality, economic decisions, expected utility, global rationality, procedural rationality, satisfying behavior.

Decision making in economics has been always intertwined with the concept of rationality. However, neoclassical economic literature has been dominated by a specific notion of rationality, namely, perfect rationality, characterized by the assumption of consistency and by the maximization hypothesis. Herbert Simon, in his long research activity, questioned this concept of perfect or global rationality, suggesting a different vision, based on empirical evidence and regarding an individual’s choices. He challenged the neoclassical theory of global rationality, suggesting his notion of bounded rationality, a satisficing (instead of optimizing) behavior, and the relevance of procedural rationality to understand the process of thought of decision makers.

The concept of rationality is central to economics. This concept passed through various stages, from the strong version of rationality of classical utilitarian economists to the weaker concept of revealed preference theory. However, economic literature has been dominated by the concept of rationality and its consistency feature, and by the maximization hypothesis. Herbert Simon is considered one the fathers of behavioral economics and a pioneer of artificial intelligence. In his long research activity in many scientific fields, including economics, he challenged mainstream economics by postulating, that “human rationality is bounded, due to external and social constraints, and internal and cognitive limitations”.

Simon developed the analysis of decision making related to both individuals and organizations. His theoretical contribution to the topic of economic decisions is the result of an interdisciplinary approach where economics, psychology, cognitive science, and organizational theory interact. Thus, Simon’s notion of bounded rationality became the central topic of this interaction between these discipline fields. This paper focuses on Simon’s notion of bounded rationality, defined as the limitations and difficulties of the decision maker to behave in the way the traditional rational choice theory assumes, due to his insufficient cognitive and computational capacities to process all the relevant information. Undoubtedly, many other authors adopted the label of bounded rationality in the literature to indicate some form of departure from rational choice theory. However, Simon used the term to refer to a more simplified vision of human decision making, by which he linked psychological factors to the decision maker’s economic behavior, and, thus, built his theoretical view on an empirical methodology. As a result, bounded rationality remains the hallmark of his theoretical contribution.

Thus, this paper focuses on Simon’s notion of bounded rationality. The work analyzes in depth Simon’s behavioral model of rational choice. It shows that Simon’s theory of bounded rationality includes three important steps: Search, satisfying, and procedural rationality. Simon’s bounded rationality theory explains the decisional processes that are adopted when it is not possible to choose the best alternative (i.e., fully optimized solution) because of decision makers’ limits in terms of information, cognitive capacity, and attention, and of the complexity of the environment in which decision makers make these decisions. In this environment, the individual searches and tries to make decisions that are good enough (i.e. satisfactory) and that represent reasonable or acceptable outcomes. Bounded rationality is not a derivative concept, but constitutes a basic and primary notion for a positive theory of choice in behavioral terms, linking the economic and the psychological sphere. Moreover, in Simon’s studies the computational aspect is very important, as also emotions can be encapsulated in the computational theory. In the bounded rationality approach, Simon does not look at the goal, but at the process that leads to an objective narrative. Hence, in this theoretical vision, the notion of procedural rationality becomes crucial. Finally, the paper offers an assessment of the notion of bounded rationality and its impact on economics and other social sciences. Despite its limited influence upon economics, Simon’s bounded rationality has transformed decision making theory across literatures and has had a major impact on institutional economics and other social sciences.

23 May 2019


Modeling the Dependence of Conditional Correlations on Market Volatility

Journal of Business and Economic Statistics,  34, 2, 2016, pp. 254-268.

Abstract: Several models have been developed to capture the dynamics of the conditional correlations between series of financial returns and several studies show that the market volatility is a major determinant of correlations. We extend some models to include explicitly the dependence of the correlations on the volatility. The models differ by the way in which the volatility influences the correlations. 

Keywords: Dynamic conditional correlations, Markov switching, Minimum variance portfolio, Model confidence set, Forecasting

It is well known that in financial markets, during turmoil periods characterized by strongly negative returns and weak macroeconomic indicators, both variances and correlations of assets increase; see, for example, Ang and Bekaert (2002), Forbes and Chinn (2004), and Cappiello et al. (2006). The presumably existing strong relationship between correlation and volatility can be employed to improve the forecasting ability of conditional correlation models. This approach is of particular interest for practitioners, since the possibility to improve the forecasts of correlations is important in portfolio choice, hedging, and option pricing, as well as in accounting for spillover effects between markets. Hence, the aim of this research is to check if the impact of volatility on correlations is statistically and economically significant, and if it helps to improve the forecasting performance of conditional correlation models, rather than to understand why correlations increase during some periods and not or less so during other periods.

We use a broad portfolio of models to capture in different ways the dependence of the conditional correlations of a set of financial time series on the market volatility or on its regime.

In particular we extend the Dynamic Conditional Correlation (DCC) model of Engle (2002) in different ways: by including the volatility (or a variable measuring its regime) as an additive independent variable, or by including its effect through time-varying coefficients in the model. We use similar extensions of the Tse and Tsui (2002) dynamic correlation model, and the Dynamic Equi-Correlation (DECO) model of Engle and Kelly (2012). The dependence relation is also modelled by extending the Regime Switching Dynamic Correlation (RSDC) model of Pelletier (2006) to include the effect of the volatility (or its regime) in the transition probabilities. The influence of volatility or its regime on the correlations is contemporaneous (instead of lagged). To implement this idea, we construct one-step ahead forecasts of the volatility (or its regime) as the additional variable to include in the existing models, through linear or nonlinear, and direct or indirect effects.

Our approach is related to the factor ARCH model of  Engle et al. (1990), where the correlations between asset returns implied by that model depend not only on their betas but also on the time-varying conditional variance of the market return. Our approach can be viewed as a reduced form one not involving the asset betas, since we let the conditional correlations be directly functions of the volatility or its regime.

The models are applied to two data sets. A detailed analysis is provided for a case with three assets, in order to illustrate in detail the main characteristics of the proposed models and the results. We extend the analysis to a data set consisting of the thirty assets composing the Dow Jones industrial index. The model comparisons are performed using statistical approaches, such as hypotheses tests, information criteria, and the model confidence set (MCS) method of Hansen et al. (2003). They are also done using an economic loss function, namely the minimum variance portfolio approach as in Engle and Colacito (2006), and through the evaluation of the economic significance of the volatility effect on correlations in the different models. Monte Carlo simulations are used to study the properties of some of the employed methods in the presence of model uncertainty. We mainly find that:

  1. The correlations are subject to changes in regime and are sensitive both to the level of volatility and the regime of volatility (high or low), in particular in terms of gains in minimum portfolio variance:
  2. Among the considered models that incorporate a volatility effect, those that do it through the regime variable allow us to find significant marginal impacts of market volatility on correlations;
  3. If we make a distinction between long-run and short-run correlations, the volatility affects the long-run ones, rather than the short-run ones;
  4. The volatility, or its regime, does not improve the forecasts of the correlations.


Ang, A., and Bekaert, G. (2002), “International Asset Allocation With Regime Shifts,” Review of Financial Studies, 15, 1137–1187.

Cappiello, L., Engle, R. F., and Sheppard, K. (2006), “Asymmetric Dynamics in the Correlations of Global Equity and Bond Returns,” Journal of Financial Econometrics, 4, 537–572.

Engle, R. F. (2002), “Dynamic Conditional Correlation: A Simple Class of Multivariate Generalized Autoregressive Conditional Heteroskedasticity Models,” Journal of Business and Economic Statistics, 20, 339–350.

Engle, R., and Colacito, R. (2006), “Testing and Evaluating Dynamic Correlations for Asset Allocation,” Journal of Business and Economic Statistics, 22, 367–381.

Engle, R. F., and Kelly, B. (2012), “Dynamic Equicorrelation,” Journal of Business and Economic Statistics, 30, 212–228.

Engle, R. F., Ng, V., and Rothschild, M. (1990), “Asset Pricing With a Factor ARCH Covariance Structure: Empirical Estimates for Treasury Bills,” Journal of Econometrics, 45, 213–237.

Forbes, C. S., and Chinn, M. D. (2004), “A Decomposition of Global Linkages in Financial Markets Over Time,” The Review of Economics and Statistics, 86, 705–722.

Hansen, P. R., Lunde, A., and Nason, J. (2003), “Choosing the Best Volatility Models: The Model Confidence Set Approach,” Oxford Bulletin of Economics and Statistics, 65, 839–861.

Pelletier, D. (2006), “Regime-Switching for Dynamic Correlation,” Journal of Econometrics, 131, 445–473.

Tse, Y. K., and Tsui, A. K. C. (2002), “A Multivariate GARCH Model With Time-Varying Correlations,” Journal of Business and Economic Statistics, 20, 351–362.

10 May 2019 


Business models for developing smart cities. A fuzzy set qualitative comparative analysis of an IoT platform

Technological Forecasting and Social Change, 2019, vol. 142, pp. 183-193.

Abstract: Which configurations of Business Model (BM) exist in an IoT platform aiming at smart cities’ development? We argue that BM configurations have general characteristics beyond individual firms’ unique traits. Our empirical findings (based on a fuzzy set qualitative comparative analysis) show BM’s causal complexity and reveal the most frequent patterns of association among value propositions and BM’s building blocks.

Keywords: Smart cities, Internet of things, Technology platform, Business model, Qualitative comparative analysis 

During the last two decades, the number of projects focusing on smart cities that have been launched worldwide has constantly increased. The common trait of such projects is that they exploit the opportunities offered by innovative Information Technology (IT) solutions (and, especially, Internet of Things technology – IoT) to provide better and sustainable living conditions to citizens. As such, most of the attention has been devoted to technological aspects related to them. A smart cities project is usually made of a set of IT devices that exchange information among themselves within a common technology platform. Different actors (both private enterprises and public organizations) participate in this complex ecosystem, and the integration and coordination of their activities represent a major challenge for any project.

Albeit the technological aspects related to the functioning of the system do play a key role, the strategic actions of firms involved in the implementation of smart cities projects have to be properly investigated as well. As in the case of any emerging technology, firms struggle to find the best way to exploit the new market opportunities, by seeking the best configuration of resources and capabilities to design products and services that satisfy customer needs. In turn, they need to design and adopt proper and innovative Business Models (BMs), which are suited to the specificities of smart cities projects.

The term “business model” has gained popularity in the late 1980s spawning from e-commerce to a variety of empirical contexts. It is conceived as a conceptual tool or model able to figure out how firms generate and deliver value to customers, entice customers to pay for value, and convert those payments into profit. Since its original formulation, literature on BM has constantly grown. However, despite the number of research papers directed to exploring BM over the last two decades, structured research on BM associated to smart cities projects remains scarce. Particularly, theory-building work and empirical research beyond single-case studies is lacking.

Moving from this gap, this study addresses the following research question: What different configurations of BM exist in an Internet of Think (IoT) platform that aims at developing smart cities projects? Indeed, while BM shows path-dependency and is the result of firms’ own histories, BM configurations have general characteristics beyond the settings of individual firms. Therefore, the analysis of BMs that firms may adopt to exploit smart cities projects, should focus on the analysis of the best configurations of resources and activities.

In order to do so, we use a fuzzy set qualitative comparative analysis (fsQCA), which combines within-case analysis with formalized, systematic cross-case comparisons. In details, fsQCA has the potential to dig deeper in configurations, such as BM, to understand (1) what different types of cases may occur in a given setting by considering their similarities and differences, and (2) the complex causal relations underlying the emergence of the outcome of interest.

We apply this methodological approach to a setting composed of 21 Small and Medium Enterprises (SMEs) that have taken part to an EU funded Accelerator (named FIWARE) focused on smart cities. By applying fsQCA to collect data about firms’ activities and strategic goals, we explore what different types of BM can be successfully adopted by firms that exploit the potentiality of a novel IoT platform to develop smart cities solutions. In turn, we focus on and isolate the relationships existing among the building blocks that cause the emergence of those specific BMs.

Results of our study offer several relevant implications both for practice and theory. As for the former, on the one side, firms that intend to develop smart cities projects should offer customized products or services, and consider the cooperation with customer capabilities as the main key resource. Additionally, our findings encourage firms and startups to puzzle customer capabilities together with customer application development as key activities and customers as main partners. On the other side, we show that no consistent pattern is associated to “standardized products and services”, most likely because a dominant design has not emerged yet, and thus customization is compulsory.

As for theory, the contribution that this study offers to the BM literature is twofold.

On the one side, during the last decades, prior research has shown that firms may benefit from collaborations with external partners by allowing the in-flow of external technologies and technological competences. In fact, external technologies may be integrated with the internal technological base in order to generate new products/services and enhance the firm’s ability to create value. In the case of IoT or smart-cities’ technology platforms, firms’ BMs have to be adapted in order to achieve advantages for both technology suppliers and technology users. Specifically, we argue that multiple BMs can coexist within technology platforms, that is, BMs adopted by Platform Developers and BMs adopted by Platform Users. In the case of Platform Users, if upstream operators have made the platform general enough then the cost of technology adaptation that downstream software developers have to incur in to apply the GPT to the specific application need is expected to be lower than the cost that the same software developers should incur to fully develop the applications in-house, if the GPT platform were not present. The story of the technology platform described in this study can be interpreted in this sense: in the presence of an industry structure organized around a IoT and smart cities’ technology platform, also downstream operators have incentives to adopt a BM which is open to collaboration with external providers.

On the other hand, we suggest that platform users do not necessarily have to adopt a similar BM. In fact, several configurations of resources and activities may co-exist, guaranteeing success to the firms. This result extends prior literature on BMs applied to smart cities and IoT. We thus show that not all the configurations of building blocks permit firms to benefit from the opportunities offered by the emerging field of smart cities. A proper coherence between key resources, key activities and key partners is key in this respect.

8 May 2019


A Pseudo-Market Approach to Allocation with Priorities

American Economic Journal: Microeconomics, vol. 10, n. 3, August 2018, pp. 272-314.

Abstract:  We propose a pseudo-market mechanism for no-monetary-transfer allocation of indivisible objects based on priorities such as those in school choice. Agents are given token money, face priority-specific prices, and buy utility-maximizing random assignments. The mechanism is asymptotically incentive compatible, and the resulting assignments are fair and constrained Pareto efficient. Aanund Hylland & Richard Zeckhauser (1979)’s position-allocation problem is a special case of our framework, and our results on incentives and fairness are also new in their classical setting. (JEL D63, D82, H75, I21, I28).

Keywords: Allocation Problem, Cardinal Preferences, Pseudo-market, Priorities.

The aim of this paper is to study the allocation of indivisible objects where monetary transfers are precluded and agents demand at most one object. Examples include student placement in public schools (where an object corresponds to a school seat and each object has multiple copies) and allocation of work or living space (where each object has exactly one copy). A common feature of these settings is that agents are prioritized. For instance, students who live in a school’s neighborhood or have siblings in the school may enjoy admission priority at this school over those who do not, and the current resident may have priority over others in the allocation of the dormitory room he or she lives in. Due to the lack of monetary transfers, objects in these environments are very often allocated by a centralized mechanism that maps agents’ reported preferences to an allocation outcome. The outcome, known as assignment, can be either deterministic or random. The former dictates who gets what object, and the latter prescribes the probability shares of objects that each agent obtains and thus is a lottery over a set of deterministic assignments. The standard allocation mechanisms used in practice and studied in the literature are ordinal: students are asked to rank schools or rooms, and the profile of submitted rankings determines the assignment. However, Miralles (2008) and Abdulkadirog ̆ lu, Che, and Yasuda (2011) pointed out that we may implement Pareto-dominant assignments by eliciting agents’ cardinal utilities, which are their relative intensities of preferences over objects and their rates of substitution between probability shares in objects. Furthermore, Liu and Pycia (2012) and Pycia (2014) showed that sensible ordinal mechanisms are asymptotically equivalent in large markets, while mechanisms eliciting cardinal utilities maintain their efficiency advantage. Naturally, with more inputs, we expect a mechanism to deliver a better outcome, as cardinal preferences are more informative than ordinal ones. However, what has not been answered in the literature is how to use cardinal information efficiently. This paper aims to fill this gap by providing a novel cardinal mechanism to improve upon the ordinal mechanisms. The mechanism is asymptotically incentive compatible, fair, and constrained efficient among ex ante stable and fair mechanisms. A mechanism is ex ante stable if, in any of its resulting assignment, no probability share of an object is given to an agent with lower priority at this object whenever a higher priority agent is obtaining some probability shares in any of his/her less preferred objects (Kesten and Ünver 2015). Furthermore, every deterministic assignment that is compatible with an ex ante stable random assignment eliminates all justified envy and thus satisfies stability (Abdulkadirog ̆ lu and Sönmez 2003).

We use the strong fairness concept, equal claim, proposed by He, Li, and Yan (2015); a mechanism satisfies equal claim if agents with the same priority at an object are given the same opportunity to obtain it. We refer to our construction as the pseudo-market (PM) mechanism, which elicits cardinal preferences from agents and delivers an assignment. If it is a random assignment, one can then conduct a lottery to implement one of the compatible deterministic assignments. To map reported preferences into assignments, PM internally solves a Walrasian equilibrium, where prices are priority-specific and the mechanism chooses probability shares to maximize each agent’s expected utility given his/her reported preferences and an exogenous budget in token money. Budgets need not be equal across agents.

This Walrasian equilibrium used in the internal computation of the PM mechanism has a unique feature in its priority-specific prices: for each object, there exists a cutoff priority group such that agents in priority groups strictly below the cutoff face an infinite price for the object (hence, they can never be matched with the object), while agents in priority groups strictly higher than the cut-off face zero price for the object. By incorporating priorities in this manner, the PM mechanism extends the canonical Hylland and Zeckhauser (1979) mechanism, which requires every agent to face the same prices and thus does not allow priorities. It is also a generalization of the Gale-Shapley Deferred-Acceptance(DA) mechanism, the most celebrated ordinal mechanism. Essentially, when both agents and objects have strict rankings over those on the other side, the DA mechanism eliminates all justified envy; when-ever there are multiple agents in one priority group of an object, the tie has to be broken, usually in an exogenous way. The PM mechanism, instead, has ties broken endogenously and efficiently by using information on cardinal preferences. Agents with relatively higher cardinal preferences for an object obtain shares of that object before others who are in the same priority group. We show that the PM mechanism is well-defined in the sense that it can always internally find a Walrasian equilibrium and deliver an assignment given any reported preference profile. Moreover, the mechanism is shown to be asymptotically incentive compatible in regular economies, where regularity guarantees that Walrasian prices are well defined as in the classical analysis of Walrasian equilibria (see, e.g., Dierker 1974, Hildenbrand 1974, and Jackson 1992).

The PM mechanism allows one to achieve higher social welfare than mechanisms eliciting only ordinal preferences such as the DA and the Probabilistic Serial mechanisms; it is ex ante stable because of our design of the priority-specific prices. Given an object s and its cutoff priority group, whenever a lower priority agent obtains a positive share of s, a higher priority agent must face a zero price for s, and, therefore, is never assigned to an object they prefer less than s. We study fairness of the PM mechanism in the sense of equal claim, which requires that, for any given object, agents with the same priority are given the same opportunity to obtain this object.

Since prices for agents in the same priority group are by construction the same in the PM mechanism, we can conclude that equal claim is satisfied when agents are given equal budgets.

06 May 2019


L’Economia dei beni confiscati  

FrancoAngeli, Milano, 2014, pp. 138, ISBN: 9788820475031

Abstract: The goal of this book is to highlight that the confiscation of assets to organized crime is a way to create “social capital”. It shows that in many municipalities of Southern Italy, characterized by the presence of confiscated property and reused, there is greater consensus for political programs with issues concerning the “legality”. The “Probit” analysis takes into consideration a sample of 542 Italian municipalities.

Keywords: Sviluppo economico; capitale sociale; Economia del Mezzogiorno

L’ipotesi di partenza di questa ricerca, confermata dai risultati econometrici, è che vi sia da alcuni anni, nei territori dove sono presenti beni confiscati riutilizzati ai fini sociali, un senso di riscatto nei confronti della criminalità organizzata da parte della “società civile”. Non a caso, in molti Comuni del Mezzogiorno, caratterizzati da esperienze di riutilizzo sociale di beni immobili confiscati alla criminalità, i risultati elettorali per l’elezione del Sindaco hanno premiato partiti e/o movimenti civici alternativi a quelli tradizionali (sia di Centro-destra che di Centro-sinistra). Questo risultato è stato maggiormente evidente nei Comuni rientranti nel sistema elettorale caratterizzato da un turno di ballottaggio tra i candidati Sindaci. Infatti, per tali Comuni l’analisi empirica ha rilevato una significativa influenza della variabile “Immobili confiscati e gestiti” (proxy di “capitale sociale”) sui “Risultati elezioni a Sindaco”; connotando questa ricerca di rilevanti elementi di originalità nell’ambito della letteratura sul tema.