Computer Science Meets Economics
Disclaimer: This article is part of the Industry 4.0 Open Educational Resources (OER) Publication Initiatives jointly supported by Duke Learning Innovation Center and DKU Center for Teaching and Learning under the Carrying the Innovation Forward program. This article belongs to the OER Series No. 3 Computational Economics Spring 2022 collection. The Spring 2022 collection is partly supported by the Social Science Divisional Chair’s Discretionary Fund to encourage faculty engagement in undergraduate research and enhance student-faculty scholarly interactions outside of the classroom. The division chair is Prof. Keping Wu, Associate Professor of Anthropology at Duke Kunshan University. The co-author Zeyu Shen was the Teaching and Research Assistant for Prof. Luyao Zhang in the course: COMPSCI/ECON 206 Computational Microeconomics at Duke Kunshan University Spring 2022, when he completed the joint article. The co-authors are forever indebted to Prof. Vincent Conitzer, who presented “Computer Science Meets Economics” as a distinguished guest lecture for this course on Apr. 19, 2022.
In this article, we take a quick tour of the research accomplishments of Prof. Vincent Conitzer in a variety of areas, with a concentration on the intersection of Economics and Computer Science. We first offer a brief overview of the background knowledge, and then present a few representative works for each of his prominent areas of research.
Prof. Vincent Conitzer writes on the course website of computational microeconomics (Conitzer 2022):
“In recent years, there has been a surge of interaction between computer scientists and economists. This interaction is driven both by necessity and opportunity. On the one hand, as computer systems become more interconnected, multiple parties must interact in the same environment and compete for scarce resources, which necessarily introduces economic phenomena. On the other hand, in the past, economic mechanisms (such as auctions and exchanges) have been designed to require very limited computing and communication resources, as they would otherwise be impractical. These days, computation and communication pose much less of a constraint, which presents an opportunity to create highly efficient, computationally intensive mechanisms.”
Throughout his career, Prof. Vincent Conitzer has actively explored the necessity, the opportunity and made significant progress in the intersection of Economics and Computer Sciences.
Professor Conitzer’s research at the intersection of Economics and Computer Science include mechanism design [7], game theory [9], machine learning [2], and Artificial Intelligence (AI) [3]. Mechanism design [1] is a branch of economics that explores how to achieve desirable outcomes given the individuals’ self-interest. The Nobel prize in 2007 was awarded to Leonid Hurwicz, Eric Maskin, and Roger Myerson for their contribution to mechanism design theory. Game theory studies mathematical models of strategic interactions among agents. Nash was awarded the 1994 Nobel prize for his landmark works in game theory. Machine learning and AI mainly investigate automated analytical model building and how machines and systems can mimic human intelligence to behave and learn. Numerous researchers in these two areas have been granted the prestigious Turing Award. The winners for their contribution to artificial intelligence, including Feigenbaum, Edward A ("Ed") (1994), McCarthy, John (1971), Minsky, Marvin (1969), Newell, Allen (1975), Pearl, Judea (2011), Reddy, Dabbala Rajagopal ("Raj") (1994), Simon, Herbert ("Herb") Alexander (1975), Valiant, Leslie Gabriel (2010). Valiant, Leslie Gabriel (2010) is the winner in Machine Learning.
Among them, mechanism design is seeking solutions to practical problems: how can we design strategic choices to achieve desirable social or economic outcomes given the assumption of individuals’ self-interest and information asymmetry? For example, in combinatorial auctions, which is a type of market in which participants can place bids on combinations of items. In voting theory, which studies voting rules that can aggregate votes and determine winners. Computer Science often finds its usage for Economics in computational complexity, i.e. the time and computational resources needed to implement and execute a certain algorithm or mechanism. In general, we end up with two kinds of results:
We identify an algorithm of computational efficiency that can solve the problem.
We document an impossibility theory, which shows that the problem cannot be efficiently solved at all. (e.g. the problem is NP-hard)
In this section, we provide an overview of the theory of computational complexity, and offer a quick introduction to NP-hardness and NP-completeness, so that the readers can have a better understanding of the significance of Prof. Vincent Conitzer’s research works.
In short, the theory of complexity studies what problems can be solved efficiently and what problems cannot. The two important concepts involved are NP-hardness and NP-completeness. To understand these concepts, we first briefly introduce two fundamental notions of complexity: Polynomial-Time (P) and Nondeterministic Polynomial-Time (NP) [4]. For any mechanism design problem, economists often care more about the existence of the solutions. In contrast, Computer Scientist ask: given the existence of the solution:
Can we easily check whether a solution is correct or not, i.e., have proofs verifiable in polynomial time by a deterministic Turing machine?
Can we find the solution to a problem efficiently, i.e., in polynomial time by a deterministic Turing machine [8]?
The concept of P and NP are born under the hood of these two questions. In particular, if a problem says yes to the first question, we say it’s in NP. The name of NP is more intuitive with an equivalent definition: the set of decisions solvable in polynomial time by a nondeterministic Turing machine.
And if a problem says yes to the second question, we say it’s in P. It’s easy to infer that any problem in P is NP since we can verify the solution by simply solving the problem. However, whether P = NP has been a famous open question.
NP-hard [5] problems are the set of problems that the necessary condition to find a polynomial-time algorithm for the solution is P = NP. Showing that a problem is NP-hard is pretty much like showing an impossibility result that this problem can’t be solved efficiently because even the problem whether P = NP is still an open question. There is an established set of problems that are known to be NP-hard. If we want to show a new computational problem is NP-hard, we could just establish an equivalence relationship with any of those problems. Finally, NP-complete [6] problems are the set of problems that lies in NP and are NP-hard.
In Prof. Vincent Conitzer’s research, he shows that many existing solutions concepts in mechanism design are facing a computational issue. For example, Conitzer and Sandholm (2003) show that to determine whether a Nash Equilibrium with certain natural properties exit is NP-hard. Thus, his research inspires Economist to explore alternative formulations of existing problems and identify solution concepts of low computational costs. His contribution to the interplays of Economics and Computer Science is thus fundamental and profound.
Prof. Vincent Conitzer’s research in mechanism design includes four facets of Complexity of Mechanism Design, Automated Mechanism Design, Voting Theory, and Combinatorial Auctions.
In the masterwork of Complexity of Mechanism Design, Conitzer and Sandholm (2002) documents the conditions under which determinist mechanism designs problems are NP-hard.
In the paper, Self-Interested Automated Mechanism Design and Implications for Optimal Combinatorial Auctions, Conitzer and Sandholm (2004) study the application of automated mechanism design in auction theory.
In the field of voting theory, Prof. Vincent Conitzer has presented a wide collection of new results for traditional voting rules, with works such as Common Voting Rules as Maximum Likelihood Estimators (Conitzer and Sandholm, 2005), and Improved Bounds for Computing Kemeny Rankings (Conitzer, Davenport, and Kalagnanam, 2006).
Professor Conitzer also has a wide body of research on combinatorial auctions, e.g. Combinatorial Auctions with K-wise Dependent Valuation (Conitzer, Sandholm, and Santi, 2005). These works provide new computational guarantees and hardness results for classical problems in voting theory and auction theory and help people gain a deeper understanding of these vanilla mechanisms.
Prof. Vincent Conitzer also has a wide range of work in game theory. He not only presented a great number of effective and efficient mechanisms in game theory but also proved that many natural problems are NP-hard. For instance, in his work Complexity Results about Nash Equilibria (Conitzer and Sandholm, 2003), he showed that the problem of determining whether Nash Equilibria with certain properties exist is NP-hard, which resolved a long-standing question of the complexity of computing Nash equilibrium. He modeled the urban network security problem as a game with two players A Double Oracle Algorithm for Zero-Sum Security Games on Graphs (Jain, Korzhyk, Vanek, Conitzer, Pechoucek, and Tambe, 2011), which contributed to the security of urban city networks, transportation networks, computer networks, and many other critical infrastructures.
With ever-increasing development in AI and machine learning, it’s natural to incorporate machine learning into game theory. For instance, we can use machine learning to characterize the process of players gradually learning the optimal strategy through interactions with other players. Prof. Vincent Conitzer also has a number of works in this area, many of which focus on learning in games, with representative works such as Communication Complexity as a Lower Bound for Learning in Games (Conitzer and Sandholm, 2004), and AWESOME: A General Multiagent Learning Algorithm that Converges in Self-play and Learns a Best Response Against Stationary Opponents (Conitzer and Sandholm, 2004). Traditional game theory only defines an equilibrium but not provides a process to achieve an equilibrium. By incorporating machine learning and AI, Prof. Vincent Conitzer study the process in which players gradually learn and improve the strategy to converge to a good, if not optimal, equilibrium.
Vincent Conitzer and Tuomas Sandholm. 2002. “Complexity of Mechanism Design.” Proceedings of the Eighteenth Conference on Uncertainty in Artificial Intelligence. https://arxiv.org/pdf/1408.1486.pdf
Vincent Conitzer and Tuomas Sandholm. 2004. “Self-Interested Automated Mechanism Design and Implications for Optimal Combinatorial Auctions.” Proceedings of the fifth ACM Conference on Electronic Commerce. https://dl.acm.org/doi/pdf/10.1145/988772.988793
Vincent Conitzer and Tuomas Sandholm. 2005. “Common Voting Rules as Maximum Likelihood Estimators.” Proceedings of the Twenty-First Conference on Uncertainty in Artificial Intelligence. https://arxiv.org/pdf/1207.1368.pdf
Vincent Conitzer, Andrew Davenport, and Jayant Kalagnanam. 2006. “Improved Bounds for Computing Kemeny Rankings.” Proceedings of the Twenty-First National Conference on Artificial Intelligence. https://www.aaai.org/Papers/AAAI/2006/AAAI06-099.pdf
Vincent Conitzer, Tuomas Sandholm, and Paolo Santi. 2005. “Combinatorial Auctions with K-wise Dependent Valuation.” Proceedings of the Twentieth National Conference on Artificial Intelligence. https://users.cs.duke.edu/~conitzer/kwiseAAAI05.pdf
Vincent Conitzer and Tuomas Sandholm. 2003. “Complexity Results about Nash Equilibria.” Proceedings of the Eighteenth International Joint Conference on Artificial Intelligence. https://arxiv.org/pdf/cs/0205074.pdf
Manish Jain , Dmytro Korzhyk , Ondrej Vanek, Vincent Conitzer, Michal Pechoucek, and Milind Tambe. 2011. “A Double Oracle Algorithm for Zero-Sum Security Games on Graphs.” International Foundation for Autonomous Agents and Multiagent Systems. https://users.cs.duke.edu/~conitzer/graph_securityAAMAS11.pdf
Vincent Conitzer and Tuomas Sandholm. 2004. “Communication Complexity as a Lower Bound for Learning in Games.” Proceedings of the Twenty-First International Conference on Machine Learning. https://www.cs.cmu.edu/~sandholm/communication.icml04.pdf
Vincent Conitzer and Tuomas Sandholm. 2004. “AWESOME: A General Multiagent Learning Algorithm that Converges in Self-play and Learns a Best Response Against Stationary Opponents.” Proceedings of the Twentieth International Conference on Machine Learning. https://arxiv.org/pdf/cs/0307002.pdf
Juris Hartmanis. 1989. “Godel, Von Neumann, and the P = NP Problem.” https://ecommons.cornell.edu/bitstream/handle/1813/6910/89-994.pdf;jsessionid=9C2EAF06E4207D65FC6880A282789EFE?sequence=1
Vincent Conitzer. 2022. “Computational Microeconomics: Spring 2022.”https://courses.cs.duke.edu/spring22/compsci323d/
Number | Technical Terms | Definition | Source |
[1] | P | Problems with at least one polynomial-time algorithm. | |
[2] | Machine Learning (ML) | A method of data analysis that automates analytical model building. | |
[3] | Artificial Intelligence (AI) | Systems or machines that mimic human intelligence to perform tasks and can iteratively improve themselves based on the information they collect. | |
[4] | NP | Decision problems that can be easily verified if the answer is “yes”. | |
[5] | NP-hard | If an NP-hard problem admits a polynomial-time algorithm, then P = NP. | |
[6] | NP-complete | A problem is NP-complete if it is both NP-hard and in NP. | |
[7] | Mechanism Design | A branch of economics that explores how to achieve desirable outcomes given the individuals' self-interest. | |
[8] | Polynominal time | An algorithm is said to be of polynomial time if its running time is upper bounded by a polynomial expression in the size of the input for the algorithm | |
[9] | Game Theory | Game theory is the study of mathematical models of strategic interactions among rational agents. |