Saul Gorn Lecture Series

Archive of past speakers in the series:

2017

Shafi Goldwasser

“Pseudo Deterministic Algorithms and Proofs”

Abstract: Probabilistic algorithms for both decision and search problems can offer significant complexity improvements over deterministic algorithms. One major difference, however, is that they may output different solutions for different choices of randomness. This makes correctness amplification impossible for search algorithms and is less than desirable in setting where uniqueness of output is important such as generation of system wide cryptographic parameters or distributed setting where different sources of randomness are used. Pseudo-deterministic algorithms are a class of randomized search algorithms, which output a unique answer with high probability. Intuitively, they are indistinguishable from deterministic algorithms by an polynomial time observer of their input/output behavior. In this talk I will describe what is known about pseudo-deterministic algorithms in the sequential, sub-linear and parallel setting. We will also briefly describe an extension of pseudo-deterministic algorithms to interactive proofs for search problems where the veri fier is guaranteed with high probability to output the same output on different executions, regardless of the prover strategies. Based on Joint work with Gat, Goldreich, Ron, Grossman and Holden.

Bio: Shafi Goldwasser is the RSA Professor of Electrical Engineering and Computer Science at MIT. She is also a professor of computer science and applied mathematics at the Weizmann Institute of Science in Israel. Goldwasser pioneering contributions include the introduction of interactive proofs, zero knowledge protocols, hardness of approximation proofs for combinatorial problems, and multi-party secure protocols.She was the recipient of the ACM Turing Award for 2012, the Gödel Prize in 1993 and another in 2001, the ACM Grace Murray Hopper award, the RSA award in mathematics, the ACM Athena award for women in computer science, the Benjamin Franklin Medal, and the IEEE Emanuel R. Piore award. She is a member of the AAAS, NAS and NAE.

Goldwasser received a BS degree in applied mathematics from Carnegie Mellon University in 1979, and MS and PhD degrees in computer science from the University of California, Berkeley, in 1984.

 

2016

Cary Phillips

“Creating the Impossible: Hollywood Visual Effects at Industrial Light & Magic”

Abstract: When you watch a movie in a darkened theater, you imagine that the scenes on the screen actually unfolded in real life with a camera there to capture them. This is utterly false, it’s all fake. At Industrial Light & Magic, small armies of artists and engineers blend digital environments and CG characters with actors on live action sets to bring the impossible to life and blur the lines between fantasy and reality. Cary Phillips will share a light-hearted view behind the scenes of the visual effects of such movies as The Avengers, Pirates of the Caribbean, Transformers, and Star Wars.

Bio: Cary co-leads the R&D group at ILM, where he has worked for over 20 years. He’s a member of the Academy of Motion Picture Arts and Sciences and the recipient of three Academy Technical Achievement Awards. He earned his PhD in computer graphics from Penn in 1991.

2014

Tuesday, April 22th, 2014

Eric Horvitz
Distinguished Scientist and Managing Director
Microsoft Research

“Data, Predictions, and Decisions in Support of People and Society Web”

Abstract: I will present on harnessing data to build predictive models that can guide decision making. First, I will discuss opportunities to harness data in prediction and decision making in healthcare.  I will focus on efforts on readmissions reduction and hospital-associated infection to highlight broader possibilities with using machine learning and decision support to enhance the quality and reduce the cost of healthcare. Then, I will describe the use of anonymized behavioral data drawn from web services as a large-scale sensor network for public health.  I will describe several efforts, including the use of data from logs of web search to identify adverse effects of medications.   Finally, I will discuss directions with building systems that are designed to leverage the complementary aspects of machine and human intelligence to assist people in daily life and to solve problems in science and society.

Bio: Eric Horvitz is distinguished scientist and director at Microsoft Research.  His interests span theoretical and practical challenges with developing systems that perceive, learn, and reason, with a focus on inference and decision making under uncertainty.  He has been elected a fellow of AAAI, AAAS, the American Academy of Arts and Sciences, and the National Academy of Engineering, and has been inducted into the CHI Academy.  He received PhD and MD degrees at Stanford University.  Information on publications, collaborations, and activities can be found at http://research.microsoft.com/~horvitz.

2013

Thursday, April 4, 2013

John C. Mitchell
Mary and Gordon Crary Family Professor
Professor of Computer Science and Electrical Engineering
Stanford University

“Javascript and Web Security: Isolation and Communication in the Programmable Web”

Abstract: The World Wide Web is our most important distributed computer system. In the modern web, pages seen by viewers contain executable programs from many sources, loaded into the browser in increasingly complex ways. This complexity is an important part of the advertising infrastructure, for example, allowing advertising companies to pass information between them and effectively auction off a portion of the user’s screen to the advertiser most interested in reaching that individual. Maps, games, and other apps are also served through sites that do not know what they do or how they work.

In this environment, it is important to allow web designers and site developers to write sophisticated programs that run in end-user browsers. At the same time, some programs may maliciously try to steal information, or worse. Therefore, separate programs from separate sources must be allowed to run, communicate with each other, but not carry out malicious attacks. Over the last several years, we have studied this problem and developed methods for isolating communicating JavaScript programs. Our methods prevent attacks we found on widely used commercial sites and are provably secure, using methods from the mathematical theory of programming languages.  In addition, these approaches are relevant to mobile security, because approximately three-quarters of current Android apps use an embedded “browser” to interact with web sites.

Bio: John Mitchell is the Mary and Gordon Crary Family Professor in the Stanford School of Engineering, Professor of Computer Science, and Vice Provost for Online Learning. His research in computer security focuses on cloud security, mobile and web security, privacy, and network security. He has also worked on programming language analysis and design, formal methods, and applications of mathematical logic to computer science. Prof. Mitchell currently leads research projects funded by the US Air Force, the Office of Naval Research, private companies and foundations; he is the Stanford Principal Investigator of the multidisciplinary TRUST NSF Science and Technology Center and Chief Computer Scientist of the DHHS-funded SHARPS project on healthcare security and privacy. He is a consultant and advisor to companies and the author of over 150 research  articles and two books.

2012

Tuesday, April 24, 2012

Leslie G. Valiant
Computer Science and Applied Mathematics
School of Engineering and Applied Sciences, Harvard University

“A Computational Theory of Cortex and Hippocampus”

2011

Thursday, April 21, 2011

Joseph Y. Halpern
Chair, Computer Science Department
Cornell University

“Beyond Nash Equilibrium: Solution Concepts for the 21st Century”

Abstract:Nash equilibrium is the most commonly-used notion of equilibrium in game theory. However, it suffers from numerous problems. Some are well known in the game theory community; for example, the Nash equilibrium of repeated prisoner’s dilemma is neither normatively nor descriptively reasonable. However, new problems arise when considering Nash equilibrium from a computer science perspective: for example, Nash equilibrium is not robust (it does not tolerate “faulty” or “unexpected” behavior), it does not deal with coalitions, it does not take computation cost into account, and it does not deal with cases where players are not aware of all aspects of the game. In this talk, I discuss solution concepts that try to address these shortcomings of Nash equilibrium.

This talk represents joint work with various collaborators, including Ittai Abraham, Danny Dolev, Rica Gonen, Rafael Pass, and Leandro Rego. No background in game theory will be presumed.

2010

Tuesday, April 20, 2010

Ed Lazowska
Bill & Melinda Gates Chair Computer Science & Engineering
University of Washington

“Computer Science: Past, Present, and Future”

Abstract: The National Science Foundation created the Computing Community Consortium to stimulate the computing research community to envision and pursue longer-range, more audacious research challenges. The next ten years of advances in computer science should be far more significant, and far more interesting, than the past ten. This talk will review the progress that the field has made, and present a number of “grand challenge” problems that we should be prepared to tackle in the coming decade.

Bio: Ed Lazowska holds the Bill & Melinda Gates Chair in Computer Science & Engineering at the University of Washington. Lazowska received his A.B. from Brown University and his Ph.D. from the University of Toronto. His research and teaching concern the design, implementation, and analysis of high performance computing and communication systems. He chaired the Computing Research Association Board of Directors from 1997-2001, the NSF CISE Advisory Committee from 1998-99, the President’s Information Technology Advisory Committee from 2003-05, and the DARPA Information Science and Technology Study Group from 2004-06. He is a member of the Microsoft Research Technical Advisory Board, and serves as a board member or technical advisor to a number of high-tech companies and venture firms. He recently became the inaugural chair of the Computing Community Consortium, an NSF-sponsored effort to engage the computing research community in envisioning more audacious research challenges. Lazowska is a Member of the National Academy of Engineering, a Fellow of the American Academy of Arts & Sciences, and a Fellow of ACM, IEEE, and AAAS. Twenty one Ph.D. students and twenty three Masters students have completed their degrees working with him.

2009

Thursday, April 16, 2009

Peter Lee
Chair, Computer Science Department
Carnegie Mellon University

“Programming a Million Robots”

Abstract: We have developed a concept for a new type of microscale robot, called Claytronics. Less than a cubic millimeter in size, these robots are designed to be cheap to build in large quantities and move via an electrostatic mechanism that allows one robot to stick and roll around one or more other robots. One can think of the design as similar to a very large-scale modular robot system.

While Claytronics is fanciful and, well, maybe even feasible to build (some interesting partial prototypes have been demonstrated), the question that has captivated me is the following: If you had a million of these robots sitting on your desk, how would you program them? In this lecture, I will describe the thought process as I’ve gradually gotten sucked into this problem, finally leading to a logic-programming language called Meld. I will present the design of Meld and show several programs and videos of large-scale robot ensembles running in a high-fidelity, physics-based simulator. I will conclude with an assessment of the language, including its possible usefulness for distributed systems such as sensor networks.

Bio: Peter Lee is the head of the Computer Science Department at Carnegie Mellon University. In this capacity, he oversees a computing organization whose research and education programs are consistently ranked among the top in the nation. Prior to assuming his current position, Dr. Lee was the Vice Provost for Research, providing administrative oversight and strategic guidance for Carnegie Mellon’s research activities, an enterprise that exceeds $400M annually in sponsored research. Peter Lee is an active researcher, educator, administrator, and servant to the academic community. For his research, he has received several awards, including the ACM SIGOPS Hall of Fame Award, and election as an ACM Fellow. He is a member of the Board of Directors of the Computing Research Association 9where he chairs the Government Affairs Committee), the Computing Community Consortium Council, the Computer Science and Telecommunications Board of the National Research Council, and the DARPA Information Science and Technology Board (where is the vice-chair).

2008

Thursday, May 1, 2008

Ronald L. Rivest
Viterbi Professor of Computer Science, Department of Electrical Engineering and Computer Science
MIT

“Security of Voting Systems”

Abstract: While running an election sounds simple, it is in fact extremely challenging. Not only are there millions of voters to be authenticated and millions of votes to be carefully collected, counted, and stored, there are now millions of “voting machines” containing millions of lines of code to be evaluated for security vulnerabilities. Moreover, voting systems have a unique requirement: the voter must not be given a “receipt” that would allow them to prove how they voted to someone else—otherwise the voter could be coerced or bribed into voting a certain way. This lack of receipts makes the design of secure voting system much more challenging than, say, the security of banking systems (where receipts are the norm).

We discuss some of the recent trends and innovations in voting systems, as well as some of the new requirements being placed upon voting systems in the U.S., and describe some promising directions for resolving the conflicts inherent in voting system requirements, including some approaches based on cryptography.

Bio: Professor Ronald L. Rivest is the Viterbi Professor of Computer Science in MIT’s Department of Electrical Engineering and Computer Science. He is a member of MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL), a member of the lab’s Theory of Computation Group and is a leader of its Cryptography and Information Security Group. He is also a founder of RSA Data Security. (RSA was bought by Security Dynamics; the combined company has been renamed to RSA Security); he is also a co-founder of Verisign and of Peppercoin.He has also worked extensively in the areas of computer algorithms, machine learning, and VLSI design.

2007

April 9, 2007

Hector Garcia-Molina
Professor, Departments of Computer Science and Electrical Engineering
Stanford University

“Generic Entity Resolution”

Abstract: Entity resolution (ER) is a problem that arises in many information integration scenarios: We have two or more sources containing records on the same set of real-world entities (e.g., customers). However, there are no unique identifiers that tell us what records from one source correspond to those in the other sources. Furthermore, the records representing the same entity may have differing information, e.g., one record may have the address misspelled, another record may be missing some fields. An ER algorithm attempts to identify the matching records from multiple sources (i.e., those corresponding to the same real-world entity), and merges the matching records as best it can.

In this talk I will describe a “generic” ER approach where the functions for comparing and merging records are black-boxes, invoked on pairs of records. I will describe a set of important properties of the black-boxes that enable efficient ER. I will also introduce three algorithms for ER: one for the general case, one for the case the properties hold, and one when the computations can be distributed across multiple processors. If time permits, I will show some experimental comparisons of the algorithms, based on comparison shopping data provided by Yahoo.

Bio: Hector Garcia-Molina is the Leonard Bosack and Sandra Lerner Professor in the Departments of Computer Science and Electrical Engineering at Stanford. He served as the Chairman of the Computer Science Department from 2001-2004, and was the Director of the Computer Systems Laboratory from 1994 to 1997. He was a faculty member of the Computer Science Department at Princeton University from 1979 to 1991, and a member of the President’s Information Technology Advisory Committee (PITAC) from 1997 to 2001. Garcia-Molina’s research interests include distributed computing systems, digital libraries and database systems. He received his BS in electrical engineering from the Instituto Tecnologico de Monterrey, Mexico in 1974, and earned an MS in electrical engineering in 1975 and a PhD in computer science in 1979 from Stanford University.

Garcia-Molina is involved in numerous professional activities. He is a member of the National Academy of Engineering, and a Fellow of the Association for Computing Machinery and the American Academy of Arts and Sciences. He is on the Technical Advisory Board of DoCoMo Labs USA, Yahoo Search & Marketplace. A Venture Advisor for Diamondhead Ventures, Garcia-Molina is a member on the Board of Directors of Oracle and Kintera. He received the ACM SIGMOD Innovations Award in 1999.

2006

April 10, 2006

Yale N. Patt

Professor Ernest Cockrell, Jr. Centennial Chair in Engineering
The University of Texas at Austin

“Computer Architecture Research:
Is it Dead?  Does it need revitalization?
Where do we go from here?”

Abstract: Computer Architecture is at the interface between the software that specifies what the computer is to do, and the hardware that actually carries out the work. Process technology continues to give us more and more resources. The first microprocessor had 2300 transistors, and ran at about 100 KHz. This year we are talking about 1.72 billion transistors on a chip, and clock frequencies in excess of 4 GHz. In a few years we will have 10 billion transistors on a chip, with clock frequencies greater than 10 GHz. However, too many people who should know better are speaking doom. Either they say there is nothing left to do in computer architecture, or they are trumpeting that computer architecture research needs revitalization. In this talk, I will examine both premises. I also hope to suggest some approaches that I expect will keep us moving forward, continuing to improve the performance of microprocessors over the next ten years, … IF we do not lose sight of some fundamental principles.

Bio: Yale Patt is the Ernest Cockrell, Jr. Centennial Chair in Engineering and Professor of ECE at Texas. He enjoys equally teaching both the large (400 students) freshman introductory course in computing and the advanced graduate courses in microarchitecture, directing the research of very bright PhD students (right now, 12 of them), and consulting extensively in the microprocessor industry. He earned the appropriate set of degrees from reputable universities and has received a substantial number of awards for both his research and teaching. Many of his research results have found their way into high performance microprocessor products. More detail can be found on his home page.

Speakers prior to 2006

2005
Barbara Liskov
Massachusetts Institute of Technology

2004
David Clark
Massachusetts Institute of Technology

2003
Moshe Y. Vardi
Rice University

2002
Roger Needham
Cambridge University

2001
Robin Milner
Cambridge University

2000
Julia Hirschberg
AT&T Labs-Research

1999
Butler Lampson
Microsoft Corporation

1998
Joseph O’Rourke
Smith College

1997
Amir Pnueli
Weizmann Institute of Science

1996
Andreis van Dam
Brown University

1995
Anil Nerode
Cornell University

1994
Robert McNaughton
RPI

About the Saul Gorn Lecture:

The Saul Gorn Memorial Lecture Series was established in honor of the late Professor Saul Gorn, who played a key role in the establishment of the Computer Science Graduate Group in the Moore School, which later became the Department of Computer and Information Science. Dr. Gorn was a pioneer in computer and information science who was on the Penn faculty for more than 30 years.