Richard E. Bellman

August 26, 1920 – March 14, 1984

Brief Biography

Richard Ernest Bellman was a major figure in modern optimization, systems analysis, and control theory who developed dynamic programming (DP) in the early 1950s. Born in Brooklyn and raised in the Bronx, Bellman had a comfortable childhood that was interrupted by the Great Depression.  Early on, he excelled at mathematics and was a star student in school. Bellman enrolled at the tuition-free City College of New York but was put off by the expense of the hour long commute (which left him no money for lunch) and transferred to Brooklyn College. After receiving a bachelor’s degree in 1941, he chose to pursue graduate study at Johns Hopkins University.

When the United States entered World War II, Bellman diverted his education to patriotic service, thereby avoiding conscription. He moved to Wisconsin in 1942 to teach an Army radio and electronics class and worked towards an MA from the University of Wisconsin. At Madison, Bellman met renowned Polish mathematician Stanislaw Ulam. Ulam convinced Solomon Lefschetz to offer Bellman a position at Princeton University as an Army Specialized Training Program instructor. After three semesters of coursework and teaching, Bellman reunited with Ulam at Los Alamos’s Theoretical Physics Division as part of the Army’s top secret Manhattan Project. He returned to Princeton after the war and received a PhD in mathematics in 1947.

Given Princeton’s aversion to hiring recent graduate students, Bellman accepted a summer job at the RAND Corporation in 1948. The RAND community of the late 1940s and 1950s was an exciting place for the burgeoning operations researcher. The fulltime and associated staff included David Blackwell, George Dantzig, and Lloyd Shapley. Bellman was quick to make a name for himself as a paper on bluffing he co-wrote with Blackwell became the subject of a New York Times article. When offered at full-time position at RAND in 1952, Bellman decided to suspend his teaching career (he had been concurrently teaching at Stanford up to that point) and focus on the development of dynamic programming with RAND personnel.

In 1965, Bellman became Professor of Mathematics, Electrical Engineering, and Medicine at the University of Southern California. He started a program of applied mathematics that included a two-year sequence of dynamic programming, control theory, invariant imbedding, and mathematical biosciences courses. Bellman’s research at USC became increasingly focused on the application of mathematics to medicine and the biological sciences. Many of his students, including Christine Shoemaker and Augustine Esogbue, have gone on to make significant contributions to OR applications. (According to Shoemaker, Bellman was ahead of his time with respect to affirmative action, and applied for and got a grant for teaching computer science to high school students in disadvantaged areas). He accepted a series of lecture engagements around the world and published many articles, books, and monographs. He additionally served on a variety of editorial boards.

Throughout his career, Bellman made significant contributions to a number of areas. He published a series of articles on dynamic programming that came together in his 1957 book, Dynamic Programming. In the early 1960s, Bellman became interested in the idea of embedding a particular problem within a larger class of problems as a functional approach to dynamic programming. He saw this as “DP without optimization”. This work fed into his seminal contributions to control theory and its application to real world problems.

Bellman grew concerned about the computational effectiveness of dynamic programming. He was an avid proponent for using computers and pursued the topic of artificial intelligence from a broad perspective. As his familiarity with computer science grew, so did his research on simulation. Bellman worked on bringing simulation to rational decision making and human systems.

In his lifetime, Bellman received many honors for his contributions to dynamic programming and operations research. He was awarded the John von Neumann Theory Prize by the Operations Research Society of America and The Institute of Management Sciences and was elected into the National Academy of Engineering. The American Automatic Control Council established the Richard E. Bellman Control Heritage Award in his honor for distinguished contributions to the control theory.

Other Biographies

Profiles in Operations Research: Richard E. Bellman
INFORMS Members may access this book for free by logging in.
For more information about this title and many other Springer publications in Operations Research, please click here.

Wikipedia Entry for Richard E. Bellman

Dreyfus S. (2003) IFORS' Operational Research Hall of Fame Richard Ernst Bellman. International Transactions in Operational Research, 10(5): 543-545. (link)

Engineering and Technology History Wiki. Richard Bellman. Accessed April 9, 2015. (link)

Leondes C. T. (1980) Foreword: An Appreciation of Professor Richard Bellman. Journal of Optimization Theory and Applications, 32(4): 399-406. (link)

Bellman G. L., dir. (2013) The Bellman Equation. Motion Picture. Shami Media, Inc. (link)

University of St. Andrews School of Mathematical and Computer Sciences. Bellman Biography. Accessed April 9, 2014. (link)


Brooklyn College, BA 1941

University of Wisconsin, MA 1943

Princeton University, PhD 1947 (Mathematics Genealogy)


Academic Affiliations
Non-Academic Affiliations

Key Interests in OR/MS

Application Areas

Memoirs and Autobiographies


Bellman R. E. (1984) Eye of the Hurricane: An Autobiography. World Scientific: London. 

Dreyfus S. (2002) Richard Bellman on the Birth of Dynamic Programming (Excerpts from Eye of the Hurricane). Operations Research, 50(1): 48-51. (link)


New York Times (1984) Richard Bellman Dies; Coast Mathematician. March 24. (link)

Awards and Honors

AMS/SIAM Norbert Wiener Prize 1970

John von Neumann Theory Prize 1976

National Academy of Engineering Member 1977

IEEE Medal of Honor 1979

Selected Publications

Bellman R. E. & Blackwell D. (1949) Some two-person games involving bluffing . Proceedings of the National Academy of Sciences, 35: 600-605.

Bellman R. E. (1957) Dynamic Programming. Princeton University Press: Princeton, NJ.

Bellman R. E., Clark C., Craft C., Malcolm D., and Ricciardi F. (1957) On Top Management Simulation. American Management Association: New York.

Bellman R. E. (1961) Adaptive Control Processes: A Guided Tour. Princeton University Press: Princeton, NJ.

Bellman R. E. & Dreyfus S. (1962) Applied Dynamic Programming. Princeton University Press: Princeton, NJ.

Bellman R. E. & Kalaba R. (1963) Mathematical Trends in Control Theory. Dover Publications: New York.

Bellman R (1965) Magic, Math and Mystery... RANDom News Sept 1964 pp. 12-17

Bellman R. E. (1969) Methods of Nonlinear Analysis. Academic Pres: New York.

Angel E. & Bellman R. E. (1972) Dynamic Programming and Partial Differential Equations. Academy Press: New York.

Bellman R. E. & Wing G. M. (1975) An Introduction to Invariant Imbedding. Wiley & Sons: New York.

Bellman R. E. (1978) Can Computers Think? An Introduction to Artificial Intelligence. Boy & Fraser: San Francisco, CA.

Bellman R. E., Esogbue A. O., & Nabeshima I. (1982) Mathematical Aspects of Scheduling and Applications. A. Wheaton & Co: Exeter, UK. 

Additional Resources

American Automatic Control Council. Award Programs: Richard E. Bellman Control Heritage Award. Accessed April 9, 2015. (link)

Roth R. S., ed. (1986) The Bellman Continuum: A Collection of the Works of Richard E. Bellman. World Scientific Publishing Company: Philadelphia, PA.