**Monte Carlo Method**

At the Los Alamos Scientific Laboratory, John von Neumann, Stan Ulam and Nick Metropolis created the Metropolis algorithm, also known as the Monte Carlo method. This algorithm obtains approximate solutions to numerical problems that has an unmanageable number of degrees of freedom and to combinatorial problems that have factorial size. It does this by mimicking a random process.

Read more here.

### Simplex Method

In 1947, George Dantzig created the simplex method for linear programming. Linear programming dominated the world of industry, where “*economic survival depends on the ability to optimise within budgetary and other constraints*“. It’s widespread application makes Dantzig’s algorithm one of the most successful of all time.

The simplex method is a elegant way of arriving at optimal answers, and in practice it is highly efficient.

### Krylov Subspace Iteration Methods

The development of the Krylov Subspace iteration methods was initiated by Magnus Hestenes, Eduard Stiefel and Cornelius Lanczos from the Institute for Numerical Analysis at the National Bureau of Standards in 1950. They address the simple task of solving the equations Ax = b. Although these seem simple, when A is a massive *n*x*n* matrix, the algebraic answer x = b/A is **not** easy to compute!

So, iterative methods, such as solving equations of the form *Kxi + 1 = Kxi + b – Axi*, were introduced. This lead to the study of Krylov subspaces, which are named after Russian mathematician Nikolai Krylov. These subspaces are spanned by powers of a matrix that is applied to a initial remainder vector *r0 = b – Ax0*.

Lanczos found a way to generate an orthogonal basis for such a subspace when the matrix is symmetric, and then Hestenes and Stiefel proposed an even better method – conjugate gradient method – used when the system is both symmetric and positive definite.

### Fortran Optimising Compiler

Developed in 1957 by a team at IBM lead by John Backus, the Fortran optimising compiler is said to be one of the most important innovations in the history of computer programming. After its development, scientists could tell a computer what they wanted it to do without having to “*descend into the netherworld of machine code*“.

Fortran I consisted of 23,500 assembly-language instructions, and although this is not a lot compared to modern compilers, it was capable of a great number of sophisticated computations.

The compiler

“produced code of such efficiency that its output would startle the programmers who studied it.”– Backus

Part 2 coming soon! M x

Algo’s are the blood of computing science

LikeLiked by 2 people