Knapsack Problem

The knapsack problem is a problem in combinatorial optimisation.

Imagine you are going to visit your friends for whom you have bought lots of presents. So many, in fact, that you can’t fit them all in your luggage meaning you must leave some behind. If we decide to pack a combination of presents with the highest value but that doesn’t exceed the weight limit imposed by airline, how can we find such a combinations? This is the Knapsack Problem.


Source: Wikipedia

We could solve this by trying all the possible combinations, calculating their value and weight and then picking one that is below the weight limit but maximises the value. Whilst this is okay when we are dealing with a small number of items, it is simply unfeasible when the number of items is large as the number of combinations is too big. So, is there an algorithm which can work for any number of items that doesn’t take too long?

Computer scientists have a way of measuring the complexity of a problem by how fast the time taken grows as the size of the input increases. Polynomial growth, i.e. if the size of the input is n then the time taken grows by a factor of n^k, describes and ‘easy problem’; the growth is nowhere near as “explosive” as  exponential growth, e.g. 2^n.

Whether or not a polynomial time algorithm exists to solve the knapsack problem is unknown, meaning it is in a class of problems called NP-Hard Problems.

Complexity classes


Complexity Classes:

  • Class P: problems that can be solved in polynomial time
  • Class NP: a potential solution can be checked in polynomial time
    • NP-Complete: within the NP class but particularly hard
    • NP-Hard: as least as hard as the NP class.

It is possible that the NP class = P class, though this is still unknown (and is one of the Millennium Prize Problems), hence the two diagrams above.

M x

P.S. My posts may be more sporadic in the next month because I’m in exam term!


The Josephus Problem

The Josephus Problem is a problem in computer science based on a counting-out game. The problem is named after Flavius Josephus, a Jewish-Roman historian from the 1st century, who tells the story like this:

“A company of 40 soldiers, along with Josephus himself, were trapped in a cave by Roman soldiers during the Siege of Yodfat in 67 A.D. The Jewish soldiers chose to die rather than surrender, so they devised a system to kill off each other until only one person remained. (That last person would be the only one required to die by their own hand.)

All 41 people stood in a circle. The first soldier killed the man to his left, the next surviving soldier killed the man to his left, and so on. Josephus was among the last two men standing, “whether we must say it happened so by chance, or whether by the providence of God,” and he convinced the other survivor to surrender rather than die.”

This story gives rise to an interesting maths problem: If you’re in a similar situation to Josephus, how do you know where to stand so you will be the last man standing?

M x

The Halting Problem

To understand the process of computation, regardless of the technology available, we need to reduce it to its most basic elements. One way to model computation is to consider an input, operated on by a program which produces some output.

In 1936, Alan Turing constructed such a model and considered the question of determining whether a program will finish running or will continue to run forever i.e. halt. This is known as the Halting Problem.

Let us suppose that we write a program, let’s call it HALT, that, given another program and its associated input, can determine whether running that program with the input will halt or not. This can be represented in the following way:

\[  \mbox{\textbf{HALT}(program,input)}=\left\{  \begin{tabular}{l}\mbox{Output YES, if the program does halt on the input,}

\\ \mbox{Output NO, otherwise. } 

\end{tabular}  \]

Note that we do not have to consider how this program works, it is merely sufficient to assume that there does exist such a program. Now, consider a new program, call it OPPOSITE which does the opposite of halt. In other words, it analyses programs running with themselves as the input, acting in the following way:

\[  \mbox{\textbf{OPPOSITE}(program)}=\left\{  \begin{tabular}{l}\mbox{Halt, if \textbf{HALT}(program, program) returns NO, }

\\ \mbox{Loop forever, otherwise. } 

\end{tabular}  \]

Now, what happens when we run OPPOSITE on itself?

If OPPOSITE(opposite) halts, then it must loop forever, but if it loops forever then it must halt. Hence, there is a contradiction, highlighting how the program HALT cannot exist.

This was an incredible result. Firstly, in this proof a computer and a program were mathematically defined, paving the way to the creation of the Turing Machines. Furthermore, Turing had found a program whose existence is logically impossible, and showed that knowing whether a program halts on some input is undecidable.

“It is one of the first examples of a decision problem.”

I have included a video by Computerphile which I think describes the problem and proof very well:

M x

NEWS: World’s Largest Proof

Recently, a trio of mathematicians – Marijn Heule from the University of Texas, Victor Marek from the University of Kentucky, and Oliver Kullmann from Swansea University – have solved a problem in mathematics and the solution takes up 200 terabytes of basic text (just consider the fact that 1 terabyte can hold 337,920 copies of War and Peace)! This breaks the previous recorded of a 13-gigabyte proof, which was published in 2014.

The mathematics problem is named the ‘Boolean Pythagorean Triples problem’, and was posed by Ronald Graham in the 1980s, who offered a $100 prize for the solution.

The problem is part of Ramsey theory and asks:

“Is it possible to colour all the integers either red or blue so that no Pythagorean triple of integers a, b, c, satisfying a^2+b^2=c^2 are all the same colour. For example if you would colour a and b red, and c blue, this would successfully not satisfy the tested triple, but all triples would have to be tested.”

Andrew Moseman from Popular Mechanics details how:

“What makes it so hard is that one integer can be part of multiple Pythagorean triples. Take 5. So 3, 4, and 5 are a Pythagorean triple. But so are 5, 12, and 13. If 5 is blue in the first example, then it has to be blue in the second, meaning either 12 or 13 has to be red.

The proof found that it is only possible to colour the numbers in such a way up to the number 7,824 and that 102,300 such colourings exist. Hence, there is no solution to the problem question. The proof took a supercomputer two days to solve, and generated 200TB of data!

The paper describing the proof was published on arXiv on the 3rd of May 2016.

Although the computer has proved that the colouring is impossible, it has not provided the underlying reason why this is, or explored why the number 7,824 is important. This highlights the objection to the value of computer-assisted proofs; yes, they may be correct, but to what extent are they mathematics?

Sources: 1 | 2

Let me know what you think of computer assisted proofs below! M x

Boolean Algebra

Today I thought I would give you a short introduction on Boolean Algebra.

George Boole color.jpg

George Boole

Boolean Algebra was named after the English mathematician, George Boole (1815 – 1864), who established a system of logic which is used in computers today. In Boolean algebra there are only two possible outcomes: either 1 or 0.

It must be noted that Boolean numbers are not the same as binary numbers as they represent a different system of mathematics from real numbers, whereas binary numbers is simply a alternative notation for real numbers.


Boolean logic statements can only ever be true or false, and the words ‘AND’, ‘OR’ and ‘NOT’ are used to string these statements together.

OR can be rewritten as a kind of addition:

0 + 0 = 0 (since “false OR false” is false)
1 + 0 = 0 + 1 = 1 (since “true OR false” and “false OR true” are both true)
1 + 1 = 1 (since “true OR true” is true)

OR is denoted by:

Screen Shot 2016-05-04 at 5.49.41 PM.png

AND can be rewritten as a kind of multiplication:

0 x 1 = 1 x 0 = 0 (since “false AND true” and “true AND false” are both false)
0 x 0 = 0 (since “false AND false” is false)
1 x 1 = 1 (since “true AND true” is true)

AND is denoted by:

 Screen Shot 2016-05-04 at 5.49.36 PM.png

NOT can be defined as the complement:

If A = 1, then NOT A = 0
If A = 0, then NOT A = 1
A + NOT A = 1 (since “true OR false” is true)
A x NOT A = 0 (since “true AND false” is false)

This is denoted by:

Screen Shot 2016-05-04 at 5.49.45 PM

Venn diagrams these operations | Source: Wikipedia

Expressions in Boolean algebra can be easily simplified. For example, the variable B in A + A x B is irrelevant as, no matter what value B has, if A is true, then A OR (A AND B) is true. Hence:

A + A x B = A

Furthermore, in Boolean algebra there is a sort of reverse duality between addition and multiplication, depicted by de Morgan’s Laws:

(A + B)’ = A‘ x B‘ and (A x B)’ = A‘ + B



In 1938, Shannon proved that Boolean algebra (using the two values 1 and 0) can describe the operation of a two-valued electrical switching circuit. Thus, in modern times, Boolean algebra is indispensable in the design of computer chips and integrated circuits.

Sources: 1 | 2 | 3

Turing Machine

The Turing Machine was first described in a paper published in 1936 on computable numbers, by English mathematician Alan Turing, who called it an a-machine and presented it as a thought experiment.

What is a Turing Machine?

A Turing Machine is a theoretical computing machine with an infinitely long tape upon which it manipulates symbols according to the mathematical model that defines it.

This machine is made up of:

  • A line of cells (“tape”) that can move back and forth
  • An active element (“head”) that has a property (“state”) and that can change the property (“colour”) of the active cell underneath it
  • A set of instructions for how the head should modify the active cell and move the tape.

Depending on the symbol and its position on the finite table – represents an algorithm that is necessarily finite – the machine:

  1. Writes a symbol (eg. a number) in the cell
  2. Either moves the tape one cell left or right
  3. Either continues or stops the computation.


With this hypothetical machine, Turing was able to prove properties of computation, and in particular, the uncomputability of Hilbert’s Entscheidungsproblem, also known as the decision problem.

Universal Turing Machine

Alan Turing was able to show that:

  • if there is a table U such that the head of a Turing machine is programmed in accordance with U;
  • and if any table P is translated and written out on the machine’s tape;
  • then the machine will behave as if its head had been programmed in accordance with P.

A Universal Turing Machine is an Turing machine whose head has been programmed in accordance with U.

There are many distinct Universal Machines, all with equivalent computational power, due to the fact that there are many tables that will act like U.

Turing Machines to Computers

The model of the Universal Turing Machine is considered to be the origin of the stored program computer (stores program instructions in electronic memory), which was used by John von Neumann in 1946 for the von Neumann architecture.

However, Turing’s greatest contributions to the development of the digital computer was his establishment of the idea of controlling the function of a computing machine by storing a program of instructions in the machine’s memory, and also his proof that a single, universal machine can carry out every computation.

Sources: 1 | 2 | 3


Ada Lovelace

Ada Lovelace, born December 10th 1815 in England, was a mathematician who is considered to have written instructions for the first computer program.

Her mother, who had a passion for mathematics, raised Lovelace with an excellent education in all subjects, particularly maths. At that time, it was extremely rare for mother to provide such extensive education, particularly a scientific one, to a daughter, as women weren’t allowed to go to university or join learned societies.

Lovelace excelled at mathematics and was encouraged by her tutor Mary Somerville, who was another notable female mathematician. At age 17, Lovelace met Charles Babbage, known as ‘the father of computers’, and they quickly became lifelong friends. Babbage described her as

that Enchantress who has thrown her magical spell around the most abstract of Sciences and has grasped it with a force which few masculine intellects could have exerted over it

The Analytical Engine

The Analytical Engine was a machine that would be able to store data and perform sequences of instructions defined on punch cards and fed into the machine. Although it was never built, the Analytical engine was a direct forerunner of the computers we use today.


Drawings of a part of the Analytical Engine | Source:

Lovelace became deeply intrigued with this machine and when, in 1842, she was asked to translate an article describing the Analytical Engine by the Italian mathematician Luigi Menabrea, Babbage asked her to expand the article “as she understood the machine so well”. The translated article became three times longer and Lovelace added a description of a method for the engine to repeat a series of instructions – a process known as looping, which is used in computer programs today. Due to this, she is often referred to as the ‘first computer programmer’.

Ada Lovelace sadly died of cancer aged 36, a few years after her groundbreaking publication of “Sketch of the Analytical Engine, with Notes from the Translator”. Although her contributions to computer science were not discovered until the 1950s, long after their publication, her passion and vision for technology has made her a powerful symbol for women in science.

Sources: 1 | 2 | 3 | 4

Hope you enjoyed reading about a very important woman in our mathematical history! Would you be interested in more posts about the field of computer science? M x