The Chair of Naval Cyber Defence

Get Complete Project Material File(s) Now! »

A walk through the history of cryptology

Cryptology, from the Greek κρνπτoς (secret) and λoγια (science) is the study of means to protect information (to use only broad terms). It is the union of cryptography (the creative, constructive part) and cryptanalysis (the assessment, attack counter-part). The first example of cryptography in history is the Caesar cipher, used by Emperor Julius Caesar during his conquests, more than two thousands years ago. The cryptanalytic technique that breaks this cipher is called frequency analysis and dates back from the 9th Century with Al-Kindi’s work. Since these times, cryptologic work has been a cat-and-mouse game where cryptographers build more and more clever ways to protect information that cryptanalysts aim at breaking.
The overall protection objective is usually partitioned into three sub-objectives whose acronym C.I.A. is easy to remember. C stands for confidentiality: we do not want an eavesdropper to get the content of a protected exchange, I for integrity: the communication should not be altered by errors or attacks, and A for availability, legitimate entities should be able to access the information. In cryptology we call primitives the elementary building blocks that together form a complete protection solution. We refer here loosely about entities or eavesdropper, we shall in this work use the term user(s) when talking about the legitimate participants and attacker for the malicious, adverse party whose aim is to break the protection to some extent. From an application viewpoint, we wish to limit the extra costs induced by the cryptographic protection to the minimum. As we shall see in this work, this is not always a trivial objective.

From paper-and-pencil to quantum… and post-quantum!

Let us now quickly review the last two thousands years of outstanding advances in cryptology.
The paper-and-pencil area spans from Caesar cipher to the German AFDX cipher of the First WorldWar. Depending on the technique, we speak about code or cipher. The former is about replacing meaningful word by others that only the legitimate recipient can understand, while the latter is about modifying the message content into something scrambled using substitution or mixing. A famous example of a code is Verlaine’s poem Les sanglots longs des violons de l’automne that was broadcast by the BBC to French resistant forces during the Second World War. When the next verse was added Blessent mon coeur d’une langleur monotone, the message yields: « This is D-Day! ». The Caesar cipher is an example of substitution, each letter is replaced by the one three position after in the alphabet: A becomes D, B becomes E and so on. So « This is D-Day » would be ciphered as WKLV LV G-GDB. Another example of cipher, using mixing only, say swap two consecutive letters in the message, would change our message into « HTSI SI D-DYA ».
Of course none of these toy techniques are secure by today’s standards, yet they have been state-of-the-art in their times. The Caesar cipher is called a mono-alphabetical substitution because encryption is about taking a letter replacement from one unique alphabet (the regular alphabet shifted by three in our example above). Another famous cipher of that kind is called Polybe Square which is about putting all the letters into a square and replacing them by the pair (row number, column number). These techniques were only broken by Al-Kindi’s work on frequency analysis in the 9th Century when he realised that the most frequent letter in the original text remains the most frequent letter (or number) in the ciphertext. Thus, if you know the original text language you can easily decrypt the message. After mono-alphabetical substitution, Vigenère introduced the poly-alphabetical cipher: the Vigenère cipher. This time, the letters of the message are not shifted by the same number of positions, but by a changing number of positions. For example the first would be shifted by eight positions, the second by twelve positions, the third by five, and the next by eight again and you loop like this for the whole message. This way, two identical letters in the ciphertext are not necessary the same in the cleartext. Analysing frequencies becomes more difficult. Now the attacker shall consider statistics on one every n letters in the ciphertext. Indeed in our example, every third letter is shifted by the same number of positions (eight, twelve or five). The first challenge is to find the number of shifts (i.e. the length of the key) before doing frequency analysis. The extreme version of this poly-alphabetical cipher, when each letter is shifted by an unpredictable number of positions (no loops and randomness), is called Vernam Cipher and is the only perfectly secure cipher ever to be built. Indeed this is perfectly secure as demonstrated by Shannon in his seminal work on information theory. The drawback of this cipher is that the key (the shifts to apply) is as long as the message, doubling the cost or time of the communications.

READ  Child Routine activities

Lattice hard problems

The study of lattices by mathematicians goes far back in time with sphere packing analysis, but it is only recently that cryptologists got interested into them. First, with Coppersmith [Cop96] who used lattices to break RSA under some assumptions. Then Ajtai [Ajt96] opened the way to lattice-based cryptography, showing the existence of NP-hard problems in lattices. We review now the landscape of these hard problems, that support today all primitives of lattice-based cryptography, homomorphic encryption among them.

Shortest Vector Problem (SVP) and consorts Shortest vector problems

The first (and easiest to state) of these problems is called the Shortest Vector Problem (SVP). This problem asks to find a shortest non-zero vector v in a lattice L, given a basis B. It shall satisfy �v� = λ1(L) Note that due to the group structure of lattices, the shortest vector is not unique, never. In every lattice, if v realises the first minimum, then −v does it too. There are also lattices were several linearly independent vectors have all the shortest norm (e.g. the hexagonal lattice). The result of Ajtai [Ajt98] is that the exact version of SVP is NP-hard.

Table of contents :

Résumé en français
1 Introduction 
1.1 The Chair of Naval Cyber Defence
1.2 A walk through the history of cryptology
1.2.1 What can crypto do for you?
1.2.2 From paper-and-pencil to quantum… and post-quantum!
1.3 This thesis
1.3.1 Outline
1.3.2 Roadmap
2 Preliminaries 
2.1 Notation
2.2 Some lattice theory
2.2.1 General definitions
2.2.2 Gram-Schmidt Orthogonalisation (GSO)
2.2.3 Fundamental results
2.3 Lattice hard problems
2.3.1 Shortest Vector Problem (SVP) and consorts
2.3.2 Learning with errors (LWE)
2.3.3 Learning with errors over rings (Ring-LWE)
2.3.4 Miscellaneous
2.4 Presentation of FHE
2.4.1 Homomorphic Encryption history (2008-2017)
2.4.2 Dimensioning constraints
2.4.3 One example: the Fan-Vercauteren scheme (FV)
3 Recipe for a new attack 
3.1 Review of the existing attacks
3.1.1 Against LWE
3.1.2 Against Ring-LWE
3.2 Mounting our own dedicated attack
3.2.1 Outline
3.2.2 Expanding on reduction
3.2.3 The actual decoding step
3.2.4 Launch the attack, for real
4 Contributions to lattice reduction 
4.1 Review of reduction algorithms
4.1.1 Size reduction
4.1.2 Lenstra-Lenstra-Lovász (LLL) reduction
4.1.3 Blockwise algorithms
4.1.4 Slide reduction
4.1.5 Performances and output qualities
4.2 Improving reduction algorithms
4.2.1 The fplll days
4.2.2 Proven CVP
4.2.3 BKZ 3
4.2.4 2-phases LLL
5 Comparing and using schemes correctly 
5.1 Review of the methods
5.2 Comparing FV and SHIELD
5.2.1 Scope
5.2.2 Unified presentation
5.2.3 Noise growth equations
5.2.4 Security
5.3 Parameters in perspective
5.3.1 Multiplicative depth for an arbitrary binary circuit
5.3.2 Multiplicative depth for an optimised circuit
5.3.3 The case of the Negative Wrapped Convolution
5.3.4 Parameters for batching
5.3.5 Keys and ciphertexts sizes
5.4 Smallest error is not always the best
5.5 Implementation performance comparison
6 Construction of a new scheme 
6.1 Introduction to the fourth generation
6.2 Additional preliminaries
6.2.1 Rings
6.2.2 Circulant LWE and reduction to Ring-LWE
6.2.3 LWE encryption
6.2.4 Simpler error distribution in CLWE for practice
6.3 Building the gate
6.3.1 Known building blocks
6.3.2 New building blocks
6.3.3 Joining the building blocks
6.3.4 Heuristic error propagation
6.4 Implementation
6.4.1 Data representation
6.4.2 Tweaking the parameters
6.4.3 Performances
Conclusion
7 Closing thoughts 
Appendices
Disseminations
List of Figures
List of Tables
List of Algorithms
Bibliography

GET THE COMPLETE PROJECT

Related Posts