🥳 JavaScript Database (JSDB)¹ version 7.0.0 released
Breaking change JSTable.PERSIST event now uses a parameter object with properties for type, keypath, value, change, and table. This should make listening for events on your databases much nicer to author. e.g., a snippet from Catalyst² I’m working on:
const settingsTable = db.settings['__table__']
const JSTable = settingsTable.constructor
settingsTable.addListener(JSTable.PERSIST, ({ keypath, value }) => {
switch (keypath) {
case 'servers.serverPoolSize':
console.info('New server pool size requested', value)
this.updateServerPool()
break
// etc.
}
})
This new version of JSDB is not in the latest Kitten³ yet as it is a breaking change and I want to make sure I update my sites/apps first if needed. I should have it integrated tomorrow.
To see the simple use case for JSDB in Kitten (the default untyped database that’s easy to get started with and perfect for quick experiments, little sites, etc.), see: https://kitten.small-web.org/tutorials/persistence/
I've read 'The Home Lab Handbook: Building and Managing Your Own IT Lab from Scratch' which I would recommend to anyone just starting out in selfhosting and homelabing. Relative to that, I found a 'course' online (https://linuxupskillchallenge.org/#table-of-contents) that would also be useful for new arrivals. ...
A quiet, cinematic street café frozen in monochrome. Two men sit high on bar stools at a small round table center stage; one leans forward, the other turns slightly toward him — their conversation looks intimate and absorbed. The nearer figure wears sunglasses that catch a bright highlight, a light collar against a darker jacket. On the tabletop are the small rituals of café life: a cup, a bottle, ashtray or small bowl, the casual clutter that signals a pause in the day.
The photographer frames them with strong side light: faces and hands are picked out in sharp relief while the background slips into shadow. Behind and around them, other patrons are dimmer shapes—an older woman seated lower, a chalkboard menu leaning against the wall — which grounds the scene in a public but private moment. Textures are tactile: the grain of the pavement, the smooth shine of glass, the soft folds of coats come through more insistently because the image is black and white.
Symbolically it reads as a study of everyday intimacy and urban ritual — a small island of connection amid the anonymity of street life. Artistically, the picture uses high contrast and selective lighting to create depth and focus, turning a common café encounter into a quiet tableau about conversation, observation, and the small performances of social life.
Imagine, just for a moment, that a mathematical breakthrough had occurred on the eve of the second World War. Perhaps Turing or Rejewski or Driscoll realised that prime number theory held the key to unbreakable encryption. This blog post attempts to answer the question "could public-key cryptography have been used in 1939?"
Let's briefly step back into history.
The Enigma machine represented the most powerful form of convenient cryptography available in the early 20th century. There were only two practical ways to crack its encryption.
Capture a codebook with the encryption keys printed in it[0](#fn:u571 "As seen in the historically "accurate" film "U571".").
Literally invent the computer[1](#fn:game "As seen in the historically "accurate" film "The Imitation Game".").
The basis of Enigma is conceptually simple. You want to write a character. You follow a complex sequence of instructions and get the resulting encrypted character. You want to write the next character, so you follow the same instructions for the first which then changes the sequence of instructions for the next character. And so on. Every character you type changes the algorithm for the next character. Fiendish!
You could encrypt an Enigma message by hand. But it would be tiresome, error-prone, and take ages. So a machine was invented to do the hard work. A series of cogs and wheels and wires and lights.
One of the weaknesses of Enigma is that it used symmetric encryption. The password used to scramble the message was the same as the one used to descramble it. Each day the codes changed, so they were printed in a handy codebook which was distributed to each operator. If someone captured the codebook, they could decrypt all sent and received messages.
Decades after the war, asymmetric cryptography was invented[2](#fn:history "As seen in the historically "accurate" film By either the Brits or the Americans depending on whose history you think is accurate."). The magic of asymmetric encryption is that it allows you to have one password to scramble the message and a totally different one to unscramble it. This completely obliterates the risk of your codebooks being discovered; you can have a "public key" for encryption. Anyone with that key can encrypt a message, but not decrypt it. You have a private key for decryption which you guard with your life.
Asymmetric encryption powers the modern world. It is made possible by high-speed computer chips which can precisely perform mind-boggling calculations in microseconds.
Let us slip into an alternate timeline. The mathematics behind asymmetric encryption are conceptually simple - even if they are exceedingly difficult to execute without a computer. If the mathematicians of the day had made the necessary intellectual breakthroughs, could public-key encryption have worked in WW2?
I'm going to work through the following problem to prove that it was just about possible3.
Could a public / private keypair have been calculated in the 1930s?
Is it possible to use paper-and-pencil to encrypt a message using a very short public key?
Would it have been possible to build a machine to encrypt using longer public keys?
What key length would have prevented the private key being cracked by brute-force?
The original machines used to crack Enigma used brute-force; trying every possible combination until they discovered the right one. That's not strictly true - a large part of cryptanalysis was understanding the statistics behind the encryption algorithm, the likely content of messages, and common phrases that they contained.
Modern encryption algorithms are resistant to most of those statistical attacks. So the only feasible method of cracking a private key is by trying each combination sequentially.
Let's suppose there's a very short private key - for example just 4 bits long. There are 24 possible combinations; 16 in total. It seems reasonable to suppose that, if the message can be easily decrypted by the intended recipient, it could easily be cracked by someone able to try 16 different key combinations.
For every bit of length added to the key, the number of combinations doubles. 24=16. 25=32. 26=64. Once you get the 232, you're at 4 billion combinations4. Trying one combination per second would take over 120 years to complete.
Of course, manually using a 32 bit key might be too complex for the technology of the day. So a shorter key might be easier to use while still retaining sufficient strength. How difficult is it to manually encipher and decipher messages with short keys?
Briefly and incorrectly put, keys are based on prime factors.
Multiply these two prime numbers: 29 and 113.
You can easily do that on paper or on a pocket calculator. It is trivial. But suppose I asked you to reverse the equation? Find out which two prime numbers are multiplied to give the number 40,133. That's much harder. For larger numbers, it is even harder than you think.
Let's do that with two small prime numbers 17 and 61. They are sufficiently small to be calculated by hand7.
𝒏 = 17 × 61 = 1037
ϕ(𝒏) = 16 × 60 = 960
𝒆 = 77 (Chosen randomly)
𝒅 = 773 (Calculated using the Extended Euclidean Algorithm, which was first described in 1740)
Here's how calculating 𝒅 works. This isn't intended to be a complete explanation of how the algorithm works, but it is sufficient to show that generating keypairs would have been well within the grasp of mathematicians of the 1930s. Feel free to skip to the next section.
Step
ri−2
ri−1
qi=⌊(ri−2)÷(ri−1)⌋
ri=(ri−2_−qi×(ri−1)
xi=(xi−2)−qi×(xi−1)
yi=(yi−2)−qi×(yi−1)
0
960
1
0
1
77
0
1
2
960
77
960÷77=12
960−12×77=960−924=36
1−12×0=1
0−12×1=−12
3
77
36
77÷36=2
77−2×36=77−72=5
0−2×1=−2
1−2×(−12)=1+24=25
4
36
5
36÷5=7
36−7×5=36−35=1
1−7×(−2)=1+14=15
−12−7×25=−12−175=−187
5
5
1
5÷1=5
5−5×1=0
−2−5×15=−2−75=−77
25−5×(−187)=25+935=960
From Step 4, we have: 1 = 15 × 960 + (−187) × 77
We are interested in the coefficient of 77, which is −187. This means −187 × 77 ≡ 1 (mod 960)
We need a positive value for d, we add ϕ(𝒏) to −187.
We have a plaintext message (𝒎), we calculate the encrypted version (𝒄) with the formula 𝒄 = 𝒎𝒆 % 𝒏
Let's suppose our message starts "HELLO". We'll give every letter a number. Our message starts with H - the eighth number of the alphabet.
877 % 1037 = 638
You can do that today on any pocket calculator. But could a competent mathematician calculate that by hand?
Exponentials get very large very quickly. There are some shortcuts, like Modular Exponentiation, but it is a fairly manual process. Doable, but not pleasant.
With enough time, you could manually encrypt a message like HELLO from 8 5 12 12 15 to 638 768 388 388 835.
In the toy example above, we turned 8 5 12 12 15 into 638 768 388 388 835. That's a very bad way of encrypting text. Working on individual letters allows for fairly trivial attacks in the form of frequency analysis. You might not know what prime numbers were used, but you know what the most common letter in English is and which letters often come in pairs.
Let's pretend that a binary code like ASCII had been invented by the 1930s, giving each letter its own number8. So the text translates to 1001000 1000101 1001100 1001100 1001111 7 bits is sufficient to store 128 characters, which is good enough for text, numbers, and punctuation.
Smooshing those bits together gives 10010001000101100110010011001001111, which is 19,473,311,311 in decimal.
But here we hit a snag! And rather an important one. Our message 𝒎 must be less than the key 𝒏 otherwise the maths doesn't work. So each "chunk" that is encrypted must be less than, in this case, 1037. In binary, 1037 is an 11 bit number - 10000001101 so let's chop the long binary string into groups of ten bit numbers9.
1001000100 0101100110 0100110010 01111 which, in decimal is 580 358 306 15.
There is no sensible way for a human to calculate that without mechanical or algorithmic assistance.
So our 1939 cryptographers are ready to pack up and go home, right?
Wrong! Remember, the plaintext message 𝒎 = 𝒄𝒅 % 𝒏
For this operation, that's 287773 % 1037
Let's turn 𝒅 into binary - decimal 773 is 1100000101 - we can now use the "square-and-multiply" algorithm to calculate the plaintext.
Again, you don't need to read this table and can skip to the next section - this is just to show that the calculations are possible (if somewhat convoluted).
Exponent Bit
base2 % modulus
If bit is 1, result = (result * base) % modulus
Current base
Current result
1
result = (1 * 287) % 1037 = 287
287
287
base = 2872 % 1037
2872 = 82369
82369÷1037=79, remainder 486
486
287
0
(No change to result)
base = 4862 % 1037
4862=236196
236196÷1037=227, remainder 177
177
287
1
result = (287 * 177) % 1037
287×177=50799
50799÷1037=49, remainder 36
36
36
base = 1772 % 1037
1772=31329
31329÷1037=30, remainder 219
219
36
0
(No change to result)
base = 2192 % 1037
2192=47961
47961÷1037=46, remainder 239
239
36
0
(No change to result)
base = 2392 % 1037
2392=57121
57121÷1037=55, remainder 036
36
36
0
(No change to result)
base = 362 % 1037
362=1296
1296÷1037=1, remainder 259
259
36
0
(No change to result)
base = 2592 % 1037
2592=67081
67081÷1037=64, remainder 789
789
36
0
(No change to result)
base = 7892 % 1037
7892=622521
622521÷1037=600, remainder 321
321
36
1
result = (36 * 321) % 1037
36×321=11556
11556÷1037=11, remainder 119
119
119
base = 3212 % 1037
3212=103041
103041÷1037=99, remainder 450
450
119
1
result = (119 * 450) % 1037
119×450=53550
53550÷1037=51, remainder 580
580
580
Yeeeesh! Tedious, but absolutely doable by hand. As long as you don't make mistakes and don't fall asleep.
Could this be made easier? Perhaps - but let's consider whether this effort would be worth it.
As discussed earlier, Enigma's password let you encrypt and decrypt using the same password. That means if the password leaks, you have lost the secrecy of both your outgoing and incoming messages.
The key advantage of symmetric encryption is that it is much easier to use. You set today's secret, then you can send and receive with ease. You do not need to manage two separate codes.
Let's imagine that there is a theoretical machine which can mechanically or electronically code and decode messages. You have the public key for sending messages back to base - but what about if you want to receive a message?
You will need a private key. A key which has to be protected in exactly the same way as Enigma's codebooks. If your private key is captured, all the messages previously sent to you can be decrypted.
OK, perhaps the solution is to give every machine its own unique keypair? Well, to quote the sages, now you have two problems.
First is the complexity of managing all the public keys. You have to remember which one to use when sending information.
Secondly, it means that you cannot broadcast a general message to all recipients. If HQ wants to send a message, they need to encrypt it separately for each receiver and also broadcast it separately. That also means the receivers have to know which message is intended for them10. Similarly, if a single user wants to send an encrypted message to all nearby units, they need to know who they are and separately encrypt messages to them11.
Simplicity is the main factor in making usable security as I've written about before. Regardless of whether a machine could have done the calculations, key management is a tough problem.
Diffie-Hellman key exchange is a cryptographic technique which allows two or more parties to use an insecure channel to exchange enough information to create a unique public/private keypair for themselves. As with all the other maths talked about, it is conceptually simple - but rather difficult to do by hand.
Given the limitations of the speed of 1930s technology, it might be easier just to broadcast in plaintext "Hello! I'm station 123 and my public key is ..." That would be a simple way of distributing your keys but has two disadvantages:
The enemy can flood you with encrypted messages. You have no way to verify that they come from a legitimate source.
There's no way to verify who the public key is from.
I'm not going to get into cryptographic signature verification because this blog post is already too long!
This, alas, is approaching the limits of my ignorance. I know it is possible to build a Difference Engine out of Lego. Similarly, in the late 1930s it was possible to build a mechanical calculator which was small, lightweight, and accurate.
There were massive analogue computers on battleships, able to solve "20-plus variable calculus problems in real-time". At around 1,400Kg these weren't as portable as the typewriter sized Enigma - but do go some way to showing it might have been possible to design a mechanical computer for these equations.
The mathematics behind public key cryptography are simple and, in my estimation, could easily have been understood by people nearly a century ago.
The various algorithms for simplifying the necessary calculations were neither obscure nor difficult to implement.
With smaller keys, it is possible to hand-calculate encryption and decryption
But, could it work in practice? If you had suitably trained battlefield mathematicians, it would be just about feasible to encrypt a message for transmission and decrypt something you've received. You wouldn't want to do it while under fire or for any long messages or while using large prime numbers. But, technically it is possible to hand-calculate the encryption and decryption of public key cryptography!
Let's look through the steps again.
Generate a public / private keypair.
Yes! Tedious for larger primes, but well within the abilities of skilled mathematicians.
Converting a plaintext message to binary.
Yes! Baudot codes were well known, as were things like Morse code.
Splitting a binary message into smaller chunks.
Yes! A trivial exercise on paper, but might be difficult mechanically.
Encrypting a chunk.
Possible but difficult - especially with larger keys.
Decrypting a chunk.
Possible but difficult - especially with larger keys.
Creating a machine to do the difficult work.
A very cautious maybe. Large battlefield mechanical-computers existed and were precise. Given the effort that went on in Bletchley Park, I don't doubt something could have been created.
However, given the complexity of the calculations, I don't think a portable machine would have been possible.
Key management.
A nightmare, as always.
Aside from the conceptual leaps required and the lack of computational power, the major problem with successfully deploying public key cryptography in 1939 is… usability!
The usability of security systems is often hidden from us. Managing a complex key infrastructure is a problem which still plagues the security industry. Despite decades of advances, we still regularly read stories about "secure" microchips getting hacked for their keys - I imagine it would be trivial to extract them from a mechanical computer.
Could public-key cryptography have been used in 1939? Possibly, but the complexity of mechanical computation would have made it impractical.
If you happen across a time machine with access to the mid-20th century, please pop back and let me know if I am right.
As seen in the historically "accurate" film "U571". ↩︎
With huge thanks to my gang of unpaid editors, including Colin and Liz. Any mistakes, errors, and typos are my responsibility. ↩︎
By contrast, the Enigma had about 67 bits of complexity, resulting in approximately 158 quintillion combinations. Hence the need for cryptanalysis rather than just brute force! ↩︎
Confusingly, a co-prime doesn't have to be a prime number. ↩︎
By the end of the 19th century, all prime numbers up to 1 billion had been discovered - that's around 50 million different primes. Far fewer combinations than the Enigma, but still a formidable challenge to try and randomly guess which two had been used. ↩︎
This isn't too much of a stretch. The Baudot code was invented in the late 1800s. It is the binary code from which ASCII descends. ↩︎
In this example, we might still be subject to some frequency analysis issues. There are common ways to start a message which could be pre-computed. ↩︎
Yes, there are ways round this. You could start each message with a plaintext callsign, or broadcast on different frequencies, or some other differentiator. The point is that it adds complexity. ↩︎
Again, there are ways round this. But they mostly involve generating even more shared keys. At which point, you're almost back to symmetric encryption! ↩︎
HomeLab & Selfhosting Books
I've read 'The Home Lab Handbook: Building and Managing Your Own IT Lab from Scratch' which I would recommend to anyone just starting out in selfhosting and homelabing. Relative to that, I found a 'course' online (https://linuxupskillchallenge.org/#table-of-contents) that would also be useful for new arrivals. ...
How To Argue With An AI Booster ( www.wheresyoured.at )
cross-posted from: ...
How is Linux on the M1 Macbook Pro?
I'm looking to pick up an old M1 macbook from my work if I can slap linux on it and turn it into a usable laptop. Its the A2338 ...