DEC 4, 2023
Should you care what happens in 179 years?
by Vlad, Access Protocol
Writing on-chain programs is tricky. On one hand you really want these programs to be upgradable, on the other you should strive to make them fully decentralized.
Vlad is a Fullstack Software Engineer at Access Protocol, working on Smart Contract and Backend development, as well as user-facing applications. Vlad has moved to the exciting world of crypto after building big systems in traditional FinTech companies. You can reach out to Vlad on X (@mmatdev) or via e-mail at firstname.lastname@example.org.
I believe that the ultimate goal of every team deploying programs on-chain should be to renounce the keys once the program is done and just enjoy the ride as an independent observer or a part of a DAO.
This poses a few technical challenges and I would like to focus on one of these today.
You might have heard about an integer overflow in a computer program, but let me quickly summarise the basics here. I will be focusing only on an overflow of integers as there is no built-in way how to represent floating point numbers in Solidity (Ethereum) nor Rust on Solana.
In a computer each number (or any other piece of data) is represented as a bunch of
1s. Moreover, if you have a look at numbers that are usually used in programming languages , the number of
1s used to represent a number is fixed, mainly for performance reasons. Usually you use 32 or 64 bits (
1s), but if you need very large numbers or if you are limited by the memory of your computer you might decide to use more or less space respectively.
Let me demonstrate this on an example: I want to use only 2 bits to store a non-negative integer. I have only four possibilities how these two bits can be set:
Therefore, I cannot represent more than 4 numbers using two bits. Let's decide that these numbers are 0, 1, 2, 3 respectively and the binary representation helps us to store them in the computer memory.
Ok, but now what happens if you try to add 2+3 together? The result should be 5, but there is no way to represent this number. Therefore, many programs won't even let you do that and crash instead. Those that don't fail end up showing completely wrong results.
This is called an integer overflow and has been a cause for multiple severe hardware malfunctions in the past. The most well known was probably the Therac-25, a computer-controlled radiation therapy machine, that cost lives of multiple people.
179 year problem
But what does this have to do with 179 years, you may be asking. Let me clarify this - I was just working on a V2 Solana on-chain program for Access Protocol when I have noticed that we are using
u16 integer format to represent the current day index in a pool.
The day when we have deployed our program to Solana is represented as 0, then every day we add 1. However, the largest number that we can represent on 16 bits is 65,536. When you divide this by 365 you find out that this corresponds to 179 years. After that it won't be possible to add 1 without the program breaking.
Should we care?
In an ordinary off-chain program that can be upgraded at any time a simple comment in a code letting the next generations know that there is an issue would be ok. But if you are writing an on-chain program and want to throw your keys away and make the program immutable, no one would be ever able to fix this.
You can decide for yourself if this is acceptable or not.
Anyways, if you're worried about this, rest assured that we will see this issue much earlier. You can be fairly sure that some of your beloved application will run into a similar issue already in the year 2038: https://en.wikipedia.org/wiki/Year_2038_problem.