Correct me if I’m wrong, but I view a blockchain as a series circuit. Since all information on a ledger is permanent and immutable, that would mean the blockchain is constantly growing along with the necessity for available memory. What is done to compensate for the possibility that the amount of data in a ledger is too large for a blockchain network to handle? Thanks.
Good question, I too am interested in the opinion of an expert on this topic.
It’s my understanding that this has become a problem already and I can think of two advancements in the main protocols that help this:
- BTC Segwit packs more transactions into a block by separating the digital signatures from the transaction and therefore reducing the size the transaction takes up in the block and the overall size of the blockchain - but still, the blockchain will still keep on growing.
- Sharding in Ethereum 2.0 is a type of database architecture that will mean each node doesn’t need to run a full copy of the blockchain.
As hardware technology advances and is forced to advance with innovation in the blockchain space, digital storage and network speeds will also continue to improve.
We also have SPVs which are not full nodes, they don’t store a full copy of the blockchain, they query many nodes who are storing a copy of the blockchain to find out whether a transaction has been confirmed.
However, I think the question still remains that a blockchain size could keep growing to infinity…
Thanks for your comments coin_digger.
So with continuous improvement in storage capacity and network speed, we can keep kicking the can down the road in order to buy time. I’m very new to all of this, and the thought popped up almost immediately that there will most likely be a need to manage and organize vast transaction histories.
Cool, sounds like a step in that direction.
I don’t think we should worry too much about the size just yet. If we are talking about bitcoin, we are only at around 280gb in 10 years. You can easily buy a 2000GB hdd for around 100$ only. I do however think that in the future we will have forks that would remove the earliest data of the blockchain. Just an opinion of course.
While this is true, the blockchain can’t work or be secure if we don’t have full nodes. SPV is only designed so that we can scale easier, using phones or other low memory devices to interact with the blockchain. While the size of the blockchain may be a problem in the future, I think we should work on other things in the mean time. We should try to scale the bitcoin and eth networks as much as possible, while still remaining secure and trustless.
BCH for example increased their blocks to 8mb. This is an 8x effect compared to bitcoin. That’s why I don’t like the approach of the bitcoin cash. It improves scalability, but it will hurt the network in the long run. Not to mention that increasing the block size also makes the propagation slower, which causes more stale blocks to occur.
Puts it in context, thank you!
If we do see a parabolic move in adoption and usage and some are predicting, would we also see a similar result in the size of the blockchain?
Great, I was on the verge of asking about this because it seemed to deviate from the principles I just learned. Thanks