Blockchain transactions per second (TPS) are often used as a measure of performance, but they do not fully tell us whether the network can scale in practice.
Carter Feldman, founder of Psy Protocol and a former hacker, told Cointelegraph that TPS data is often misleading because it ignores how transactions are actually verified and communicated in decentralized systems.
“Many pre-launch mainnet, testnet, or isolated benchmarks measure TPS with just one node running. At this point, you might as well call Instagram a blockchain that can reach 1 billion TPS because it has one central authority verifying every API call,” Feldman said.
Part of the problem is the way most blockchains are designed. The faster they try to move, the greater the load on each node and the more hard decentralization becomes. This burden can be reduced by separating transaction execution from verification.
TPS numbers ignore the costs of decentralization
TPS is an significant benchmark for blockchain performance. If the network has a higher TPS, it can handle higher actual usage.
However, Feldman argued that most mainstream TPS data represents ideal settings that do not translate into actual throughput. The impressive numbers do not show how the system works in decentralized conditions.
“The TPS of a virtual machine or a single node is not a measure of the actual performance of the blockchain mainnet,” Feldman said.
“However, the number of transactions per second that a blockchain can process in production is still an important way to quantify how much usage it can support, and that is what scaling should mean.”
Each full node in the blockchain must verify that transactions comply with the rules of the protocol. If one node accepts an invalid transaction, the others should reject it. This is what makes a decentralized ledger work.
Related: Firedancer will speed up Solana, but won’t reach its full potential
Blockchain performance takes into account the speed at which the virtual machine completes transactions. However, bandwidth, latency, and network topology matter in the real world. So performance also depends on how transactions are received and verified by other nodes in the network.
As a result, TPS data published in official documents often differs from the performance of the main network. Benchmarks that isolate execution from forwarding and verification costs measure something closer to virtual machine speed than blockchain scalability.
EOS, the network where Feldman was a former block producer, broke records for initial coin offerings in 2018. white paper suggested a theoretical scale of about 1 million TPS. That’s still an impressive number, even by 2026 standards.
EOS never achieved the theoretical goal of TPS. Previous reports claimed it could reach 4,000 trades with favorable settings. However, tests conducted by blockchain testers at Whiteblock showed that under realistic network conditions, throughput dropped to around 50 TPS.
In 2023, Jump Crypto demonstrated that its Solana validation client, Firedancer, achieved what EOS could not, testing 1 million TPS. Since then, the client has been deployed with multiple validators action a hybrid version known as Frankendancer. Solana in today’s live conditions typically processes around 3000-4000 TPS. About 40% of these transactions are non-voting transactions, which better reflect real user activity.

Breaking the linear scaling problem
Blockchain throughput typically scales linearly with load. More transactions reflect more activity, but it also means that nodes are receiving and verifying more data.
Each additional transaction increases the computational load. At some point, bandwidth constraints, hardware limitations, and synchronization delays make further growth unsustainable without abandoning decentralization.
Feldman said overcoming this limitation requires rethinking how trustworthiness is proven, which can be done using zero-knowledge (ZK) technology. ZK is a way to prove that a batch of transactions was processed correctly, without each node having to re-run those transactions. Because it allows you to prove validity without revealing all the underlying data, ZK is often promoted as a solution to privacy problems.
Related: Privacy tools are gaining popularity thanks to institutional adoption, says ZKsync dev
Feldman argues that this can also reduce the overhead of scaling through recursive ZK proofs. To put it simply, it is about evidence verifying other evidence.
“It turns out that you can take two ZK proofs and generate a ZK proof that proves that both proofs are correct,” Feldman said. “So you can take two pieces of evidence and turn them into one piece of evidence.”
“Let’s say we start with 16 user transactions. We can take those 16 proofs and create eight proofs from them, then we can make eight proofs into four proofs,” Feldman explained, sharing a graphic showing an evidence tree where multiple proofs eventually become one.

In customary blockchain designs, increasing TPS increases the verification and bandwidth requirements for each node. Feldman argues that with evidence-based design, throughput can enhance without proportionally increasing verification costs per node.
This does not mean that ZK completely eliminates scaling trade-offs. Generating evidence can be computationally intensive and may require specialized infrastructure. While verification becomes low-cost for regular nodes, the burden shifts to provers who have to do the massive cryptographic work. Modernizing evidence-based verification in existing blockchain architectures is also elaborate, which helps explain why most mainstream networks still rely on customary execution models.
Performance beyond raw throughput
TPS is not useless, but it is conditional. According to Feldman, raw capacity data is less meaningful than economic signals such as transaction fees, which provide a clearer indicator of network health and demand.
“I believe that TPS is the second benchmark when it comes to blockchain performance, but only if it is measured in a production environment or in an environment where transactions are not only processed but also forwarded and verified by other nodes,” he said.

The dominant and existing blockchain project has also influenced investments. Those that rely on sequential execution cannot easily apply evidence-based verification without redesigning how transactions are processed.
“At the very beginning, it was almost impossible to raise money for anything other than ZK EVM [Ethereum Virtual Machine]” Feldman said, explaining Psy Protocol’s previous funding problems.
“The reason people didn’t want to finance it in the beginning was because it took a while,” he added. “You can’t just fork EVMs or their state memory because everything is done completely differently.”
In most blockchains, a higher TPS means more work for each node. Core data alone does not show whether this burden is sustainable.
Warehouse: Ethereum roadmap to 10,000 TPS using ZK technology: a guide for dummies
Cointelegraph Features and Cointelegraph Magazine publish long-form journalism, analysis and narrative reporting from Cointelegraph’s in-house editorial team and selected external writers with subject matter expertise. All articles are edited and reviewed by Cointelegraph editors in accordance with our editorial standards. Contributions from outside authors are solicited for their experience, research, or perspective and do not reflect the views of Cointelegraph as a company unless expressly stated. The content published in “Functions i Magazyn” does not constitute financial, legal or investment advice. Readers should conduct their own research and, if necessary, consult qualified professionals. Cointelegraph maintains full editorial independence. Advertisers, partners or commercial relationships have no influence on the selection, launch and publication of the Magazine Features and content.
