Proposed ‘New Hope’ Blockchain Platforms Enable Large-Scale DNN Training on Smart Contracts


It’s believed that deep neural networks (DNNs) hold significant potential for blockchain applications such as decentralized finance (DeFi) and decentralized autonomous organization (DAO). However, training and running large-scale DNNs on smart contracts — stored computer code that automatically executes all or part of a contractual agreement — remains infeasible due to fundamental design issues with today’s blockchain platforms.

A new paper, Training Massive Deep Neural Networks in a Smart Contract: A New Hope,proposes a set of novel blockchain platform designs, collectively dubbed “A New Hope (ANH),” that aim to enable the integration of large-scale DNNs into smart contracts.

image.png

There are two major hurdles for training and running a DNN within a smart contract. The first is cost. For instance, on the online smart contract and decentralized application platform Ethereum, each smart contract instruction incurs a monetary cost referred to as “gas.” Training and running DNNs in such a metered environment could result in a prohibitively high gas cost that burns through millions of dollars.

The second issue is that DNN training often does not yield deterministic results, which runs contrary to the expectation that blockchain platforms should have deterministic, reproducible results and effects on smart contract transactions.

The ANH approach is designed to address these issues. The paper summarizes the proposed platform designs as:

  1. Validators of new blocks do not execute the transactions therein.
  2. Transaction execution is on-demand, possibly through a service provider called an on-chain accountant.
  3. Smart contract transactions are allowed to have nondeterministic results, which are verified through a special validation mechanism that may involve invoking other smart contracts.

In current blockchain systems, each node is required to execute all transactions in all blocks and maintain the entire world state at all times — meaning a node can only finish processing a given block after it finishes executing all transactions in that block. A block consists of a sequence of transactions and additional verification information such as signatures and hash values, and in cases where a block contains a transaction with intensive computations (e.g. a large DNN training), the required processing time would be too long.

image.png

The proposed ANH takes the bold move of removing the ordered list of transactions. Thus, with the single exception of the genesis block, no block contains information about the world state. Therefore, a block can be formed as soon as its creator node gathers a list of transactions, enabling the block validator to simply verify the signature without running a transaction.

Also, because maintaining the entire world state in real-time is not cost-efficient when transactions contain expensive DNN training, ANH adopts a lazy transaction execution strategy: a transaction is executed only when its results are needed.

image.png

To reduce the validators’ transaction fees, ANH imposes two rules: 1) Transaction fees must be paid with zero-cost income, and 2) The transaction sender must pay the maximum possible gas costs specified by the gas limit of the transaction. If the transaction is completed without reaching the gas limit, the remaining paid gas is returned as a credit on the sender’s account.

Overall, ANH maintains computational efficiency on the blockchain platform by deferring smart contract computations to payment time and reduces total smart contract computation costs through lazy, on-demand execution. The paper also explores potential implications of ANH, such as its effects on token fungibility, sharding, private transactions, and the fundamental meaning of smart contracts.

Curiously, the paper’s sole author is listed as “Yin Yang” (a possible pseudonym), and no associated institutions are identified.

The paper Training Massive Deep Neural Networks in a Smart Contract: A New Hope is on arXiv.


Author: Hecate He | Editor: Michael Sarazen, Chain Zhang


We know you don’t want to miss any news or research breakthroughs. Subscribe to our popular newsletter Synced Global AI Weekly to get weekly AI updates.



Source link

spot_imgspot_imgspot_img

Latest articles

Related articles

Leave a reply

Please enter your comment!
Please enter your name here

spot_imgspot_img