Primitives
Block
How do I listen to new blocks?
The example below shows how to subscribe to new blocks. It displays the block number every time a new block is seen by the node you are connected to.
// Import the API
async function main() {
// Instantiate the API
...
// We only display a couple, then unsubscribe
let count = 0;
// Subscribe to the new headers on-chain. The callback is fired when new headers
// are found, the call itself returns a promise with a subscription that can be
// used to unsubscribe from the newHead subscription
const unsubscribe = await api.rpc.chain.subscribeNewHeads((header) => {
console.log(`Chain is at block: #${header.number}`);
if (++count === 256) {
unsubscribe();
process.exit(0);
}
});
}
main().catch(console.error);
How do I retrieve the header/extrinsic hash from blocks?
A block hash refers to the hash over the header, the extrinsic hash refers to the hash of the encoded extrinsic. Since all objects returned by the API implement the .hash => Hash
getter, we can simply use this to view the actual hash.
// returns Hash
const blockHash = await api.rpc.chain.getBlockHash(blockNumber);
// returns SignedBlock
const signedBlock = await api.rpc.chain.getBlock(blockHash);
// the hash for the block, always via header (Hash -> toHex()) - will be
// the same as blockHash above (also available on any header retrieved,
// subscription or once-off)
console.log(signedBlock.block.header.hash.toHex());
// the hash for each extrinsic in the block
signedBlock.block.extrinsics.forEach((ex, index) => {
console.log(index, ex.hash.toHex());
});
How do I extract the block author?
The block author is encoded inside the consensus logs for the block. To extract it, you need to decode the log (which the API does do) and then map the index of the validator to the list of session validators. This extraction is however available on the API derive for new head subscriptions, which returns an extended header with the author populated (assuming that the digest logs are known).
// subscribe to all new headers (with extended info)
api.derive.chain.subscribeNewHeads((header) => {
console.log(`#${header.number}: ${header.author}`);
});
For a single header only, the derives also contain a getHeader
, which once again returns a header extended with the author:
// retrieve the last header (hash optional)
const header = await api.derive.chain.getHeader();
console.log(`#${header.number}: ${header.author}`);
How do I view extrinsic information?
The transactions are included in a signed block as part of the extrinsics - some of these will be unsigned and generated by the block author and some of these may be submitted from external sources and be signed (some palettes use unsigned transactions, so signed/unsigned is not an indication of origin). To retrieve the block and display the transaction information, we can do the following:
// no blockHash is specified, so we retrieve the latest
const signedBlock = await api.rpc.chain.getBlock();
// the information for each of the contained extrinsics
signedBlock.block.extrinsics.forEach((ex, index) => {
// the extrinsics are decoded by the API, human-like view
console.log(index, ex.toHuman());
const {
isSigned,
meta,
method: { args, method, section },
} = ex;
// explicit display of name, args & documentation
console.log(`${section}.${method}(${args.map((a) => a.toString()).join(", ")})`);
console.log(meta.documentation.map((d) => d.toString()).join("\n"));
// signer/nonce info
if (isSigned) {
console.log(`signer=${ex.signer.toString()}, nonce=${ex.nonce.toString()}`);
}
});
In the above example, .toHuman()
is used to format the extrinsic into a human-readable representation. You can inspect/extract specific fields from the decoded extrinsic as required. For instance ex.method.section
would return the pallet that executed this transaction.
How do I map extrinsics to their events?
While the blocks contain the extrinsics, the system event storage will include the events and the details needed to allow for a mapping between them. For events, the phase
is an enum that would be isApplyExtrinsic
with the index in the cases where it refers to an extrinsic in a block. This index maps through the order of the extrinsics as found.
To perform a mapping between the two, we need information from from both sources, as per the example below.
// no blockHash is specified, so we retrieve the latest
const signedBlock = await api.rpc.chain.getBlock();
const apiAt = await api.at(signedBlock.block.header.hash);
const allRecords = await apiAt.query.system.events();
// map between the extrinsics and events
signedBlock.block.extrinsics.forEach(({ method: { method, section } }, index) => {
// filter the specific events based on the phase and then the
// index of our extrinsic in the block
const events = allRecords
.filter(({ phase }) => phase.isApplyExtrinsic && phase.asApplyExtrinsic.eq(index))
.map(({ event }) => `${event.section}.${event.method}`);
console.log(`${section}.${method}:: ${events.join(", ") || "no events"}`);
});
How do I determine if an extrinsic succeeded/failed?
The code below extends the above example, where extrinsics are mapped to their blocks. In this example, we will specifically look for specific extrinsic events—namely, the system.ExtrinsicSuccess
and system.ExtrinsicFailed
events. The same logic can be applied to inspect any other type of expected event.
// no blockHash is specified, so we retrieve the latest
const signedBlock = await api.rpc.chain.getBlock();
// get the api and events at a specific block
const apiAt = await api.at(signedBlock.block.header.hash);
const allRecords = await apiAt.query.system.events();
// map between the extrinsics and events
signedBlock.block.extrinsics.forEach(({ method: { method, section } }, index) => {
allRecords
// filter the specific events based on the phase and then the
// index of our extrinsic in the block
.filter(({ phase }) => phase.isApplyExtrinsic && phase.asApplyExtrinsic.eq(index))
// test the events against the specific types we are looking for
.forEach(({ event }) => {
if (api.events.system.ExtrinsicSuccess.is(event)) {
// extract the data for this event
// (In TS, because of the guard above, these will be typed)
const [dispatchInfo] = event.data;
console.log(
`${section}.${method}:: ExtrinsicSuccess:: ${JSON.stringify(dispatchInfo.toHuman())}`
);
} else if (api.events.system.ExtrinsicFailed.is(event)) {
// extract the data for this event
const [dispatchError, dispatchInfo] = event.data;
let errorInfo;
// decode the error
if (dispatchError.isModule) {
// for module errors, we have the section indexed, lookup
// (For specific known errors, we can also do a check against the
// api.errors.<module>.<ErrorName>.is(dispatchError.asModule) guard)
const decoded = api.registry.findMetaError(dispatchError.asModule);
errorInfo = `${decoded.section}.${decoded.name}`;
} else {
// Other, CannotLookup, BadOrigin, no extra info
errorInfo = dispatchError.toString();
}
console.log(`${section}.${method}:: ExtrinsicFailed:: ${errorInfo}`);
}
});
});
Event
How do I subscribe to node events?
Query the system events and extract information from them.
async function main() {
// Instantiate the API
...
// Subscribe to system events via storage
api.query.system.events((events) => {
console.log(`\nReceived ${events.length} events:`);
// Loop through the Vec<EventRecord>
events.forEach((record) => {
// Extract the phase, event and the event types
const { event, phase } = record;
const types = event.typeDef;
// Show what we are busy with
console.log(`\t${event.section}:${event.method}:: (phase=${phase.toString()})`);
console.log(`\t\t${event.meta.documentation.toString()}`);
// Loop through each of the parameters, displaying the type and data
event.data.forEach((data, index) => {
console.log(`\t\t\t${types[index].type}: ${data.toString()}`);
});
});
});
}
main().catch((error) => {
console.error(error);
process.exit(1);
});
How do I traverse events while submitting extrinsic?
// Import the test keyring (already has dev keys for Alice, Bob, Charlie, Eve & Ferdie)
const testKeyring = require("@polkadot/keyring/testing");
// Utility function for random values
const { randomAsU8a } = require("@polkadot/util-crypto");
// Some constants we are using in this sample
const ALICE = "0xE04CC55ebEE1cBCE552f250e85c57B70B2E2625b";
const AMOUNT = 10000;
async function main() {
// Instantiate the API
...
// Create an instance of our testing keyring
// If you're using ES6 module imports instead of require, just change this line to:
// const keyring = testKeyring();
const keyring = testKeyring.default();
// Get the nonce for the admin key
const { nonce } = await api.query.system.account(ALICE);
// Find the actual keypair in the keyring
const alicePair = keyring.getPair(ALICE);
// Create a new random recipient
const recipient = keyring.addFromSeed(randomAsU8a(32)).address;
console.log(
"Sending",
AMOUNT,
"from",
alicePair.address,
"to",
recipient,
"with nonce",
nonce.toString()
);
// Do the transfer and track the actual status
api.tx.balances
.transfer(recipient, AMOUNT)
.signAndSend(alicePair, { nonce }, ({ events = [], status }) => {
console.log("Transaction status:", status.type);
if (status.isInBlock) {
console.log("Included at block hash", status.asInBlock.toHex());
console.log("Events:");
events.forEach(({ event: { data, method, section }, phase }) => {
console.log("\t", phase.toString(), `: ${section}.${method}`, data.toString());
});
} else if (status.isFinalized) {
console.log("Finalized block hash", status.asFinalized.toHex());
process.exit(0);
}
});
}
main().catch(console.error);
Storage
How do I check for storage existence?
In the metadata, a fallback is provided for each storage item. This means that when an entry does not exist, the fallback (which is the default value for the type) will be provided - querying for a non-existent key (unless an option) will yield a value:
// retrieve Option<StakingLedger>
const ledger = await api.query.staking.ledger("0x25451A4de12dcCc2D166922fA938E900fCc4ED24");
// retrieve ValidatorPrefs (will yield the default value)
const prefs = await api.query.staking.validators("0x25451A4de12dcCc2D166922fA938E900fCc4ED24");
console.log(ledger.isNone, ledger.isSome); // true, false
console.log(JSON.stringify(prefs.toHuman())); // {"commission":"0"}
In the second case, the non-existent prefs returns the default/fallback value for the storage item. So in this case, we don't know if the value is set to 0 or unset. Existence can be checked by using the storage size, which would be zero if nothing is stored.
// exists
const sizeY = await api.query.staking.validators.size("0x25451A4de12dcCc2D166922fA938E900fCc4ED24");
// non existent
const sizeN = await api.query.staking.validators.size("0x5630a480727CD7799073b36472d9b1A6031F840b");
console.log(sizeY.isZero(), sizeY.toNumber()); // false 4
console.log(sizeN.isZero(), sizeY.toNumber()); // true 0
How do I use .entries()/.keys()
on double maps?
As explained in the section Building with Native API each map-type storage entry exposes the entries/keys helpers to retrieve the whole list. In the case of double maps, with the addition of a single argument, you can retrieve either all entries or a subset based on the first map key.
In both these cases, entries/keys operate the same way, .entries()
retrieving (StorageKey, Codec)[]
and .keys()
retrieving StorageKey[]
// Retrieves the entries for all slashes, in all eras (no arg)
const allEntries = await api.query.staking.nominatorSlashInEra.entries();
// nominatorSlashInEra(EraIndex, AccountId) for the types of the key args
allEntries.forEach(
([
{
args: [era, nominatorId],
},
value,
]) => {
console.log(`${era}: ${nominatorId} slashed ${value.toHuman()}`);
}
);
While we can retrieve only the keys for a specific era, using an argument for the first part of the double map (as defined here, an EraIndex
):
// Retrieves the keys for the slashed validators in era 652
const slashedKeys = await api.query.staking.nominatorSlashInEra.keys(652);
// key args still contains [EraIndex, AccountId] decoded
console.log(`slashed: ${slashedKeys.map(({ args: [era, nominatorId] }) => nominatorId)`);
How do I read storage at specific blockhash?
In addition to querying the latest storage, you can make storage queries at a specific blockhash. Be aware that the node applies a pruning strategy and typically only keeps the last 256 blocks, unless run in archive mode.
const ALICE = "0xE04CC55ebEE1cBCE552f250e85c57B70B2E2625b";
const BOB = "0x25451A4de12dcCc2D166922fA938E900fCc4ED24";
async function main() {
// Instantiate the API
...
// Retrieve the last block header, extracting the hash and parentHash
const { hash, parentHash } = await api.rpc.chain.getHeader();
console.log(`last header hash ${hash.toHex()}`);
// Retrieve the balance at the preceding block for Alice using an at api
const apiAt = await api.at(parentHash);
const balance = await apiAt.query.system.account(ALICE);
console.log(`Alice's balance at ${parentHash.toHex()} was ${balance.data.free}`);
// Now perform a multi query, returning multiple balances at once
const balances = await api.query.system.account.multi([ALICE, BOB]);
console.log(
`Current balances for Alice and Bob are ${balances[0].data.free} and ${balances[1].data.free}`
);
}
main()
.catch(console.error)
.finally(() => process.exit());
Transaction
How do I make a simple transfer?
The example below shows how to create a transaction to make a transfer from one account to another.
const { Keyring } = require("@polkadot/keyring");
import { hexToU8a } from "@polkadot/util";
const BOB = "0x25451A4de12dcCc2D166922fA938E900fCc4ED24";
async function main() {
// Instantiate the API
...
// Construct the keyring after the API (crypto has an async init)
const keyring = new Keyring({ type: "ethereum" });
// Add Alice to our keyring
const alice = keyring.addFromSeed(hexToU8a(ALICE_PRIVATE_KEY));
// Create a extrinsic, transferring 12345 units to Bob
const transfer = api.tx.balances.transfer(BOB, 12345);
// Sign and send the transaction using our account
const hash = await transfer.signAndSend(alice);
console.log("Transfer sent with hash", hash.toHex());
}
main()
.catch(console.error)
.finally(() => process.exit());
How do I estimate the transaction fees?
In addition to the signAndSend
helper on transactions, .paymentInfo
(with the exact same parameters) are also exposed. Using the same sender, it applies a dummy signature to the transaction and then gets the fee estimation via RPC.
// estimate the fees as RuntimeDispatchInfo, using the signer (either
// address or locked/unlocked keypair) (When overrides are applied, e.g
// nonce, the format would be `paymentInfo(sender, { nonce })`)
const info = await api.tx.balances.transfer(recipient, 123).paymentInfo(sender);
// log relevant info, partialFee is Balance, estimated for current
console.log(`
class=${info.class.toString()},
weight=${info.weight.toString()},
partialFee=${info.partialFee.toHuman()}
`);
How do I get the decoded enum for an ExtrinsicFailed event?
Assuming you are sending a tx via .signAndSend
, the callback yields information around the tx pool status as well as any events when isInBlock
or isFinalized
. If an extrinsic fails via system.ExtrinsicFailed
event, you can retrieve the error if defined as an enum on a module.
api.tx.balances.transfer(recipient, 123).signAndSend(sender, ({ status, events }) => {
if (status.isInBlock || status.isFinalized) {
events
// find/filter for failed events
.filter(({ event }) => api.events.system.ExtrinsicFailed.is(event))
// we know that data for system.ExtrinsicFailed is
// (DispatchError, DispatchInfo)
.forEach(
({
event: {
data: [error, info],
},
}) => {
if (error.isModule) {
// for module errors, we have the section indexed, lookup
const decoded = api.registry.findMetaError(error.asModule);
const { docs, method, section } = decoded;
console.log(`${section}.${method}: ${docs.join(" ")}`);
} else {
// Other, CannotLookup, BadOrigin, no extra info
console.log(error.toString());
}
}
);
}
});
As of the @polkadot/api
2.3.1 additional result fields are exposed. Firstly there is dispatchInfo: DispatchInfo
which occurs in both ExtrinsicSuccess
& ExtrinsicFailed
events. Additionally, on failures the dispatchError: DispatchError
is exposed. With this in mind, the above example can be simplified to be:
api.tx.balances
.transfer(recipient, 123)
.signAndSend(sender, ({ status, events, dispatchError }) => {
// status would still be set, but in the case of error we can shortcut
// to just check it (so an error would indicate InBlock or Finalized)
if (dispatchError) {
if (dispatchError.isModule) {
// for module errors, we have the section indexed, lookup
const decoded = api.registry.findMetaError(dispatchError.asModule);
const { docs, name, section } = decoded;
console.log(`${section}.${name}: ${docs.join(" ")}`);
} else {
// Other, CannotLookup, BadOrigin, no extra info
console.log(dispatchError.toString());
}
}
});
How do I get the Result of a Sudo event?
The section above shows you how to listen for the result of a regular extrinsic. However, Sudo extrinsics do not directly report the success or failure of the underlying call. Instead, a Sudo transaction will return Sudid(result)
, where result
will be the information you are looking for.
To properly parse this information, we will follow the steps above, but then specifically peek into the event data to find the final result:
const unsub = await api.tx.sudo
.sudo(api.tx.balances.forceTransfer(user1, user2, amount))
.signAndSend(sudoPair, ({ status, events }) => {
if (status.isInBlock || status.isFinalized) {
events
// We know this tx should result in `Sudid` event.
.filter(({ event }) => api.events.sudo.Sudid.is(event))
// We know that `Sudid` returns just a `Result`
.forEach(
({
event: {
data: [result],
},
}) => {
// Now we look to see if the extrinsic was actually successful or not...
if (result.isError) {
let error = result.asError;
if (error.isModule) {
// for module errors, we have the section indexed, lookup
const decoded = api.registry.findMetaError(error.asModule);
const { docs, name, section } = decoded;
console.log(`${section}.${name}: ${docs.join(" ")}`);
} else {
// Other, CannotLookup, BadOrigin, no extra info
console.log(error.toString());
}
}
}
);
unsub();
}
});
How do I send an unsigned extrinsic?
For most runtime modules, transactions need to be signed, and validation for this happens on the node side. There are, however, modules that accept unsigned extrinsics. An example would be the Polkadot/Kusama token claims (here, used as an example).
// construct the transaction, exactly as per normal
const utx = api.tx.claims.claim(beneficiary, ethSignature);
// send it without calling sign, pass callback with status/events
tx.send(({ status }) => {
if (status.isInBlock) {
console.log(`included in ${status.asInBlock}`);
}
});
The signing is indicated by the first byte in the transaction, so in this case, we have called .send
on it (no .sign
or .signAndSend
), so it will be sent using the unsigned state without the signature attached. :::note
The status event is only available on providers that support subscriptions, such as WSProvider
or ScProvider
. On HttpProvider
, which does not have bi-directional capabilities. There are no subscriptions, so it cannot listen to the events that are emitted by the transaction pool. In the case of HttpProvider
the result object returned will always be the non-unique transaction hash.
:::
How can I batch transactions?
Polkadot/Substrate provides a utility.batch
method that can be used to send a number of transactions at once. These are then executed from a single sender (single nonce specified) in sequence. This is very useful in many cases. For instance, if you wish to create a payout for a validator for multiple eras, you can use this method. Likewise, you can send several transfers at once or batch different types of transactions.
// construct a list of transactions we want to batch
const txs = [
api.tx.balances.transfer(addrBob, 12345),
api.tx.balances.transfer(addrEve, 12345),
api.tx.staking.unbond(12345),
];
// construct the batch and send the transactions
api.tx.utility.batch(txs).signAndSend(sender, ({ status }) => {
if (status.isInBlock) {
console.log(`included in ${status.asInBlock}`);
}
});
The fee for a batch transaction can be estimated similar to the fee for a single transaction using the exposed .paymentInfo
helper method described earlier, and it is usually less than the sum of the fees for each individual transaction.
How do I take the pending tx pool into account in my nonce?
The system.account
query will always contain the current state, i.e. it will reflect the nonce for the last known block. As such when sending multiple transactions in quick succession (see batching above), there may be transactions in the pool with the same nonce that signAndSend
would apply - this call does not do any magic. It simply reads the state for the nonce. Since we can specify options to the signAndSend
operation, we can override the nonce, either by manually incrementing it or querying it via rpc.system.accountNextIndex
.
for (let i = 0; i < 10; i++) {
// retrieve sender's next index/nonce, taking txs in the pool into account
const nonce = await api.rpc.system.accountNextIndex(sender);
// send, just retrieving the hash, not waiting on status
const txhash = await api.tx.balances.transfer(recipient, 123).signAndSend(sender, { nonce });
}
As a convenience function, the accountNextIndex
can be omitted by specifying a nonce of -1
, allowing the API to do the lookup. In this case, the above example can be simplified even further:
for (let i = 0; i < 10; i++) {
const txhash = await api.tx.balances.transfer(recipient, 123).signAndSend(sender, { nonce: -1 });
}
The latter form is preferred since it dispatches the RPC calls for nonce and blockHash (used for mortality) in parallel. This approach will yield a better throughput, especially with the above bulk example.