Building a data-driven Thala using Sentio
TL;DR
Thala leverages a blockchain indexer to gather pool TVL, volume, and swap history data, among other parameters. This enables us to make data-informed decisions and create informative user interfaces. When considering whether to build or purchase an indexer, we opted for Sentio because of their prompt response time and swift delivery of feature requests. We’ve successfully utilized Sentio to tackle various data challenges and are delighted with our ongoing collaboration.
Why Use a Blockchain Indexer?
Data plays a pivotal role at Thala:
- It empowers users with a more informative application, enhancing their understanding of the protocol.
- Developers can draw insights from data to make critical technical decisions.
- Governance participants can rely on data to comprehend the protocol’s activity and health before voting on vital protocol decisions.
However, becoming data-driven is challenging due to the complex storage and aggregation of on-chain data on the blockchain. To tackle this problem, we utilize a blockchain indexer to:
- Collect blockchain data in real time.
- Transform blockchain data into meaningful metrics and insights.
- Provide developer-friendly APIs to access the data.
Choosing the right solution
We need an indexing tool that:
- Is readily available on Aptos, a nascent chain with relatively sparse infrastructure
- Allows us to implement custom transformation logic in code that we are familiar with such as TypeScript
- Is responsive, with quick answers to questions and feature requests
Using the Aptos Indexer
The Aptos Indexer was primarily designed to meet general needs. To index application-specific data, we would either need to modify the indexer code and run the custom indexer node, or aggregate on top of the exposed GraphQL API to generate Thala-specific data. This approach would necessitate us maintaining our own data infrastructure, adding a significant DevOps burden to our small development team.
Choosing Sentio
At the time, no indexer vendor supported Aptos. However, upon discussing our needs with Sentio, a team of robust engineers with solid funding who were building observability in crypto, we were given access to their beta support for Aptos within a week. We started with time series metrics and gradually benefited from more advanced features, including event logs and a SQL interface.
The choice of Sentio was not just about the product, but also the service. Features were delivered in days rather than months, and we had multiple instances of working with Sentio developers at odd hours to ensure smooth indexer operation and unaffected users.
Sentio in Use
The Sentio usage process can be summarized in the following steps:
- Identify smart contracts to be consumed by Sentio and import their ABI file into your Sentio workspace. The Sentio SDK then generates a full suite of indexer hooks.
- Implement the indexer logic using the generated hooks (onTransaction, onEvent, etc). When an event is received, update a metric (Gauge or Counter) or emit a log using a variety of available data, including decoded transaction and event payload alongside access to Aptos and CoinGecko clients for on-chain resources and coin prices.
- Optionally, add debug prints to verify the correctness of the indexer logic implementation.
- Deploy the indexer.
- Verify the indexed data in the Sentio UI.
- If everything looks satisfactory, set the contracts to active.
- Query the indexed data using the Sentio REST API.
To understand how to utilize Sentio better, let’s examine four examples adapted from the Thala Indexer code that we use in production. You can also explore our GitHub repository.
Pool Volume
Pool volume is indicative of ThalaSwap activity. To calculate daily pool volume, we check the SwapEvent emitted by the weighted_pool.move
inside the onEventSwapEvent
function.
import { weighted_pool } from "../types/aptos/amm.js";
import { Gauge } from "@sentio/sdk";
import { getCoinInfo, getPrice } from "@sentio/sdk/aptos/ext";
const volOptions = {
sparse: true,
aggregationConfig: {
intervalInMinutes: [60],
},
};
const volumeGauge = Gauge.register("pool_volume_usd", volOptions);
weighted_pool.bind(...).onEventSwapEvent(async (event, ctx) => {
const poolTag = getPoolTag(...);
const timestamp = Number(ctx.transaction.timestamp);
const coinIn = getCoinInfo(event.type_arguments[0]);
const priceIn = await getPrice(coinIn.token_type.type, timestamp);
const amountIn = event.data_decoded.amount_in.scaleDown(coinIn.decimals);
const volumeUsd = amountIn.multipliedBy(priceIn);
volumeGauge.record(ctx, volumeUsd, { poolTag });
})
After uploading the indexer code, we can easily view the metrics in Sentio UI.
Even better, we can easily draw a diagram using the editor.
Last but not least, we can easily get a REST query via “Export as Curl”.
Modifying the query for a bit, we can easily display the pool volume data in our app.
Pool TVL
TVL is a significant metric for ThalaSwap. Although its calculation is more complex since coin prices vary regardless of smart contract events, with Sentio’s AptosResourcesProcessor
that takes periodic snapshots of Aptos module resources, we can easily compute the pool's TVL at any given time.
import { weighted_pool } from "../types/aptos/amm.js";
import { Gauge } from "@sentio/sdk";
import {
AptosResourcesProcessor,
AptosContext,
defaultMoveCoder,
} from "@sentio/sdk/aptos";
const tvlByPoolGauge = Gauge.register("tvl_by_pool", { sparse: true });
AptosResourcesProcessor.bind(...).onTimeInterval(
async (resources, ctx) => {
const pools = await defaultMoveCoder().filterAndDecodeResources<
weighted_pool.WeightedPool<any, any, any, any, any, any, any, any>
>(weighted_pool.WeightedPool.TYPE_QNAME, resources);
console.log("number of weighted pools:", pools.length);
for (const pool of pools) {
const tvl = ... // similar logic as in pool volume example
tvlByPoolGauge.record(ctx, tvl, { poolType: pool.type });
}
},
5, // aggregate every 5 minutes
60 // backfill every 60 minutes
);
Swap History
Swap history provides insights into traders’ behavior. We track this by simply emitting the log payload via the eventLogger
.
weighted_pool.bind(...).onEventSwapEvent(async (event, ctx) => {
// ... see the above mentioned volume tracking logic
ctx.eventLogger.emit("swap", {
distinctId: ctx.transaction.sender,
message: `Swap ${amountIn} ${coinIn} for ${swapAmountOut} ${coinOut}`,
// a JSON struct that contains data about the swap
...swapAttributes,
});
})
Under the hood, the structured logs are stored in a ClickHouse database, which is able to quickly processes large queries. For example, we can easily identify large traders who purchased more than 10,000 MOD in a single transaction, see the below screenshot.
Aside from using the Sentio app, we can also access data by sending SQL queries to a REST endpoint. For example, we send SELECT * FROM swap where pair = ... ORDER BY block_number DESC LIMIT ... OFFSET ...
to render a paginated table, as you can find here:
Vault Hints
A more interesting application of event logs is to compute vault hints. In Move Dollar contracts, we allow callers to pass in a hint: Option<address>
to vault operations. We have created an off-chain mirror of the ordered vault list in Sentio, and we keep track of vault updates using event logs.
public fun deposit_collateral<CoinType>(account: &signer, collateral: Coin<CoinType>, hint: Option<address>)
public fun withdraw_collateral<CoinType>(account: &signer, amount: u64, hint: Option<address>): Coin<CoinType>
public fun borrow<CoinType>(account: &signer, amount: u64, hint: Option<address>): Coin<MOD>
public fun repay<CoinType>(account: &signer, debt: Coin<MOD>, hint: Option<address>)
public fun liquidate<CoinType>(vault_addr: address, hint: Option<address>): Coin<CoinType>
The reason for this is that, for each collateral type, we want to maintain a linked list of user vaults ordered by collateral ratio. This allows for easier redemption (if enabled), as we can go over the vaults with the lowest collateral ratio to the highest. However, maintaining an on-chain ordered list can be computationally expensive, because every time a vault is updated, we would need to move it to a new position. In the worst case, this is an O(n) operation using linear addressing.
To work around this issue, we instead maintain an off-chain ordered list. Before moving the vault, we use off-chain computation to determine its new position (known as a “vault hint”), reducing the vault update operation to O(1).
Implementing an off-chain mirror of the ordered vault list in Sentio is straightforward. To begin with, we keep track of vault updates using event logs.
import { vault } from "./types/aptos/mod.js";
vault.bind(...)
.onEventBorrowEvent((event, ctx) => {
ctx.eventLogger.emit("update_vault", getVaultUpdatedAttr(event))
})
.onEventRepayEvent((event, ctx) => {
ctx.eventLogger.emit("update_vault", getVaultUpdatedAttr(event))
})
.onEventDepositEvent((event, ctx) => {
ctx.eventLogger.emit("update_vault", getVaultUpdatedAttr(event))
})
.onEventWithdrawEvent((event, ctx) => {
ctx.eventLogger.emit("update_vault", getVaultUpdatedAttr(event))
})
.onEventLiquidationEvent((event, ctx) => {
ctx.eventLogger.emit("update_vault", getVaultUpdatedAttr(event))
})
.onEventRedemptionEvent((event, ctx) => {
ctx.eventLogger.emit("update_vault", getVaultUpdatedAttr(event))
});
function getVaultUpdatedAttr(event) {
return {
distinctId: ...,
coinType: ...,
account: ...,
collateral: ...,
liability: ...
}
}
This will create a ClickHouse table called “update_vault”, with columns “distinctId”, “coinType”, “account”, “collateral”, and “liability”. The Thala app then uses this SQL query to determine the vault’s position after an update and include the vault hint in write transactions.
SELECT account
FROM (
SELECT *,
ROW_NUMBER() OVER (PARTITION BY distinct_id ORDER BY timestamp DESC, log_index DESC) AS row_num
FROM update_vault
WHERE coinType = '${coinType}'
AND account <> '${account}'
) ranked
WHERE row_num = 1
ORDER BY ABS(collateral / NULLIF(liability, 0) - ${collateral} / ${liability})
LIMIT 1;
Conclusion
We are thrilled with Sentio’s real-time Aptos indexing and insight capabilities. Sentio has become a critical part of Thala’s infrastructure, and we’re excited to continue our partnership as we continue to build out our product stack.