Create an indexer with DipDup

This page will guide you through the steps to get your first DipDup indexer up and running in a few minutes.

This guide will help you create an indexer with DipDup and Python. DipDup is a tool that allows you to build a custom indexer for your project.

What you'll learn:

  • How to use DipDup
  • How to create an indexer


  • - DipDup
  • - Python
Last Updated: May 09, 2024

Let's create an indexer to track balances of particular token. We will need to set up the indexing environment, configure the indexer, and store the results in a database.


Here are a few things you need to get started with DipDup:

  • Skills: Basic Python 3 knowledge to implement data handlers.
  • Operating System: You can use any Linux/macOS distribution on amd64/arm64 platforms with Python installed.
  • Python Version: Python 3.11 is required for DipDup. You can check your Python version by running python3 --version in your terminal.

Understanding DipDup

DipDup is a software framework that helps web3 developers create selective indexers for decentralized applications. It uses blockchain data provided by various off-chain data sources. Some of the key features of DipDup include:

  • Ready For Multichain: DipDup supports dozens of blockchains, and we are constantly adding new ones. You can easily reuse your business logic for different networks or even index multiple chains in a single project.
  • Declarative Configuration: A whole indexer is defined by a single configuration file and a bunch of Python data handlers. Code is completely separated from the configuration and environment variables, making it easy to maintain and deploy your project.
  • Lots of Integrations: You can use SQLite, PostgreSQL, or TimescaleDB databases to store blockchain data, deploy to Compose or Swarm with a single command, monitor your indexer with Prometheus or Sentry, and more.
  • Magic GraphQL API: DipDup automatically generates a GraphQL API for your data using Hasura, so you can easily query it from your frontend or other services. You can easily extend API with custom queries and metadata requests.
  • Powerful CLI: DipDup CLI has everything you need to manage your project, from creating a new one to running and deploying. And it's very convenient. There are lots of templates for various blockchains and use cases, so you can start quickly.

Install DipDup

The easiest way to install DipDup as a CLI application is pipx with pipx install dipdup command. If you don't want to deal with tooling, we have a convenient installer script. Run the following command in your terminal:

curl -Lsf | python3

See the DipDup Installation page for other options.

Create a project

DipDup CLI has a built-in project generator. Run the following command in your terminal:

dipdup new

In this guide we will use one of our demos - demo_evm_events as a template: Terminal output of dipdup new command

Dipdup will generate complete project structure, including USDT balances tracking logic, which is implemented in demo. Let's look further into it in following steps.

Configuration file

In the project root, you'll find a file named dipdup.yaml. It's the main configuration file of your indexer. We will discuss it in detail in the Config section; for now just replace contract address with target token, I will use ZKsync USDT for this example:

spec_version: 2.0
package: zksync_demo

    kind: evm.subsquid
    url: ${SUBSQUID_URL:-}
    node: evm_node
    kind: abi.etherscan
    url: ${ETHERSCAN_URL:-}
    api_key: ${ETHERSCAN_API_KEY:-''}
    kind: evm.node
    url: ${NODE_URL:-}/${NODE_API_KEY:-''}
    ws_url: ${NODE_WS_URL:-wss://}/${NODE_API_KEY:-''}

    kind: evm
    address: 0x493257fD37EDB34451f62EDf8D2a0C418852bA4C
    typename: eth_usdt

    datasource: subsquid
      - callback: on_transfer
        contract: eth_usdt
        name: Transfer

Implement handlers

Now, let's examine handlers/ As defined in dipdup.yaml, this handler is activated when the transfer method of the token contract is called. In this guide, we focus on tracking USDT balances on ZKsync Era:
from decimal import Decimal

from tortoise.exceptions import DoesNotExist

from zksync_demo import models as models
from zksync_demo.types.eth_usdt.evm_events.transfer import Transfer
from dipdup.context import HandlerContext
from dipdup.models.evm_subsquid import SubsquidEvent

async def on_transfer(
    ctx: HandlerContext,
    event: SubsquidEvent[Transfer],
) -> None:
    amount = Decimal(event.payload.value) / (10**6)
    if not amount:

    await on_balance_update(
    await on_balance_update(,

async def on_balance_update(
    address: str,
    balance_update: Decimal,
    level: int,
) -> None:
        holder = await models.Holder.cached_get(pk=address)
    except DoesNotExist:
        holder = models.Holder(
    holder.balance += balance_update
    holder.turnover += abs(balance_update)
    holder.tx_count += 1
    holder.last_seen = level

Notice that we utilize the Transaction model predefined in models/ DipDup is compatible with several databases, including SQLite, PostgreSQL, and TimescaleDB, thanks to a custom ORM layer built on top of Tortoise ORM. For more information on DipDup's data models and how to utilize them in your projects, visit the models page in our docs.


In 'dipdup.yaml' we have three datasources, in this tutorial we will use two of them: Subsquid Network for historical data and JSON-RPC API (EVM node) for real-time and historical data. Before running, set URLs for datasources. These URLs can be set in dipdup.yaml, but here we suggest better .env file approach:

  1. Copy deploy/.env.default file to deploy/.env.
  3. Set your EVM node, for this example I will use the public RPC endpoint: NODE_URL=, NODE_WS_URL=wss://

Start the indexer:

dipdup -e deploy/.env -c dipdup.yaml -c configs/dipdup.sqlite.yaml run

Data will be stored in SQLite database. Run this query to check the data:

sqlite3 /tmp/zksync_demo.sqlite 'SELECT * FROM holder LIMIT 10'

The next part of the guide will explain how to start production-ready DipDup application in Docker environment with GraphQL API for your data.

Query API

Most powerful and common DipDup application configuration uses PostgreSQL to store data and Hasura to have production ready API, steps to deploy stack in Docker environment in a few simple steps:

  1. Generate and set HASURA_SECRET and POSTGRES_PASSWORD in deploy/.env file, Hasura secret will be used later to access Hasura.
  2. Build and start Docker containers: docker compose --env-file deploy/.env -f deploy/compose.yaml up -d.
  3. Run docker ps to check that all containers are running. Locate the URL of the Hasura console in the PORTS column compose.yml documentation. If the container isn't accessible via, it may be accessible via localhost:PORT in certain environments. docker ps
  4. As an example, let's query the first 10 addresses with a positive balance: hasura request

Explore DipDup

To learn more about DipDup features, visit the official DipDup documentation. It offers an in-depth explanation of the framework concepts, lots of examples from basic to the most advanced, allowing rapid and efficient development of blockchain data indexers of any complexity.

Made with ❤️ by the ZKsync Community