Merge branch 'main' into arb-fix-4

This commit is contained in:
sragss 2021-10-22 13:55:29 -07:00 committed by Sam Ragsdale
commit e29d8bb310
86 changed files with 3144 additions and 548 deletions

View File

@ -51,7 +51,7 @@ jobs:
- name: Run precommit
run: |
poetry run pre-commit
poetry run pre-commit run --all-files
- name: Test with pytest
shell: bash

3
.gitignore vendored
View File

@ -19,3 +19,6 @@ cache
# k8s
.helm
# env
.envrc

36
CONTRIBUTING.md Normal file
View File

@ -0,0 +1,36 @@
# Contributing guide
Welcome to the Flashbots collective! We just ask you to be nice when you play with us.
## Pre-commit
We use pre-commit to maintain a consistent style, prevent errors, and ensure test coverage.
To set up, install dependencies through `poetry`:
```
poetry install
```
Then install pre-commit hooks with:
```
poetry run pre-commit install
```
## Tests
Run tests with:
```
kubectl exec deploy/mev-inspect-deployment -- poetry run pytest --cov=mev_inspect tests
```
## Send a pull request
- Your proposed changes should be first described and discussed in an issue.
- Open the branch in a personal fork, not in the team repository.
- Every pull request should be small and represent a single change. If the problem is complicated, split it in multiple issues and pull requests.
- Every pull request should be covered by unit tests.
We appreciate you, friend <3.

171
README.md
View File

@ -1,143 +1,214 @@
# mev-inspect-py
> illuminating the dark forest 🌲🔦
**mev-inspect-py** is an MEV inspector for Ethereum
[![standard-readme compliant](https://img.shields.io/badge/readme%20style-standard-brightgreen.svg?style=flat-square)](https://github.com/RichardLitt/standard-readme)
[![Discord](https://img.shields.io/discord/755466764501909692)](https://discord.gg/7hvTycdNcK)
[Maximal extractable value](https://ethereum.org/en/developers/docs/mev/) inspector for Ethereum, to illuminate the [dark forest](https://www.paradigm.xyz/2020/08/ethereum-is-a-dark-forest/) 🌲💡
Given a block, mev-inspect finds:
- miner payments (gas + coinbase)
- tokens transfers and profit
- swaps and [arbitrages](https://twitter.com/bertcmiller/status/142763202826305946://twitter.com/bertcmiller/status/1427632028263059462)
- swaps and [arbitrages](https://twitter.com/bertcmiller/status/1427632028263059462)
- ...and more
Data is stored in Postgres for analysis
Data is stored in Postgres for analysis.
## Running locally
mev-inspect-py is built to run on kubernetes locally and in production
## Install
### Install dependencies
mev-inspect-py is built to run on kubernetes locally and in production.
1. Setup a local kubernetes deployment (we use [kind](https://kind.sigs.k8s.io/docs/user/quick-start))
### Dependencies
2. Setup [Tilt](https://docs.tilt.dev/install.html) which manages the local deployment
- [docker](https://www.docker.com/products/docker-desktop)
- [kind](https://kind.sigs.k8s.io/docs/user/quick-start), or a similar tool for running local Kubernetes clusters
- [kubectl](https://kubernetes.io/docs/tasks/tools/)
- [helm](https://helm.sh/docs/intro/install/)
- [tilt](https://docs.tilt.dev/install.html)
### Start up
### Set up
Create a new cluster with:
```
kind create cluster
```
Set an environment variable `RPC_URL` to an RPC for fetching blocks.
Set an environment variable `RPC_URL` to an RPC for fetching blocks
Example:
```
export RPC_URL="http://111.111.111.111:8546"
```
**Note: mev-inspect-py currently requires and RPC with support for parity traces**
**Note**: mev-inspect-py currently requires an RPC of a full archive node with support for Erigon traces and receipts (not geth 😔).
Next, start all services with:
Next, start all servcies with:
```
tilt up
```
Press "space" to see a browser of the services starting up
Press "space" to see a browser of the services starting up.
On first startup, you'll need to apply database migrations with:
On first startup, you'll need to apply database migrations. Apply with:
```
kubectl exec deploy/mev-inspect-deployment -- alembic upgrade head
kubectl exec deploy/mev-inspect -- alembic upgrade head
```
## Inspecting
## Usage
### Inspect a single block
Inspecting block [12914944](https://twitter.com/mevalphaleak/status/1420416437575901185)
Inspecting block [12914944](https://twitter.com/mevalphaleak/status/1420416437575901185):
```
kubectl exec deploy/mev-inspect-deployment -- poetry run inspect-block 12914944
kubectl exec deploy/mev-inspect -- poetry run inspect-block 12914944
```
### Inspect many blocks
Inspecting blocks 12914944 to 12914954
Inspecting blocks 12914944 to 12914954:
```
kubectl exec deploy/mev-inspect-deployment -- poetry run inspect-many-blocks 12914944 12914954
kubectl exec deploy/mev-inspect -- poetry run inspect-many-blocks 12914944 12914954
```
### Inspect all incoming blocks
Start a block listener with
Start a block listener with:
```
kubectl exec deploy/mev-inspect-deployment -- /app/listener start
kubectl exec deploy/mev-inspect -- /app/listener start
```
By default, it will pick up wherever you left off.
If running for the first time, listener starts at the latest block
If running for the first time, listener starts at the latest block.
See logs for the listener with:
See logs for the listener with
```
kubectl exec deploy/mev-inspect-deployment -- tail -f listener.log
kubectl exec deploy/mev-inspect -- tail -f listener.log
```
And stop the listener with
And stop the listener with:
```
kubectl exec deploy/mev-inspect-deployment -- /app/listener stop
kubectl exec deploy/mev-inspect -- /app/listener stop
```
## Contributing
### Exploring
### Guide
All inspect output data is stored in Postgres.
✨ Coming soon
To connect to the local Postgres database for querying, launch a client container with:
### Pre-commit
We use pre-commit to maintain a consistent style, prevent errors, and ensure test coverage.
To set up, install dependencies through poetry
```
poetry install
kubectl run -i --rm --tty postgres-client --env="PGPASSWORD=password" --image=jbergknoff/postgresql-client -- mev_inspect --host=postgresql --user=postgres
```
Then install pre-commit hooks with
When you see the prompt:
```
poetry run pre-commit install
mev_inspect=#
```
### Tests
You're ready to query!
Try finding the total number of swaps decoded with UniswapV3Pool:
Run tests with
```
kubectl exec deploy/mev-inspect-deployment -- poetry run pytest --cov=mev_inspect tests
SELECT COUNT(*) FROM swaps WHERE abi_name='UniswapV3Pool';
```
or top 10 arbs by gross profit that took profit in WETH:
```
SELECT *
FROM arbitrages
WHERE profit_token_address = '0xc02aaa39b223fe8d0a0e5c4f27ead9083c756cc2'
ORDER BY profit_amount DESC
LIMIT 10;
```
Postgres tip: Enter `\x` to enter "Explanded display" mode which looks nicer for results with many columns.
## FAQ
### How do I delete / reset my local postgres data?
Stop the system if running
Stop the system if running:
```
tilt down
```
Delete it with
Delete it with:
```
kubectl delete pvc data-postgresql-postgresql-0
```
Start back up again
Start back up again:
```
tilt up
```
And rerun migrations to create the tables again
And rerun migrations to create the tables again:
```
kubectl exec deploy/mev-inspect-deployment -- alembic upgrade head
kubectl exec deploy/mev-inspect -- alembic upgrade head
```
### I was using the docker-compose setup and want to switch to kube, now what?
Re-add the old `docker-compose.yml` file to your mev-inspect-py directory
Re-add the old `docker-compose.yml` file to your mev-inspect-py directory.
A copy can be found [here](https://github.com/flashbots/mev-inspect-py/blob/ef60c097719629a7d2dc56c6e6c9a100fb706f76/docker-compose.yml)
Tear down docker-compose resources
Tear down docker-compose resources:
```
docker compose down
```
Then go through the steps in the current README for kube setup
Then go through the steps in the current README for kube setup.
### Error from server (AlreadyExists): pods "postgres-client" already exists
This means the postgres client container didn't shut down correctly.
Delete this one with:
```
kubectl delete pod/postgres-client
```
Then start it back up again.
## Maintainers
- [@lukevs](https://github.com/lukevs)
- [@gheise](https://github.com/gheise)
- [@bertmiller](https://github.com/bertmiller)
## Contributing
[Flashbots](https://flashbots.net) is a research and development collective working on mitigating the negative externalities of decentralized economies. We contribute with the larger free software community to illuminate the dark forest.
You are welcome here <3.
- If you want to join us, come and say hi in our [Discord chat](https://discord.gg/7hvTycdNcK).
- If you have a question, feedback or a bug report for this project, please [open a new Issue](https://github.com/flashbots/mev-inspect-py/issues).
- If you would like to contribute with code, check the [CONTRIBUTING file](CONTRIBUTING.md).
- We just ask you to be nice.
## Security
If you find a security vulnerability on this project or any other initiative related to Flashbots, please let us know sending an email to security@flashbots.net.
---
Made with ☀️ by the ⚡🤖 collective.

View File

@ -16,8 +16,16 @@ k8s_yaml(configmap_from_dict("mev-inspect-rpc", inputs = {
k8s_yaml(secret_from_dict("mev-inspect-db-credentials", inputs = {
"username" : "postgres",
"password": "password",
"host": "postgresql",
}))
# if using https://github.com/taarushv/trace-db
# k8s_yaml(secret_from_dict("trace-db-credentials", inputs = {
# "username" : "username",
# "password": "password",
# "host": "trace-db-postgresql",
# }))
docker_build_with_restart("mev-inspect-py", ".",
entrypoint="/app/entrypoint.sh",
live_update=[
@ -25,7 +33,6 @@ docker_build_with_restart("mev-inspect-py", ".",
run("cd /app && poetry install",
trigger="./pyproject.toml"),
],
platform="linux/arm64",
)
k8s_yaml("k8s/app.yaml")
k8s_resource(workload="mev-inspect-deployment", resource_deps=["postgresql-postgresql"])
k8s_yaml(helm('./k8s/mev-inspect', name='mev-inspect'))
k8s_resource(workload="mev-inspect", resource_deps=["postgresql-postgresql"])

View File

@ -5,12 +5,12 @@ from sqlalchemy import pool
from alembic import context
from mev_inspect.db import get_sqlalchemy_database_uri
from mev_inspect.db import get_inspect_database_uri
# this is the Alembic Config object, which provides
# access to the values within the .ini file in use.
config = context.config
config.set_main_option("sqlalchemy.url", get_sqlalchemy_database_uri())
config.set_main_option("sqlalchemy.url", get_inspect_database_uri())
# Interpret the config file for Python logging.
# This line sets up loggers basically.

View File

@ -0,0 +1,27 @@
"""Add received_collateral_address to liquidations
Revision ID: 205ce02374b3
Revises: c8363617aa07
Create Date: 2021-10-04 19:52:40.017084
"""
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision = "205ce02374b3"
down_revision = "c8363617aa07"
branch_labels = None
depends_on = None
def upgrade():
op.add_column(
"liquidations",
sa.Column("received_token_address", sa.String(256), nullable=True),
)
def downgrade():
op.drop_column("liquidations", "received_token_address")

View File

@ -0,0 +1,38 @@
"""Create liquidations table
Revision ID: c8363617aa07
Revises: cd96af55108e
Create Date: 2021-09-29 14:00:06.857103
"""
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision = "c8363617aa07"
down_revision = "cd96af55108e"
branch_labels = None
depends_on = None
def upgrade():
op.create_table(
"liquidations",
sa.Column("created_at", sa.TIMESTAMP, server_default=sa.func.now()),
sa.Column("liquidated_user", sa.String(256), nullable=False),
sa.Column("liquidator_user", sa.String(256), nullable=False),
sa.Column("collateral_token_address", sa.String(256), nullable=False),
sa.Column("debt_token_address", sa.String(256), nullable=False),
sa.Column("debt_purchase_amount", sa.Numeric, nullable=False),
sa.Column("received_amount", sa.Numeric, nullable=False),
sa.Column("protocol", sa.String(256), nullable=True),
sa.Column("transaction_hash", sa.String(66), nullable=False),
sa.Column("trace_address", sa.String(256), nullable=False),
sa.Column("block_number", sa.Numeric, nullable=False),
sa.PrimaryKeyConstraint("transaction_hash", "trace_address"),
)
def downgrade():
op.drop_table("liquidations")

57
backfill.py Normal file
View File

@ -0,0 +1,57 @@
import subprocess
import sys
from typing import Iterator, Tuple
def get_block_after_before_chunks(
after_block: int,
before_block: int,
n_workers: int,
) -> Iterator[Tuple[int, int]]:
n_blocks = before_block - after_block
remainder = n_blocks % n_workers
floor_chunk_size = n_blocks // n_workers
last_before_block = None
for worker_index in range(n_workers):
chunk_size = floor_chunk_size
if worker_index < remainder:
chunk_size += 1
batch_after_block = (
last_before_block if last_before_block is not None else after_block
)
batch_before_block = batch_after_block + chunk_size
yield batch_after_block, batch_before_block
last_before_block = batch_before_block
def backfill(after_block: int, before_block: int, n_workers: int):
if n_workers <= 0:
raise ValueError("Need at least one worker")
for batch_after_block, batch_before_block in get_block_after_before_chunks(
after_block,
before_block,
n_workers,
):
print(f"Backfilling {batch_after_block} to {batch_before_block}")
backfill_command = f"sh backfill.sh {batch_after_block} {batch_before_block}"
process = subprocess.Popen(backfill_command.split(), stdout=subprocess.PIPE)
output, _ = process.communicate()
print(output)
def main():
after_block = int(sys.argv[1])
before_block = int(sys.argv[2])
n_workers = int(sys.argv[3])
backfill(after_block, before_block, n_workers)
if __name__ == "__main__":
main()

6
backfill.sh Normal file
View File

@ -0,0 +1,6 @@
current_image=$(kubectl get deployment mev-inspect -o=jsonpath='{$.spec.template.spec.containers[:1].image}')
helm template mev-inspect-backfill ./k8s/mev-inspect-backfill \
--set image.repository=$current_image \
--set command.startBlockNumber=$1 \
--set command.endBlockNumber=$2 | kubectl apply -f -

43
cli.py
View File

@ -1,15 +1,21 @@
import os
import logging
import sys
import click
from web3 import Web3
from mev_inspect.db import get_session
from mev_inspect.classifiers.trace import TraceClassifier
from mev_inspect.db import get_inspect_session, get_trace_session
from mev_inspect.inspect_block import inspect_block
from mev_inspect.provider import get_base_provider
RPC_URL_ENV = "RPC_URL"
logging.basicConfig(stream=sys.stdout, level=logging.INFO)
logger = logging.getLogger(__name__)
@click.group()
def cli():
@ -21,14 +27,24 @@ def cli():
@click.option("--rpc", default=lambda: os.environ.get(RPC_URL_ENV, ""))
@click.option("--cache/--no-cache", default=True)
def inspect_block_command(block_number: int, rpc: str, cache: bool):
db_session = get_session()
inspect_db_session = get_inspect_session()
trace_db_session = get_trace_session()
base_provider = get_base_provider(rpc)
w3 = Web3(base_provider)
trace_classifier = TraceClassifier()
if not cache:
click.echo("Skipping cache")
logger.info("Skipping cache")
inspect_block(db_session, base_provider, w3, block_number, should_cache=cache)
inspect_block(
inspect_db_session,
base_provider,
w3,
trace_classifier,
block_number,
trace_db_session=trace_db_session,
)
@cli.command()
@ -40,29 +56,32 @@ def inspect_many_blocks_command(
after_block: int, before_block: int, rpc: str, cache: bool
):
db_session = get_session()
inspect_db_session = get_inspect_session()
trace_db_session = get_trace_session()
base_provider = get_base_provider(rpc)
w3 = Web3(base_provider)
trace_classifier = TraceClassifier()
if not cache:
click.echo("Skipping cache")
logger.info("Skipping cache")
for i, block_number in enumerate(range(after_block, before_block)):
block_message = (
f"Running for {block_number} ({i+1}/{before_block - after_block})"
)
dashes = "-" * len(block_message)
click.echo(dashes)
click.echo(block_message)
click.echo(dashes)
logger.info(dashes)
logger.info(block_message)
logger.info(dashes)
inspect_block(
db_session,
inspect_db_session,
base_provider,
w3,
trace_classifier,
block_number,
should_write_classified_traces=False,
should_cache=cache,
trace_db_session=trace_db_session,
)

View File

@ -1,45 +0,0 @@
apiVersion: apps/v1
kind: Deployment
metadata:
name: mev-inspect-deployment
labels:
app: mev-inspect
spec:
replicas: 1
selector:
matchLabels:
app: mev-inspect
template:
metadata:
labels:
app: mev-inspect
spec:
containers:
- name: mev-inspect
image: mev-inspect-py
command: [ "/app/entrypoint.sh" ]
env:
- name: POSTGRES_USER
valueFrom:
secretKeyRef:
name: mev-inspect-db-credentials
key: username
- name: POSTGRES_PASSWORD
valueFrom:
secretKeyRef:
name: mev-inspect-db-credentials
key: password
- name: POSTGRES_HOST
value: postgresql
- name: RPC_URL
valueFrom:
configMapKeyRef:
name: mev-inspect-rpc
key: url
livenessProbe:
exec:
command:
- ls
- /
initialDelaySeconds: 20
periodSeconds: 5

View File

@ -0,0 +1,23 @@
# Patterns to ignore when building packages.
# This supports shell glob matching, relative path matching, and
# negation (prefixed with !). Only one pattern per line.
.DS_Store
# Common VCS dirs
.git/
.gitignore
.bzr/
.bzrignore
.hg/
.hgignore
.svn/
# Common backup files
*.swp
*.bak
*.tmp
*.orig
*~
# Various IDEs
.project
.idea/
*.tmproj
.vscode/

View File

@ -0,0 +1,24 @@
apiVersion: v2
name: mev-inspect-backfill
description: A Helm chart for Kubernetes
# A chart can be either an 'application' or a 'library' chart.
#
# Application charts are a collection of templates that can be packaged into versioned archives
# to be deployed.
#
# Library charts provide useful utilities or functions for the chart developer. They're included as
# a dependency of application charts to inject those utilities and functions into the rendering
# pipeline. Library charts do not define any templates and therefore cannot be deployed.
type: application
# This is the chart version. This version number should be incremented each time you make changes
# to the chart and its templates, including the app version.
# Versions are expected to follow Semantic Versioning (https://semver.org/)
version: 0.1.0
# This is the version number of the application being deployed. This version number should be
# incremented each time you make changes to the application. Versions are not expected to
# follow Semantic Versioning. They should reflect the version the application is using.
# It is recommended to use it with quotes.
appVersion: "1.16.0"

View File

@ -0,0 +1,62 @@
{{/*
Expand the name of the chart.
*/}}
{{- define "mev-inspect-backfill.name" -}}
{{- default .Chart.Name .Values.nameOverride | trunc 63 | trimSuffix "-" }}
{{- end }}
{{/*
Create a default fully qualified app name.
We truncate at 63 chars because some Kubernetes name fields are limited to this (by the DNS naming spec).
If release name contains chart name it will be used as a full name.
*/}}
{{- define "mev-inspect-backfill.fullname" -}}
{{- if .Values.fullnameOverride }}
{{- .Values.fullnameOverride | trunc 63 | trimSuffix "-" }}
{{- else }}
{{- $name := default .Chart.Name .Values.nameOverride }}
{{- if contains $name .Release.Name }}
{{- .Release.Name | trunc 63 | trimSuffix "-" }}
{{- else }}
{{- printf "%s-%s" .Release.Name $name | trunc 63 | trimSuffix "-" }}
{{- end }}
{{- end }}
{{- end }}
{{/*
Create chart name and version as used by the chart label.
*/}}
{{- define "mev-inspect-backfill.chart" -}}
{{- printf "%s-%s" .Chart.Name .Chart.Version | replace "+" "_" | trunc 63 | trimSuffix "-" }}
{{- end }}
{{/*
Common labels
*/}}
{{- define "mev-inspect-backfill.labels" -}}
helm.sh/chart: {{ include "mev-inspect-backfill.chart" . }}
{{ include "mev-inspect-backfill.selectorLabels" . }}
{{- if .Chart.AppVersion }}
app.kubernetes.io/version: {{ .Chart.AppVersion | quote }}
{{- end }}
app.kubernetes.io/managed-by: {{ .Release.Service }}
{{- end }}
{{/*
Selector labels
*/}}
{{- define "mev-inspect-backfill.selectorLabels" -}}
app.kubernetes.io/name: {{ include "mev-inspect-backfill.name" . }}
app.kubernetes.io/instance: {{ .Release.Name }}
{{- end }}
{{/*
Create the name of the service account to use
*/}}
{{- define "mev-inspect-backfill.serviceAccountName" -}}
{{- if .Values.serviceAccount.create }}
{{- default (include "mev-inspect-backfill.fullname" .) .Values.serviceAccount.name }}
{{- else }}
{{- default "default" .Values.serviceAccount.name }}
{{- end }}
{{- end }}

View File

@ -0,0 +1,51 @@
apiVersion: batch/v1
kind: Job
metadata:
name: {{ include "mev-inspect-backfill.fullname" . }}-{{ randAlphaNum 5 | lower }}
labels:
{{- include "mev-inspect-backfill.labels" . | nindent 4 }}
spec:
completions: 1
parallelism: 1
ttlSecondsAfterFinished: 5
template:
metadata:
{{- with .Values.podAnnotations }}
annotations:
{{- toYaml . | nindent 8 }}
{{- end }}
spec:
containers:
- name: {{ .Chart.Name }}
securityContext:
{{- toYaml .Values.securityContext | nindent 12 }}
image: "{{ .Values.image.repository }}"
imagePullPolicy: {{ .Values.image.pullPolicy }}
command:
- poetry
- run
- inspect-many-blocks
- {{ .Values.command.startBlockNumber | quote }}
- {{ .Values.command.endBlockNumber | quote }}
env:
- name: POSTGRES_HOST
valueFrom:
secretKeyRef:
name: mev-inspect-db-credentials
key: host
- name: POSTGRES_USER
valueFrom:
secretKeyRef:
name: mev-inspect-db-credentials
key: username
- name: POSTGRES_PASSWORD
valueFrom:
secretKeyRef:
name: mev-inspect-db-credentials
key: password
- name: RPC_URL
valueFrom:
configMapKeyRef:
name: mev-inspect-rpc
key: url
restartPolicy: OnFailure

View File

@ -0,0 +1,42 @@
# Default values for mev-inspect.
# This is a YAML-formatted file.
# Declare variables to be passed into your templates.
image:
repository: mev-inspect-py
pullPolicy: IfNotPresent
imagePullSecrets: []
nameOverride: ""
fullnameOverride: ""
podAnnotations: {}
podSecurityContext: {}
# fsGroup: 2000
securityContext: {}
# capabilities:
# drop:
# - ALL
# readOnlyRootFilesystem: true
# runAsNonRoot: true
# runAsUser: 1000
resources: {}
# We usually recommend not to specify default resources and to leave this as a conscious
# choice for the user. This also increases chances charts run on environments with little
# resources, such as Minikube. If you do want to specify resources, uncomment the following
# lines, adjust them as necessary, and remove the curly braces after 'resources:'.
# limits:
# cpu: 100m
# memory: 128Mi
# requests:
# cpu: 100m
# memory: 128Mi
nodeSelector: {}
tolerations: []
affinity: {}

View File

@ -0,0 +1,23 @@
# Patterns to ignore when building packages.
# This supports shell glob matching, relative path matching, and
# negation (prefixed with !). Only one pattern per line.
.DS_Store
# Common VCS dirs
.git/
.gitignore
.bzr/
.bzrignore
.hg/
.hgignore
.svn/
# Common backup files
*.swp
*.bak
*.tmp
*.orig
*~
# Various IDEs
.project
.idea/
*.tmproj
.vscode/

View File

@ -0,0 +1,24 @@
apiVersion: v2
name: mev-inspect
description: A Helm chart for Kubernetes
# A chart can be either an 'application' or a 'library' chart.
#
# Application charts are a collection of templates that can be packaged into versioned archives
# to be deployed.
#
# Library charts provide useful utilities or functions for the chart developer. They're included as
# a dependency of application charts to inject those utilities and functions into the rendering
# pipeline. Library charts do not define any templates and therefore cannot be deployed.
type: application
# This is the chart version. This version number should be incremented each time you make changes
# to the chart and its templates, including the app version.
# Versions are expected to follow Semantic Versioning (https://semver.org/)
version: 0.1.0
# This is the version number of the application being deployed. This version number should be
# incremented each time you make changes to the application. Versions are not expected to
# follow Semantic Versioning. They should reflect the version the application is using.
# It is recommended to use it with quotes.
appVersion: "1.16.0"

View File

@ -0,0 +1,62 @@
{{/*
Expand the name of the chart.
*/}}
{{- define "mev-inspect.name" -}}
{{- default .Chart.Name .Values.nameOverride | trunc 63 | trimSuffix "-" }}
{{- end }}
{{/*
Create a default fully qualified app name.
We truncate at 63 chars because some Kubernetes name fields are limited to this (by the DNS naming spec).
If release name contains chart name it will be used as a full name.
*/}}
{{- define "mev-inspect.fullname" -}}
{{- if .Values.fullnameOverride }}
{{- .Values.fullnameOverride | trunc 63 | trimSuffix "-" }}
{{- else }}
{{- $name := default .Chart.Name .Values.nameOverride }}
{{- if contains $name .Release.Name }}
{{- .Release.Name | trunc 63 | trimSuffix "-" }}
{{- else }}
{{- printf "%s-%s" .Release.Name $name | trunc 63 | trimSuffix "-" }}
{{- end }}
{{- end }}
{{- end }}
{{/*
Create chart name and version as used by the chart label.
*/}}
{{- define "mev-inspect.chart" -}}
{{- printf "%s-%s" .Chart.Name .Chart.Version | replace "+" "_" | trunc 63 | trimSuffix "-" }}
{{- end }}
{{/*
Common labels
*/}}
{{- define "mev-inspect.labels" -}}
helm.sh/chart: {{ include "mev-inspect.chart" . }}
{{ include "mev-inspect.selectorLabels" . }}
{{- if .Chart.AppVersion }}
app.kubernetes.io/version: {{ .Chart.AppVersion | quote }}
{{- end }}
app.kubernetes.io/managed-by: {{ .Release.Service }}
{{- end }}
{{/*
Selector labels
*/}}
{{- define "mev-inspect.selectorLabels" -}}
app.kubernetes.io/name: {{ include "mev-inspect.name" . }}
app.kubernetes.io/instance: {{ .Release.Name }}
{{- end }}
{{/*
Create the name of the service account to use
*/}}
{{- define "mev-inspect.serviceAccountName" -}}
{{- if .Values.serviceAccount.create }}
{{- default (include "mev-inspect.fullname" .) .Values.serviceAccount.name }}
{{- else }}
{{- default "default" .Values.serviceAccount.name }}
{{- end }}
{{- end }}

View File

@ -0,0 +1,92 @@
apiVersion: apps/v1
kind: Deployment
metadata:
name: {{ include "mev-inspect.fullname" . }}
labels:
{{- include "mev-inspect.labels" . | nindent 4 }}
spec:
replicas: {{ .Values.replicaCount }}
selector:
matchLabels:
{{- include "mev-inspect.selectorLabels" . | nindent 6 }}
template:
metadata:
{{- with .Values.podAnnotations }}
annotations:
{{- toYaml . | nindent 8 }}
{{- end }}
labels:
{{- include "mev-inspect.selectorLabels" . | nindent 8 }}
spec:
{{- with .Values.imagePullSecrets }}
imagePullSecrets:
{{- toYaml . | nindent 8 }}
{{- end }}
securityContext:
{{- toYaml .Values.podSecurityContext | nindent 8 }}
containers:
- name: {{ .Chart.Name }}
securityContext:
{{- toYaml .Values.securityContext | nindent 12 }}
image: "{{ .Values.image.repository }}"
imagePullPolicy: {{ .Values.image.pullPolicy }}
livenessProbe:
exec:
command:
- ls
- /
initialDelaySeconds: 20
periodSeconds: 5
resources:
{{- toYaml .Values.resources | nindent 12 }}
env:
- name: POSTGRES_HOST
valueFrom:
secretKeyRef:
name: mev-inspect-db-credentials
key: host
- name: POSTGRES_USER
valueFrom:
secretKeyRef:
name: mev-inspect-db-credentials
key: username
- name: POSTGRES_PASSWORD
valueFrom:
secretKeyRef:
name: mev-inspect-db-credentials
key: password
- name: TRACE_DB_HOST
valueFrom:
secretKeyRef:
name: trace-db-credentials
key: host
optional: true
- name: TRACE_DB_USER
valueFrom:
secretKeyRef:
name: trace-db-credentials
key: username
optional: true
- name: TRACE_DB_PASSWORD
valueFrom:
secretKeyRef:
name: trace-db-credentials
key: password
optional: true
- name: RPC_URL
valueFrom:
configMapKeyRef:
name: mev-inspect-rpc
key: url
{{- with .Values.nodeSelector }}
nodeSelector:
{{- toYaml . | nindent 8 }}
{{- end }}
{{- with .Values.affinity }}
affinity:
{{- toYaml . | nindent 8 }}
{{- end }}
{{- with .Values.tolerations }}
tolerations:
{{- toYaml . | nindent 8 }}
{{- end }}

View File

@ -0,0 +1,44 @@
# Default values for mev-inspect.
# This is a YAML-formatted file.
# Declare variables to be passed into your templates.
replicaCount: 1
image:
repository: mev-inspect-py:latest
pullPolicy: IfNotPresent
imagePullSecrets: []
nameOverride: ""
fullnameOverride: ""
podAnnotations: {}
podSecurityContext: {}
# fsGroup: 2000
securityContext: {}
# capabilities:
# drop:
# - ALL
# readOnlyRootFilesystem: true
# runAsNonRoot: true
# runAsUser: 1000
resources: {}
# We usually recommend not to specify default resources and to leave this as a conscious
# choice for the user. This also increases chances charts run on environments with little
# resources, such as Minikube. If you do want to specify resources, uncomment the following
# lines, adjust them as necessary, and remove the curly braces after 'resources:'.
# limits:
# cpu: 100m
# memory: 128Mi
# requests:
# cpu: 100m
# memory: 128Mi
nodeSelector: {}
tolerations: []
affinity: {}

View File

@ -9,7 +9,8 @@ from mev_inspect.crud.latest_block_update import (
find_latest_block_update,
update_latest_block,
)
from mev_inspect.db import get_session
from mev_inspect.classifiers.trace import TraceClassifier
from mev_inspect.db import get_inspect_session, get_trace_session
from mev_inspect.inspect_block import inspect_block
from mev_inspect.provider import get_base_provider
from mev_inspect.signal_handler import GracefulKiller
@ -18,6 +19,9 @@ from mev_inspect.signal_handler import GracefulKiller
logging.basicConfig(filename="listener.log", level=logging.INFO)
logger = logging.getLogger(__name__)
# lag to make sure the blocks we see are settled
BLOCK_NUMBER_LAG = 5
def run():
rpc = os.getenv("RPC_URL")
@ -28,18 +32,23 @@ def run():
killer = GracefulKiller()
db_session = get_session()
inspect_db_session = get_inspect_session()
trace_db_session = get_trace_session()
trace_classifier = TraceClassifier()
base_provider = get_base_provider(rpc)
w3 = Web3(base_provider)
latest_block_number = get_latest_block_number(w3)
while not killer.kill_now:
last_written_block = find_latest_block_update(db_session)
last_written_block = find_latest_block_update(inspect_db_session)
logger.info(f"Latest block: {latest_block_number}")
logger.info(f"Last written block: {last_written_block}")
if last_written_block is None or last_written_block < latest_block_number:
if (last_written_block is None) or (
last_written_block < (latest_block_number - BLOCK_NUMBER_LAG)
):
block_number = (
latest_block_number
if last_written_block is None
@ -49,14 +58,14 @@ def run():
logger.info(f"Writing block: {block_number}")
inspect_block(
db_session,
inspect_db_session,
base_provider,
w3,
trace_classifier,
block_number,
should_write_classified_traces=False,
should_cache=False,
trace_db_session=trace_db_session,
)
update_latest_block(db_session, block_number)
update_latest_block(inspect_db_session, block_number)
else:
time.sleep(5)
latest_block_number = get_latest_block_number(w3)
@ -65,4 +74,7 @@ def run():
if __name__ == "__main__":
run()
try:
run()
except Exception as e:
logger.error(e)

49
mev Executable file
View File

@ -0,0 +1,49 @@
#!/bin/sh
set -e
DB_NAME=mev_inspect
function get_kube_db_secret(){
kubectl get secrets mev-inspect-db-credentials -o jsonpath="{.data.$1}" | base64 --decode
}
function db(){
host=$(get_kube_db_secret "host")
username=$(get_kube_db_secret "username")
password=$(get_kube_db_secret "password")
kubectl run -i --rm --tty postgres-client \
--env="PGPASSWORD=$password" \
--image=jbergknoff/postgresql-client \
-- $DB_NAME --host=$host --user=$username
}
case "$1" in
db)
echo "Connecting to $DB_NAME"
db
;;
backfill)
start_block_number=$2
end_block_number=$3
n_workers=$4
echo "Backfilling from $start_block_number to $end_block_number with $n_workers workers"
python backfill.py $start_block_number $end_block_number $n_workers
;;
inspect)
block_number=$2
echo "Inspecting block $block_number"
kubectl exec -ti deploy/mev-inspect -- poetry run inspect-block $block_number
;;
test)
echo "Running tests"
kubectl exec -ti deploy/mev-inspect -- poetry run pytest tests
;;
*)
echo "Usage: "$1" {inspect|test}"
exit 1
esac
exit 0

View File

@ -0,0 +1,104 @@
from typing import List, Tuple, Optional
from mev_inspect.traces import (
get_child_traces,
is_child_of_any_address,
)
from mev_inspect.schemas.classified_traces import (
ClassifiedTrace,
DecodedCallTrace,
Classification,
Protocol,
)
from mev_inspect.transfers import get_transfer
from mev_inspect.schemas.transfers import Transfer
from mev_inspect.schemas.liquidations import Liquidation
AAVE_CONTRACT_ADDRESSES: List[str] = [
# AAVE Proxy
"0x398ec7346dcd622edc5ae82352f02be94c62d119",
# AAVE V2
"0x7d2768de32b0b80b7a3454c06bdac94a69ddc7a9",
# AAVE V1
"0x3dfd23a6c5e8bbcfc9581d2e864a68feb6a076d3",
# AAVE V2 WETH
"0x030ba81f1c18d280636f32af80b9aad02cf0854e",
# AAVE AMM Market DAI
"0x79be75ffc64dd58e66787e4eae470c8a1fd08ba4",
# AAVE i
"0x030ba81f1c18d280636f32af80b9aad02cf0854e",
"0xbcca60bb61934080951369a648fb03df4f96263c",
]
def get_aave_liquidations(
traces: List[ClassifiedTrace],
) -> List[Liquidation]:
"""Inspect list of classified traces and identify liquidation"""
liquidations: List[Liquidation] = []
parent_liquidations: List[List[int]] = []
for trace in traces:
if (
trace.classification == Classification.liquidate
and isinstance(trace, DecodedCallTrace)
and not is_child_of_any_address(trace, parent_liquidations)
and trace.protocol == Protocol.aave
):
parent_liquidations.append(trace.trace_address)
liquidator = trace.from_address
child_traces = get_child_traces(
trace.transaction_hash, trace.trace_address, traces
)
(
received_token_address,
received_amount,
) = _get_payback_token_and_amount(trace, child_traces, liquidator)
liquidations.append(
Liquidation(
liquidated_user=trace.inputs["_user"],
collateral_token_address=trace.inputs["_collateral"],
debt_token_address=trace.inputs["_reserve"],
liquidator_user=liquidator,
debt_purchase_amount=trace.inputs["_purchaseAmount"],
protocol=Protocol.aave,
received_amount=received_amount,
received_token_address=received_token_address,
transaction_hash=trace.transaction_hash,
trace_address=trace.trace_address,
block_number=trace.block_number,
)
)
return liquidations
def _get_payback_token_and_amount(
liquidation: DecodedCallTrace, child_traces: List[ClassifiedTrace], liquidator: str
) -> Tuple[str, int]:
"""Look for and return liquidator payback from liquidation"""
for child in child_traces:
if child.classification == Classification.transfer and isinstance(
child, DecodedCallTrace
):
child_transfer: Optional[Transfer] = get_transfer(child)
if (
child_transfer is not None
and child_transfer.to_address == liquidator
and child.from_address in AAVE_CONTRACT_ADDRESSES
):
return child_transfer.token_address, child_transfer.amount
return liquidation.inputs["_collateral"], 0

View File

@ -12,15 +12,32 @@ THIS_FILE_DIRECTORY = Path(__file__).parents[0]
ABI_DIRECTORY_PATH = THIS_FILE_DIRECTORY / "abis"
def get_abi(abi_name: str, protocol: Optional[Protocol]) -> Optional[ABI]:
def get_abi_path(abi_name: str, protocol: Optional[Protocol]) -> Optional[Path]:
abi_filename = f"{abi_name}.json"
abi_path = (
ABI_DIRECTORY_PATH / abi_filename
if protocol is None
else ABI_DIRECTORY_PATH / protocol.value / abi_filename
)
if abi_path.is_file():
return abi_path
return None
# raw abi, for instantiating contract for queries (as opposed to classification, see below)
def get_raw_abi(abi_name: str, protocol: Optional[Protocol]) -> Optional[str]:
abi_path = get_abi_path(abi_name, protocol)
if abi_path is not None:
with abi_path.open() as abi_file:
return abi_file.read()
return None
def get_abi(abi_name: str, protocol: Optional[Protocol]) -> Optional[ABI]:
abi_path = get_abi_path(abi_name, protocol)
if abi_path is not None:
with abi_path.open() as abi_file:
abi_json = json.load(abi_file)
return parse_obj_as(ABI, abi_json)

View File

@ -0,0 +1,615 @@
[
{
"anonymous": false,
"inputs": [
{
"indexed": true,
"internalType": "address",
"name": "owner",
"type": "address"
},
{
"indexed": true,
"internalType": "address",
"name": "spender",
"type": "address"
},
{
"indexed": false,
"internalType": "uint256",
"name": "value",
"type": "uint256"
}
],
"name": "Approval",
"type": "event"
},
{
"anonymous": false,
"inputs": [
{
"indexed": true,
"internalType": "address",
"name": "from",
"type": "address"
},
{
"indexed": true,
"internalType": "address",
"name": "to",
"type": "address"
},
{
"indexed": false,
"internalType": "uint256",
"name": "value",
"type": "uint256"
},
{
"indexed": false,
"internalType": "uint256",
"name": "index",
"type": "uint256"
}
],
"name": "BalanceTransfer",
"type": "event"
},
{
"anonymous": false,
"inputs": [
{
"indexed": true,
"internalType": "address",
"name": "from",
"type": "address"
},
{
"indexed": true,
"internalType": "address",
"name": "target",
"type": "address"
},
{
"indexed": false,
"internalType": "uint256",
"name": "value",
"type": "uint256"
},
{
"indexed": false,
"internalType": "uint256",
"name": "index",
"type": "uint256"
}
],
"name": "Burn",
"type": "event"
},
{
"anonymous": false,
"inputs": [
{
"indexed": true,
"internalType": "address",
"name": "underlyingAsset",
"type": "address"
},
{
"indexed": true,
"internalType": "address",
"name": "pool",
"type": "address"
},
{
"indexed": false,
"internalType": "address",
"name": "treasury",
"type": "address"
},
{
"indexed": false,
"internalType": "address",
"name": "incentivesController",
"type": "address"
},
{
"indexed": false,
"internalType": "uint8",
"name": "aTokenDecimals",
"type": "uint8"
},
{
"indexed": false,
"internalType": "string",
"name": "aTokenName",
"type": "string"
},
{
"indexed": false,
"internalType": "string",
"name": "aTokenSymbol",
"type": "string"
},
{
"indexed": false,
"internalType": "bytes",
"name": "params",
"type": "bytes"
}
],
"name": "Initialized",
"type": "event"
},
{
"anonymous": false,
"inputs": [
{
"indexed": true,
"internalType": "address",
"name": "from",
"type": "address"
},
{
"indexed": false,
"internalType": "uint256",
"name": "value",
"type": "uint256"
},
{
"indexed": false,
"internalType": "uint256",
"name": "index",
"type": "uint256"
}
],
"name": "Mint",
"type": "event"
},
{
"anonymous": false,
"inputs": [
{
"indexed": true,
"internalType": "address",
"name": "from",
"type": "address"
},
{
"indexed": true,
"internalType": "address",
"name": "to",
"type": "address"
},
{
"indexed": false,
"internalType": "uint256",
"name": "value",
"type": "uint256"
}
],
"name": "Transfer",
"type": "event"
},
{
"inputs": [
],
"name": "UNDERLYING_ASSET_ADDRESS",
"outputs": [
{
"internalType": "address",
"name": "",
"type": "address"
}
],
"stateMutability": "view",
"type": "function"
},
{
"inputs": [
{
"internalType": "address",
"name": "owner",
"type": "address"
},
{
"internalType": "address",
"name": "spender",
"type": "address"
}
],
"name": "allowance",
"outputs": [
{
"internalType": "uint256",
"name": "",
"type": "uint256"
}
],
"stateMutability": "view",
"type": "function"
},
{
"inputs": [
{
"internalType": "address",
"name": "spender",
"type": "address"
},
{
"internalType": "uint256",
"name": "amount",
"type": "uint256"
}
],
"name": "approve",
"outputs": [
{
"internalType": "bool",
"name": "",
"type": "bool"
}
],
"stateMutability": "nonpayable",
"type": "function"
},
{
"inputs": [
{
"internalType": "address",
"name": "account",
"type": "address"
}
],
"name": "balanceOf",
"outputs": [
{
"internalType": "uint256",
"name": "",
"type": "uint256"
}
],
"stateMutability": "view",
"type": "function"
},
{
"inputs": [
{
"internalType": "address",
"name": "user",
"type": "address"
},
{
"internalType": "address",
"name": "receiverOfUnderlying",
"type": "address"
},
{
"internalType": "uint256",
"name": "amount",
"type": "uint256"
},
{
"internalType": "uint256",
"name": "index",
"type": "uint256"
}
],
"name": "burn",
"outputs": [
],
"stateMutability": "nonpayable",
"type": "function"
},
{
"inputs": [
],
"name": "getIncentivesController",
"outputs": [
{
"internalType": "contract IAaveIncentivesController",
"name": "",
"type": "address"
}
],
"stateMutability": "view",
"type": "function"
},
{
"inputs": [
{
"internalType": "address",
"name": "user",
"type": "address"
}
],
"name": "getScaledUserBalanceAndSupply",
"outputs": [
{
"internalType": "uint256",
"name": "",
"type": "uint256"
},
{
"internalType": "uint256",
"name": "",
"type": "uint256"
}
],
"stateMutability": "view",
"type": "function"
},
{
"inputs": [
{
"internalType": "address",
"name": "user",
"type": "address"
},
{
"internalType": "uint256",
"name": "amount",
"type": "uint256"
}
],
"name": "handleRepayment",
"outputs": [
],
"stateMutability": "nonpayable",
"type": "function"
},
{
"inputs": [
{
"internalType": "contract ILendingPool",
"name": "pool",
"type": "address"
},
{
"internalType": "address",
"name": "treasury",
"type": "address"
},
{
"internalType": "address",
"name": "underlyingAsset",
"type": "address"
},
{
"internalType": "contract IAaveIncentivesController",
"name": "incentivesController",
"type": "address"
},
{
"internalType": "uint8",
"name": "aTokenDecimals",
"type": "uint8"
},
{
"internalType": "string",
"name": "aTokenName",
"type": "string"
},
{
"internalType": "string",
"name": "aTokenSymbol",
"type": "string"
},
{
"internalType": "bytes",
"name": "params",
"type": "bytes"
}
],
"name": "initialize",
"outputs": [
],
"stateMutability": "nonpayable",
"type": "function"
},
{
"inputs": [
{
"internalType": "address",
"name": "user",
"type": "address"
},
{
"internalType": "uint256",
"name": "amount",
"type": "uint256"
},
{
"internalType": "uint256",
"name": "index",
"type": "uint256"
}
],
"name": "mint",
"outputs": [
{
"internalType": "bool",
"name": "",
"type": "bool"
}
],
"stateMutability": "nonpayable",
"type": "function"
},
{
"inputs": [
{
"internalType": "uint256",
"name": "amount",
"type": "uint256"
},
{
"internalType": "uint256",
"name": "index",
"type": "uint256"
}
],
"name": "mintToTreasury",
"outputs": [
],
"stateMutability": "nonpayable",
"type": "function"
},
{
"inputs": [
{
"internalType": "address",
"name": "user",
"type": "address"
}
],
"name": "scaledBalanceOf",
"outputs": [
{
"internalType": "uint256",
"name": "",
"type": "uint256"
}
],
"stateMutability": "view",
"type": "function"
},
{
"inputs": [
],
"name": "scaledTotalSupply",
"outputs": [
{
"internalType": "uint256",
"name": "",
"type": "uint256"
}
],
"stateMutability": "view",
"type": "function"
},
{
"inputs": [
],
"name": "totalSupply",
"outputs": [
{
"internalType": "uint256",
"name": "",
"type": "uint256"
}
],
"stateMutability": "view",
"type": "function"
},
{
"inputs": [
{
"internalType": "address",
"name": "recipient",
"type": "address"
},
{
"internalType": "uint256",
"name": "amount",
"type": "uint256"
}
],
"name": "transfer",
"outputs": [
{
"internalType": "bool",
"name": "",
"type": "bool"
}
],
"stateMutability": "nonpayable",
"type": "function"
},
{
"inputs": [
{
"internalType": "address",
"name": "sender",
"type": "address"
},
{
"internalType": "address",
"name": "recipient",
"type": "address"
},
{
"internalType": "uint256",
"name": "amount",
"type": "uint256"
}
],
"name": "transferFrom",
"outputs": [
{
"internalType": "bool",
"name": "",
"type": "bool"
}
],
"stateMutability": "nonpayable",
"type": "function"
},
{
"inputs": [
{
"internalType": "address",
"name": "from",
"type": "address"
},
{
"internalType": "address",
"name": "to",
"type": "address"
},
{
"internalType": "uint256",
"name": "value",
"type": "uint256"
}
],
"name": "transferOnLiquidation",
"outputs": [
],
"stateMutability": "nonpayable",
"type": "function"
},
{
"inputs": [
{
"internalType": "address",
"name": "user",
"type": "address"
},
{
"internalType": "uint256",
"name": "amount",
"type": "uint256"
}
],
"name": "transferUnderlyingTo",
"outputs": [
{
"internalType": "uint256",
"name": "",
"type": "uint256"
}
],
"stateMutability": "nonpayable",
"type": "function"
}
]

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@ -1,6 +1,7 @@
from pathlib import Path
from typing import List
from typing import List, Optional
from sqlalchemy import orm
from web3 import Web3
from mev_inspect.fees import fetch_base_fee_per_gas
@ -16,28 +17,27 @@ def get_latest_block_number(w3: Web3) -> int:
def create_from_block_number(
base_provider, w3: Web3, block_number: int, should_cache: bool
base_provider,
w3: Web3,
block_number: int,
trace_db_session: Optional[orm.Session],
) -> Block:
if not should_cache:
return fetch_block(w3, base_provider, block_number)
block: Optional[Block] = None
cache_path = _get_cache_path(block_number)
if trace_db_session is not None:
block = _find_block(trace_db_session, block_number)
if cache_path.is_file():
print(f"Cache for block {block_number} exists, " "loading data from cache")
return Block.parse_file(cache_path)
if block is None:
return _fetch_block(w3, base_provider, block_number)
else:
print(f"Cache for block {block_number} did not exist, getting data")
block = fetch_block(w3, base_provider, block_number)
cache_block(cache_path, block)
return block
def fetch_block(w3, base_provider, block_number: int) -> Block:
def _fetch_block(
w3,
base_provider,
block_number: int,
) -> Block:
block_json = w3.eth.get_block(block_number)
receipts_json = base_provider.make_request("eth_getBlockReceipts", [block_number])
traces_json = w3.parity.trace_block(block_number)
@ -57,6 +57,87 @@ def fetch_block(w3, base_provider, block_number: int) -> Block:
)
def _find_block(
trace_db_session: orm.Session,
block_number: int,
) -> Optional[Block]:
traces = _find_traces(trace_db_session, block_number)
receipts = _find_receipts(trace_db_session, block_number)
base_fee_per_gas = _find_base_fee(trace_db_session, block_number)
if traces is None or receipts is None or base_fee_per_gas is None:
return None
miner_address = _get_miner_address_from_traces(traces)
if miner_address is None:
return None
return Block(
block_number=block_number,
miner=miner_address,
base_fee_per_gas=base_fee_per_gas,
traces=traces,
receipts=receipts,
)
def _find_traces(
trace_db_session: orm.Session,
block_number: int,
) -> Optional[List[Trace]]:
result = trace_db_session.execute(
"SELECT raw_traces FROM block_traces WHERE block_number = :block_number",
params={"block_number": block_number},
).one_or_none()
if result is None:
return None
else:
(traces_json,) = result
return [Trace(**trace_json) for trace_json in traces_json]
def _find_receipts(
trace_db_session: orm.Session,
block_number: int,
) -> Optional[List[Receipt]]:
result = trace_db_session.execute(
"SELECT raw_receipts FROM block_receipts WHERE block_number = :block_number",
params={"block_number": block_number},
).one_or_none()
if result is None:
return None
else:
(receipts_json,) = result
return [Receipt(**receipt) for receipt in receipts_json]
def _find_base_fee(
trace_db_session: orm.Session,
block_number: int,
) -> Optional[int]:
result = trace_db_session.execute(
"SELECT base_fee_in_wei FROM base_fee WHERE block_number = :block_number",
params={"block_number": block_number},
).one_or_none()
if result is None:
return None
else:
(base_fee,) = result
return base_fee
def _get_miner_address_from_traces(traces: List[Trace]) -> Optional[str]:
for trace in traces:
if trace.type == TraceType.reward:
return trace.action["author"]
return None
def get_transaction_hashes(calls: List[Trace]) -> List[str]:
result = []
@ -82,4 +163,4 @@ def cache_block(cache_path: Path, block: Block):
def _get_cache_path(block_number: int) -> Path:
cache_directory_path = Path(cache_directory)
return cache_directory_path / f"{block_number}-new.json"
return cache_directory_path / f"{block_number}.json"

View File

@ -1,11 +1,16 @@
from typing import Dict, Optional, Tuple, Type
from mev_inspect.schemas.classified_traces import DecodedCallTrace, Protocol
from mev_inspect.schemas.classifiers import ClassifierSpec, Classifier
from .aave import AAVE_CLASSIFIER_SPECS
from .curve import CURVE_CLASSIFIER_SPECS
from .erc20 import ERC20_CLASSIFIER_SPECS
from .uniswap import UNISWAP_CLASSIFIER_SPECS
from .weth import WETH_CLASSIFIER_SPECS
from .weth import WETH_CLASSIFIER_SPECS, WETH_ADDRESS
from .zero_ex import ZEROX_CLASSIFIER_SPECS
from .balancer import BALANCER_CLASSIFIER_SPECS
from .compound import COMPOUND_CLASSIFIER_SPECS
ALL_CLASSIFIER_SPECS = (
ERC20_CLASSIFIER_SPECS
@ -15,4 +20,21 @@ ALL_CLASSIFIER_SPECS = (
+ AAVE_CLASSIFIER_SPECS
+ ZEROX_CLASSIFIER_SPECS
+ BALANCER_CLASSIFIER_SPECS
+ COMPOUND_CLASSIFIER_SPECS
)
_SPECS_BY_ABI_NAME_AND_PROTOCOL: Dict[
Tuple[str, Optional[Protocol]], ClassifierSpec
] = {(spec.abi_name, spec.protocol): spec for spec in ALL_CLASSIFIER_SPECS}
def get_classifier(
trace: DecodedCallTrace,
) -> Optional[Type[Classifier]]:
abi_name_and_protocol = (trace.abi_name, trace.protocol)
spec = _SPECS_BY_ABI_NAME_AND_PROTOCOL.get(abi_name_and_protocol)
if spec is not None:
return spec.classifiers.get(trace.function_signature)
return None

View File

@ -1,15 +1,45 @@
from mev_inspect.schemas.classified_traces import (
Classification,
ClassifierSpec,
Protocol,
)
from mev_inspect.schemas.classifiers import (
ClassifierSpec,
DecodedCallTrace,
TransferClassifier,
LiquidationClassifier,
)
from mev_inspect.schemas.transfers import Transfer
class AaveTransferClassifier(TransferClassifier):
@staticmethod
def get_transfer(trace: DecodedCallTrace) -> Transfer:
return Transfer(
block_number=trace.block_number,
transaction_hash=trace.transaction_hash,
trace_address=trace.trace_address,
amount=trace.inputs["value"],
to_address=trace.inputs["to"],
from_address=trace.inputs["from"],
token_address=trace.to_address,
)
AAVE_SPEC = ClassifierSpec(
abi_name="AaveLendingPool",
protocol=Protocol.aave,
classifications={
"liquidationCall(address,address,address,uint256,bool)": Classification.liquidate,
classifiers={
"liquidationCall(address,address,address,uint256,bool)": LiquidationClassifier,
},
)
AAVE_CLASSIFIER_SPECS = [AAVE_SPEC]
ATOKENS_SPEC = ClassifierSpec(
abi_name="aTokens",
protocol=Protocol.aave,
classifiers={
"transferOnLiquidation(address,address,uint256)": AaveTransferClassifier,
"transferFrom(address,address,uint256)": AaveTransferClassifier,
},
)
AAVE_CLASSIFIER_SPECS = [AAVE_SPEC, ATOKENS_SPEC]

View File

@ -1,16 +1,29 @@
from mev_inspect.schemas.classified_traces import (
Classification,
ClassifierSpec,
DecodedCallTrace,
Protocol,
)
from mev_inspect.schemas.classifiers import (
ClassifierSpec,
SwapClassifier,
)
BALANCER_V1_POOL_ABI_NAME = "BPool"
class BalancerSwapClassifier(SwapClassifier):
@staticmethod
def get_swap_recipient(trace: DecodedCallTrace) -> str:
return trace.from_address
BALANCER_V1_SPECS = [
ClassifierSpec(
abi_name="BPool",
abi_name=BALANCER_V1_POOL_ABI_NAME,
protocol=Protocol.balancer_v1,
classifications={
"swapExactAmountIn(address,uint256,address,uint256,uint256)": Classification.swap,
"swapExactAmountOut(address,uint256,address,uint256,uint256)": Classification.swap,
classifiers={
"swapExactAmountIn(address,uint256,address,uint256,uint256)": BalancerSwapClassifier,
"swapExactAmountOut(address,uint256,address,uint256,uint256)": BalancerSwapClassifier,
},
),
]

View File

@ -0,0 +1,165 @@
from mev_inspect.schemas.classified_traces import (
Protocol,
)
from mev_inspect.schemas.classifiers import (
ClassifierSpec,
LiquidationClassifier,
SeizeClassifier,
)
COMPOUND_V2_CETH_SPEC = ClassifierSpec(
abi_name="CEther",
protocol=Protocol.compound_v2,
valid_contract_addresses=["0x4ddc2d193948926d02f9b1fe9e1daa0718270ed5"],
classifiers={
"liquidateBorrow(address,address)": LiquidationClassifier,
"seize(address,address,uint256)": SeizeClassifier,
},
)
CREAM_CETH_SPEC = ClassifierSpec(
abi_name="CEther",
protocol=Protocol.cream,
valid_contract_addresses=["0xD06527D5e56A3495252A528C4987003b712860eE"],
classifiers={
"liquidateBorrow(address,address)": LiquidationClassifier,
"seize(address,address,uint256)": SeizeClassifier,
},
)
COMPOUND_V2_CTOKEN_SPEC = ClassifierSpec(
abi_name="CToken",
protocol=Protocol.compound_v2,
valid_contract_addresses=[
"0x6c8c6b02e7b2be14d4fa6022dfd6d75921d90e4e",
"0x5d3a536e4d6dbd6114cc1ead35777bab948e3643",
"0x158079ee67fce2f58472a96584a73c7ab9ac95c1",
"0x39aa39c021dfbae8fac545936693ac917d5e7563",
"0xf650c3d88d12db855b8bf7d11be6c55a4e07dcc9",
"0xc11b1268c1a384e55c48c2391d8d480264a3a7f4",
"0xb3319f5d18bc0d84dd1b4825dcde5d5f7266d407",
"0xf5dce57282a584d2746faf1593d3121fcac444dc",
"0x35a18000230da775cac24873d00ff85bccded550",
"0x70e36f6bf80a52b3b46b3af8e106cc0ed743e8e4",
"0xccf4429db6322d5c611ee964527d42e5d685dd6a",
"0x12392f67bdf24fae0af363c24ac620a2f67dad86",
"0xface851a4921ce59e912d19329929ce6da6eb0c7",
"0x95b4ef2869ebd94beb4eee400a99824bf5dc325b",
"0x4b0181102a0112a2ef11abee5563bb4a3176c9d7",
"0xe65cdb6479bac1e22340e4e755fae7e509ecd06c",
"0x80a2ae356fc9ef4305676f7a3e2ed04e12c33946",
],
classifiers={
"liquidateBorrow(address,uint256,address)": LiquidationClassifier,
"seize(address,address,uint256)": SeizeClassifier,
},
)
CREAM_CTOKEN_SPEC = ClassifierSpec(
abi_name="CToken",
protocol=Protocol.cream,
valid_contract_addresses=[
"0xd06527d5e56a3495252a528c4987003b712860ee",
"0x51f48b638f82e8765f7a26373a2cb4ccb10c07af",
"0x44fbebd2f576670a6c33f6fc0b00aa8c5753b322",
"0xcbae0a83f4f9926997c8339545fb8ee32edc6b76",
"0xce4fe9b4b8ff61949dcfeb7e03bc9faca59d2eb3",
"0x19d1666f543d42ef17f66e376944a22aea1a8e46",
"0x9baf8a5236d44ac410c0186fe39178d5aad0bb87",
"0x797aab1ce7c01eb727ab980762ba88e7133d2157",
"0x892b14321a4fcba80669ae30bd0cd99a7ecf6ac0",
"0x697256caa3ccafd62bb6d3aa1c7c5671786a5fd9",
"0x8b86e0598616a8d4f1fdae8b59e55fb5bc33d0d6",
"0xc7fd8dcee4697ceef5a2fd4608a7bd6a94c77480",
"0x17107f40d70f4470d20cb3f138a052cae8ebd4be",
"0x1ff8cdb51219a8838b52e9cac09b71e591bc998e",
"0x3623387773010d9214b10c551d6e7fc375d31f58",
"0x4ee15f44c6f0d8d1136c83efd2e8e4ac768954c6",
"0x338286c0bc081891a4bda39c7667ae150bf5d206",
"0x10fdbd1e48ee2fd9336a482d746138ae19e649db",
"0x01da76dea59703578040012357b81ffe62015c2d",
"0xef58b2d5a1b8d3cde67b8ab054dc5c831e9bc025",
"0xe89a6d0509faf730bd707bf868d9a2a744a363c7",
"0xeff039c3c1d668f408d09dd7b63008622a77532c",
"0x22b243b96495c547598d9042b6f94b01c22b2e9e",
"0x8b3ff1ed4f36c2c2be675afb13cc3aa5d73685a5",
"0x2a537fa9ffaea8c1a41d3c2b68a9cb791529366d",
"0x7ea9c63e216d5565c3940a2b3d150e59c2907db3",
"0x3225e3c669b39c7c8b3e204a8614bb218c5e31bc",
"0xf55bbe0255f7f4e70f63837ff72a577fbddbe924",
"0x903560b1cce601794c584f58898da8a8b789fc5d",
"0x054b7ed3f45714d3091e82aad64a1588dc4096ed",
"0xd5103afcd0b3fa865997ef2984c66742c51b2a8b",
"0xfd609a03b393f1a1cfcacedabf068cad09a924e2",
"0xd692ac3245bb82319a31068d6b8412796ee85d2c",
"0x92b767185fb3b04f881e3ac8e5b0662a027a1d9f",
"0x10a3da2bb0fae4d591476fd97d6636fd172923a8",
"0x3c6c553a95910f9fc81c98784736bd628636d296",
"0x21011bc93d9e515b9511a817a1ed1d6d468f49fc",
"0x85759961b116f1d36fd697855c57a6ae40793d9b",
"0x7c3297cfb4c4bbd5f44b450c0872e0ada5203112",
"0x7aaa323d7e398be4128c7042d197a2545f0f1fea",
"0x011a014d5e8eb4771e575bb1000318d509230afa",
"0xe6c3120f38f56deb38b69b65cc7dcaf916373963",
"0x4fe11bc316b6d7a345493127fbe298b95adaad85",
"0xcd22c4110c12ac41acefa0091c432ef44efaafa0",
"0x228619cca194fbe3ebeb2f835ec1ea5080dafbb2",
"0x73f6cba38922960b7092175c0add22ab8d0e81fc",
"0x38f27c03d6609a86ff7716ad03038881320be4ad",
"0x5ecad8a75216cea7dff978525b2d523a251eea92",
"0x5c291bc83d15f71fb37805878161718ea4b6aee9",
"0x6ba0c66c48641e220cf78177c144323b3838d375",
"0xd532944df6dfd5dd629e8772f03d4fc861873abf",
"0x197070723ce0d3810a0e47f06e935c30a480d4fc",
"0xc25eae724f189ba9030b2556a1533e7c8a732e14",
"0x25555933a8246ab67cbf907ce3d1949884e82b55",
"0xc68251421edda00a10815e273fa4b1191fac651b",
"0x65883978ada0e707c3b2be2a6825b1c4bdf76a90",
"0x8b950f43fcac4931d408f1fcda55c6cb6cbf3096",
"0x59089279987dd76fc65bf94cb40e186b96e03cb3",
"0x2db6c82ce72c8d7d770ba1b5f5ed0b6e075066d6",
"0xb092b4601850e23903a42eacbc9d8a0eec26a4d5",
"0x081fe64df6dc6fc70043aedf3713a3ce6f190a21",
"0x1d0986fb43985c88ffa9ad959cc24e6a087c7e35",
"0xc36080892c64821fa8e396bc1bd8678fa3b82b17",
"0x8379baa817c5c5ab929b03ee8e3c48e45018ae41",
"0x299e254a8a165bbeb76d9d69305013329eea3a3b",
"0xf8445c529d363ce114148662387eba5e62016e20",
"0x28526bb33d7230e65e735db64296413731c5402e",
"0x45406ba53bb84cd32a58e7098a2d4d1b11b107f6",
"0x6d1b9e01af17dd08d6dec08e210dfd5984ff1c20",
"0x1f9b4756b008106c806c7e64322d7ed3b72cb284",
"0xab10586c918612ba440482db77549d26b7abf8f7",
"0xdfff11dfe6436e42a17b86e7f419ac8292990393",
"0xdbb5e3081def4b6cdd8864ac2aeda4cbf778fecf",
"0x71cefcd324b732d4e058afacba040d908c441847",
"0x1a122348b73b58ea39f822a89e6ec67950c2bbd0",
"0x523effc8bfefc2948211a05a905f761cba5e8e9e",
"0x4202d97e00b9189936edf37f8d01cff88bdd81d4",
"0x4baa77013ccd6705ab0522853cb0e9d453579dd4",
"0x98e329eb5aae2125af273102f3440de19094b77c",
"0x8c3b7a4320ba70f8239f83770c4015b5bc4e6f91",
"0xe585c76573d7593abf21537b607091f76c996e73",
"0x81e346729723c4d15d0fb1c5679b9f2926ff13c6",
"0x766175eac1a99c969ddd1ebdbe7e270d508d8fff",
"0xd7394428536f63d5659cc869ef69d10f9e66314b",
"0x1241b10e7ea55b22f5b2d007e8fecdf73dcff999",
"0x2a867fd776b83e1bd4e13c6611afd2f6af07ea6d",
"0x250fb308199fe8c5220509c1bf83d21d60b7f74a",
"0x4112a717edd051f77d834a6703a1ef5e3d73387f",
"0xf04ce2e71d32d789a259428ddcd02d3c9f97fb4e",
"0x89e42987c39f72e2ead95a8a5bc92114323d5828",
"0x58da9c9fc3eb30abbcbbab5ddabb1e6e2ef3d2ef",
],
classifiers={
"liquidateBorrow(address,uint256,address)": LiquidationClassifier,
"seize(address,address,uint256)": SeizeClassifier,
},
)
COMPOUND_CLASSIFIER_SPECS = [
COMPOUND_V2_CETH_SPEC,
COMPOUND_V2_CTOKEN_SPEC,
CREAM_CETH_SPEC,
CREAM_CTOKEN_SPEC,
]

View File

@ -1,29 +1,20 @@
from mev_inspect.schemas.classified_traces import (
ClassifierSpec,
Protocol,
)
"""
Deployment addresses found here
https://curve.readthedocs.io/ref-addresses.html
from mev_inspect.schemas.classifiers import (
ClassifierSpec,
DecodedCallTrace,
SwapClassifier,
)
class CurveSwapClassifier(SwapClassifier):
@staticmethod
def get_swap_recipient(trace: DecodedCallTrace) -> str:
return trace.from_address
organized into 3 groups
1. Base Pools: 2 or more tokens implementing stable swap
- StableSwap<pool>
- Deposit<pool>
- CurveContract<version>
- CurveTokenV1/V2
2. Meta Pools: 1 token trading with an LP from above
- StableSwap<pool>
- Deposit<pool>
- CurveTokenV1/V2
3. Liquidity Gauges: stake LP get curve governance token?
- LiquidityGauge
- LiquidityGaugeV1/V2
- LiquidityGaugeReward
4. DAO stuff
5..? Other stuff, haven't decided if important
"""
CURVE_BASE_POOLS = [
ClassifierSpec(
abi_name="CurveTokenV1",
@ -72,101 +63,171 @@ CURVE_BASE_POOLS = [
abi_name="StableSwap3Pool",
protocol=Protocol.curve,
valid_contract_addresses=["0xbEbc44782C7dB0a1A60Cb6fe97d0b483032FF1C7"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwapAAVE",
protocol=Protocol.curve,
valid_contract_addresses=["0xDeBF20617708857ebe4F679508E7b7863a8A8EeE"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
"exchange_underlying(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwapAETH",
protocol=Protocol.curve,
valid_contract_addresses=["0xA96A65c051bF88B4095Ee1f2451C2A9d43F53Ae2"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwapBUSD",
protocol=Protocol.curve,
valid_contract_addresses=["0x79a8C46DeA5aDa233ABaFFD40F3A0A2B1e5A4F27"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
"exchange_underlying(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwapCompound",
protocol=Protocol.curve,
valid_contract_addresses=["0xA2B47E3D5c44877cca798226B7B8118F9BFb7A56"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
"exchange_underlying(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwapEURS",
protocol=Protocol.curve,
valid_contract_addresses=["0x0Ce6a5fF5217e38315f87032CF90686C96627CAA"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwaphBTC",
protocol=Protocol.curve,
valid_contract_addresses=["0x4CA9b3063Ec5866A4B82E437059D2C43d1be596F"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwapIronBank",
protocol=Protocol.curve,
valid_contract_addresses=["0x2dded6Da1BF5DBdF597C45fcFaa3194e53EcfeAF"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
"exchange_underlying(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwapLink",
protocol=Protocol.curve,
valid_contract_addresses=["0xf178c0b5bb7e7abf4e12a4838c7b7c5ba2c623c0"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwapPAX",
protocol=Protocol.curve,
valid_contract_addresses=["0x06364f10B501e868329afBc005b3492902d6C763"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
"exchange_underlying(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwaprenBTC",
protocol=Protocol.curve,
valid_contract_addresses=["0x93054188d876f558f4a66B2EF1d97d16eDf0895B"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwaprETH",
protocol=Protocol.curve,
valid_contract_addresses=["0xF9440930043eb3997fc70e1339dBb11F341de7A8"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwapsAAVE",
protocol=Protocol.curve,
valid_contract_addresses=["0xEB16Ae0052ed37f479f7fe63849198Df1765a733"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
"exchange_underlying(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwapsBTC",
protocol=Protocol.curve,
valid_contract_addresses=["0x7fC77b5c7614E1533320Ea6DDc2Eb61fa00A9714"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwapsETH",
protocol=Protocol.curve,
valid_contract_addresses=["0xc5424B857f758E906013F3555Dad202e4bdB4567"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwapstETH",
protocol=Protocol.curve,
valid_contract_addresses=["0xDC24316b9AE028F1497c275EB9192a3Ea0f67022"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwapsUSD",
protocol=Protocol.curve,
valid_contract_addresses=["0xA5407eAE9Ba41422680e2e00537571bcC53efBfD"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
"exchange_underlying(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwapUSDT",
protocol=Protocol.curve,
valid_contract_addresses=["0x52EA46506B9CC5Ef470C5bf89f17Dc28bB35D85C"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
"exchange_underlying(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwapY",
protocol=Protocol.curve,
valid_contract_addresses=["0x45F783CCE6B7FF23B2ab2D70e416cdb7D6055f51"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
"exchange_underlying(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwapYv2",
protocol=Protocol.curve,
valid_contract_addresses=["0x8925D9d9B4569D737a48499DeF3f67BaA5a144b9"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
"exchange_underlying(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="DepositBUSD",
@ -300,51 +361,91 @@ CURVE_META_POOLS = [
abi_name="StableSwapbBTC",
protocol=Protocol.curve,
valid_contract_addresses=["0x071c661B4DeefB59E2a3DdB20Db036821eeE8F4b"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
"exchange_underlying(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwapDUSD",
protocol=Protocol.curve,
valid_contract_addresses=["0x8038C01A0390a8c547446a0b2c18fc9aEFEcc10c"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
"exchange_underlying(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwapGUSD",
protocol=Protocol.curve,
valid_contract_addresses=["0x4f062658EaAF2C1ccf8C8e36D6824CDf41167956"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
"exchange_underlying(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwapHUSD",
protocol=Protocol.curve,
valid_contract_addresses=["0x3eF6A01A0f81D6046290f3e2A8c5b843e738E604"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
"exchange_underlying(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwapLinkUSD",
protocol=Protocol.curve,
valid_contract_addresses=["0xE7a24EF0C5e95Ffb0f6684b813A78F2a3AD7D171"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
"exchange_underlying(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwapMUSD",
protocol=Protocol.curve,
valid_contract_addresses=["0x8474DdbE98F5aA3179B3B3F5942D724aFcdec9f6"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
"exchange_underlying(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwapoBTC",
protocol=Protocol.curve,
valid_contract_addresses=["0xd81dA8D904b52208541Bade1bD6595D8a251F8dd"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
"exchange_underlying(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwappBTC",
protocol=Protocol.curve,
valid_contract_addresses=["0x7F55DDe206dbAD629C080068923b36fe9D6bDBeF"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
"exchange_underlying(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwapRSV",
protocol=Protocol.curve,
valid_contract_addresses=["0xC18cC39da8b11dA8c3541C598eE022258F9744da"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
"exchange_underlying(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwaptBTC",
protocol=Protocol.curve,
valid_contract_addresses=["0xC25099792E9349C7DD09759744ea681C7de2cb66"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
"exchange_underlying(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwapUSD",
@ -353,82 +454,29 @@ CURVE_META_POOLS = [
"0x3E01dD8a5E1fb3481F0F589056b428Fc308AF0Fb",
"0x0f9cb53Ebe405d49A0bbdBD291A65Ff571bC83e1",
],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
"exchange_underlying(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwapUSDP",
protocol=Protocol.curve,
valid_contract_addresses=["0x42d7025938bEc20B69cBae5A77421082407f053A"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
"exchange_underlying(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwapUST",
protocol=Protocol.curve,
valid_contract_addresses=["0x890f4e345B1dAED0367A877a1612f86A1f86985f"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
"exchange_underlying(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
]
"""
CURVE_LIQUIDITY_GAUGES = [
ClassifierSpec(
abi_name="LiquidityGauge",
protocol=Protocol.curve,
valid_contract_addresses=[
"0xbFcF63294aD7105dEa65aA58F8AE5BE2D9d0952A", # 3Pool
"0x69Fb7c45726cfE2baDeE8317005d3F94bE838840", # BUSD
"0x7ca5b0a2910B33e9759DC7dDB0413949071D7575", # Compound
"0xC5cfaDA84E902aD92DD40194f0883ad49639b023", # GUSD
"0x4c18E409Dc8619bFb6a1cB56D114C3f592E0aE79", # hBTC
"0x2db0E83599a91b508Ac268a6197b8B14F5e72840", # HUSD
"0x64E3C23bfc40722d3B649844055F1D51c1ac041d", # PAX
"0xB1F2cdeC61db658F091671F5f199635aEF202CAC", # renBTC
"0xC2b1DF84112619D190193E48148000e3990Bf627", # USDK
"0xF98450B5602fa59CC66e1379DFfB6FDDc724CfC4", # USDN
"0xBC89cd85491d81C6AD2954E6d0362Ee29fCa8F53", # USDT
"0xFA712EE4788C042e2B7BB55E6cb8ec569C4530c1", # Y
],
),
ClassifierSpec(
abi_name="LiquidityGaugeV2",
protocol=Protocol.curve,
valid_contract_addresses=[
"0xd662908ADA2Ea1916B3318327A97eB18aD588b5d", # AAVE
"0x6d10ed2cF043E6fcf51A0e7b4C2Af3Fa06695707", # ankrETH
"0xdFc7AdFa664b08767b735dE28f9E84cd30492aeE", # bBTC
"0x90Bb609649E0451E5aD952683D64BD2d1f245840", # EURS
"0x72e158d38dbd50a483501c24f792bdaaa3e7d55c", # FRAX
"0x11137B10C210b579405c21A07489e28F3c040AB1", # oBTC
"0xF5194c3325202F456c95c1Cf0cA36f8475C1949F", # IronBank
"0xFD4D8a17df4C27c1dD245d153ccf4499e806C87D", # Link
"0xd7d147c6Bb90A718c3De8C0568F9B560C79fa416", # pBTC
"0x462253b8F74B72304c145DB0e4Eebd326B22ca39", # sAAVE
"0x3C0FFFF15EA30C35d7A85B85c0782D6c94e1d238", # sETH
"0x182B723a58739a9c974cFDB385ceaDb237453c28", # stETH
"0x055be5DDB7A925BfEF3417FC157f53CA77cA7222", # USDP
"0x3B7020743Bc2A4ca9EaF9D0722d42E20d6935855", # UST
"0x8101E6760130be2C8Ace79643AB73500571b7162", # Yv2
],
),
ClassifierSpec(
abi_name="LiquidityGaugeV3",
protocol=Protocol.curve,
valid_contract_addresses=[
"0x9582C4ADACB3BCE56Fea3e590F05c3ca2fb9C477", # alUSD
"0x824F13f1a2F29cFEEa81154b46C0fc820677A637", # rETH
"0x6955a55416a06839309018A8B0cB72c4DDC11f15", # TriCrypto
],
),
ClassifierSpec(
abi_name="LiquidityGaugeReward",
protocol=Protocol.curve,
valid_contract_addresses=[
"0xAEA6c312f4b3E04D752946d329693F7293bC2e6D", # DUSD
"0x5f626c30EC1215f4EdCc9982265E8b1F411D1352", # MUSD
"0x4dC4A289a8E33600D8bD4cf5F6313E43a37adec7", # RSV
"0x705350c4BcD35c9441419DdD5d2f097d7a55410F", # sBTC
"0xA90996896660DEcC6E997655E065b23788857849", # sUSDv2
"0x6828bcF74279eE32f2723eC536c22c51Eed383C6", # tBTC
],
),
]
"""
CURVE_CLASSIFIER_SPECS = [*CURVE_BASE_POOLS, *CURVE_META_POOLS]

View File

@ -1,15 +1,30 @@
from mev_inspect.schemas.classified_traces import (
Classification,
from mev_inspect.schemas.classified_traces import DecodedCallTrace
from mev_inspect.schemas.classifiers import (
ClassifierSpec,
TransferClassifier,
)
from mev_inspect.schemas.transfers import Transfer
class ERC20TransferClassifier(TransferClassifier):
@staticmethod
def get_transfer(trace: DecodedCallTrace) -> Transfer:
return Transfer(
block_number=trace.block_number,
transaction_hash=trace.transaction_hash,
trace_address=trace.trace_address,
amount=trace.inputs["amount"],
to_address=trace.inputs["recipient"],
from_address=trace.inputs.get("sender", trace.from_address),
token_address=trace.to_address,
)
ERC20_SPEC = ClassifierSpec(
abi_name="ERC20",
classifications={
"transferFrom(address,address,uint256)": Classification.transfer,
"transfer(address,uint256)": Classification.transfer,
"burn(address)": Classification.burn,
classifiers={
"transferFrom(address,address,uint256)": ERC20TransferClassifier,
"transfer(address,uint256)": ERC20TransferClassifier,
},
)

View File

@ -1,8 +1,33 @@
from mev_inspect.schemas.classified_traces import (
Classification,
ClassifierSpec,
DecodedCallTrace,
Protocol,
)
from mev_inspect.schemas.classifiers import (
ClassifierSpec,
SwapClassifier,
)
UNISWAP_V2_PAIR_ABI_NAME = "UniswapV2Pair"
UNISWAP_V3_POOL_ABI_NAME = "UniswapV3Pool"
class UniswapV3SwapClassifier(SwapClassifier):
@staticmethod
def get_swap_recipient(trace: DecodedCallTrace) -> str:
if trace.inputs is not None and "recipient" in trace.inputs:
return trace.inputs["recipient"]
else:
return trace.from_address
class UniswapV2SwapClassifier(SwapClassifier):
@staticmethod
def get_swap_recipient(trace: DecodedCallTrace) -> str:
if trace.inputs is not None and "to" in trace.inputs:
return trace.inputs["to"]
else:
return trace.from_address
UNISWAP_V3_CONTRACT_SPECS = [
@ -65,9 +90,9 @@ UNISWAP_V3_CONTRACT_SPECS = [
UNISWAP_V3_GENERAL_SPECS = [
ClassifierSpec(
abi_name="UniswapV3Pool",
classifications={
"swap(address,bool,int256,uint160,bytes)": Classification.swap,
abi_name=UNISWAP_V3_POOL_ABI_NAME,
classifiers={
"swap(address,bool,int256,uint160,bytes)": UniswapV3SwapClassifier,
},
),
ClassifierSpec(
@ -96,9 +121,9 @@ UNISWAPPY_V2_CONTRACT_SPECS = [
]
UNISWAPPY_V2_PAIR_SPEC = ClassifierSpec(
abi_name="UniswapV2Pair",
classifications={
"swap(uint256,uint256,address,bytes)": Classification.swap,
abi_name=UNISWAP_V2_PAIR_ABI_NAME,
classifiers={
"swap(uint256,uint256,address,bytes)": UniswapV2SwapClassifier,
},
)

View File

@ -1,16 +1,37 @@
from mev_inspect.schemas.classified_traces import (
Classification,
ClassifierSpec,
Protocol,
)
from mev_inspect.schemas.classifiers import (
ClassifierSpec,
DecodedCallTrace,
TransferClassifier,
)
from mev_inspect.schemas.transfers import Transfer
class WethTransferClassifier(TransferClassifier):
@staticmethod
def get_transfer(trace: DecodedCallTrace) -> Transfer:
return Transfer(
block_number=trace.block_number,
transaction_hash=trace.transaction_hash,
trace_address=trace.trace_address,
amount=trace.inputs["wad"],
to_address=trace.inputs["dst"],
from_address=trace.from_address,
token_address=trace.to_address,
)
WETH_ADDRESS = "0xC02aaA39b223FE8D0A0e5C4F27eAD9083C756Cc2"
WETH_SPEC = ClassifierSpec(
abi_name="WETH9",
protocol=Protocol.weth,
valid_contract_addresses=["0xC02aaA39b223FE8D0A0e5C4F27eAD9083C756Cc2"],
classifications={
"transferFrom(address,address,uint256)": Classification.transfer,
"transfer(address,uint256)": Classification.transfer,
valid_contract_addresses=[WETH_ADDRESS],
classifiers={
"transferFrom(address,address,uint256)": WethTransferClassifier,
"transfer(address,uint256)": WethTransferClassifier,
},
)

View File

@ -1,7 +1,9 @@
from mev_inspect.schemas.classified_traces import (
ClassifierSpec,
Protocol,
)
from mev_inspect.schemas.classifiers import (
ClassifierSpec,
)
ZEROX_CONTRACT_SPECS = [

View File

@ -67,8 +67,11 @@ class TraceClassifier:
if call_data is not None:
signature = call_data.function_signature
classification = spec.classifications.get(
signature, Classification.unknown
classifier = spec.classifiers.get(signature)
classification = (
Classification.unknown
if classifier is None
else classifier.get_classification()
)
return DecodedCallTrace(

View File

@ -0,0 +1,125 @@
from typing import Dict, List, Optional
from web3 import Web3
from mev_inspect.traces import get_child_traces
from mev_inspect.schemas.classified_traces import (
ClassifiedTrace,
Classification,
Protocol,
)
from mev_inspect.schemas.liquidations import Liquidation
from mev_inspect.abi import get_raw_abi
from mev_inspect.transfers import ETH_TOKEN_ADDRESS
V2_COMPTROLLER_ADDRESS = "0x3d9819210A31b4961b30EF54bE2aeD79B9c9Cd3B"
V2_C_ETHER = "0x4Ddc2D193948926D02f9B1fE9e1daa0718270ED5"
CREAM_COMPTROLLER_ADDRESS = "0x3d5BC3c8d13dcB8bF317092d84783c2697AE9258"
CREAM_CR_ETHER = "0xD06527D5e56A3495252A528C4987003b712860eE"
# helper, only queried once in the beginning (inspect_block)
def fetch_all_underlying_markets(w3: Web3, protocol: Protocol) -> Dict[str, str]:
if protocol == Protocol.compound_v2:
c_ether = V2_C_ETHER
address = V2_COMPTROLLER_ADDRESS
elif protocol == Protocol.cream:
c_ether = CREAM_CR_ETHER
address = CREAM_COMPTROLLER_ADDRESS
else:
raise ValueError(f"No Comptroller found for {protocol}")
token_mapping = {}
comptroller_abi = get_raw_abi("Comptroller", Protocol.compound_v2)
comptroller_instance = w3.eth.contract(address=address, abi=comptroller_abi)
markets = comptroller_instance.functions.getAllMarkets().call()
token_abi = get_raw_abi("CToken", Protocol.compound_v2)
for token in markets:
# make an exception for cETH (as it has no .underlying())
if token != c_ether:
token_instance = w3.eth.contract(address=token, abi=token_abi)
underlying_token = token_instance.functions.underlying().call()
token_mapping[
token.lower()
] = underlying_token.lower() # make k:v lowercase for consistancy
return token_mapping
def get_compound_liquidations(
traces: List[ClassifiedTrace],
collateral_by_c_token_address: Dict[str, str],
collateral_by_cr_token_address: Dict[str, str],
) -> List[Liquidation]:
"""Inspect list of classified traces and identify liquidation"""
liquidations: List[Liquidation] = []
for trace in traces:
if (
trace.classification == Classification.liquidate
and (
trace.protocol == Protocol.compound_v2
or trace.protocol == Protocol.cream
)
and trace.inputs is not None
and trace.to_address is not None
):
# First, we look for cEther liquidations (position paid back via tx.value)
child_traces = get_child_traces(
trace.transaction_hash, trace.trace_address, traces
)
seize_trace = _get_seize_call(child_traces)
underlying_markets = {}
if trace.protocol == Protocol.compound_v2:
underlying_markets = collateral_by_c_token_address
elif trace.protocol == Protocol.cream:
underlying_markets = collateral_by_cr_token_address
if (
seize_trace is not None
and seize_trace.inputs is not None
and len(underlying_markets) != 0
):
c_token_collateral = trace.inputs["cTokenCollateral"]
if trace.abi_name == "CEther":
liquidations.append(
Liquidation(
liquidated_user=trace.inputs["borrower"],
collateral_token_address=ETH_TOKEN_ADDRESS, # WETH since all cEther liquidations provide Ether
debt_token_address=c_token_collateral,
liquidator_user=seize_trace.inputs["liquidator"],
debt_purchase_amount=trace.value,
protocol=trace.protocol,
received_amount=seize_trace.inputs["seizeTokens"],
transaction_hash=trace.transaction_hash,
trace_address=trace.trace_address,
block_number=trace.block_number,
)
)
elif (
trace.abi_name == "CToken"
): # cToken liquidations where liquidator pays back via token transfer
c_token_address = trace.to_address
liquidations.append(
Liquidation(
liquidated_user=trace.inputs["borrower"],
collateral_token_address=underlying_markets[
c_token_address
],
debt_token_address=c_token_collateral,
liquidator_user=seize_trace.inputs["liquidator"],
debt_purchase_amount=trace.inputs["repayAmount"],
protocol=trace.protocol,
received_amount=seize_trace.inputs["seizeTokens"],
transaction_hash=trace.transaction_hash,
trace_address=trace.trace_address,
block_number=trace.block_number,
)
)
return liquidations
def _get_seize_call(traces: List[ClassifiedTrace]) -> Optional[ClassifiedTrace]:
"""Find the call to `seize` in the child traces (successful liquidation)"""
for trace in traces:
if trace.classification == Classification.seize:
return trace
return None

View File

@ -0,0 +1,31 @@
import json
from typing import List
from mev_inspect.models.liquidations import LiquidationModel
from mev_inspect.schemas.liquidations import Liquidation
def delete_liquidations_for_block(
db_session,
block_number: int,
) -> None:
(
db_session.query(LiquidationModel)
.filter(LiquidationModel.block_number == block_number)
.delete()
)
db_session.commit()
def write_liquidations(
db_session,
liquidations: List[Liquidation],
) -> None:
models = [
LiquidationModel(**json.loads(liquidation.json()))
for liquidation in liquidations
]
db_session.bulk_save_objects(models)
db_session.commit()

View File

@ -2,7 +2,7 @@ import json
from typing import List
from mev_inspect.models.transfers import TransferModel
from mev_inspect.schemas.transfers import ERC20Transfer
from mev_inspect.schemas.transfers import Transfer
def delete_transfers_for_block(
@ -20,7 +20,7 @@ def delete_transfers_for_block(
def write_transfers(
db_session,
transfers: List[ERC20Transfer],
transfers: List[Transfer],
) -> None:
models = [TransferModel(**json.loads(transfer.json())) for transfer in transfers]

View File

@ -1,10 +1,23 @@
import os
from typing import Optional
from sqlalchemy import create_engine
from sqlalchemy import create_engine, orm
from sqlalchemy.orm import sessionmaker
def get_sqlalchemy_database_uri():
def get_trace_database_uri() -> Optional[str]:
username = os.getenv("TRACE_DB_USER")
password = os.getenv("TRACE_DB_PASSWORD")
host = os.getenv("TRACE_DB_HOST")
db_name = "trace_db"
if all(field is not None for field in [username, password, host]):
return f"postgresql://{username}:{password}@{host}/{db_name}"
return None
def get_inspect_database_uri():
username = os.getenv("POSTGRES_USER")
password = os.getenv("POSTGRES_PASSWORD")
host = os.getenv("POSTGRES_HOST")
@ -12,10 +25,24 @@ def get_sqlalchemy_database_uri():
return f"postgresql://{username}:{password}@{host}/{db_name}"
def get_engine():
return create_engine(get_sqlalchemy_database_uri())
def _get_engine(uri: str):
return create_engine(uri)
def get_session():
Session = sessionmaker(bind=get_engine())
def _get_session(uri: str):
Session = sessionmaker(bind=_get_engine(uri))
return Session()
def get_inspect_session() -> orm.Session:
uri = get_inspect_database_uri()
return _get_session(uri)
def get_trace_session() -> Optional[orm.Session]:
uri = get_trace_database_uri()
if uri is not None:
return _get_session(uri)
return None

View File

@ -1,13 +1,19 @@
from typing import Dict, Optional
from hexbytes import HexBytes
import eth_utils.abi
from eth_abi import decode_abi
from eth_abi.exceptions import InsufficientDataBytes, NonEmptyPaddingBytes
from hexbytes._utils import hexstr_to_bytes
from mev_inspect.schemas.abi import ABI, ABIFunctionDescription
from mev_inspect.schemas.call_data import CallData
# 0x + 8 characters
SELECTOR_LENGTH = 10
class ABIDecoder:
def __init__(self, abi: ABI):
self._functions_by_selector: Dict[str, ABIFunctionDescription] = {
@ -17,8 +23,7 @@ class ABIDecoder:
}
def decode(self, data: str) -> Optional[CallData]:
hex_data = HexBytes(data)
selector, params = hex_data[:4], hex_data[4:]
selector, params = data[:SELECTOR_LENGTH], data[SELECTOR_LENGTH:]
func = self._functions_by_selector.get(selector)
@ -26,10 +31,15 @@ class ABIDecoder:
return None
names = [input.name for input in func.inputs]
types = [input.type for input in func.inputs]
types = [
input.type
if input.type != "tuple"
else eth_utils.abi.collapse_if_tuple(input.dict())
for input in func.inputs
]
try:
decoded = decode_abi(types, params)
decoded = decode_abi(types, hexstr_to_bytes(params))
except (InsufficientDataBytes, NonEmptyPaddingBytes):
return None

View File

@ -1,5 +1,7 @@
import logging
from typing import Optional
from sqlalchemy import orm
from web3 import Web3
from mev_inspect.arbitrages import get_arbitrages
@ -17,29 +19,37 @@ from mev_inspect.crud.miner_payments import (
delete_miner_payments_for_block,
write_miner_payments,
)
from mev_inspect.crud.swaps import delete_swaps_for_block, write_swaps
from mev_inspect.crud.transfers import delete_transfers_for_block, write_transfers
from mev_inspect.crud.liquidations import (
delete_liquidations_for_block,
write_liquidations,
)
from mev_inspect.miner_payments import get_miner_payments
from mev_inspect.swaps import get_swaps
from mev_inspect.transfers import get_transfers
from mev_inspect.liquidations import get_liquidations
logger = logging.getLogger(__name__)
def inspect_block(
db_session,
inspect_db_session: orm.Session,
base_provider,
w3: Web3,
trace_clasifier: TraceClassifier,
block_number: int,
should_cache: bool,
trace_db_session: Optional[orm.Session],
should_write_classified_traces: bool = True,
should_write_swaps: bool = True,
should_write_transfers: bool = True,
should_write_arbitrages: bool = True,
should_write_miner_payments: bool = True,
):
block = create_from_block_number(base_provider, w3, block_number, should_cache)
block = create_from_block_number(
base_provider,
w3,
block_number,
trace_db_session,
)
logger.info(f"Total traces: {len(block.traces)}")
@ -48,37 +58,40 @@ def inspect_block(
)
logger.info(f"Total transactions: {total_transactions}")
trace_clasifier = TraceClassifier()
classified_traces = trace_clasifier.classify(block.traces)
logger.info(f"Returned {len(classified_traces)} classified traces")
if should_write_classified_traces:
delete_classified_traces_for_block(db_session, block_number)
write_classified_traces(db_session, classified_traces)
delete_classified_traces_for_block(inspect_db_session, block_number)
write_classified_traces(inspect_db_session, classified_traces)
transfers = get_transfers(classified_traces)
if should_write_transfers:
delete_transfers_for_block(db_session, block_number)
write_transfers(db_session, transfers)
logger.info(f"Found {len(transfers)} transfers")
delete_transfers_for_block(inspect_db_session, block_number)
write_transfers(inspect_db_session, transfers)
swaps = get_swaps(classified_traces)
logger.info(f"Found {len(swaps)} swaps")
if should_write_swaps:
delete_swaps_for_block(db_session, block_number)
write_swaps(db_session, swaps)
delete_swaps_for_block(inspect_db_session, block_number)
write_swaps(inspect_db_session, swaps)
arbitrages = get_arbitrages(swaps)
logger.info(f"Found {len(arbitrages)} arbitrages")
if should_write_arbitrages:
delete_arbitrages_for_block(db_session, block_number)
write_arbitrages(db_session, arbitrages)
delete_arbitrages_for_block(inspect_db_session, block_number)
write_arbitrages(inspect_db_session, arbitrages)
liquidations = get_liquidations(classified_traces)
logger.info(f"Found {len(liquidations)} liquidations")
delete_liquidations_for_block(inspect_db_session, block_number)
write_liquidations(inspect_db_session, liquidations)
miner_payments = get_miner_payments(
block.miner, block.base_fee_per_gas, classified_traces, block.receipts
)
if should_write_miner_payments:
delete_miner_payments_for_block(db_session, block_number)
write_miner_payments(db_session, miner_payments)
delete_miner_payments_for_block(inspect_db_session, block_number)
write_miner_payments(inspect_db_session, miner_payments)

View File

@ -0,0 +1,23 @@
from typing import List
from mev_inspect.aave_liquidations import get_aave_liquidations
from mev_inspect.schemas.classified_traces import (
ClassifiedTrace,
Classification,
)
from mev_inspect.schemas.liquidations import Liquidation
def has_liquidations(classified_traces: List[ClassifiedTrace]) -> bool:
liquidations_exist = False
for classified_trace in classified_traces:
if classified_trace.classification == Classification.liquidate:
liquidations_exist = True
return liquidations_exist
def get_liquidations(
classified_traces: List[ClassifiedTrace],
) -> List[Liquidation]:
aave_liquidations = get_aave_liquidations(classified_traces)
return aave_liquidations

View File

@ -0,0 +1,19 @@
from sqlalchemy import Column, Numeric, String, ARRAY, Integer
from .base import Base
class LiquidationModel(Base):
__tablename__ = "liquidations"
liquidated_user = Column(String, nullable=False)
liquidator_user = Column(String, nullable=False)
collateral_token_address = Column(String, nullable=False)
debt_token_address = Column(String, nullable=False)
debt_purchase_amount = Column(Numeric, nullable=False)
received_amount = Column(Numeric, nullable=False)
received_token_address = Column(String, nullable=False)
protocol = Column(String, nullable=True)
transaction_hash = Column(String, primary_key=True)
trace_address = Column(ARRAY(Integer), primary_key=True)
block_number = Column(Numeric, nullable=False)

View File

@ -1,8 +1,8 @@
from enum import Enum
from typing import List, Union
from typing import List, Optional, Union
from typing_extensions import Literal
from hexbytes import HexBytes
import eth_utils.abi
from pydantic import BaseModel
from web3 import Web3
@ -26,6 +26,10 @@ NON_FUNCTION_DESCRIPTION_TYPES = Union[
class ABIDescriptionInput(BaseModel):
name: str
type: str
components: Optional[List["ABIDescriptionInput"]]
ABIDescriptionInput.update_forward_refs()
class ABIGenericDescription(BaseModel):
@ -37,12 +41,17 @@ class ABIFunctionDescription(BaseModel):
name: str
inputs: List[ABIDescriptionInput]
def get_selector(self) -> HexBytes:
def get_selector(self) -> str:
signature = self.get_signature()
return Web3.sha3(text=signature)[0:4]
return Web3.sha3(text=signature)[0:4].hex()
def get_signature(self) -> str:
joined_input_types = ",".join(input.type for input in self.inputs)
joined_input_types = ",".join(
input.type
if input.type != "tuple"
else eth_utils.abi.collapse_if_tuple(input.dict())
for input in self.inputs
)
return f"{self.name}({joined_input_types})"

View File

@ -1,17 +1,15 @@
from enum import Enum
from typing import Any, Dict, List, Optional
from pydantic import BaseModel
from .blocks import Trace
class Classification(Enum):
unknown = "unknown"
swap = "swap"
burn = "burn"
transfer = "transfer"
liquidate = "liquidate"
seize = "seize"
class Protocol(Enum):
@ -23,6 +21,8 @@ class Protocol(Enum):
curve = "curve"
zero_ex = "0x"
balancer_v1 = "balancer_v1"
compound_v2 = "compound_v2"
cream = "cream"
class ClassifiedTrace(Trace):
@ -62,12 +62,5 @@ class DecodedCallTrace(CallTrace):
protocol: Optional[Protocol]
gas: Optional[int]
gas_used: Optional[int]
function_name: Optional[str]
function_signature: Optional[str]
class ClassifierSpec(BaseModel):
abi_name: str
protocol: Optional[Protocol] = None
valid_contract_addresses: Optional[List[str]] = None
classifications: Dict[str, Classification] = {}
function_name: str
function_signature: str

View File

@ -0,0 +1,55 @@
from abc import ABC, abstractmethod
from typing import Dict, List, Optional, Type
from pydantic import BaseModel
from .classified_traces import Classification, DecodedCallTrace, Protocol
from .transfers import Transfer
class Classifier(ABC):
@staticmethod
@abstractmethod
def get_classification() -> Classification:
raise NotImplementedError()
class TransferClassifier(Classifier):
@staticmethod
def get_classification() -> Classification:
return Classification.transfer
@staticmethod
@abstractmethod
def get_transfer(trace: DecodedCallTrace) -> Transfer:
raise NotImplementedError()
class SwapClassifier(Classifier):
@staticmethod
def get_classification() -> Classification:
return Classification.swap
@staticmethod
@abstractmethod
def get_swap_recipient(trace: DecodedCallTrace) -> str:
raise NotImplementedError()
class LiquidationClassifier(Classifier):
@staticmethod
def get_classification() -> Classification:
return Classification.liquidate
class SeizeClassifier(Classifier):
@staticmethod
def get_classification() -> Classification:
return Classification.seize
class ClassifierSpec(BaseModel):
abi_name: str
protocol: Optional[Protocol] = None
valid_contract_addresses: Optional[List[str]] = None
classifiers: Dict[str, Type[Classifier]] = {}

View File

@ -0,0 +1,17 @@
from typing import List, Optional
from pydantic import BaseModel
from mev_inspect.schemas.classified_traces import Protocol
class Liquidation(BaseModel):
liquidated_user: str
liquidator_user: str
collateral_token_address: str
debt_token_address: str
debt_purchase_amount: int
received_amount: int
received_token_address: Optional[str]
protocol: Protocol
transaction_hash: str
trace_address: List[int]
block_number: str

View File

@ -1,8 +1,9 @@
from typing import List, TypeVar
from typing import List
from pydantic import BaseModel
from .classified_traces import Classification, ClassifiedTrace, Protocol
ETH_TOKEN_ADDRESS = "0xEeeeeEeeeEeEeeEeEeEeeEEEeeeeEeeeeeeeEEeE"
class Transfer(BaseModel):
@ -12,50 +13,4 @@ class Transfer(BaseModel):
from_address: str
to_address: str
amount: int
# To preserve the specific Transfer type
TransferGeneric = TypeVar("TransferGeneric", bound="Transfer")
class EthTransfer(Transfer):
@classmethod
def from_trace(cls, trace: ClassifiedTrace) -> "EthTransfer":
return cls(
block_number=trace.block_number,
transaction_hash=trace.transaction_hash,
trace_address=trace.trace_address,
amount=trace.value,
to_address=trace.to_address,
from_address=trace.from_address,
)
class ERC20Transfer(Transfer):
token_address: str
@classmethod
def from_trace(cls, trace: ClassifiedTrace) -> "ERC20Transfer":
if trace.classification != Classification.transfer or trace.inputs is None:
raise ValueError("Invalid transfer")
if trace.protocol == Protocol.weth:
return cls(
block_number=trace.block_number,
transaction_hash=trace.transaction_hash,
trace_address=trace.trace_address,
amount=trace.inputs["wad"],
to_address=trace.inputs["dst"],
from_address=trace.from_address,
token_address=trace.to_address,
)
else:
return cls(
block_number=trace.block_number,
transaction_hash=trace.transaction_hash,
trace_address=trace.trace_address,
amount=trace.inputs["amount"],
to_address=trace.inputs["recipient"],
from_address=trace.inputs.get("sender", trace.from_address),
token_address=trace.to_address,
)

View File

@ -1,8 +1,8 @@
import json
from hexbytes import HexBytes
from pydantic import BaseModel
from web3.datastructures import AttributeDict
from pydantic import BaseModel
def to_camel(string: str) -> str:

View File

@ -1,24 +1,24 @@
from typing import List, Optional
from mev_inspect.classifiers.specs import get_classifier
from mev_inspect.schemas.classified_traces import (
ClassifiedTrace,
Classification,
DecodedCallTrace,
)
from mev_inspect.schemas.classifiers import SwapClassifier
from mev_inspect.schemas.swaps import Swap
from mev_inspect.schemas.transfers import ERC20Transfer
from mev_inspect.schemas.transfers import Transfer
from mev_inspect.traces import get_traces_by_transaction_hash
from mev_inspect.transfers import (
build_eth_transfer,
get_child_transfers,
get_transfer,
filter_transfers,
remove_child_transfers_of_transfers,
)
UNISWAP_V2_PAIR_ABI_NAME = "UniswapV2Pair"
UNISWAP_V3_POOL_ABI_NAME = "UniswapV3Pool"
BALANCER_V1_POOL_ABI_NAME = "BPool"
def get_swaps(traces: List[ClassifiedTrace]) -> List[Swap]:
swaps = []
@ -32,11 +32,16 @@ def _get_swaps_for_transaction(traces: List[ClassifiedTrace]) -> List[Swap]:
ordered_traces = list(sorted(traces, key=lambda t: t.trace_address))
swaps: List[Swap] = []
prior_transfers: List[ERC20Transfer] = []
prior_transfers: List[Transfer] = []
for trace in ordered_traces:
if trace.classification == Classification.transfer:
prior_transfers.append(ERC20Transfer.from_trace(trace))
if not isinstance(trace, DecodedCallTrace):
continue
elif trace.classification == Classification.transfer:
transfer = get_transfer(trace)
if transfer is not None:
prior_transfers.append(transfer)
elif trace.classification == Classification.swap:
child_transfers = get_child_transfers(
@ -58,9 +63,9 @@ def _get_swaps_for_transaction(traces: List[ClassifiedTrace]) -> List[Swap]:
def _parse_swap(
trace: ClassifiedTrace,
prior_transfers: List[ERC20Transfer],
child_transfers: List[ERC20Transfer],
trace: DecodedCallTrace,
prior_transfers: List[Transfer],
child_transfers: List[Transfer],
) -> Optional[Swap]:
pool_address = trace.to_address
recipient_address = _get_recipient_address(trace)
@ -68,7 +73,13 @@ def _parse_swap(
if recipient_address is None:
return None
transfers_to_pool = filter_transfers(prior_transfers, to_address=pool_address)
transfers_to_pool = []
if trace.value is not None and trace.value > 0:
transfers_to_pool = [build_eth_transfer(trace)]
if len(transfers_to_pool) == 0:
transfers_to_pool = filter_transfers(prior_transfers, to_address=pool_address)
if len(transfers_to_pool) == 0:
transfers_to_pool = filter_transfers(child_transfers, to_address=pool_address)
@ -92,6 +103,7 @@ def _parse_swap(
block_number=trace.block_number,
trace_address=trace.trace_address,
pool_address=pool_address,
protocol=trace.protocol,
from_address=transfer_in.from_address,
to_address=transfer_out.to_address,
token_in_address=transfer_in.token_address,
@ -102,20 +114,9 @@ def _parse_swap(
)
def _get_recipient_address(trace: ClassifiedTrace) -> Optional[str]:
if trace.abi_name == UNISWAP_V3_POOL_ABI_NAME:
return (
trace.inputs["recipient"]
if trace.inputs is not None and "recipient" in trace.inputs
else trace.from_address
)
elif trace.abi_name == UNISWAP_V2_PAIR_ABI_NAME:
return (
trace.inputs["to"]
if trace.inputs is not None and "to" in trace.inputs
else trace.from_address
)
elif trace.abi_name == BALANCER_V1_POOL_ABI_NAME:
return trace.from_address
else:
return None
def _get_recipient_address(trace: DecodedCallTrace) -> Optional[str]:
classifier = get_classifier(trace)
if classifier is not None and issubclass(classifier, SwapClassifier):
return classifier.get_swap_recipient(trace)
return None

View File

@ -34,6 +34,18 @@ def get_child_traces(
return child_traces
def is_child_of_any_address(
trace: ClassifiedTrace, parent_trace_addresses: List[List[int]]
) -> bool:
return any(
[
is_child_trace_address(trace.trace_address, parent)
for parent in parent_trace_addresses
]
)
def get_traces_by_transaction_hash(
traces: List[ClassifiedTrace],
) -> Dict[str, List[ClassifiedTrace]]:

View File

@ -1,49 +1,95 @@
from typing import Dict, List, Optional, Sequence
from mev_inspect.schemas.classified_traces import Classification, ClassifiedTrace
from mev_inspect.schemas.transfers import ERC20Transfer, EthTransfer, TransferGeneric
from mev_inspect.classifiers.specs import get_classifier
from mev_inspect.schemas.classifiers import TransferClassifier
from mev_inspect.schemas.classified_traces import (
ClassifiedTrace,
DecodedCallTrace,
)
from mev_inspect.schemas.transfers import ETH_TOKEN_ADDRESS, Transfer
from mev_inspect.traces import is_child_trace_address, get_child_traces
def get_eth_transfers(traces: List[ClassifiedTrace]) -> List[EthTransfer]:
def get_transfers(traces: List[ClassifiedTrace]) -> List[Transfer]:
transfers = []
for trace in traces:
if trace.value is not None and trace.value > 0:
transfers.append(EthTransfer.from_trace(trace))
transfer = get_transfer(trace)
if transfer is not None:
transfers.append(transfer)
return transfers
def get_transfers(traces: List[ClassifiedTrace]) -> List[ERC20Transfer]:
transfers = []
def get_eth_transfers(traces: List[ClassifiedTrace]) -> List[Transfer]:
transfers = get_transfers(traces)
for trace in traces:
if trace.classification == Classification.transfer:
transfers.append(ERC20Transfer.from_trace(trace))
return [
transfer
for transfer in transfers
if transfer.token_address == ETH_TOKEN_ADDRESS
]
return transfers
def get_transfer(trace: ClassifiedTrace) -> Optional[Transfer]:
if _is_simple_eth_transfer(trace):
return build_eth_transfer(trace)
if isinstance(trace, DecodedCallTrace):
return _build_erc20_transfer(trace)
return None
def _is_simple_eth_transfer(trace: ClassifiedTrace) -> bool:
return (
trace.value is not None
and trace.value > 0
and "input" in trace.action
and trace.action["input"] == "0x"
)
def build_eth_transfer(trace: ClassifiedTrace) -> Transfer:
return Transfer(
block_number=trace.block_number,
transaction_hash=trace.transaction_hash,
trace_address=trace.trace_address,
amount=trace.value,
to_address=trace.to_address,
from_address=trace.from_address,
token_address=ETH_TOKEN_ADDRESS,
)
def _build_erc20_transfer(trace: DecodedCallTrace) -> Optional[Transfer]:
classifier = get_classifier(trace)
if classifier is not None and issubclass(classifier, TransferClassifier):
return classifier.get_transfer(trace)
return None
def get_child_transfers(
transaction_hash: str,
parent_trace_address: List[int],
traces: List[ClassifiedTrace],
) -> List[ERC20Transfer]:
) -> List[Transfer]:
child_transfers = []
for child_trace in get_child_traces(transaction_hash, parent_trace_address, traces):
if child_trace.classification == Classification.transfer:
child_transfers.append(ERC20Transfer.from_trace(child_trace))
transfer = get_transfer(child_trace)
if transfer is not None:
child_transfers.append(transfer)
return child_transfers
def filter_transfers(
transfers: Sequence[TransferGeneric],
transfers: Sequence[Transfer],
to_address: Optional[str] = None,
from_address: Optional[str] = None,
) -> List[TransferGeneric]:
) -> List[Transfer]:
filtered_transfers = []
for transfer in transfers:
@ -59,8 +105,8 @@ def filter_transfers(
def remove_child_transfers_of_transfers(
transfers: List[ERC20Transfer],
) -> List[ERC20Transfer]:
transfers: List[Transfer],
) -> List[Transfer]:
updated_transfers = []
transfer_addresses_by_transaction: Dict[str, List[List[int]]] = {}

View File

@ -1,5 +1,5 @@
from hexbytes.main import HexBytes
from hexbytes._utils import hexstr_to_bytes
def hex_to_int(value: str) -> int:
return int.from_bytes(HexBytes(value), byteorder="big")
return int.from_bytes(hexstr_to_bytes(value), byteorder="big")

170
poetry.lock generated
View File

@ -31,14 +31,6 @@ python-dateutil = "*"
python-editor = ">=0.3"
SQLAlchemy = ">=1.3.0"
[[package]]
name = "appdirs"
version = "1.4.4"
description = "A small Python module for determining appropriate platform-specific dirs, e.g. a \"user data dir\"."
category = "dev"
optional = false
python-versions = "*"
[[package]]
name = "astroid"
version = "2.7.2"
@ -113,26 +105,12 @@ optional = false
python-versions = "*"
[[package]]
name = "black"
version = "21.7b0"
description = "The uncompromising code formatter."
name = "bottle"
version = "0.12.19"
description = "Fast and simple WSGI-framework for small web-applications."
category = "dev"
optional = false
python-versions = ">=3.6.2"
[package.dependencies]
appdirs = "*"
click = ">=7.1.2"
mypy-extensions = ">=0.4.3"
pathspec = ">=0.8.1,<1"
regex = ">=2020.1.8"
tomli = ">=0.2.6,<2.0.0"
[package.extras]
colorama = ["colorama (>=0.4.3)"]
d = ["aiohttp (>=3.6.0)", "aiohttp-cors (>=0.4.0)"]
python2 = ["typed-ast (>=1.4.2)"]
uvloop = ["uvloop (>=0.15.2)"]
python-versions = "*"
[[package]]
name = "certifi"
@ -199,6 +177,17 @@ python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, <4"
[package.extras]
toml = ["toml"]
[[package]]
name = "cprofilev"
version = "1.0.7"
description = "An easier way to use cProfile"
category = "dev"
optional = false
python-versions = "*"
[package.dependencies]
bottle = "*"
[[package]]
name = "cytoolz"
version = "0.11.0"
@ -606,14 +595,6 @@ python-versions = "*"
[package.dependencies]
six = ">=1.9.0"
[[package]]
name = "pathspec"
version = "0.9.0"
description = "Utility library for gitignore style pattern matching of file paths."
category = "dev"
optional = false
python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,>=2.7"
[[package]]
name = "platformdirs"
version = "2.2.0"
@ -822,7 +803,7 @@ python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, !=3.5.*"
[[package]]
name = "regex"
version = "2021.8.27"
version = "2021.10.8"
description = "Alternative regular expression module, to replace re."
category = "dev"
optional = false
@ -919,14 +900,6 @@ category = "dev"
optional = false
python-versions = ">=2.6, !=3.0.*, !=3.1.*, !=3.2.*"
[[package]]
name = "tomli"
version = "1.2.1"
description = "A lil' TOML parser"
category = "dev"
optional = false
python-versions = ">=3.6"
[[package]]
name = "toolz"
version = "0.11.1"
@ -1044,7 +1017,7 @@ multidict = ">=4.0"
[metadata]
lock-version = "1.1"
python-versions = "^3.9"
content-hash = "206acce73eccf4be7eec1ed7b1a0703438601143a107c4285f67730934eed86a"
content-hash = "baade6f62f3adaff192b2c85b4f602f4990b9b99d6fcce904aeb5087b6fa1921"
[metadata.files]
aiohttp = [
@ -1090,10 +1063,6 @@ alembic = [
{file = "alembic-1.6.5-py2.py3-none-any.whl", hash = "sha256:e78be5b919f5bb184e3e0e2dd1ca986f2362e29a2bc933c446fe89f39dbe4e9c"},
{file = "alembic-1.6.5.tar.gz", hash = "sha256:a21fedebb3fb8f6bbbba51a11114f08c78709377051384c9c5ead5705ee93a51"},
]
appdirs = [
{file = "appdirs-1.4.4-py2.py3-none-any.whl", hash = "sha256:a841dacd6b99318a741b166adb07e19ee71a274450e68237b4650ca1055ab128"},
{file = "appdirs-1.4.4.tar.gz", hash = "sha256:7d5d0167b2b1ba821647616af46a749d1c653740dd0d2415100fe26e27afdf41"},
]
astroid = [
{file = "astroid-2.7.2-py3-none-any.whl", hash = "sha256:ecc50f9b3803ebf8ea19aa2c6df5622d8a5c31456a53c741d3be044d96ff0948"},
{file = "astroid-2.7.2.tar.gz", hash = "sha256:b6c2d75cd7c2982d09e7d41d70213e863b3ba34d3bd4014e08f167cee966e99e"},
@ -1121,9 +1090,9 @@ base58 = [
bitarray = [
{file = "bitarray-1.2.2.tar.gz", hash = "sha256:27a69ffcee3b868abab3ce8b17c69e02b63e722d4d64ffd91d659f81e9984954"},
]
black = [
{file = "black-21.7b0-py3-none-any.whl", hash = "sha256:1c7aa6ada8ee864db745b22790a32f94b2795c253a75d6d9b5e439ff10d23116"},
{file = "black-21.7b0.tar.gz", hash = "sha256:c8373c6491de9362e39271630b65b964607bc5c79c83783547d76c839b3aa219"},
bottle = [
{file = "bottle-0.12.19-py3-none-any.whl", hash = "sha256:f6b8a34fe9aa406f9813c02990db72ca69ce6a158b5b156d2c41f345016a723d"},
{file = "bottle-0.12.19.tar.gz", hash = "sha256:a9d73ffcbc6a1345ca2d7949638db46349f5b2b77dac65d6494d45c23628da2c"},
]
certifi = [
{file = "certifi-2021.5.30-py2.py3-none-any.whl", hash = "sha256:50b1e4f8446b06f41be7dd6338db18e0990601dce795c2b1686458aa7e8fa7d8"},
@ -1203,6 +1172,9 @@ coverage = [
{file = "coverage-5.5-pp37-none-any.whl", hash = "sha256:2a3859cb82dcbda1cfd3e6f71c27081d18aa251d20a17d87d26d4cd216fb0af4"},
{file = "coverage-5.5.tar.gz", hash = "sha256:ebe78fe9a0e874362175b02371bdfbee64d8edc42a044253ddf4ee7d3c15212c"},
]
cprofilev = [
{file = "CProfileV-1.0.7.tar.gz", hash = "sha256:8791748b1f3d3468c2c927c3fd5f905080b84d8f2d217ca764b7d9d7a1fb9a77"},
]
cytoolz = [
{file = "cytoolz-0.11.0-cp35-cp35m-macosx_10_6_x86_64.whl", hash = "sha256:c50051c02b23823209d6b0e8f7b2b37371312da50ca78165871dc6fed7bd37df"},
{file = "cytoolz-0.11.0-cp35-cp35m-manylinux1_x86_64.whl", hash = "sha256:140eaadcd86216d4a185db3a37396ee80dd2edc6e490ba37a3d7c1b17a124078"},
@ -1524,10 +1496,6 @@ packaging = [
parsimonious = [
{file = "parsimonious-0.8.1.tar.gz", hash = "sha256:3add338892d580e0cb3b1a39e4a1b427ff9f687858fdd61097053742391a9f6b"},
]
pathspec = [
{file = "pathspec-0.9.0-py2.py3-none-any.whl", hash = "sha256:7d15c4ddb0b5c802d161efc417ec1a2558ea2653c2e8ad9c19098201dc1c993a"},
{file = "pathspec-0.9.0.tar.gz", hash = "sha256:e564499435a2673d586f6b2130bb5b95f04a3ba06f81b8f895b651a3c76aabb1"},
]
platformdirs = [
{file = "platformdirs-2.2.0-py3-none-any.whl", hash = "sha256:4666d822218db6a262bdfdc9c39d21f23b4cfdb08af331a81e92751daf6c866c"},
{file = "platformdirs-2.2.0.tar.gz", hash = "sha256:632daad3ab546bd8e6af0537d09805cec458dce201bccfe23012df73332e181e"},
@ -1737,47 +1705,53 @@ pyyaml = [
{file = "PyYAML-5.4.1.tar.gz", hash = "sha256:607774cbba28732bfa802b54baa7484215f530991055bb562efbed5b2f20a45e"},
]
regex = [
{file = "regex-2021.8.27-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:507861cf3d97a86fbe26ea6cc04660ae028b9e4080b8290e28b99547b4e15d89"},
{file = "regex-2021.8.27-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:105122fa63da98d8456d5026bc6ac5a1399fd82fa6bad22c6ea641b1572c9142"},
{file = "regex-2021.8.27-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:83946ca9278b304728b637bc8d8200ab1663a79de85e47724594917aeed0e892"},
{file = "regex-2021.8.27-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:ee318974a1fdacba1701bc9e552e9015788d6345416364af6fa987424ff8df53"},
{file = "regex-2021.8.27-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:dde0ac721c7c5bfa5f9fc285e811274dec3c392f2c1225f7d07ca98a8187ca84"},
{file = "regex-2021.8.27-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:862b6164e9a38b5c495be2c2854e75fd8af12c5be4c61dc9b42d255980d7e907"},
{file = "regex-2021.8.27-cp310-cp310-win32.whl", hash = "sha256:7684016b73938ca12d160d2907d141f06b7597bd17d854e32bb7588be01afa1d"},
{file = "regex-2021.8.27-cp310-cp310-win_amd64.whl", hash = "sha256:a5f3bc727fea58f21d99c22e6d4fca652dc11dbc2a1e7cfc4838cd53b2e3691f"},
{file = "regex-2021.8.27-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:db888d4fb33a2fd54b57ac55d5015e51fa849f0d8592bd799b4e47f83bd04e00"},
{file = "regex-2021.8.27-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:92eb03f47427fea452ff6956d11f5d5a3f22a048c90a0f34fa223e6badab6c85"},
{file = "regex-2021.8.27-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7406dd2e44c7cfb4680c0a45a03264381802c67890cf506c147288f04c67177d"},
{file = "regex-2021.8.27-cp36-cp36m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:7db58ad61f3f6ea393aaf124d774ee0c58806320bc85c06dc9480f5c7219c250"},
{file = "regex-2021.8.27-cp36-cp36m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:cd45b4542134de63e7b9dd653e0a2d7d47ffed9615e3637c27ca5f6b78ea68bb"},
{file = "regex-2021.8.27-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:e02dad60e3e8442eefd28095e99b2ac98f2b8667167493ac6a2f3aadb5d84a17"},
{file = "regex-2021.8.27-cp36-cp36m-win32.whl", hash = "sha256:de0d06ccbc06af5bf93bddec10f4f80275c5d74ea6d28b456931f3955f58bc8c"},
{file = "regex-2021.8.27-cp36-cp36m-win_amd64.whl", hash = "sha256:2a0a5e323cf86760784ce2b91d8ab5ea09d0865d6ef4da0151e03d15d097b24e"},
{file = "regex-2021.8.27-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:6530b7b9505123cdea40a2301225183ca65f389bc6129f0c225b9b41680268d8"},
{file = "regex-2021.8.27-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4f3e36086d6631ceaf468503f96a3be0d247caef0660c9452fb1b0c055783851"},
{file = "regex-2021.8.27-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8ddb4f9ce6bb388ecc97b4b3eb37e786f05d7d5815e8822e0d87a3dbd7100649"},
{file = "regex-2021.8.27-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:2de1429e4eeab799c168a4f6e6eecdf30fcaa389bba4039cc8a065d6b7aad647"},
{file = "regex-2021.8.27-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4f03fc0a25122cdcbf39136510d4ea7627f732206892db522adf510bc03b8c67"},
{file = "regex-2021.8.27-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:503c1ba0920a46a1844363725215ef44d59fcac2bd2c03ae3c59aa9d08d29bd6"},
{file = "regex-2021.8.27-cp37-cp37m-win32.whl", hash = "sha256:24d68499a27b2d93831fde4a9b84ea5b19e0ab141425fbc9ab1e5b4dad179df7"},
{file = "regex-2021.8.27-cp37-cp37m-win_amd64.whl", hash = "sha256:6729914dd73483cd1c8aaace3ac082436fc98b0072743ac136eaea0b3811d42f"},
{file = "regex-2021.8.27-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:2d9cbe0c755ab8b6f583169c0783f7278fc6b195e423b09c5a8da6f858025e96"},
{file = "regex-2021.8.27-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d2404336fd16788ea757d4218a2580de60adb052d9888031e765320be8884309"},
{file = "regex-2021.8.27-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:208851a2f8dd31e468f0b5aa6c94433975bd67a107a4e7da3bdda947c9f85e25"},
{file = "regex-2021.8.27-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:3ee8ad16a35c45a5bab098e39020ecb6fec3b0e700a9d88983d35cbabcee79c8"},
{file = "regex-2021.8.27-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:56ae6e3cf0506ec0c40b466e31f41ee7a7149a2b505ae0ee50edd9043b423d27"},
{file = "regex-2021.8.27-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:2778c6cb379d804e429cc8e627392909e60db5152b42c695c37ae5757aae50ae"},
{file = "regex-2021.8.27-cp38-cp38-win32.whl", hash = "sha256:e960fe211496333b2f7e36badf4c22a919d740386681f79139ee346b403d1ca1"},
{file = "regex-2021.8.27-cp38-cp38-win_amd64.whl", hash = "sha256:116c277774f84266044e889501fe79cfd293a8b4336b7a5e89b9f20f1e5a9f21"},
{file = "regex-2021.8.27-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:32753eda8d413ce4f208cfe01dd61171a78068a6f5d5f38ccd751e00585cdf1d"},
{file = "regex-2021.8.27-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:84057cfae5676f456b03970eb78b7e182fddc80c2daafd83465a3d6ca9ff8dbf"},
{file = "regex-2021.8.27-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a6180dbf5945b27e9420e1b58c3cacfc79ad5278bdad3ea35109f5680fbe16d1"},
{file = "regex-2021.8.27-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:b158f673ae6a6523f13704f70aa7e4ce875f91e379bece4362c89db18db189d5"},
{file = "regex-2021.8.27-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:19acdb8831a4e3b03b23369db43178d8fee1f17b99c83af6cd907886f76bd9d4"},
{file = "regex-2021.8.27-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:12eaf0bbe568bd62e6cade7937e0bf01a2a4cef49a82f4fd204401e78409e158"},
{file = "regex-2021.8.27-cp39-cp39-win32.whl", hash = "sha256:1401cfa4320691cbd91191ec678735c727dee674d0997b0902a5a38ad482faf5"},
{file = "regex-2021.8.27-cp39-cp39-win_amd64.whl", hash = "sha256:0696eb934dee723e3292056a2c046ddb1e4dd3887685783a9f4af638e85dee76"},
{file = "regex-2021.8.27.tar.gz", hash = "sha256:e9700c52749cb3e90c98efd72b730c97b7e4962992fca5fbcaf1363be8e3b849"},
{file = "regex-2021.10.8-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:094a905e87a4171508c2a0e10217795f83c636ccc05ddf86e7272c26e14056ae"},
{file = "regex-2021.10.8-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:981c786293a3115bc14c103086ae54e5ee50ca57f4c02ce7cf1b60318d1e8072"},
{file = "regex-2021.10.8-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:b0f2f874c6a157c91708ac352470cb3bef8e8814f5325e3c5c7a0533064c6a24"},
{file = "regex-2021.10.8-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:51feefd58ac38eb91a21921b047da8644155e5678e9066af7bcb30ee0dca7361"},
{file = "regex-2021.10.8-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ea8de658d7db5987b11097445f2b1f134400e2232cb40e614e5f7b6f5428710e"},
{file = "regex-2021.10.8-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:1ce02f420a7ec3b2480fe6746d756530f69769292eca363218c2291d0b116a01"},
{file = "regex-2021.10.8-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:39079ebf54156be6e6902f5c70c078f453350616cfe7bfd2dd15bdb3eac20ccc"},
{file = "regex-2021.10.8-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:ff24897f6b2001c38a805d53b6ae72267025878d35ea225aa24675fbff2dba7f"},
{file = "regex-2021.10.8-cp310-cp310-win32.whl", hash = "sha256:c6569ba7b948c3d61d27f04e2b08ebee24fec9ff8e9ea154d8d1e975b175bfa7"},
{file = "regex-2021.10.8-cp310-cp310-win_amd64.whl", hash = "sha256:45cb0f7ff782ef51bc79e227a87e4e8f24bc68192f8de4f18aae60b1d60bc152"},
{file = "regex-2021.10.8-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:fab3ab8aedfb443abb36729410403f0fe7f60ad860c19a979d47fb3eb98ef820"},
{file = "regex-2021.10.8-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:74e55f8d66f1b41d44bc44c891bcf2c7fad252f8f323ee86fba99d71fd1ad5e3"},
{file = "regex-2021.10.8-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3d52c5e089edbdb6083391faffbe70329b804652a53c2fdca3533e99ab0580d9"},
{file = "regex-2021.10.8-cp36-cp36m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:1abbd95cbe9e2467cac65c77b6abd9223df717c7ae91a628502de67c73bf6838"},
{file = "regex-2021.10.8-cp36-cp36m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b9b5c215f3870aa9b011c00daeb7be7e1ae4ecd628e9beb6d7e6107e07d81287"},
{file = "regex-2021.10.8-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:f540f153c4f5617bc4ba6433534f8916d96366a08797cbbe4132c37b70403e92"},
{file = "regex-2021.10.8-cp36-cp36m-win32.whl", hash = "sha256:1f51926db492440e66c89cd2be042f2396cf91e5b05383acd7372b8cb7da373f"},
{file = "regex-2021.10.8-cp36-cp36m-win_amd64.whl", hash = "sha256:5f55c4804797ef7381518e683249310f7f9646da271b71cb6b3552416c7894ee"},
{file = "regex-2021.10.8-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:fb2baff66b7d2267e07ef71e17d01283b55b3cc51a81b54cc385e721ae172ba4"},
{file = "regex-2021.10.8-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9e527ab1c4c7cf2643d93406c04e1d289a9d12966529381ce8163c4d2abe4faf"},
{file = "regex-2021.10.8-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:36c98b013273e9da5790ff6002ab326e3f81072b4616fd95f06c8fa733d2745f"},
{file = "regex-2021.10.8-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:55ef044899706c10bc0aa052f2fc2e58551e2510694d6aae13f37c50f3f6ff61"},
{file = "regex-2021.10.8-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:aa0ab3530a279a3b7f50f852f1bab41bc304f098350b03e30a3876b7dd89840e"},
{file = "regex-2021.10.8-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:a37305eb3199d8f0d8125ec2fb143ba94ff6d6d92554c4b8d4a8435795a6eccd"},
{file = "regex-2021.10.8-cp37-cp37m-win32.whl", hash = "sha256:2efd47704bbb016136fe34dfb74c805b1ef5c7313aef3ce6dcb5ff844299f432"},
{file = "regex-2021.10.8-cp37-cp37m-win_amd64.whl", hash = "sha256:924079d5590979c0e961681507eb1773a142553564ccae18d36f1de7324e71ca"},
{file = "regex-2021.10.8-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:19b8f6d23b2dc93e8e1e7e288d3010e58fafed323474cf7f27ab9451635136d9"},
{file = "regex-2021.10.8-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:b09d3904bf312d11308d9a2867427479d277365b1617e48ad09696fa7dfcdf59"},
{file = "regex-2021.10.8-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:951be934dc25d8779d92b530e922de44dda3c82a509cdb5d619f3a0b1491fafa"},
{file = "regex-2021.10.8-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7f125fce0a0ae4fd5c3388d369d7a7d78f185f904c90dd235f7ecf8fe13fa741"},
{file = "regex-2021.10.8-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5f199419a81c1016e0560c39773c12f0bd924c37715bffc64b97140d2c314354"},
{file = "regex-2021.10.8-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:09e1031e2059abd91177c302da392a7b6859ceda038be9e015b522a182c89e4f"},
{file = "regex-2021.10.8-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:9c070d5895ac6aeb665bd3cd79f673775caf8d33a0b569e98ac434617ecea57d"},
{file = "regex-2021.10.8-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:176796cb7f82a7098b0c436d6daac82f57b9101bb17b8e8119c36eecf06a60a3"},
{file = "regex-2021.10.8-cp38-cp38-win32.whl", hash = "sha256:5e5796d2f36d3c48875514c5cd9e4325a1ca172fc6c78b469faa8ddd3d770593"},
{file = "regex-2021.10.8-cp38-cp38-win_amd64.whl", hash = "sha256:e4204708fa116dd03436a337e8e84261bc8051d058221ec63535c9403a1582a1"},
{file = "regex-2021.10.8-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:6dcf53d35850ce938b4f044a43b33015ebde292840cef3af2c8eb4c860730fff"},
{file = "regex-2021.10.8-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:b8b6ee6555b6fbae578f1468b3f685cdfe7940a65675611365a7ea1f8d724991"},
{file = "regex-2021.10.8-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:e2ec1c106d3f754444abf63b31e5c4f9b5d272272a491fa4320475aba9e8157c"},
{file = "regex-2021.10.8-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:973499dac63625a5ef9dfa4c791aa33a502ddb7615d992bdc89cf2cc2285daa3"},
{file = "regex-2021.10.8-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:88dc3c1acd3f0ecfde5f95c32fcb9beda709dbdf5012acdcf66acbc4794468eb"},
{file = "regex-2021.10.8-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:4786dae85c1f0624ac77cb3813ed99267c9adb72e59fdc7297e1cf4d6036d493"},
{file = "regex-2021.10.8-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:fe6ce4f3d3c48f9f402da1ceb571548133d3322003ce01b20d960a82251695d2"},
{file = "regex-2021.10.8-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:9e3e2cea8f1993f476a6833ef157f5d9e8c75a59a8d8b0395a9a6887a097243b"},
{file = "regex-2021.10.8-cp39-cp39-win32.whl", hash = "sha256:82cfb97a36b1a53de32b642482c6c46b6ce80803854445e19bc49993655ebf3b"},
{file = "regex-2021.10.8-cp39-cp39-win_amd64.whl", hash = "sha256:b04e512eb628ea82ed86eb31c0f7fc6842b46bf2601b66b1356a7008327f7700"},
{file = "regex-2021.10.8.tar.gz", hash = "sha256:26895d7c9bbda5c52b3635ce5991caa90fbb1ddfac9c9ff1c7ce505e2282fb2a"},
]
requests = [
{file = "requests-2.26.0-py2.py3-none-any.whl", hash = "sha256:6c1246513ecd5ecd4528a0906f910e8f0f9c6b8ec72030dc9fd154dc1a6efd24"},
@ -1830,10 +1804,6 @@ toml = [
{file = "toml-0.10.2-py2.py3-none-any.whl", hash = "sha256:806143ae5bfb6a3c6e736a764057db0e6a0e05e338b5630894a5f779cabb4f9b"},
{file = "toml-0.10.2.tar.gz", hash = "sha256:b3bda1d108d5dd99f4a20d24d9c348e91c4db7ab1b749200bded2f839ccbe68f"},
]
tomli = [
{file = "tomli-1.2.1-py3-none-any.whl", hash = "sha256:8dd0e9524d6f386271a36b41dbf6c57d8e32fd96fd22b6584679dc569d20899f"},
{file = "tomli-1.2.1.tar.gz", hash = "sha256:a5b75cb6f3968abb47af1b40c1819dc519ea82bcc065776a866e8d74c5ca9442"},
]
toolz = [
{file = "toolz-0.11.1-py3-none-any.whl", hash = "sha256:1bc473acbf1a1db4e72a1ce587be347450e8f08324908b8a266b486f408f04d5"},
{file = "toolz-0.11.1.tar.gz", hash = "sha256:c7a47921f07822fe534fb1c01c9931ab335a4390c782bd28c6bcc7c2f71f3fbf"},

View File

@ -22,7 +22,8 @@ pytest-sugar = "^0.9.4"
pytest-cov = "^2.12.1"
coverage = "^5.5"
alembic = "^1.6.5"
black = "^21.7b0"
CProfileV = "^1.0.7"
regex = "^2021.10.8"
[build-system]
requires = ["poetry-core>=1.0.0"]

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

1
tests/comp_markets.json Normal file
View File

@ -0,0 +1 @@
{"0x6c8c6b02e7b2be14d4fa6022dfd6d75921d90e4e": "0x0d8775f648430679a709e98d2b0cb6250d2887ef", "0x5d3a536e4d6dbd6114cc1ead35777bab948e3643": "0x6b175474e89094c44da98b954eedeac495271d0f", "0x158079ee67fce2f58472a96584a73c7ab9ac95c1": "0x1985365e9f78359a9b6ad760e32412f4a445e862", "0x39aa39c021dfbae8fac545936693ac917d5e7563": "0xa0b86991c6218b36c1d19d4a2e9eb0ce3606eb48", "0xf650c3d88d12db855b8bf7d11be6c55a4e07dcc9": "0xdac17f958d2ee523a2206206994597c13d831ec7", "0xc11b1268c1a384e55c48c2391d8d480264a3a7f4": "0x2260fac5e5542a773aa44fbcfedf7c193bc2c599", "0xb3319f5d18bc0d84dd1b4825dcde5d5f7266d407": "0xe41d2489571d322189246dafa5ebde1f4699f498", "0xf5dce57282a584d2746faf1593d3121fcac444dc": "0x89d24a6b4ccb1b6faa2625fe562bdd9a23260359", "0x35a18000230da775cac24873d00ff85bccded550": "0x1f9840a85d5af5bf1d1762f925bdaddc4201f984", "0x70e36f6bf80a52b3b46b3af8e106cc0ed743e8e4": "0xc00e94cb662c3520282e6f5717214004a7f26888", "0xccf4429db6322d5c611ee964527d42e5d685dd6a": "0x2260fac5e5542a773aa44fbcfedf7c193bc2c599", "0x12392f67bdf24fae0af363c24ac620a2f67dad86": "0x0000000000085d4780b73119b644ae5ecd22b376", "0xface851a4921ce59e912d19329929ce6da6eb0c7": "0x514910771af9ca656af840dff83e8264ecf986ca", "0x95b4ef2869ebd94beb4eee400a99824bf5dc325b": "0x9f8f72aa9304c8b593d555f12ef6589cc3a579a2", "0x4b0181102a0112a2ef11abee5563bb4a3176c9d7": "0x6b3595068778dd592e39a122f4f5a5cf09c90fe2", "0xe65cdb6479bac1e22340e4e755fae7e509ecd06c": "0x7fc66500c84a76ad7e9c93437bfc5ac33e2ddae9", "0x80a2ae356fc9ef4305676f7a3e2ed04e12c33946": "0x0bc529c00c6401aef6d220be8c6ea1667f6ad93e"}

1
tests/cream_markets.json Normal file

File diff suppressed because one or more lines are too long

View File

@ -1,11 +1,11 @@
from typing import List
from typing import List, Optional
from mev_inspect.schemas.blocks import TraceType
from mev_inspect.schemas.classified_traces import (
Classification,
ClassifiedTrace,
CallTrace,
DecodedCallTrace,
Protocol,
)
@ -18,7 +18,7 @@ def make_transfer_trace(
token_address: str,
amount: int,
):
return CallTrace(
return DecodedCallTrace(
transaction_hash=transaction_hash,
block_number=block_number,
type=TraceType.call,
@ -26,6 +26,9 @@ def make_transfer_trace(
classification=Classification.transfer,
from_address=from_address,
to_address=token_address,
abi_name="ERC20",
function_name="transfer",
function_signature="transfer(address,uint256)",
inputs={
"recipient": to_address,
"amount": amount,
@ -43,6 +46,8 @@ def make_swap_trace(
from_address: str,
pool_address: str,
abi_name: str,
function_signature: str,
protocol: Optional[Protocol],
recipient_address: str,
recipient_input_key: str,
):
@ -56,8 +61,11 @@ def make_swap_trace(
classification=Classification.swap,
from_address=from_address,
to_address=pool_address,
function_name="swap",
function_signature=function_signature,
inputs={recipient_input_key: recipient_address},
abi_name=abi_name,
protocol=protocol,
block_hash=str(block_number),
)

View File

@ -1,23 +1,165 @@
import unittest
from typing import List
# Fails precommit because these inspectors don't exist yet
# from mev_inspect import inspector_compound
# from mev_inspect import inspector_aave
#
#
# class TestLiquidations(unittest.TestCase):
# def test_compound_liquidation(self):
# tx_hash = "0x0ec6d5044a47feb3ceb647bf7ea4ffc87d09244d629eeced82ba17ec66605012"
# block_no = 11338848
# res = inspector_compound.get_profit(tx_hash, block_no)
# # self.assertEqual(res['profit'], 0)
#
# def test_aave_liquidation(self):
# tx_hash = "0xc8d2501d28800b1557eb64c5d0e08fd6070c15b6c04c39ca05631f641d19ffb2"
# block_no = 10803840
# res = inspector_aave.get_profit(tx_hash, block_no)
# # self.assertEqual(res['profit'], 0)
from mev_inspect.aave_liquidations import get_aave_liquidations
from mev_inspect.schemas.liquidations import Liquidation
from mev_inspect.schemas.classified_traces import Protocol
from mev_inspect.classifiers.trace import TraceClassifier
from tests.utils import load_test_block
if __name__ == "__main__":
unittest.main()
def test_single_weth_liquidation():
transaction_hash = (
"0xb7575eedc9d8cfe82c4a11cd1a851221f2eafb93d738301995ac7103ffe877f7"
)
block_number = 13244807
liquidations = [
Liquidation(
liquidated_user="0xd16404ca0a74a15e66d8ad7c925592fb02422ffe",
liquidator_user="0x19256c009781bc2d1545db745af6dfd30c7e9cfa",
collateral_token_address="0xc02aaa39b223fe8d0a0e5c4f27ead9083c756cc2",
debt_token_address="0xdac17f958d2ee523a2206206994597c13d831ec7",
debt_purchase_amount=26503300291,
received_amount=8182733924513576561,
received_token_address="0xc02aaa39b223fe8d0a0e5c4f27ead9083c756cc2",
protocol=Protocol.aave,
transaction_hash=transaction_hash,
trace_address=[1, 1, 6],
block_number=block_number,
)
]
block = load_test_block(block_number)
trace_classifier = TraceClassifier()
classified_traces = trace_classifier.classify(block.traces)
result = get_aave_liquidations(classified_traces)
_assert_equal_list_of_liquidations(result, liquidations)
def test_single_liquidation():
transaction_hash = (
"0xe6c0e3ef0436cb032e1ef292141f4fc4dcd47a75a2559602133114952190e76b"
)
block_number = 10921991
liquidations = [
Liquidation(
liquidated_user="0x8d8d912fe4db5917da92d14fea05225b803c359c",
liquidator_user="0xf2d9e54f0e317b8ac94825b2543908e7552fe9c7",
collateral_token_address="0x80fb784b7ed66730e8b1dbd9820afd29931aab03",
debt_token_address="0xdac17f958d2ee523a2206206994597c13d831ec7",
debt_purchase_amount=1069206535,
received_amount=2657946947610159065393,
received_token_address="0x80fb784b7ed66730e8b1dbd9820afd29931aab03",
protocol=Protocol.aave,
transaction_hash=transaction_hash,
trace_address=[0, 7, 1, 0, 6],
block_number=block_number,
)
]
block = load_test_block(block_number)
trace_classifier = TraceClassifier()
classified_traces = trace_classifier.classify(block.traces)
result = get_aave_liquidations(classified_traces)
_assert_equal_list_of_liquidations(result, liquidations)
def test_single_liquidation_with_atoken_payback():
transaction_hash = (
"0xde551a73e813f1a1e5c843ac2c6a0e40d71618f4040bb7d0cd7cf7b2b6cf4633"
)
block_number = 13376024
liquidations = [
Liquidation(
liquidated_user="0x3d2b6eacd1bca51af57ed8b3ff9ef0bd8ee8c56d",
liquidator_user="0x887668f2dc9612280243f2a6ef834cecf456654e",
collateral_token_address="0x514910771af9ca656af840dff83e8264ecf986ca",
debt_token_address="0xc02aaa39b223fe8d0a0e5c4f27ead9083c756cc2",
debt_purchase_amount=767615458043667978,
received_amount=113993647930952952550,
received_token_address="0xa06bc25b5805d5f8d82847d191cb4af5a3e873e0",
protocol=Protocol.aave,
transaction_hash=transaction_hash,
trace_address=[2],
block_number=block_number,
)
]
block = load_test_block(block_number)
trace_classifier = TraceClassifier()
classified_traces = trace_classifier.classify(block.traces)
result = get_aave_liquidations(classified_traces)
_assert_equal_list_of_liquidations(result, liquidations)
def test_multiple_liquidations_in_block():
transaction1 = "0xedd062c3a728db4b114f2e83cac281d19a9f753e36afa8a35cdbdf1e1dd5d017"
transaction2 = "0x18492f250cf4735bd67a21c6cc26b7d9c59cf2fb077356dc924f36bc68a810e5"
transaction3 = "0x191b05b28ebaf460e38e90ac6a801681b500f169041ae83a45b32803ef2ec98c"
block_number = 12498502
liquidation1 = Liquidation(
liquidated_user="0x6c6541ae8a7c6a6f968124a5ff2feac8f0c7875b",
liquidator_user="0x7185e240d8e9e2d692cbc68d30eecf965e9a7feb",
collateral_token_address="0x514910771af9ca656af840dff83e8264ecf986ca",
debt_token_address="0x4fabb145d64652a948d72533023f6e7a623c7c53",
debt_purchase_amount=457700000000000000000,
received_amount=10111753901939162887,
received_token_address="0x514910771af9ca656af840dff83e8264ecf986ca",
protocol=Protocol.aave,
transaction_hash=transaction1,
trace_address=[],
block_number=block_number,
)
liquidation2 = Liquidation(
liquidated_user="0x6c6541ae8a7c6a6f968124a5ff2feac8f0c7875b",
liquidator_user="0x7185e240d8e9e2d692cbc68d30eecf965e9a7feb",
collateral_token_address="0x514910771af9ca656af840dff83e8264ecf986ca",
debt_token_address="0x0000000000085d4780b73119b644ae5ecd22b376",
debt_purchase_amount=497030000000000000000,
received_amount=21996356316098208090,
received_token_address="0x514910771af9ca656af840dff83e8264ecf986ca",
protocol=Protocol.aave,
transaction_hash=transaction2,
trace_address=[],
block_number=block_number,
)
liquidation3 = Liquidation(
liquidated_user="0xda874f844389df33c0fad140df4970fe1b366726",
liquidator_user="0x7185e240d8e9e2d692cbc68d30eecf965e9a7feb",
collateral_token_address="0x9f8f72aa9304c8b593d555f12ef6589cc3a579a2",
debt_token_address="0x57ab1ec28d129707052df4df418d58a2d46d5f51",
debt_purchase_amount=447810000000000000000,
received_amount=121531358145247546,
received_token_address="0x9f8f72aa9304c8b593d555f12ef6589cc3a579a2",
protocol=Protocol.aave,
transaction_hash=transaction3,
trace_address=[],
block_number=block_number,
)
block = load_test_block(block_number)
trace_classifier = TraceClassifier()
classified_traces = trace_classifier.classify(block.traces)
result = get_aave_liquidations(classified_traces)
liquidations = [liquidation1, liquidation2, liquidation3]
_assert_equal_list_of_liquidations(result, liquidations)
def _assert_equal_list_of_liquidations(
actual_liquidations: List[Liquidation], expected_liquidations: List[Liquidation]
):
for i in range(len(actual_liquidations)):
assert actual_liquidations[i] == expected_liquidations[i]

View File

@ -2,7 +2,7 @@ from typing import List
from mev_inspect.arbitrages import get_arbitrages, _get_all_routes
from mev_inspect.schemas.swaps import Swap
from mev_inspect.swaps import (
from mev_inspect.classifiers.specs.uniswap import (
UNISWAP_V2_PAIR_ABI_NAME,
UNISWAP_V3_POOL_ABI_NAME,
)

140
tests/test_compound.py Normal file
View File

@ -0,0 +1,140 @@
from mev_inspect.compound_liquidations import get_compound_liquidations
from mev_inspect.schemas.liquidations import Liquidation
from mev_inspect.schemas.classified_traces import Protocol
from mev_inspect.classifiers.trace import TraceClassifier
from tests.utils import load_test_block, load_comp_markets, load_cream_markets
comp_markets = load_comp_markets()
cream_markets = load_cream_markets()
def test_c_ether_liquidations():
block_number = 13234998
transaction_hash = (
"0x78f7e67391c2bacde45e5057241f8b9e21a59330bce4332eecfff8fac279d090"
)
liquidations = [
Liquidation(
liquidated_user="0xb5535a3681cf8d5431b8acfd779e2f79677ecce9",
liquidator_user="0xe0090ec6895c087a393f0e45f1f85098a6c33bef",
collateral_token_address="0xEeeeeEeeeEeEeeEeEeEeeEEEeeeeEeeeeeeeEEeE",
debt_token_address="0x39aa39c021dfbae8fac545936693ac917d5e7563",
debt_purchase_amount=268066492249420078,
received_amount=4747650169097,
protocol=Protocol.compound_v2,
transaction_hash=transaction_hash,
trace_address=[1],
block_number=block_number,
)
]
block = load_test_block(block_number)
trace_classifier = TraceClassifier()
classified_traces = trace_classifier.classify(block.traces)
result = get_compound_liquidations(classified_traces, comp_markets, cream_markets)
assert result == liquidations
block_number = 13207907
transaction_hash = (
"0x42a575e3f41d24f3bb00ae96f220a8bd1e24e6a6282c2e0059bb7820c61e91b1"
)
liquidations = [
Liquidation(
liquidated_user="0x45df6f00166c3fb77dc16b9e47ff57bc6694e898",
liquidator_user="0xe0090ec6895c087a393f0e45f1f85098a6c33bef",
collateral_token_address="0xEeeeeEeeeEeEeeEeEeEeeEEEeeeeEeeeeeeeEEeE",
debt_token_address="0x35a18000230da775cac24873d00ff85bccded550",
debt_purchase_amount=414547860568297082,
received_amount=321973320649,
protocol=Protocol.compound_v2,
transaction_hash=transaction_hash,
trace_address=[1],
block_number=block_number,
)
]
block = load_test_block(block_number)
trace_classifier = TraceClassifier()
classified_traces = trace_classifier.classify(block.traces)
result = get_compound_liquidations(classified_traces, comp_markets, cream_markets)
assert result == liquidations
block_number = 13298725
transaction_hash = (
"0x22a98b27a1d2c4f3cba9d65257d18ee961d6c98f21c7eade37da0543847eb654"
)
liquidations = [
Liquidation(
liquidated_user="0xacbcf5d2970eef25f02a27e9d9cd31027b058b9b",
liquidator_user="0xe0090ec6895c087a393f0e45f1f85098a6c33bef",
collateral_token_address="0xEeeeeEeeeEeEeeEeEeEeeEEEeeeeEeeeeeeeEEeE",
debt_token_address="0x35a18000230da775cac24873d00ff85bccded550",
debt_purchase_amount=1106497772527562662,
received_amount=910895850496,
protocol=Protocol.compound_v2,
transaction_hash=transaction_hash,
trace_address=[1],
block_number=block_number,
)
]
block = load_test_block(block_number)
trace_classifier = TraceClassifier()
classified_traces = trace_classifier.classify(block.traces)
result = get_compound_liquidations(classified_traces, comp_markets, cream_markets)
assert result == liquidations
def test_c_token_liquidation():
block_number = 13326607
transaction_hash = (
"0x012215bedd00147c58e1f59807664914b2abbfc13c260190dc9cfc490be3e343"
)
liquidations = [
Liquidation(
liquidated_user="0xacdd5528c1c92b57045041b5278efa06cdade4d8",
liquidator_user="0xe0090ec6895c087a393f0e45f1f85098a6c33bef",
collateral_token_address="0xa0b86991c6218b36c1d19d4a2e9eb0ce3606eb48",
debt_token_address="0x70e36f6bf80a52b3b46b3af8e106cc0ed743e8e4",
debt_purchase_amount=1207055531,
received_amount=21459623305,
protocol=Protocol.compound_v2,
transaction_hash=transaction_hash,
trace_address=[1],
block_number=block_number,
)
]
block = load_test_block(block_number)
trace_classifier = TraceClassifier()
classified_traces = trace_classifier.classify(block.traces)
result = get_compound_liquidations(classified_traces, comp_markets, cream_markets)
assert result == liquidations
def test_cream_token_liquidation():
block_number = 12674514
transaction_hash = (
"0x0809bdbbddcf566e5392682a9bd9d0006a92a4dc441163c791b1136f982994b1"
)
liquidations = [
Liquidation(
liquidated_user="0x46bf9479dc569bc796b7050344845f6564d45fba",
liquidator_user="0xa2863cad9c318669660eb4eca8b3154b90fb4357",
collateral_token_address="0x514910771af9ca656af840dff83e8264ecf986ca",
debt_token_address="0x44fbebd2f576670a6c33f6fc0b00aa8c5753b322",
debt_purchase_amount=14857434973806369550,
received_amount=1547215810826,
protocol=Protocol.cream,
transaction_hash=transaction_hash,
trace_address=[],
block_number=block_number,
)
]
block = load_test_block(block_number)
trace_classifier = TraceClassifier()
classified_traces = trace_classifier.classify(block.traces)
result = get_compound_liquidations(classified_traces, comp_markets, cream_markets)
assert result == liquidations

69
tests/test_decode.py Normal file
View File

@ -0,0 +1,69 @@
import pydantic
from mev_inspect import decode
from mev_inspect.schemas import abi
def test_decode_function_with_simple_argument():
test_function_name = "testFunction"
test_parameter_name = "testParameter"
test_abi = pydantic.parse_obj_as(
abi.ABI,
[
{
"name": test_function_name,
"type": "function",
"inputs": [{"name": test_parameter_name, "type": "uint256"}],
}
],
)
# 4byte signature of the test function.
# https://www.4byte.directory/signatures/?bytes4_signature=0x350c530b
test_function_selector = "350c530b"
test_function_argument = (
"0000000000000000000000000000000000000000000000000000000000000001"
)
abi_decoder = decode.ABIDecoder(test_abi)
call_data = abi_decoder.decode(
"0x" + test_function_selector + test_function_argument
)
assert call_data.function_name == test_function_name
assert call_data.function_signature == "testFunction(uint256)"
assert call_data.inputs == {test_parameter_name: 1}
def test_decode_function_with_tuple_argument():
test_function_name = "testFunction"
test_tuple_name = "testTuple"
test_parameter_name = "testParameter"
test_abi = pydantic.parse_obj_as(
abi.ABI,
[
{
"name": test_function_name,
"type": "function",
"inputs": [
{
"name": test_tuple_name,
"type": "tuple",
"components": [
{"name": test_parameter_name, "type": "uint256"}
],
}
],
}
],
)
# 4byte signature of the test function.
# https://www.4byte.directory/signatures/?bytes4_signature=0x98568079
test_function_selector = "98568079"
test_function_argument = (
"0000000000000000000000000000000000000000000000000000000000000001"
)
abi_decoder = decode.ABIDecoder(test_abi)
call_data = abi_decoder.decode(
"0x" + test_function_selector + test_function_argument
)
assert call_data.function_name == test_function_name
assert call_data.function_signature == "testFunction((uint256))"
assert call_data.inputs == {test_tuple_name: (1,)}

View File

@ -1,9 +1,10 @@
from mev_inspect.swaps import (
get_swaps,
from mev_inspect.swaps import get_swaps
from mev_inspect.classifiers.specs.balancer import BALANCER_V1_POOL_ABI_NAME
from mev_inspect.classifiers.specs.uniswap import (
UNISWAP_V2_PAIR_ABI_NAME,
UNISWAP_V3_POOL_ABI_NAME,
BALANCER_V1_POOL_ABI_NAME,
)
from mev_inspect.schemas.classified_traces import Protocol
from .helpers import (
make_unknown_trace,
@ -64,6 +65,8 @@ def test_swaps(
from_address=alice_address,
pool_address=first_pool_address,
abi_name=UNISWAP_V2_PAIR_ABI_NAME,
protocol=None,
function_signature="swap(uint256,uint256,address,bytes)",
recipient_address=bob_address,
recipient_input_key="to",
),
@ -83,6 +86,8 @@ def test_swaps(
from_address=bob_address,
pool_address=second_pool_address,
abi_name=UNISWAP_V3_POOL_ABI_NAME,
protocol=None,
function_signature="swap(address,bool,int256,uint160,bytes)",
recipient_address=carl_address,
recipient_input_key="recipient",
),
@ -129,6 +134,8 @@ def test_swaps(
from_address=bob_address,
pool_address=third_pool_address,
abi_name=BALANCER_V1_POOL_ABI_NAME,
protocol=Protocol.balancer_v1,
function_signature="swapExactAmountIn(address,uint256,address,uint256,uint256)",
recipient_address=bob_address,
recipient_input_key="recipient",
),
@ -178,7 +185,7 @@ def test_swaps(
assert bal_v1_swap.transaction_hash == third_transaction_hash
assert bal_v1_swap.block_number == block_number
assert bal_v1_swap.trace_address == [6]
assert bal_v1_swap.protocol is None
assert bal_v1_swap.protocol == Protocol.balancer_v1
assert bal_v1_swap.pool_address == third_pool_address
assert bal_v1_swap.from_address == bob_address
assert bal_v1_swap.to_address == bob_address

View File

@ -1,4 +1,4 @@
from mev_inspect.schemas.transfers import ERC20Transfer
from mev_inspect.schemas.transfers import Transfer
from mev_inspect.transfers import remove_child_transfers_of_transfers
@ -13,7 +13,7 @@ def test_remove_child_transfers_of_transfers(get_transaction_hashes, get_address
third_token_address,
] = get_addresses(5)
outer_transfer = ERC20Transfer(
outer_transfer = Transfer(
block_number=123,
transaction_hash=transaction_hash,
trace_address=[0],
@ -23,7 +23,7 @@ def test_remove_child_transfers_of_transfers(get_transaction_hashes, get_address
token_address=first_token_address,
)
inner_transfer = ERC20Transfer(
inner_transfer = Transfer(
**{
**outer_transfer.dict(),
**dict(
@ -33,7 +33,7 @@ def test_remove_child_transfers_of_transfers(get_transaction_hashes, get_address
}
)
other_transfer = ERC20Transfer(
other_transfer = Transfer(
block_number=123,
transaction_hash=transaction_hash,
trace_address=[1],
@ -43,7 +43,7 @@ def test_remove_child_transfers_of_transfers(get_transaction_hashes, get_address
token_address=third_token_address,
)
separate_transaction_transfer = ERC20Transfer(
separate_transaction_transfer = Transfer(
**{
**inner_transfer.dict(),
**dict(transaction_hash=other_transaction_hash),

View File

@ -1,5 +1,6 @@
import json
import os
from typing import Dict
from mev_inspect.schemas.blocks import Block
@ -14,3 +15,17 @@ def load_test_block(block_number: int) -> Block:
with open(block_path, "r") as block_file:
block_json = json.load(block_file)
return Block(**block_json)
def load_comp_markets() -> Dict[str, str]:
comp_markets_path = f"{THIS_FILE_DIRECTORY}/comp_markets.json"
with open(comp_markets_path, "r") as markets_file:
markets = json.load(markets_file)
return markets
def load_cream_markets() -> Dict[str, str]:
cream_markets_path = f"{THIS_FILE_DIRECTORY}/cream_markets.json"
with open(cream_markets_path, "r") as markets_file:
markets = json.load(markets_file)
return markets