Compare commits

...

800 Commits

Author SHA1 Message Date
sukoneck
ce8179f07e
Update README.md
graduate repo
2024-11-27 14:42:51 -07:00
m-r-g-t
b3438d7353
feat: remove need for pg within mev-inspect env (#344)
* instructions for running without kubernetes ('monolithic mode')

* added docker instructions

* chore: remove pgsql as hard dependency

* chore: update deps

* docs: updated docs to remove local pg engine for docker install

* docs: reword docs

* ci: update poetry source

* fix: refactor tests for mypy

* fix: search miner for eth2

* feat: improve eth2 miner fn

* refactor: unnecessary comma

* test: add miner generation tests

---------

Co-authored-by: pintail <you@example.com>
2023-10-11 16:16:44 -04:00
Taarush Vemulapalli
26aa190b03
Create LICENSE 2023-05-22 13:42:43 -04:00
pintail-xyz
51c7345d26
exclude reverted punk bid acceptances (#330)
Co-authored-by: pintail <you@example.com>
2023-02-15 08:16:08 -08:00
pintail-xyz
b6777a25dc
check that swaps involve both pool assets (#328)
Co-authored-by: pintail <you@example.com>
2023-02-08 09:01:36 -08:00
pintail-xyz
297afec364
fix arbitrage by filtering out sequence where the start/finish swap contract is the same (#325)
Co-authored-by: pintail <you@example.com>
2023-01-17 08:34:33 -08:00
pintail-xyz
0e42ab6ba4
ignore reverted transactions from liquidation classifier (#317) 2022-12-05 07:59:52 -08:00
Gui Heise
d917ae72de
Merge pull request #306 from maxholloway/mh/fix-broken-readme-link
fix broken README link
2022-08-04 11:09:19 -04:00
Max Holloway
b720bb2c03 fix broken README link 2022-07-25 17:54:03 -05:00
Gui Heise
3c9fe3dca9
Merge pull request #269 from pintail-xyz/separate-cream-compound
separate out the cream and compound liquidation classifiers
2022-04-13 12:01:26 -04:00
pintail
60357cb449 update pre-commit packages 2022-04-09 00:30:43 +01:00
pintail
04d1c86dbd separate cream and compound classifiers 2022-04-07 19:27:47 +01:00
Gui Heise
22135c42dc
Merge pull request #290 from flashbots/export-deletions
Support export deletions
2022-03-22 17:05:32 -04:00
Gui Heise
8cb91afddc Add created_at to blocks table 2022-03-22 16:38:42 -04:00
Gui Heise
a9f57bcd2e Object deletions 2022-03-22 13:27:05 -04:00
Gui Heise
baf370c9fb Remove date from filename 2022-03-22 10:52:07 -04:00
Gui Heise
852ac93ba7
Merge pull request #287 from flashbots/blocks-export
add blocks to export
2022-03-04 14:09:34 -05:00
Gui Heise
c1b26f8888 add blocks to export 2022-03-04 14:03:34 -05:00
Gui Heise
7a0b21b006
Merge pull request #286 from flashbots/export-filename
Add timestamp to filename
2022-03-02 15:15:45 -05:00
Gui Heise
440c5f0754 Add timestamp to filename 2022-03-02 11:51:18 -05:00
Gui Heise
be280ab113
Merge pull request #284 from flashbots/fix-export-backfill
Fix worker actor priority
2022-02-28 13:33:45 -05:00
Gui Heise
77957e07cf Fix priority 2022-02-28 13:13:16 -05:00
Luke Van Seters
38cd60cf88
Merge pull request #282 from Dire-0x/fix/sandwicher-address-not-any-uniswap-router
fix check sandwicher against correct uniswap routers
2022-02-25 10:05:47 -05:00
Dire
75efdb4afe fix check sandwicher against correct uniswap routers 2022-02-24 11:01:44 -06:00
Gui Heise
6aca1d292d
Merge pull request #274 from flashbots/backfill-export
Backfill export
2022-02-22 13:30:39 -05:00
Gui Heise
1fbecbec58 Worker low priority 2022-02-21 13:19:25 -05:00
Gui Heise
180a987a61 Add low priority to cli tasks 2022-02-21 12:45:36 -05:00
Luke Van Seters
af7ae2c3b0
Merge pull request #272 from flashbots/healthcheck-retry
Retry on healthcheck
2022-02-21 12:43:34 -05:00
Gui Heise
5eef1b7a8f Add worker and listener task 2022-02-21 11:16:22 -05:00
Gui Heise
c6e6d694ec Add task to the listener 2022-02-21 11:02:30 -05:00
Gui Heise
da04bc4351 Add tasks to CLI 2022-02-21 10:59:14 -05:00
Gui Heise
cbad9e79b6 Separate tasks 2022-02-21 10:55:26 -05:00
Gui Heise
b486d53012 Remove priorities 2022-02-18 14:47:36 -05:00
Gui Heise
fe9253ca5e Comment Tiltfile 2022-02-18 14:47:36 -05:00
Gui Heise
db6b55ad38 Task priority and queue 2022-02-18 14:47:36 -05:00
Gui Heise
c7e94b55d4 Fix poetry config 2022-02-18 14:47:36 -05:00
Gui Heise
54cc4f1dc6 Add bash script 2022-02-18 14:47:36 -05:00
Gui Heise
c58d75118d Fix task priorities 2022-02-18 14:47:36 -05:00
Gui Heise
b86ecbca87 Add commands 2022-02-18 14:47:36 -05:00
Gui Heise
ed01c155b3
Merge pull request #278 from flashbots/export-tables
Add logic for more tables
2022-02-18 13:53:38 -05:00
Gui Heise
1edd39c382 Spacing 2022-02-18 11:52:45 -05:00
Gui Heise
ca6978a693 Add logic for more tables 2022-02-18 11:10:24 -05:00
Luke Van Seters
8767f27fe6
Merge pull request #275 from flashbots/block-list-2
Enqueue a list of blocks
2022-02-16 12:08:27 -05:00
Luke Van Seters
19eb48aec0 Use the actor 2022-02-16 11:56:50 -05:00
Luke Van Seters
cb6f20ba63 No -t for stdin push 2022-02-16 11:56:33 -05:00
Ryan Radomski
1b42920dd1 Changed backfilling a list of blocks in Readme to using a text file example 2022-02-16 11:36:23 -05:00
Ryan Radomski
fa14caec17 Added block-list command to enqueue a list of blocks from stdin 2022-02-16 11:35:38 -05:00
Luke Van Seters
eda0485fa5 Retry on healthcheck 2022-02-16 10:52:49 -05:00
Luke Van Seters
4b93f95d50
Merge pull request #270 from flashbots/only-write-if-newly-empty
Only export empty blocks if there's an existing non-empty one
2022-02-16 09:47:55 -05:00
Luke Van Seters
25f77a54fc
Merge pull request #267 from flashbots/lukevs-pr-template
Create pull_request_template.md
2022-02-16 09:27:35 -05:00
Luke Van Seters
48cf3612bd
Update pull_request_template.md 2022-02-16 09:19:36 -05:00
Luke Van Seters
4ef2145409 Skip write if no data and no key or current upload has no data 2022-02-16 09:06:51 -05:00
Luke Van Seters
b30d6be0c5 Add peek that preserves the iterable 2022-02-16 08:55:48 -05:00
Luke Van Seters
cd5f82733b
Merge pull request #266 from flashbots/priority-export
Create low and high priority queues. Put export on high priority, backfill on low
2022-02-15 17:20:02 -05:00
Luke Van Seters
6af61dac74
Update pull_request_template.md 2022-02-15 13:23:39 -05:00
Luke Van Seters
d0304474e6
Merge pull request #268 from flashbots/lukevs-contributing-fix
Update CONTRIBUTING.md to include new test command
2022-02-15 12:35:33 -05:00
Luke Van Seters
8544eb46ef
Update CONTRIBUTING.md 2022-02-15 12:26:16 -05:00
Luke Van Seters
a8856521d7
Create pull_request_template.md 2022-02-15 12:24:29 -05:00
Luke Van Seters
2e40c8bd5e Also add priority to listener 2022-02-15 12:01:02 -05:00
Luke Van Seters
7a3f3874b6 Add separate queue names to consume both then internally prioritize 2022-02-15 11:59:04 -05:00
Luke Van Seters
94f4ec7d40 Fix priorities. Lower comes first 2022-02-15 10:28:06 -05:00
Luke Van Seters
a58863b992 Add priorities to queue tasks 2022-02-15 10:25:08 -05:00
Gui Heise
0f4cd2f31d
Merge pull request #263 from flashbots/export-v3
Add Enqueue/Direct exports commands
2022-02-14 17:48:37 -05:00
Gui Heise
26ce3229ef Fix mev 2022-02-14 17:12:44 -05:00
Gui Heise
6e51443ab3 Add enqueue/direct commands 2022-02-14 16:48:38 -05:00
Gui Heise
328215bacb Fix mev 2022-02-14 15:45:38 -05:00
Gui Heise
5f5bafa7e1
Merge pull request #262 from flashbots/export-v2
Add S3 export task
2022-02-14 13:34:43 -05:00
Gui Heise
8c7baecf2a Syntax 2022-02-14 13:30:20 -05:00
Gui Heise
c6f7fd509e Export command and function edits 2022-02-14 12:37:52 -05:00
Gui Heise
95444eae24 Add actor 2022-02-11 18:43:39 -05:00
Gui Heise
bb06c8a958 Add export task 2022-02-11 16:47:24 -05:00
Gui Heise
9dbe68b284 Single block export function 2022-02-11 16:39:50 -05:00
Luke Van Seters
debcb8731a
Merge pull request #258 from flashbots/aws-s3-local
Export a range of blocks in mev_summary to S3
2022-02-11 11:22:48 -05:00
Luke Van Seters
88b5e0ce2a Move ENV names to variables. Make region and keys optional 2022-02-11 11:19:59 -05:00
Luke Van Seters
4dbe6ed2d7 Pass through AWS creds as well. Turn into a secret. Make all optional for folks not using the export 2022-02-11 11:16:06 -05:00
Luke Van Seters
c079ac9aa6 Add region for the export bucket 2022-02-11 11:16:06 -05:00
Luke Van Seters
001b6e2b85 Add a flashbots prefix 2022-02-11 11:16:06 -05:00
Luke Van Seters
aa5c90ae96 only one mev inpect helml 2022-02-11 11:16:06 -05:00
Gui Heise
751059c534 Remove some comments 2022-02-11 11:16:06 -05:00
Gui Heise
dbebb57b9c Tiltfile comments and services constraint 2022-02-11 11:16:06 -05:00
Luke Van Seters
462bff387a Break env piece into a function 2022-02-11 11:16:06 -05:00
Luke Van Seters
040be01e9d Set aws creds through environment variables locally 2022-02-11 11:16:06 -05:00
Gui Heise
00dba743d9 ConfigMap 2022-02-11 11:16:06 -05:00
Luke Van Seters
b1d4cb852b Add some logging. Remove unused list function 2022-02-11 11:16:05 -05:00
Luke Van Seters
d9439dfe27 Run query. Export to S3 2022-02-11 11:15:54 -05:00
Luke Van Seters
06c39d1495 Add boto3. Remove boto. Add a test connection to localstack 2022-02-11 11:15:36 -05:00
Luke Van Seters
e0fc9e7776 Add a shell of a command to do the export 2022-02-11 11:15:05 -05:00
Luke Van Seters
17dec2b203 Expose localhost port 2022-02-11 11:15:05 -05:00
Luke Van Seters
bb875cc45a Add boto 2022-02-11 11:15:05 -05:00
Luke Van Seters
f696bb72f4 Add localstack 2022-02-11 11:15:05 -05:00
Luke Van Seters
1a5aa6308c
Merge pull request #260 from pintail-xyz/reverse-backfill
implement reverse backfill
2022-02-10 15:57:14 -05:00
Luke Van Seters
6b6d80b3da
Merge pull request #261 from flashbots/revert-259-patch-1
Revert "Add new stablecoins and router contracts"
2022-02-10 10:57:59 -05:00
Luke Van Seters
b332bb703f
Revert "Add new stablecoins and router contracts" 2022-02-10 10:57:37 -05:00
Luke Van Seters
31bc65d617
Merge pull request #259 from ivigamberdiev/patch-1
Add new stablecoins and router contracts
2022-02-10 10:56:22 -05:00
pintail
c77869abd5 implement reverse backfill 2022-02-09 22:16:27 +00:00
Luke Van Seters
3965c5f7ba
Merge pull request #255 from flashbots/split-out-workers-from-task
Separate importing tasks from importing the worker
2022-02-08 13:06:37 -05:00
Igor Igamberdiev
0293ea3ed4
Add new stablecoins and router contracts 2022-02-07 21:11:26 +03:00
Luke Van Seters
f836b50ef5
Merge pull request #256 from tmikulin/fix-new-bitnami-postgres-update
adjust the new name for postgres bitnami
2022-02-04 10:02:48 -05:00
Tomislav Mikulin
7b236b7a71 change info in Tiltfile for postgres 2022-02-04 14:25:41 +01:00
Tomislav Mikulin
1b13e975a6 adjust the new name for postgres bitnami 2022-02-04 13:23:25 +01:00
Luke Van Seters
4db05526b3 Remove unused __main__ 2022-02-03 14:50:19 -05:00
Luke Van Seters
ecb3a563c1 Separate tasks from the worker 2022-02-02 13:16:36 -05:00
Luke Van Seters
78257df3ef
Merge pull request #250 from flashbots/load-prices-readme
Ask to load prices as part of setup
2022-01-28 16:39:50 -05:00
Luke Van Seters
d69c1ea533
Merge pull request #251 from flashbots/readme-clear-queue
Add instructions on clearing the queue to the README
2022-01-28 16:39:39 -05:00
Gui Heise
ad472d9d23
Merge pull request #252 from flashbots/compound-eth
Fix Compound cETH liquidations
2022-01-27 22:04:23 -05:00
Gui Heise
8e4416002a Fix tuple 2022-01-27 16:49:32 -05:00
Gui Heise
3ceaf7f6cf Fix eth liquidations 2022-01-26 12:53:13 -05:00
Luke Van Seters
b52d8514ce Add instructions on clearing the queue to the README 2022-01-25 11:14:05 -05:00
Luke Van Seters
747dfbd2bf Ask to load prices as part of setup 2022-01-21 13:16:08 -05:00
Luke Van Seters
99d92aaf7c
Merge pull request #249 from flashbots/nft-trades-block-primary-key
Change nft_trades primary key to include block_number
2022-01-21 10:40:42 -05:00
Luke Van Seters
a31dd7c09b Change nft_trades primary key to include block_number 2022-01-21 10:30:53 -05:00
Luke Van Seters
4076128419
Merge pull request #248 from flashbots/write-protocols-to-mev-summary
Write protocols to mev_summary
2022-01-20 19:29:43 -05:00
Luke Van Seters
de8e2a059b Write protocols to mev_summary 2022-01-20 19:28:17 -05:00
Luke Van Seters
903bf0f5d7
Merge pull request #247 from flashbots/add-protocols-to-mev-summary
Add protocols to mev summary
2022-01-20 19:27:39 -05:00
Luke Van Seters
8fd382e4b1
Merge pull request #246 from flashbots/write-to-arbs-protocol-column
Write to arbitrage protocols column
2022-01-20 19:15:23 -05:00
Luke Van Seters
ac47974daf
Merge pull request #245 from flashbots/add-protocols-column-to-arbs
Add protocols column to arbitrages
2022-01-20 19:13:53 -05:00
Luke Van Seters
866b337be7 Add protocols column to mev_summary 2022-01-20 19:13:02 -05:00
Luke Van Seters
f37de76824 Fix swap and sandwich tests 2022-01-20 19:09:35 -05:00
Luke Van Seters
3afb854d13 Require protocol to build a swap 2022-01-20 19:03:29 -05:00
Luke Van Seters
a056919507 Fix test swaps 2022-01-20 19:03:00 -05:00
Luke Van Seters
2e1600b002 Add protocol on test swaps 2022-01-20 19:01:27 -05:00
Luke Van Seters
9ce82a36de Write protocol for uniswap v2 and v3 swaps. Require protocol for all future swaps. Write protocols to arbitrages 2022-01-20 18:57:14 -05:00
Luke Van Seters
3f2daee6a9 Add protocols column to arbitrages 2022-01-20 18:33:08 -05:00
Luke Van Seters
9bef022d37
Merge pull request #244 from flashbots/update-revision-for-tokens
Update to latest down revision for tokens migration
2022-01-20 12:17:05 -05:00
Luke Van Seters
e3b4e35c23 Update to latest down revision 2022-01-20 11:22:09 -05:00
Luke Van Seters
62d8125bcf
Merge pull request #242 from flashbots/sandwich-more-accurate
Allow sandwiches with large differences. Explicitly filter uniswap routers
2022-01-20 11:03:35 -05:00
Luke Van Seters
53f6be4700
Merge pull request #243 from flashbots/add-liquidations
Write liquidations to mev_summary on inspect
2022-01-20 11:00:34 -05:00
Gui Heise
a21027614d
Merge pull request #241 from flashbots/token-migration
Add tokens revision
2022-01-20 10:45:51 -05:00
Gui Heise
bfd1783045 Revert to one query 2022-01-20 10:36:32 -05:00
Luke Van Seters
0266582889 Actually insert the data. Fix the ordering 2022-01-20 10:10:50 -05:00
Luke Van Seters
177d8599c1 Allow sandwiches with large differences. Explicitly filter uniswap routers 2022-01-19 21:35:25 -05:00
Gui Heise
7bdc8b68ef
Merge pull request #239 from flashbots/prices-range
Add prices fetch-range
2022-01-19 18:22:56 -05:00
Luke Van Seters
cab1fe4f4c
Merge pull request #240 from flashbots/nullable-gross-profit
Make gross profit nullable
2022-01-19 18:17:52 -05:00
Gui Heise
e6d52fa7df Add tokens revision 2022-01-19 18:14:49 -05:00
Luke Van Seters
654c749c02 Make gross profit nullable 2022-01-19 18:11:41 -05:00
Luke Van Seters
906b158851
Merge pull request #237 from flashbots/add-listener-mev-summary
Populate mev_summary on inspect
2022-01-19 17:41:23 -05:00
Luke Van Seters
97e11521fd
Merge pull request #236 from flashbots/mev-summary-table
Create table for mev_summary
2022-01-19 17:30:34 -05:00
Luke Van Seters
d67ee0657e
Merge pull request #230 from flashbots/keep-parts-from-trace-db
Allow partial results from the db
2022-01-19 17:28:23 -05:00
Luke Van Seters
c26910e74b Add liquidations to the summary 2022-01-19 17:07:24 -05:00
Gui Heise
df8525d582 Correct instance name 2022-01-19 17:03:06 -05:00
Gui Heise
cdb5ecc9a0 Fix datetime support 2022-01-19 16:59:25 -05:00
Gui Heise
f0064e01b2 Shared function, start/end -> after/before 2022-01-19 16:48:20 -05:00
Luke Van Seters
94825d3547 Update mev_summary for all inspections 2022-01-19 16:48:12 -05:00
Gui Heise
c4f82bdbd6 Add prices-range 2022-01-19 15:43:59 -05:00
Gui Heise
e105ee4d29
Merge pull request #235 from flashbots/liquidation-classifiers
Add LiquidationClassifiers
2022-01-19 11:32:42 -05:00
Gui Heise
8ad5ea525d Fix token address circular import 2022-01-19 11:18:04 -05:00
Gui Heise
5eaf4748c7 Fix helper names 2022-01-19 10:58:09 -05:00
Gui Heise
e1d054b82e Add cETH transfer in cETH liquidations 2022-01-19 10:48:40 -05:00
Luke Van Seters
b113b6c82e Add mev_summary population to the listener 2022-01-18 18:04:52 -05:00
Luke Van Seters
5fc38de2c1 Create table for mev_summary 2022-01-18 18:03:40 -05:00
Gui Heise
425f882637 Fix black issues 2022-01-18 16:46:49 -05:00
Gui Heise
f1937be0e9 Fix cETH address circular import 2022-01-18 16:42:56 -05:00
Gui Heise
c6379977ba Simplify debt amount 2022-01-18 16:39:38 -05:00
Gui Heise
72a9d6744a Share transfer helpers 2022-01-18 16:39:38 -05:00
Gui Heise
de4a682061 Adjust AAVE and include prices fromm #231 2022-01-18 16:39:08 -05:00
Gui Heise
b599dff24d replace value with repay input and init received to None 2022-01-18 16:37:44 -05:00
Gui Heise
772a08d8d6 Remove terminal saved output 2022-01-18 16:37:44 -05:00
Gui Heise
f31e2525da Remove old inspectors and add none return for 0 received 2022-01-18 16:37:44 -05:00
Gui Heise
172ed46b0b Remoeve debugger 2022-01-18 16:37:44 -05:00
Gui Heise
af22d4765d Fix debt token in tests to underlyinh 2022-01-18 16:37:44 -05:00
Gui Heise
98a38eee47 Change crUSDC to LINK 2022-01-18 16:37:44 -05:00
Gui Heise
1fd9780e9d Change cUSDC to USDC token address 2022-01-18 16:37:44 -05:00
Gui Heise
d1b4ebd02c Address black 2022-01-18 16:37:44 -05:00
Gui Heise
406d22a26f Fixes 2022-01-18 16:37:44 -05:00
Gui Heise
072c4df36c Compound logic 2022-01-18 16:37:44 -05:00
Gui Heise
9883680dfa Update classifiers and tests 2022-01-18 16:37:44 -05:00
Gui Heise
d0ab255a5c Add LiquidationClassifiers 2022-01-18 16:37:44 -05:00
Gui Heise
a9b8f149aa
Merge pull request #231 from flashbots/coingecko-api
Add coingecko api
2022-01-18 16:37:18 -05:00
Gui Heise
d8f896bda3 Add Eth and Weth prices 2022-01-18 16:33:32 -05:00
Gui Heise
c7de7cf808 Fix Black pre-commit 2022-01-18 16:04:31 -05:00
Gui Heise
a3d83e625c Add cETH and cWBTC 2022-01-18 16:01:33 -05:00
Luke Van Seters
6e537053e8
Merge pull request #233 from CalebEverett/sandwich_profit
add sandwich profit
2022-01-18 11:31:22 -05:00
Luke Van Seters
85d90e3c6b No need for typevar 2022-01-18 11:30:32 -05:00
Luke Van Seters
091ddbd9c1 Clean up async defaults 2022-01-18 11:29:45 -05:00
Caleb
dcab5fe3fb update sandwiches fixture 2022-01-18 07:39:40 -08:00
Luke Van Seters
89d2a718b2 No limits! 2022-01-18 09:24:03 -05:00
Caleb
1e5e48804c run precommit 2022-01-17 19:59:49 -08:00
Caleb
42dd184779 add sandwich profit 2022-01-17 19:56:40 -08:00
Gui Heise
fed3497afc Remove @coro from cli 2022-01-17 22:09:27 -05:00
Gui Heise
3a0e91f870
Merge pull request #234 from flashbots/compound-v2
Flip token addresses
2022-01-17 22:05:23 -05:00
Caleb
5467459004 add profit_token_address to sandwiches 2022-01-17 17:15:10 -08:00
Gui Heise
9604e01aba Correct tests 2022-01-17 16:42:43 -05:00
Gui Heise
e4528de7bd Same for CETher 2022-01-17 15:10:30 -05:00
Gui Heise
640bcaa867 Flip token addresses 2022-01-17 14:43:29 -05:00
Luke Van Seters
32049d4a86
Merge pull request #232 from tmikulin/update_gitignore
add pycharm files to gitignore
2022-01-17 10:08:11 -05:00
Caleb
189b7d1220 add sandwich profit 2022-01-16 14:42:00 -08:00
Tomislav Mikulin
1935a898db add pycharm files to gitignore 2022-01-16 17:05:21 +01:00
Luke Van Seters
93c7998e22 Allow partial results from the db 2022-01-15 11:12:59 -05:00
Luke Van Seters
72433ece39
Merge pull request #225 from flashbots/constrain-sandwiches
Constrain sandwiches to require similar swap amounts
2022-01-15 11:05:41 -05:00
Gui Heise
3072e4a826 Specify coingecko id's and remove async keyword from cli 2022-01-14 13:17:37 -05:00
Gui Heise
7af515d1ac Change price to float 2022-01-13 11:17:48 -05:00
Gui Heise
2a1da33752 Remove leftover coinbase file 2022-01-13 10:54:00 -05:00
Gui Heise
2e22103713 Add coingecko api 2022-01-13 01:26:53 -05:00
Luke Van Seters
a93161eabc
Merge pull request #229 from flashbots/chainlink-fix
Fix chainlink price
2022-01-11 15:52:39 -05:00
Luke Van Seters
d9bca45a50 Fix chainlink price 2022-01-11 15:50:19 -05:00
Luke Van Seters
de03b953a0
Merge pull request #223 from flashbots/aave-zero-bug
Support aave self-liquidations
2022-01-11 09:49:50 -05:00
Luke Van Seters
403e84fa29 should be zero if we dont know 2022-01-11 09:48:41 -05:00
Luke Van Seters
a40e250464
Merge pull request #226 from tmikulin/improve-k8-security
Enforce security in k8 files
2022-01-11 08:20:01 -05:00
Tomislav Mikulin
2703b008de Enforce security in k8 files 2022-01-10 20:52:45 +01:00
Gui Heise
c28f7c6174 Remove unused Optional 2022-01-10 14:21:28 -05:00
Gui Heise
2bb760874d Remove exceptions 2022-01-10 14:18:37 -05:00
Luke Van Seters
e388271b55 Constrain sandwiches to require the backrun swap to be similar in amount and from the same address as the frontrun 2022-01-10 13:53:10 -05:00
Luke Van Seters
a29b12bf0a
Merge pull request #224 from flashbots/support-all-tokens-coinbase-knows
Support all tokens we have supported for coinbase
2022-01-10 12:54:14 -05:00
Luke Van Seters
5b1efd5e6d Support all tokens we have supported for coinbase 2022-01-10 12:27:08 -05:00
Luke Van Seters
89fcf388e4
Merge pull request #222 from flashbots/include-sandwiches-close-in-arb
Include sandwiches that close in arbs
2022-01-10 11:24:34 -05:00
Gui Heise
63087fc0e8 Support aave self-liquidations 2022-01-10 10:14:58 -05:00
Luke Van Seters
a6e76bfd10
Merge pull request #218 from flashbots/remove-backfill
Remove old backfill code
2022-01-10 10:05:14 -05:00
Luke Van Seters
50ff7dadcd The sandwicher should be where the swap value accumulates 2022-01-08 16:15:39 -05:00
Luke Van Seters
4930065045 Include sandwiches that close in arbs 2022-01-08 13:44:58 -05:00
Luke Van Seters
4a4992a0f9
Merge pull request #221 from flashbots/fix-listener-small-image
Fix listener to work with more secure image
2022-01-08 07:04:44 -05:00
Luke Van Seters
81be06ad7d Fix listener to work with more secure image 2022-01-07 16:18:51 -05:00
Luke Van Seters
e13f895593
Merge pull request #219 from flashbots/require-small-difference-arbs
Require token amounts in arbitrage swaps to be close to each other
2022-01-07 14:09:52 -05:00
Luke Van Seters
660dfe7b2f Update tests to use a true reverting arb 2022-01-07 13:18:51 -05:00
Luke Van Seters
11aebe078a Require price difference to be less than 1% between swaps 2022-01-07 13:06:41 -05:00
Luke Van Seters
69cad7537e Break swap outs / ins check into a function 2022-01-07 12:22:45 -05:00
Gui Heise
9894450e0c
Merge pull request #217 from flashbots/aave-liquidations-v3
Restructure AAVE classifier debt logic
2022-01-07 11:30:53 -05:00
Gui Heise
977a72839e Remove instance checks 2022-01-07 11:25:33 -05:00
Luke Van Seters
dcdb4e421d
Merge pull request #210 from tmikulin/improve_dockerfile
Improve dockerfile
2022-01-07 11:05:39 -05:00
Tomislav Mikulin
02fb01dfb8 Merge branch 'main' into improve_dockerfile 2022-01-07 09:30:35 +01:00
Tomislav Mikulin
9ab1e6e5b1 add the missing emojis 2022-01-07 09:25:30 +01:00
Luke Van Seters
b33eb49dd2 Remove old backfill code 2022-01-06 17:10:52 -05:00
Gui Heise
327695c56c Remove AAVE address list 2022-01-06 16:38:48 -05:00
Gui Heise
818a9b0b65 Raise exceptions 2022-01-06 16:35:51 -05:00
Gui Heise
75748abb43 Actually fix eth transfers test 2022-01-06 16:17:10 -05:00
Gui Heise
92904d7298 Fix eth transfer liquidations 2022-01-06 16:14:35 -05:00
Gui Heise
73a29a667b Fix text 2022-01-06 15:08:44 -05:00
Luke Van Seters
8bb92aa87e
Merge pull request #215 from flashbots/flip-token-in-out-amounts
Switch token amounts for taker and maker on 0x
2022-01-05 20:30:40 -05:00
Luke Van Seters
722ee8c6ec Fix tests 2022-01-05 18:02:49 -05:00
Luke Van Seters
bee620fd98 Switch token amounts for taker and maker on 0x 2022-01-05 17:55:49 -05:00
Luke Van Seters
2d8db7f506
Merge pull request #213 from flashbots/static-redis-password
Set the password in Redis statically locally
2022-01-05 15:21:31 -05:00
Luke Van Seters
09e1d48ae8 Set the password in redis statically locally 2022-01-04 19:00:10 -05:00
Luke Van Seters
379bd82f0e
Merge pull request #211 from flashbots/faster-writes
Use COPY to speed up database writes for blocks and traces
2022-01-04 13:17:24 -05:00
Luke Van Seters
8ba0f86569
Merge pull request #206 from flashbots/fix-pricing
Only import the worker where needed
2022-01-04 12:21:29 -05:00
Luke Van Seters
807e6e482a
Merge pull request #212 from flashbots/only-search-shortest
Cut out early from arbitrages if we've already found a shorter path
2022-01-04 11:38:31 -05:00
Luke Van Seters
17823b5aae comment => variable 2022-01-04 11:25:27 -05:00
Luke Van Seters
eff77dd482 goodbye 2022-01-04 11:24:33 -05:00
Luke Van Seters
2af2f86069
Merge pull request #207 from flashbots/gimme-a-break
Be more lenient on liveness timeouts for deployments
2022-01-04 11:05:31 -05:00
Luke Van Seters
28b37c723c Put it back 2022-01-04 10:19:39 -05:00
Luke Van Seters
02a0adc8e2 Break it to prove tests work 2022-01-04 10:16:50 -05:00
Luke Van Seters
f84b9d45d3 Add placeholder file to detect which code is running 2022-01-04 10:05:53 -05:00
Luke Van Seters
24a6ba670e Bring back the array for diff checks 2022-01-04 09:50:44 -05:00
Luke Van Seters
bb94eba02a Change to max_route_length to make the logic clearer 2022-01-03 16:09:34 -05:00
Luke Van Seters
4e9ff10988 Cut out early from arbitrages if we've already found a shorter path 2022-01-03 15:59:56 -05:00
Luke Van Seters
0ed4f5456e Move list util to db shared 2022-01-03 15:20:00 -05:00
Luke Van Seters
9b8cac5c5d Credit 2022-01-03 15:14:28 -05:00
Luke Van Seters
ada540c1d4 Write using an iterator 2022-01-03 14:50:27 -05:00
Luke Van Seters
6b1c469a10 Move classified_traces to csv write 2022-01-03 14:27:36 -05:00
Luke Van Seters
bab2043575 Abstract out csv writing 2022-01-03 13:38:34 -05:00
Luke Van Seters
93bdb7c129 Write blocks as proof of concept 2022-01-03 13:15:30 -05:00
Luke Van Seters
99d291da8e Be more lenient on liveness timeouts 2022-01-03 12:43:54 -05:00
Luke Van Seters
7bb3275c04 Only import worker where needed 2022-01-03 12:16:33 -05:00
Tomislav Mikulin
1557673eda Merge branch 'main' into improve_dockerfile 2022-01-03 17:56:13 +01:00
Luke Van Seters
5a26bde3de Get RPC only where its needed 2022-01-03 11:50:38 -05:00
Luke Van Seters
e462a16b8f
Merge pull request #202 from flashbots/redis-queue
Queue backfills with Redis
2022-01-03 11:42:07 -05:00
Tomislav Mikulin
6f624ecb7b optimize the dockerfile with security and shrinking the resulting docker image 2022-01-02 16:32:52 +01:00
Luke Van Seters
0860f4f7f5 More detail in the README 2021-12-31 18:08:04 -05:00
Luke Van Seters
5cad2fef43 Break redis into a function. Add reference to README for now 2021-12-31 18:00:32 -05:00
Luke Van Seters
139e45333b Clean up redis pods 2021-12-31 16:44:22 -05:00
Luke Van Seters
f296de5a20 Update README to reflect new backfill 2021-12-31 16:37:27 -05:00
Luke Van Seters
0516fffa9c Add some logging 2021-12-31 16:18:17 -05:00
Luke Van Seters
01bb566478 Drop worker count to 1 locally 2021-12-31 16:18:05 -05:00
Luke Van Seters
cbec5b7613 Only build inspector once 2021-12-31 16:12:36 -05:00
Luke Van Seters
cff148e21f Log when writing 2021-12-31 16:11:18 -05:00
Luke Van Seters
815af26f28 Enqueue messages to redis with backfill command 2021-12-31 15:55:33 -05:00
Luke Van Seters
b862bddfe9 Add worker deployment 2021-12-31 15:55:33 -05:00
Luke Van Seters
476db25003 Add redis 2021-12-31 15:55:33 -05:00
Luke Van Seters
4662a1ecbc Pass DB sessions into inspector 2021-12-31 15:50:07 -05:00
Luke Van Seters
1ff9e9aa1c
Merge pull request #199 from flashbots/fix-cycle-sandwiches
Support sandwiches including multiple pools
2021-12-31 15:22:39 -05:00
Luke Van Seters
bec0d03cae
Merge pull request #201 from flashbots/fix-typo
Fix typo in gathering blocks
2021-12-31 14:49:33 -05:00
Luke Van Seters
602e32de36
Merge pull request #200 from flashbots/mev-use-poetry
Use poetry for backfill script
2021-12-31 08:19:33 -05:00
Luke Van Seters
943715c812 Fix typo in gathering blocks 2021-12-30 22:05:23 -05:00
Luke Van Seters
60b0b933b4 Use poetry for backfill script 2021-12-30 10:46:29 -05:00
Luke Van Seters
9235020999
Merge pull request #195 from flashbots/consistent-middleware
Use middleware for all RPC calls
2021-12-30 10:11:33 -05:00
Luke Van Seters
a683cc66e0 Fix sandwiches including multiple pools 2021-12-29 17:59:21 -05:00
Luke Van Seters
b487ab08a0
Merge pull request #197 from flashbots/break-out-early-find
Break out of finding block on first missing attribute
2021-12-29 11:26:56 -05:00
Luke Van Seters
880e588f5f
Merge pull request #196 from flashbots/zero-ex-two-transfers
ZeroX requires at least 2 child transfers
2021-12-29 11:26:39 -05:00
Luke Van Seters
f9ccd8dca2
Merge pull request #194 from flashbots/bug-all
Inspect block should write all
2021-12-29 11:26:08 -05:00
Luke Van Seters
846f7376d4 Break out of finding block on first missing attribute 2021-12-29 09:50:40 -05:00
Luke Van Seters
52be448fb8 ZeroX requires at least 2 child transfers 2021-12-29 09:14:15 -05:00
Luke Van Seters
b70f55c9cc Keep asyncio sleep 2021-12-25 17:29:40 -05:00
Luke Van Seters
7707b818f0 Include new methods in retry-able methods 2021-12-25 17:23:21 -05:00
Luke Van Seters
6b8d66b976
Merge pull request #173 from sketsdever/opensea
Opensea NFT Trade classifier
2021-12-25 16:56:29 -05:00
Luke Van Seters
b611be4e68 Inspect block should write all 2021-12-25 16:54:47 -05:00
Shea Ketsdever
5990838603 Last nits 2021-12-25 15:53:13 -06:00
Luke Van Seters
fcc453391f Use middleware for trace and receipt methods 2021-12-23 22:21:18 -05:00
Shea Ketsdever
edc40a3106 Merge 2021-12-23 19:56:24 -06:00
Shea Ketsdever
ce7585e0b3 Fix getting addr 2021-12-23 19:41:26 -06:00
Shea Ketsdever
1f84f95fff Require exchange_wallet_address and rename payment_token -> payment_token_address 2021-12-23 18:57:11 -06:00
Luke Van Seters
2982ff700f
Merge pull request #192 from flashbots/liquidations-error-crud
Pass error through to liquidation
2021-12-23 14:41:37 -05:00
Luke Van Seters
21826dd308 Pass error through from trace to liquidation 2021-12-23 10:09:32 -05:00
Luke Van Seters
115167096e Add error column to liquidations 2021-12-23 09:56:15 -05:00
Luke Van Seters
7b44046926
Merge pull request #183 from flashbots/fix-infinite-arbs
Only use each swap in a single arbitrage
2021-12-22 22:53:55 -05:00
Luke Van Seters
2768428eac
Merge pull request #189 from flashbots/overflow-error
Ignore overflow errors on trace decode
2021-12-22 22:49:40 -05:00
Luke Van Seters
b588e115ce Fix reverting arbitrage tests 2021-12-22 22:42:26 -05:00
Luke Van Seters
bd99188f6e Rename rest 2021-12-22 22:41:10 -05:00
Luke Van Seters
fa5be12e81 Fix docstring 2021-12-22 22:41:10 -05:00
Luke Van Seters
ca921f896d route => shortest_route in tests 2021-12-22 22:41:10 -05:00
Luke Van Seters
22769c9529 Remove TODO - not needed for now 2021-12-22 22:41:10 -05:00
Luke Van Seters
17c9b835ac Simplify smallest logic. Fix tests 2021-12-22 22:41:10 -05:00
Luke Van Seters
46b768c147 Break out shortest logic into a function 2021-12-22 22:41:10 -05:00
Luke Van Seters
46f7786c4f Only keep the shortest route instead 2021-12-22 22:41:10 -05:00
Luke Van Seters
154d356621 Only keep the longest arb 2021-12-22 22:41:10 -05:00
Luke Van Seters
f4fb7717dd Ignore overflow errors on trace decode 2021-12-22 22:39:06 -05:00
Gui Heise
45c74a19ec
Merge pull request #188 from flashbots/compound-tokens
Add compound tokens
2021-12-22 15:27:33 -05:00
Gui Heise
1916c81293 Fix USDC const 2021-12-22 14:59:34 -05:00
Gui Heise
e237f8d17f Add token addresses 2021-12-22 14:45:12 -05:00
Taarush Vemulapalli
4cb3383d1a
New error column for arbitrages (#180) 2021-12-22 08:00:54 -08:00
Luke Van Seters
ea40a3905f
Merge pull request #179 from flashbots/copy-data
Inspect many writing 10 blocks at a time - 40s => 30s locally
2021-12-21 17:57:01 -05:00
Luke Van Seters
bb0420fd78
Merge pull request #175 from flashbots/random-postgres-client
Append a random number to postgres client
2021-12-21 15:46:21 -05:00
Luke Van Seters
3c958cdc76
Merge pull request #178 from flashbots/copy-data
Bulk delete and write data
2021-12-21 15:37:26 -05:00
Luke Van Seters
cec6341bdf Inspect many writing 10 blocks at a time - 40s => 30s locally 2021-12-21 15:05:12 -05:00
Luke Van Seters
fcfb40c864 Add inspect many blocks - use for single inspect too 2021-12-21 14:58:39 -05:00
Gui Heise
a463ff7ebf
Merge pull request #177 from flashbots/token-decimals
Create tokens table
2021-12-21 14:52:29 -05:00
Gui Heise
c68e7216d9 Remove pass 2021-12-21 14:44:58 -05:00
Gui Heise
ba45200d66 Create tokens table 2021-12-21 14:18:46 -05:00
Luke Van Seters
35074c098e Append a random number to postgres client 2021-12-21 10:28:13 -05:00
Shea Ketsdever
66e1e64675 Actually fix lint issues 2021-12-20 11:05:05 -08:00
Luke Van Seters
82c167d842
Merge pull request #174 from flashbots/listener-lag-fix
Fix listener first startup
2021-12-20 12:54:32 -05:00
Luke Van Seters
a2f8b5c08e Remove PIDFILE after stop 2021-12-20 12:43:27 -05:00
Luke Van Seters
6e8d898cb0 Start listener from block lag 2021-12-20 12:37:20 -05:00
Shea Ketsdever
bf85025b84 Fix lint issue 2021-12-20 09:05:21 -08:00
Shea Ketsdever
97e6c156ab Add nft_trades table to db 2021-12-19 15:13:01 -08:00
Shea Ketsdever
b75ee98018 Create nft trade from transfers 2021-12-19 14:31:49 -08:00
Shea Ketsdever
f92737b00c Classify opensea nft trades 2021-12-19 12:16:49 -08:00
Luke Van Seters
cfa3443f88
Merge pull request #170 from flashbots/no-sandwiches
If no sandwiched swaps, not a sandwich
2021-12-17 12:15:05 -05:00
Luke Van Seters
088c32f52f If no sandwiched swaps, not a sandwich 2021-12-17 11:02:03 -05:00
Luke Van Seters
1943d73021
Merge pull request #169 from flashbots/lower-prices
Make token addresses for prices lowercase
2021-12-16 18:38:17 -05:00
Luke Van Seters
633007be64 Make token addresses for prices lowercase 2021-12-16 17:28:20 -05:00
Taarush Vemulapalli
d7bb160d85
Add received_token_address for Compound/CREAM (#168) 2021-12-16 14:33:10 -05:00
Luke Van Seters
8a8090e20f
Merge pull request #163 from flashbots/add-sandwiches-crud
Add sandwiches
2021-12-16 14:32:03 -05:00
Gui Heise
408ff02de3
Merge pull request #164 from flashbots/0x-bug 2021-12-16 13:41:10 -05:00
Gui Heise
c93e216647 Fix length check for child transfers 2021-12-15 14:35:29 -05:00
Gui Heise
af01b4e8b5 Value to Runtime error 2021-12-15 14:03:51 -05:00
Gui Heise
42b82be386 Add exception to transfers not found 2021-12-15 13:54:51 -05:00
Luke Van Seters
566dada5d4 Add back crud for sandwiches 2021-12-15 13:47:29 -05:00
Luke Van Seters
f0c29e2b2f Add logic and writing for sandwiches. Add tests too 2021-12-15 13:45:55 -05:00
Gui Heise
c090624f4c move none check 2021-12-15 11:06:22 -05:00
Luke Van Seters
5fa7c6b567
Merge pull request #167 from flashbots/isort-again
Fix whitespace for isort
2021-12-14 13:31:50 -05:00
Luke Van Seters
b9544eb18b Fix whitespace for isort 2021-12-14 13:14:13 -05:00
Luke Van Seters
c23b9a1651
Merge pull request #158 from flashbots/add-isort
Add isort pack to pre-commit
2021-12-14 13:11:39 -05:00
Luke Van Seters
94a05d8845 Run isort for alembic 2021-12-14 13:09:28 -05:00
Luke Van Seters
8b6bf7d76d Make alembic a known third part for isort 2021-12-14 13:09:02 -05:00
Luke Van Seters
2c251fb72e Make alembic a known third party 2021-12-14 13:08:26 -05:00
Luke Van Seters
bda96b04ce Try local rev 2021-12-14 13:03:24 -05:00
Luke Van Seters
bd73820123 Rename isort back 2021-12-14 12:59:14 -05:00
Luke Van Seters
7bc820fb33
Merge pull request #162 from flashbots/add-sandwiches-db
Add sandwiches tables
2021-12-14 12:48:03 -05:00
Luke Van Seters
4b909ad88e Add tables for sandwiches 2021-12-14 12:47:49 -05:00
Luke Van Seters
2ec2bf44ba
Merge pull request #160 from flashbots/add-transaction-position-crud
Write transaction position for swaps and traces
2021-12-14 12:47:16 -05:00
Luke Van Seters
ccd409e9cf
Merge pull request #161 from flashbots/add-transaction-position
Add nullable transaction position field
2021-12-14 12:47:07 -05:00
Luke Van Seters
138b1a0eef No comments 2021-12-14 12:46:00 -05:00
Luke Van Seters
d62d547da1
Merge pull request #159 from flashbots/faster-tests
Speed up the tests by sharing trace_classifier
2021-12-14 12:37:02 -05:00
Gui Heise
23635892a6 Add check for reverted orders 2021-12-13 21:07:24 -05:00
Luke Van Seters
9ffe9fe131 Add back 2021-12-13 20:24:29 -05:00
Luke Van Seters
c6eba733a0 Fix env.py 2021-12-13 20:20:12 -05:00
Luke Van Seters
c853cee43e Write transaction position for swaps and traces 2021-12-13 20:05:07 -05:00
Luke Van Seters
e84d946ebb Add nullable transaction position field 2021-12-13 20:03:17 -05:00
Luke Van Seters
2046cd2e51 Add support for profiling 2021-12-13 19:46:58 -05:00
Luke Van Seters
bb23fce13c Share trace classifier in tests 2021-12-13 19:34:47 -05:00
Luke Van Seters
5d7d84aa02 Add back isort in precommit 2021-12-13 18:50:43 -05:00
Luke Van Seters
767cf2df8f Specify python version 2021-12-13 18:48:50 -05:00
Luke Van Seters
d5f73b5e3a Run isort on all files 2021-12-13 18:46:39 -05:00
Luke Van Seters
bc46c2929b Fix isort settings so mev_inspect is considered this project 2021-12-13 18:45:21 -05:00
Luke Van Seters
f07c497b33
Merge pull request #157 from flashbots/fix-head-punks
Change migrations head for punks
2021-12-13 14:34:23 -05:00
Luke Van Seters
1534fb6165 Change migrations head for punks 2021-12-13 13:19:11 -05:00
Gui Heise
88adfd8625
Merge pull request #154 from flashbots/add-liquidation-addresses
Add liquidation addresses
2021-12-13 11:00:26 -05:00
Gui Heise
d736b38845 Add coinbase names for addresses 2021-12-08 15:09:06 -05:00
Gui Heise
00c73b228d Add supported token addresses 2021-12-07 15:53:45 -05:00
Gui Heise
5341c904ec Add top received liquidation addresses to prices 2021-12-07 15:13:08 -05:00
Robert Miller
9ffa9d2df9
Merge pull request #149 from flashbots/punk_accept_bids_database
feat: punk accept bids database
2021-12-06 16:41:47 -05:00
Robert Miller
4e91e52a92 style: formatting 2021-12-06 16:36:05 -05:00
Robert Miller
0ad3906989 style: formatting 2021-12-06 16:33:35 -05:00
Robert Miller
27f43ea29c
Merge branch 'main' into punk_accept_bids_database 2021-12-06 16:31:24 -05:00
Robert Miller
8d48cea315
Merge pull request #147 from flashbots/punk-database-work
feat: punk_snipe database entry
2021-12-06 16:16:31 -05:00
Robert Miller
01c4024017 style: formatting 2021-12-06 16:13:47 -05:00
Robert Miller
044a233141
Merge branch 'main' into punk-database-work 2021-12-06 16:07:13 -05:00
Robert Miller
34dc54ee6f
Merge pull request #148 from flashbots/punk_bid_database
feat: add punk bid database
2021-12-06 16:05:47 -05:00
Gui Heise
d938182833
Merge pull request #153 from flashbots/double-arb-bug
Fix arbitrage swap double entry bug
2021-12-06 15:21:43 -05:00
Gui Heise
d2a1814774 skip start swap 2021-12-06 15:18:32 -05:00
Gui Heise
be19c42275 add start and end route check 2021-12-06 11:52:41 -05:00
Robert Miller
8ee803d229 Merge branch 'punk-database-work' of https://github.com/flashbots/mev-inspect-py into punk-database-work 2021-12-04 20:33:13 -05:00
Robert Miller
478f9bafa5 style: formatting 2021-12-04 20:32:54 -05:00
Robert Miller
9f08275698 style: formatting 2021-12-04 20:32:29 -05:00
Robert Miller
622cf9319e style: formatting 2021-12-04 20:31:46 -05:00
Luke Van Seters
11744deaa9
Merge pull request #151 from sketsdever/bancor
Bancor classifier
2021-12-03 11:05:44 -05:00
Shea Ketsdever
37e6900f46 Rename create_swap functions 2021-12-02 21:08:45 -05:00
Taarush Vemulapalli
1fb65bacc1
Compound backfilling/removed network calls (#125)
* Removes `collateral_token_address` from both aave/comp for consistency
2021-12-02 11:19:32 -08:00
Shea Ketsdever
4fdd628ce3 Merge 2021-12-01 18:10:05 -05:00
Luke Van Seters
912239fc2e
Merge pull request #150 from flashbots/fix-timestamp-writing
Fix timestamp writing in blocks
2021-11-30 12:57:20 -05:00
Luke Van Seters
ed94e71715 Fix timestamp writing in blocks 2021-11-30 12:54:07 -05:00
Luke Van Seters
f7e4bdaed2
Merge pull request #142 from flashbots/prices-kube
Add cron job to fetch prices
2021-11-29 12:09:29 -05:00
Shea Ketsdever
7d7f78bfb1 Fix int<>timestamp bug 2021-11-28 16:02:41 -08:00
Shea Ketsdever
cd01298ba6 Bancor classifier 2021-11-28 14:51:24 -08:00
Robert Miller
c1ba63ef81 style: formatting 2021-11-26 21:34:16 -05:00
Robert Miller
e1e678bbc2 style: formatting 2021-11-26 21:33:47 -05:00
Robert Miller
c619c20878
bug: add a missing parentheses 2021-11-26 21:29:53 -05:00
Robert Miller
3088055606
bug: add a missing parentheses 2021-11-26 21:29:20 -05:00
Luke Van Seters
018fb8c73b Run hourly 2021-11-26 21:07:06 -05:00
Luke Van Seters
9a076a6b4c Don't run prices by default 2021-11-26 21:07:06 -05:00
Luke Van Seters
391314b9d6 Limit successful history instead of ttl 2021-11-26 21:07:06 -05:00
Luke Van Seters
c83577b04c Remove restart 2021-11-26 21:07:06 -05:00
Luke Van Seters
34aca861cc Use poetry directly instead of entrypoint script 2021-11-26 21:07:06 -05:00
Luke Van Seters
a8c1728e35 Save progress 2021-11-26 21:07:06 -05:00
Luke Van Seters
26caaa04e1
Merge pull request #134 from flashbots/prices
Add support for fetching prices from coinbase and storing
2021-11-26 21:06:48 -05:00
Luke Van Seters
4f34316afb COINBASE_TOKEN_NAMES => COINBASE_TOKEN_NAME_BY_ADDRESS 2021-11-26 21:03:57 -05:00
Robert Miller
868094696a
Merge branch 'main' into punk_accept_bids_database 2021-11-26 19:07:42 -05:00
Robert Miller
90f822a15f
Merge branch 'main' into punk_bid_database 2021-11-26 19:07:16 -05:00
Robert Miller
56f0bbb855
Merge branch 'main' into punk-database-work 2021-11-26 19:02:10 -05:00
Gui Heise
4304776af6
Merge pull request #143 from flashbots/0x-v2
Add support for 0x orderbook
2021-11-26 18:14:41 -05:00
Robert Miller
07aa6e3089 feat: add punk_bid_acceptances database 2021-11-26 15:42:36 -05:00
Robert Miller
71c549b6f3 feat: add punk_bids database 2021-11-26 15:33:07 -05:00
Robert Miller
7bfe77a18f bug: fix punk_snipe alembic file 2021-11-26 15:22:45 -05:00
Robert Miller
947e5921c7 feat: add alembic for punk snipes 2021-11-26 15:10:37 -05:00
Robert Miller
8144d406b3
Merge pull request #138 from flashbots/cryptopunks-classifer 2021-11-26 12:00:35 -05:00
Luke Van Seters
2dc2c89b0b
Merge pull request #146 from flashbots/block-timestamp-timestamp
Convert block_timestmap from numeric to timestamp
2021-11-26 11:18:30 -05:00
Luke Van Seters
051ef74eb7 Convert block_timestmap from numeric to timestamp 2021-11-26 11:02:02 -05:00
Robert Miller
2cc7ac4a20 feat: initial files for punk database 2021-11-25 21:05:42 -05:00
Robert Miller
b4097baa68 feat: remove unused punk_snipe import 2021-11-25 19:35:22 -05:00
Robert Miller
7638c97e88 =feat: change punk snipe to only check against the highest bid per punk 2021-11-25 19:32:30 -05:00
Robert Miller
bb3ace07a1 =move punk classifiers out of classifer.py 2021-11-25 16:48:48 -05:00
Robert Miller
976ac9ea77 style: change punk_bid.amount to price 2021-11-25 16:04:52 -05:00
Robert Miller
3314056c88 revert change to mev 2021-11-25 12:23:46 -05:00
Gui Heise
44e357344e Remove test assertion 2021-11-24 13:54:39 -05:00
Gui Heise
9f860c118e Remove validation step 2021-11-24 12:23:32 -05:00
Gui Heise
8a555ea442 Move helpers into 0x file 2021-11-24 12:14:40 -05:00
Gui Heise
7656c0d76c Remove children swaps 2021-11-23 14:34:26 -05:00
Gui Heise
c334441e95 Add assertion and move constants up 2021-11-23 11:28:15 -05:00
Gui Heise
d7872db45c Restructure classifier 2021-11-23 11:15:03 -05:00
Gui Heise
d75e9b76ab Add constants and exceptions 2021-11-23 10:38:02 -05:00
Gui Heise
4c643a2d9f Add tests for 0x swaps 2021-11-23 09:32:18 -05:00
Gui Heise
2d62ca25d6 Add function signatures 2021-11-22 19:06:58 -05:00
Gui Heise
e29c4fad72 Add support for any taker 2021-11-22 15:09:20 -05:00
Gui Heise
2f1a9bc751 Add helper for token_in_amount 2021-11-22 12:35:23 -05:00
Gui Heise
f650d3e87f Make protocol zero_ex 2021-11-22 12:23:14 -05:00
Gui Heise
32aa3246bf Remove debugger 2021-11-22 12:23:14 -05:00
Gui Heise
dbe40249b5 Add Rfq/Limit distinction 2021-11-22 12:23:14 -05:00
Gui Heise
cf71272c10 Add 0x swap classifier 2021-11-22 12:23:14 -05:00
Gui Heise
8428dd9908
Merge pull request #141 from flashbots/classifier-helpers
Add classifier helpers
2021-11-22 12:22:38 -05:00
Gui Heise
89c2ed3a84 Remove func 2021-11-22 12:16:39 -05:00
Gui Heise
784922fa07 Rename to helpers, add func 2021-11-22 12:07:30 -05:00
Gui Heise
9bf7a2675c
Merge pull request #140 from flashbots/swapmodel
Add contract_address to SwapModel
2021-11-22 11:28:26 -05:00
Gui Heise
dc02564862 Add contract_address 2021-11-22 10:55:00 -05:00
Gui Heise
4f2c65e535
Merge pull request #137 from flashbots/swap-contract-address
Swap contract address
2021-11-21 22:14:17 -05:00
Gui Heise
94269cad33
Merge pull request #139 from flashbots/mev-bash
Change shell directory
2021-11-20 10:24:23 -05:00
Gui Heise
d2e1c588c4 Change shell directory 2021-11-19 19:21:46 -05:00
Robert Miller
377137d9c8 feat: add support for punk snipes 2021-11-19 17:18:29 -06:00
Robert Miller
f31430da30 bug: update uint to uin256 2021-11-19 17:17:34 -06:00
Gui Heise
12a82e918b Add contract_address in arbs 2021-11-19 11:03:06 -05:00
Gui Heise
45c9980a79 Add contract_address to tests 2021-11-19 11:00:14 -05:00
Gui Heise
8c699ed7cc Alter schema 2021-11-19 10:59:08 -05:00
Gui Heise
a9859a0b12 Add database migration 2021-11-19 10:58:35 -05:00
Gui Heise
07e1680301
Merge pull request #130 from flashbots/swaps-classifiers
Implement swap classifiers
2021-11-19 09:58:39 -05:00
Luke Van Seters
bf4570c8a3
Merge pull request #136 from flashbots/transfers-trace-address-array
Change transfers trace_address to ARRAY
2021-11-19 08:43:59 -05:00
Luke Van Seters
5f9bd3a274 Change transfers trace address to ARRAY 2021-11-19 08:42:08 -05:00
Luke Van Seters
ec860c7357
Merge pull request #135 from flashbots/prices-table-2
Add prices table
2021-11-18 17:33:27 -05:00
Luke Van Seters
f5233a17fd Rename to prices table 2021-11-18 13:56:07 -05:00
Luke Van Seters
7d50d3d674 Rename to prices table 2021-11-18 13:55:38 -05:00
Luke Van Seters
023205c25b Print => logger 2021-11-18 13:47:59 -05:00
Luke Van Seters
d499983f32 Remove fetch-latest for now 2021-11-18 13:45:25 -05:00
Luke Van Seters
5b59427d4f Write prices. Ignore duplicates 2021-11-18 13:43:21 -05:00
Gui Heise
386eccaeb7 Remove abstract method 2021-11-18 12:58:45 -05:00
Gui Heise
ca0014533a Add getter method for Uni recipient address 2021-11-18 12:52:48 -05:00
Gui Heise
c5621e0676 space 2021-11-18 12:23:09 -05:00
Gui Heise
1e1241cbf5 Remove Uni none checks and bash change 2021-11-18 12:22:13 -05:00
Luke Van Seters
bed8520bc8 Write prices on fetch-all 2021-11-18 11:55:42 -05:00
Luke Van Seters
5a3dbca425 Create usd_prices table 2021-11-18 11:55:03 -05:00
Luke Van Seters
2dc14218bf Add support for fetching all supported prices 2021-11-18 11:43:59 -05:00
Luke Van Seters
053c29cf20 Add placeholder for price commands 2021-11-18 11:43:59 -05:00
Gui Heise
6e25031623 Rename utils.py to swaps.py 2021-11-18 11:38:09 -05:00
Luke Van Seters
5756cb15a5
Merge pull request #128 from elopio/typo/clasifier
Fix typo: clasifier
2021-11-18 11:34:25 -05:00
Luke Van Seters
36101c36db
Merge pull request #132 from flashbots/timestamp-support
Add support for writing timestamps in mev-inspect
2021-11-18 10:39:15 -05:00
Luke Van Seters
d7238c0e83
Merge pull request #131 from flashbots/add-block-timestamps-table
Add block timestamps table
2021-11-18 10:39:10 -05:00
Luke Van Seters
0d4cbc76b6
Merge pull request #129 from flashbots/fix-logging-base
Only set base logging from entry points
2021-11-18 10:39:05 -05:00
Robert Miller
1de1570939 feat: change to "punk bid acceptance" and get punk bid acceptances 2021-11-17 21:51:56 -05:00
Luke Van Seters
d2437055d9 Fix tests 2021-11-17 15:19:48 -05:00
Luke Van Seters
5aa8776b0d Don't attempt to create block if timestamp is null 2021-11-17 15:14:24 -05:00
Luke Van Seters
a2dc8908df Save block during inspection 2021-11-17 15:11:26 -05:00
Luke Van Seters
ad45abbe9c Add crud for blocks 2021-11-17 15:07:04 -05:00
Luke Van Seters
460f449127 Add block timestamps table 2021-11-17 14:37:57 -05:00
Luke Van Seters
caf645e923 Fetch timestamp when creating blocks 2021-11-17 13:28:48 -05:00
Gui Heise
ff9337eb4b Fix UniV3 Classifier 2021-11-17 10:19:10 -05:00
Gui Heise
94c5691f01 Move swap logic into classifiers 2021-11-17 07:37:25 -05:00
Robert Miller
96d2171daa style: improve schema naming bcuz imagine complained 2021-11-16 19:57:58 -05:00
Robert Miller
0d6215f82e wip feat: getting punk bids / accepts 2021-11-15 21:08:28 -05:00
Robert Miller
5766abb9fe feat: add punk classifiers 2021-11-15 21:08:07 -05:00
Robert Miller
c5ab2be4e3 add punk classifications 2021-11-15 21:07:38 -05:00
Luke Van Seters
f705a85b5c Only set base logging from entrypoints 2021-11-15 16:00:18 -05:00
Gui Heise
f43df8ffa4 Fix circular imports 2021-11-15 13:28:34 -05:00
Gui Heise
29cd82cd0b Parse swap logic inside uniswap classifier 2021-11-15 11:00:39 -05:00
Luke Van Seters
dec628b7a9
Merge pull request #124 from flashbots/listener-healthcheck
Ping healthcheck URL on each inspect in listener
2021-11-12 19:02:42 -05:00
Luke Van Seters
ec49c03484
Merge pull request #123 from flashbots/listener-async
Support asyncio in listener
2021-11-12 19:02:33 -05:00
Luke Van Seters
d34356bffb
Merge pull request #118 from flashbots/classified-traces-block-index
Change classified_traces and miner_payments primary keys to begin with block number
2021-11-12 19:02:24 -05:00
Luke Van Seters
e144e377fd
Merge pull request #117 from flashbots/swap-block-index
Reindex swaps by block number
2021-11-12 14:58:43 -05:00
Leo Arias
cfeaaae046 Fix typo: clasifier 2021-11-11 17:55:12 +00:00
Gui Heise
5d03c1fbfa Add classifier specs to init 2021-11-11 10:39:24 -05:00
Robert Miller
af2aab4940 add cryptopunks trace classifier 2021-11-10 20:14:42 -05:00
Luke Van Seters
63e81b22e6 Ping healthcheck url on each successful block inspect 2021-11-09 18:21:51 -05:00
Luke Van Seters
7b60488f76 Support async for listener 2021-11-09 11:51:43 -05:00
Luke Van Seters
e0d6919039 Pass DB session into the inspector 2021-11-09 10:49:08 -05:00
Luke Van Seters
c94b2523c1
Merge pull request #121 from flashbots/listener-fix
Fix mev listener
2021-11-09 09:25:05 -05:00
Luke Van Seters
91ff886ecf Fix listener command 2021-11-08 12:09:19 -05:00
Luke Van Seters
fd1deae50d
Merge pull request #119 from flashbots/add-pokt-readme
Add Pokt recommendation to the README
2021-11-04 12:44:45 -04:00
Luke Van Seters
6c4409be75 free => hosted 2021-11-03 16:21:39 -04:00
Luke Van Seters
66a4089790 Add pokt recommendation to the readme 2021-11-03 16:20:38 -04:00
Luke Van Seters
45a536cd15 Change miner payments and transfers tables to begin with block number 2021-11-03 12:47:56 -04:00
Luke Van Seters
674565f789 Change classified traces primary key to include block number 2021-11-02 18:40:43 -04:00
Luke Van Seters
c38d77504e Reindex swaps by block number 2021-11-02 17:29:45 -04:00
Luke Van Seters
b5a9bed2d4
Merge pull request #116 from flashbots/cache-field-remove
Remove unused cache field
2021-11-02 15:59:13 -04:00
Luke Van Seters
d4a0541391 Add mev exec to execute a command on the inspect pod 2021-11-02 14:31:47 -04:00
Luke Van Seters
e9d71f62bf Remove unused cache field 2021-11-02 14:27:07 -04:00
Luke Van Seters
c436c6480e
Merge pull request #109 from carlomazzaferro/asyncio-backfilling
asyncio-based backfilling
2021-11-01 12:04:43 -04:00
carlomazzaferro
0cb62e4c55
Merge remote-tracking branch 'upstream/main' into asyncio-backfilling 2021-10-30 00:16:20 +01:00
carlomazzaferro
a6bf834e76
address PR comments 2021-10-30 00:15:23 +01:00
Gui Heise
a1b001b2cf
Merge pull request #114 from flashbots/ETHTransferLiquidations
Add support for ETH-Transfer liquidations
2021-10-28 15:51:01 +01:00
Gui Heise
c6c0cb5511 Remove optional 2021-10-28 15:48:20 +01:00
carlomazzaferro
36111abf69
Use inspector class -- remove global Semaphore and improve error handling 2021-10-28 11:33:33 +01:00
carlomazzaferro
f6719cdfc8
merge commit 2021-10-28 11:15:51 +01:00
carlomazzaferro
c3475bbd8f
Use inspector class -- remove global Semaphore and improve error handling 2021-10-28 11:04:24 +01:00
Gui Heise
e25448a9f4 Eth constant 2021-10-28 00:31:39 +01:00
Gui Heise
1ee62bc96b Remove unused elif 2021-10-28 00:20:28 +01:00
Gui Heise
6e9d9b943a Fix transfer conditional 2021-10-28 00:16:39 +01:00
Gui Heise
cf0926fef0 Add ETH_ADDRESS and check against it 2021-10-28 00:15:57 +01:00
Gui Heise
a93f4abf95 Fix tests 2021-10-27 23:55:07 +01:00
Gui Heise
c4dac40bad Delete 2021-10-27 23:53:16 +01:00
Gui Heise
afc4eb4289 Delete 2021-10-27 23:52:38 +01:00
Gui Heise
c0f4da04d8 Test 2021-10-27 23:47:08 +01:00
Luke Van Seters
3521567884
Merge pull request #111 from flashbots/backfill-template-env-2
Add trace DB environment variables to the backfill helm chart
2021-10-27 15:36:33 -04:00
Luke Van Seters
afce3ce9ba
Merge pull request #108 from flashbots/backfill-cleanup
Move traces to `traces.py` files
2021-10-27 15:36:16 -04:00
Gui Heise
06615bec95
Merge pull request #113 from flashbots/fetch-block
Add fetch-block command
2021-10-27 18:01:27 +01:00
Gui Heise
a8fbacb7f0 Add block json 2021-10-27 17:27:32 +01:00
Luke Van Seters
30df035d12
Merge pull request #110 from flashbots/readme-mev-2
Update README to use `./mev` commands
2021-10-27 11:21:37 -04:00
Gui Heise
6834dba8fa Add mev command 2021-10-27 15:53:44 +01:00
Gui Heise
f57d8e5be5 Add fetch-block command 2021-10-27 15:47:59 +01:00
Luke Van Seters
132b79ee91
Merge pull request #78 from emlazzarin/arb-fix-4
Adjust arbitrage calculation logic
2021-10-26 12:19:50 -04:00
Gui Heise
7bb65a336e Fix ETH transfer liquidations 2021-10-26 14:09:35 +01:00
Gui Heise
8822ebcf55 ETH transfer WIP 2021-10-25 15:34:46 +01:00
sragss
e29d8bb310 Merge branch 'main' into arb-fix-4 2021-10-22 14:01:08 -07:00
carlomazzaferro
e15eef49c1
async based middleware, better logging and async requests 2021-10-22 13:58:00 +01:00
Gui Heise
ceebea30e3 Add ETH transfer logic 2021-10-22 12:37:06 +01:00
Luke Van Seters
58ab655d89 Specify trace DB credentials in the backfill helm chart 2021-10-21 16:24:43 -04:00
Luke Van Seters
576fe04eb0
Update README.md 2021-10-21 15:54:45 -04:00
Luke Van Seters
18c42a872f Add support for tailing listener logs and inspecting many from the mev command 2021-10-21 15:54:11 -04:00
Luke Van Seters
5897781db8
Update README to add a backfill section 2021-10-21 15:47:24 -04:00
Luke Van Seters
619ed51e49
Update README to use ./mev commands 2021-10-21 15:36:01 -04:00
Luke Van Seters
f523935a79
Merge pull request #100 from elopio/readme
Prettify the README
2021-10-21 15:32:05 -04:00
carlomazzaferro
4f20c540e6
asyncio-based concurrent backfilling 2021-10-20 17:12:21 +01:00
Luke Van Seters
4894f57f13 Add back transaction hash in classified traces 2021-10-19 18:11:33 -04:00
Luke Van Seters
8c6f984b0a transaction hash is optional 2021-10-19 18:01:31 -04:00
Luke Van Seters
d38e027bfa Remove duplicate fields on classified trace 2021-10-19 13:21:39 -04:00
Luke Van Seters
01a27f84c0 Rename classified_traces file to traces. Move Trace to traces 2021-10-19 13:20:01 -04:00
Leo Arias
60f1a651bb Apply review feedback 2021-10-19 16:47:53 +00:00
Leo Arias
ad4acfa043 Add maintainers 2021-10-19 16:37:42 +00:00
Leo Arias
a4c21b765d Fix the issues link 2021-10-19 16:34:14 +00:00
Leo Arias
c36e2445af Link the badge to discord 2021-10-19 16:34:14 +00:00
Leo Arias
53a1afd5f7 Improve format 2021-10-19 16:34:14 +00:00
Leo Arias
f3687c9102 Fix typo 2021-10-19 16:32:39 +00:00
Leo Arias
8e42bede10 Prettify the README. 2021-10-19 16:32:39 +00:00
Luke Van Seters
a5e4a2d1d4
Merge pull request #106 from flashbots/traces-db-access
Use the trace DB for cached blocks
2021-10-18 13:06:54 -04:00
Luke Van Seters
4ae59b8e28 Back to original 2021-10-18 12:58:57 -04:00
Sam Ragsdale
d952287b2d Adjust arbitrage path creation to not depend on pool_address, adjust tests accordingly 2021-10-18 09:29:58 -07:00
Luke Van Seters
063b8764a8 Write classified traces 2021-10-18 12:01:26 -04:00
Luke Van Seters
68232f4161 Fetch all of the block from the DB if possible 2021-10-18 11:33:14 -04:00
Luke Van Seters
a786b74f4a Update listener 2021-10-18 10:47:41 -04:00
Luke Van Seters
dbc7c5d4ae Add credentials to tiltfile. Find traces using traces db 2021-10-18 10:47:41 -04:00
Luke Van Seters
ee5a4905e6 Weave trace db through to fetch 2021-10-18 10:47:41 -04:00
Luke Van Seters
1993f0a14d Add trace DB session 2021-10-18 10:47:41 -04:00
Luke Van Seters
2935df284d Add optional environment variables for TRACE_DB_* 2021-10-18 10:47:41 -04:00
Luke Van Seters
db1b31c0dc
Merge pull request #103 from flashbots/backfill-main
Add support for parallelized backfills
2021-10-18 10:46:57 -04:00
Luke Van Seters
03e42ee007
Merge pull request #105 from flashbots/faster-inspect
Small adjustments to speed up inspection
2021-10-18 10:46:35 -04:00
Luke Van Seters
f3b85dc1df Run black 2021-10-16 21:44:07 -04:00
Luke Van Seters
e22d947c1f Bring back black 2021-10-16 19:51:11 -04:00
Luke Van Seters
7f017777d6 Add some small optimizations. Skip compound liquidations for now 2021-10-16 19:50:54 -04:00
Luke Van Seters
8d9f860346 Add some small optimizations. Skip compound liquidations for now 2021-10-16 19:49:38 -04:00
Luke Van Seters
3934004ed4 Turn off black. not working 2021-10-16 19:48:31 -04:00
Taarush Vemulapalli
90afc1b905
Support for Cream markets + tests (#104)
* Support for Cream markets + test

* fixes for WETH/underlying_markets

* has_liquidations helper
2021-10-16 10:13:39 -07:00
Gui Heise
c6c45b4ab0
Merge pull request #89 from flashbots/aTokens
Add received_token_address to liquidations
2021-10-15 11:49:14 -04:00
Gui Heise
1354de8d4a Remove print statement 2021-10-15 10:11:45 -04:00
Gui Heise
54b0e93b10 Edit payback function type hint 2021-10-15 10:09:44 -04:00
Luke Van Seters
a9263d6008
Merge pull request #102 from flashbots/helm-2
Migrate our kube to helm
2021-10-14 17:42:24 -04:00
Luke Van Seters
af75fbc35a Add backfill scripts 2021-10-14 17:41:57 -04:00
Gui Heise
b06b3bc733 Fix classifier 2021-10-14 17:23:55 -04:00
Luke Van Seters
1818aafbd7 Make job names random 2021-10-14 17:04:32 -04:00
Gui Heise
e11f5b6741 Resolve merge conflicts 2021-10-14 16:47:09 -04:00
Luke Van Seters
2de620ea4e Add a chart for backfill jobs 2021-10-14 16:16:35 -04:00
Luke Van Seters
9e41f316cb Add replica count back. Remove tag from image 2021-10-13 18:13:46 -04:00
Luke Van Seters
1d3543c982 Update README for new helm name 2021-10-13 17:25:11 -04:00
Luke Van Seters
24951891ca
Merge pull request #101 from flashbots/mev-backfill
Add `mev` command for easy inspect use
2021-10-13 17:24:07 -04:00
Luke Van Seters
369e956db6 Remove old app.yaml 2021-10-13 17:23:38 -04:00
Luke Van Seters
8f8dd11af3 Remove chartg 2021-10-13 17:20:10 -04:00
Luke Van Seters
3c40faf310 Move to helm charts 2021-10-13 17:08:54 -04:00
Luke Van Seters
2e921f2685 Add db command. Update Tiltfile / app to store DB host in secrets 2021-10-13 14:55:53 -04:00
Luke Van Seters
0ddb0104af Add some echos. Add backfill 2021-10-13 14:21:30 -04:00
Luke Van Seters
7d66bce9ee Add mev script 2021-10-13 14:14:10 -04:00
Luke Van Seters
561f8c3450 new script who dis 2021-10-13 14:07:50 -04:00
Gui Heise
588b41333d Add Aave transfer classifiers 2021-10-13 12:57:14 -04:00
Gui Heise
7d9d9af120 aToken test 2021-10-13 12:01:39 -04:00
Gui Heise
cf7836896b Aave transfers 2021-10-13 12:01:39 -04:00
Gui Heise
fc5ccc9b9b Aave transfers 2021-10-13 11:58:37 -04:00
Gui Heise
f5b4e87c4c Fixed ABI 2021-10-13 11:55:28 -04:00
Gui Heise
1c786357a4 Minor fixes 2021-10-13 11:55:28 -04:00
Gui Heise
a75bc69366 aTokens classifer spec 2021-10-13 11:55:28 -04:00
Gui Heise
d422b88bba aTokens ABI 2021-10-13 11:55:28 -04:00
Gui Heise
a1bdb3b9b8 Remove test 2021-10-13 11:55:28 -04:00
Gui Heise
003106194f Removed broken test 2021-10-13 11:55:28 -04:00
Gui Heise
a704ab2fe3 aToken blocks 2021-10-13 11:55:28 -04:00
Gui Heise
4c889f813c Payback function output order 2021-10-13 11:53:47 -04:00
Gui Heise
4c203da24e Adjust payback function name and migratiop drop to column 2021-10-13 11:53:47 -04:00
Gui Heise
ccd17c5585 Adjust liquidator payback logic 2021-10-13 11:53:47 -04:00
Gui Heise
b997d0fbd1 Remove nullable 2021-10-13 11:53:47 -04:00
Gui Heise
aa5a72b189 Add received_token_address to liquidation object 2021-10-13 11:53:47 -04:00
Gui Heise
f84d192053 Adjust migration to add column 2021-10-13 11:53:47 -04:00
Gui Heise
45b1790f75 Add migration 2021-10-13 11:53:47 -04:00
Gui Heise
a38b9d2ce2 Change return type 2021-10-13 11:53:47 -04:00
Gui Heise
dbcb26d2ca Add aToken lookup logic 2021-10-13 11:53:47 -04:00
Luke Van Seters
e785dd0b25
Merge pull request #96 from flashbots/eth-transfers-eeee
Remove ETH / ERC20 transfer distinction
2021-10-13 10:24:36 -04:00
Taarush Vemulapalli
ed83b49091
Compound
* compound v2 + tests
2021-10-13 07:19:52 -07:00
Gui Heise
9758005a80 aToken test 2021-10-13 01:07:31 -04:00
Gui Heise
b2de07407c Aave transfers 2021-10-13 00:53:14 -04:00
Gui Heise
5e111dd5b2 Aave transfers 2021-10-13 00:38:05 -04:00
Luke Van Seters
2a852746fe Only get eth transfer if only transfering eth 2021-10-12 20:17:50 -04:00
Luke Van Seters
3950a9c809 Handle ETH transfers in swaps 2021-10-12 20:13:59 -04:00
Luke Van Seters
6de8f494c4 get => build 2021-10-12 20:13:59 -04:00
Luke Van Seters
9df6dfdf5b Build => get 2021-10-12 20:13:59 -04:00
Luke Van Seters
378f5b248e Remove ETH / ERC20 transfer distinction 2021-10-12 20:13:59 -04:00
Gui Heise
e5506c1bf6 Fixed ABI 2021-10-12 17:11:05 -04:00
Gui Heise
c68f2c87e3 Minor fixes 2021-10-12 16:42:35 -04:00
Gui Heise
d2a91775de aTokens classifer spec 2021-10-12 14:19:07 -04:00
Gui Heise
afd65aaac0 aTokens ABI 2021-10-12 14:06:11 -04:00
Gui Heise
e77fa51db0 Remove test 2021-10-12 13:37:49 -04:00
Gui Heise
fd5cbce43e Removed broken test 2021-10-12 13:35:37 -04:00
Gui Heise
025d5b9d2b aToken blocks 2021-10-12 13:34:08 -04:00
Robert Miller
f7fbd97a50
Merge pull request #99 from flashbots/curve-swaps-2
Add support for Curve swaps
2021-10-12 13:26:19 -04:00
Luke Van Seters
e3b360ec39 Fix swap tests 2021-10-12 12:23:47 -04:00
Luke Van Seters
547b51df92 Add swap support for curve 2021-10-12 12:19:24 -04:00
Luke Van Seters
0c4f605229 Write protocol for swaps 2021-10-12 12:19:09 -04:00
Luke Van Seters
1c1b80721c
Merge pull request #94 from elopio/issue/tuple
decode: collapse tuples
2021-10-12 08:19:34 -07:00
Luke Van Seters
ed463ad979
Merge pull request #98 from flashbots/lag-block-listener
Lag the block listener 5 blocks
2021-10-12 08:18:59 -07:00
Luke Van Seters
d76bb52016 Lag the block listener 5 blocks 2021-10-11 16:00:58 -07:00
Luke Van Seters
b5f625112e
Merge pull request #97 from flashbots/pre-commit-pr-fix
Add --all-files to pre-commit GH action
2021-10-11 15:53:10 -07:00
Luke Van Seters
b8ff6f0e8b run --all-files 2021-10-11 15:48:36 -07:00
Luke Van Seters
2377222750 Add --all-files to pre-commit GH aciton 2021-10-11 15:45:55 -07:00
Leo Arias
ba73f58396 Run precommit 2021-10-11 17:51:38 +00:00
Leo Arias
a67769cea3 Run precommit 2021-10-11 17:31:23 +00:00
Leo Arias
4e5ad64929 decode: collapse tuples 2021-10-11 01:49:37 +00:00
Luke Van Seters
b6fc27b3f6 rename get_transfers => get_erc20_transfers 2021-10-08 12:24:43 -04:00
Luke Van Seters
afcff7c845
Merge pull request #92 from flashbots/swaps-classifier
Use SwapClassifier for decoding Swap objects
2021-10-08 11:47:27 -04:00
Luke Van Seters
a1fd035de8 Update tests 2021-10-08 11:37:12 -04:00
Luke Van Seters
3039f3eed2 Use SwapClassifier for Swap objects 2021-10-08 11:37:12 -04:00
Luke Van Seters
8c6d7ab889
Merge pull request #90 from flashbots/specs-v2
Group classifying a trace as a `transfer` with the logic to decode a `Transfer` object
2021-10-08 11:36:39 -04:00
Luke Van Seters
e3eb858ed9 Fail at runtime if not implemented 2021-10-06 16:43:04 -04:00
Luke Van Seters
058cbeed94 Fix tests for decoded call trace 2021-10-06 16:00:17 -04:00
Luke Van Seters
f1379cc0a0 Switch to class instead of instance 2021-10-06 15:56:28 -04:00
Luke Van Seters
02c9c1cddc Add transfer parsing to transfer classifiers 2021-10-06 15:28:50 -04:00
Luke Van Seters
86ee26dd1a Make Classifier a union 2021-10-06 15:14:24 -04:00
Luke Van Seters
d57a2d021d Add specific classifiers for each type 2021-10-06 15:12:44 -04:00
Luke Van Seters
621a2798c8 No burn 2021-10-06 14:55:00 -04:00
Luke Van Seters
d2c397f212 Change classifications => classifiers 2021-10-06 14:53:38 -04:00
Gui Heise
f8f8c488d7 Payback function output order 2021-10-06 13:33:43 -04:00
Gui Heise
67c31883c3 Adjust payback function name and migratiop drop to column 2021-10-06 02:19:13 -04:00
Gui Heise
d7be215bb9 Adjust liquidator payback logic 2021-10-05 16:27:50 -04:00
Luke Van Seters
8a94eeaf39 Add .envrc to gitignore 2021-10-05 12:43:43 -04:00
Gui Heise
5274619081 Remove nullable 2021-10-05 12:29:23 -04:00
Gui Heise
ad19ce913f Add received_token_address to liquidation object 2021-10-05 12:23:06 -04:00
Luke Van Seters
3c761d85f8
Merge pull request #88 from elopio/typos/readme
Fix typos in README
2021-10-05 10:22:49 -04:00
Leo Arias
e75a2919cd Fix typos in README 2021-10-05 04:38:48 +00:00
Gui Heise
66e36a6407 Adjust migration to add column 2021-10-04 18:59:58 -04:00
Gui Heise
fa20c2e650 Add migration 2021-10-04 16:21:06 -04:00
Gui Heise
4ac4b2c601 Change return type 2021-10-04 15:40:40 -04:00
Gui Heise
6bd1e1905b Add aToken lookup logic 2021-10-04 09:25:25 -04:00
Gui Heise
7dbbd9f545
Merge pull request #86 from flashbots/liquidation-models
Add liquidation models
2021-10-01 18:06:13 -04:00
Gui Heise
77b17cab94 Make trace address a primary key 2021-10-01 17:59:11 -04:00
Luke Van Seters
f9c3431854
Merge pull request #84 from flashbots/aave-db
Add database migration for liquidations
2021-10-01 13:01:09 -04:00
Robert Miller
4834d068f6
Merge pull request #74 from flashbots/aave-v0
Add AAVE liquidations to inspect_block
2021-09-30 13:58:37 -04:00
Gui Heise
eb720dee16 Remove print 2021-09-30 11:48:05 -04:00
Gui Heise
4dbcb59b4d Add trace address to liquidations 2021-09-30 11:48:05 -04:00
Gui Heise
1560ee9a99 Add liquidations model/crud 2021-09-30 11:48:05 -04:00
Gui Heise
cac1b13ac7 Database migration for liquidations 2021-09-30 11:46:55 -04:00
Gui Heise
cc41cbe1ef
Update liquidation_test.py 2021-09-30 11:09:27 -04:00
Gui Heise
d54ab01046
Update inspect_block.py 2021-09-30 11:05:32 -04:00
Gui Heise
a86fa44717
Update aave_liquidations.py 2021-09-30 11:04:10 -04:00
Luke Van Seters
e6f5ece46f
Update README.md 2021-09-29 12:40:49 -04:00
Gui Heise
7dbf4a9e0e Database migration for liquidations 2021-09-29 10:24:46 -04:00
Gui Heise
eb9edc914e Names 2021-09-29 09:43:06 -04:00
Gui Heise
f48d373cf3 Function naming 2021-09-29 09:43:06 -04:00
Gui Heise
d348490ce5 index 2021-09-29 09:43:06 -04:00
Gui Heise
35f12ed4a8 Naming 2021-09-29 09:43:06 -04:00
Gui Heise
3047d207cc Add assertion function 2021-09-29 09:43:06 -04:00
Gui Heise
db6feab697 Removed dir 2021-09-29 09:43:06 -04:00
Gui Heise
54fb7713a0 Change tests from unittest to pytest 2021-09-29 09:43:06 -04:00
Gui Heise
e135830b5d Multiple liquidations tests 2021-09-29 09:43:06 -04:00
Gui Heise
07763e0e3c Load blocks from cache in tests 2021-09-29 09:43:06 -04:00
Luke Van Seters
a3bcc7e3bb Add tests 2021-09-29 09:43:06 -04:00
Gui Heise
356735dc5f Export order and function updates 2021-09-29 09:43:06 -04:00
Gui Heise
536c01c7f9 Remove comments and prints 2021-09-29 09:43:06 -04:00
Gui Heise
0382618724 Add transfers and simplify children 2021-09-29 09:43:06 -04:00
Gui Heise
0288c339d1 Remove liquidation data 2021-09-29 09:43:06 -04:00
Gui Heise
887d8c0a6a Function for child liquidation check 2021-09-29 09:43:06 -04:00
Gui Heise
052e1f6c8d Parent liquidations type 2021-09-29 09:43:06 -04:00
Gui Heise
882af3e42f Remove .remove() and add unique parent trace check 2021-09-29 09:43:06 -04:00
Gui Heise
bdcaaa9bf7 Turned received amount logic to function 2021-09-29 09:43:06 -04:00
Gui Heise
36e90f295f Updated find_liquidator_payback to bool 2021-09-29 09:43:06 -04:00
Gui Heise
e57f754bfe Cleanup 2021-09-29 09:43:06 -04:00
Gui Heise
5b8072b271 Simplify logic for liquidator payback 2021-09-29 09:43:06 -04:00
Gui Heise
b215a1d9b2 Remove try, update transfer_to keys, add child trace parsing and removal 2021-09-29 09:43:06 -04:00
Gui Heise
8b5d1327a8 Remove unused and try 2021-09-29 09:43:06 -04:00
Gui Heise
aedd6696b4 Cleanup 2021-09-29 09:43:06 -04:00
Gui Heise
8385bb676b Add received amount calculations and update functions 2021-09-29 09:43:06 -04:00
Gui Heise
faa8d09312 Transfer function cleanup 2021-09-29 09:43:05 -04:00
Gui Heise
02959e68da Add collateral amount initial logic using transfer functions 2021-09-29 09:43:05 -04:00
Gui Heise
e7b3bb4ac7 Redefine transfer functions and add liquidator 2021-09-29 09:43:05 -04:00
Gui Heise
0a770511a4 Add event inputs 2021-09-29 09:43:05 -04:00
Gui Heise
2f9dbeae08 Improve legibility 2021-09-29 09:41:59 -04:00
Gui Heise
0cc259220d Simplify liquidation data logic 2021-09-29 09:41:59 -04:00
Gui Heise
5149840a76 Removed junk data / narrow to inspect_block 2021-09-29 09:41:59 -04:00
Gui Heise
ddce8bfb8a Resolve merge conflicts 2021-09-29 09:41:59 -04:00
Gui Heise
d52ad4b74c Add return type and resolving merge conflivt 2021-09-29 09:41:59 -04:00
Gui Heise
8f79843f3f Add types to lists 2021-09-29 09:41:59 -04:00
Gui Heise
bf3ca0f529 Tighten PR 2021-09-29 09:41:59 -04:00
Gui Heise
356e8f6c86 Remove unused logic/ minor fixes 2021-09-29 09:41:59 -04:00
Gui Heise
173d16c2bc Add docstrings to new functions 2021-09-29 09:41:59 -04:00
Gui Heise
2ce4badf65 Assignment of transfer lists 2021-09-29 09:41:59 -04:00
Gui Heise
4bba2f793a Add logic to functions and introduce transfer logic 2021-09-29 09:41:59 -04:00
Gui Heise
a1d06ce114 Remove unused imports, improve variable names 2021-09-29 09:41:59 -04:00
Gui Heise
563935d5b4 Added Aave liquidation to inspect_block 2021-09-29 09:41:59 -04:00
Gui Heise
8f51f4e87c Resolve merge conflicts 2021-09-29 09:41:59 -04:00
Gui Heise
f272f11c81 Add return type and resolving merge conflivt 2021-09-29 09:41:59 -04:00
Gui Heise
8d1242f760 Add types to lists 2021-09-29 09:41:59 -04:00
Gui Heise
e93a78b8ce Tighten PR 2021-09-29 09:41:59 -04:00
Gui Heise
7f93466b35 Remove unused logic/ minor fixes 2021-09-29 09:41:59 -04:00
Gui Heise
fededa9cad Add docstrings to new functions 2021-09-29 09:41:59 -04:00
Gui Heise
7dea90d5c7 Assignment of transfer lists 2021-09-29 09:41:59 -04:00
Gui Heise
c1328e312f Add logic to functions and introduce transfer logic 2021-09-29 09:41:59 -04:00
Gui Heise
82a6c72f6a Remove unused imports, improve variable names 2021-09-29 09:41:59 -04:00
Gui Heise
91428d491c Added Aave liquidation to inspect_block 2021-09-29 09:41:59 -04:00
Luke Van Seters
8f0b295956
Merge pull request #80 from flashbots/log-to-stdout
Log to stdout for the CLI
2021-09-29 09:00:26 -04:00
Luke Van Seters
9f1e6c12fa
Merge pull request #82 from wardbradt/patch-1
fix broken link
2021-09-28 22:54:25 -04:00
Ward Bradt
f0526c1012
fix broken link 2021-09-28 22:30:17 -04:00
Luke Van Seters
ebc161aa51
Merge pull request #81 from flashbots/readme-updates
Update README to include helm, setup for kind, and postgres info
2021-09-23 13:42:45 -04:00
Luke Van Seters
f2ce697175 Add instructions on connecting to Postgres 2021-09-23 12:04:27 -04:00
Luke Van Seters
58a7409568 Add Docker to README 2021-09-23 11:47:21 -04:00
Luke Van Seters
e56458c908 Add kubectl to README 2021-09-23 11:46:33 -04:00
Luke Van Seters
3bba682c58 Remove platform from Tiltfile 2021-09-23 11:42:34 -04:00
Luke Van Seters
54cd815514 Update README with install helm and create kind cluster 2021-09-23 11:42:21 -04:00
Luke Van Seters
9c170a3f00 Log to stdout for the CLI 2021-09-22 10:40:06 -04:00
Robert Miller
0f23046733
Merge pull request #76 from flashbots/readme-kube-4
Update README to use Kubernetes setup - Remove docker compose
2021-09-20 17:21:00 -05:00
Robert Miller
e5e4f6ef1b
Merge pull request #75 from flashbots/dont-start-run-by-default
Don't start block listener on container start
2021-09-20 17:19:36 -05:00
Luke Van Seters
18e45ee437 Lower weth address 2021-09-20 13:28:16 -04:00
Luke Van Seters
747dc5dfe1 Remove some old config files 2021-09-20 13:25:21 -04:00
Luke Van Seters
576f7dc507 Don't ignore tests. Run tests int he container 2021-09-20 13:22:04 -04:00
Luke Van Seters
86fdeddfaa Not used 2021-09-20 13:21:24 -04:00
Luke Van Seters
00c97ffe72 Better instructions for switching from docker compose 2021-09-20 13:15:18 -04:00
Luke Van Seters
7b036cc620 Add stars 2021-09-20 13:12:35 -04:00
Luke Van Seters
0afc1494f1 Add tests to README 2021-09-20 13:11:13 -04:00
Luke Van Seters
52679cd3cc Update actions to call tests directly 2021-09-20 13:09:14 -04:00
Luke Van Seters
37cf615c75 Update README for more interesting blocks 2021-09-20 13:06:33 -04:00
Luke Van Seters
4d5c8977c1 Remove docker compose and old poetry scripts 2021-09-20 13:04:37 -04:00
Luke Van Seters
036228036d Bold parity 2021-09-20 12:57:28 -04:00
Luke Van Seters
663a97e84f Fix spacing 2021-09-20 12:56:17 -04:00
Luke Van Seters
be9ae86d5c Update README for Kube 2021-09-20 12:54:08 -04:00
Luke Van Seters
5682c2ce4e Break out block listener to a start / stop daemon instead of running on startup 2021-09-20 12:31:27 -04:00
Robert Miller
5756a7c405
Merge pull request #72 from flashbots/write-transfers-2
Write transfers to transfers table on inspect
2021-09-20 10:34:07 -05:00
Robert Miller
1027a3ecbc
Merge pull request #73 from emlazzarin/main
Balancer v1 classifier / abis
2021-09-20 10:23:09 -05:00
Luke Van Seters
f5ce06b008 Fix tests 2021-09-20 11:22:38 -04:00
Luke Van Seters
8686166276 Add writing transfers to inspect block 2021-09-20 11:22:37 -04:00
Luke Van Seters
2b7c8532f2 Add crud and models to write transfers to the DB 2021-09-20 11:20:56 -04:00
Luke Van Seters
d37bf8f6e2 Add block number to transfers schema 2021-09-20 11:20:56 -04:00
Robert Miller
f395e9758f
Merge pull request #69 from flashbots/tilt-deps-2
Make mev-inspect require that the DB is already up for local dev
2021-09-20 10:19:48 -05:00
Luke Van Seters
516664e6ab Rebase 2021-09-20 11:11:44 -04:00
Robert Miller
3ff4af2970
Merge pull request #71 from flashbots/write-transfers
Add DB migration to create transfers table
2021-09-20 10:08:49 -05:00
Robert Miller
f7ffbfadb1
Merge pull request #66 from flashbots/inspect-listener
Poll for incoming blocks and auto-inspect them
2021-09-20 10:07:31 -05:00
Sam Ragsdale
ed63b6bb38 Balancer v1 classifier / abis 2021-09-17 15:13:21 -07:00
Luke Van Seters
266a66be03 Add DB migration to create transfers table 2021-09-17 09:29:16 -04:00
Luke Van Seters
b8280f8464 Sleep first to get newest block after sleep 2021-09-17 09:20:17 -04:00
Luke Van Seters
a9cbe106ad Use a dedicated table for the last block written. Write new blocks as they come 2021-09-17 09:20:17 -04:00
Luke Van Seters
50d04a0b42 Use last written miner payment block as max written block 2021-09-17 09:20:17 -04:00
Luke Van Seters
e6793ee053 Add configmap for RPC. Print latest block on loop 2021-09-17 09:20:17 -04:00
Luke Van Seters
0db24349fd print => logging 2021-09-17 09:20:17 -04:00
Luke Van Seters
7a53816d74 Add auto-restart of process. Add GracefulKiller 2021-09-17 09:20:17 -04:00
Luke Van Seters
e92c36d30a Move DB session out 2021-09-17 09:20:17 -04:00
Luke Van Seters
66c22682e8 Get empty list default for miner payments 2021-09-17 09:20:17 -04:00
Luke Van Seters
768de19b60 Move inspect block logic into mev_inspect module from script 2021-09-17 09:20:17 -04:00
Luke Van Seters
e365a2c0c0 Move inspect block logic into mev_inspect module from script 2021-09-17 09:20:17 -04:00
Luke Van Seters
4993bbc8e0 Create cache dir if not exists 2021-09-17 09:20:17 -04:00
Luke Van Seters
bff71b01c3
Merge pull request #70 from flashbots/call-trace-db
Add stricter types to trace classifier
2021-09-16 14:55:28 -06:00
Robert Miller
aed8310cb1
Merge pull request #68 from flashbots/add-block-number-index-again
Add an index on block number for miner payments
2021-09-16 12:43:50 -05:00
Robert Miller
c51d907655
Merge pull request #65 from flashbots/backoff-retry
Add middleware to retry with backoff
2021-09-16 12:42:52 -05:00
Gui Heise
1b0e05ec2f syntax 2021-09-16 11:58:36 -04:00
Gui Heise
fbb0ebaffe Syntax fix 2021-09-16 11:57:16 -04:00
Gui Heise
7e7bd5bc07 Removed action and subtraces arguments from helpers 2021-09-16 11:36:35 -04:00
Gui Heise
230a07f47d Simplified serialization 2021-09-16 11:28:18 -04:00
Gui Heise
cc9f3e993d Fixed helpers default attrs 2021-09-16 11:20:28 -04:00
Gui Heise
034b72c463 Updated comments 2021-09-16 11:11:05 -04:00
Gui Heise
71b7c99c17 Added pydantic serialization, naming changes 2021-09-16 11:08:16 -04:00
Gui Heise
170ab07e2f
Delete get_helm.sh 2021-09-16 10:18:40 -04:00
Gui Heise
f204620fea Fixed inputs serialization 2021-09-14 14:32:48 -04:00
Luke Van Seters
bf79c7e0be Add an index on block number for miner payments 2021-09-14 12:27:17 -04:00
Gui Heise
3795336fd8 WIP: Fixed db writing (no inputs) 2021-09-13 16:58:50 -04:00
Gui Heise
8281d123ab WIP: Fix DB writing 2021-09-13 15:28:51 -04:00
Luke Van Seters
e7d918f514 Add backoff to http retry middleware 2021-09-11 14:35:09 -06:00
Luke Van Seters
b2d2c7dbeb Switch CMD to python loop. Make host an environment variable 2021-09-11 09:50:33 -06:00
Gui Heise
fe6cd4dcdb Added type changes to tests 2021-09-09 14:18:27 -04:00
245 changed files with 38322 additions and 2315 deletions

1
.dockerignore Normal file
View File

@ -0,0 +1 @@
cache

8
.env
View File

@ -1,8 +0,0 @@
# Postgres
POSTGRES_SERVER=db
POSTGRES_USER=postgres
POSTGRES_PASSWORD=password
POSTGRES_DB=mev_inspect
# SQLAlchemy
SQLALCHEMY_DATABASE_URI=postgresql://$POSTGRES_USER:$POSTGRES_PASSWORD@$POSTGRES_SERVER/$POSTGRES_DB

View File

@ -21,7 +21,7 @@ jobs:
- name: Bootstrap poetry
shell: bash
run: |
curl -sL https://raw.githubusercontent.com/python-poetry/poetry/master/install-poetry.py \
curl -sSL https://install.python-poetry.org \
| python - -y
- name: Update PATH
@ -51,8 +51,8 @@ jobs:
- name: Run precommit
run: |
poetry run pre-commit
poetry run pre-commit run --all-files
- name: Test with pytest
shell: bash
run: poetry run test
run: poetry run pytest --cov=mev_inspect tests

9
.gitignore vendored
View File

@ -19,3 +19,12 @@ cache
# k8s
.helm
# env
.envrc
# pycharm
.idea
.env
.python-version

View File

@ -1,9 +1,16 @@
repos:
- repo: https://github.com/ambv/black
rev: 20.8b1
rev: 22.3.0
hooks:
- id: black
language_version: python3.9
- id: black
language_version: python3.9
- repo: local
hooks:
- id: isort
name: isort
entry: poetry run isort .
language: system
types: [python]
- repo: local
hooks:
- id: pylint
@ -13,8 +20,9 @@ repos:
language: system
types: [python]
- repo: https://github.com/pre-commit/mirrors-mypy
rev: v0.910
rev: v0.942
hooks:
- id: 'mypy'
additional_dependencies:
- 'pydantic'
- 'types-requests'

View File

@ -433,7 +433,7 @@ int-import-graph=
known-standard-library=
# Force import order to recognize a module as part of a third party library.
known-third-party=enchant
known-third-party=alembic
# Couples of modules and preferred modules, separated by a comma.
preferred-modules=

36
CONTRIBUTING.md Normal file
View File

@ -0,0 +1,36 @@
# Contributing guide
Welcome to the Flashbots collective! We just ask you to be nice when you play with us.
## Pre-commit
We use pre-commit to maintain a consistent style, prevent errors, and ensure test coverage.
To set up, install dependencies through `poetry`:
```
poetry install
```
Then install pre-commit hooks with:
```
poetry run pre-commit install
```
## Tests
Run tests with:
```
./mev test
```
## Send a pull request
- Your proposed changes should be first described and discussed in an issue.
- Open the branch in a personal fork, not in the team repository.
- Every pull request should be small and represent a single change. If the problem is complicated, split it in multiple issues and pull requests.
- Every pull request should be covered by unit tests.
We appreciate you, friend <3.

View File

@ -1,21 +1,29 @@
FROM python:3.9
FROM python:3.9-slim-buster
RUN pip install -U pip \
ENV POETRY_VERSION=1.1.12
RUN useradd --create-home flashbot \
&& apt-get update \
&& curl -sSL https://raw.githubusercontent.com/python-poetry/poetry/master/get-poetry.py | python -
&& apt-get install -y --no-install-recommends build-essential libffi-dev libpq-dev gcc procps \
&& pip install poetry==$POETRY_VERSION \
&& apt-get clean \
&& rm -rf /var/lib/apt/lists/*
ENV PATH="${PATH}:/root/.poetry/bin"
ENV PATH="${PATH}:/home/flashbot/.local/bin"
COPY ./pyproject.toml /app/pyproject.toml
COPY ./poetry.lock /app/poetry.lock
COPY --chown=flashbot ./pyproject.toml /app/pyproject.toml
COPY --chown=flashbot ./poetry.lock /app/poetry.lock
WORKDIR /app/
RUN poetry config virtualenvs.create false && \
poetry install
USER flashbot
COPY . /app
RUN poetry config virtualenvs.create false \
&& poetry install
COPY --chown=flashbot . /app
# easter eggs 😝
RUN echo "PS1='🕵️:\[\033[1;36m\]\h \[\033[1;34m\]\W\[\033[0;35m\]\[\033[1;36m\]$ \[\033[0m\]'" >> ~/.bashrc
CMD ["/bin/bash"]
ENTRYPOINT [ "poetry" ]
CMD [ "run", "python", "loop.py" ]

21
LICENSE Normal file
View File

@ -0,0 +1,21 @@
MIT License
Copyright (c) 2023 Flashbots
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.

126
MONOLITHIC.md Normal file
View File

@ -0,0 +1,126 @@
# Running mev-inspect-py without kubernetes ('monolithic mode')
Running mev-inspect-py outside of kubernetes can be useful for debug purposes. In this case, the steps for installation are:
1. Install dependencies (pyenv, poetry, postgres)
1. Set up python virtual environment using matching python version (3.9.x) and install required python modules using poetry
1. Create postgres database
1. Run database migrations
The database credentials and archive node address used by mev-inspect-py need to be loaded into environment variables (both for database migrations and to run mev-inspect-py).
## Ubuntu install instructions
So, starting from a clean Ubuntu 22.04 installation, the prerequisites for pyenv, psycopg2 (python3-dev libpq-dev) can be installed with
`sudo apt install -y make build-essential git libssl-dev zlib1g-dev libbz2-dev libreadline-dev libsqlite3-dev wget curl llvm libncurses5-dev libncursesw5-dev xz-utils tk-dev liblzma-dev python3-dev libpq-dev`
### pyenv
Install pyenv using the web installer
`curl https://pyenv.run | bash`
and add the following to `~/.bashrc` (if running locally) or `~/.profile` (if running over ssh).
```
export PYENV_ROOT="$HOME/.pyenv"
command -v pyenv >/dev/null || export PATH="$PYENV_ROOT/bin:$PATH"
eval "$(pyenv init -)"
```
Then update the current shell by running `source ~/.bashrc` or `source ~/.profile` as appropriate.
### Poetry
Install Poetry using the web installer
`curl -sSL https://install.python-poetry.org | python3 -`
add the following to `~/.bashrc` (if running locally) or `~/.profile` (if running over ssh)
`export PATH="/home/user/.local/bin:$PATH"`
If running over ssh you should also add the following to `~/.profile` to prevent [Poetry errors](https://github.com/python-poetry/poetry/issues/1917) from a lack of active keyring:
`export PYTHON_KEYRING_BACKEND=keyring.backends.null.Keyring`
Again update current shell by running `source ~/.bashrc` or `source ~/.profile` as appropriate.
### postgres
We have tested two alternatives for postgres - installing locally or as a container.
#### Option 1: Installing locally
To install locally from a clean Ubuntu 22.04 installation, run:
`sudo apt install postgresql postgresql-contrib`
Note: You may need to reconfigure your pg-hba.conf to allow local access.
#### Option 2: Installing docker
To avoid interfering with your local postgres instance, you may prefer to run postgres within a docker container.
For docker installation instructions, please refer to https://docs.docker.com/engine/install/ubuntu/
### mev-inspect-py
With all dependencies now installed, clone the mev-inspec-py repo
```
git clone https://github.com/flashbots/mev-inspect-py.git
cd mev-inspect-py
```
We now install the required pythn version and use Poetry to install the required python modules into a virtual environment.
```
pyenv install 3.9.16
pyenv local 3.9.16
poetry env use 3.9.16
poetry install
```
### Create database
mev-inspect-py outputs to a postgres database, so we need to set this up. There are various ways of doing this, two options are presented here.
#### Option 1 — Run postgres locally
```
sudo -u postgres psql
\password
postgres
create database mev_inspect;
\q
```
#### Option 2 — Use postgres docker image
To avoid interfering with your local postgres instance, you may prefer to run postgres within a docker container. First ensure that postgres is not currently running to ensure port `5432` is available:
`sudo systemctl stop postgresql`
and then start a containerised postgres instance:
`sudo docker run -d -p 5432:5432 -e POSTGRES_USER=postgres -e POSTGRES_PASSWORD=postgres -e POSTGRES_DB=mev_inspect postgres`
### Environment variables
We will need to set a few environment variables to use mev-inspect-py. **These will be required every time mev-inspect-py runs**, so again you may wish to add these to your `~/.bashrc` and/or `~/.profile` as appropriate. Note that you need to substitute the correct URL for your archive node below if you are not running Erigon locally.
```
export POSTGRES_USER=postgres
export POSTGRES_PASSWORD=postgres
export POSTGRES_HOST=localhost
export RPC_URL="http://127.0.0.1:8545"
```
### Database migrations
Finally run the database migrations and fetch price information:
```
poetry run alembic upgrade head
poetry run fetch-all-prices
```
## Usage instructions
The same functionality available through kubernetes can be run in 'monolithic mode', but the relevant functions now need to be invoked by Poetry directly. So to inspect a single block, run for example:
`poetry run inspect-block 16379706`
Or to inspect a range of blocks:
`poetry run inspect-many-blocks 16379606 16379706`
Or to run the test suite:
`poetry run pytest tests`

358
README.md
View File

@ -1,104 +1,306 @@
# mev-inspect
A [WIP] Ethereum MEV Inspector in Python managed by Poetry
⚠️ This tool has been deprecated. You can visit [Flashbots Data](https://datasets.flashbots.net/) for historical mev-inspect data on Ethereum and join us on the [Flashbots forum](https://collective.flashbots.net). ⚠️
## Containers
mev-inspect's local setup is built on [Docker Compose](https://docs.docker.com/compose/)
# mev-inspect-py
By default it starts up:
- `mev-inspect` - a container with the code in this repo used for running scripts
- `db` - a postgres database instance
- `pgadmin` - a postgres DB UI for querying and more (avaiable at localhost:5050)
[![standard-readme compliant](https://img.shields.io/badge/readme%20style-standard-brightgreen.svg?style=flat-square)](https://github.com/RichardLitt/standard-readme)
[![Discord](https://img.shields.io/discord/755466764501909692)](https://discord.gg/7hvTycdNcK)
## Running locally
Setup [Docker](https://www.docker.com/products/docker-desktop)
Setup [Poetry](https://python-poetry.org/docs/#osx--linux--bashonwindows-install-instructions)
[Maximal extractable value](https://ethereum.org/en/developers/docs/mev/) inspector for Ethereum, to illuminate the [dark forest](https://www.paradigm.xyz/2020/08/ethereum-is-a-dark-forest/) 🌲💡
Given a block, mev-inspect finds:
- miner payments (gas + coinbase)
- tokens transfers and profit
- swaps and [arbitrages](https://twitter.com/bertcmiller/status/1427632028263059462)
- ...and more
Data is stored in Postgres for analysis.
## Install
mev-inspect-py is built to run on kubernetes locally and in production.
### Dependencies
- [docker](https://www.docker.com/products/docker-desktop)
- [kind](https://kind.sigs.k8s.io/docs/user/quick-start), or a similar tool for running local Kubernetes clusters
- [kubectl](https://kubernetes.io/docs/tasks/tools/)
- [helm](https://helm.sh/docs/intro/install/)
- [tilt](https://docs.tilt.dev/install.html)
### Set up
Create a new cluster with:
Install dependencies through poetry
```
poetry install
kind create cluster
```
Start the services (optionally as daemon)
```
poetry run start [-d]
```
Set an environment variable `RPC_URL` to an RPC for fetching blocks.
Apply the latest migrations against the local DB:
```
poetry run exec alembic upgrade head
```
mev-inspect-py currently requires a node with support for Erigon traces and receipts (not geth yet 😔).
Run inspect on a block
```
poetry run inspect -b/--block-number 11931270 -r/--rpc 'http://111.11.11.111:8545/'
```
[pokt.network](https://www.pokt.network/)'s "Ethereum Mainnet Archival with trace calls" is a good hosted option.
To stop the services (if running in the background, otherwise just ctrl+c)
```
poetry run stop
```
Example:
MEV container can be attached via
```
poetry run attach
```
Running additional compose commands are possible through standard `docker
compose ...` calls. Check `docker compose help` for more tools available
## Executing scripts
Any script can be run from the mev-inspect container like
```
poetry run exec <your command here>
```
For example
```
poetry run exec python examples/uniswap_inspect.py -block_number=123 -rpc='111.111.111'
```
### Poetry Scripts
```bash
# code check
poetry run lint # linting via Pylint
poetry run test # testing and code coverage with Pytest
poetry run isort # fixing imports
poetry run mypy # type checking
poetry run black # style guide
poetry run pre-commit run --all-files # runs Black, PyLint and MyPy
# docker management
poetry run start [-d] # starts all services, optionally as a daemon
poetry run stop # shutsdown all services or just ctrl + c if foreground
poetry run build # rebuilds containers
poetry run attach # enters the mev-inspect container in interactive mode
# launches inspection script
poetry run inspect -b/--block-number 11931270 -r/--rpc 'http://111.11.11.111:8545/'
export RPC_URL="http://111.111.111.111:8546"
```
## Rebuilding containers
After changes to the app's Dockerfile, rebuild with
Next, start all services with:
```
poetry run build
tilt up
```
## Using PGAdmin
Press "space" to see a browser of the services starting up.
1. Go to [localhost:5050](localhost:5050)
On first startup, you'll need to apply database migrations with:
2. Login with the PGAdmin username and password in `.env`
```
./mev exec alembic upgrade head
```
3. Add a new engine for mev_inspect with
- host: db
- user / password: see `.env`
And load prices data
```
./mev prices fetch-all
```
## Monolithic (non-kubernetes) install instructions
For an alternative means of running mev-inspect-py for smaller set-ups or debug purposes see the [monolithic install instructions](MONOLITHIC.md).
## Usage
### Inspect a single block
Inspecting block [12914944](https://twitter.com/mevalphaleak/status/1420416437575901185):
```
./mev inspect 12914944
```
### Inspect many blocks
Inspecting blocks 12914944 to 12914954:
```
./mev inspect-many 12914944 12914954
```
### Inspect all incoming blocks
Start a block listener with:
```
./mev listener start
```
By default, it will pick up wherever you left off.
If running for the first time, listener starts at the latest block.
Tail logs for the listener with:
```
./mev listener tail
```
And stop the listener with:
```
./mev listener stop
```
### Backfilling
For larger backfills, you can inspect many blocks in parallel
To inspect blocks 12914944 to 12915044, run
```
./mev backfill 12914944 12915044
```
This queues the blocks in Redis to be pulled off by the mev-inspect-worker service
To increase or decrease parallelism, update the replicaCount value for the mev-inspect-workers helm chart
Locally, this can be done by editing Tiltfile and changing "replicaCount=1" to your desired parallelism:
```
k8s_yaml(helm(
'./k8s/mev-inspect-workers',
name='mev-inspect-workers',
set=["replicaCount=1"],
))
```
You can see worker pods spin up then complete by watching the status of all pods
```
watch kubectl get pods
```
To see progress and failed batches, connect to Redis with
```
./mev redis
```
For total messages, query:
```
HLEN dramatiq:default.msgs
```
For messages failed and waiting to retry in the delay queue (DQ), query:
```
HGETALL dramatiq:default.DQ.msgs
```
For messages permanently failed in the dead letter queue (XQ), query:
```
HGETALL dramatiq:default.XQ.msgs
```
To clear the queue, delete keys for the main queue and delay queue
```
DEL dramatiq:default.msgs
DEL dramatiq:default.DQ.msgs
```
For more information on queues, see the [spec shared by dramatiq](https://github.com/Bogdanp/dramatiq/blob/24cbc0dc551797783f41b08ea461e1b5d23a4058/dramatiq/brokers/redis/dispatch.lua#L24-L43)
**Backfilling a list of blocks**
Create a file containing a block per row, for example blocks.txt containing:
```
12500000
12500001
12500002
```
Then queue the blocks with
```
cat blocks.txt | ./mev block-list
```
To watch the logs for a given worker pod, take its pod name using the above, then run:
```
kubectl logs -f pod/mev-inspect-worker-abcdefg
```
(where `mev-inspect-worker-abcdefg` is your actual pod name)
### Exploring
All inspect output data is stored in Postgres.
To connect to the local Postgres database for querying, launch a client container with:
```
./mev db
```
When you see the prompt:
```
mev_inspect=#
```
You're ready to query!
Try finding the total number of swaps decoded with UniswapV3Pool:
```
SELECT COUNT(*) FROM swaps WHERE abi_name='UniswapV3Pool';
```
or top 10 arbs by gross profit that took profit in WETH:
```
SELECT *
FROM arbitrages
WHERE profit_token_address = '0xc02aaa39b223fe8d0a0e5c4f27ead9083c756cc2'
ORDER BY profit_amount DESC
LIMIT 10;
```
Postgres tip: Enter `\x` to enter "Explanded display" mode which looks nicer for results with many columns.
## FAQ
### How do I delete / reset my local postgres data?
Stop the system if running:
```
tilt down
```
Delete it with:
```
kubectl delete pvc data-postgresql-postgresql-0
```
Start back up again:
```
tilt up
```
And rerun migrations to create the tables again:
```
./mev exec alembic upgrade head
```
### I was using the docker-compose setup and want to switch to kube, now what?
Re-add the old `docker-compose.yml` file to your mev-inspect-py directory.
A copy can be found [here](https://github.com/flashbots/mev-inspect-py/blob/ef60c097719629a7d2dc56c6e6c9a100fb706f76/docker-compose.yml)
Tear down docker-compose resources:
```
docker compose down
```
Then go through the steps in the current README for kube setup.
### Error from server (AlreadyExists): pods "postgres-client" already exists
This means the postgres client container didn't shut down correctly.
Delete this one with:
```
kubectl delete pod/postgres-client
```
Then start it back up again.
## Maintainers
- [@lukevs](https://github.com/lukevs)
- [@gheise](https://github.com/gheise)
- [@bertmiller](https://github.com/bertmiller)
## Contributing
Pre-commit is used to maintain a consistent style, prevent errors and ensure test coverage.
[Flashbots](https://flashbots.net) is a research and development collective working on mitigating the negative externalities of decentralized economies. We contribute with the larger free software community to illuminate the dark forest.
Install pre-commit with:
```
poetry run pre-commit install
```
You are welcome here <3.
Update README if needed
- If you want to join us, come and say hi in our [Discord chat](https://discord.gg/7hvTycdNcK).
- If you have a question, feedback or a bug report for this project, please [open a new Issue](https://github.com/flashbots/mev-inspect-py/issues).
- If you would like to contribute with code, check the [CONTRIBUTING file](CONTRIBUTING.md).
- We just ask you to be nice.
## Security
If you find a security vulnerability on this project or any other initiative related to Flashbots, please let us know sending an email to security@flashbots.net.
---
Made with ☀️ by the ⚡🤖 collective.

117
Tiltfile
View File

@ -1,22 +1,119 @@
load('ext://helm_remote', 'helm_remote')
load("ext://helm_remote", "helm_remote")
load("ext://secret", "secret_from_dict")
load("ext://configmap", "configmap_from_dict")
helm_remote("postgresql",
repo_name='bitnami',
repo_url='https://charts.bitnami.com/bitnami',
set=["postgresqlPassword=password", "postgresqlDatabase=mev_inspect"],
repo_name="bitnami",
repo_url="https://charts.bitnami.com/bitnami",
set=["auth.postgresPassword=password", "auth.database=mev_inspect"],
)
load('ext://secret', 'secret_from_dict')
helm_remote("redis",
repo_name="bitnami",
repo_url="https://charts.bitnami.com/bitnami",
set=["global.redis.password=password"],
)
k8s_yaml(configmap_from_dict("mev-inspect-rpc", inputs = {
"url" : os.environ["RPC_URL"],
}))
k8s_yaml(configmap_from_dict("mev-inspect-listener-healthcheck", inputs = {
"url" : os.getenv("LISTENER_HEALTHCHECK_URL", default=""),
}))
k8s_yaml(secret_from_dict("mev-inspect-db-credentials", inputs = {
"username" : "postgres",
"password": "password",
"host": "postgresql",
}))
docker_build('mev-inspect', '.',
# if using https://github.com/taarushv/trace-db
# k8s_yaml(secret_from_dict("trace-db-credentials", inputs = {
# "username" : "username",
# "password": "password",
# "host": "trace-db-postgresql",
# }))
docker_build("mev-inspect-py", ".",
live_update=[
sync('.', '/app'),
run('cd /app && poetry install',
trigger='./pyproject.toml'),
sync(".", "/app"),
run("cd /app && poetry install",
trigger="./pyproject.toml"),
],
)
k8s_yaml("k8s/app.yaml")
k8s_yaml(helm(
'./k8s/mev-inspect',
name='mev-inspect',
set=[
"extraEnv[0].name=AWS_ACCESS_KEY_ID",
"extraEnv[0].value=foobar",
"extraEnv[1].name=AWS_SECRET_ACCESS_KEY",
"extraEnv[1].value=foobar",
"extraEnv[2].name=AWS_REGION",
"extraEnv[2].value=us-east-1",
"extraEnv[3].name=AWS_ENDPOINT_URL",
"extraEnv[3].value=http://localstack:4566",
],
))
k8s_yaml(helm(
'./k8s/mev-inspect-workers',
name='mev-inspect-workers',
set=[
"extraEnv[0].name=AWS_ACCESS_KEY_ID",
"extraEnv[0].value=foobar",
"extraEnv[1].name=AWS_SECRET_ACCESS_KEY",
"extraEnv[1].value=foobar",
"extraEnv[2].name=AWS_REGION",
"extraEnv[2].value=us-east-1",
"extraEnv[3].name=AWS_ENDPOINT_URL",
"extraEnv[3].value=http://localstack:4566",
"replicaCount=1",
],
))
k8s_resource(
workload="mev-inspect",
resource_deps=["postgresql", "redis-master"],
)
k8s_resource(
workload="mev-inspect-workers",
resource_deps=["postgresql", "redis-master"],
)
# uncomment to enable price monitor
# k8s_yaml(helm('./k8s/mev-inspect-prices', name='mev-inspect-prices'))
# k8s_resource(workload="mev-inspect-prices", resource_deps=["postgresql"])
local_resource(
'pg-port-forward',
serve_cmd='kubectl port-forward --namespace default svc/postgresql 5432:5432',
resource_deps=["postgresql"]
)
# if using local S3 exports
#k8s_yaml(secret_from_dict("mev-inspect-export", inputs = {
# "export-bucket-name" : "local-export",
# "export-bucket-region": "us-east-1",
# "export-aws-access-key-id": "foobar",
# "export-aws-secret-access-key": "foobar",
#}))
#helm_remote(
# "localstack",
# repo_name="localstack-charts",
# repo_url="https://localstack.github.io/helm-charts",
#)
#
#local_resource(
# 'localstack-port-forward',
# serve_cmd='kubectl port-forward --namespace default svc/localstack 4566:4566',
# resource_deps=["localstack"]
#)
#
#k8s_yaml(configmap_from_dict("mev-inspect-export", inputs = {
# "services": "s3",
#}))

View File

@ -1,16 +1,14 @@
from logging.config import fileConfig
from sqlalchemy import engine_from_config
from sqlalchemy import pool
from alembic import context
from sqlalchemy import engine_from_config, pool
from mev_inspect.db import get_sqlalchemy_database_uri
from mev_inspect.db import get_inspect_database_uri
# this is the Alembic Config object, which provides
# access to the values within the .ini file in use.
config = context.config
config.set_main_option("sqlalchemy.url", get_sqlalchemy_database_uri())
config.set_main_option("sqlalchemy.url", get_inspect_database_uri())
# Interpret the config file for Python logging.
# This line sets up loggers basically.

View File

@ -0,0 +1,54 @@
"""Change miner payments and transfers primary keys to include block number
Revision ID: 04a3bb3740c3
Revises: a10d68643476
Create Date: 2021-11-02 22:42:01.702538
"""
from alembic import op
# revision identifiers, used by Alembic.
revision = "04a3bb3740c3"
down_revision = "a10d68643476"
branch_labels = None
depends_on = None
def upgrade():
# transfers
op.execute("ALTER TABLE transfers DROP CONSTRAINT transfers_pkey")
op.create_primary_key(
"transfers_pkey",
"transfers",
["block_number", "transaction_hash", "trace_address"],
)
op.drop_index("ix_transfers_block_number")
# miner_payments
op.execute("ALTER TABLE miner_payments DROP CONSTRAINT miner_payments_pkey")
op.create_primary_key(
"miner_payments_pkey",
"miner_payments",
["block_number", "transaction_hash"],
)
op.drop_index("ix_block_number")
def downgrade():
# transfers
op.execute("ALTER TABLE transfers DROP CONSTRAINT transfers_pkey")
op.create_index("ix_transfers_block_number", "transfers", ["block_number"])
op.create_primary_key(
"transfers_pkey",
"transfers",
["transaction_hash", "trace_address"],
)
# miner_payments
op.execute("ALTER TABLE miner_payments DROP CONSTRAINT miner_payments_pkey")
op.create_index("ix_block_number", "miner_payments", ["block_number"])
op.create_primary_key(
"miner_payments_pkey",
"miner_payments",
["transaction_hash"],
)

View File

@ -0,0 +1,35 @@
"""Change blocks.timestamp to timestamp
Revision ID: 04b76ab1d2af
Revises: 2c90b2b8a80b
Create Date: 2021-11-26 15:31:21.111693
"""
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision = "04b76ab1d2af"
down_revision = "0cef835f7b36"
branch_labels = None
depends_on = None
def upgrade():
op.alter_column(
"blocks",
"block_timestamp",
type_=sa.TIMESTAMP,
nullable=False,
postgresql_using="TO_TIMESTAMP(block_timestamp)",
)
def downgrade():
op.alter_column(
"blocks",
"block_timestamp",
type_=sa.Numeric,
nullable=False,
postgresql_using="extract(epoch FROM block_timestamp)",
)

View File

@ -0,0 +1,34 @@
"""empty message
Revision ID: 070819d86587
Revises: d498bdb0a641
Create Date: 2021-11-26 18:25:13.402822
"""
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision = "d498bdb0a641"
down_revision = "b9fa1ecc9929"
branch_labels = None
depends_on = None
def upgrade():
op.create_table(
"punk_snipes",
sa.Column("created_at", sa.TIMESTAMP, server_default=sa.func.now()),
sa.Column("block_number", sa.Numeric, nullable=False),
sa.Column("transaction_hash", sa.String(66), nullable=False),
sa.Column("trace_address", sa.String(256), nullable=False),
sa.Column("from_address", sa.String(256), nullable=False),
sa.Column("punk_index", sa.Numeric, nullable=False),
sa.Column("min_acceptance_price", sa.Numeric, nullable=False),
sa.Column("acceptance_price", sa.Numeric, nullable=False),
sa.PrimaryKeyConstraint("block_number", "transaction_hash", "trace_address"),
)
def downgrade():
op.drop_table("punk_snipes")

View File

@ -8,7 +8,6 @@ Create Date: 2021-08-30 17:42:25.548130
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision = "083978d6e455"
down_revision = "92f28a2b4f52"

View File

@ -0,0 +1,26 @@
"""Rename pool_address to contract_address
Revision ID: 0cef835f7b36
Revises: 5427d62a2cc0
Create Date: 2021-11-19 15:36:15.152622
"""
from alembic import op
# revision identifiers, used by Alembic.
revision = "0cef835f7b36"
down_revision = "5427d62a2cc0"
branch_labels = None
depends_on = None
def upgrade():
op.alter_column(
"swaps", "pool_address", nullable=False, new_column_name="contract_address"
)
def downgrade():
op.alter_column(
"swaps", "contract_address", nullable=False, new_column_name="pool_address"
)

View File

@ -0,0 +1,28 @@
"""Add nullable transaction_position field to swaps and traces
Revision ID: 15ba9c27ee8a
Revises: 04b76ab1d2af
Create Date: 2021-12-02 18:24:18.218880
"""
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision = "15ba9c27ee8a"
down_revision = "ead7eb8283b9"
branch_labels = None
depends_on = None
def upgrade():
op.add_column(
"classified_traces",
sa.Column("transaction_position", sa.Numeric, nullable=True),
)
op.add_column("swaps", sa.Column("transaction_position", sa.Numeric, nullable=True))
def downgrade():
op.drop_column("classified_traces", "transaction_position")
op.drop_column("swaps", "transaction_position")

View File

@ -0,0 +1,26 @@
"""Add received_collateral_address to liquidations
Revision ID: 205ce02374b3
Revises: c8363617aa07
Create Date: 2021-10-04 19:52:40.017084
"""
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision = "205ce02374b3"
down_revision = "c8363617aa07"
branch_labels = None
depends_on = None
def upgrade():
op.add_column(
"liquidations",
sa.Column("received_token_address", sa.String(256), nullable=True),
)
def downgrade():
op.drop_column("liquidations", "received_token_address")

View File

@ -0,0 +1,28 @@
"""Add blocks table
Revision ID: 2c90b2b8a80b
Revises: 04a3bb3740c3
Create Date: 2021-11-17 18:29:13.065944
"""
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision = "2c90b2b8a80b"
down_revision = "04a3bb3740c3"
branch_labels = None
depends_on = None
def upgrade():
op.create_table(
"blocks",
sa.Column("block_number", sa.Numeric, nullable=False),
sa.Column("block_timestamp", sa.Numeric, nullable=False),
sa.PrimaryKeyConstraint("block_number"),
)
def downgrade():
op.drop_table("blocks")

View File

@ -0,0 +1,22 @@
"""Add index on block_number for miner_payments
Revision ID: 320e56b0a99f
Revises: a02f3f2c469f
Create Date: 2021-09-14 11:11:41.559137
"""
from alembic import op
# revision identifiers, used by Alembic.
revision = "320e56b0a99f"
down_revision = "a02f3f2c469f"
branch_labels = None
depends_on = None
def upgrade():
op.create_index("ix_block_number", "miner_payments", ["block_number"])
def downgrade():
op.drop_index("ix_block_number", "miner_payments")

View File

@ -0,0 +1,45 @@
"""Cahnge swap primary key to include block number
Revision ID: 3417f49d97b3
Revises: 205ce02374b3
Create Date: 2021-11-02 20:50:32.854996
"""
from alembic import op
# revision identifiers, used by Alembic.
revision = "3417f49d97b3"
down_revision = "205ce02374b3"
branch_labels = None
depends_on = None
def upgrade():
op.execute("ALTER TABLE swaps DROP CONSTRAINT swaps_pkey CASCADE")
op.create_primary_key(
"swaps_pkey",
"swaps",
["block_number", "transaction_hash", "trace_address"],
)
op.create_index(
"arbitrage_swaps_swaps_idx",
"arbitrage_swaps",
["swap_transaction_hash", "swap_trace_address"],
)
def downgrade():
op.drop_index("arbitrage_swaps_swaps_idx")
op.execute("ALTER TABLE swaps DROP CONSTRAINT swaps_pkey CASCADE")
op.create_primary_key(
"swaps_pkey",
"swaps",
["transaction_hash", "trace_address"],
)
op.create_foreign_key(
"arbitrage_swaps_swaps_fkey",
"arbitrage_swaps",
"swaps",
["swap_transaction_hash", "swap_trace_address"],
["transaction_hash", "trace_address"],
)

View File

@ -0,0 +1,40 @@
"""Create NFT Trades table
Revision ID: 3c54832385e3
Revises: 4b9d289f2d74
Create Date: 2021-12-19 22:50:28.936516
"""
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision = "3c54832385e3"
down_revision = "4b9d289f2d74"
branch_labels = None
depends_on = None
def upgrade():
op.create_table(
"nft_trades",
sa.Column("created_at", sa.TIMESTAMP, server_default=sa.func.now()),
sa.Column("abi_name", sa.String(1024), nullable=False),
sa.Column("transaction_hash", sa.String(66), nullable=False),
sa.Column("transaction_position", sa.Numeric, nullable=False),
sa.Column("block_number", sa.Numeric, nullable=False),
sa.Column("trace_address", sa.String(256), nullable=False),
sa.Column("protocol", sa.String(256), nullable=False),
sa.Column("error", sa.String(256), nullable=True),
sa.Column("seller_address", sa.String(256), nullable=False),
sa.Column("buyer_address", sa.String(256), nullable=False),
sa.Column("payment_token_address", sa.String(256), nullable=False),
sa.Column("payment_amount", sa.Numeric, nullable=False),
sa.Column("collection_address", sa.String(256), nullable=False),
sa.Column("token_id", sa.Numeric, nullable=False),
sa.PrimaryKeyConstraint("transaction_hash", "trace_address"),
)
def downgrade():
op.drop_table("nft_trades")

View File

@ -0,0 +1,23 @@
"""Add error column to liquidations
Revision ID: 4b9d289f2d74
Revises: 99d376cb93cc
Create Date: 2021-12-23 14:54:28.406159
"""
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision = "4b9d289f2d74"
down_revision = "99d376cb93cc"
branch_labels = None
depends_on = None
def upgrade():
op.add_column("liquidations", sa.Column("error", sa.String(256), nullable=True))
def downgrade():
op.drop_column("liquidations", "error")

View File

@ -0,0 +1,33 @@
"""empty message
Revision ID: 52d75a7e0533
Revises: 7cf0eeb41da0
Create Date: 2021-11-26 20:35:58.954138
"""
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision = "52d75a7e0533"
down_revision = "7cf0eeb41da0"
branch_labels = None
depends_on = None
def upgrade():
op.create_table(
"punk_bid_acceptances",
sa.Column("created_at", sa.TIMESTAMP, server_default=sa.func.now()),
sa.Column("block_number", sa.Numeric, nullable=False),
sa.Column("transaction_hash", sa.String(66), nullable=False),
sa.Column("trace_address", sa.String(256), nullable=False),
sa.Column("from_address", sa.String(256), nullable=False),
sa.Column("punk_index", sa.Numeric, nullable=False),
sa.Column("min_price", sa.Numeric, nullable=False),
sa.PrimaryKeyConstraint("block_number", "transaction_hash", "trace_address"),
)
def downgrade():
op.drop_table("punk_bid_acceptances")

View File

@ -0,0 +1,46 @@
"""Change transfers trace address to ARRAY
Revision ID: 5427d62a2cc0
Revises: d540242ae368
Create Date: 2021-11-19 13:25:11.252774
"""
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision = "5427d62a2cc0"
down_revision = "d540242ae368"
branch_labels = None
depends_on = None
def upgrade():
op.drop_constraint("transfers_pkey", "transfers")
op.alter_column(
"transfers",
"trace_address",
type_=sa.ARRAY(sa.Integer),
nullable=False,
postgresql_using="trace_address::int[]",
)
op.create_primary_key(
"transfers_pkey",
"transfers",
["block_number", "transaction_hash", "trace_address"],
)
def downgrade():
op.drop_constraint("transfers_pkey", "transfers")
op.alter_column(
"transfers",
"trace_address",
type_=sa.String(256),
nullable=False,
)
op.create_primary_key(
"transfers_pkey",
"transfers",
["block_number", "transaction_hash", "trace_address"],
)

View File

@ -0,0 +1,32 @@
"""Add block_number to nft_trades primary key
Revision ID: 5c5375de15fd
Revises: e616420acd18
Create Date: 2022-01-21 15:27:57.790340
"""
from alembic import op
# revision identifiers, used by Alembic.
revision = "5c5375de15fd"
down_revision = "e616420acd18"
branch_labels = None
depends_on = None
def upgrade():
op.execute("ALTER TABLE nft_trades DROP CONSTRAINT nft_trades_pkey")
op.create_primary_key(
"nft_trades_pkey",
"nft_trades",
["block_number", "transaction_hash", "trace_address"],
)
def downgrade():
op.execute("ALTER TABLE nft_trades DROP CONSTRAINT nft_trades_pkey")
op.create_primary_key(
"nft_trades_pkey",
"nft_trades",
["transaction_hash", "trace_address"],
)

View File

@ -0,0 +1,22 @@
"""Make gross profit nullable on summary
Revision ID: 630783c18a93
Revises: ab9a9e449ff9
Create Date: 2022-01-19 23:09:51.816948
"""
from alembic import op
# revision identifiers, used by Alembic.
revision = "630783c18a93"
down_revision = "ab9a9e449ff9"
branch_labels = None
depends_on = None
def upgrade():
op.alter_column("mev_summary", "gross_profit_usd", nullable=True)
def downgrade():
op.alter_column("mev_summary", "gross_profit_usd", nullable=False)

View File

@ -0,0 +1,33 @@
"""empty message
Revision ID: 7cf0eeb41da0
Revises: d498bdb0a641
Create Date: 2021-11-26 20:27:28.936516
"""
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision = "7cf0eeb41da0"
down_revision = "d498bdb0a641"
branch_labels = None
depends_on = None
def upgrade():
op.create_table(
"punk_bids",
sa.Column("created_at", sa.TIMESTAMP, server_default=sa.func.now()),
sa.Column("block_number", sa.Numeric, nullable=False),
sa.Column("transaction_hash", sa.String(66), nullable=False),
sa.Column("trace_address", sa.String(256), nullable=False),
sa.Column("from_address", sa.String(256), nullable=False),
sa.Column("punk_index", sa.Numeric, nullable=False),
sa.Column("price", sa.Numeric, nullable=False),
sa.PrimaryKeyConstraint("block_number", "transaction_hash", "trace_address"),
)
def downgrade():
op.drop_table("punk_bids")

View File

@ -8,7 +8,6 @@ Create Date: 2021-08-06 15:58:04.556762
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision = "7eec417a4f3e"
down_revision = "9d8c69b3dccb"

View File

@ -8,7 +8,6 @@ Create Date: 2021-08-17 03:46:21.498821
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision = "92f28a2b4f52"
down_revision = "9b8ae51c5d56"

View File

@ -0,0 +1,23 @@
"""error column
Revision ID: 99d376cb93cc
Revises: c4a7620a2d33
Create Date: 2021-12-21 21:26:12.142484
"""
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision = "99d376cb93cc"
down_revision = "c4a7620a2d33"
branch_labels = None
depends_on = None
def upgrade():
op.add_column("arbitrages", sa.Column("error", sa.String(256), nullable=True))
def downgrade():
op.drop_column("arbitrages", "error")

View File

@ -8,7 +8,6 @@ Create Date: 2021-08-06 17:06:55.364516
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision = "9b8ae51c5d56"
down_revision = "7eec417a4f3e"

View File

@ -8,7 +8,6 @@ Create Date: 2021-08-05 21:46:35.209199
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision = "9d8c69b3dccb"
down_revision = "2116e2f36a19"

View File

@ -8,7 +8,6 @@ Create Date: 2021-09-13 21:32:27.181344
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision = "a02f3f2c469f"
down_revision = "d70c08b4db6f"

View File

@ -0,0 +1,34 @@
"""Change classified traces primary key to include block number
Revision ID: a10d68643476
Revises: 3417f49d97b3
Create Date: 2021-11-02 22:03:26.312317
"""
from alembic import op
# revision identifiers, used by Alembic.
revision = "a10d68643476"
down_revision = "3417f49d97b3"
branch_labels = None
depends_on = None
def upgrade():
op.execute("ALTER TABLE classified_traces DROP CONSTRAINT classified_traces_pkey")
op.create_primary_key(
"classified_traces_pkey",
"classified_traces",
["block_number", "transaction_hash", "trace_address"],
)
op.drop_index("i_block_number")
def downgrade():
op.execute("ALTER TABLE classified_traces DROP CONSTRAINT classified_traces_pkey")
op.create_index("i_block_number", "classified_traces", ["block_number"])
op.create_primary_key(
"classified_traces_pkey",
"classified_traces",
["transaction_hash", "trace_address"],
)

View File

@ -0,0 +1,40 @@
"""Create mev_summary table
Revision ID: ab9a9e449ff9
Revises: b26ab0051a88
Create Date: 2022-01-18 18:36:42.865154
"""
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision = "ab9a9e449ff9"
down_revision = "b26ab0051a88"
branch_labels = None
depends_on = None
def upgrade():
op.create_table(
"mev_summary",
sa.Column("created_at", sa.TIMESTAMP, server_default=sa.func.now()),
sa.Column("block_number", sa.Numeric, nullable=False),
sa.Column("block_timestamp", sa.TIMESTAMP, nullable=False),
sa.Column("protocol", sa.String(256), nullable=True),
sa.Column("transaction_hash", sa.String(66), nullable=False),
sa.Column("type", sa.String(256), nullable=False),
sa.Column("gross_profit_usd", sa.Numeric, nullable=False),
sa.Column("miner_payment_usd", sa.Numeric, nullable=False),
sa.Column("gas_used", sa.Numeric, nullable=False),
sa.Column("gas_price", sa.Numeric, nullable=False),
sa.Column("coinbase_transfer", sa.Numeric, nullable=False),
sa.Column("gas_price_with_coinbase_transfer", sa.Numeric, nullable=False),
sa.Column("miner_address", sa.String(256), nullable=False),
sa.Column("base_fee_per_gas", sa.Numeric, nullable=False),
sa.Column("error", sa.String(256), nullable=True),
)
def downgrade():
op.drop_table("mev_summary")

View File

@ -0,0 +1,27 @@
"""add profit_amount column to sandwiches table
Revision ID: b26ab0051a88
Revises: 3c54832385e3
Create Date: 2022-01-16 13:45:10.190969
"""
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision = "b26ab0051a88"
down_revision = "3c54832385e3"
branch_labels = None
depends_on = None
def upgrade():
op.add_column(
"sandwiches", sa.Column("profit_token_address", sa.String(256), nullable=True)
)
op.add_column("sandwiches", sa.Column("profit_amount", sa.Numeric, nullable=True))
def downgrade():
op.drop_column("sandwiches", "profit_token_address")
op.drop_column("sandwiches", "profit_amount")

View File

@ -0,0 +1,26 @@
"""Remove collateral_token_address column
Revision ID: b9fa1ecc9929
Revises: 04b76ab1d2af
Create Date: 2021-12-01 23:32:40.574108
"""
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision = "b9fa1ecc9929"
down_revision = "04b76ab1d2af"
branch_labels = None
depends_on = None
def upgrade():
op.drop_column("liquidations", "collateral_token_address")
def downgrade():
op.add_column(
"liquidations",
sa.Column("collateral_token_address", sa.String(256), nullable=False),
)

View File

@ -0,0 +1,40 @@
"""Add tokens to database
Revision ID: bba80d21c5a4
Revises: b26ab0051a88
Create Date: 2022-01-19 22:19:59.514998
"""
from alembic import op
# revision identifiers, used by Alembic.
revision = "bba80d21c5a4"
down_revision = "630783c18a93"
branch_labels = None
depends_on = None
def upgrade():
op.execute(
"""
INSERT INTO tokens (token_address,decimals) VALUES
('0x514910771af9ca656af840dff83e8264ecf986ca',18),
('0xc02aaa39b223fe8d0a0e5c4f27ead9083c756cc2',18),
('0xeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeee',18),
('0x0bc529c00c6401aef6d220be8c6ea1667f6ad93e',18),
('0x5d3a536e4d6dbd6114cc1ead35777bab948e3643',8),
('0x2260fac5e5542a773aa44fbcfedf7c193bc2c599',8),
('0x80fb784b7ed66730e8b1dbd9820afd29931aab03',18),
('0x4ddc2d193948926d02f9b1fe9e1daa0718270ed5',8),
('0xa0b86991c6218b36c1d19d4a2e9eb0ce3606eb48',6),
('0xdac17f958d2ee523a2206206994597c13d831ec7',6),
('0x6b175474e89094c44da98b954eedeac495271d0f',18),
('0x0000000000085d4780b73119b644ae5ecd22b376',18),
('0x39aa39c021dfbae8fac545936693ac917d5e7563',8),
('0x7fc66500c84a76ad7e9c93437bfc5ac33e2ddae9',18);
"""
)
def downgrade():
op.execute("DELETE FROM tokens")

View File

@ -0,0 +1,26 @@
"""Add protocols column to arbitrages
Revision ID: bdbb545f6c03
Revises: bba80d21c5a4
Create Date: 2022-01-20 23:17:19.316008
"""
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision = "bdbb545f6c03"
down_revision = "bba80d21c5a4"
branch_labels = None
depends_on = None
def upgrade():
op.add_column(
"arbitrages",
sa.Column("protocols", sa.ARRAY(sa.String(256)), server_default="{}"),
)
def downgrade():
op.drop_column("arbitrages", "protocols")

View File

@ -0,0 +1,28 @@
"""Create tokens table
Revision ID: c4a7620a2d33
Revises: 15ba9c27ee8a
Create Date: 2021-12-21 19:12:33.940117
"""
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision = "c4a7620a2d33"
down_revision = "15ba9c27ee8a"
branch_labels = None
depends_on = None
def upgrade():
op.create_table(
"tokens",
sa.Column("token_address", sa.String(256), nullable=False),
sa.Column("decimals", sa.Numeric, nullable=False),
sa.PrimaryKeyConstraint("token_address"),
)
def downgrade():
op.drop_table("tokens")

View File

@ -7,7 +7,6 @@ Create Date: 2021-07-30 17:37:27.335475
"""
from alembic import op
# revision identifiers, used by Alembic.
revision = "c5da44eb072c"
down_revision = "0660432b9840"

View File

@ -0,0 +1,37 @@
"""Create liquidations table
Revision ID: c8363617aa07
Revises: cd96af55108e
Create Date: 2021-09-29 14:00:06.857103
"""
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision = "c8363617aa07"
down_revision = "cd96af55108e"
branch_labels = None
depends_on = None
def upgrade():
op.create_table(
"liquidations",
sa.Column("created_at", sa.TIMESTAMP, server_default=sa.func.now()),
sa.Column("liquidated_user", sa.String(256), nullable=False),
sa.Column("liquidator_user", sa.String(256), nullable=False),
sa.Column("collateral_token_address", sa.String(256), nullable=False),
sa.Column("debt_token_address", sa.String(256), nullable=False),
sa.Column("debt_purchase_amount", sa.Numeric, nullable=False),
sa.Column("received_amount", sa.Numeric, nullable=False),
sa.Column("protocol", sa.String(256), nullable=True),
sa.Column("transaction_hash", sa.String(66), nullable=False),
sa.Column("trace_address", sa.String(256), nullable=False),
sa.Column("block_number", sa.Numeric, nullable=False),
sa.PrimaryKeyConstraint("transaction_hash", "trace_address"),
)
def downgrade():
op.drop_table("liquidations")

View File

@ -0,0 +1,38 @@
"""Add transfers table
Revision ID: cd96af55108e
Revises: 5437dc68f4df
Create Date: 2021-09-17 12:44:45.245137
"""
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision = "cd96af55108e"
down_revision = "320e56b0a99f"
branch_labels = None
depends_on = None
def upgrade():
op.create_table(
"transfers",
sa.Column("created_at", sa.TIMESTAMP, server_default=sa.func.now()),
sa.Column("block_number", sa.Numeric, nullable=False),
sa.Column("transaction_hash", sa.String(66), nullable=False),
sa.Column("trace_address", sa.String(256), nullable=False),
sa.Column("protocol", sa.String(256), nullable=True),
sa.Column("from_address", sa.String(256), nullable=False),
sa.Column("to_address", sa.String(256), nullable=False),
sa.Column("token_address", sa.String(256), nullable=False),
sa.Column("amount", sa.Numeric, nullable=False),
sa.Column("error", sa.String(256), nullable=True),
sa.PrimaryKeyConstraint("transaction_hash", "trace_address"),
)
op.create_index("ix_transfers_block_number", "transfers", ["block_number"])
def downgrade():
op.drop_index("ix_transfers_block_number", "transfers")
op.drop_table("transfers")

View File

@ -0,0 +1,29 @@
"""Create usd_prices table
Revision ID: d540242ae368
Revises: 2c90b2b8a80b
Create Date: 2021-11-18 04:30:06.802857
"""
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision = "d540242ae368"
down_revision = "2c90b2b8a80b"
branch_labels = None
depends_on = None
def upgrade():
op.create_table(
"prices",
sa.Column("timestamp", sa.TIMESTAMP),
sa.Column("usd_price", sa.Numeric, nullable=False),
sa.Column("token_address", sa.String(256), nullable=False),
sa.PrimaryKeyConstraint("token_address", "timestamp"),
)
def downgrade():
op.drop_table("prices")

View File

@ -8,7 +8,6 @@ Create Date: 2021-08-30 22:10:04.186251
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision = "d70c08b4db6f"
down_revision = "083978d6e455"

View File

@ -0,0 +1,26 @@
"""Add protocols column to mev_summary
Revision ID: e616420acd18
Revises: bdbb545f6c03
Create Date: 2022-01-21 00:11:51.516459
"""
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision = "e616420acd18"
down_revision = "bdbb545f6c03"
branch_labels = None
depends_on = None
def upgrade():
op.add_column(
"mev_summary",
sa.Column("protocols", sa.ARRAY(sa.String(256)), server_default="{}"),
)
def downgrade():
op.drop_column("mev_summary", "protocols")

View File

@ -0,0 +1,69 @@
"""Create sandwiches and sandwiched swaps tables
Revision ID: ead7eb8283b9
Revises: a5d80460f0e6
Create Date: 2021-12-03 16:37:28.077158
"""
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision = "ead7eb8283b9"
down_revision = "52d75a7e0533"
branch_labels = None
depends_on = None
def upgrade():
op.create_table(
"sandwiches",
sa.Column("id", sa.String(256), primary_key=True),
sa.Column("created_at", sa.TIMESTAMP, server_default=sa.func.now()),
sa.Column("block_number", sa.Numeric, nullable=False),
sa.Column("sandwicher_address", sa.String(256), nullable=False),
sa.Column("frontrun_swap_transaction_hash", sa.String(256), nullable=False),
sa.Column("frontrun_swap_trace_address", sa.ARRAY(sa.Integer), nullable=False),
sa.Column("backrun_swap_transaction_hash", sa.String(256), nullable=False),
sa.Column("backrun_swap_trace_address", sa.ARRAY(sa.Integer), nullable=False),
)
op.create_index(
"ik_sandwiches_frontrun",
"sandwiches",
[
"block_number",
"frontrun_swap_transaction_hash",
"frontrun_swap_trace_address",
],
)
op.create_index(
"ik_sandwiches_backrun",
"sandwiches",
["block_number", "backrun_swap_transaction_hash", "backrun_swap_trace_address"],
)
op.create_table(
"sandwiched_swaps",
sa.Column("created_at", sa.TIMESTAMP, server_default=sa.func.now()),
sa.Column("sandwich_id", sa.String(1024), primary_key=True),
sa.Column("block_number", sa.Numeric, primary_key=True),
sa.Column("transaction_hash", sa.String(66), primary_key=True),
sa.Column("trace_address", sa.ARRAY(sa.Integer), primary_key=True),
sa.ForeignKeyConstraint(["sandwich_id"], ["sandwiches.id"], ondelete="CASCADE"),
)
op.create_index(
"ik_sandwiched_swaps_secondary",
"sandwiched_swaps",
["block_number", "transaction_hash", "trace_address"],
)
def downgrade():
op.drop_index("ik_sandwiched_swaps_secondary")
op.drop_table("sandwiched_swaps")
op.drop_index("ik_sandwiches_frontrun")
op.drop_index("ik_sandwiches_backrun")
op.drop_table("sandwiches")

219
cli.py Normal file
View File

@ -0,0 +1,219 @@
import fileinput
import logging
import os
import sys
from datetime import datetime
import click
import dramatiq
from mev_inspect.concurrency import coro
from mev_inspect.crud.prices import write_prices
from mev_inspect.db import get_inspect_session, get_trace_session
from mev_inspect.inspector import MEVInspector
from mev_inspect.prices import fetch_prices, fetch_prices_range
from mev_inspect.queue.broker import connect_broker
from mev_inspect.queue.tasks import (
LOW_PRIORITY,
LOW_PRIORITY_QUEUE,
backfill_export_task,
inspect_many_blocks_task,
)
from mev_inspect.s3_export import export_block
RPC_URL_ENV = "RPC_URL"
logging.basicConfig(stream=sys.stdout, level=logging.INFO)
logger = logging.getLogger(__name__)
@click.group()
def cli():
pass
@cli.command()
@click.argument("block_number", type=int)
@click.option("--rpc", default=lambda: os.environ.get(RPC_URL_ENV, ""))
@coro
async def inspect_block_command(block_number: int, rpc: str):
inspect_db_session = get_inspect_session()
trace_db_session = get_trace_session()
inspector = MEVInspector(rpc)
await inspector.inspect_single_block(
inspect_db_session=inspect_db_session,
trace_db_session=trace_db_session,
block=block_number,
)
@cli.command()
@click.argument("block_number", type=int)
@click.option("--rpc", default=lambda: os.environ.get(RPC_URL_ENV, ""))
@coro
async def fetch_block_command(block_number: int, rpc: str):
trace_db_session = get_trace_session()
inspector = MEVInspector(rpc)
block = await inspector.create_from_block(
block_number=block_number,
trace_db_session=trace_db_session,
)
print(block.json())
@cli.command()
@click.argument("after_block", type=int)
@click.argument("before_block", type=int)
@click.option("--rpc", default=lambda: os.environ.get(RPC_URL_ENV, ""))
@click.option(
"--max-concurrency",
type=int,
help="maximum number of concurrent connections",
default=5,
)
@click.option(
"--request-timeout", type=int, help="timeout for requests to nodes", default=500
)
@coro
async def inspect_many_blocks_command(
after_block: int,
before_block: int,
rpc: str,
max_concurrency: int,
request_timeout: int,
):
inspect_db_session = get_inspect_session()
trace_db_session = get_trace_session()
inspector = MEVInspector(
rpc,
max_concurrency=max_concurrency,
request_timeout=request_timeout,
)
await inspector.inspect_many_blocks(
inspect_db_session=inspect_db_session,
trace_db_session=trace_db_session,
after_block=after_block,
before_block=before_block,
)
@cli.command()
def enqueue_block_list_command():
broker = connect_broker()
inspect_many_blocks_actor = dramatiq.actor(
inspect_many_blocks_task,
broker=broker,
queue_name=LOW_PRIORITY_QUEUE,
priority=LOW_PRIORITY,
)
for block_string in fileinput.input():
block = int(block_string)
logger.info(f"Sending {block} to {block+1}")
inspect_many_blocks_actor.send(block, block + 1)
@cli.command()
@click.argument("start_block", type=int)
@click.argument("end_block", type=int)
@click.argument("batch_size", type=int, default=10)
def enqueue_many_blocks_command(start_block: int, end_block: int, batch_size: int):
broker = connect_broker()
inspect_many_blocks_actor = dramatiq.actor(
inspect_many_blocks_task,
broker=broker,
queue_name=LOW_PRIORITY_QUEUE,
priority=LOW_PRIORITY,
)
if start_block < end_block:
after_block = start_block
before_block = end_block
for batch_after_block in range(after_block, before_block, batch_size):
batch_before_block = min(batch_after_block + batch_size, before_block)
logger.info(f"Sending {batch_after_block} to {batch_before_block}")
inspect_many_blocks_actor.send(batch_after_block, batch_before_block)
else:
after_block = end_block
before_block = start_block
for batch_before_block in range(before_block, after_block, -1 * batch_size):
batch_after_block = max(batch_before_block - batch_size, after_block)
logger.info(f"Sending {batch_after_block} to {batch_before_block}")
inspect_many_blocks_actor.send(batch_after_block, batch_before_block)
@cli.command()
def fetch_all_prices():
inspect_db_session = get_inspect_session()
logger.info("Fetching prices")
prices = fetch_prices()
logger.info("Writing prices")
write_prices(inspect_db_session, prices)
@cli.command()
@click.argument("block_number", type=int)
def enqueue_s3_export(block_number: int):
broker = connect_broker()
export_actor = dramatiq.actor(
backfill_export_task,
broker=broker,
queue_name=LOW_PRIORITY_QUEUE,
priority=LOW_PRIORITY,
)
logger.info(f"Sending block {block_number} export to queue")
export_actor.send(block_number)
@cli.command()
@click.argument("after_block", type=int)
@click.argument("before_block", type=int)
def enqueue_many_s3_exports(after_block: int, before_block: int):
broker = connect_broker()
export_actor = dramatiq.actor(
backfill_export_task,
broker=broker,
queue_name=LOW_PRIORITY_QUEUE,
priority=LOW_PRIORITY,
)
logger.info(f"Sending blocks {after_block} to {before_block} to queue")
for block_number in range(after_block, before_block):
export_actor.send(block_number)
@cli.command()
@click.argument("block_number", type=int)
def s3_export(block_number: int):
inspect_db_session = get_inspect_session()
logger.info(f"Exporting {block_number}")
export_block(inspect_db_session, block_number)
@cli.command()
@click.argument("after", type=click.DateTime(formats=["%Y-%m-%d", "%m-%d-%Y"]))
@click.argument("before", type=click.DateTime(formats=["%Y-%m-%d", "%m-%d-%Y"]))
def fetch_range(after: datetime, before: datetime):
inspect_db_session = get_inspect_session()
logger.info("Fetching prices")
prices = fetch_prices_range(after, before)
logger.info("Writing prices")
write_prices(inspect_db_session, prices)
def get_rpc_url() -> str:
return os.environ["RPC_URL"]
if __name__ == "__main__":
cli()

View File

@ -1,24 +0,0 @@
services:
mev-inspect:
build: .
depends_on:
- db
env_file:
- .env
volumes:
- .:/app
tty: true
db:
image: postgres:12
volumes:
- mev-inspect-db-data:/var/lib/postgresql/data/pgdata
env_file:
- .env
environment:
- PGDATA=/var/lib/postgresql/data/pgdata
ports:
- 5432:5432
volumes:
mev-inspect-db-data:

View File

@ -1,39 +0,0 @@
apiVersion: apps/v1
kind: Deployment
metadata:
name: mev-inspect-deployment
labels:
app: mev-inspect
spec:
replicas: 1
selector:
matchLabels:
app: mev-inspect
template:
metadata:
labels:
app: mev-inspect
spec:
containers:
- name: mev-inspect
image: mev-inspect:latest
command: [ "/bin/bash", "-c", "--" ]
args: [ "while true; do sleep 30; done;" ]
env:
- name: POSTGRES_USER
valueFrom:
secretKeyRef:
name: mev-inspect-db-credentials
key: username
- name: POSTGRES_PASSWORD
valueFrom:
secretKeyRef:
name: mev-inspect-db-credentials
key: password
livenessProbe:
exec:
command:
- ls
- /
initialDelaySeconds: 20
periodSeconds: 5

View File

@ -0,0 +1,23 @@
# Patterns to ignore when building packages.
# This supports shell glob matching, relative path matching, and
# negation (prefixed with !). Only one pattern per line.
.DS_Store
# Common VCS dirs
.git/
.gitignore
.bzr/
.bzrignore
.hg/
.hgignore
.svn/
# Common backup files
*.swp
*.bak
*.tmp
*.orig
*~
# Various IDEs
.project
.idea/
*.tmproj
.vscode/

View File

@ -0,0 +1,24 @@
apiVersion: v2
name: mev-inspect-prices
description: A Helm chart for Kubernetes
# A chart can be either an 'application' or a 'library' chart.
#
# Application charts are a collection of templates that can be packaged into versioned archives
# to be deployed.
#
# Library charts provide useful utilities or functions for the chart developer. They're included as
# a dependency of application charts to inject those utilities and functions into the rendering
# pipeline. Library charts do not define any templates and therefore cannot be deployed.
type: application
# This is the chart version. This version number should be incremented each time you make changes
# to the chart and its templates, including the app version.
# Versions are expected to follow Semantic Versioning (https://semver.org/)
version: 0.1.0
# This is the version number of the application being deployed. This version number should be
# incremented each time you make changes to the application. Versions are not expected to
# follow Semantic Versioning. They should reflect the version the application is using.
# It is recommended to use it with quotes.
appVersion: "1.16.0"

View File

@ -0,0 +1,62 @@
{{/*
Expand the name of the chart.
*/}}
{{- define "mev-inspect-prices.name" -}}
{{- default .Chart.Name .Values.nameOverride | trunc 63 | trimSuffix "-" }}
{{- end }}
{{/*
Create a default fully qualified app name.
We truncate at 63 chars because some Kubernetes name fields are limited to this (by the DNS naming spec).
If release name contains chart name it will be used as a full name.
*/}}
{{- define "mev-inspect-prices.fullname" -}}
{{- if .Values.fullnameOverride }}
{{- .Values.fullnameOverride | trunc 63 | trimSuffix "-" }}
{{- else }}
{{- $name := default .Chart.Name .Values.nameOverride }}
{{- if contains $name .Release.Name }}
{{- .Release.Name | trunc 63 | trimSuffix "-" }}
{{- else }}
{{- printf "%s-%s" .Release.Name $name | trunc 63 | trimSuffix "-" }}
{{- end }}
{{- end }}
{{- end }}
{{/*
Create chart name and version as used by the chart label.
*/}}
{{- define "mev-inspect-prices.chart" -}}
{{- printf "%s-%s" .Chart.Name .Chart.Version | replace "+" "_" | trunc 63 | trimSuffix "-" }}
{{- end }}
{{/*
Common labels
*/}}
{{- define "mev-inspect-prices.labels" -}}
helm.sh/chart: {{ include "mev-inspect-prices.chart" . }}
{{ include "mev-inspect-prices.selectorLabels" . }}
{{- if .Chart.AppVersion }}
app.kubernetes.io/version: {{ .Chart.AppVersion | quote }}
{{- end }}
app.kubernetes.io/managed-by: {{ .Release.Service }}
{{- end }}
{{/*
Selector labels
*/}}
{{- define "mev-inspect-prices.selectorLabels" -}}
app.kubernetes.io/name: {{ include "mev-inspect-prices.name" . }}
app.kubernetes.io/instance: {{ .Release.Name }}
{{- end }}
{{/*
Create the name of the service account to use
*/}}
{{- define "mev-inspect-prices.serviceAccountName" -}}
{{- if .Values.serviceAccount.create }}
{{- default (include "mev-inspect-prices.fullname" .) .Values.serviceAccount.name }}
{{- else }}
{{- default "default" .Values.serviceAccount.name }}
{{- end }}
{{- end }}

View File

@ -0,0 +1,35 @@
apiVersion: batch/v1
kind: CronJob
metadata:
name: {{ include "mev-inspect-prices.fullname" . }}
spec:
schedule: "0 */1 * * *"
successfulJobsHistoryLimit: 0
jobTemplate:
spec:
template:
spec:
containers:
- name: {{ .Chart.Name }}
image: "{{ .Values.image.repository }}"
imagePullPolicy: {{ .Values.image.pullPolicy }}
args:
- run
- fetch-all-prices
env:
- name: POSTGRES_HOST
valueFrom:
secretKeyRef:
name: mev-inspect-db-credentials
key: host
- name: POSTGRES_USER
valueFrom:
secretKeyRef:
name: mev-inspect-db-credentials
key: username
- name: POSTGRES_PASSWORD
valueFrom:
secretKeyRef:
name: mev-inspect-db-credentials
key: password
restartPolicy: Never

View File

@ -0,0 +1,7 @@
image:
repository: mev-inspect-py
pullPolicy: IfNotPresent
imagePullSecrets: []
nameOverride: ""
fullnameOverride: ""

View File

@ -0,0 +1,23 @@
# Patterns to ignore when building packages.
# This supports shell glob matching, relative path matching, and
# negation (prefixed with !). Only one pattern per line.
.DS_Store
# Common VCS dirs
.git/
.gitignore
.bzr/
.bzrignore
.hg/
.hgignore
.svn/
# Common backup files
*.swp
*.bak
*.tmp
*.orig
*~
# Various IDEs
.project
.idea/
*.tmproj
.vscode/

View File

@ -0,0 +1,24 @@
apiVersion: v2
name: mev-inspect-workers
description: A Helm chart for Kubernetes
# A chart can be either an 'application' or a 'library' chart.
#
# Application charts are a collection of templates that can be packaged into versioned archives
# to be deployed.
#
# Library charts provide useful utilities or functions for the chart developer. They're included as
# a dependency of application charts to inject those utilities and functions into the rendering
# pipeline. Library charts do not define any templates and therefore cannot be deployed.
type: application
# This is the chart version. This version number should be incremented each time you make changes
# to the chart and its templates, including the app version.
# Versions are expected to follow Semantic Versioning (https://semver.org/)
version: 0.1.0
# This is the version number of the application being deployed. This version number should be
# incremented each time you make changes to the application. Versions are not expected to
# follow Semantic Versioning. They should reflect the version the application is using.
# It is recommended to use it with quotes.
appVersion: "1.16.0"

View File

@ -0,0 +1,62 @@
{{/*
Expand the name of the chart.
*/}}
{{- define "mev-inspect-worker.name" -}}
{{- default .Chart.Name .Values.nameOverride | trunc 63 | trimSuffix "-" }}
{{- end }}
{{/*
Create a default fully qualified app name.
We truncate at 63 chars because some Kubernetes name fields are limited to this (by the DNS naming spec).
If release name contains chart name it will be used as a full name.
*/}}
{{- define "mev-inspect-worker.fullname" -}}
{{- if .Values.fullnameOverride }}
{{- .Values.fullnameOverride | trunc 63 | trimSuffix "-" }}
{{- else }}
{{- $name := default .Chart.Name .Values.nameOverride }}
{{- if contains $name .Release.Name }}
{{- .Release.Name | trunc 63 | trimSuffix "-" }}
{{- else }}
{{- printf "%s-%s" .Release.Name $name | trunc 63 | trimSuffix "-" }}
{{- end }}
{{- end }}
{{- end }}
{{/*
Create chart name and version as used by the chart label.
*/}}
{{- define "mev-inspect-worker.chart" -}}
{{- printf "%s-%s" .Chart.Name .Chart.Version | replace "+" "_" | trunc 63 | trimSuffix "-" }}
{{- end }}
{{/*
Common labels
*/}}
{{- define "mev-inspect-worker.labels" -}}
helm.sh/chart: {{ include "mev-inspect-worker.chart" . }}
{{ include "mev-inspect-worker.selectorLabels" . }}
{{- if .Chart.AppVersion }}
app.kubernetes.io/version: {{ .Chart.AppVersion | quote }}
{{- end }}
app.kubernetes.io/managed-by: {{ .Release.Service }}
{{- end }}
{{/*
Selector labels
*/}}
{{- define "mev-inspect-worker.selectorLabels" -}}
app.kubernetes.io/name: {{ include "mev-inspect-worker.name" . }}
app.kubernetes.io/instance: {{ .Release.Name }}
{{- end }}
{{/*
Create the name of the service account to use
*/}}
{{- define "mev-inspect-worker.serviceAccountName" -}}
{{- if .Values.serviceAccount.create }}
{{- default (include "mev-inspect-worker.fullname" .) .Values.serviceAccount.name }}
{{- else }}
{{- default "default" .Values.serviceAccount.name }}
{{- end }}
{{- end }}

View File

@ -0,0 +1,133 @@
apiVersion: apps/v1
kind: Deployment
metadata:
name: {{ include "mev-inspect-worker.fullname" . }}
labels:
{{- include "mev-inspect-worker.labels" . | nindent 4 }}
spec:
replicas: {{ .Values.replicaCount }}
selector:
matchLabels:
{{- include "mev-inspect-worker.selectorLabels" . | nindent 6 }}
template:
metadata:
{{- with .Values.podAnnotations }}
annotations:
{{- toYaml . | nindent 8 }}
{{- end }}
labels:
{{- include "mev-inspect-worker.selectorLabels" . | nindent 8 }}
spec:
{{- with .Values.imagePullSecrets }}
imagePullSecrets:
{{- toYaml . | nindent 8 }}
{{- end }}
securityContext:
{{- toYaml .Values.podSecurityContext | nindent 8 }}
containers:
- name: {{ .Chart.Name }}
securityContext:
{{- toYaml .Values.securityContext | nindent 12 }}
image: "{{ .Values.image.repository }}"
imagePullPolicy: {{ .Values.image.pullPolicy }}
args: ["run", "dramatiq", "worker", "--threads=1", "--processes=1"]
livenessProbe:
exec:
command:
- ls
- /
initialDelaySeconds: 20
periodSeconds: 10
timeoutSeconds: 5
resources:
{{- toYaml .Values.resources | nindent 12 }}
env:
- name: POSTGRES_HOST
valueFrom:
secretKeyRef:
name: mev-inspect-db-credentials
key: host
- name: POSTGRES_USER
valueFrom:
secretKeyRef:
name: mev-inspect-db-credentials
key: username
- name: POSTGRES_PASSWORD
valueFrom:
secretKeyRef:
name: mev-inspect-db-credentials
key: password
- name: REDIS_PASSWORD
valueFrom:
secretKeyRef:
name: redis
key: redis-password
- name: TRACE_DB_HOST
valueFrom:
secretKeyRef:
name: trace-db-credentials
key: host
optional: true
- name: TRACE_DB_USER
valueFrom:
secretKeyRef:
name: trace-db-credentials
key: username
optional: true
- name: TRACE_DB_PASSWORD
valueFrom:
secretKeyRef:
name: trace-db-credentials
key: password
optional: true
- name: RPC_URL
valueFrom:
configMapKeyRef:
name: mev-inspect-rpc
key: url
- name: LISTENER_HEALTHCHECK_URL
valueFrom:
configMapKeyRef:
name: mev-inspect-listener-healthcheck
key: url
optional: true
- name: EXPORT_BUCKET_NAME
valueFrom:
secretKeyRef:
name: mev-inspect-export
key: export-bucket-name
optional: true
- name: EXPORT_BUCKET_REGION
valueFrom:
secretKeyRef:
name: mev-inspect-export
key: export-bucket-region
optional: true
- name: EXPORT_AWS_ACCESS_KEY_ID
valueFrom:
secretKeyRef:
name: mev-inspect-export
key: export-aws-access-key-id
optional: true
- name: EXPORT_AWS_SECRET_ACCESS_KEY
valueFrom:
secretKeyRef:
name: mev-inspect-export
key: export-aws-secret-access-key
optional: true
{{- range .Values.extraEnv }}
- name: {{ .name }}
value: {{ .value }}
{{- end }}
{{- with .Values.nodeSelector }}
nodeSelector:
{{- toYaml . | nindent 8 }}
{{- end }}
{{- with .Values.affinity }}
affinity:
{{- toYaml . | nindent 8 }}
{{- end }}
{{- with .Values.tolerations }}
tolerations:
{{- toYaml . | nindent 8 }}
{{- end }}

View File

@ -0,0 +1,45 @@
# Default values for mev-inspect-workers
# This is a YAML-formatted file.
# Declare variables to be passed into your templates.
replicaCount: 1
image:
repository: mev-inspect-py:latest
pullPolicy: IfNotPresent
imagePullSecrets: []
nameOverride: ""
fullnameOverride: ""
podAnnotations: {}
podSecurityContext: {}
# fsGroup: 2000
securityContext:
allowPrivilegeEscalation: false
capabilities:
drop:
- ALL
# readOnlyRootFilesystem: true
runAsNonRoot: true
runAsUser: 1000
resources: {}
# We usually recommend not to specify default resources and to leave this as a conscious
# choice for the user. This also increases chances charts run on environments with little
# resources, such as Minikube. If you do want to specify resources, uncomment the following
# lines, adjust them as necessary, and remove the curly braces after 'resources:'.
# limits:
# cpu: 100m
# memory: 128Mi
# requests:
# cpu: 100m
# memory: 128Mi
nodeSelector: {}
tolerations: []
affinity: {}

View File

@ -0,0 +1,23 @@
# Patterns to ignore when building packages.
# This supports shell glob matching, relative path matching, and
# negation (prefixed with !). Only one pattern per line.
.DS_Store
# Common VCS dirs
.git/
.gitignore
.bzr/
.bzrignore
.hg/
.hgignore
.svn/
# Common backup files
*.swp
*.bak
*.tmp
*.orig
*~
# Various IDEs
.project
.idea/
*.tmproj
.vscode/

View File

@ -0,0 +1,24 @@
apiVersion: v2
name: mev-inspect
description: A Helm chart for Kubernetes
# A chart can be either an 'application' or a 'library' chart.
#
# Application charts are a collection of templates that can be packaged into versioned archives
# to be deployed.
#
# Library charts provide useful utilities or functions for the chart developer. They're included as
# a dependency of application charts to inject those utilities and functions into the rendering
# pipeline. Library charts do not define any templates and therefore cannot be deployed.
type: application
# This is the chart version. This version number should be incremented each time you make changes
# to the chart and its templates, including the app version.
# Versions are expected to follow Semantic Versioning (https://semver.org/)
version: 0.1.0
# This is the version number of the application being deployed. This version number should be
# incremented each time you make changes to the application. Versions are not expected to
# follow Semantic Versioning. They should reflect the version the application is using.
# It is recommended to use it with quotes.
appVersion: "1.16.0"

View File

@ -0,0 +1,62 @@
{{/*
Expand the name of the chart.
*/}}
{{- define "mev-inspect.name" -}}
{{- default .Chart.Name .Values.nameOverride | trunc 63 | trimSuffix "-" }}
{{- end }}
{{/*
Create a default fully qualified app name.
We truncate at 63 chars because some Kubernetes name fields are limited to this (by the DNS naming spec).
If release name contains chart name it will be used as a full name.
*/}}
{{- define "mev-inspect.fullname" -}}
{{- if .Values.fullnameOverride }}
{{- .Values.fullnameOverride | trunc 63 | trimSuffix "-" }}
{{- else }}
{{- $name := default .Chart.Name .Values.nameOverride }}
{{- if contains $name .Release.Name }}
{{- .Release.Name | trunc 63 | trimSuffix "-" }}
{{- else }}
{{- printf "%s-%s" .Release.Name $name | trunc 63 | trimSuffix "-" }}
{{- end }}
{{- end }}
{{- end }}
{{/*
Create chart name and version as used by the chart label.
*/}}
{{- define "mev-inspect.chart" -}}
{{- printf "%s-%s" .Chart.Name .Chart.Version | replace "+" "_" | trunc 63 | trimSuffix "-" }}
{{- end }}
{{/*
Common labels
*/}}
{{- define "mev-inspect.labels" -}}
helm.sh/chart: {{ include "mev-inspect.chart" . }}
{{ include "mev-inspect.selectorLabels" . }}
{{- if .Chart.AppVersion }}
app.kubernetes.io/version: {{ .Chart.AppVersion | quote }}
{{- end }}
app.kubernetes.io/managed-by: {{ .Release.Service }}
{{- end }}
{{/*
Selector labels
*/}}
{{- define "mev-inspect.selectorLabels" -}}
app.kubernetes.io/name: {{ include "mev-inspect.name" . }}
app.kubernetes.io/instance: {{ .Release.Name }}
{{- end }}
{{/*
Create the name of the service account to use
*/}}
{{- define "mev-inspect.serviceAccountName" -}}
{{- if .Values.serviceAccount.create }}
{{- default (include "mev-inspect.fullname" .) .Values.serviceAccount.name }}
{{- else }}
{{- default "default" .Values.serviceAccount.name }}
{{- end }}
{{- end }}

View File

@ -0,0 +1,133 @@
apiVersion: apps/v1
kind: Deployment
metadata:
name: {{ include "mev-inspect.fullname" . }}
labels:
{{- include "mev-inspect.labels" . | nindent 4 }}
spec:
replicas: {{ .Values.replicaCount }}
selector:
matchLabels:
{{- include "mev-inspect.selectorLabels" . | nindent 6 }}
template:
metadata:
{{- with .Values.podAnnotations }}
annotations:
{{- toYaml . | nindent 8 }}
{{- end }}
labels:
{{- include "mev-inspect.selectorLabels" . | nindent 8 }}
spec:
{{- with .Values.imagePullSecrets }}
imagePullSecrets:
{{- toYaml . | nindent 8 }}
{{- end }}
securityContext:
{{- toYaml .Values.podSecurityContext | nindent 8 }}
containers:
- name: {{ .Chart.Name }}
securityContext:
{{- toYaml .Values.securityContext | nindent 12 }}
image: "{{ .Values.image.repository }}"
imagePullPolicy: {{ .Values.image.pullPolicy }}
args: ["run", "python", "loop.py"]
livenessProbe:
exec:
command:
- ls
- /
initialDelaySeconds: 20
periodSeconds: 10
timeoutSeconds: 5
resources:
{{- toYaml .Values.resources | nindent 12 }}
env:
- name: POSTGRES_HOST
valueFrom:
secretKeyRef:
name: mev-inspect-db-credentials
key: host
- name: POSTGRES_USER
valueFrom:
secretKeyRef:
name: mev-inspect-db-credentials
key: username
- name: POSTGRES_PASSWORD
valueFrom:
secretKeyRef:
name: mev-inspect-db-credentials
key: password
- name: REDIS_PASSWORD
valueFrom:
secretKeyRef:
name: redis
key: redis-password
- name: TRACE_DB_HOST
valueFrom:
secretKeyRef:
name: trace-db-credentials
key: host
optional: true
- name: TRACE_DB_USER
valueFrom:
secretKeyRef:
name: trace-db-credentials
key: username
optional: true
- name: TRACE_DB_PASSWORD
valueFrom:
secretKeyRef:
name: trace-db-credentials
key: password
optional: true
- name: RPC_URL
valueFrom:
configMapKeyRef:
name: mev-inspect-rpc
key: url
- name: LISTENER_HEALTHCHECK_URL
valueFrom:
configMapKeyRef:
name: mev-inspect-listener-healthcheck
key: url
optional: true
- name: EXPORT_BUCKET_NAME
valueFrom:
secretKeyRef:
name: mev-inspect-export
key: export-bucket-name
optional: true
- name: EXPORT_BUCKET_REGION
valueFrom:
secretKeyRef:
name: mev-inspect-export
key: export-bucket-region
optional: true
- name: EXPORT_AWS_ACCESS_KEY_ID
valueFrom:
secretKeyRef:
name: mev-inspect-export
key: export-aws-access-key-id
optional: true
- name: EXPORT_AWS_SECRET_ACCESS_KEY
valueFrom:
secretKeyRef:
name: mev-inspect-export
key: export-aws-secret-access-key
optional: true
{{- range .Values.extraEnv }}
- name: {{ .name }}
value: {{ .value }}
{{- end }}
{{- with .Values.nodeSelector }}
nodeSelector:
{{- toYaml . | nindent 8 }}
{{- end }}
{{- with .Values.affinity }}
affinity:
{{- toYaml . | nindent 8 }}
{{- end }}
{{- with .Values.tolerations }}
tolerations:
{{- toYaml . | nindent 8 }}
{{- end }}

View File

@ -0,0 +1,46 @@
# Default values for mev-inspect.
# This is a YAML-formatted file.
# Declare variables to be passed into your templates.
replicaCount: 1
image:
repository: mev-inspect-py:latest
pullPolicy: IfNotPresent
imagePullSecrets: []
nameOverride: ""
fullnameOverride: ""
podAnnotations: {}
podSecurityContext: {}
# fsGroup: 2000
securityContext:
allowPrivilegeEscalation: false
capabilities:
drop:
- all
#readOnlyRootFilesystem: true
runAsNonRoot: true
runAsUser: 1000
resources: {}
# We usually recommend not to specify default resources and to leave this as a conscious
# choice for the user. This also increases chances charts run on environments with little
# resources, such as Minikube. If you do want to specify resources, uncomment the following
# lines, adjust them as necessary, and remove the curly braces after 'resources:'.
# limits:
# cpu: 100m
# memory: 128Mi
# requests:
# cpu: 100m
# memory: 128Mi
nodeSelector: {}
tolerations: []
affinity: {}

54
listener Executable file
View File

@ -0,0 +1,54 @@
#!/bin/sh
set -e
NAME=listener
PIDFILE=/home/flashbot/$NAME.pid
DAEMON=/bin/bash
DAEMON_OPTS='-c "poetry run python listener.py"'
case "$1" in
start)
echo -n "Starting daemon: "$NAME
start-stop-daemon \
--background \
--chdir /app \
--chuid flashbot \
--start \
--quiet \
--pidfile $PIDFILE \
--make-pidfile \
--startas /bin/bash -- -c "poetry run python listener.py"
echo "."
;;
stop)
echo -n "Stopping daemon: "$NAME
start-stop-daemon --stop --quiet --oknodo --pidfile $PIDFILE
rm $PIDFILE
echo "."
;;
tail)
tail -f listener.log
;;
restart)
echo -n "Restarting daemon: "$NAME
start-stop-daemon --stop --quiet --oknodo --retry 30 --pidfile $PIDFILE
rm $PIDFILE
start-stop-daemon \
--background \
--chdir /app \
--chuid flashbot \
--start \
--quiet \
--pidfile $PIDFILE \
--make-pidfile \
--startas /bin/bash -- -c "poetry run python listener.py"
echo "."
;;
*)
echo "Usage: "$1" {start|stop|restart|tail}"
exit 1
esac
exit 0

126
listener.py Normal file
View File

@ -0,0 +1,126 @@
import asyncio
import logging
import os
import dramatiq
from aiohttp_retry import ExponentialRetry, RetryClient
from mev_inspect.block import get_latest_block_number
from mev_inspect.concurrency import coro
from mev_inspect.crud.latest_block_update import (
find_latest_block_update,
update_latest_block,
)
from mev_inspect.db import get_inspect_session, get_trace_session
from mev_inspect.inspector import MEVInspector
from mev_inspect.provider import get_base_provider
from mev_inspect.queue.broker import connect_broker
from mev_inspect.queue.tasks import (
HIGH_PRIORITY,
HIGH_PRIORITY_QUEUE,
realtime_export_task,
)
from mev_inspect.signal_handler import GracefulKiller
logging.basicConfig(filename="listener.log", filemode="a", level=logging.INFO)
logger = logging.getLogger(__name__)
# lag to make sure the blocks we see are settled
BLOCK_NUMBER_LAG = 5
@coro
async def run():
rpc = os.getenv("RPC_URL")
if rpc is None:
raise RuntimeError("Missing environment variable RPC_URL")
healthcheck_url = os.getenv("LISTENER_HEALTHCHECK_URL")
logger.info("Starting...")
killer = GracefulKiller()
inspect_db_session = get_inspect_session()
trace_db_session = get_trace_session()
broker = connect_broker()
export_actor = dramatiq.actor(
realtime_export_task,
broker=broker,
queue_name=HIGH_PRIORITY_QUEUE,
priority=HIGH_PRIORITY,
)
inspector = MEVInspector(rpc)
base_provider = get_base_provider(rpc)
while not killer.kill_now:
await inspect_next_block(
inspector,
inspect_db_session,
trace_db_session,
base_provider,
healthcheck_url,
export_actor,
)
logger.info("Stopping...")
async def inspect_next_block(
inspector: MEVInspector,
inspect_db_session,
trace_db_session,
base_provider,
healthcheck_url,
export_actor,
):
latest_block_number = await get_latest_block_number(base_provider)
last_written_block = find_latest_block_update(inspect_db_session)
logger.info(f"Latest block: {latest_block_number}")
logger.info(f"Last written block: {last_written_block}")
if last_written_block is None:
# maintain lag if no blocks written yet
last_written_block = latest_block_number - BLOCK_NUMBER_LAG - 1
if last_written_block < (latest_block_number - BLOCK_NUMBER_LAG):
block_number = last_written_block + 1
logger.info(f"Writing block: {block_number}")
await inspector.inspect_single_block(
inspect_db_session=inspect_db_session,
trace_db_session=trace_db_session,
block=block_number,
)
update_latest_block(inspect_db_session, block_number)
logger.info(f"Sending block {block_number} for export")
export_actor.send(block_number)
if healthcheck_url:
await ping_healthcheck_url(healthcheck_url)
else:
await asyncio.sleep(5)
async def ping_healthcheck_url(url):
retry_options = ExponentialRetry(attempts=3)
async with RetryClient(
raise_for_status=False, retry_options=retry_options
) as client:
async with client.get(url) as _response:
pass
if __name__ == "__main__":
try:
run()
except Exception as e:
logger.error(e)

21
loop.py Normal file
View File

@ -0,0 +1,21 @@
import logging
import time
from mev_inspect.signal_handler import GracefulKiller
logging.basicConfig(filename="loop.log", level=logging.INFO)
logger = logging.getLogger(__name__)
def run():
logger.info("Starting...")
killer = GracefulKiller()
while not killer.kill_now:
time.sleep(1)
logger.info("Stopping...")
if __name__ == "__main__":
run()

129
mev Executable file
View File

@ -0,0 +1,129 @@
#!/usr/bin/env bash
set -e
DB_NAME=mev_inspect
function get_kube_secret(){
kubectl get secrets $1 -o jsonpath="{.data.$2}" | base64 --decode
}
function get_kube_db_secret(){
kubectl get secrets mev-inspect-db-credentials -o jsonpath="{.data.$1}" | base64 --decode
}
function db(){
host=$(get_kube_secret "mev-inspect-db-credentials" "host")
username=$(get_kube_secret "mev-inspect-db-credentials" "username")
password=$(get_kube_secret "mev-inspect-db-credentials" "password")
kubectl run -i --rm --tty postgres-client-$RANDOM \
--env="PGPASSWORD=$password" \
--image=jbergknoff/postgresql-client \
-- $DB_NAME --host=$host --user=$username
}
function redis(){
echo "To continue, enter 'shift + r'"
redis_password=$(get_kube_secret "redis" "redis-password")
kubectl run -i --rm --tty \
--namespace default redis-client-$RANDOM \
--env REDIS_PASSWORD=$redis_password \
--image docker.io/bitnami/redis:6.2.6-debian-10-r0 \
--command -- redis-cli -h redis-master -a $redis_password
}
case "$1" in
db)
echo "Connecting to $DB_NAME"
db
;;
redis)
echo "Connecting to redis"
redis
;;
listener)
kubectl exec -ti deploy/mev-inspect -- ./listener $2
;;
block-list)
echo "Backfilling blocks from stdin"
kubectl exec -i deploy/mev-inspect -- poetry run enqueue-block-list
;;
backfill)
after_block_number=$2
before_block_number=$3
echo "Backfilling from $after_block_number to $before_block_number"
kubectl exec -ti deploy/mev-inspect -- poetry run enqueue-many-blocks $after_block_number $before_block_number
;;
inspect)
block_number=$2
echo "Inspecting block $block_number"
kubectl exec -ti deploy/mev-inspect -- poetry run inspect-block $block_number
;;
inspect-many)
after_block_number=$2
before_block_number=$3
echo "Inspecting from block $after_block_number to $before_block_number"
kubectl exec -ti deploy/mev-inspect -- \
poetry run inspect-many-blocks $after_block_number $before_block_number
;;
test)
shift
echo "Running tests"
kubectl exec -ti deploy/mev-inspect -- poetry run pytest tests $@
;;
fetch)
block_number=$2
echo "Fetching block $block_number"
kubectl exec -ti deploy/mev-inspect -- poetry run fetch-block $block_number
;;
prices)
shift
case "$1" in
fetch-all)
echo "Running price fetch-all"
kubectl exec -ti deploy/mev-inspect -- \
poetry run fetch-all-prices
;;
fetch-range)
after=$2
before=$3
echo "Running price fetch-range"
kubectl exec -ti deploy/mev-inspect -- \
poetry run fetch-range $after $before
;;
*)
echo "prices usage: "$1" {fetch-all}"
exit 1
esac
;;
backfill-export)
after_block=$2
before_block=$3
echo "Sending $after_block to $before_block export to queue"
kubectl exec -ti deploy/mev-inspect -- poetry run enqueue-many-s3-exports $after_block $before_block
;;
enqueue-s3-export)
block_number=$2
echo "Sending $block_number export to queue"
kubectl exec -ti deploy/mev-inspect -- poetry run enqueue-s3-export $block_number
;;
s3-export)
block_number=$2
echo "Exporting $block_number"
kubectl exec -ti deploy/mev-inspect -- poetry run s3-export $block_number
;;
exec)
shift
kubectl exec -ti deploy/mev-inspect -- $@
;;
*)
echo "Usage: "$1" {db|backfill|inspect|test}"
exit 1
esac
exit 0

View File

@ -4,23 +4,39 @@ from typing import Optional
from pydantic import parse_obj_as
from mev_inspect.schemas import ABI
from mev_inspect.schemas.classified_traces import Protocol
from mev_inspect.schemas.abi import ABI
from mev_inspect.schemas.traces import Protocol
THIS_FILE_DIRECTORY = Path(__file__).parents[0]
ABI_DIRECTORY_PATH = THIS_FILE_DIRECTORY / "abis"
def get_abi(abi_name: str, protocol: Optional[Protocol]) -> Optional[ABI]:
def get_abi_path(abi_name: str, protocol: Optional[Protocol]) -> Optional[Path]:
abi_filename = f"{abi_name}.json"
abi_path = (
ABI_DIRECTORY_PATH / abi_filename
if protocol is None
else ABI_DIRECTORY_PATH / protocol.value / abi_filename
)
if abi_path.is_file():
return abi_path
return None
# raw abi, for instantiating contract for queries (as opposed to classification, see below)
def get_raw_abi(abi_name: str, protocol: Optional[Protocol]) -> Optional[str]:
abi_path = get_abi_path(abi_name, protocol)
if abi_path is not None:
with abi_path.open() as abi_file:
return abi_file.read()
return None
def get_abi(abi_name: str, protocol: Optional[Protocol]) -> Optional[ABI]:
abi_path = get_abi_path(abi_name, protocol)
if abi_path is not None:
with abi_path.open() as abi_file:
abi_json = json.load(abi_file)
return parse_obj_as(ABI, abi_json)

View File

@ -0,0 +1,615 @@
[
{
"anonymous": false,
"inputs": [
{
"indexed": true,
"internalType": "address",
"name": "owner",
"type": "address"
},
{
"indexed": true,
"internalType": "address",
"name": "spender",
"type": "address"
},
{
"indexed": false,
"internalType": "uint256",
"name": "value",
"type": "uint256"
}
],
"name": "Approval",
"type": "event"
},
{
"anonymous": false,
"inputs": [
{
"indexed": true,
"internalType": "address",
"name": "from",
"type": "address"
},
{
"indexed": true,
"internalType": "address",
"name": "to",
"type": "address"
},
{
"indexed": false,
"internalType": "uint256",
"name": "value",
"type": "uint256"
},
{
"indexed": false,
"internalType": "uint256",
"name": "index",
"type": "uint256"
}
],
"name": "BalanceTransfer",
"type": "event"
},
{
"anonymous": false,
"inputs": [
{
"indexed": true,
"internalType": "address",
"name": "from",
"type": "address"
},
{
"indexed": true,
"internalType": "address",
"name": "target",
"type": "address"
},
{
"indexed": false,
"internalType": "uint256",
"name": "value",
"type": "uint256"
},
{
"indexed": false,
"internalType": "uint256",
"name": "index",
"type": "uint256"
}
],
"name": "Burn",
"type": "event"
},
{
"anonymous": false,
"inputs": [
{
"indexed": true,
"internalType": "address",
"name": "underlyingAsset",
"type": "address"
},
{
"indexed": true,
"internalType": "address",
"name": "pool",
"type": "address"
},
{
"indexed": false,
"internalType": "address",
"name": "treasury",
"type": "address"
},
{
"indexed": false,
"internalType": "address",
"name": "incentivesController",
"type": "address"
},
{
"indexed": false,
"internalType": "uint8",
"name": "aTokenDecimals",
"type": "uint8"
},
{
"indexed": false,
"internalType": "string",
"name": "aTokenName",
"type": "string"
},
{
"indexed": false,
"internalType": "string",
"name": "aTokenSymbol",
"type": "string"
},
{
"indexed": false,
"internalType": "bytes",
"name": "params",
"type": "bytes"
}
],
"name": "Initialized",
"type": "event"
},
{
"anonymous": false,
"inputs": [
{
"indexed": true,
"internalType": "address",
"name": "from",
"type": "address"
},
{
"indexed": false,
"internalType": "uint256",
"name": "value",
"type": "uint256"
},
{
"indexed": false,
"internalType": "uint256",
"name": "index",
"type": "uint256"
}
],
"name": "Mint",
"type": "event"
},
{
"anonymous": false,
"inputs": [
{
"indexed": true,
"internalType": "address",
"name": "from",
"type": "address"
},
{
"indexed": true,
"internalType": "address",
"name": "to",
"type": "address"
},
{
"indexed": false,
"internalType": "uint256",
"name": "value",
"type": "uint256"
}
],
"name": "Transfer",
"type": "event"
},
{
"inputs": [
],
"name": "UNDERLYING_ASSET_ADDRESS",
"outputs": [
{
"internalType": "address",
"name": "",
"type": "address"
}
],
"stateMutability": "view",
"type": "function"
},
{
"inputs": [
{
"internalType": "address",
"name": "owner",
"type": "address"
},
{
"internalType": "address",
"name": "spender",
"type": "address"
}
],
"name": "allowance",
"outputs": [
{
"internalType": "uint256",
"name": "",
"type": "uint256"
}
],
"stateMutability": "view",
"type": "function"
},
{
"inputs": [
{
"internalType": "address",
"name": "spender",
"type": "address"
},
{
"internalType": "uint256",
"name": "amount",
"type": "uint256"
}
],
"name": "approve",
"outputs": [
{
"internalType": "bool",
"name": "",
"type": "bool"
}
],
"stateMutability": "nonpayable",
"type": "function"
},
{
"inputs": [
{
"internalType": "address",
"name": "account",
"type": "address"
}
],
"name": "balanceOf",
"outputs": [
{
"internalType": "uint256",
"name": "",
"type": "uint256"
}
],
"stateMutability": "view",
"type": "function"
},
{
"inputs": [
{
"internalType": "address",
"name": "user",
"type": "address"
},
{
"internalType": "address",
"name": "receiverOfUnderlying",
"type": "address"
},
{
"internalType": "uint256",
"name": "amount",
"type": "uint256"
},
{
"internalType": "uint256",
"name": "index",
"type": "uint256"
}
],
"name": "burn",
"outputs": [
],
"stateMutability": "nonpayable",
"type": "function"
},
{
"inputs": [
],
"name": "getIncentivesController",
"outputs": [
{
"internalType": "contract IAaveIncentivesController",
"name": "",
"type": "address"
}
],
"stateMutability": "view",
"type": "function"
},
{
"inputs": [
{
"internalType": "address",
"name": "user",
"type": "address"
}
],
"name": "getScaledUserBalanceAndSupply",
"outputs": [
{
"internalType": "uint256",
"name": "",
"type": "uint256"
},
{
"internalType": "uint256",
"name": "",
"type": "uint256"
}
],
"stateMutability": "view",
"type": "function"
},
{
"inputs": [
{
"internalType": "address",
"name": "user",
"type": "address"
},
{
"internalType": "uint256",
"name": "amount",
"type": "uint256"
}
],
"name": "handleRepayment",
"outputs": [
],
"stateMutability": "nonpayable",
"type": "function"
},
{
"inputs": [
{
"internalType": "contract ILendingPool",
"name": "pool",
"type": "address"
},
{
"internalType": "address",
"name": "treasury",
"type": "address"
},
{
"internalType": "address",
"name": "underlyingAsset",
"type": "address"
},
{
"internalType": "contract IAaveIncentivesController",
"name": "incentivesController",
"type": "address"
},
{
"internalType": "uint8",
"name": "aTokenDecimals",
"type": "uint8"
},
{
"internalType": "string",
"name": "aTokenName",
"type": "string"
},
{
"internalType": "string",
"name": "aTokenSymbol",
"type": "string"
},
{
"internalType": "bytes",
"name": "params",
"type": "bytes"
}
],
"name": "initialize",
"outputs": [
],
"stateMutability": "nonpayable",
"type": "function"
},
{
"inputs": [
{
"internalType": "address",
"name": "user",
"type": "address"
},
{
"internalType": "uint256",
"name": "amount",
"type": "uint256"
},
{
"internalType": "uint256",
"name": "index",
"type": "uint256"
}
],
"name": "mint",
"outputs": [
{
"internalType": "bool",
"name": "",
"type": "bool"
}
],
"stateMutability": "nonpayable",
"type": "function"
},
{
"inputs": [
{
"internalType": "uint256",
"name": "amount",
"type": "uint256"
},
{
"internalType": "uint256",
"name": "index",
"type": "uint256"
}
],
"name": "mintToTreasury",
"outputs": [
],
"stateMutability": "nonpayable",
"type": "function"
},
{
"inputs": [
{
"internalType": "address",
"name": "user",
"type": "address"
}
],
"name": "scaledBalanceOf",
"outputs": [
{
"internalType": "uint256",
"name": "",
"type": "uint256"
}
],
"stateMutability": "view",
"type": "function"
},
{
"inputs": [
],
"name": "scaledTotalSupply",
"outputs": [
{
"internalType": "uint256",
"name": "",
"type": "uint256"
}
],
"stateMutability": "view",
"type": "function"
},
{
"inputs": [
],
"name": "totalSupply",
"outputs": [
{
"internalType": "uint256",
"name": "",
"type": "uint256"
}
],
"stateMutability": "view",
"type": "function"
},
{
"inputs": [
{
"internalType": "address",
"name": "recipient",
"type": "address"
},
{
"internalType": "uint256",
"name": "amount",
"type": "uint256"
}
],
"name": "transfer",
"outputs": [
{
"internalType": "bool",
"name": "",
"type": "bool"
}
],
"stateMutability": "nonpayable",
"type": "function"
},
{
"inputs": [
{
"internalType": "address",
"name": "sender",
"type": "address"
},
{
"internalType": "address",
"name": "recipient",
"type": "address"
},
{
"internalType": "uint256",
"name": "amount",
"type": "uint256"
}
],
"name": "transferFrom",
"outputs": [
{
"internalType": "bool",
"name": "",
"type": "bool"
}
],
"stateMutability": "nonpayable",
"type": "function"
},
{
"inputs": [
{
"internalType": "address",
"name": "from",
"type": "address"
},
{
"internalType": "address",
"name": "to",
"type": "address"
},
{
"internalType": "uint256",
"name": "value",
"type": "uint256"
}
],
"name": "transferOnLiquidation",
"outputs": [
],
"stateMutability": "nonpayable",
"type": "function"
},
{
"inputs": [
{
"internalType": "address",
"name": "user",
"type": "address"
},
{
"internalType": "uint256",
"name": "amount",
"type": "uint256"
}
],
"name": "transferUnderlyingTo",
"outputs": [
{
"internalType": "uint256",
"name": "",
"type": "uint256"
}
],
"stateMutability": "nonpayable",
"type": "function"
}
]

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@ -1,8 +1,11 @@
from itertools import groupby
from typing import List, Optional
from typing import List, Optional, Tuple
from mev_inspect.schemas.arbitrages import Arbitrage
from mev_inspect.schemas.swaps import Swap
from mev_inspect.utils import equal_within_percent
MAX_TOKEN_AMOUNT_PERCENT_DIFFERENCE = 0.01
def get_arbitrages(swaps: List[Swap]) -> List[Arbitrage]:
@ -23,70 +26,168 @@ def get_arbitrages(swaps: List[Swap]) -> List[Arbitrage]:
def _get_arbitrages_from_swaps(swaps: List[Swap]) -> List[Arbitrage]:
pool_addresses = {swap.pool_address for swap in swaps}
"""
An arbitrage is defined as multiple swaps in a series that result in the initial token being returned
to the initial sender address.
There are 2 types of swaps that are most common (99%+).
Case I (fully routed):
BOT -> A/B -> B/C -> C/A -> BOT
Case II (always return to bot):
BOT -> A/B -> BOT -> B/C -> BOT -> A/C -> BOT
There is only 1 correct way to route Case I, but for Case II the following valid routes could be found:
A->B->C->A / B->C->A->B / C->A->B->C. Thus when multiple valid routes are found we filter to the set that
happen in valid order.
"""
all_arbitrages = []
for index, first_swap in enumerate(swaps):
other_swaps = swaps[:index] + swaps[index + 1 :]
start_ends = _get_all_start_end_swaps(swaps)
if len(start_ends) == 0:
return []
if first_swap.from_address not in pool_addresses:
arbitrage = _get_arbitrage_starting_with_swap(first_swap, other_swaps)
used_swaps: List[Swap] = []
if arbitrage is not None:
all_arbitrages.append(arbitrage)
for (start, ends) in start_ends:
if start in used_swaps:
continue
return all_arbitrages
unused_ends = [end for end in ends if end not in used_swaps]
route = _get_shortest_route(start, unused_ends, swaps)
def _get_arbitrage_starting_with_swap(
start_swap: Swap,
other_swaps: List[Swap],
) -> Optional[Arbitrage]:
swap_path = [start_swap]
current_swap: Swap = start_swap
while True:
next_swap = _get_swap_from_address(
current_swap.to_address,
current_swap.token_out_address,
other_swaps,
)
if next_swap is None:
return None
swap_path.append(next_swap)
current_swap = next_swap
if (
current_swap.to_address == start_swap.from_address
and current_swap.token_out_address == start_swap.token_in_address
):
start_amount = start_swap.token_in_amount
end_amount = current_swap.token_out_amount
if route is not None:
start_amount = route[0].token_in_amount
end_amount = route[-1].token_out_amount
profit_amount = end_amount - start_amount
error = None
for swap in route:
if swap.error is not None:
error = swap.error
return Arbitrage(
swaps=swap_path,
block_number=start_swap.block_number,
transaction_hash=start_swap.transaction_hash,
account_address=start_swap.from_address,
profit_token_address=start_swap.token_in_address,
arb = Arbitrage(
swaps=route,
block_number=route[0].block_number,
transaction_hash=route[0].transaction_hash,
account_address=route[0].from_address,
profit_token_address=route[0].token_in_address,
start_amount=start_amount,
end_amount=end_amount,
profit_amount=profit_amount,
error=error,
)
return None
all_arbitrages.append(arb)
used_swaps.extend(route)
if len(all_arbitrages) == 1:
return all_arbitrages
else:
return [
arb
for arb in all_arbitrages
if (arb.swaps[0].trace_address < arb.swaps[-1].trace_address)
]
def _get_swap_from_address(
address: str, token_address: str, swaps: List[Swap]
) -> Optional[Swap]:
for swap in swaps:
if swap.pool_address == address and swap.token_in_address == token_address:
return swap
def _get_shortest_route(
start_swap: Swap,
end_swaps: List[Swap],
all_swaps: List[Swap],
max_route_length: Optional[int] = None,
) -> Optional[List[Swap]]:
if len(end_swaps) == 0:
return None
return None
if max_route_length is not None and max_route_length < 2:
return None
for end_swap in end_swaps:
if _swap_outs_match_swap_ins(start_swap, end_swap):
return [start_swap, end_swap]
if max_route_length is not None and max_route_length == 2:
return None
other_swaps = [
swap for swap in all_swaps if (swap is not start_swap and swap not in end_swaps)
]
if len(other_swaps) == 0:
return None
shortest_remaining_route = None
max_remaining_route_length = (
None if max_route_length is None else max_route_length - 1
)
for next_swap in other_swaps:
if _swap_outs_match_swap_ins(start_swap, next_swap):
shortest_from_next = _get_shortest_route(
next_swap,
end_swaps,
other_swaps,
max_route_length=max_remaining_route_length,
)
if shortest_from_next is not None and (
shortest_remaining_route is None
or len(shortest_from_next) < len(shortest_remaining_route)
):
shortest_remaining_route = shortest_from_next
max_remaining_route_length = len(shortest_from_next) - 1
if shortest_remaining_route is None:
return None
else:
return [start_swap] + shortest_remaining_route
def _get_all_start_end_swaps(swaps: List[Swap]) -> List[Tuple[Swap, List[Swap]]]:
"""
Gets the set of all possible openings and corresponding closing swaps for an arbitrage via
- swap[start].token_in == swap[end].token_out
- swap[start].from_address == swap[end].to_address
- not swap[start].from_address in all_pool_addresses
- not swap[end].to_address in all_pool_addresses
"""
pool_addrs = [swap.contract_address for swap in swaps]
valid_start_ends: List[Tuple[Swap, List[Swap]]] = []
for index, potential_start_swap in enumerate(swaps):
ends_for_start: List[Swap] = []
remaining_swaps = swaps[:index] + swaps[index + 1 :]
for potential_end_swap in remaining_swaps:
if (
potential_start_swap.token_in_address
== potential_end_swap.token_out_address
and potential_start_swap.contract_address
!= potential_end_swap.contract_address
and potential_start_swap.from_address == potential_end_swap.to_address
and not potential_start_swap.from_address in pool_addrs
):
ends_for_start.append(potential_end_swap)
if len(ends_for_start) > 0:
valid_start_ends.append((potential_start_swap, ends_for_start))
return valid_start_ends
def _swap_outs_match_swap_ins(swap_out, swap_in) -> bool:
return (
swap_out.token_out_address == swap_in.token_in_address
and (
swap_out.contract_address == swap_in.from_address
or swap_out.to_address == swap_in.contract_address
or swap_out.to_address == swap_in.from_address
)
and equal_within_percent(
swap_out.token_out_amount,
swap_in.token_in_amount,
MAX_TOKEN_AMOUNT_PERCENT_DIFFERENCE,
)
)

View File

@ -1,58 +1,208 @@
from pathlib import Path
from typing import List
import asyncio
import logging
from typing import List, Optional
from sqlalchemy import orm
from web3 import Web3
from mev_inspect.fees import fetch_base_fee_per_gas
from mev_inspect.schemas import Block, Trace, TraceType
from mev_inspect.schemas.blocks import Block
from mev_inspect.schemas.receipts import Receipt
from mev_inspect.schemas.traces import Trace, TraceType
from mev_inspect.utils import hex_to_int
logger = logging.getLogger(__name__)
cache_directory = "./cache"
async def get_latest_block_number(base_provider) -> int:
latest_block = await base_provider.make_request(
"eth_getBlockByNumber",
["latest", False],
)
return hex_to_int(latest_block["result"]["number"])
def create_from_block_number(
base_provider, w3: Web3, block_number: int, should_cache: bool
async def create_from_block_number(
w3: Web3,
block_number: int,
trace_db_session: Optional[orm.Session],
) -> Block:
if not should_cache:
return fetch_block(w3, base_provider, block_number)
cache_path = _get_cache_path(block_number)
if cache_path.is_file():
print(f"Cache for block {block_number} exists, " "loading data from cache")
return Block.parse_file(cache_path)
else:
print(f"Cache for block {block_number} did not exist, getting data")
block = fetch_block(w3, base_provider, block_number)
cache_block(cache_path, block)
return block
def fetch_block(w3, base_provider, block_number: int) -> Block:
block_json = w3.eth.get_block(block_number)
receipts_json = base_provider.make_request("eth_getBlockReceipts", [block_number])
traces_json = w3.parity.trace_block(block_number)
receipts: List[Receipt] = [
Receipt(**receipt) for receipt in receipts_json["result"]
]
traces = [Trace(**trace_json) for trace_json in traces_json]
base_fee_per_gas = fetch_base_fee_per_gas(w3, block_number)
block_timestamp, receipts, traces, base_fee_per_gas = await asyncio.gather(
_find_or_fetch_block_timestamp(w3, block_number, trace_db_session),
_find_or_fetch_block_receipts(w3, block_number, trace_db_session),
_find_or_fetch_block_traces(w3, block_number, trace_db_session),
_find_or_fetch_base_fee_per_gas(w3, block_number, trace_db_session),
)
miner_address = await _find_or_fetch_miner_address(w3, block_number, traces)
return Block(
block_number=block_number,
miner=block_json["miner"],
block_timestamp=block_timestamp,
miner=miner_address,
base_fee_per_gas=base_fee_per_gas,
traces=traces,
receipts=receipts,
)
async def _find_or_fetch_block_timestamp(
w3,
block_number: int,
trace_db_session: Optional[orm.Session],
) -> int:
if trace_db_session is not None:
existing_block_timestamp = _find_block_timestamp(trace_db_session, block_number)
if existing_block_timestamp is not None:
return existing_block_timestamp
return await _fetch_block_timestamp(w3, block_number)
async def _find_or_fetch_block_receipts(
w3,
block_number: int,
trace_db_session: Optional[orm.Session],
) -> List[Receipt]:
if trace_db_session is not None:
existing_block_receipts = _find_block_receipts(trace_db_session, block_number)
if existing_block_receipts is not None:
return existing_block_receipts
return await _fetch_block_receipts(w3, block_number)
async def _find_or_fetch_block_traces(
w3,
block_number: int,
trace_db_session: Optional[orm.Session],
) -> List[Trace]:
if trace_db_session is not None:
existing_block_traces = _find_block_traces(trace_db_session, block_number)
if existing_block_traces is not None:
return existing_block_traces
return await _fetch_block_traces(w3, block_number)
async def _find_or_fetch_base_fee_per_gas(
w3,
block_number: int,
trace_db_session: Optional[orm.Session],
) -> int:
if trace_db_session is not None:
existing_base_fee_per_gas = _find_base_fee_per_gas(
trace_db_session, block_number
)
if existing_base_fee_per_gas is not None:
return existing_base_fee_per_gas
return await fetch_base_fee_per_gas(w3, block_number)
async def _fetch_block_timestamp(w3, block_number: int) -> int:
block_json = await w3.eth.get_block(block_number)
return block_json["timestamp"]
async def _fetch_block_receipts(w3, block_number: int) -> List[Receipt]:
receipts_json = await w3.eth.get_block_receipts(block_number)
return [Receipt(**receipt) for receipt in receipts_json]
async def _fetch_block_traces(w3, block_number: int) -> List[Trace]:
traces_json = await w3.eth.trace_block(block_number)
return [Trace(**trace_json) for trace_json in traces_json]
def _find_block_timestamp(
trace_db_session: orm.Session,
block_number: int,
) -> Optional[int]:
result = trace_db_session.execute(
"SELECT block_timestamp FROM block_timestamps WHERE block_number = :block_number",
params={"block_number": block_number},
).one_or_none()
if result is None:
return None
else:
(block_timestamp,) = result
return block_timestamp
def _find_block_traces(
trace_db_session: orm.Session,
block_number: int,
) -> Optional[List[Trace]]:
result = trace_db_session.execute(
"SELECT raw_traces FROM block_traces WHERE block_number = :block_number",
params={"block_number": block_number},
).one_or_none()
if result is None:
return None
else:
(traces_json,) = result
return [Trace(**trace_json) for trace_json in traces_json]
def _find_block_receipts(
trace_db_session: orm.Session,
block_number: int,
) -> Optional[List[Receipt]]:
result = trace_db_session.execute(
"SELECT raw_receipts FROM block_receipts WHERE block_number = :block_number",
params={"block_number": block_number},
).one_or_none()
if result is None:
return None
else:
(receipts_json,) = result
return [Receipt(**receipt) for receipt in receipts_json]
def _find_base_fee_per_gas(
trace_db_session: orm.Session,
block_number: int,
) -> Optional[int]:
result = trace_db_session.execute(
"SELECT base_fee_in_wei FROM base_fee WHERE block_number = :block_number",
params={"block_number": block_number},
).one_or_none()
if result is None:
return None
else:
(base_fee,) = result
return base_fee
async def _find_or_fetch_miner_address(
w3,
block_number: int,
traces: List[Trace],
) -> Optional[str]:
# eth1 blocks
miner_address = _get_miner_address_from_traces(traces)
if miner_address is not None:
return miner_address
return await _fetch_miner_eth2(w3, block_number)
async def _fetch_miner_eth2(w3, block_number: int) -> Optional[str]:
block_json = await w3.eth.get_block(block_number)
return block_json["miner"]
def _get_miner_address_from_traces(traces: List[Trace]) -> Optional[str]:
for trace in traces:
if trace.type == TraceType.reward:
return trace.action["author"]
return None
def get_transaction_hashes(calls: List[Trace]) -> List[str]:
result = []
@ -65,15 +215,3 @@ def get_transaction_hashes(calls: List[Trace]) -> List[str]:
result.append(call.transaction_hash)
return result
def cache_block(cache_path: Path, block: Block):
write_mode = "w" if cache_path.is_file() else "x"
with open(cache_path, mode=write_mode) as cache_file:
cache_file.write(block.json())
def _get_cache_path(block_number: int) -> Path:
cache_directory_path = Path(cache_directory)
return cache_directory_path / f"{block_number}-new.json"

View File

@ -0,0 +1,208 @@
from typing import List, Optional, Sequence
from mev_inspect.schemas.nft_trades import NftTrade
from mev_inspect.schemas.prices import ETH_TOKEN_ADDRESS
from mev_inspect.schemas.swaps import Swap
from mev_inspect.schemas.traces import ClassifiedTrace, DecodedCallTrace
from mev_inspect.schemas.transfers import Transfer
def create_nft_trade_from_transfers(
trace: DecodedCallTrace,
child_transfers: List[Transfer],
collection_address: str,
seller_address: str,
buyer_address: str,
exchange_wallet_address: str,
) -> Optional[NftTrade]:
transfers_to_buyer = _filter_transfers(child_transfers, to_address=buyer_address)
transfers_to_seller = _filter_transfers(child_transfers, to_address=seller_address)
if len(transfers_to_buyer) != 1 or len(transfers_to_seller) != 1:
return None
if transfers_to_buyer[0].token_address != collection_address:
return None
payment_token_address = transfers_to_seller[0].token_address
payment_amount = transfers_to_seller[0].amount
token_id = transfers_to_buyer[0].amount
transfers_from_seller_to_exchange = _filter_transfers(
child_transfers,
from_address=seller_address,
to_address=exchange_wallet_address,
)
transfers_from_buyer_to_exchange = _filter_transfers(
child_transfers,
from_address=buyer_address,
to_address=exchange_wallet_address,
)
for fee in [
*transfers_from_seller_to_exchange,
*transfers_from_buyer_to_exchange,
]:
# Assumes that exchange fees are paid with the same token as the sale
payment_amount -= fee.amount
return NftTrade(
abi_name=trace.abi_name,
transaction_hash=trace.transaction_hash,
transaction_position=trace.transaction_position,
block_number=trace.block_number,
trace_address=trace.trace_address,
protocol=trace.protocol,
error=trace.error,
seller_address=seller_address,
buyer_address=buyer_address,
payment_token_address=payment_token_address,
payment_amount=payment_amount,
collection_address=collection_address,
token_id=token_id,
)
def create_swap_from_pool_transfers(
trace: DecodedCallTrace,
recipient_address: str,
prior_transfers: List[Transfer],
child_transfers: List[Transfer],
) -> Optional[Swap]:
pool_address = trace.to_address
transfers_to_pool = []
if trace.value is not None and trace.value > 0:
transfers_to_pool = [_build_eth_transfer(trace)]
if len(transfers_to_pool) == 0:
transfers_to_pool = _filter_transfers(prior_transfers, to_address=pool_address)
if len(transfers_to_pool) == 0:
transfers_to_pool = _filter_transfers(child_transfers, to_address=pool_address)
if len(transfers_to_pool) == 0:
return None
transfers_from_pool_to_recipient = _filter_transfers(
child_transfers, to_address=recipient_address, from_address=pool_address
)
if len(transfers_from_pool_to_recipient) != 1:
return None
transfer_in = transfers_to_pool[-1]
transfer_out = transfers_from_pool_to_recipient[0]
if transfer_in.token_address == transfer_out.token_address:
return None
return Swap(
abi_name=trace.abi_name,
transaction_hash=trace.transaction_hash,
transaction_position=trace.transaction_position,
block_number=trace.block_number,
trace_address=trace.trace_address,
contract_address=pool_address,
protocol=trace.protocol,
from_address=transfer_in.from_address,
to_address=transfer_out.to_address,
token_in_address=transfer_in.token_address,
token_in_amount=transfer_in.amount,
token_out_address=transfer_out.token_address,
token_out_amount=transfer_out.amount,
error=trace.error,
)
def create_swap_from_recipient_transfers(
trace: DecodedCallTrace,
pool_address: str,
recipient_address: str,
prior_transfers: List[Transfer],
child_transfers: List[Transfer],
) -> Optional[Swap]:
transfers_from_recipient = _filter_transfers(
[*prior_transfers, *child_transfers], from_address=recipient_address
)
transfers_to_recipient = _filter_transfers(
child_transfers, to_address=recipient_address
)
if len(transfers_from_recipient) != 1 or len(transfers_to_recipient) != 1:
return None
transfer_in = transfers_from_recipient[0]
transfer_out = transfers_to_recipient[0]
return Swap(
abi_name=trace.abi_name,
transaction_hash=trace.transaction_hash,
transaction_position=trace.transaction_position,
block_number=trace.block_number,
trace_address=trace.trace_address,
contract_address=pool_address,
protocol=trace.protocol,
from_address=transfer_in.from_address,
to_address=transfer_out.to_address,
token_in_address=transfer_in.token_address,
token_in_amount=transfer_in.amount,
token_out_address=transfer_out.token_address,
token_out_amount=transfer_out.amount,
error=trace.error,
)
def _build_eth_transfer(trace: ClassifiedTrace) -> Transfer:
return Transfer(
block_number=trace.block_number,
transaction_hash=trace.transaction_hash,
trace_address=trace.trace_address,
amount=trace.value,
to_address=trace.to_address,
from_address=trace.from_address,
token_address=ETH_TOKEN_ADDRESS,
)
def _filter_transfers(
transfers: Sequence[Transfer],
to_address: Optional[str] = None,
from_address: Optional[str] = None,
) -> List[Transfer]:
filtered_transfers = []
for transfer in transfers:
if to_address is not None and transfer.to_address != to_address:
continue
if from_address is not None and transfer.from_address != from_address:
continue
filtered_transfers.append(transfer)
return filtered_transfers
def get_received_transfer(
liquidator: str, child_transfers: List[Transfer]
) -> Optional[Transfer]:
"""Get transfer from AAVE to liquidator"""
for transfer in child_transfers:
if transfer.to_address == liquidator:
return transfer
return None
def get_debt_transfer(
liquidator: str, child_transfers: List[Transfer]
) -> Optional[Transfer]:
"""Get transfer from liquidator to AAVE"""
for transfer in child_transfers:
if transfer.from_address == liquidator:
return transfer
return None

View File

@ -1,11 +1,21 @@
from typing import Dict, Optional, Tuple, Type
from mev_inspect.schemas.classifiers import Classifier, ClassifierSpec
from mev_inspect.schemas.traces import DecodedCallTrace, Protocol
from .aave import AAVE_CLASSIFIER_SPECS
from .balancer import BALANCER_CLASSIFIER_SPECS
from .bancor import BANCOR_CLASSIFIER_SPECS
from .compound import COMPOUND_CLASSIFIER_SPECS
from .cream import CREAM_CLASSIFIER_SPECS
from .cryptopunks import CRYPTOPUNKS_CLASSIFIER_SPECS
from .curve import CURVE_CLASSIFIER_SPECS
from .erc20 import ERC20_CLASSIFIER_SPECS
from .opensea import OPENSEA_CLASSIFIER_SPECS
from .uniswap import UNISWAP_CLASSIFIER_SPECS
from .weth import WETH_CLASSIFIER_SPECS
from .zero_ex import ZEROX_CLASSIFIER_SPECS
ALL_CLASSIFIER_SPECS = (
ERC20_CLASSIFIER_SPECS
+ WETH_CLASSIFIER_SPECS
@ -13,4 +23,26 @@ ALL_CLASSIFIER_SPECS = (
+ UNISWAP_CLASSIFIER_SPECS
+ AAVE_CLASSIFIER_SPECS
+ ZEROX_CLASSIFIER_SPECS
+ BALANCER_CLASSIFIER_SPECS
+ COMPOUND_CLASSIFIER_SPECS
+ CREAM_CLASSIFIER_SPECS
+ CRYPTOPUNKS_CLASSIFIER_SPECS
+ OPENSEA_CLASSIFIER_SPECS
+ BANCOR_CLASSIFIER_SPECS
)
_SPECS_BY_ABI_NAME_AND_PROTOCOL: Dict[
Tuple[str, Optional[Protocol]], ClassifierSpec
] = {(spec.abi_name, spec.protocol): spec for spec in ALL_CLASSIFIER_SPECS}
def get_classifier(
trace: DecodedCallTrace,
) -> Optional[Type[Classifier]]:
abi_name_and_protocol = (trace.abi_name, trace.protocol)
spec = _SPECS_BY_ABI_NAME_AND_PROTOCOL.get(abi_name_and_protocol)
if spec is not None:
return spec.classifiers.get(trace.function_signature)
return None

View File

@ -1,15 +1,93 @@
from mev_inspect.schemas.classified_traces import (
Classification,
from typing import List, Optional
from mev_inspect.classifiers.helpers import get_debt_transfer, get_received_transfer
from mev_inspect.schemas.classifiers import (
ClassifiedTrace,
ClassifierSpec,
Protocol,
DecodedCallTrace,
LiquidationClassifier,
TransferClassifier,
)
from mev_inspect.schemas.liquidations import Liquidation
from mev_inspect.schemas.traces import Protocol
from mev_inspect.schemas.transfers import Transfer
class AaveLiquidationClassifier(LiquidationClassifier):
@staticmethod
def parse_liquidation(
liquidation_trace: DecodedCallTrace,
child_transfers: List[Transfer],
child_traces: List[ClassifiedTrace],
) -> Optional[Liquidation]:
liquidator = liquidation_trace.from_address
liquidated = liquidation_trace.inputs["_user"]
debt_token_address = liquidation_trace.inputs["_reserve"]
received_token_address = liquidation_trace.inputs["_collateral"]
debt_purchase_amount = None
received_amount = None
debt_transfer = get_debt_transfer(liquidator, child_transfers)
received_transfer = get_received_transfer(liquidator, child_transfers)
if debt_transfer is not None and received_transfer is not None:
debt_token_address = debt_transfer.token_address
debt_purchase_amount = debt_transfer.amount
received_token_address = received_transfer.token_address
received_amount = received_transfer.amount
return Liquidation(
liquidated_user=liquidated,
debt_token_address=debt_token_address,
liquidator_user=liquidator,
debt_purchase_amount=debt_purchase_amount,
protocol=Protocol.aave,
received_amount=received_amount,
received_token_address=received_token_address,
transaction_hash=liquidation_trace.transaction_hash,
trace_address=liquidation_trace.trace_address,
block_number=liquidation_trace.block_number,
error=liquidation_trace.error,
)
else:
return None
class AaveTransferClassifier(TransferClassifier):
@staticmethod
def get_transfer(trace: DecodedCallTrace) -> Transfer:
return Transfer(
block_number=trace.block_number,
transaction_hash=trace.transaction_hash,
trace_address=trace.trace_address,
amount=trace.inputs["value"],
to_address=trace.inputs["to"],
from_address=trace.inputs["from"],
token_address=trace.to_address,
)
AAVE_SPEC = ClassifierSpec(
abi_name="AaveLendingPool",
protocol=Protocol.aave,
classifications={
"liquidationCall(address,address,address,uint256,bool)": Classification.liquidate,
classifiers={
"liquidationCall(address,address,address,uint256,bool)": AaveLiquidationClassifier,
},
)
AAVE_CLASSIFIER_SPECS = [AAVE_SPEC]
ATOKENS_SPEC = ClassifierSpec(
abi_name="aTokens",
protocol=Protocol.aave,
classifiers={
"transferOnLiquidation(address,address,uint256)": AaveTransferClassifier,
},
)
AAVE_CLASSIFIER_SPECS: List[ClassifierSpec] = [AAVE_SPEC, ATOKENS_SPEC]

View File

@ -0,0 +1,41 @@
from typing import List, Optional
from mev_inspect.classifiers.helpers import create_swap_from_pool_transfers
from mev_inspect.schemas.classifiers import ClassifierSpec, SwapClassifier
from mev_inspect.schemas.swaps import Swap
from mev_inspect.schemas.traces import DecodedCallTrace, Protocol
from mev_inspect.schemas.transfers import Transfer
BALANCER_V1_POOL_ABI_NAME = "BPool"
class BalancerSwapClassifier(SwapClassifier):
@staticmethod
def parse_swap(
trace: DecodedCallTrace,
prior_transfers: List[Transfer],
child_transfers: List[Transfer],
) -> Optional[Swap]:
recipient_address = trace.from_address
swap = create_swap_from_pool_transfers(
trace, recipient_address, prior_transfers, child_transfers
)
return swap
BALANCER_V1_SPECS = [
ClassifierSpec(
abi_name=BALANCER_V1_POOL_ABI_NAME,
protocol=Protocol.balancer_v1,
classifiers={
"swapExactAmountIn(address,uint256,address,uint256,uint256)": BalancerSwapClassifier,
"swapExactAmountOut(address,uint256,address,uint256,uint256)": BalancerSwapClassifier,
},
),
]
BALANCER_CLASSIFIER_SPECS = [
*BALANCER_V1_SPECS,
]

View File

@ -0,0 +1,41 @@
from typing import List, Optional
from mev_inspect.classifiers.helpers import create_swap_from_recipient_transfers
from mev_inspect.schemas.classifiers import ClassifierSpec, SwapClassifier
from mev_inspect.schemas.swaps import Swap
from mev_inspect.schemas.traces import DecodedCallTrace, Protocol
from mev_inspect.schemas.transfers import Transfer
BANCOR_NETWORK_ABI_NAME = "BancorNetwork"
BANCOR_NETWORK_CONTRACT_ADDRESS = "0x2F9EC37d6CcFFf1caB21733BdaDEdE11c823cCB0"
class BancorSwapClassifier(SwapClassifier):
@staticmethod
def parse_swap(
trace: DecodedCallTrace,
prior_transfers: List[Transfer],
child_transfers: List[Transfer],
) -> Optional[Swap]:
recipient_address = trace.from_address
swap = create_swap_from_recipient_transfers(
trace,
BANCOR_NETWORK_CONTRACT_ADDRESS,
recipient_address,
prior_transfers,
child_transfers,
)
return swap
BANCOR_NETWORK_SPEC = ClassifierSpec(
abi_name=BANCOR_NETWORK_ABI_NAME,
protocol=Protocol.bancor,
classifiers={
"convertByPath(address[],uint256,uint256,address,address,uint256)": BancorSwapClassifier,
},
valid_contract_addresses=[BANCOR_NETWORK_CONTRACT_ADDRESS],
)
BANCOR_CLASSIFIER_SPECS = [BANCOR_NETWORK_SPEC]

View File

@ -0,0 +1,127 @@
from typing import List, Optional
from mev_inspect.classifiers.helpers import get_debt_transfer, get_received_transfer
from mev_inspect.schemas.classifiers import (
Classification,
ClassifiedTrace,
ClassifierSpec,
DecodedCallTrace,
LiquidationClassifier,
SeizeClassifier,
)
from mev_inspect.schemas.liquidations import Liquidation
from mev_inspect.schemas.prices import CETH_TOKEN_ADDRESS, ETH_TOKEN_ADDRESS
from mev_inspect.schemas.traces import Protocol
from mev_inspect.schemas.transfers import Transfer
class CompoundLiquidationClassifier(LiquidationClassifier):
@staticmethod
def parse_liquidation(
liquidation_trace: DecodedCallTrace,
child_transfers: List[Transfer],
child_traces: List[ClassifiedTrace],
) -> Optional[Liquidation]:
liquidator = liquidation_trace.from_address
liquidated = liquidation_trace.inputs["borrower"]
debt_token_address = liquidation_trace.to_address
received_token_address = liquidation_trace.inputs["cTokenCollateral"]
debt_purchase_amount = None
received_amount = None
debt_purchase_amount, debt_token_address = (
(liquidation_trace.value, ETH_TOKEN_ADDRESS)
if debt_token_address == CETH_TOKEN_ADDRESS and liquidation_trace.value != 0
else (liquidation_trace.inputs["repayAmount"], CETH_TOKEN_ADDRESS)
)
debt_transfer = get_debt_transfer(liquidator, child_transfers)
received_transfer = get_received_transfer(liquidator, child_transfers)
seize_trace = _get_seize_call(child_traces)
if debt_transfer is not None:
debt_token_address = debt_transfer.token_address
debt_purchase_amount = debt_transfer.amount
if received_transfer is not None:
received_token_address = received_transfer.token_address
received_amount = received_transfer.amount
elif seize_trace is not None and seize_trace.inputs is not None:
received_amount = seize_trace.inputs["seizeTokens"]
if received_amount is None:
return None
return Liquidation(
liquidated_user=liquidated,
debt_token_address=debt_token_address,
liquidator_user=liquidator,
debt_purchase_amount=debt_purchase_amount,
protocol=liquidation_trace.protocol,
received_amount=received_amount,
received_token_address=received_token_address,
transaction_hash=liquidation_trace.transaction_hash,
trace_address=liquidation_trace.trace_address,
block_number=liquidation_trace.block_number,
error=liquidation_trace.error,
)
return None
COMPOUND_V2_CETH_SPEC = ClassifierSpec(
abi_name="CEther",
protocol=Protocol.compound_v2,
valid_contract_addresses=["0x4ddc2d193948926d02f9b1fe9e1daa0718270ed5"],
classifiers={
"liquidateBorrow(address,address)": CompoundLiquidationClassifier,
"seize(address,address,uint256)": SeizeClassifier,
},
)
COMPOUND_V2_CTOKEN_SPEC = ClassifierSpec(
abi_name="CToken",
protocol=Protocol.compound_v2,
valid_contract_addresses=[
"0x6c8c6b02e7b2be14d4fa6022dfd6d75921d90e4e",
"0x5d3a536e4d6dbd6114cc1ead35777bab948e3643",
"0x158079ee67fce2f58472a96584a73c7ab9ac95c1",
"0x39aa39c021dfbae8fac545936693ac917d5e7563",
"0xf650c3d88d12db855b8bf7d11be6c55a4e07dcc9",
"0xc11b1268c1a384e55c48c2391d8d480264a3a7f4",
"0xb3319f5d18bc0d84dd1b4825dcde5d5f7266d407",
"0xf5dce57282a584d2746faf1593d3121fcac444dc",
"0x35a18000230da775cac24873d00ff85bccded550",
"0x70e36f6bf80a52b3b46b3af8e106cc0ed743e8e4",
"0xccf4429db6322d5c611ee964527d42e5d685dd6a",
"0x12392f67bdf24fae0af363c24ac620a2f67dad86",
"0xface851a4921ce59e912d19329929ce6da6eb0c7",
"0x95b4ef2869ebd94beb4eee400a99824bf5dc325b",
"0x4b0181102a0112a2ef11abee5563bb4a3176c9d7",
"0xe65cdb6479bac1e22340e4e755fae7e509ecd06c",
"0x80a2ae356fc9ef4305676f7a3e2ed04e12c33946",
],
classifiers={
"liquidateBorrow(address,uint256,address)": CompoundLiquidationClassifier,
"seize(address,address,uint256)": SeizeClassifier,
},
)
COMPOUND_CLASSIFIER_SPECS: List[ClassifierSpec] = [
COMPOUND_V2_CETH_SPEC,
COMPOUND_V2_CTOKEN_SPEC,
]
def _get_seize_call(traces: List[ClassifiedTrace]) -> Optional[ClassifiedTrace]:
"""Find the call to `seize` in the child traces (successful liquidation)"""
for trace in traces:
if trace.classification == Classification.seize:
return trace
return None

View File

@ -0,0 +1,204 @@
from typing import List, Optional
from mev_inspect.classifiers.helpers import get_debt_transfer, get_received_transfer
from mev_inspect.schemas.classifiers import (
Classification,
ClassifiedTrace,
ClassifierSpec,
DecodedCallTrace,
LiquidationClassifier,
SeizeClassifier,
)
from mev_inspect.schemas.liquidations import Liquidation
from mev_inspect.schemas.prices import ETH_TOKEN_ADDRESS
from mev_inspect.schemas.traces import Protocol
from mev_inspect.schemas.transfers import Transfer
CRETH_TOKEN_ADDRESS = "0xd06527d5e56a3495252a528c4987003b712860ee"
class CreamLiquidationClassifier(LiquidationClassifier):
@staticmethod
def parse_liquidation(
liquidation_trace: DecodedCallTrace,
child_transfers: List[Transfer],
child_traces: List[ClassifiedTrace],
) -> Optional[Liquidation]:
liquidator = liquidation_trace.from_address
liquidated = liquidation_trace.inputs["borrower"]
debt_token_address = liquidation_trace.to_address
received_token_address = liquidation_trace.inputs["cTokenCollateral"]
debt_purchase_amount = None
received_amount = None
debt_purchase_amount, debt_token_address = (
(liquidation_trace.value, ETH_TOKEN_ADDRESS)
if debt_token_address == CRETH_TOKEN_ADDRESS
and liquidation_trace.value != 0
else (liquidation_trace.inputs["repayAmount"], CRETH_TOKEN_ADDRESS)
)
debt_transfer = get_debt_transfer(liquidator, child_transfers)
received_transfer = get_received_transfer(liquidator, child_transfers)
seize_trace = _get_seize_call(child_traces)
if debt_transfer is not None:
debt_token_address = debt_transfer.token_address
debt_purchase_amount = debt_transfer.amount
if received_transfer is not None:
received_token_address = received_transfer.token_address
received_amount = received_transfer.amount
elif seize_trace is not None and seize_trace.inputs is not None:
received_amount = seize_trace.inputs["seizeTokens"]
if received_amount is None:
return None
return Liquidation(
liquidated_user=liquidated,
debt_token_address=debt_token_address,
liquidator_user=liquidator,
debt_purchase_amount=debt_purchase_amount,
protocol=liquidation_trace.protocol,
received_amount=received_amount,
received_token_address=received_token_address,
transaction_hash=liquidation_trace.transaction_hash,
trace_address=liquidation_trace.trace_address,
block_number=liquidation_trace.block_number,
error=liquidation_trace.error,
)
return None
CREAM_CRETH_SPEC = ClassifierSpec(
abi_name="CEther",
protocol=Protocol.cream,
valid_contract_addresses=["0xD06527D5e56A3495252A528C4987003b712860eE"],
classifiers={
"liquidateBorrow(address,address)": CreamLiquidationClassifier,
"seize(address,address,uint256)": SeizeClassifier,
},
)
CREAM_CTOKEN_SPEC = ClassifierSpec(
abi_name="CToken",
protocol=Protocol.cream,
valid_contract_addresses=[
"0xd06527d5e56a3495252a528c4987003b712860ee",
"0x51f48b638f82e8765f7a26373a2cb4ccb10c07af",
"0x44fbebd2f576670a6c33f6fc0b00aa8c5753b322",
"0xcbae0a83f4f9926997c8339545fb8ee32edc6b76",
"0xce4fe9b4b8ff61949dcfeb7e03bc9faca59d2eb3",
"0x19d1666f543d42ef17f66e376944a22aea1a8e46",
"0x9baf8a5236d44ac410c0186fe39178d5aad0bb87",
"0x797aab1ce7c01eb727ab980762ba88e7133d2157",
"0x892b14321a4fcba80669ae30bd0cd99a7ecf6ac0",
"0x697256caa3ccafd62bb6d3aa1c7c5671786a5fd9",
"0x8b86e0598616a8d4f1fdae8b59e55fb5bc33d0d6",
"0xc7fd8dcee4697ceef5a2fd4608a7bd6a94c77480",
"0x17107f40d70f4470d20cb3f138a052cae8ebd4be",
"0x1ff8cdb51219a8838b52e9cac09b71e591bc998e",
"0x3623387773010d9214b10c551d6e7fc375d31f58",
"0x4ee15f44c6f0d8d1136c83efd2e8e4ac768954c6",
"0x338286c0bc081891a4bda39c7667ae150bf5d206",
"0x10fdbd1e48ee2fd9336a482d746138ae19e649db",
"0x01da76dea59703578040012357b81ffe62015c2d",
"0xef58b2d5a1b8d3cde67b8ab054dc5c831e9bc025",
"0xe89a6d0509faf730bd707bf868d9a2a744a363c7",
"0xeff039c3c1d668f408d09dd7b63008622a77532c",
"0x22b243b96495c547598d9042b6f94b01c22b2e9e",
"0x8b3ff1ed4f36c2c2be675afb13cc3aa5d73685a5",
"0x2a537fa9ffaea8c1a41d3c2b68a9cb791529366d",
"0x7ea9c63e216d5565c3940a2b3d150e59c2907db3",
"0x3225e3c669b39c7c8b3e204a8614bb218c5e31bc",
"0xf55bbe0255f7f4e70f63837ff72a577fbddbe924",
"0x903560b1cce601794c584f58898da8a8b789fc5d",
"0x054b7ed3f45714d3091e82aad64a1588dc4096ed",
"0xd5103afcd0b3fa865997ef2984c66742c51b2a8b",
"0xfd609a03b393f1a1cfcacedabf068cad09a924e2",
"0xd692ac3245bb82319a31068d6b8412796ee85d2c",
"0x92b767185fb3b04f881e3ac8e5b0662a027a1d9f",
"0x10a3da2bb0fae4d591476fd97d6636fd172923a8",
"0x3c6c553a95910f9fc81c98784736bd628636d296",
"0x21011bc93d9e515b9511a817a1ed1d6d468f49fc",
"0x85759961b116f1d36fd697855c57a6ae40793d9b",
"0x7c3297cfb4c4bbd5f44b450c0872e0ada5203112",
"0x7aaa323d7e398be4128c7042d197a2545f0f1fea",
"0x011a014d5e8eb4771e575bb1000318d509230afa",
"0xe6c3120f38f56deb38b69b65cc7dcaf916373963",
"0x4fe11bc316b6d7a345493127fbe298b95adaad85",
"0xcd22c4110c12ac41acefa0091c432ef44efaafa0",
"0x228619cca194fbe3ebeb2f835ec1ea5080dafbb2",
"0x73f6cba38922960b7092175c0add22ab8d0e81fc",
"0x38f27c03d6609a86ff7716ad03038881320be4ad",
"0x5ecad8a75216cea7dff978525b2d523a251eea92",
"0x5c291bc83d15f71fb37805878161718ea4b6aee9",
"0x6ba0c66c48641e220cf78177c144323b3838d375",
"0xd532944df6dfd5dd629e8772f03d4fc861873abf",
"0x197070723ce0d3810a0e47f06e935c30a480d4fc",
"0xc25eae724f189ba9030b2556a1533e7c8a732e14",
"0x25555933a8246ab67cbf907ce3d1949884e82b55",
"0xc68251421edda00a10815e273fa4b1191fac651b",
"0x65883978ada0e707c3b2be2a6825b1c4bdf76a90",
"0x8b950f43fcac4931d408f1fcda55c6cb6cbf3096",
"0x59089279987dd76fc65bf94cb40e186b96e03cb3",
"0x2db6c82ce72c8d7d770ba1b5f5ed0b6e075066d6",
"0xb092b4601850e23903a42eacbc9d8a0eec26a4d5",
"0x081fe64df6dc6fc70043aedf3713a3ce6f190a21",
"0x1d0986fb43985c88ffa9ad959cc24e6a087c7e35",
"0xc36080892c64821fa8e396bc1bd8678fa3b82b17",
"0x8379baa817c5c5ab929b03ee8e3c48e45018ae41",
"0x299e254a8a165bbeb76d9d69305013329eea3a3b",
"0xf8445c529d363ce114148662387eba5e62016e20",
"0x28526bb33d7230e65e735db64296413731c5402e",
"0x45406ba53bb84cd32a58e7098a2d4d1b11b107f6",
"0x6d1b9e01af17dd08d6dec08e210dfd5984ff1c20",
"0x1f9b4756b008106c806c7e64322d7ed3b72cb284",
"0xab10586c918612ba440482db77549d26b7abf8f7",
"0xdfff11dfe6436e42a17b86e7f419ac8292990393",
"0xdbb5e3081def4b6cdd8864ac2aeda4cbf778fecf",
"0x71cefcd324b732d4e058afacba040d908c441847",
"0x1a122348b73b58ea39f822a89e6ec67950c2bbd0",
"0x523effc8bfefc2948211a05a905f761cba5e8e9e",
"0x4202d97e00b9189936edf37f8d01cff88bdd81d4",
"0x4baa77013ccd6705ab0522853cb0e9d453579dd4",
"0x98e329eb5aae2125af273102f3440de19094b77c",
"0x8c3b7a4320ba70f8239f83770c4015b5bc4e6f91",
"0xe585c76573d7593abf21537b607091f76c996e73",
"0x81e346729723c4d15d0fb1c5679b9f2926ff13c6",
"0x766175eac1a99c969ddd1ebdbe7e270d508d8fff",
"0xd7394428536f63d5659cc869ef69d10f9e66314b",
"0x1241b10e7ea55b22f5b2d007e8fecdf73dcff999",
"0x2a867fd776b83e1bd4e13c6611afd2f6af07ea6d",
"0x250fb308199fe8c5220509c1bf83d21d60b7f74a",
"0x4112a717edd051f77d834a6703a1ef5e3d73387f",
"0xf04ce2e71d32d789a259428ddcd02d3c9f97fb4e",
"0x89e42987c39f72e2ead95a8a5bc92114323d5828",
"0x58da9c9fc3eb30abbcbbab5ddabb1e6e2ef3d2ef",
],
classifiers={
"liquidateBorrow(address,uint256,address)": CreamLiquidationClassifier,
"seize(address,address,uint256)": SeizeClassifier,
},
)
CREAM_CLASSIFIER_SPECS: List[ClassifierSpec] = [
CREAM_CRETH_SPEC,
CREAM_CTOKEN_SPEC,
]
def _get_seize_call(traces: List[ClassifiedTrace]) -> Optional[ClassifiedTrace]:
"""Find the call to `seize` in the child traces (successful liquidation)"""
for trace in traces:
if trace.classification == Classification.seize:
return trace
return None

View File

@ -0,0 +1,27 @@
from mev_inspect.schemas.classifiers import Classifier, ClassifierSpec
from mev_inspect.schemas.traces import Classification, Protocol
class PunkBidAcceptanceClassifier(Classifier):
@staticmethod
def get_classification() -> Classification:
return Classification.punk_accept_bid
class PunkBidClassifier(Classifier):
@staticmethod
def get_classification() -> Classification:
return Classification.punk_bid
CRYPTO_PUNKS_SPEC = ClassifierSpec(
abi_name="cryptopunks",
protocol=Protocol.cryptopunks,
valid_contract_addresses=["0xb47e3cd837dDF8e4c57F05d70Ab865de6e193BBB"],
classifiers={
"enterBidForPunk(uint256)": PunkBidClassifier,
"acceptBidForPunk(uint256,uint256)": PunkBidAcceptanceClassifier,
},
)
CRYPTOPUNKS_CLASSIFIER_SPECS = [CRYPTO_PUNKS_SPEC]

View File

@ -1,29 +1,28 @@
from mev_inspect.schemas.classified_traces import (
ClassifierSpec,
Protocol,
)
from typing import List, Optional
from mev_inspect.classifiers.helpers import create_swap_from_pool_transfers
from mev_inspect.schemas.classifiers import ClassifierSpec, SwapClassifier
from mev_inspect.schemas.swaps import Swap
from mev_inspect.schemas.traces import DecodedCallTrace, Protocol
from mev_inspect.schemas.transfers import Transfer
class CurveSwapClassifier(SwapClassifier):
@staticmethod
def parse_swap(
trace: DecodedCallTrace,
prior_transfers: List[Transfer],
child_transfers: List[Transfer],
) -> Optional[Swap]:
recipient_address = trace.from_address
swap = create_swap_from_pool_transfers(
trace, recipient_address, prior_transfers, child_transfers
)
return swap
"""
Deployment addresses found here
https://curve.readthedocs.io/ref-addresses.html
organized into 3 groups
1. Base Pools: 2 or more tokens implementing stable swap
- StableSwap<pool>
- Deposit<pool>
- CurveContract<version>
- CurveTokenV1/V2
2. Meta Pools: 1 token trading with an LP from above
- StableSwap<pool>
- Deposit<pool>
- CurveTokenV1/V2
3. Liquidity Gauges: stake LP get curve governance token?
- LiquidityGauge
- LiquidityGaugeV1/V2
- LiquidityGaugeReward
4. DAO stuff
5..? Other stuff, haven't decided if important
"""
CURVE_BASE_POOLS = [
ClassifierSpec(
abi_name="CurveTokenV1",
@ -72,101 +71,171 @@ CURVE_BASE_POOLS = [
abi_name="StableSwap3Pool",
protocol=Protocol.curve,
valid_contract_addresses=["0xbEbc44782C7dB0a1A60Cb6fe97d0b483032FF1C7"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwapAAVE",
protocol=Protocol.curve,
valid_contract_addresses=["0xDeBF20617708857ebe4F679508E7b7863a8A8EeE"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
"exchange_underlying(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwapAETH",
protocol=Protocol.curve,
valid_contract_addresses=["0xA96A65c051bF88B4095Ee1f2451C2A9d43F53Ae2"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwapBUSD",
protocol=Protocol.curve,
valid_contract_addresses=["0x79a8C46DeA5aDa233ABaFFD40F3A0A2B1e5A4F27"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
"exchange_underlying(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwapCompound",
protocol=Protocol.curve,
valid_contract_addresses=["0xA2B47E3D5c44877cca798226B7B8118F9BFb7A56"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
"exchange_underlying(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwapEURS",
protocol=Protocol.curve,
valid_contract_addresses=["0x0Ce6a5fF5217e38315f87032CF90686C96627CAA"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwaphBTC",
protocol=Protocol.curve,
valid_contract_addresses=["0x4CA9b3063Ec5866A4B82E437059D2C43d1be596F"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwapIronBank",
protocol=Protocol.curve,
valid_contract_addresses=["0x2dded6Da1BF5DBdF597C45fcFaa3194e53EcfeAF"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
"exchange_underlying(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwapLink",
protocol=Protocol.curve,
valid_contract_addresses=["0xf178c0b5bb7e7abf4e12a4838c7b7c5ba2c623c0"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwapPAX",
protocol=Protocol.curve,
valid_contract_addresses=["0x06364f10B501e868329afBc005b3492902d6C763"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
"exchange_underlying(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwaprenBTC",
protocol=Protocol.curve,
valid_contract_addresses=["0x93054188d876f558f4a66B2EF1d97d16eDf0895B"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwaprETH",
protocol=Protocol.curve,
valid_contract_addresses=["0xF9440930043eb3997fc70e1339dBb11F341de7A8"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwapsAAVE",
protocol=Protocol.curve,
valid_contract_addresses=["0xEB16Ae0052ed37f479f7fe63849198Df1765a733"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
"exchange_underlying(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwapsBTC",
protocol=Protocol.curve,
valid_contract_addresses=["0x7fC77b5c7614E1533320Ea6DDc2Eb61fa00A9714"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwapsETH",
protocol=Protocol.curve,
valid_contract_addresses=["0xc5424B857f758E906013F3555Dad202e4bdB4567"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwapstETH",
protocol=Protocol.curve,
valid_contract_addresses=["0xDC24316b9AE028F1497c275EB9192a3Ea0f67022"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwapsUSD",
protocol=Protocol.curve,
valid_contract_addresses=["0xA5407eAE9Ba41422680e2e00537571bcC53efBfD"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
"exchange_underlying(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwapUSDT",
protocol=Protocol.curve,
valid_contract_addresses=["0x52EA46506B9CC5Ef470C5bf89f17Dc28bB35D85C"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
"exchange_underlying(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwapY",
protocol=Protocol.curve,
valid_contract_addresses=["0x45F783CCE6B7FF23B2ab2D70e416cdb7D6055f51"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
"exchange_underlying(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwapYv2",
protocol=Protocol.curve,
valid_contract_addresses=["0x8925D9d9B4569D737a48499DeF3f67BaA5a144b9"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
"exchange_underlying(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="DepositBUSD",
@ -300,51 +369,91 @@ CURVE_META_POOLS = [
abi_name="StableSwapbBTC",
protocol=Protocol.curve,
valid_contract_addresses=["0x071c661B4DeefB59E2a3DdB20Db036821eeE8F4b"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
"exchange_underlying(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwapDUSD",
protocol=Protocol.curve,
valid_contract_addresses=["0x8038C01A0390a8c547446a0b2c18fc9aEFEcc10c"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
"exchange_underlying(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwapGUSD",
protocol=Protocol.curve,
valid_contract_addresses=["0x4f062658EaAF2C1ccf8C8e36D6824CDf41167956"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
"exchange_underlying(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwapHUSD",
protocol=Protocol.curve,
valid_contract_addresses=["0x3eF6A01A0f81D6046290f3e2A8c5b843e738E604"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
"exchange_underlying(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwapLinkUSD",
protocol=Protocol.curve,
valid_contract_addresses=["0xE7a24EF0C5e95Ffb0f6684b813A78F2a3AD7D171"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
"exchange_underlying(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwapMUSD",
protocol=Protocol.curve,
valid_contract_addresses=["0x8474DdbE98F5aA3179B3B3F5942D724aFcdec9f6"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
"exchange_underlying(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwapoBTC",
protocol=Protocol.curve,
valid_contract_addresses=["0xd81dA8D904b52208541Bade1bD6595D8a251F8dd"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
"exchange_underlying(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwappBTC",
protocol=Protocol.curve,
valid_contract_addresses=["0x7F55DDe206dbAD629C080068923b36fe9D6bDBeF"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
"exchange_underlying(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwapRSV",
protocol=Protocol.curve,
valid_contract_addresses=["0xC18cC39da8b11dA8c3541C598eE022258F9744da"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
"exchange_underlying(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwaptBTC",
protocol=Protocol.curve,
valid_contract_addresses=["0xC25099792E9349C7DD09759744ea681C7de2cb66"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
"exchange_underlying(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwapUSD",
@ -353,82 +462,29 @@ CURVE_META_POOLS = [
"0x3E01dD8a5E1fb3481F0F589056b428Fc308AF0Fb",
"0x0f9cb53Ebe405d49A0bbdBD291A65Ff571bC83e1",
],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
"exchange_underlying(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwapUSDP",
protocol=Protocol.curve,
valid_contract_addresses=["0x42d7025938bEc20B69cBae5A77421082407f053A"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
"exchange_underlying(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
ClassifierSpec(
abi_name="StableSwapUST",
protocol=Protocol.curve,
valid_contract_addresses=["0x890f4e345B1dAED0367A877a1612f86A1f86985f"],
classifiers={
"exchange(int128,int128,uint256,uint256)": CurveSwapClassifier,
"exchange_underlying(int128,int128,uint256,uint256)": CurveSwapClassifier,
},
),
]
"""
CURVE_LIQUIDITY_GAUGES = [
ClassifierSpec(
abi_name="LiquidityGauge",
protocol=Protocol.curve,
valid_contract_addresses=[
"0xbFcF63294aD7105dEa65aA58F8AE5BE2D9d0952A", # 3Pool
"0x69Fb7c45726cfE2baDeE8317005d3F94bE838840", # BUSD
"0x7ca5b0a2910B33e9759DC7dDB0413949071D7575", # Compound
"0xC5cfaDA84E902aD92DD40194f0883ad49639b023", # GUSD
"0x4c18E409Dc8619bFb6a1cB56D114C3f592E0aE79", # hBTC
"0x2db0E83599a91b508Ac268a6197b8B14F5e72840", # HUSD
"0x64E3C23bfc40722d3B649844055F1D51c1ac041d", # PAX
"0xB1F2cdeC61db658F091671F5f199635aEF202CAC", # renBTC
"0xC2b1DF84112619D190193E48148000e3990Bf627", # USDK
"0xF98450B5602fa59CC66e1379DFfB6FDDc724CfC4", # USDN
"0xBC89cd85491d81C6AD2954E6d0362Ee29fCa8F53", # USDT
"0xFA712EE4788C042e2B7BB55E6cb8ec569C4530c1", # Y
],
),
ClassifierSpec(
abi_name="LiquidityGaugeV2",
protocol=Protocol.curve,
valid_contract_addresses=[
"0xd662908ADA2Ea1916B3318327A97eB18aD588b5d", # AAVE
"0x6d10ed2cF043E6fcf51A0e7b4C2Af3Fa06695707", # ankrETH
"0xdFc7AdFa664b08767b735dE28f9E84cd30492aeE", # bBTC
"0x90Bb609649E0451E5aD952683D64BD2d1f245840", # EURS
"0x72e158d38dbd50a483501c24f792bdaaa3e7d55c", # FRAX
"0x11137B10C210b579405c21A07489e28F3c040AB1", # oBTC
"0xF5194c3325202F456c95c1Cf0cA36f8475C1949F", # IronBank
"0xFD4D8a17df4C27c1dD245d153ccf4499e806C87D", # Link
"0xd7d147c6Bb90A718c3De8C0568F9B560C79fa416", # pBTC
"0x462253b8F74B72304c145DB0e4Eebd326B22ca39", # sAAVE
"0x3C0FFFF15EA30C35d7A85B85c0782D6c94e1d238", # sETH
"0x182B723a58739a9c974cFDB385ceaDb237453c28", # stETH
"0x055be5DDB7A925BfEF3417FC157f53CA77cA7222", # USDP
"0x3B7020743Bc2A4ca9EaF9D0722d42E20d6935855", # UST
"0x8101E6760130be2C8Ace79643AB73500571b7162", # Yv2
],
),
ClassifierSpec(
abi_name="LiquidityGaugeV3",
protocol=Protocol.curve,
valid_contract_addresses=[
"0x9582C4ADACB3BCE56Fea3e590F05c3ca2fb9C477", # alUSD
"0x824F13f1a2F29cFEEa81154b46C0fc820677A637", # rETH
"0x6955a55416a06839309018A8B0cB72c4DDC11f15", # TriCrypto
],
),
ClassifierSpec(
abi_name="LiquidityGaugeReward",
protocol=Protocol.curve,
valid_contract_addresses=[
"0xAEA6c312f4b3E04D752946d329693F7293bC2e6D", # DUSD
"0x5f626c30EC1215f4EdCc9982265E8b1F411D1352", # MUSD
"0x4dC4A289a8E33600D8bD4cf5F6313E43a37adec7", # RSV
"0x705350c4BcD35c9441419DdD5d2f097d7a55410F", # sBTC
"0xA90996896660DEcC6E997655E065b23788857849", # sUSDv2
"0x6828bcF74279eE32f2723eC536c22c51Eed383C6", # tBTC
],
),
]
"""
CURVE_CLASSIFIER_SPECS = [*CURVE_BASE_POOLS, *CURVE_META_POOLS]

View File

@ -1,15 +1,27 @@
from mev_inspect.schemas.classified_traces import (
Classification,
ClassifierSpec,
)
from mev_inspect.schemas.classifiers import ClassifierSpec, TransferClassifier
from mev_inspect.schemas.traces import DecodedCallTrace
from mev_inspect.schemas.transfers import Transfer
class ERC20TransferClassifier(TransferClassifier):
@staticmethod
def get_transfer(trace: DecodedCallTrace) -> Transfer:
return Transfer(
block_number=trace.block_number,
transaction_hash=trace.transaction_hash,
trace_address=trace.trace_address,
amount=trace.inputs["amount"],
to_address=trace.inputs["recipient"],
from_address=trace.inputs.get("sender", trace.from_address),
token_address=trace.to_address,
)
ERC20_SPEC = ClassifierSpec(
abi_name="ERC20",
classifications={
"transferFrom(address,address,uint256)": Classification.transfer,
"transfer(address,uint256)": Classification.transfer,
"burn(address)": Classification.burn,
classifiers={
"transferFrom(address,address,uint256)": ERC20TransferClassifier,
"transfer(address,uint256)": ERC20TransferClassifier,
},
)

View File

@ -0,0 +1,42 @@
from typing import List, Optional
from mev_inspect.classifiers.helpers import create_nft_trade_from_transfers
from mev_inspect.schemas.classifiers import ClassifierSpec, NftTradeClassifier
from mev_inspect.schemas.nft_trades import NftTrade
from mev_inspect.schemas.traces import DecodedCallTrace, Protocol
from mev_inspect.schemas.transfers import Transfer
OPENSEA_WALLET_ADDRESS = "0x5b3256965e7c3cf26e11fcaf296dfc8807c01073"
class OpenseaClassifier(NftTradeClassifier):
@staticmethod
def parse_trade(
trace: DecodedCallTrace,
child_transfers: List[Transfer],
) -> Optional[NftTrade]:
addresses = trace.inputs["addrs"]
buy_maker = addresses[1]
sell_maker = addresses[8]
target = addresses[4]
return create_nft_trade_from_transfers(
trace,
child_transfers,
collection_address=target,
seller_address=sell_maker,
buyer_address=buy_maker,
exchange_wallet_address=OPENSEA_WALLET_ADDRESS,
)
OPENSEA_SPEC = ClassifierSpec(
abi_name="WyvernExchange",
protocol=Protocol.opensea,
valid_contract_addresses=["0x7be8076f4ea4a4ad08075c2508e481d6c946d12b"],
classifiers={
"atomicMatch_(address[14],uint256[18],uint8[8],bytes,bytes,bytes,bytes,bytes,bytes,uint8[2],bytes32[5])": OpenseaClassifier,
},
)
OPENSEA_CLASSIFIER_SPECS = [OPENSEA_SPEC]

Some files were not shown because too many files have changed in this diff Show More