diff options
Diffstat (limited to 'packages/pipeline')
135 files changed, 0 insertions, 9309 deletions
diff --git a/packages/pipeline/.npmignore b/packages/pipeline/.npmignore deleted file mode 100644 index 89302c908..000000000 --- a/packages/pipeline/.npmignore +++ /dev/null @@ -1,7 +0,0 @@ -.* -yarn-error.log -/scripts/ -/generated_docs/ -/src/ -tsconfig.json -/lib/monorepo_scripts/ diff --git a/packages/pipeline/README.md b/packages/pipeline/README.md deleted file mode 100644 index 23113fd9b..000000000 --- a/packages/pipeline/README.md +++ /dev/null @@ -1,186 +0,0 @@ -## @0xproject/pipeline - -This repository contains scripts used for scraping data from the Ethereum blockchain into SQL tables for analysis by the 0x team. - -## Contributing - -We strongly recommend that the community help us make improvements and determine the future direction of the protocol. To report bugs within this package, please create an issue in this repository. - -Please read our [contribution guidelines](../../CONTRIBUTING.md) before getting started. - -### Install dependencies: - -```bash -yarn install -``` - -### Build - -```bash -yarn build -``` - -### Clean - -```bash -yarn clean -``` - -### Lint - -```bash -yarn lint -``` - -### Migrations - -Create a new migration: `yarn migrate:create --name MigrationNameInCamelCase`. - -Run migrations: `yarn migrate:run` - -Revert the most recent migration (CAUTION: may result in data loss!): `yarn migrate:revert` - -## Testing - -There are several test scripts in **package.json**. You can run all the tests -with `yarn test:all` or run certain tests seprately by following the -instructions below. Some tests may not work out of the box on certain platforms -or operating systems (see the "Database tests" section below). - -### Unit tests - -The unit tests can be run with `yarn test`. These tests don't depend on any -services or databases and will run in any environment that can run Node. - -### Database tests - -Database integration tests can be run with `yarn test:db`. These tests will -attempt to automatically spin up a Postgres database via Docker. If this doesn't -work you have two other options: - -1. Set the `DOCKER_SOCKET` environment variable to a valid socket path to use - for communicating with Docker. -2. Start Postgres manually and set the `ZEROEX_DATA_PIPELINE_TEST_DB_URL` - environment variable. If this is set, the tests will use your existing - Postgres database instead of trying to create one with Docker. - -## Running locally - -`pipeline` requires access to a PostgreSQL database. The easiest way to start -Postgres is via Docker. Depending on your platform, you may need to prepend -`sudo` to the following command: - -``` -docker run --rm -d -p 5432:5432 --name pipeline_postgres postgres:11-alpine -``` - -This will start a Postgres server with the default username and database name -(`postgres` and `postgres`). You should set the environment variable as follows: - -``` -export ZEROEX_DATA_PIPELINE_DB_URL=postgresql://postgres@localhost/postgres -``` - -First thing you will need to do is run the migrations: - -``` -yarn migrate:run -``` - -Now you can run scripts locally: - -``` -node packages/pipeline/lib/src/scripts/pull_radar_relay_orders.js -``` - -To stop the Postgres server (you may need to add `sudo`): - -``` -docker stop pipeline_postgres -``` - -This will remove all data from the database. - -If you prefer, you can also install Postgres with e.g., -[Homebrew](https://wiki.postgresql.org/wiki/Homebrew) or -[Postgress.app](https://postgresapp.com/). Keep in mind that you will need to -set the`ZEROEX_DATA_PIPELINE_DB_URL` environment variable to a valid -[PostgreSQL connection url](https://stackoverflow.com/questions/3582552/postgresql-connection-url) - -## Directory structure - -``` -. -├── lib: Code generated by the TypeScript compiler. Don't edit this directly. -├── migrations: Code for creating and updating database schemas. -├── node_modules: -├── src: All TypeScript source code. -│  ├── data_sources: Code responsible for getting raw data, typically from a third-party source. -│  ├── entities: TypeORM entities which closely mirror our database schemas. Some other ORMs call these "models". -│  ├── parsers: Code for converting raw data into entities. -│  ├── scripts: Executable scripts which put all the pieces together. -│  └── utils: Various utils used across packages/files. -├── test: All tests go here and are organized in the same way as the folder/file that they test. -``` - -## Adding new data to the pipeline - -1. Create an entity in the _entities_ directory. Entities directly mirror our - database schemas. We follow the practice of having "dumb" entities, so - entity classes should typically not have any methods. -2. Create a migration using the `yarn migrate:create` command. Create/update - tables as needed. Remember to fill in both the `up` and `down` methods. Try - to avoid data loss as much as possible in your migrations. -3. Add basic tests for your entity and migrations to the **test/entities/** - directory. -4. Create a class or function in the **data_sources/** directory for getting - raw data. This code should abstract away pagination and rate-limiting as - much as possible. -5. Create a class or function in the **parsers/** directory for converting the - raw data into an entity. Also add tests in the **tests/** directory to test - the parser. -6. Create an executable script in the **scripts/** directory for putting - everything together. Your script can accept environment variables for things - like API keys. It should pull the data, parse it, and save it to the - database. Scripts should be idempotent and atomic (when possible). What this - means is that your script may be responsible for determining _which_ data - needs to be updated. For example, you may need to query the database to find - the most recent block number that we have already pulled, then pull new data - starting from that block number. -7. Run the migrations and then run your new script locally and verify it works - as expected. -8. After all tests pass and you can run the script locally, open a new PR to - the monorepo. Don't merge this yet! -9. If you added any new scripts or dependencies between scripts, you will need - to make changes to https://github.com/0xProject/0x-pipeline-orchestration - and make a separate PR there. Don't merge this yet! -10. After your PR passes code review, ask @feuGeneA or @xianny to deploy your - changes to the QA environment. Check the [QA Airflow dashboard](http://airflow-qa.0x.org:8080) - to make sure everything works correctly in the QA environment. -11. Merge your PR to 0x-monorepo (and - https://github.com/0xProject/0x-pipeline-orchestration if needed). Then ask - @feuGeneA or @xianny to deploy to production. -12. Monitor the [production Airflow dashboard](http://airflow.0x.org:8080) to - make sure everything still works. -13. Celebrate! :tada: - -#### Additional guidelines and tips: - -- Table names should be plural and separated by underscores (e.g., - `exchange_fill_events`). -- Any table which contains data which comes directly from a third-party source - should be namespaced in the `raw` PostgreSQL schema. -- Column names in the database should be separated by underscores (e.g., - `maker_asset_type`). -- Field names in entity classes (like any other fields in TypeScript) should - be camel-cased (e.g., `makerAssetType`). -- All timestamps should be stored as milliseconds since the Unix Epoch. -- Use the `BigNumber` type for TypeScript code which deals with 256-bit - numbers from smart contracts or for any case where we are dealing with large - floating point numbers. -- [TypeORM documentation](http://typeorm.io/#/) is pretty robust and can be a - helpful resource. - -* Scripts/parsers should perform minimum data transformation/normalization. - The idea here is to have a raw data feed that will be cleaned up and - synthesized in a separate step. diff --git a/packages/pipeline/coverage/.gitkeep b/packages/pipeline/coverage/.gitkeep deleted file mode 100644 index e69de29bb..000000000 --- a/packages/pipeline/coverage/.gitkeep +++ /dev/null diff --git a/packages/pipeline/migrations/1542070840010-InitialSchema.ts b/packages/pipeline/migrations/1542070840010-InitialSchema.ts deleted file mode 100644 index 895f9e6c9..000000000 --- a/packages/pipeline/migrations/1542070840010-InitialSchema.ts +++ /dev/null @@ -1,187 +0,0 @@ -import { MigrationInterface, QueryRunner, Table } from 'typeorm'; - -const blocks = new Table({ - name: 'raw.blocks', - columns: [ - { name: 'number', type: 'bigint', isPrimary: true }, - { name: 'hash', type: 'varchar', isPrimary: true }, - { name: 'timestamp', type: 'bigint' }, - ], -}); - -const exchange_cancel_events = new Table({ - name: 'raw.exchange_cancel_events', - columns: [ - { name: 'contract_address', type: 'char(42)', isPrimary: true }, - { name: 'log_index', type: 'integer', isPrimary: true }, - { name: 'block_number', type: 'bigint', isPrimary: true }, - - { name: 'raw_data', type: 'varchar' }, - - { name: 'transaction_hash', type: 'varchar' }, - { name: 'maker_address', type: 'char(42)' }, - { name: 'taker_address', type: 'char(42)' }, - { name: 'fee_recipient_address', type: 'char(42)' }, - { name: 'sender_address', type: 'char(42)' }, - { name: 'order_hash', type: 'varchar' }, - - { name: 'raw_maker_asset_data', type: 'varchar' }, - { name: 'maker_asset_type', type: 'varchar' }, - { name: 'maker_asset_proxy_id', type: 'varchar' }, - { name: 'maker_token_address', type: 'char(42)' }, - { name: 'maker_token_id', type: 'varchar', isNullable: true }, - { name: 'raw_taker_asset_data', type: 'varchar' }, - { name: 'taker_asset_type', type: 'varchar' }, - { name: 'taker_asset_proxy_id', type: 'varchar' }, - { name: 'taker_token_address', type: 'char(42)' }, - { name: 'taker_token_id', type: 'varchar', isNullable: true }, - ], -}); - -const exchange_cancel_up_to_events = new Table({ - name: 'raw.exchange_cancel_up_to_events', - columns: [ - { name: 'contract_address', type: 'char(42)', isPrimary: true }, - { name: 'log_index', type: 'integer', isPrimary: true }, - { name: 'block_number', type: 'bigint', isPrimary: true }, - - { name: 'raw_data', type: 'varchar' }, - - { name: 'transaction_hash', type: 'varchar' }, - { name: 'maker_address', type: 'char(42)' }, - { name: 'sender_address', type: 'char(42)' }, - { name: 'order_epoch', type: 'varchar' }, - ], -}); - -const exchange_fill_events = new Table({ - name: 'raw.exchange_fill_events', - columns: [ - { name: 'contract_address', type: 'char(42)', isPrimary: true }, - { name: 'log_index', type: 'integer', isPrimary: true }, - { name: 'block_number', type: 'bigint', isPrimary: true }, - - { name: 'raw_data', type: 'varchar' }, - - { name: 'transaction_hash', type: 'varchar' }, - { name: 'maker_address', type: 'char(42)' }, - { name: 'taker_address', type: 'char(42)' }, - { name: 'fee_recipient_address', type: 'char(42)' }, - { name: 'sender_address', type: 'char(42)' }, - { name: 'maker_asset_filled_amount', type: 'varchar' }, - { name: 'taker_asset_filled_amount', type: 'varchar' }, - { name: 'maker_fee_paid', type: 'varchar' }, - { name: 'taker_fee_paid', type: 'varchar' }, - { name: 'order_hash', type: 'varchar' }, - - { name: 'raw_maker_asset_data', type: 'varchar' }, - { name: 'maker_asset_type', type: 'varchar' }, - { name: 'maker_asset_proxy_id', type: 'varchar' }, - { name: 'maker_token_address', type: 'char(42)' }, - { name: 'maker_token_id', type: 'varchar', isNullable: true }, - { name: 'raw_taker_asset_data', type: 'varchar' }, - { name: 'taker_asset_type', type: 'varchar' }, - { name: 'taker_asset_proxy_id', type: 'varchar' }, - { name: 'taker_token_address', type: 'char(42)' }, - { name: 'taker_token_id', type: 'varchar', isNullable: true }, - ], -}); - -const relayers = new Table({ - name: 'raw.relayers', - columns: [ - { name: 'uuid', type: 'varchar', isPrimary: true }, - { name: 'name', type: 'varchar' }, - { name: 'sra_http_endpoint', type: 'varchar', isNullable: true }, - { name: 'sra_ws_endpoint', type: 'varchar', isNullable: true }, - { name: 'app_url', type: 'varchar', isNullable: true }, - { name: 'fee_recipient_addresses', type: 'char(42)', isArray: true }, - { name: 'taker_addresses', type: 'char(42)', isArray: true }, - ], -}); - -const sra_orders = new Table({ - name: 'raw.sra_orders', - columns: [ - { name: 'exchange_address', type: 'char(42)', isPrimary: true }, - { name: 'order_hash_hex', type: 'varchar', isPrimary: true }, - - { name: 'source_url', type: 'varchar' }, - { name: 'last_updated_timestamp', type: 'bigint' }, - { name: 'first_seen_timestamp', type: 'bigint' }, - - { name: 'maker_address', type: 'char(42)' }, - { name: 'taker_address', type: 'char(42)' }, - { name: 'fee_recipient_address', type: 'char(42)' }, - { name: 'sender_address', type: 'char(42)' }, - { name: 'maker_asset_filled_amount', type: 'varchar' }, - { name: 'taker_asset_filled_amount', type: 'varchar' }, - { name: 'maker_fee', type: 'varchar' }, - { name: 'taker_fee', type: 'varchar' }, - { name: 'expiration_time_seconds', type: 'int' }, - { name: 'salt', type: 'varchar' }, - { name: 'signature', type: 'varchar' }, - - { name: 'raw_maker_asset_data', type: 'varchar' }, - { name: 'maker_asset_type', type: 'varchar' }, - { name: 'maker_asset_proxy_id', type: 'varchar' }, - { name: 'maker_token_address', type: 'char(42)' }, - { name: 'maker_token_id', type: 'varchar', isNullable: true }, - { name: 'raw_taker_asset_data', type: 'varchar' }, - { name: 'taker_asset_type', type: 'varchar' }, - { name: 'taker_asset_proxy_id', type: 'varchar' }, - { name: 'taker_token_address', type: 'char(42)' }, - { name: 'taker_token_id', type: 'varchar', isNullable: true }, - - { name: 'metadata_json', type: 'varchar' }, - ], -}); - -const token_on_chain_metadata = new Table({ - name: 'raw.token_on_chain_metadata', - columns: [ - { name: 'address', type: 'char(42)', isPrimary: true }, - { name: 'decimals', type: 'integer' }, - { name: 'symbol', type: 'varchar' }, - { name: 'name', type: 'varchar' }, - ], -}); - -const transactions = new Table({ - name: 'raw.transactions', - columns: [ - { name: 'block_number', type: 'bigint', isPrimary: true }, - { name: 'block_hash', type: 'varchar', isPrimary: true }, - { name: 'transaction_hash', type: 'varchar', isPrimary: true }, - { name: 'gas_used', type: 'bigint' }, - { name: 'gas_price', type: 'bigint' }, - ], -}); - -export class InitialSchema1542070840010 implements MigrationInterface { - public async up(queryRunner: QueryRunner): Promise<any> { - await queryRunner.createSchema('raw'); - - await queryRunner.createTable(blocks); - await queryRunner.createTable(exchange_cancel_events); - await queryRunner.createTable(exchange_cancel_up_to_events); - await queryRunner.createTable(exchange_fill_events); - await queryRunner.createTable(relayers); - await queryRunner.createTable(sra_orders); - await queryRunner.createTable(token_on_chain_metadata); - await queryRunner.createTable(transactions); - } - - public async down(queryRunner: QueryRunner): Promise<any> { - await queryRunner.dropTable(blocks.name); - await queryRunner.dropTable(exchange_cancel_events.name); - await queryRunner.dropTable(exchange_cancel_up_to_events.name); - await queryRunner.dropTable(exchange_fill_events.name); - await queryRunner.dropTable(relayers.name); - await queryRunner.dropTable(sra_orders.name); - await queryRunner.dropTable(token_on_chain_metadata.name); - await queryRunner.dropTable(transactions.name); - - await queryRunner.dropSchema('raw'); - } -} diff --git a/packages/pipeline/migrations/1542147915364-NewSraOrderTimestampFormat.ts b/packages/pipeline/migrations/1542147915364-NewSraOrderTimestampFormat.ts deleted file mode 100644 index 5a8f3fec8..000000000 --- a/packages/pipeline/migrations/1542147915364-NewSraOrderTimestampFormat.ts +++ /dev/null @@ -1,48 +0,0 @@ -import { MigrationInterface, QueryRunner, Table } from 'typeorm'; - -export class NewSraOrderTimestampFormat1542147915364 implements MigrationInterface { - public async up(queryRunner: QueryRunner): Promise<any> { - await queryRunner.query( - `ALTER TABLE raw.sra_orders - DROP CONSTRAINT "PK_09bfb9980715329563bd53d667e", - ADD PRIMARY KEY (order_hash_hex, exchange_address, source_url); - `, - ); - - await queryRunner.query( - `CREATE TABLE raw.sra_orders_observed_timestamps ( - order_hash_hex varchar NOT NULL, - exchange_address varchar NOT NULL, - source_url varchar NOT NULL, - observed_timestamp bigint NOT NULL, - FOREIGN KEY - (order_hash_hex, exchange_address, source_url) - REFERENCES raw.sra_orders (order_hash_hex, exchange_address, source_url), - PRIMARY KEY (order_hash_hex, exchange_address, source_url, observed_timestamp) - );`, - ); - - await queryRunner.query( - `ALTER TABLE raw.sra_orders - DROP COLUMN last_updated_timestamp, - DROP COLUMN first_seen_timestamp;`, - ); - } - - public async down(queryRunner: QueryRunner): Promise<any> { - await queryRunner.dropTable('raw.sra_orders_observed_timestamps'); - - await queryRunner.query( - `ALTER TABLE raw.sra_orders - ADD COLUMN last_updated_timestamp bigint NOT NULL DEFAULT 0, - ADD COLUMN first_seen_timestamp bigint NOT NULL DEFAULT 0;`, - ); - - await queryRunner.query( - `ALTER TABLE raw.sra_orders - DROP CONSTRAINT sra_orders_pkey, - ADD CONSTRAINT "PK_09bfb9980715329563bd53d667e" PRIMARY KEY ("exchange_address", "order_hash_hex"); - `, - ); - } -} diff --git a/packages/pipeline/migrations/1542152278484-RenameSraOrdersFilledAmounts.ts b/packages/pipeline/migrations/1542152278484-RenameSraOrdersFilledAmounts.ts deleted file mode 100644 index a13e3efa5..000000000 --- a/packages/pipeline/migrations/1542152278484-RenameSraOrdersFilledAmounts.ts +++ /dev/null @@ -1,13 +0,0 @@ -import { MigrationInterface, QueryRunner } from 'typeorm'; - -export class RenameSraOrdersFilledAmounts1542152278484 implements MigrationInterface { - public async up(queryRunner: QueryRunner): Promise<any> { - await queryRunner.renameColumn('raw.sra_orders', 'maker_asset_filled_amount', 'maker_asset_amount'); - await queryRunner.renameColumn('raw.sra_orders', 'taker_asset_filled_amount', 'taker_asset_amount'); - } - - public async down(queryRunner: QueryRunner): Promise<any> { - await queryRunner.renameColumn('raw.sra_orders', 'maker_asset_amount', 'maker_asset_filled_amount'); - await queryRunner.renameColumn('raw.sra_orders', 'taker_asset_amount', 'taker_asset_filled_amount'); - } -} diff --git a/packages/pipeline/migrations/1542234704666-ConvertBigNumberToNumeric.ts b/packages/pipeline/migrations/1542234704666-ConvertBigNumberToNumeric.ts deleted file mode 100644 index 5200ef7cc..000000000 --- a/packages/pipeline/migrations/1542234704666-ConvertBigNumberToNumeric.ts +++ /dev/null @@ -1,53 +0,0 @@ -import { MigrationInterface, QueryRunner } from 'typeorm'; - -export class ConvertBigNumberToNumeric1542234704666 implements MigrationInterface { - public async up(queryRunner: QueryRunner): Promise<any> { - await queryRunner.query( - `ALTER TABLE raw.exchange_fill_events - ALTER COLUMN maker_asset_filled_amount TYPE numeric USING maker_asset_filled_amount::numeric, - ALTER COLUMN taker_asset_filled_amount TYPE numeric USING taker_asset_filled_amount::numeric, - ALTER COLUMN maker_fee_paid TYPE numeric USING maker_fee_paid::numeric, - ALTER COLUMN taker_fee_paid TYPE numeric USING taker_fee_paid::numeric;`, - ); - - await queryRunner.query( - `ALTER TABLE raw.exchange_cancel_up_to_events - ALTER COLUMN order_epoch TYPE numeric USING order_epoch::numeric;`, - ); - - await queryRunner.query( - `ALTER TABLE raw.sra_orders - ALTER COLUMN maker_asset_amount TYPE numeric USING maker_asset_amount::numeric, - ALTER COLUMN taker_asset_amount TYPE numeric USING taker_asset_amount::numeric, - ALTER COLUMN maker_fee TYPE numeric USING maker_fee::numeric, - ALTER COLUMN taker_fee TYPE numeric USING taker_fee::numeric, - ALTER COLUMN expiration_time_seconds TYPE numeric USING expiration_time_seconds::numeric, - ALTER COLUMN salt TYPE numeric USING salt::numeric;`, - ); - } - - public async down(queryRunner: QueryRunner): Promise<any> { - await queryRunner.query( - `ALTER TABLE raw.sra_orders - ALTER COLUMN maker_asset_amount TYPE varchar USING maker_asset_amount::varchar, - ALTER COLUMN taker_asset_amount TYPE varchar USING taker_asset_amount::varchar, - ALTER COLUMN maker_fee TYPE varchar USING maker_fee::varchar, - ALTER COLUMN taker_fee TYPE varchar USING taker_fee::varchar, - ALTER COLUMN expiration_time_seconds TYPE varchar USING expiration_time_seconds::varchar, - ALTER COLUMN salt TYPE varchar USING salt::varchar;`, - ); - - await queryRunner.query( - `ALTER TABLE raw.exchange_cancel_up_to_events - ALTER COLUMN order_epoch TYPE varchar USING order_epoch::varchar;`, - ); - - await queryRunner.query( - `ALTER TABLE raw.exchange_fill_events - ALTER COLUMN maker_asset_filled_amount TYPE varchar USING maker_asset_filled_amount::varchar, - ALTER COLUMN taker_asset_filled_amount TYPE varchar USING taker_asset_filled_amount::varchar, - ALTER COLUMN maker_fee_paid TYPE varchar USING maker_fee_paid::varchar, - ALTER COLUMN taker_fee_paid TYPE varchar USING taker_fee_paid::varchar;`, - ); - } -} diff --git a/packages/pipeline/migrations/1542249766882-AddHomepageUrlToRelayers.ts b/packages/pipeline/migrations/1542249766882-AddHomepageUrlToRelayers.ts deleted file mode 100644 index 9a4811ad5..000000000 --- a/packages/pipeline/migrations/1542249766882-AddHomepageUrlToRelayers.ts +++ /dev/null @@ -1,14 +0,0 @@ -import { MigrationInterface, QueryRunner, TableColumn } from 'typeorm'; - -export class AddHomepageUrlToRelayers1542249766882 implements MigrationInterface { - public async up(queryRunner: QueryRunner): Promise<any> { - await queryRunner.addColumn( - 'raw.relayers', - new TableColumn({ name: 'homepage_url', type: 'varchar', default: `'unknown'` }), - ); - } - - public async down(queryRunner: QueryRunner): Promise<any> { - await queryRunner.dropColumn('raw.relayers', 'homepage_url'); - } -} diff --git a/packages/pipeline/migrations/1542401122477-MakeTakerAddressNullable.ts b/packages/pipeline/migrations/1542401122477-MakeTakerAddressNullable.ts deleted file mode 100644 index 957c85a36..000000000 --- a/packages/pipeline/migrations/1542401122477-MakeTakerAddressNullable.ts +++ /dev/null @@ -1,17 +0,0 @@ -import { MigrationInterface, QueryRunner } from 'typeorm'; - -export class MakeTakerAddressNullable1542401122477 implements MigrationInterface { - public async up(queryRunner: QueryRunner): Promise<any> { - await queryRunner.query( - `ALTER TABLE raw.exchange_cancel_events - ALTER COLUMN taker_address DROP NOT NULL;`, - ); - } - - public async down(queryRunner: QueryRunner): Promise<any> { - await queryRunner.query( - `ALTER TABLE raw.exchange_cancel_events - ALTER COLUMN taker_address SET NOT NULL;`, - ); - } -} diff --git a/packages/pipeline/migrations/1542655823221-NewMetadataAndOHLCVTables.ts b/packages/pipeline/migrations/1542655823221-NewMetadataAndOHLCVTables.ts deleted file mode 100644 index 838f5ba9c..000000000 --- a/packages/pipeline/migrations/1542655823221-NewMetadataAndOHLCVTables.ts +++ /dev/null @@ -1,60 +0,0 @@ -import { MigrationInterface, QueryRunner } from 'typeorm'; - -export class NewMetadataAndOHLCVTables1542655823221 implements MigrationInterface { - // tslint:disable-next-line - public async up(queryRunner: QueryRunner): Promise<any> { - await queryRunner.query(` - CREATE TABLE raw.token_metadata ( - address VARCHAR NOT NULL, - authority VARCHAR NOT NULL, - decimals INT NULL, - symbol VARCHAR NULL, - name VARCHAR NULL, - - PRIMARY KEY (address, authority) - ); - `); - - await queryRunner.dropTable('raw.token_on_chain_metadata'); - - await queryRunner.query(` - CREATE TABLE raw.ohlcv_external ( - exchange VARCHAR NOT NULL, - from_symbol VARCHAR NOT NULL, - to_symbol VARCHAR NOT NULL, - start_time BIGINT NOT NULL, - end_time BIGINT NOT NULL, - - open DOUBLE PRECISION NOT NULL, - close DOUBLE PRECISION NOT NULL, - low DOUBLE PRECISION NOT NULL, - high DOUBLE PRECISION NOT NULL, - volume_from DOUBLE PRECISION NOT NULL, - volume_to DOUBLE PRECISION NOT NULL, - - source VARCHAR NOT NULL, - observed_timestamp BIGINT NOT NULL, - - PRIMARY KEY (exchange, from_symbol, to_symbol, start_time, end_time, source, observed_timestamp) - ); - `); - } - - // tslint:disable-next-line - public async down(queryRunner: QueryRunner): Promise<any> { - await queryRunner.query(` - CREATE TABLE raw.token_on_chain_metadata ( - address VARCHAR NOT NULL, - decimals INT NULL, - symbol VARCHAR NULL, - name VARCHAR NULL, - - PRIMARY KEY (address) - ); - `); - - await queryRunner.dropTable('raw.token_metadata'); - - await queryRunner.dropTable('raw.ohlcv_external'); - } -} diff --git a/packages/pipeline/migrations/1543434472116-TokenOrderbookSnapshots.ts b/packages/pipeline/migrations/1543434472116-TokenOrderbookSnapshots.ts deleted file mode 100644 index a7117c753..000000000 --- a/packages/pipeline/migrations/1543434472116-TokenOrderbookSnapshots.ts +++ /dev/null @@ -1,30 +0,0 @@ -import { MigrationInterface, QueryRunner, Table } from 'typeorm'; - -const tokenOrderbookSnapshots = new Table({ - name: 'raw.token_orderbook_snapshots', - columns: [ - { name: 'observed_timestamp', type: 'bigint', isPrimary: true }, - { name: 'source', type: 'varchar', isPrimary: true }, - { name: 'order_type', type: 'order_t' }, - { name: 'price', type: 'numeric', isPrimary: true }, - - { name: 'base_asset_symbol', type: 'varchar', isPrimary: true }, - { name: 'base_asset_address', type: 'char(42)' }, - { name: 'base_volume', type: 'numeric' }, - - { name: 'quote_asset_symbol', type: 'varchar', isPrimary: true }, - { name: 'quote_asset_address', type: 'char(42)' }, - { name: 'quote_volume', type: 'numeric' }, - ], -}); - -export class TokenOrderbookSnapshots1543434472116 implements MigrationInterface { - public async up(queryRunner: QueryRunner): Promise<any> { - await queryRunner.query(`CREATE TYPE order_t AS enum('bid', 'ask');`); - await queryRunner.createTable(tokenOrderbookSnapshots); - } - - public async down(queryRunner: QueryRunner): Promise<any> { - await queryRunner.dropTable(tokenOrderbookSnapshots.name); - } -} diff --git a/packages/pipeline/migrations/1543446690436-CreateDexTrades.ts b/packages/pipeline/migrations/1543446690436-CreateDexTrades.ts deleted file mode 100644 index 267cf144b..000000000 --- a/packages/pipeline/migrations/1543446690436-CreateDexTrades.ts +++ /dev/null @@ -1,41 +0,0 @@ -import { MigrationInterface, QueryRunner, Table } from 'typeorm'; - -const dexTrades = new Table({ - name: 'raw.dex_trades', - columns: [ - { name: 'source_url', type: 'varchar', isPrimary: true }, - { name: 'tx_hash', type: 'varchar', isPrimary: true }, - - { name: 'tx_timestamp', type: 'bigint' }, - { name: 'tx_date', type: 'varchar' }, - { name: 'tx_sender', type: 'varchar(42)' }, - { name: 'smart_contract_id', type: 'bigint' }, - { name: 'smart_contract_address', type: 'varchar(42)' }, - { name: 'contract_type', type: 'varchar' }, - { name: 'maker', type: 'varchar(42)' }, - { name: 'taker', type: 'varchar(42)' }, - { name: 'amount_buy', type: 'numeric' }, - { name: 'maker_fee_amount', type: 'numeric' }, - { name: 'buy_currency_id', type: 'bigint' }, - { name: 'buy_symbol', type: 'varchar' }, - { name: 'amount_sell', type: 'numeric' }, - { name: 'taker_fee_amount', type: 'numeric' }, - { name: 'sell_currency_id', type: 'bigint' }, - { name: 'sell_symbol', type: 'varchar' }, - { name: 'maker_annotation', type: 'varchar' }, - { name: 'taker_annotation', type: 'varchar' }, - { name: 'protocol', type: 'varchar' }, - { name: 'buy_address', type: 'varchar(42)', isNullable: true }, - { name: 'sell_address', type: 'varchar(42)', isNullable: true }, - ], -}); - -export class CreateDexTrades1543446690436 implements MigrationInterface { - public async up(queryRunner: QueryRunner): Promise<any> { - await queryRunner.createTable(dexTrades); - } - - public async down(queryRunner: QueryRunner): Promise<any> { - await queryRunner.dropTable(dexTrades); - } -} diff --git a/packages/pipeline/migrations/1543980079179-ConvertTokenMetadataDecimalsToBigNumber.ts b/packages/pipeline/migrations/1543980079179-ConvertTokenMetadataDecimalsToBigNumber.ts deleted file mode 100644 index 351bc7eb8..000000000 --- a/packages/pipeline/migrations/1543980079179-ConvertTokenMetadataDecimalsToBigNumber.ts +++ /dev/null @@ -1,17 +0,0 @@ -import { MigrationInterface, QueryRunner } from 'typeorm'; - -export class ConvertTokenMetadataDecimalsToBigNumber1543980079179 implements MigrationInterface { - public async up(queryRunner: QueryRunner): Promise<any> { - await queryRunner.query( - `ALTER TABLE raw.token_metadata - ALTER COLUMN decimals TYPE numeric USING decimals::numeric;`, - ); - } - - public async down(queryRunner: QueryRunner): Promise<any> { - await queryRunner.query( - `ALTER TABLE raw.token_metadata - ALTER COLUMN decimals TYPE numeric USING decimals::integer;`, - ); - } -} diff --git a/packages/pipeline/migrations/1543983324954-ConvertTransactionGasPriceToBigNumber.ts b/packages/pipeline/migrations/1543983324954-ConvertTransactionGasPriceToBigNumber.ts deleted file mode 100644 index dcb0fd727..000000000 --- a/packages/pipeline/migrations/1543983324954-ConvertTransactionGasPriceToBigNumber.ts +++ /dev/null @@ -1,19 +0,0 @@ -import { MigrationInterface, QueryRunner } from 'typeorm'; - -export class ConvertTransactionGasPriceToBigNumber1543983324954 implements MigrationInterface { - public async up(queryRunner: QueryRunner): Promise<any> { - await queryRunner.query( - `ALTER TABLE raw.transactions - ALTER COLUMN gas_price TYPE numeric USING gas_price::numeric, - ALTER COLUMN gas_used TYPE numeric USING gas_used::numeric;`, - ); - } - - public async down(queryRunner: QueryRunner): Promise<any> { - await queryRunner.query( - `ALTER TABLE raw.transactions - ALTER COLUMN gas_price TYPE numeric USING gas_price::bigint, - ALTER COLUMN gas_used TYPE numeric USING gas_used::bigint;`, - ); - } -} diff --git a/packages/pipeline/migrations/1544131464368-CreateERC20ApprovalEvents.ts b/packages/pipeline/migrations/1544131464368-CreateERC20ApprovalEvents.ts deleted file mode 100644 index 2e84e0ec8..000000000 --- a/packages/pipeline/migrations/1544131464368-CreateERC20ApprovalEvents.ts +++ /dev/null @@ -1,26 +0,0 @@ -import { MigrationInterface, QueryRunner, Table } from 'typeorm'; - -const erc20ApprovalEvents = new Table({ - name: 'raw.erc20_approval_events', - columns: [ - { name: 'token_address', type: 'varchar(42)', isPrimary: true }, - { name: 'log_index', type: 'integer', isPrimary: true }, - { name: 'block_number', type: 'bigint', isPrimary: true }, - - { name: 'raw_data', type: 'varchar' }, - { name: 'transaction_hash', type: 'varchar' }, - { name: 'owner_address', type: 'varchar(42)' }, - { name: 'spender_address', type: 'varchar(42)' }, - { name: 'amount', type: 'numeric' }, - ], -}); - -export class CreateERC20TokenApprovalEvents1544131464368 implements MigrationInterface { - public async up(queryRunner: QueryRunner): Promise<any> { - await queryRunner.createTable(erc20ApprovalEvents); - } - - public async down(queryRunner: QueryRunner): Promise<any> { - await queryRunner.dropTable(erc20ApprovalEvents); - } -} diff --git a/packages/pipeline/migrations/1544131658904-TokenOrderbookSnapshotAddOrderType.ts b/packages/pipeline/migrations/1544131658904-TokenOrderbookSnapshotAddOrderType.ts deleted file mode 100644 index a501ec6d8..000000000 --- a/packages/pipeline/migrations/1544131658904-TokenOrderbookSnapshotAddOrderType.ts +++ /dev/null @@ -1,33 +0,0 @@ -import { MigrationInterface, QueryRunner } from 'typeorm'; - -export class TokenOrderbookSnapshotAddOrderType1544131658904 implements MigrationInterface { - public async up(queryRunner: QueryRunner): Promise<any> { - await queryRunner.query( - `ALTER TABLE raw.token_orderbook_snapshots - DROP CONSTRAINT "PK_8a16487e7cb6862ec5a84ed3495", - ADD PRIMARY KEY (observed_timestamp, source, order_type, price, base_asset_symbol, quote_asset_symbol); - `, - ); - await queryRunner.query( - `ALTER TABLE raw.token_orderbook_snapshots - ALTER COLUMN quote_asset_address DROP NOT NULL, - ALTER COLUMN base_asset_address DROP NOT NULL; - `, - ); - } - - public async down(queryRunner: QueryRunner): Promise<any> { - await queryRunner.query( - `ALTER TABLE raw.token_orderbook_snapshots - ALTER COLUMN quote_asset_address SET NOT NULL, - ALTER COLUMN base_asset_address SET NOT NULL; - `, - ); - await queryRunner.query( - `ALTER TABLE raw.token_orderbook_snapshots - DROP CONSTRAINT token_orderbook_snapshots_pkey, - ADD CONSTRAINT "PK_8a16487e7cb6862ec5a84ed3495" PRIMARY KEY (observed_timestamp, source, price, base_asset_symbol, quote_asset_symbol); - `, - ); - } -} diff --git a/packages/pipeline/migrations/1545440485644-CreateCopperTables.ts b/packages/pipeline/migrations/1545440485644-CreateCopperTables.ts deleted file mode 100644 index 64bf70af4..000000000 --- a/packages/pipeline/migrations/1545440485644-CreateCopperTables.ts +++ /dev/null @@ -1,103 +0,0 @@ -import { MigrationInterface, QueryRunner, Table } from 'typeorm'; - -const leads = new Table({ - name: 'raw.copper_leads', - columns: [ - { name: 'id', type: 'bigint', isPrimary: true }, - { name: 'name', type: 'varchar', isNullable: true }, - { name: 'first_name', type: 'varchar', isNullable: true }, - { name: 'last_name', type: 'varchar', isNullable: true }, - { name: 'middle_name', type: 'varchar', isNullable: true }, - { name: 'assignee_id', type: 'bigint', isNullable: true }, - { name: 'company_name', type: 'varchar', isNullable: true }, - { name: 'customer_source_id', type: 'bigint', isNullable: true }, - { name: 'monetary_value', type: 'integer', isNullable: true }, - { name: 'status', type: 'varchar' }, - { name: 'status_id', type: 'bigint' }, - { name: 'title', type: 'varchar', isNullable: true }, - { name: 'date_created', type: 'bigint' }, - { name: 'date_modified', type: 'bigint', isPrimary: true }, - ], -}); -const activities = new Table({ - name: 'raw.copper_activities', - columns: [ - { name: 'id', type: 'bigint', isPrimary: true }, - { name: 'parent_id', type: 'bigint' }, - { name: 'parent_type', type: 'varchar' }, - { name: 'type_id', type: 'bigint' }, - { name: 'type_category', type: 'varchar' }, - { name: 'type_name', type: 'varchar', isNullable: true }, - { name: 'user_id', type: 'bigint' }, - { name: 'old_value_id', type: 'bigint', isNullable: true }, - { name: 'old_value_name', type: 'varchar', isNullable: true }, - { name: 'new_value_id', type: 'bigint', isNullable: true }, - { name: 'new_value_name', type: 'varchar', isNullable: true }, - { name: 'date_created', type: 'bigint' }, - { name: 'date_modified', type: 'bigint', isPrimary: true }, - ], -}); - -const opportunities = new Table({ - name: 'raw.copper_opportunities', - columns: [ - { name: 'id', type: 'bigint', isPrimary: true }, - { name: 'name', type: 'varchar' }, - { name: 'assignee_id', isNullable: true, type: 'bigint' }, - { name: 'close_date', isNullable: true, type: 'varchar' }, - { name: 'company_id', isNullable: true, type: 'bigint' }, - { name: 'company_name', isNullable: true, type: 'varchar' }, - { name: 'customer_source_id', isNullable: true, type: 'bigint' }, - { name: 'loss_reason_id', isNullable: true, type: 'bigint' }, - { name: 'pipeline_id', type: 'bigint' }, - { name: 'pipeline_stage_id', type: 'bigint' }, - { name: 'primary_contact_id', isNullable: true, type: 'bigint' }, - { name: 'priority', isNullable: true, type: 'varchar' }, - { name: 'status', type: 'varchar' }, - { name: 'interaction_count', type: 'bigint' }, - { name: 'monetary_value', isNullable: true, type: 'integer' }, - { name: 'win_probability', isNullable: true, type: 'integer' }, - { name: 'date_created', type: 'bigint' }, - { name: 'date_modified', type: 'bigint', isPrimary: true }, - { name: 'custom_fields', type: 'jsonb' }, - ], -}); - -const activityTypes = new Table({ - name: 'raw.copper_activity_types', - columns: [ - { name: 'id', type: 'bigint', isPrimary: true }, - { name: 'category', type: 'varchar' }, - { name: 'name', type: 'varchar' }, - { name: 'is_disabled', type: 'boolean', isNullable: true }, - { name: 'count_as_interaction', type: 'boolean', isNullable: true }, - ], -}); - -const customFields = new Table({ - name: 'raw.copper_custom_fields', - columns: [ - { name: 'id', type: 'bigint', isPrimary: true }, - { name: 'name', type: 'varchar' }, - { name: 'data_type', type: 'varchar' }, - { name: 'field_type', type: 'varchar', isNullable: true }, - ], -}); - -export class CreateCopperTables1544055699284 implements MigrationInterface { - public async up(queryRunner: QueryRunner): Promise<any> { - await queryRunner.createTable(leads); - await queryRunner.createTable(activities); - await queryRunner.createTable(opportunities); - await queryRunner.createTable(activityTypes); - await queryRunner.createTable(customFields); - } - - public async down(queryRunner: QueryRunner): Promise<any> { - await queryRunner.dropTable(leads.name); - await queryRunner.dropTable(activities.name); - await queryRunner.dropTable(opportunities.name); - await queryRunner.dropTable(activityTypes.name); - await queryRunner.dropTable(customFields.name); - } -} diff --git a/packages/pipeline/migrations/1547153875669-UpdateDDexAPIToV3.ts b/packages/pipeline/migrations/1547153875669-UpdateDDexAPIToV3.ts deleted file mode 100644 index 957af4941..000000000 --- a/packages/pipeline/migrations/1547153875669-UpdateDDexAPIToV3.ts +++ /dev/null @@ -1,21 +0,0 @@ -import { MigrationInterface, QueryRunner } from 'typeorm'; - -export class UpdateDDexAPIToV31547153875669 implements MigrationInterface { - public async up(queryRunner: QueryRunner): Promise<any> { - await queryRunner.query(` - UPDATE raw.token_orderbook_snapshots - SET quote_asset_symbol='WETH' - WHERE quote_asset_symbol='ETH' AND - source='ddex'; - `); - } - - public async down(queryRunner: QueryRunner): Promise<any> { - await queryRunner.query(` - UPDATE raw.token_orderbook_snapshots - SET quote_asset_symbol='ETH' - WHERE quote_asset_symbol='WETH' AND - source='ddex'; - `); - } -} diff --git a/packages/pipeline/migrations/1548809952793-AllowDuplicateTxHashesInDexTrades.ts b/packages/pipeline/migrations/1548809952793-AllowDuplicateTxHashesInDexTrades.ts deleted file mode 100644 index 21b08f0ef..000000000 --- a/packages/pipeline/migrations/1548809952793-AllowDuplicateTxHashesInDexTrades.ts +++ /dev/null @@ -1,29 +0,0 @@ -import { MigrationInterface, QueryRunner, TableColumn } from 'typeorm'; - -const DEX_TRADES_TABLE_NAME = 'raw.dex_trades'; - -export class AllowDuplicateTxHashesInDexTrades1548809952793 implements MigrationInterface { - public async up(queryRunner: QueryRunner): Promise<any> { - const dexTradesTable = await queryRunner.getTable(DEX_TRADES_TABLE_NAME); - if (dexTradesTable) { - // Need new primary key to be non-null. No default value makes sense, so drop table. - await queryRunner.query(`DELETE from ${DEX_TRADES_TABLE_NAME}`); - // Composite key goes from (source_url, tx_hash) to (source_url, tx_hash, trade_index) - await queryRunner.addColumn( - DEX_TRADES_TABLE_NAME, - new TableColumn({ - name: 'trade_index', - type: 'varchar', - isPrimary: true, - }), - ); - } - } - - public async down(queryRunner: QueryRunner): Promise<any> { - const dexTradesTable = await queryRunner.getTable(DEX_TRADES_TABLE_NAME); - if (dexTradesTable) { - await queryRunner.dropColumn(dexTradesTable, 'trade_index'); - } - } -} diff --git a/packages/pipeline/migrations/1549479172800-AddTxHashToExchangeEventPrimaryKey.ts b/packages/pipeline/migrations/1549479172800-AddTxHashToExchangeEventPrimaryKey.ts deleted file mode 100644 index d6ea6c47b..000000000 --- a/packages/pipeline/migrations/1549479172800-AddTxHashToExchangeEventPrimaryKey.ts +++ /dev/null @@ -1,35 +0,0 @@ -import { MigrationInterface, QueryRunner } from 'typeorm'; - -const tableNames = ['exchange_cancel_events', 'exchange_cancel_up_to_events', 'exchange_fill_events']; - -const oldPrimaryColumns = ['contract_address', 'log_index', 'block_number']; - -const newPrimaryColumns = ['transaction_hash']; - -async function updatePrimaryKeysAsync(queryRunner: QueryRunner, columnNames: string[]): Promise<void> { - for (const tableName of tableNames) { - const table = await queryRunner.getTable(`raw.${tableName}`); - if (table === undefined) { - throw new Error(`Couldn't get table 'raw.${tableName}'`); - } - const columns = []; - for (const columnName of columnNames) { - const column = table.findColumnByName(columnName); - if (column === undefined) { - throw new Error(`Couldn't get column '${columnName}' from table 'raw.${tableName}'`); - } - columns.push(column); - } - await queryRunner.updatePrimaryKeys(table, columns); - } -} - -export class AddTxHashToExchangeEventPrimaryKey1549479172800 implements MigrationInterface { - public async up(queryRunner: QueryRunner): Promise<any> { - await updatePrimaryKeysAsync(queryRunner, oldPrimaryColumns.concat(newPrimaryColumns)); - } - - public async down(queryRunner: QueryRunner): Promise<any> { - await updatePrimaryKeysAsync(queryRunner, oldPrimaryColumns); - } -} diff --git a/packages/pipeline/migrations/1549499426238-AddTxHashToERC20ApprovalEventPrimaryKey.ts b/packages/pipeline/migrations/1549499426238-AddTxHashToERC20ApprovalEventPrimaryKey.ts deleted file mode 100644 index 874713e67..000000000 --- a/packages/pipeline/migrations/1549499426238-AddTxHashToERC20ApprovalEventPrimaryKey.ts +++ /dev/null @@ -1,31 +0,0 @@ -import { MigrationInterface, QueryRunner } from 'typeorm'; - -const oldPrimaryColumns = ['token_address', 'log_index', 'block_number']; - -const newPrimaryColumns = ['transaction_hash']; - -async function updatePrimaryKeysAsync(queryRunner: QueryRunner, columnNames: string[]): Promise<void> { - const table = await queryRunner.getTable(`raw.erc20_approval_events`); - if (table === undefined) { - throw new Error(`Couldn't get table 'raw.erc20_approval_events'`); - } - const columns = []; - for (const columnName of columnNames) { - const column = table.findColumnByName(columnName); - if (column === undefined) { - throw new Error(`Couldn't get column '${columnName}' from table 'raw.erc20_approval_events'`); - } - columns.push(column); - } - await queryRunner.updatePrimaryKeys(table, columns); -} - -export class AddTxHashToERC20ApprovalEventPrimaryKey1549499426238 implements MigrationInterface { - public async up(queryRunner: QueryRunner): Promise<any> { - await updatePrimaryKeysAsync(queryRunner, oldPrimaryColumns.concat(newPrimaryColumns)); - } - - public async down(queryRunner: QueryRunner): Promise<any> { - await updatePrimaryKeysAsync(queryRunner, oldPrimaryColumns); - } -} diff --git a/packages/pipeline/package.json b/packages/pipeline/package.json deleted file mode 100644 index b2ad39a5f..000000000 --- a/packages/pipeline/package.json +++ /dev/null @@ -1,66 +0,0 @@ -{ - "name": "@0x/pipeline", - "version": "1.0.9", - "private": true, - "description": "Data pipeline for offline analysis", - "scripts": { - "build": "yarn tsc -b", - "build:ci": "yarn build", - "test": "yarn run_mocha", - "rebuild_and_test": "run-s build test:all", - "test:db": "yarn run_mocha:db", - "test:all": "run-s test test:db", - "test:circleci": "yarn test:coverage", - "run_mocha": "mocha --require source-map-support/register --require make-promises-safe 'lib/test/!(entities)/**/*_test.js' --bail --exit", - "run_mocha:db": "mocha --require source-map-support/register --require make-promises-safe lib/test/db_global_hooks.js 'lib/test/entities/*_test.js' --bail --exit --timeout 60000", - "test:coverage": "nyc npm run test:all --all && yarn coverage:report:lcov", - "coverage:report:lcov": "nyc report --reporter=text-lcov > coverage/lcov.info", - "clean": "shx rm -rf lib", - "lint": "tslint --project . --format stylish --exclude ./migrations/**/* --exclude ./test/fixtures/**/**/*.json", - "migrate:run": "yarn typeorm migration:run --config ./lib/src/ormconfig", - "migrate:revert": "yarn typeorm migration:revert --config ./lib/src/ormconfig", - "migrate:create": "yarn typeorm migration:create --config ./lib/src/ormconfig --dir migrations" - }, - "repository": { - "type": "git", - "url": "https://github.com/0xProject/0x-monorepo" - }, - "license": "Apache-2.0", - "devDependencies": { - "@0x/tslint-config": "^3.0.0", - "@types/axios": "^0.14.0", - "@types/ramda": "^0.25.38", - "chai": "^4.0.1", - "chai-as-promised": "^7.1.0", - "chai-bignumber": "^3.0.0", - "dirty-chai": "^2.0.1", - "mocha": "^4.1.0", - "tslint": "5.11.0", - "typescript": "3.0.1" - }, - "dependencies": { - "@0x/connect": "^4.0.3", - "@0x/contract-addresses": "^2.2.1", - "@0x/contract-artifacts": "^1.3.0", - "@0x/contract-wrappers": "^7.0.2", - "@0x/dev-utils": "^2.0.2", - "@0x/order-utils": "^6.0.1", - "@0x/subproviders": "^3.0.2", - "@0x/types": "^2.0.2", - "@0x/utils": "^4.0.3", - "@0x/web3-wrapper": "^4.0.2", - "@types/dockerode": "^2.5.9", - "@types/p-limit": "^2.0.0", - "async-parallel": "^1.2.3", - "axios": "^0.18.0", - "bottleneck": "^2.13.2", - "dockerode": "^2.5.7", - "ethereum-types": "^2.0.0", - "pg": "^7.5.0", - "prettier": "^1.16.3", - "ramda": "^0.25.0", - "reflect-metadata": "^0.1.12", - "sqlite3": "^4.0.2", - "typeorm": "^0.2.7" - } -} diff --git a/packages/pipeline/src/data_sources/bloxy/index.ts b/packages/pipeline/src/data_sources/bloxy/index.ts deleted file mode 100644 index 22ab195b3..000000000 --- a/packages/pipeline/src/data_sources/bloxy/index.ts +++ /dev/null @@ -1,143 +0,0 @@ -import axios from 'axios'; -import * as R from 'ramda'; - -import { logUtils } from '@0x/utils'; - -// URL to use for getting dex trades from Bloxy. -export const BLOXY_DEX_TRADES_URL = 'https://bloxy.info/api/dex/trades'; -// Number of trades to get at once. Must be less than or equal to MAX_OFFSET. -const TRADES_PER_QUERY = 10000; -// Maximum offset supported by the Bloxy API. -const MAX_OFFSET = 100000; -// Buffer to subtract from offset. This means we will request some trades twice -// but we have less chance on missing out on any data. -const OFFSET_BUFFER = 1000; -// Maximum number of days supported by the Bloxy API. -const MAX_DAYS = 30; -// Buffer used for comparing the last seen timestamp to the last returned -// timestamp. Increasing this reduces chances of data loss but also creates more -// redundancy and can impact performance. -// tslint:disable-next-line:custom-no-magic-numbers -const LAST_SEEN_TIMESTAMP_BUFFER_MS = 1000 * 60 * 30; // 30 minutes - -// tslint:disable-next-line:custom-no-magic-numbers -const millisecondsPerDay = 1000 * 60 * 60 * 24; // ms/d = ms/s * s/m * m/h * h/d - -export interface BloxyTrade { - tx_hash: string; - tx_time: string; - tx_date: string; - tx_sender: string; - tradeIndex: string; - smart_contract_id: number; - smart_contract_address: string; - contract_type: string; - maker: string; - taker: string; - amountBuy: number; - makerFee: number; - buyCurrencyId: number; - buySymbol: string; - amountSell: number; - takerFee: number; - sellCurrencyId: number; - sellSymbol: string; - maker_annotation: string; - taker_annotation: string; - protocol: string; - buyAddress: string | null; - sellAddress: string | null; -} - -interface BloxyError { - error: string; -} - -type BloxyResponse<T> = T | BloxyError; -type BloxyTradeResponse = BloxyResponse<BloxyTrade[]>; - -function isError<T>(response: BloxyResponse<T>): response is BloxyError { - return (response as BloxyError).error !== undefined; -} - -export class BloxySource { - private readonly _apiKey: string; - - constructor(apiKey: string) { - this._apiKey = apiKey; - } - - /** - * Gets all latest trades between the lastSeenTimestamp (minus some buffer) - * and the current time. Note that because the Bloxy API has some hard - * limits it might not always be possible to get *all* the trades in the - * desired time range. - * @param lastSeenTimestamp The latest timestamp for trades that have - * already been seen. - */ - public async getDexTradesAsync(lastSeenTimestamp: number): Promise<BloxyTrade[]> { - const allTrades = await this._scrapeAllDexTradesAsync(lastSeenTimestamp); - logUtils.log(`Removing duplicates from ${allTrades.length} entries`); - const uniqueTrades = R.uniqBy((trade: BloxyTrade) => `${trade.tradeIndex}-${trade.tx_hash}`, allTrades); - logUtils.log(`Removed ${allTrades.length - uniqueTrades.length} duplicate entries`); - return uniqueTrades; - } - - // Potentially returns duplicate trades. - private async _scrapeAllDexTradesAsync(lastSeenTimestamp: number): Promise<BloxyTrade[]> { - let allTrades: BloxyTrade[] = []; - - // Clamp numberOfDays so that it is always between 1 and MAX_DAYS (inclusive) - const numberOfDays = R.clamp(1, MAX_DAYS, getDaysSinceTimestamp(lastSeenTimestamp)); - - // Keep getting trades until we hit one of the following conditions: - // - // 1. Offset hits MAX_OFFSET (we can't go back any further). - // 2. There are no more trades in the response. - // 3. We see a tx_time equal to or earlier than lastSeenTimestamp (plus - // some buffer). - // - for (let offset = 0; offset <= MAX_OFFSET; offset += TRADES_PER_QUERY - OFFSET_BUFFER) { - const trades = await this._getTradesWithOffsetAsync(numberOfDays, offset); - if (trades.length === 0) { - // There are no more trades left for the days we are querying. - // This means we are done. - return allTrades; - } - const sortedTrades = R.reverse(R.sortBy(trade => trade.tx_time, trades)); - allTrades = allTrades.concat(sortedTrades); - - // Check if lastReturnedTimestamp < lastSeenTimestamp - const lastReturnedTimestamp = new Date(sortedTrades[0].tx_time).getTime(); - if (lastReturnedTimestamp < lastSeenTimestamp - LAST_SEEN_TIMESTAMP_BUFFER_MS) { - // We are at the point where we have already seen trades for the - // timestamp range that is being returned. We're done. - return allTrades; - } - } - return allTrades; - } - - private async _getTradesWithOffsetAsync(numberOfDays: number, offset: number): Promise<BloxyTrade[]> { - const resp = await axios.get<BloxyTradeResponse>(BLOXY_DEX_TRADES_URL, { - params: { - key: this._apiKey, - days: numberOfDays, - limit: TRADES_PER_QUERY, - offset, - }, - }); - if (isError(resp.data)) { - throw new Error(`Error in Bloxy API response: ${resp.data.error}`); - } - return resp.data; - } -} - -// Computes the number of days between the given timestamp and the current -// timestamp (rounded up). -function getDaysSinceTimestamp(timestamp: number): number { - const msSinceTimestamp = Date.now() - timestamp; - const daysSinceTimestamp = msSinceTimestamp / millisecondsPerDay; - return Math.ceil(daysSinceTimestamp); -} diff --git a/packages/pipeline/src/data_sources/contract-wrappers/erc20_events.ts b/packages/pipeline/src/data_sources/contract-wrappers/erc20_events.ts deleted file mode 100644 index e0098122f..000000000 --- a/packages/pipeline/src/data_sources/contract-wrappers/erc20_events.ts +++ /dev/null @@ -1,45 +0,0 @@ -import { - ContractWrappers, - ERC20TokenApprovalEventArgs, - ERC20TokenEvents, - ERC20TokenWrapper, -} from '@0x/contract-wrappers'; -import { Web3ProviderEngine } from '@0x/subproviders'; -import { LogWithDecodedArgs } from 'ethereum-types'; - -import { GetEventsFunc, getEventsWithPaginationAsync } from './utils'; - -export class ERC20EventsSource { - private readonly _erc20Wrapper: ERC20TokenWrapper; - private readonly _tokenAddress: string; - constructor(provider: Web3ProviderEngine, networkId: number, tokenAddress: string) { - const contractWrappers = new ContractWrappers(provider, { networkId }); - this._erc20Wrapper = contractWrappers.erc20Token; - this._tokenAddress = tokenAddress; - } - - public async getApprovalEventsAsync( - startBlock: number, - endBlock: number, - ): Promise<Array<LogWithDecodedArgs<ERC20TokenApprovalEventArgs>>> { - return getEventsWithPaginationAsync( - this._getApprovalEventsForRangeAsync.bind(this) as GetEventsFunc<ERC20TokenApprovalEventArgs>, - startBlock, - endBlock, - ); - } - - // Gets all approval events of for a specific sub-range. This getter - // function will be called during each step of pagination. - private async _getApprovalEventsForRangeAsync( - fromBlock: number, - toBlock: number, - ): Promise<Array<LogWithDecodedArgs<ERC20TokenApprovalEventArgs>>> { - return this._erc20Wrapper.getLogsAsync<ERC20TokenApprovalEventArgs>( - this._tokenAddress, - ERC20TokenEvents.Approval, - { fromBlock, toBlock }, - {}, - ); - } -} diff --git a/packages/pipeline/src/data_sources/contract-wrappers/exchange_events.ts b/packages/pipeline/src/data_sources/contract-wrappers/exchange_events.ts deleted file mode 100644 index 58691e2ab..000000000 --- a/packages/pipeline/src/data_sources/contract-wrappers/exchange_events.ts +++ /dev/null @@ -1,59 +0,0 @@ -import { - ContractWrappers, - ExchangeCancelEventArgs, - ExchangeCancelUpToEventArgs, - ExchangeEventArgs, - ExchangeEvents, - ExchangeFillEventArgs, - ExchangeWrapper, -} from '@0x/contract-wrappers'; -import { Web3ProviderEngine } from '@0x/subproviders'; -import { LogWithDecodedArgs } from 'ethereum-types'; - -import { GetEventsFunc, getEventsWithPaginationAsync } from './utils'; - -export class ExchangeEventsSource { - private readonly _exchangeWrapper: ExchangeWrapper; - constructor(provider: Web3ProviderEngine, networkId: number) { - const contractWrappers = new ContractWrappers(provider, { networkId }); - this._exchangeWrapper = contractWrappers.exchange; - } - - public async getFillEventsAsync( - startBlock: number, - endBlock: number, - ): Promise<Array<LogWithDecodedArgs<ExchangeFillEventArgs>>> { - const getFillEventsForRangeAsync = this._makeGetterFuncForEventType<ExchangeFillEventArgs>(ExchangeEvents.Fill); - return getEventsWithPaginationAsync(getFillEventsForRangeAsync, startBlock, endBlock); - } - - public async getCancelEventsAsync( - startBlock: number, - endBlock: number, - ): Promise<Array<LogWithDecodedArgs<ExchangeCancelEventArgs>>> { - const getCancelEventsForRangeAsync = this._makeGetterFuncForEventType<ExchangeCancelEventArgs>( - ExchangeEvents.Cancel, - ); - return getEventsWithPaginationAsync(getCancelEventsForRangeAsync, startBlock, endBlock); - } - - public async getCancelUpToEventsAsync( - startBlock: number, - endBlock: number, - ): Promise<Array<LogWithDecodedArgs<ExchangeCancelUpToEventArgs>>> { - const getCancelUpToEventsForRangeAsync = this._makeGetterFuncForEventType<ExchangeCancelUpToEventArgs>( - ExchangeEvents.CancelUpTo, - ); - return getEventsWithPaginationAsync(getCancelUpToEventsForRangeAsync, startBlock, endBlock); - } - - // Returns a getter function which gets all events of a specific type for a - // specific sub-range. This getter function will be called during each step - // of pagination. - private _makeGetterFuncForEventType<ArgsType extends ExchangeEventArgs>( - eventType: ExchangeEvents, - ): GetEventsFunc<ArgsType> { - return async (fromBlock: number, toBlock: number) => - this._exchangeWrapper.getLogsAsync<ArgsType>(eventType, { fromBlock, toBlock }, {}); - } -} diff --git a/packages/pipeline/src/data_sources/contract-wrappers/utils.ts b/packages/pipeline/src/data_sources/contract-wrappers/utils.ts deleted file mode 100644 index 67660a37e..000000000 --- a/packages/pipeline/src/data_sources/contract-wrappers/utils.ts +++ /dev/null @@ -1,67 +0,0 @@ -import { DecodedLogArgs, LogWithDecodedArgs } from 'ethereum-types'; - -const NUM_BLOCKS_PER_QUERY = 10000; // Number of blocks to query for events at a time. -const NUM_RETRIES = 3; // Number of retries if a request fails or times out. - -export type GetEventsFunc<ArgsType extends DecodedLogArgs> = ( - fromBlock: number, - toBlock: number, -) => Promise<Array<LogWithDecodedArgs<ArgsType>>>; - -/** - * Gets all events between the given startBlock and endBlock by querying for - * NUM_BLOCKS_PER_QUERY at a time. Accepts a getter function in order to - * maximize code re-use and allow for getting different types of events for - * different contracts. If the getter function throws with a retryable error, - * it will automatically be retried up to NUM_RETRIES times. - * @param getEventsAsync A getter function which will be called for each step during pagination. - * @param startBlock The start of the entire block range to get events for. - * @param endBlock The end of the entire block range to get events for. - */ -export async function getEventsWithPaginationAsync<ArgsType extends DecodedLogArgs>( - getEventsAsync: GetEventsFunc<ArgsType>, - startBlock: number, - endBlock: number, -): Promise<Array<LogWithDecodedArgs<ArgsType>>> { - let events: Array<LogWithDecodedArgs<ArgsType>> = []; - for (let fromBlock = startBlock; fromBlock <= endBlock; fromBlock += NUM_BLOCKS_PER_QUERY) { - const toBlock = Math.min(fromBlock + NUM_BLOCKS_PER_QUERY - 1, endBlock); - const eventsInRange = await _getEventsWithRetriesAsync(getEventsAsync, NUM_RETRIES, fromBlock, toBlock); - events = events.concat(eventsInRange); - } - return events; -} - -/** - * Calls the getEventsAsync function and retries up to numRetries times if it - * throws with an error that is considered retryable. - * @param getEventsAsync a function that will be called on each iteration. - * @param numRetries the maximum number times to retry getEventsAsync if it fails with a retryable error. - * @param fromBlock the start of the sub-range of blocks we are getting events for. - * @param toBlock the end of the sub-range of blocks we are getting events for. - */ -export async function _getEventsWithRetriesAsync<ArgsType extends DecodedLogArgs>( - getEventsAsync: GetEventsFunc<ArgsType>, - numRetries: number, - fromBlock: number, - toBlock: number, -): Promise<Array<LogWithDecodedArgs<ArgsType>>> { - let eventsInRange: Array<LogWithDecodedArgs<ArgsType>> = []; - for (let i = 0; i <= numRetries; i++) { - try { - eventsInRange = await getEventsAsync(fromBlock, toBlock); - } catch (err) { - if (isErrorRetryable(err) && i < numRetries) { - continue; - } else { - throw err; - } - } - break; - } - return eventsInRange; -} - -function isErrorRetryable(err: Error): boolean { - return err.message.includes('network timeout'); -} diff --git a/packages/pipeline/src/data_sources/copper/index.ts b/packages/pipeline/src/data_sources/copper/index.ts deleted file mode 100644 index 15df2fd7d..000000000 --- a/packages/pipeline/src/data_sources/copper/index.ts +++ /dev/null @@ -1,126 +0,0 @@ -import { fetchAsync } from '@0x/utils'; -import Bottleneck from 'bottleneck'; - -import { - CopperActivityTypeCategory, - CopperActivityTypeResponse, - CopperCustomFieldResponse, - CopperSearchResponse, -} from '../../parsers/copper'; - -const HTTP_OK_STATUS = 200; -const COPPER_URI = 'https://api.prosperworks.com/developer_api/v1'; - -const DEFAULT_PAGINATION_PARAMS = { - page_size: 200, - sort_by: 'date_modified', - sort_direction: 'desc', -}; - -export type CopperSearchParams = CopperLeadSearchParams | CopperActivitySearchParams | CopperOpportunitySearchParams; -export interface CopperLeadSearchParams { - page_number?: number; -} - -export interface CopperActivitySearchParams { - minimum_activity_date: number; - page_number?: number; -} - -export interface CopperOpportunitySearchParams { - sort_by: string; // must override the default 'date_modified' for this endpoint - page_number?: number; -} -export enum CopperEndpoint { - Leads = '/leads/search', - Opportunities = '/opportunities/search', - Activities = '/activities/search', -} -const ONE_SECOND = 1000; - -function httpErrorCheck(response: Response): void { - if (response.status !== HTTP_OK_STATUS) { - throw new Error(`HTTP error while scraping Copper: [${JSON.stringify(response)}]`); - } -} -export class CopperSource { - private readonly _accessToken: string; - private readonly _userEmail: string; - private readonly _defaultHeaders: any; - private readonly _limiter: Bottleneck; - - constructor(maxConcurrentRequests: number, accessToken: string, userEmail: string) { - this._accessToken = accessToken; - this._userEmail = userEmail; - this._defaultHeaders = { - 'Content-Type': 'application/json', - 'X-PW-AccessToken': this._accessToken, - 'X-PW-Application': 'developer_api', - 'X-PW-UserEmail': this._userEmail, - }; - this._limiter = new Bottleneck({ - minTime: ONE_SECOND / maxConcurrentRequests, - reservoir: 30, - reservoirRefreshAmount: 30, - reservoirRefreshInterval: maxConcurrentRequests, - }); - } - - public async fetchNumberOfPagesAsync(endpoint: CopperEndpoint, searchParams?: CopperSearchParams): Promise<number> { - const resp = await this._limiter.schedule(() => - fetchAsync(COPPER_URI + endpoint, { - method: 'POST', - body: JSON.stringify({ ...DEFAULT_PAGINATION_PARAMS, ...searchParams }), - headers: this._defaultHeaders, - }), - ); - - httpErrorCheck(resp); - - // total number of records that match the request parameters - if (resp.headers.has('X-Pw-Total')) { - const totalRecords: number = parseInt(resp.headers.get('X-Pw-Total') as string, 10); // tslint:disable-line:custom-no-magic-numbers - return Math.ceil(totalRecords / DEFAULT_PAGINATION_PARAMS.page_size); - } else { - return 1; - } - } - public async fetchSearchResultsAsync<T extends CopperSearchResponse>( - endpoint: CopperEndpoint, - searchParams?: CopperSearchParams, - ): Promise<T[]> { - const request = { ...DEFAULT_PAGINATION_PARAMS, ...searchParams }; - const response = await this._limiter.schedule(() => - fetchAsync(COPPER_URI + endpoint, { - method: 'POST', - body: JSON.stringify(request), - headers: this._defaultHeaders, - }), - ); - httpErrorCheck(response); - const json: T[] = await response.json(); - return json; - } - - public async fetchActivityTypesAsync(): Promise<Map<CopperActivityTypeCategory, CopperActivityTypeResponse[]>> { - const response = await this._limiter.schedule(() => - fetchAsync(`${COPPER_URI}/activity_types`, { - method: 'GET', - headers: this._defaultHeaders, - }), - ); - httpErrorCheck(response); - return response.json(); - } - - public async fetchCustomFieldsAsync(): Promise<CopperCustomFieldResponse[]> { - const response = await this._limiter.schedule(() => - fetchAsync(`${COPPER_URI}/custom_field_definitions`, { - method: 'GET', - headers: this._defaultHeaders, - }), - ); - httpErrorCheck(response); - return response.json(); - } -} diff --git a/packages/pipeline/src/data_sources/ddex/index.ts b/packages/pipeline/src/data_sources/ddex/index.ts deleted file mode 100644 index 7ef92b90f..000000000 --- a/packages/pipeline/src/data_sources/ddex/index.ts +++ /dev/null @@ -1,77 +0,0 @@ -import { fetchAsync, logUtils } from '@0x/utils'; - -const DDEX_BASE_URL = 'https://api.ddex.io/v3'; -const ACTIVE_MARKETS_URL = `${DDEX_BASE_URL}/markets`; -const NO_AGGREGATION_LEVEL = 3; // See https://docs.ddex.io/#get-orderbook -const ORDERBOOK_ENDPOINT = `/orderbook?level=${NO_AGGREGATION_LEVEL}`; -export const DDEX_SOURCE = 'ddex'; - -export interface DdexActiveMarketsResponse { - status: number; - desc: string; - data: { - markets: DdexMarket[]; - }; -} - -export interface DdexMarket { - id: string; - quoteToken: string; - quoteTokenDecimals: number; - quoteTokenAddress: string; - baseToken: string; - baseTokenDecimals: number; - baseTokenAddress: string; - minOrderSize: string; - pricePrecision: number; - priceDecimals: number; - amountDecimals: number; -} - -export interface DdexOrderbookResponse { - status: number; - desc: string; - data: { - orderBook: DdexOrderbook; - }; -} - -export interface DdexOrderbook { - marketId: string; - bids: DdexOrder[]; - asks: DdexOrder[]; -} - -export interface DdexOrder { - price: string; - amount: string; - orderId: string; -} - -// tslint:disable:prefer-function-over-method -// ^ Keep consistency with other sources and help logical organization -export class DdexSource { - /** - * Call Ddex API to find out which markets they are maintaining orderbooks for. - */ - public async getActiveMarketsAsync(): Promise<DdexMarket[]> { - logUtils.log('Getting all active DDEX markets'); - const resp = await fetchAsync(ACTIVE_MARKETS_URL); - const respJson: DdexActiveMarketsResponse = await resp.json(); - const markets = respJson.data.markets; - logUtils.log(`Got ${markets.length} markets.`); - return markets; - } - - /** - * Retrieve orderbook from Ddex API for a given market. - * @param marketId String identifying the market we want data for. Eg. 'REP/AUG' - */ - public async getMarketOrderbookAsync(marketId: string): Promise<DdexOrderbook> { - logUtils.log(`${marketId}: Retrieving orderbook.`); - const marketOrderbookUrl = `${ACTIVE_MARKETS_URL}/${marketId}${ORDERBOOK_ENDPOINT}`; - const resp = await fetchAsync(marketOrderbookUrl); - const respJson: DdexOrderbookResponse = await resp.json(); - return respJson.data.orderBook; - } -} diff --git a/packages/pipeline/src/data_sources/idex/index.ts b/packages/pipeline/src/data_sources/idex/index.ts deleted file mode 100644 index c1e53c08d..000000000 --- a/packages/pipeline/src/data_sources/idex/index.ts +++ /dev/null @@ -1,82 +0,0 @@ -import { fetchAsync } from '@0x/utils'; - -const IDEX_BASE_URL = 'https://api.idex.market'; -const MARKETS_URL = `${IDEX_BASE_URL}/returnTicker`; -const ORDERBOOK_URL = `${IDEX_BASE_URL}/returnOrderBook`; -const MAX_ORDER_COUNT = 100; // Maximum based on https://github.com/AuroraDAO/idex-api-docs#returnorderbook -export const IDEX_SOURCE = 'idex'; - -export interface IdexMarketsResponse { - [marketName: string]: IdexMarket; -} - -export interface IdexMarket { - last: string; - high: string; - low: string; - lowestAsk: string; - highestBid: string; - percentChange: string; - baseVolume: string; - quoteVolume: string; -} - -export interface IdexOrderbook { - asks: IdexOrder[]; - bids: IdexOrder[]; -} - -export interface IdexOrder { - price: string; - amount: string; - total: string; - orderHash: string; - params: IdexOrderParam; -} - -export interface IdexOrderParam { - tokenBuy: string; - buySymbol: string; - buyPrecision: number; - amountBuy: string; - tokenSell: string; - sellSymbol: string; - sellPrecision: number; - amountSell: string; - expires: number; - nonce: number; - user: string; -} - -// tslint:disable:prefer-function-over-method -// ^ Keep consistency with other sources and help logical organization -export class IdexSource { - /** - * Call Idex API to find out which markets they are maintaining orderbooks for. - */ - public async getMarketsAsync(): Promise<string[]> { - const params = { method: 'POST' }; - const resp = await fetchAsync(MARKETS_URL, params); - const respJson: IdexMarketsResponse = await resp.json(); - const markets: string[] = Object.keys(respJson); - return markets; - } - - /** - * Retrieve orderbook from Idex API for a given market. - * @param marketId String identifying the market we want data for. Eg. 'REP_AUG' - */ - public async getMarketOrderbookAsync(marketId: string): Promise<IdexOrderbook> { - const params = { - method: 'POST', - headers: { 'Content-Type': 'application/json' }, - body: JSON.stringify({ - market: marketId, - count: MAX_ORDER_COUNT, - }), - }; - const resp = await fetchAsync(ORDERBOOK_URL, params); - const respJson: IdexOrderbook = await resp.json(); - return respJson; - } -} diff --git a/packages/pipeline/src/data_sources/oasis/index.ts b/packages/pipeline/src/data_sources/oasis/index.ts deleted file mode 100644 index 3b30e9dfd..000000000 --- a/packages/pipeline/src/data_sources/oasis/index.ts +++ /dev/null @@ -1,103 +0,0 @@ -import { fetchAsync } from '@0x/utils'; - -const OASIS_BASE_URL = 'https://data.makerdao.com/v1'; -const OASIS_MARKET_QUERY = `query { - oasisMarkets(period: "1 week") { - nodes { - id - base - quote - buyVol - sellVol - price - high - low - } - } -}`; -const OASIS_ORDERBOOK_QUERY = `query ($market: String!) { - allOasisOrders(condition: { market: $market }) { - totalCount - nodes { - market - offerId - price - amount - act - } - } -}`; -export const OASIS_SOURCE = 'oasis'; - -export interface OasisMarket { - id: string; // market symbol e.g MKRDAI - base: string; // base symbol e.g MKR - quote: string; // quote symbol e.g DAI - buyVol: number; // total buy volume (base) - sellVol: number; // total sell volume (base) - price: number; // volume weighted price (quote) - high: number; // max sell price - low: number; // min buy price -} - -export interface OasisMarketResponse { - data: { - oasisMarkets: { - nodes: OasisMarket[]; - }; - }; -} - -export interface OasisOrder { - offerId: number; // Offer Id - market: string; // Market symbol (base/quote) - price: string; // Offer price (quote) - amount: string; // Offer amount (base) - act: string; // Action (ask|bid) -} - -export interface OasisOrderbookResponse { - data: { - allOasisOrders: { - totalCount: number; - nodes: OasisOrder[]; - }; - }; -} - -// tslint:disable:prefer-function-over-method -// ^ Keep consistency with other sources and help logical organization -export class OasisSource { - /** - * Call Ddex API to find out which markets they are maintaining orderbooks for. - */ - public async getActiveMarketsAsync(): Promise<OasisMarket[]> { - const params = { - method: 'POST', - headers: { 'Content-Type': 'application/json' }, - body: JSON.stringify({ query: OASIS_MARKET_QUERY }), - }; - const resp = await fetchAsync(OASIS_BASE_URL, params); - const respJson: OasisMarketResponse = await resp.json(); - const markets = respJson.data.oasisMarkets.nodes; - return markets; - } - - /** - * Retrieve orderbook from Oasis API for a given market. - * @param marketId String identifying the market we want data for. Eg. 'REPAUG'. - */ - public async getMarketOrderbookAsync(marketId: string): Promise<OasisOrder[]> { - const input = { - market: marketId, - }; - const params = { - method: 'POST', - headers: { 'Content-Type': 'application/json' }, - body: JSON.stringify({ query: OASIS_ORDERBOOK_QUERY, variables: input }), - }; - const resp = await fetchAsync(OASIS_BASE_URL, params); - const respJson: OasisOrderbookResponse = await resp.json(); - return respJson.data.allOasisOrders.nodes; - } -} diff --git a/packages/pipeline/src/data_sources/ohlcv_external/crypto_compare.ts b/packages/pipeline/src/data_sources/ohlcv_external/crypto_compare.ts deleted file mode 100644 index 85042501b..000000000 --- a/packages/pipeline/src/data_sources/ohlcv_external/crypto_compare.ts +++ /dev/null @@ -1,110 +0,0 @@ -// tslint:disable:no-duplicate-imports -import { fetchAsync } from '@0x/utils'; -import Bottleneck from 'bottleneck'; -import { stringify } from 'querystring'; -import * as R from 'ramda'; - -import { TradingPair } from '../../utils/get_ohlcv_trading_pairs'; - -export interface CryptoCompareOHLCVResponse { - Data: CryptoCompareOHLCVRecord[]; - Response: string; - Message: string; - Type: number; -} - -export interface CryptoCompareOHLCVRecord { - time: number; // in seconds, not milliseconds - close: number; - high: number; - low: number; - open: number; - volumefrom: number; - volumeto: number; -} - -export interface CryptoCompareOHLCVParams { - fsym: string; - tsym: string; - e?: string; - aggregate?: string; - aggregatePredictableTimePeriods?: boolean; - limit?: number; - toTs?: number; -} - -const ONE_HOUR = 60 * 60 * 1000; // tslint:disable-line:custom-no-magic-numbers -const ONE_SECOND = 1000; -const ONE_HOUR_AGO = new Date().getTime() - ONE_HOUR; -const HTTP_OK_STATUS = 200; -const CRYPTO_COMPARE_VALID_EMPTY_RESPONSE_TYPE = 96; -const MAX_PAGE_SIZE = 2000; - -export class CryptoCompareOHLCVSource { - public readonly intervalBetweenRecords = ONE_HOUR; - public readonly defaultExchange = 'CCCAGG'; - public readonly interval = this.intervalBetweenRecords * MAX_PAGE_SIZE; // the hourly API returns data for one interval at a time - private readonly _url: string = 'https://min-api.cryptocompare.com/data/histohour?'; - - // rate-limit for all API calls through this class instance - private readonly _limiter: Bottleneck; - constructor(maxReqsPerSecond: number) { - this._limiter = new Bottleneck({ - minTime: ONE_SECOND / maxReqsPerSecond, - reservoir: 30, - reservoirRefreshAmount: 30, - reservoirRefreshInterval: ONE_SECOND, - }); - } - - // gets OHLCV records starting from pair.latest - public async getHourlyOHLCVAsync(pair: TradingPair): Promise<CryptoCompareOHLCVRecord[]> { - const params = { - e: this.defaultExchange, - fsym: pair.fromSymbol, - tsym: pair.toSymbol, - limit: MAX_PAGE_SIZE, - toTs: Math.floor((pair.latestSavedTime + this.interval) / ONE_SECOND), // CryptoCompare uses timestamp in seconds. not ms - }; - const url = this._url + stringify(params); - const response = await this._limiter.schedule(() => fetchAsync(url)); - if (response.status !== HTTP_OK_STATUS) { - throw new Error(`HTTP error while scraping Crypto Compare: [${response}]`); - } - const json: CryptoCompareOHLCVResponse = await response.json(); - if ( - (json.Response === 'Error' || json.Data.length === 0) && - json.Type !== CRYPTO_COMPARE_VALID_EMPTY_RESPONSE_TYPE - ) { - throw new Error(JSON.stringify(json)); - } - return json.Data.filter(rec => { - return ( - // Crypto Compare takes ~30 mins to finalise records - rec.time * ONE_SECOND < ONE_HOUR_AGO && rec.time * ONE_SECOND > pair.latestSavedTime && hasData(rec) - ); - }); - } - public generateBackfillIntervals(pair: TradingPair): TradingPair[] { - const now = new Date().getTime(); - const f = (p: TradingPair): false | [TradingPair, TradingPair] => { - if (p.latestSavedTime > now) { - return false; - } else { - return [p, R.merge(p, { latestSavedTime: p.latestSavedTime + this.interval })]; - } - }; - return R.unfold(f, pair); - } -} - -function hasData(record: CryptoCompareOHLCVRecord): boolean { - return ( - record.close !== 0 || - record.open !== 0 || - record.high !== 0 || - record.low !== 0 || - record.volumefrom !== 0 || - record.volumeto !== 0 - ); -} diff --git a/packages/pipeline/src/data_sources/paradex/index.ts b/packages/pipeline/src/data_sources/paradex/index.ts deleted file mode 100644 index 46d448f4b..000000000 --- a/packages/pipeline/src/data_sources/paradex/index.ts +++ /dev/null @@ -1,92 +0,0 @@ -import { fetchAsync, logUtils } from '@0x/utils'; - -const PARADEX_BASE_URL = 'https://api.paradex.io/consumer/v0'; -const ACTIVE_MARKETS_URL = `${PARADEX_BASE_URL}/markets`; -const ORDERBOOK_ENDPOINT = `${PARADEX_BASE_URL}/orderbook`; -const TOKEN_INFO_ENDPOINT = `${PARADEX_BASE_URL}/tokens`; -export const PARADEX_SOURCE = 'paradex'; - -export type ParadexActiveMarketsResponse = ParadexMarket[]; - -export interface ParadexMarket { - id: string; - symbol: string; - baseToken: string; - quoteToken: string; - minOrderSize: string; - maxOrderSize: string; - priceMaxDecimals: number; - amountMaxDecimals: number; - // These are not native to the Paradex API response. We tag them on later - // by calling the token endpoint and joining on symbol. - baseTokenAddress?: string; - quoteTokenAddress?: string; -} - -export interface ParadexOrderbookResponse { - marketId: number; - marketSymbol: string; - bids: ParadexOrder[]; - asks: ParadexOrder[]; -} - -export interface ParadexOrder { - amount: string; - price: string; -} - -export type ParadexTokenInfoResponse = ParadexTokenInfo[]; - -export interface ParadexTokenInfo { - name: string; - symbol: string; - address: string; -} - -export class ParadexSource { - private readonly _apiKey: string; - - constructor(apiKey: string) { - this._apiKey = apiKey; - } - - /** - * Call Paradex API to find out which markets they are maintaining orderbooks for. - */ - public async getActiveMarketsAsync(): Promise<ParadexActiveMarketsResponse> { - logUtils.log('Getting all active Paradex markets.'); - const resp = await fetchAsync(ACTIVE_MARKETS_URL, { - headers: { 'API-KEY': this._apiKey }, - }); - const markets: ParadexActiveMarketsResponse = await resp.json(); - logUtils.log(`Got ${markets.length} markets.`); - return markets; - } - - /** - * Call Paradex API to find out their token information. - */ - public async getTokenInfoAsync(): Promise<ParadexTokenInfoResponse> { - logUtils.log('Getting token information from Paradex.'); - const resp = await fetchAsync(TOKEN_INFO_ENDPOINT, { - headers: { 'API-KEY': this._apiKey }, - }); - const tokens: ParadexTokenInfoResponse = await resp.json(); - logUtils.log(`Got information for ${tokens.length} tokens.`); - return tokens; - } - - /** - * Retrieve orderbook from Paradex API for a given market. - * @param marketSymbol String representing the market we want data for. - */ - public async getMarketOrderbookAsync(marketSymbol: string): Promise<ParadexOrderbookResponse> { - logUtils.log(`${marketSymbol}: Retrieving orderbook.`); - const marketOrderbookUrl = `${ORDERBOOK_ENDPOINT}?market=${marketSymbol}`; - const resp = await fetchAsync(marketOrderbookUrl, { - headers: { 'API-KEY': this._apiKey }, - }); - const orderbookResponse: ParadexOrderbookResponse = await resp.json(); - return orderbookResponse; - } -} diff --git a/packages/pipeline/src/data_sources/relayer-registry/index.ts b/packages/pipeline/src/data_sources/relayer-registry/index.ts deleted file mode 100644 index 8133f5eae..000000000 --- a/packages/pipeline/src/data_sources/relayer-registry/index.ts +++ /dev/null @@ -1,33 +0,0 @@ -import axios from 'axios'; - -export interface RelayerResponse { - name: string; - homepage_url: string; - app_url: string; - header_img: string; - logo_img: string; - networks: RelayerResponseNetwork[]; -} - -export interface RelayerResponseNetwork { - networkId: number; - sra_http_endpoint?: string; - sra_ws_endpoint?: string; - static_order_fields?: { - fee_recipient_addresses?: string[]; - taker_addresses?: string[]; - }; -} - -export class RelayerRegistrySource { - private readonly _url: string; - - constructor(url: string) { - this._url = url; - } - - public async getRelayerInfoAsync(): Promise<Map<string, RelayerResponse>> { - const resp = await axios.get<Map<string, RelayerResponse>>(this._url); - return resp.data; - } -} diff --git a/packages/pipeline/src/data_sources/trusted_tokens/index.ts b/packages/pipeline/src/data_sources/trusted_tokens/index.ts deleted file mode 100644 index 552739fb9..000000000 --- a/packages/pipeline/src/data_sources/trusted_tokens/index.ts +++ /dev/null @@ -1,29 +0,0 @@ -import axios from 'axios'; - -export interface ZeroExTrustedTokenMeta { - address: string; - name: string; - symbol: string; - decimals: number; -} - -export interface MetamaskTrustedTokenMeta { - address: string; - name: string; - erc20: boolean; - symbol: string; - decimals: number; -} - -export class TrustedTokenSource<T> { - private readonly _url: string; - - constructor(url: string) { - this._url = url; - } - - public async getTrustedTokenMetaAsync(): Promise<T> { - const resp = await axios.get<T>(this._url); - return resp.data; - } -} diff --git a/packages/pipeline/src/data_sources/web3/index.ts b/packages/pipeline/src/data_sources/web3/index.ts deleted file mode 100644 index 45a9ea161..000000000 --- a/packages/pipeline/src/data_sources/web3/index.ts +++ /dev/null @@ -1,22 +0,0 @@ -import { Web3ProviderEngine } from '@0x/subproviders'; -import { Web3Wrapper } from '@0x/web3-wrapper'; -import { BlockWithoutTransactionData, Transaction } from 'ethereum-types'; - -export class Web3Source { - private readonly _web3Wrapper: Web3Wrapper; - constructor(provider: Web3ProviderEngine) { - this._web3Wrapper = new Web3Wrapper(provider); - } - - public async getBlockInfoAsync(blockNumber: number): Promise<BlockWithoutTransactionData> { - const block = await this._web3Wrapper.getBlockIfExistsAsync(blockNumber); - if (block == null) { - return Promise.reject(new Error(`Could not find block for given block number: ${blockNumber}`)); - } - return block; - } - - public async getTransactionInfoAsync(txHash: string): Promise<Transaction> { - return this._web3Wrapper.getTransactionByHashAsync(txHash); - } -} diff --git a/packages/pipeline/src/entities/block.ts b/packages/pipeline/src/entities/block.ts deleted file mode 100644 index 398946622..000000000 --- a/packages/pipeline/src/entities/block.ts +++ /dev/null @@ -1,13 +0,0 @@ -import { Column, Entity, PrimaryColumn } from 'typeorm'; - -import { numberToBigIntTransformer } from '../utils'; - -@Entity({ name: 'blocks', schema: 'raw' }) -export class Block { - @PrimaryColumn() public hash!: string; - @PrimaryColumn({ transformer: numberToBigIntTransformer }) - public number!: number; - - @Column({ name: 'timestamp', transformer: numberToBigIntTransformer }) - public timestamp!: number; -} diff --git a/packages/pipeline/src/entities/copper_activity.ts b/packages/pipeline/src/entities/copper_activity.ts deleted file mode 100644 index cbc034285..000000000 --- a/packages/pipeline/src/entities/copper_activity.ts +++ /dev/null @@ -1,41 +0,0 @@ -import { Column, Entity, Index, PrimaryColumn } from 'typeorm'; - -import { numberToBigIntTransformer } from '../utils'; - -@Entity({ name: 'copper_activities', schema: 'raw' }) -export class CopperActivity { - @PrimaryColumn({ type: 'bigint', transformer: numberToBigIntTransformer }) - public id!: number; - - @Index() - @Column({ name: 'parent_id', type: 'bigint', transformer: numberToBigIntTransformer }) - public parentId!: number; - @Column({ name: 'parent_type', type: 'varchar' }) - public parentType!: string; - - // join with CopperActivityType - @Index() - @Column({ name: 'type_id', type: 'bigint', transformer: numberToBigIntTransformer }) - public typeId!: number; - @Column({ name: 'type_category', type: 'varchar' }) - public typeCategory!: string; - @Column({ name: 'type_name', type: 'varchar', nullable: true }) - public typeName?: string; - - @Column({ name: 'user_id', type: 'bigint', transformer: numberToBigIntTransformer }) - public userId!: number; - @Column({ name: 'old_value_id', type: 'bigint', nullable: true, transformer: numberToBigIntTransformer }) - public oldValueId?: number; - @Column({ name: 'old_value_name', type: 'varchar', nullable: true }) - public oldValueName?: string; - @Column({ name: 'new_value_id', type: 'bigint', nullable: true, transformer: numberToBigIntTransformer }) - public newValueId?: number; - @Column({ name: 'new_value_name', type: 'varchar', nullable: true }) - public newValueName?: string; - - @Index() - @Column({ name: 'date_created', type: 'bigint', transformer: numberToBigIntTransformer }) - public dateCreated!: number; - @PrimaryColumn({ name: 'date_modified', type: 'bigint', transformer: numberToBigIntTransformer }) - public dateModified!: number; -} diff --git a/packages/pipeline/src/entities/copper_activity_type.ts b/packages/pipeline/src/entities/copper_activity_type.ts deleted file mode 100644 index 8fb2dcf70..000000000 --- a/packages/pipeline/src/entities/copper_activity_type.ts +++ /dev/null @@ -1,17 +0,0 @@ -import { Column, Entity, PrimaryColumn } from 'typeorm'; - -import { numberToBigIntTransformer } from '../utils'; - -@Entity({ name: 'copper_activity_types', schema: 'raw' }) -export class CopperActivityType { - @PrimaryColumn({ type: 'bigint', transformer: numberToBigIntTransformer }) - public id!: number; - @Column({ name: 'category', type: 'varchar' }) - public category!: string; - @Column({ name: 'name', type: 'varchar' }) - public name!: string; - @Column({ name: 'is_disabled', type: 'boolean', nullable: true }) - public isDisabled?: boolean; - @Column({ name: 'count_as_interaction', type: 'boolean', nullable: true }) - public countAsInteraction?: boolean; -} diff --git a/packages/pipeline/src/entities/copper_custom_field.ts b/packages/pipeline/src/entities/copper_custom_field.ts deleted file mode 100644 index f23f6ab22..000000000 --- a/packages/pipeline/src/entities/copper_custom_field.ts +++ /dev/null @@ -1,15 +0,0 @@ -import { Column, Entity, PrimaryColumn } from 'typeorm'; - -import { numberToBigIntTransformer } from '../utils'; - -@Entity({ name: 'copper_custom_fields', schema: 'raw' }) -export class CopperCustomField { - @PrimaryColumn({ type: 'bigint', transformer: numberToBigIntTransformer }) - public id!: number; - @Column({ name: 'data_type', type: 'varchar' }) - public dataType!: string; - @Column({ name: 'field_type', type: 'varchar', nullable: true }) - public fieldType?: string; - @Column({ name: 'name', type: 'varchar' }) - public name!: string; -} diff --git a/packages/pipeline/src/entities/copper_lead.ts b/packages/pipeline/src/entities/copper_lead.ts deleted file mode 100644 index c51ccd761..000000000 --- a/packages/pipeline/src/entities/copper_lead.ts +++ /dev/null @@ -1,38 +0,0 @@ -import { Column, Entity, Index, PrimaryColumn } from 'typeorm'; - -import { numberToBigIntTransformer } from '../utils'; - -@Entity({ name: 'copper_leads', schema: 'raw' }) -export class CopperLead { - @PrimaryColumn({ type: 'bigint', transformer: numberToBigIntTransformer }) - public id!: number; - - @Column({ name: 'name', type: 'varchar', nullable: true }) - public name?: string; - @Column({ name: 'first_name', type: 'varchar', nullable: true }) - public firstName?: string; - @Column({ name: 'last_name', type: 'varchar', nullable: true }) - public lastName?: string; - @Column({ name: 'middle_name', type: 'varchar', nullable: true }) - public middleName?: string; - @Column({ name: 'assignee_id', type: 'bigint', transformer: numberToBigIntTransformer, nullable: true }) - public assigneeId?: number; - @Column({ name: 'company_name', type: 'varchar', nullable: true }) - public companyName?: string; - @Column({ name: 'customer_source_id', type: 'bigint', transformer: numberToBigIntTransformer, nullable: true }) - public customerSourceId?: number; - @Column({ name: 'monetary_value', type: 'integer', nullable: true }) - public monetaryValue?: number; - @Column({ name: 'status', type: 'varchar' }) - public status!: string; - @Column({ name: 'status_id', type: 'bigint', transformer: numberToBigIntTransformer }) - public statusId!: number; - @Column({ name: 'title', type: 'varchar', nullable: true }) - public title?: string; - - @Index() - @Column({ name: 'date_created', type: 'bigint', transformer: numberToBigIntTransformer }) - public dateCreated!: number; - @PrimaryColumn({ name: 'date_modified', type: 'bigint', transformer: numberToBigIntTransformer }) - public dateModified!: number; -} diff --git a/packages/pipeline/src/entities/copper_opportunity.ts b/packages/pipeline/src/entities/copper_opportunity.ts deleted file mode 100644 index e12bd69ce..000000000 --- a/packages/pipeline/src/entities/copper_opportunity.ts +++ /dev/null @@ -1,45 +0,0 @@ -import { Column, Entity, PrimaryColumn } from 'typeorm'; - -import { numberToBigIntTransformer } from '../utils'; - -@Entity({ name: 'copper_opportunities', schema: 'raw' }) -export class CopperOpportunity { - @PrimaryColumn({ name: 'id', type: 'bigint', transformer: numberToBigIntTransformer }) - public id!: number; - @Column({ name: 'name', type: 'varchar' }) - public name!: string; - @Column({ name: 'assignee_id', nullable: true, type: 'bigint', transformer: numberToBigIntTransformer }) - public assigneeId?: number; - @Column({ name: 'close_date', nullable: true, type: 'varchar' }) - public closeDate?: string; - @Column({ name: 'company_id', nullable: true, type: 'bigint', transformer: numberToBigIntTransformer }) - public companyId?: number; - @Column({ name: 'company_name', nullable: true, type: 'varchar' }) - public companyName?: string; - @Column({ name: 'customer_source_id', nullable: true, type: 'bigint', transformer: numberToBigIntTransformer }) - public customerSourceId?: number; - @Column({ name: 'loss_reason_id', nullable: true, type: 'bigint', transformer: numberToBigIntTransformer }) - public lossReasonId?: number; - @Column({ name: 'pipeline_id', type: 'bigint', transformer: numberToBigIntTransformer }) - public pipelineId!: number; - @Column({ name: 'pipeline_stage_id', type: 'bigint', transformer: numberToBigIntTransformer }) - public pipelineStageId!: number; - @Column({ name: 'primary_contact_id', nullable: true, type: 'bigint', transformer: numberToBigIntTransformer }) - public primaryContactId?: number; - @Column({ name: 'priority', nullable: true, type: 'varchar' }) - public priority?: string; - @Column({ name: 'status', type: 'varchar' }) - public status!: string; - @Column({ name: 'interaction_count', type: 'bigint', transformer: numberToBigIntTransformer }) - public interactionCount!: number; - @Column({ name: 'monetary_value', nullable: true, type: 'integer' }) - public monetaryValue?: number; - @Column({ name: 'win_probability', nullable: true, type: 'integer' }) - public winProbability?: number; - @Column({ name: 'date_created', type: 'bigint', transformer: numberToBigIntTransformer }) - public dateCreated!: number; - @PrimaryColumn({ name: 'date_modified', type: 'bigint', transformer: numberToBigIntTransformer }) - public dateModified!: number; - @Column({ name: 'custom_fields', type: 'jsonb' }) - public customFields!: { [key: number]: number }; -} diff --git a/packages/pipeline/src/entities/dex_trade.ts b/packages/pipeline/src/entities/dex_trade.ts deleted file mode 100644 index 93dcaf238..000000000 --- a/packages/pipeline/src/entities/dex_trade.ts +++ /dev/null @@ -1,56 +0,0 @@ -import { BigNumber } from '@0x/utils'; -import { Column, Entity, PrimaryColumn } from 'typeorm'; - -import { bigNumberTransformer, numberToBigIntTransformer } from '../utils'; - -@Entity({ name: 'dex_trades', schema: 'raw' }) -export class DexTrade { - @PrimaryColumn({ name: 'source_url' }) - public sourceUrl!: string; - @PrimaryColumn({ name: 'tx_hash' }) - public txHash!: string; - @PrimaryColumn({ name: 'trade_index' }) - public tradeIndex!: string; - - @Column({ name: 'tx_timestamp', type: 'bigint', transformer: numberToBigIntTransformer }) - public txTimestamp!: number; - @Column({ name: 'tx_date' }) - public txDate!: string; - @Column({ name: 'tx_sender' }) - public txSender!: string; - @Column({ name: 'smart_contract_id', type: 'bigint', transformer: numberToBigIntTransformer }) - public smartContractId!: number; - @Column({ name: 'smart_contract_address' }) - public smartContractAddress!: string; - @Column({ name: 'contract_type' }) - public contractType!: string; - @Column({ type: 'varchar' }) - public maker!: string; - @Column({ type: 'varchar' }) - public taker!: string; - @Column({ name: 'amount_buy', type: 'numeric', transformer: bigNumberTransformer }) - public amountBuy!: BigNumber; - @Column({ name: 'maker_fee_amount', type: 'numeric', transformer: bigNumberTransformer }) - public makerFeeAmount!: BigNumber; - @Column({ name: 'buy_currency_id', type: 'bigint', transformer: numberToBigIntTransformer }) - public buyCurrencyId!: number; - @Column({ name: 'buy_symbol' }) - public buySymbol!: string; - @Column({ name: 'amount_sell', type: 'numeric', transformer: bigNumberTransformer }) - public amountSell!: BigNumber; - @Column({ name: 'taker_fee_amount', type: 'numeric', transformer: bigNumberTransformer }) - public takerFeeAmount!: BigNumber; - @Column({ name: 'sell_currency_id', type: 'bigint', transformer: numberToBigIntTransformer }) - public sellCurrencyId!: number; - @Column({ name: 'sell_symbol' }) - public sellSymbol!: string; - @Column({ name: 'maker_annotation' }) - public makerAnnotation!: string; - @Column({ name: 'taker_annotation' }) - public takerAnnotation!: string; - @Column() public protocol!: string; - @Column({ name: 'buy_address', type: 'varchar', nullable: true }) - public buyAddress!: string | null; - @Column({ name: 'sell_address', type: 'varchar', nullable: true }) - public sellAddress!: string | null; -} diff --git a/packages/pipeline/src/entities/erc20_approval_event.ts b/packages/pipeline/src/entities/erc20_approval_event.ts deleted file mode 100644 index ee5e621d2..000000000 --- a/packages/pipeline/src/entities/erc20_approval_event.ts +++ /dev/null @@ -1,26 +0,0 @@ -import { BigNumber } from '@0x/utils'; -import { Column, Entity, PrimaryColumn } from 'typeorm'; - -import { bigNumberTransformer, numberToBigIntTransformer } from '../utils'; - -@Entity({ name: 'erc20_approval_events', schema: 'raw' }) -export class ERC20ApprovalEvent { - @PrimaryColumn({ name: 'token_address' }) - public tokenAddress!: string; - @PrimaryColumn({ name: 'log_index' }) - public logIndex!: number; - @PrimaryColumn({ name: 'block_number', transformer: numberToBigIntTransformer }) - public blockNumber!: number; - - @Column({ name: 'raw_data' }) - public rawData!: string; - - @PrimaryColumn({ name: 'transaction_hash' }) - public transactionHash!: string; - @Column({ name: 'owner_address' }) - public ownerAddress!: string; - @Column({ name: 'spender_address' }) - public spenderAddress!: string; - @Column({ name: 'amount', type: 'numeric', transformer: bigNumberTransformer }) - public amount!: BigNumber; -} diff --git a/packages/pipeline/src/entities/exchange_cancel_event.ts b/packages/pipeline/src/entities/exchange_cancel_event.ts deleted file mode 100644 index a86194920..000000000 --- a/packages/pipeline/src/entities/exchange_cancel_event.ts +++ /dev/null @@ -1,51 +0,0 @@ -import { Column, Entity, PrimaryColumn } from 'typeorm'; - -import { AssetType } from '../types'; -import { numberToBigIntTransformer } from '../utils'; - -@Entity({ name: 'exchange_cancel_events', schema: 'raw' }) -export class ExchangeCancelEvent { - @PrimaryColumn({ name: 'contract_address' }) - public contractAddress!: string; - @PrimaryColumn({ name: 'log_index' }) - public logIndex!: number; - @PrimaryColumn({ name: 'block_number', transformer: numberToBigIntTransformer }) - public blockNumber!: number; - - @Column({ name: 'raw_data' }) - public rawData!: string; - - @PrimaryColumn({ name: 'transaction_hash' }) - public transactionHash!: string; - @Column({ name: 'maker_address' }) - public makerAddress!: string; - @Column({ nullable: true, type: String, name: 'taker_address' }) - public takerAddress!: string; - @Column({ name: 'fee_recipient_address' }) - public feeRecipientAddress!: string; - @Column({ name: 'sender_address' }) - public senderAddress!: string; - @Column({ name: 'order_hash' }) - public orderHash!: string; - - @Column({ name: 'raw_maker_asset_data' }) - public rawMakerAssetData!: string; - @Column({ name: 'maker_asset_type' }) - public makerAssetType!: AssetType; - @Column({ name: 'maker_asset_proxy_id' }) - public makerAssetProxyId!: string; - @Column({ name: 'maker_token_address' }) - public makerTokenAddress!: string; - @Column({ nullable: true, type: String, name: 'maker_token_id' }) - public makerTokenId!: string | null; - @Column({ name: 'raw_taker_asset_data' }) - public rawTakerAssetData!: string; - @Column({ name: 'taker_asset_type' }) - public takerAssetType!: AssetType; - @Column({ name: 'taker_asset_proxy_id' }) - public takerAssetProxyId!: string; - @Column({ name: 'taker_token_address' }) - public takerTokenAddress!: string; - @Column({ nullable: true, type: String, name: 'taker_token_id' }) - public takerTokenId!: string | null; -} diff --git a/packages/pipeline/src/entities/exchange_cancel_up_to_event.ts b/packages/pipeline/src/entities/exchange_cancel_up_to_event.ts deleted file mode 100644 index f24aea23a..000000000 --- a/packages/pipeline/src/entities/exchange_cancel_up_to_event.ts +++ /dev/null @@ -1,26 +0,0 @@ -import { BigNumber } from '@0x/utils'; -import { Column, Entity, PrimaryColumn } from 'typeorm'; - -import { bigNumberTransformer, numberToBigIntTransformer } from '../utils'; - -@Entity({ name: 'exchange_cancel_up_to_events', schema: 'raw' }) -export class ExchangeCancelUpToEvent { - @PrimaryColumn({ name: 'contract_address' }) - public contractAddress!: string; - @PrimaryColumn({ name: 'log_index' }) - public logIndex!: number; - @PrimaryColumn({ name: 'block_number', transformer: numberToBigIntTransformer }) - public blockNumber!: number; - - @Column({ name: 'raw_data' }) - public rawData!: string; - - @PrimaryColumn({ name: 'transaction_hash' }) - public transactionHash!: string; - @Column({ name: 'maker_address' }) - public makerAddress!: string; - @Column({ name: 'sender_address' }) - public senderAddress!: string; - @Column({ name: 'order_epoch', type: 'numeric', transformer: bigNumberTransformer }) - public orderEpoch!: BigNumber; -} diff --git a/packages/pipeline/src/entities/exchange_fill_event.ts b/packages/pipeline/src/entities/exchange_fill_event.ts deleted file mode 100644 index 52111711e..000000000 --- a/packages/pipeline/src/entities/exchange_fill_event.ts +++ /dev/null @@ -1,60 +0,0 @@ -import { BigNumber } from '@0x/utils'; -import { Column, Entity, PrimaryColumn } from 'typeorm'; - -import { AssetType } from '../types'; -import { bigNumberTransformer, numberToBigIntTransformer } from '../utils'; - -@Entity({ name: 'exchange_fill_events', schema: 'raw' }) -export class ExchangeFillEvent { - @PrimaryColumn({ name: 'contract_address' }) - public contractAddress!: string; - @PrimaryColumn({ name: 'log_index' }) - public logIndex!: number; - @PrimaryColumn({ name: 'block_number', transformer: numberToBigIntTransformer }) - public blockNumber!: number; - - @Column({ name: 'raw_data' }) - public rawData!: string; - - @PrimaryColumn({ name: 'transaction_hash' }) - public transactionHash!: string; - @Column({ name: 'maker_address' }) - public makerAddress!: string; - @Column({ name: 'taker_address' }) - public takerAddress!: string; - @Column({ name: 'fee_recipient_address' }) - public feeRecipientAddress!: string; - @Column({ name: 'sender_address' }) - public senderAddress!: string; - @Column({ name: 'maker_asset_filled_amount', type: 'numeric', transformer: bigNumberTransformer }) - public makerAssetFilledAmount!: BigNumber; - @Column({ name: 'taker_asset_filled_amount', type: 'numeric', transformer: bigNumberTransformer }) - public takerAssetFilledAmount!: BigNumber; - @Column({ name: 'maker_fee_paid', type: 'numeric', transformer: bigNumberTransformer }) - public makerFeePaid!: BigNumber; - @Column({ name: 'taker_fee_paid', type: 'numeric', transformer: bigNumberTransformer }) - public takerFeePaid!: BigNumber; - @Column({ name: 'order_hash' }) - public orderHash!: string; - - @Column({ name: 'raw_maker_asset_data' }) - public rawMakerAssetData!: string; - @Column({ name: 'maker_asset_type' }) - public makerAssetType!: AssetType; - @Column({ name: 'maker_asset_proxy_id' }) - public makerAssetProxyId!: string; - @Column({ name: 'maker_token_address' }) - public makerTokenAddress!: string; - @Column({ nullable: true, type: String, name: 'maker_token_id' }) - public makerTokenId!: string | null; - @Column({ name: 'raw_taker_asset_data' }) - public rawTakerAssetData!: string; - @Column({ name: 'taker_asset_type' }) - public takerAssetType!: AssetType; - @Column({ name: 'taker_asset_proxy_id' }) - public takerAssetProxyId!: string; - @Column({ name: 'taker_token_address' }) - public takerTokenAddress!: string; - @Column({ nullable: true, type: String, name: 'taker_token_id' }) - public takerTokenId!: string | null; -} diff --git a/packages/pipeline/src/entities/index.ts b/packages/pipeline/src/entities/index.ts deleted file mode 100644 index 27c153c07..000000000 --- a/packages/pipeline/src/entities/index.ts +++ /dev/null @@ -1,25 +0,0 @@ -import { ExchangeCancelEvent } from './exchange_cancel_event'; -import { ExchangeCancelUpToEvent } from './exchange_cancel_up_to_event'; -import { ExchangeFillEvent } from './exchange_fill_event'; - -export { Block } from './block'; -export { DexTrade } from './dex_trade'; -export { ExchangeCancelEvent } from './exchange_cancel_event'; -export { ExchangeCancelUpToEvent } from './exchange_cancel_up_to_event'; -export { ExchangeFillEvent } from './exchange_fill_event'; -export { OHLCVExternal } from './ohlcv_external'; -export { Relayer } from './relayer'; -export { SraOrder } from './sra_order'; -export { SraOrdersObservedTimeStamp, createObservedTimestampForOrder } from './sra_order_observed_timestamp'; -export { TokenMetadata } from './token_metadata'; -export { TokenOrderbookSnapshot } from './token_order'; -export { Transaction } from './transaction'; -export { ERC20ApprovalEvent } from './erc20_approval_event'; - -export { CopperLead } from './copper_lead'; -export { CopperActivity } from './copper_activity'; -export { CopperOpportunity } from './copper_opportunity'; -export { CopperActivityType } from './copper_activity_type'; -export { CopperCustomField } from './copper_custom_field'; - -export type ExchangeEvent = ExchangeFillEvent | ExchangeCancelEvent | ExchangeCancelUpToEvent; diff --git a/packages/pipeline/src/entities/ohlcv_external.ts b/packages/pipeline/src/entities/ohlcv_external.ts deleted file mode 100644 index 4f55dd930..000000000 --- a/packages/pipeline/src/entities/ohlcv_external.ts +++ /dev/null @@ -1,30 +0,0 @@ -import { Column, Entity, PrimaryColumn } from 'typeorm'; - -import { numberToBigIntTransformer } from '../utils'; - -@Entity({ name: 'ohlcv_external', schema: 'raw' }) -export class OHLCVExternal { - @PrimaryColumn() public exchange!: string; - - @PrimaryColumn({ name: 'from_symbol', type: 'varchar' }) - public fromSymbol!: string; - @PrimaryColumn({ name: 'to_symbol', type: 'varchar' }) - public toSymbol!: string; - @PrimaryColumn({ name: 'start_time', transformer: numberToBigIntTransformer }) - public startTime!: number; - @PrimaryColumn({ name: 'end_time', transformer: numberToBigIntTransformer }) - public endTime!: number; - - @Column() public open!: number; - @Column() public close!: number; - @Column() public low!: number; - @Column() public high!: number; - @Column({ name: 'volume_from' }) - public volumeFrom!: number; - @Column({ name: 'volume_to' }) - public volumeTo!: number; - - @PrimaryColumn() public source!: string; - @PrimaryColumn({ name: 'observed_timestamp', transformer: numberToBigIntTransformer }) - public observedTimestamp!: number; -} diff --git a/packages/pipeline/src/entities/relayer.ts b/packages/pipeline/src/entities/relayer.ts deleted file mode 100644 index 5af8578b4..000000000 --- a/packages/pipeline/src/entities/relayer.ts +++ /dev/null @@ -1,21 +0,0 @@ -import { Column, Entity, PrimaryColumn } from 'typeorm'; - -@Entity({ name: 'relayers', schema: 'raw' }) -export class Relayer { - @PrimaryColumn() public uuid!: string; - - @Column() public name!: string; - @Column({ name: 'homepage_url', type: 'varchar' }) - public homepageUrl!: string; - @Column({ name: 'sra_http_endpoint', type: 'varchar', nullable: true }) - public sraHttpEndpoint!: string | null; - @Column({ name: 'sra_ws_endpoint', type: 'varchar', nullable: true }) - public sraWsEndpoint!: string | null; - @Column({ name: 'app_url', type: 'varchar', nullable: true }) - public appUrl!: string | null; - - @Column({ name: 'fee_recipient_addresses', type: 'varchar', array: true }) - public feeRecipientAddresses!: string[]; - @Column({ name: 'taker_addresses', type: 'varchar', array: true }) - public takerAddresses!: string[]; -} diff --git a/packages/pipeline/src/entities/sra_order.ts b/packages/pipeline/src/entities/sra_order.ts deleted file mode 100644 index 9c730a0bb..000000000 --- a/packages/pipeline/src/entities/sra_order.ts +++ /dev/null @@ -1,63 +0,0 @@ -import { BigNumber } from '@0x/utils'; -import { Column, Entity, PrimaryColumn } from 'typeorm'; - -import { AssetType } from '../types'; -import { bigNumberTransformer } from '../utils'; - -@Entity({ name: 'sra_orders', schema: 'raw' }) -export class SraOrder { - @PrimaryColumn({ name: 'exchange_address' }) - public exchangeAddress!: string; - @PrimaryColumn({ name: 'order_hash_hex' }) - public orderHashHex!: string; - @PrimaryColumn({ name: 'source_url' }) - public sourceUrl!: string; - - @Column({ name: 'maker_address' }) - public makerAddress!: string; - @Column({ name: 'taker_address' }) - public takerAddress!: string; - @Column({ name: 'fee_recipient_address' }) - public feeRecipientAddress!: string; - @Column({ name: 'sender_address' }) - public senderAddress!: string; - @Column({ name: 'maker_asset_amount', type: 'numeric', transformer: bigNumberTransformer }) - public makerAssetAmount!: BigNumber; - @Column({ name: 'taker_asset_amount', type: 'numeric', transformer: bigNumberTransformer }) - public takerAssetAmount!: BigNumber; - @Column({ name: 'maker_fee', type: 'numeric', transformer: bigNumberTransformer }) - public makerFee!: BigNumber; - @Column({ name: 'taker_fee', type: 'numeric', transformer: bigNumberTransformer }) - public takerFee!: BigNumber; - @Column({ name: 'expiration_time_seconds', type: 'numeric', transformer: bigNumberTransformer }) - public expirationTimeSeconds!: BigNumber; - @Column({ name: 'salt', type: 'numeric', transformer: bigNumberTransformer }) - public salt!: BigNumber; - @Column({ name: 'signature' }) - public signature!: string; - - @Column({ name: 'raw_maker_asset_data' }) - public rawMakerAssetData!: string; - @Column({ name: 'maker_asset_type' }) - public makerAssetType!: AssetType; - @Column({ name: 'maker_asset_proxy_id' }) - public makerAssetProxyId!: string; - @Column({ name: 'maker_token_address' }) - public makerTokenAddress!: string; - @Column({ nullable: true, type: String, name: 'maker_token_id' }) - public makerTokenId!: string | null; - @Column({ name: 'raw_taker_asset_data' }) - public rawTakerAssetData!: string; - @Column({ name: 'taker_asset_type' }) - public takerAssetType!: AssetType; - @Column({ name: 'taker_asset_proxy_id' }) - public takerAssetProxyId!: string; - @Column({ name: 'taker_token_address' }) - public takerTokenAddress!: string; - @Column({ nullable: true, type: String, name: 'taker_token_id' }) - public takerTokenId!: string | null; - - // TODO(albrow): Make this optional? - @Column({ name: 'metadata_json' }) - public metadataJson!: string; -} diff --git a/packages/pipeline/src/entities/sra_order_observed_timestamp.ts b/packages/pipeline/src/entities/sra_order_observed_timestamp.ts deleted file mode 100644 index cbec1c6d0..000000000 --- a/packages/pipeline/src/entities/sra_order_observed_timestamp.ts +++ /dev/null @@ -1,35 +0,0 @@ -import { Entity, PrimaryColumn } from 'typeorm'; - -import { numberToBigIntTransformer } from '../utils'; - -import { SraOrder } from './sra_order'; - -@Entity({ name: 'sra_orders_observed_timestamps', schema: 'raw' }) -export class SraOrdersObservedTimeStamp { - @PrimaryColumn({ name: 'exchange_address' }) - public exchangeAddress!: string; - @PrimaryColumn({ name: 'order_hash_hex' }) - public orderHashHex!: string; - @PrimaryColumn({ name: 'source_url' }) - public sourceUrl!: string; - - @PrimaryColumn({ name: 'observed_timestamp', transformer: numberToBigIntTransformer }) - public observedTimestamp!: number; -} - -/** - * Returns a new SraOrdersObservedTimeStamp for the given order based on the - * current time. - * @param order The order to generate a timestamp for. - */ -export function createObservedTimestampForOrder( - order: SraOrder, - observedTimestamp: number, -): SraOrdersObservedTimeStamp { - const observed = new SraOrdersObservedTimeStamp(); - observed.exchangeAddress = order.exchangeAddress; - observed.orderHashHex = order.orderHashHex; - observed.sourceUrl = order.sourceUrl; - observed.observedTimestamp = observedTimestamp; - return observed; -} diff --git a/packages/pipeline/src/entities/token_metadata.ts b/packages/pipeline/src/entities/token_metadata.ts deleted file mode 100644 index 911b53972..000000000 --- a/packages/pipeline/src/entities/token_metadata.ts +++ /dev/null @@ -1,22 +0,0 @@ -import { BigNumber } from '@0x/utils'; -import { Column, Entity, PrimaryColumn } from 'typeorm'; - -import { bigNumberTransformer } from '../utils/transformers'; - -@Entity({ name: 'token_metadata', schema: 'raw' }) -export class TokenMetadata { - @PrimaryColumn({ type: 'varchar', nullable: false }) - public address!: string; - - @PrimaryColumn({ type: 'varchar', nullable: false }) - public authority!: string; - - @Column({ type: 'numeric', transformer: bigNumberTransformer, nullable: true }) - public decimals!: BigNumber | null; - - @Column({ type: 'varchar', nullable: true }) - public symbol!: string | null; - - @Column({ type: 'varchar', nullable: true }) - public name!: string | null; -} diff --git a/packages/pipeline/src/entities/token_order.ts b/packages/pipeline/src/entities/token_order.ts deleted file mode 100644 index 2709747cb..000000000 --- a/packages/pipeline/src/entities/token_order.ts +++ /dev/null @@ -1,28 +0,0 @@ -import { BigNumber } from '@0x/utils'; -import { Column, Entity, PrimaryColumn } from 'typeorm'; - -import { bigNumberTransformer, numberToBigIntTransformer } from '../utils'; - -@Entity({ name: 'token_orderbook_snapshots', schema: 'raw' }) -export class TokenOrderbookSnapshot { - @PrimaryColumn({ name: 'observed_timestamp', type: 'bigint', transformer: numberToBigIntTransformer }) - public observedTimestamp!: number; - @PrimaryColumn({ name: 'source' }) - public source!: string; - @PrimaryColumn({ name: 'order_type' }) - public orderType!: string; - @PrimaryColumn({ name: 'price', type: 'numeric', transformer: bigNumberTransformer }) - public price!: BigNumber; - @PrimaryColumn({ name: 'base_asset_symbol' }) - public baseAssetSymbol!: string; - @Column({ nullable: true, type: String, name: 'base_asset_address' }) - public baseAssetAddress!: string | null; - @Column({ name: 'base_volume', type: 'numeric', transformer: bigNumberTransformer }) - public baseVolume!: BigNumber; - @PrimaryColumn({ name: 'quote_asset_symbol' }) - public quoteAssetSymbol!: string; - @Column({ nullable: true, type: String, name: 'quote_asset_address' }) - public quoteAssetAddress!: string | null; - @Column({ name: 'quote_volume', type: 'numeric', transformer: bigNumberTransformer }) - public quoteVolume!: BigNumber; -} diff --git a/packages/pipeline/src/entities/transaction.ts b/packages/pipeline/src/entities/transaction.ts deleted file mode 100644 index 742050177..000000000 --- a/packages/pipeline/src/entities/transaction.ts +++ /dev/null @@ -1,19 +0,0 @@ -import { BigNumber } from '@0x/utils'; -import { Column, Entity, PrimaryColumn } from 'typeorm'; - -import { bigNumberTransformer, numberToBigIntTransformer } from '../utils'; - -@Entity({ name: 'transactions', schema: 'raw' }) -export class Transaction { - @PrimaryColumn({ name: 'transaction_hash' }) - public transactionHash!: string; - @PrimaryColumn({ name: 'block_hash' }) - public blockHash!: string; - @PrimaryColumn({ name: 'block_number', transformer: numberToBigIntTransformer }) - public blockNumber!: number; - - @Column({ type: 'numeric', name: 'gas_used', transformer: bigNumberTransformer }) - public gasUsed!: BigNumber; - @Column({ type: 'numeric', name: 'gas_price', transformer: bigNumberTransformer }) - public gasPrice!: BigNumber; -} diff --git a/packages/pipeline/src/ormconfig.ts b/packages/pipeline/src/ormconfig.ts deleted file mode 100644 index 2700714cd..000000000 --- a/packages/pipeline/src/ormconfig.ts +++ /dev/null @@ -1,54 +0,0 @@ -import { ConnectionOptions } from 'typeorm'; - -import { - Block, - CopperActivity, - CopperActivityType, - CopperCustomField, - CopperLead, - CopperOpportunity, - DexTrade, - ERC20ApprovalEvent, - ExchangeCancelEvent, - ExchangeCancelUpToEvent, - ExchangeFillEvent, - OHLCVExternal, - Relayer, - SraOrder, - SraOrdersObservedTimeStamp, - TokenMetadata, - TokenOrderbookSnapshot, - Transaction, -} from './entities'; - -const entities = [ - Block, - CopperOpportunity, - CopperActivity, - CopperActivityType, - CopperCustomField, - CopperLead, - DexTrade, - ExchangeCancelEvent, - ExchangeCancelUpToEvent, - ExchangeFillEvent, - ERC20ApprovalEvent, - OHLCVExternal, - Relayer, - SraOrder, - SraOrdersObservedTimeStamp, - TokenMetadata, - TokenOrderbookSnapshot, - Transaction, -]; - -const config: ConnectionOptions = { - type: 'postgres', - url: process.env.ZEROEX_DATA_PIPELINE_DB_URL, - synchronize: false, - logging: ['error'], - entities, - migrations: ['./lib/migrations/**/*.js'], -}; - -module.exports = config; diff --git a/packages/pipeline/src/parsers/bloxy/index.ts b/packages/pipeline/src/parsers/bloxy/index.ts deleted file mode 100644 index 3d797aff0..000000000 --- a/packages/pipeline/src/parsers/bloxy/index.ts +++ /dev/null @@ -1,54 +0,0 @@ -import { BigNumber } from '@0x/utils'; -import * as R from 'ramda'; - -import { BLOXY_DEX_TRADES_URL, BloxyTrade } from '../../data_sources/bloxy'; -import { DexTrade } from '../../entities'; - -/** - * Parses a raw trades response from the Bloxy Dex API and returns an array of - * DexTrade entities. - * @param rawTrades A raw order response from an SRA endpoint. - */ -export function parseBloxyTrades(rawTrades: BloxyTrade[]): DexTrade[] { - return R.map(_parseBloxyTrade, rawTrades); -} - -/** - * Converts a single Bloxy trade into a DexTrade entity. - * @param rawTrade A single trade from the response from the Bloxy API. - */ -export function _parseBloxyTrade(rawTrade: BloxyTrade): DexTrade { - const dexTrade = new DexTrade(); - dexTrade.sourceUrl = BLOXY_DEX_TRADES_URL; - dexTrade.txHash = rawTrade.tx_hash; - dexTrade.tradeIndex = rawTrade.tradeIndex; - dexTrade.txTimestamp = new Date(rawTrade.tx_time).getTime(); - dexTrade.txDate = rawTrade.tx_date; - dexTrade.txSender = rawTrade.tx_sender; - dexTrade.smartContractId = rawTrade.smart_contract_id; - dexTrade.smartContractAddress = rawTrade.smart_contract_address; - dexTrade.contractType = rawTrade.contract_type; - dexTrade.maker = rawTrade.maker; - dexTrade.taker = rawTrade.taker; - // TODO(albrow): The Bloxy API returns amounts and fees as a `number` type - // but some of their values have too many significant digits to be - // represented that way. Ideally they will switch to using strings and then - // we can update this code. - dexTrade.amountBuy = new BigNumber(rawTrade.amountBuy.toString()); - dexTrade.makerFeeAmount = new BigNumber(rawTrade.makerFee.toString()); - dexTrade.buyCurrencyId = rawTrade.buyCurrencyId; - dexTrade.buySymbol = filterNullCharacters(rawTrade.buySymbol); - dexTrade.amountSell = new BigNumber(rawTrade.amountSell.toString()); - dexTrade.takerFeeAmount = new BigNumber(rawTrade.takerFee.toString()); - dexTrade.sellCurrencyId = rawTrade.sellCurrencyId; - dexTrade.sellSymbol = filterNullCharacters(rawTrade.sellSymbol); - dexTrade.makerAnnotation = rawTrade.maker_annotation; - dexTrade.takerAnnotation = rawTrade.taker_annotation; - dexTrade.protocol = rawTrade.protocol; - dexTrade.buyAddress = rawTrade.buyAddress; - dexTrade.sellAddress = rawTrade.sellAddress; - return dexTrade; -} - -// Works with any form of escaped null character (e.g., '\0' and '\u0000'). -const filterNullCharacters = R.replace(/\0/g, ''); diff --git a/packages/pipeline/src/parsers/copper/index.ts b/packages/pipeline/src/parsers/copper/index.ts deleted file mode 100644 index 07da66d10..000000000 --- a/packages/pipeline/src/parsers/copper/index.ts +++ /dev/null @@ -1,259 +0,0 @@ -import * as R from 'ramda'; - -import { CopperActivity, CopperActivityType, CopperCustomField, CopperLead, CopperOpportunity } from '../../entities'; - -const ONE_SECOND = 1000; -export type CopperSearchResponse = CopperLeadResponse | CopperActivityResponse | CopperOpportunityResponse; -export interface CopperLeadResponse { - id: number; - name?: string; - first_name?: string; - last_name?: string; - middle_name?: string; - assignee_id?: number; - company_name?: string; - customer_source_id?: number; - monetary_value?: number; - status: string; - status_id: number; - title?: string; - date_created: number; // in seconds - date_modified: number; // in seconds -} - -export interface CopperActivityResponse { - id: number; - parent: CopperActivityParentResponse; - type: CopperActivityTypeResponse; - user_id: number; - activity_date: number; - old_value: CopperActivityValueResponse; - new_value: CopperActivityValueResponse; - date_created: number; // in seconds - date_modified: number; // in seconds -} - -export interface CopperActivityValueResponse { - id: number; - name: string; -} -export interface CopperActivityParentResponse { - id: number; - type: string; -} - -// custom activity types -export enum CopperActivityTypeCategory { - User = 'user', - System = 'system', -} -export interface CopperActivityTypeResponse { - id: number; - category: CopperActivityTypeCategory; - name: string; - is_disabled?: boolean; - count_as_interaction?: boolean; -} - -export interface CopperOpportunityResponse { - id: number; - name: string; - assignee_id?: number; - close_date?: string; - company_id?: number; - company_name?: string; - customer_source_id?: number; - loss_reason_id?: number; - pipeline_id: number; - pipeline_stage_id: number; - primary_contact_id?: number; - priority?: string; - status: string; - tags: string[]; - interaction_count: number; - monetary_value?: number; - win_probability?: number; - date_created: number; // in seconds - date_modified: number; // in seconds - custom_fields: CopperNestedCustomFieldResponse[]; -} -interface CopperNestedCustomFieldResponse { - custom_field_definition_id: number; - value: number | number[] | null; -} -// custom fields -export enum CopperCustomFieldType { - String = 'String', - Text = 'Text', - Dropdown = 'Dropdown', - MultiSelect = 'MultiSelect', // not in API documentation but shows up in results - Date = 'Date', - Checkbox = 'Checkbox', - Float = 'Float', - URL = 'URL', // tslint:disable-line:enum-naming - Percentage = 'Percentage', - Currency = 'Currency', - Connect = 'Connect', -} -export interface CopperCustomFieldOptionResponse { - id: number; - name: string; -} -export interface CopperCustomFieldResponse { - id: number; - name: string; - data_type: CopperCustomFieldType; - options?: CopperCustomFieldOptionResponse[]; -} -/** - * Parse response from Copper API /search/leads/ - * - * @param leads - The array of leads returned from the API - * @returns Returns an array of Copper Lead entities - */ -export function parseLeads(leads: CopperLeadResponse[]): CopperLead[] { - return leads.map(lead => { - const entity = new CopperLead(); - entity.id = lead.id; - entity.name = lead.name || undefined; - entity.firstName = lead.first_name || undefined; - entity.lastName = lead.last_name || undefined; - entity.middleName = lead.middle_name || undefined; - entity.assigneeId = lead.assignee_id || undefined; - entity.companyName = lead.company_name || undefined; - entity.customerSourceId = lead.customer_source_id || undefined; - entity.monetaryValue = lead.monetary_value || undefined; - entity.status = lead.status; - entity.statusId = lead.status_id; - entity.title = lead.title || undefined; - entity.dateCreated = lead.date_created * ONE_SECOND; - entity.dateModified = lead.date_modified * ONE_SECOND; - return entity; - }); -} - -/** - * Parse response from Copper API /search/activities/ - * - * @param activities - The array of activities returned from the API - * @returns Returns an array of Copper Activity entities - */ -export function parseActivities(activities: CopperActivityResponse[]): CopperActivity[] { - return activities.map(activity => { - const entity = new CopperActivity(); - entity.id = activity.id; - - entity.parentId = activity.parent.id; - entity.parentType = activity.parent.type; - - entity.typeId = activity.type.id; - entity.typeCategory = activity.type.category.toString(); - entity.typeName = activity.type.name; - - entity.userId = activity.user_id; - entity.dateCreated = activity.date_created * ONE_SECOND; - entity.dateModified = activity.date_modified * ONE_SECOND; - - // nested nullable fields - entity.oldValueId = R.path(['old_value', 'id'], activity); - entity.oldValueName = R.path(['old_value', 'name'], activity); - entity.newValueId = R.path(['new_value', 'id'], activity); - entity.newValueName = R.path(['new_value', 'name'], activity); - - return entity; - }); -} - -/** - * Parse response from Copper API /search/opportunities/ - * - * @param opportunities - The array of opportunities returned from the API - * @returns Returns an array of Copper Opportunity entities - */ -export function parseOpportunities(opportunities: CopperOpportunityResponse[]): CopperOpportunity[] { - return opportunities.map(opp => { - const customFields: { [key: number]: number } = opp.custom_fields - .filter(f => f.value !== null) - .map(f => ({ - ...f, - value: ([] as number[]).concat(f.value || []), // normalise all values to number[] - })) - .map(f => f.value.map(val => [f.custom_field_definition_id, val] as [number, number])) // pair each value with the custom_field_definition_id - .reduce((acc, pair) => acc.concat(pair)) // flatten - .reduce<{ [key: number]: number }>((obj, [key, value]) => { - // transform into object literal - obj[key] = value; - return obj; - }, {}); - - const entity = new CopperOpportunity(); - entity.id = opp.id; - entity.name = opp.name; - entity.assigneeId = opp.assignee_id || undefined; - entity.closeDate = opp.close_date || undefined; - entity.companyId = opp.company_id || undefined; - entity.companyName = opp.company_name || undefined; - entity.customerSourceId = opp.customer_source_id || undefined; - entity.lossReasonId = opp.loss_reason_id || undefined; - entity.pipelineId = opp.pipeline_id; - entity.pipelineStageId = opp.pipeline_stage_id; - entity.primaryContactId = opp.primary_contact_id || undefined; - entity.priority = opp.priority || undefined; - entity.status = opp.status; - entity.interactionCount = opp.interaction_count; - entity.monetaryValue = opp.monetary_value || undefined; - entity.winProbability = opp.win_probability === null ? undefined : opp.win_probability; - entity.dateCreated = opp.date_created * ONE_SECOND; - entity.dateModified = opp.date_modified * ONE_SECOND; - entity.customFields = customFields; - return entity; - }); -} - -/** - * Parse response from Copper API /activity_types/ - * - * @param activityTypeResponse - Activity Types response from the API, keyed by "user" or "system" - * @returns Returns an array of Copper Activity Type entities - */ -export function parseActivityTypes( - activityTypeResponse: Map<CopperActivityTypeCategory, CopperActivityTypeResponse[]>, -): CopperActivityType[] { - const values: CopperActivityTypeResponse[] = R.flatten(Object.values(activityTypeResponse)); - return values.map(activityType => ({ - id: activityType.id, - name: activityType.name, - category: activityType.category.toString(), - isDisabled: activityType.is_disabled, - countAsInteraction: activityType.count_as_interaction, - })); -} - -/** - * Parse response from Copper API /custom_field_definitions/ - * - * @param customFieldResponse - array of custom field definitions returned from the API, consisting of top-level fields and nested fields - * @returns Returns an array of Copper Custom Field entities - */ -export function parseCustomFields(customFieldResponse: CopperCustomFieldResponse[]): CopperCustomField[] { - function parseTopLevelField(field: CopperCustomFieldResponse): CopperCustomField[] { - const topLevelField: CopperCustomField = { - id: field.id, - name: field.name, - dataType: field.data_type.toString(), - }; - - if (field.options !== undefined) { - const nestedFields: CopperCustomField[] = field.options.map(option => ({ - id: option.id, - name: option.name, - dataType: field.name, - fieldType: 'option', - })); - return nestedFields.concat(topLevelField); - } else { - return [topLevelField]; - } - } - return R.chain(parseTopLevelField, customFieldResponse); -} diff --git a/packages/pipeline/src/parsers/ddex_orders/index.ts b/packages/pipeline/src/parsers/ddex_orders/index.ts deleted file mode 100644 index 562f894ab..000000000 --- a/packages/pipeline/src/parsers/ddex_orders/index.ts +++ /dev/null @@ -1,69 +0,0 @@ -import { BigNumber } from '@0x/utils'; - -import { aggregateOrders } from '../utils'; - -import { DdexMarket, DdexOrderbook } from '../../data_sources/ddex'; -import { TokenOrderbookSnapshot as TokenOrder } from '../../entities'; -import { OrderType } from '../../types'; - -/** - * Marque function of this file. - * 1) Takes in orders from an orderbook, - * other information attached. - * @param ddexOrderbook A raw orderbook that we pull from the Ddex API. - * @param ddexMarket An object containing market data also directly from the API. - * @param observedTimestamp Time at which the orders for the market were pulled. - * @param source The exchange where these orders are placed. In this case 'ddex'. - */ -export function parseDdexOrders( - ddexOrderbook: DdexOrderbook, - ddexMarket: DdexMarket, - observedTimestamp: number, - source: string, -): TokenOrder[] { - const aggregatedBids = aggregateOrders(ddexOrderbook.bids); - const aggregatedAsks = aggregateOrders(ddexOrderbook.asks); - const parsedBids = aggregatedBids.map(order => - parseDdexOrder(ddexMarket, observedTimestamp, OrderType.Bid, source, order), - ); - const parsedAsks = aggregatedAsks.map(order => - parseDdexOrder(ddexMarket, observedTimestamp, OrderType.Ask, source, order), - ); - return parsedBids.concat(parsedAsks); -} - -/** - * Parse a single aggregated Ddex order in order to form a tokenOrder entity - * which can be saved into the database. - * @param ddexMarket An object containing information about the market where these - * trades have been placed. - * @param observedTimestamp The time when the API response returned back to us. - * @param orderType 'bid' or 'ask' enum. - * @param source Exchange where these orders were placed. - * @param ddexOrder A <price, amount> tuple which we will convert to volume-basis. - */ -export function parseDdexOrder( - ddexMarket: DdexMarket, - observedTimestamp: number, - orderType: OrderType, - source: string, - ddexOrder: [string, BigNumber], -): TokenOrder { - const tokenOrder = new TokenOrder(); - const price = new BigNumber(ddexOrder[0]); - const amount = ddexOrder[1]; - - tokenOrder.source = source; - tokenOrder.observedTimestamp = observedTimestamp; - tokenOrder.orderType = orderType; - tokenOrder.price = price; - - tokenOrder.baseAssetSymbol = ddexMarket.baseToken; - tokenOrder.baseAssetAddress = ddexMarket.baseTokenAddress; - tokenOrder.baseVolume = amount; - - tokenOrder.quoteAssetSymbol = ddexMarket.quoteToken; - tokenOrder.quoteAssetAddress = ddexMarket.quoteTokenAddress; - tokenOrder.quoteVolume = price.times(amount); - return tokenOrder; -} diff --git a/packages/pipeline/src/parsers/events/erc20_events.ts b/packages/pipeline/src/parsers/events/erc20_events.ts deleted file mode 100644 index caf9984d0..000000000 --- a/packages/pipeline/src/parsers/events/erc20_events.ts +++ /dev/null @@ -1,34 +0,0 @@ -import { ERC20TokenApprovalEventArgs } from '@0x/contract-wrappers'; -import { LogWithDecodedArgs } from 'ethereum-types'; -import * as R from 'ramda'; - -import { ERC20ApprovalEvent } from '../../entities'; - -/** - * Parses raw event logs for an ERC20 approval event and returns an array of - * ERC20ApprovalEvent entities. - * @param eventLogs Raw event logs (e.g. returned from contract-wrappers). - */ -export const parseERC20ApprovalEvents: ( - eventLogs: Array<LogWithDecodedArgs<ERC20TokenApprovalEventArgs>>, -) => ERC20ApprovalEvent[] = R.map(_convertToERC20ApprovalEvent); - -/** - * Converts a raw event log for an ERC20 approval event into an - * ERC20ApprovalEvent entity. - * @param eventLog Raw event log (e.g. returned from contract-wrappers). - */ -export function _convertToERC20ApprovalEvent( - eventLog: LogWithDecodedArgs<ERC20TokenApprovalEventArgs>, -): ERC20ApprovalEvent { - const erc20ApprovalEvent = new ERC20ApprovalEvent(); - erc20ApprovalEvent.tokenAddress = eventLog.address as string; - erc20ApprovalEvent.blockNumber = eventLog.blockNumber as number; - erc20ApprovalEvent.logIndex = eventLog.logIndex as number; - erc20ApprovalEvent.rawData = eventLog.data as string; - erc20ApprovalEvent.transactionHash = eventLog.transactionHash; - erc20ApprovalEvent.ownerAddress = eventLog.args._owner; - erc20ApprovalEvent.spenderAddress = eventLog.args._spender; - erc20ApprovalEvent.amount = eventLog.args._value; - return erc20ApprovalEvent; -} diff --git a/packages/pipeline/src/parsers/events/exchange_events.ts b/packages/pipeline/src/parsers/events/exchange_events.ts deleted file mode 100644 index 9c4a5f89a..000000000 --- a/packages/pipeline/src/parsers/events/exchange_events.ts +++ /dev/null @@ -1,145 +0,0 @@ -import { ExchangeCancelEventArgs, ExchangeCancelUpToEventArgs, ExchangeFillEventArgs } from '@0x/contract-wrappers'; -import { assetDataUtils } from '@0x/order-utils'; -import { AssetProxyId, ERC721AssetData } from '@0x/types'; -import { LogWithDecodedArgs } from 'ethereum-types'; -import * as R from 'ramda'; - -import { ExchangeCancelEvent, ExchangeCancelUpToEvent, ExchangeFillEvent } from '../../entities'; -import { bigNumbertoStringOrNull, convertAssetProxyIdToType } from '../../utils'; - -/** - * Parses raw event logs for a fill event and returns an array of - * ExchangeFillEvent entities. - * @param eventLogs Raw event logs (e.g. returned from contract-wrappers). - */ -export const parseExchangeFillEvents: ( - eventLogs: Array<LogWithDecodedArgs<ExchangeFillEventArgs>>, -) => ExchangeFillEvent[] = R.map(_convertToExchangeFillEvent); - -/** - * Parses raw event logs for a cancel event and returns an array of - * ExchangeCancelEvent entities. - * @param eventLogs Raw event logs (e.g. returned from contract-wrappers). - */ -export const parseExchangeCancelEvents: ( - eventLogs: Array<LogWithDecodedArgs<ExchangeCancelEventArgs>>, -) => ExchangeCancelEvent[] = R.map(_convertToExchangeCancelEvent); - -/** - * Parses raw event logs for a CancelUpTo event and returns an array of - * ExchangeCancelUpToEvent entities. - * @param eventLogs Raw event logs (e.g. returned from contract-wrappers). - */ -export const parseExchangeCancelUpToEvents: ( - eventLogs: Array<LogWithDecodedArgs<ExchangeCancelUpToEventArgs>>, -) => ExchangeCancelUpToEvent[] = R.map(_convertToExchangeCancelUpToEvent); - -/** - * Converts a raw event log for a fill event into an ExchangeFillEvent entity. - * @param eventLog Raw event log (e.g. returned from contract-wrappers). - */ -export function _convertToExchangeFillEvent(eventLog: LogWithDecodedArgs<ExchangeFillEventArgs>): ExchangeFillEvent { - const makerAssetData = assetDataUtils.decodeAssetDataOrThrow(eventLog.args.makerAssetData); - const takerAssetData = assetDataUtils.decodeAssetDataOrThrow(eventLog.args.takerAssetData); - const exchangeFillEvent = new ExchangeFillEvent(); - exchangeFillEvent.contractAddress = eventLog.address as string; - exchangeFillEvent.blockNumber = eventLog.blockNumber as number; - exchangeFillEvent.logIndex = eventLog.logIndex as number; - exchangeFillEvent.rawData = eventLog.data as string; - exchangeFillEvent.transactionHash = eventLog.transactionHash; - exchangeFillEvent.makerAddress = eventLog.args.makerAddress; - exchangeFillEvent.takerAddress = eventLog.args.takerAddress; - exchangeFillEvent.feeRecipientAddress = eventLog.args.feeRecipientAddress; - exchangeFillEvent.senderAddress = eventLog.args.senderAddress; - exchangeFillEvent.makerAssetFilledAmount = eventLog.args.makerAssetFilledAmount; - exchangeFillEvent.takerAssetFilledAmount = eventLog.args.takerAssetFilledAmount; - exchangeFillEvent.makerFeePaid = eventLog.args.makerFeePaid; - exchangeFillEvent.takerFeePaid = eventLog.args.takerFeePaid; - exchangeFillEvent.orderHash = eventLog.args.orderHash; - exchangeFillEvent.rawMakerAssetData = eventLog.args.makerAssetData; - // tslint:disable-next-line:no-unnecessary-type-assertion - exchangeFillEvent.makerAssetType = convertAssetProxyIdToType(makerAssetData.assetProxyId as AssetProxyId); - exchangeFillEvent.makerAssetProxyId = makerAssetData.assetProxyId; - // HACK(abandeali1): this event schema currently does not support multiple maker/taker assets, so we store the first token address from the MultiAssetProxy assetData - exchangeFillEvent.makerTokenAddress = assetDataUtils.isMultiAssetData(makerAssetData) - ? assetDataUtils.decodeMultiAssetDataRecursively(eventLog.args.makerAssetData).nestedAssetData[0].tokenAddress - : makerAssetData.tokenAddress; - // tslint has a false positive here. Type assertion is required. - // tslint:disable-next-line:no-unnecessary-type-assertion - exchangeFillEvent.makerTokenId = bigNumbertoStringOrNull((makerAssetData as ERC721AssetData).tokenId); - exchangeFillEvent.rawTakerAssetData = eventLog.args.takerAssetData; - // tslint:disable-next-line:no-unnecessary-type-assertion - exchangeFillEvent.takerAssetType = convertAssetProxyIdToType(takerAssetData.assetProxyId as AssetProxyId); - exchangeFillEvent.takerAssetProxyId = takerAssetData.assetProxyId; - // HACK(abandeali1): this event schema currently does not support multiple maker/taker assets, so we store the first token address from the MultiAssetProxy assetData - exchangeFillEvent.takerTokenAddress = assetDataUtils.isMultiAssetData(takerAssetData) - ? assetDataUtils.decodeMultiAssetDataRecursively(eventLog.args.takerAssetData).nestedAssetData[0].tokenAddress - : takerAssetData.tokenAddress; - // tslint:disable-next-line:no-unnecessary-type-assertion - exchangeFillEvent.takerTokenId = bigNumbertoStringOrNull((takerAssetData as ERC721AssetData).tokenId); - return exchangeFillEvent; -} - -/** - * Converts a raw event log for a cancel event into an ExchangeCancelEvent - * entity. - * @param eventLog Raw event log (e.g. returned from contract-wrappers). - */ -export function _convertToExchangeCancelEvent( - eventLog: LogWithDecodedArgs<ExchangeCancelEventArgs>, -): ExchangeCancelEvent { - const makerAssetData = assetDataUtils.decodeAssetDataOrThrow(eventLog.args.makerAssetData); - const takerAssetData = assetDataUtils.decodeAssetDataOrThrow(eventLog.args.takerAssetData); - const exchangeCancelEvent = new ExchangeCancelEvent(); - exchangeCancelEvent.contractAddress = eventLog.address as string; - exchangeCancelEvent.blockNumber = eventLog.blockNumber as number; - exchangeCancelEvent.logIndex = eventLog.logIndex as number; - exchangeCancelEvent.rawData = eventLog.data as string; - exchangeCancelEvent.transactionHash = eventLog.transactionHash; - exchangeCancelEvent.makerAddress = eventLog.args.makerAddress; - exchangeCancelEvent.takerAddress = eventLog.args.takerAddress; - exchangeCancelEvent.feeRecipientAddress = eventLog.args.feeRecipientAddress; - exchangeCancelEvent.senderAddress = eventLog.args.senderAddress; - exchangeCancelEvent.orderHash = eventLog.args.orderHash; - exchangeCancelEvent.rawMakerAssetData = eventLog.args.makerAssetData; - // tslint:disable-next-line:no-unnecessary-type-assertion - exchangeCancelEvent.makerAssetType = convertAssetProxyIdToType(makerAssetData.assetProxyId as AssetProxyId); - exchangeCancelEvent.makerAssetProxyId = makerAssetData.assetProxyId; - // HACK(abandeali1): this event schema currently does not support multiple maker/taker assets, so we store the first token address from the MultiAssetProxy assetData - exchangeCancelEvent.makerTokenAddress = assetDataUtils.isMultiAssetData(makerAssetData) - ? assetDataUtils.decodeMultiAssetDataRecursively(eventLog.args.makerAssetData).nestedAssetData[0].tokenAddress - : makerAssetData.tokenAddress; - // tslint:disable-next-line:no-unnecessary-type-assertion - exchangeCancelEvent.makerTokenId = bigNumbertoStringOrNull((makerAssetData as ERC721AssetData).tokenId); - exchangeCancelEvent.rawTakerAssetData = eventLog.args.takerAssetData; - // tslint:disable-next-line:no-unnecessary-type-assertion - exchangeCancelEvent.takerAssetType = convertAssetProxyIdToType(takerAssetData.assetProxyId as AssetProxyId); - exchangeCancelEvent.takerAssetProxyId = takerAssetData.assetProxyId; - // HACK(abandeali1): this event schema currently does not support multiple maker/taker assets, so we store the first token address from the MultiAssetProxy assetData - exchangeCancelEvent.takerTokenAddress = assetDataUtils.isMultiAssetData(takerAssetData) - ? assetDataUtils.decodeMultiAssetDataRecursively(eventLog.args.takerAssetData).nestedAssetData[0].tokenAddress - : takerAssetData.tokenAddress; - // tslint:disable-next-line:no-unnecessary-type-assertion - exchangeCancelEvent.takerTokenId = bigNumbertoStringOrNull((takerAssetData as ERC721AssetData).tokenId); - return exchangeCancelEvent; -} - -/** - * Converts a raw event log for a cancelUpTo event into an - * ExchangeCancelUpToEvent entity. - * @param eventLog Raw event log (e.g. returned from contract-wrappers). - */ -export function _convertToExchangeCancelUpToEvent( - eventLog: LogWithDecodedArgs<ExchangeCancelUpToEventArgs>, -): ExchangeCancelUpToEvent { - const exchangeCancelUpToEvent = new ExchangeCancelUpToEvent(); - exchangeCancelUpToEvent.contractAddress = eventLog.address as string; - exchangeCancelUpToEvent.blockNumber = eventLog.blockNumber as number; - exchangeCancelUpToEvent.logIndex = eventLog.logIndex as number; - exchangeCancelUpToEvent.rawData = eventLog.data as string; - exchangeCancelUpToEvent.transactionHash = eventLog.transactionHash; - exchangeCancelUpToEvent.makerAddress = eventLog.args.makerAddress; - exchangeCancelUpToEvent.senderAddress = eventLog.args.senderAddress; - exchangeCancelUpToEvent.orderEpoch = eventLog.args.orderEpoch; - return exchangeCancelUpToEvent; -} diff --git a/packages/pipeline/src/parsers/events/index.ts b/packages/pipeline/src/parsers/events/index.ts deleted file mode 100644 index 3f9915e8b..000000000 --- a/packages/pipeline/src/parsers/events/index.ts +++ /dev/null @@ -1,2 +0,0 @@ -export { parseExchangeCancelEvents, parseExchangeCancelUpToEvents, parseExchangeFillEvents } from './exchange_events'; -export { parseERC20ApprovalEvents } from './erc20_events'; diff --git a/packages/pipeline/src/parsers/idex_orders/index.ts b/packages/pipeline/src/parsers/idex_orders/index.ts deleted file mode 100644 index 14b871195..000000000 --- a/packages/pipeline/src/parsers/idex_orders/index.ts +++ /dev/null @@ -1,81 +0,0 @@ -import { BigNumber } from '@0x/utils'; - -import { aggregateOrders } from '../utils'; - -import { IdexOrderbook, IdexOrderParam } from '../../data_sources/idex'; -import { TokenOrderbookSnapshot as TokenOrder } from '../../entities'; -import { OrderType } from '../../types'; - -/** - * Marque function of this file. - * 1) Takes in orders from an orderbook, - * 2) Aggregates them by price point, - * 3) Parses them into entities which are then saved into the database. - * @param idexOrderbook raw orderbook that we pull from the Idex API. - * @param observedTimestamp Time at which the orders for the market were pulled. - * @param source The exchange where these orders are placed. In this case 'idex'. - */ -export function parseIdexOrders(idexOrderbook: IdexOrderbook, observedTimestamp: number, source: string): TokenOrder[] { - const aggregatedBids = aggregateOrders(idexOrderbook.bids); - // Any of the bid orders' params will work - const idexBidOrder = idexOrderbook.bids[0]; - const parsedBids = - aggregatedBids.length > 0 - ? aggregatedBids.map(order => - parseIdexOrder(idexBidOrder.params, observedTimestamp, OrderType.Bid, source, order), - ) - : []; - - const aggregatedAsks = aggregateOrders(idexOrderbook.asks); - // Any of the ask orders' params will work - const idexAskOrder = idexOrderbook.asks[0]; - const parsedAsks = - aggregatedAsks.length > 0 - ? aggregatedAsks.map(order => - parseIdexOrder(idexAskOrder.params, observedTimestamp, OrderType.Ask, source, order), - ) - : []; - return parsedBids.concat(parsedAsks); -} - -/** - * Parse a single aggregated Idex order in order to form a tokenOrder entity - * which can be saved into the database. - * @param idexOrderParam An object containing information about the market where these - * trades have been placed. - * @param observedTimestamp The time when the API response returned back to us. - * @param orderType 'bid' or 'ask' enum. - * @param source Exchange where these orders were placed. - * @param idexOrder A <price, amount> tuple which we will convert to volume-basis. - */ -export function parseIdexOrder( - idexOrderParam: IdexOrderParam, - observedTimestamp: number, - orderType: OrderType, - source: string, - idexOrder: [string, BigNumber], -): TokenOrder { - const tokenOrder = new TokenOrder(); - const price = new BigNumber(idexOrder[0]); - const amount = idexOrder[1]; - - tokenOrder.source = source; - tokenOrder.observedTimestamp = observedTimestamp; - tokenOrder.orderType = orderType; - tokenOrder.price = price; - tokenOrder.baseVolume = amount; - tokenOrder.quoteVolume = price.times(amount); - - if (orderType === OrderType.Bid) { - tokenOrder.baseAssetSymbol = idexOrderParam.buySymbol; - tokenOrder.baseAssetAddress = idexOrderParam.tokenBuy; - tokenOrder.quoteAssetSymbol = idexOrderParam.sellSymbol; - tokenOrder.quoteAssetAddress = idexOrderParam.tokenSell; - } else { - tokenOrder.baseAssetSymbol = idexOrderParam.sellSymbol; - tokenOrder.baseAssetAddress = idexOrderParam.tokenSell; - tokenOrder.quoteAssetSymbol = idexOrderParam.buySymbol; - tokenOrder.quoteAssetAddress = idexOrderParam.tokenBuy; - } - return tokenOrder; -} diff --git a/packages/pipeline/src/parsers/oasis_orders/index.ts b/packages/pipeline/src/parsers/oasis_orders/index.ts deleted file mode 100644 index b71fb65b9..000000000 --- a/packages/pipeline/src/parsers/oasis_orders/index.ts +++ /dev/null @@ -1,71 +0,0 @@ -import { BigNumber } from '@0x/utils'; -import * as R from 'ramda'; - -import { aggregateOrders } from '../utils'; - -import { OasisMarket, OasisOrder } from '../../data_sources/oasis'; -import { TokenOrderbookSnapshot as TokenOrder } from '../../entities'; -import { OrderType } from '../../types'; - -/** - * Marque function of this file. - * 1) Takes in orders from an orderbook, - * 2) Aggregates them according to price point, - * 3) Builds TokenOrder entity with other information attached. - * @param oasisOrderbook A raw orderbook that we pull from the Oasis API. - * @param oasisMarket An object containing market data also directly from the API. - * @param observedTimestamp Time at which the orders for the market were pulled. - * @param source The exchange where these orders are placed. In this case 'oasis'. - */ -export function parseOasisOrders( - oasisOrderbook: OasisOrder[], - oasisMarket: OasisMarket, - observedTimestamp: number, - source: string, -): TokenOrder[] { - const aggregatedBids = aggregateOrders(R.filter(R.propEq('act', OrderType.Bid), oasisOrderbook)); - const aggregatedAsks = aggregateOrders(R.filter(R.propEq('act', OrderType.Ask), oasisOrderbook)); - const parsedBids = aggregatedBids.map(order => - parseOasisOrder(oasisMarket, observedTimestamp, OrderType.Bid, source, order), - ); - const parsedAsks = aggregatedAsks.map(order => - parseOasisOrder(oasisMarket, observedTimestamp, OrderType.Ask, source, order), - ); - return parsedBids.concat(parsedAsks); -} - -/** - * Parse a single aggregated Oasis order to form a tokenOrder entity - * which can be saved into the database. - * @param oasisMarket An object containing information about the market where these - * trades have been placed. - * @param observedTimestamp The time when the API response returned back to us. - * @param orderType 'bid' or 'ask' enum. - * @param source Exchange where these orders were placed. - * @param oasisOrder A <price, amount> tuple which we will convert to volume-basis. - */ -export function parseOasisOrder( - oasisMarket: OasisMarket, - observedTimestamp: number, - orderType: OrderType, - source: string, - oasisOrder: [string, BigNumber], -): TokenOrder { - const tokenOrder = new TokenOrder(); - const price = new BigNumber(oasisOrder[0]); - const amount = oasisOrder[1]; - - tokenOrder.source = source; - tokenOrder.observedTimestamp = observedTimestamp; - tokenOrder.orderType = orderType; - tokenOrder.price = price; - - tokenOrder.baseAssetSymbol = oasisMarket.base; - tokenOrder.baseAssetAddress = null; // Oasis doesn't provide address information - tokenOrder.baseVolume = amount; - - tokenOrder.quoteAssetSymbol = oasisMarket.quote; - tokenOrder.quoteAssetAddress = null; // Oasis doesn't provide address information - tokenOrder.quoteVolume = price.times(amount); - return tokenOrder; -} diff --git a/packages/pipeline/src/parsers/ohlcv_external/crypto_compare.ts b/packages/pipeline/src/parsers/ohlcv_external/crypto_compare.ts deleted file mode 100644 index 3efb90384..000000000 --- a/packages/pipeline/src/parsers/ohlcv_external/crypto_compare.ts +++ /dev/null @@ -1,38 +0,0 @@ -import { CryptoCompareOHLCVRecord } from '../../data_sources/ohlcv_external/crypto_compare'; -import { OHLCVExternal } from '../../entities'; - -const ONE_SECOND = 1000; // Crypto Compare uses timestamps in seconds instead of milliseconds - -export interface OHLCVMetadata { - exchange: string; - fromSymbol: string; - toSymbol: string; - source: string; - observedTimestamp: number; - interval: number; -} -/** - * Parses OHLCV records from Crypto Compare into an array of OHLCVExternal entities - * @param rawRecords an array of OHLCV records from Crypto Compare (not the full response) - */ -export function parseRecords(rawRecords: CryptoCompareOHLCVRecord[], metadata: OHLCVMetadata): OHLCVExternal[] { - return rawRecords.map(rec => { - const ohlcvRecord = new OHLCVExternal(); - ohlcvRecord.exchange = metadata.exchange; - ohlcvRecord.fromSymbol = metadata.fromSymbol; - ohlcvRecord.toSymbol = metadata.toSymbol; - ohlcvRecord.startTime = rec.time * ONE_SECOND - metadata.interval; - ohlcvRecord.endTime = rec.time * ONE_SECOND; - - ohlcvRecord.open = rec.open; - ohlcvRecord.close = rec.close; - ohlcvRecord.low = rec.low; - ohlcvRecord.high = rec.high; - ohlcvRecord.volumeFrom = rec.volumefrom; - ohlcvRecord.volumeTo = rec.volumeto; - - ohlcvRecord.source = metadata.source; - ohlcvRecord.observedTimestamp = metadata.observedTimestamp; - return ohlcvRecord; - }); -} diff --git a/packages/pipeline/src/parsers/paradex_orders/index.ts b/packages/pipeline/src/parsers/paradex_orders/index.ts deleted file mode 100644 index 85990dae4..000000000 --- a/packages/pipeline/src/parsers/paradex_orders/index.ts +++ /dev/null @@ -1,66 +0,0 @@ -import { BigNumber } from '@0x/utils'; - -import { ParadexMarket, ParadexOrder, ParadexOrderbookResponse } from '../../data_sources/paradex'; -import { TokenOrderbookSnapshot as TokenOrder } from '../../entities'; -import { OrderType } from '../../types'; - -/** - * Marque function of this file. - * 1) Takes in orders from an orderbook (orders are already aggregated by price point), - * 2) For each aggregated order, forms a TokenOrder entity with market data and - * other information attached. - * @param paradexOrderbookResponse An orderbook response from the Paradex API. - * @param paradexMarket An object containing market data also directly from the API. - * @param observedTimestamp Time at which the orders for the market were pulled. - * @param source The exchange where these orders are placed. In this case 'paradex'. - */ -export function parseParadexOrders( - paradexOrderbookResponse: ParadexOrderbookResponse, - paradexMarket: ParadexMarket, - observedTimestamp: number, - source: string, -): TokenOrder[] { - const parsedBids = paradexOrderbookResponse.bids.map(order => - parseParadexOrder(paradexMarket, observedTimestamp, OrderType.Bid, source, order), - ); - const parsedAsks = paradexOrderbookResponse.asks.map(order => - parseParadexOrder(paradexMarket, observedTimestamp, OrderType.Ask, source, order), - ); - return parsedBids.concat(parsedAsks); -} - -/** - * Parse a single aggregated Ddex order in order to form a tokenOrder entity - * which can be saved into the database. - * @param paradexMarket An object containing information about the market where these - * orders have been placed. - * @param observedTimestamp The time when the API response returned back to us. - * @param orderType 'bid' or 'ask' enum. - * @param source Exchange where these orders were placed. - * @param paradexOrder A ParadexOrder object; basically price, amount tuple. - */ -export function parseParadexOrder( - paradexMarket: ParadexMarket, - observedTimestamp: number, - orderType: OrderType, - source: string, - paradexOrder: ParadexOrder, -): TokenOrder { - const tokenOrder = new TokenOrder(); - const price = new BigNumber(paradexOrder.price); - const amount = new BigNumber(paradexOrder.amount); - - tokenOrder.source = source; - tokenOrder.observedTimestamp = observedTimestamp; - tokenOrder.orderType = orderType; - tokenOrder.price = price; - - tokenOrder.baseAssetSymbol = paradexMarket.baseToken; - tokenOrder.baseAssetAddress = paradexMarket.baseTokenAddress as string; - tokenOrder.baseVolume = amount; - - tokenOrder.quoteAssetSymbol = paradexMarket.quoteToken; - tokenOrder.quoteAssetAddress = paradexMarket.quoteTokenAddress as string; - tokenOrder.quoteVolume = price.times(amount); - return tokenOrder; -} diff --git a/packages/pipeline/src/parsers/relayer_registry/index.ts b/packages/pipeline/src/parsers/relayer_registry/index.ts deleted file mode 100644 index 9723880a4..000000000 --- a/packages/pipeline/src/parsers/relayer_registry/index.ts +++ /dev/null @@ -1,37 +0,0 @@ -import * as R from 'ramda'; - -import { RelayerResponse, RelayerResponseNetwork } from '../../data_sources/relayer-registry'; -import { Relayer } from '../../entities'; - -/** - * Parses a raw relayer registry response into an array of Relayer entities. - * @param rawResp raw response from the relayer-registry json file. - */ -export function parseRelayers(rawResp: Map<string, RelayerResponse>): Relayer[] { - const parsedAsObject = R.mapObjIndexed(parseRelayer, rawResp); - return R.values(parsedAsObject); -} - -function parseRelayer(relayerResp: RelayerResponse, uuid: string): Relayer { - const relayer = new Relayer(); - relayer.uuid = uuid; - relayer.name = relayerResp.name; - relayer.homepageUrl = relayerResp.homepage_url; - relayer.appUrl = relayerResp.app_url; - const mainNetworkRelayerInfo = getMainNetwork(relayerResp); - if (mainNetworkRelayerInfo !== undefined) { - relayer.sraHttpEndpoint = mainNetworkRelayerInfo.sra_http_endpoint || null; - relayer.sraWsEndpoint = mainNetworkRelayerInfo.sra_ws_endpoint || null; - relayer.feeRecipientAddresses = - R.path(['static_order_fields', 'fee_recipient_addresses'], mainNetworkRelayerInfo) || []; - relayer.takerAddresses = R.path(['static_order_fields', 'taker_addresses'], mainNetworkRelayerInfo) || []; - } else { - relayer.feeRecipientAddresses = []; - relayer.takerAddresses = []; - } - return relayer; -} - -function getMainNetwork(relayerResp: RelayerResponse): RelayerResponseNetwork | undefined { - return R.find(network => network.networkId === 1, relayerResp.networks); -} diff --git a/packages/pipeline/src/parsers/sra_orders/index.ts b/packages/pipeline/src/parsers/sra_orders/index.ts deleted file mode 100644 index 13fe632a4..000000000 --- a/packages/pipeline/src/parsers/sra_orders/index.ts +++ /dev/null @@ -1,68 +0,0 @@ -import { APIOrder, OrdersResponse } from '@0x/connect'; -import { assetDataUtils, orderHashUtils } from '@0x/order-utils'; -import { AssetProxyId, ERC721AssetData } from '@0x/types'; -import * as R from 'ramda'; - -import { SraOrder } from '../../entities'; -import { bigNumbertoStringOrNull, convertAssetProxyIdToType } from '../../utils'; - -/** - * Parses a raw order response from an SRA endpoint and returns an array of - * SraOrder entities. - * @param rawOrdersResponse A raw order response from an SRA endpoint. - */ -export function parseSraOrders(rawOrdersResponse: OrdersResponse): SraOrder[] { - return R.map(_convertToEntity, rawOrdersResponse.records); -} - -/** - * Converts a single APIOrder into an SraOrder entity. - * @param apiOrder A single order from the response from an SRA endpoint. - */ -export function _convertToEntity(apiOrder: APIOrder): SraOrder { - // TODO(albrow): refactor out common asset data decoding code. - const makerAssetData = assetDataUtils.decodeAssetDataOrThrow(apiOrder.order.makerAssetData); - const takerAssetData = assetDataUtils.decodeAssetDataOrThrow(apiOrder.order.takerAssetData); - - const sraOrder = new SraOrder(); - sraOrder.exchangeAddress = apiOrder.order.exchangeAddress; - sraOrder.orderHashHex = orderHashUtils.getOrderHashHex(apiOrder.order); - - sraOrder.makerAddress = apiOrder.order.makerAddress; - sraOrder.takerAddress = apiOrder.order.takerAddress; - sraOrder.feeRecipientAddress = apiOrder.order.feeRecipientAddress; - sraOrder.senderAddress = apiOrder.order.senderAddress; - sraOrder.makerAssetAmount = apiOrder.order.makerAssetAmount; - sraOrder.takerAssetAmount = apiOrder.order.takerAssetAmount; - sraOrder.makerFee = apiOrder.order.makerFee; - sraOrder.takerFee = apiOrder.order.takerFee; - sraOrder.expirationTimeSeconds = apiOrder.order.expirationTimeSeconds; - sraOrder.salt = apiOrder.order.salt; - sraOrder.signature = apiOrder.order.signature; - - sraOrder.rawMakerAssetData = apiOrder.order.makerAssetData; - // tslint:disable-next-line:no-unnecessary-type-assertion - sraOrder.makerAssetType = convertAssetProxyIdToType(makerAssetData.assetProxyId as AssetProxyId); - sraOrder.makerAssetProxyId = makerAssetData.assetProxyId; - // HACK(abandeali1): this event schema currently does not support multiple maker/taker assets, so we store the first token address from the MultiAssetProxy assetData - sraOrder.makerTokenAddress = assetDataUtils.isMultiAssetData(makerAssetData) - ? assetDataUtils.decodeMultiAssetDataRecursively(apiOrder.order.makerAssetData).nestedAssetData[0].tokenAddress - : makerAssetData.tokenAddress; - // tslint has a false positive here. Type assertion is required. - // tslint:disable-next-line:no-unnecessary-type-assertion - sraOrder.makerTokenId = bigNumbertoStringOrNull((makerAssetData as ERC721AssetData).tokenId); - sraOrder.rawTakerAssetData = apiOrder.order.takerAssetData; - // tslint:disable-next-line:no-unnecessary-type-assertion - sraOrder.takerAssetType = convertAssetProxyIdToType(takerAssetData.assetProxyId as AssetProxyId); - sraOrder.takerAssetProxyId = takerAssetData.assetProxyId; - // HACK(abandeali1): this event schema currently does not support multiple maker/taker assets, so we store the first token address from the MultiAssetProxy assetData - sraOrder.takerTokenAddress = assetDataUtils.isMultiAssetData(takerAssetData) - ? assetDataUtils.decodeMultiAssetDataRecursively(apiOrder.order.takerAssetData).nestedAssetData[0].tokenAddress - : takerAssetData.tokenAddress; - // tslint:disable-next-line:no-unnecessary-type-assertion - sraOrder.takerTokenId = bigNumbertoStringOrNull((takerAssetData as ERC721AssetData).tokenId); - - sraOrder.metadataJson = JSON.stringify(apiOrder.metaData); - - return sraOrder; -} diff --git a/packages/pipeline/src/parsers/token_metadata/index.ts b/packages/pipeline/src/parsers/token_metadata/index.ts deleted file mode 100644 index 65e0aaa6e..000000000 --- a/packages/pipeline/src/parsers/token_metadata/index.ts +++ /dev/null @@ -1,46 +0,0 @@ -import * as R from 'ramda'; - -import { MetamaskTrustedTokenMeta, ZeroExTrustedTokenMeta } from '../../data_sources/trusted_tokens'; -import { TokenMetadata } from '../../entities'; -import { toBigNumberOrNull } from '../../utils'; - -/** - * Parses Metamask's trusted tokens list. - * @param rawResp raw response from the metamask json file. - */ -export function parseMetamaskTrustedTokens(rawResp: Map<string, MetamaskTrustedTokenMeta>): TokenMetadata[] { - const parsedAsObject = R.mapObjIndexed(parseMetamaskTrustedToken, rawResp); - return R.values(parsedAsObject); -} - -/** - * Parses 0x's trusted tokens list. - * @param rawResp raw response from the 0x trusted tokens file. - */ -export function parseZeroExTrustedTokens(rawResp: ZeroExTrustedTokenMeta[]): TokenMetadata[] { - return R.map(parseZeroExTrustedToken, rawResp); -} - -function parseMetamaskTrustedToken(resp: MetamaskTrustedTokenMeta, address: string): TokenMetadata { - const trustedToken = new TokenMetadata(); - - trustedToken.address = address; - trustedToken.decimals = toBigNumberOrNull(resp.decimals); - trustedToken.symbol = resp.symbol; - trustedToken.name = resp.name; - trustedToken.authority = 'metamask'; - - return trustedToken; -} - -function parseZeroExTrustedToken(resp: ZeroExTrustedTokenMeta): TokenMetadata { - const trustedToken = new TokenMetadata(); - - trustedToken.address = resp.address; - trustedToken.decimals = toBigNumberOrNull(resp.decimals); - trustedToken.symbol = resp.symbol; - trustedToken.name = resp.name; - trustedToken.authority = '0x'; - - return trustedToken; -} diff --git a/packages/pipeline/src/parsers/utils.ts b/packages/pipeline/src/parsers/utils.ts deleted file mode 100644 index 860729e9f..000000000 --- a/packages/pipeline/src/parsers/utils.ts +++ /dev/null @@ -1,28 +0,0 @@ -import { BigNumber } from '@0x/utils'; - -export interface GenericRawOrder { - price: string; - amount: string; -} - -/** - * Aggregates individual orders by price point. Filters zero amount orders. - * @param rawOrders An array of objects that have price and amount information. - */ -export function aggregateOrders(rawOrders: GenericRawOrder[]): Array<[string, BigNumber]> { - const aggregatedOrders = new Map<string, BigNumber>(); - rawOrders.forEach(order => { - const amount = new BigNumber(order.amount); - if (amount.isZero()) { - return; - } - // Use string instead of BigNum to aggregate by value instead of variable. - // Convert to BigNumber first to consolidate different string - // representations of the same number. Eg. '0.0' and '0.00'. - const price = new BigNumber(order.price).toString(); - - const existingAmount = aggregatedOrders.get(price) || new BigNumber(0); - aggregatedOrders.set(price, amount.plus(existingAmount)); - }); - return Array.from(aggregatedOrders.entries()); -} diff --git a/packages/pipeline/src/parsers/web3/index.ts b/packages/pipeline/src/parsers/web3/index.ts deleted file mode 100644 index f986efc59..000000000 --- a/packages/pipeline/src/parsers/web3/index.ts +++ /dev/null @@ -1,49 +0,0 @@ -import { BigNumber } from '@0x/utils'; -import { BlockWithoutTransactionData, Transaction as EthTransaction } from 'ethereum-types'; - -import { Block, Transaction } from '../../entities'; - -const MILLISECONDS_PER_SECOND = 1000; - -/** - * Parses a raw block and returns a Block entity. - * @param rawBlock a raw block (e.g. returned from web3-wrapper). - */ -export function parseBlock(rawBlock: BlockWithoutTransactionData): Block { - if (rawBlock.hash == null) { - throw new Error('Tried to parse raw block but hash was null'); - } - if (rawBlock.number == null) { - throw new Error('Tried to parse raw block but number was null'); - } - - const block = new Block(); - block.hash = rawBlock.hash; - block.number = rawBlock.number; - // Block timestamps are in seconds, but we use milliseconds everywhere else. - block.timestamp = rawBlock.timestamp * MILLISECONDS_PER_SECOND; - return block; -} - -/** - * Parses a raw transaction and returns a Transaction entity. - * @param rawBlock a raw transaction (e.g. returned from web3-wrapper). - */ -export function parseTransaction(rawTransaction: EthTransaction): Transaction { - if (rawTransaction.blockHash == null) { - throw new Error('Tried to parse raw transaction but blockHash was null'); - } - if (rawTransaction.blockNumber == null) { - throw new Error('Tried to parse raw transaction but blockNumber was null'); - } - - const tx = new Transaction(); - tx.transactionHash = rawTransaction.hash; - tx.blockHash = rawTransaction.blockHash; - tx.blockNumber = rawTransaction.blockNumber; - - tx.gasUsed = new BigNumber(rawTransaction.gas); - tx.gasPrice = rawTransaction.gasPrice; - - return tx; -} diff --git a/packages/pipeline/src/scripts/pull_competing_dex_trades.ts b/packages/pipeline/src/scripts/pull_competing_dex_trades.ts deleted file mode 100644 index 14644bb2e..000000000 --- a/packages/pipeline/src/scripts/pull_competing_dex_trades.ts +++ /dev/null @@ -1,52 +0,0 @@ -import 'reflect-metadata'; -import { Connection, ConnectionOptions, createConnection, Repository } from 'typeorm'; - -import { logUtils } from '@0x/utils'; - -import { BloxySource } from '../data_sources/bloxy'; -import { DexTrade } from '../entities'; -import * as ormConfig from '../ormconfig'; -import { parseBloxyTrades } from '../parsers/bloxy'; -import { handleError } from '../utils'; - -// Number of trades to save at once. -const BATCH_SAVE_SIZE = 1000; - -let connection: Connection; - -(async () => { - connection = await createConnection(ormConfig as ConnectionOptions); - await getAndSaveTradesAsync(); - process.exit(0); -})().catch(handleError); - -async function getAndSaveTradesAsync(): Promise<void> { - const apiKey = process.env.BLOXY_API_KEY; - if (apiKey === undefined) { - throw new Error('Missing required env var: BLOXY_API_KEY'); - } - const bloxySource = new BloxySource(apiKey); - const tradesRepository = connection.getRepository(DexTrade); - const lastSeenTimestamp = await getLastSeenTimestampAsync(tradesRepository); - logUtils.log(`Last seen timestamp: ${lastSeenTimestamp === 0 ? 'none' : lastSeenTimestamp}`); - logUtils.log('Getting latest dex trades...'); - const rawTrades = await bloxySource.getDexTradesAsync(lastSeenTimestamp); - logUtils.log(`Parsing ${rawTrades.length} trades...`); - const trades = parseBloxyTrades(rawTrades); - logUtils.log(`Saving ${trades.length} trades...`); - await tradesRepository.save(trades, { chunk: Math.ceil(trades.length / BATCH_SAVE_SIZE) }); - logUtils.log('Done saving trades.'); -} - -async function getLastSeenTimestampAsync(tradesRepository: Repository<DexTrade>): Promise<number> { - if ((await tradesRepository.count()) === 0) { - return 0; - } - const response = (await connection.query( - 'SELECT tx_timestamp FROM raw.dex_trades ORDER BY tx_timestamp DESC LIMIT 1', - )) as Array<{ tx_timestamp: number }>; - if (response.length === 0) { - return 0; - } - return response[0].tx_timestamp; -} diff --git a/packages/pipeline/src/scripts/pull_copper.ts b/packages/pipeline/src/scripts/pull_copper.ts deleted file mode 100644 index 5e4a6a643..000000000 --- a/packages/pipeline/src/scripts/pull_copper.ts +++ /dev/null @@ -1,130 +0,0 @@ -import * as R from 'ramda'; -import { Connection, ConnectionOptions, createConnection, Repository } from 'typeorm'; - -import { logUtils } from '@0x/utils'; - -import { CopperEndpoint, CopperSearchParams, CopperSource } from '../data_sources/copper'; -import { CopperActivity, CopperActivityType, CopperCustomField, CopperLead, CopperOpportunity } from '../entities'; -import * as ormConfig from '../ormconfig'; -import { - CopperSearchResponse, - parseActivities, - parseActivityTypes, - parseCustomFields, - parseLeads, - parseOpportunities, -} from '../parsers/copper'; -import { handleError } from '../utils'; -const ONE_SECOND = 1000; -const COPPER_RATE_LIMIT = 10; -let connection: Connection; - -(async () => { - connection = await createConnection(ormConfig as ConnectionOptions); - - const accessToken = process.env.COPPER_ACCESS_TOKEN; - const userEmail = process.env.COPPER_USER_EMAIL; - if (accessToken === undefined || userEmail === undefined) { - throw new Error('Missing required env var: COPPER_ACCESS_TOKEN and/or COPPER_USER_EMAIL'); - } - const source = new CopperSource(COPPER_RATE_LIMIT, accessToken, userEmail); - - const fetchPromises = [ - fetchAndSaveLeadsAsync(source), - fetchAndSaveOpportunitiesAsync(source), - fetchAndSaveActivitiesAsync(source), - fetchAndSaveCustomFieldsAsync(source), - fetchAndSaveActivityTypesAsync(source), - ]; - fetchPromises.forEach(async fn => { - await fn; - }); -})().catch(handleError); - -async function fetchAndSaveLeadsAsync(source: CopperSource): Promise<void> { - const repository = connection.getRepository(CopperLead); - const startTime = await getMaxAsync(connection, 'date_modified', 'raw.copper_leads'); - logUtils.log(`Fetching Copper leads starting from ${startTime}...`); - await fetchAndSaveAsync(CopperEndpoint.Leads, source, startTime, {}, parseLeads, repository); -} - -async function fetchAndSaveOpportunitiesAsync(source: CopperSource): Promise<void> { - const repository = connection.getRepository(CopperOpportunity); - const startTime = await getMaxAsync(connection, 'date_modified', 'raw.copper_opportunities'); - logUtils.log(`Fetching Copper opportunities starting from ${startTime}...`); - await fetchAndSaveAsync( - CopperEndpoint.Opportunities, - source, - startTime, - { sort_by: 'name' }, - parseOpportunities, - repository, - ); -} - -async function fetchAndSaveActivitiesAsync(source: CopperSource): Promise<void> { - const repository = connection.getRepository(CopperActivity); - const startTime = await getMaxAsync(connection, 'date_modified', 'raw.copper_activities'); - const searchParams = { - minimum_activity_date: Math.floor(startTime / ONE_SECOND), - }; - logUtils.log(`Fetching Copper activities starting from ${startTime}...`); - await fetchAndSaveAsync(CopperEndpoint.Activities, source, startTime, searchParams, parseActivities, repository); -} - -async function getMaxAsync(conn: Connection, sortColumn: string, tableName: string): Promise<number> { - const queryResult = await conn.query(`SELECT MAX(${sortColumn}) as _max from ${tableName};`); - if (R.isEmpty(queryResult)) { - return 0; - } else { - return queryResult[0]._max; - } -} - -// (Xianny): Copper API doesn't allow queries to filter by date. To ensure that we are filling in ascending chronological -// order and not missing any records, we are scraping all available pages. If Copper data gets larger, -// it would make sense to search for and start filling from the first page that contains a new record. -// This search would increase our network calls and is not efficient to implement with our current small volume -// of Copper records. -async function fetchAndSaveAsync<T extends CopperSearchResponse, E>( - endpoint: CopperEndpoint, - source: CopperSource, - startTime: number, - searchParams: CopperSearchParams, - parseFn: (recs: T[]) => E[], - repository: Repository<E>, -): Promise<void> { - let saved = 0; - const numPages = await source.fetchNumberOfPagesAsync(endpoint); - try { - for (let i = numPages; i > 0; i--) { - logUtils.log(`Fetching page ${i}/${numPages} of ${endpoint}...`); - const raw = await source.fetchSearchResultsAsync<T>(endpoint, { - ...searchParams, - page_number: i, - }); - const newRecords = raw.filter(rec => rec.date_modified * ONE_SECOND > startTime); - const parsed = parseFn(newRecords); - await repository.save<any>(parsed); - saved += newRecords.length; - } - } catch (err) { - logUtils.log(`Error fetching ${endpoint}, stopping: ${err.stack}`); - } finally { - logUtils.log(`Saved ${saved} items from ${endpoint}, done.`); - } -} - -async function fetchAndSaveActivityTypesAsync(source: CopperSource): Promise<void> { - logUtils.log(`Fetching Copper activity types...`); - const activityTypes = await source.fetchActivityTypesAsync(); - const repository = connection.getRepository(CopperActivityType); - await repository.save(parseActivityTypes(activityTypes)); -} - -async function fetchAndSaveCustomFieldsAsync(source: CopperSource): Promise<void> { - logUtils.log(`Fetching Copper custom fields...`); - const customFields = await source.fetchCustomFieldsAsync(); - const repository = connection.getRepository(CopperCustomField); - await repository.save(parseCustomFields(customFields)); -} diff --git a/packages/pipeline/src/scripts/pull_ddex_orderbook_snapshots.ts b/packages/pipeline/src/scripts/pull_ddex_orderbook_snapshots.ts deleted file mode 100644 index 4e00f258f..000000000 --- a/packages/pipeline/src/scripts/pull_ddex_orderbook_snapshots.ts +++ /dev/null @@ -1,55 +0,0 @@ -import { logUtils } from '@0x/utils'; -import * as R from 'ramda'; -import { Connection, ConnectionOptions, createConnection } from 'typeorm'; - -import { DDEX_SOURCE, DdexMarket, DdexSource } from '../data_sources/ddex'; -import { TokenOrderbookSnapshot as TokenOrder } from '../entities'; -import * as ormConfig from '../ormconfig'; -import { parseDdexOrders } from '../parsers/ddex_orders'; -import { handleError } from '../utils'; - -// Number of orders to save at once. -const BATCH_SAVE_SIZE = 1000; - -// Number of markets to retrieve orderbooks for at once. -const MARKET_ORDERBOOK_REQUEST_BATCH_SIZE = 50; - -// Delay between market orderbook requests. -const MILLISEC_MARKET_ORDERBOOK_REQUEST_DELAY = 5000; - -let connection: Connection; - -(async () => { - connection = await createConnection(ormConfig as ConnectionOptions); - const ddexSource = new DdexSource(); - const markets = await ddexSource.getActiveMarketsAsync(); - for (const marketsChunk of R.splitEvery(MARKET_ORDERBOOK_REQUEST_BATCH_SIZE, markets)) { - await Promise.all( - marketsChunk.map(async (market: DdexMarket) => getAndSaveMarketOrderbookAsync(ddexSource, market)), - ); - await new Promise<void>(resolve => setTimeout(resolve, MILLISEC_MARKET_ORDERBOOK_REQUEST_DELAY)); - } - process.exit(0); -})().catch(handleError); - -/** - * Retrieve orderbook from Ddex API for a given market. Parse orders and insert - * them into our database. - * @param ddexSource Data source which can query Ddex API. - * @param market Object from Ddex API containing market data. - */ -async function getAndSaveMarketOrderbookAsync(ddexSource: DdexSource, market: DdexMarket): Promise<void> { - const orderBook = await ddexSource.getMarketOrderbookAsync(market.id); - const observedTimestamp = Date.now(); - - logUtils.log(`${market.id}: Parsing orders.`); - const orders = parseDdexOrders(orderBook, market, observedTimestamp, DDEX_SOURCE); - - if (orders.length > 0) { - logUtils.log(`${market.id}: Saving ${orders.length} orders.`); - const TokenOrderRepository = connection.getRepository(TokenOrder); - await TokenOrderRepository.save(orders, { chunk: Math.ceil(orders.length / BATCH_SAVE_SIZE) }); - } else { - logUtils.log(`${market.id}: 0 orders to save.`); - } -} diff --git a/packages/pipeline/src/scripts/pull_erc20_events.ts b/packages/pipeline/src/scripts/pull_erc20_events.ts deleted file mode 100644 index bd520c610..000000000 --- a/packages/pipeline/src/scripts/pull_erc20_events.ts +++ /dev/null @@ -1,96 +0,0 @@ -import { getContractAddressesForNetworkOrThrow } from '@0x/contract-addresses'; -import { web3Factory } from '@0x/dev-utils'; -import { Web3ProviderEngine } from '@0x/subproviders'; -import { logUtils } from '@0x/utils'; -import { Web3Wrapper } from '@0x/web3-wrapper'; -import 'reflect-metadata'; -import { Connection, ConnectionOptions, createConnection } from 'typeorm'; - -import { ERC20EventsSource } from '../data_sources/contract-wrappers/erc20_events'; -import { ERC20ApprovalEvent } from '../entities'; -import * as ormConfig from '../ormconfig'; -import { parseERC20ApprovalEvents } from '../parsers/events'; -import { handleError, INFURA_ROOT_URL } from '../utils'; - -const NETWORK_ID = 1; -const START_BLOCK_OFFSET = 100; // Number of blocks before the last known block to consider when updating fill events. -const BATCH_SAVE_SIZE = 1000; // Number of events to save at once. -const BLOCK_FINALITY_THRESHOLD = 10; // When to consider blocks as final. Used to compute default endBlock. - -let connection: Connection; - -interface Token { - // name is used for logging only. - name: string; - address: string; - defaultStartBlock: number; -} - -const tokensToGetApprovalEvents: Token[] = [ - { - name: 'WETH', - address: getContractAddressesForNetworkOrThrow(NETWORK_ID).etherToken, - defaultStartBlock: 4719568, // Block when the WETH contract was deployed. - }, - { - name: 'ZRX', - address: getContractAddressesForNetworkOrThrow(NETWORK_ID).zrxToken, - defaultStartBlock: 4145415, // Block when the ZRX contract was deployed. - }, - { - name: 'DAI', - address: '0x89d24a6b4ccb1b6faa2625fe562bdd9a23260359', - defaultStartBlock: 4752008, // Block when the DAI contract was deployed. - }, -]; - -(async () => { - connection = await createConnection(ormConfig as ConnectionOptions); - const provider = web3Factory.getRpcProvider({ - rpcUrl: INFURA_ROOT_URL, - }); - const endBlock = await calculateEndBlockAsync(provider); - for (const token of tokensToGetApprovalEvents) { - await getAndSaveApprovalEventsAsync(provider, token, endBlock); - } - process.exit(0); -})().catch(handleError); - -async function getAndSaveApprovalEventsAsync( - provider: Web3ProviderEngine, - token: Token, - endBlock: number, -): Promise<void> { - logUtils.log(`Getting approval events for ${token.name}...`); - logUtils.log('Checking existing approval events...'); - const repository = connection.getRepository(ERC20ApprovalEvent); - const startBlock = (await getStartBlockAsync(token)) || token.defaultStartBlock; - - logUtils.log(`Getting approval events starting at ${startBlock}...`); - const eventsSource = new ERC20EventsSource(provider, NETWORK_ID, token.address); - const eventLogs = await eventsSource.getApprovalEventsAsync(startBlock, endBlock); - - logUtils.log(`Parsing ${eventLogs.length} approval events...`); - const events = parseERC20ApprovalEvents(eventLogs); - logUtils.log(`Retrieved and parsed ${events.length} total approval events.`); - await repository.save(events, { chunk: Math.ceil(events.length / BATCH_SAVE_SIZE) }); -} - -async function calculateEndBlockAsync(provider: Web3ProviderEngine): Promise<number> { - const web3Wrapper = new Web3Wrapper(provider); - const currentBlock = await web3Wrapper.getBlockNumberAsync(); - return currentBlock - BLOCK_FINALITY_THRESHOLD; -} - -async function getStartBlockAsync(token: Token): Promise<number | null> { - const queryResult = await connection.query( - `SELECT block_number FROM raw.erc20_approval_events WHERE token_address = $1 ORDER BY block_number DESC LIMIT 1`, - [token.address], - ); - if (queryResult.length === 0) { - logUtils.log(`No existing approval events found for ${token.name}.`); - return null; - } - const lastKnownBlock = queryResult[0].block_number; - return lastKnownBlock - START_BLOCK_OFFSET; -} diff --git a/packages/pipeline/src/scripts/pull_exchange_events.ts b/packages/pipeline/src/scripts/pull_exchange_events.ts deleted file mode 100644 index c2c56da6b..000000000 --- a/packages/pipeline/src/scripts/pull_exchange_events.ts +++ /dev/null @@ -1,152 +0,0 @@ -import { web3Factory } from '@0x/dev-utils'; -import { Web3ProviderEngine } from '@0x/subproviders'; -import { logUtils } from '@0x/utils'; -import { Web3Wrapper } from '@0x/web3-wrapper'; -import R = require('ramda'); -import 'reflect-metadata'; -import { Connection, ConnectionOptions, createConnection, Repository } from 'typeorm'; - -import { ExchangeEventsSource } from '../data_sources/contract-wrappers/exchange_events'; -import { ExchangeCancelEvent, ExchangeCancelUpToEvent, ExchangeEvent, ExchangeFillEvent } from '../entities'; -import * as ormConfig from '../ormconfig'; -import { parseExchangeCancelEvents, parseExchangeCancelUpToEvents, parseExchangeFillEvents } from '../parsers/events'; -import { EXCHANGE_START_BLOCK, handleError, INFURA_ROOT_URL } from '../utils'; - -const START_BLOCK_OFFSET = 100; // Number of blocks before the last known block to consider when updating fill events. -const BATCH_SAVE_SIZE = 1000; // Number of events to save at once. -const BLOCK_FINALITY_THRESHOLD = 10; // When to consider blocks as final. Used to compute default endBlock. - -let connection: Connection; - -(async () => { - connection = await createConnection(ormConfig as ConnectionOptions); - const provider = web3Factory.getRpcProvider({ - rpcUrl: INFURA_ROOT_URL, - }); - const endBlock = await calculateEndBlockAsync(provider); - const eventsSource = new ExchangeEventsSource(provider, 1); - await getFillEventsAsync(eventsSource, endBlock); - await getCancelEventsAsync(eventsSource, endBlock); - await getCancelUpToEventsAsync(eventsSource, endBlock); - process.exit(0); -})().catch(handleError); - -async function getFillEventsAsync(eventsSource: ExchangeEventsSource, endBlock: number): Promise<void> { - logUtils.log('Checking existing fill events...'); - const repository = connection.getRepository(ExchangeFillEvent); - const startBlock = await getStartBlockAsync(repository); - logUtils.log(`Getting fill events starting at ${startBlock}...`); - const eventLogs = await eventsSource.getFillEventsAsync(startBlock, endBlock); - logUtils.log('Parsing fill events...'); - const events = parseExchangeFillEvents(eventLogs); - logUtils.log(`Retrieved and parsed ${events.length} total fill events.`); - await saveEventsAsync(startBlock === EXCHANGE_START_BLOCK, repository, events); -} - -async function getCancelEventsAsync(eventsSource: ExchangeEventsSource, endBlock: number): Promise<void> { - logUtils.log('Checking existing cancel events...'); - const repository = connection.getRepository(ExchangeCancelEvent); - const startBlock = await getStartBlockAsync(repository); - logUtils.log(`Getting cancel events starting at ${startBlock}...`); - const eventLogs = await eventsSource.getCancelEventsAsync(startBlock, endBlock); - logUtils.log('Parsing cancel events...'); - const events = parseExchangeCancelEvents(eventLogs); - logUtils.log(`Retrieved and parsed ${events.length} total cancel events.`); - await saveEventsAsync(startBlock === EXCHANGE_START_BLOCK, repository, events); -} - -async function getCancelUpToEventsAsync(eventsSource: ExchangeEventsSource, endBlock: number): Promise<void> { - logUtils.log('Checking existing CancelUpTo events...'); - const repository = connection.getRepository(ExchangeCancelUpToEvent); - const startBlock = await getStartBlockAsync(repository); - logUtils.log(`Getting CancelUpTo events starting at ${startBlock}...`); - const eventLogs = await eventsSource.getCancelUpToEventsAsync(startBlock, endBlock); - logUtils.log('Parsing CancelUpTo events...'); - const events = parseExchangeCancelUpToEvents(eventLogs); - logUtils.log(`Retrieved and parsed ${events.length} total CancelUpTo events.`); - await saveEventsAsync(startBlock === EXCHANGE_START_BLOCK, repository, events); -} - -const tableNameRegex = /^[a-zA-Z_]*$/; - -async function getStartBlockAsync<T extends ExchangeEvent>(repository: Repository<T>): Promise<number> { - const fillEventCount = await repository.count(); - if (fillEventCount === 0) { - logUtils.log(`No existing ${repository.metadata.name}s found.`); - return EXCHANGE_START_BLOCK; - } - const tableName = repository.metadata.tableName; - if (!tableNameRegex.test(tableName)) { - throw new Error(`Unexpected special character in table name: ${tableName}`); - } - const queryResult = await connection.query( - `SELECT block_number FROM raw.${tableName} ORDER BY block_number DESC LIMIT 1`, - ); - const lastKnownBlock = queryResult[0].block_number; - return lastKnownBlock - START_BLOCK_OFFSET; -} - -async function saveEventsAsync<T extends ExchangeEvent>( - isInitialPull: boolean, - repository: Repository<T>, - events: T[], -): Promise<void> { - logUtils.log(`Saving ${repository.metadata.name}s...`); - if (isInitialPull) { - // Split data into numChunks pieces of maximum size BATCH_SAVE_SIZE - // each. - for (const eventsBatch of R.splitEvery(BATCH_SAVE_SIZE, events)) { - await repository.insert(eventsBatch); - } - } else { - // If we possibly have some overlap where we need to update some - // existing events, we need to use our workaround/fallback. - await saveIndividuallyWithFallbackAsync(repository, events); - } - const totalEvents = await repository.count(); - logUtils.log(`Done saving events. There are now ${totalEvents} total ${repository.metadata.name}s.`); -} - -async function saveIndividuallyWithFallbackAsync<T extends ExchangeEvent>( - repository: Repository<T>, - events: T[], -): Promise<void> { - // Note(albrow): This is a temporary hack because `save` is not working as - // documented and is causing a primary key constraint violation. Hopefully - // can remove later because this "poor man's upsert" implementation operates - // on one event at a time and is therefore much slower. - for (const event of events) { - try { - // First try an insert. - await repository.insert(event); - } catch (err) { - if (err.message.includes('duplicate key value violates unique constraint')) { - logUtils.log("Ignore the preceeding INSERT failure; it's not unexpected"); - } else { - throw err; - } - // If it fails, assume it was a primary key constraint error and try - // doing an update instead. - // Note(albrow): Unfortunately the `as any` hack here seems - // required. I can't figure out how to convince the type-checker - // that the criteria and the entity itself are the correct type for - // the given repository. If we can remove the `save` hack then this - // will probably no longer be necessary. - await repository.update( - { - contractAddress: event.contractAddress, - blockNumber: event.blockNumber, - logIndex: event.logIndex, - transactionHash: event.transactionHash, - } as any, - event as any, - ); - } - } -} - -async function calculateEndBlockAsync(provider: Web3ProviderEngine): Promise<number> { - const web3Wrapper = new Web3Wrapper(provider); - const currentBlock = await web3Wrapper.getBlockNumberAsync(); - return currentBlock - BLOCK_FINALITY_THRESHOLD; -} diff --git a/packages/pipeline/src/scripts/pull_idex_orderbook_snapshots.ts b/packages/pipeline/src/scripts/pull_idex_orderbook_snapshots.ts deleted file mode 100644 index 490b17766..000000000 --- a/packages/pipeline/src/scripts/pull_idex_orderbook_snapshots.ts +++ /dev/null @@ -1,63 +0,0 @@ -import { logUtils } from '@0x/utils'; -import * as R from 'ramda'; -import { Connection, ConnectionOptions, createConnection } from 'typeorm'; - -import { IDEX_SOURCE, IdexSource } from '../data_sources/idex'; -import { TokenOrderbookSnapshot as TokenOrder } from '../entities'; -import * as ormConfig from '../ormconfig'; -import { parseIdexOrders } from '../parsers/idex_orders'; -import { handleError } from '../utils'; - -// Number of orders to save at once. -const BATCH_SAVE_SIZE = 1000; - -// Number of markets to retrieve orderbooks for at once. -const MARKET_ORDERBOOK_REQUEST_BATCH_SIZE = 100; - -// Delay between market orderbook requests. -const MILLISEC_MARKET_ORDERBOOK_REQUEST_DELAY = 2000; - -let connection: Connection; - -(async () => { - connection = await createConnection(ormConfig as ConnectionOptions); - const idexSource = new IdexSource(); - logUtils.log('Getting all IDEX markets'); - const markets = await idexSource.getMarketsAsync(); - logUtils.log(`Got ${markets.length} markets.`); - for (const marketsChunk of R.splitEvery(MARKET_ORDERBOOK_REQUEST_BATCH_SIZE, markets)) { - await Promise.all( - marketsChunk.map(async (marketId: string) => getAndSaveMarketOrderbookAsync(idexSource, marketId)), - ); - await new Promise<void>(resolve => setTimeout(resolve, MILLISEC_MARKET_ORDERBOOK_REQUEST_DELAY)); - } - process.exit(0); -})().catch(handleError); - -/** - * Retrieve orderbook from Idex API for a given market. Parse orders and insert - * them into our database. - * @param idexSource Data source which can query Idex API. - * @param marketId String representing market of interest, eg. 'ETH_TIC'. - */ -async function getAndSaveMarketOrderbookAsync(idexSource: IdexSource, marketId: string): Promise<void> { - logUtils.log(`${marketId}: Retrieving orderbook.`); - const orderBook = await idexSource.getMarketOrderbookAsync(marketId); - const observedTimestamp = Date.now(); - - if (!R.has('bids', orderBook) || !R.has('asks', orderBook)) { - logUtils.warn(`${marketId}: Orderbook faulty.`); - return; - } - - logUtils.log(`${marketId}: Parsing orders.`); - const orders = parseIdexOrders(orderBook, observedTimestamp, IDEX_SOURCE); - - if (orders.length > 0) { - logUtils.log(`${marketId}: Saving ${orders.length} orders.`); - const TokenOrderRepository = connection.getRepository(TokenOrder); - await TokenOrderRepository.save(orders, { chunk: Math.ceil(orders.length / BATCH_SAVE_SIZE) }); - } else { - logUtils.log(`${marketId}: 0 orders to save.`); - } -} diff --git a/packages/pipeline/src/scripts/pull_missing_blocks.ts b/packages/pipeline/src/scripts/pull_missing_blocks.ts deleted file mode 100644 index 345ea38fe..000000000 --- a/packages/pipeline/src/scripts/pull_missing_blocks.ts +++ /dev/null @@ -1,91 +0,0 @@ -import { web3Factory } from '@0x/dev-utils'; -import { logUtils } from '@0x/utils'; - -import * as Parallel from 'async-parallel'; -import R = require('ramda'); -import 'reflect-metadata'; -import { Connection, ConnectionOptions, createConnection, Repository } from 'typeorm'; - -import { Web3Source } from '../data_sources/web3'; -import { Block } from '../entities'; -import * as ormConfig from '../ormconfig'; -import { parseBlock } from '../parsers/web3'; -import { handleError, INFURA_ROOT_URL } from '../utils'; - -// Number of blocks to save at once. -const BATCH_SAVE_SIZE = 1000; -// Maximum number of requests to send at once. -const MAX_CONCURRENCY = 20; -// Maximum number of blocks to query for at once. This is also the maximum -// number of blocks we will hold in memory prior to being saved to the database. -const MAX_BLOCKS_PER_QUERY = 1000; - -let connection: Connection; - -const tablesWithMissingBlocks = [ - 'raw.exchange_fill_events', - 'raw.exchange_cancel_events', - 'raw.exchange_cancel_up_to_events', - 'raw.erc20_approval_events', -]; - -(async () => { - connection = await createConnection(ormConfig as ConnectionOptions); - const provider = web3Factory.getRpcProvider({ - rpcUrl: INFURA_ROOT_URL, - }); - const web3Source = new Web3Source(provider); - for (const tableName of tablesWithMissingBlocks) { - await getAllMissingBlocksAsync(web3Source, tableName); - } - process.exit(0); -})().catch(handleError); - -interface MissingBlocksResponse { - block_number: string; -} - -async function getAllMissingBlocksAsync(web3Source: Web3Source, tableName: string): Promise<void> { - const blocksRepository = connection.getRepository(Block); - while (true) { - logUtils.log(`Checking for missing blocks in ${tableName}...`); - const blockNumbers = await getMissingBlockNumbersAsync(tableName); - if (blockNumbers.length === 0) { - // There are no more missing blocks. We're done. - break; - } - await getAndSaveBlocksAsync(web3Source, blocksRepository, blockNumbers); - } - const totalBlocks = await blocksRepository.count(); - logUtils.log(`Done saving blocks for ${tableName}. There are now ${totalBlocks} total blocks.`); -} - -async function getMissingBlockNumbersAsync(tableName: string): Promise<number[]> { - // This query returns up to `MAX_BLOCKS_PER_QUERY` distinct block numbers - // which are present in `tableName` but not in `raw.blocks`. - const response = (await connection.query( - `SELECT DISTINCT(block_number) FROM ${tableName} LEFT JOIN raw.blocks ON ${tableName}.block_number = raw.blocks.number WHERE number IS NULL LIMIT $1;`, - [MAX_BLOCKS_PER_QUERY], - )) as MissingBlocksResponse[]; - const blockNumberStrings = R.pluck('block_number', response); - const blockNumbers = R.map(parseInt, blockNumberStrings); - logUtils.log(`Found ${blockNumbers.length} missing blocks.`); - return blockNumbers; -} - -async function getAndSaveBlocksAsync( - web3Source: Web3Source, - blocksRepository: Repository<Block>, - blockNumbers: number[], -): Promise<void> { - logUtils.log(`Getting block data for ${blockNumbers.length} blocks...`); - Parallel.setConcurrency(MAX_CONCURRENCY); - const rawBlocks = await Parallel.map(blockNumbers, async (blockNumber: number) => - web3Source.getBlockInfoAsync(blockNumber), - ); - logUtils.log(`Parsing ${rawBlocks.length} blocks...`); - const blocks = R.map(parseBlock, rawBlocks); - logUtils.log(`Saving ${blocks.length} blocks...`); - await blocksRepository.save(blocks, { chunk: Math.ceil(blocks.length / BATCH_SAVE_SIZE) }); - logUtils.log('Done saving this batch of blocks'); -} diff --git a/packages/pipeline/src/scripts/pull_oasis_orderbook_snapshots.ts b/packages/pipeline/src/scripts/pull_oasis_orderbook_snapshots.ts deleted file mode 100644 index c4dcf6c83..000000000 --- a/packages/pipeline/src/scripts/pull_oasis_orderbook_snapshots.ts +++ /dev/null @@ -1,58 +0,0 @@ -import { logUtils } from '@0x/utils'; -import * as R from 'ramda'; -import { Connection, ConnectionOptions, createConnection } from 'typeorm'; - -import { OASIS_SOURCE, OasisMarket, OasisSource } from '../data_sources/oasis'; -import { TokenOrderbookSnapshot as TokenOrder } from '../entities'; -import * as ormConfig from '../ormconfig'; -import { parseOasisOrders } from '../parsers/oasis_orders'; -import { handleError } from '../utils'; - -// Number of orders to save at once. -const BATCH_SAVE_SIZE = 1000; - -// Number of markets to retrieve orderbooks for at once. -const MARKET_ORDERBOOK_REQUEST_BATCH_SIZE = 50; - -// Delay between market orderbook requests. -const MILLISEC_MARKET_ORDERBOOK_REQUEST_DELAY = 1000; - -let connection: Connection; - -(async () => { - connection = await createConnection(ormConfig as ConnectionOptions); - const oasisSource = new OasisSource(); - logUtils.log('Getting all active Oasis markets'); - const markets = await oasisSource.getActiveMarketsAsync(); - logUtils.log(`Got ${markets.length} markets.`); - for (const marketsChunk of R.splitEvery(MARKET_ORDERBOOK_REQUEST_BATCH_SIZE, markets)) { - await Promise.all( - marketsChunk.map(async (market: OasisMarket) => getAndSaveMarketOrderbookAsync(oasisSource, market)), - ); - await new Promise<void>(resolve => setTimeout(resolve, MILLISEC_MARKET_ORDERBOOK_REQUEST_DELAY)); - } - process.exit(0); -})().catch(handleError); - -/** - * Retrieve orderbook from Oasis API for a given market. Parse orders and insert - * them into our database. - * @param oasisSource Data source which can query Oasis API. - * @param marketId String identifying market we want data for. eg. 'REPAUG'. - */ -async function getAndSaveMarketOrderbookAsync(oasisSource: OasisSource, market: OasisMarket): Promise<void> { - logUtils.log(`${market.id}: Retrieving orderbook.`); - const orderBook = await oasisSource.getMarketOrderbookAsync(market.id); - const observedTimestamp = Date.now(); - - logUtils.log(`${market.id}: Parsing orders.`); - const orders = parseOasisOrders(orderBook, market, observedTimestamp, OASIS_SOURCE); - - if (orders.length > 0) { - logUtils.log(`${market.id}: Saving ${orders.length} orders.`); - const TokenOrderRepository = connection.getRepository(TokenOrder); - await TokenOrderRepository.save(orders, { chunk: Math.ceil(orders.length / BATCH_SAVE_SIZE) }); - } else { - logUtils.log(`${market.id}: 0 orders to save.`); - } -} diff --git a/packages/pipeline/src/scripts/pull_ohlcv_cryptocompare.ts b/packages/pipeline/src/scripts/pull_ohlcv_cryptocompare.ts deleted file mode 100644 index caac7b9d4..000000000 --- a/packages/pipeline/src/scripts/pull_ohlcv_cryptocompare.ts +++ /dev/null @@ -1,96 +0,0 @@ -import { Connection, ConnectionOptions, createConnection, Repository } from 'typeorm'; - -import { logUtils } from '@0x/utils'; - -import { CryptoCompareOHLCVSource } from '../data_sources/ohlcv_external/crypto_compare'; -import { OHLCVExternal } from '../entities'; -import * as ormConfig from '../ormconfig'; -import { OHLCVMetadata, parseRecords } from '../parsers/ohlcv_external/crypto_compare'; -import { handleError } from '../utils'; -import { fetchOHLCVTradingPairsAsync, TradingPair } from '../utils/get_ohlcv_trading_pairs'; - -const SOURCE_NAME = 'CryptoCompare'; -const TWO_HOURS_AGO = new Date().getTime() - 2 * 60 * 60 * 1000; // tslint:disable-line:custom-no-magic-numbers - -const MAX_REQS_PER_SECOND = parseInt(process.env.CRYPTOCOMPARE_MAX_REQS_PER_SECOND || '15', 10); // tslint:disable-line:custom-no-magic-numbers -const EARLIEST_BACKFILL_DATE = process.env.OHLCV_EARLIEST_BACKFILL_DATE || '2014-06-01'; -const EARLIEST_BACKFILL_TIME = new Date(EARLIEST_BACKFILL_DATE).getTime(); - -let connection: Connection; - -(async () => { - connection = await createConnection(ormConfig as ConnectionOptions); - const repository = connection.getRepository(OHLCVExternal); - const source = new CryptoCompareOHLCVSource(MAX_REQS_PER_SECOND); - - const jobTime = new Date().getTime(); - const tradingPairs = await fetchOHLCVTradingPairsAsync(connection, SOURCE_NAME, EARLIEST_BACKFILL_TIME); - logUtils.log(`Starting ${tradingPairs.length} job(s) to scrape Crypto Compare for OHLCV records...`); - - const fetchAndSavePromises = tradingPairs.map(async pair => { - const pairs = source.generateBackfillIntervals(pair); - return fetchAndSaveAsync(source, repository, jobTime, pairs); - }); - await Promise.all(fetchAndSavePromises); - logUtils.log(`Finished scraping OHLCV records from Crypto Compare, exiting...`); - process.exit(0); -})().catch(handleError); - -async function fetchAndSaveAsync( - source: CryptoCompareOHLCVSource, - repository: Repository<OHLCVExternal>, - jobTime: number, - pairs: TradingPair[], -): Promise<void> { - const sortAscTimestamp = (a: TradingPair, b: TradingPair): number => { - if (a.latestSavedTime < b.latestSavedTime) { - return -1; - } else if (a.latestSavedTime > b.latestSavedTime) { - return 1; - } else { - return 0; - } - }; - pairs.sort(sortAscTimestamp); - - let i = 0; - while (i < pairs.length) { - const pair = pairs[i]; - if (pair.latestSavedTime > TWO_HOURS_AGO) { - break; - } - try { - const records = await source.getHourlyOHLCVAsync(pair); - logUtils.log(`Retrieved ${records.length} records for ${JSON.stringify(pair)}`); - if (records.length > 0) { - const metadata: OHLCVMetadata = { - exchange: source.defaultExchange, - fromSymbol: pair.fromSymbol, - toSymbol: pair.toSymbol, - source: SOURCE_NAME, - observedTimestamp: jobTime, - interval: source.intervalBetweenRecords, - }; - const parsedRecords = parseRecords(records, metadata); - await saveRecordsAsync(repository, parsedRecords); - } - i++; - } catch (err) { - logUtils.log(`Error scraping OHLCVRecords, stopping task for ${JSON.stringify(pair)} [${err}]`); - break; - } - } - return Promise.resolve(); -} - -async function saveRecordsAsync(repository: Repository<OHLCVExternal>, records: OHLCVExternal[]): Promise<void> { - const metadata = [ - records[0].fromSymbol, - records[0].toSymbol, - new Date(records[0].startTime), - new Date(records[records.length - 1].endTime), - ]; - - logUtils.log(`Saving ${records.length} records to ${repository.metadata.name}... ${JSON.stringify(metadata)}`); - await repository.save(records); -} diff --git a/packages/pipeline/src/scripts/pull_paradex_orderbook_snapshots.ts b/packages/pipeline/src/scripts/pull_paradex_orderbook_snapshots.ts deleted file mode 100644 index 34345f355..000000000 --- a/packages/pipeline/src/scripts/pull_paradex_orderbook_snapshots.ts +++ /dev/null @@ -1,87 +0,0 @@ -import { logUtils } from '@0x/utils'; -import { Connection, ConnectionOptions, createConnection } from 'typeorm'; - -import { - PARADEX_SOURCE, - ParadexActiveMarketsResponse, - ParadexMarket, - ParadexSource, - ParadexTokenInfoResponse, -} from '../data_sources/paradex'; -import { TokenOrderbookSnapshot as TokenOrder } from '../entities'; -import * as ormConfig from '../ormconfig'; -import { parseParadexOrders } from '../parsers/paradex_orders'; -import { handleError } from '../utils'; - -// Number of orders to save at once. -const BATCH_SAVE_SIZE = 1000; - -let connection: Connection; - -(async () => { - connection = await createConnection(ormConfig as ConnectionOptions); - const apiKey = process.env.PARADEX_DATA_PIPELINE_API_KEY; - if (apiKey === undefined) { - throw new Error('Missing required env var: PARADEX_DATA_PIPELINE_API_KEY'); - } - const paradexSource = new ParadexSource(apiKey); - const markets = await paradexSource.getActiveMarketsAsync(); - const tokenInfoResponse = await paradexSource.getTokenInfoAsync(); - const extendedMarkets = addTokenAddresses(markets, tokenInfoResponse); - await Promise.all( - extendedMarkets.map(async (market: ParadexMarket) => getAndSaveMarketOrderbookAsync(paradexSource, market)), - ); - process.exit(0); -})().catch(handleError); - -/** - * Extend the default ParadexMarket objects with token addresses. - * @param markets An array of ParadexMarket objects. - * @param tokenInfoResponse An array of ParadexTokenInfo containing the addresses. - */ -function addTokenAddresses( - markets: ParadexActiveMarketsResponse, - tokenInfoResponse: ParadexTokenInfoResponse, -): ParadexMarket[] { - const symbolAddressMapping = new Map<string, string>(); - tokenInfoResponse.forEach(tokenInfo => symbolAddressMapping.set(tokenInfo.symbol, tokenInfo.address)); - - markets.forEach((market: ParadexMarket) => { - if (symbolAddressMapping.has(market.baseToken)) { - market.baseTokenAddress = symbolAddressMapping.get(market.baseToken); - } else { - market.quoteTokenAddress = ''; - logUtils.warn(`${market.baseToken}: No address found.`); - } - - if (symbolAddressMapping.has(market.quoteToken)) { - market.quoteTokenAddress = symbolAddressMapping.get(market.quoteToken); - } else { - market.quoteTokenAddress = ''; - logUtils.warn(`${market.quoteToken}: No address found.`); - } - }); - return markets; -} - -/** - * Retrieve orderbook from Paradex API for a given market. Parse orders and insert - * them into our database. - * @param paradexSource Data source which can query the Paradex API. - * @param market Object from the Paradex API with information about the market in question. - */ -async function getAndSaveMarketOrderbookAsync(paradexSource: ParadexSource, market: ParadexMarket): Promise<void> { - const paradexOrderbookResponse = await paradexSource.getMarketOrderbookAsync(market.symbol); - const observedTimestamp = Date.now(); - - logUtils.log(`${market.symbol}: Parsing orders.`); - const orders = parseParadexOrders(paradexOrderbookResponse, market, observedTimestamp, PARADEX_SOURCE); - - if (orders.length > 0) { - logUtils.log(`${market.symbol}: Saving ${orders.length} orders.`); - const tokenOrderRepository = connection.getRepository(TokenOrder); - await tokenOrderRepository.save(orders, { chunk: Math.ceil(orders.length / BATCH_SAVE_SIZE) }); - } else { - logUtils.log(`${market.symbol}: 0 orders to save.`); - } -} diff --git a/packages/pipeline/src/scripts/pull_radar_relay_orders.ts b/packages/pipeline/src/scripts/pull_radar_relay_orders.ts deleted file mode 100644 index 8e8720803..000000000 --- a/packages/pipeline/src/scripts/pull_radar_relay_orders.ts +++ /dev/null @@ -1,62 +0,0 @@ -import { HttpClient } from '@0x/connect'; -import { logUtils } from '@0x/utils'; - -import * as R from 'ramda'; -import 'reflect-metadata'; -import { Connection, ConnectionOptions, createConnection, EntityManager } from 'typeorm'; - -import { createObservedTimestampForOrder, SraOrder } from '../entities'; -import * as ormConfig from '../ormconfig'; -import { parseSraOrders } from '../parsers/sra_orders'; -import { handleError } from '../utils'; - -const RADAR_RELAY_URL = 'https://api.radarrelay.com/0x/v2'; -const ORDERS_PER_PAGE = 10000; // Number of orders to get per request. - -let connection: Connection; - -(async () => { - connection = await createConnection(ormConfig as ConnectionOptions); - await getOrderbookAsync(); - process.exit(0); -})().catch(handleError); - -async function getOrderbookAsync(): Promise<void> { - logUtils.log('Getting all orders...'); - const connectClient = new HttpClient(RADAR_RELAY_URL); - const rawOrders = await connectClient.getOrdersAsync({ - perPage: ORDERS_PER_PAGE, - }); - logUtils.log(`Got ${rawOrders.records.length} orders.`); - logUtils.log('Parsing orders...'); - // Parse the sra orders, then add source url to each. - const orders = R.pipe( - parseSraOrders, - R.map(setSourceUrl(RADAR_RELAY_URL)), - )(rawOrders); - // Save all the orders and update the observed time stamps in a single - // transaction. - logUtils.log('Saving orders and updating timestamps...'); - const observedTimestamp = Date.now(); - await connection.transaction( - async (manager: EntityManager): Promise<void> => { - for (const order of orders) { - await manager.save(SraOrder, order); - const orderObservation = createObservedTimestampForOrder(order, observedTimestamp); - await manager.save(orderObservation); - } - }, - ); -} - -const sourceUrlProp = R.lensProp('sourceUrl'); - -/** - * Sets the source url for a single order. Returns a new order instead of - * mutating the given one. - */ -const setSourceUrl = R.curry( - (sourceURL: string, order: SraOrder): SraOrder => { - return R.set(sourceUrlProp, sourceURL, order); - }, -); diff --git a/packages/pipeline/src/scripts/pull_trusted_tokens.ts b/packages/pipeline/src/scripts/pull_trusted_tokens.ts deleted file mode 100644 index 8afb3e052..000000000 --- a/packages/pipeline/src/scripts/pull_trusted_tokens.ts +++ /dev/null @@ -1,48 +0,0 @@ -import 'reflect-metadata'; -import { Connection, ConnectionOptions, createConnection } from 'typeorm'; - -import { logUtils } from '@0x/utils'; - -import { MetamaskTrustedTokenMeta, TrustedTokenSource, ZeroExTrustedTokenMeta } from '../data_sources/trusted_tokens'; -import { TokenMetadata } from '../entities'; -import * as ormConfig from '../ormconfig'; -import { parseMetamaskTrustedTokens, parseZeroExTrustedTokens } from '../parsers/token_metadata'; -import { handleError } from '../utils'; - -const METAMASK_TRUSTED_TOKENS_URL = - 'https://raw.githubusercontent.com/MetaMask/eth-contract-metadata/d45916c533116510cc8e9e048a8b5fc3732a6b6d/contract-map.json'; - -const ZEROEX_TRUSTED_TOKENS_URL = 'https://website-api.0xproject.com/tokens'; - -let connection: Connection; - -(async () => { - connection = await createConnection(ormConfig as ConnectionOptions); - await getMetamaskTrustedTokensAsync(); - await getZeroExTrustedTokensAsync(); - process.exit(0); -})().catch(handleError); - -async function getMetamaskTrustedTokensAsync(): Promise<void> { - logUtils.log('Getting latest metamask trusted tokens list ...'); - const trustedTokensRepository = connection.getRepository(TokenMetadata); - const trustedTokensSource = new TrustedTokenSource<Map<string, MetamaskTrustedTokenMeta>>( - METAMASK_TRUSTED_TOKENS_URL, - ); - const resp = await trustedTokensSource.getTrustedTokenMetaAsync(); - const trustedTokens = parseMetamaskTrustedTokens(resp); - logUtils.log('Saving metamask trusted tokens list'); - await trustedTokensRepository.save(trustedTokens); - logUtils.log('Done saving metamask trusted tokens.'); -} - -async function getZeroExTrustedTokensAsync(): Promise<void> { - logUtils.log('Getting latest 0x trusted tokens list ...'); - const trustedTokensRepository = connection.getRepository(TokenMetadata); - const trustedTokensSource = new TrustedTokenSource<ZeroExTrustedTokenMeta[]>(ZEROEX_TRUSTED_TOKENS_URL); - const resp = await trustedTokensSource.getTrustedTokenMetaAsync(); - const trustedTokens = parseZeroExTrustedTokens(resp); - logUtils.log('Saving metamask trusted tokens list'); - await trustedTokensRepository.save(trustedTokens); - logUtils.log('Done saving metamask trusted tokens.'); -} diff --git a/packages/pipeline/src/scripts/update_relayer_info.ts b/packages/pipeline/src/scripts/update_relayer_info.ts deleted file mode 100644 index 910a0157c..000000000 --- a/packages/pipeline/src/scripts/update_relayer_info.ts +++ /dev/null @@ -1,34 +0,0 @@ -import 'reflect-metadata'; -import { Connection, ConnectionOptions, createConnection } from 'typeorm'; - -import { logUtils } from '@0x/utils'; - -import { RelayerRegistrySource } from '../data_sources/relayer-registry'; -import { Relayer } from '../entities'; -import * as ormConfig from '../ormconfig'; -import { parseRelayers } from '../parsers/relayer_registry'; -import { handleError } from '../utils'; - -// NOTE(albrow): We need to manually update this URL for now. Fix this when we -// have the relayer-registry behind semantic versioning. -const RELAYER_REGISTRY_URL = - 'https://raw.githubusercontent.com/0xProject/0x-relayer-registry/4701c85677d161ea729a466aebbc1826c6aa2c0b/relayers.json'; - -let connection: Connection; - -(async () => { - connection = await createConnection(ormConfig as ConnectionOptions); - await getRelayersAsync(); - process.exit(0); -})().catch(handleError); - -async function getRelayersAsync(): Promise<void> { - logUtils.log('Getting latest relayer info...'); - const relayerRepository = connection.getRepository(Relayer); - const relayerSource = new RelayerRegistrySource(RELAYER_REGISTRY_URL); - const relayersResp = await relayerSource.getRelayerInfoAsync(); - const relayers = parseRelayers(relayersResp); - logUtils.log('Saving relayer info...'); - await relayerRepository.save(relayers); - logUtils.log('Done saving relayer info.'); -} diff --git a/packages/pipeline/src/types.ts b/packages/pipeline/src/types.ts deleted file mode 100644 index 5f2121807..000000000 --- a/packages/pipeline/src/types.ts +++ /dev/null @@ -1,9 +0,0 @@ -export enum AssetType { - ERC20 = 'erc20', - ERC721 = 'erc721', - MultiAsset = 'multiAsset', -} -export enum OrderType { - Bid = 'bid', - Ask = 'ask', -} diff --git a/packages/pipeline/src/utils/constants.ts b/packages/pipeline/src/utils/constants.ts deleted file mode 100644 index 56f3e82d8..000000000 --- a/packages/pipeline/src/utils/constants.ts +++ /dev/null @@ -1,3 +0,0 @@ -// Block number when the Exchange contract was deployed to mainnet. -export const EXCHANGE_START_BLOCK = 6271590; -export const INFURA_ROOT_URL = 'https://mainnet.infura.io'; diff --git a/packages/pipeline/src/utils/get_ohlcv_trading_pairs.ts b/packages/pipeline/src/utils/get_ohlcv_trading_pairs.ts deleted file mode 100644 index 19f81344e..000000000 --- a/packages/pipeline/src/utils/get_ohlcv_trading_pairs.ts +++ /dev/null @@ -1,116 +0,0 @@ -import { fetchAsync } from '@0x/utils'; -import * as R from 'ramda'; -import { Connection } from 'typeorm'; - -export interface TradingPair { - fromSymbol: string; - toSymbol: string; - latestSavedTime: number; -} - -const COINLIST_API = 'https://min-api.cryptocompare.com/data/all/coinlist?BuiltOn=7605'; - -interface CryptoCompareCoinListResp { - Data: Map<string, CryptoCompareCoin>; -} - -interface CryptoCompareCoin { - Symbol: string; - BuiltOn: string; - SmartContractAddress: string; -} - -const TO_CURRENCIES = ['USD', 'EUR', 'ETH', 'USDT']; -const ETHEREUM_IDENTIFIER = '7605'; -const HTTP_OK_STATUS = 200; - -interface StaticPair { - fromSymbol: string; - toSymbol: string; -} -const SPECIAL_CASES: StaticPair[] = [ - { - fromSymbol: 'ETH', - toSymbol: 'USD', - }, -]; - -/** - * Get trading pairs with latest scraped time for OHLCV records - * @param conn a typeorm Connection to postgres - */ -export async function fetchOHLCVTradingPairsAsync( - conn: Connection, - source: string, - earliestBackfillTime: number, -): Promise<TradingPair[]> { - // fetch existing ohlcv records - const latestTradingPairs: Array<{ - from_symbol: string; - to_symbol: string; - latest: string; - }> = await conn.query(`SELECT - MAX(end_time) as latest, - from_symbol, - to_symbol - FROM raw.ohlcv_external - GROUP BY from_symbol, to_symbol;`); - - // build addressable index: { fromsym: { tosym: time }} - const latestTradingPairsIndex: { [fromSym: string]: { [toSym: string]: number } } = {}; - latestTradingPairs.forEach(pair => { - const latestIndex: { [toSym: string]: number } = latestTradingPairsIndex[pair.from_symbol] || {}; - latestIndex[pair.to_symbol] = parseInt(pair.latest, 10); // tslint:disable-line:custom-no-magic-numbers - latestTradingPairsIndex[pair.from_symbol] = latestIndex; - }); - - // match time to special cases - const specialCases: TradingPair[] = SPECIAL_CASES.map(pair => { - const latestSavedTime = - R.path<number>([pair.fromSymbol, pair.toSymbol], latestTradingPairsIndex) || earliestBackfillTime; - return R.assoc('latestSavedTime', latestSavedTime, pair); - }); - - // get token symbols used by Crypto Compare - const allCoinsResp = await fetchAsync(COINLIST_API); - if (allCoinsResp.status !== HTTP_OK_STATUS) { - return []; - } - const allCoins: CryptoCompareCoinListResp = await allCoinsResp.json(); - const erc20CoinsIndex: Map<string, string> = new Map(); - Object.entries(allCoins.Data).forEach(pair => { - const [symbol, coinData] = pair; - if (coinData.BuiltOn === ETHEREUM_IDENTIFIER && coinData.SmartContractAddress !== 'N/A') { - erc20CoinsIndex.set(coinData.SmartContractAddress.toLowerCase(), symbol); - } - }); - - // fetch all tokens that are traded on 0x - const rawEventTokenAddresses: Array<{ tokenaddress: string }> = await conn.query( - `SELECT DISTINCT(maker_token_address) as tokenaddress FROM raw.exchange_fill_events UNION - SELECT DISTINCT(taker_token_address) as tokenaddress FROM raw.exchange_fill_events`, - ); - - // tslint:disable-next-line:no-unbound-method - const eventTokenAddresses = R.pluck('tokenaddress', rawEventTokenAddresses).map(R.toLower); - - // join token addresses with CC symbols - const eventTokenSymbols: string[] = eventTokenAddresses - .filter(tokenAddress => erc20CoinsIndex.has(tokenAddress)) - .map(tokenAddress => erc20CoinsIndex.get(tokenAddress) as string); - - // join traded tokens with fiat and latest backfill time - const eventTradingPairs: TradingPair[] = R.chain(sym => { - return TO_CURRENCIES.map(fiat => { - const pair = { - fromSymbol: sym, - toSymbol: fiat, - latestSavedTime: R.path<number>([sym, fiat], latestTradingPairsIndex) || earliestBackfillTime, - }; - return pair; - }); - }, eventTokenSymbols); - - // join with special cases - return R.concat(eventTradingPairs, specialCases); -} diff --git a/packages/pipeline/src/utils/index.ts b/packages/pipeline/src/utils/index.ts deleted file mode 100644 index 094c0178e..000000000 --- a/packages/pipeline/src/utils/index.ts +++ /dev/null @@ -1,53 +0,0 @@ -import { BigNumber } from '@0x/utils'; -export * from './transformers'; -export * from './constants'; - -/** - * If the given BigNumber is not null, returns the string representation of that - * number. Otherwise, returns null. - * @param n The number to convert. - */ -export function bigNumbertoStringOrNull(n: BigNumber): string | null { - if (n == null) { - return null; - } - return n.toString(); -} - -/** - * If value is null or undefined, returns null. Otherwise converts value to a - * BigNumber. - * @param value A string or number to be converted to a BigNumber - */ -export function toBigNumberOrNull(value: string | number | null): BigNumber | null { - switch (value) { - case null: - case undefined: - return null; - default: - return new BigNumber(value); - } -} - -/** - * Logs an error by intelligently checking for `message` and `stack` properties. - * Intended for use with top-level immediately invoked asynchronous functions. - * @param e the error to log. - */ -export function handleError(e: any): void { - if (e.message != null) { - // tslint:disable-next-line:no-console - console.error(e.message); - } else { - // tslint:disable-next-line:no-console - console.error('Unknown error'); - } - if (e.stack != null) { - // tslint:disable-next-line:no-console - console.error(e.stack); - } else { - // tslint:disable-next-line:no-console - console.error('(No stack trace)'); - } - process.exit(1); -} diff --git a/packages/pipeline/src/utils/transformers/asset_proxy_id_types.ts b/packages/pipeline/src/utils/transformers/asset_proxy_id_types.ts deleted file mode 100644 index 2cd05a616..000000000 --- a/packages/pipeline/src/utils/transformers/asset_proxy_id_types.ts +++ /dev/null @@ -1,20 +0,0 @@ -import { AssetProxyId } from '@0x/types'; - -import { AssetType } from '../../types'; - -/** - * Converts an assetProxyId to its string equivalent - * @param assetProxyId Id of AssetProxy - */ -export function convertAssetProxyIdToType(assetProxyId: AssetProxyId): AssetType { - switch (assetProxyId) { - case AssetProxyId.ERC20: - return AssetType.ERC20; - case AssetProxyId.ERC721: - return AssetType.ERC721; - case AssetProxyId.MultiAsset: - return AssetType.MultiAsset; - default: - throw new Error(`${assetProxyId} not a supported assetProxyId`); - } -} diff --git a/packages/pipeline/src/utils/transformers/big_number.ts b/packages/pipeline/src/utils/transformers/big_number.ts deleted file mode 100644 index 5f2e4d565..000000000 --- a/packages/pipeline/src/utils/transformers/big_number.ts +++ /dev/null @@ -1,16 +0,0 @@ -import { BigNumber } from '@0x/utils'; -import { ValueTransformer } from 'typeorm/decorator/options/ValueTransformer'; - -export class BigNumberTransformer implements ValueTransformer { - // tslint:disable-next-line:prefer-function-over-method - public to(value: BigNumber | null): string | null { - return value === null ? null : value.toString(); - } - - // tslint:disable-next-line:prefer-function-over-method - public from(value: string | null): BigNumber | null { - return value === null ? null : new BigNumber(value); - } -} - -export const bigNumberTransformer = new BigNumberTransformer(); diff --git a/packages/pipeline/src/utils/transformers/index.ts b/packages/pipeline/src/utils/transformers/index.ts deleted file mode 100644 index 31a4c9223..000000000 --- a/packages/pipeline/src/utils/transformers/index.ts +++ /dev/null @@ -1,3 +0,0 @@ -export * from './big_number'; -export * from './number_to_bigint'; -export * from './asset_proxy_id_types'; diff --git a/packages/pipeline/src/utils/transformers/number_to_bigint.ts b/packages/pipeline/src/utils/transformers/number_to_bigint.ts deleted file mode 100644 index 8fbd52093..000000000 --- a/packages/pipeline/src/utils/transformers/number_to_bigint.ts +++ /dev/null @@ -1,31 +0,0 @@ -import { BigNumber } from '@0x/utils'; -import { ValueTransformer } from 'typeorm/decorator/options/ValueTransformer'; - -const decimalRadix = 10; - -// Can be used to convert a JavaScript number type to a Postgres bigint type and -// vice versa. By default TypeORM will silently convert number types to string -// if the corresponding Postgres type is bigint. See -// https://github.com/typeorm/typeorm/issues/2400 for more information. -export class NumberToBigIntTransformer implements ValueTransformer { - // tslint:disable-next-line:prefer-function-over-method - public to(value: number): string | null { - if (value === null || value === undefined) { - return null; - } else { - return value.toString(); - } - } - - // tslint:disable-next-line:prefer-function-over-method - public from(value: string): number { - if (new BigNumber(value).isGreaterThan(Number.MAX_SAFE_INTEGER)) { - throw new Error( - `Attempted to convert PostgreSQL bigint value (${value}) to JavaScript number type but it is too big to safely convert`, - ); - } - return Number.parseInt(value, decimalRadix); - } -} - -export const numberToBigIntTransformer = new NumberToBigIntTransformer(); diff --git a/packages/pipeline/test/data_sources/contract-wrappers/utils_test.ts b/packages/pipeline/test/data_sources/contract-wrappers/utils_test.ts deleted file mode 100644 index 06f1a5e86..000000000 --- a/packages/pipeline/test/data_sources/contract-wrappers/utils_test.ts +++ /dev/null @@ -1,109 +0,0 @@ -// tslint:disable:custom-no-magic-numbers -import * as chai from 'chai'; -import { LogWithDecodedArgs } from 'ethereum-types'; -import 'mocha'; - -import { _getEventsWithRetriesAsync } from '../../../src/data_sources/contract-wrappers/utils'; -import { chaiSetup } from '../../utils/chai_setup'; - -chaiSetup.configure(); -const expect = chai.expect; - -const retryableMessage = 'network timeout: (simulated network timeout error)'; -const retryableError = new Error(retryableMessage); - -describe('data_sources/contract-wrappers/utils', () => { - describe('_getEventsWithRetriesAsync', () => { - it('sends a single request if it was successful', async () => { - // Pre-declare values for the fromBlock and toBlock arguments. - const expectedFromBlock = 100; - const expectedToBlock = 200; - const expectedLogs: Array<LogWithDecodedArgs<any>> = [ - { - logIndex: 123, - transactionIndex: 456, - transactionHash: '0x6dd106d002873746072fc5e496dd0fb2541b68c77bcf9184ae19a42fd33657fe', - blockHash: '0x6dd106d002873746072fc5e496dd0fb2541b68c77bcf9184ae19a42fd33657ff', - blockNumber: 789, - address: '0x6dd106d002873746072fc5e496dd0fb2541b68c77bcf9184ae19a42fd3365800', - data: 'fake raw data', - topics: [], - event: 'TEST_EVENT', - args: [1, 2, 3], - }, - ]; - - // mockGetEventsAsync checks its arguments, increments `callCount` - // and returns `expectedLogs`. - let callCount = 0; - const mockGetEventsAsync = async ( - fromBlock: number, - toBlock: number, - ): Promise<Array<LogWithDecodedArgs<any>>> => { - expect(fromBlock).equals(expectedFromBlock); - expect(toBlock).equals(expectedToBlock); - callCount += 1; - return expectedLogs; - }; - - // Make sure that we get what we expected and that the mock function - // was called exactly once. - const gotLogs = await _getEventsWithRetriesAsync(mockGetEventsAsync, 3, expectedFromBlock, expectedToBlock); - expect(gotLogs).deep.equals(expectedLogs); - expect(callCount).equals( - 1, - 'getEventsAsync function was called more than once even though it was successful', - ); - }); - it('retries and eventually succeeds', async () => { - const numRetries = 5; - let callCount = 0; - // mockGetEventsAsync throws unless callCount == numRetries + 1. - const mockGetEventsAsync = async ( - _fromBlock: number, - _toBlock: number, - ): Promise<Array<LogWithDecodedArgs<any>>> => { - callCount += 1; - if (callCount === numRetries + 1) { - return []; - } - throw retryableError; - }; - await _getEventsWithRetriesAsync(mockGetEventsAsync, numRetries, 100, 300); - expect(callCount).equals(numRetries + 1, 'getEventsAsync function was called the wrong number of times'); - }); - it('throws for non-retryable errors', async () => { - const numRetries = 5; - const expectedMessage = 'Non-retryable error'; - // mockGetEventsAsync always throws a non-retryable error. - const mockGetEventsAsync = async ( - _fromBlock: number, - _toBlock: number, - ): Promise<Array<LogWithDecodedArgs<any>>> => { - throw new Error(expectedMessage); - }; - // Note(albrow): This does actually return a promise (or at least a - // "promise-like object" and is a false positive in TSLint. - // tslint:disable-next-line:await-promise - await expect(_getEventsWithRetriesAsync(mockGetEventsAsync, numRetries, 100, 300)).to.be.rejectedWith( - expectedMessage, - ); - }); - it('throws after too many retries', async () => { - const numRetries = 5; - // mockGetEventsAsync always throws a retryable error. - const mockGetEventsAsync = async ( - _fromBlock: number, - _toBlock: number, - ): Promise<Array<LogWithDecodedArgs<any>>> => { - throw retryableError; - }; - // Note(albrow): This does actually return a promise (or at least a - // "promise-like object" and is a false positive in TSLint. - // tslint:disable-next-line:await-promise - await expect(_getEventsWithRetriesAsync(mockGetEventsAsync, numRetries, 100, 300)).to.be.rejectedWith( - retryableMessage, - ); - }); - }); -}); diff --git a/packages/pipeline/test/data_sources/ohlcv_external/crypto_compare_test.ts b/packages/pipeline/test/data_sources/ohlcv_external/crypto_compare_test.ts deleted file mode 100644 index 2efe3f5ec..000000000 --- a/packages/pipeline/test/data_sources/ohlcv_external/crypto_compare_test.ts +++ /dev/null @@ -1,47 +0,0 @@ -import * as chai from 'chai'; -import 'mocha'; -import * as R from 'ramda'; - -import { CryptoCompareOHLCVSource } from '../../../src/data_sources/ohlcv_external/crypto_compare'; -import { TradingPair } from '../../../src/utils/get_ohlcv_trading_pairs'; -import { chaiSetup } from '../../utils/chai_setup'; - -chaiSetup.configure(); -const expect = chai.expect; - -// tslint:disable:custom-no-magic-numbers -describe('ohlcv_external data source (Crypto Compare)', () => { - describe('generateBackfillIntervals', () => { - it('generates pairs with intervals to query', () => { - const source = new CryptoCompareOHLCVSource(20); - const pair: TradingPair = { - fromSymbol: 'ETH', - toSymbol: 'ZRX', - latestSavedTime: new Date().getTime() - source.interval * 2, - }; - - const expected = [ - pair, - R.merge(pair, { latestSavedTime: pair.latestSavedTime + source.interval }), - R.merge(pair, { latestSavedTime: pair.latestSavedTime + source.interval * 2 }), - ]; - - const actual = source.generateBackfillIntervals(pair); - expect(actual).deep.equal(expected); - }); - - it('returns single pair if no backfill is needed', () => { - const source = new CryptoCompareOHLCVSource(20); - const pair: TradingPair = { - fromSymbol: 'ETH', - toSymbol: 'ZRX', - latestSavedTime: new Date().getTime() - source.interval + 5000, - }; - - const expected = [pair]; - - const actual = source.generateBackfillIntervals(pair); - expect(actual).deep.equal(expected); - }); - }); -}); diff --git a/packages/pipeline/test/db_global_hooks.ts b/packages/pipeline/test/db_global_hooks.ts deleted file mode 100644 index dfee02c45..000000000 --- a/packages/pipeline/test/db_global_hooks.ts +++ /dev/null @@ -1,9 +0,0 @@ -import { setUpDbAsync, tearDownDbAsync } from './db_setup'; - -before('set up database', async () => { - await setUpDbAsync(); -}); - -after('tear down database', async () => { - await tearDownDbAsync(); -}); diff --git a/packages/pipeline/test/db_setup.ts b/packages/pipeline/test/db_setup.ts deleted file mode 100644 index bf31d15b6..000000000 --- a/packages/pipeline/test/db_setup.ts +++ /dev/null @@ -1,174 +0,0 @@ -import * as Docker from 'dockerode'; -import * as fs from 'fs'; -import * as R from 'ramda'; -import { Connection, ConnectionOptions, createConnection } from 'typeorm'; - -import * as ormConfig from '../src/ormconfig'; - -// The name of the image to pull and use for the container. This also affects -// which version of Postgres we use. -const DOCKER_IMAGE_NAME = 'postgres:11-alpine'; -// The name to use for the Docker container which will run Postgres. -const DOCKER_CONTAINER_NAME = '0x_pipeline_postgres_test'; -// The port which will be exposed on the Docker container. -const POSTGRES_HOST_PORT = '15432'; -// Number of milliseconds to wait for postgres to finish initializing after -// starting the docker container. -const POSTGRES_SETUP_DELAY_MS = 5000; - -/** - * Sets up the database for testing purposes. If the - * ZEROEX_DATA_PIPELINE_TEST_DB_URL env var is specified, it will create a - * connection using that url. Otherwise it will spin up a new Docker container - * with a Postgres database and then create a connection to that database. - */ -export async function setUpDbAsync(): Promise<void> { - const connection = await createDbConnectionOnceAsync(); - await connection.runMigrations({ transaction: true }); -} - -/** - * Tears down the database used for testing. This completely destroys any data. - * If a docker container was created, it destroys that container too. - */ -export async function tearDownDbAsync(): Promise<void> { - const connection = await createDbConnectionOnceAsync(); - for (const _ of connection.migrations) { - await connection.undoLastMigration({ transaction: true }); - } - if (needsDocker()) { - const docker = initDockerOnce(); - const postgresContainer = docker.getContainer(DOCKER_CONTAINER_NAME); - await postgresContainer.kill(); - await postgresContainer.remove(); - } -} - -let savedConnection: Connection; - -/** - * The first time this is run, it creates and returns a new TypeORM connection. - * Each subsequent time, it returns the existing connection. This is helpful - * because only one TypeORM connection can be active at a time. - */ -export async function createDbConnectionOnceAsync(): Promise<Connection> { - if (savedConnection !== undefined) { - return savedConnection; - } - - if (needsDocker()) { - await initContainerAsync(); - } - const testDbUrl = - process.env.ZEROEX_DATA_PIPELINE_TEST_DB_URL || - `postgresql://postgres@localhost:${POSTGRES_HOST_PORT}/postgres`; - const testOrmConfig = R.merge(ormConfig, { url: testDbUrl }) as ConnectionOptions; - - savedConnection = await createConnection(testOrmConfig); - return savedConnection; -} - -async function sleepAsync(ms: number): Promise<{}> { - return new Promise<{}>(resolve => setTimeout(resolve, ms)); -} - -let savedDocker: Docker; - -function initDockerOnce(): Docker { - if (savedDocker !== undefined) { - return savedDocker; - } - - // Note(albrow): Code for determining the right socket path is partially - // based on https://github.com/apocas/dockerode/blob/8f3aa85311fab64d58eca08fef49aa1da5b5f60b/test/spec_helper.js - const isWin = require('os').type() === 'Windows_NT'; - const socketPath = process.env.DOCKER_SOCKET || (isWin ? '//./pipe/docker_engine' : '/var/run/docker.sock'); - const isSocket = fs.existsSync(socketPath) ? fs.statSync(socketPath).isSocket() : false; - if (!isSocket) { - throw new Error(`Failed to connect to Docker using socket path: "${socketPath}". - -The database integration tests need to be able to connect to a Postgres database. Make sure that Docker is running and accessible at the expected socket path. If Docker isn't working you have two options: - - 1) Set the DOCKER_SOCKET environment variable to a socket path that can be used to connect to Docker or - 2) Set the ZEROEX_DATA_PIPELINE_TEST_DB_URL environment variable to connect directly to an existing Postgres database instead of trying to start Postgres via Docker -`); - } - savedDocker = new Docker({ - socketPath, - }); - return savedDocker; -} - -// Creates the container, waits for it to initialize, and returns it. -async function initContainerAsync(): Promise<Docker.Container> { - const docker = initDockerOnce(); - - // Tear down any existing containers with the same name. - await tearDownExistingContainerIfAnyAsync(); - - // Pull the image we need. - await pullImageAsync(docker, DOCKER_IMAGE_NAME); - - // Create the container. - const postgresContainer = await docker.createContainer({ - name: DOCKER_CONTAINER_NAME, - Image: DOCKER_IMAGE_NAME, - ExposedPorts: { - '5432': {}, - }, - HostConfig: { - PortBindings: { - '5432': [ - { - HostPort: POSTGRES_HOST_PORT, - }, - ], - }, - }, - }); - await postgresContainer.start(); - await sleepAsync(POSTGRES_SETUP_DELAY_MS); - return postgresContainer; -} - -async function tearDownExistingContainerIfAnyAsync(): Promise<void> { - const docker = initDockerOnce(); - - // Check if a container with the desired name already exists. If so, this - // probably means we didn't clean up properly on the last test run. - const existingContainer = docker.getContainer(DOCKER_CONTAINER_NAME); - if (existingContainer != null) { - try { - await existingContainer.kill(); - } catch { - // If this fails, it's fine. The container was probably already - // killed. - } - try { - await existingContainer.remove(); - } catch { - // If this fails, it's fine. The container was probably already - // removed. - } - } -} - -function needsDocker(): boolean { - return process.env.ZEROEX_DATA_PIPELINE_TEST_DB_URL === undefined; -} - -// Note(albrow): This is partially based on -// https://stackoverflow.com/questions/38258263/how-do-i-wait-for-a-pull -async function pullImageAsync(docker: Docker, imageName: string): Promise<void> { - return new Promise<void>((resolve, reject) => { - docker.pull(imageName, {}, (err, stream) => { - if (err != null) { - reject(err); - return; - } - docker.modem.followProgress(stream, () => { - resolve(); - }); - }); - }); -} diff --git a/packages/pipeline/test/entities/block_test.ts b/packages/pipeline/test/entities/block_test.ts deleted file mode 100644 index 503f284f0..000000000 --- a/packages/pipeline/test/entities/block_test.ts +++ /dev/null @@ -1,23 +0,0 @@ -import 'mocha'; -import 'reflect-metadata'; - -import { Block } from '../../src/entities'; -import { createDbConnectionOnceAsync } from '../db_setup'; -import { chaiSetup } from '../utils/chai_setup'; - -import { testSaveAndFindEntityAsync } from './util'; - -chaiSetup.configure(); - -// tslint:disable:custom-no-magic-numbers -describe('Block entity', () => { - it('save/find', async () => { - const connection = await createDbConnectionOnceAsync(); - const block = new Block(); - block.hash = '0x12345'; - block.number = 1234567; - block.timestamp = 5432154321; - const blocksRepository = connection.getRepository(Block); - await testSaveAndFindEntityAsync(blocksRepository, block); - }); -}); diff --git a/packages/pipeline/test/entities/copper_test.ts b/packages/pipeline/test/entities/copper_test.ts deleted file mode 100644 index 2543364e6..000000000 --- a/packages/pipeline/test/entities/copper_test.ts +++ /dev/null @@ -1,54 +0,0 @@ -import 'mocha'; -import 'reflect-metadata'; - -import { - CopperActivity, - CopperActivityType, - CopperCustomField, - CopperLead, - CopperOpportunity, -} from '../../src/entities'; -import { createDbConnectionOnceAsync } from '../db_setup'; -import { - ParsedActivities, - ParsedActivityTypes, - ParsedCustomFields, - ParsedLeads, - ParsedOpportunities, -} from '../fixtures/copper/parsed_entities'; -import { chaiSetup } from '../utils/chai_setup'; - -import { testSaveAndFindEntityAsync } from './util'; - -chaiSetup.configure(); - -describe('Copper entities', () => { - describe('save and find', async () => { - it('Copper lead', async () => { - const connection = await createDbConnectionOnceAsync(); - const repository = connection.getRepository(CopperLead); - ParsedLeads.forEach(async entity => testSaveAndFindEntityAsync(repository, entity)); - }); - it('Copper activity', async () => { - const connection = await createDbConnectionOnceAsync(); - const repository = connection.getRepository(CopperActivity); - ParsedActivities.forEach(async entity => testSaveAndFindEntityAsync(repository, entity)); - }); - // searching on jsonb fields is broken in typeorm - it.skip('Copper opportunity', async () => { - const connection = await createDbConnectionOnceAsync(); - const repository = connection.getRepository(CopperOpportunity); - ParsedOpportunities.forEach(async entity => testSaveAndFindEntityAsync(repository, entity)); - }); - it('Copper activity type', async () => { - const connection = await createDbConnectionOnceAsync(); - const repository = connection.getRepository(CopperActivityType); - ParsedActivityTypes.forEach(async entity => testSaveAndFindEntityAsync(repository, entity)); - }); - it('Copper custom field', async () => { - const connection = await createDbConnectionOnceAsync(); - const repository = connection.getRepository(CopperCustomField); - ParsedCustomFields.forEach(async entity => testSaveAndFindEntityAsync(repository, entity)); - }); - }); -}); diff --git a/packages/pipeline/test/entities/dex_trades_test.ts b/packages/pipeline/test/entities/dex_trades_test.ts deleted file mode 100644 index 7c4829988..000000000 --- a/packages/pipeline/test/entities/dex_trades_test.ts +++ /dev/null @@ -1,61 +0,0 @@ -import { BigNumber } from '@0x/utils'; -import 'mocha'; -import * as R from 'ramda'; -import 'reflect-metadata'; - -import { DexTrade } from '../../src/entities'; -import { createDbConnectionOnceAsync } from '../db_setup'; -import { chaiSetup } from '../utils/chai_setup'; - -import { testSaveAndFindEntityAsync } from './util'; - -chaiSetup.configure(); - -const baseTrade = { - sourceUrl: 'https://bloxy.info/api/dex/trades', - txTimestamp: 1543447585938, - txDate: '2018-11-21', - txSender: '0x00923b9a074762b93650716333b3e1473a15048e', - smartContractId: 7091917, - smartContractAddress: '0x818e6fecd516ecc3849daf6845e3ec868087b755', - contractType: 'DEX/Kyber Network Proxy', - maker: '0xbf2179859fc6d5bee9bf9158632dc51678a4100c', - taker: '0xbf2179859fc6d5bee9bf9158632dc51678a4100d', - amountBuy: new BigNumber('1.011943163078103'), - makerFeeAmount: new BigNumber(0), - buyCurrencyId: 1, - buySymbol: 'ETH', - amountSell: new BigNumber('941.4997928436911'), - takerFeeAmount: new BigNumber(0), - sellCurrencyId: 16610, - sellSymbol: 'ELF', - makerAnnotation: '', - takerAnnotation: '', - protocol: 'Kyber Network Proxy', - sellAddress: '0xbf2179859fc6d5bee9bf9158632dc51678a4100e', - tradeIndex: '3', -}; - -const tradeWithNullAddresses: DexTrade = R.merge(baseTrade, { - txHash: '0xb93a7faf92efbbb5405c9a73cd4efd99702fe27c03ff22baee1f1b1e37b3a0bf', - buyAddress: '0xbf2179859fc6d5bee9bf9158632dc51678a4100e', - sellAddress: '0xbf2179859fc6d5bee9bf9158632dc51678a4100f', -}); - -const tradeWithNonNullAddresses: DexTrade = R.merge(baseTrade, { - txHash: '0xb93a7faf92efbbb5405c9a73cd4efd99702fe27c03ff22baee1f1b1e37b3a0be', - buyAddress: null, - sellAddress: null, -}); - -// tslint:disable:custom-no-magic-numbers -describe('DexTrade entity', () => { - it('save/find', async () => { - const connection = await createDbConnectionOnceAsync(); - const trades = [tradeWithNullAddresses, tradeWithNonNullAddresses]; - const tradesRepository = connection.getRepository(DexTrade); - for (const trade of trades) { - await testSaveAndFindEntityAsync(tradesRepository, trade); - } - }); -}); diff --git a/packages/pipeline/test/entities/erc20_approval_events_test.ts b/packages/pipeline/test/entities/erc20_approval_events_test.ts deleted file mode 100644 index 1ecf41ee5..000000000 --- a/packages/pipeline/test/entities/erc20_approval_events_test.ts +++ /dev/null @@ -1,29 +0,0 @@ -import { BigNumber } from '@0x/utils'; -import 'mocha'; -import 'reflect-metadata'; - -import { ERC20ApprovalEvent } from '../../src/entities'; -import { createDbConnectionOnceAsync } from '../db_setup'; -import { chaiSetup } from '../utils/chai_setup'; - -import { testSaveAndFindEntityAsync } from './util'; - -chaiSetup.configure(); - -// tslint:disable:custom-no-magic-numbers -describe('ERC20ApprovalEvent entity', () => { - it('save/find', async () => { - const connection = await createDbConnectionOnceAsync(); - const event = new ERC20ApprovalEvent(); - event.tokenAddress = '0xc02aaa39b223fe8d0a0e5c4f27ead9083c756cc2'; - event.blockNumber = 6281577; - event.rawData = '0x000000000000000000000000000000000000000000000002b9cba5ee21ad3df9'; - event.logIndex = 43; - event.transactionHash = '0xcb46b19c786376a0a0140d51e3e606a4c4f926d8ca5434e96d2f69d04d8d9c7f'; - event.ownerAddress = '0x0b65c5f6f3a05d6be5588a72b603360773b3fe04'; - event.spenderAddress = '0x448a5065aebb8e423f0896e6c5d525c040f59af3'; - event.amount = new BigNumber('50281464906893835769'); - const blocksRepository = connection.getRepository(ERC20ApprovalEvent); - await testSaveAndFindEntityAsync(blocksRepository, event); - }); -}); diff --git a/packages/pipeline/test/entities/exchange_cancel_event_test.ts b/packages/pipeline/test/entities/exchange_cancel_event_test.ts deleted file mode 100644 index f3b306d69..000000000 --- a/packages/pipeline/test/entities/exchange_cancel_event_test.ts +++ /dev/null @@ -1,57 +0,0 @@ -import 'mocha'; -import * as R from 'ramda'; -import 'reflect-metadata'; - -import { ExchangeCancelEvent } from '../../src/entities'; -import { AssetType } from '../../src/types'; -import { createDbConnectionOnceAsync } from '../db_setup'; -import { chaiSetup } from '../utils/chai_setup'; - -import { testSaveAndFindEntityAsync } from './util'; - -chaiSetup.configure(); - -const baseCancelEvent = { - contractAddress: '0x4f833a24e1f95d70f028921e27040ca56e09ab0b', - logIndex: 1234, - blockNumber: 6276262, - rawData: '0x000000000000000000000000f6da68519f78b0d0bc93c701e86affcb75c92428', - transactionHash: '0x6dd106d002873746072fc5e496dd0fb2541b68c77bcf9184ae19a42fd33657fe', - makerAddress: '0xf6da68519f78b0d0bc93c701e86affcb75c92428', - takerAddress: '0xf6da68519f78b0d0bc93c701e86affcb75c92428', - feeRecipientAddress: '0xc370d2a5920344aa6b7d8d11250e3e861434cbdd', - senderAddress: '0xf6da68519f78b0d0bc93c701e86affcb75c92428', - orderHash: '0xab12ed2cbaa5615ab690b9da75a46e53ddfcf3f1a68655b5fe0d94c75a1aac4a', - rawMakerAssetData: '0xf47261b0000000000000000000000000c02aaa39b223fe8d0a0e5c4f27ead9083c756cc2', - makerAssetProxyId: '0xf47261b0', - makerTokenAddress: '0xc02aaa39b223fe8d0a0e5c4f27ead9083c756cc2', - rawTakerAssetData: '0xf47261b0000000000000000000000000e41d2489571d322189246dafa5ebde1f4699f498', - takerAssetProxyId: '0xf47261b0', - takerTokenAddress: '0xe41d2489571d322189246dafa5ebde1f4699f498', -}; - -const erc20CancelEvent = R.merge(baseCancelEvent, { - makerAssetType: 'erc20' as AssetType, - makerTokenId: null, - takerAssetType: 'erc20' as AssetType, - takerTokenId: null, -}); - -const erc721CancelEvent = R.merge(baseCancelEvent, { - makerAssetType: 'erc721' as AssetType, - makerTokenId: '19378573', - takerAssetType: 'erc721' as AssetType, - takerTokenId: '63885673888', -}); - -// tslint:disable:custom-no-magic-numbers -describe('ExchangeCancelEvent entity', () => { - it('save/find', async () => { - const connection = await createDbConnectionOnceAsync(); - const events = [erc20CancelEvent, erc721CancelEvent]; - const cancelEventRepository = connection.getRepository(ExchangeCancelEvent); - for (const event of events) { - await testSaveAndFindEntityAsync(cancelEventRepository, event); - } - }); -}); diff --git a/packages/pipeline/test/entities/exchange_cancel_up_to_event_test.ts b/packages/pipeline/test/entities/exchange_cancel_up_to_event_test.ts deleted file mode 100644 index aa34f8c1c..000000000 --- a/packages/pipeline/test/entities/exchange_cancel_up_to_event_test.ts +++ /dev/null @@ -1,29 +0,0 @@ -import { BigNumber } from '@0x/utils'; -import 'mocha'; -import 'reflect-metadata'; - -import { ExchangeCancelUpToEvent } from '../../src/entities'; -import { createDbConnectionOnceAsync } from '../db_setup'; -import { chaiSetup } from '../utils/chai_setup'; - -import { testSaveAndFindEntityAsync } from './util'; - -chaiSetup.configure(); - -// tslint:disable:custom-no-magic-numbers -describe('ExchangeCancelUpToEvent entity', () => { - it('save/find', async () => { - const connection = await createDbConnectionOnceAsync(); - const cancelUpToEventRepository = connection.getRepository(ExchangeCancelUpToEvent); - const cancelUpToEvent = new ExchangeCancelUpToEvent(); - cancelUpToEvent.blockNumber = 6276262; - cancelUpToEvent.contractAddress = '0x4f833a24e1f95d70f028921e27040ca56e09ab0b'; - cancelUpToEvent.logIndex = 42; - cancelUpToEvent.makerAddress = '0xf6da68519f78b0d0bc93c701e86affcb75c92428'; - cancelUpToEvent.orderEpoch = new BigNumber('123456789123456789'); - cancelUpToEvent.rawData = '0x000000000000000000000000f6da68519f78b0d0bc93c701e86affcb75c92428'; - cancelUpToEvent.senderAddress = '0xf6da68519f78b0d0bc93c701e86affcb75c92428'; - cancelUpToEvent.transactionHash = '0x6dd106d002873746072fc5e496dd0fb2541b68c77bcf9184ae19a42fd33657fe'; - await testSaveAndFindEntityAsync(cancelUpToEventRepository, cancelUpToEvent); - }); -}); diff --git a/packages/pipeline/test/entities/exchange_fill_event_test.ts b/packages/pipeline/test/entities/exchange_fill_event_test.ts deleted file mode 100644 index b2cb8c5e0..000000000 --- a/packages/pipeline/test/entities/exchange_fill_event_test.ts +++ /dev/null @@ -1,62 +0,0 @@ -import { BigNumber } from '@0x/utils'; -import 'mocha'; -import * as R from 'ramda'; -import 'reflect-metadata'; - -import { ExchangeFillEvent } from '../../src/entities'; -import { AssetType } from '../../src/types'; -import { createDbConnectionOnceAsync } from '../db_setup'; -import { chaiSetup } from '../utils/chai_setup'; - -import { testSaveAndFindEntityAsync } from './util'; - -chaiSetup.configure(); - -const baseFillEvent = { - contractAddress: '0x4f833a24e1f95d70f028921e27040ca56e09ab0b', - blockNumber: 6276262, - logIndex: 102, - rawData: '0x000000000000000000000000f6da68519f78b0d0bc93c701e86affcb75c92428', - transactionHash: '0x6dd106d002873746072fc5e496dd0fb2541b68c77bcf9184ae19a42fd33657fe', - makerAddress: '0xf6da68519f78b0d0bc93c701e86affcb75c92428', - takerAddress: '0xf6da68519f78b0d0bc93c701e86affcb75c92428', - feeRecipientAddress: '0xc370d2a5920344aa6b7d8d11250e3e861434cbdd', - senderAddress: '0xf6da68519f78b0d0bc93c701e86affcb75c92428', - makerAssetFilledAmount: new BigNumber('10000000000000000'), - takerAssetFilledAmount: new BigNumber('100000000000000000'), - makerFeePaid: new BigNumber('0'), - takerFeePaid: new BigNumber('12345'), - orderHash: '0xab12ed2cbaa5615ab690b9da75a46e53ddfcf3f1a68655b5fe0d94c75a1aac4a', - rawMakerAssetData: '0xf47261b0000000000000000000000000c02aaa39b223fe8d0a0e5c4f27ead9083c756cc2', - makerAssetProxyId: '0xf47261b0', - makerTokenAddress: '0xc02aaa39b223fe8d0a0e5c4f27ead9083c756cc2', - rawTakerAssetData: '0xf47261b0000000000000000000000000e41d2489571d322189246dafa5ebde1f4699f498', - takerAssetProxyId: '0xf47261b0', - takerTokenAddress: '0xe41d2489571d322189246dafa5ebde1f4699f498', -}; - -const erc20FillEvent = R.merge(baseFillEvent, { - makerAssetType: 'erc20' as AssetType, - makerTokenId: null, - takerAssetType: 'erc20' as AssetType, - takerTokenId: null, -}); - -const erc721FillEvent = R.merge(baseFillEvent, { - makerAssetType: 'erc721' as AssetType, - makerTokenId: '19378573', - takerAssetType: 'erc721' as AssetType, - takerTokenId: '63885673888', -}); - -// tslint:disable:custom-no-magic-numbers -describe('ExchangeFillEvent entity', () => { - it('save/find', async () => { - const connection = await createDbConnectionOnceAsync(); - const events = [erc20FillEvent, erc721FillEvent]; - const fillEventsRepository = connection.getRepository(ExchangeFillEvent); - for (const event of events) { - await testSaveAndFindEntityAsync(fillEventsRepository, event); - } - }); -}); diff --git a/packages/pipeline/test/entities/ohlcv_external_test.ts b/packages/pipeline/test/entities/ohlcv_external_test.ts deleted file mode 100644 index 8b995db50..000000000 --- a/packages/pipeline/test/entities/ohlcv_external_test.ts +++ /dev/null @@ -1,35 +0,0 @@ -import 'mocha'; -import 'reflect-metadata'; - -import { OHLCVExternal } from '../../src/entities'; -import { createDbConnectionOnceAsync } from '../db_setup'; -import { chaiSetup } from '../utils/chai_setup'; - -import { testSaveAndFindEntityAsync } from './util'; - -chaiSetup.configure(); - -const ohlcvExternal: OHLCVExternal = { - exchange: 'CCCAGG', - fromSymbol: 'ETH', - toSymbol: 'ZRX', - startTime: 1543352400000, - endTime: 1543356000000, - open: 307.41, - close: 310.08, - low: 304.6, - high: 310.27, - volumeFrom: 904.6, - volumeTo: 278238.5, - source: 'Crypto Compare', - observedTimestamp: 1543442338074, -}; - -// tslint:disable:custom-no-magic-numbers -describe('OHLCVExternal entity', () => { - it('save/find', async () => { - const connection = await createDbConnectionOnceAsync(); - const repository = connection.getRepository(OHLCVExternal); - await testSaveAndFindEntityAsync(repository, ohlcvExternal); - }); -}); diff --git a/packages/pipeline/test/entities/relayer_test.ts b/packages/pipeline/test/entities/relayer_test.ts deleted file mode 100644 index 760ffb6f9..000000000 --- a/packages/pipeline/test/entities/relayer_test.ts +++ /dev/null @@ -1,55 +0,0 @@ -import 'mocha'; -import * as R from 'ramda'; -import 'reflect-metadata'; - -import { Relayer } from '../../src/entities'; -import { createDbConnectionOnceAsync } from '../db_setup'; -import { chaiSetup } from '../utils/chai_setup'; - -import { testSaveAndFindEntityAsync } from './util'; - -chaiSetup.configure(); - -const baseRelayer = { - uuid: 'e8d27d8d-ddf6-48b1-9663-60b0a3ddc716', - name: 'Radar Relay', - homepageUrl: 'https://radarrelay.com', - appUrl: null, - sraHttpEndpoint: null, - sraWsEndpoint: null, - feeRecipientAddresses: [], - takerAddresses: [], -}; - -const relayerWithUrls = R.merge(baseRelayer, { - uuid: 'e8d27d8d-ddf6-48b1-9663-60b0a3ddc717', - appUrl: 'https://app.radarrelay.com', - sraHttpEndpoint: 'https://api.radarrelay.com/0x/v2/', - sraWsEndpoint: 'wss://ws.radarrelay.com/0x/v2', -}); - -const relayerWithAddresses = R.merge(baseRelayer, { - uuid: 'e8d27d8d-ddf6-48b1-9663-60b0a3ddc718', - feeRecipientAddresses: [ - '0xa258b39954cef5cb142fd567a46cddb31a670124', - '0xa258b39954cef5cb142fd567a46cddb31a670125', - '0xa258b39954cef5cb142fd567a46cddb31a670126', - ], - takerAddresses: [ - '0xa258b39954cef5cb142fd567a46cddb31a670127', - '0xa258b39954cef5cb142fd567a46cddb31a670128', - '0xa258b39954cef5cb142fd567a46cddb31a670129', - ], -}); - -// tslint:disable:custom-no-magic-numbers -describe('Relayer entity', () => { - it('save/find', async () => { - const connection = await createDbConnectionOnceAsync(); - const relayers = [baseRelayer, relayerWithUrls, relayerWithAddresses]; - const relayerRepository = connection.getRepository(Relayer); - for (const relayer of relayers) { - await testSaveAndFindEntityAsync(relayerRepository, relayer); - } - }); -}); diff --git a/packages/pipeline/test/entities/sra_order_test.ts b/packages/pipeline/test/entities/sra_order_test.ts deleted file mode 100644 index c43de8ce8..000000000 --- a/packages/pipeline/test/entities/sra_order_test.ts +++ /dev/null @@ -1,84 +0,0 @@ -import { BigNumber } from '@0x/utils'; -import 'mocha'; -import * as R from 'ramda'; -import 'reflect-metadata'; -import { Repository } from 'typeorm'; - -import { SraOrder, SraOrdersObservedTimeStamp } from '../../src/entities'; -import { AssetType } from '../../src/types'; -import { createDbConnectionOnceAsync } from '../db_setup'; -import { chaiSetup } from '../utils/chai_setup'; - -import { testSaveAndFindEntityAsync } from './util'; - -chaiSetup.configure(); - -const baseOrder = { - sourceUrl: 'https://api.radarrelay.com/0x/v2', - exchangeAddress: '0x4f833a24e1f95d70f028921e27040ca56e09ab0b', - makerAddress: '0xb45df06e38540a675fdb5b598abf2c0dbe9d6b81', - takerAddress: '0x0000000000000000000000000000000000000000', - feeRecipientAddress: '0xa258b39954cef5cb142fd567a46cddb31a670124', - senderAddress: '0x0000000000000000000000000000000000000000', - makerAssetAmount: new BigNumber('1619310371000000000'), - takerAssetAmount: new BigNumber('8178335207070707070707'), - makerFee: new BigNumber('100'), - takerFee: new BigNumber('200'), - expirationTimeSeconds: new BigNumber('1538529488'), - salt: new BigNumber('1537924688891'), - signature: '0x1b5a5d672b0d647b5797387ccbb89d8', - rawMakerAssetData: '0xf47261b0000000000000000000000000c02aaa39b223fe8d0a0e5c4f27ead9083c756cc2', - makerAssetProxyId: '0xf47261b0', - makerTokenAddress: '0xc02aaa39b223fe8d0a0e5c4f27ead9083c756cc2', - rawTakerAssetData: '0xf47261b000000000000000000000000042d6622dece394b54999fbd73d108123806f6a18', - takerAssetProxyId: '0xf47261b0', - takerTokenAddress: '0x42d6622dece394b54999fbd73d108123806f6a18', - metadataJson: '{"isThisArbitraryData":true,"powerLevel":9001}', -}; - -const erc20Order = R.merge(baseOrder, { - orderHashHex: '0x1bdbeb0d088a33da28b9ee6d94e8771452f90f4a69107da2fa75195d61b9a1c9', - makerAssetType: 'erc20' as AssetType, - makerTokenId: null, - takerAssetType: 'erc20' as AssetType, - takerTokenId: null, -}); - -const erc721Order = R.merge(baseOrder, { - orderHashHex: '0x1bdbeb0d088a33da28b9ee6d94e8771452f90f4a69107da2fa75195d61b9a1d0', - makerAssetType: 'erc721' as AssetType, - makerTokenId: '19378573', - takerAssetType: 'erc721' as AssetType, - takerTokenId: '63885673888', -}); - -// tslint:disable:custom-no-magic-numbers -describe('SraOrder and SraOrdersObservedTimeStamp entities', () => { - // Note(albrow): SraOrder and SraOrdersObservedTimeStamp are tightly coupled - // and timestamps have a foreign key constraint such that they have to point - // to an existing SraOrder. For these reasons, we are testing them together - // in the same test. - it('save/find', async () => { - const connection = await createDbConnectionOnceAsync(); - const orderRepository = connection.getRepository(SraOrder); - const timestampRepository = connection.getRepository(SraOrdersObservedTimeStamp); - const orders = [erc20Order, erc721Order]; - for (const order of orders) { - await testOrderWithTimestampAsync(orderRepository, timestampRepository, order); - } - }); -}); - -async function testOrderWithTimestampAsync( - orderRepository: Repository<SraOrder>, - timestampRepository: Repository<SraOrdersObservedTimeStamp>, - order: SraOrder, -): Promise<void> { - await testSaveAndFindEntityAsync(orderRepository, order); - const timestamp = new SraOrdersObservedTimeStamp(); - timestamp.exchangeAddress = order.exchangeAddress; - timestamp.orderHashHex = order.orderHashHex; - timestamp.sourceUrl = order.sourceUrl; - timestamp.observedTimestamp = 1543377376153; - await testSaveAndFindEntityAsync(timestampRepository, timestamp); -} diff --git a/packages/pipeline/test/entities/token_metadata_test.ts b/packages/pipeline/test/entities/token_metadata_test.ts deleted file mode 100644 index 48e656644..000000000 --- a/packages/pipeline/test/entities/token_metadata_test.ts +++ /dev/null @@ -1,39 +0,0 @@ -import { BigNumber } from '@0x/utils'; -import 'mocha'; -import 'reflect-metadata'; - -import { TokenMetadata } from '../../src/entities'; -import { createDbConnectionOnceAsync } from '../db_setup'; -import { chaiSetup } from '../utils/chai_setup'; - -import { testSaveAndFindEntityAsync } from './util'; - -chaiSetup.configure(); - -const metadataWithoutNullFields: TokenMetadata = { - address: '0xe41d2489571d322189246dafa5ebde1f4699f498', - authority: 'https://website-api.0xproject.com/tokens', - decimals: new BigNumber(18), - symbol: 'ZRX', - name: '0x', -}; - -const metadataWithNullFields: TokenMetadata = { - address: '0xe41d2489571d322189246dafa5ebde1f4699f499', - authority: 'https://website-api.0xproject.com/tokens', - decimals: null, - symbol: null, - name: null, -}; - -// tslint:disable:custom-no-magic-numbers -describe('TokenMetadata entity', () => { - it('save/find', async () => { - const connection = await createDbConnectionOnceAsync(); - const tokenMetadata = [metadataWithoutNullFields, metadataWithNullFields]; - const tokenMetadataRepository = connection.getRepository(TokenMetadata); - for (const tokenMetadatum of tokenMetadata) { - await testSaveAndFindEntityAsync(tokenMetadataRepository, tokenMetadatum); - } - }); -}); diff --git a/packages/pipeline/test/entities/token_order_test.ts b/packages/pipeline/test/entities/token_order_test.ts deleted file mode 100644 index c6057f5aa..000000000 --- a/packages/pipeline/test/entities/token_order_test.ts +++ /dev/null @@ -1,31 +0,0 @@ -import { BigNumber } from '@0x/utils'; -import 'mocha'; - -import { TokenOrderbookSnapshot } from '../../src/entities'; -import { createDbConnectionOnceAsync } from '../db_setup'; -import { chaiSetup } from '../utils/chai_setup'; - -import { testSaveAndFindEntityAsync } from './util'; - -chaiSetup.configure(); - -const tokenOrderbookSnapshot: TokenOrderbookSnapshot = { - source: 'ddextest', - observedTimestamp: Date.now(), - orderType: 'bid', - price: new BigNumber(10.1), - baseAssetSymbol: 'ETH', - baseAssetAddress: '0x818e6fecd516ecc3849daf6845e3ec868087b755', - baseVolume: new BigNumber(143), - quoteAssetSymbol: 'ABC', - quoteAssetAddress: '0x00923b9a074762b93650716333b3e1473a15048e', - quoteVolume: new BigNumber(12.3234234), -}; - -describe('TokenOrderbookSnapshot entity', () => { - it('save/find', async () => { - const connection = await createDbConnectionOnceAsync(); - const tokenOrderbookSnapshotRepository = connection.getRepository(TokenOrderbookSnapshot); - await testSaveAndFindEntityAsync(tokenOrderbookSnapshotRepository, tokenOrderbookSnapshot); - }); -}); diff --git a/packages/pipeline/test/entities/transaction_test.ts b/packages/pipeline/test/entities/transaction_test.ts deleted file mode 100644 index 634844544..000000000 --- a/packages/pipeline/test/entities/transaction_test.ts +++ /dev/null @@ -1,26 +0,0 @@ -import { BigNumber } from '@0x/utils'; -import 'mocha'; -import 'reflect-metadata'; - -import { Transaction } from '../../src/entities'; -import { createDbConnectionOnceAsync } from '../db_setup'; -import { chaiSetup } from '../utils/chai_setup'; - -import { testSaveAndFindEntityAsync } from './util'; - -chaiSetup.configure(); - -// tslint:disable:custom-no-magic-numbers -describe('Transaction entity', () => { - it('save/find', async () => { - const connection = await createDbConnectionOnceAsync(); - const transactionRepository = connection.getRepository(Transaction); - const transaction = new Transaction(); - transaction.blockHash = '0x6ff106d00b6c3746072fc06bae140fb2549036ba7bcf9184ae19a42fd33657fd'; - transaction.blockNumber = 6276262; - transaction.gasPrice = new BigNumber(3000000); - transaction.gasUsed = new BigNumber(125000); - transaction.transactionHash = '0x6dd106d002873746072fc5e496dd0fb2541b68c77bcf9184ae19a42fd33657fe'; - await testSaveAndFindEntityAsync(transactionRepository, transaction); - }); -}); diff --git a/packages/pipeline/test/entities/util.ts b/packages/pipeline/test/entities/util.ts deleted file mode 100644 index 42df23a4a..000000000 --- a/packages/pipeline/test/entities/util.ts +++ /dev/null @@ -1,25 +0,0 @@ -import * as chai from 'chai'; -import 'mocha'; - -import { Repository } from 'typeorm'; - -const expect = chai.expect; - -/** - * First saves the given entity to the database, then finds it and makes sure - * that the found entity is exactly equal to the original one. This is a bare - * minimum basic test to make sure that the entity type definition and our - * database schema are aligned and that it is possible to save and find the - * entity. - * @param repository A TypeORM repository corresponding with the type of the entity. - * @param entity An instance of a TypeORM entity which will be saved/retrieved from the database. - */ -export async function testSaveAndFindEntityAsync<T>(repository: Repository<T>, entity: T): Promise<void> { - // Note(albrow): We are forced to use an 'any' hack here because - // TypeScript complains about stack depth when checking the types. - await repository.save<any>(entity); - const gotEntity = await repository.findOneOrFail({ - where: entity, - }); - expect(gotEntity).deep.equal(entity); -} diff --git a/packages/pipeline/test/fixtures/copper/api_v1_activity_types.json b/packages/pipeline/test/fixtures/copper/api_v1_activity_types.json deleted file mode 100644 index dbd39c31b..000000000 --- a/packages/pipeline/test/fixtures/copper/api_v1_activity_types.json +++ /dev/null @@ -1,24 +0,0 @@ -{ - "user": [ - { "id": 0, "category": "user", "name": "Note", "is_disabled": false, "count_as_interaction": false }, - { "id": 660496, "category": "user", "name": "To Do", "is_disabled": false, "count_as_interaction": false }, - { "id": 660495, "category": "user", "name": "Meeting", "is_disabled": false, "count_as_interaction": true }, - { "id": 660494, "category": "user", "name": "Phone Call", "is_disabled": false, "count_as_interaction": true } - ], - "system": [ - { - "id": 1, - "category": "system", - "name": "Property Changed", - "is_disabled": false, - "count_as_interaction": false - }, - { - "id": 3, - "category": "system", - "name": "Pipeline Stage Changed", - "is_disabled": false, - "count_as_interaction": false - } - ] -} diff --git a/packages/pipeline/test/fixtures/copper/api_v1_activity_types.ts b/packages/pipeline/test/fixtures/copper/api_v1_activity_types.ts deleted file mode 100644 index fd2d62a6c..000000000 --- a/packages/pipeline/test/fixtures/copper/api_v1_activity_types.ts +++ /dev/null @@ -1,16 +0,0 @@ -import { CopperActivityType } from '../../../src/entities'; -const ParsedActivityTypes: CopperActivityType[] = [ - { id: 0, name: 'Note', category: 'user', isDisabled: false, countAsInteraction: false }, - { id: 660496, name: 'To Do', category: 'user', isDisabled: false, countAsInteraction: false }, - { id: 660495, name: 'Meeting', category: 'user', isDisabled: false, countAsInteraction: true }, - { id: 660494, name: 'Phone Call', category: 'user', isDisabled: false, countAsInteraction: true }, - { id: 1, name: 'Property Changed', category: 'system', isDisabled: false, countAsInteraction: false }, - { - id: 3, - name: 'Pipeline Stage Changed', - category: 'system', - isDisabled: false, - countAsInteraction: false, - }, -]; -export { ParsedActivityTypes }; diff --git a/packages/pipeline/test/fixtures/copper/api_v1_custom_field_definitions.json b/packages/pipeline/test/fixtures/copper/api_v1_custom_field_definitions.json deleted file mode 100644 index c6665cb0f..000000000 --- a/packages/pipeline/test/fixtures/copper/api_v1_custom_field_definitions.json +++ /dev/null @@ -1,38 +0,0 @@ -[ - { - "id": 261066, - "name": "Integration Type", - "canonical_name": null, - "data_type": "MultiSelect", - "available_on": ["opportunity", "company", "person"], - "options": [ - { "id": 394020, "name": "Strategic Relationship", "rank": 7 }, - { "id": 394013, "name": "ERC-20 Exchange", "rank": 0 }, - { "id": 394014, "name": "ERC-721 Marketplace", "rank": 1 }, - { "id": 394015, "name": "Trade Widget", "rank": 2 }, - { "id": 394016, "name": "Prediction Market Exchange", "rank": 3 }, - { "id": 394017, "name": "Security Token Exchange", "rank": 4 }, - { "id": 394018, "name": "Complementary Company", "rank": 5 }, - { "id": 394019, "name": "Service Provider", "rank": 6 } - ] - }, - { - "id": 261067, - "name": "Company Type", - "canonical_name": null, - "data_type": "Dropdown", - "available_on": ["company", "opportunity", "person"], - "options": [ - { "id": 394129, "name": "Market Maker", "rank": 6 }, - { "id": 394130, "name": "Events", "rank": 2 }, - { "id": 394023, "name": "Exchange", "rank": 3 }, - { "id": 394024, "name": "Investor", "rank": 5 }, - { "id": 394026, "name": "Service Provider", "rank": 8 }, - { "id": 394027, "name": "Wallet", "rank": 9 }, - { "id": 394134, "name": "Game", "rank": 4 }, - { "id": 394025, "name": "OTC", "rank": 7 }, - { "id": 394021, "name": "Blockchain/Protocol", "rank": 0 }, - { "id": 394022, "name": "dApp", "rank": 1 } - ] - } -] diff --git a/packages/pipeline/test/fixtures/copper/api_v1_custom_field_definitions.ts b/packages/pipeline/test/fixtures/copper/api_v1_custom_field_definitions.ts deleted file mode 100644 index a44bbd2c3..000000000 --- a/packages/pipeline/test/fixtures/copper/api_v1_custom_field_definitions.ts +++ /dev/null @@ -1,39 +0,0 @@ -import { CopperCustomField } from '../../../src/entities'; -const ParsedCustomFields: CopperCustomField[] = [ - { - id: 394020, - name: 'Strategic Relationship', - dataType: 'Integration Type', - fieldType: 'option', - }, - { id: 394013, name: 'ERC-20 Exchange', dataType: 'Integration Type', fieldType: 'option' }, - { id: 394014, name: 'ERC-721 Marketplace', dataType: 'Integration Type', fieldType: 'option' }, - { id: 394015, name: 'Trade Widget', dataType: 'Integration Type', fieldType: 'option' }, - { - id: 394016, - name: 'Prediction Market Exchange', - dataType: 'Integration Type', - fieldType: 'option', - }, - { - id: 394017, - name: 'Security Token Exchange', - dataType: 'Integration Type', - fieldType: 'option', - }, - { id: 394018, name: 'Complementary Company', dataType: 'Integration Type', fieldType: 'option' }, - { id: 394019, name: 'Service Provider', dataType: 'Integration Type', fieldType: 'option' }, - { id: 261066, name: 'Integration Type', dataType: 'MultiSelect' }, - { id: 394129, name: 'Market Maker', dataType: 'Company Type', fieldType: 'option' }, - { id: 394130, name: 'Events', dataType: 'Company Type', fieldType: 'option' }, - { id: 394023, name: 'Exchange', dataType: 'Company Type', fieldType: 'option' }, - { id: 394024, name: 'Investor', dataType: 'Company Type', fieldType: 'option' }, - { id: 394026, name: 'Service Provider', dataType: 'Company Type', fieldType: 'option' }, - { id: 394027, name: 'Wallet', dataType: 'Company Type', fieldType: 'option' }, - { id: 394134, name: 'Game', dataType: 'Company Type', fieldType: 'option' }, - { id: 394025, name: 'OTC', dataType: 'Company Type', fieldType: 'option' }, - { id: 394021, name: 'Blockchain/Protocol', dataType: 'Company Type', fieldType: 'option' }, - { id: 394022, name: 'dApp', dataType: 'Company Type', fieldType: 'option' }, - { id: 261067, name: 'Company Type', dataType: 'Dropdown' }, -]; -export { ParsedCustomFields }; diff --git a/packages/pipeline/test/fixtures/copper/api_v1_list_activities.json b/packages/pipeline/test/fixtures/copper/api_v1_list_activities.json deleted file mode 100644 index a726111ac..000000000 --- a/packages/pipeline/test/fixtures/copper/api_v1_list_activities.json +++ /dev/null @@ -1,242 +0,0 @@ -[ - { - "id": 5015299552, - "parent": { "id": 14667512, "type": "opportunity" }, - "type": { "id": 3, "category": "system", "name": "Stage Change" }, - "user_id": 680302, - "details": "blah blah", - "activity_date": 1545329595, - "old_value": { "id": 2392929, "name": "Evaluation" }, - "new_value": { "id": 2392931, "name": "Integration Started" }, - "date_created": 1545329595, - "date_modified": 1545329595 - }, - { - "id": 5010214065, - "parent": { "id": 14978865, "type": "opportunity" }, - "type": { "id": 3, "category": "system", "name": "Stage Change" }, - "user_id": 680302, - "details": "blah blah", - "activity_date": 1545245706, - "old_value": { "id": 2392928, "name": "Intro" }, - "new_value": { "id": 2392929, "name": "Evaluation" }, - "date_created": 1545245706, - "date_modified": 1545245706 - }, - { - "id": 5006149111, - "parent": { "id": 70430977, "type": "person" }, - "type": { "id": 660495, "category": "user" }, - "user_id": 680302, - "details": "blah blah", - "activity_date": 1545166908, - "old_value": null, - "new_value": null, - "date_created": 1545168280, - "date_modified": 1545166908 - }, - { - "id": 5005314622, - "parent": { "id": 27778968, "type": "company" }, - "type": { "id": 660495, "category": "user" }, - "user_id": 680302, - "details": "blah blah", - "activity_date": 1545080504, - "old_value": null, - "new_value": null, - "date_created": 1545160479, - "date_modified": 1545080504 - }, - { - "id": 5000006802, - "parent": { "id": 14956518, "type": "opportunity" }, - "type": { "id": 660495, "category": "user" }, - "user_id": 680302, - "details": "blah blah", - "activity_date": 1545071374, - "old_value": null, - "new_value": null, - "date_created": 1545071500, - "date_modified": 1545071374 - }, - { - "id": 4985504199, - "parent": { "id": 14912790, "type": "opportunity" }, - "type": { "id": 660495, "category": "user" }, - "user_id": 680302, - "details": "blah blah", - "activity_date": 1544644058, - "old_value": null, - "new_value": null, - "date_created": 1544644661, - "date_modified": 1544644058 - }, - { - "id": 4985456147, - "parent": { "id": 14912790, "type": "opportunity" }, - "type": { "id": 660495, "category": "user" }, - "user_id": 680302, - "details": "blah blah", - "activity_date": 1544644048, - "old_value": null, - "new_value": null, - "date_created": 1544644053, - "date_modified": 1544644048 - }, - { - "id": 4980975996, - "parent": { "id": 14902828, "type": "opportunity" }, - "type": { "id": 660495, "category": "user" }, - "user_id": 680302, - "details": "blah blah", - "activity_date": 1544563171, - "old_value": null, - "new_value": null, - "date_created": 1544563224, - "date_modified": 1544563171 - }, - { - "id": 4980910331, - "parent": { "id": 14902828, "type": "opportunity" }, - "type": { "id": 3, "category": "system", "name": "Stage Change" }, - "user_id": 680302, - "details": "blah blah", - "activity_date": 1544562495, - "old_value": { "id": 2392928, "name": "Intro" }, - "new_value": { "id": 2392931, "name": "Integration Started" }, - "date_created": 1544562495, - "date_modified": 1544562495 - }, - { - "id": 4980872220, - "parent": { "id": 14888910, "type": "opportunity" }, - "type": { "id": 660495, "category": "user" }, - "user_id": 680302, - "details": "blah blah", - "activity_date": 1544559279, - "old_value": null, - "new_value": null, - "date_created": 1544562118, - "date_modified": 1544559279 - }, - { - "id": 4980508097, - "parent": { "id": 14050167, "type": "opportunity" }, - "type": { "id": 1, "category": "system", "name": "Status Change" }, - "user_id": 680302, - "details": "blah blah", - "activity_date": 1544558077, - "old_value": "Open", - "new_value": "Won", - "date_created": 1544558077, - "date_modified": 1544558077 - }, - { - "id": 4980508095, - "parent": { "id": 66538237, "type": "person" }, - "type": { "id": 1, "category": "system" }, - "user_id": 680302, - "details": "blah blah", - "activity_date": 1544558077, - "old_value": null, - "new_value": null, - "date_created": 1544558077, - "date_modified": 1544558077 - }, - { - "id": 4980508092, - "parent": { "id": 27779020, "type": "company" }, - "type": { "id": 1, "category": "system" }, - "user_id": 680302, - "details": "blah blah", - "activity_date": 1544558077, - "old_value": null, - "new_value": null, - "date_created": 1544558077, - "date_modified": 1544558077 - }, - { - "id": 4980507507, - "parent": { "id": 14050167, "type": "opportunity" }, - "type": { "id": 3, "category": "system", "name": "Stage Change" }, - "user_id": 680302, - "details": "blah blah", - "activity_date": 1544558071, - "old_value": { "id": 2392931, "name": "Integration Started" }, - "new_value": { "id": 2405442, "name": "Integration Complete" }, - "date_created": 1544558071, - "date_modified": 1544558071 - }, - { - "id": 4980479684, - "parent": { "id": 14901232, "type": "opportunity" }, - "type": { "id": 3, "category": "system", "name": "Stage Change" }, - "user_id": 680302, - "details": "blah blah", - "activity_date": 1544557777, - "old_value": { "id": 2392928, "name": "Intro" }, - "new_value": { "id": 2392929, "name": "Evaluation" }, - "date_created": 1544557777, - "date_modified": 1544557777 - }, - { - "id": 4980327164, - "parent": { "id": 14901232, "type": "opportunity" }, - "type": { "id": 660495, "category": "user" }, - "user_id": 680302, - "details": "blah blah", - "activity_date": 1544554864, - "old_value": null, - "new_value": null, - "date_created": 1544556132, - "date_modified": 1544554864 - }, - { - "id": 4975270470, - "parent": { "id": 14888744, "type": "opportunity" }, - "type": { "id": 3, "category": "system", "name": "Stage Change" }, - "user_id": 680302, - "details": "blah blah", - "activity_date": 1544469501, - "old_value": { "id": 2392928, "name": "Intro" }, - "new_value": { "id": 2392931, "name": "Integration Started" }, - "date_created": 1544469501, - "date_modified": 1544469501 - }, - { - "id": 4975255523, - "parent": { "id": 64713448, "type": "person" }, - "type": { "id": 1, "category": "system" }, - "user_id": 680302, - "details": "blah blah", - "activity_date": 1544469389, - "old_value": null, - "new_value": null, - "date_created": 1544469389, - "date_modified": 1544469389 - }, - { - "id": 4975255519, - "parent": { "id": 13735617, "type": "opportunity" }, - "type": { "id": 1, "category": "system", "name": "Status Change" }, - "user_id": 680302, - "details": "blah blah", - "activity_date": 1544469388, - "old_value": "Open", - "new_value": "Won", - "date_created": 1544469388, - "date_modified": 1544469388 - }, - { - "id": 4975255514, - "parent": { "id": 27778968, "type": "company" }, - "type": { "id": 1, "category": "system" }, - "user_id": 680302, - "details": "blah blah", - "activity_date": 1544469388, - "old_value": null, - "new_value": null, - "date_created": 1544469388, - "date_modified": 1544469388 - } -] diff --git a/packages/pipeline/test/fixtures/copper/api_v1_list_activities.ts b/packages/pipeline/test/fixtures/copper/api_v1_list_activities.ts deleted file mode 100644 index 51ee9ced3..000000000 --- a/packages/pipeline/test/fixtures/copper/api_v1_list_activities.ts +++ /dev/null @@ -1,305 +0,0 @@ -import { CopperActivity } from '../../../src/entities'; - -const ParsedActivities: CopperActivity[] = [ - { - id: 5015299552, - parentId: 14667512, - parentType: 'opportunity', - typeId: 3, - typeCategory: 'system', - typeName: 'Stage Change', - userId: 680302, - dateCreated: 1545329595000, - dateModified: 1545329595000, - oldValueId: 2392929, - oldValueName: 'Evaluation', - newValueId: 2392931, - newValueName: 'Integration Started', - }, - { - id: 5010214065, - parentId: 14978865, - parentType: 'opportunity', - typeId: 3, - typeCategory: 'system', - typeName: 'Stage Change', - userId: 680302, - dateCreated: 1545245706000, - dateModified: 1545245706000, - oldValueId: 2392928, - oldValueName: 'Intro', - newValueId: 2392929, - newValueName: 'Evaluation', - }, - { - id: 5006149111, - parentId: 70430977, - parentType: 'person', - typeId: 660495, - typeCategory: 'user', - typeName: undefined, - userId: 680302, - dateCreated: 1545168280000, - dateModified: 1545166908000, - oldValueId: undefined, - oldValueName: undefined, - newValueId: undefined, - newValueName: undefined, - }, - { - id: 5005314622, - parentId: 27778968, - parentType: 'company', - typeId: 660495, - typeCategory: 'user', - typeName: undefined, - userId: 680302, - dateCreated: 1545160479000, - dateModified: 1545080504000, - oldValueId: undefined, - oldValueName: undefined, - newValueId: undefined, - newValueName: undefined, - }, - { - id: 5000006802, - parentId: 14956518, - parentType: 'opportunity', - typeId: 660495, - typeCategory: 'user', - typeName: undefined, - userId: 680302, - dateCreated: 1545071500000, - dateModified: 1545071374000, - oldValueId: undefined, - oldValueName: undefined, - newValueId: undefined, - newValueName: undefined, - }, - { - id: 4985504199, - parentId: 14912790, - parentType: 'opportunity', - typeId: 660495, - typeCategory: 'user', - typeName: undefined, - userId: 680302, - dateCreated: 1544644661000, - dateModified: 1544644058000, - oldValueId: undefined, - oldValueName: undefined, - newValueId: undefined, - newValueName: undefined, - }, - { - id: 4985456147, - parentId: 14912790, - parentType: 'opportunity', - typeId: 660495, - typeCategory: 'user', - typeName: undefined, - userId: 680302, - dateCreated: 1544644053000, - dateModified: 1544644048000, - oldValueId: undefined, - oldValueName: undefined, - newValueId: undefined, - newValueName: undefined, - }, - { - id: 4980975996, - parentId: 14902828, - parentType: 'opportunity', - typeId: 660495, - typeCategory: 'user', - typeName: undefined, - userId: 680302, - dateCreated: 1544563224000, - dateModified: 1544563171000, - oldValueId: undefined, - oldValueName: undefined, - newValueId: undefined, - newValueName: undefined, - }, - { - id: 4980910331, - parentId: 14902828, - parentType: 'opportunity', - typeId: 3, - typeCategory: 'system', - typeName: 'Stage Change', - userId: 680302, - dateCreated: 1544562495000, - dateModified: 1544562495000, - oldValueId: 2392928, - oldValueName: 'Intro', - newValueId: 2392931, - newValueName: 'Integration Started', - }, - { - id: 4980872220, - parentId: 14888910, - parentType: 'opportunity', - typeId: 660495, - typeCategory: 'user', - typeName: undefined, - userId: 680302, - dateCreated: 1544562118000, - dateModified: 1544559279000, - oldValueId: undefined, - oldValueName: undefined, - newValueId: undefined, - newValueName: undefined, - }, - { - id: 4980508097, - parentId: 14050167, - parentType: 'opportunity', - typeId: 1, - typeCategory: 'system', - typeName: 'Status Change', - userId: 680302, - dateCreated: 1544558077000, - dateModified: 1544558077000, - oldValueId: undefined, - oldValueName: undefined, - newValueId: undefined, - newValueName: undefined, - }, - { - id: 4980508095, - parentId: 66538237, - parentType: 'person', - typeId: 1, - typeCategory: 'system', - typeName: undefined, - userId: 680302, - dateCreated: 1544558077000, - dateModified: 1544558077000, - oldValueId: undefined, - oldValueName: undefined, - newValueId: undefined, - newValueName: undefined, - }, - { - id: 4980508092, - parentId: 27779020, - parentType: 'company', - typeId: 1, - typeCategory: 'system', - typeName: undefined, - userId: 680302, - dateCreated: 1544558077000, - dateModified: 1544558077000, - oldValueId: undefined, - oldValueName: undefined, - newValueId: undefined, - newValueName: undefined, - }, - { - id: 4980507507, - parentId: 14050167, - parentType: 'opportunity', - typeId: 3, - typeCategory: 'system', - typeName: 'Stage Change', - userId: 680302, - dateCreated: 1544558071000, - dateModified: 1544558071000, - oldValueId: 2392931, - oldValueName: 'Integration Started', - newValueId: 2405442, - newValueName: 'Integration Complete', - }, - { - id: 4980479684, - parentId: 14901232, - parentType: 'opportunity', - typeId: 3, - typeCategory: 'system', - typeName: 'Stage Change', - userId: 680302, - dateCreated: 1544557777000, - dateModified: 1544557777000, - oldValueId: 2392928, - oldValueName: 'Intro', - newValueId: 2392929, - newValueName: 'Evaluation', - }, - { - id: 4980327164, - parentId: 14901232, - parentType: 'opportunity', - typeId: 660495, - typeCategory: 'user', - typeName: undefined, - userId: 680302, - dateCreated: 1544556132000, - dateModified: 1544554864000, - oldValueId: undefined, - oldValueName: undefined, - newValueId: undefined, - newValueName: undefined, - }, - { - id: 4975270470, - parentId: 14888744, - parentType: 'opportunity', - typeId: 3, - typeCategory: 'system', - typeName: 'Stage Change', - userId: 680302, - dateCreated: 1544469501000, - dateModified: 1544469501000, - oldValueId: 2392928, - oldValueName: 'Intro', - newValueId: 2392931, - newValueName: 'Integration Started', - }, - { - id: 4975255523, - parentId: 64713448, - parentType: 'person', - typeId: 1, - typeCategory: 'system', - typeName: undefined, - userId: 680302, - dateCreated: 1544469389000, - dateModified: 1544469389000, - oldValueId: undefined, - oldValueName: undefined, - newValueId: undefined, - newValueName: undefined, - }, - { - id: 4975255519, - parentId: 13735617, - parentType: 'opportunity', - typeId: 1, - typeCategory: 'system', - typeName: 'Status Change', - userId: 680302, - dateCreated: 1544469388000, - dateModified: 1544469388000, - oldValueId: undefined, - oldValueName: undefined, - newValueId: undefined, - newValueName: undefined, - }, - { - id: 4975255514, - parentId: 27778968, - parentType: 'company', - typeId: 1, - typeCategory: 'system', - typeName: undefined, - userId: 680302, - dateCreated: 1544469388000, - dateModified: 1544469388000, - oldValueId: undefined, - oldValueName: undefined, - newValueId: undefined, - newValueName: undefined, - }, -]; -export { ParsedActivities }; diff --git a/packages/pipeline/test/fixtures/copper/api_v1_list_leads.json b/packages/pipeline/test/fixtures/copper/api_v1_list_leads.json deleted file mode 100644 index e7161085d..000000000 --- a/packages/pipeline/test/fixtures/copper/api_v1_list_leads.json +++ /dev/null @@ -1,577 +0,0 @@ -[ - { - "id": 9150547, - "name": "My Contact", - "prefix": null, - "first_name": "My", - "last_name": "Contact", - "middle_name": null, - "suffix": null, - "address": null, - "assignee_id": null, - "company_name": null, - "customer_source_id": null, - "details": null, - "email": { - "email": "mycontact@noemail.com", - "category": "work" - }, - "interaction_count": 0, - "monetary_value": null, - "socials": [], - "status": "New", - "status_id": 208231, - "tags": [], - "title": null, - "websites": [], - "phone_numbers": [], - "custom_fields": [ - { - "custom_field_definition_id": 100764, - "value": null - }, - { - "custom_field_definition_id": 103481, - "value": null - } - ], - "date_created": 1490045162, - "date_modified": 1490045162 - }, - { - "id": 9150552, - "name": "My Contact", - "prefix": null, - "first_name": "My", - "last_name": "Contact", - "middle_name": null, - "suffix": null, - "address": null, - "assignee_id": null, - "company_name": null, - "customer_source_id": null, - "details": null, - "email": null, - "interaction_count": 0, - "monetary_value": null, - "socials": [], - "status": "New", - "status_id": 208231, - "tags": [], - "title": null, - "websites": [], - "phone_numbers": [ - { - "number": "415-123-45678", - "category": "mobile" - } - ], - "custom_fields": [ - { - "custom_field_definition_id": 100764, - "value": null - }, - { - "custom_field_definition_id": 103481, - "value": null - } - ], - "date_created": 1490045237, - "date_modified": 1490045237 - }, - { - "id": 9150578, - "name": "My Contact", - "prefix": null, - "first_name": "My", - "last_name": "Contact", - "middle_name": null, - "suffix": null, - "address": null, - "assignee_id": null, - "company_name": null, - "customer_source_id": null, - "details": null, - "email": null, - "interaction_count": 0, - "monetary_value": null, - "socials": [], - "status": "New", - "status_id": 208231, - "tags": [], - "title": null, - "websites": [], - "phone_numbers": [ - { - "number": "415-123-45678", - "category": "mobile" - } - ], - "custom_fields": [ - { - "custom_field_definition_id": 100764, - "value": null - }, - { - "custom_field_definition_id": 103481, - "value": null - } - ], - "date_created": 1490045279, - "date_modified": 1490045279 - }, - { - "id": 8982554, - "name": "My Lead", - "prefix": null, - "first_name": "My", - "last_name": "Lead", - "middle_name": null, - "suffix": null, - "address": null, - "assignee_id": null, - "company_name": null, - "customer_source_id": null, - "details": null, - "email": { - "email": "mylead@noemail.com", - "category": "work" - }, - "interaction_count": 0, - "monetary_value": null, - "socials": [], - "status": "New", - "status_id": 208231, - "tags": [], - "title": null, - "websites": [], - "phone_numbers": [], - "custom_fields": [ - { - "custom_field_definition_id": 100764, - "value": null - }, - { - "custom_field_definition_id": 103481, - "value": null - } - ], - "date_created": 1489528899, - "date_modified": 1489528899 - }, - { - "id": 8982702, - "name": "My Lead", - "prefix": null, - "first_name": "My", - "last_name": "Lead", - "middle_name": null, - "suffix": null, - "address": null, - "assignee_id": null, - "company_name": null, - "customer_source_id": null, - "details": null, - "email": { - "email": "mylead@gmail.test", - "category": "work" - }, - "interaction_count": 0, - "monetary_value": null, - "socials": [], - "status": "New", - "status_id": 208231, - "tags": [], - "title": null, - "websites": [], - "phone_numbers": [], - "custom_fields": [ - { - "custom_field_definition_id": 100764, - "value": null - }, - { - "custom_field_definition_id": 103481, - "value": null - } - ], - "date_created": 1489531171, - "date_modified": 1489531171 - }, - { - "id": 9094361, - "name": "My Lead", - "prefix": null, - "first_name": "My", - "last_name": "Lead", - "middle_name": null, - "suffix": null, - "address": null, - "assignee_id": null, - "company_name": null, - "customer_source_id": null, - "details": null, - "email": { - "email": "mylead@noemail.com", - "category": "work" - }, - "interaction_count": 0, - "monetary_value": null, - "socials": [], - "status": "New", - "status_id": 208231, - "tags": [], - "title": null, - "websites": [], - "phone_numbers": [], - "custom_fields": [ - { - "custom_field_definition_id": 100764, - "value": null - }, - { - "custom_field_definition_id": 103481, - "value": null - } - ], - "date_created": 1489791225, - "date_modified": 1489791225 - }, - { - "id": 9094364, - "name": "My Lead", - "prefix": null, - "first_name": "My", - "last_name": "Lead", - "middle_name": null, - "suffix": null, - "address": null, - "assignee_id": null, - "company_name": null, - "customer_source_id": null, - "details": null, - "email": { - "email": "mylead@noemail.com", - "category": "work" - }, - "interaction_count": 0, - "monetary_value": null, - "socials": [], - "status": "New", - "status_id": 208231, - "tags": [], - "title": null, - "websites": [], - "phone_numbers": [], - "custom_fields": [ - { - "custom_field_definition_id": 100764, - "value": "123456789012345678901234567890" - }, - { - "custom_field_definition_id": 103481, - "value": "123456789012345678901234567890" - } - ], - "date_created": 1489791283, - "date_modified": 1489791283 - }, - { - "id": 9094371, - "name": "My Lead", - "prefix": null, - "first_name": "My", - "last_name": "Lead", - "middle_name": null, - "suffix": null, - "address": null, - "assignee_id": null, - "company_name": null, - "customer_source_id": null, - "details": null, - "email": { - "email": "mylead@noemail.com", - "category": "work" - }, - "interaction_count": 0, - "monetary_value": null, - "socials": [], - "status": "New", - "status_id": 208231, - "tags": [], - "title": null, - "websites": [], - "phone_numbers": [], - "custom_fields": [ - { - "custom_field_definition_id": 100764, - "value": "|--------1---------2---------3---------4---------5---------6---------7---------8---------9---------|--------1---------2---------3---------4---------5---------6---------7---------8---------9---------" - }, - { - "custom_field_definition_id": 103481, - "value": "123456789012345678901234567890" - } - ], - "date_created": 1489791417, - "date_modified": 1489791417 - }, - { - "id": 9094372, - "name": "My Lead", - "prefix": null, - "first_name": "My", - "last_name": "Lead", - "middle_name": null, - "suffix": null, - "address": null, - "assignee_id": null, - "company_name": null, - "customer_source_id": null, - "details": null, - "email": { - "email": "mylead@noemail.com", - "category": "work" - }, - "interaction_count": 0, - "monetary_value": null, - "socials": [], - "status": "New", - "status_id": 208231, - "tags": [], - "title": null, - "websites": [], - "phone_numbers": [], - "custom_fields": [ - { - "custom_field_definition_id": 100764, - "value": "|--------1---------2---------3---------4---------5---------6---------7---------8---------9---------|--------1---------2---------3---------4---------5---------6---------7---------8---------9---------|--------1---------2---------3---------4---------5-----" - }, - { - "custom_field_definition_id": 103481, - "value": "123456789012345678901234567890" - } - ], - "date_created": 1489791453, - "date_modified": 1489791453 - }, - { - "id": 9094373, - "name": "My Lead", - "prefix": null, - "first_name": "My", - "last_name": "Lead", - "middle_name": null, - "suffix": null, - "address": null, - "assignee_id": null, - "company_name": null, - "customer_source_id": null, - "details": null, - "email": { - "email": "mylead@noemail.com", - "category": "work" - }, - "interaction_count": 0, - "monetary_value": null, - "socials": [], - "status": "New", - "status_id": 208231, - "tags": [], - "title": null, - "websites": [], - "phone_numbers": [], - "custom_fields": [ - { - "custom_field_definition_id": 100764, - "value": "|--------1---------2---------3---------4---------5---------6---------7---------8---------9---------|--------1---------2---------3---------4---------5---------6---------7---------8---------9---------|--------1---------2---------3---------4---------5-----" - }, - { - "custom_field_definition_id": 103481, - "value": "|--------1---------2---------3---------4---------5---------6---------7---------8---------9---------|--------1---------2---------3---------4---------5---------6---------7---------8---------9---------|--------1---------2---------3---------4---------5---------6---------7---------8---------9---------" - } - ], - "date_created": 1489791470, - "date_modified": 1489791470 - }, - { - "id": 9094383, - "name": "My Lead", - "prefix": null, - "first_name": "My", - "last_name": "Lead", - "middle_name": null, - "suffix": null, - "address": null, - "assignee_id": null, - "company_name": null, - "customer_source_id": null, - "details": null, - "email": { - "email": "mylead@noemail.com", - "category": "work" - }, - "interaction_count": 0, - "monetary_value": null, - "socials": [], - "status": "New", - "status_id": 208231, - "tags": [], - "title": null, - "websites": [], - "phone_numbers": [], - "custom_fields": [ - { - "custom_field_definition_id": 100764, - "value": "|--------1---------2---------3---------4---------5---------6---------7---------8---------9---------|--------1---------2---------3---------4---------5---------6---------7---------8---------9---------|--------1---------2---------3---------4---------5-----" - }, - { - "custom_field_definition_id": 103481, - "value": "|--------1---------2---------3---------4---------5---------6---------7---------8---------9---------|--------1---------2---------3---------4---------5---------6---------7---------8---------9---------|--------1---------2---------3---------4---------5---------6---------7---------8---------9---------|--------1---------2---------3---------4---------5---------6---------7---------8---------9---------|--------1---------2---------3---------4---------5---------6---------7---------8---------9---------" - } - ], - "date_created": 1489791672, - "date_modified": 1489791672 - }, - { - "id": 9174441, - "name": "My Lead", - "prefix": null, - "first_name": "My", - "last_name": "Lead", - "middle_name": null, - "suffix": null, - "address": null, - "assignee_id": null, - "company_name": null, - "customer_source_id": null, - "details": null, - "email": { - "email": "mylead@noemail.com", - "category": "work" - }, - "interaction_count": 0, - "monetary_value": null, - "socials": [], - "status": "New", - "status_id": 208231, - "tags": [], - "title": null, - "websites": [], - "phone_numbers": [], - "custom_fields": [ - { - "custom_field_definition_id": 100764, - "value": "Text fields are 255 chars or less!" - }, - { - "custom_field_definition_id": 103481, - "value": "text \n text" - } - ], - "date_created": 1490112942, - "date_modified": 1490112942 - }, - { - "id": 9174443, - "name": "My Lead", - "prefix": null, - "first_name": "My", - "last_name": "Lead", - "middle_name": null, - "suffix": null, - "address": null, - "assignee_id": null, - "company_name": null, - "customer_source_id": null, - "details": null, - "email": { - "email": "mylead@noemail.com", - "category": "work" - }, - "interaction_count": 0, - "monetary_value": null, - "socials": [], - "status": "New", - "status_id": 208231, - "tags": [], - "title": null, - "websites": [], - "phone_numbers": [], - "custom_fields": [ - { - "custom_field_definition_id": 100764, - "value": "Text fields are 255 chars or less!" - }, - { - "custom_field_definition_id": 103481, - "value": "text /n text" - } - ], - "date_created": 1490112953, - "date_modified": 1490112953 - }, - { - "id": 8894157, - "name": "Test Lead", - "prefix": null, - "first_name": "Test", - "last_name": "Lead", - "middle_name": null, - "suffix": null, - "address": { - "street": "301 Howard St Ste 600", - "city": "San Francisco", - "state": "CA", - "postal_code": "94105", - "country": "US" - }, - "assignee_id": 137658, - "company_name": "Lead's Company", - "customer_source_id": 331241, - "details": "This is an update", - "email": { - "email": "address@workemail.com", - "category": "work" - }, - "interaction_count": 0, - "monetary_value": 100, - "socials": [ - { - "url": "facebook.com/test_lead", - "category": "facebook" - } - ], - "status": "New", - "status_id": 208231, - "tags": ["tag 1", "tag 2"], - "title": "Title", - "websites": [ - { - "url": "www.workwebsite.com", - "category": "work" - } - ], - "phone_numbers": [ - { - "number": "415-999-4321", - "category": "mobile" - }, - { - "number": "415-555-1234", - "category": "work" - } - ], - "custom_fields": [ - { - "custom_field_definition_id": 100764, - "value": null - }, - { - "custom_field_definition_id": 103481, - "value": null - } - ], - "date_created": 1489018784, - "date_modified": 1496692911 - } -] diff --git a/packages/pipeline/test/fixtures/copper/api_v1_list_leads.ts b/packages/pipeline/test/fixtures/copper/api_v1_list_leads.ts deleted file mode 100644 index b1f00cba7..000000000 --- a/packages/pipeline/test/fixtures/copper/api_v1_list_leads.ts +++ /dev/null @@ -1,229 +0,0 @@ -import { CopperLead } from '../../../src/entities'; -const ParsedLeads: CopperLead[] = [ - { - id: 9150547, - name: 'My Contact', - firstName: 'My', - lastName: 'Contact', - middleName: undefined, - assigneeId: undefined, - companyName: undefined, - customerSourceId: undefined, - monetaryValue: undefined, - status: 'New', - statusId: 208231, - title: undefined, - dateCreated: 1490045162000, - dateModified: 1490045162000, - }, - { - id: 9150552, - name: 'My Contact', - firstName: 'My', - lastName: 'Contact', - middleName: undefined, - assigneeId: undefined, - companyName: undefined, - customerSourceId: undefined, - monetaryValue: undefined, - status: 'New', - statusId: 208231, - title: undefined, - dateCreated: 1490045237000, - dateModified: 1490045237000, - }, - { - id: 9150578, - name: 'My Contact', - firstName: 'My', - lastName: 'Contact', - middleName: undefined, - assigneeId: undefined, - companyName: undefined, - customerSourceId: undefined, - monetaryValue: undefined, - status: 'New', - statusId: 208231, - title: undefined, - dateCreated: 1490045279000, - dateModified: 1490045279000, - }, - { - id: 8982554, - name: 'My Lead', - firstName: 'My', - lastName: 'Lead', - middleName: undefined, - assigneeId: undefined, - companyName: undefined, - customerSourceId: undefined, - monetaryValue: undefined, - status: 'New', - statusId: 208231, - title: undefined, - dateCreated: 1489528899000, - dateModified: 1489528899000, - }, - { - id: 8982702, - name: 'My Lead', - firstName: 'My', - lastName: 'Lead', - middleName: undefined, - assigneeId: undefined, - companyName: undefined, - customerSourceId: undefined, - monetaryValue: undefined, - status: 'New', - statusId: 208231, - title: undefined, - dateCreated: 1489531171000, - dateModified: 1489531171000, - }, - { - id: 9094361, - name: 'My Lead', - firstName: 'My', - lastName: 'Lead', - middleName: undefined, - assigneeId: undefined, - companyName: undefined, - customerSourceId: undefined, - monetaryValue: undefined, - status: 'New', - statusId: 208231, - title: undefined, - dateCreated: 1489791225000, - dateModified: 1489791225000, - }, - { - id: 9094364, - name: 'My Lead', - firstName: 'My', - lastName: 'Lead', - middleName: undefined, - assigneeId: undefined, - companyName: undefined, - customerSourceId: undefined, - monetaryValue: undefined, - status: 'New', - statusId: 208231, - title: undefined, - dateCreated: 1489791283000, - dateModified: 1489791283000, - }, - { - id: 9094371, - name: 'My Lead', - firstName: 'My', - lastName: 'Lead', - middleName: undefined, - assigneeId: undefined, - companyName: undefined, - customerSourceId: undefined, - monetaryValue: undefined, - status: 'New', - statusId: 208231, - title: undefined, - dateCreated: 1489791417000, - dateModified: 1489791417000, - }, - { - id: 9094372, - name: 'My Lead', - firstName: 'My', - lastName: 'Lead', - middleName: undefined, - assigneeId: undefined, - companyName: undefined, - customerSourceId: undefined, - monetaryValue: undefined, - status: 'New', - statusId: 208231, - title: undefined, - dateCreated: 1489791453000, - dateModified: 1489791453000, - }, - { - id: 9094373, - name: 'My Lead', - firstName: 'My', - lastName: 'Lead', - middleName: undefined, - assigneeId: undefined, - companyName: undefined, - customerSourceId: undefined, - monetaryValue: undefined, - status: 'New', - statusId: 208231, - title: undefined, - dateCreated: 1489791470000, - dateModified: 1489791470000, - }, - { - id: 9094383, - name: 'My Lead', - firstName: 'My', - lastName: 'Lead', - middleName: undefined, - assigneeId: undefined, - companyName: undefined, - customerSourceId: undefined, - monetaryValue: undefined, - status: 'New', - statusId: 208231, - title: undefined, - dateCreated: 1489791672000, - dateModified: 1489791672000, - }, - { - id: 9174441, - name: 'My Lead', - firstName: 'My', - lastName: 'Lead', - middleName: undefined, - assigneeId: undefined, - companyName: undefined, - customerSourceId: undefined, - monetaryValue: undefined, - status: 'New', - statusId: 208231, - title: undefined, - dateCreated: 1490112942000, - dateModified: 1490112942000, - }, - { - id: 9174443, - name: 'My Lead', - firstName: 'My', - lastName: 'Lead', - middleName: undefined, - assigneeId: undefined, - companyName: undefined, - customerSourceId: undefined, - monetaryValue: undefined, - status: 'New', - statusId: 208231, - title: undefined, - dateCreated: 1490112953000, - dateModified: 1490112953000, - }, - { - id: 8894157, - name: 'Test Lead', - firstName: 'Test', - lastName: 'Lead', - middleName: undefined, - assigneeId: 137658, - companyName: "Lead's Company", - customerSourceId: 331241, - monetaryValue: 100, - status: 'New', - statusId: 208231, - title: 'Title', - dateCreated: 1489018784000, - dateModified: 1496692911000, - }, -]; - -export { ParsedLeads }; diff --git a/packages/pipeline/test/fixtures/copper/api_v1_list_opportunities.json b/packages/pipeline/test/fixtures/copper/api_v1_list_opportunities.json deleted file mode 100644 index 34ac58c30..000000000 --- a/packages/pipeline/test/fixtures/copper/api_v1_list_opportunities.json +++ /dev/null @@ -1,662 +0,0 @@ -[ - { - "id": 14050269, - "name": "8Base RaaS", - "assignee_id": 680302, - "close_date": "11/19/2018", - "company_id": 27778962, - "company_name": "8base", - "customer_source_id": null, - "details": "blah blah", - "loss_reason_id": null, - "pipeline_id": 512676, - "pipeline_stage_id": 2405442, - "primary_contact_id": 66088850, - "priority": "None", - "status": "Won", - "tags": [], - "interaction_count": 81, - "monetary_unit": null, - "monetary_value": null, - "converted_unit": null, - "converted_value": null, - "win_probability": 0, - "date_stage_changed": 1542653860, - "date_last_contacted": 1544757550, - "leads_converted_from": [], - "date_lead_created": null, - "date_created": 1538414159, - "date_modified": 1544769562, - "custom_fields": [ - { "custom_field_definition_id": 261066, "value": [394013, 394018] }, - { "custom_field_definition_id": 261067, "value": 394026 } - ] - }, - { - "id": 14631430, - "name": "Alice.si TW + ERC 20 Marketplace", - "assignee_id": 680302, - "close_date": "12/15/2018", - "company_id": 30238847, - "company_name": "Alice SI", - "customer_source_id": null, - "details": "blah blah", - "loss_reason_id": null, - "pipeline_id": 512676, - "pipeline_stage_id": 2392929, - "primary_contact_id": 69354024, - "priority": "None", - "status": "Open", - "tags": [], - "interaction_count": 4, - "monetary_unit": null, - "monetary_value": null, - "converted_unit": null, - "converted_value": null, - "win_probability": 0, - "date_stage_changed": 1542304481, - "date_last_contacted": 1542304800, - "leads_converted_from": [], - "date_lead_created": null, - "date_created": 1542304481, - "date_modified": 1542304943, - "custom_fields": [ - { "custom_field_definition_id": 261066, "value": [394013, 394015] }, - { "custom_field_definition_id": 261067, "value": 394023 } - ] - }, - { - "id": 14632057, - "name": "Altcoin.io Relayer", - "assignee_id": 680302, - "close_date": "12/15/2018", - "company_id": 29936486, - "company_name": "Altcoin.io", - "customer_source_id": null, - "details": "blah blah", - "loss_reason_id": null, - "pipeline_id": 512676, - "pipeline_stage_id": 2392929, - "primary_contact_id": 68724646, - "priority": "None", - "status": "Open", - "tags": [], - "interaction_count": 22, - "monetary_unit": null, - "monetary_value": null, - "converted_unit": null, - "converted_value": null, - "win_probability": 0, - "date_stage_changed": 1542310909, - "date_last_contacted": 1543864597, - "leads_converted_from": [], - "date_lead_created": null, - "date_created": 1542306827, - "date_modified": 1543864667, - "custom_fields": [ - { "custom_field_definition_id": 261066, "value": [394013, 394017] }, - { "custom_field_definition_id": 261067, "value": 394023 } - ] - }, - { - "id": 14667523, - "name": "Altcoin.io Relayer", - "assignee_id": 680302, - "close_date": "12/19/2018", - "company_id": 29936486, - "company_name": "Altcoin.io", - "customer_source_id": null, - "details": "blah blah", - "loss_reason_id": null, - "pipeline_id": 512676, - "pipeline_stage_id": 2392929, - "primary_contact_id": 68724646, - "priority": "None", - "status": "Open", - "tags": [], - "interaction_count": 21, - "monetary_unit": null, - "monetary_value": null, - "converted_unit": null, - "converted_value": null, - "win_probability": 0, - "date_stage_changed": 1542657437, - "date_last_contacted": 1543864597, - "leads_converted_from": [], - "date_lead_created": null, - "date_created": 1542657437, - "date_modified": 1543864667, - "custom_fields": [ - { "custom_field_definition_id": 261066, "value": [394013, 394017] }, - { "custom_field_definition_id": 261067, "value": 394023 } - ] - }, - { - "id": 14666706, - "name": "Amadeus Relayer", - "assignee_id": 680302, - "close_date": "11/19/2018", - "company_id": 29243209, - "company_name": "Amadeus", - "customer_source_id": null, - "details": "blah blah", - "loss_reason_id": null, - "pipeline_id": 512676, - "pipeline_stage_id": 2405442, - "primary_contact_id": 66912020, - "priority": "None", - "status": "Won", - "tags": [], - "interaction_count": 11, - "monetary_unit": null, - "monetary_value": null, - "converted_unit": null, - "converted_value": null, - "win_probability": 0, - "date_stage_changed": 1542654284, - "date_last_contacted": 1543264254, - "leads_converted_from": [], - "date_lead_created": null, - "date_created": 1542654284, - "date_modified": 1543277520, - "custom_fields": [ - { "custom_field_definition_id": 261066, "value": [394013] }, - { "custom_field_definition_id": 261067, "value": 394023 } - ] - }, - { - "id": 14666718, - "name": "Ambo Relayer", - "assignee_id": 680302, - "close_date": "11/19/2018", - "company_id": 29249190, - "company_name": "Ambo", - "customer_source_id": null, - "details": "blah blah", - "loss_reason_id": null, - "pipeline_id": 512676, - "pipeline_stage_id": 2405442, - "primary_contact_id": 66927869, - "priority": "None", - "status": "Won", - "tags": [], - "interaction_count": 126, - "monetary_unit": null, - "monetary_value": null, - "converted_unit": null, - "converted_value": null, - "win_probability": 0, - "date_stage_changed": 1542654352, - "date_last_contacted": 1545252349, - "leads_converted_from": [], - "date_lead_created": null, - "date_created": 1542654352, - "date_modified": 1545253761, - "custom_fields": [ - { "custom_field_definition_id": 261066, "value": [394013] }, - { "custom_field_definition_id": 261067, "value": 394023 } - ] - }, - { - "id": 14164318, - "name": "Augur TW", - "assignee_id": 680302, - "close_date": "12/10/2018", - "company_id": 27778967, - "company_name": "Augur", - "customer_source_id": null, - "details": "blah blah", - "loss_reason_id": null, - "pipeline_id": 512676, - "pipeline_stage_id": 2405442, - "primary_contact_id": 67248692, - "priority": "None", - "status": "Won", - "tags": [], - "interaction_count": 22, - "monetary_unit": null, - "monetary_value": null, - "converted_unit": null, - "converted_value": null, - "win_probability": 0, - "date_stage_changed": 1544469362, - "date_last_contacted": 1544491567, - "leads_converted_from": [], - "date_lead_created": null, - "date_created": 1539204858, - "date_modified": 1544653867, - "custom_fields": [ - { "custom_field_definition_id": 261066, "value": [394015] }, - { "custom_field_definition_id": 261067, "value": 394021 } - ] - }, - { - "id": 14666626, - "name": "Autonio", - "assignee_id": 680302, - "close_date": "12/19/2018", - "company_id": 27920701, - "company_name": "Auton", - "customer_source_id": null, - "details": "blah blah", - "loss_reason_id": null, - "pipeline_id": 512676, - "pipeline_stage_id": 2392931, - "primary_contact_id": 64742640, - "priority": "None", - "status": "Open", - "tags": [], - "interaction_count": 54, - "monetary_unit": null, - "monetary_value": null, - "converted_unit": null, - "converted_value": null, - "win_probability": 0, - "date_stage_changed": 1542653834, - "date_last_contacted": 1542658568, - "leads_converted_from": [], - "date_lead_created": null, - "date_created": 1542653834, - "date_modified": 1542658808, - "custom_fields": [ - { "custom_field_definition_id": 261066, "value": [394013, 394019] }, - { "custom_field_definition_id": 261067, "value": 394023 } - ] - }, - { - "id": 14050921, - "name": "Axie Infinity 721 Marketplace", - "assignee_id": 680302, - "close_date": "11/1/2018", - "company_id": 27779033, - "company_name": "Axie Infinity", - "customer_source_id": null, - "details": "blah blah", - "loss_reason_id": null, - "pipeline_id": 512676, - "pipeline_stage_id": 2392931, - "primary_contact_id": 66499254, - "priority": "None", - "status": "Open", - "tags": [], - "interaction_count": 4, - "monetary_unit": null, - "monetary_value": null, - "converted_unit": null, - "converted_value": null, - "win_probability": 0, - "date_stage_changed": 1543861025, - "date_last_contacted": 1539024738, - "leads_converted_from": [], - "date_lead_created": null, - "date_created": 1538416687, - "date_modified": 1543861025, - "custom_fields": [ - { "custom_field_definition_id": 261066, "value": [394014] }, - { "custom_field_definition_id": 261067, "value": 394134 } - ] - }, - { - "id": 13735617, - "name": "Balance TW", - "assignee_id": 680302, - "close_date": "12/10/2018", - "company_id": 27778968, - "company_name": "Balance", - "customer_source_id": null, - "details": "blah blah", - "loss_reason_id": null, - "pipeline_id": 512676, - "pipeline_stage_id": 2405442, - "primary_contact_id": 64713448, - "priority": "None", - "status": "Won", - "tags": [], - "interaction_count": 34, - "monetary_unit": null, - "monetary_value": null, - "converted_unit": null, - "converted_value": null, - "win_probability": 0, - "date_stage_changed": 1544469382, - "date_last_contacted": 1545082200, - "leads_converted_from": [], - "date_lead_created": null, - "date_created": 1535668009, - "date_modified": 1545082454, - "custom_fields": [ - { "custom_field_definition_id": 261066, "value": [394015] }, - { "custom_field_definition_id": 261067, "value": 394027 } - ] - }, - { - "id": 14667112, - "name": "Bamboo Relayer", - "assignee_id": 680302, - "close_date": "11/19/2018", - "company_id": 29243795, - "company_name": "Bamboo Relay", - "customer_source_id": null, - "details": "blah blah", - "loss_reason_id": null, - "pipeline_id": 512676, - "pipeline_stage_id": 2405442, - "primary_contact_id": 66914687, - "priority": "None", - "status": "Won", - "tags": [], - "interaction_count": 46, - "monetary_unit": null, - "monetary_value": null, - "converted_unit": null, - "converted_value": null, - "win_probability": 0, - "date_stage_changed": 1542655143, - "date_last_contacted": 1545252349, - "leads_converted_from": [], - "date_lead_created": null, - "date_created": 1542655143, - "date_modified": 1545253761, - "custom_fields": [ - { "custom_field_definition_id": 261066, "value": [394013] }, - { "custom_field_definition_id": 261067, "value": 394023 } - ] - }, - { - "id": 13627309, - "name": "Ben TW", - "assignee_id": 680302, - "close_date": "1/1/2019", - "company_id": 27702348, - "company_name": "Ben", - "customer_source_id": null, - "details": "blah blah", - "loss_reason_id": null, - "pipeline_id": 512676, - "pipeline_stage_id": 2392929, - "primary_contact_id": 64262622, - "priority": "None", - "status": "Open", - "tags": [], - "interaction_count": 64, - "monetary_unit": null, - "monetary_value": null, - "converted_unit": null, - "converted_value": null, - "win_probability": 0, - "date_stage_changed": 1541527279, - "date_last_contacted": 1541639882, - "leads_converted_from": [], - "date_lead_created": null, - "date_created": 1534887789, - "date_modified": 1541651395, - "custom_fields": [ - { "custom_field_definition_id": 261066, "value": [394015] }, - { "custom_field_definition_id": 261067, "value": 394027 } - ] - }, - { - "id": 14808512, - "name": "Bit2Me Relayer", - "assignee_id": 680302, - "close_date": "12/3/2018", - "company_id": 30793050, - "company_name": "Bit2Me", - "customer_source_id": null, - "details": "blah blah", - "loss_reason_id": null, - "pipeline_id": 512676, - "pipeline_stage_id": 2405442, - "primary_contact_id": 70267217, - "priority": "None", - "status": "Won", - "tags": [], - "interaction_count": 0, - "monetary_unit": null, - "monetary_value": null, - "converted_unit": null, - "converted_value": null, - "win_probability": 0, - "date_stage_changed": 1543861167, - "date_last_contacted": null, - "leads_converted_from": [], - "date_lead_created": null, - "date_created": 1543861167, - "date_modified": 1543861189, - "custom_fields": [ - { "custom_field_definition_id": 261066, "value": [394013] }, - { "custom_field_definition_id": 261067, "value": 394023 } - ] - }, - { - "id": 14050312, - "name": "Bitcoin.tax Reporting Integration", - "assignee_id": 680302, - "close_date": "11/1/2018", - "company_id": 27957614, - "company_name": "Bitcoin", - "customer_source_id": null, - "details": "blah blah", - "loss_reason_id": null, - "pipeline_id": 512676, - "pipeline_stage_id": 2392928, - "primary_contact_id": 66539479, - "priority": "None", - "status": "Open", - "tags": [], - "interaction_count": 5, - "monetary_unit": null, - "monetary_value": null, - "converted_unit": null, - "converted_value": null, - "win_probability": 0, - "date_stage_changed": 1538414308, - "date_last_contacted": 1536766098, - "leads_converted_from": [], - "date_lead_created": null, - "date_created": 1538414308, - "date_modified": 1538414314, - "custom_fields": [ - { "custom_field_definition_id": 261066, "value": [394019] }, - { "custom_field_definition_id": 261067, "value": 394026 } - ] - }, - { - "id": 14331463, - "name": "Bitpie TW", - "assignee_id": 680302, - "close_date": "11/19/2018", - "company_id": 27779026, - "company_name": "Bitpie", - "customer_source_id": null, - "details": "blah blah", - "loss_reason_id": null, - "pipeline_id": 512676, - "pipeline_stage_id": 2392929, - "primary_contact_id": 67700943, - "priority": "None", - "status": "Open", - "tags": [], - "interaction_count": 9, - "monetary_unit": null, - "monetary_value": null, - "converted_unit": null, - "converted_value": null, - "win_probability": 0, - "date_stage_changed": 1539984566, - "date_last_contacted": 1541529947, - "leads_converted_from": [], - "date_lead_created": null, - "date_created": 1539984566, - "date_modified": 1541530233, - "custom_fields": [ - { "custom_field_definition_id": 261066, "value": [394015] }, - { "custom_field_definition_id": 261067, "value": 394027 } - ] - }, - { - "id": 14331481, - "name": "Bitski Wallet SDK TW", - "assignee_id": 680302, - "close_date": "11/19/2018", - "company_id": 29489300, - "company_name": "Bitski", - "customer_source_id": null, - "details": "blah blah", - "loss_reason_id": null, - "pipeline_id": 512676, - "pipeline_stage_id": 2392929, - "primary_contact_id": 67697528, - "priority": "None", - "status": "Open", - "tags": [], - "interaction_count": 23, - "monetary_unit": null, - "monetary_value": null, - "converted_unit": null, - "converted_value": null, - "win_probability": 0, - "date_stage_changed": 1539984735, - "date_last_contacted": 1544811399, - "leads_converted_from": [], - "date_lead_created": null, - "date_created": 1539984735, - "date_modified": 1544818605, - "custom_fields": [ - { "custom_field_definition_id": 261066, "value": [394015] }, - { "custom_field_definition_id": 261067, "value": 394026 } - ] - }, - { - "id": 14531554, - "name": "BitUniverse TW", - "assignee_id": 680302, - "close_date": "12/6/2018", - "company_id": 29901805, - "company_name": "BitUniverse Co., Ltd (Cryptocurrency Portfolio)", - "customer_source_id": null, - "details": "blah blah", - "loss_reason_id": null, - "pipeline_id": 512676, - "pipeline_stage_id": 2392929, - "primary_contact_id": 68692107, - "priority": "None", - "status": "Open", - "tags": [], - "interaction_count": 15, - "monetary_unit": null, - "monetary_value": null, - "converted_unit": null, - "converted_value": null, - "win_probability": 0, - "date_stage_changed": 1543861104, - "date_last_contacted": 1544803276, - "leads_converted_from": [], - "date_lead_created": null, - "date_created": 1541527110, - "date_modified": 1544812979, - "custom_fields": [ - { "custom_field_definition_id": 261066, "value": [394015] }, - { "custom_field_definition_id": 261067, "value": 394026 } - ] - }, - { - "id": 14050895, - "name": "BlitzPredict PMR", - "assignee_id": 680302, - "close_date": "11/1/2018", - "company_id": 28758258, - "company_name": "BlitzPredict", - "customer_source_id": null, - "details": "blah blah", - "loss_reason_id": null, - "pipeline_id": 512676, - "pipeline_stage_id": 2392929, - "primary_contact_id": 66378659, - "priority": "None", - "status": "Open", - "tags": [], - "interaction_count": 32, - "monetary_unit": null, - "monetary_value": null, - "converted_unit": null, - "converted_value": null, - "win_probability": 0, - "date_stage_changed": 1539985501, - "date_last_contacted": 1544830560, - "leads_converted_from": [], - "date_lead_created": null, - "date_created": 1538416597, - "date_modified": 1544830709, - "custom_fields": [ - { "custom_field_definition_id": 261066, "value": [394016] }, - { "custom_field_definition_id": 261067, "value": 394023 } - ] - }, - { - "id": 14209841, - "name": "Blockfolio TW", - "assignee_id": 680302, - "close_date": "11/15/2018", - "company_id": 29332516, - "company_name": "Blockfolio", - "customer_source_id": null, - "details": "blah blah", - "loss_reason_id": null, - "pipeline_id": 512676, - "pipeline_stage_id": 2405443, - "primary_contact_id": 67247027, - "priority": "None", - "status": "Open", - "tags": [], - "interaction_count": 20, - "monetary_unit": null, - "monetary_value": null, - "converted_unit": null, - "converted_value": null, - "win_probability": 0, - "date_stage_changed": 1539984098, - "date_last_contacted": 1539977661, - "leads_converted_from": [], - "date_lead_created": null, - "date_created": 1539624801, - "date_modified": 1539984098, - "custom_fields": [ - { "custom_field_definition_id": 261066, "value": [394015] }, - { "custom_field_definition_id": 261067, "value": 394026 } - ] - }, - { - "id": 14633220, - "name": "BlockSwap 721 / 1155 Conversational Marketplace", - "assignee_id": 680302, - "close_date": "12/15/2018", - "company_id": 30210921, - "company_name": "BlockSwap", - "customer_source_id": null, - "details": "blah blah", - "loss_reason_id": null, - "pipeline_id": 512676, - "pipeline_stage_id": 2392929, - "primary_contact_id": 69296220, - "priority": "None", - "status": "Open", - "tags": [], - "interaction_count": 82, - "monetary_unit": null, - "monetary_value": null, - "converted_unit": null, - "converted_value": null, - "win_probability": 0, - "date_stage_changed": 1542311056, - "date_last_contacted": 1543536442, - "leads_converted_from": [], - "date_lead_created": null, - "date_created": 1542311056, - "date_modified": 1543557877, - "custom_fields": [ - { "custom_field_definition_id": 261066, "value": [394014] }, - { "custom_field_definition_id": 261067, "value": 394023 } - ] - } -] diff --git a/packages/pipeline/test/fixtures/copper/api_v1_list_opportunities.ts b/packages/pipeline/test/fixtures/copper/api_v1_list_opportunities.ts deleted file mode 100644 index 3c2d4ae5e..000000000 --- a/packages/pipeline/test/fixtures/copper/api_v1_list_opportunities.ts +++ /dev/null @@ -1,425 +0,0 @@ -// tslint:disable:custom-no-magic-numbers -import { CopperOpportunity } from '../../../src/entities'; -const ParsedOpportunities: CopperOpportunity[] = [ - { - id: 14050269, - name: '8Base RaaS', - assigneeId: 680302, - closeDate: '11/19/2018', - companyId: 27778962, - companyName: '8base', - customerSourceId: undefined, - lossReasonId: undefined, - pipelineId: 512676, - pipelineStageId: 2405442, - primaryContactId: 66088850, - priority: 'None', - status: 'Won', - interactionCount: 81, - monetaryValue: undefined, - winProbability: 0, - dateCreated: 1538414159000, - dateModified: 1544769562000, - customFields: { '261066': 394018, '261067': 394026 }, - }, - { - id: 14631430, - name: 'Alice.si TW + ERC 20 Marketplace', - assigneeId: 680302, - closeDate: '12/15/2018', - companyId: 30238847, - companyName: 'Alice SI', - customerSourceId: undefined, - lossReasonId: undefined, - pipelineId: 512676, - pipelineStageId: 2392929, - primaryContactId: 69354024, - priority: 'None', - status: 'Open', - interactionCount: 4, - monetaryValue: undefined, - winProbability: 0, - dateCreated: 1542304481000, - dateModified: 1542304943000, - customFields: { '261066': 394015, '261067': 394023 }, - }, - { - id: 14632057, - name: 'Altcoin.io Relayer', - assigneeId: 680302, - closeDate: '12/15/2018', - companyId: 29936486, - companyName: 'Altcoin.io', - customerSourceId: undefined, - lossReasonId: undefined, - pipelineId: 512676, - pipelineStageId: 2392929, - primaryContactId: 68724646, - priority: 'None', - status: 'Open', - interactionCount: 22, - monetaryValue: undefined, - winProbability: 0, - dateCreated: 1542306827000, - dateModified: 1543864667000, - customFields: { '261066': 394017, '261067': 394023 }, - }, - { - id: 14667523, - name: 'Altcoin.io Relayer', - assigneeId: 680302, - closeDate: '12/19/2018', - companyId: 29936486, - companyName: 'Altcoin.io', - customerSourceId: undefined, - lossReasonId: undefined, - pipelineId: 512676, - pipelineStageId: 2392929, - primaryContactId: 68724646, - priority: 'None', - status: 'Open', - interactionCount: 21, - monetaryValue: undefined, - winProbability: 0, - dateCreated: 1542657437000, - dateModified: 1543864667000, - customFields: { '261066': 394017, '261067': 394023 }, - }, - { - id: 14666706, - name: 'Amadeus Relayer', - assigneeId: 680302, - closeDate: '11/19/2018', - companyId: 29243209, - companyName: 'Amadeus', - customerSourceId: undefined, - lossReasonId: undefined, - pipelineId: 512676, - pipelineStageId: 2405442, - primaryContactId: 66912020, - priority: 'None', - status: 'Won', - interactionCount: 11, - monetaryValue: undefined, - winProbability: 0, - dateCreated: 1542654284000, - dateModified: 1543277520000, - customFields: { '261066': 394013, '261067': 394023 }, - }, - { - id: 14666718, - name: 'Ambo Relayer', - assigneeId: 680302, - closeDate: '11/19/2018', - companyId: 29249190, - companyName: 'Ambo', - customerSourceId: undefined, - lossReasonId: undefined, - pipelineId: 512676, - pipelineStageId: 2405442, - primaryContactId: 66927869, - priority: 'None', - status: 'Won', - interactionCount: 126, - monetaryValue: undefined, - winProbability: 0, - dateCreated: 1542654352000, - dateModified: 1545253761000, - customFields: { '261066': 394013, '261067': 394023 }, - }, - { - id: 14164318, - name: 'Augur TW', - assigneeId: 680302, - closeDate: '12/10/2018', - companyId: 27778967, - companyName: 'Augur', - customerSourceId: undefined, - lossReasonId: undefined, - pipelineId: 512676, - pipelineStageId: 2405442, - primaryContactId: 67248692, - priority: 'None', - status: 'Won', - interactionCount: 22, - monetaryValue: undefined, - winProbability: 0, - dateCreated: 1539204858000, - dateModified: 1544653867000, - customFields: { '261066': 394015, '261067': 394021 }, - }, - { - id: 14666626, - name: 'Autonio', - assigneeId: 680302, - closeDate: '12/19/2018', - companyId: 27920701, - companyName: 'Auton', - customerSourceId: undefined, - lossReasonId: undefined, - pipelineId: 512676, - pipelineStageId: 2392931, - primaryContactId: 64742640, - priority: 'None', - status: 'Open', - interactionCount: 54, - monetaryValue: undefined, - winProbability: 0, - dateCreated: 1542653834000, - dateModified: 1542658808000, - customFields: { '261066': 394019, '261067': 394023 }, - }, - { - id: 14050921, - name: 'Axie Infinity 721 Marketplace', - assigneeId: 680302, - closeDate: '11/1/2018', - companyId: 27779033, - companyName: 'Axie Infinity', - customerSourceId: undefined, - lossReasonId: undefined, - pipelineId: 512676, - pipelineStageId: 2392931, - primaryContactId: 66499254, - priority: 'None', - status: 'Open', - interactionCount: 4, - monetaryValue: undefined, - winProbability: 0, - dateCreated: 1538416687000, - dateModified: 1543861025000, - customFields: { '261066': 394014, '261067': 394134 }, - }, - { - id: 13735617, - name: 'Balance TW', - assigneeId: 680302, - closeDate: '12/10/2018', - companyId: 27778968, - companyName: 'Balance', - customerSourceId: undefined, - lossReasonId: undefined, - pipelineId: 512676, - pipelineStageId: 2405442, - primaryContactId: 64713448, - priority: 'None', - status: 'Won', - interactionCount: 34, - monetaryValue: undefined, - winProbability: 0, - dateCreated: 1535668009000, - dateModified: 1545082454000, - customFields: { '261066': 394015, '261067': 394027 }, - }, - { - id: 14667112, - name: 'Bamboo Relayer', - assigneeId: 680302, - closeDate: '11/19/2018', - companyId: 29243795, - companyName: 'Bamboo Relay', - customerSourceId: undefined, - lossReasonId: undefined, - pipelineId: 512676, - pipelineStageId: 2405442, - primaryContactId: 66914687, - priority: 'None', - status: 'Won', - interactionCount: 46, - monetaryValue: undefined, - winProbability: 0, - dateCreated: 1542655143000, - dateModified: 1545253761000, - customFields: { '261066': 394013, '261067': 394023 }, - }, - { - id: 13627309, - name: 'Ben TW', - assigneeId: 680302, - closeDate: '1/1/2019', - companyId: 27702348, - companyName: 'Ben', - customerSourceId: undefined, - lossReasonId: undefined, - pipelineId: 512676, - pipelineStageId: 2392929, - primaryContactId: 64262622, - priority: 'None', - status: 'Open', - interactionCount: 64, - monetaryValue: undefined, - winProbability: 0, - dateCreated: 1534887789000, - dateModified: 1541651395000, - customFields: { '261066': 394015, '261067': 394027 }, - }, - { - id: 14808512, - name: 'Bit2Me Relayer', - assigneeId: 680302, - closeDate: '12/3/2018', - companyId: 30793050, - companyName: 'Bit2Me', - customerSourceId: undefined, - lossReasonId: undefined, - pipelineId: 512676, - pipelineStageId: 2405442, - primaryContactId: 70267217, - priority: 'None', - status: 'Won', - interactionCount: 0, - monetaryValue: undefined, - winProbability: 0, - dateCreated: 1543861167000, - dateModified: 1543861189000, - customFields: { '261066': 394013, '261067': 394023 }, - }, - { - id: 14050312, - name: 'Bitcoin.tax Reporting Integration', - assigneeId: 680302, - closeDate: '11/1/2018', - companyId: 27957614, - companyName: 'Bitcoin', - customerSourceId: undefined, - lossReasonId: undefined, - pipelineId: 512676, - pipelineStageId: 2392928, - primaryContactId: 66539479, - priority: 'None', - status: 'Open', - interactionCount: 5, - monetaryValue: undefined, - winProbability: 0, - dateCreated: 1538414308000, - dateModified: 1538414314000, - customFields: { '261066': 394019, '261067': 394026 }, - }, - { - id: 14331463, - name: 'Bitpie TW', - assigneeId: 680302, - closeDate: '11/19/2018', - companyId: 27779026, - companyName: 'Bitpie', - customerSourceId: undefined, - lossReasonId: undefined, - pipelineId: 512676, - pipelineStageId: 2392929, - primaryContactId: 67700943, - priority: 'None', - status: 'Open', - interactionCount: 9, - monetaryValue: undefined, - winProbability: 0, - dateCreated: 1539984566000, - dateModified: 1541530233000, - customFields: { '261066': 394015, '261067': 394027 }, - }, - { - id: 14331481, - name: 'Bitski Wallet SDK TW', - assigneeId: 680302, - closeDate: '11/19/2018', - companyId: 29489300, - companyName: 'Bitski', - customerSourceId: undefined, - lossReasonId: undefined, - pipelineId: 512676, - pipelineStageId: 2392929, - primaryContactId: 67697528, - priority: 'None', - status: 'Open', - interactionCount: 23, - monetaryValue: undefined, - winProbability: 0, - dateCreated: 1539984735000, - dateModified: 1544818605000, - customFields: { '261066': 394015, '261067': 394026 }, - }, - { - id: 14531554, - name: 'BitUniverse TW', - assigneeId: 680302, - closeDate: '12/6/2018', - companyId: 29901805, - companyName: 'BitUniverse Co., Ltd (Cryptocurrency Portfolio)', - customerSourceId: undefined, - lossReasonId: undefined, - pipelineId: 512676, - pipelineStageId: 2392929, - primaryContactId: 68692107, - priority: 'None', - status: 'Open', - interactionCount: 15, - monetaryValue: undefined, - winProbability: 0, - dateCreated: 1541527110000, - dateModified: 1544812979000, - customFields: { '261066': 394015, '261067': 394026 }, - }, - { - id: 14050895, - name: 'BlitzPredict PMR', - assigneeId: 680302, - closeDate: '11/1/2018', - companyId: 28758258, - companyName: 'BlitzPredict', - customerSourceId: undefined, - lossReasonId: undefined, - pipelineId: 512676, - pipelineStageId: 2392929, - primaryContactId: 66378659, - priority: 'None', - status: 'Open', - interactionCount: 32, - monetaryValue: undefined, - winProbability: 0, - dateCreated: 1538416597000, - dateModified: 1544830709000, - customFields: { '261066': 394016, '261067': 394023 }, - }, - { - id: 14209841, - name: 'Blockfolio TW', - assigneeId: 680302, - closeDate: '11/15/2018', - companyId: 29332516, - companyName: 'Blockfolio', - customerSourceId: undefined, - lossReasonId: undefined, - pipelineId: 512676, - pipelineStageId: 2405443, - primaryContactId: 67247027, - priority: 'None', - status: 'Open', - interactionCount: 20, - monetaryValue: undefined, - winProbability: 0, - dateCreated: 1539624801000, - dateModified: 1539984098000, - customFields: { '261066': 394015, '261067': 394026 }, - }, - { - id: 14633220, - name: 'BlockSwap 721 / 1155 Conversational Marketplace', - assigneeId: 680302, - closeDate: '12/15/2018', - companyId: 30210921, - companyName: 'BlockSwap', - customerSourceId: undefined, - lossReasonId: undefined, - pipelineId: 512676, - pipelineStageId: 2392929, - primaryContactId: 69296220, - priority: 'None', - status: 'Open', - interactionCount: 82, - monetaryValue: undefined, - winProbability: 0, - dateCreated: 1542311056000, - dateModified: 1543557877000, - customFields: { '261066': 394014, '261067': 394023 }, - }, -]; -export { ParsedOpportunities }; diff --git a/packages/pipeline/test/fixtures/copper/parsed_entities.ts b/packages/pipeline/test/fixtures/copper/parsed_entities.ts deleted file mode 100644 index 1f49d38ed..000000000 --- a/packages/pipeline/test/fixtures/copper/parsed_entities.ts +++ /dev/null @@ -1,5 +0,0 @@ -export { ParsedActivityTypes } from './api_v1_activity_types'; -export { ParsedCustomFields } from './api_v1_custom_field_definitions'; -export { ParsedActivities } from './api_v1_list_activities'; -export { ParsedLeads } from './api_v1_list_leads'; -export { ParsedOpportunities } from './api_v1_list_opportunities'; diff --git a/packages/pipeline/test/parsers/bloxy/index_test.ts b/packages/pipeline/test/parsers/bloxy/index_test.ts deleted file mode 100644 index d270bd2a7..000000000 --- a/packages/pipeline/test/parsers/bloxy/index_test.ts +++ /dev/null @@ -1,100 +0,0 @@ -// tslint:disable:custom-no-magic-numbers -import { BigNumber } from '@0x/utils'; -import * as chai from 'chai'; -import 'mocha'; -import * as R from 'ramda'; - -import { BLOXY_DEX_TRADES_URL, BloxyTrade } from '../../../src/data_sources/bloxy'; -import { DexTrade } from '../../../src/entities'; -import { _parseBloxyTrade } from '../../../src/parsers/bloxy'; -import { chaiSetup } from '../../utils/chai_setup'; - -chaiSetup.configure(); -const expect = chai.expect; - -const baseInput: BloxyTrade = { - tx_hash: '0xb93a7faf92efbbb5405c9a73cd4efd99702fe27c03ff22baee1f1b1e37b3a0bf', - tx_time: '2018-11-21T09:06:28.000+00:00', - tx_date: '2018-11-21', - tx_sender: '0x00923b9a074762b93650716333b3e1473a15048e', - tradeIndex: '1', - smart_contract_id: 7091917, - smart_contract_address: '0x818e6fecd516ecc3849daf6845e3ec868087b755', - contract_type: 'DEX/Kyber Network Proxy', - maker: '0x0000000000000000000000000000000000000001', - taker: '0x0000000000000000000000000000000000000002', - amountBuy: 1.011943163078103, - makerFee: 38.912083, - buyCurrencyId: 1, - buySymbol: 'ETH', - amountSell: 941.4997928436911, - takerFee: 100.39, - sellCurrencyId: 16610, - sellSymbol: 'ELF', - maker_annotation: 'random annotation', - taker_annotation: 'random other annotation', - protocol: 'Kyber Network Proxy', - buyAddress: '0xbf2179859fc6d5bee9bf9158632dc51678a4100d', - sellAddress: '0xbf2179859fc6d5bee9bf9158632dc51678a4100e', -}; - -const baseExpected: DexTrade = { - sourceUrl: BLOXY_DEX_TRADES_URL, - txHash: '0xb93a7faf92efbbb5405c9a73cd4efd99702fe27c03ff22baee1f1b1e37b3a0bf', - tradeIndex: '1', - txTimestamp: 1542791188000, - txDate: '2018-11-21', - txSender: '0x00923b9a074762b93650716333b3e1473a15048e', - smartContractId: 7091917, - smartContractAddress: '0x818e6fecd516ecc3849daf6845e3ec868087b755', - contractType: 'DEX/Kyber Network Proxy', - maker: '0x0000000000000000000000000000000000000001', - taker: '0x0000000000000000000000000000000000000002', - amountBuy: new BigNumber('1.011943163078103'), - makerFeeAmount: new BigNumber('38.912083'), - buyCurrencyId: 1, - buySymbol: 'ETH', - amountSell: new BigNumber('941.4997928436911'), - takerFeeAmount: new BigNumber('100.39'), - sellCurrencyId: 16610, - sellSymbol: 'ELF', - makerAnnotation: 'random annotation', - takerAnnotation: 'random other annotation', - protocol: 'Kyber Network Proxy', - buyAddress: '0xbf2179859fc6d5bee9bf9158632dc51678a4100d', - sellAddress: '0xbf2179859fc6d5bee9bf9158632dc51678a4100e', -}; - -interface TestCase { - input: BloxyTrade; - expected: DexTrade; -} - -const testCases: TestCase[] = [ - { - input: baseInput, - expected: baseExpected, - }, - { - input: R.merge(baseInput, { buyAddress: null, sellAddress: null }), - expected: R.merge(baseExpected, { buyAddress: null, sellAddress: null }), - }, - { - input: R.merge(baseInput, { - buySymbol: - 'RING\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000', - }), - expected: R.merge(baseExpected, { buySymbol: 'RING' }), - }, -]; - -describe('bloxy', () => { - describe('_parseBloxyTrade', () => { - for (const [i, testCase] of testCases.entries()) { - it(`converts BloxyTrade to DexTrade entity (${i + 1}/${testCases.length})`, () => { - const actual = _parseBloxyTrade(testCase.input); - expect(actual).deep.equal(testCase.expected); - }); - } - }); -}); diff --git a/packages/pipeline/test/parsers/copper/index_test.ts b/packages/pipeline/test/parsers/copper/index_test.ts deleted file mode 100644 index bb8e70da1..000000000 --- a/packages/pipeline/test/parsers/copper/index_test.ts +++ /dev/null @@ -1,87 +0,0 @@ -import * as chai from 'chai'; -import 'mocha'; - -import { - CopperActivity, - CopperActivityType, - CopperCustomField, - CopperLead, - CopperOpportunity, -} from '../../../src/entities'; -import { - CopperActivityResponse, - CopperActivityTypeCategory, - CopperActivityTypeResponse, - CopperCustomFieldResponse, - CopperSearchResponse, - parseActivities, - parseActivityTypes, - parseCustomFields, - parseLeads, - parseOpportunities, -} from '../../../src/parsers/copper'; -import { chaiSetup } from '../../utils/chai_setup'; - -chaiSetup.configure(); -const expect = chai.expect; - -type CopperResponse = CopperSearchResponse | CopperCustomFieldResponse; -type CopperEntity = CopperLead | CopperActivity | CopperOpportunity | CopperActivityType | CopperCustomField; - -import * as activityTypesApiResponse from '../../fixtures/copper/api_v1_activity_types.json'; -import * as customFieldsApiResponse from '../../fixtures/copper/api_v1_custom_field_definitions.json'; -import * as listActivitiesApiResponse from '../../fixtures/copper/api_v1_list_activities.json'; -import * as listLeadsApiResponse from '../../fixtures/copper/api_v1_list_leads.json'; -import * as listOpportunitiesApiResponse from '../../fixtures/copper/api_v1_list_opportunities.json'; -import { - ParsedActivities, - ParsedActivityTypes, - ParsedCustomFields, - ParsedLeads, - ParsedOpportunities, -} from '../../fixtures/copper/parsed_entities'; - -interface TestCase { - input: CopperResponse[]; - expected: CopperEntity[]; - parseFn(input: CopperResponse[]): CopperEntity[]; -} -const testCases: TestCase[] = [ - { - input: listLeadsApiResponse, - expected: ParsedLeads, - parseFn: parseLeads, - }, - { - input: (listActivitiesApiResponse as unknown) as CopperActivityResponse[], - expected: ParsedActivities, - parseFn: parseActivities, - }, - { - input: listOpportunitiesApiResponse, - expected: ParsedOpportunities, - parseFn: parseOpportunities, - }, - { - input: customFieldsApiResponse, - expected: ParsedCustomFields, - parseFn: parseCustomFields, - }, -]; -describe('Copper parser', () => { - it('parses API responses', () => { - testCases.forEach(testCase => { - const actual: CopperEntity[] = testCase.parseFn(testCase.input); - expect(actual).deep.equal(testCase.expected); - }); - }); - - // special case because the API response is not an array - it('parses activity types API response', () => { - const actual: CopperActivityType[] = parseActivityTypes((activityTypesApiResponse as unknown) as Map< - CopperActivityTypeCategory, - CopperActivityTypeResponse[] - >); - expect(actual).deep.equal(ParsedActivityTypes); - }); -}); diff --git a/packages/pipeline/test/parsers/ddex_orders/index_test.ts b/packages/pipeline/test/parsers/ddex_orders/index_test.ts deleted file mode 100644 index d6f69e090..000000000 --- a/packages/pipeline/test/parsers/ddex_orders/index_test.ts +++ /dev/null @@ -1,52 +0,0 @@ -import { BigNumber } from '@0x/utils'; -import * as chai from 'chai'; -import 'mocha'; - -import { DdexMarket } from '../../../src/data_sources/ddex'; -import { TokenOrderbookSnapshot as TokenOrder } from '../../../src/entities'; -import { parseDdexOrder } from '../../../src/parsers/ddex_orders'; -import { OrderType } from '../../../src/types'; -import { chaiSetup } from '../../utils/chai_setup'; - -chaiSetup.configure(); -const expect = chai.expect; - -// tslint:disable:custom-no-magic-numbers -describe('ddex_orders', () => { - describe('parseDdexOrder', () => { - it('converts ddexOrder to TokenOrder entity', () => { - const ddexOrder: [string, BigNumber] = ['0.5', new BigNumber(10)]; - const ddexMarket: DdexMarket = { - id: 'ABC-DEF', - quoteToken: 'ABC', - quoteTokenDecimals: 5, - quoteTokenAddress: '0x0000000000000000000000000000000000000000', - baseToken: 'DEF', - baseTokenDecimals: 2, - baseTokenAddress: '0xb45df06e38540a675fdb5b598abf2c0dbe9d6b81', - minOrderSize: '0.1', - pricePrecision: 1, - priceDecimals: 1, - amountDecimals: 0, - }; - const observedTimestamp: number = Date.now(); - const orderType: OrderType = OrderType.Bid; - const source: string = 'ddex'; - - const expected = new TokenOrder(); - expected.source = 'ddex'; - expected.observedTimestamp = observedTimestamp; - expected.orderType = OrderType.Bid; - expected.price = new BigNumber(0.5); - expected.quoteAssetSymbol = 'ABC'; - expected.quoteAssetAddress = '0x0000000000000000000000000000000000000000'; - expected.quoteVolume = new BigNumber(5); - expected.baseAssetSymbol = 'DEF'; - expected.baseAssetAddress = '0xb45df06e38540a675fdb5b598abf2c0dbe9d6b81'; - expected.baseVolume = new BigNumber(10); - - const actual = parseDdexOrder(ddexMarket, observedTimestamp, orderType, source, ddexOrder); - expect(actual).deep.equal(expected); - }); - }); -}); diff --git a/packages/pipeline/test/parsers/events/erc20_events_test.ts b/packages/pipeline/test/parsers/events/erc20_events_test.ts deleted file mode 100644 index 962c50f98..000000000 --- a/packages/pipeline/test/parsers/events/erc20_events_test.ts +++ /dev/null @@ -1,54 +0,0 @@ -import { ERC20TokenApprovalEventArgs } from '@0x/contract-wrappers'; -import { BigNumber } from '@0x/utils'; -import * as chai from 'chai'; -import { LogWithDecodedArgs } from 'ethereum-types'; -import 'mocha'; - -import { ERC20ApprovalEvent } from '../../../src/entities'; -import { _convertToERC20ApprovalEvent } from '../../../src/parsers/events/erc20_events'; -import { _convertToExchangeFillEvent } from '../../../src/parsers/events/exchange_events'; -import { chaiSetup } from '../../utils/chai_setup'; - -chaiSetup.configure(); -const expect = chai.expect; - -// tslint:disable:custom-no-magic-numbers -describe('erc20_events', () => { - describe('_convertToERC20ApprovalEvent', () => { - it('converts LogWithDecodedArgs to ERC20ApprovalEvent entity', () => { - const input: LogWithDecodedArgs<ERC20TokenApprovalEventArgs> = { - address: '0xc02aaa39b223fe8d0a0e5c4f27ead9083c756cc2', - blockHash: '0xd2d7aafaa7102aec0bca8ef026d5a85133e87892334c46ee1e92e42912991c9b', - blockNumber: 6281577, - data: '0x000000000000000000000000000000000000000000000002b9cba5ee21ad3df9', - logIndex: 43, - topics: [ - '0x8c5be1e5ebec7d5bd14f71427d1e84f3dd0314c0f7b2291e5b200ac8c7c3b925', - '0x0000000000000000000000000b65c5f6f3a05d6be5588a72b603360773b3fe04', - '0x000000000000000000000000448a5065aebb8e423f0896e6c5d525c040f59af3', - ], - transactionHash: '0xcb46b19c786376a0a0140d51e3e606a4c4f926d8ca5434e96d2f69d04d8d9c7f', - transactionIndex: 103, - event: 'Approval', - args: { - _owner: '0x0b65c5f6f3a05d6be5588a72b603360773b3fe04', - _spender: '0x448a5065aebb8e423f0896e6c5d525c040f59af3', - _value: new BigNumber('50281464906893835769'), - }, - }; - - const expected = new ERC20ApprovalEvent(); - expected.tokenAddress = '0xc02aaa39b223fe8d0a0e5c4f27ead9083c756cc2'; - expected.blockNumber = 6281577; - expected.rawData = '0x000000000000000000000000000000000000000000000002b9cba5ee21ad3df9'; - expected.logIndex = 43; - expected.transactionHash = '0xcb46b19c786376a0a0140d51e3e606a4c4f926d8ca5434e96d2f69d04d8d9c7f'; - expected.ownerAddress = '0x0b65c5f6f3a05d6be5588a72b603360773b3fe04'; - expected.spenderAddress = '0x448a5065aebb8e423f0896e6c5d525c040f59af3'; - expected.amount = new BigNumber('50281464906893835769'); - - const actual = _convertToERC20ApprovalEvent(input); - expect(actual).deep.equal(expected); - }); - }); -}); diff --git a/packages/pipeline/test/parsers/events/exchange_events_test.ts b/packages/pipeline/test/parsers/events/exchange_events_test.ts deleted file mode 100644 index 956ad9ef8..000000000 --- a/packages/pipeline/test/parsers/events/exchange_events_test.ts +++ /dev/null @@ -1,79 +0,0 @@ -import { ExchangeFillEventArgs } from '@0x/contract-wrappers'; -import { BigNumber } from '@0x/utils'; -import * as chai from 'chai'; -import { LogWithDecodedArgs } from 'ethereum-types'; -import 'mocha'; - -import { ExchangeFillEvent } from '../../../src/entities'; -import { _convertToExchangeFillEvent } from '../../../src/parsers/events/exchange_events'; -import { AssetType } from '../../../src/types'; -import { chaiSetup } from '../../utils/chai_setup'; - -chaiSetup.configure(); -const expect = chai.expect; - -// tslint:disable:custom-no-magic-numbers -describe('exchange_events', () => { - describe('_convertToExchangeFillEvent', () => { - it('converts LogWithDecodedArgs to ExchangeFillEvent entity', () => { - const input: LogWithDecodedArgs<ExchangeFillEventArgs> = { - logIndex: 102, - transactionIndex: 38, - transactionHash: '0x6dd106d002873746072fc5e496dd0fb2541b68c77bcf9184ae19a42fd33657fe', - blockHash: '', - blockNumber: 6276262, - address: '0x4f833a24e1f95d70f028921e27040ca56e09ab0b', - data: - '0x000000000000000000000000f6da68519f78b0d0bc93c701e86affcb75c92428000000000000000000000000f6da68519f78b0d0bc93c701e86affcb75c92428000000000000000000000000000000000000000000000000002386f26fc10000000000000000000000000000000000000000000000000000016345785d8a000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000010000000000000000000000000000000000000000000000000000000000000001600000000000000000000000000000000000000000000000000000000000000024f47261b0000000000000000000000000c02aaa39b223fe8d0a0e5c4f27ead9083c756cc2000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000024f47261b0000000000000000000000000e41d2489571d322189246dafa5ebde1f4699f49800000000000000000000000000000000000000000000000000000000', - topics: [ - '0x0bcc4c97732e47d9946f229edb95f5b6323f601300e4690de719993f3c371129', - '0x000000000000000000000000f6da68519f78b0d0bc93c701e86affcb75c92428', - '0x000000000000000000000000c370d2a5920344aa6b7d8d11250e3e861434cbdd', - '0xab12ed2cbaa5615ab690b9da75a46e53ddfcf3f1a68655b5fe0d94c75a1aac4a', - ], - event: 'Fill', - args: { - makerAddress: '0xf6da68519f78b0d0bc93c701e86affcb75c92428', - feeRecipientAddress: '0xc370d2a5920344aa6b7d8d11250e3e861434cbdd', - takerAddress: '0xf6da68519f78b0d0bc93c701e86affcb75c92428', - senderAddress: '0xf6da68519f78b0d0bc93c701e86affcb75c92428', - makerAssetFilledAmount: new BigNumber('10000000000000000'), - takerAssetFilledAmount: new BigNumber('100000000000000000'), - makerFeePaid: new BigNumber('0'), - takerFeePaid: new BigNumber('12345'), - orderHash: '0xab12ed2cbaa5615ab690b9da75a46e53ddfcf3f1a68655b5fe0d94c75a1aac4a', - makerAssetData: '0xf47261b0000000000000000000000000c02aaa39b223fe8d0a0e5c4f27ead9083c756cc2', - takerAssetData: '0xf47261b0000000000000000000000000e41d2489571d322189246dafa5ebde1f4699f498', - }, - }; - const expected = new ExchangeFillEvent(); - expected.contractAddress = '0x4f833a24e1f95d70f028921e27040ca56e09ab0b'; - expected.blockNumber = 6276262; - expected.logIndex = 102; - expected.rawData = - '0x000000000000000000000000f6da68519f78b0d0bc93c701e86affcb75c92428000000000000000000000000f6da68519f78b0d0bc93c701e86affcb75c92428000000000000000000000000000000000000000000000000002386f26fc10000000000000000000000000000000000000000000000000000016345785d8a000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000010000000000000000000000000000000000000000000000000000000000000001600000000000000000000000000000000000000000000000000000000000000024f47261b0000000000000000000000000c02aaa39b223fe8d0a0e5c4f27ead9083c756cc2000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000024f47261b0000000000000000000000000e41d2489571d322189246dafa5ebde1f4699f49800000000000000000000000000000000000000000000000000000000'; - expected.transactionHash = '0x6dd106d002873746072fc5e496dd0fb2541b68c77bcf9184ae19a42fd33657fe'; - expected.makerAddress = '0xf6da68519f78b0d0bc93c701e86affcb75c92428'; - expected.takerAddress = '0xf6da68519f78b0d0bc93c701e86affcb75c92428'; - expected.feeRecipientAddress = '0xc370d2a5920344aa6b7d8d11250e3e861434cbdd'; - expected.senderAddress = '0xf6da68519f78b0d0bc93c701e86affcb75c92428'; - expected.makerAssetFilledAmount = new BigNumber('10000000000000000'); - expected.takerAssetFilledAmount = new BigNumber('100000000000000000'); - expected.makerFeePaid = new BigNumber('0'); - expected.takerFeePaid = new BigNumber('12345'); - expected.orderHash = '0xab12ed2cbaa5615ab690b9da75a46e53ddfcf3f1a68655b5fe0d94c75a1aac4a'; - expected.rawMakerAssetData = '0xf47261b0000000000000000000000000c02aaa39b223fe8d0a0e5c4f27ead9083c756cc2'; - expected.makerAssetType = AssetType.ERC20; - expected.makerAssetProxyId = '0xf47261b0'; - expected.makerTokenAddress = '0xc02aaa39b223fe8d0a0e5c4f27ead9083c756cc2'; - expected.makerTokenId = null; - expected.rawTakerAssetData = '0xf47261b0000000000000000000000000e41d2489571d322189246dafa5ebde1f4699f498'; - expected.takerAssetType = AssetType.ERC20; - expected.takerAssetProxyId = '0xf47261b0'; - expected.takerTokenAddress = '0xe41d2489571d322189246dafa5ebde1f4699f498'; - expected.takerTokenId = null; - const actual = _convertToExchangeFillEvent(input); - expect(actual).deep.equal(expected); - }); - }); -}); diff --git a/packages/pipeline/test/parsers/idex_orders/index_test.ts b/packages/pipeline/test/parsers/idex_orders/index_test.ts deleted file mode 100644 index 48b019732..000000000 --- a/packages/pipeline/test/parsers/idex_orders/index_test.ts +++ /dev/null @@ -1,87 +0,0 @@ -import { BigNumber } from '@0x/utils'; -import * as chai from 'chai'; -import 'mocha'; - -import { IdexOrderParam } from '../../../src/data_sources/idex'; -import { TokenOrderbookSnapshot as TokenOrder } from '../../../src/entities'; -import { parseIdexOrder } from '../../../src/parsers/idex_orders'; -import { OrderType } from '../../../src/types'; -import { chaiSetup } from '../../utils/chai_setup'; - -chaiSetup.configure(); -const expect = chai.expect; - -// tslint:disable:custom-no-magic-numbers -describe('idex_orders', () => { - describe('parseIdexOrder', () => { - // for market listed as 'DEF_ABC'. - it('correctly converts bid type idexOrder to TokenOrder entity', () => { - const idexOrder: [string, BigNumber] = ['0.5', new BigNumber(10)]; - const idexOrderParam: IdexOrderParam = { - tokenBuy: '0x0000000000000000000000000000000000000000', - buySymbol: 'ABC', - buyPrecision: 2, - amountBuy: '10', - tokenSell: '0xb45df06e38540a675fdb5b598abf2c0dbe9d6b81', - sellSymbol: 'DEF', - sellPrecision: 2, - amountSell: '5', - expires: Date.now() + 100000, - nonce: 1, - user: '0x212345667543456435324564345643453453333', - }; - const observedTimestamp: number = Date.now(); - const orderType: OrderType = OrderType.Bid; - const source: string = 'idex'; - - const expected = new TokenOrder(); - expected.source = 'idex'; - expected.observedTimestamp = observedTimestamp; - expected.orderType = OrderType.Bid; - expected.price = new BigNumber(0.5); - expected.baseAssetSymbol = 'ABC'; - expected.baseAssetAddress = '0x0000000000000000000000000000000000000000'; - expected.baseVolume = new BigNumber(10); - expected.quoteAssetSymbol = 'DEF'; - expected.quoteAssetAddress = '0xb45df06e38540a675fdb5b598abf2c0dbe9d6b81'; - expected.quoteVolume = new BigNumber(5); - - const actual = parseIdexOrder(idexOrderParam, observedTimestamp, orderType, source, idexOrder); - expect(actual).deep.equal(expected); - }); - it('correctly converts ask type idexOrder to TokenOrder entity', () => { - const idexOrder: [string, BigNumber] = ['0.5', new BigNumber(10)]; - const idexOrderParam: IdexOrderParam = { - tokenBuy: '0xb45df06e38540a675fdb5b598abf2c0dbe9d6b81', - buySymbol: 'DEF', - buyPrecision: 2, - amountBuy: '5', - tokenSell: '0x0000000000000000000000000000000000000000', - sellSymbol: 'ABC', - sellPrecision: 2, - amountSell: '10', - expires: Date.now() + 100000, - nonce: 1, - user: '0x212345667543456435324564345643453453333', - }; - const observedTimestamp: number = Date.now(); - const orderType: OrderType = OrderType.Ask; - const source: string = 'idex'; - - const expected = new TokenOrder(); - expected.source = 'idex'; - expected.observedTimestamp = observedTimestamp; - expected.orderType = OrderType.Ask; - expected.price = new BigNumber(0.5); - expected.baseAssetSymbol = 'ABC'; - expected.baseAssetAddress = '0x0000000000000000000000000000000000000000'; - expected.baseVolume = new BigNumber(10); - expected.quoteAssetSymbol = 'DEF'; - expected.quoteAssetAddress = '0xb45df06e38540a675fdb5b598abf2c0dbe9d6b81'; - expected.quoteVolume = new BigNumber(5); - - const actual = parseIdexOrder(idexOrderParam, observedTimestamp, orderType, source, idexOrder); - expect(actual).deep.equal(expected); - }); - }); -}); diff --git a/packages/pipeline/test/parsers/oasis_orders/index_test.ts b/packages/pipeline/test/parsers/oasis_orders/index_test.ts deleted file mode 100644 index 401fedff8..000000000 --- a/packages/pipeline/test/parsers/oasis_orders/index_test.ts +++ /dev/null @@ -1,49 +0,0 @@ -import { BigNumber } from '@0x/utils'; -import * as chai from 'chai'; -import 'mocha'; - -import { OasisMarket } from '../../../src/data_sources/oasis'; -import { TokenOrderbookSnapshot as TokenOrder } from '../../../src/entities'; -import { parseOasisOrder } from '../../../src/parsers/oasis_orders'; -import { OrderType } from '../../../src/types'; -import { chaiSetup } from '../../utils/chai_setup'; - -chaiSetup.configure(); -const expect = chai.expect; - -// tslint:disable:custom-no-magic-numbers -describe('oasis_orders', () => { - describe('parseOasisOrder', () => { - it('converts oasisOrder to TokenOrder entity', () => { - const oasisOrder: [string, BigNumber] = ['0.5', new BigNumber(10)]; - const oasisMarket: OasisMarket = { - id: 'ABCDEF', - base: 'DEF', - quote: 'ABC', - buyVol: 100, - sellVol: 200, - price: 1, - high: 1, - low: 0, - }; - const observedTimestamp: number = Date.now(); - const orderType: OrderType = OrderType.Bid; - const source: string = 'oasis'; - - const expected = new TokenOrder(); - expected.source = 'oasis'; - expected.observedTimestamp = observedTimestamp; - expected.orderType = OrderType.Bid; - expected.price = new BigNumber(0.5); - expected.baseAssetSymbol = 'DEF'; - expected.baseAssetAddress = null; - expected.baseVolume = new BigNumber(10); - expected.quoteAssetSymbol = 'ABC'; - expected.quoteAssetAddress = null; - expected.quoteVolume = new BigNumber(5); - - const actual = parseOasisOrder(oasisMarket, observedTimestamp, orderType, source, oasisOrder); - expect(actual).deep.equal(expected); - }); - }); -}); diff --git a/packages/pipeline/test/parsers/ohlcv_external/crypto_compare_test.ts b/packages/pipeline/test/parsers/ohlcv_external/crypto_compare_test.ts deleted file mode 100644 index 118cafc5e..000000000 --- a/packages/pipeline/test/parsers/ohlcv_external/crypto_compare_test.ts +++ /dev/null @@ -1,62 +0,0 @@ -import * as chai from 'chai'; -import 'mocha'; -import * as R from 'ramda'; - -import { CryptoCompareOHLCVRecord } from '../../../src/data_sources/ohlcv_external/crypto_compare'; -import { OHLCVExternal } from '../../../src/entities'; -import { OHLCVMetadata, parseRecords } from '../../../src/parsers/ohlcv_external/crypto_compare'; -import { chaiSetup } from '../../utils/chai_setup'; - -chaiSetup.configure(); -const expect = chai.expect; - -// tslint:disable:custom-no-magic-numbers -describe('ohlcv_external parser (Crypto Compare)', () => { - describe('parseRecords', () => { - const record: CryptoCompareOHLCVRecord = { - time: 200, - close: 100, - high: 101, - low: 99, - open: 98, - volumefrom: 1234, - volumeto: 4321, - }; - - const metadata: OHLCVMetadata = { - fromSymbol: 'ETH', - toSymbol: 'ZRX', - exchange: 'CCCAGG', - source: 'CryptoCompare', - observedTimestamp: new Date().getTime(), - interval: 100000, - }; - - const entity = new OHLCVExternal(); - entity.exchange = metadata.exchange; - entity.fromSymbol = metadata.fromSymbol; - entity.toSymbol = metadata.toSymbol; - entity.startTime = 100000; - entity.endTime = 200000; - entity.open = record.open; - entity.close = record.close; - entity.low = record.low; - entity.high = record.high; - entity.volumeFrom = record.volumefrom; - entity.volumeTo = record.volumeto; - entity.source = metadata.source; - entity.observedTimestamp = metadata.observedTimestamp; - - it('converts Crypto Compare OHLCV records to OHLCVExternal entity', () => { - const input = [record, R.merge(record, { time: 300 }), R.merge(record, { time: 400 })]; - const expected = [ - entity, - R.merge(entity, { startTime: 200000, endTime: 300000 }), - R.merge(entity, { startTime: 300000, endTime: 400000 }), - ]; - - const actual = parseRecords(input, metadata); - expect(actual).deep.equal(expected); - }); - }); -}); diff --git a/packages/pipeline/test/parsers/paradex_orders/index_test.ts b/packages/pipeline/test/parsers/paradex_orders/index_test.ts deleted file mode 100644 index c5dd8751b..000000000 --- a/packages/pipeline/test/parsers/paradex_orders/index_test.ts +++ /dev/null @@ -1,54 +0,0 @@ -import { BigNumber } from '@0x/utils'; -import * as chai from 'chai'; -import 'mocha'; - -import { ParadexMarket, ParadexOrder } from '../../../src/data_sources/paradex'; -import { TokenOrderbookSnapshot as TokenOrder } from '../../../src/entities'; -import { parseParadexOrder } from '../../../src/parsers/paradex_orders'; -import { OrderType } from '../../../src/types'; -import { chaiSetup } from '../../utils/chai_setup'; - -chaiSetup.configure(); -const expect = chai.expect; - -// tslint:disable:custom-no-magic-numbers -describe('paradex_orders', () => { - describe('parseParadexOrder', () => { - it('converts ParadexOrder to TokenOrder entity', () => { - const paradexOrder: ParadexOrder = { - amount: '412', - price: '0.1245', - }; - const paradexMarket: ParadexMarket = { - id: '2', - symbol: 'ABC/DEF', - baseToken: 'DEF', - quoteToken: 'ABC', - minOrderSize: '0.1', - maxOrderSize: '1000', - priceMaxDecimals: 5, - amountMaxDecimals: 5, - baseTokenAddress: '0xb45df06e38540a675fdb5b598abf2c0dbe9d6b81', - quoteTokenAddress: '0x0000000000000000000000000000000000000000', - }; - const observedTimestamp: number = Date.now(); - const orderType: OrderType = OrderType.Bid; - const source: string = 'paradex'; - - const expected = new TokenOrder(); - expected.source = 'paradex'; - expected.observedTimestamp = observedTimestamp; - expected.orderType = OrderType.Bid; - expected.price = new BigNumber(0.1245); - expected.baseAssetSymbol = 'DEF'; - expected.baseAssetAddress = '0xb45df06e38540a675fdb5b598abf2c0dbe9d6b81'; - expected.baseVolume = new BigNumber(412); - expected.quoteAssetSymbol = 'ABC'; - expected.quoteAssetAddress = '0x0000000000000000000000000000000000000000'; - expected.quoteVolume = new BigNumber(412 * 0.1245); - - const actual = parseParadexOrder(paradexMarket, observedTimestamp, orderType, source, paradexOrder); - expect(actual).deep.equal(expected); - }); - }); -}); diff --git a/packages/pipeline/test/parsers/sra_orders/index_test.ts b/packages/pipeline/test/parsers/sra_orders/index_test.ts deleted file mode 100644 index 838171a72..000000000 --- a/packages/pipeline/test/parsers/sra_orders/index_test.ts +++ /dev/null @@ -1,69 +0,0 @@ -import { APIOrder } from '@0x/types'; -import { BigNumber } from '@0x/utils'; -import * as chai from 'chai'; -import 'mocha'; - -import { SraOrder } from '../../../src/entities'; -import { _convertToEntity } from '../../../src/parsers/sra_orders'; -import { AssetType } from '../../../src/types'; -import { chaiSetup } from '../../utils/chai_setup'; - -chaiSetup.configure(); -const expect = chai.expect; - -// tslint:disable:custom-no-magic-numbers -describe('sra_orders', () => { - describe('_convertToEntity', () => { - it('converts ApiOrder to SraOrder entity', () => { - const input: APIOrder = { - order: { - makerAddress: '0xb45df06e38540a675fdb5b598abf2c0dbe9d6b81', - takerAddress: '0x0000000000000000000000000000000000000000', - feeRecipientAddress: '0xa258b39954cef5cb142fd567a46cddb31a670124', - senderAddress: '0x0000000000000000000000000000000000000000', - makerAssetAmount: new BigNumber('1619310371000000000'), - takerAssetAmount: new BigNumber('8178335207070707070707'), - makerFee: new BigNumber('0'), - takerFee: new BigNumber('0'), - exchangeAddress: '0x4f833a24e1f95d70f028921e27040ca56e09ab0b', - expirationTimeSeconds: new BigNumber('1538529488'), - signature: - '0x1b5a5d672b0d647b5797387ccbb89d822d5d2e873346b014f4ff816ff0783f2a7a0d2824d2d7042ec8ea375bc7f870963e1cb8248f1db03ddf125e27b5963aa11f03', - salt: new BigNumber('1537924688891'), - makerAssetData: '0xf47261b0000000000000000000000000c02aaa39b223fe8d0a0e5c4f27ead9083c756cc2', - takerAssetData: '0xf47261b000000000000000000000000042d6622dece394b54999fbd73d108123806f6a18', - }, - metaData: { isThisArbitraryData: true, powerLevel: 9001 }, - }; - const expected = new SraOrder(); - expected.exchangeAddress = '0x4f833a24e1f95d70f028921e27040ca56e09ab0b'; - expected.orderHashHex = '0x1bdbeb0d088a33da28b9ee6d94e8771452f90f4a69107da2fa75195d61b9a1c9'; - expected.makerAddress = '0xb45df06e38540a675fdb5b598abf2c0dbe9d6b81'; - expected.takerAddress = '0x0000000000000000000000000000000000000000'; - expected.feeRecipientAddress = '0xa258b39954cef5cb142fd567a46cddb31a670124'; - expected.senderAddress = '0x0000000000000000000000000000000000000000'; - expected.makerAssetAmount = new BigNumber('1619310371000000000'); - expected.takerAssetAmount = new BigNumber('8178335207070707070707'); - expected.makerFee = new BigNumber('0'); - expected.takerFee = new BigNumber('0'); - expected.expirationTimeSeconds = new BigNumber('1538529488'); - expected.salt = new BigNumber('1537924688891'); - expected.signature = - '0x1b5a5d672b0d647b5797387ccbb89d822d5d2e873346b014f4ff816ff0783f2a7a0d2824d2d7042ec8ea375bc7f870963e1cb8248f1db03ddf125e27b5963aa11f03'; - expected.rawMakerAssetData = '0xf47261b0000000000000000000000000c02aaa39b223fe8d0a0e5c4f27ead9083c756cc2'; - expected.makerAssetType = AssetType.ERC20; - expected.makerAssetProxyId = '0xf47261b0'; - expected.makerTokenAddress = '0xc02aaa39b223fe8d0a0e5c4f27ead9083c756cc2'; - expected.makerTokenId = null; - expected.rawTakerAssetData = '0xf47261b000000000000000000000000042d6622dece394b54999fbd73d108123806f6a18'; - expected.takerAssetType = AssetType.ERC20; - expected.takerAssetProxyId = '0xf47261b0'; - expected.takerTokenAddress = '0x42d6622dece394b54999fbd73d108123806f6a18'; - expected.takerTokenId = null; - expected.metadataJson = '{"isThisArbitraryData":true,"powerLevel":9001}'; - - const actual = _convertToEntity(input); - expect(actual).deep.equal(expected); - }); - }); -}); diff --git a/packages/pipeline/test/parsers/utils/index_test.ts b/packages/pipeline/test/parsers/utils/index_test.ts deleted file mode 100644 index 5a0d0f182..000000000 --- a/packages/pipeline/test/parsers/utils/index_test.ts +++ /dev/null @@ -1,30 +0,0 @@ -import { BigNumber } from '@0x/utils'; -import * as chai from 'chai'; -import 'mocha'; - -import { aggregateOrders, GenericRawOrder } from '../../../src/parsers/utils'; -import { chaiSetup } from '../../utils/chai_setup'; - -chaiSetup.configure(); -const expect = chai.expect; - -// tslint:disable:custom-no-magic-numbers -describe('aggregateOrders', () => { - it('aggregates order by price point', () => { - const input = [ - { price: '1', amount: '20', orderHash: 'testtest', total: '20' }, - { price: '1', amount: '30', orderHash: 'testone', total: '30' }, - { price: '2', amount: '100', orderHash: 'testtwo', total: '200' }, - ]; - const expected = [['1', new BigNumber(50)], ['2', new BigNumber(100)]]; - const actual = aggregateOrders(input); - expect(actual).deep.equal(expected); - }); - - it('handles empty orders gracefully', () => { - const input: GenericRawOrder[] = []; - const expected: Array<[string, BigNumber]> = []; - const actual = aggregateOrders(input); - expect(actual).deep.equal(expected); - }); -}); diff --git a/packages/pipeline/test/utils/chai_setup.ts b/packages/pipeline/test/utils/chai_setup.ts deleted file mode 100644 index 1a8733093..000000000 --- a/packages/pipeline/test/utils/chai_setup.ts +++ /dev/null @@ -1,13 +0,0 @@ -import * as chai from 'chai'; -import chaiAsPromised = require('chai-as-promised'); -import ChaiBigNumber = require('chai-bignumber'); -import * as dirtyChai from 'dirty-chai'; - -export const chaiSetup = { - configure(): void { - chai.config.includeStack = true; - chai.use(ChaiBigNumber()); - chai.use(dirtyChai); - chai.use(chaiAsPromised); - }, -}; diff --git a/packages/pipeline/tsconfig.json b/packages/pipeline/tsconfig.json deleted file mode 100644 index 45e07374c..000000000 --- a/packages/pipeline/tsconfig.json +++ /dev/null @@ -1,18 +0,0 @@ -{ - "extends": "../../tsconfig", - "compilerOptions": { - "outDir": "lib", - "rootDir": ".", - "emitDecoratorMetadata": true, - "experimentalDecorators": true, - "resolveJsonModule": true - }, - "include": ["./src/**/*", "./test/**/*", "./migrations/**/*"], - "files": [ - "./test/fixtures/copper/api_v1_activity_types.json", - "./test/fixtures/copper/api_v1_custom_field_definitions.json", - "./test/fixtures/copper/api_v1_list_activities.json", - "./test/fixtures/copper/api_v1_list_leads.json", - "./test/fixtures/copper/api_v1_list_opportunities.json" - ] -} diff --git a/packages/pipeline/tslint.json b/packages/pipeline/tslint.json deleted file mode 100644 index dd9053357..000000000 --- a/packages/pipeline/tslint.json +++ /dev/null @@ -1,3 +0,0 @@ -{ - "extends": ["@0x/tslint-config"] -} diff --git a/packages/pipeline/typedoc-tsconfig.json b/packages/pipeline/typedoc-tsconfig.json deleted file mode 100644 index 8b0ff51c1..000000000 --- a/packages/pipeline/typedoc-tsconfig.json +++ /dev/null @@ -1,10 +0,0 @@ -{ - "extends": "../../typedoc-tsconfig", - "compilerOptions": { - "outDir": "lib", - "rootDir": ".", - "emitDecoratorMetadata": true, - "experimentalDecorators": true - }, - "include": ["./src/**/*", "./test/**/*", "./migrations/**/*"] -} |