Skip to content

Commit 02d1f16

Browse files
kevjumbaadchia
authored andcommitted
chore: Update maintainer docs with up to date instructions (feast-dev#3042)
* Update Signed-off-by: Kevin Zhang <[email protected]> * Update Signed-off-by: Kevin Zhang <[email protected]> * Fix Signed-off-by: Kevin Zhang <[email protected]> * Fix Signed-off-by: Kevin Zhang <[email protected]> * revert Signed-off-by: Kevin Zhang <[email protected]>
1 parent a697fb4 commit 02d1f16

File tree

2 files changed

+12
-5
lines changed

2 files changed

+12
-5
lines changed

CONTRIBUTING.md

Lines changed: 10 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -199,7 +199,9 @@ To test across clouds, on top of setting up Redis, you also need GCP / AWS / Sno
199199
1. You can get free credits [here](https://cloud.google.com/free/docs/free-cloud-features#free-trial).
200200
2. You will need to setup a service account, enable the BigQuery API, and create a staging location for a bucket.
201201
* Setup your service account and project using steps 1-5 [here](https://codelabs.developers.google.com/codelabs/cloud-bigquery-python#0).
202+
* Remember to save your `PROJECT_ID` and your `key.json`. These will be your secrets that you will need to configure in Github actions. Namely, `secrets.GCP_PROJECT_ID` and `secrets.GCP_SA_KEY`. The `GCP_SA_KEY` value is the contents of your `key.json` file.
202203
* Follow these [instructions](https://cloud.google.com/storage/docs/creating-buckets) in your project to create a bucket for running GCP tests and remember to save the bucket name.
204+
* Make sure to add the service account email that you created in the previous step to the users that can access your bucket. Then, make sure to give the account the correct access roles, namely `objectCreator`, `objectViewer`, `objectAdmin`, and `admin`, so that your tests can use the bucket.
203205
3. Install the [Cloud SDK](https://cloud.google.com/sdk/docs/install).
204206
4. Login to gcloud if you haven't already:
205207
```
@@ -222,7 +224,7 @@ To test across clouds, on top of setting up Redis, you also need GCP / AWS / Sno
222224

223225
Your active configuration is: [default]
224226
```
225-
7. Export GCP specific environment variables. Namely,
227+
7. Export GCP specific environment variables in your workflow. Namely,
226228
```sh
227229
export GCS_REGION='[your gcs region e.g US]'
228230
export GCS_STAGING_LOCATION='[your gcs staging location]'
@@ -254,7 +256,9 @@ export AWS_REGISTRY_PATH='[your aws registry path]'
254256
grant role accountadmin, sysadmin to user user2;
255257
```
256258
* Also remember to save your [account name](https://docs.snowflake.com/en/user-guide/admin-account-identifier.html#:~:text=organization_name%20is%20the%20name%20of,your%20account%20within%20your%20organization), username, and role.
257-
3. Create a warehouse and database named `FEAST` with the schema `OFFLINE`.
259+
* Your account name can be found under
260+
3. Create Dashboard and add a Tile.
261+
4. Create a warehouse and database named `FEAST` with the schemas `OFFLINE` and `ONLINE`.
258262
```sql
259263
create or replace warehouse feast_tests_wh with
260264
warehouse_size='MEDIUM' --set your warehouse size to whatever your budget allows--
@@ -265,9 +269,10 @@ export AWS_REGISTRY_PATH='[your aws registry path]'
265269
create or replace database FEAST;
266270
use database FEAST;
267271
create schema OFFLINE;
272+
create schema ONLINE;
268273
```
269-
4. You will need to create a data unloading location(either on S3, GCP, or Azure). Detailed instructions [here](https://docs.snowflake.com/en/user-guide/data-unload-overview.html). You will need to save the storage export location and the storage export name.
270-
5. Then to run successfully, you'll need some environment variables setup:
274+
5. You will need to create a data unloading location(either on S3, GCP, or Azure). Detailed instructions [here](https://docs.snowflake.com/en/user-guide/data-unload-overview.html). You will need to save the storage export location and the storage export name. You will need to create a [storage integration ](https://docs.snowflake.com/en/sql-reference/sql/create-storage-integration.html) in your warehouse to make this work. Name this storage integration `FEAST_S3`.
275+
6. Then to run successfully, you'll need some environment variables setup:
271276
```sh
272277
export SNOWFLAKE_CI_DEPLOYMENT='[your snowflake account name]'
273278
export SNOWFLAKE_CI_USER='[your snowflake username]'
@@ -277,7 +282,7 @@ export AWS_REGISTRY_PATH='[your aws registry path]'
277282
export BLOB_EXPORT_STORAGE_NAME='[your data unloading storage name]'
278283
export BLOB_EXPORT_URI='[your data unloading blob uri]`
279284
```
280-
6. Once everything is setup, running snowflake integration tests should pass without failures.
285+
7. Once everything is setup, running snowflake integration tests should pass without failures.
281286
282287
Note that for Snowflake / GCP / AWS, running `make test-python-integration` will create new temporary tables / datasets in your cloud storage tables.
283288

docs/project/maintainers.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -55,3 +55,5 @@ Fork specific integration tests are run by the `fork_pr_integration_tests.yml_[p
5555
- Each test in Feast is parametrized by its offline and online store so we can filter out tests by name. The above command chooses only tests with BigQuery that do not use Dynamo or Redshift.
5656

5757
5. Everytime a pull request or a change to a pull request is made, the integration tests, the local integration tests, the unit tests, and the linter should run.
58+
59+
> Sample fork setups can be found here: [snowflake](https://github.com/kevjumba/feast/pull/30) and [bigquery](https://github.com/kevjumba/feast/pull/31).

0 commit comments

Comments
 (0)