You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: CONTRIBUTING.md
+10-5Lines changed: 10 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -199,7 +199,9 @@ To test across clouds, on top of setting up Redis, you also need GCP / AWS / Sno
199
199
1. You can get free credits [here](https://cloud.google.com/free/docs/free-cloud-features#free-trial).
200
200
2. You will need to setup a service account, enable the BigQuery API, and create a staging location for a bucket.
201
201
* Setup your service account and project using steps 1-5 [here](https://codelabs.developers.google.com/codelabs/cloud-bigquery-python#0).
202
+
* Remember to save your `PROJECT_ID` and your `key.json`. These will be your secrets that you will need to configure in Github actions. Namely, `secrets.GCP_PROJECT_ID` and `secrets.GCP_SA_KEY`. The `GCP_SA_KEY` value is the contents of your `key.json` file.
202
203
* Follow these [instructions](https://cloud.google.com/storage/docs/creating-buckets) in your project to create a bucket for running GCP tests and remember to save the bucket name.
204
+
* Make sure to add the service account email that you created in the previous step to the users that can access your bucket. Then, make sure to give the account the correct access roles, namely `objectCreator`, `objectViewer`, `objectAdmin`, and `admin`, so that your tests can use the bucket.
203
205
3. Install the [Cloud SDK](https://cloud.google.com/sdk/docs/install).
204
206
4. Login to gcloud if you haven't already:
205
207
```
@@ -222,7 +224,7 @@ To test across clouds, on top of setting up Redis, you also need GCP / AWS / Sno
222
224
223
225
Your active configuration is: [default]
224
226
```
225
-
7. Export GCP specific environment variables. Namely,
227
+
7. Export GCP specific environment variables in your workflow. Namely,
* Also remember to save your [account name](https://docs.snowflake.com/en/user-guide/admin-account-identifier.html#:~:text=organization_name%20is%20the%20name%20of,your%20account%20within%20your%20organization), username, and role.
257
-
3. Create a warehouse and database named `FEAST` with the schema `OFFLINE`.
259
+
* Your account name can be found under
260
+
3. Create Dashboard and add a Tile.
261
+
4. Create a warehouse and database named `FEAST` with the schemas `OFFLINE` and `ONLINE`.
258
262
```sql
259
263
create or replace warehouse feast_tests_wh with
260
264
warehouse_size='MEDIUM'--set your warehouse size to whatever your budget allows--
4. You will need to create a data unloading location(either on S3, GCP, or Azure). Detailed instructions [here](https://docs.snowflake.com/en/user-guide/data-unload-overview.html). You will need to save the storage export location and the storage export name.
270
-
5. Then to run successfully, you'll need some environment variables setup:
274
+
5. You will need to create a data unloading location(either on S3, GCP, or Azure). Detailed instructions [here](https://docs.snowflake.com/en/user-guide/data-unload-overview.html). You will need to save the storage export location and the storage export name. You will need to create a [storage integration ](https://docs.snowflake.com/en/sql-reference/sql/create-storage-integration.html) in your warehouse to make this work. Name this storage integration `FEAST_S3`.
275
+
6. Then to run successfully, you'll need some environment variables setup:
Copy file name to clipboardExpand all lines: docs/project/maintainers.md
+2Lines changed: 2 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -55,3 +55,5 @@ Fork specific integration tests are run by the `fork_pr_integration_tests.yml_[p
55
55
- Each test in Feast is parametrized by its offline and online store so we can filter out tests by name. The above command chooses only tests with BigQuery that do not use Dynamo or Redshift.
56
56
57
57
5. Everytime a pull request or a change to a pull request is made, the integration tests, the local integration tests, the unit tests, and the linter should run.
58
+
59
+
> Sample fork setups can be found here: [snowflake](https://github.com/kevjumba/feast/pull/30) and [bigquery](https://github.com/kevjumba/feast/pull/31).
0 commit comments