Implementation of an ELT pipeline using dbt and Snowflake. Features the following:
- Type - 2 slowly changing dimensions(SCDs)
- Data transformations
- Common table expressions(CTEs)
- dbt models
- dbt materialization
- dbt tests
- dbt documentation
- dbt sources, seeds, snapshots
- dbt hooks and operations
- Jinja and macros
- dbt packages
- Analyses and exposures
- dbt seeds
- Data visualization using Preset BI
- Debugging tests in dbt
- Windows 10
- Snowflake
- Python
- dbt
- Preset business intelligence
# runs the SQL SELECT statements in the models using a materialization strategy
dbt run
# debug connection, tests, validity of project file, dependencies etc
dbt debug
# runs tests defined on models, sources, snapshots, and seeds
dbt test
# run tests on a single model(dbt test --select <model_name>)
dbt test --select dim_listings_with_hosts
# run tests on a source
dbt test --select source:airbnb.listings
# rebuild incremental models
dbt run --full-refresh
# rebuild incremental model(fct_reviews)
dbt run --full-refresh --select fct_reviews
# load csv files located in the seeds directory into Snowflake
dbt seed
# generates executable SQL from source model, test, and analysis files
dbt compile
# queries source tables and checks for freshness of those tables
dbt source freshness
# executes the snapshots defined in the snapshots directory
dbt snapshot
# install package(deps: dependencies)
dbt deps
# generate documentation
dbt docs generate
# serves the generated documentation locally on port 8080
dbt docs serve
# generates executable SQL from source model, test, and analysis files. Store these compiled SQL files in the target/ directory of the project
dbt compile
Screenshots
![]() |
![]() |
---|---|
![]() |
![]() |