Skip to content

Commit 91932da

Browse files
committed
update sql assessment
1 parent f9be1ef commit 91932da

File tree

3 files changed

+88
-52
lines changed

3 files changed

+88
-52
lines changed

sql-assesment/SQL_Assessment.md renamed to sql-assesment/README.md

Lines changed: 37 additions & 52 deletions
Original file line numberDiff line numberDiff line change
@@ -2,65 +2,50 @@
22

33
The database contains two tables, store_revenue and marketing_data. Refer to the two CSV
44
files, store_revenue and marketing_data to understand how these tables have been created.
5-
6-
store_revenue contains revenue by date, brand ID, and location.
7-
8-
create table store_revenue (
9-
id int not null primary key auto_increment,
10-
date datetime,
11-
brand_id int,
12-
store_location varchar(250),
13-
revenue float
14-
);
15-
16-
marketing_data contains ad impression and click data by date and location.
17-
18-
create table marketing_data (
19-
id int not null primary key auto_increment,
20-
date datetime,
21-
geo varchar(2),
22-
impressions float,
23-
clicks float
24-
);
25-
26-
### Please provide a SQL statement between the comments under each question.
27-
28-
Question #0 (Already done for you as an example)
5+
6+
store_revenue contains revenue by date, brand ID, and location:
7+
8+
> create table store_revenue (
9+
> id int not null primary key auto_increment,
10+
> date datetime,
11+
> brand_id int,
12+
> store_location varchar(250),
13+
> revenue float
14+
> );
15+
16+
marketing_data contains ad impression and click data by date and location:
17+
18+
> create table marketing_data (
19+
> id int not null primary key auto_increment,
20+
> date datetime,
21+
> geo varchar(2),
22+
> impressions float,
23+
> clicks float
24+
> );
25+
26+
### Please provide a SQL statement under each question.
27+
28+
* Question #0 (Already done for you as an example)
2929
Select the first 2 rows from the marketing data
30-
*/
3130
32-
select *
33-
from marketing_data
34-
limit 2;
31+
> select *
32+
> from marketing_data
33+
> limit 2;
3534
36-
/*
37-
Question #1
35+
* Question #1
3836
Generate a query to get the sum of the clicks of the marketing data
39-
*/
40-
41-
/*
42-
Question #2
37+
38+
* Question #2
4339
Generate a query to gather the sum of revenue by geo from the store_revenue table
44-
*/
45-
46-
/*
47-
Question #3
40+
41+
* Question #3
4842
Merge these two datasets so we can see impressions, clicks, and revenue together by date
4943
and geo.
5044
Please ensure all records from each table are accounted for.
51-
*/
52-
53-
/*
54-
Question #4
45+
46+
* Question #4
5547
In your opinion, what is the most efficient store and why?
56-
*/
57-
58-
/*
59-
Question #5 (Challenge)
60-
Generate a query to rank in order the top 10 revenue producing states
61-
*/
62-
63-
/*
6448
65-
End Assignment
66-
*/
49+
* Question #5 (Challenge)
50+
Generate a query to rank in order the top 10 revenue producing states
51+

sql-assesment/marketing_data.csv

Lines changed: 20 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,20 @@
1+
"2016-01-01","TX","2532","45"
2+
"2016-01-01","CA","3425","63"
3+
"2016-01-01","NY","3532","25"
4+
"2016-01-01","MN","1342","784"
5+
"2016-01-02","TX","3643","23"
6+
"2016-01-02","CA","1354","53"
7+
"2016-01-02","NY","4643","85"
8+
"2016-01-02","MN","2366","85"
9+
"2016-01-03","TX","2353","57"
10+
"2016-01-03","CA","5258","36"
11+
"2016-01-03","NY","4735","63"
12+
"2016-01-03","MN","5783","87"
13+
"2016-01-04","TX","5783","47"
14+
"2016-01-04","CA","7854","85"
15+
"2016-01-04","NY","4754","36"
16+
"2016-01-04","MN","9345","24"
17+
"2016-01-05","TX","2535","63"
18+
"2016-01-05","CA","4678","73"
19+
"2016-01-05","NY","2364","33"
20+
"2016-01-05","MN","3452","25"

sql-assesment/store_revenue.csv

Lines changed: 31 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,31 @@
1+
"2016-01-01","1","United States-CA","100"
2+
"2016-01-01","1","United States-TX","420"
3+
"2016-01-01","1","United States-NY","142"
4+
"2016-01-02","1","United States-CA","231"
5+
"2016-01-02","1","United States-TX","2342"
6+
"2016-01-02","1","United States-NY","232"
7+
"2016-01-03","1","United States-CA","100"
8+
"2016-01-03","1","United States-TX","420"
9+
"2016-01-03","1","United States-NY","3245"
10+
"2016-01-04","1","United States-CA","34"
11+
"2016-01-04","1","United States-TX","3"
12+
"2016-01-04","1","United States-NY","54"
13+
"2016-01-05","1","United States-CA","45"
14+
"2016-01-05","1","United States-TX","423"
15+
"2016-01-05","1","United States-NY","234"
16+
"2016-01-01","2","United States-CA","234"
17+
"2016-01-01","2","United States-TX","234"
18+
"2016-01-01","2","United States-NY","142"
19+
"2016-01-02","2","United States-CA","234"
20+
"2016-01-02","2","United States-TX","3423"
21+
"2016-01-02","2","United States-NY","2342"
22+
"2016-01-03","2","United States-CA","234234"
23+
"2016-01-06","3","United States-TX","3"
24+
"2016-01-03","2","United States-TX","3"
25+
"2016-01-03","2","United States-NY","234"
26+
"2016-01-04","2","United States-CA","2"
27+
"2016-01-04","2","United States-TX","2354"
28+
"2016-01-04","2","United States-NY","45235"
29+
"2016-01-05","2","United States-CA","23"
30+
"2016-01-05","2","United States-TX","4"
31+
"2016-01-05","2","United States-NY","124"

0 commit comments

Comments
 (0)