Skip to content

Commit 28fff10

Browse files
MrCloudSectoniblyx
andauthored
feat(S3_in_w_x_flags): Support S3 URIs for custom checks paths and whitelist files. (prowler-cloud#1090)
* feat(S3_in_w_x_flags): Support S3 URIs for custom checks paths and whitelist files. * feat(S3_in_w_x_flags): README document was updated. * Update README.md * Update README.md * Update README.md * Update README.md Co-authored-by: Toni de la Fuente <[email protected]> Co-authored-by: Sergio Garcia Garcia
1 parent 07b2b0d commit 28fff10

File tree

4 files changed

+118
-18
lines changed

4 files changed

+118
-18
lines changed

README.md

Lines changed: 10 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -292,11 +292,12 @@ Prowler has two parameters related to regions: `-r` that is used query AWS servi
292292
293293
>Note about output formats to use with `-M`: "text" is the default one with colors, "mono" is like default one but monochrome, "csv" is comma separated values, "json" plain basic json (without comma between lines) and "json-asff" is also json with Amazon Security Finding Format that you can ship to Security Hub using `-S`.
294294
295-
or save your report in an S3 bucket (this only works for text or mono. For csv, json or json-asff it has to be copied afterwards):
295+
To save your report in an S3 bucket, use `-B` to define a custom output bucket along with `-M` to define the output format that is going to be uploaded to S3:
296296
297297
```sh
298-
./prowler -M mono | aws s3 cp - s3://bucket-name/prowler-report.txt
298+
./prowler -M csv -B my-bucket/folder/
299299
```
300+
>In the case you do not want to use the assumed role credentials but the initial credentials to put the reports into the S3 bucket, use `-D` instead of `-B`. Make sure that the used credentials have s3:PutObject permissions in the S3 path where the reports are going to be uploaded.
300301
301302
When generating multiple formats and running using Docker, to retrieve the reports, bind a local directory to the container, e.g.:
302303
@@ -399,7 +400,10 @@ Prowler runs in GovCloud regions as well. To make sure it points to the right AP
399400
400401
### Custom folder for custom checks
401402

402-
Flag `-x /my/own/checks` will include any check in that particular directory. To see how to write checks see [Add Custom Checks](#add-custom-checks) section.
403+
Flag `-x /my/own/checks` will include any check in that particular directory (files must start by check). To see how to write checks see [Add Custom Checks](#add-custom-checks) section.
404+
405+
S3 URIs are also supported as custom folders for custom checks, e.g. `s3://bucket/prefix/checks`. Prowler will download the folder locally and run the checks as they are called with default execution,`-c` or `-g`.
406+
>Make sure that the used credentials have s3:GetObject permissions in the S3 path where the custom checks are located.
403407
404408
### Show or log only FAILs
405409

@@ -488,6 +492,9 @@ Sometimes you may find resources that are intentionally configured in a certain
488492
./prowler -w whitelist_sample.txt
489493
```
490494

495+
S3 URIs are also supported as allowlist file, e.g. `s3://bucket/prefix/allowlist_sample.txt`
496+
>Make sure that the used credentials have s3:GetObject permissions in the S3 path where the whitelist file is located.
497+
491498
Whitelist option works along with other options and adds a `WARNING` instead of `INFO`, `PASS` or `FAIL` to any output format except for `json-asff`.
492499

493500
## How to fix every FAIL

include/allowlist

Lines changed: 43 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,43 @@
1+
#!/usr/bin/env bash
2+
3+
# Prowler - the handy cloud security tool (copyright 2018) by Toni de la Fuente
4+
#
5+
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
6+
# use this file except in compliance with the License. You may obtain a copy
7+
# of the License at http://www.apache.org/licenses/LICENSE-2.0
8+
#
9+
# Unless required by applicable law or agreed to in writing, software distributed
10+
# under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR
11+
# CONDITIONS OF ANY KIND, either express or implied. See the License for the
12+
# specific language governing permissions and limitations under the License.
13+
14+
allowlist(){
15+
# check if the file is an S3 URI
16+
if grep -q -E "^s3://([^/]+)/(.*?([^/]+))$" <<< "$ALLOWLIST_FILE"; then
17+
# download s3 object
18+
local S3_ALLOWLIST_FILE=allowlist_s3_file.txt
19+
echo -e "$NOTICE Downloading allowlist from S3 URI $ALLOWLIST_FILE ..."
20+
if ! $AWSCLI s3 cp $ALLOWLIST_FILE $S3_ALLOWLIST_FILE $PROFILE_OPT > /dev/null 2>&1; then
21+
echo "$BAD FAIL! Access Denied trying to download allowlist from the S3 URI, please make sure it is correct and/or you have permissions to get the S3 object.$NORMAL"
22+
EXITCODE=1
23+
exit $EXITCODE
24+
fi
25+
echo -e "$OK Success! Allowlist was downloaded, starting Prowler...$NORMAL"
26+
# ignore lines starting with # (comments)
27+
# ignore inline comments: check1:foo # inline comment
28+
ALLOWLIST=$(awk '!/^[[:space:]]*#/{print }' <(cat "$S3_ALLOWLIST_FILE") | sed 's/[[:space:]]*#.*$//g')
29+
# remove temporary file
30+
rm -f "$S3_ALLOWLIST_FILE"
31+
else
32+
# Check if input allowlist file exists
33+
if [[ -f "$ALLOWLIST_FILE" ]]; then
34+
# ignore lines starting with # (comments)
35+
# ignore inline comments: check1:foo # inline comment
36+
ALLOWLIST=$(awk '!/^[[:space:]]*#/{print }' <(cat "$ALLOWLIST_FILE") | sed 's/[[:space:]]*#.*$//g')
37+
else
38+
echo "$BAD FAIL! $ALLOWLIST_FILE does not exist, please input a valid allowlist file.$NORMAL"
39+
EXITCODE=1
40+
exit $EXITCODE
41+
fi
42+
fi
43+
}

include/custom_checks

Lines changed: 52 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,52 @@
1+
#!/usr/bin/env bash
2+
3+
# Prowler - the handy cloud security tool (copyright 2018) by Toni de la Fuente
4+
#
5+
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
6+
# use this file except in compliance with the License. You may obtain a copy
7+
# of the License at http://www.apache.org/licenses/LICENSE-2.0
8+
#
9+
# Unless required by applicable law or agreed to in writing, software distributed
10+
# under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR
11+
# CONDITIONS OF ANY KIND, either express or implied. See the License for the
12+
# specific language governing permissions and limitations under the License.
13+
14+
custom_checks(){
15+
# check if the path is an S3 URI
16+
if grep -q -E "^s3://([^/]+)/?(.*?([^/]+)/?)?$" <<< "$EXTERNAL_CHECKS_PATH"; then
17+
if grep -q "check*" <<< "$("${AWSCLI}" s3 ls "${EXTERNAL_CHECKS_PATH}" $PROFILE_OPT)"; then
18+
# download s3 object
19+
echo -e "$NOTICE Downloading custom checks from S3 URI $EXTERNAL_CHECKS_PATH...$NORMAL"
20+
S3_CHECKS_TEMP_FOLDER="$PROWLER_DIR/s3-custom-checks"
21+
mkdir "${S3_CHECKS_TEMP_FOLDER}"
22+
$AWSCLI s3 sync "$EXTERNAL_CHECKS_PATH" "${S3_CHECKS_TEMP_FOLDER}" $PROFILE_OPT > /dev/null
23+
# verify if there are checks
24+
for checks in "${S3_CHECKS_TEMP_FOLDER}"/check*; do
25+
. "$checks"
26+
echo -e "$OK Check $(basename "$checks") was included!$NORMAL"
27+
done
28+
echo -e "$OK Success! Custom checks were downloaded and included, starting Prowler...$NORMAL"
29+
# remove temporary dir
30+
rm -rf "${S3_CHECKS_TEMP_FOLDER}"
31+
else
32+
echo "$BAD FAIL! Access Denied trying to download custom checks or $EXTERNAL_CHECKS_PATH does not contain any checks, please make sure it is correct and/or you have permissions to get the S3 objects.$NORMAL"
33+
EXITCODE=1
34+
# remove temporary dir
35+
rm -rf "${S3_CHECKS_TEMP_FOLDER}"
36+
exit $EXITCODE
37+
fi
38+
else
39+
# verify if input directory exists with checks
40+
if ls "${EXTERNAL_CHECKS_PATH}"/check* > /dev/null 2>&1; then
41+
for checks in "${EXTERNAL_CHECKS_PATH}"/check*; do
42+
. "$checks"
43+
echo -e "$OK Check $(basename "$checks") was included!$NORMAL"
44+
done
45+
echo -e "$OK Success! Custom checks were included, starting Prowler...$NORMAL"
46+
else
47+
echo "$BAD FAIL! $EXTERNAL_CHECKS_PATH does not exist or not contain checks, please input a valid custom checks path.$NORMAL"
48+
EXITCODE=1
49+
exit $EXITCODE
50+
fi
51+
fi
52+
}

prowler

Lines changed: 13 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -43,7 +43,7 @@ FAILED_CHECK_FAILED_SCAN=1
4343
PROWLER_START_TIME=$( date -u +"%Y-%m-%dT%H:%M:%S%z" )
4444
TITLE_ID=""
4545
TITLE_TEXT="CALLER ERROR - UNSET TITLE"
46-
WHITELIST_FILE=""
46+
ALLOWLIST_FILE=""
4747
TOTAL_CHECKS=()
4848

4949
# Ensures command output will always be set to JSON.
@@ -88,8 +88,8 @@ USAGE:
8888
-s Show scoring report (it is included by default in the html report).
8989
-S Send check output to AWS Security Hub. Only valid when the output mode is json-asff
9090
(i.e. "-M json-asff -S").
91-
-x Specify external directory with custom checks
92-
(i.e. /my/own/checks, files must start by "check").
91+
-x Specify external directory with custom checks. S3 URI is supported.
92+
(i.e. /my/own/checks or s3://bucket/prefix/checks, files must start by "check").
9393
-q Get only FAIL findings, will show WARNINGS when a resource is excluded.
9494
-A Account id for the account where to assume a role, requires -R.
9595
(i.e.: 123456789012)
@@ -98,8 +98,8 @@ USAGE:
9898
-T Session duration given to that role credentials in seconds, default 1h (3600) recommended 12h, optional with -R and -A.
9999
(i.e.: 43200)
100100
-I External ID to be used when assuming roles (not mandatory), requires -A and -R.
101-
-w Whitelist file. See whitelist_sample.txt for reference and format.
102-
(i.e.: whitelist_sample.txt)
101+
-w Allowlist file. See allowlist_sample.txt for reference and format. S3 URI is supported.
102+
(i.e.: allowlist_sample.txt or s3://bucket/prefix/allowlist_sample.txt)
103103
-N <shodan_api_key> Shodan API key used by check extra7102.
104104
-o Custom output directory, if not specified will use default prowler/output, requires -M <mode>.
105105
(i.e.: -M csv -o /tmp/reports/)
@@ -201,7 +201,7 @@ while getopts ":hlLkqp:r:c:C:g:f:m:M:E:x:enbVsSI:A:R:T:w:N:o:B:D:F:zZ:O:" OPTION
201201
SESSION_DURATION_TO_ASSUME=$OPTARG
202202
;;
203203
w )
204-
WHITELIST_FILE=$OPTARG
204+
ALLOWLIST_FILE=$OPTARG
205205
;;
206206
N )
207207
SHODAN_API_KEY=$OPTARG
@@ -294,6 +294,8 @@ unset AWS_DEFAULT_OUTPUT
294294
. $PROWLER_DIR/include/securityhub_integration
295295
. $PROWLER_DIR/include/junit_integration
296296
. $PROWLER_DIR/include/organizations_metadata
297+
. $PROWLER_DIR/include/custom_checks
298+
. $PROWLER_DIR/include/allowlist
297299

298300
# Parses the check file into CHECK_ID's.
299301
if [[ -n "$CHECK_FILE" ]]; then
@@ -308,11 +310,9 @@ if [[ -n "$CHECK_FILE" ]]; then
308310
fi
309311
fi
310312

311-
# Pre-process whitelist file if supplied
312-
if [[ -n "$WHITELIST_FILE" ]]; then
313-
# ignore lines starting with # (comments)
314-
# ignore inline comments: check1:foo # inline comment
315-
WHITELIST="$(awk '!/^[[:space:]]*#/{print }' <(cat "$WHITELIST_FILE") | sed 's/[[:space:]]*#.*$//g')"
313+
# Pre-process allowlist file if supplied
314+
if [[ -n "$ALLOWLIST_FILE" ]]; then
315+
allowlist
316316
fi
317317

318318
# Load all of the groups of checks inside groups folder named as "groupNumber*"
@@ -328,9 +328,7 @@ done
328328

329329
# include checks if external folder is specified
330330
if [[ $EXTERNAL_CHECKS_PATH ]]; then
331-
for checks in $(ls $EXTERNAL_CHECKS_PATH/check*); do
332-
. "$checks"
333-
done
331+
custom_checks
334332
fi
335333

336334
# Get a list of total checks available by ID
@@ -462,7 +460,7 @@ execute_check() {
462460
# Generate the credential report, only if it is group1 related which checks we
463461
# run so that the checks can safely assume it's available
464462
# set the custom ignores list for this check
465-
ignores="$(awk "/${1}/{print}" <(echo "${WHITELIST}"))"
463+
ignores="$(awk "/${1}/{print}" <(echo "${ALLOWLIST}"))"
466464

467465
if [ ${alternate_name} ];then
468466
if [[ ${alternate_name} == check1* || ${alternate_name} == extra71 || ${alternate_name} == extra774 || ${alternate_name} == extra7123 ]];then

0 commit comments

Comments
 (0)