Skip to content

Commit f15cdd2

Browse files
committed
ReadMe rewrite (WIP)
1 parent 7326bd8 commit f15cdd2

File tree

1 file changed

+108
-70
lines changed

1 file changed

+108
-70
lines changed

README.md

+108-70
Original file line numberDiff line numberDiff line change
@@ -1,127 +1,165 @@
1-
# Lambda Proxy Function for Remote Debugging
1+
# Lambda Runtime Emulator for local debugging
22

3-
*I often need to debug complex Lambda setups that are tightly integrated into other services and are hard to simulate in my dev environment. Even if I manage to grab the event JSON it is hard to send the response back on time.
4-
What I always wanted was to invoke Lambda handlers locally in response to AWS events in real-time. This project is an early attempt to solve this problem.*
3+
This runtime emulator allows debugging AWS Lambda functions written in Rust locally while receiving the payload from AWS and sending the responses back as if you were doing remote debugging inside the AWS environment.
54

6-
This lambda function sends both `Payload` and `Context` to a predefined SQS queue and then waits for a response that is deserialized and passed back onto the runtime.
75

8-
In the meantime, you can read the message from the queue, deserialize `Payload` and `Context`, process it locally and send back a response. If your local lambda fails you can re-read the message until either the proxy lambda times out or you send back a reply.
6+
## How it works
97

10-
![flow diagram](img/schematics.png)
8+
This project has two crates:
119

12-
The benefit of this approach *vs* using a local [mock environment](https://aws.amazon.com/premiumsupport/knowledge-center/lambda-layer-simulated-docker/) is the full integration in your infra. It is not always possible to use mock input or try to copy the input by hand. The proxy function will give you near-real-time request/response with real data.
10+
__Production configuration__
1311

14-
### Components
12+
Consider this typical Lambda use case:
1513

16-
- **Lambda Runtime** - a [modified Lambda runtime for Rust](https://github.com/awslabs/aws-lambda-rust-runtime/tree/master/lambda-runtime) that adds a few required features
17-
- **Lambda Handler Proxy** - a [handler](https://github.com/rimutaka/lambda-debug-proxy/blob/master/lambda-debug-proxy/src/main.rs) that relays data to/from SQS queues
18-
- **Request Queue** - an SQS queue for forwarding event details to the dev environment
19-
- **Response Queue** - an SQS queue for forwarding response details from the dev environment
20-
- **Lambda Handler Wrapper** - a [library with supporting functions](https://github.com/rimutaka/lambda-debug-proxy/tree/master/lambda-debug-proxy-client) running in dev environment to pass the payload from SQS queues to the handler function
21-
- **Lambda Handler** - the handler function that is supposed to run on AWS Lambda, but will be debugged in the local dev environment
14+
![function needed debugging](./img/lambda-debugger-usecase.png)
2215

23-
If you are not familiar with [AWS SQS](https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/welcome.html) you may not know that the messages [have to be explicitly deleted](https://docs.aws.amazon.com/AWSSimpleQueueService/latest/APIReference/API_DeleteMessage.html) from the queue. The request messages are deleted by the handler wrapper when the handler returns a response. This allows re-running the handler if it fails before sending a response, which is a handy debugging feature. The response messages are deleted by the handler proxy as soon as they arrive.
16+
__Debugging configuration__
2417

25-
It is possible for the response to arrive too late because either the Lambda Runtime or the caller timed out. For example, AWS APIGateway wait is limited to 30s. The Lambda function can be configured to wait for up to 15 minutes. Remember to check that all stale messages got deleted and purge the queues via the console or AWS CLI if needed.
18+
- _proxy-lambda_ crate forwards Lambda requests and responses between AWS and your development machine in real time
19+
- _runtime-emulator_ crate provides necessary APIs and exchanges payloads with _proxy-lambda_ to enable your Lambda function to run locally
2620

27-
![sequence diagram](img/sequence.png)
21+
![function debugged locally](./img/lambda-debugger-components.png)
2822

29-
### Deployment
23+
## Getting started
3024

31-
Replace `--region us-east-1` and `--function-name lambda-debug-proxy` with your values and run this script from the project root:
25+
__Initial setup:__
3226

33-
```
34-
cargo build --release --target x86_64-unknown-linux-gnu
35-
cp ./target/x86_64-unknown-linux-gnu/release/proxy-lambda ./bootstrap && zip proxy.zip bootstrap && rm bootstrap
36-
aws lambda update-function-code --region us-east-1 --function-name lambda-debug-proxy --zip-file fileb://proxy.zip
37-
```
27+
- clone this repository locally
28+
- build for release or debug
29+
- create _request_ and _response_ queues in SQS with IAM permissions
3830

39-
You can build for other targets, e.g. `-musl` or `aarch64-`.
31+
__Per Lambda function:__
4032

41-
### Lambda config
33+
- deploy _proxy-lambda_ in place of the function you need to debug
34+
- run the emulator in the terminal on the local machine as a binary or with `cargo run`
35+
- run your lambda locally with `cargo run`
4236

43-
Create an empty Lambda function with `Custom runtime on Amazon Linux 2023` or newer and run the deployment script from the previous section. It will build and upload the code to AWS.
4437

45-
Recommended settings:
4638

47-
- **Runtime**: Custom runtime on Amazon Linux 2023 or newer
48-
- **Memory (MB)**: 128
49-
- **Timeout**: 15min 0sec
50-
- **Reserved concurrency**: 1
51-
- **Maximum age of event**: 0h 1min 0sec
52-
- **Retry attempts**: 0
39+
Detailed instructions and code samples for the above steps are provided further in this document.
5340

41+
### Limitations
5442

55-
#### Env variables
56-
- `LAMBDA_PROXY_TRACING_LEVEL` - optional, default=INFO, use DEBUG to get full lambda logging or TRACE to go deeper into dependencies.
57-
- `AWS_DEFAULT_REGION` or `AWS_REGION` - required, but they should be pre-set by AWS
58-
- `LAMBDA_PROXY_REQ_QUEUE_URL` - the Queue URL for Lambda proxy requests, required
59-
- `LAMBDA_PROXY_RESP_QUEUE_URL` - the Queue URL for Lambda handler responses, optional.
43+
This emulator provides the necessary API endpoints for the lambda function to run. It does not:
6044

61-
The proxy runs asynchronously if no `LAMBDA_PROXY_RESP_QUEUE_URL` is specified. It sends the request to `LAMBDA_PROXY_REQ_QUEUE_URL` and returns `OK` regardless of what happens at the remote handler's end.
62-
This is useful for debugging asynchronous functions like S3 event handlers.
45+
* constrain the environment, e.g. memory or execution time
46+
* report errors back to AWS
47+
* handle concurrent requests
48+
* copies the entire set of env vars from AWS (see [runtime-emulator/env-template.sh](runtime-emulator/env-template.sh))
49+
* no support for X-Trace or Extensions APIs
50+
51+
If the local lambda sends back the response before other services time out (e.g. API GW times out after 30s)
52+
the _proxy_lambda_ returns it to the caller.
53+
54+
## Deployment in details
6355

64-
### Queue config
56+
### SQS configuration
6557

66-
Create `LAMBDA_PROXY_REQ` and `LAMBDA_PROXY_RESP` SQS queues. Allow r/w access to both queues from the Lambda and Client IAM Roles.
58+
Create `PROXY_LAMBDA_REQ` and `PROXY_LAMBDA_RESP` SQS queues.
59+
You may use different names, but it is easier to use the defaults provided here.
6760

6861
Recommended settings:
6962

70-
- **Type**: Standard
63+
- **Queue type**: Standard
7164
- **Maximum message size**: 256 KB
7265
- **Default visibility timeout**: 10 Seconds
7366
- **Message retention period**: 1 Hour
7467
- **Receive message wait time**: 20 Seconds
7568

76-
Replace the names and IDs before using this policy:
69+
This IAM policy grants _proxy-lambda_ access to the queues.
70+
It assumes that you have sufficient privileges to access Lambda and SQS from your local machine.
71+
72+
Replace _Principal_ and _Resource_ IDs with your values before adding this policy to the queue config.
73+
7774
```json
7875
{
7976
"Version": "2012-10-17",
8077
"Statement": [
8178
{
8279
"Effect": "Allow",
8380
"Principal": {
84-
"AWS": "512295225992"
81+
"AWS": "arn:aws:iam::512295225992:role/lambda_basic"
8582
},
86-
"Action": [
87-
"SQS:*"
88-
],
89-
"Resource": "arn:aws:sqs:us-east-1:512295225992:"
90-
},
91-
{
9283
"Action": [
9384
"sqs:DeleteMessage",
9485
"sqs:GetQueueAttributes",
9586
"sqs:ReceiveMessage",
9687
"sqs:SendMessage"
9788
],
98-
"Effect": "Allow",
99-
"Resource": "arn:aws:sqs:us-east-1:512295225992:LAMBDA_PROXY_REQ",
100-
"Principal": {
101-
"AWS": [
102-
"arn:aws:iam::512295225992:role/service-role/lambda-debug-proxy-role"
103-
]
104-
}
89+
"Resource": "arn:aws:sqs:us-east-1:512295225992:PROXY_LAMBDA_REQ"
10590
}
10691
]
10792
}
10893
```
10994

110-
### Client config
11195

112-
The lambda code running on your local machine has to be wrapped into a client that reads the queue, de-serializes the payload into your required format, calls your local lambda code and sends the response back to AWS. It is the exact reverse of what the proxy function does. See [local.rs example](examples/local.rs) for a sample Rust implementation. It should be easy to port the sample wrapper into the language of your Lambda handler. There is no need to refactor the proxy.
96+
### Building and deploying _proxy-lambda_
97+
98+
The _proxy lambda_ function should be deployed to AWS Lambda in place of the function you want to debug.
99+
100+
Replace the following parts of the snippet with your values before running it from the project root:
101+
- _target_ - the architecture of the lambda function on AWS, e.g. `x86_64-unknown-linux-gnu`
102+
- _region_ - the region of the lambda function, e.g. `us-east-1`
103+
- _name_ - the name of the lambda function you want to replace with the proxy, e.g. `my-lambda`
104+
105+
```
106+
target=x86_64-unknown-linux-gnu
107+
region=us-east-1
108+
name=my-lambda
109+
110+
cargo build --release --target $target
111+
cp ./target/$target/release/proxy-lambda ./bootstrap && zip proxy.zip bootstrap && rm bootstrap
112+
aws lambda update-function-code --region $region --function-name $name --zip-file fileb://proxy.zip
113+
```
114+
115+
A deployed _proxy-lambda_ should return _Config error_ if you run it with the test event from the console.
116+
117+
118+
119+
### Lambda environmental variables
113120

114-
#### Running the Rust client from `examples/local.rs`:
121+
- `PROXY_LAMBDA_TRACING_LEVEL` - optional, default=INFO, use DEBUG to get full lambda logging or TRACE to go deeper into dependencies.
122+
- `AWS_DEFAULT_REGION` or `AWS_REGION` - required, but they should be pre-set by AWS
123+
- `LAMBDA_PROXY_REQ_QUEUE_URL` - the Queue URL for Lambda proxy requests, required
124+
- `LAMBDA_PROXY_RESP_QUEUE_URL` - the Queue URL for Lambda handler responses, optional.
125+
126+
The proxy runs asynchronously if no `LAMBDA_PROXY_RESP_QUEUE_URL` is specified. It sends the request to `LAMBDA_PROXY_REQ_QUEUE_URL` and returns `OK` regardless of what happens at the remote handler's end.
127+
This is useful for debugging asynchronous functions like S3 event handlers.
115128

116-
1. Create a Lambda proxy with two queues on AWS.
117-
2. Configure env vars
118-
3. Prepare a test event for the proxy lambda
119-
4. Copy-paste the queue URLs and Region into your local client (`examples/local.rs`)
120-
5. Start the local client with `cargo run --example local`
121-
6. Fire the test event in the proxy lambda
129+
## Debugging
122130

123-
Your local client will read the message from `LAMBDA_PROXY_REQ` queue, send a response to `LAMBDA_PROXY_RESP` queue and exit. The proxy lambda will wait for a response and finish its execution as soon as it arrives or time out.
131+
Pre-requisites:
132+
- _proxy-lambda_ was deployed and configured
133+
- SQS queues were created with the appropriate access policies
134+
- modify [runtime-emulator/env-minimal.sh](runtime-emulator/env-minimal.sh) with your IDs
135+
136+
__Launching the emulator:__
137+
- run the modified [runtime-emulator/env-minimal.sh](runtime-emulator/env-minimal.sh) in a terminal window on your local machine
138+
- start the _runtime-emulator_ in the same terminal window as a binary or with `cargo run`
139+
- the app should inform you it is listening on a certain port
140+
141+
__Launching the local lambda:__
142+
- run the modified [runtime-emulator/env-minimal.sh](runtime-emulator/env-minimal.sh) in a terminal window on your local machine
143+
- start your lambda in the same terminal window with `cargo run`
144+
- the emulator will inform you it is waiting for an incoming message from the SQS
145+
146+
__Debugging:__
147+
- trigger the event on AWS as part of your normal data flow, e.g. by a user action on a webpage
148+
- the emulator should display the lambda payload and forward it to your local lambda for processing
149+
- debug
150+
- successful responses are sent back to the caller if the response queue is configured
151+
152+
If the local lambda fails, terminates or panics, you can make changes to the code and run it again to reuse the same payload.
153+
154+
155+
## Technical details
156+
157+
### SQS
158+
159+
If you are not familiar with [AWS SQS](https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/welcome.html) you may not know that the messages [have to be explicitly deleted](https://docs.aws.amazon.com/AWSSimpleQueueService/latest/APIReference/API_DeleteMessage.html) from the queue. The request messages are deleted by the handler wrapper when the handler returns a response. This allows re-running the handler if it fails before sending a response, which is a handy debugging feature. The response messages are deleted by the handler proxy as soon as they arrive.
160+
161+
It is possible for the response to arrive too late because either the Lambda Runtime or the caller timed out. For example, AWS APIGateway wait is limited to 30s. The Lambda function can be configured to wait for up to 15 minutes. Remember to check that all stale messages got deleted and purge the queues via the console or AWS CLI if needed.
124162

125-
## Large payloads and data compression
163+
### Large payloads and data compression
126164

127165
The proxy Lambda function running on AWS and the client running on the dev's machine send JSON payload to SQS. The size of the payload is [limited to 262,144 bytes by SQS](https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/quotas-messages.html). To get around this limitation the client may compress the JSON payload using [flate2 crate](https://crates.io/crates/flate2) and send it as an encoded Base58 string. The encoding/decoding happens automatically at both ends. Only large messages are compressed because it takes up to a few seconds in debug mode.

0 commit comments

Comments
 (0)