Skip to content

Part 3 Chapter 15 request_batching_server.py #85

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
Vedant-R opened this issue Dec 30, 2021 · 2 comments
Closed

Part 3 Chapter 15 request_batching_server.py #85

Vedant-R opened this issue Dec 30, 2021 · 2 comments

Comments

@Vedant-R
Copy link

Vedant-R commented Dec 30, 2021

Hi,

This is an exceptional solution provided for async deployment and model serving.

I wanted to enquire if there are examples of unit tests for this. I have been continuously struggling with unit tests for sanic async post method with the following error:

async with self.queue_lock:
AttributeError: __aenter__

I am getting this error for the line:

58 async with self.queue_lock:
59        if len(self.queue) >= MAX_QUEUE_SIZE:

The test I am trying:

import json
import pytest
from ..app import app

@pytest.mark.asyncio
async def test_basic_post():
    _, response = await app.asgi_client.post("/predict", data=json.dumps(INPUT), headers=HEADERS)
    print(response)

Please do let me know if anyone has come across this and have a solution.

@Vedant-R
Copy link
Author

Got this done, thank you. Closing it

@AlexBlack2202
Copy link

Got this done, thank you. Closing it

hello, i have same issuse as your, can you show me how to fix this

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants