Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

h2/h3 trailing header support. Fixes #147 #162

Closed
wants to merge 1 commit into from

Conversation

jeffsawatzky
Copy link

This is an initial implementation of trailing header suppport for http2/http3. This fixes #147, in theory.

Here's the thing though...I have no clue what I am doing. I would like some tips on how to do the following:

  1. Ensure that I am following the asgi spec properly
  2. Ensure that I am following the HTTP/2 HTTP/3 spec properly

I assume the underlying h2 and h3 libraries take care of (2) for me, but I'm not sure.

@jeffsawatzky
Copy link
Author

@pgjones can you provide feedback/comments on my stuff so far? Don't want to get to far into this if I'm going in the wrong direction. Thanks.

src/hypercorn/protocol/h2.py Outdated Show resolved Hide resolved
src/hypercorn/protocol/h3.py Outdated Show resolved Hide resolved
@pgjones
Copy link
Owner

pgjones commented May 27, 2024

Thanks for this, I've adapted it a bit and merged in d8de5f2

@pgjones pgjones closed this May 27, 2024
and self.scope["http_version"] in TRAILERS_VERSIONS
and self.state
in {
ASGIHTTPState.REQUEST,
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@jeffsawatzky why allow trailers to be sent before the response?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@pgjones because there may not be a response. gRPC uses the trailing headers for error responses, so if there was an error server side before any response was sent then it would just return a trailing header and no response.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@pgjones if you look at this:
https://github.com/grpc/grpc/blob/master/doc/PROTOCOL-HTTP2.md
The response could be trailers only.

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think the status would have to be 200 (I don't think there is a self.response to use), does this make sense to you?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For my usecase with gRPC, yes that makes sense. All gRPC responses are supposed to be 200.
https://github.com/grpc/grpc/blob/master/doc/PROTOCOL-HTTP2.md#responses

Whether it makes sense generally, I think so?

@synodriver
Copy link
Contributor

I did a quick test with this implementation with python-grpc-client. The asgi app just works like a grpc server, however, the client is not happy with the server and lost the connection half way. It seems that the server send trailers after EndBody been send, which closed the connection. maybe we should avoid this. I'll dig deeper tomorrow.

@jeffsawatzky jeffsawatzky deleted the trailer_support branch May 27, 2024 18:15
@pgjones
Copy link
Owner

pgjones commented May 27, 2024

Thanks both, hopefully fixed with d16b503

@synodriver
Copy link
Contributor

Emmmm... That didn't work either. Here is my asgi app which pretend to be a grpc server(A simple grpc echo server)

async def grpcapp(scope, receive, send):
    if scope["type"] == "http":
        body = (await receive())["body"]
        await send(
            {
                "type": "http.response.start",
                "status": 200,
                "headers": [
                    (b"Content-Type", b"application/grpc+proto"),
                    (b"Cache-Control", b"no-cache"),
                    (b"Trailer", b"grpc-status")
                ],
                "trailers": True,
            }
        )

        await send({"type": "http.response.body", "body": body})
        await send({"type": "http.response.trailers", "headers": [(b"grpc-status", b"0")]})

A real grpc client would expect the same response type and value when talk to the server, but with grpclib in python, I just got

Traceback (most recent call last):
  File "D:\conda\envs\py310\lib\site-packages\grpclib\client.py", line 468, in recv_trailing_metadata
    trailers = await self._stream.recv_trailers()
  File "D:\conda\envs\py310\lib\site-packages\grpclib\protocol.py", line 351, in recv_trailers
    await self.trailers_received.wait()
  File "D:\conda\envs\py310\lib\asyncio\locks.py", line 214, in wait
    await fut
asyncio.exceptions.CancelledError

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "E:\pyproject\nonecorn\src\pclient.py", line 28, in <module>
    asyncio.run(main())
  File "D:\conda\envs\py310\lib\asyncio\runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "D:\conda\envs\py310\lib\asyncio\base_events.py", line 649, in run_until_complete
    return future.result()
  File "E:\pyproject\nonecorn\src\pclient.py", line 23, in main
    reply = await greeter.SayHello(Test(id=2, data="xsndjasndsa"))
  File "D:\conda\envs\py310\lib\site-packages\grpclib\client.py", line 902, in __call__
    async with self.open(timeout=timeout, metadata=metadata) as stream:
  File "D:\conda\envs\py310\lib\site-packages\grpclib\client.py", line 563, in __aexit__
    raise exc_val
  File "D:\conda\envs\py310\lib\site-packages\grpclib\client.py", line 553, in __aexit__
    await self._maybe_finish()
  File "D:\conda\envs\py310\lib\site-packages\grpclib\client.py", line 523, in _maybe_finish
    await self.recv_trailing_metadata()
  File "D:\conda\envs\py310\lib\site-packages\grpclib\client.py", line 467, in recv_trailing_metadata
    with self._wrapper:
  File "D:\conda\envs\py310\lib\site-packages\grpclib\utils.py", line 70, in __exit__
    raise self._error
grpclib.exceptions.StreamTerminatedError: Connection lost

And here is the client side:

import asyncio
import os
import ssl
from datetime import datetime

os.environ["PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION"] = "python"
from grpclib.client import Channel

from test_grpc import TestServerStub

# generated by protoc, it doens't matter since you can generate your own
from test_pb2 import Test


async def main():
    async with Channel("127.0.0.1", 9001, ssl=False) as channel:
        greeter = TestServerStub(channel)

        reply = await greeter.SayHello(Test(id=2, data="xsndjasndsa"))
        print(reply)


if __name__ == "__main__":
    asyncio.run(main())

Besides, grpcurl also failed with ERROR: Code: Internal Message: server closed the stream without sending trailers

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[feature request] trailing headers
3 participants