Skip to content

Add configurable HTTP/2 connection window size#663

Closed
arturobernalg wants to merge 1 commit intoapache:masterfrom
arturobernalg:connection-window-size
Closed

Add configurable HTTP/2 connection window size#663
arturobernalg wants to merge 1 commit intoapache:masterfrom
arturobernalg:connection-window-size

Conversation

@arturobernalg
Copy link
Copy Markdown
Member

This change adds a local HTTP/2 connection window size setting to H2Config.
The new setting controls connection-level flow control only. It does not affect stream INITIAL_WINDOW_SIZE and preserves the current behavior by default.

@arturobernalg arturobernalg requested a review from ok2c April 3, 2026 10:04
@ok2c
Copy link
Copy Markdown
Member

ok2c commented Apr 5, 2026

@arturobernalg What is exactly the intent of this change-set? Why do you want this to be configurable? What is the exact use case for this?

@arturobernalg
Copy link
Copy Markdown
Member Author

@arturobernalg What is exactly the intent of this change-set? Why do you want this to be configurable? What is the exact use case for this?

@ok2c
The intent is to make the connection-level receive window configurable independently of the stream initial window.
The use case is multiplexed H2 connections receiving large bodies, where per-stream windows alone do not bound total in-flight DATA for the connection.

@ok2c
Copy link
Copy Markdown
Member

ok2c commented Apr 9, 2026

@arturobernalg Please provide an integration test or a demo app showing that tinkering with this parameter produces any tangible results.

@arturobernalg arturobernalg force-pushed the connection-window-size branch from 545c210 to b82a2ed Compare April 18, 2026 11:49
@arturobernalg
Copy link
Copy Markdown
Member Author

@arturobernalg Please provide an integration test or a demo app showing that tinkering with this parameter produces any tangible results.

@ok2c Added a demo app that keeps the per-stream initial window fixed and varies only the connection-level receive window while transferring many large responses over a single H2 connection.

@ok2c
Copy link
Copy Markdown
Member

ok2c commented Apr 18, 2026

@arturobernalg what kind of improvement are you seeing?

@ok2c
Copy link
Copy Markdown
Member

ok2c commented Apr 18, 2026

@arturobernalg And more importantly, why would anyone in their sane mind use any value other than MAX?

@arturobernalg
Copy link
Copy Markdown
Member Author

@ok2c
The point of the option is independent control of connection-level inbound flow. MAX remains a sensible throughput-oriented default. Smaller values are for applications that want stricter aggregate backpressure across all active streams on a multiplexed connection.

A concrete use case is an H2 client in a gateway, proxy, or sidecar multiplexing many large upstream responses over a single connection. Such clients may still want a reasonably large per-stream window, while limiting aggregate inbound data in flight at the connection level.

@arturobernalg
Copy link
Copy Markdown
Member Author

@ok2c The point of the option is independent control of connection-level inbound flow. MAX remains a sensible throughput-oriented default. Smaller values are for applications that want stricter aggregate backpressure across all active streams on a multiplexed connection.

A concrete use case is an H2 client in a gateway, proxy, or sidecar multiplexing many large upstream responses over a single connection. Such clients may still want a reasonably large per-stream window, while limiting aggregate inbound data in flight at the connection level.

To my mind, the main use case is a gateway, proxy, sidecar, API aggregator, or backend-to-backend H2 client multiplexing many concurrent large upstream responses over a single connection.
Such clients may want to keep the per-stream window reasonably large, while limiting aggregate inbound data in flight at the connection level for tighter backpressure and more predictable resource usage.

@ok2c
Copy link
Copy Markdown
Member

ok2c commented Apr 18, 2026

@arturobernalg My daughter was absolutely convinced she really needed a pony when she was 7.

Is there a single user for this parameter? Anyone asked for it? Do you need it for your personal project? For any more or less serious use case this single parameter would likely not be enough. We would likely need a proper strategy interface here if actually anyone really asked for a feature like this.

@arturobernalg
Copy link
Copy Markdown
Member Author

@arturobernalg My daughter was absolutely convinced she really needed a pony when she was 7.

Is there a single user for this parameter? Anyone asked for it? Do you need it for your personal project? For any more or less serious use case this single parameter would likely not be enough. We would likely need a proper strategy interface here if actually anyone really asked for a feature like this.

For the record. No one asked for AI either, and yet we got OpenAI.

@ok2c
Copy link
Copy Markdown
Member

ok2c commented Apr 18, 2026

and yet we got OpenAI.

@arturobernalg Which is bloody shame

@arturobernalg
Copy link
Copy Markdown
Member Author

and yet we got OpenAI.

@arturobernalg Which is bloody shame

and yet we got OpenAI.

@arturobernalg Which is bloody shame

@ok2c - Agree 100%

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants