Chunked Encoding with FastCGI does not work as expected
13 June 2009, 21:37
I'm trying to stream chunked data from a FastCGI (py) application so the client gets informed about the steps the FastCGI app is doing.
The return data (execpt the header) does not get delivered in chunks instead all results from the app are delivered when the whole job is finished. Is this data buffered by hiawatha?
When I run the app directly it behaves as expected and sends the chunks by the time.
Hiawatha version: 6.14
Operating System: FreeBSD 7.2-RELEASE
14 June 2009, 00:03
Hiawatha does indeed buffer data from (Fast)CGI applications. This is done to improve speed. Delivering data to the clients in the same chunks as recieved from the (Fast)CGI process is not part of the CGI or FastCGI specifications. So you should not depend on it.
14 June 2009, 09:42
Is there any possibility to change this? Or is there any other/better approach for the situation where the client has to wait for results from server but should be informed about the single steps being performed on the server?
So tha i.e. http://webpy.org/cookbook/streaming_large_files works as expected.
14 June 2009, 10:29
Solved: By looking through the code it was easy: By filling up each chunk with blanks so each chunk gets a size of > 2048 B each chunk is sent to the client. Buffering only happens if the chunk is too small.
14 June 2009, 12:51
That's a more robust sollution than yours. It's webserver independent.
This topic has been closed.