Uploading big files: MaxRequestSize limit

12 November 2010, 18:01
Hello Hugo,

I need to upload some big files using the web interface. As I understand, the current Hiawatha v7.4 request size limit is 16 Megabyte maximum.
Is that correct? Can I make it more? At least ~200 MB?

Use case 1:
- hiawatha.conf:
   RequestLimitMask = allow all
MaxRequestSize = 8192

- trying to upload 110 MB file through web interface (Joomla)
- I'm being kicked
- system.log:|Fri 12 Nov 2010 15:57:12 +0100|Maximum request size reached

That works OK, but now I really want to upload the file, so:

Use case 2:
- hiawatha.conf:
RequestLimitMask = deny

- trying to upload 110 MB file through web interface (Joomla)
- I see the "500 - Internal Server Error" message in the Browser
- access.log:|Fri 12 Nov 2010 16:15:13 +0100|500|585||POST /intranet/index.php?option=com_docman&task=doc_update&gid=189&step=3&Itemid=120 HTTP/1.1|Host: server1|User-Agent: Mozilla/5.0 (X11; U; Linux i686; en-US; rv: Gecko/20101027 Ubuntu/10.04 (lucid) Firefox/3.6.12|Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8|Accept-Language: en-us,en;q=0.5|Accept-Encoding: gzip,deflate|Accept-Charset: UTF-8,*|Keep-Alive: 115|Connection: keep-alive|Referer: http://server1/intranet/index.php?option=com_docman&task=doc_update&gid=189&step=2&Itemid=120|Content-Type: multipart/form-data; boundary=---------------------------693237019264601422669564|Content-Length: 112099551

Quickly tried a change in hiawatha.c:


but that didn't helped. Got the same result as in use case 2 (error 500).

Any Ideas?

Hiawatha version: 7.4
Operating System: Ubuntu 8.04.3 LTS
Hugo Leisink
13 November 2010, 00:03
A webserver was never meant for uploading large files. You should use FTP for that. Hiawatha does support the uploading of large files, but I don't recommend it, because it will use a lot of memory. If you still want to do so, set the MaxRequestSize option (in kilobytes) to whatever you want and don't forget to set the TimeForRequest option (in seconds) , because it will probably take some time to upload 100MB.

Again, I strongly discourage to abuse a webserver for uploading such large files. Use FTP or SCP for that!
15 November 2010, 14:36
All webservers are being used to transfer large files. This trend is caused by fast internet connections, web2.0 applications, and many firewalls blocking everything but port 80. FTP is being replaced by http services introducing better user management, upload/download privileges, counters, automatic expirations...,,, even facebook, they ALL allow some sort of file upload through HTTP. even states: "If you'd like your file to be downloadable by everyone, make sure not to exceed the limit of 1024 MB. Larger files can only be downloaded by premium users."
This is how the internet community looks like: crazy people uploading HUGE files over http.
Marketing points are (
- upload and transfer files of unlimited size
- no need to hassle with FTP sites

You are totally right, HTTP originally was not meant to upload files. But anyway, considering large file upload just a "webserver abusement" could now be a reason for web service providers to not to use Hiawatha.
15 November 2010, 14:42
One of my servers is hosting Joomla with DOCman plugin allowing managed uploading/downloading files.
On GBit LAN, to transfer 110 MB file takes less than 10 seconds. Hiawatha is set to TimeForRequest = 30,100.
Hiawatha message 'Maximum request size reached' is understandable, but uploading big files causes '500 - Internal server error'.
Increasing MaxRequestSize didn't helped. Setting RequestLimitMask didn't helped either.

I'm not sure if that's caused by the DOCman code, or by Hiawatha setup. It previously worked using Apache, I didn't tried another webservers yet.
Maybe I'll do that later.
Hugo Leisink
15 November 2010, 15:33
First, it's not that Hiawatha is not allowing uploading large files, I only strongly discourage it. Setting the MaxRequestSize and TimeForRequest option properly should allow you to send huge requests to Hiawatha. Apperently not, so I'll will see if I can reproduce your error. Remember that also PHP must be willing to accept huge requests. Is PHP's 'memory_limit' set properly? Maybe some other PHP settings which are not set correctly?

Second, because many incompetent firewall administrators only allow traffic through port 80 and then allow EVERYTHING to pass through that port does not make uploading large files via HTTP an sensible thing.

I'm more than willing to make changes to Hiawatha to better support web applications, but if 'bad design' of the web application is the cause, than I'm afraid I have to decline. Security and proper design are not negotiable for me.
15 November 2010, 16:51
Haaa, you were right as usual. Settings were (/etc/php5/cgi/php.ini):

I just forgot to increase the post_max_size too. Silly me
memory_limit = 256M
max_execution_time = 60
post_max_size = 200M
works fine.

Anyway, it's not intuitive to find out, it's not being logged anywhere.

Thanks again for the quick response
This topic has been closed.