On occasion I receive an error "413 Request Entity Too Large" while updating an svn repository. Once I receive this error, it continues every time I attempt to update the local working copy. A new checkout will solve the problem, but is very inconvenient. The project is over 30 GB, and the SVN repository is hosted externally.
This has occurred in the past on several different computers, including Windows development 开发者_如何学Pythonmachines, and our Linux build server.
Most of what I have found regarding this issue relates to large individual files (over 2GB). This is not the case here, as the largest files are approx. 50-60 MB.
Has anyone else ran into this before and/or know the cause/solution to this?
Try to add the following configuration directives to your Apache configuration file:
LimitXMLRequestBody 8000000
LimitRequestBody 0
I don't have access to my repo server (IT Managed, and its over the weekend). So what I found was that I could work around this issue by doing an svn update on subdirs until one wouldn't work. Then I descended into this dir until I stopped getting the 413 error. Then I could do an update at higher levels. Might not work for everyone but could help get through in an emergency
Made a short bash script to loop through the subdirectories, per mdh's answer:
for dir in *; do
[[ -e $dir ]] || continue
echo "Updating $dir"
svn up $dir
done
svn up
I had this issue recently with any file over 10MB. It turns out I forgot I'm proxying the svn/apache server with nginx. Changing client_max_body_size
in nginx.conf
fixed the issue. I left LimitXMLRequestBody
and LimitRequestBody
on the Apache server at their defaults.
Also, if you run mod_security, consider checking your SecRequestBodyLimit setting. Mine was set too low and was causing the problem.
If without access to the server, you can also select all the folders using ctrl+A and then right click to update all folders individually using tortoise svn etc. Essentially the same as @lucrussell's solution.
精彩评论