413 Content Too Large — File Upload Rejected
Symptoms
- Browser shows `413 Request Entity Too Large` or `413 Content Too Large` immediately on upload
- `curl: (55) Send failure: Connection reset by peer` when sending a large `-d @file` body
- Nginx error log contains: `client intended to send too large body: N bytes`
- Small files upload successfully; the error only appears above a specific file size threshold
- Django raises `RequestDataTooBig` exception in application logs
- `curl: (55) Send failure: Connection reset by peer` when sending a large `-d @file` body
- Nginx error log contains: `client intended to send too large body: N bytes`
- Small files upload successfully; the error only appears above a specific file size threshold
- Django raises `RequestDataTooBig` exception in application logs
Root Causes
- Nginx `client_max_body_size` directive defaults to 1 MB, rejecting larger uploads
- Django's `DATA_UPLOAD_MAX_MEMORY_SIZE` (default 2.5 MB) exceeded for in-memory uploads
- Django's `FILE_UPLOAD_MAX_MEMORY_SIZE` limiting the in-memory portion of multipart uploads
- API gateway or load balancer (AWS ALB, Cloudflare) enforcing its own body size limit
- Missing or incorrect `Content-Type: multipart/form-data` header preventing streaming parse
Diagnosis
1. **Confirm the error origin** — check if Nginx or Django is responding:
```bash
# Bypass Nginx and hit Django directly
curl -X POST http://localhost:8000/upload/ \
-F 'file=@large_file.zip'
# If this succeeds, Nginx is the limiter
```
2. **Check Nginx body size limit** in your site config:
```bash
grep -r 'client_max_body_size' /etc/nginx/
# Default (when absent) = 1m
```
3. **Check Django settings** for upload size limits:
```python
from django.conf import settings
print(settings.DATA_UPLOAD_MAX_MEMORY_SIZE) # default 2621440 (2.5 MB)
print(settings.FILE_UPLOAD_MAX_MEMORY_SIZE) # default 2621440 (2.5 MB)
```
4. **Identify the actual upload size** from the request:
```bash
ls -lh large_file.zip
stat -f '%z' large_file.zip # macOS: size in bytes
```
```bash
# Bypass Nginx and hit Django directly
curl -X POST http://localhost:8000/upload/ \
-F 'file=@large_file.zip'
# If this succeeds, Nginx is the limiter
```
2. **Check Nginx body size limit** in your site config:
```bash
grep -r 'client_max_body_size' /etc/nginx/
# Default (when absent) = 1m
```
3. **Check Django settings** for upload size limits:
```python
from django.conf import settings
print(settings.DATA_UPLOAD_MAX_MEMORY_SIZE) # default 2621440 (2.5 MB)
print(settings.FILE_UPLOAD_MAX_MEMORY_SIZE) # default 2621440 (2.5 MB)
```
4. **Identify the actual upload size** from the request:
```bash
ls -lh large_file.zip
stat -f '%z' large_file.zip # macOS: size in bytes
```
Resolution
**Fix 1: Increase Nginx `client_max_body_size`:**
```nginx
# /etc/nginx/sites-available/myapp
server {
client_max_body_size 50M; # Set to your max expected upload size
location /api/upload/ {
client_max_body_size 100M; # Override per-location if needed
proxy_pass http://127.0.0.1:8000;
}
}
```
```bash
sudo nginx -t && sudo systemctl reload nginx
```
**Fix 2: Increase Django upload limits in settings:**
```python
# config/settings/base.py
DATA_UPLOAD_MAX_MEMORY_SIZE = 50 * 1024 * 1024 # 50 MB
FILE_UPLOAD_MAX_MEMORY_SIZE = 50 * 1024 * 1024 # 50 MB in memory before spooling to disk
```
**Fix 3: If using AWS ALB**, set the maximum request size in the target group:
```bash
aws elbv2 modify-target-group-attributes \
--target-group-arn arn:aws:elasticloadbalancing:... \
--attributes Key=load_balancing.algorithm.type,Value=round_robin
# ALB hard limit is 1 MB per request body — use S3 presigned URLs for large files
```
**Fix 4: Use S3 presigned URLs** to bypass the proxy entirely for large uploads:
```python
import boto3
s3 = boto3.client('s3')
url = s3.generate_presigned_post('my-bucket', 'uploads/file.zip',
ExpiresIn=300, Conditions=[['content-length-range', 0, 104857600]])
# Return url to frontend; browser uploads directly to S3
```
```nginx
# /etc/nginx/sites-available/myapp
server {
client_max_body_size 50M; # Set to your max expected upload size
location /api/upload/ {
client_max_body_size 100M; # Override per-location if needed
proxy_pass http://127.0.0.1:8000;
}
}
```
```bash
sudo nginx -t && sudo systemctl reload nginx
```
**Fix 2: Increase Django upload limits in settings:**
```python
# config/settings/base.py
DATA_UPLOAD_MAX_MEMORY_SIZE = 50 * 1024 * 1024 # 50 MB
FILE_UPLOAD_MAX_MEMORY_SIZE = 50 * 1024 * 1024 # 50 MB in memory before spooling to disk
```
**Fix 3: If using AWS ALB**, set the maximum request size in the target group:
```bash
aws elbv2 modify-target-group-attributes \
--target-group-arn arn:aws:elasticloadbalancing:... \
--attributes Key=load_balancing.algorithm.type,Value=round_robin
# ALB hard limit is 1 MB per request body — use S3 presigned URLs for large files
```
**Fix 4: Use S3 presigned URLs** to bypass the proxy entirely for large uploads:
```python
import boto3
s3 = boto3.client('s3')
url = s3.generate_presigned_post('my-bucket', 'uploads/file.zip',
ExpiresIn=300, Conditions=[['content-length-range', 0, 104857600]])
# Return url to frontend; browser uploads directly to S3
```
Prevention
- **Set `client_max_body_size` explicitly** in Nginx rather than relying on the 1 MB default — document the chosen limit in a comment with the rationale
- **Use S3 presigned URLs or multipart upload** for files over 10 MB — this removes the web server from the upload path entirely
- **Return a helpful error message** from your API when a 413 occurs, including the actual limit and the received size so clients can self-diagnose
- **Validate file size on the frontend** before submission to give users instant feedback without a round trip
- **Use S3 presigned URLs or multipart upload** for files over 10 MB — this removes the web server from the upload path entirely
- **Return a helpful error message** from your API when a 413 occurs, including the actual limit and the received size so clients can self-diagnose
- **Validate file size on the frontend** before submission to give users instant feedback without a round trip