python requests upload large file with additional data

Thien Nguyen picture Thien Nguyen · Mar 3, 2016 · Viewed 14.3k times · Source

I've been looking around for ways to upload large file with additional data, but there doesn't seem to be any solution. To upload file, I've been using this code and it's been working fine with small file:

with open("my_file.csv", "rb") as f:
    files = {"documents": ("my_file.csv", f, "application/octet-stream")}
    data = {"composite": "NONE"}
    headers = {"Prefer": "respond-async"}
    resp = session.post("my/url", headers=headers, data=data, files=files)

The problem is that the code loads the whole file up before sending, and I would run into MemoryError when uploading large files. I've looked around, and the way to stream data is to set

resp = session.post("my/url", headers=headers, data=f)

but I need to add {"composite": "NONE"} to the data. If not, the server wouldn't recognize the file.

Answer

Ian Stapleton Cordasco picture Ian Stapleton Cordasco · Mar 3, 2016

You can use the requests-toolbelt to do this:

import requests
from requests_toolbelt.multipart import encoder

session = requests.Session()
with open('my_file.csv', 'rb') as f:
    form = encoder.MultipartEncoder({
        "documents": ("my_file.csv", f, "application/octet-stream"),
        "composite": "NONE",
    })
    headers = {"Prefer": "respond-async", "Content-Type": form.content_type}
    resp = session.post(url, headers=headers, data=form)
session.close()

This will cause requests to stream the multipart/form-data upload for you.