FTP to Google Storage

CCC picture CCC · Apr 19, 2017 · Viewed 25.7k times · Source

Some files get uploaded on a daily basis to an FTP server and I need those files under Google Cloud Storage. I don't want to bug the users that upload the files to install any additional software and just let them keep using their FTP client. Is there a way to use GCS as an FTP server? If not, how can I create a job that periodically picks up the files from an FTP location and puts them in GCS? In other words: what's the best and simplest way to do it?

Answer

crazystick picture crazystick · Apr 19, 2017

You could write yourself an FTP server which uploads to GCS, for example based on pyftpdlib

Define a custom handler which stores to GCS when a file is received

import os
from pyftpdlib.handlers import FTPHandler
from pyftpdlib.servers import FTPServer
from pyftpdlib.authorizers import DummyAuthorizer
from google.cloud import storage

class MyHandler:
    def on_file_received(self, file):
        storage_client = storage.Client()
        bucket = storage_client.get_bucket('your_gcs_bucket')
        blob = bucket.blob(file[5:]) # strip leading /tmp/
        blob.upload_from_filename(file)
        os.remove(file)
    def on_... # implement other events

def main():
    authorizer = DummyAuthorizer()
    authorizer.add_user('user', 'password', homedir='/tmp', perm='elradfmw')

    handler = MyHandler
    handler.authorizer = authorizer
    handler.masquerade_address = add.your.public.ip
    handler.passive_ports = range(60000, 60999)

    server = FTPServer(("127.0.0.1", 21), handler)
    server.serve_forever()

if __name__ == "__main__":
    main()

I've successfully run this on Google Container Engine (it requires some effort getting passive FTP working properly) but it should be pretty simple to do on Compute Engine. According to the above configuration, open port 21 and ports 60000 - 60999 on the firewall.

To run it, python my_ftp_server.py - if you want to listen on port 21 you'll need root privileges.