Automate File Handling With Python & AWS S3 | Five Minute Python Scripts

Hey everyone! Quick Example on how we can automate the handling of different files to automatically be uploaded to amazon S3.

Sorry for the green screen issues – hope they’re not too bothersome!

Few notes on Creating IAM users:
– Create a user
– Enable programmatic access
– Add user to a group with the desired permissions
( In this video we used S3, but you can use this client with many more aws services. You would assign those permissions to a group the same way as we do the s3 permissions here)
– Copy access and secret access keys

Legit crazy that I’m writing this with 11,000+ subscribers — How awesome! I write these notes in the descriptions of most of my videos, and I still remember writing it with 250 subscribers and being so thankful. I’m honored I get to create these videos and hopefully they’re of use to some of you. Thank you for all the support and kind words throughout the last year!

Support the Channel on Patreon —
Join The Socials —
Reddit –
FB –
Insta –
Twitter –
LinkedIn –
GitHub –
Full code from the video:

from secrets import access_key, secret_access_key

import boto3
import os

client = boto3.client(‘s3’,
aws_access_key_id = access_key,
aws_secret_access_key = secret_access_key)

for file in os.listdir():
if ‘.py’ in file:
upload_file_bucket = ‘youtube-dummy-bucket’
upload_file_key = ‘python/’ + str(file)
client.upload_file(file, upload_file_bucket, upload_file_key)

Packages (& Versions) used in this video:
boto3 1.11.13

Code from this tutorial and all my others can be found on my GitHub:

Check out my website:

If you liked the video – please hit the like button. It means more than you know. Thanks for watching and thank you for all your support!!

— Channel FAQ —

What text editor do you use?
Atom –

What Equipment do you use to film videos?

What editing software do you use?
Adobe CC –
Premiere Pro for video editing
Photoshop for images
After Effects for animations

Do I have any courses available?
Yes & always working on more!

Where do I get my music?
I get all my music from the copyright free Youtube audio library

Let me know if there’s anything else you want answered!


Always looking for suggestions on what video to make next — leave me a comment with your project! Happy Coding!


44 thoughts on “Automate File Handling With Python & AWS S3 | Five Minute Python Scripts”
  1. Bro how do I download the uploaded file and load it on my web page? Do you have a tutorial on that?

  2. Hi there! If you could suggest some python approach to write a file into multiple target buckets from source bucket, that would be helpful. Thanks

  3. Looking for a way to transfer files from our Sharepoint location to a AWS S3 bucket using Python. Any good suggestions on how to go about doing it?

  4. How to securely insert the Access key and password to the python or c#
    Because using your method any one can decompile my program and have my access key and secret key ?

  5. just reviewed another video from someone that literally walked through this same scenario. however, their video was 57 mins! no. i did not watch the 57 min video. i watched about 3 mins and saw your video sitting in the sidebar screaming '7 minute video over here!' haha.

    to the point and just the facts. no showboating knowledge and skills.

    if im ever drowning, i hope you are on the shore to save me and not the person who makes a 57 min video on 'how to jump in and save a drowning person'. haha

  6. amazing, it was very useful
    I have a question: how can I send the file to a specif path in my bucket?

  7. Awesome Derrick , I have one requirement where i need to upload the file from S3 to Sharepoint Site using python in AWS EC2. Any help please ???

  8. Nice video. I have one question. I don't want to download the content from S3, and I want to know the content details like no. Of .txt, no. Of. py within that bucket.
    I am using aws s3 ls bucket_name/ – – recursive – – human-readable – – summarize
    It shows the total size and objects no. But can't find the file counts with respect to their extensions.

  9. Hi Derrick!! I want to know with its possible to do this with a AWS Lambda for automation pourposes?? (sorry about the english, im from Brazil)

  10. Good sample could you please make an example of how to send email with lambda too

  11. 3:36 — file is a python keyword. Don't use Python keywords for variable names. Use something like a_file or my_file.

  12. Hi Derrick and all
    I tried this method and I am getting the error below
    ImportError: cannot import name 'access_key' from 'secrets' (C:Program

    did anyone had same error? and how do you go about it?

  13. I have my bucket like "parent_Bucket/Chield_Bucket", Now I want to upload my file to Chield_Bucket. Could you please suggest what I need to do for this.

  14. You are awesome Derrick! Could you Please upload a video of automating data transfer from AWS S3 to AWS Redshift (Using AWS Glue) or any other appropriate method .

  15. @Derrick Sherrill everytime I try to run the code it picks up files from windows32 directory. I have the .py file inside a folder with few test .txt files but it keeps picking up txt files from the win32 dir. Whats happening, please help

Leave a Reply

Your email address will not be published.

Captcha loading...