Automate File Handling With Python & AWS S3 | Five Minute Python Scripts

    25
    44



    Hey everyone! Quick Example on how we can automate the handling of different files to automatically be uploaded to amazon S3.

    Sorry for the green screen issues – hope they’re not too bothersome!

    Few notes on Creating IAM users:
    – Create a user
    – Enable programmatic access
    – Add user to a group with the desired permissions
    ( In this video we used S3, but you can use this client with many more aws services. You would assign those permissions to a group the same way as we do the s3 permissions here)
    – Copy access and secret access keys

    Legit crazy that I’m writing this with 11,000+ subscribers — How awesome! I write these notes in the descriptions of most of my videos, and I still remember writing it with 250 subscribers and being so thankful. I’m honored I get to create these videos and hopefully they’re of use to some of you. Thank you for all the support and kind words throughout the last year!

    Support the Channel on Patreon —
    https://www.patreon.com/join/derricksherrill
    Join The Socials —
    Reddit – https://www.reddit.com/r/CodeWithDerrick/
    FB – https://www.facebook.com/CodeWithDerrick/
    Insta – https://www.instagram.com/codewithderrick/
    Twitter – https://twitter.com/codewithderrick
    LinkedIn – https://www.linkedin.com/in/derricksherrill/
    GitHub – https://github.com/Derrick-Sherrill
    *****************************************************************
    Full code from the video:

    from secrets import access_key, secret_access_key

    import boto3
    import os

    client = boto3.client(‘s3’,
    aws_access_key_id = access_key,
    aws_secret_access_key = secret_access_key)

    for file in os.listdir():
    if ‘.py’ in file:
    upload_file_bucket = ‘youtube-dummy-bucket’
    upload_file_key = ‘python/’ + str(file)
    client.upload_file(file, upload_file_bucket, upload_file_key)

    https://github.com/Derrick-Sherrill/DerrickSherrill.com/blob/master/automatic_s3_uploader.py

    Packages (& Versions) used in this video:
    boto3 1.11.13
    os

    *****************************************************************
    Code from this tutorial and all my others can be found on my GitHub:
    https://github.com/Derrick-Sherrill/DerrickSherrill.com

    Check out my website:
    https://www.derricksherrill.com/

    If you liked the video – please hit the like button. It means more than you know. Thanks for watching and thank you for all your support!!

    — Channel FAQ —

    What text editor do you use?
    Atom – https://atom.io/

    What Equipment do you use to film videos?
    https://www.amazon.com/shop/derricksherrill

    What editing software do you use?
    Adobe CC – https://www.adobe.com/creativecloud.html
    Premiere Pro for video editing
    Photoshop for images
    After Effects for animations

    Do I have any courses available?
    Yes & always working on more!
    https://www.udemy.com/user/derrick-sherrill-2/

    Where do I get my music?
    I get all my music from the copyright free Youtube audio library
    https://www.youtube.com/audiolibrary/music?nv=1

    Let me know if there’s anything else you want answered!

    ————————-

    Always looking for suggestions on what video to make next — leave me a comment with your project! Happy Coding!

    source

    Previous articleHow to automate data processing in Python with Mito
    Next articleAutomating LIFE with Python

    44 COMMENTS

    1. just reviewed another video from someone that literally walked through this same scenario. however, their video was 57 mins! no. i did not watch the 57 min video. i watched about 3 mins and saw your video sitting in the sidebar screaming '7 minute video over here!' haha.

      to the point and just the facts. no showboating knowledge and skills.

      if im ever drowning, i hope you are on the shore to save me and not the person who makes a 57 min video on 'how to jump in and save a drowning person'. haha

    2. Nice video. I have one question. I don't want to download the content from S3, and I want to know the content details like no. Of .txt, no. Of. py within that bucket.
      I am using aws s3 ls bucket_name/ – – recursive – – human-readable – – summarize
      It shows the total size and objects no. But can't find the file counts with respect to their extensions.