Hey everyone! Quick Example on how we can automate the handling of different files to automatically be uploaded to amazon S3.
Sorry for the green screen issues – hope they’re not too bothersome!
Few notes on Creating IAM users:
– Create a user
– Enable programmatic access
– Add user to a group with the desired permissions
( In this video we used S3, but you can use this client with many more aws services. You would assign those permissions to a group the same way as we do the s3 permissions here)
– Copy access and secret access keys
Legit crazy that I’m writing this with 11,000+ subscribers — How awesome! I write these notes in the descriptions of most of my videos, and I still remember writing it with 250 subscribers and being so thankful. I’m honored I get to create these videos and hopefully they’re of use to some of you. Thank you for all the support and kind words throughout the last year!
Support the Channel on Patreon —
https://www.patreon.com/join/derricksherrill
Join The Socials —
Reddit – https://www.reddit.com/r/CodeWithDerrick/
FB – https://www.facebook.com/CodeWithDerrick/
Insta – https://www.instagram.com/codewithderrick/
Twitter – https://twitter.com/codewithderrick
LinkedIn – https://www.linkedin.com/in/derricksherrill/
GitHub – https://github.com/Derrick-Sherrill
*****************************************************************
Full code from the video:
from secrets import access_key, secret_access_key
import boto3
import os
client = boto3.client(‘s3’,
aws_access_key_id = access_key,
aws_secret_access_key = secret_access_key)
for file in os.listdir():
if ‘.py’ in file:
upload_file_bucket = ‘youtube-dummy-bucket’
upload_file_key = ‘python/’ + str(file)
client.upload_file(file, upload_file_bucket, upload_file_key)
https://github.com/Derrick-Sherrill/DerrickSherrill.com/blob/master/automatic_s3_uploader.py
Packages (& Versions) used in this video:
boto3 1.11.13
os
*****************************************************************
Code from this tutorial and all my others can be found on my GitHub:
https://github.com/Derrick-Sherrill/DerrickSherrill.com
Check out my website:
https://www.derricksherrill.com/
If you liked the video – please hit the like button. It means more than you know. Thanks for watching and thank you for all your support!!
— Channel FAQ —
What text editor do you use?
Atom – https://atom.io/
What Equipment do you use to film videos?
https://www.amazon.com/shop/derricksherrill
What editing software do you use?
Adobe CC – https://www.adobe.com/creativecloud.html
Premiere Pro for video editing
Photoshop for images
After Effects for animations
Do I have any courses available?
Yes & always working on more!
https://www.udemy.com/user/derrick-sherrill-2/
Where do I get my music?
I get all my music from the copyright free Youtube audio library
https://www.youtube.com/audiolibrary/music?nv=1
Let me know if there’s anything else you want answered!
————————-
Always looking for suggestions on what video to make next — leave me a comment with your project! Happy Coding!
source
Bro how do I download the uploaded file and load it on my web page? Do you have a tutorial on that?
Hi there! If you could suggest some python approach to write a file into multiple target buckets from source bucket, that would be helpful. Thanks
thank you greetings from the Dominican Republic
This is natural teaching.
Are you able to download, as well?
Looking for a way to transfer files from our Sharepoint location to a AWS S3 bucket using Python. Any good suggestions on how to go about doing it?
Excellent, Derrick!!!
Brilliant tutorial well explained
How to securely insert the Access key and password to the python or c#
Because using your method any one can decompile my program and have my access key and secret key ?
Stupendous tutorial….Need more such tutorials
Thank you Derrick. Nice tutorial. Thanks for your time! It helped me a lot.
if you have a video for SFTP, please share link. š
just reviewed another video from someone that literally walked through this same scenario. however, their video was 57 mins! no. i did not watch the 57 min video. i watched about 3 mins and saw your video sitting in the sidebar screaming '7 minute video over here!' haha.
to the point and just the facts. no showboating knowledge and skills.
if im ever drowning, i hope you are on the shore to save me and not the person who makes a 57 min video on 'how to jump in and save a drowning person'. haha
amazing, it was very useful
I have a question: how can I send the file to a specif path in my bucket?
Wish i was this beautiful hahah Great vid dude! Helped me a lot.
Thanks Derrick!
All about S3 Bucket;
Subscribe for more such videos.
https://www.youtube.com/watch?v=ljtrRW69z10
Awesome Derrick , I have one requirement where i need to upload the file from S3 to Sharepoint Site using python in AWS EC2. Any help please ???
Thank you, it's amazing to video !!!
Nice video. I have one question. I don't want to download the content from S3, and I want to know the content details like no. Of .txt, no. Of. py within that bucket.
I am using aws s3 ls bucket_name/ – – recursive – – human-readable – – summarize
It shows the total size and objects no. But can't find the file counts with respect to their extensions.
Hi Derrick!! I want to know with its possible to do this with a AWS Lambda for automation pourposes?? (sorry about the english, im from Brazil)
Great video like always! Can you also make a video on cron job?
Good sample could you please make an example of how to send email with lambda too
I am fairly new to data but you are awesome. Thank you!!
Thank Derrick !!
Can I do this from Juypter Notebook?
3:36 — file is a python keyword. Don't use Python keywords for variable names. Use something like a_file or my_file.
Hi Derrick and all
I tried this method and I am getting the error below
ImportError: cannot import name 'access_key' from 'secrets' (C:Program FilesPython37libsecrets.py)
did anyone had same error? and how do you go about it?
Hi How can read the files browser for S3
I have my bucket like "parent_Bucket/Chield_Bucket", Now I want to upload my file to Chield_Bucket. Could you please suggest what I need to do for this.
what to do when the s3 has multiple buckets and we need to choose a particular bucket??
Thanks man ! Great video!
You are awesome Derrick! Could you Please upload a video of automating data transfer from AWS S3 to AWS Redshift (Using AWS Glue) or any other appropriate method .
This is exactly what I needed for work, thanks so much!
Awesome
Well Explained.
This is very helpful, thank you.
how to copy file in other directory?
Youāre the man. Need more content!
If we need to read multiple files from AWS S3 bucket, what must be the code in Python?
Thanks man. Much appreciated!
Can you do /suggest best automation tools to work with aws
Dude , Is there any reason why you aren't a Cloud developer for a company?
@Derrick Sherrill everytime I try to run the code it picks up files from windows32 directory. I have the .py file inside a folder with few test .txt files but it keeps picking up txt files from the win32 dir. Whats happening, please help