Upload a file to AWS S3 with FastAPI #3883
Unanswered
saeedesmaili
asked this question in
Q&A
Replies: 1 comment
-
Have you tried: @app.put(
"/{system_code}/{file_name}",
tags=["API V1"],
summary="Upload file to S3 storage.",
description="Upload file to S3 storage."
)
async def upload_file(system_code: str, file: UploadFile = File(...)):
if isSystemDefined(system_code) is True:
system = next((system for system in access if system['system_code'] == system_code), None)
s3 = S3Browser(system)
temp_file = os.path.join("/tmp", file.filename)
with open(temp_file, "wb") as out_file:
out_file.write(await file.read())
try:
s3.fput_object(file.filename, temp_file)
...
...
... |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I'm working on a FastAPI endpoint to upload the provided file from user to an AWS bucket. To make sure my aws credentials and region is valid, I first tried the following code outside FastAPI (in a simple python script) and it successfully uploads the file into S3 bucket:
But I tried so many ways (after searching in StackOverflow and FastAPI github repo issues) and with all of them I get the following error:
Here are two different endpoints I've tried and failed:
I have also tried the following to make sure my AWS credentials are valid:
I can't understand why I can upload the file outside FastAPI but not in my endpoint. My FastAPI app is running on docker (if it's important in this case).
Update: Running the fastapi (with
python3 -m uvicorn app.main:app --reload --workers 1 --host 0.0.0.0 --port 8000 --proxy-headers
) outside docker, it can upload files into s3, but the same exact code with the same hard-coded s3 credentials can't upload to s3 (but can read the existing s3 buckets for example).Update 2: the code inside docker can read the list of buckets (with
s3.buckets.all()
) but it can't read the list of objects inside a bucket (withs3.Bucket("bucket-name").objects.all()
) and fails withbotocore.exceptions.EndpointConnectionError
.Update 3: I added
boto3.set_stream_logger("")
to see the debugging logs, and here is where it fails when running inside docker:Beta Was this translation helpful? Give feedback.
All reactions