Dear,
When I tried to change the data storage to AWS S3, I was unable to do so with the following alert
```
Could not read S3 object at key my-folder/owner.txt from bucket my-bucket: Failed to determine the Amazon region for bucket 'my-bucket'. Please ensure that the bucket exists, is shared with Synapse, in particular granting ListObject permission.. For security purposes, Synapse needs to establish that the user has permission to write to the bucket. Please create an object in bucket 'my-bucket' with key 'my-folder/owner.txt' that contains a line separated list of identifiers for the user. Valid identifiers are the id of the user or id of a team the user is part of. Also see http://docs.synapse.org/articles/custom_storage_location.html for more information on how to create a new external upload destination.
```
The owner.txt has been uploaded and the bucket is set as "https://docs.synapse.org/articles/custom_storage_location.html". It was working fine with this setting until the other day, has something been changed?
Please let me know how to solve this problem.
Created by Nobuyuki Izumihara izumihara @brucehoff Thank you very much for your support!!
I was able to change the storage to S3. The data has been uploaded without any problems. @izumihara I found a problem on our side, which is now fixed. We are now able to do:
```
aws --region us-east-1 s3api head-bucket --bucket synapse-srpbs
```
without getting a 403 response. I am hopeful that this fixes the problem. Would you please try again and let us know if the problem is fixed? There is one possible problem on our side that I need to check. I will report back as soon as possible. From my account the following commands work fine. No error is returned.
```
aws --region us-east-1 s3api head-bucket --bucket synapse-srpbs
```
I've set it up according to [Custom Storage Locations](https://docs.synapse.org/articles/custom_storage_location.html) page, and I understand that Synapse can't access S3 for some reason.
I want to know the reason. Do I need to install a bucket in the US region? When you see the error:
```
Could not read S3 object at ... Failed to determine the Amazon region for bucket...
```
It means that AWS returned a Forbidden response to the head-bucket request. This request can be issued by this AWS CLI command;
```
aws --region us-east-1 s3api head-bucket --bucket synapse-srpbs
```
I ran this command (using the role, arn:aws:iam::325565585839:root) and got this response:
```
aws --region us-east-1 s3api head-bucket --bucket synapse-srpbs
An error occurred (403) when calling the HeadBucket operation: Forbidden
```
For some reason your bucket is not accessible. If you can make the aforementioned CLI command work (e.g., using a role you control rather than Synapse's AMI role) then Synapse should work with your bucket. That's right, I'm sorry.
Many times Thank you.
How about here?
```
{
"Version": "2008-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::325565585839:root"
},
"Action": [
"s3:List*",
"s3:GetBucketLocation"
],
"Resource": "arn:aws:s3:::synapse-srpbs"
},
{
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::325565585839:root"
},
"Action": [
"s3:*Object*",
"s3:*MultipartUpload*"
],
"Resource": "arn:aws:s3:::synapse-srpbs/*"
}
]
}
```
Could you please format your policy using a code block? I'm not sure if you used
```
"s3:*Object*"
```
(which is correct) or
```
"s3:Object"
```
(which is incorrect). When you omit the code block format, the asterisks are lost. Thank you for your prompt reply.
Below is the exact policy I have set for S3.
I tried it now, but I get the same error.
---
{
"Version": "2008-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::325565585839:root"
},
"Action": [
"s3:List*",
"s3:GetBucketLocation"
],
"Resource": "arn:aws:s3:::synapse-srpbs"
},
{
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::325565585839:root"
},
"Action": [
"s3:*Object*",
"s3:*MultipartUpload*"
],
"Resource": "arn:aws:s3:::synapse-srpbs/*"
}
]
} The policy you shared refers to the resource my-bucket-name literally (in two places). The resource in your policy should be the actual name of your bucket, synapse-srpbs. Can you let us know if correcting the policy fixes the problem? Thank you so much!
Sorry for the late reply.
The bucket and folder names are tentative names for asking questions in this forum, but in the production environment they are set as follows.
Bucket name: synapse-srpbs
Folder name: fc ('owner.txt' with synapse ID has also been uploaded)
By the way, when I tried it on Google Cloud Platform, it worked fine.
I don't understand why it doesn't work on AWS. The error message in your original post refers to 'my-bucket' and 'my-folder'. Are these literally the names you used for your bucket and folder, respectively?
The policy you shared refers to the resource **my-bucket-name** literally (in two places). The resource in your policy should be the actual name of your bucket. Thank you very much for your support!
The S3 configuration I did is as follows.
**Region:**
ap-northeast-1
**Policy:**
{
"Version": "2008-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::325565585839:root"
},
"Action": [
"s3:ListBucket*",
"s3:GetBucketLocation"
],
"Resource": "arn:aws:s3:::my-bucket-name"
},
{
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::325565585839:root"
},
"Action": [
"s3:*Object*",
"s3:*MultipartUpload*"
],
"Resource": "arn:aws:s3:::my-bucket-name/*"
}
]
}
**CORS:**
[
{
"AllowedHeaders": [
"*"
],
"AllowedMethods": [
"GET",
"POST",
"PUT",
"HEAD"
],
"AllowedOrigins": [
"*"
],
"ExposeHeaders": [],
"MaxAgeSeconds": 3000
}
]
Last August, I changed the storage to S3 and uploaded the data. This bucket was used only for a single project.
When I tried to update the data this time, I was unable to upload it, so I am trying to reconfigure it. Can you share the policy you set on the bucket (paste below)?
When you say, "It was working fine until the other day" are you saying that you have configured multiple Synapse projects to use this bucket? When was the last date that you successfully configured a project to use the bucket?