Enabled S3, but images upload to server instead of buckets

Hi there,

I’ve installed Sharetribe on Ubuntu and used it without s3 enabled for a week. Now I’ve added the 2 s3 buckets and 2 aws access keys to the config file (according to the instructions). Restarted server.
In my devtools js console in Chrome I do not see any attempts to upload to S3 and the images end up on the server (in public/system/images) as if s3 wasn’t enabled.
Has anyone seen this behaviour before? Any help would be greatly appreciated.

Kind regards, Bianca

OK, I solved this by removing the quotes around both acces keys in the config file. It’s trying to post to AWS now, but I get a 400 bad request.

Hi, do you use SSL (https connection) ?

Hello,

Did you solve the issue?

I am having the same problem (400 bad request).

I can upload correctly to the permanent storage S3 (the one called “your-sharetribe-images” in the gihub instructions).

But I have a “400 bad request error” when I try to upload listing images.

I am wondering if the problem is due to a bad sharetribe.conf and/or wrong settings of the S3 bucket (the one called “your-sharetribe-images-tmp” in the gihub instructions).

Thank you in advance.

Hi,

I am pretty convinced it ultimately comes down to this: https://www.sharetribe.com/community/t/wrong-aws-signature-version-used-when-uploading-to-s3/1800
I’m no Ruby developer and am currently not able to get the sourcecode updated to use the right signature, so I decided to skip the whole AWS upload. I’ve mounted the images directory to a separate (non-aws) cloud storage, because I really do not want them on my production server.