interfacelab / ilab-media-tools Goto Github PK
View Code? Open in Web Editor NEWHome Page: https://mediacloud.press/
License: GNU General Public License v3.0
Home Page: https://mediacloud.press/
License: GNU General Public License v3.0
I imported about 200 images to the wrong cloud directory, then I deleted the files on the cloud, now all 200 of my media are pointing to empty cloud locations. I tried reinstalling but the data isn't purging. I can't delete the 200 images and start over as they link to many different posts.
Maybe each image should have a "unlink" option to unlink from the cloud? meanwhile how do I clear this manually?
When we assigned IAM role to an EC2 instance, the credentials to access the S3 will not only ILAB_AWS_S3_ACCESS_KEY
and ILAB_AWS_S3_ACCESS_SECRET
but also a token.
Those 3 settings are automatically available in the EC2 instance attached to an instance profile with IAM role.
ENDPOINT=http://169.254.169.254/latest/meta-data/iam/security-credentials/;
KEYS=$(curl -s $ENDPOINT`curl -s $ENDPOINT`);
# Fill in ILAB env vars
ILAB_AWS_S3_ACCESS_KEY=$(echo $KEYS | jq ".AccessKeyId");
ILAB_AWS_S3_ACCESS_SECRET=$(echo $KEYS | jq ".SecretAccessKey");
#⚠️⚠️⚠️ the one below is missing ⚠️⚠️⚠️⚠️
ILAB_AWS_S3_SESSION_TOKEN=$(echo $KEYS | jq ".Token"); # << WE REQUEST the support of this one
Yoast SEO plugin has quite involved logic for finding the correct image to share to Facebook and uses the get_intermediate_size() function to check these sizes. This function unfortunately assumes that it can build the url with $data['url'] = path_join( dirname($file_url), $data['file'] );
which breaks imgix as no get parameters are passed and in case of secure url's blocks it entirely. Since this is core wordpress functionality that is expected to work ImgixTool probably needs to filter this explicitly.
Hi @jawngee,
Please see pull request #52
I need this change and I've made it in my installation of this plugin, it's working fine.
Wasabi have said to me:
We do not encourage our users to use ACL's as it is deprecated. It is always better to use policies.
Therefore we need a blank option for ACL - could this be added to next release please?
Thank you 😁
When issuing the wp mediacloud import
command, all files are imported into the base directory of our object storage bucket using the default WordPress prefix.
Note that newly-uploaded images do honor this setting. Only when running the script from the command line is there a problem.
I have S3 and Imgix enabled. "Render PDF files" is not checked but in the media library I am seeing the attachment url go to Imgix. Imgix will try and render it, of course, but I think this is probably not desired functionality.
Hi, just include wp-admin/includes/image.php in iblab-media-tools.php
include_once (ABSPATH . 'wp-admin/includes/image.php');
Hey,
I just installed Media Cloud, but after uploading my first test the schema is missing in public URL, so the browser tries to load images from the local webserver. I use wp_get_attachment_image_src() to get image source, but it ends up in a source like "cdn.domain.com/year/month/picture.jpg", but it should be "http://cdn.domain.com/year/month/picture.jpg".
Am I doing something wrong or is it the plugin, which does strange things ?
Actually I am using MC 2.1 @ WP 4.8.1
BR
ray87
Hey guys,
Trying to debug an issue where after entering my S3 storage credentials — and clicking import media — Media Cloud loops through each image and appears to upload, but no images ever show up in S3.
I assume it is a credential issue and have enabled debugging from the Plugin's list of options on the home landing screen but I am unsure of where to access these debug logs. Hope you guys can help!
I just upgraded our multisite with the new version of the plugin. Thanks for the changes! When I go to click a media library item now I get this error in my console:
...
Uncaught TypeError: Cannot read property 'CacheControl' of undefined
...
I don't have any of the new AWS fields filled out, including CacheControl. Shouldn't those be optional/backwards compatible?
I'll see if I can push a fix up if the issue isn't fixed yet tomorrow or the next day - I've got a deadline today.
Plugin is properly working for is intended purpose on S3 images from media library.
However, when installed, it breaks native capability of WP to upload and install a plugin through the normal plugin uploader (not from repository, only via manual upload of a plugin .zip file)
Perhaps it's a permissions issue or that WP is still looking to the native path on site for the .zip file type instead of to S3.
I'll dig into the code later unless you have an immediate "a ha" how to solve.
Here is screenshot.
Cheers!
spence
Looks like I'm getting this on every cropping size, as well as for the full-res image. So on default WP I get this error 4x. Pretty sure I've entered everything in correctly. Screenshot of S3 settings:
Here's the error:
Notice: Undefined index: s3 in /srv/www/cellar.example.com/current/web/app/plugins/ilab-media-tools/classes/tools/s3/ilab-media-s3-tool.php on line 185
For fun, I tried removing the code in the if statement and just used the code on line 187 and it didn't throw the errors, but it doesn't look like it's uploading to S3 as my bucket is empty.
I tried turning Imgix off and the issue's still there.
Running a Trellis box with an up-to-date Bedrock install on multisite. Your plugin is install as a normal plugin with composer and is network activated.
Any thoughts? Thanks so much! Super excited to use this.
Been scratching my head for a few hours with this one but it seems that the latest updates have broken WooCommerce's single images displays. The thumbnails work (if there are multiple images), it's just the main product images.
I was able to get everything working by switching the order of the data going into the sizeToFitSize
function here:
While I'm not 100% sure how this works yet, I think that the images I was getting back are returning a height of 0
and so all the ratio math happening in the sizeToFitSize
function is just returning 0
, but again, I'm not positive.
For now, I am using my fix deployed over here: Craftpeak@1725209, since we needed it to work asap, but I don't want to submit a PR yet because obviously I don't fully understand what's happening and the implications it could have on other uses.
Edit: shoot. Looks like the change I made is causing the images not to preserve their aspect ratios for some images that were uploaded non-square. I'll spend some more time on it when I can get a minute.
After the update all websites are throwing error 500 with following logs:
2017/09/26 09:30:17 [error] 18#0: *47 FastCGI sent in stderr: "PHP message: PHP Parse error: syntax error, unexpected 'function' (T_FUNCTION), expecting identifier (T_STRING) or \\ (T_NS_SEPARATOR) in /usr/share/nginx/www/wp-content/plugins/ilab-media-tools/classes/Tools/Crop/CropTool.php on line 17" while reading response header from upstream, client: *******, server: localhost, request: "GET / HTTP/1.1", upstream: "fastcgi://unix:/var/run/php5-fpm.sock:"
Hello,
I've got buckets on regions EU - Paris which is key eu-west-3
Yet this plugin does not offer the ability to choose this region.
Can you add it please ?
If you are uploading image with the same name, it might override image that is already uploaded to cloud, because wordpress can't detect that file already exists, and is not renaming file to unique name.
Steps to reproduce:
Please at least put some "insecure" notification below this option in settings.
I was trying to enable the Direct Upload but it seems it is not turning on, I already have my Storage account and Imgix account setuped and able to upload to s3 but not using the Direct Uploads method. https://screencast.com/t/I8DpH4SOhAz6. I can't find any additional info on how to properly set it up or troubleshoot that part.
Attempting to set up IMGIX and S3. IMGIX subdomain is deployed using an IAM user belonging to a group made with Sample Policies for Wordpress.
Can't get past this error in Troubleshooter:
Validate Storage Settings
There was an error or errors trying to connect to the storage provider.Error insuring that stretcheveryday exists. Message: Error executing "ListBuckets" on "https://s3.ca-central-1.amazonaws.com/"; AWS HTTP error: cURL error 60: SSL certificate problem: unable to get local issuer certificate (see http://curl.haxx.se/libcurl/c/libcurl-errors.html)
This is the libcurl error:
CURLE_PEER_FAILED_VERIFICATION (60)
The remote server's SSL certificate or SSH md5 fingerprint was deemed not OK. This error code has been unified with CURLE_SSL_CACERT since 7.62.0. Its previous value was 51.
This is my policy:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:DeleteObjectTagging",
"s3:ListBucketMultipartUploads",
"s3:DeleteObjectVersion",
"s3:ListBucket",
"s3:DeleteObjectVersionTagging",
"s3:GetBucketAcl",
"s3:ListMultipartUploadParts",
"s3:PutObject",
"s3:GetObjectAcl",
"s3:GetObject",
"s3:AbortMultipartUpload",
"s3:DeleteObject",
"s3:GetBucketLocation",
"s3:PutObjectAcl"
],
"Resource": [
"arn:aws:s3:::stretcheveryday/*",
"arn:aws:s3:::stretcheveryday"
]
},
{
"Effect": "Allow",
"Action": "s3:HeadBucket",
"Resource": "*"
}
]
}
Trying to connect from a local installation, does that matter?
Do I need to do something with Public Access Settings or Access Control List in the Bucket? I added the CORS snippet:
<?xml version="1.0" encoding="UTF-8"?>
<CORSConfiguration xmlns="http://s3.amazonaws.com/doc/2006-03-01/">
<CORSRule>
<AllowedOrigin>*</AllowedOrigin>
<AllowedMethod>GET</AllowedMethod>
<AllowedMethod>PUT</AllowedMethod>
<AllowedMethod>POST</AllowedMethod>
<AllowedMethod>HEAD</AllowedMethod>
<MaxAgeSeconds>3000</MaxAgeSeconds>
<AllowedHeader>*</AllowedHeader>
</CORSRule>
</CORSConfiguration>
Any suggestions greatly appreciated. Debug Log
Hi just curious if additional configuration needs to be done for multisite installations?
Im testing a multisite setup using sub directory - not subdomains - as the site is parked at a subdomain.
Out of the box mediacloud is not working, im using DO spaces. Love the plugin and not being stuck with aws. So hopefully this is something simple? thanks!
In v2.1.5 I am still getting a HTTP error when trying to upload PDFs. The error points to cms/wp-admin/async-upload.php.
The PDF is only 1MB and I have "Upload Non-Image files" option turned on in the Storage Settings.
Fatal error: Uncaught Error: Call to undefined function mime_content_type() in /home/domain/public_html/wp-content/plugins/ilab-media-tools/classes/tools/s3/ilab-media-s3-tool.php:785 Stack trace: #0 /home/domain/public_html/wp-content/plugins/ilab-media-tools/classes/tools/s3/ilab-media-s3-tool.php(548): ILabMediaS3Tool->processFile(Object(ILAB_Aws\S3\S3MultiRegionClient), '/home/domain', '2017/09/female-...', Array, 3105) #1 /home/domain/public_html/wp-includes/class-wp-hook.php(298): ILabMediaS3Tool->updateAttachmentMetadata(Array, 3105) #2 /home/domain/public_html/wp-includes/plugin.php(203): WP_Hook->apply_filters(Array, Array) #3 /home/domain/public_html/wp-includes/post.php(5013): apply_filters('wp_update_attac...', Array, 3105) #4 /home/domain/public_html/wp-admin/includes/media.php(383): wp_update_attachment_metadata(3105, Array) #5 /home/domain/public_html/wp-admin/async-upload.php(91): media_handle_upload('async-upload', 0) #6 {main} in /home/domain/public_html/wp-content/plugins/ilab-media-tools/classes/tools/s3/ilab-media-s3-tool.php on line 785
There is a forced redirect in classes/Utilities/Helpers.php if the file is simply included, I guess as the standard wordpress guard for direct access through url. This is a bit problematic in our project as that constant is not set until later in some of our classes (which then loads wordpress) and since this file is listed as explicitly autoloaded we can't even initialize the autoloader and load our files without it.
Is this guard really needed and if so could it be moved outside of the helpers file so to delay it until the library is actually requested and needed. In our case it's not so simple to just define ABSPATH as we would like to do some other stuff before that and this prevents us from even loading classes before we've defined it.
Wordpress' new block layout with responsive images adds a bad source path for images uploaded to the AWS cloud resulting in images not showing. If I disable the plugin, all images show. the source set for responsive images are all pointed to the correct AWS bucket, but the original source file path is not. In our wp install, the source image points to the image in the wp database with the addition of a "/h" at the end of the source URL. Thank you for any assistance on this issue.
Hi,
Just thought I would let you know that the 2.0.4 update somehow refactored all my images to different image sizes for some reason and I am not sure why.
Just thought I would let you know.
Thanks for such an awesome plugin and continuing the new realeases on a regular.
You make my life 100x easier not having to use the S3 offload plugin.
Thanks!
This is more of a feature request/idea to see if there is some interest in making it so that maybe we can use the awesome cropping tool for images that aren't automatically cropped by WordPress? I think it would be great to allow clients/users the ability to crop all images as they see fit. Somewhat like this plugin, which I've used many times before.
Anyone else interested in this? I'm going to try and take a look today to see how hard it would be to pull off with Imgix. Not really sure how I would pull this off with just S3, but that doesn't fit my use case so I was thinking Imgix would be a requirement.
Hello again,
I discovered an issue. The plugin is not compatible with EDD Free Downloads. When you click to download a file it does not rewrite the URL for the file that is now on S3.
Can you provide any assistance?
Blessings!
Hi,
I got this error:
PHP Warning: Illegal offset type in isset or empty in /var/www/html/blog/wp-content/plugins/ilab-media-tools/classes/tools/s3/ilab-media-s3-tool.php on line 1012
How could I fix it?
Thanks
Adding support for PHPLeague's Glide library, which is sort of a local version of Imgix that runs directly in WordPress.
Crash crash crash - help?
Fatal error: Uncaught ChrisWhite\B2\Exceptions\B2Exception: Received error from B2: in /home/hearjjdp/public_html/wp-content/plugins/ilab-media-tools/vendor/cwhite92/b2-sdk-php/src/ErrorHandler.php:36 Stack trace: #0 /home/hearjjdp/public_html/wp-content/plugins/ilab-media-tools/vendor/cwhite92/b2-sdk-php/src/Http/Client.php(29): ChrisWhite\B2\ErrorHandler::handleErrorResponse(Object(GuzzleHttp\Psr7\Response)) #1 /home/hearjjdp/public_html/wp-content/plugins/ilab-media-tools/vendor/cwhite92/b2-sdk-php/src/Client.php(411): ChrisWhite\B2\Http\Client->request('GET', 'https://api.bac...', Array) #2 /home/hearjjdp/public_html/wp-content/plugins/ilab-media-tools/vendor/cwhite92/b2-sdk-php/src/Client.php(38): ChrisWhite\B2\Client->authorizeAccount() #3 /home/hearjjdp/public_html/wp-content/plugins/ilab-media-tools/classes/Cloud/Storage/Driver/Backblaze/BackblazeStorage.php(189): ChrisWhite\B2\Client->__construct('e9c98bdc7da6', 'K0011omOOyRjHER...') #4 /home/hearjjdp/public_html/wp-content/plugins/ilab-media-tools/classes/Cloud in /home/hearjjdp/public_html/wp-content/plugins/ilab-media-tools/vendor/cwhite92/b2-sdk-php/src/ErrorHandler.php on line 36
We are running imgix and are uploading our images to S3. This has been working fine up until version 2.15 it seems (site has been in development for well over a year, with much of the same functionality.)
I was troubleshooting an issue where a few select images always returned a 403, so I disabled the secure URLS in imgix dashboard and inside the settings of the media cloud plugin. I also tweaked the cache expiration and set it to public,max-age=86000
When I saved these settings the site went into 500/white-screen, and this is what I have in my log:
thrown in /htdocs/wp-content/plugins/ilab-media-tools/ilab-media-tools.php on line 90
[30-Oct-2018 10:21:50 UTC] PHP Fatal error: Uncaught Error: Class 'ILAB\MediaCloud\Cloud\Storage\StorageManager' not found in /htdocs/wp-content/plugins/ilab-media-tools/ilab-media-tools.php:90
Stack trace:
#0 /wordpress-4.9.8/wp-settings.php(305): include_once()
#1 /htdocs/wp-config.php(54): require_once('/wordpress-4.9....')
#2 /htdocs/wp-load.php(37): require_once('/htdocs/wp-co...')
#3 /wordpress-4.9.8/wp-admin/admin.php(31): require_once('/htdocs/wp-lo...')
#4 /wordpress-4.9.8/wp-admin/index.php(10): require_once('/wordpress-4.9....')
#5 {main} thrown in /htdocs/wp-content/plugins/ilab-media-tools/ilab-media-tools.php on line 90
When I put ILAB_AWS_S3_BUCKET_PREFIX
into my .env file it doesn't make any difference.
I think this line is probably where the ENV variable should be put in (there's a spot that just has ''
, but I didn't want to make a PR not knowing as much about how v2 is put together.
Let me know if that's in fact the spot and if I should open up a PR.
Images not uploading to S3. All credentials fine, since adding Cloudfront CDN images no longing being added to S3 and thus not being served through site setup.
Everything seems to have been setup correctly and site is fine with existing images being served correctly, however new images not being uploaded?
With a custom prefix defined, and storage activated, I get this error on storage activation, breaking the entire site (WSOD).
Based on this thread, I've tracked it down to those lines.
According to this post, wp_get_current_user
isn't available until the plugins_loaded
or init
hooks.
A current fix for the issue is to change these lines like this:
-$user = wp_get_current_user();
+$user = 0;
+if(function_exists('wp_get_current_user')) {
+ $user = wp_get_current_user();
+}
$userName = '';
if($user->ID != 0) {
$userName = sanitize_title($user->display_name);
}
But that doesn't actually solve the issue. I think whatever calls the prefixer (possibly the entire plugin) needs to be wrapped in init
or plugins_loaded
.
Related: https://discourse.interfacelab.io/t/enabling-storage-results-in-white-screen-of-death/118
Loving this plugin. I'm trying to further optimize image delivery via Imgix and came across this article: https://blog.imgix.com/2016/11/30/passing-pagespeed.html, which suggests that auto=compress
is used - would you be open to a PR which adds this checkbox to the backend? The q=
parameter would be respected as well 👍 - if so, I'll go ahead and submit a PR. Thanks!
My customised gallery displays a bank of thumbnail
size images next to a JS driven enlargement.
Thumbnails are 300 x 300, hard crop. This used to result in all thumbnails being 300 x 300, but lately I have seen some 300 x ??? (soft crop) thumbnails sneaking in.
It appears that the srcset
contains both the desired thumbnail
size (hard crop), and an unrelated medium
size (soft crop).
I can't show you a live example, sorry, as the site can't go live until it is working, and I don't currently have a staging server for PHP 7.
I tried using the Regenerate Thumbnails function which didn't help, but I'm successfully working around this issue by disabling the Media Tools filter:
add_filter( 'ilab_s3_can_calculate_srcset', false, 10, 4 );
Edit: but note that doing this results in thumbnails being output with bad rotations.
I wondered if this issue might be due to the process at my end:
Annoying bug where the cloud popup shows and then sticks when mousing through the grid quickly.
I was using w3tc for the s3 upload feature (yes...) and all my files prior to this moment are already on my s3. Is there anyway I can force to use the CDN URL?
Is it possible to make Media Tools work with some king of on-the-fly image size generation plugin/library (like aq_resize, Fly Image Resizer or Image Processing Queue)
Btw, awesome plugin, thanks!
In the documentation there is a reference to this filter: ilab_imgix_filter_parameters
And while attempting to use I realize it wasn't working...at all. Took a look at the code and noticed there is a typo in how the filter is referenced.
Within the documents, the filter is referenced with underscores (_
) and in the code, the filter is referenced with hyphens (-
).
Code:apply_filters('ilab-imgix-filter-parameters', $params, $size, $id, $meta);
Documentation: apply_filters('ilab_imgix_filter_parameters', $params, $size, $id, $meta);
Can be found here:
Hi there, thanks for this plugin of yours. its incredible useful and well written. I wrote this extension https://github.com/setcooki/wp-minio-sync for Minio S3 to keep Wordpress installs that share the same media cloud in sync. Maybe someone finds this helpful as well.
Wanted to move from S3 Offload Media to this plugin. Just wanted to know if this is compatible with WPML?
As S3 Offload Media is compatible with WPML.
Since yesterday I can no longer authenticate on my DigitalOcean Spaces bucket.
I tried generating a new API key but I still get the same message:
Your AWS S3 settings are incorrect or the bucket does not exist. Please verify your settings and update them.
I guess DigitalOcean updated ther API and something broke. It worked perfectly fine for the past 3 months.
Because of this I can no longer upload any new images.
DigitalOcean has not acknowledge any issue at the moment.
I'm still waiting for an official answer, since I opened a ticket with them.
We already do this with Amazon Rekognition, but supporting other services could be useful. Additionally, it would be good to be able to decouple the backend storage from the service. Right now to use Rekognition, you also need to be using S3.
Nanonets:
https://nanonets.com/content-moderation-api/
Clarifai:
https://clarifai.com/models/nsfw-image-recognition-model-e9576d86d2004ed1a38ba0cf39ecb4b1
I have been testing the new update and it works as expected.. Now i can set a prefix according to my path in my s3 bucket. But there is a problem, in wp-offload s3 plugin.. there is an option in which you can upload your images with a versioning control, this means that in every file you upload, it is created a new folder with a timestamp which represents the version.. This is made because is normal to have your s3 bucket configure with an aws cloud-front, by the way, the versioning option helps to the invalidations produced in the cloud-front and ensure that the file your are looking for, is in the last version.
On the other hand the last develop about the prefix has broken the delete action.. When I set a new prefix, upload a new file, and afterwards i delete from the wordpress (with the option deletes in s3 too, activated), the file is not found, and consequentely is not erased.
I have develop my own solution for those issues. And I would like to get feedback about mi code and if you like it, be incorporated in the master branch for future updates. Sorry i do not know the correct flow for how i can recomend a new commit on github.. I am new on git. Because of that id prefer to send you the files i have modified. It is only one and is only a few lines of code. You can check where are the changes, but summing up it is only on line 390 and lines 292, 293.
All the design is located in the file ilab-media-tools/classes/tools/s3/ilab-media-s3-tool.php. And only takes a few code lines.
For the delete action bug.. I have modified line 390 “$this->delete_file($s3,$data[‘file’]);” for “$this->delete_file($s3,$data[‘s3’][‘key’]);” and now whatever prefix I have set.. the delete action is able to find it and erase from the bucket too
For the versioning bug. I have added the function get_object_version_string() from the plugin wp-offload s3, adpating the format to my requeriments, and i have also added new two options to the prefix in lines 292 and 293
$prefix = str_replace(“@{versioning}”, $this->get_object_version_string(), $prefix);
$prefix = str_replace(“@{site-id}”,sanitize_title(strtolower(get_current_blog_id())), $prefix);
These two options “@{versioning} and @{site-id}” allows me to implement the same path that i had in the bucket.. Using the “multisite” folder, and the “versioning feature”. The prefix set in the AWS BUcket Prefix setting will be "wp-content/uploads/sites/@{site-id}/@{date:Y/m}/@{versioning}"
[ilab-media-s3-tool.php.zip]
(https://github.com/Interfacelab/ilab-media-tools/files/776438/ilab-media-s3-tool.php.zip)
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.