Upload images to S3 via WordPress Plugin

A few years ago I blogged about creating an Adobe AIR app that would allow me to drag and drop images on it and automatically upload them to my blog server. The backend code would store the image and create a thumbnail. The WordPress plugin would display the images on the sidebar.

Since visiting AWS re:Invent, I started looking for a project to play around with some AWS services and decided it was time to create a new version of the “uploader”. The stack I used:

  • WordPress plugin
    • PHP
    • JS
  • S3 to store the images and thumbnails
  • SQS to get notified when the thumbnails are created
  • Lambda to generate thumbnails after an image is uploaded to S3


A WordPress plugin is configured to run on the sidebar. For a guest user, only the functionality to list images is loaded. This will make an API call to load all objects from the S3 bucket. In JS, we parse through all objects and display the newest x thumbnail images on the screen. Clicking on the image will load the original sized version.

The experience for the admin user is a bit different. That user sees a drag and drop area above the thumbnails. Upon dropping images on that spot, two progress bars are displayed. One shows the progress of images being uploaded to S3. The other shows the progress of thumbnails begin generated.

Guest View:

Admin View:


The Process

High level, this is what is happening:

  • Images are dropped on the WordPress plugin’s drag and drop area.
  • A call is made to upload those images to an S3 bucket using the AWS JS SDK.
  • The S3 bucket receives the image and when it’s completed, a Lambda function is triggered.
  • The Lambda function creates a thumbnail image and puts it in the thumbnail bucket.
  • When the thumbnail is fully received, that S3 bucket sends SQS a message with information about that image.
  • JS keeps checking back with SQS to check on completed images and when all images are done, resets the display and shows the new thumbnails.



The first part of the process is to create two S3 buckets: one for the original sized images and the other for the thumbnails. The trickiest part here is to get permissions correct in AWS. We’ll need to attach the right policies. A lot of how I learned to configure this and how I got this to work comes from a hands-on training class from QuiklabsIntroduction to AWS Lambda.


The purpose of the Lambda is to resize an uploaded image and store it in the thumbnail bucket.


We use SQS to track what thumbnails have been generated. When the Lambda is done, it sends a message to the SQS queue to let it know it’s finished.

WordPress Plugin

Nothing fancy here, just a WordPress plugin that shows recent images based on an API call to the S3 thumbnail bucket. It has a widget setup page to store the AWS connection strings.

Here is the repo: https://github.com/joeyrivera/wp-s3-images

Steps To Make This Happen

  • Setup S3
    • create two buckets
    • setup permissions
  • Create Lambda
    • configure the resize script
    • set it as a triggered event on the S3 bucket
    • setup permissions
  • Setup SQS
    • add event to receive from S3 resize bucket
    • setup permissions
  • Create WordPress plugin
    • create widget form
    • create settings page
    • drag and drop


If you want more details, leave a comment and I’ll do my best to answer your question or update this post with more.


WordPress Plugins

  • https://developer.wordpress.org/themes/basics/including-css-javascript/
  • https://wordpress.org/support/article/debugging-in-wordpress/

Drag and Drop JS

  • https://developer.mozilla.org/en-US/docs/Web/API/HTML_Drag_and_Drop_API/File_drag_and_drop
  • https://www.smashingmagazine.com/2018/01/drag-drop-file-uploader-vanilla-js/


  • https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/S3.html
  • https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/SQS.html
  • https://docs.aws.amazon.com/AmazonS3/latest/user-guide/enable-event-notifications.html
  • https://www.tothenew.com/blog/configuring-sns-notifications-for-s3-put-object-event-operation/
  • https://stackoverflow.com/questions/19176926/how-to-make-all-objects-in-aws-s3-bucket-public-by-default
  • https://docs.aws.amazon.com/sdk-for-javascript/v2/developer-guide/s3-example-photo-album.html
  • https://run.qwiklabs.com/focuses/8613?parent=catalog
  • https://docs.aws.amazon.com/AmazonS3/latest/user-guide/enable-event-notifications.html#enable-event-notifications-how-to
  • https://aws.amazon.com/blogs/aws/new-for-aws-lambda-environment-variables-and-serverless-application-model/

Phing and PHPUnit

I started reading up on Jenkins as I want to learn more about it and noticed it has an extension to work with Phing scripts. It’s been a while since I used Phing so I decided to spend some time learning more about its current state. The goal being to create a script that would run various PHP tasks starting with unit tests using PHPUnit. After spending some time, I got it working and decided to document my findings in this post.

All the following code can be found in this github repo:

Getting Our Dependencies

First we want to create a new folder and install composer. Now that you have composer, create a composer.json file with the contents of the file below:

    "autoload": {
        "classmap": [
    "require-dev": {
        "phing/phing": "2.*",
        "phpunit/phpunit": "^8"

and run:

php composer.phar install

This will install all you need to work with Phing and PHPUnit for this exercise. Now we want to add some code and tests to verify PHPUnit works. You can write some sample code or copy/paste the classes I created in this github project under src/ and tests/.


Before we run phpunit, we do want to add a phpunit.xml file and some configuration information so phpunit knows where the tests are and where the autoloader is as well:

        <testsuite name="Tests">

Now we can run vendor/bin/phpunit and see all tests pass!

Getting Phing To Work

This part was a bit tricky. I saw in the documentation the PHPUnitTask but I was unable to get it to run any tests. After some research, and I didn’t verify this, it seems the PHPUnit Task is not compatible with the later version of PHPUnit that use namespacing.

Instead of using the PHPUnit Task, I decided to try the ExecTask. That worked well! I just needed to find the right configuration to get Phing to detect the output and if it was a pass or failure. To do this, I used the following settings in the build.xml file:

<?xml version="1.0" encoding="UTF-8"?>
<project name="PhingTests" default="tests">
    <target name="tests">
        <echo msg="Running unit tests" />
        <exec executable="./vendor/bin/phpunit" passthru="true" checkreturn="true" />
        <!-- doesn't seem to work with namespaced versions of phpunit -->
        <!-- <phpunit codecoverage="false" pharlocation="./vendor/bin/phpunit">
            <formatter type="plain" usefile="false"/>
                <fileset dir="./tests/">
                    <include name="**/*Test*.php"/>
        </phpunit> -->

Setting passthru and checkreturn lets us capture the results of the execution command and check the return code to see if it was a successful run or a failure. Now, running vendor/bin/phing (which by default calls the tests target) returns the following:

And there you have it, Phing with PHPUnit working! Just for fun, I updated the code to simulate a tests failure to verify the process is working and here is what you get in Phing when PHPUnit fails:

Moved my blog to AWS Lightsail

Hey all! It took 6 years but finally a new post.

I went to AWS re:Invent earlier this month and it inspired me to play around with AWS and the first step was to move my blog. I moved my DNS to Route 53 and spun up a WordPress instance using Lightsail. In theory it’s only going to cost $3/month which is much better than what I was paying before. We’ll see after the first free month if that’s the case.

I hope to start blogging more again. I’m playing around with creating an image uploader using S3 buckets and Lambda functions. Once that’s working, I’ll post details. And re:Invent was great and I hope to go again. This was my first time going and I learned lots of tricks to make scheduling my next visit more efficient.

Doctrine2 and Postgresql timestamp with millisecond issue

I’ve been doing a lot of work with Doctrine 2 and finding some issues when dealing with PostgreSQL and Doctrine which I’ll be blogging about. I’ve ignored this particular issue for some time and finally decided to address it.

The issue is declaring a Doctrine entity property as datetime if the column in the db is storing the timestamp with milliseconds such as ‘2012-01-01 10:12:35.542’. When you try to load that into your entity, Doctrine will give you an error. An example of how this can happen is declaring a column timestamp without time zone and using now() as the default value.

Here is an example of a Doctrine entity property declared as a datetime

* @Column(type="datetime")
protected $discontinued;

And what the column sql declaration looks like

discontinued timestamp without time zone NOT NULL

There are three solutions that work, one is to configure Doctrine to not use the default DateTime classes, or you can declare your datetime properties as strings in your entities and convert them to datetime yourself if needed and finally the other approach is to update your db to not store milliseconds. I considered all approaches are decided that it would be better and more efficient to update the db than to write some custom DateTime classes or treat dates as string. If you would like to tackle this issue by creating your own custom DateTime classes then follow this link for more information.

I used the following query to identity all the timestamp columns in my db table:

select table_name, column_name 
from information_schema.colum
where table_catalog = 'gatweb2' and table_schema = 'public' and udt_name like 'timestamp%' 
order by table_name asc

Noticed I used like timestampe% instead of = timestamp. This is because you could have timestamptz (timestamp with time zone) columns as well. Once I had the list, I saved the results, did some search/replace magic to come up with alter statements for each table and ran the queries. I spent a little bit of time looking at cursors to see if I could write a cursor to loop through the list and alter each table but it was taking too long research and search/replaced was quick. This is what the alter queries look like for each table:

alter table 'contact' alter 'date' set data type timestamp(0) without time zone

By altering my timestamp columns to timestamp(0), it lets Postgres know to remove milliseconds from all values as well as to not store milliseconds in the future.

An issue I ran into when running all my alter statements was that some views were referencing some of the columns I was about to alter and Postgres didn’t like that. I did some research and it seems my only choice was to drop those particular views, alter the tables, then recreate the views. I created a big script that did all that, ran it, and now all my issues are gone and my unit test passed.

Slides from Automation with Phing presentation for Codeworks 2012 Atlanta

I had the pleasant opportunity to present at Codeworks this year. I’ve presented at the Atlanta PHP User Group a few times but this was my first time presenting at a conference. Overall I think it went well and I learned a lot from it. I picked Phing as a topic as we have been doing a lot with Phing in the last year at work. The presentation covers some of the improvements we’ve made to one of our applications by automating a few processes that we used to spend hours on. Here are the slides:

Automate SVN Export to Site w/ Bash Script

So at work we are finalizing the setup of a new server environment. The site is in PHP and the code is all in SVN. We were trying to decide what process to use to export the SVN contents to the site and that’s where I decided to learn how to write a bash script. This is my first and with some help from Jess we created the following script. The script does the following:

  • Does an info on the remote repo to get the revision number
  • Checks against local revision number which is stored in a file
  • If the revision numbers don’t match, it does a diff on both revisions and creates an list with the files that were changed
  • It then loops through each file and exports it to the site
  • Finally it stores the new revision number in the file
Feel free to use this and tweak it for your needs. This is our first draft, at this point we’ll start cleaning it up and adding more functionality but it works. Make sure to add a cron job to run it every so often and enjoy.
# need to figure out what to do on files that need to be deleted

echo "Getting info from remote repo"
REMOTE_VERSION=$(svn info $REPO | grep Revision)
REMOTE_VERSION=${REMOTE_VERSION: -4} # need to update to not hardcode 4 spaces back

echo "Current Revision: $CURRENT_VERSION"
echo "Remote Revision: $REMOTE_VERSION"

        echo "No export needed"
        exit 0

echo "Getting diffs between revisions"
difflines=`svn diff --summarize -r $CURRENT_VERSION:$REMOTE_VERSION $REPO 2>&1 | awk '{print $2}'`


for i in `echo $difflines`; do
   echo "svn export ${i} ${TARGET_DIR}${FILENAME}"
   svn export ${i} ${TARGET_DIR}${FILENAME}

echo "Saving revision number"

Zend_Json_Server and how to call it via JSON-RPC 2.0

So I started playing with Zend_Json_Server and was having a hard time trying to figure out how to call the server from a client. Finally I checked the JSON-RPC 2.0 spec and read a very important detail that I had not realized:

The Request is expressed as a single JSON Object

This was the key to my problem. There are 4 parameters that can be sent with each JSON-RPC 2.0 request but I was sending each as a post var which is not what Zend_Json_Server expects. It simply wants one json encoded object with all the parameters inside. The 4 available parameters are:

  1. jsonrpc – the version you are using; I’m using 2.0.
  2. method – the name of the method you want to call in the server.
  3. params – object of parameters your method needs. If you don’t need any, don’t send this param.
  4. id – an identifier (anything you want) that will be sent to and from the server for this request.

I’m using Zend_Http_Client() to make the request and here is an example:

$params = array(
	'jsonrpc' => '2.0',
	'method' => 'find',
	'params' => array('326691'),
	'id' => 'test'

$http = new Zend_Http_Client();

echo $http->request()->getBody();

Continue reading “Zend_Json_Server and how to call it via JSON-RPC 2.0”

Facebook Graph API App Easy w/ PHP SDK

NOTE: This post was written using Facebook’s PHP SDK version 2.1.2. Since this post was made, the PHP SDK has changed and some of the process that are explained below may have changed as well. At some point I’ll have to revisit this post and update it but at this time just keep in mind of the above.

As promised, here is a post (similar to my Twitter API post) on using the Facebook API. There are many reason why one would want to access the Facebook API – maybe to create a mobile app that lets you post photos to your Facebook albums, or maybe you just want to show your last few Facebook status updates on your blog; what ever the reason may be, Facebooks Graph API mixed in with their PHP SDK makes it really easy to accomplish this.


  • Setup our environment
  • Register an app on Facebook
  • Understand the authentication process and extended parameters
  • Understand Graph API
  • Retrieve our latest status updates
  • Add a new status update
  • Retrieve our photos from our albums
  • Add a new photo Continue reading “Facebook Graph API App Easy w/ PHP SDK”

Twitter API, OAuth Authentication, and Zend_Oauth Tutorial

* 06/2014 UPDATE
Thanks to Ivan for pointing out that the siteUrl is now ‘https://api.twitter.com/oauth’. Make sure to use this new value anywhere where the siteUrl is mentioned below.

I recently had to work on a project that required me to interact with the Twitter API. I had done this before so I wasn’t expecting anything different until I remembered that Twitter had changed their API to start using OAuth for authentication. If you are not familiar with OAuth, it’s a secure way of authenticating without requiring a user to submit their username and password to third-parties – you can read more about it at OAuth. There are lots of resources online that talk about this in detail but I wasn’t able to find one that explained the entire process in a way that made sense. Hopefully this post will give you everything you need to get started with the Twitter API. I’m going to go through the steps required to make this work without using the entire zend framework.


This tutorial will go step-by-step in explaining how to create a small PHP application that can interact with the Twitter API. Our goal is to:

  • Authenticate
  • Display our latest tweets
  • Post new tweets from PHP
  • Display the last few times our account was mentioned

The only assumptions at this point (other than knowing PHP) is that you have a twitter account and the zend framework library downloaded. We won’t be using the entire framework, just some of the files as standalone modules.

Registering An App

The first step in being able to communicate with the Twitter API is to register an app in their system so you can receive all the necessary keys to authenticate with OAuth. Go to dev.twitter.com and log in with your Twitter account. Now click on ‘register an app’ (if that link is not visible then click on ‘your apps’ on the top right and then click on ‘register an app’ in the next page). These are the values I put in the form on the next page for my app. Feel free to follow along. I’ll explain the important inputs.

Application: Joey’s Blog Example
Description: Twitter PHP App
Application Website: http://www.joeyrivera.com
Organization: None
Application Type: Browser
Callback URL: http://www.joeyrivera.com/twitter/callback.php
Default Access Type: Read & Write Continue reading “Twitter API, OAuth Authentication, and Zend_Oauth Tutorial”

My car is for sale! 2005 Subaru WRX STi, 51,300 miles for $21,500 – SOLD!

Here is the link to autotrader with all the info: My car for sale. It’s a great car and I had a blast with it but I have bought a replacement, a Z06. The suby has a lot of mods, you can read about them all in my blog under ‘my-car‘ – pushed 450whp with methanol. Contact me here or at autotrader for more questions.