WP Auto Updates, Custom Plugins and You

I have wanted to talk about the WordPress auto updates feature for a while, but I don’t want to continue the general good or bad argument. I have every faith in the WordPress Foundation’s ability to responsibly use the feature, however I don’t believe anyone other than the WordPress Foundation should have access to it.

It was only recently that the excellent WordPress SEO plugin was force auto updated to fix a security vulnerability. Again this is a responsible use of the feature, managed by the WordPress Plugins Repository team, people I trust far more than I trust myself.

What I really want to talk about is how custom and/or premium plugins can reproduce the exact same behaviour. To be clear, by default, premium plugin shops have the ability to force auto update their plugins. Despite supporting the Auto Updates feature when controlled by the WordPress Foundation, I strongly oppose its control by any third party.

While I may trust a company to write a functional, reliable and secure plugin, I trust very few people with the power to inject code into my websites at will or to properly secure their systems so that no-one else could use it to do the same.

Luckily WordPress has provided the solution, developers can easily block plugin updates all together, or white-list specific plugins they trust to use the feature (like the snippet below).

 

[wpgist id=”a25d87b38ac6780a0bc6″ file=”autoupdate_plugins.php”]

 

Despite being able to block it manually, having the feature open for use by anyone still makes me nervous.

My main problem with it is similar to the arguments against security services forcing companies to add back doors to their products and services. The general consequence is that by doing this you actually reduce security, by creating a method by which a third party could gain access too.

We have already seen companies losing customer credit card details, is it so far fetched to imagine plugin update services being targeted? After all some of them would give access to hundreds of thousands of websites, that is a gold mine for hackers.

Even if nothing changes, I think it would be healthy for users and especially plugin shops to be more aware of the responsibility they now have. You have a back door key to your users websites, don’t hide it under the flower pot.

I am interested in how everyone else feels about this particular aspect of the auto updates feature, does this unsettle you too or do you think I am overreacting?

Silent Caching in WordPress

This article is out of date and incorrect, most modern web servers
will not close the connection until after the shutdown hook runs.
I will be publishing a follow up article soon with a better method.

While reviewing the code and performance of my latest plugin, the Kebo Twitter Feed, I realised that the Widget was causing significant delays in page load time. While the majority of page loads only involved a 0.002 second increase from the Widget due to caching, when the cache had expired it was causing a delay of 0.6 seconds.

The delay when refreshing the cache is caused by the need to request the users latest Tweets using the v1.1 Twitter API. Realising that there was no way I could make this significantly faster, I needed to find a way to improve the process of refreshing the cache.

I was currently using a Transient, the inbuilt caching system for WordPress, in the default way. Which allows you to save data like this:

[code lang=”php”]
<?php

// takes ‘name’, ‘data’ and ‘expiry time’.
set_transient( ‘transient_name’, $data, 5 * MINUTE_IN_SECONDS );

?>
[/code]

Once stored, you can then request it at any point using this:

[code lang=”php”]
<?php

if ( false === ( $data = get_transient( ‘transient_name’ ) ) ) {

// Cache has expired, need to refresh data

}

// We have the data, continue

?>
[/code]

The problem with this is that if the cache has expired, the function returns false and you must refresh the cache immediately. Being forced to refresh the cache immediately causes my problem, as in the middle of rendering the Widget, I must make a request to the Twitter API for the users latest Tweets.

So what could I do? What I needed was two levels to my caching, a soft expire which would still return the data and a hard expire like the current Transient. I opted to use a long term expire (24 hours) for the Transient and then add my own soft expire time to the data in the Transient. This allows me to call the Transient and then check my own soft expire time, if this has expired I can use the current data and then refresh the cache after the page has loaded. This involved changes to way I set the Transient:

[code lang=”php”]
<?php

// Add soft expire time to the data stored
$data[‘expiry’] = time() + ( 5 * MINUTE_IN_SECONDS );

// Set transient using name, data and expire time.
set_transient( ‘transient_name’, $data, 5 * MINUTE_IN_SECONDS );

?>
[/code]

As you can see above, we add a custom ‘soft’ expire time to the data we are going to store, which we will use later on when we call the Transient to test if it has soft expired. Time to see how that is done:

[code lang=”php”]
<?php

// Check for hard expire
if ( false === ( $data = get_transient( ‘transient_name’ ) ) ) {

// Cache has hard expired, need to refresh data

}

// Check for soft expire
elseif ( $data[‘expiry’] < time() ) {

// Cache has soft expired, need to refresh data

}

// We have the data, continue

?>
[/code]

We have dealt with storing the data as needed, but we are still left with a big problem. How do we refresh the cache after the page has loaded? The search for the solution to this led me to the shutdown hook, which runs just before PHP shuts down execution. This allows us to set code to run after the page has been rendered, all that is left is to put all this together while making sure we only try to update the cache in the background once. The final code to call the Transient looks like this:

[code lang=”php”]
<?php

// Check for hard expire
if ( false === ( $data = get_transient( ‘transient_name’ ) ) ) {

// Cache has expired, need to refresh data

}

// Check for soft expire
elseif ( $data[‘expiry’] < time() ) {

// Set silent cache to refresh after page load.
add_action( ‘shutdown’, ‘pb_silent_cache_refresh’ );

// Add 10 seconds to soft expire, to stop
// other threads trying to update it at the same time.
$data[‘expiry’] = ( $data[‘expiry’] + 10 );

// Update soft expire time.
set_transient( ‘transient_name’, $data, 24 * HOUR_IN_SECONDS );

}

// We have the data, continue

?>
[/code]

If we detect that our data has soft expired, we tell WordPress to run our function which will refresh the cache data after the page has rendered. We then need to make sure that, if many simultaneous page requests are coming through, we don’t try to refresh our cache many times at once.

To accomplish this we add 10 seconds to our soft expiry time, which will prevent most other page requests from detecting the cache needs to be refreshed. The 10 second extension also means that if our background update fails for any reason, like a Twitter API call timing out, we can attempt to refresh it again soon. The last task is to create our function which will update the cache after the page has rendered:

[code lang=”php”]
<?php

function pb_silent_cache_refresh() {

// Refresh the data to be cached
// Add soft expire time to the data stored
$data[‘expiry’] = time() + ( 5 * MINUTE_IN_SECONDS );

// Set transient using name, data and expire time.
set_transient( ‘transient_name’, $data, 24 * HOUR_IN_SECONDS );

}

?>
[/code]

This gives you the ability to refresh cached items which take a significant amount of time to process without impacting page load speed and this technique is being used live on my Kebo Twitter Feed plugin. If have any ideas and suggestions for making this even better, I would love to hear from you. Happy caching!