Conservatives think the Public are Idiots

As we run into the last weeks of election campaigning one thing is very obvious, the Conservatives think that the UK public are idiots. They are performing a sustained campaign to fill the media with scare stories and offensive attacks to prevent any serious debate. They do not believe we, the public, will see past it.

The Conservative campaign against Labour has used the same strategy for a long time. Move the focus to nasty personal attacks and scare stories to reduce any discussion and analysis of policies in the media. This is because they understand that proper discussion and analysis would result in Labour winning.

Over the past two weeks we have seen a huge step up in aggressive attacks, including scare stories on the economy (here), on a Labour-SNP coalition (here), personal attacks on Ed Miliband (here) and Nicola Sturgeon (here). A casual glance at any of the right-wing press will show you lots of similar stories. It appears though that this is not enough, with Rupert Murdoch berating his Sun journalists for not attacking Ed Miliband enough to stop him winning the election (here).

Personal attacks on Ed Miliband’s love life, and Nicola Sturgeon’s masculinity, have nothing to do with this election. Neither do the sensationalist claims that a Labour-SNP coalition would tear our country apart. This strategy only works if it successfully fills the media and distracts the public.

Conservative attacks have focused on personal insults, overspending and the threat of dividing the UK. These are all areas, that when investigated, they fail miserably in. Which is why their strategy of distraction is so important to them.

The current Conservative coalition has added more to our national debt in 5 years than every Labour government since 1900 (IMF). This is during an economic recovery and after enacting harsh austerity cuts to the NHS, local government and social care.

uk-national-debt-graph-2015
Another line of attack on the economy is claiming that a Labour-SNP coalition would borrow over £160 billion more than Conservative plans. Not only is there no evidence of this (it is entirely calculated by the Conservatives) but they themselves managed to borrow over £200 billion more than planned during this parliament.

Conservative claims that a Labour-SNP coalition would break up our nation is a shocking case of scaremongering. Scottish support for the SNP exists because of how badly successive governments have performed in Scotland, we should be addressing this issue not pushing them further away.

The Conservatives know Scotland is not going to vote for them, so they have given up entirely and are attempting to use the SNP as a political weapon. This behaviour is only pushing Scotland further away from us. The Conservatives don’t care though because if they lose it won’t be their problem and would reduce Labour’s support.

This forms part of a common pattern whereby the Conservative strategy is to divide the people in our nation. The hate campaign towards benefit claimants, despite official statistics painting a completely different picture (here). The lacklustre campaign before the Scottish referendum almost backfired, resulting in David Cameron pleading to not “Vote Yes to give the ‘effing Tories’ a kick” (here). They have also frequently been found withholding pro-EU reports to further their anti-EU scaremongering (here) (here).

It is time to prove that we are not the idiots the Conservatives think, to give proper discussion and analysis its place in politics again. We need to show that baseless media attacks and distraction tactics are not acceptable in UK politics. We all want a better, stronger and united country, one that will give our families and children the best opportunities possible. The first stage of that is by ensuring that political campaigning is based on discussion and analysis, not petty attacks and scaremongering.

WP Auto Updates, Custom Plugins and You

I have wanted to talk about the WordPress auto updates feature for a while, but I don’t want to continue the general good or bad argument. I have every faith in the WordPress Foundation’s ability to responsibly use the feature, however I don’t believe anyone other than the WordPress Foundation should have access to it.

It was only recently that the excellent WordPress SEO plugin was force auto updated to fix a security vulnerability. Again this is a responsible use of the feature, managed by the WordPress Plugins Repository team, people I trust far more than I trust myself.

What I really want to talk about is how custom and/or premium plugins can reproduce the exact same behaviour. To be clear, by default, premium plugin shops have the ability to force auto update their plugins. Despite supporting the Auto Updates feature when controlled by the WordPress Foundation, I strongly oppose its control by any third party.

While I may trust a company to write a functional, reliable and secure plugin, I trust very few people with the power to inject code into my websites at will or to properly secure their systems so that no-one else could use it to do the same.

Luckily WordPress has provided the solution, developers can easily block plugin updates all together, or white-list specific plugins they trust to use the feature (like the snippet below).

 

 

Despite being able to block it manually, having the feature open for use by anyone still makes me nervous.

My main problem with it is similar to the arguments against security services forcing companies to add back doors to their products and services. The general consequence is that by doing this you actually reduce security, by creating a method by which a third party could gain access too.

We have already seen companies losing customer credit card details, is it so far fetched to imagine plugin update services being targeted? After all some of them would give access to hundreds of thousands of websites, that is a gold mine for hackers.

Even if nothing changes, I think it would be healthy for users and especially plugin shops to be more aware of the responsibility they now have. You have a back door key to your users websites, don’t hide it under the flower pot.

I am interested in how everyone else feels about this particular aspect of the auto updates feature, does this unsettle you too or do you think I am overreacting?

API Handlers for WordPress

This has been a busy, fun and productive weekend. I have created three API handlers for WordPress to make interacting with the Github, Bitbucket and New Relic APIs easier from inside WordPress.

I created them to help me fetch information from each of these services, to integrate into a website to display more interesting and dynamic information to the viewers. I hope that they will help others do the same.

You can find the them in my Github repositories, and below, each is a file you could use in the /mu-plugins/ folder or include in a theme/plugin:

Each is fully working and the README.md files contain instructions, lists the available endpoints and gives examples code. None yet cover all available endpoints, just the most commonly useful. I will be expanding the endpoints available, documentation and examples over time.

I have stuck to the WordPress way of doing things on the whole and maintained compatibility with the WordPress requirements (e.g. PHP 5.2). If you have any problems or suggestions I would love to hear them on here or via Github issues.

Silent Caching in WordPress

This article is out of date and incorrect, most modern web servers
will not close the connection until after the shutdown hook runs.
I will be publishing a follow up article soon with a better method.

While reviewing the code and performance of my latest plugin, the Kebo Twitter Feed, I realised that the Widget was causing significant delays in page load time. While the majority of page loads only involved a 0.002 second increase from the Widget due to caching, when the cache had expired it was causing a delay of 0.6 seconds.

The delay when refreshing the cache is caused by the need to request the users latest Tweets using the v1.1 Twitter API. Realising that there was no way I could make this significantly faster, I needed to find a way to improve the process of refreshing the cache.

I was currently using a Transient, the inbuilt caching system for WordPress, in the default way. Which allows you to save data like this:

[code lang=”php”]
<?php

// takes ‘name’, ‘data’ and ‘expiry time’.
set_transient( ‘transient_name’, $data, 5 * MINUTE_IN_SECONDS );

?>
[/code]

Once stored, you can then request it at any point using this:

[code lang=”php”]
<?php

if ( false === ( $data = get_transient( ‘transient_name’ ) ) ) {

// Cache has expired, need to refresh data

}

// We have the data, continue

?>
[/code]

The problem with this is that if the cache has expired, the function returns false and you must refresh the cache immediately. Being forced to refresh the cache immediately causes my problem, as in the middle of rendering the Widget, I must make a request to the Twitter API for the users latest Tweets.

So what could I do? What I needed was two levels to my caching, a soft expire which would still return the data and a hard expire like the current Transient. I opted to use a long term expire (24 hours) for the Transient and then add my own soft expire time to the data in the Transient. This allows me to call the Transient and then check my own soft expire time, if this has expired I can use the current data and then refresh the cache after the page has loaded. This involved changes to way I set the Transient:

[code lang=”php”]
<?php

// Add soft expire time to the data stored
$data[‘expiry’] = time() + ( 5 * MINUTE_IN_SECONDS );

// Set transient using name, data and expire time.
set_transient( ‘transient_name’, $data, 5 * MINUTE_IN_SECONDS );

?>
[/code]

As you can see above, we add a custom ‘soft’ expire time to the data we are going to store, which we will use later on when we call the Transient to test if it has soft expired. Time to see how that is done:

[code lang=”php”]
<?php

// Check for hard expire
if ( false === ( $data = get_transient( ‘transient_name’ ) ) ) {

// Cache has hard expired, need to refresh data

}

// Check for soft expire
elseif ( $data[‘expiry’] < time() ) {

// Cache has soft expired, need to refresh data

}

// We have the data, continue

?>
[/code]

We have dealt with storing the data as needed, but we are still left with a big problem. How do we refresh the cache after the page has loaded? The search for the solution to this led me to the shutdown hook, which runs just before PHP shuts down execution. This allows us to set code to run after the page has been rendered, all that is left is to put all this together while making sure we only try to update the cache in the background once. The final code to call the Transient looks like this:

[code lang=”php”]
<?php

// Check for hard expire
if ( false === ( $data = get_transient( ‘transient_name’ ) ) ) {

// Cache has expired, need to refresh data

}

// Check for soft expire
elseif ( $data[‘expiry’] < time() ) {

// Set silent cache to refresh after page load.
add_action( ‘shutdown’, ‘pb_silent_cache_refresh’ );

// Add 10 seconds to soft expire, to stop
// other threads trying to update it at the same time.
$data[‘expiry’] = ( $data[‘expiry’] + 10 );

// Update soft expire time.
set_transient( ‘transient_name’, $data, 24 * HOUR_IN_SECONDS );

}

// We have the data, continue

?>
[/code]

If we detect that our data has soft expired, we tell WordPress to run our function which will refresh the cache data after the page has rendered. We then need to make sure that, if many simultaneous page requests are coming through, we don’t try to refresh our cache many times at once.

To accomplish this we add 10 seconds to our soft expiry time, which will prevent most other page requests from detecting the cache needs to be refreshed. The 10 second extension also means that if our background update fails for any reason, like a Twitter API call timing out, we can attempt to refresh it again soon. The last task is to create our function which will update the cache after the page has rendered:

[code lang=”php”]
<?php

function pb_silent_cache_refresh() {

// Refresh the data to be cached
// Add soft expire time to the data stored
$data[‘expiry’] = time() + ( 5 * MINUTE_IN_SECONDS );

// Set transient using name, data and expire time.
set_transient( ‘transient_name’, $data, 24 * HOUR_IN_SECONDS );

}

?>
[/code]

This gives you the ability to refresh cached items which take a significant amount of time to process without impacting page load speed and this technique is being used live on my Kebo Twitter Feed plugin. If have any ideas and suggestions for making this even better, I would love to hear from you. Happy caching!

New Site

I am really pleased to finally put the new design of my site live. I have scrapped the content too, as I will using the website in a different way. I believe this site is a lot more fitting for the work I currently do and a much better reflection of myself.

Mostly I will be using the site to house the electrical impluses which race around my brain, otherwise known as thoughts. Occasionally combined with information about fun side projects I am doing.

Keep an eye out and you will soon see much more content soon!