Terraform Dynamic Environment Variables through Workspaces and Local Values

We’ve fallen in love with Terraform and have moved all infrastructure over to using it on AWS. This is has been a huge win for our team and allows us to confidently deploy infrastructure changes across multiple accounts.

Continue reading →

Create AWS CloudWatch Alarm on the Number of Running EC2 Instances

One of the first thing an attacker would likely do if they ever got access to your AWS secret keys would be to spin up as many servers as possible - probably large ones as well! So to help prevent this kind of attack from getting out of hand, we will be creating a CloudWatch alarm to trigger when the number of EC2 instances is above a certain threshold.

The basic items we need to create are as follows:

  1. Lambda function to get the total number of EC2 instances running in our account
  2. Custom CloudWatch metric with the the current instance count
  3. CloudWatch alarm to trigger on the EC2 instance count metric

Continue reading →

Laravel 4.2 updated_at Table Field Always Updating

Here's the fix to the issue if you find the "updated_at" field in your database table always being updated even though the values do not actually change.

The "issue" is found within the getDirty() function within Eloquent\Model. It's caused by the way it compares the current value with the value being inserted/updated. The comparison is type sensative which can cause problems for certain values. Here's a github issue explaining the reasoning behind this from Taylor Otwell.

The solution we decided on was to ensure the value types being inserted and updated match what was previously there by forcing integers or doubles. It will also require you to use 1 or 0 instead of true or false for boolean values.

$int = 235;
// force it to be integer
$int = (int) 235;

$bool = true;
// becomes 
$bool = 1;
Continue reading →

AWS CLI Ansible Playbook on Ubuntu 14.04

Here's a simple Ansible playbook that installs the AWS CLI and all the required dependencies on Ubuntu 14.04 - should work on other versions of Ubuntu too.

Setup

  • Clone or download the jveldboom/ansible-aws-cli repo to your Ansible machine
  • Add your host information within inventories/hosts
  • Add your AWS creditentials within vars/vars.yml
Continue reading →

GoAccess Automated Reports - Last 30+ Days via Cron

First if you're not familier with GoAccess, here's a quick description from their website http://goaccess.io/:

GoAccess is an open source real-time web log analyzer and interactive viewer that runs in a terminal in *nix systems. It provides fast and valuable HTTP statistics for system administrators that require a visual server report on the fly.

Typically you run goaccess on a single log file like this: goaccess -f access.log. But in our case we wanted to run it on multiple log files for a month end report. This provided some issues since we needed to be able to pass multiple files and only report within a date range.

Below is the script we came up with that's not perfect, but it's simple as does a pretty good job of what we needed. It finds all the gzipped access.log files modified within the past 35 days and pipes them into goaccess. We chose 35 days since it's possible some files may contain multiple days so 35 files should always include at least 30 days. Finally we save the report as an HTML file by the date. monthly-2015-05.html.

#!/bin/bash
DATE=$(date +'%Y.%m')
zcat `find /var/log/apache2/ -name "access.log.*.gz" -mtime -35` | goaccess > /dir/monthly-$DATE.html

Then just save the file and add the cron job to run at midnight on the first of each month.

# as a shell script
00 00 01 * * /bin/bash /dir/goaccess-monthly.sh

# or as a single cron job line
00 00 01 * *  zcat `find /var/log/apache2/ -name "access.log.*.gz" -mtime -35` | goaccess > /dir/monthly-$(date +'%Y.%m').html

Also check out the man page for more information on various options and settings.

Continue reading →

EC2 High CPU Wait and the EBS Provisioned IOPS Difference

On our small Graphite monitoring server (c4.large), we kept having very high CPU wait times. It would hover around 80% almost continuously. I could scrub back in the graph timeline to see where it started and there were no major changes made on that day. (like adding a fleet of new servers or extra data points) So I was pretty confident it was not an application issue.

Continue reading →

Heka JSON Decoder using a SandboxDecoder and Lua

First let me say if you're looking for help on Heka, check out their IRC channel. It's full of great guys that are extremely helpful! [IRC: #heka on irc.mozilla.org]

This Heka JSON encoder converts any simple key/value JSON payload into Heka fields.

Continue reading →

Drop MySQL Primary Key with Foreign Key using Laravel 4 (error 1025 & 105)

This issue is really not specifically related to Laravel. But the code below shows how to handle the issue within Laravel 4.

The issue comes from trying to delete a primary key that's also a foreign key. MySQL would spit out the following "useful" error:

SQLSTATE[HY000]: General error: 1025 Error on rename of './database/#sql-10e9_9c' to './database/table' (errno: 150) (SQL: alter table `table` drop primary key)

To solve this you just need to delete the foreign key first and then the primary key.

Schema::table('products_fulltext', function(Blueprint $table) {
   $table->dropForeign('table_field_foreign');
   $table->dropPrimary('PRIMARY');
});
Continue reading →

Laravel 4 Unable to Read Package Configuration File

I was trying to add a configuration file to an existing Laravel 4 package (revisionable) to help improve the functionality. But no matter what I tried I could not get the package to read from the src/config/config.php file.

Continue reading →

How to create a temporary storage directory that automatically deletes contents after X days (Mac)

Here's a quick way to have a directory that allows you to store files and other subdirectories for temporary usage. For example, I use this to save all my downloads and other files that I only need for the next day or two.

We first need to create a new cron job. If you're not familier with cron, it's basically instructions to the cron daemon of the general form: "run this command at this time on this date". (or for some light reading) We're going to use cron to run a command to move all the contents of a directory to the trash.

// from the terminal enter to create new crontab or open existing
crontab -e

// next press Esc+i to enter into "INSERT" mode which will allow you enter text
00  */2  *  *  *  find /path/to/temp/ -mtime +1 -exec mv {} ~/.Trash ; >/dev/null 2>&1

// then save the crontab by pressing Esc, :, w, q (write & quit)

Now let's explain what's going on with the cron job.

00 minutes (ie: 2:00,18:00)
*/2 every 2 hours
* * * every hour, day of the month, day of the week
find /path/to/temp/ finds all contents of directory
-mtime +1 where the file's make time is X days more than current date
-exec mv {} take all contents found from find comment and executes the mv (moves) command
~/.Trash ; users trash can (could be any directory though)
>/dev/null 2>&1 suppresses any output from displaying (ie fulling up your users mailbox)

Also, if you're looking for a quick way to add extra storage to your Macbook, checkout the Nifty Drives. This is what I use for my temporary storage.

Continue reading →