Estabilishing ActiveRecord Database Connections in Ruby (But Without Rails)

Anyone who has even the smallest amount of experience developing with Ruby on Rails knows that Rails has some pretty sweet configuration conventions which make switching between environments very easy. Switching from development to production to testing is as easy as changing the RAILS_ENV variable. No doubt Rails does some dark magic behind the scenes to trivialise this. But what if you’re writing an app in Ruby, without the Rails to guide you?

ActiveRecord 101

Establishing a database connection in ActiveRecord without Rails is pretty basic:

require 'active_record'

ActiveRecord::Base.establish_connection({
  adapter:  'sqlite3',
  database: 'db/test.sqlite3'
})

It establishes a connection to the specified database with the specified connection  configuration. If you were using MySQL or PostgreSQL you would provide the relevant configuration such as username, password and host. With the connection established, we can now start consuming the connection however we like. For example, we may want to use funky Rails’ style models (so we can easily create and populate tables and reference them):

ActiveRecord::Schema.define do
  create_table :things do |t|
    t.integer :id, :null => false
    t.string  :name
  end
end

class Thing < ActiveRecord::Base
end

Thing.create({
  id:   0,
  name: 'Broomstick'
})

Care about the Environment

The other magic in Rails is the environment selection. The main benefit with this is that we can have isolated connection configurations and have our Ruby app automatically select the correct settings for us (using our environment variable ENV).

require 'active_record'

conf = case ENV['DB']
when 'conf1'
{
  adapter: 'sqlite3',
  database: 'db/mydb1.sqlite3'
}
when 'conf2'
{
  adapter: 'sqlite3',
  database: 'db/mydb2.sqlite3'
}
else
  raise 'export DB=conf[n]'
end

ActiveRecord::Base.establish_connection conf

Mind you, before this will work you first need to set the environment variable:

export DB=conf1 # or conf2

Now, depending on ENV[‘DB’], the code will open a connection to the corresponding database.

Best Practises

It’s never a good idea to store sensitive information (like database connection information and usernames and password of any kind) in your code base. Rails uses an external database configuration file called database.yml to solve this problem, so lets implement one ourselves.

database.yml:

conf1:
  adapter: sqlite3
  database: db/mydb1.sqlite3

conf2:
  adapter: sqlite3
  database: db/mydb2.sqlite3

Loading this YAML file is very simple:

require 'active_record'
conf = YAML.load_file('database.yml')
ActiveRecord::Base.establish_connection conf[ENV['DB']]

Now we can set our environment variable same as before, but our code will look in a YAML file for the proper database configuration automatically, without having to store database connection information directly inside our code.

Association Cardinality in Rails

From time to time I’ve noticed people who struggle with cardinality and associations in Ruby on Rails. So, I thought I would attempt to create a cheat sheet here to help developers understand relationship cardinality and how it maps to associations.

ActiveRecord can be used to describe relations with one-to-one, one-to-many and many-to-many cardinality; where each model defines its relation to another. Let’s cover each of the three types of associations.

One-to-one

Use `has_one` in the base and `belongs_to` in the association:

class Family < ActiveRecord::Base
  has_one :home
end
class Home < ActiveRecord::Base
  belongs_to :family
end

A common question about a one-to-one association is ‘how to know which direction the has_one and belongs_to go?’  The correct way to know, is that whichever model has the foreign key, gets the `belongs_to`.  In this case, Home has the foreign key `family_id`.

One-to-one relationships are a bit odd, and as a general rule, if you find yourself using a lot of them, there is probably a better solution.

One-to-many

Use `has_many` in the base and `belongs_to` in the association:

class Family < ActiveRecord::Base
  has_many :parents
end
class Parent < ActiveRecord::Base
  belongs_to :family
end

This will be your most common relationship. As with one-to-one’s, the table with the foreign key gets the `belongs_to` (although this is a lot more obvious with a one-to-many). In this case the foreign key is `family_id`.

Many-to-many

These can be a lot more complicated and there is actually a couple of different ways to do it.

The first way involves a specific joining model. This results in 2 stages of has_many associations. It is referred to as `has_many :through` and is primarily used if you need to fully control the joining model/table:

class Family < ActiveRecord::Base
  belongs_to :parent
  belongs_to :kid
end
class Parent < ActiveRecord::Base
  has_many :kids, through: :families
end
class Kid < ActiveRecord::Base
  has_many :parents, through: :families
end

The second (and my preferred way) is to use the `has_and_belongs_to_many` method:

class Parent < ActiveRecord::Base
  has_and_belongs_to_many :kids 
end
class Kid < ActiveRecord::Base
  has_and_belongs_to_many :parents
end

The main difference (or disadvantage) with the `has_and_belongs_to_many`method is that the intermediary joining table and foreign keys need to be exactly named to match what Rails expects. Which many-to-many method you use ultimately depends on whether you need to work with the relationship model as its own entity directly.

Downloading Blackboard Unit Course Content for Offline Viewing

Blackboard is a great tool for completing college courses online, but sometimes you can get stuck without the internet making it difficult to study. There is a way however, to use a command line tool wget to download the site content for local, offline viewing.

It should be noted however, that this method is unlikely to be endorsed by your college – so use at your own risk!

To download the content for offline viewing you need two things (in FireFox):
1. Install the Firefox Cookie Exporter
2. Install wget

Use Cookie Exporter to export to cookies.txt

-the next command will download all the course material after loading the cookies.txt

wget -mk --no-check-certificate --load-cookies cookies.txt https://link.to/the/course/CODEE_LEE7/Content/

If you use this method to download from courses with lots of content you may want to consider inserting “-w 20” into the wget command, which tells wget to wait 20secs between downloads (give the server some rest otherwise you may get booted).

MySQL utf8mb4 Encoding Breaks ActiveRecord’s Schema Setup

I recently wrote about the virtues of true UTF8 (utf8mb4) character sets in MySQL and how to change your database to use it. Today we will discuss a possible problem you may encounter when you do when programming on Ruby on Rails. The error looks something like this:

$ rake db:setup

Mysql::Error: Specified key was too long; max key length is 767 bytes:
CREATE UNIQUE INDEX unique_schema_migrations ON schema_migrations (version)

The problem exists because the utf8mb4 character set uses the full 4 bytes per character rather than the 1-3 of UTF8 (the character set most people mistakenly use thinking they’ll have full Unicode compliance). Because of this extra size, the schema_migration may no longer fit.

This small patch will set default mysql string column length to 191 instead of 255 which is the new index limit on utf8mb4 (aka real utf8).

# config/initializers/mysqlpls.rb
require 'active_record/connection_adapters/abstract_mysql_adapter'

module ActiveRecord
  module ConnectionAdapters
    class AbstractMysqlAdapter
      NATIVE_DATABASE_TYPES[:string] = { :name => "varchar", :limit => 191 }
    end
  end
end

 

MySQL’s UTF8 isn’t *really* UTF8 (and how to properly support Unicode)

I was forced to pull out my hair recently, after several SQL queries with UTF8 uni-coded text being fired from my project of the day, were failing to insert into my UTF8 MySQL database. Working on a project with a lot of middle eastern and asian content was simply crashing out my SQL insert queries and I was loosing a tremendous quantity of data.

Turns out MySQL’s UTF8 character set only partially implements proper UTF-8 encoding. It can only store UTF-8-encoded symbols that consist of one to three bytes; encoded symbols that take up four bytes aren’t supported. Luckily, MySQL 5.5.3 introduced a new encoding called utf8mb4 which maps to proper UTF-8 and thus fully supports Unicode, including astral symbols.

Switching from MySQL’s utf8 to utf8mb4

First, we need to change the character set and collation properties of the database, tables, and columns, to use utf8mb4, instead of utf8.

# For each database:
ALTER DATABASE database_name CHARACTER SET = utf8mb4 COLLATE utf8mb4_unicode_ci;
# For each table:
ALTER TABLE table_name CONVERT TO CHARACTER SET utf8mb4 COLLATE utf8mb4_unicode_ci;
# For each column:
ALTER TABLE table_name CHANGE column_name column_name VARCHAR(191) CHARACTER SET utf8mb4 COLLATE utf8mb4_unicode_ci;

Note: Don’t simply copy-paste this! You will need to tailor the queries to meet your specific requirements, the above lines are for illustrative purposes only.

Repair and optimize all tables:

Do this with some simple SQL Queries:

# For each table
REPAIR TABLE table_name;
OPTIMIZE TABLE table_name;

Or using the MySQL command-line:

$ mysqlcheck -u root -p --auto-repair --optimize --all-databases

Note: Don’t forget to change the character encoding for your PHP/Ruby code to use the correct character encoding – i.e. utf8mb4 instead of utf8.

Test the changes:

You can test that the changes have taken effect by running this command to see if the collation/charsets have been switched to utf8mb4:

SHOW VARIABLES WHERE Variable_name LIKE ’character\_set\_%’ OR Variable_name LIKE ’collation%’;

Alternatively, change the default character set in MySQL.

In the /etc/my.cnf config file, we set the following instructions:

[client]
default-character-set = utf8mb4

[mysql]
default-character-set = utf8mb4

[mysqld]
collation-server = utf8mb4_unicode_ci
init-connect='SET NAMES utf8mb4'
character-set-server = utf8mb4

That’s it. Obviously you will need to restart my SQL first though, and it will only apply to databases made after the default change, but I highly recommend this change. Even if you think you don’t need the full range of UTF8 characters, it only takes one to mess things up!

Conclusion

Never use utf8 in MySQL — always use utf8mb4 instead. Arbitrarily limiting the set of symbols you can use is just silly and is bound to cause a problem if your user is a young Japanese girl putting lots of cats in your strings. Why would you risk loosing lose data every time an Arabian user writes a message in their native locale. There’s no logical reason for it, and a few minutes of pain now, might save you a lot of heartache later. Do the right thing, and use utf8mb4 instead.

Better Way to Copy a Large Quantity of Data Over a Network Without Using ‘scp’

If you ever need to copy a large amount of data over a network (especially if its a huge number of small files) you can pipe a tar command through a ssh connection, and because tar copies whole blocks at a time, it will be far, far, faster than using SCP.

To execute it, simply:

$ tar czf - <files> | ssh user@host "cd /wherever; tar xvzf -"

 

How to Speed Up Importing or Recovering a Large MySQL Database

I recently had to import a 30GB MySQL database from a backup of a client’s production database. My development workstation really struggled with the hundred’s of thousands of INSERT queries, and the import either took an unacceptably long time or failed outright.

Fortunately, there is a sure-fire way to increase the import though (in my case a 100 fold increase in speed).

Simply open Terminal and type:

> mysql -uXXX -pXXX

…and replace the X’s with an appropriate username and password and then paste this at the MySQL prompt:

CREATE DATABASE my_database;
USE my_database;
set global net_buffer_length=1000000;
set global max_allowed_packet=1000000000;
SET autocommit=0;
SET unique_checks=0;
SET foreign_key_checks=0;
SOURCE /some/path/database_file.sql;
COMMIT;
SET autocommit=1;
SET unique_checks=1;
SET foreign_key_checks=1;

A few notes: ‘set global net_buffer_length’ and ‘set global max_allowed_packet’ only apply if your source filename and path are over a network. Also simply omit the ‘CREATE DATABASE my_database;’ line if you already imported the blank schema, or the database already exists.

Using Ruby’s Metaprogramming to Initialize an Object From a Hash

Consider the code:

class A
  attr_accessor :b, :c, :d, :e, :h, :i, :x
end

Now imagine that you want to initialize each instance variable to the one that has the same name in the hash.. Imagine all the repetitive and crappy code that would generate.

But this is Ruby and with Ruby there is *nearly* always a better way.  Instead, meta-program it, and mix-it-in.

module constructed_from_hash
 def initialize(h)
  h.each { |k, v| send("#{k}=", v) }
 end
end

class A
 include constructed_from_hash
 attr_accessor :b, :c, :d, :e, :h, :i, :x
end

Nice, elegant and clean. Just the way Ruby code is supposed to be. AND this code will now scale, as more accessors are added to the object over time, the constructor too, wont need reprogramming. If you don’t need to do this often, you can pull just the constructor out of the module and put it directly into the class, but this way provides the most flexibility.

Header image taken from Examining Dwemthy’s Array composite pattern. An interesting read in it’s own right. check it out.

Monitoring the Progress of a Very Large MySQL Database Import

Previously I have mentioned an awesome little command-line tool called ‘pv’. Recently, I was trying to restore a clients of legacy database on my development machine, which was a staggering 30GB .sql file. I was having quite a few problems trying to wrestle this beast, the worst of which being that I had no idea how much longer the import would take, or if it had locked up. Luckily, PV comes to the rescue.

Normally, when importing a MySQL dump file, you can just type:

mysql -uxxx -pxxx dbname < /sqlfile.sql

…to import directly from the file. However, you can pv the file and pipe it into the mysql executable like:

pv sqlfile.sql | mysql -uxxx -pxxxx dbname

And you will get an awesome progress bar about how completed the task is. You can also use it in the reverse. You’d be surprised how useful it can be.

Tv-Renamer – Trust Me, It Just Works!

If like me, you consume a great deal of Television; and have always wanted a tool to help re-organise and use a single file/directory naming convention for all your media files – here it is.

TVRenamer is a Java GUI utility to rename TV episodes from TV listings. Basically, it will take an ugly filename like Lost.S06E05.DD51.720p.WEB-DL.AVC-FUSiON.mkv and rename it to Lost [6×05] Lighthouse.mkv

It has a whole lot of features which I won’t bother to go into (as if rapid, reliable auto-renaming/sorting wasn’t enough!) but I really encourage everyone to check this out. It really is amazing.