Downloading Blackboard Unit Course Content for Offline Viewing

Blackboard is a great tool for completing college courses online, but sometimes you can get stuck without the internet making it difficult to study. There is a way however, to use a command line tool wget to download the site content for local, offline viewing.

It should be noted however, that this method is unlikely to be endorsed by your college – so use at your own risk!

To download the content for offline viewing you need two things (in FireFox):
1. Install the Firefox Cookie Exporter
2. Install wget

Use Cookie Exporter to export to cookies.txt

-the next command will download all the course material after loading the cookies.txt

wget -mk --no-check-certificate --load-cookies cookies.txt https://link.to/the/course/CODEE_LEE7/Content/

If you use this method to download from courses with lots of content you may want to consider inserting “-w 20” into the wget command, which tells wget to wait 20secs between downloads (give the server some rest otherwise you may get booted).

MySQL utf8mb4 Encoding Breaks ActiveRecord’s Schema Setup

I recently wrote about the virtues of true UTF8 (utf8mb4) character sets in MySQL and how to change your database to use it. Today we will discuss a possible problem you may encounter when you do when programming on Ruby on Rails. The error looks something like this:

$ rake db:setup

Mysql::Error: Specified key was too long; max key length is 767 bytes:
CREATE UNIQUE INDEX unique_schema_migrations ON schema_migrations (version)

The problem exists because the utf8mb4 character set uses the full 4 bytes per character rather than the 1-3 of UTF8 (the character set most people mistakenly use thinking they’ll have full Unicode compliance). Because of this extra size, the schema_migration may no longer fit.

This small patch will set default mysql string column length to 191 instead of 255 which is the new index limit on utf8mb4 (aka real utf8).

# config/initializers/mysqlpls.rb
require 'active_record/connection_adapters/abstract_mysql_adapter'

module ActiveRecord
  module ConnectionAdapters
    class AbstractMysqlAdapter
      NATIVE_DATABASE_TYPES[:string] = { :name => "varchar", :limit => 191 }
    end
  end
end

 

MySQL’s UTF8 isn’t *really* UTF8 (and how to properly support Unicode)

I was forced to pull out my hair recently, after several SQL queries with UTF8 uni-coded text being fired from my project of the day, were failing to insert into my UTF8 MySQL database. Working on a project with a lot of middle eastern and asian content was simply crashing out my SQL insert queries and I was loosing a tremendous quantity of data.

Turns out MySQL’s UTF8 character set only partially implements proper UTF-8 encoding. It can only store UTF-8-encoded symbols that consist of one to three bytes; encoded symbols that take up four bytes aren’t supported. Luckily, MySQL 5.5.3 introduced a new encoding called utf8mb4 which maps to proper UTF-8 and thus fully supports Unicode, including astral symbols.

Switching from MySQL’s utf8 to utf8mb4

First, we need to change the character set and collation properties of the database, tables, and columns, to use utf8mb4, instead of utf8.

# For each database:
ALTER DATABASE database_name CHARACTER SET = utf8mb4 COLLATE utf8mb4_unicode_ci;
# For each table:
ALTER TABLE table_name CONVERT TO CHARACTER SET utf8mb4 COLLATE utf8mb4_unicode_ci;
# For each column:
ALTER TABLE table_name CHANGE column_name column_name VARCHAR(191) CHARACTER SET utf8mb4 COLLATE utf8mb4_unicode_ci;

Note: Don’t simply copy-paste this! You will need to tailor the queries to meet your specific requirements, the above lines are for illustrative purposes only.

Repair and optimize all tables:

Do this with some simple SQL Queries:

# For each table
REPAIR TABLE table_name;
OPTIMIZE TABLE table_name;

Or using the MySQL command-line:

$ mysqlcheck -u root -p --auto-repair --optimize --all-databases

Note: Don’t forget to change the character encoding for your PHP/Ruby code to use the correct character encoding – i.e. utf8mb4 instead of utf8.

Test the changes:

You can test that the changes have taken effect by running this command to see if the collation/charsets have been switched to utf8mb4:

SHOW VARIABLES WHERE Variable_name LIKE ’character\_set\_%’ OR Variable_name LIKE ’collation%’;

Alternatively, change the default character set in MySQL.

In the /etc/my.cnf config file, we set the following instructions:

[client]
default-character-set = utf8mb4

[mysql]
default-character-set = utf8mb4

[mysqld]
collation-server = utf8mb4_unicode_ci
init-connect='SET NAMES utf8mb4'
character-set-server = utf8mb4

That’s it. Obviously you will need to restart my SQL first though, and it will only apply to databases made after the default change, but I highly recommend this change. Even if you think you don’t need the full range of UTF8 characters, it only takes one to mess things up!

Conclusion

Never use utf8 in MySQL — always use utf8mb4 instead. Arbitrarily limiting the set of symbols you can use is just silly and is bound to cause a problem if your user is a young Japanese girl putting lots of cats in your strings. Why would you risk loosing lose data every time an Arabian user writes a message in their native locale. There’s no logical reason for it, and a few minutes of pain now, might save you a lot of heartache later. Do the right thing, and use utf8mb4 instead.

Better Way to Copy a Large Quantity of Data Over a Network Without Using ‘scp’

If you ever need to copy a large amount of data over a network (especially if its a huge number of small files) you can pipe a tar command through a ssh connection, and because tar copies whole blocks at a time, it will be far, far, faster than using SCP.

To execute it, simply:

$ tar czf - <files> | ssh user@host "cd /wherever; tar xvzf -"

 

How to Speed Up Importing or Recovering a Large MySQL Database

I recently had to import a 30GB MySQL database from a backup of a client’s production database. My development workstation really struggled with the hundred’s of thousands of INSERT queries, and the import either took an unacceptably long time or failed outright.

Fortunately, there is a sure-fire way to increase the import though (in my case a 100 fold increase in speed).

Simply open Terminal and type:

> mysql -uXXX -pXXX

…and replace the X’s with an appropriate username and password and then paste this at the MySQL prompt:

CREATE DATABASE my_database;
USE my_database;
set global net_buffer_length=1000000;
set global max_allowed_packet=1000000000;
SET autocommit=0;
SET unique_checks=0;
SET foreign_key_checks=0;
SOURCE /some/path/database_file.sql;
COMMIT;
SET autocommit=1;
SET unique_checks=1;
SET foreign_key_checks=1;

A few notes: ‘set global net_buffer_length’ and ‘set global max_allowed_packet’ only apply if your source filename and path are over a network. Also simply omit the ‘CREATE DATABASE my_database;’ line if you already imported the blank schema, or the database already exists.

Using Ruby’s Metaprogramming to Initialize an Object From a Hash

Consider the code:

class A
  attr_accessor :b, :c, :d, :e, :h, :i, :x
end

Now imagine that you want to initialize each instance variable to the one that has the same name in the hash.. Imagine all the repetitive and crappy code that would generate.

But this is Ruby and with Ruby there is *nearly* always a better way.  Instead, meta-program it, and mix-it-in.

module constructed_from_hash
 def initialize(h)
  h.each { |k, v| send("#{k}=", v) }
 end
end

class A
 include constructed_from_hash
 attr_accessor :b, :c, :d, :e, :h, :i, :x
end

Nice, elegant and clean. Just the way Ruby code is supposed to be. AND this code will now scale, as more accessors are added to the object over time, the constructor too, wont need reprogramming. If you don’t need to do this often, you can pull just the constructor out of the module and put it directly into the class, but this way provides the most flexibility.

Header image taken from Examining Dwemthy’s Array composite pattern. An interesting read in it’s own right. check it out.

Monitoring the Progress of a Very Large MySQL Database Import

Previously I have mentioned an awesome little command-line tool called ‘pv’. Recently, I was trying to restore a clients of legacy database on my development machine, which was a staggering 30GB .sql file. I was having quite a few problems trying to wrestle this beast, the worst of which being that I had no idea how much longer the import would take, or if it had locked up. Luckily, PV comes to the rescue.

Normally, when importing a MySQL dump file, you can just type:

mysql -uxxx -pxxx dbname < /sqlfile.sql

…to import directly from the file. However, you can pv the file and pipe it into the mysql executable like:

pv sqlfile.sql | mysql -uxxx -pxxxx dbname

And you will get an awesome progress bar about how completed the task is. You can also use it in the reverse. You’d be surprised how useful it can be.

Tv-Renamer – Trust Me, It Just Works!

If like me, you consume a great deal of Television; and have always wanted a tool to help re-organise and use a single file/directory naming convention for all your media files – here it is.

TVRenamer is a Java GUI utility to rename TV episodes from TV listings. Basically, it will take an ugly filename like Lost.S06E05.DD51.720p.WEB-DL.AVC-FUSiON.mkv and rename it to Lost [6×05] Lighthouse.mkv

It has a whole lot of features which I won’t bother to go into (as if rapid, reliable auto-renaming/sorting wasn’t enough!) but I really encourage everyone to check this out. It really is amazing.

Hands-down, the BEST way to Install MySQL on Mac OSX Mavericks

I recently had some database woes. I needed to restore a MySQL database of a clients existing website that was dozens of gigabytes in size. I had a great deal of trouble trying to import that data, but that’s a story for another time. This was about getting MySQL installed in the first place (for a development environment).

At first, I tried using Homebrew – because its awesome and I like it. But sadly, for one reason or another the default configuration just wasn’t working for me. It was simple and blind, but too well hidden and I didn’t feel like there was enough “control” (like getting setting/getting the default root password for example). If you’re going to automate something, then automate it – but don’t ask me to run stuff to secure my install when your supposed to be automating it for me.  Bah humbug!

However, the wonderful folks at macminivault.com solved all my problems. It was simply perfection. Exactly the right balance of automation and control.

Just open up Terminal and paste this into it:

bash <(curl -Ls http://git.io/eUx7rg)

It will tell you or prompt you for the rest. And don’t forget to get the text file containing the root password, before you mistakenly delete it.

Simplicity itself.

(comic property of http://www.brainstuck.com/)