Compression


Backing up to S3: The Containerized Way

I recently decided to jump into the object storage revolution (yeah, I’m a little late). This drive comes from my very old archives I’d like to store offsite but also to more easily streamline how I deploy applications which have things like data directories and databases that need to be backed up. The Customary Lately, through my work at Arbor and my own personal dabbling, I’ve come to love the idea that a service may depend on one or more containers to function.

Continue reading ↦

Unzip multiple archives in a single directory

And yes, this handles file names with spaces and other weird characters…ohh the joys of double quotes. ls *.zip | while read i; do IFS="\n" echo "Starting on $i"; unzip -d "$i-extracted" "$i"; echo -en "Finished $i..\n"; done



Compress a MySQL database table

MySQL InnoDB engine based databases support compression of table data using zlib compression algorithm. From the official documentation, it is quite easy to create or alter a table to support compression! It of course helps quite a bit with column’s you might plan to use which contain a lot of text (using the TEXT, MEDIUMTEXT, LONGTEXT column formats). Here is how I altered my table using phpmyadmin (since I didn’t see an obvious place in the GUI to do it, I just ran the following SQL statements on the DB):

Continue reading ↦

Extract only a single file/dir from an archive

Using 7z: 7z l file.7z 7z x file.7z directory/neededfile.txt Using Tar: tar ztvf file.tgz tar xzvf file.tgz directory/neededfile.txt Note the “*t*” is the argument telling tar to list files, so you could do “*jtvf*” for a bz2 archive, etc…