It’s back. Me. This website. My drive to write. The continued existence of my technological journey through a wide spectrum of challenging endeavours. It’s back and better than ever.
Moving forward, a few things will be different. First off, this is no longer my personal outlet for internal thoughts and happenings: instead, you’ll be able to find those elsewhere (more later). Second, future posts will touch on everything from Python semantics, CI/CD, microservices, hacky one-liners, and every real approach to challenges many will encounter through their career in computing (as I have).
I recently obtained a disk dock and cloning unit (StarTech.com) for working with some of my internal drives (I have too many). This unit does a bit-by-bit clone of one disk to another, which is really useful! The problem with this is that each disk now looks exactly the same to your Operating System, meaning there is no way to mount them both at the same time!
Furthermore, I decided to create a LUKS encrypted drive protecting an ext4 partition.
I just wanted to give a quick update on what my company, Arroyo Networks, is up to!
Arroyo is moving forward with a brand new product in a brand new market with a brand new purpose. Over the next few months, we plan to offer a private beta of our prototype with an open beta shortly after that. Be sure to look for more announcements, a brand new website, and much more in the coming weeks!
This is a continuation of my original article on our New Dell XPS 15’s. This article will cover my “moving in” experience getting used to Gnome 3, and tuning Arch just the way I like it ;)
That cool dude over at geeketeer.net has also written about his experiences and mods!
Bootloader fun! Most people don’t know but GRUB has a multitude of theme customization options available. We use a set made around Arch Linux located here.
Introduction Both myself and co-founder, @seglberg, decided early in 2016 that it was time to rethink our workstations. We both had Thinkpad’s which were alright but lacked in performance and weren’t ready for the workload we presently required. While they’ve treated us well, we decided to look around and see what’s fresh in the laptop market, especially with the new Intel Skylake architecture available!
With the new things we’re working on, it’s essential that we can quickly run compression, encryption, docker builds, and virtual machines, etc.
Going Public…
Just over a week ago, my company rolled out our public presence: A fresh web site, LinkedIn profile, and even Twitter.
I want to also thank all the wonderful people who have sent luck our way and those who have supported us thus far…you are awesome!
Distro Change Moving on, I wanted to make mention of my recent decision to move to Arch Linux, an amazingly light, responsive, and elegant linux distribution.
Late last week, I resigned from my position at Arbor Networks in order to join a stealth startup. Unfortunately, I didn’t get to say goodbye to any coworkers because of my discretion around details of the new company.
Either way, I’m hitting the ground running at my new gig and having a blast! Don’t worry, we’ll be going public pretty soon so keep an eye out!
I want to thank all the people who have already shown their support and reached out wishing me luck.
I recently ran into a situation where I needed to copy a large amount of data from one box to another in a LAN environment. In a situation like this, the following things are usually true, at least for this project they were:
Just creating a Tar archive on the source box and transferring it over isn’t gonna fly. The source contains many files and directories (in the millions); enough that its not even practical to use file based methods to move data over The Disk which data resides on is not exactly very “fast” and may be exceptionally “old” We need to maximize transfer speed and we don’t care about “syncing”; we just want a raw dump of the data from one place to another.
I recently decided to jump into the object storage revolution (yeah, I’m a little late). This drive comes from my very old archives I’d like to store offsite but also to more easily streamline how I deploy applications which have things like data directories and databases that need to be backed up.
The Customary Lately, through my work at Arbor and my own personal dabbling, I’ve come to love the idea that a service may depend on one or more containers to function.
Sometimes, you need an application to run at a scheduled time. Ideally, it would be a really cool feature if you could merely tell the docker daemon to do this via some sort of schedule: * 1 * * * in your docker-compose.yml. Sadly this isn’t really possible. So you have two options:
Source your image from a container which has cron installed. Merely install cron yourself. Either way, there are a few things you need to watch out for.