Check now 😎— Nat Friedman (@natfriedman) August 17, 2019
Recently, I started working on a new side project in Elixir and I think I’ve finally found something I’m going to stick with! In the past I would either build something like a simple TODO app and not get far enough into the language or I would pick a gigantic idea and get nowhere due to how daunting it was. However, one of my co-workers recently implemented a feature through the Github Webhooks API where we are required to add a label to our Pull Requests and a Slack channel is notified that the PR is ready to be reviewed. I decided that I wanted to rebuild it in Elixir and in doing so, be able to write about what I learn along the way; this is the first in what I hope to be many posts about my journey. With that said, if you’re unfamiliar with the webhooks API or how to set it up on your repository, please read the link above because we’re jumping right in!
I was recently attending DockerCon and Michael Ellison, CEO of Codepath, was giving a talk about the future of Computer Science education. He was telling a story about how he was recently in a room with a large number of CTOs and asked how many of them started programming before the age of 14. He mentioned that around 90% of them raised their hands. I let out an audible chuckle, turned to the person next to me and said “Well time to change careers, looks like I’ll never be a CTO.” While being a CTO isn’t the end all be all of a Software Engineer’s career path, I do believe it is a very desirable outcome. Later on, as I was back at my hotel, I was thinking about when I had started learning to program and it dawned on me that it has been about ten years since the spring quarter of my Freshman year at Oregon State University. I started reminiscing on all I had learned over the past ten years and decided I wanted to write a brief summary of my journey to becoming a Software Engineer and what I hope to accomplish moving forward.
Recently, I decided that one of my goals for 2019 was to familiarize myself more with Docker. I’ve been exposed to using Docker for the past couple of years, but I don’t use it on a day to day basis. Every once in a while, I would need to update a Dockerfile or a script and I would realize I needed to brush up on mostly everything because it had been so long since the last time I looked at anything Docker related. I decided I would just dive in and read a book to familiarize myself with any concepts I had glossed over before so I started reading Learn Docker – Fundamentals of Docker 18.x. It was during a tutorial where some seeded data was needed in a Postgresql database that I was had a bit of an aha moment. I can build images that have data in them already?!’ I thought to myself; this could actually really help out on local development if I had a copy of a production database.
An update to my post on adding a testing environment to a gem. After doing some recent updates to our Docker images at work, I realized that we are always using Ruby Alpine images, and not the base Ruby image. I can’t remember why I built the gem’s Dockerfile using the base Ruby image, perhaps I had just overlooked the fact that we used Ruby Alpine, but I wanted to standardize the Dockerfiles I had written at work and here for the blog so I decided to look into what it would take to do so.
Continuing to work on our gem with active_record rake tasks, we still need to set up a testing environment that can be run locally and in a repeatable fashion for continuous integration; we’ll accomplish the latter using a simple Dockerfile. But first let’s make it easier for someone to start using the gem by enhancing the scripts in
This is a follow up to my last post about Adding ActiveRecord Rake Tasks to a Gem that I promised to write. In that post I had to figure out how to make the command
rails g migration accessible inside of the gem, which ended up taking me all afternoon, but surprised me in how little code was actually needed to achieve the result. I wanted to write about the process I went through to figure out what was needed; I believe it is good exercise in understanding how to follow code to and understanding what it takes to re-implement functionality.
In my previous post I walked through using a gem to connect to another Rails application’s database, but another use case for connecting a gem to a database is for the development of the gem itself. Instead of having to create a Rails application and install the gem to connect to the database to test your models, we can create local database for only the gem by adding ActiveRecord’s Rake tasks.
I was recently thinking about system design, specifically the monolithic vs microservices approaches and how applications can talk to each other. If I needed to connect two applications, I would start by exposing APIs and using Faraday to write a simple HTTP client to consume the APIs. However, APIs can have their own set of issues (a discussion for another day) and an idea popped into my head to allow applications to connect directly to the database of another application through a gem that exposes the classes. I would only consider this approach internal applications and even then, you could totally cripple your system if someone starts writing queries without knowing what they are doing. But I was curious and wanted to try this approach out so let’s get started with creating our gem!
A new year, a new attempt at blogging! A few years ago I spun up www.jeremykreutzbender.com and dropped a Ghost blog on it in my first attempt to blog. I wanted to cross my personal thoughts with programming oriented blogs and to be frank I didn’t get too far. I made a few posts but found it hard find things to talk about or put together coherent and meaningful thoughts. I also found that the ever changing nature of Ghost, because its actively being worked on, and self hosting meant that every six months or so I had to remember how to log into my DigitalOcean and apply the updates and ensure that didn’t break my theme; overall it just became a hassle that I didn’t feel was worth the time.