BIRKEY CONSULTING

ABOUT  RSS  ARCHIVE


15 Apr 2025

Nix: Better way for fun and profit

Nix is started in 2003 as a research project aimed to solve the problem of reliable software deployment. The PhD thesis titled The Purely Functional Software Deployment Model proposed a novel way of building software where the final artifact is purely dependent on the inputs to the build system, which is a pure function in a mathematical sense. Regardless of where you are in your nix journey, I can't recommend this paper (thesis) enough. It is very approachable and worth a read so you learn from first principle of what, why and how about Nix.

Nix is a software build and management system that can replace traditional package managers, build environments and configuration tools. Due to the inherent complexity of the problem domain nix is designed to solve and its long history, it has pretty steep learning curve but not unsurmountable. One of the common point of confusions is how the term `Nix` is used in documentations, tutorials and blogosphere. So let me clarify few terminologies that often gets overloaded.

After few false starts and restarts, below are what I believe to be better ways for getting started, using nix for fun and profit.

Installation

I have a following bash script to install a specific version so I can have control over which version to install, what features enable and disable.

#!/usr/bin/env bash
set -Eeuo pipefail

VERSION='2.28.1' # replace it with the latest version
URL="https://releases.nixos.org/nix/nix-${VERSION}/install"
MY_CONF="$HOME/.dotfiles/nix/nix.conf"
sh <(curl --location "${URL}") \
     --daemon \
     --no-channel-add \
     --nix-extra-conf-file ${MY_CONF}
# conf file has this content
experimental-features = nix-command flakes

The `–no-channel-add` and the extra conf file needs some explanation. Nix called a remote url a channel that gets automatically installed, where nix uses to retrieve package definitions (Nix DSL) to manage packages. It introduces a state, which is currently installed channel url that is outside of Nix DSL, thus defeating the purpose of reproducibility. It is considered legacy feature and not needed by flakes, an experimental feature already widely adopted by the community. So I highly recommend enabling flakes and additional commands to interact with it.

Using for fun and sanity

Every project depends on existing software that is beyond your control. Nix DSL enables you to declaratively specify your projects dependencies, a repo or a tar-ball down to the file digest of its content, which is what gives nix superpowers of being a deterministic and reproducible package manager. This means that if your inputs stays the same, nix guarantees that it produces the exact same output regardless of when and where. Below is a flake that pulls in latest version of Clojure into your project.

{
  # optional attribute
  description = "My awesome Clojure/ClojureScript project";

  # required attribute
  inputs = {
    # nix dsl fns useful for writing flakes
    flake-utils.url = "github:numtide/flake-utils/v1.0.0";
    # Pins state of the packages to a specific commit sha
    pinnedPkgs.url = "github:NixOS/nixpkgs/c46290747b2aaf090f48a478270feb858837bf11";
  };

  # required attribute
  outputs = { self, flake-utils, pinnedPkgs }@inputs :
  flake-utils.lib.eachDefaultSystem (system:
  let pinnedSysPkgs = inputs.pinnedPkgs.legacyPackages.${system};
  in
  {
    devShells.default = pinnedSysPkgs.mkShell {
      packages = [
        pinnedSysPkgs.clojure
      ];

      # commands to run in the development interactive shell
      shellHook = ''
        echo To get Clojure REPL, Run:
        echo clojure
        echo To get ClojureScript REPL, Run:
        echo clj -Sdeps \'{:deps {org.clojure/clojurescript {:mvn/version "1.11.132"}}}\' -M -m cljs.main --repl
      '';
    };
    packages = {
      docker = pinnedSysPkgs.dockerTools.buildLayeredImage {
        name = "My awesome Clj docker image built by nix";
        tag = "latest";
        contents = [pinnedSysPkgs.clojure];
      };
    };
  });
}

Do not worry too much about not understanding above nix dsl code. The most important thing to know is that it is nix dsl referred to as a flake that specifies its inputs and outputs declaratively. Save above code as `flake.nix`, which is a convention, then run `nix develop` to get an interactive shell with Clojure in your path. Nix can do way more than this. However, I recommend you just start with solving project dependencies problem. Above flake gives you following benefits:

  • Ability to pin the exact versions of your project dependencies.
  • Cross platform development environment that works both in MacOS and various flavors of Linux.
  • Determinate and reproducible development environment that eliminates "it works on my machine" tooling issues.

One important thing to notice here is the way I chose to reference the url inputs of the flake. I deliberately used tags or commit sha to prevent the state of the urls (thus the state of the nix DSL) change under me, which defeats the purpose of having a determinate and reproducible way to get a development environment. I have following bash script that prints available tags and corresponding commit hash:

 git_tag_sha () {
   repo="$1"
   echo "********************************************************"
   echo "Available release and commit sha for pinning are:"
   echo "********************************************************"
   printf "\033[1m%-12s %s\033[0m\n" "release" "commit sha"
   curl -s https://github.com/$repo/tags | grep -oP 'href="\K[^"]*(releases/tag|nixpkgs/commit)[^"]*' | awk -F '/' 'NR%2{tag=$NF; next} {printf "%-12s %s\n", tag, $NF}'
   echo
   echo "****************************************************************************"
   echo "Please replace the commit sha of following line to pin pkgs to a commit sha: "
   echo "pinnedPkgs.url = github:$repo/<commit>"
   echo "****************************************************************************"
   echo
}
# You can run it like this:
 git_tag_sha "NixOS/nixpkgs"

Profiting in CI/CD and production

This is probably one of the most frictionless and rewarding outcome of using nix. Nix is designed to solve the problem of software deployment after all but the wholesale adoption in production might prove to be too much for the final gain. To spare yourself countless hours of frustration, I highly recommend you start with using it to build docker image if you happened to use docker and Kubernetes. Nix has superb built-in support for making the smallest possible docker image otherwise impossible. Above flake already includes `docker` image as one of its packages output. Here is how you build and load the docker image:

nix build .#docker # the image will be in ./result
docker load < ./result # to get it ready to be deployed

It is a declarative way (using the power of Nix DSL compared to using series commands in YAML file) to deterministically reproduce layered Docker image that saves time and money in your DevOps journey. Have fun and enjoy!

Tags: nix
18 Jun 2023

Google Bard and Emacs

After reading a Google blog post on Bard's increasing ability for reasoning about source code, I thought I would give it a try. The issue is that not like OpenAI, Bard currently does not have an http API that I can use via curl. I googled around and came across the `bard-rs` project here: https://github.com/Alfex4936/Bard-rs. So I followed the excellent instruction to get set up using bard from command line and its is pretty solid. I used following Elisp to use `bard-rs` from Emacs' compilation buffer here:

(defun kcompilation-start (cmd name &optional mode)
  (let* ((compile-command nil)
         (compilation-save-buffers-predicate 'ignore)
         (compilation-buffer
          (compilation-start cmd
                             (if (equal mode 'read-only) nil t)
                             (lambda (m)
                               (or (when (boundp 'name)
                                     (format "*%s*" name))
                                   (buffer-name))))))
    (when current-prefix-arg
      (with-current-buffer compilation-buffer
        (switch-to-prev-buffer (get-buffer-window (current-buffer)))))
    (message (format "Running %s in %s ..." cmd name))))

(defun kprompt-bard (&optional p)
  "Prompts for input to send it to `bard` using `bard-rs` in
*bard-prompt* buffer. If mark-active, uses the text in the region
 as the prompt"
  (interactive "P")
  (let* ((bs "bard-prompt")
         (bname (format "*%s*" bs))
         (bname (if (get-buffer bname)
                    bname
                  (progn (kcompilation-start "bard-rs -e ~/.env" bs)
                         bname)))
         (prompt (if mark-active
                     (replace-regexp-in-string
                      "\n"
                      ""
                      (buffer-substring-no-properties (region-beginning) (region-end)))
                   (read-string "AI Chat Prompt: "))))
    (with-current-buffer (pop-to-buffer bname)
      (when p
        (end-of-buffer)
        (insert "!reset")
        (comint-send-input)
        (end-of-buffer)
        (insert prompt)
        (comint-send-input))
      (when (not p)
        (end-of-buffer)
        (insert prompt)
        (comint-send-input)))))

You can bind `kprompt-bard` to any key of your choice and start interacting with Google bard from the comfort of Emacs' buffer.

Tags: AI emacs
14 May 2023

AI or not to AI

1913 Webster dictionary gives following definition to Artificial Intelligence: Artificial - 1. Made or contrived by art; produced or modified by human skill and labor, in opposition to natural; 2. Feigned; fictitious; assumed; affected; not genuine. 3. Artful; cunning; crafty. 4. Cultivated; not indigenous; not of spontaneous growth; Intelligence - 1. The act or state of knowing; the exercise of the understanding. 2. The capacity to know or understand; readiness of comprehension; the intellect, as a gift or an endowment. 3. Knowledge imparted or acquired, whether by study, research, or experience; general information. Specifically; (Mil.) Information about an enemy or potential enemy, his capacities, and intentions.

Let us read and re-read above definitions and give it a few minutes to sink in. With our skilled labor, we have managed to produce a very powerful fictitious software that can understand, produce and reason about human generated artifacts such as language, images and videos. Our ability to make it more scalable, more accurate and more controllable is just a matter of time considering what we have is already pretty powerful enough. My following interaction with ChatGPT 3.5 is already impressive:

[ME]: Explain, then add tests and simplify if possible:
(defun kprompt (&optional p)
  (interactive "P")
  (let* ((bname (if p "*org-ai-on-region*" "*scratch*"))
         (directive "Explain, then add tests and simplify if possible:\n")
         (prompt (if p
                     (buffer-substring (region-beginning) (region-end))
                   (read-string "AI Chat Prompt: "))))
    (with-current-buffer (pop-to-buffer bname)
      (when p
        (erase-buffer)
        (org-mode)
        (setq truncate-lines nil)
        (setq word-wrap t)
        (org-ai--insert-created-timestamp)
        (insert "\n")
        (insert "#+begin_ai\n" "[ME]: " directive prompt "\n" "#+end_ai\n")
        (forward-line -1)
        (org-ai-complete-block))
      (when (not p)
        (beginning-of-buffer)
        (open-line 2)
        (org-ai-prompt prompt)))))

[AI]: This function takes an optional argument, p, which determines whether the user
will be prompted to enter a text or whether the function will use the text highlighted
in the buffer. 

If p is provided, it will create a new buffer with the name "*org-ai-on-region*" and
insert the highlighted text from the current buffer along with a directive. This new
buffer buffer will be formatted using org-mode to allow for better organization of
comprehension text. Finally, it will call org-ai-complete-block to add some
AI-generated text to the prompt.

If p is not provided, it will prompt the user to enter text and then call
org-ai-prompt org-ai-prompt to add some AI-generated text to the prompt.

Tests:

Input:
(kprompt)
Expected Output:
AI Chat Prompt: 

Input:
(kprompt t)
Expected Output:
*org-ai-on-region* buffer created with prompt

Input:
(Highlight "This is a test")
(kprompt t)
Expected output:
*org-ai-on-region* buffer created with "This is a test" text and AI-generated text.

Simplified Code:
The code appears to be relatively straightforward and efficient.
No simplification is needed.

It understood and reasoned about a piece of code (simple for now but does not matter as it can get pretty powerful in no time) despite the fact that it has never seen above code before in its entirety (I am pretty sure about that since it is a private code and this is the only time I am making the code public). This degree of intelligence codified into a software that a few powerful cooperation currently dominates should be something that keeps every software engineers awake at night. Not because of its inherent danger or tremendous productivity boosting ability as folks on the opposite side of spectrum of current AI debate claimed, but because of the very fact that every aspect of human lives will be effected by a such a powerful code like ChatGPT whether we like it or not, and we need to do whatever we can to ensure it is used for the good of humanity in general. It is created by humans and should serve humans. Make no mistake about it. Powerful software systems like that is already used by big cooperations and rouge states to cajole people into a state of self censorship if not into a state of heedlessness of its future implication. Social media, powerful tracking and image recognition systems are already pervasive in the lives of millions of people that are being controlled by dictators all around the world (and it is being exported very actively in the name of economic progress) to socially engineer people's behaviors that benefits their agenda in the name of social and economic progress at the very expense of destroying anyone or anything that is deemed as an obstacle.

As a software engineer who have seen the worst of what bad actors can do with such a powerful systems, I am calling out to all of my fellow engineers to start thinking about what kind of world we would like our kids to inherit from us regardless of where you are, who you are and what is your geopolitical affiliation is. The wave is already there, and it takes all of us to make sure we are not being social engineered out of our humanity. I believe in the power of our humanity to make AI to work for us not the other way around. I registered the domain www.codeforhumanrights.org few years ago and this might be a good time to start putting it to a good use. If you are reading this and feel the need to start doing something, reach out to me via ktuman at acm dot org.

Tags: AI emacs
04 Mar 2023

Atomic commits made easy

Code complexity is something we all deal with in our daily work. There are many tools to helps us manage it. One of the most important one is to make incremental changes where each change is about one and one context alone , which is a great definition of an atomic commit. I do not think I need to convince you about its benefits any further than what I already have alluded to above, which is worth repeating here: It helps us contain complexity within our code base. In pursuit of making it easy for me to do atomic commits, I settled down following workflow:

Having armed with above convention, I incorporated following tools to help me to make atomic commits easy:

I am not going to repeat what the excellent blog talked about above tools here, but it is worth checking it out, and I highly recommend it. If you happen to use Emacs, here is how you add it to your config:

;; clone above repo in to ~/repos and eval following code
(load-file "~/repos/commit-patch/commit-patch-buffer.el")
(eval-after-load 'diff-mode
  '(require 'commit-patch-buffer nil 'noerror))

With above configuration, you can M-x vc-diff a file (vc-root-diff for whole project) then kill, split or edit the resulting hunks using diff mode's built-in commands and to then hit C-c C-c to commit the patch. Later if you realized that your commit is not atomic, you can make further changes and amend previous commit by C-c C-C (note the upper case C).

Tags: vc emacs
08 Oct 2022

Notes on StrangeLoop 2022

StrangeLoop this year made me feel almost as normal as those during pre pandemic times especially in terms of size of participants and sponsors. I actually believe it had the most sponsorship so far among those I attended over the years. One noticeable company was AWS that a friend of mine joined earlier this year. So it was pretty exciting to meet up with him there. As always, the breadths of folks and topics from all walks of Software Engineering is just excellent this time as well. I took few notes on the talks that I am able to attend and would summarize them on few categories below.

Dev Experience : I consider observability to be an important part of development experience, which is always pushed back as an afterthought for almost all the startups that I worked with and I know of. The talk "Building Observability for 99% Developers" by Jean Yang just resonated with me and I am really glad that her company is building tools to make developer experience better using eBPF that has matured in the last few years. It was quite and entertaining talk and I highly recommend you check it out. Another talk that I really enjoyed is "Workflows, a new abstraction for distributed systems" by Dominik Tornow. If you are dealing with the chaos of distributed services, the abstraction that he presented make you feel like you are working with monolith (oh the happy times :) again with the advantage of cloud scalibility.

Programming Languages : I personally prefer dynamic languages (Clojure to be exact) as my tools of choice for my day to day practice of solving problems for people but I do see the benefit of type system as a guiding rail when designing an API or better yet generating code. You should check out the talk "Codegen with Types, for Humans, by Humans" by Matthew Griffith, where he talks about how to use types to generate code for human consumption. If you are working with (or say fighting with) type system that is complecting your business logic/code execution, you should check out the talk "Monad I Love You Now Get Out Of My Type System" by Gjeta Gjyshinca, where she talks about the platform that helps Scala developer to get their type system out of business logic with just five extra characters, @node, with a compiler plugin.

Data Centric Problem Solving : Strange Loop Conf have always been attractive to data centric practices and paradigms and this year was no different. There are many talks related to how to generate, architect and analyze data. The talk titled "Data-driven investigation in defense of human rights" by Christo Buschek is presented with very clear and methodical approaches to solve real problems we face around the world. I really feel like we need more of that type of work where we put technology for the benefit of our fellow human beings. I think he should do another presentation next year titled "Data-driven investigation in defense of peace and opposition of war", which surely makes me sign up for Strange Loop for one last time (Next year will be the last time you can experience Strange Loop and I highly recommend you attend to see what you have been missing).

Tags: conference strange-loop
Other posts