Chris' Blog.

My occasional thoughts on software development, careers, and updates on life in general.

FM Synthesis

FM Synthesis is an old-school way of generating musical instrument sounds, initially popularised by the Adlib and SoundBlaster PC sound cards in the late ’80s (and, of course, in piano keyboards). Here’s an example of what FM Synth music sounded like in games. Ahh the nostalgia.

A friend who is a school music teacher found that his students all use the same identical samples for instruments for their creations. So I created YouSynth, a web app that allows you to create any instrument you like using a basic form of FM synthesis, and download that instrument as WAV file you can use anywhere, as well as play around with it using an attached MIDI keyboard. Please check it out!

So as to not leave out the maths teachers, I thought I’d write an article about how the maths for FM synthesis works! I think it’s fascinating, hopefully you might too. My dream is that maybe a maths teacher somewhere would use this as an interesting demonstration of applied maths to pique their students’ interest :)

Formula

To start with, here’s the gist of it - for each sample, the value is:

sin(
    carrierFrequency * time * 2 * pi
    +
    sin(modulatorFrequency * time * 2 * pi) * modulatorEnvelope
) * carrierEnvelope

Now let’s break that down.

Carrier frequency

The carrier frequency is the fundamental frequency of the note. Eg for A4, it’s 440 Hz. For Middle C, aka C4, it’s ~261.6 Hz.

For each note you go up (including sharps), the frequency is multiplied by 2^(1/12). The 1/12 is because there are 12 freqencies in each octave when including the sharps. The 2^ is because frequencies double with each octave. Eg A4 is 440 Hz, and A5 is 880 Hz.

When working with MIDI, each note gets a number representation: C4=60, C#4=61, D4=62, etc. To convert from a midi note to a frequency, the formula is: 440 * 2 ^ ((midiNote - 69) / 12).

Time

The time in the above formula is in seconds since the note started playing. Since you’d typically be generating samples at a rate of 44100 or 48000 Hz, to convert from the sample number to the time, this formula applies: time = sample / sampleRate.

Pi

The 2 * pi is necessary because sin repeats its output every multiple of 2 * pi on its input. An interesting aside: Credible mathematicians consider that tau (2 * pi) should be taught to students instead of pi, because it is so common that we need to double pi before using it, so why not just use the double as the famous constant, then? See the Tau manifesto.

Modulator frequency

The modulator is the waveform that ‘modulates’ the fundamental frequency. Think of it as the whammy bar on a guitar being wiggled up and down quickly.

Typically the modulator frequency is a whole-number multiple or fraction of the fundamental frequency. Eg for a fundamental of 440 Hz, the following modulator frequencies all sound ‘nice’: 110 (440/4), 146.7 (440/3), 220 (440/2), 440, 880, 1320, etc.

Envelopes

The envelopes control the amplitude/volume of the carrier and modulator over time. From initially zero, quickly up to 100%, then down to a sustained volume of perhaps 50%, where it remains while the piano key is held, then when the key is released, it gradually returns to 0.

A common strategy is the ADSR envelope.

During the attack stage: amplitude = time / attackDuration.

During decay stage: amplitude = 1 - (time - attackDuration) / decayDuration * (1 - sustainAmplitude).

During sustain stage: amplitude = sustainAmplitude.

During release stage: amplitude = sustainAmplitude - releasingTime / releaseDuration.

Other waves

To make more interesting sounds, other waveforms besides sine waves can be used. Some common ones are square, triangle, and sawtooth. Here are their formulae which repeat every multiple of 1 on the input:

  • Sine = sin(x * 2 * pi)
  • Square = 4 * floor(x) - 2 * floor(2 * x) + 1
  • Triangle = 2 * abs(2 * (x + 0.25 - floor(x + 0.75))) - 1
  • Sawtooth = 2 * (x - floor(x + 0.5))

So there you have it, the maths behind basic FM Synthesis. Thanks for reading, hope you found this fascinating, at least a tiny bit, God bless!

Photo by Vackground on Unsplash


Training layers of neurons

Hi all, here’s the fourth on my series on neural networks / machine learning / AI from scratch. In the previous articles (please read them first!), I explained how a single neuron works, then how to calculate the gradient of its weight and bias, and how you can use that gradient to train the neuron. In this article, I’ll explain how to determine the gradients when you have many layers of many neurons, and use those gradients to train the neural net.

In my previous articles in this series, I used spreadsheets to make the maths easier to follow along. Unfortunately I don’t think I’ll be able to demonstrate this topic in a spreadsheet, I think it’d get out of hand, so I’ll keep it in code. I hope you can still follow along!

Data model

Pardon my pseudocode:

class Net {
    layers: [Layer]
}

class Layer {
    neurons: [Neuron]
}

class Neuron {
    value: float
    bias: float
    weights: [float]
    activation_gradient: float
}

Explanation:

  • Layers: The neural net is made up of multiple layers. The first one in the array is the input layer, the last one is the output layer.
  • Neurons: The neurons that make up a layer. Each layer will typically have different numbers of neurons.
  • Value: The output of each neuron.
  • Bias: The bias of each neuron.
  • Weights: Input weights for each neuron. This array’s size will be the number of inputs to this layer. For the first layer, this will be the number of inputs (aka features) to the neural net. For subsequent layers, this will be the count of neurons in the previous layer.
  • Activation Gradient: These are the gradients of each neuron, chained to the latter layers via the magic of calculus. This is also equal to the gradient of the bias too. Maybe reading my second article in this series will help understand what this gradient means :)

High(ish) level explanation

What we’re trying to achieve here is to use calculus to determine the ‘gradient’ of every bias and every weight in this neural net. In order to do this, we have to ‘back propagate’ these gradients from the back to the front of the ‘layers’ array.

Concretely - if, say, we had 3 layers: we’d figure out the gradients of the activation functions of layers[2], then use those values to calculate the gradients of layers[1], and then layers[0].

Once we have the gradients of the activation functions for each neuron in each layer, it’s easy to figure out the gradient of the weights and bias for each neuron.

And, as demonstrated in my previous article, once we have the gradients, we can ‘nudge’ the weights and biases in the direction that their gradients say, thus train the neural net.

Steps

Training and determining the gradients go hand-in-hand, as you need the inputs to calculate the values of each neuron in the net, and you need the targets (aka desired outputs) to determine the gradients. Thus it’s a three step process:

  • Forward pass (calculate the Layer.values)
  • Backpropagation (calculate the Layer.activation_gradients)
  • Train the weights and biases (adjust the Layer.biases and Layer.weights)

Forward pass

This pass fills in the ‘value’ fields.

  • The first layer’s neurons must have the same number of weights as the number of inputs.
  • Each neuron’s value is calculated as tanh(bias + sum(weights * inputs)).
  • Since tanh is used as the activation function, this neural net can only work with inputs and outputs and targets that are in the range -1 to +1.

Forward pass pseudocode:

for layer in layers, first to last {
    if this is the first layer {
        for neuron in layer.neurons {
            total = neuron.bias
            for weight in neuron.weights {
                total += weight * inputs[weight_index]
            }
            neuron.value = tanh(total)
        }
    } else {
        previous_layer = layers[layer_index - 1]
        for neuron in layer.neurons {
            total = neuron.bias
            for weight in neuron.weights {
                total += weight * previous_layer.neuron[weight_index].value
            }
            neuron.value = tanh(total)
        }
    }
}

Backward pass (aka backpropagation)

This fills in the ‘activation_gradient’ fields.

  • Note that when iterating the layers here, you must go last to first.
  • The ‘targets’ are the array of output value(s) from the training data.
  • The last layer must have the same number of neurons as the number of targets.
  • The (1 - value^2) * ... are calculus equations for determining gradients.

Backward pass pseudocode:

for layer in reversed layers, last to first {
    if this is the last layer {
        for neuron in layer.neurons {
            neuron.activation_gradient =
                (1 - neuron.value^2) *
                (value - targets[neuron_index])
        }
    } else {
        next_layer = layers[layer_index + 1]
        for this_layer_neuron in layer.neurons {
            next_layer_gradient_sum = 0
            for next_layer_neuron in next_layer.neurons {
                next_layer_gradient_sum +=
                    next_layer_neuron.activation_gradient * 
                    next_layer_neuron.weights[this_layer_neuron_index]
            }
            this_layer_neuron.activation_gradient =
                (1 - this_layer_neuron.value^2) *
                next_layer_gradient_sum
        }
    }
}

Training pass

Now that you have the gradients, you can adjust the biases/weights to train it to better.

I’ll skim over this as it’s covered in my earlier articles in this series. The gist of it is that, for each neuron, the gradient is calculated for the bias and every weight, and the bias/weights are adjusted a little to ‘descend the gradient’. Perhaps my pseudocode might make more sense:

learning_rate = 0.01 // Aka 1%
for layer in layers {
    if this is the first layer {
        for neuron in layer.neurons {
            neuron.bias -= neuron.activation_gradient * learning_rate
            for weight in neuron.weights {
                gradient_for_this_weight = inputs[weight_index] *
                    neuron.activation_gradient
                weight -= gradient_for_this_weight * learning_rate
            }
        }
    } else {
        previous_layer = layers[layer_index - 1]
        for neuron in layer.neurons {
            neuron.bias -= neuron.activation_gradient * learning_rate
            for weight in neuron.weights {
                gradient_for_this_weight =
                    previous_layer.neurons[weight_index].value *
                    neuron.activation_gradient
                weight -= gradient_for_this_weight * learning_rate
            }
        }
    }
}

Rust demo

Because I’m a Rust tragic, here’s a demo. It’s kinda long, sorry, not sorry. It was fun to write :)

This trains a neural network to calculate the area and circumference of a rectangle, given the width and height as inputs.

  • Width and height are scaled to the range 0.1 - 1. because that’s the range that the tanh activation function supports.
  • Target values are also scaled to be in the range that tanh supports.
  • Initial biases and weights are randomly assigned.

🦀🦀🦀

use rand::Rng;

struct Net {
    layers: Vec<Layer>,
}

struct Layer {
    neurons: Vec<Neuron>,
}

struct Neuron {
    value: f64,
    bias: f64,
    weights: Vec<f64>,
    activation_gradient: f64
}

const LEARNING_RATE: f64 = 0.001;

fn main() {
    let mut rng = rand::thread_rng();

    // Make a 3,3,2 neural net that inputs the width and height of a rectangle,
    // and outputs the area and circumference.
    let mut net = Net {
        layers: vec![
            Layer { // First layer has 2 weights to suit the 2 inputs.
                neurons: vec![
                    Neuron {
                        value: 0.,
                        bias: rng.gen_range(-1. .. 1.),
                        weights: vec![
                            rng.gen_range(-1. .. 1.),
                            rng.gen_range(-1. .. 1.),
                        ],
                        activation_gradient: 0.,
                    },
                    Neuron {
                        value: 0.,
                        bias: rng.gen_range(-1. .. 1.),
                        weights: vec![
                            rng.gen_range(-1. .. 1.),
                            rng.gen_range(-1. .. 1.),
                        ],
                        activation_gradient: 0.,
                    },
                    Neuron {
                        value: 0.,
                        bias: rng.gen_range(-1. .. 1.),
                        weights: vec![
                            rng.gen_range(-1. .. 1.),
                            rng.gen_range(-1. .. 1.),
                        ],
                        activation_gradient: 0.,
                    },
                ],
            },
            Layer { // Second layer neurons have the same number of weights as the previous layer has neurons.
                neurons: vec![
                    Neuron {
                        value: 0.,
                        bias: rng.gen_range(-1. .. 1.),
                        weights: vec![
                            rng.gen_range(-1. .. 1.),
                            rng.gen_range(-1. .. 1.),
                            rng.gen_range(-1. .. 1.),
                        ],
                        activation_gradient: 0.,
                    },
                    Neuron {
                        value: 0.,
                        bias: rng.gen_range(-1. .. 1.),
                        weights: vec![
                            rng.gen_range(-1. .. 1.),
                            rng.gen_range(-1. .. 1.),
                            rng.gen_range(-1. .. 1.),
                        ],
                        activation_gradient: 0.,
                    },
                    Neuron {
                        value: 0.,
                        bias: rng.gen_range(-1. .. 1.),
                        weights: vec![
                            rng.gen_range(-1. .. 1.),
                            rng.gen_range(-1. .. 1.),
                            rng.gen_range(-1. .. 1.),
                        ],
                        activation_gradient: 0.,
                    },
                ],
            },
            Layer { // Last layer has 2 neurons to suit 2 outputs.
                neurons: vec![
                    Neuron {
                        value: 0.,
                        bias: rng.gen_range(-1. .. 1.),
                        weights: vec![
                            rng.gen_range(-1. .. 1.),
                            rng.gen_range(-1. .. 1.),
                            rng.gen_range(-1. .. 1.),
                        ],
                        activation_gradient: 0.,
                    },
                    Neuron {
                        value: 0.,
                        bias: rng.gen_range(-1. .. 1.),
                        weights: vec![
                            rng.gen_range(-1. .. 1.),
                            rng.gen_range(-1. .. 1.),
                            rng.gen_range(-1. .. 1.),
                        ],
                        activation_gradient: 0.,
                    },
                ],
            },
        ],
    };

    // Train.
    let mut cumulative_error_counter: i64 = 0; // These vars are for averaging the errors.
    let mut area_error_percent_sum: f64 = 0.;
    let mut circumference_error_percent_sum: f64 = 0.;
    for training_iteration in 0..100_000_000 {
        // Inputs:
        let width: f64 = rng.gen_range(0.1 .. 1.);
        let height: f64 = rng.gen_range(0.1 .. 1.);
        let inputs: Vec<f64> = vec![width, height];

        // Targets (eg desired outputs):
        let area = width * height;
        let circumference_scaled = (height * 2. + width * 2.) * 0.25; // Scaled by 0.25 so it'll always be in range 0..1.
        let targets: Vec<f64> = vec![area, circumference_scaled];

        // Forward pass!
        for layer_index in 0..net.layers.len() {
            if layer_index == 0 {
                let layer = &mut net.layers[layer_index];
                for neuron in &mut layer.neurons {
                    let mut total = neuron.bias;
                    for (weight_index, weight) in neuron.weights.iter().enumerate() {
                        total += weight * inputs[weight_index];
                    }
                    neuron.value = total.tanh();
                }
            } else {
                // Workaround for Rust not allowing you to borrow two different vec elements simultaneously.
                let previous_layer: &Layer;
                unsafe { previous_layer = & *net.layers.as_ptr().add(layer_index - 1) }
                let layer = &mut net.layers[layer_index];
                for neuron in &mut layer.neurons {
                    let mut total = neuron.bias;
                    for (weight_index, weight) in neuron.weights.iter().enumerate() {
                        total += weight * previous_layer.neurons[weight_index].value;
                    }
                    neuron.value = total.tanh();
                }
            }
        }

        // Let's check the results!
        let outputs: Vec<f64> = net.layers.last().unwrap().neurons
            .iter().map(|n| n.value).collect();
        let area_error_percent = (targets[0] - outputs[0]).abs() / targets[0] * 100.;
        let circumference_error_percent = (targets[1] - outputs[1]).abs() / targets[1] * 100.;
        area_error_percent_sum += area_error_percent;
        circumference_error_percent_sum += circumference_error_percent;
        cumulative_error_counter += 1;
        if training_iteration % 10_000_000 == 0 {
            println!("Iteration {} errors: area {:.3}%, circumference: {:.3}% (smaller = better)",
                training_iteration,
                area_error_percent_sum / cumulative_error_counter as f64,
                circumference_error_percent_sum / cumulative_error_counter as f64);
            area_error_percent_sum = 0.;
            circumference_error_percent_sum = 0.;
            cumulative_error_counter = 0;
        }

        // Backward pass! (aka backpropagation)
        let layers_len = net.layers.len();
        for layer_index in (0..layers_len).rev() { // Reverse the order.
            if layer_index == layers_len - 1 { // Last layer.
                let layer = &mut net.layers[layer_index];
                for (neuron_index, neuron) in layer.neurons.iter_mut().enumerate() {
                    neuron.activation_gradient =
                        (1. - neuron.value * neuron.value) *
                        (neuron.value - targets[neuron_index]);
                }
            } else {
                // Workaround for Rust not allowing you to borrow two different vec elements simultaneously.
                let next_layer: &Layer;
                unsafe { next_layer = & *net.layers.as_ptr().add(layer_index + 1) }
                let layer = &mut net.layers[layer_index];
                for (this_layer_neuron_index, this_layer_neuron) in layer.neurons.iter_mut().enumerate() {
                    let mut next_layer_gradient_sum: f64 = 0.;
                    for next_layer_neuron in &next_layer.neurons {
                        next_layer_gradient_sum +=
                            next_layer_neuron.activation_gradient * 
                            next_layer_neuron.weights[this_layer_neuron_index];
                    }
                    this_layer_neuron.activation_gradient =
                        (1. - this_layer_neuron.value * this_layer_neuron.value) *
                        next_layer_gradient_sum;
                }
            }
        }

        // Training pass!
        for layer_index in 0..net.layers.len() {
            if layer_index == 0 {
                let layer = &mut net.layers[layer_index];
                for neuron in &mut layer.neurons {
                    neuron.bias -= neuron.activation_gradient * LEARNING_RATE;
                    for (weight_index, weight) in neuron.weights.iter_mut().enumerate() {
                        let gradient_for_this_weight =
                            inputs[weight_index] *
                            neuron.activation_gradient;
                        *weight -= gradient_for_this_weight * LEARNING_RATE
                    }
                }
            } else {
                // Workaround for Rust not allowing you to borrow two different vec elements simultaneously.
                let previous_layer: &Layer;
                unsafe { previous_layer = & *net.layers.as_ptr().add(layer_index - 1) }
                let layer = &mut net.layers[layer_index];
                for neuron in &mut layer.neurons {
                    neuron.bias -= neuron.activation_gradient * LEARNING_RATE;
                    for (weight_index, weight) in neuron.weights.iter_mut().enumerate() {
                        let gradient_for_this_weight =
                            previous_layer.neurons[weight_index].value *
                            neuron.activation_gradient;
                        *weight -= gradient_for_this_weight * LEARNING_RATE;
                    }
                }
            }
        }
    }
}

Which outputs:

Iteration 0 errors: area 223.106%, circumference: 13.175% (smaller = better)
Iteration 10000000 errors: area 17.861%, circumference: 1.123% (smaller = better)
Iteration 20000000 errors: area 14.656%, circumference: 0.790% (smaller = better)
Iteration 30000000 errors: area 14.516%, circumference: 0.698% (smaller = better)
Iteration 40000000 errors: area 6.359%, circumference: 0.882% (smaller = better)
Iteration 50000000 errors: area 2.966%, circumference: 0.875% (smaller = better)
Iteration 60000000 errors: area 2.769%, circumference: 0.807% (smaller = better)
Iteration 70000000 errors: area 2.600%, circumference: 0.698% (smaller = better)
Iteration 80000000 errors: area 2.401%, circumference: 0.573% (smaller = better)
Iteration 90000000 errors: area 2.166%, circumference: 0.468% (smaller = better)

Which you can see the error percentage drop down as it ‘learns’ to calculate the area and circumference of a rectangle. Magic!

Thanks for reading, hope you found this helpful, at least a tiny bit, God bless!

Photo by Jonas Hensel on Unsplash


Previewable SwiftUI ViewModels

Hi all, I’d like to talk about a way to setup your ViewModels in SwiftUI to make previews easy:

  • A) Decouple your ViewModels from your Views.
  • B) Replace your ViewModel when previewing.
  • C) Easily inject any ViewState content when previewing.
  • D) Test your ViewModels without needing a View, instead testing their ViewState.

I’ve used a variant of this (I simplified it a little) with a big team before so I know it’s battle-proven. But of course this may be more helpful as a starting point for you, too.

The general idea is this: Have a ‘ViewModel’ protocol, and make your Views have a generic constraint to accept any ViewModel that uses that view’s specific state/events, and use a preview viewmodel that adheres to the protocol.

One-time boilerplate

So here’s the generic ViewModel that every screen will re-use. ViewEvent is typically an enum, and used by the View to eg send button presses to the ViewModel. ViewState is the struct that is used to push the loaded/loading/error/whatever state to the View.

protocol ViewModel<ViewEvent, ViewState>: ObservableObject {
    associatedtype ViewEvent
    associatedtype ViewState

    // For communication in the VM -> View direction:
    var viewState: ViewState { get set }

    // For communication in the View -> VM direction:
    func handle(event: ViewEvent)
}

Somewhere you’ll have a ‘preview’ viewmodel. This is declared once and used by all screens you want to preview. I’m a fan of putting your preview code in a conditional compilation statement. Note that this allows you to inject any viewstate you like. Is ‘preview view’ a tautology? Should this be called PreviewModel or PreViewModel? Flip a coin to decide…

#if targetEnvironment(simulator)
class PreviewViewModel<ViewEvent, ViewState>: ViewModel {
    @Published var viewState: ViewState

    init(viewState: ViewState) {
        self.viewState = viewState
    }

    func handle(event: ViewEvent) {
        print("Event: \(event)")
    }
}
#endif

View

Before I show the view, I’ll introduce the event and states. Firstly the event enum, this is the single ‘pipe’ via which the View calls through to the ViewModel (aspirationally… 2-way bindings sidestep this). You will likely have associated values on some of these, eg the id of which row was pressed, that kind of thing:

enum FooViewEvent {
    case hello
    case goodbye
    case present
}

Next is the ViewState. This controls what is displayed. Typically you might have an loading/loaded/error enum in here, among other things. Notice there’s an ‘xIsPresented’ var here that is used in a 2-way-binding later for modal presentation:

struct FooViewState: Equatable {
    var text: String
    var sheetIsPresented: Bool = false
}

Ok, now the state and event are out of the way, here’s how a view might look. Note the gnarly generic clause up the top, this is the trickiest part of this whole technique to be honest. Basically it’s saying ‘I can accept any ViewModel that uses this particular screen’s event/state’. Also note the 2-way binding for the modal sheet: even though this somewhat side-steps the idea of piping all input/output through the event/state concept, it’s very SwiftUI-idiomatic to use these bindings so I don’t want to be overly rigid and make life difficult: we want to avoid ‘cutting against the grain’ when working with SwiftUI. So, yeah, this isn’t architecturally pure, but it is productive!

struct FooView<VM: ViewModel>: View
where VM.ViewEvent == FooViewEvent,
      VM.ViewState == FooViewState
{
    @StateObject var viewModel: VM

    var body: some View {
        VStack {
            Text(viewModel.viewState.text)
            Button("Hello") {
                viewModel.handle(event: .hello)
            }
            Button("Goodbye") {
                viewModel.handle(event: .goodbye)
            }
            Button("Present modal sheet") {
                viewModel.handle(event: .present)
            }
        }
        .sheet(isPresented: $viewModel.viewState.sheetIsPresented) {
            Text("This is a modal sheet!")
                .presentationDetents([.medium])
                .presentationDragIndicator(.visible)
        }
    }
}

ViewModel

Last but not least is the ViewModel for this screen. Note that because viewState is @Published, and ViewModel is a @StateObject, any updates to viewState are magically automatically applied to the View. It’s really simple, no Combine required! Also note the xIsPresented is trivial to set to true to present something, far simpler than using some form of router which I fear can be convoluted.

class FooViewModel: ViewModel {
    @Published var viewState: FooViewState

    init() {
        viewState = FooViewState(
            text: "Nothing has happened yet."
        )
    }

    func handle(event: FooViewEvent) {
        switch event {
        case .hello:
            viewState.text = "👋"
        case .goodbye:
            viewState.text = "😢"
        case .present:
            viewState.sheetIsPresented = true
        }
    }
}

Previews

At the bottom of the view file you’ll want your previews. By using the PreviewViewModel you can inject whatever ViewState you like:

#if targetEnvironment(simulator)
#Preview {
    FooView(
        viewModel: PreviewViewModel(
            viewState: FooViewState(
                text: "This is a preview!"
            )
        )
    )
}    
#endif

Conclusion

I hope this helps you use SwiftUI in a preview-friendly way! SwiftUI without previews is the pits…

The source for this is on this github gist here

Thanks for reading, hope you found this helpful, at least a tiny bit, God bless!

Photo by Yahya Gopalani on Unsplash Font by Khurasan on Dafont


You can see older posts in the right panel, under 'archive'.

Archive

The Maths of FM Synthesis 9 Oct 2024

Neural Networks from scratch #4: Training layers of neurons, backpropagation with pseudocode and a Rust demo 10 Jul 2024

Previewable SwiftUI ViewModels 16 May 2024

Neural Networks explained with spreadsheets, 3: Training a single neuron 22 Apr 2024

Neural Networks explained with spreadsheets, 2: Gradients for a single neuron 20 Mar 2024

Neural Networks explained with spreadsheets, 1: A single neuron 10 Mar 2024

How to implement a position-and-velocity Kalman Filter 15 Dec 2023

How to implement a position-only Kalman Filter 14 Dec 2023

Rust Crypto Ticker using Interactive Brokers' TWS API directly 28 Aug 2023

Rust PNG writer from scratch 12 Jul 2022

UIScrollView content and frame layout guides: scroll your UIStackView content purely in storyboards (iOS) 1 May 2022

Swift Security framework wrapper for RSA and Elliptic Curve encryption / decryption 21 Sep 2021

Simple, practical async await Swift examples 3 Jul 2021

Xcode pbxproj project generator in Swift 17 May 2021

UITableViewDiffableDataSource for adding and removing rows automatically to a table view in Swift 10 May 2021

Super simple iOS Combine example 23 Feb 2021

Introducing Chalkinator: Native desktop blogging app 7 Jun 2020

Flare: Open source 2-way folder sync to Backblaze B2 in Swift 28 May 2020

Making a baby monitor out of a couple of ESP32s, an I2S microphone, and a small speaker 16 Apr 2020

Chris' 2020 guide to hosting a HTTPS static site on AWS S3 + Cloudfront 15 Mar 2020

Simple Javascript debounce, no libraries needed 20 Feb 2020

Asynchronous NSOperations in Swift 5 3 Jan 2020

Deploying Golang Revel sites to AWS Elastic Beanstalk 9 Dec 2019

Golang and pure Swift Compression and Decompression 28 Jul 2019

Pure Swift simple Keychain wrapper 23 Jun 2019

Pure Swift 5 CommonCrypto AES Encryption 9 Jun 2019

Bluetooth example code for Swift/iOS 6 Jun 2019

Talking to a Bluetooth LE peripheral with Swift/iOS 18 May 2019

Obfuscating Keys using Swift 5 May 2019

State Machines in Swift using enums 10 Apr 2019

iOS timers without circular references with Pendulum 28 Mar 2019

Pragmatic Reactive Programming 11 Oct 2017

React Native first impressions 7 Apr 2017

Gondola 26 Feb 2017

Scalable Swift 22 Nov 2016

Swift 3 Migration 6 Nov 2016

Enum-Driven View Controllers 3 Jan 2016

Status bar colours: Everything there is to know 30 Dec 2015

Android server 20 Dec 2015

Generating heightmap terrain with Swift 8 Nov 2015

Swift Education Screencasts 27 Oct 2015

Swift Image Cache 24 Sep 2015

Don't be slack 13 Sep 2015

Swift KVO alternative 23 Jul 2015

Swift Keychain wrapper 21 Jun 2015

Swift NSURLSession wrapper 12 Jun 2015

iOS8 View Controller transitioning bug 17 Apr 2015

IB Designable 18 Mar 2015

iOS App Architecture 2 Mar 2015

Video Course Launch 14 Feb 2015

Video Course Pre-launch 8 Feb 2015

Blogging Platforms 13 Jan 2015

Mobile in 2014 - Year in Review 11 Jan 2015

Secret Keys talk 16 Nov 2014

Dimmi 11 Nov 2014

Project setup in Xcode6 22 Oct 2014

Uploading to an S3 bucket from iOS 15 Oct 2014

iOS8 App Testing Roundup 28 Sep 2014

Storing obfuscated secret keys in your iOS app 16 Sep 2014

Getting Core Location / CLLocationManager to work on iOS8 14 Sep 2014

Accessing the response body in failure blocks with AFNetworking 2 10 Sep 2014

How to allow your UITextFields to scroll out of the way of the keyboard 8 Sep 2014

How to subclass UIButton in iOS7 and make a UIButtonTypeSystem 4 Sep 2014

New season 1 Aug 2014

House finished 17 Jun 2014

WebP decoding on iOS 9 Feb 2014

Moving on again 22 Jan 2014

Lossy images for retina iPads - JPEG vs WebP 30 Nov 2013

Career options I wish I knew about when I was younger 20 Oct 2013

Positivity and your friends 7 Oct 2013

Tactility 26 Jul 2013

WWDC-induced narcolepsy 15 Jul 2013

Back on rails 31 May 2013

Full circle 6 May 2013

Programmatic UI on iOS 3 May 2013

Screencasts and positivity 8 Apr 2013

Year of positivity 14 Mar 2013

iOS Dev State of the Union 6 Feb 2013

Adventures with IAPs 3 Feb 2013

No longer a Googler 23 Dec 2012

Localising iPhone apps with Microsoft Translator 8 Dec 2012

Fight back (app biz update 13) 12 Nov 2012

Sent to the backburner (app biz update 12) 25 Oct 2012

Lisi Schappi 7 Oct 2012

Today's happy plateau (app biz update 11) 26 Aug 2012

First week's sales of Today (app biz update 10) 19 Aug 2012

Today launch! And a difficult decision made... (app biz update 9) 15 Aug 2012

Approved! (app biz update 8) 5 Aug 2012

Creating a graph in Objective-C on the iPhone 3 Aug 2012

Hurry up and wait (app biz update 7) 30 Jul 2012

Today app marketing site 27 Jul 2012

Today app submitted 25 Jul 2012

UIAlertView input wrapper 24 Jul 2012

Mentoring 23 Jul 2012

This is too hard! (app biz update 6) 20 Jul 2012

Perspectives (app biz update 5) 9 Jul 2012

4th starting-my-own-biz update 1 Jul 2012

ScrumFox landing page 28 Jun 2012

Server Scope landing page 27 Jun 2012

Telstra Calls and Data Usage 26 Jun 2012

Service History + Dropbox 26 Jun 2012

Impromptu Presenter 26 Jun 2012

Fertility Tracker 26 Jun 2012

Baby Allergy Tracker 26 Jun 2012

Starting my own business, update 3 22 Jun 2012

Starting my own business, update 2 17 Jun 2012

Starting my own business - First update 10 Jun 2012

I must be crazy 6 Jun 2012

Finding your location on an iPhone 7 May 2012

A generous career 4 May 2012

Skeleton Key Cocoaheads presentation 3 May 2012

CHBgDropboxSync - Dropbox auto-sync for your iOS apps 1 May 2012

That book about that Steve Jobs guy 30 Apr 2012

Another app marketing idea 23 Apr 2012

Sweet grouped tables on the iPhone 17 Apr 2012

Skeleton Key App 11 Apr 2012

Another app marketing idea... 5 Apr 2012

Quickly check for any missing retina graphics in your project 3 Apr 2012

Skeleton Key Password Manager with Dropbox 2 Apr 2012

RC Boat motor finally mounted 2 Apr 2012

Promoting apps presentation slides 1 Apr 2012

How i just wasted a month on my latest app, and how you don't need to 26 Mar 2012

The Finishing Line 20 Mar 2012

Using Launchd to run a script every 5 mins on a Mac 20 Feb 2012

Generating AES256 keys from a password/passphrase in ObjC 20 Feb 2012

Indie iPhone app marketing, part 2 19 Feb 2012

My App Manifesto: Syncing + Dropbox + YAML = Awesome 15 Feb 2012

Indie iPhone App Marketing part 1 7 Feb 2012

Perspectives 2 Feb 2012

Accountability and Free Will 1 Feb 2012

Badassery 31 Jan 2012

Sacrifice 30 Jan 2012

Lead Yourself First 29 Jan 2012

How to ping a server in Objective-C / iPhone 26 Jan 2012

iOS Automated Builds with Xcode4 16 Jan 2012

Xcode 4 - Command line builds of iPhone apps 15 Jan 2012

Guest post by Jason McDougall 13 Jan 2012

Scouts, Games and Motivation 10 Jan 2012

2011 Re-cap 8 Jan 2012

Ruby script to increment a build number 4 Jan 2012

Turning 30? All ideas, no execution? 18 Dec 2011

CHDropboxSync - simply sync your iOS app's documents to Dropbox 14 Dec 2011

Deep-enumerating a directory on the iphone, getting file attributes as you go 10 Dec 2011

Getting a date without the time component in objective-c 6 Dec 2011

Memory management in Objective-C 4 Dec 2011

Starting small 29 Nov 2011

Dictionary Types Helper 29 Nov 2011

Observer Pattern in Objective-C 16 Nov 2011

Why you should give presentations 13 Nov 2011

How to get a programming or design job in Sydney 9 Nov 2011

Custom nav bar / toolbar backgrounds in iOS5 8 Nov 2011

Stuck 27 Oct 2011

Dead easy singletons in Obj-C 19 Oct 2011

JSON vs OCON (Objective-C Object Notation) 18 Oct 2011

In defence of Objective-C 16 Oct 2011

Update the MessagePack objective-c library to support packing 12 Oct 2011

Icons 11 Oct 2011

How to host a site on Amazon AWS S3, step-by-step 7 Oct 2011

Drawing a textured pattern over the default UINavigationBar 6 Oct 2011

Markdown Presentations 1 Oct 2011

More MegaComet testing: Ruling out keepalives 15 Sep 2011

MegaComet test #4 - This time with more kernel 14 Sep 2011

Building People 10 Sep 2011

Half way there: Getting MegaComet to 523,000 concurrent HTTP connections 5 Sep 2011

Making a progress bar in your iPhone UINavigationBar 22 Aug 2011

Hacker News Reader 20 Aug 2011

How to programmatically resize elements for landscape vs portrait in your iphone interface 16 Aug 2011

MegaComet testing part 2 3 Aug 2011

Australian Baby Colours 28 Jul 2011

Boat prop shaft 25 Jul 2011

Megacomet with 1 million queued messages 24 Jul 2011

Installed the strut and rudder 18 Jul 2011

Painted the inside of the boat 17 Jul 2011

Fuzzy iphone graphics when using an UIImageView set to UIViewContentModeCenter 13 Jul 2011

My 3 Data and Calls Usage 11 Jul 2011

Reading a line from the console in node.js 10 Jul 2011

Trim whitespaces on all text fields in a view controller 9 Jul 2011

Final finish 9 Jul 2011

MessagePack parser for Objective-C / iPhone 30 Jun 2011

Lacquering the starboard side 25 Jun 2011

What do do with EXC_ARM_DA_ALIGN on an iPhone app 23 Jun 2011

Lacquering the hull 23 Jun 2011

Staining the boat 22 Jun 2011

NSMutableSet with weak references in objective-c 20 Jun 2011

Iphone gesture recogniser that works for baby games 20 Jun 2011

Image manipulation pixel by pixel in objective C for the iphone 19 Jun 2011

Baby Allergy Tracker 12 Jun 2011

Power sanding the deck 10 Jun 2011

Planing the edge of the deck 2 Jun 2011

Figured out the deck 2 Jun 2011

Boat bulkheads 2 Jun 2011

Simulating iOS memory warnings 31 May 2011

Putting a UIButton in a UIToolbar 29 May 2011

How to allow closing a UIActionSheet by tapping outside it 29 May 2011

Finding the currently visible view in a UITabBarController 24 May 2011

Random Chef 17 May 2011

Centered UIButton in a navigation bar on the iphone 16 May 2011

Little Orchard 13 May 2011

Boat update 13 May 2011

How to get the current time in all time zones for the iphone / obj-c 12 May 2011

Design portfolio 10 May 2011

Tricks with grand central dispatch, such as objective-c's equivalent to setTimeout 9 May 2011

How to make an iphone view controller detect left or right swipes 5 May 2011

Centered section headers on a UITableView 5 May 2011

Christmas in may 4 May 2011

Finished trimming the boat (its floatable now!) and got some parts 29 Apr 2011

How to make a multiline label with dynamic text on the iphone and get the correct height 27 Apr 2011

Forcing an image size on the image in a table view cell on an iphone 20 Apr 2011

Git on the Mac 19 Apr 2011

Build a url query string in obj-c from a dictionary of params like jquery does 12 Apr 2011

Rendering a radial gradient on the iphone / objective-c 11 Apr 2011

Skinning the port side of the boat 8 Apr 2011

Skinning the side of the boat 5 Apr 2011

Sending a UDP broadcast packet in C / Objective-C 5 Apr 2011

How to talk to a unix socket / named pipe with python 4 Apr 2011

Skinning the bottom of the boat 31 Mar 2011

Service discovery using node.js and ssdp / universal plug n play 30 Mar 2011

Extremely simple python threading 29 Mar 2011

New rescue boat 26 Mar 2011

HttpContext vs HttpContextBase vs HttpContextWrapper 5 Nov 2010

Simple C# Wiki engine 30 Sep 2010

Simple way to throttle parts of your Asp.Net web app 29 Sep 2010

How to implement DES and Triple DES from scratch 4 Aug 2010

How to use sessions with Struts 2 30 Jul 2010

How to use Cookies in Struts 2 with ServletRequest and ServletResponse 30 Jul 2010

Using Quartz Scheduler in a Java web app (servlet) 27 Jul 2010

Javascript date picker that Doesn't Suck!(tm) 27 Jul 2010

Using Oracle XE with Hibernate 20 Jul 2010

A simple implementation of AES in Ruby from scratch 29 Jun 2010

Asp.Net Forms authentication to your own database 28 May 2010

AS2805 (like ISO8583) financial message parser in C# 7 May 2010

Ruby hex dumper 4 May 2010

Using Spring to manage Hibernate sessions in Struts2 (and other web frameworks) 13 Jan 2010

Emails in C#: Delivery and Read receipts / Attachments 12 Jan 2010

Using Java libraries in a C# app with IKVM 16 Dec 2009

Learning Java tutorial 27 Nov 2009

Using generic database providers with C# 17 Nov 2009

Scheduled task executable batch babysitter 29 Oct 2009

Working with query strings in Javascript using Prototype 30 Sep 2009

Still fighting with String.Format? 9 Sep 2009

How I'd build the next Google 24 Aug 2009

Getting IIS and Tomcat to play nicely with isapi_redirect 24 Aug 2009

Using the new ODP.Net to access Oracle from C# with simple deployment 11 Aug 2009

C# Cryptography - Encrypting a bunch of bytes 14 Jul 2009

Sorting enormous files using a C# external merge sort 10 Jul 2009

Reconciling/comparing huge data sets with C# 9 Jul 2009

Some keyboard-friendly DHTML tricks 10 Jun 2009

How to figure out what/who is connected to your SQL server 18 Mar 2009

Adding a column to a massive Sql server table 16 Mar 2009

Multithreading using Delegates in C# 10 Mar 2009

Using C# locks and threads to rip through a to-do list 6 Feb 2009

Using threads and lock in C# 3 Feb 2009

Compressing using the 7Zip LZMA algorithm in C# beats GZipStream 14 Jan 2009

MS Sql Server 2005 locking 17 Dec 2008

Simple Comet demo for Ruby on Rails 19 Nov 2008

Geocoding part 2 - Plotting postcodes onto a map of Australia with C# 24 Oct 2008

Using evolutionary algorithms to make a walkthrough for the light-bot game with C# 20 Oct 2008

How to tell when memory leaks are about to kill your Asp.Net application 16 Oct 2008

C# version of isxdigit - is a character a hex digit? 15 Sep 2008

Geocoding part 1 - Getting the longitude and latitude of all australian postcodes from google maps 26 Aug 2008

Converting HSV to RGB colour using C# 14 Aug 2008

Opening a TCP connection in C# with a custom timeout 11 Aug 2008

Oracle Explorer - a very simple C# open source Toad alternative 31 Jul 2008

Linking DigitalMars' D with a C library (Mongrel's HTTP parser) 23 Jun 2008

Connecting to Oracle from C# / Winforms / Asp.net without tnsnames.ora 16 Jun 2008

A simple server: DigitalMars' D + Libev 6 Jun 2008

Travelling from Rails 1 to Rails 2 9 Apr 2008

Online Rostering System 9 Apr 2008

DanceInforma 9 Apr 2008

Using RSS or Atom to keep an eye on your company's heartbeat 10 Nov 2007

Easy Integrated Active Directory Security in ASP.Net 24 Oct 2007