-
Randomness on nRF52 using Embedded Swift
Introduction Randomness comes in handy in many different places. It’s useful in the logic of many games to make enemies behave in an unpredictable fashion, it can distribute the timing of events when used within a retry mechanism and random data is sometimes integral to AI algorithms. In this post, we’ll see how to generate random numbers in Embedded Swift on an nRF52840 DK using the nRF Connect SDK. Reading documentation The External dependencies section of the Embedded Swift — User Manual lists some external dependencies that need to be present at link-time, as the Swift standard library or Embedded Swift runtime might need to call them.
Read more… -
Creating a Swift type for button input on nRF52 - Part 2
Introduction In the last post, Creating a Swift type for button input on nRF52 - Part 1, we worked on code to abstract interacting with a button in Swift code. We had a working prototype, but things fell apart as soon as we cleaned up the code and encapsulated it in a struct. Investigating the issue Let’s see if there’s valuable information in the documentation. /** * @brief GPIO callback structure * * Used to register a callback in the driver instance callback list.
Read more… -
Creating a Swift type for button input on nRF52 - Part 1
Introduction In my previous post, Controlling an LED using Embedded Swift on nRF52, we created a Swift struct to encapsulate the C code required to control an LED. In this post, we’ll do the same for a button. If you want to follow along, the code is available on GitHub with each step having its own commit. Starting from working code As a starting point, we will write the whole code in C.
Read more… -
Controlling an LED using Embedded Swift on nRF52
Introduction In my previous post, nrfx-blink Step by Step, I took you through the steps required to configure your development environment to build and flash the “Blinky” Embedded Swift example from Apple on an nRF52840 DK. In this post, we’ll explore that example more deeply, grow from there, and adapt it to be more natural for Swift developers. If you want to follow along, the code is available on GitHub with each step having its own commit.
Read more… -
nrfx-blink Step by Step
Introduction When getting started with Embedded Swift, the examples published by Apple are really an incredible resource. However they’re just that — examples, right to the point and sometimes with minimal information. If you’re new to embedded development, you’ll need to do some additional digging (which isn’t too bad, that’s how you learn). Embedded Swift development is very specific to the chipset that you’re targeting. As you can see, there are several examples in the repository, targeted at different boards and using different options (bare metal vs SDK/RTOS).
Read more… -
Using SWD pins on Seeed Studio XIAO nRF52840
Introduction When I discovered the Omi (previously Friend) project and wanted to build myself a device, I followed the instructions on their getting started page. For the firmware, this meant: open the project in VSCode, build, double-click reset on the nRF board, and drap-and-drop the zephyr.uf2 file into the flash drive that appeared on the computer. As I wanted to make some changes to the firmware code, I wanted to improve on this process, most importantly, get feedback on what’s happening on the board, through console logs.
Read more… -
Opus Decoding in Swift
Introduction Friend is an open-source wearable AI device that captures your conversations, transcribes and summarises them, and offers a ton of features based on that information. The wearable device captures the audio signal and sends it to your phone over BLE, which offers a limited bandwidth. Initially, Friend was sending raw PCM data (16 bits, 8kHz), meaning the audio quality was quite low and so was the transcription accuracy. To improve that, last weekend I re-enabled the Opus codec in their firmware, allowing higher audio quality while using less bandwidth.
Read more… -
The PAL Project
Several weeks ago, I created the nelcea/PAL: Wearable AI exploration repository on GitHub. I want to take a minute to explain in slight more detail what this open-source project is about. PAL is the vehicle for my research in the world of Wearable AI, with an initial focus on the fields of “Personal Knowledge Management” and “Building a Second Brain” (BASB). The initial use case is to capture conversations I’m having, store their text somewhere, and make it possible for me to consult, process, or interact with them.
Read more… -
WWDC 2024
And just like that, WWDC week has come and gone… As expected, there were plenty of announcements across all OSes, both user-facing and technical - including the big one on Apple Intelligence. I don’t think there were many surprises this year. I won’t go into details on the announcements as many people are already doing that, but I do want to point you to a few interesting resources: What’s new in Swift 6.
Read more… -
On the proper usage of commonFormat for AVAudioFile
Introduction I started looking into AVAudioFile while working on PALApp, my exploration into the world of portable AI. The MVP I was implementing is receiving audio over BLE from a Friend wearable and storing it in an audio file on the device. Try 1 The Friend audio samples are 16-bit little-endian integers. To ease testing, I can simulate this by populating a buffer with a simple sine wave let samples: [Int16] = { var s = [Int16]() for i in 0.
Read more…