Final Project Tracker
This is were I document my progress on my final project.
Week 1
I did some brainstorming and made concepts around the two ideas I have for my final project. You can read more about the ideas and look at early visuals on the page documenting my work for Week 1.
I’m still deciding between the two options, but I am giving myself some time to decide. I know that I’ll need to figure out a few things that these ideas have in common. So, I can work on those things before committing to an idea. Namely:
- These are portable devices and need to be powered by battery. I need to get more familiar with LiPO batteries and need a charging solution (USB-C).
- Both ideas involve e-ink screens. I need to understand the form factor I can go for. Specifically, I need to understand if I need to use breakout boards or can manage to drive the display itself directly. I also need to learn how to actually show images and text on the displays.
Week 2
This week was all about laser cutting cardboard. I wasn’t able to make much progress on the final project, but I’ve ordered and received e-ink screens from Waveshare. I ordered both a display on a driver board and a raw display. I’m hoping that next week will give me some headspace to start playing with the displays.
Week 3
I know my own limitations in terms of my electronics knowledge, but I also understand what I will have to learn to make my project(s) happen. So, I spent some time today (Sept 18th) figuring out resources I can use to learn. I had a great meeting with Ceci who’s in my cohort and knows MUCH more about electronics than I do. She answered a lot of my questions and gave me a sense of what’s feasible. Thank you so much, Ceci! The questions I had were predominantly around power management. When switching, the e-ink screen needs a higher voltage than what the microcontroller or USB will provide by default.
Driver Code
For my Final Project I need to send images (like QR codes) to an e-ink screen. I want to use micropython for the logic in my final project and so I was looking for a micropython driver that I could use for the board. I ran into some problems. The micropython driver that I could find was targeting V1 of the Waveshare board only and didn’t work on the V2 screen that I purchased. I couldn’t find a V2 driver written in micropython, but I ended up playing with a combination of C++ libraries (GxEPD2 and Adafruit GFX Library) that allowed me to verify that the screen itself actually worked fine. This video was very helpful when narrowing down what libraries to try.
I still didn’t have a micropython driver obviously and the C++ libraries were obfuscated enough (lots of inheritance of templated base classes to support a wide range of screen models) that deriving the byte sequence needed to initialize and operate the board was a little outside of what I could realistically achieve this week.
The capabilities of modern LLMs astound me on a weekly basis. To solve my “problem”, I opened the GxEPD2 library in Visual Studio Code and asked GPT-5-mini to help me figure out how the communication with the screen is handled for my particular screen model.
Prompt
#codebase I’d love for you to analyze the exact control seuqence needed to init a waveshare 1inch54 board. I know that the code here works on my display so let’s figure out how exactly that works. Afterwards I’d like to translate it to micropython, but let’s figure out how the communication with the display is done and what’s required to show pixels on the display please.It came back giving me a few options for a few different display types. I clarified which screen I want to use with the following prompt.
Prompt
I know that the one we want to use is the D67 variant. I ran that on the display (via ESP32 dev boad) and it worked well. Let’s use what we know about that variant and produce a micropython equivalent to how the c++ code works with that display, please.It then produced micropython driver code that worked flawlessly for my test:
Driver
"""
MicroPython driver for GDEH0154D67 (1.54" 200x200 SSD1681) based on GxEPD2 Arduino sequences.
Target: ESP32 (adjust SPI init pins for other boards)
Wiring (example):
- CS -> any GPIO (active low)
- DC -> any GPIO
- RST -> any GPIO
- BUSY-> any GPIO (input)
- SCK -> HSPI SCK
- MOSI-> HSPI MOSI
- MISO-> optional (not used)
Notes:
- Panel requires 3.3V for VCC and IO. Do NOT use 5V data lines.
- Busy polarity for this variant: HIGH when busy.
Usage:
import machine, time
from micropython_gdeh0154d67 import EPD
epd = EPD(spi_id=1, sck=14, mosi=13, cs=15, dc=27, rst=26, busy=25)
epd.init()
buf = bytearray([0xFF]*(200*200//8)) # white
epd.write_image(0,0,200,200,buf)
epd.update_full()
epd.sleep()
"""
import time
from machine import Pin, SPI
# Display parameters
WIDTH = 200
HEIGHT = 200
BUSY_ACTIVE_LEVEL = 1 # busy HIGH for SSD1681
class EPD:
def __init__(self, spi_id=1, sck=14, mosi=13, miso=-1, baudrate=4000000, cs=15, dc=27, rst=26, busy=25):
# pins
self.cs = Pin(cs, Pin.OUT, value=1)
self.dc = Pin(dc, Pin.OUT, value=1)
self.rst = Pin(rst, Pin.OUT, value=1)
self.busy = Pin(busy, Pin.IN)
# SPI
if miso >= 0:
self.spi = SPI(spi_id, baudrate=baudrate, polarity=0, phase=0, sck=Pin(sck), mosi=Pin(mosi), miso=Pin(miso))
else:
self.spi = SPI(spi_id, baudrate=baudrate, polarity=0, phase=0, sck=Pin(sck), mosi=Pin(mosi))
# internal
self._inited = False
# low-level helpers
def _cs_low(self):
self.cs.value(0)
def _cs_high(self):
self.cs.value(1)
def _dc_command(self):
self.dc.value(0)
def _dc_data(self):
self.dc.value(1)
def send_command(self, cmd):
self._dc_command()
self._cs_low()
self.spi.write(bytes([cmd]))
self._cs_high()
self._dc_data()
def send_data(self, data):
# data can be int or bytes/bytearray
self._dc_data()
self._cs_low()
if isinstance(data, int):
self.spi.write(bytes([data]))
else:
# assume bytes-like
self.spi.write(data)
self._cs_high()
def _start_data(self):
self._dc_data()
self._cs_low()
def _end_data(self):
self._cs_high()
def reset(self):
# follow Waveshare style: drive RST high, then low, then high
self.rst.value(1)
time.sleep_ms(10)
self.rst.value(0)
time.sleep_ms(10)
self.rst.value(1)
time.sleep_ms(10)
def wait_while_busy(self, timeout_ms=10000):
start = time.ticks_ms()
while self.busy.value() == BUSY_ACTIVE_LEVEL:
time.sleep_ms(1)
if time.ticks_diff(time.ticks_ms(), start) > timeout_ms:
raise OSError('EPD busy timeout')
# partial ram area like in Arduino driver
def set_partial_ram_area(self, x, y, w, h):
# _writeCommand(0x11); _writeData(0x03);
self.send_command(0x11)
self.send_data(0x03)
# 0x44 x start/end in bytes
self.send_command(0x44)
self.send_data(x // 8)
self.send_data((x + w - 1) // 8)
# 0x45 y start/end
self.send_command(0x45)
self.send_data(y & 0xFF)
self.send_data((y >> 8) & 0xFF)
yend = y + h - 1
self.send_data(yend & 0xFF)
self.send_data((yend >> 8) & 0xFF)
# 0x4E ram x address
self.send_command(0x4E)
self.send_data(x // 8)
# 0x4F ram y address
self.send_command(0x4F)
self.send_data(y & 0xFF)
self.send_data((y >> 8) & 0xFF)
# init sequence mirrored from _InitDisplay() in GxEPD2_154_D67.cpp
def init(self):
if self._inited:
return
self.reset()
time.sleep_ms(10)
# soft reset
self.send_command(0x12)
time.sleep_ms(10)
# Driver output control
self.send_command(0x01)
self.send_data(0xC7)
self.send_data(0x00)
self.send_data(0x00)
# Border Waveform
self.send_command(0x3C)
self.send_data(0x05)
# Read built-in temp sensor
self.send_command(0x18)
self.send_data(0x80)
# set full ram area
self.set_partial_ram_area(0, 0, WIDTH, HEIGHT)
self._inited = True
# power on sequence (PowerOn in Arduino code)
def power_on(self):
# _writeCommand(0x22); _writeData(0xe0); _writeCommand(0x20); _waitWhileBusy
self.send_command(0x22)
self.send_data(0xE0)
self.send_command(0x20)
self.wait_while_busy(5000)
def power_off(self):
# _writeCommand(0x22); _writeData(0x83); _writeCommand(0x20); _waitWhileBusy
self.send_command(0x22)
self.send_data(0x83)
self.send_command(0x20)
self.wait_while_busy(2000)
# write whole buffer (current) to RAM at specified rectangle and optionally do not refresh
def write_image(self, x, y, w, h, buf):
if not self._inited:
self.init()
# initial write handling in Arduino ensures previous/full buffers are managed; here we just write current
self.set_partial_ram_area(x, y, w, h)
self.send_command(0x24) # write RAM (current)
# stream data
self._start_data()
# buf should be bytes/bytearray length w*h/8
self.spi.write(buf)
self._end_data()
# helper similar to Arduino's _Update_Full/_Update_Part
def update_full(self):
# full update: 0x22 0xF7, 0x20 then wait
self.send_command(0x22)
self.send_data(0xF7)
self.send_command(0x20)
# full refresh time in Arduino was relatively long; wait until busy releases
self.wait_while_busy(20000)
# after full update the Arduino sets power_is_on false; we keep state open for simplicity
def update_partial(self):
# partial update: 0x22 0xFC, 0x20 then wait
self.send_command(0x22)
self.send_data(0xFC)
self.send_command(0x20)
self.wait_while_busy(5000)
def sleep(self):
# deep sleep: 0x10 0x01 per Arduino
self.power_off()
self.send_command(0x10)
self.send_data(0x01)
# Simple test function to draw a checker pattern (for quick visible test)
def example_test():
epd = EPD(spi_id=1, sck=14, mosi=13, cs=15, dc=27, rst=26, busy=25)
epd.init()
epd.power_on()
# white buffer
buf = bytearray([0xFF] * (WIDTH * HEIGHT // 8))
# paint a small black rectangle in top-left 40x40 (black = 0x00)
for row in range(40):
for col_byte in range(40//8):
idx = (row * (WIDTH//8)) + col_byte
buf[idx] = 0x00
epd.write_image(0, 0, WIDTH, HEIGHT, buf)
epd.update_full()
epd.sleep()
if __name__ == '__main__':
example_test()
I then extended the python code to test images. Again, I used ChatGPT to give me a quick tool that can convert png files (I downloaded a QR code that I wanted to use) to a .bin file with contents that I can directly send to the waveshare screen.
Prompt
If I want to get a bytearray in python of an image, what’s the best way for me to store that image so it’s easiest to read it as a bytearray? Can you give me the code to store a png as .bin?Python
# png2bin.py
# pip install pillow
from PIL import Image
import argparse, sys
def pack_1bpp(im, invert=False, msb_first=True):
im = im.convert("1") # 1-bit BW
w, h = im.size
px = im.getdata()
# Pillow "1" gives 0 or 255. Map to 0/1 bits.
bits = [(0 if p else 1) if invert else (1 if p else 0) for p in px]
out = bytearray()
for i in range(0, len(bits), 8):
chunk = bits[i:i+8]
if len(chunk) < 8:
chunk += [0]*(8 - len(chunk))
b = 0
if msb_first:
for bit in chunk: b = (b << 1) | bit
else:
for k, bit in enumerate(chunk): b |= (bit & 1) << k
out.append(b)
return out, w, h
def main():
ap = argparse.ArgumentParser()
ap.add_argument("input_png")
ap.add_argument("output_bin")
ap.add_argument("--size", type=str, help="WxH (e.g. 200x200)")
ap.add_argument("--invert", action="store_true", help="flip black/white bits")
ap.add_argument("--lsbfirst", action="store_true", help="pack LSB-first in each byte")
args = ap.parse_args()
im = Image.open(args.input_png)
if args.size:
w, h = map(int, args.size.lower().split("x"))
im = im.resize((w, h))
buf, w, h = pack_1bpp(im, invert=args.invert, msb_first=not args.lsbfirst)
# sanity: bytes should be w*h/8
expected = (w*h + 7)//8
if len(buf) != expected:
print(f"Warning: got {len(buf)} bytes, expected {expected}", file=sys.stderr)
with open(args.output_bin, "wb") as f:
f.write(buf)
print(f"Wrote {args.output_bin} ({len(buf)} bytes) for {w}x{h}")
if __name__ == "__main__":
main()
In one of the next weeks, we’ll learn how to use a Logic Analyzer. I will then use it to look at the communication between microcontroller and screen to verify a) that I understand what’s going on and b) that the communication is equivalent to the established C++ library. Once I’ve gone through those steps, I’m planning on open sourcing the above driver code on GitHub so others can easily drive their V2 screens, as well.
Week 4
I focused on 3D printing and scanning this week. I had a chance to chat with Dan and Quentin about my final project, though. They had some good advice for me and I’m very excited for the next two weeks which are all about electronics design and production.
I have decided to build the smart wallet (not the fridge magnets) as my final project. My goal is to hook it up to XR glasses and use it to board a flight.
Week 5
In Week 5 I verified that I undestand the communication to the e-ink display. I used a logic analyzer to look at the individual bytes that are sent over the SPI channels.
During this week’s assignment I tried to design and simulate a PCB that can replace the driver board that one of the e-ink screens came with. I re-made and adjusted the schematics published by waveshare. Unfortunately, I couldn’t confirm yet that the design is correct and the board will work.
Week 6
In Week 6 I tried to fabricate the board with the boost converter on it. It was a little bit of a mess overall because there are so many traces to keep track of. I wasn’t able to test if the boost converter actually works.
Week 7 - Input Devices
I re-designed the board and on October 26th sent it off to JLCPCB to get a cleaner 2-sided board that I can experiment with. I also ordered additional parts that the HTM inventory currently doesn’t have.
TODO: add a list here with prices and when I ordered them.
For this week’s assignment, I made a device that reads an NFC tag and then communicates through one of my servers with the Spotify API. I will most likely need a similar setup for the final project. The XR glasses will want to communicate to some server what image should be shown on the physical device (it needs a name!!). The server probably should provie some GET endpoint that I can then use to pull the image onto the e-ink screen. To trigger the ESP to make that GET request I could use an MQTT queue, but I’m not sure yet whether that’s actually the best approach. Will have to do some experimenting. I’m planning on doing that while doing the networking assignment in Week 12. Another option is BLE, but most XR glasses don’t really give you asccess to their bluetooth hardware.
Week 9 - Output Devices
After the board arrived in the mail, I soldered all the components onto it. I have one design flaw: One of the mosfets has the wrong footprint on the board. I was designing around one of the mosfets that we have in the HTM inventory, but then decided to order an use a different, much smaller one.

TODO: add pictures of mosfet and footprint
It took me a long time and 2 or 3 different boards to get the 24-pin connector soldered. The pins are very small and I struggled quite a bit. The microscope in the electronics lab in the basement was a great help. Quentin saw me struggling and showed me how to use the solder wick for effectively to clean up shorts between the pins. Thank you, Quentin!
TODO: Add close-up of connector.
Week 10 - Molding & Casting
This was molding and casting week and I couldn’t get anything done for my final project.
Week 11 - Machine Week
Ball week! Too busy! Can’t do anything else!!!
Week 12 - Networking Week
Week 13 - Programming Week
Remainder
- Make the housing for the screen (Networking Week)
- Final test for the custom PCB with boost converter but maybe just a simple custom PCB (Networking Week)
- Write software for XR glasses (Interfaces/Programming Week)
- Add cloud setup (Networking Week, similar to Input Devices Week)

