I'm driving a MAX7219 shift-register based display (8x 7-segment units) from a Meadow F7 using pins D00 - D02 (data, chip select and clock respectively).
My current implementation is bit-banging the digital I/O pins and is very slow - by my timings it takes ~15 - 16ms to toggle the state of a pin. This gives around 62 - 67 Hz update rate.
I'm aware that bit-banging is considered orders of magnitude slower than SPI etc, but this seems excessively slow and takes over 800ms to set a single digit on the MAX7219 (i.e. to bit-bang out 2 bytes).
I found an article where someone was able to drive the I/O pins of an STM32-series chip at over 1MHz without any great effort. I appreciate that bare-metal C doesn't have the same overheads as a real-time OS running .NET, but this seems like an incredible difference.
Is this known / expected? Can we expect the I/O toggle speeds to improve as the firmware matures? Am I simply doing something very wrong?!