Handling SIGUSR1 in GDB

I was debugging a project that used SIGUSR1 heavily, GDB stops on SIGUSR1 by default and it was making debugging a pain when I didn’t care when the signal was being generated. So here is now to set how GDB interprets signals.

By entering the following into the GDB prompt you can instruct it to not print, or stop when the signal happens and to pass it to the program.

handle SIGUSR1 nostop noprint pass

     

Finding a File Descriptor Leak

My current project is a control system that runs on a small embedded PC running a Linux OS. I had a problem during development where I had to open/close two serial ports alternatively because they shared an interrupt. Doing this suddenly caused the software to crash after running for a few minutes. The problem turned out to be that I was leaking file descriptors!

I wrote a simple shell script that prints the the total file descriptors open on the serial ports. This helped me make sure my bug fix worked correctly.

#!/bin/sh
 
TIME=1
 
while [ 1 -eq 1 ]; do
    clear
    lsof | awk '''
BEGIN { ttyS0 = ttyS1 = ttyS3 = ttyS4 = 0; }
/\/dev\/ttyS4/ { ttyS4++; }
/\/dev\/ttyS3/ { ttyS3++; }
/\/dev\/ttyS0/ { ttyS0++; }
/\/dev\/ttyS1/ { ttyS1++; }
END {
    print " Descriptor    Counter ";
    print " ttyS4       " ttyS4;
    print " ttyS3       " ttyS3;
    print " ttyS1       " ttyS1;
    print " ttyS0       " ttyS0;
}
'''
    sleep $TIME
done

In the end I manually control the serial port using the sys module instead of using pySerial. I still have no idea why pyserial started leaking resources.