February, 8th - Session 2

 Figure 1. On the left: magnification of the signal; on the right: Fast Fourier Transform
In second session we attempted to analyse the first set of data. As the high number of samples (over 500'000) caused that our software to lag considerably, we decided to reduce the amount of data by drawing the graph of every 'n' sample. Unfortunately, the effect was no as we expected. Figure 1 shows how sampling rate affected the signal output. The left side shows the signal when we used all samples at the bottom then when we go higher every 10th, 100th, 1000th, 2000th and 5000th sample. It gave us general idea how the signal looks like as with high number of samples it looked like a random noise, but after magnification we were able to notice at least two frequencies in this set of data.
Then we attempted to perform a Fast Fourier Transform to the reduced set of samples. We observed that there is one fundamental frequency in the low region between 0 and 100 Herz and another one at approximately 4kHz. As the amount of data was reduced we weren't able to extract all the information from the FFT and it was necessary to repeat the analysis with all the samples.