14EC10068 Rounak Choudhury
When representing a point wise continuous function using Fourier series we
observe that at the discontinuous jumps the sum of the Fourier series for a finite
number of terms gives an overshoot at such discontinuous jumps this
phenomenon is called Gibb’s phenomenon. Thus there is an error involved as the
sum of the partial series exceeds the maximum value of the function,
In the attached Matlab code we have approximated a square wave with 1V pp
and with a time period of 1 sec, we see that as we increase the value of n the
overshoot at the discontinuous jumps increases.