Industries require systems for timely detection of faults for optimal and proper performance, and alarm systems have been installed to detect these faults. The function of these systems is to inform the operator of the occurrence of abnormal conditions in the process. The most common way in the industry to detect abnormal conditions is to compare the value of a variable with a predetermined desired value, and when the value of a variable exceeds this value, the alarm system is activated and notifies the operator of a fault. On the other hand, the operator should ideally receive only one alarm for each fault, so if the number of received alarms increases, it will be impossible for the operator to detect the necessary alarms. Therefore, what is important in alarm systems is the timely detection of faults, alarm management and response to alarms. False alarm rates, missed alarm rates and detection delays are the most important criteria for evaluating alarm systems. Given what has been said, today there are many ways to improve the performance of alarm systems and these methods are expanding. Delay timer, filtering and dead band are some of these methods. In the delay timer method, unlike the conventional method in which the decision to activate and deactivate the alarm is made by examining a sample, in this method, the decision to activate and deactivate the alarm is made by examining a sequence of samples. In this thesis, the delay timer method and its generalization are fully introduced and the generalized delay timer method is used to detect changes in the probability density function. Finally, an algorithm for calculating the values of generalized delay timer parameters for this detection is presented. Alarm systems, False alarm rate, Missed alarm rate, Detection delay, Generalized delay timer