Active galactic nuclei (AGN) exhibit small amplitude, short timescale variability in their optical luminosities, of roughly a few tenths of a magnitude over periods of hours to years. But extreme variability of AGN - large luminosity changes that are a significant departure from the baseline variability - are known as AGN flares. These events are rare and their timescales poorly constrained, and most of the literature focuses on individual events. With surveys such as the Legacy Survey of Space and Time (LSST) promising millions of transient detections per night in the coming decade, there is a need for fast and efficient classification of AGN flares. The problem with the systematic detection of AGN flares is the ability to detect them against a variable baseline; the ability to define a signal as a significant departure from the ever-present variability is a statistical challenge. Recently, Gaussian Processes (GPs) have revolutionised the analysis of time-series data in many areas of astronomical research. However, they have seen limited uptake within AGN astronomy. Here we investigate the efficacy of Gaussian Processes to detect AGN flares in both simulated and real optical light curves. We show that a GP can successfully detect AGN flares with a false-positive rate of less than one per cent, and we present examples of AGN that show extreme variability.