A Review on Depression Detection Among Adolescent by Face

— Depression has become one of the most common mental illnesses in the past decade, affecting millions of patients and their families. However, the methods of diagnosing depression almost exclusively rely on questionnaire-based interviews and clinical judgments of symptom severity, which are highly dependent on doctors’ experience and makes it a labor-intensive work. Our study aims to develop an objective and convenient method to assist depression detection using facial features in adolescent. Most of the adolescent are totally unaware that they may be having depression. If at all they are aware of it, some adolescents conceal their depression from everyone. So, an automated system is required that will pick out the adolescents who are dealing with depression. In this paper, different research work focused for detecting depression are discussed.


I. INTRODUCTION
Depression is a common mental disorder that already affects more than 350 million people worldwide [1]. It will not only make a bad influence on the patients but also on their families. The World Health Organization said that depression will become the second leading cause of illness by the year 2020 [2]. However, the assessment methods of diagnosing depression almost exclusively rely on the patient-reported and clinical judgments of the symptom severity [3]. Current diagnostic techniques of depression have obvious disadvantages, which are associated with the patient denial, poor sensitivity, subjective biases and inaccuracy [4]. Finding an objective, accurate and practical method for depression detection still remains a challenge.
Nowadays, depression is most common among the adolescents as well as youths in the initial days of their professional life. The issue of depression is important as directly affect the productivity of a person. For adolescents, it affects their performance in their examination and in other extracurricular activities.
For working employees, depression affects their performance as well as the ability to handle the pressure of work that leads to the deterioration of their overall performance and ultimately could lead them to get fired. Depression can create several other problems in the life of a person. Other people may consider them a coward or may pity them. However, they fail to understand that they are suffering from a phase of the mental illness, where person itself began to consider himself/ herself as a coward and feel insecure amidst other persons. Over the years, a lot of methods have been developed to overcome depression.
Persons et al. [2] proposed a psychotherapy as a remedy for the depression. Clarke et al. [3] designed a program Overcoming Depression on the Internet (ODIN) to overcome depression that shows good results. Gilson et al. [4] introduced a cognitive therapy that proved to be very helpful in overcoming depression. As mentioned beforehand, we have lots of good methods to overcome depression. However, all the wonderful methods and techniques are of no use if we don't know whom these need to be applied.
The big issue is to find out who is depressed. Most of the depressed persons find it hard to admit that they are depressed. It's not that they are hiding, although there are always some exceptions, they themselves are not aware of their state of mind. Many people try to hide their emotions while talking to others that make it hard to judge by meeting someone. The true state of mind of a person is revealed when he/ she is alone. Therefore, we need a simple yet effective method to detect the feeling of depression in a person.

II. RELATED WORK
Depression is all about the emotions. Plutchik [5] listed the basic eight types of emotions as mentioned below: 1. Fear is the feeling of being afraid or a feeling of insecurity.
It is usually induced by perceived threat or danger. It can cause a change in behavior. It is an unpleasant feeling.
2. Anger is intense negative emotion that leads to a hostile response to perceived provocation, threat or hurt [6]. It is usually accompanied by the feeling of doing something violent and taking revenge.
3. Sadness is the feeling of despair, grief, disappointment, and sorrow. It changes the behavior of the person to a state where a person feels very bad and sometimes feel like crying.
www.ijoscience.com 4. Joy is a positive feeling of happiness and great pleasure. It enables the person to enjoy the moment and try to find good things in everything. It promotes good behavior in a person.
5. Disgust is the feeling of regret and disapproval usually caused by something unpleasant or offensive. It is usually followed by anger.
6. Surprise is a feeling of astonishment when something unexpected happens. It leaves the person unable to believe what he/ she perceives.
7. Trust is the feeling of reliability, truth or confidence in someone.
8. Anticipation is the feeling of looking forward positively on something that is about to happen. There are other emotions like acceptance, rage, ecstasy etc. However, all these emotions are a combination of two or more emotions mentioned above. Affective Computing has been an active area of research in the field of machine learning, it is a broad area.
Significant work has been done in the literature in this field to detect a state of the mental disorder based on the emotions of the individual.
Cohn [8] used an approach based on the facial action and vocal prosody to detect depression in an individual. Facial expressions are a great reflector of the mood of the person. An attempt to detect facial expression was developed by Sharma et al. [9].
Lexicon based emotion analyzer was proposed in [10] is a good attempt to analyze the emotion of an individual.
Cohn [8] proposed a method based on facial actions and vocal prosody with an accuracy of 88%.
Lu-Shih et al. [11] proposed another content-based depression detection but the accuracy of the model was not good with value 59.55%.
A further improved model [12] was proposed by Lu-Shih et al. that increased overall accuracy to 80.5%.
Wang et al. [13] proposed a depression detection model based on sentiment analysis in microblog social network that obtained an overall accuracy of 80%.
An improved model [14] was proposed by Xinyu Wang et al. that achieved a very high accuracy of nearly 95%.
Shen et al. [15] proposed a model based on social media harvesting that achieved an accuracy of 77%.
The OpenFace developed by Baltrusaitis et al. [16] is a ˇ tool that extracts facial landmarks. Similar landmarks have been employed in Vonikakis et al. [17] for group happiness assessment, to derive distances between specific landmarks, the presence or absence of which is commonly related to happiness. The features used in [17] agree with the basis of the Facial Action Coding System (FACS). Furthermore, inner distance between the eyebrows is also used a feature for detection. Such features were employed in the approach reported by Alghowinem et al. [18], involving landmark distances in the eyes and eyebrows.
The authors are well motivated from the above works as discussed in details and decided to introduce a Depression detection model which can recognize the mental state of the person correctly and in minimum time. In the given system, we are trying to determine the state of mind of a person i.e. whether the person is depressed or normal.

III. DATASET
The dataset employed for testing in different approaches was introduced in the 3rd and 4th AVEC [19], being the only freely available dataset, annotated for depression, which includes video recordings of participants; DAIC has also been made partly available, but without video recordings.
The AVEC dataset includes volunteer participants, recorded by a webcam, during performance of several tasks [11]. In the present work a subset of the dataset was used, in the sense that only two tasks were considered, namely i. FreeForm task, where participants answered questions ii. NorthWind task, where participants read aloud a passage Depression annotation however is provided only for 200 of the recordings, as test set labels were withheld for the challenge needs.
IV. PROPOSED MODEL The proposed system will be designed with the potential of serving as a decision support system based on facial features from video frames to determine the level of depression. www.ijoscience.com

Figure 1: Depression Detection from Face
Following features will be extracted and analysed to determine the level of depression as:

A. Face Features
The 2-D facial landmarks are considered for face feature extraction. Different features extracted from face are in the form of head features, distance and blink rate.

Figure 2: Facial Landmarks
Head Features: The Head pose features provided by the organizers were not used as they did not depict temporal information, i.e. features did not convey the change with respect to time. Hence Head Motion was judged by horizontal and vertical motion of certain facial points, i.e., points 2, 4, 14, and 16 [ Fig. 2] [20].
The points selected were those that are minimally involved in facial movements and expressions such as blinking, smiling and other facial expressions, such that their motion prominently represent head motion. For each of the above facial points, change in their position was calculated between every consecutive frame. This change was measured in both horizontal and vertical direction as well as net magnitude. Then statistical features were calculated as mean, median and mode of displacement in horizontal, vertical direction and magnitude of displacement, and velocity in horizontal, vertical direction and magnitude of velocity.
Distance: Head motion and facial expressions combined can give substantial information regarding the behavior of a person. Their temporal information may convey information regarding effective state of the person and hence one may find correlation among them. This was implemented as features extracted on the facial regions, namely, eyes, mouth, eyebrows and head.
Blink Rate: Blink rate cab be calculated using the 2D facial landmarks. First, the region of points enclosing the eye (points 37-42, of left eye) will be taken. For the entire number of frames given, the area of the polygon made by those points will be calculated. This way, the data eye area VS frame data will be obtained. For the close eye area, the minimum of the area of random 1000 frames will be taken. A blink will be considered if the area covered by the eye points is less than 90 percent of the area with eye opened. This way the number of blinks will be calculated over the entire number of frames. The blink frequency was calculated by dividing the number of blinks by corresponding duration of the session.
Depression detection from images alone, mainly depends on a clear and proper definition of a depressed face. The facial expression of a depressed face is slightly different from that of sadness. A depressed face expression has the same characteristics of a sad expression, such as the upward slanted eyebrows etc. but the main difference is that there is no major frown involved. Also, a sad face may have eyes lowered looking downward showing the helpless, dejected mood.

V. CONCLUSION
Depression is a serious mental illness, and the current diagnosis process still needs to be conducted by a specially trained psychiatrist or psychologist, usually using a scale and careful observation in communication, which depends on the doctor's experience. And it's hard for non-psychiatrists to diagnose and treat depression. Capturing videos including eyes www.ijoscience.com and faces, extracting and recognizing the features of the captured videos may help patients themselves or community doctors to detect and diagnose potential depressive patients early or to improve the diagnostic rate of depression. This is what is expected to achieve in future research.