Topics

References

  • Wasserman (2004), Chpter 13.7, 22
  • Bishop (2006), Chapter 4

Classification Problem

Paired data \((X_1,Y_1),...,(X_n,Y_n)\).

  • input \(X\in R^d\)
  • discrete output \(Y\in\{0,1\}\) or \(Y\in\{1,...,K\}\)

Three major approaches:

  • Learn discriminant function \(Y=h(X)\)
  • Learn conditional probability \(P(Y|X)\)
  • Learn generative model \(P(X|Y)\) and use Bayes’ theorem:
    • \(P(Y|X) = \frac{P(X|Y)P(Y)}{P(X)}\)

Generalized linear models

\[y(x) = f( b^Tx+b_0)\] \(f\): activation function
\(f^{-1}\): link function

Logistic Regression

Binary discriminative model: \[p_i = P(Y_i=1|X=x_i) = s(b^Tx_i)\] * Logistic sigmoid function \(s(u) = \frac{1}{1+e^{–u}}\)
- note \(s’(u)=s(u)(1-s(u))\)

  • Likelihood for \((X_1,Y_1),...,(X_n,Y_n)\)

\[\mathcal{L}(b) = P(Y|b) = \prod_i p_i^{Y_i} (1-p_i)^{1-Y_i}\]

  • Log likelihood (cross-entropy error):

\[\mathcal{l}(b) = \sum_i\{ {Y_i}\log p_i + (1-Y_i)\log(1-p_i)\}\] \[\nabla \mathcal{l}(b) = \frac{Y_i}{p_i}p_i(1-p_i)x_i - \frac{1-Y_i}{1-p_i}p_i(1-p_i)x_i = \sum_n (Y_i–p_i)x_i\]

Iterative reweighted least squares (IRLS)

  • Input data: \(X = (X_1,...,x_n)^T\)
  • Predicted output: \(P = (p_1,...,p_n)^T\)
  • Target output: \(Y = (Y_1,...,Y_n)^T\)

  • weighting around border: \(R: R_{ii}=p_i(1-p_i)\)

\[b_{new} = b_{old} – (X^TRX)^{-1}X^Y(P–Y) = (X^TRX)^{-1}X^YRz\]

  • effective target: \(z = Xw_{old} + R^{-1}(Y–P)\)

Linear Discriminat Analysis (LDA)

Assum that data distributions \(f_0(x)=f(x|Y=0)\) and \(f_1(x)=f(x|Y=1)\) are Gaussians.

x = seq(-5, 5, 0.1)
f0 = dnorm(x, -1, 1)
f1 = dnorm(x, 2, 2)
plot(x, f0, type="l")
lines(x, f1, type="l")

By assumign that the covariace matrices of the two distributions are the same, the discriminat function is given by \[h(x) = \arg\max_k \delta_k(x)\] \[\delta_k(x) = x^T \Sigma^{-1}\mu_k - \frac{1}{2}\mu_k^T\Sigma^{-1} + \log \pi_k\]

It can be approximated by estimates from data as: \[\delta_k(x) = x^T S^{-1}\hat{\mu}_k - \frac{1}{2}\mu_k^TS^{-1} + \log \hat{\pi}_k\] \[S = \frac{n_0S_0 + n_1S_1}{n_0+n_1}\]

\[S_k = \frac{1}{n_k}\sum_{Y_i=k}(X_i-\hat{\mu}_k)(X_i-\hat{\mu}_k)\] * \(\hat{\mu}_k = \frac{1}{n_k}\sum_{Y_i=k}X_i\)
* \(\hat{\pi}_k = \frac{n_k}{n}\)

ROC Curve

Two types of errors

’’ Positive Negative
Predicted Positive True Positive Flase Positive (Type I error)
Predicted Negative Flase Negative (Type II error) True Negative
  • Accuracy = (TP+TN)/all
  • Sensitivity = TP/(TP+FN)
  • Specificity = TN/(TN+FP)

  • Receiver operation characteristics (ROC) curve
    • True positive rate (Sensitivity)
    • Flase positive rate (1-Specificity)
  • AUC: area under the ROC curve

x = seq(-5, 5, 0.1)
F0 = pnorm(x, -1, 1)
F1 = pnorm(x, 2, 2)
par(pty="s")
plot(1-F0, 1-F1, type="l")

Exercise

1. Logistic Regression

Generate two sets of data in 2D, e.g., Gaussians, with or without overlap.

n = 100
X0 = cbind( rnorm(n, 3, 2), rnorm(n, 0, 1))
XY0 = cbind(X0, 0)
X1 = cbind( rnorm(n, 0, 1), rnorm(n, 3, 2))
XY1 = cbind(X1, 1)
XY = rbind(XY0, XY1)
plot(XY[,1:2], col=XY[,3]+1)

  1. Apply logistic regression by glm() function and see the classification accuracy.
df = data.frame(XY)
df
              X1           X2 X3
1    0.203647897 -0.654207459  0
2    0.879388769  1.376554161  0
3   -0.227866781  0.833396173  0
4    3.051979768  0.938763091  0
5   -0.260443932  1.751847878  0
6    1.237412325 -1.922075212  0
7    2.799624139  0.427930182  0
8    4.577211784  0.742043931  0
9    5.861660510 -1.469932481  0
10   5.399389487 -0.814674035  0
11   3.627217397  2.521402023  0
12   3.406005873 -0.423648490  0
13  -2.010451849  0.760667811  0
14   3.737225333  0.568692703  0
15   2.360997244  0.943361784  0
16   2.117719876  0.740276224  0
17   4.707973515 -1.076352071  0
18   3.034720426 -0.818171474  0
19   0.173228939  0.005268775  0
20   3.678003693  0.530104707  0
21   1.761777420 -1.594466725  0
22   0.114343264 -1.631043408  0
23   0.223580248 -1.583067790  0
24   3.922553684  0.370560875  0
25   5.056476329  0.760691601  0
26   3.962789632  0.908598387  0
27   2.314608329  0.485957328  0
28   0.783702673 -1.390993973  0
29   3.719515222  0.705963695  0
30   3.470523863  0.715189859  0
31  -0.321223599 -0.432350658  0
32   4.933431210  0.464008432  0
33   3.150436568 -0.006077782  0
34   5.654643192 -0.073269326  0
35   1.343085136 -1.636650219  0
36   2.091640734 -0.543255582  0
37   3.743679644 -2.657106519  0
38   3.268108459 -0.505491236  0
39   3.098809328  1.567225794  0
40   0.614545256  0.877712419  0
41   3.072850494 -0.909862686  0
42   4.079056411  1.667383864  0
43   3.556183018  1.511062716  0
44   3.119010989  1.133605718  0
45   3.899369297 -1.587484667  0
46   0.722508925 -0.458302445  0
47  -0.497884684 -0.221705247  0
48   4.500724697  0.419067391  0
49   0.667168239  0.094234573  0
50   1.222261608  0.022503562  0
51   1.550778813  0.969388576  0
52   5.077742095  1.348087237  0
53   5.013868309 -1.415886758  0
54   5.498556246 -0.520798398  0
55   4.856407888 -0.179567918  0
56   1.870362561  0.253475340  0
57   0.710369363 -0.601346626  0
58  -0.568366037  1.220828249  0
59   3.484510580 -0.511362219  0
60   5.666461673  2.892524341  0
61   2.265075687 -0.615574398  0
62   0.626917151 -0.645760616  0
63   3.065734190  0.380730566  0
64   5.040460094  0.018166027  0
65   3.276392119  1.688046718  0
66   1.844869915  0.311746407  0
67   2.733031829  0.896973378  0
68   3.177096180 -0.573672031  0
69   3.410121076  0.300571247  0
70   4.464208504 -0.454501463  0
71   3.054911910 -0.017780024  0
72   0.622998299 -0.818008429  0
73   3.540229996 -0.067333603  0
74   3.600387104  0.930545687  0
75   3.079823925  2.891097112  0
76   2.101153831 -0.185551014  0
77   3.850678780 -0.266147938  0
78   2.061256784  0.535575136  0
79   3.431931373 -1.200971201  0
80   4.573479468  0.671756172  0
81  -2.403678649 -0.140412781  0
82   2.955750297  0.691592567  0
83   5.125438736 -0.485883747  0
84  -1.074041349 -0.931110137  0
85   0.650190525  0.524310388  0
86   8.891126339 -0.383504323  0
87   5.051695022  0.866794826  0
88   3.881265355 -1.559645018  0
89   7.933059397 -0.393654677  0
90   4.652211664 -0.607758468  0
91   5.691421653  0.336563162  0
92   0.266559329  0.845322371  0
93   4.939105704  0.686147887  0
94   1.636546390  0.179583387  0
95   0.165163826 -1.109674665  0
96   4.276789937  1.355400674  0
97   6.422995979 -0.936549454  0
98   1.862855987  1.060426192  0
99   5.998626263  0.633118326  0
100  5.418133822 -1.065387159  0
101 -2.058757860  3.823393427  1
102  1.150220595  7.224223242  1
103  0.569739117  6.504690359  1
104 -0.424984315  1.189509821  1
105 -0.856060280  5.485906001  1
106  0.491322459  6.519086166  1
107 -0.403054952  4.398486424  1
108  1.031788542  3.780096598  1
109 -0.868825746  3.343199572  1
110 -0.348922161  4.874852977  1
111  0.456997837  3.313570030  1
112 -0.794070938  0.736780866  1
113 -0.283610554  3.712119076  1
114 -0.411268433  7.753128242  1
115  0.279310993  4.474860789  1
116  1.167521186  1.468453848  1
117  0.429836096  3.480665489  1
118 -0.254434242  3.981678615  1
119  0.699399755  3.891295496  1
120 -2.067884160 -0.755524522  1
121 -0.266226609  0.338546244  1
122  0.568335110  3.822410605  1
123  1.970138498  0.855055297  1
124 -0.155694400  6.404596672  1
125  1.105994785  1.125597038  1
126  0.255493227 -0.087016673  1
127 -0.134529708  2.786437444  1
128 -0.768096686  2.082713506  1
129  0.114805033  4.210390230  1
130  0.659797775  4.663852062  1
131  0.228387805 -0.217272956  1
132  0.223748280  0.513988858  1
133 -0.367916538  3.584103465  1
134 -0.300640419  4.122231412  1
135 -1.009465990  3.101767182  1
136  1.119835982  6.089171929  1
137  1.629019477  1.607177425  1
138 -0.588137972  4.288843550  1
139 -0.454963127  0.616247887  1
140 -0.735557598  4.504886018  1
141  0.333105562  1.451268460  1
142  0.043684322  1.320201520  1
143 -0.898181718  3.915325941  1
144 -0.409633471  5.591944939  1
145  1.189253770  0.163327890  1
146 -0.477821995  1.276781644  1
147  0.256858070  0.796405014  1
148 -0.376915465  2.658656306  1
149  0.468861422  5.945045255  1
150  1.426086491  2.295901139  1
151  1.861256253  3.322891115  1
152  0.088075247  5.728011758  1
153  1.373331757  2.633115095  1
154  0.411544567  3.984687142  1
155  0.412674798  1.584711769  1
156 -0.111174612  1.690558128  1
157  0.010246924  3.705068184  1
158  0.767852580  0.955838162  1
159 -0.070406998  0.860568450  1
160 -0.162753022  3.839015302  1
161  0.627682873  5.162742624  1
162 -0.098018590  3.080736090  1
163 -0.215050961  5.078759804  1
164 -0.018667692 -0.104977298  1
165 -1.058354932  5.314129398  1
166  0.009314498 -0.698723778  1
167 -1.596070899  3.108105517  1
168 -0.544259377  3.890486036  1
169  0.485061631  2.469184681  1
170 -0.651879559  1.751990593  1
171 -0.284415810  4.080273068  1
172  0.607202776  0.508376248  1
173  0.212444290  1.434101962  1
174  1.032410135  5.248064449  1
175  0.047920545  5.206483818  1
176 -1.289435743  3.418177530  1
177  0.008664296  4.516040368  1
178 -0.510249411  3.187796116  1
179  0.843898864  6.085754332  1
180 -0.646355042  2.997866372  1
181 -2.052463597  4.428195460  1
182 -0.379741002  1.494980174  1
183  1.149372635  0.414293025  1
184  0.824980741  3.661851764  1
185 -0.518567605  5.373824081  1
186  0.102950287  4.508377985  1
187 -0.459471536  3.370341658  1
188 -1.843918383  3.066473820  1
189  0.386822661  4.566402550  1
190  0.196528299  3.103916355  1
191  0.068850050  3.660037665  1
192  0.097813030  3.574540614  1
193  0.046014034 -1.482229814  1
194 -1.249351446  2.332353127  1
195 -0.643372577  4.329341236  1
196 -0.312110212  2.663849346  1
197 -1.902588273 -0.239638294  1
198  0.654585056  1.678369909  1
199  2.450819312  3.540100011  1
200  0.521388315  3.938846775  1
lrm = glm(X3~X1+X2, family=binomial(link=logit), df)
summary(lrm)

Call:
glm(formula = X3 ~ X1 + X2, family = binomial(link = logit), 
    data = df)

Deviance Residuals: 
     Min        1Q    Median        3Q  
-2.48877  -0.25906   0.00084   0.15984  
     Max  
 2.17840  

Coefficients:
            Estimate Std. Error z value
(Intercept)  -0.2614     0.3520  -0.743
X1           -1.1470     0.2138  -5.366
X2            1.3228     0.2558   5.172
            Pr(>|z|)    
(Intercept)    0.458    
X1          8.07e-08 ***
X2          2.32e-07 ***
---
Signif. codes:  
  0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05
  ‘.’ 0.1 ‘ ’ 1

(Dispersion parameter for binomial family taken to be 1)

    Null deviance: 277.259  on 199  degrees of freedom
Residual deviance:  87.741  on 197  degrees of freedom
AIC: 93.741

Number of Fisher Scoring iterations: 7
coefficients(lrm)
(Intercept)          X1          X2 
 -0.2613782  -1.1469598   1.3228061 
y = predict(lrm, data.frame(XY[,1:2]))
plot(y)

  1. Visulaize the separation line.

b0 + b1X1 + b2X2 = 0
X2 = -(b0+b1*X1)/b2

b = coefficients(lrm)
x1 = seq(-5, 10, 1)
x2 = -(b[1] + b[2]*x1)/b[3]
plot(XY[,1:2], col=XY[,3]+1)
lines(x1, x2)

  1. Optional: implement Reweighted Least Squares algorithm yourself and compare with the result by glm()

2. Linear Discriminant Analysis

  1. For the same data set as above, apply LDA by implementing it yoursel of using lda() function in MASS library and compare the results.
library(MASS)
ldm = lda(X3~X1+X2, df)
summary(ldm)
        Length Class  Mode     
prior   2      -none- numeric  
counts  2      -none- numeric  
means   4      -none- numeric  
scaling 2      -none- numeric  
lev     2      -none- character
svd     1      -none- numeric  
N       1      -none- numeric  
call    3      -none- call     
terms   3      terms  call     
xlevels 0      -none- list     

3. More Data

Take a data set of your interest and apply logistic regression and LDA.

LS0tCnRpdGxlOiAiOC4gQ2xhc3NpZmljYXRpb24iCm91dHB1dDogaHRtbF9ub3RlYm9vawotLS0KCiMgVG9waWNzCgoqIENsYXNzaWZpY2F0aW9uCiogR2VuZXJhbGl6ZWQgbGluZWFyIG1vZGVscwoqIExvZ2lzdGljIHJlZ3Jlc3Npb24KKiBMaW5lYXIgZGlzY3JpbWluYW50IGFuYWx5c2lzIChMREEpCiogUk9DIGN1cnZlCgojIyMjIFJlZmVyZW5jZXMKCiogV2Fzc2VybWFuICgyMDA0KSwgQ2hwdGVyIDEzLjcsIDIyCiogQmlzaG9wICgyMDA2KSwgQ2hhcHRlciA0CgojIyBDbGFzc2lmaWNhdGlvbiBQcm9ibGVtCgpQYWlyZWQgZGF0YSAkKFhfMSxZXzEpLC4uLiwoWF9uLFlfbikkLiAgCgoqIGlucHV0ICRYXGluIFJeZCQgIAoqIGRpc2NyZXRlIG91dHB1dCAkWVxpblx7MCwxXH0kIG9yICRZXGluXHsxLC4uLixLXH0kCgpUaHJlZSBtYWpvciBhcHByb2FjaGVzOgoKKiBMZWFybiBkaXNjcmltaW5hbnQgZnVuY3Rpb24gJFk9aChYKSQgIAoqIExlYXJuIGNvbmRpdGlvbmFsIHByb2JhYmlsaXR5ICRQKFl8WCkkICAKKiBMZWFybiBnZW5lcmF0aXZlIG1vZGVsICRQKFh8WSkkIGFuZCB1c2UgQmF5ZXPigJkgdGhlb3JlbTogIAogICAgKiAkUChZfFgpID0gXGZyYWN7UChYfFkpUChZKX17UChYKX0kCgojIyBHZW5lcmFsaXplZCBsaW5lYXIgbW9kZWxzCgokJHkoeCkgPSBmKCBiXlR4K2JfMCkkJAokZiQ6IGFjdGl2YXRpb24gZnVuY3Rpb24gIAokZl57LTF9JDogbGluayBmdW5jdGlvbgoKIyMgTG9naXN0aWMgUmVncmVzc2lvbgoKQmluYXJ5IGRpc2NyaW1pbmF0aXZlIG1vZGVsOiAKJCRwX2kgPSBQKFlfaT0xfFg9eF9pKSA9IHMoYl5UeF9pKSQkCiogTG9naXN0aWMgc2lnbW9pZCBmdW5jdGlvbiAkcyh1KSA9IFxmcmFjezF9ezErZV574oCTdX19JCAgICAgICAKICAgIC0gbm90ZSAkc+KAmSh1KT1zKHUpKDEtcyh1KSkkCgoqIExpa2VsaWhvb2QgZm9yICQoWF8xLFlfMSksLi4uLChYX24sWV9uKSQKCiQkXG1hdGhjYWx7TH0oYikgPSBQKFl8YikgPSBccHJvZF9pIHBfaV57WV9pfSAoMS1wX2kpXnsxLVlfaX0kJAoKKiBMb2cgbGlrZWxpaG9vZCAoY3Jvc3MtZW50cm9weSBlcnJvcik6CgokJFxtYXRoY2Fse2x9KGIpID0gXHN1bV9pXHsge1lfaX1cbG9nIHBfaSArICgxLVlfaSlcbG9nKDEtcF9pKVx9JCQKJCRcbmFibGEgXG1hdGhjYWx7bH0oYikgCiAgPSBcZnJhY3tZX2l9e3BfaX1wX2koMS1wX2kpeF9pIC0gXGZyYWN7MS1ZX2l9ezEtcF9pfXBfaSgxLXBfaSl4X2kgCiAgPSBcc3VtX24gKFlfaeKAk3BfaSl4X2kkJAoKIyMjIEl0ZXJhdGl2ZSByZXdlaWdodGVkIGxlYXN0IHNxdWFyZXMgKElSTFMpCgoqIElucHV0IGRhdGE6ICRYID0gKFhfMSwuLi4seF9uKV5UJCAgCiogUHJlZGljdGVkIG91dHB1dDogJFAgPSAocF8xLC4uLixwX24pXlQkICAKKiBUYXJnZXQgb3V0cHV0OiAkWSA9IChZXzEsLi4uLFlfbileVCQgIAoKKiB3ZWlnaHRpbmcgYXJvdW5kIGJvcmRlcjogJFI6IFJfe2lpfT1wX2koMS1wX2kpJAoKJCRiX3tuZXd9ID0gYl97b2xkfSDigJMgKFheVFJYKV57LTF9WF5ZKFDigJNZKSA9IChYXlRSWCleey0xfVheWVJ6JCQKCiogZWZmZWN0aXZlIHRhcmdldDogJHogPSBYd197b2xkfSArIFJeey0xfShZ4oCTUCkkCgojIyBMaW5lYXIgRGlzY3JpbWluYXQgQW5hbHlzaXMgKExEQSkKCkFzc3VtIHRoYXQgZGF0YSBkaXN0cmlidXRpb25zIAokZl8wKHgpPWYoeHxZPTApJCBhbmQgJGZfMSh4KT1mKHh8WT0xKSQgCmFyZSBHYXVzc2lhbnMuCgpgYGB7cn0KeCA9IHNlcSgtNSwgNSwgMC4xKQpmMCA9IGRub3JtKHgsIC0xLCAxKQpmMSA9IGRub3JtKHgsIDIsIDIpCnBsb3QoeCwgZjAsIHR5cGU9ImwiKQpsaW5lcyh4LCBmMSwgdHlwZT0ibCIpCmBgYAoKQnkgYXNzdW1pZ24gdGhhdCB0aGUgY292YXJpYWNlIG1hdHJpY2VzIG9mIHRoZSB0d28gZGlzdHJpYnV0aW9ucyBhcmUgdGhlIHNhbWUsIHRoZSBkaXNjcmltaW5hdCBmdW5jdGlvbiBpcyBnaXZlbiBieQokJGgoeCkgPSBcYXJnXG1heF9rIFxkZWx0YV9rKHgpJCQKJCRcZGVsdGFfayh4KSA9IHheVCBcU2lnbWFeey0xfVxtdV9rIC0gXGZyYWN7MX17Mn1cbXVfa15UXFNpZ21hXnstMX0gKyBcbG9nIFxwaV9rJCQKCkl0IGNhbiBiZSBhcHByb3hpbWF0ZWQgYnkgZXN0aW1hdGVzIGZyb20gZGF0YSBhczoKJCRcZGVsdGFfayh4KSA9IHheVCBTXnstMX1caGF0e1xtdX1fayAtIFxmcmFjezF9ezJ9XG11X2teVFNeey0xfSArIFxsb2cgXGhhdHtccGl9X2skJAokJFMgPSBcZnJhY3tuXzBTXzAgKyBuXzFTXzF9e25fMCtuXzF9JCQKCiQkU19rID0gXGZyYWN7MX17bl9rfVxzdW1fe1lfaT1rfShYX2ktXGhhdHtcbXV9X2spKFhfaS1caGF0e1xtdX1faykkJAoqICRcaGF0e1xtdX1fayA9IFxmcmFjezF9e25fa31cc3VtX3tZX2k9a31YX2kkICAKKiAkXGhhdHtccGl9X2sgPSBcZnJhY3tuX2t9e259JAoKIyMgUk9DIEN1cnZlCgpUd28gdHlwZXMgb2YgZXJyb3JzCgonJyB8IFBvc2l0aXZlIHwgTmVnYXRpdmUKLS0tLS0tLS0tKy0tLS0tLSstLS0tLS0rLS0tLS0tLQpQcmVkaWN0ZWQgUG9zaXRpdmUgfCBUcnVlIFBvc2l0aXZlIHwgRmxhc2UgUG9zaXRpdmUgKFR5cGUgSSBlcnJvcikKUHJlZGljdGVkIE5lZ2F0aXZlIHwgRmxhc2UgTmVnYXRpdmUgKFR5cGUgSUkgZXJyb3IpIHwgVHJ1ZSBOZWdhdGl2ZQoKKiBBY2N1cmFjeSA9IChUUCtUTikvYWxsICAKKiBTZW5zaXRpdml0eSA9IFRQLyhUUCtGTikgIAoqIFNwZWNpZmljaXR5ID0gVE4vKFROK0ZQKSAgCgoqIFJlY2VpdmVyIG9wZXJhdGlvbiBjaGFyYWN0ZXJpc3RpY3MgKFJPQykgY3VydmUgIAogICAgKyBUcnVlIHBvc2l0aXZlIHJhdGUgKFNlbnNpdGl2aXR5KQogICAgKyBGbGFzZSBwb3NpdGl2ZSByYXRlICgxLVNwZWNpZmljaXR5KQoqIEFVQzogYXJlYSB1bmRlciB0aGUgUk9DIGN1cnZlCgpgYGB7cn0KeCA9IHNlcSgtNSwgNSwgMC4xKQpGMCA9IHBub3JtKHgsIC0xLCAxKQpGMSA9IHBub3JtKHgsIDIsIDIpCnBhcihwdHk9InMiKQpwbG90KDEtRjAsIDEtRjEsIHR5cGU9ImwiKQpgYGAKCiMgRXhlcmNpc2UKCgojIyAxLiBMb2dpc3RpYyBSZWdyZXNzaW9uCgpHZW5lcmF0ZSB0d28gc2V0cyBvZiBkYXRhIGluIDJELCBlLmcuLCBHYXVzc2lhbnMsIHdpdGggb3Igd2l0aG91dCBvdmVybGFwLgoKYGBge3J9Cm4gPSAxMDAKWDAgPSBjYmluZCggcm5vcm0obiwgMywgMiksIHJub3JtKG4sIDAsIDEpKQpYWTAgPSBjYmluZChYMCwgMCkKWDEgPSBjYmluZCggcm5vcm0obiwgMCwgMSksIHJub3JtKG4sIDMsIDIpKQpYWTEgPSBjYmluZChYMSwgMSkKWFkgPSByYmluZChYWTAsIFhZMSkKcGxvdChYWVssMToyXSwgY29sPVhZWywzXSsxKQpgYGAKCjEpIEFwcGx5IGxvZ2lzdGljIHJlZ3Jlc3Npb24gYnkgYGdsbSgpYCBmdW5jdGlvbiBhbmQgc2VlIHRoZSBjbGFzc2lmaWNhdGlvbiBhY2N1cmFjeS4KCmBgYHtyfQpkZiA9IGRhdGEuZnJhbWUoWFkpCmRmCmBgYAoKCmBgYHtyfQpscm0gPSBnbG0oWDN+WDErWDIsIGZhbWlseT1iaW5vbWlhbChsaW5rPWxvZ2l0KSwgZGYpCnN1bW1hcnkobHJtKQpgYGAKCmBgYHtyfQpjb2VmZmljaWVudHMobHJtKQpgYGAKCmBgYHtyfQp5ID0gcHJlZGljdChscm0sIGRhdGEuZnJhbWUoWFlbLDE6Ml0pKQpwbG90KHkpCmBgYAoKMikgVmlzdWxhaXplIHRoZSBzZXBhcmF0aW9uIGxpbmUuCgpiMCArIGIxKlgxICsgYjIqWDIgPSAwICAKWDIgPSAtKGIwK2IxKlgxKS9iMgoKYGBge3J9CmIgPSBjb2VmZmljaWVudHMobHJtKQp4MSA9IHNlcSgtNSwgMTAsIDEpCngyID0gLShiWzFdICsgYlsyXSp4MSkvYlszXQpwbG90KFhZWywxOjJdLCBjb2w9WFlbLDNdKzEpCmxpbmVzKHgxLCB4MikKYGBgCgozKSBPcHRpb25hbDogaW1wbGVtZW50IFJld2VpZ2h0ZWQgTGVhc3QgU3F1YXJlcyBhbGdvcml0aG0geW91cnNlbGYgYW5kIGNvbXBhcmUgd2l0aCB0aGUgcmVzdWx0IGJ5IGBnbG0oKWAKCmBgYHtyfQoKYGBgCgojIyAyLiBMaW5lYXIgRGlzY3JpbWluYW50IEFuYWx5c2lzCgoxKSBGb3IgdGhlIHNhbWUgZGF0YSBzZXQgYXMgYWJvdmUsIGFwcGx5IExEQSBieSBpbXBsZW1lbnRpbmcgaXQgeW91cnNlbCBvZiB1c2luZyBgbGRhKClgIGZ1bmN0aW9uIGluIGBNQVNTYCBsaWJyYXJ5IGFuZCBjb21wYXJlIHRoZSByZXN1bHRzLgoKYGBge3J9CmxpYnJhcnkoTUFTUykKYGBgCgpgYGB7cn0KbGRtID0gbGRhKFgzflgxK1gyLCBkZikKc3VtbWFyeShsZG0pCmBgYAoKYGBge3J9CgpgYGAKCgojIyAzLiBNb3JlIERhdGEKClRha2UgYSBkYXRhIHNldCBvZiB5b3VyIGludGVyZXN0IGFuZCBhcHBseSBsb2dpc3RpYyByZWdyZXNzaW9uIGFuZCBMREEuCgpgYGB7cn0KCmBgYAoKCg==