%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% % % % Scientific Word Wrap/Unwrap Version 2.5 % % Scientific Word Wrap/Unwrap Version 3.0 % % % % If you are separating the files in this message by hand, you will % % need to identify the file type and place it in the appropriate % % directory. The possible types are: Document, DocAssoc, Other, % % Macro, Style, Graphic, PastedPict, and PlotPict. Extract files % % tagged as Document, DocAssoc, or Other into your TeX source file % % directory. Macro files go into your TeX macros directory. Style % % files are used by Scientific Word and do not need to be extracted. % % Graphic, PastedPict, and PlotPict files should be placed in a % % graphics directory. % % % % Graphic files need to be converted from the text format (this is % % done for e-mail compatability) to the original 8-bit binary format. % % % %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% % % % Files included: % % % % "/document/Faultlessness-final-version.tex", Document, 56948, 5/11/2005, 8:28:00, ""% % "/document/IGBA0M00.wmf", PastePict, 9300, 10/30/2000, 9:26:10, "" % % "/document/IGBA0M02.wmf", PastePict, 11076, 10/30/2000, 9:26:10, "" % % "/document/IGBA0M01.wmf", PastePict, 16486, 10/30/2000, 9:26:10, "" % % % %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% %%%%%%%%%%% Start /document/Faultlessness-final-version.tex %%%%%%%%%%% % This is the first line of the Latex file. % Erase anything that comes before. \documentstyle[amstex,amssymb,12pt]{article} %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% %TCIDATA{OutputFilter=LATEX.DLL} %TCIDATA{Version=5.00.0.2570} %TCIDATA{} %TCIDATA{LastRevised=Wednesday, May 11, 2005 10:27:58} %TCIDATA{} %TCIDATA{Language=American English} %TCIDATA{CSTFile=article.cst} \pagestyle{plain} \renewcommand{\baselinestretch}{1.5} \footnotesep.25in \topmargin-0.5in \textheight9.1in \oddsidemargin0.1in \evensidemargin0.1in \textwidth6.2in \newtheorem{proposition}{Proposition} \newtheorem{corollary}{Corollary} \newtheorem{lemma}{Lemma} \newtheorem{claim}{Claim} \newtheorem{theorem}{Theorem} \newtheorem{remark}{Remark} \newtheorem{subclaim}{Subclaim} \newcommand{\up}[1]{\overline{#1}} \newcommand{\dn}[1]{\underline{#1}} \newcommand{\F}{\mbox{\boldmath $F$}} \newcommand{\bm}[1]{\mbox{\boldmath $#1$}} \newcommand{\pder}[2]{\mbox{$\partial #1 \over \partial #2$}} \newcommand{\pdermath}[2]{{\partial #1 \over \partial #2}} \newcommand{\ep}{\epsilon} \input{tcilatex} \begin{document} \title{{\huge Backward Induction with Players who Doubt Others' Faultlessness% \thanks{% We thank the associate editor and two anonymous referees for very helpful comments. We also benefited from discussions with Eddie Dekel, Herakles Polemarchakis,\ \ and Asher Wolinsky.}}} \author{Aviad Heifetz\thanks{% The Economics and Management Department, The Open University of Israel, aviadhe@openu.ac.il} \ and Ady Pauzner\thanks{% Eitan Berglass School of Economics, Tel Aviv University,\ pauzner@post.tau.ac.il}} \date{This version: May 2005} \maketitle \begin{abstract} We investigate the robustness of the backward-induction outcome, in binary-action extensive-form games, to the introduction of small mistakes in reasoning. Specifically, when a player contemplates the best action at a future decision node, she assigns some small probability to the event that other players may reach a different conclusion when they carry out the same analysis. We show that, in a long centipede game, the prediction that players do not cooperate fails under this perturbation. Importantly, this result does not depend on forward induction or reputation reasoning. It particular, it applies to finite horizon overlapping generations models with fiat money. J.E.L Classification No. C73, Field: Game Theory\bigskip \end{abstract} \pagebreak \allowbreak \section{Introduction} Backward induction is the predominant solution concept used to predict behavior in extensive form games of complete information. Nonetheless, there are numerous examples in which the backward induction outcome seems to contradict our \textquotedblleft common sense\textquotedblright\ intuition as to how human agents would act. The centipede game (Rosenthal 1981) is a particularly striking example. In this game, continual cooperation between two parties can yield substantial gains to both, while the backward induction outcome precludes cooperation altogether. Experimental evidence (see, for example, McKelvey and Palfrey (1992)) suggests that human subjects do not adhere to backward induction reasoning in this case: a significant proportion of subjects continue for several rounds. A closely related game is the finite overlapping generations model of state-issued fiat money. Had there been a meteorite, bound to hit and destroy Earth at the end of the next millennium, dollars would be worthless one day before the destruction. Reasoning backwards, they would be worthless one day earlier, and so forth. Yet, it is hard to believe that dollars would not be used in trade today. This paper claims that the backward induction solution concept sometimes hinges on the assumption that players are {\it absolutely} certain of their conclusions regarding others' reasoning. We consider a slight deviation from this assumption and show that, in some cases, the predictions of the model change considerably. Specifically, we develop a model in which a player who contemplates the best action at some future decision node attributes some small but positive probability to the possibility that other players may reach a different conclusion when they carry out the same analysis. Moreover, each player believes that the other players have the same doubts, that other players believe that others maintain such doubts, and so on. In other words, common knowledge of rationality is replaced by {\it common certainty that with some small probability},{\it \ the other players might conclude differently than oneself when they contemplate the best action at each decision node.} Our model applies to binary-action games in agent form. Each agent is characterized by a set of \textquotedblleft types\textquotedblright . A type represents the way the agent thinks when she analyzes the decision problem of agents in all the nodes in the subgame that starts with her. For each of these decision problems, the type specifies whether she is \textquotedblleft correct\textquotedblright\ -- and believes that the action with the higher payoff is the best one, or \textquotedblleft confused\textquotedblright\ -- and believes that the action with the lower payoff is the best one. In other words, a confused agent knows very well how to add and multiply numbers, but whenever she has to compare two payoffs $x$ and $y$ where $x>y,$ she concludes by mistake that $y\ $is preferred to $x.$ A type is represented by a collection of binary digits ($1$ representing \textquotedblleft correct\textquotedblright\ and $0$ representing \textquotedblleft confused\textquotedblright ), one digit for each node in the player's subgame. A given type, however, does not understand the meaning of the $1$'s and $0$'s in her name. She believes that her way of analyzing the game is the objectively correct way, and believes that other agents tend to analyze the game in a way similar to hers. This is modeled by the way she assigns a probability distribution over the types of another agent: the more similar the type, the higher the probability. More precisely, one agent (of a given type) believes that the probability that the type of another agent differs from hers in a given digit is $\varepsilon $ (\TEXTsymbol{<}$\frac{1% }{2}$), independently across digits. Thus, a confused type attributes a probability of $1-\varepsilon $ to the event that any other player who considers the same node would think like her, and a probability of $% \varepsilon $ to the event that the other player is \textquotedblleft confused\textquotedblright\ (in her view) and compares payoffs in the objectively correct way. When applied to the centipede game, the prediction of this model differs considerably from that of the ``fully rational''\ model. Cooperation among the players ensues for a long while and only a few\ steps before the end of the game does it break down. To see why small doubts can induce cooperation, consider the following deliberation of player 1 at the beginning of a long centipede game: \textquotedblleft True, if I were in the shoes of player 2 who plays right after me, and if I assumed that it is commonly known that everyone reasons like me, I would not cooperate. Thus, if player 2 reasons in this way, I should exit right away. However, in order to decide what's best for 2, I had to put myself in the shoes of many players at consecutive decision nodes, and 2 will have to follow the same procedure. It is therefore not that unlikely that at least at some of these decision nodes, player 2 would reach a conclusion opposite to mine regarding the best action -- by mistake, because her way of thinking is different, because her computational abilities are limited, or for whatever other reason. It is enough that this happens once in order for player 2 to continue: When analyzing the game backwards from the node where she reasoned differently, player 2 will then think that everyone should cooperate. (Since every mistake leads to a reversal in her perceived best action, any odd number of mistakes will also induce make player 2 to cooperate.) Since there are so many decision nodes where this might happen, albeit with a small probability at each node, the overall probability that 2 will cooperate at the next stage may not be that small. In such a case, the best for me may be to cooperate as well.\textquotedblright \textquotedblleft In fact, if player 2 maintains similar considerations regarding the way consecutive agents reason, she might also conclude that her best action is to cooperate, {\it even in the case that she makes no mistakes but, exactly like me, does not rule out the possibility of mistakes. }Thus, I should in fact ascribe a rather high probability that 2 will cooperate.\textquotedblright \footnote{% This type of reasoning will continue to be valid as long as the players are not too close to the end of the game, so that there are still enough decision nodes down the game tree in which they may doubt each other's conclusions. Consequently, cooperation lasts with a high probability until a certain number of stages before the end. The smaller the probability $% \varepsilon $ ascribed to the possible mismatch of conclusions, the earlier cooperation ends. In the limit of full rationality, when $\varepsilon =0$, we obtain the no-cooperation backward induction outcome.} A number of papers have studied the sensitivity of the backward induction outcome to slight deviations from the common knowledge of rationality assumption. Kreps, Milgrom, Roberts, and Wilson (1982) and Kreps (1990, p. 536) showed that cooperation in the centipede game or in the finitely repeated prisoners' dilemma can emerge if there is a small chance that one of the players is an irrational, cooperator type. \ The driving force behind these results is \textquotedblleft forward induction\textquotedblright\ or \textquotedblleft reputation\textquotedblright\ reasoning, in which the rational type of a player mimics her cooperator type in order to convince her opponent that she is likely to cooperate in the future. Ben-Porath (1997) showed how common {\it certainty} of rationality (where the event that a player is irrational has probability $0$ but nonetheless is not empty) is compatible with cooperation to some extent. This relaxation of common knowledge permits an assignment of beliefs after a zero probability event occurs, and thus can accommodate forward induction reasoning. \ Our model, by contrast, does not rely on forward induction reasoning. The prediction that players will cooperate in the centipede game relies purely on backward induction. That is, the action that an agent chooses depends only on the analysis of the subgame that starts with her, and is in no way affected by the other parts of the game. Therefore, she can never infer, from the actions of another player at an early stage of the game, what (another agent of) that player is likely to do in the subgame. Nevertheless, our model predicts cooperation in long agent-form centipede games. Thus, our results also apply to finite overlapping generations models of fiat money. In such models, an agent accepts money only if she believes that the next agent will accept it; she does not expect her action to affect the next agent's beliefs regarding the likelihood that money will continue to be accepted in the future. While the reputation models do not apply in this case, our model shows that the conclusion that money cannot have value when the horizon is finite depends crucially on the assumption that players are sure that they know exactly how others think. Another possible deviation from full rationality was studied by Selten (1975). Here, at each node there is a slight chance that a player might \textquotedblleft tremble\textquotedblright\ and mistakenly play a suboptimal action. It is important to differentiate between our {\it % mistakes in reasoning} and Selten's {\it mistakes in acting}. In a long centipede game, mistakes in reasoning accumulate since a mistake in figuring out the best action at any of the following nodes is enough to change the player's (perceived) best action. Hence, when there are many nodes ahead, there may be a relatively high probability of a mismatch of conclusions between two consecutive players regarding the best action. In contrast, mistakes in acting do not accumulate, as can be easily shown by induction from the end of the game: if the correct action at some stage is to exit, there is only a small, fixed chance that the player would \textquotedblleft tremble\textquotedblright\ and continue, and hence the correct action one stage earlier is, again, to exit. This means that the probability of cooperation remains bounded by the probability of a tremble regardless of the length of the game.\footnote{% Aumann (1992) constructs an example in which there is an irrational type that mistakenly continues at one of the decision nodes. The ex-ante probability of the mistaken type is very small. However, conditional on reaching that node, the probability is large enough (relative to the payoff differentials) that the best action for the player at the preceding node is to continue. This induces all players from that node backwards to continue. Note that Selten's counterpart to Aumann's assumption is that the tremble (for the player at that node) is large. The purpose of our paper is to show that small \textquotedblleft trembles\textquotedblright\ (in reasoning) can accumulate in long games until the probability of continuation reaches that assumed by Aumann. (Note that Aumann's information structure is not of \textquotedblleft agent-form\textquotedblright , i.e., a player's type determines her action in more than one node. This, however, is not the driving force behind his cooperative result.)} The remainder of this paper is organized as follows. In section 2 we present the formal model for binary-action games. In section 3 we apply the model to the overlapping-generation version of the centipede game; a self-contained example is presented in Subsection 3.1. Section 4 concludes with discussions on the interpretation of our type space and of other games -- the prisoners' dilemma and the two-player centipede game. Proofs are relegated to the appendix. \section{The Model} \label{sec:themodel} Consider a binary-action, finite, extensive-form game $G$ with perfect information and set of players $I$. The game is described by a tree $(Z,N,A)$% , where $Z$ is the set of leaves (terminal nodes), $N$ is the set of (nonterminal) nodes, and $A$ is the set of arcs. Elements of $A$ are ordered pairs $(n,m)$, and $n_{\ast }=\{m:(n,m)\in A\}$ is the (binary) set of immediate successors of $n$. Player $i$'s payoff is a function $% f_{i}:Z\rightarrow R$. We assume that the game is played in \textquotedblleft agent form\textquotedblright . \ Thus, without loss of generality, we can identify the set of agents with the set of (non-terminal) nodes $N$. Agent $n$'s set of (pure) strategies is simply the set of immediate successors $n_{\ast }$. Let $S_{n}$ be the set of nodes in the subgame starting at node $n$. If the agents in this subgame choose actions $(a_{n^{\prime }})_{n^{\prime }\in S_{n}}$, agent $n$ obtains the payoff $u_{n}\left( (a_{n^{\prime }})_{n^{\prime }\in S_{n}}\right) $, computed in the usual way. $T_{n}=\{0,1\}^{S_{n}}$ is the set of types of agent $n$, with a typical element denoted $t_{n}=(t_{n}^{n^{\prime }})_{n^{\prime }\in S_{n}}$. The belief $b_{t_{n}}^{n^{\prime }}$ of type $t_{n}$ about the agent $n^{\prime } $ $\in S_{n}\setminus n$ who plays after her, is a probability measure over her set of types $T_{n^{\prime }}$. \ Denoting $d(t_{n},t_{n^{\prime }})=\#\left\{ n^{\prime \prime }\in S_{n^{\prime }}\text{ : }t_{n^{\prime }}^{n^{\prime \prime }}\neq t_{n}^{n^{\prime \prime }}\right\} $ and $% e(t_{n},t_{n^{\prime }})=\#\left\{ n^{\prime \prime }\in S_{n^{\prime }}% \text{ : }t_{n^{\prime }}^{n^{\prime \prime }}=t_{n}^{n^{\prime \prime }}\right\} $, $b_{t_{n}}^{n^{\prime }}$ is defined by:% \[ b_{t_{n}}^{n^{\prime }}\left( t_{n^{\prime }}\right) =\varepsilon ^{d(t_{n},t_{n^{\prime }})}\left( 1-\varepsilon \right) ^{e(t_{n},t_{n^{\prime }})} \]% where $\varepsilon \in \left( 0,\frac{1}{2}\right) $ is the (common) probability of confusion. Thus the probability that $t_{n}$ assigns to $% t_{n^{\prime }}$ increases with the number of nodes over which their types \textquotedblleft agree.\textquotedblright\ We assume that the beliefs of $% t_{n}$ over the types $t_{n^{\prime }}$ of different agents $n^{\prime }\in S_{n}$ are independent. The interpretation of this is that types interpret their names as {\em neutral tags}, and do not conceive the $0$-s in their names as denoting nodes where they are \textquotedblleft mistaken\textquotedblright . Type $t_{n}$ takes its own name as the point of reference, as if it were always right, and consider types $t_{n^{\prime }}$ that are very different from itself as peculiar and rare. Computation of the backward induction outcome in our model is done recursively. Let $U_{n}(t_{n},a)$ denote the expected payoff of agent $n$, with beliefs determined by her type $t_{n}$, if she takes action $a\in n_{\ast }$, and let $a_{n}(t_{n})$ be her preferred action. Then:% \begin{equation} U_{n}(t_{n},a)=\left\{ \begin{array}{ccc} f_{n}(a) & \text{if} & a\text{ is a leaf} \\ \dsum\limits_{(t_{n^{\prime }})_{n^{\prime }\in S_{a}}\in \underset{% n^{\prime }\in S_{a}}{\times }T_{n^{\prime }}}\left( \underset{_{n^{\prime }\in S_{a}}}{\Pi }b_{t_{n}}^{n^{\prime }}\left( (t_{n^{\prime }})\right) \right) \cdot u_{n}\left( a,(a_{n^{\prime }}\left( t_{n^{\prime }}\right) )_{n^{\prime }\in S_{a}}\right) & \text{if} & a\text{ is a node}% \end{array}% \right. \tag{I} \end{equation}% where% \begin{equation} a_{n}(t_{n})=\left\{ \begin{array}{ccc} \arg \max_{a\in n_{\ast }}U_{n}(t_{n},a) & \text{if} & t_{n}^{n}=1 \\ \arg \min_{a\in n_{\ast }}U_{n}(t_{n},a) & \text{if} & t_{n}^{n}=0% \end{array}% \right. \tag{II} \end{equation}% We restrict attention to games and $\varepsilon $ for which the $\arg \max $ and $\arg \min $ in this definition are always singletons (i.e., $% U_{n}(t_{n},a)$ is not constant in $a\in n_{\ast }$). Clearly, this holds for generic games. The recursive definition is applied as follows. For agents $n$ at nodes whose successors are only leaves, the payoff $U_{n}(t_{n},a)$ from playing action $a$ is simply $f_{n}(a)$ (equation I). \ The agent has to choose between the two payoffs $\{f_{n}(a):a\in n_{\ast }\}$. Type $1$ chooses the \textquotedblright right\textquotedblright\ action, i.e., the one with the higher payoff. Type $0$ chooses the wrong action, i.e., the one with the lower payoff (Equation II). Next, we move to nodes whose successors are either leaves or nodes whose successors are only leaves. For the former, payoffs are computed as before, according to the first line in Equation I. For the latter, we have already computed the chosen actions (as a function of types), and thus can apply the second line in Equation I. Now again we can employ Equation II to compute the actions. Continuing in this way, we eventually cover the whole tree.\footnote{% If we were to extend the definition of the model to extensive-form games with more than two actions per node, the digit of the type for each node would assume as many values as there are actions at that node. It would then be natural to assume that a type who is not confused at that node assigns an overall small probability to the (more than one) possible kinds of confusions at that node. It is less straightforward, however, to decide what probabilities each {\em confused }type would assign -- both to the truly non confused type, and to the other kinds of confusion at that node.} \section{Application: cooperation in the centipede game} In this section we study an application of our model to a class of centipede games. We will see that our solution concept yields different predictions when compared to the standard backward-induction solution concept. We work with an overlapping-generations version, in which the payoff structure is simple, and allows for an analytic solution. Consider a centipede game with $N$ players, each assigned to one of $N$ consecutive decision nodes. \ We enumerate the nodes {\em from the end} of the game; the name of each node also denotes the player who plays there. Hence, player $1$ is the last one to play, $2$ is the one-before-last, and so on. Each player can either continue or quit at her turn. The player can receive one of three possible payoffs: If she quits, she receives a payoff of $1$. If she continues, her payoff depends on the next player's action: if the next player quits, she receives $0$; if the next player continues, her payoff is $d>2$. The last player gets $1$ by quitting and $0$ by continuing. The probability of confusion, $\varepsilon $, is assumed to be positive and less than $\frac{1}{d}$. The overlapping-generations centipede game fits a number of economic scenarios. Consider, for example, the use of fiat money in a world which is commonly known to end at some given future date. Agents, who live in overlapping generations, can choose between consuming their own endowment (utility of $1$) or selling it in exchange for money. Each of them would enjoy a higher utility ($d$) if the following agent accepted the money in exchange for her own endowment. But if the following agent declines to accept money, the utility is $0$ (as paper money has no intrinsic value). It has been shown that in an infinite-horizon world, there is an equilibrium in which fiat money has value and can be used for trade (see Samuelson 1958). However, if the world is known to end at a particular date, no agent would accept money at the last date. Hence, the agent playing at the one-before-last date would not give her endowment in exchange for money, since this money will be useless. Continuing this backward induction reasoning, one can show that no agent would ever accept fiat money. The analysis below shows how small mutual doubts among the agents regarding each other's maximizing behavior will induce them to use the money for trade for a long while. How is our model applied to this game? Consider player $n$. Her set of types $T_{n}=\{0,1\}^{S_{n}}$ is the set of all $n$-digits binary numbers, where $% 0 $ in the $k$-th digit corresponds to confusion at node $k$. (Recall that \textquotedblleft confusion\textquotedblright\ means a reversal of the usual ordering on numbers, i.e., concluding that payoff $y$ is preferred to payoff $x$ when $x>y.$). Consider now player $m>n$. How does she reason about player $n$? This depends on her own type $t_{m}$. The belief of type $t_{m}$ is a probability distribution over $T_{n}$. The probability it assigns to type $t_{n}\in T_{n} $ is computed by comparing the $n$-digit number $t_{n}$ to the first $% n $ digits in $t_{m}$; this probability is $\varepsilon ^{\ell }(1-\varepsilon )^{n-\ell }$, where $\ell $ is the number of digits in which the two numbers differ. How do the types decide what to do? They choose the action which maximizes their expected payoff given their beliefs, unless they are confused at their own decision node, in which case they choose the opposite action. In our multi-player centipede game, the best action for player $m=n+1$ depends on the probability that player $n$ continues. In the case that $n+1$ is not confused at her decision node, she will choose to continue if she believes that the probability that $n$ will also continue exceeds $\frac{1}{d}$. This will yield her an average payoff larger than $1$, the payoff that she can guarantee by quitting immediately (the only exception is at the last node of the game tree, where the above calculation is not relevant; at that node, player $1$ simply compares the payoff of quitting, $1$, to that of continuing, $0$). In the case that $n+1$ is confused, she will of course choose the opposite action to the one implied by the above rule. \subsection{An Example} \label{sec:example} There are five players: $e$, $d$, $c$, $b$, $a$. At her turn, a player can secure a payoff of $1$ by exiting. For all players but the last, the payoff from continuing depends on the action of the next player: it is $0$ if the next player exits and $5$ if the next player chooses to continue. The last player, $a$, receives $0$ by continuing. Clearly, the usual backward induction argument implies that all the players choose to exit if their node is ever reached. \FRAME{dtbpFU}{290.5pt}{109.0625pt}{0pt}{\Qcb{Figure 1: A 5-player centipede game}}{}{Figure }{\special{language "Scientific Word";type "GRAPHIC";maintain-aspect-ratio TRUE;display "USEDEF";valid_file "T";width 290.5pt;height 109.0625pt;depth 0pt;original-width 364.5pt;original-height 135.625pt;cropleft "0";croptop "1";cropright "1";cropbottom "0";tempfilename 'IGBA0M00.wmf';tempfile-properties "XPR";}} Suppose now that the probability of confusion is $\varepsilon =0.1$. How do the players play the game? Let us analyze the game backwards. What does player $b$ (who is not confused) think that player $a$ will do? She believes that with probability $\varepsilon =0.1$ player $a$ will get confused, will assess $0$ as better than $1$, and will continue. If player $a$ is not confused (probability $1-\varepsilon =0.9$), she will exit. Thus, if $b$ continues, she expects an average payoff of $0.9\times 0+0.1\times 5=0.5$. \ This is less than the payoff of $1$ that she gets by quitting and therefore she quits. What does a (non-confused) player $c$ think that player $b$ will do? She thinks that $b$ will quit, unless either: \noindent 1) $b$ understands correctly that $a$ will quit, but gets confused in computing her own best response and decides to continue (probability $% 0.1\times 0.9)$, or \noindent 2) $b$ gets confused when she puts herself in the shoes of player $% a,$ concludes that $a$ will quit and, given that mistake, she \textquotedblleft correctly\textquotedblright\ decides to continue (probability $0.9\times 0.1$).\smallskip \noindent Thus, if $c$ continues, her expected payoff is $0.82\times 0+2\times 0.1\times 0.9\times 5=0.9$. \ This is still less than the pay off of $1$ that she can secure by quitting and therefore she quits. What does the non-confused type of player $d$ think that $c$ will do? She believes that $c$ quits unless $c$ was confused exactly once when she put herself in the shoes of $a,$ $b$ or herself (probability $3\times 0.1\times 0.9^{2}=$ $0.243),$ or in all three cases (probability $0.1^{3}=0.001).$ In the complementary event that $c$ either never got confused or got confused two mutually-compensating times, $c$ continues. Thus, if $d$ continues, her expected payoff is $0.757\times 0+0.243\times 5=1.22$, \ which is better than the payoff of $1$ she gets by quitting. Therefore, she continues! Finally, the non-confused type of player $e$ is almost certain that $d$ will continue. Why? Consider a type of $d$ who is not confused in her own shoes. As explained above, if $d$ never got confused in the shoes of $a,b$ and $c$, she would continue. This holds also if she got confused twice when she reasoned about $a,b$ and $c$. But what if $d$ got confused once or thrice in the shoes of $a,b$ and $c$? Also in this case $d$ would continue -- \textquotedblleft by mistake\textquotedblright : as may easily be calculated, such a type of $d$ would ascribe probability $1-0.243=0.757$ to the event that $c$ continues. Thus, any type of $d$ who is not confused in her own shoes must continue; and clearly, every type of $d$ who {\it is }% confused in her own shoes quits. Thus, (the non-confused type of) $e$ assigns a probability of $1-\varepsilon =0.9$ to the event that $d$ continues. As a result, $e$ will continue. Figure 2 describes the probabilities of continuation of each player, as viewed by the non-confused type of the preceding player. \FRAME{dtbpFU}{305.875pt}{74.125pt}{0pt}{\Qcb{Figure 2: Probabilities of continuation.}}{}{Figure}{\special{language "Scientific Word";type "GRAPHIC";maintain-aspect-ratio TRUE;display "USEDEF";valid_file "T";width 305.875pt;height 74.125pt;depth 0pt;original-width 326.125pt;original-height 77.5625pt;cropleft "0";croptop "1";cropright "1";cropbottom "0";tempfilename 'IGBA0M02.wmf';tempfile-properties "XPR";}}\vspace{-1cm} \subsection{The general result} The following theorem states that a long centipede game has a concluding segment, before which players who are not confused (i.e., of type{\it \ }$% 111\dots 1$) continue, and furthermore assign a high probability to continuation everywhere before that concluding segment.\medskip \noindent {\bf Theorem 3.1. \ }{\it There is an integer }$n=n(d,\varepsilon ) ${\it , such that the types }$111\dots 1${\it \ of all players from }$n+1$ {\it on (backwards towards the beginning of the game) continue, and the types }$111\dots 1$\ {\it of the players }$1,\dots n$ {\it quit. Moreover, types }$111\dots 1${\it \ of all players }$i>n$ {\it believe that player }$% i-1$ {\it continues with probability at least }$1-\frac{1}{d}-\varepsilon \left( 1-\frac{2}{d}\right) $. {\it Finally, }$n(d,\varepsilon )$ \ {\it % decreases in both} $d$ {\it and} $\varepsilon $.\bigskip To see the intuition for the theorem, we now analyze the example in Section % \ref{sec:example} and extend it by adding players $f$, $g$,... . For each player (except player $a$) and each of her types, we can divide the sequence of digits that composes the type to two: one digit describes whether she is confused in her own shoes, and the other digits describes her belief regarding the types of the next player. The belief leads to an assessment of the probability that the next player quits or continues, which determines what the best action is. The digit of the player itself determines whether she indeed takes the best action, or mistakenly takes the wrong action. As for the last player $a$, since there are no further players, the best action for her is simply determined by the two terminal payoffs -- and it is to quit. Like the other players, whether she will take the best action or mistakenly quit depends on her own digit. As we go backwards in the game and analyze the behavior of players $b$ and $% c $ (in the eyes of the non-confused type of the previous player), we see that the probability of continuation, which is the probability of an odd number of mistakes, increases. Since the probability of an odd number of successes in $n$ independent draws tends to 1/2 as the number of draws grows to infinity, we know that at some point the probability will be above $\frac{% 1}{d}<\frac{1}{2}$. With our $\varepsilon =0.1$, this first happens when we reach player $c$, for whom the probability is $0.243>0.2$ . Now, all the types of player $d$ assign a probability greater than $0.2$ to the event that player $c$ continues: for some the probability is $0.243$, and for the others it is $1-0.243$. This means that the best action for $d$ is to continue, rather than to quit, and all the types of $d$ agree on that. Whether they will indeed continue or mistakenly quit depends on their own digit. Now, the conclusion that the best action for $d$ is to continue is common to all the types of all the previous players ($e$, $f$, ...). Therefore, to analyze the behavior of players $d$, $e$, $f$ and so on, we can look at a simplified game. In this game, the last player is $d$, and her payoff from continuing is higher than that of quitting. The payoffs of the other players are unchanged. The analysis of the simplified game goes as follows. Player $d$'s best action is to continue, and thus she continues unless confused in her own shoes. Also players $e$ and $f$ continue unless they are confused an odd number of times. Thus, the probability that a player continues are $1-0.1$ for $d$, $1-0.82$ for $e$, and $1-0.243$ for $f$. What about player $g$? All her types assign probability either $1-0.243$ or $% 1-(1-0.243)$ to the event that $f$ continues. Since both numbers are above $% 0.2$, they all agree that the best action for $g$ is to continue. Thus, whether $g$ continues depends only on whether she is confused in her own shoes -- probability $\varepsilon =0.1$. Thus, we can again simplify the game, make $g$ the last player, with the payoff to continuing higher than that of quitting (as we did with player $d$). We therefore see that the probability of continuation follows cycles of length $n(d,\varepsilon )=3$. The first cycle is different from the others, as the best action for the last player $a$ is to quit, while for the simplified games corresponding to the next cycles the best end-action is to continue. Thus, the probabilities of continuation in the first cycle are $% 0.1 $, $0.18$ and $0.243$; for the other cycles we have the complementary probabilities $0.9$, $0.82$ and $0.757$. The following graph depicts the probability of continuation in the eyes of a type $t_{n+1}=111\dots 1$ as a function of the number of generations from the end, for the parameters of the example in the Section 2 ($d=5,$ \ $% \varepsilon =0.1$). \FRAME{dtbpFU}{375pt}{153.75pt}{0pt}{\Qcb{Figure 3: Continuation probabilites in the Centipede Game\protect\linebreak }}{}{Figure }{\special% {language "Scientific Word";type "GRAPHIC";display "USEDEF";valid_file "T";width 375pt;height 153.75pt;depth 0pt;original-width 80.125pt;original-height 80.125pt;cropleft "0";croptop "1.0079";cropright "0.9953";cropbottom "0";tempfilename 'IGBA0M01.wmf';tempfile-properties "XPR";}} \section{Concluding Remarks} We introduced a small deviation from full common knowledge of rationality: players ascribe some small probability to the possibility that other players reach a different conclusion when they examine the same decision problem. We analyzed the centipede game and saw that the usual backward-induction, non-cooperative solution might not be robust to such a perturbation of the model. \subsection{Interpretation of the type space} In our type space there is common certainty that each of the players (1) believes that other players are sometimes confused (relative to her own point of view), and (2) is sure that she is right in all her calculations. This type space admits two different interpretations. According to the interpretation that we emphasize in this paper, the type space is a hypothetical construct in the minds of the players, while the players themselves do not get confused. In other words, the actual belief of each player coincides with that of the type $111\dots 1$ who is never confused, and it is the players' mutual suspicions (which do not materialize in practice) that drive them to cooperation. An alternative interpretation is that players actually do make mistakes: ex-ante each of the types may be realized, with the probability assigned by the non-confused type $111\dots 1$% . However, even the confused types mistakenly believe that they are always correct. Both interpretations above are not compatible with a common-prior, Bayesian-game framework. Can our model be adapted to this more standard assumption? How would our analysis change if we assume that each type also doubts her own judgement? The critical difference is that if a player doubts her own reasoning, she should better try to gain information regarding her best action from past decisions of other players, as these decisions were influenced by their analysis of her current decision problem. Differently put, since she understands that her own thinking only gives her a noisy signal regarding the correct computation, she should consider the preceding players' actions as additional signals that improve the accuracy of her own one. But now the analysis becomes very different. Since the analysis of a subgame depends on the history that precedes it, forward-induction arguments come into play (even though the game is in agent-form): If a player knows that her decision to continue will affect the (sequential equilibrium) beliefs of the next agent, she must take this effect into account. In particular, if by continuing she can convince the next player to continue, then she should continue even if she believes that the best action for the next player is to exit.\footnote{% We thank Asher Wolinsky for this observation.} While studying the forward-induction consequences of a Bayesian framework could be interesting, we preferred to adhere, in this paper, to backward-induction reasoning (by which cooperation in the centipede game is perhaps more surprising), and leave this issue for future research. \subsection{Two-player centipede games and forward-induction} In our agent-form analysis, types were ascribed to agents. In a two-player alternating-moves centipede game, in contrast, a type is ascribed to a player. It is natural to assume that the type of the player should not change as the game proceeds. In other words, the way a player predicts the continuation of the game from some future node onwards, is the same at any decision node of hers that precedes that future node. One may conjecture that this assumption implies a positive correlation between the {\em actions} taken by a given type at different node. That is, if a type continues at some node, it is likely that it also continues at its previous decision node. (This conjecture is based on the structure of the centipede game, in which a belief that the probability of continuation in the future is higher implies a higher incentive to continue in the present.) The condition, that there is positive correlation between the actions of a type at different nodes, is the driving force behind forward-induction arguments. For example, in Kreps [1990], there is a ''crazy'' type who always continues. Thus, when a player is called to play, she knows that the opponent continued at the previous node, and this increases her assessment of the probability that the opponent will continue at the next node. This type of reasoning leads to cooperation at the beginning of long centipede games. Under the above conjecture, that our belief structure leads to a positive correlation in actions, one could view our framework as providing a possible foundation for the positive correlation assumption. That is, the positive correlation of actions becomes a result rather than an assumption. \subsection{Other games: the prisoners' dilemma} One may wonder about other canonical games. An important example is the finitely repeated prisoners' dilemma. One ''lesson'' from the forward-induction literature (such as Kreps et al. [1982]) was that there is a similarity between the centipede game and the finitely repeated prisoners' dilemma, as in both the same slight perturbation leads to an equilibrium with cooperation for a long while. Are the two games similar also under our perturbation? Or maybe the non-cooperation, backward-induction outcome in the finitely repeated prisoners' dilemma is robust to the introduction of small mistakes in reasoning? It turns out that the analysis of that game is quite complicated, as the number of possible histories expands exponentially with the number of iterations. However, we can analyze a simplified version which is perhaps closer to the way most people think about the repeated prisoners' dilemma. Consider a finitely repeated prisoners' dilemma game, in which each player has a different agent who plays at each stage. This game differs from the canonical agent-form since the same agent plays at all the nodes corresponding to a specific stage; nonetheless, with full common knowledge of rationality, backward induction yields the usual noncooperative solution. Now assume that each agent ascribes a small probability $\varepsilon $ to the event that another agent {\em always} reaches the opposite conclusion when she analyses the same decision problem. That is, an agent suspects that any other agent might be \textquotedblleft mistakenly wired\textquotedblright\ and always reach the \textquotedblleft wrong\textquotedblright\ conclusion (from the first agent's perspective). Moreover, assume that these doubts are common knowledge. This simplified model is very easy to analyze. Clearly, at the last stage both players defect (if they don't make a mistake). In the one before-the-last stage, a player who carried out a correct analysis will certainly defect. Moreover, a player who mistakenly expects cooperation in the last stage will also choose to defect. Although she errs in her analysis of the last stage, she does not expect her action to affect her opponent's action in the next stage. Thus, she chooses the action that maximizes her stage payoff. The same reasoning applies to all the stages of the game. Players who do not err in their analysis of the current stage choose to defect, whatever doubts they may have regarding the opponent's reasoning. Thus the probability of cooperating is simply the probability $\varepsilon $ that a player errs in her own shoes, when analyzing the current stage. In other words, in contrast to the centipede game, in the prisoners' dilemma mistakes do not accumulate: although the player may be mistaken in many places when she analyses the continuation of the game, the probability she will take the wrong action remains small. Therefore, if the mutually ascribed probability of mistakes $\varepsilon $\ is small enough, then no matter how many times the game is repeated, a player who is never mistaken will always defect. Thus, if we think of the model of slight mistakes in reasoning as a {\em % refinement} of subgame perfect equilibrium, the finitely repeated prisoners' dilemma differs from the centipede game. The subgame perfect no-cooperation equilibrium of the finitely repeated prisoners' dilemma is robust to the introduction of mutual doubts regarding the faultlessness of the players. This is not the case with the (long) centipede game in which a slight deviation from common knowledge of rationality yields a qualitatively different backward-induction outcome. One possible objection to the above modeling of mistakes in the prisoners' dilemma, is that we only allow players to mistakenly choose fixed actions, rather than also allow for mistakenly choosing a history-dependent strategy. Clearly, a player will choose to cooperate only if she believes that the opponent who plays after her has a history-dependent strategy, such as tit-for-tat; if she believes her opponent's action is fixed -- whether it is cooperation or defection -- she will not cooperate. Maybe we could have restored cooperation in the prisoners' dilemma had we allowed mistakes in beliefs about history-dependent strategies? It is easy to see that the answer is negative. Assume for example that a type, at the one-before-last stage, mistakenly believes that tit-for-tat is the best action at the last stage. If not confused in her own shoes, she will decide to cooperate -- not to play tit-for-tat. Going back to an agent at the two-before-last stage, if she mistakenly believes that the best action at the {\em last} stage is tit-for-tat, but given this mistake correctly computes the action at the one-before-last stage -- which is "cooperate", then her best action is to defect! The only case in which she would not play defect is if she made a mistake at the one-before-last stage (or in her own shoes). In other words, in the prisoners' dilemma game, the possibility of mistakenly assigning to players history dependent strategies with 1-recall (like tit-for-tat), can only affect the optimal action for an agent one stage earlier, but not further back in the game. That is, the only cases in which a player at stage $k$ from the end might not defect are (i) if she made a mistake in her own shoes, or (ii) if she made a mistake in the shoes of the players at the next stage $k-1$. Further than that, a single mistake does not affect her best action. As a result, mistakes with probability of order $\varepsilon $ do not accumulate to more than $2\varepsilon $ ($% \varepsilon $ times the bound on the recall $+1$). The conclusion is that if we allow for mistakes to history-dependent strategies with recall bounded to 1, then, as the probability of mistake approaches $0$, play in the repeated prisoner's dilemma converges to all-defect, regardless of the length of the game. The same line of reasoning applies also if we allow mistakes to history-dependent strategies with longer recall, as long as the bound on the recall is commonly known. Thus, even when we allow for history-dependent mistakes with bounded recall, we may still say that, unlike in the centipede game, the backward induction outcome in the finitely repeated prisoners' dilemma is robust to the inclusion of mistakes in reasoning of the kind we analyzed. \appendix\pagebreak \section{Proofs} To prove theorem 3.1 we first need some definitions. Consider the following two complementary subsets of the set of types $T_{j}$ of player $j$: \[ E_{j}^{\ell ,m}=\{\text{the types of }j\text{ which are confused an even number of times in nodes }\ell ,\dots ,m\} \] \[ O_{j}^{\ell ,m}=\{\text{the types of }j\text{ which are confused an odd number of times in nodes }\ell ,\dots ,m\} \] The following two lemmata will be useful for the sequel. \bigskip \noindent {\bf Lemma A.1.} \ {\it All the types in }$E_{m+1}^{\ell ,m}${\it % \ assign the same probability to }$E_{m}^{\ell ,m},${\it \ all the types in }% $O_{m+1}^{\ell ,m}$ {\it assign the same probability to }$O_{m}^{\ell ,m},$ {\it and these two probabilities are the same. } \bigskip \noindent {\bf Proof. \ }For every type $t_{m+1}\in T_{m+1},$ denote by $% P_{t_{m+1}}:T_{m}\rightarrow T_{m}$ the permutation of $T_{m}$ that for each $t_{m}\in T_{m}$ inverts (i.e., changes $0$'s to $\allowbreak 1$'s and $1$'s to $\allowbreak 0$'s) all the entries of $t_{m}$ corresponding to entries where $t_{m+1}$ is confused (has entry 0). (That is, for $t_{m}=\left( t_{m}^{n}\right) _{n=1}^{m}$, each $t_{m}^{n}$ is unchanged if $% t_{m+1}^{n}=1 $ and inverted if $t_{m+1}^{n}=0$.) Note that, by definition, for every $A\subseteq T_{m}$ \[ b_{t_{m+1}}^{m}\left( A\right) =b_{{\bf 1}_{m+1}}^{m}\left( P_{t_{m+1}}\left( A\right) \right) . \] Note further that for any $t_{m+1}\in $ $E_{m+1}^{\ell ,m}$ (i.e., the number of $0$ entries between $\ell $ and $m$ is even), $P_{t_{m+1}}$ maps both $E_{m}^{\ell ,m}$ and $O_{m}^{\ell ,m}$ to themselves. Similarly, for $% t_{m+1}^{^{\prime }}\in $ $O_{m+1}^{\ell ,m}$, $P_{t_{m+1}}$ maps $% E_{m}^{\ell ,m}$ onto $O_{m}^{\ell ,m}$ and vice versa. Denote by ${\bf 1}% _{m+1}$ the type $111\dots 1$ of player $m+1$. Then,\bigskip \noindent $\hspace{-1.5cm}% \begin{array}{cc} b_{t_{m+1}}^{m}(E_{m}^{\ell ,m})=b_{{\bf 1}_{m+1}}^{m}\left( P_{t_{m+1}}(E_{m}^{\ell ,m})\right) =b_{{\bf 1}_{m+1}}^{m}(E_{m}^{\ell ,m})% \text{ \ ,} & b_{t_{m+1}}^{m}(O_{m}^{\ell ,m})=b_{{\bf 1}_{m+1}}^{m}\left( P_{t_{m+1}}(O_{m}^{\ell ,m})\right) =b_{{\bf 1}_{m+1}}^{m}(O_{m}^{\ell ,m}) \\ b_{t_{m+1}^{\prime }}^{m}(O_{m}^{\ell ,m})=b_{{\bf 1}_{m+1}}^{m}\left( P_{t_{m+1}^{\prime }}(O_{m}^{\ell ,m})\right) =b_{{\bf 1}_{m+1}}^{m}(E_{m}^{% \ell ,m})\text{ \ ,} & b_{t_{m+1}^{\prime }}^{m}(E_{m}^{\ell ,m})=b_{{\bf 1}% _{m+1}}^{m}\left( P_{t_{m+1}^{\prime }}(E_{m}^{\ell ,m})\right) =b_{{\bf 1}% _{m+1}}^{m}(O_{m}^{\ell ,m})% \end{array}% $ \bigskip \noindent which imply that all $b_{t_{m+1}}^{m}(E_{m}^{\ell ,m})$'s and $% b_{t_{m+1}^{\prime }}^{m}(O_{m}^{\ell ,m})$'s are the same, and all $% b_{t_{m+1}}^{m}(O_{m}^{\ell ,m})$'s and $b_{t_{m+1}^{\prime }}^{m}(E_{m}^{\ell ,m})$'s are the same. $\blacksquare $\bigskip \noindent {\bf Lemma A.2. \ }{\it Suppose that all the types of }$m${\it \ in }$E_{m}^{\ell ,m}${\it \ take the same action, and all the types in }$% O_{m}^{\ell ,m}${\it \ take the opposite action. Let }% \[ p=b_{{\bf 1}_{m+1}}^{m}(E_{m}^{\ell ,m}) \] \begin{enumerate} \item[1)] {\it If }$p<\frac{1}{d}${\it \ or} $p>1-\frac{1}{d}${\it , then all the types of }$m+1$ {\it in }$E_{m+1}^{\ell ,m+1}$ {\it take the same action} {\it and all the types of }$m+1$ {\it in }$O_{m+1}^{\ell ,m+1}$ {\it % take the opposite action.} \item[2)] {\it If }$\frac{1}{d}\frac{1}{d}$, and the other types of $m+1$, those in $O_{m+1}^{\ell ,m}$, assign to $E_{m}^{\ell ,m}$ probability $1-p>\frac{1}{d}$. Similarly, if the types in $O_{m}^{\ell ,m}$ continue, all the types in $E_{m+1}^{\ell ,m}$ assign to $E_{m}^{\ell ,m}$ probability $1-p>\frac{1}{d},$ and the types in $O_{m+1}^{\ell ,m}$ assign to $E_{m}^{\ell ,m}$ probability $p>\frac{1}{d}$. Thus, all the types of $% m+1 $ who are not confused in node $m+1$ continue, and all those who are confused there quit. In case 1, denote by $C_{m}^{\ell ,m}$ the set of types of $m$ who continue -- either $E_{m}^{\ell ,m}$ or $O_{m}^{\ell ,m}$. Now, $b_{{\bf 1}% _{m+1}}^{m}(C_{m}^{\ell ,m})$ is either $p$ or $1-p$, and thus is either (a) less than $\frac{1}{d}${\it \ }or (b) more than $1-\frac{1}{d}$. by lemma A.1, in case (a) all the types of $m+1$ in $E_{m+1}^{\ell ,m}$ assign to $% C_{m}^{\ell ,m}$ a probability smaller than $\frac{1}{d}$ and the types in $% O_{m+1}^{\ell ,m}$ assign to $C_{m}^{\ell ,m}$ a probability larger than $1-% \frac{1}{d}$; in case (b) all the types of $m+1$ in $O_{m+1}^{\ell ,m}$ assign to $C_{m}^{\ell ,m}$ a probability smaller than $\frac{1}{d}$ and the types in $E_{m+1}^{\ell ,m}$ assign to $C_{m}^{\ell ,m}$ a probability larger than $1-\frac{1}{d}$. \ Denote by $Q_{m+1}^{\ell ,m}$ the set of types of $m+1$ who assign to $% C_{m}^{\ell ,m}$ probability smaller than $\frac{1}{d}$, and by $% C_{m+1}^{\ell ,m}$ the set of types of $m+1$ who assign to $C_{m}^{\ell ,m}$ probability larger than $1-\frac{1}{d}.$ By the argument above, either $% Q_{m+1}^{\ell ,m}=E_{m+1}^{\ell ,m}$ and $C_{m+1}^{\ell ,m}=O_{m+1}^{\ell ,m} $ (case a), or $Q_{m+1}^{\ell ,m}=O_{m+1}^{\ell ,m}$ and $C_{m+1}^{\ell ,m}=E_{m+1}^{\ell ,m}$ (case b). To maximize expected utility, the types of $C_{m+1}^{\ell ,m}$ who are not confused in node $m+1$ continue there and the types of $C_{m+1}^{\ell ,m}$ who are confused in node $m+1$ quit. In contrast, the types of $% Q_{m+1}^{\ell ,m}$ who are not confused in node $m+1$ quit there, and the types of $C_{m+1}^{\ell ,m}$ who are confused in node $m+1$ continue. So altogether, the types of $m+1$ who continue constitute either the set $% E_{m+1}^{\ell ,m+1}$ or the set $O_{m+1}^{\ell ,m+1}$ and the types who quit constitute the other.\ $\blacksquare $ \bigskip \noindent {\bf Proof of Theorem 3.1. }\ In node $1$ (the last), type $1$ (which constitutes the set $E_{1}^{1,1}$) quits and type $0$ (which constitutes the set $O_{1}^{1,1}$) continues. With $\ell =m=1$ we are thus in case 1 of lemma A.2, because $p=1-\varepsilon >1-\frac{1}{d}$. The conclusion of the lemma in this case is that the premise of the lemma obtains also with $\ell =1$ and $m=2$. \ Inductively, we can apply iteratively Lemma A.2 case 1, with $\ell =1$ and increasing $m,$ until $p$ first falls below $1-\frac{1}{d}$ (this must happen since the probability $p$ of getting confused an even number of times decreases and tends to $\frac{1}{% 2}$). Let $n$ be the minimal $m$ for which $p<1-\frac{1}{d}.$ Evidently, the smaller $\varepsilon $ and $d$ are, the larger is $n$. Then, we can apply case 2 of Lemma A.2. (since $\frac{1}{2}n$ (as perceived by type $111...1$ of player $i+1$), note that at each iteration, if this probability is less than $1-\frac{1}{d}$ we go to case 2 of lemma A.2., and the probability jumps back up to $1-\varepsilon $. The lower bound is thus obtained if we take the lowest bound on probability for which case 1 of lemma A.2. applies, which is $1-\frac{1}{d}$, and compute the probability after applying case 1 again. Now,% \begin{eqnarray*} b_{{\bf 1}_{m+2}}^{m+1}(E_{m+1}^{\ell ,m+1}) &=&(1-\varepsilon )\cdot b_{% {\bf 1}_{m+1}}^{m}(E_{m}^{\ell ,m})+\varepsilon \cdot b_{{\bf 1}% _{m+1}}^{m}(O_{m}^{\ell ,m}) \\ &=&(1-\varepsilon )\cdot b_{{\bf 1}_{m+1}}^{m}(E_{m}^{\ell ,m})+\varepsilon \cdot \left( 1-b_{{\bf 1}_{m+1}}^{m}(E_{m}^{\ell ,m})\right) \end{eqnarray*}% Why? With probability $(1-\varepsilon )$ the new digit ($m+1$) is $1$, and in this case every type in $E_{m}^{\ell ,m}$ becomes a type in $% E_{m+1}^{\ell ,m+1}$, and with probability $\varepsilon $ the new digit is $% 0 $, and in this case every type in $O_{m}^{\ell ,m}$ becomes a type in $% E_{m+1}^{\ell ,m+1}$. Substituting $1-\frac{1}{d}$ for the lowest bound of $% b_{{\bf 1}_{m+1}}^{m}(E_{m}^{\ell ,m})$, we obtain: \[ b_{{\bf 1}_{m+2}}^{m+1}(E_{m+1}^{\ell ,m+1})\geq (1-\varepsilon )\left( 1-% \frac{1}{d}\right) +\varepsilon \frac{1}{d}=1-\frac{1}{d}-\varepsilon \left( 1-\frac{2}{d}\right) \]% $\blacksquare $\pagebreak \begin{thebibliography}{9} \bibitem{Aumann92} Aumann, R.J. 1992. Irrationality in Game Theory, in Dasgupta et al. (eds.), {\it Economic Analysis of Markets and Games, Essays in Honor of Frank Hahn, }MIT Press, Cambridge. \bibitem{BenPorath} Ben-Porath, E. 1997. \textquotedblleft Rationality, Nash Equilibrium and Backward Induction in Perfect-Information Games,\textquotedblright\ {\sl {\it Review of Economic Studies}} 64:23-46. \bibitem{McKelveyPalfrey} McKelvey, R.D. and T.R. Palfrey 1992. \textquotedblleft An Experimental Study of the Centipede Game,\textquotedblright\ {\sl Econometrica} 60:803-836. \bibitem{Kreps} Kreps, D. 1990. {\it A Course in Microeconomic Theory,} Princeton University Press. \bibitem{KrepsMilgromRobertsWilson} Kreps, D., P. Milgrom, J. Roberts and R. Wilson 1982. \textquotedblleft Rational Cooperation in the Finitely Repeated Prisoner's Dilemma,\textquotedblright\ {\sl Journal of Economic Theory} 27:245-252. \bibitem{Rosenthal} Rosenthal, R.W. 1981. \textquotedblleft Games of Perfect Information, Predatory Pricing and the Chain-Store Paradox,\textquotedblright\ {\sl Journal of Economic Theory} 25:92-100. \bibitem{Selten} Selten, R. 1982. \textquotedblleft Reexamination of the Perfectness Concept for Equilibrium Points in Extensive Form Games,\textquotedblright\ {\sl International journal of Game Theory} 4:25-55. \end{thebibliography} \end{document} %%%%%%%%%%%% End /document/Faultlessness-final-version.tex %%%%%%%%%%%% %%%%%%%%%%%%%%%%%%%%% Start /document/IGBA0M00.wmf %%%%%%%%%%%%%%%%%%%% WwlqZB@@@@@@@h`L_JA{I@@@@@pxkK@@I@@@C|aD@@@C@TA@@@@@@PA@@@`IF|@@^@pD@@ E@@@@W}f\dy@@MevXr}v\oYF]`\u[rQVA@@@@KH@@@@@@E@@@@p`@sHPzEt@@@@p~B@@@@@@@@@ @@@@@@@@@@@@@@A@@CD@@@@tR@@@@A@@@@BDP@@TA@@@p~BX}@@@@@@@@PF@@@@@@D@D@RPUZm Uv\`xTYwAbTouVXnA@M@P@@@@PKAD@@E@@@@d`@@@@@@P@@@@`@AD@@P@@@@XbAO@`E@|O@ @|iW@@@Lc@@@p@@@@@^@@A@@@@nD@@@T@@@@`BB@@@@@PA@@@@IH@@@@@@E@@@@D` @@\@@@@@BD@@@@@@@@@@D@@@@tR@B@PB@@@@zKPA@@@@@@pO@b@@A@@@@mDp@@P@@@@ p@A`@@G@@@@HAAsH`LBd^AhWPA@@@@KH@@@@@@E@@@@p`@sHPzEPA@@@`IF|@@^@pD@@E@ @@@W}f\dy@@MevXr}v\oYF]`\u[rQVA@@@@LHpLBd^AM@@@@lo@@@@@@@@@@@@@@@@@@@@A@@P@ @p@A@@@@mD@A@P@@@@`@AD@@U@@@@lo@VO@@@@@@@@dA@@@@@@A@A`DTeV[eMGHNUv]`Hu[mEf [@PD@D@@@@tR@E@PA@@@@IH@@@@@@D@@@@HP@A@@D@@@@fXpC@XA@C@@XB@@@|C@@@@qE@ @@EA@@@d@@@@`~B@@@C@@@@@@@@@`H@P@@@@PKAX@@G@@@@po@A@@@@@@@@@@A@@@@mDpA@T@@@ @pBB@@@@@@B@@@@eL`@@hI@AAPpEHD@I@@@@ho@@@@@@@@@@@@@@HB@D@@@@tR@H@@A@@@@pG`A @\@@@@@B@@@@@@@D@@@@tR@F@@B@@@@fXpC@X@@G@@P@@@@XbAO@`E@|O@@`I @@@pO@@@@]B@@@`O@@@PB@@@@zK@@@L@@@@@@@@@@b@@A@@@@mDPB@P@@@@PKA\@@H@@@@Tr@B@ `f@DD@ZBP}@P@@@@PKA`@@D@@@@@_@I@@A@@@@mD`A@`@@@@`IF|@@F@pA@@D@@@@fXpC@ XA@C@@fF@@@|C@@@pjA@@@xC@@@d@@@@`~B@@@C@@@@@@@@@`H@P@@@@PKAd@@D@@@@tR@ G@@B@@@@eL`@@`Z@AA@jATO@D@@@@tR@H@@A@@@@pGPB@P@@@@PKAX@@H@@@@XbAO@`A@|_ @@@A@@@`IF|@@V@p@@@mB@@@@@@@dk@@@@~@@@@I@@@@ho@@@p@@@@@@@@@@HB@D@@@@t R@I@@A@@@@mDpA@`@@@@PICH@@vJPP@Xk@uC@A@@@@mD@B@P@@@@@|Ad@@D@@@@tR@F@@B@@@@f XpC@X@@G@@P@@@@XbAO@`E@|O@@H|@@@pO@@@@GO@@@`O@@@PB@@@@zK@@@L@@@@@@ @@@@b@@A@@@@mDPB@P@@@@PKA\@@H@@@@Tr@B@@qCDD@DOP}@P@@@@PKA`@@D@@@@@_@I@@A@@@ @mD`A@`@@@@`IF|@@F@pA@@D@@@@fXpC@XA@C@@PS@@@|C@@@PuD@@@xC@@@d@@@@ `~B@@@C@@@@@@@@@`H@P@@@@PKAd@@D@@@@tR@G@@B@@@@eL`@@HMAAA`tDTO@D@@@@tR@H@@A@ @@@pGPB@P@@@@PKAX@@H@@@@XbAO@`A@|_@@P@@@@PKA\@@I@@@@ho@E@@@@@@@@HB@ D@@@@tR@I@pA@@@@[P`LBtK@yCp]@P@@@@PKAX@@D@@@@tR@H@@A@@@@mDpA@P@@@@PKAd@@G@@ @@lAAEE`n@|O@LB@A@@@@mD`A@P@@@@PKA`@@C@@@@xA@G@@@@XAADE`n@|O@LB@A@@@@BDP@@T A@@@p~Bx|@@@@@@@@PF@@@@@@D@D@RPUZmUv\`xTYwAbTouVXnAPH@P@@@@PKAh@@E@@@@Pa@@ D@c@P@@@@`KAD@@M@@@@HcB@D@c@D@@D@@@@@@@iWpLBDC@Y@@A@@@@nD@@@T@@@@@EB@@@@@@A @@@@mDPA@P@@@@`@AD@@D@@@@\R@OA@@@@mDpA@P@@@@PKAd@@G@@@@lAA@F`n@hS@LB@A@@@ @mD`A@P@@@@PKA`@@C@@@@xA@G@@@@XAAE`n@hS@LB@A@@@@BDP@@P@@@@PKAh@@E@@@@d`@@@ @@@T@@@@@EBlS@LB@A@@@@nDP@@t@@@@`LJlS@LBP@@P@@@@@@@d^AsHPL@dA@D@@@@xR@@@PA@ @@@TH@dAkhD@@@@tR@E@@A@@@@BDP@@P@@@@pIA|D@@@@tR@G@@A@@@@mDPB@\@@@@pFDl[@ zBP]ApH@D@@@@tR@F@@A@@@@mD@B@L@@@@`G@\@@@@`EDh[@zBP]ApH@D@@@@HP@A@@A@@@@mD` B@T@@@@PBB@@@@@PA@@@@TH`]ApH@D@@@@xR@A@PC@@@@rh`]ApH@A@@A@@@@@@PzELc@q@PF@P @@@@`KA@@@E@@@@Pa@P~GlbR@@@@PKAT@@D@@@@HP@A@@A@@@@gDpS@@@@PKA\@@D@@@@tR@I @pA@@@@[P`}AhK@pF@c@P@@@@PKAX@@D@@@@tR@H@p@@@@@^@pA@@@@VPP}AhK@pF@c@P@@@@`@ AD@@D@@@@tR@J@PA@@@@IH@@@@@@E@@@@Pa@qF@c@P@@@@`KAD@@M@@@@HcBqF@c@D@@D@@@@@@ @iWpLBDC@Y@@A@@@@nD@@@T@@@@@EB@y_pJJA@@@@mDPA@P@@@@`@AD@@D@@@@\R@OA@@@@mD pA@P@@@@PKAd@@G@@@@lAAqH`n@l^@LB@A@@@@mD`A@P@@@@PKA`@@C@@@@xA@G@@@@XAApH`n@ l^@LB@A@@@@BDP@@P@@@@PKAh@@E@@@@d`@@@@@@T@@@@@EBp^@LB@A@@@@nDP@@t@@@@`LJp^@ LBP@@P@@@@@@@d^AsHPL@dA@D@@@@xR@@@PA@@@@TH@dAkhD@@@@tR@E@@A@@@@BDP@@P@@@@p IA|D@@@@tR@G@@A@@@@mDPB@\@@@@pFD\D@~BP@@hG@D@@@@tR@F@@A@@@@mD@B@P@@@@PKA\ @@D@@@@tR@I@pA@@@@[PPS@lK@G@@d@P@@@@PKAX@@D@@@@tR@H@p@@@@@^@pA@@@@VPpR@lK@G @@d@P@@@@`@AD@@D@@@@tR@J@PA@@@@IH@@@@@@E@@@@Pa@H@@d@P@@@@`KAD@@M@@@@HcBH@@d @D@@D@@@@@@@iWpLBTF@V@@A@@@@nD@@@T@@@@@EB@y_pJJA@@@@mDPA@P@@@@`@AD@@D@@@@\R @OA@@@@mDpA@P@@@@PKAd@@G@@@@lAAIAPrAP@@EF@A@@@@mD`A@P@@@@PKA`@@D@@@@tR@G@ @A@@@@mDPB@\@@@@pFD@E@GG`B@dY@D@@@@tR@F@@A@@@@mD@B@L@@@@`G@\@@@@`ED|D@GG`B@ dY@D@@@@HP@A@@A@@@@mD`B@T@@@@PBB@@@@@PA@@@@THpB@dY@D@@@@xR@A@PC@@@@rhpB@dY@ A@@A@@@@@@PzELc@dAPF@P@@@@`KA@@@E@@@@Pa@P~GlbR@@@@PKAT@@D@@@@HP@A@@A@@@@gDp S@@@@PKA\@@D@@@@tR@I@pA@@@@[P@S@hm@F@`eBP@@@@PKAX@@D@@@@tR@H@@A@@@@mDpA@P @@@@PKAd@@G@@@@lAARApuBp@@lJ@A@@@@mD`A@P@@@@PKA`@@C@@@@xA@G@@@@XAAPApuBp@@l J@A@@@@BDP@@P@@@@PKAh@@E@@@@d`@@@@@@T@@@@@EBt@@lJ@A@@@@nDP@@t@@@@`LJt@@lJP@ @P@@@@@@@d^AsHpX@XA@D@@@@xR@@@PA@@@@TH@dAkhD@@@@tR@E@@A@@@@BDP@@P@@@@pIA| D@@@@tR@G@@A@@@@mDPB@\@@@@pFDdD@eO@A@Dz@D@@@@tR@F@@A@@@@mD@B@P@@@@PKA\@@D@ @@@tR@I@pA@@@@[P@T@L~@J@`mCP@@@@PKAX@@D@@@@tR@H@p@@@@@^@pA@@@@VPpS@L~@J@`mC P@@@@`@AD@@D@@@@tR@J@PA@@@@IH@@@@@@E@@@@Pa@K@`mCP@@@@`KAD@@M@@@@HcBK@`mCD@@ D@@@@@@@iWpLBHF@X@@A@@@@nD@@@T@@@@@EB@y_pJJA@@@@mDPA@P@@@@`@AD@@D@@@@\R@O A@@@@mDpA@P@@@@PKAd@@G@@@@lAAIA`}DP@@rR@A@@@@mD`A@P@@@@PKA`@@D@@@@tR@G@@A@@ @@mDPB@\@@@@pFD@E@sS`B@`LAD@@@@tR@F@@A@@@@mD@B@L@@@@`G@\@@@@`ED|D@sS`B@`LAD @@@@HP@A@@A@@@@mD`B@T@@@@PBB@@@@@PA@@@@THpB@`LAD@@@@xR@A@PC@@@@rhpB@`LAA@@A @@@@@@PzELc@aA`E@P@@@@`KA@@@E@@@@Pa@P~GlbR@@@@PKAT@@D@@@@HP@A@@A@@@@gDpCA @@@`IF|@@V@p@@@oE@@@@A@@@D\A@@`}@@@@I@@@@ho@@@p@@@@@@@@@@HB@D@@@@tR@K@ @A@@@@mDpA@`@@@@PICH@@~V`P@x[AsC@A@@@@mD@B@P@@@@@|Al@@D@@@@tR@F@@B@@@@fXpC@ X@@G@@D@@@@tR@G@@A@@@@mDPB@\@@@@pFDHc@NGP~@`X@D@@@@tR@F@@A@@@@mD@B@P@@ @@PKA\@@D@@@@tR@I@pA@@@@[PPQAP]@CPeAP@@@@PKAX@@D@@@@tR@H@p@@@@@^@pA@@@@VP@ QAP]@CPeAP@@@@`@AD@@D@@@@tR@J@PA@@@@IH@@@@@@E@@@@Pa@@DPeAP@@@@`KAD@@N@@@@H cB@DPeAH@@D@@@@@@@iWpLB@BLM@PF@P@@@@`KA@@@E@@@@Pa@P~GlbR@@@@PKAT@@D@@@@HP@A @@A@@@@gDpS@@@@PKA\@@D@@@@tR@I@pA@@@@[P@`Al\@zDPgAP@@@@PKAX@@D@@@@tR@H@p@ @@@@^@pA@@@@VPp_Al\@zDPgAP@@@@`@AD@@D@@@@tR@J@PA@@@@IH@@@@@@E@@@@Pa@{DPgAP@ @@@`KAD@@M@@@@HcB{DPgAD@@D@@@@@@@iWpLBDC@Y@@A@@@@nD@@@T@@@@@EB@y_pJJA@@@@mD PA@P@@@@`@AD@@D@@@@\R@OA@@@@mDpA@P@@@@PKAd@@G@@@@lAA{FprATW@]F@A@@@@mD`A@ P@@@@PKA`@@C@@@@xA@G@@@@XAAzFprATW@]F@A@@@@BDP@@P@@@@PKAh@@E@@@@d`@@@@@@T@@ @@@EBXW@]F@A@@@@nDP@@t@@@@`LJXW@]FP@@P@@@@@@@d^AsHPL@dA@D@@@@xR@@@PA@@@@TH@ dAkhD@@@@tR@E@@A@@@@BDP@@P@@@@pIA|D@@@@tR@G@@A@@@@mDPB@\@@@@pFDX_@KG@lAt Y@D@@@@tR@F@@A@@@@mD@B@L@@@@`G@\@@@@`EDT_@KG@lAtY@D@@@@HP@A@@A@@@@mD`B@T@@@ @PBB@@@@@PA@@@@THPlAtY@D@@@@xR@A@PC@@@@rhPlAtY@A@@A@@@@@@PzELc@q@PF@P@@@@`K A@@@E@@@@Pa@P~GlbR@@@@PKAT@@D@@@@HP@A@@A@@@@gDpS@@@@PKA\@@D@@@@tR@I@pA@@@ @[PPLBl\@kGPgAP@@@@PKAX@@D@@@@tR@H@p@@@@@^@pA@@@@VP@LBl\@kGPgAP@@@@`@AD@@D@ @@@tR@J@PA@@@@IH@@@@@@E@@@@Pa@lGPgAP@@@@`KAD@@M@@@@HcBlGPgAD@@D@@@@@@@iWpLB DC@Y@@A@@@@nD@@@T@@@@@EB@y_pJJA@@@@mDPA@P@@@@`@AD@@D@@@@\R@OA@@@@mDpA@P@@ @@PKAd@@G@@@@lAArHpwBdO@YJ@A@@@@mD`A@P@@@@PKA`@@D@@@@tR@G@@A@@@@mDPB@\@@@@p FDTT@\Kp@xj@D@@@@tR@F@@A@@@@mD@B@L@@@@`G@\@@@@`EDPT@\Kp@xj@D@@@@HP@A@@A@@ @@mD`B@T@@@@PBB@@@@@PA@@@@TH@@Axj@D@@@@xR@A@PC@@@@rh@@Axj@A@@A@@@@@@PzELc@u @PF@P@@@@`KA@@@E@@@@Pa@P~GlbR@@@@PKAT@@D@@@@HP@A@@A@@@@gDpS@@@@PKA\@@D@@@ @tR@I@pA@@@@[P@`ATn@zD`iBP@@@@PKAX@@D@@@@tR@H@p@@@@@^@pA@@@@VPp_ATn@zD`iBP@ @@@`@AD@@D@@@@tR@J@PA@@@@IH@@@@@@E@@@@Pa@{D`iBP@@@@`KAD@@N@@@@HcB{D`iBH@@D@ @@@@@@iWpLB@BLM@PF@P@@@@`KA@@@E@@@@Pa@P~GlbR@@@@PKAT@@D@@@@HP@A@@A@@@@gDp S@@@@PKA\@@D@@@@tR@I@pA@@@@[PpnApm@uE`kBP@@@@PKAX@@D@@@@tR@H@p@@@@@^@pA@@@@ VP`nApm@uE`kBP@@@@`@AD@@D@@@@tR@J@PA@@@@IH@@@@@@E@@@@Pa@vE`kBP@@@@`KAD@@M@@ @@HcBvE`kBD@@D@@@@@@@iWpLBDC@Y@@A@@@@nD@@@T@@@@@EB@y_pJJA@@@@mDPA@P@@@@`@AD @@D@@@@\R@OA@@@@mDpA@P@@@@PKAd@@G@@@@lAAvG@wB@[@nJ@A@@@@mD`A@P@@@@PKA`@@C @@@@xA@G@@@@XAAuG@wB@[@nJ@A@@@@BDP@@P@@@@PKAh@@E@@@@d`@@@@@@T@@@@@EBD[@nJ@A @@@@nDP@@t@@@@`LJD[@nJP@@P@@@@@@@d^AsHPL@dA@D@@@@xR@@@PA@@@@TH@dAkhD@@@@tR @E@@A@@@@BDP@@P@@@@pIA|D@@@@tR@G@@A@@@@mDPB@\@@@@pFDDc@\KpzAxj@D@@@@tR@F@ @A@@@@mD@B@L@@@@`G@\@@@@`ED@c@\KpzAxj@D@@@@HP@A@@A@@@@mD`B@T@@@@PBB@@@@@PA@ @@@TH@{Axj@D@@@@xR@A@PC@@@@rh@{Axj@A@@A@@@@@@PzELc@q@PF@P@@@@`KA@@@E@@@@Pa@ P~GlbR@@@@PKAT@@D@@@@HP@A@@A@@@@gDpS@@@@PKA\@@D@@@@tR@I@pA@@@@[P`LBT~@yC` gCP@@@@PKAX@@D@@@@tR@H@@A@@@@mDpA@P@@@@PKAd@@G@@@@lAAEEPxC|O@sN@A@@@@mD`A@P @@@@PKA`@@C@@@@xA@G@@@@XAADEPxC|O@sN@A@@@@BDP@@P@@@@PKAh@@E@@@@d`@@@@@@T@@@ @@EB@P@sN@A@@@@nDP@@t@@@@`LJ@P@sNP@@P@@@@@@@d^AsHPM@dA@D@@@@xR@@@PA@@@@TH@d AkhD@@@@tR@E@@A@@@@BDP@@P@@@@pIA|D@@@@tR@G@@A@@@@mDPB@\@@@@pFD@X@aO`NAL{ @D@@@@tR@F@@A@@@@mD@B@L@@@@`G@\@@@@`ED|W@aO`NAL{@D@@@@HP@A@@A@@@@mD`B@T@@@@ PBB@@@@@PA@@@@THpNAL{@D@@@@xR@A@PC@@@@rhpNAL{@A@@A@@@@@@PzELc@u@PF@P@@@@`KA @@@E@@@@Pa@P~GlbR@@@@PKAT@@D@@@@HP@A@@A@@@@gDpS@@@@PKA\@@D@@@@tR@I@pA@@@@ [PpnAh~@uEpjCP@@@@PKAX@@D@@@@tR@H@p@@@@@^@pA@@@@VP`nAh~@uEpjCP@@@@`@AD@@D@@ @@tR@J@PA@@@@IH@@@@@@E@@@@Pa@vEpjCP@@@@`KAD@@N@@@@HcBvEpjCH@@D@@@@@@@iWpLB@ BLM@PF@P@@@@`KA@@@E@@@@Pa@P~GlbR@@@@PKAT@@D@@@@HP@A@@A@@@@gDpS@@@@PKA\@@D @@@@tR@I@pA@@@@[P`}AD~@pFplCP@@@@PKAX@@D@@@@tR@H@p@@@@@^@pA@@@@VPP}AD~@pFpl CP@@@@`@AD@@D@@@@tR@J@PA@@@@IH@@@@@@E@@@@Pa@qFplCP@@@@`KAD@@M@@@@HcBqFplCD@ @D@@@@@@@iWpLBDC@Y@@A@@@@nD@@@T@@@@@EB@y_pJJA@@@@mDPA@P@@@@`@AD@@D@@@@\R@ OA@@@@mDpA@P@@@@PKAd@@G@@@@lAAqHPxCl^@sN@A@@@@mD`A@P@@@@PKA`@@C@@@@xA@G@@@@ XAApHPxCl^@sN@A@@@@BDP@@P@@@@PKAh@@E@@@@d`@@@@@@T@@@@@EBp^@sN@A@@@@nDP@@t@@ @@`LJp^@sNP@@P@@@@@@@d^AsHPL@dA@D@@@@xR@@@PA@@@@TH@dAkhD@@@@tR@E@@A@@@@BDP @@P@@@@pIA|D@@@@tR@G@@A@@@@mDPB@\@@@@pFDHc@vSP~@|JAD@@@@tR@F@@A@@@@mD@B@P @@@@PKA\@@D@@@@tR@I@pA@@@@[PPQAHOAC@qDP@@@@PKAX@@D@@@@tR@H@p@@@@@^@pA@@@@V P@QAHOAC@qDP@@@@`@AD@@D@@@@tR@J@PA@@@@IH@@@@@@E@@@@Pa@@D@qDP@@@@`KAD@@M@@@ @HcB@D@qDD@@D@@@@@@@iWpLBTC@Y@@A@@@@nD@@@T@@@@@EB@y_pJJA@@@@mDPA@P@@@@`@AD@ @D@@@@\R@OA@@@@mDpA@P@@@@PKAd@@G@@@@lAA@F`|DhS@DS@A@@@@mD`A@P@@@@PKA`@@C@ @@@xA@G@@@@XAAE`|DhS@DS@A@@@@BDP@@P@@@@PKAh@@E@@@@d`@@@@@@T@@@@@EBlS@DS@A@ @@@nDP@@t@@@@`LJlS@DSP@@P@@@@@@@d^AsHPM@dA@D@@@@xR@@@PA@@@@TH@dAkhD@@@@tR@ E@@A@@@@BDP@@P@@@@pIA|D@@@@tR@G@@A@@@@mDPB@\@@@@pFDl[@rSP]APLAD@@@@tR@F@@ A@@@@mD@B@L@@@@`G@\@@@@`EDh[@rSP]APLAD@@@@HP@A@@A@@@@mD`B@T@@@@PBB@@@@@PA@@ @@TH`]APLAD@@@@xR@A@PC@@@@rh`]APLAA@@A@@@@@@PzELc@u@PF@P@@@@`KA@@@E@@@@Pa@P ~GlbR@@@@PKAT@@D@@@@HP@A@@A@@@@gDpS@@@@PKA\@@D@@@@tR@I@pA@@@@[P`}AlOApF@o DP@@@@PKAX@@D@@@@tR@H@p@@@@@^@pA@@@@VPP}AlOApF@oDP@@@@`@AD@@D@@@@tR@J@PA@@@ @IH@@@@@@E@@@@Pa@qF@oDP@@@@`KAD@@N@@@@HcBqF@oDH@@D@@@@@@@iWpLB@BLM@PF@P@@@@ `KA@@@E@@@@Pa@P~GlbR@@@@PKAT@@D@@@@HP@A@@A@@@@gDpS@@@@PKA\@@D@@@@tR@I@pA@ @@@[PPLBHOAkG@qDP@@@@PKAX@@D@@@@tR@H@p@@@@@^@pA@@@@VP@LBHOAkG@qDP@@@@`@AD@@ D@@@@tR@J@PA@@@@IH@@@@@@E@@@@Pa@lG@qDP@@@@`KAD@@M@@@@HcBlG@qDD@@D@@@@@@@iWp LBDC@Y@@A@@@@nD@@@T@@@@@EB@y_pJJA@@@@mDPA@P@@@@`@AD@@D@@@@\R@OA@@@@mDpA@P @@@@PKAd@@G@@@@lAArHpxEdO@]V@A@@@@mD`A@P@@@@PKA`@@D@@@@tR@G@@A@@@@mDPB@\@@@ @pFDTT@`Wp@H[AD@@@@tR@F@@A@@@@mD@B@L@@@@`G@\@@@@`EDPT@`Wp@H[AD@@@@HP@A@@A @@@@mD`B@T@@@@PBB@@@@@PA@@@@TH@@AH[AD@@@@xR@A@PC@@@@rh@@AH[AA@@A@@@@@@PzELc @u@PF@P@@@@`KA@@@E@@@@Pa@P~GlbR@@@@PKAT@@D@@@@HP@A@@A@@@@gDpS@@@@PKA\@@D@ @@@tR@I@pA@@@@[P@`A@^AzD`lEP@@@@PKAX@@D@@@@tR@H@p@@@@@^@pA@@@@VPp_A@^AzD`lE P@@@@`@AD@@D@@@@tR@J@PA@@@@IH@@@@@@E@@@@Pa@{D`lEP@@@@`KAD@@M@@@@HcB{D`lED@@ D@@@@@@@iWpLBTC@Y@@A@@@@nD@@@T@@@@@EB@y_pJJA@@@@mDPA@P@@@@`@AD@@D@@@@\R@O A@@@@mDpA@P@@@@PKAd@@G@@@@lAA{F@xETW@rV@A@@@@mD`A@P@@@@PKA`@@C@@@@xA@G@@@@X AAzF@xETW@rV@A@@@@BDP@@P@@@@PKAh@@E@@@@d`@@@@@@T@@@@@EBXW@rV@A@@@@nDP@@t@@@ @`LJXW@rVP@@P@@@@@@@d^AsHPM@dA@D@@@@xR@@@PA@@@@TH@dAkhD@@@@tR@E@@A@@@@BDP@ @P@@@@pIA|D@@@@tR@G@@A@@@@mDPB@\@@@@pFDX_@`W@lAH[AD@@@@tR@F@@A@@@@mD@B@L@ @@@`G@\@@@@`EDT_@`W@lAH[AD@@@@HP@A@@A@@@@mD`B@T@@@@PBB@@@@@PA@@@@THPlAH[AD@ @@@xR@A@PC@@@@rhPlAH[AA@@A@@@@@@PzELc@u@PF@P@@@@`KA@@@E@@@@Pa@P~GlbR@@@@PKA T@@D@@@@HP@A@@A@@@@gDpS@@@@PKA\@@D@@@@tR@I@pA@@@@[PPLBd^AkG`jEP@@@@PKAX@@ D@@@@tR@H@p@@@@@^@pA@@@@VP@LBd^AkG`jEP@@@@`@AD@@D@@@@tR@J@PA@@@@IH@@@@@@E@@ @@Pa@lG`jEP@@@@`KAD@@N@@@@HcBlG`jEH@@D@@@@@@@iWpLB@BLM@PF@P@@@@`KA@@@E@@@@P a@P~GlbR@@@@PKAT@@D@@@@HP@A@@A@@@@gDpS@@@@PKA\@@D@@@@tR@I@pA@@@@[P`LBXD@y C@@@P@@@@PKAX@@D@@@@tR@H@@A@@@@mDpA@P@@@@PKAd@@G@@@@lAAEEPR@|O@P@@A@@@@mD`A @P@@@@PKA`@@C@@@@xA@G@@@@XAADEPR@|O@P@@A@@@@BDP@@P@@@@PKAh@@E@@@@d`@@@@@@T@ @@@@EB@P@P@@A@@@@nDP@@x@@@@`LJ@P@P@`@@P@@@@@@@d^AsHPYzXA@N@@A@@@@nD@@@T@@@@ @EB@y_pJJA@@@@mDPA@P@@@@`@AD@@D@@@@\R@OA@@@@mDpA@P@@@@PKAd@@G@@@@lAA@F`R@ hS@N@@A@@@@mD`A@P@@@@PKA`@@C@@@@xA@G@@@@XAAE`R@hS@N@@A@@@@BDP@@P@@@@PKAh@@ E@@@@d`@@@@@@T@@@@@EBlS@N@@A@@@@nDP@@x@@@@`LJlS@N@`@@P@@@@@@@d^AsH@YzdA@N@@ A@@@@nD@@@T@@@@@EB@y_pJJA@@@@mDPA@P@@@@`@AD@@D@@@@\R@OA@@@@mDpA@P@@@@PKAd @@G@@@@lAA{FPR@TW@P@@A@@@@mD`A@P@@@@PKA`@@C@@@@xA@G@@@@XAAzFPR@TW@P@@A@@@@B DP@@P@@@@PKAh@@E@@@@d`@@@@@@T@@@@@EBXW@P@@A@@@@nDP@@x@@@@`LJXW@P@`@@P@@@@@@ @d^AsHpXzXA@N@@A@@@@nD@@@T@@@@@EB@y_pJJA@@@@mDPA@P@@@@`@AD@@D@@@@\R@OA@@@ @mDpA@P@@@@PKAd@@G@@@@lAAvG`R@@[@O@@A@@@@mD`A@P@@@@PKA`@@C@@@@xA@G@@@@XAAuG `R@@[@O@@A@@@@BDP@@P@@@@PKAh@@E@@@@d`@@@@@@T@@@@@EBD[@O@@A@@@@nDP@@x@@@@`LJ D[@O@`@@P@@@@@@@d^AsH`Xz`A@N@@A@@@@nD@@@T@@@@@EB@y_pJJA@@@@mDPA@P@@@@`@AD@@ D@@@@\R@OA@@@@mDpA@P@@@@PKAd@@G@@@@lAAqHPR@l^@P@@A@@@@mD`A@P@@@@PKA`@@C@@ @@xA@G@@@@XAApHPR@l^@P@@A@@@@BDP@@P@@@@PKAh@@E@@@@d`@@@@@@T@@@@@EBp^@P@@A@@ @@nDP@@x@@@@`LJp^@P@`@@P@@@@@@@d^AsHPXzXA@N@@A@@@@nD@@@T@@@@@EB@y_pJJA@@@@m DPA@P@@@@`@AD@@D@@@@\R@OA@@@@mD@A@P@@@@@|AT@@D@@@@@_@G@@A@@@@pGPB@P@@@@@| Ah@@D@@@@tR@F@PB@@@@zK@@@@@@@@@@@@@@b@@A@@@@mDPA@P@@@@pIA|H@@@@XbAO@`A@| _@@P@@@@PKA@@@C@@@@@@@ %%%%%%%%%%%%%%%%%%%%%% End /document/IGBA0M00.wmf %%%%%%%%%%%%%%%%%%%%% %%%%%%%%%%%%%%%%%%%%% Start /document/IGBA0M02.wmf %%%%%%%%%%%%%%%%%%%% WwlqZB@@@@@@@`LKfj@{I@@@@@pbgH@@I@@@C\YE@@PC@TA@@@@@@PA@@@`IF|@@^@pD@@ E@@@@W}f\dy@@MevXr}v\oYF]`\u[rQVA@@@@KHPO@@E@@@@p`@DJ@eJLA@@@p~B@@@@@@@@@ @@@@@@@@Pl@@@@ALt[uIWZeIGHNUv]@DK@D@@@@tR@@@@A@@@@BDP@@TA@@@p~Btz@@@@@@@@P F@@@@@@G@D@RPUZmUv\`xTYwAbTouVXnA@@@P@@@@PKAD@@D@@@@tR@A@PA@@@@IH@@@@@@E@@@ @Pa@BJ@@@T@@@@@EB\PpBB@A@@@@BDP@@@A@@@`IF|@@V@p@@`@@@@@|ciB@@``B@@ @C@@@@xA@D@@@@xR@@@PA@@@@JH@@@@@@E@@@@d`@@@@@@T@@@@P@B|CpA@@@@|KP@@@@@@@ @@@P@@@@PKAH@@I@@@@ho@E@@@@@@@@HB@D@@@@tR@C@@A@@@@CD@B@T@@@@pBB@@@@@PA@ @@@LH@aBtfBT@@@@XbAO@`G@|OA@PA@@@pUoIGYN@PSiMf\oMw[fQGHW}f\dU@@@@@CBPh@ mipD@@@@{K@@@@@@@@@@@@@@@@@@qR@@@DpPoUg\iUf\`xTYwAPl@P@@@@PKAP@@D@@@@HP@A@P E@@@@{KPkC@@@@@@@@Y@@@@@@P@P@HAUiuVYsAbSe]GHR}V[ayF@@@@A@@@@mDPA@P@@@@PKAT @@E@@@@d`@@@@@@T@@@@pBB@@@@@PA@@@@THP@@`dBD@@@@xR@A@PC@@@@rhP@@`dBA@@A@@@@@ @P[JPh@`@PI@P@@@@`KA@@@E@@@@Pa@@@pw@P@@@@`@AD@@P@@@@XbAO@`E@|O@@lC@@@p^ @@@@di@@@dH@@@PB@@@@zK@@@X@@@@@@@@@@b@@A@@@@mD`A@\@@@@@BD@@@@@@@@@@D@@@@tR @G@@B@@@@eL`@@|C@ApWJPH@I@@@@ho@@@@@@@@@@@@@@HB@D@@@@tR@H@@A@@@@pG`A@\@@@@ @B@@@@@@@D@@@@tR@F@@B@@@@fXpC@X@@G@@P@@@@XbAO@`E@|O@@lC@@@p^@ @@@DA@@@p^@@@PB@@@@zK@@@X@@@@@@@@@@b@@A@@@@mDPB@P@@@@PKA\@@H@@@@Tr@B@pO@|G@ @pyAP@@@@PKA`@@D@@@@@_@I@@A@@@@mD`A@`@@@@`IF|@@F@pA@@D@@@@fXpC@XA@ C@@WI@@@lG@@@@XB@@@lG@@@d@@@@`~B@@@F@@@@@@@@@`H@P@@@@PKAd@@D@@@@tR@G@@B@@ @@eL`@@le@ApVB\^@D@@@@tR@H@@A@@@@pGPB@P@@@@PKAX@@H@@@@XbAO@`A@|_@@@A@@ @`IF|@@V@p@@p\D@@@{A@@@tGA@@@{A@@@I@@@@ho@@@`A@@@@@@@@@HB@D@@@@tR@I@@A @@@@mDpA@`@@@@PICH@@wQp_@`GAgG@A@@@@mD@B@P@@@@@|Ad@@D@@@@tR@F@@B@@@@fXpC@X@ @G@@P@@@@XbAO@`E@|O@@|hA@@p^@@@@YZ@@@p^@@@PB@@@@zK@@@X@@@@@@@@@@b@ @A@@@@mDPB@P@@@@PKA\@@H@@@@Tr@B@pdF|G@TZpyAP@@@@PKA`@@D@@@@@_@I@@A@@@@mD`A@ `@@@@`IF|@@F@pA@@D@@@@fXpC@XA@C@@kb@@@lG@@@PmH@@@lG@@@d@@@@`~B@@@ F@@@@@@@@@`H@P@@@@PKAd@@D@@@@tR@G@@B@@@@eL`@@|JBA@lH\^@D@@@@tR@H@@A@@@@pGP B@P@@@@PKAX@@H@@@@XbAO@`A@|_@@P@@@@PKA\@@I@@@@ho@E@@@@@@@@HB@D@@@@t R@I@pA@@@@[Pp`BhH@jGpS@@@@PKAX@@D@@@@tR@H@@A@@@@mDpA@P@@@@PKAd@@G@@@@lAAi I@X@X_@hOA@@@@mD`A@P@@@@PKA`@@C@@@@xA@G@@@@XAAgIpW@T_@hOA@@@@BDP@@@A@@@p~ Btz@@@@@@@@^E@@@@PlD@D@BtTZreVXmAPlD@@@@tR@J@@A@@@@mD`B@P@@@@`KA@P@E@@@@Pa @PHPA@P@@@@`KAD@@L@@@@HcBPHPA@D@@D@@@@@@@mi@aB@B@D@@@@xR@@@PA@@@@TH@rKzlhD@ @@@xR@@@PE@@@@{K@gC@@@@@@@@Y@@@@@@P@P@HAUiuVYsAbSe]GHR}V[ayF@@@@A@@@@mDpB@ P@@@@PKAl@@E@@@@Pa@uGPK@P@@@@`KAD@@M@@@@HcBuGPK@D@@D@@@@@@@mi@aBDC@r@@A@@@@ nD@@@T@@@@@EB`|bNKJA@@@@BDP@@P@@@@pIA|D@@@@tR@G@@A@@@@mDPB@\@@@@pFDLh@@S` ~ATCAD@@@@tR@F@@A@@@@mD@B@P@@@@PKA\@@D@@@@tR@I@pA@@@@[PP`B|IAvG@QDP@@@@PKAX @@D@@@@tR@H@p@@@@@^@pA@@@@VPp_B|IAuG@QDP@@@@`@AD@@D@@@@tR@J@@A@@@@mD`B@T@@@ @PBB@@@@@@A@@@@nD@@AT@@@@@EBLa@DQ@A@@@@nDP@@p@@@@`LJLa@DQP@@P@@@@@@@tfBDJ@H @P@@@@`KA@@@E@@@@Pa@HohsbR@@@@`KA@@@D@@@@tR@K@@A@@@@mDpB@T@@@@@EB`_@lQ@A@@@ @nDP@@t@@@@`LJ`_@lQP@@P@@@@@@@tfBDJPM@HC@D@@@@xR@@@PA@@@@TH@rKzlhD@@@@HP@A@ @A@@@@gDpS@@@@PKA\@@D@@@@tR@I@pA@@@@[Pp`BXj@pGpFBP@@@@PKAX@@D@@@@tR@H@@A@ @@@mDpA@P@@@@PKAd@@G@@@@lAAiI@\BX_@rG@A@@@@mD`A@P@@@@PKA`@@C@@@@xA@G@@@@XAA gIp[BT_@sG@A@@@@BDP@@P@@@@PKAh@@D@@@@tR@J@PA@@@@IH@@@@@@D@@@@xR@@DPA@@@@TH@ DBTa@D@@@@xR@A@@C@@@@rh@DBTa@A@@A@@@@@@P[JPh@`@@A@@@@nD@@@T@@@@@EB`|bNKJA@@ @@nD@@@P@@@@PKAl@@D@@@@tR@K@PA@@@@THP}Atc@D@@@@xR@A@PC@@@@rhP}Atc@A@@A@@@@@ @P[JPh@p@`L@P@@@@`KA@@@E@@@@Pa@HohsbR@@@@`@AD@@D@@@@\R@OA@@@@mDpA@P@@@@PK Ad@@G@@@@lAAKB`a@@@@OA@@@@mD`A@P@@@@PKA`@@D@@@@tR@G@@A@@@@mDPB@\@@@@pFDxH @UAp@@@@@D@@@@tR@F@@A@@@@mD@B@L@@@@`G@\@@@@`EDpH@UAp@@@@@D@@@@HP@A@@A@@@@mD `B@P@@@@PKAh@@E@@@@d`@@@@@@P@@@@`KA@P@E@@@@Pa@a@@@@P@@@@`KAD@@L@@@@HcBa@@@@ D@@D@@@@@@@mi@aB@B@D@@@@xR@@@PA@@@@TH@rKzlhD@@@@xR@@@@A@@@@mDpB@P@@@@PKAl@@ E@@@@Pa@F@@J@P@@@@`KAD@@M@@@@HcBF@@J@D@@D@@@@@@@mi@aBTF@l@@A@@@@nD@@@T@@@@@ EB`|bNKJA@@@@BDP@@P@@@@pIA|D@@@@tR@G@@A@@@@mDPB@\@@@@pFD|H@\J`A@Ta@D@@@@t R@F@@A@@@@mD@B@P@@@@PKA\@@D@@@@tR@I@pA@@@@[P`c@tg@C@`HBP@@@@PKAX@@D@@@@tR@H @p@@@@@^@pA@@@@VP@c@tg@C@`HBP@@@@`@AD@@D@@@@tR@J@@A@@@@mD`B@T@@@@PBB@@@@@@A @@@@nD@@AT@@@@@EBDB@bH@A@@@@nDP@@p@@@@`LJDB@bHP@@P@@@@@@@tfBDJ@H@P@@@@`KA@@ @E@@@@Pa@HohsbR@@@@`KA@@@D@@@@tR@K@@A@@@@mDpB@T@@@@@EBX@@JI@A@@@@nDP@@t@@@@ `LJX@@JIP@@P@@@@@@@tfBDJ@Y@HC@D@@@@xR@@@PA@@@@TH@rKzlhD@@@@HP@A@@A@@@@gDp S@@@@PKA\@@D@@@@tR@I@pA@@@@[PPe@xKAJ@pMDP@@@@PKAX@@D@@@@tR@H@@A@@@@mDpA@P@@ @@PKAd@@G@@@@lAANBPfDL@@DQ@A@@@@mD`A@P@@@@PKA`@@C@@@@xA@G@@@@XAALBPfDL@@DQ@ A@@@@BDP@@P@@@@PKAh@@D@@@@tR@J@PA@@@@IH@@@@@@D@@@@xR@@DPA@@@@THPH@PDAD@@@@x R@A@@C@@@@rhPH@PDAA@@A@@@@@@P[JPh@`@@A@@@@nD@@@T@@@@@EB`|bNKJA@@@@nD@@@P@@@ @PKAl@@D@@@@tR@K@PA@@@@TH`A@pFAD@@@@xR@A@PC@@@@rh`A@pFAA@@A@@@@@@P[JPh@cA@K @P@@@@`KA@@@E@@@@Pa@HohsbR@@@@`@AD@@D@@@@\R@OA@@@@mDpA@P@@@@PKAd@@G@@@@lA AOB@uFX@@MY@A@@@@mD`A@P@@@@PKA`@@D@@@@tR@G@@A@@@@mDPB@\@@@@pFDDI@vZ`A@teAD@ @@@tR@F@@A@@@@mD@B@L@@@@`G@\@@@@`ED|H@vZ`A@teAD@@@@HP@A@@A@@@@mD`B@P@@@@PKA h@@E@@@@d`@@@@@@P@@@@`KA@P@E@@@@Pa@d@pVFP@@@@`KAD@@L@@@@HcBd@pVFD@@D@@@@@@@ mi@aB@B@D@@@@xR@@@PA@@@@TH@rKzlhD@@@@xR@@@@A@@@@mDpB@P@@@@PKAl@@E@@@@Pa@I@p `FP@@@@`KAD@@M@@@@HcBI@p`FD@@D@@@@@@@mi@aBHF@r@@A@@@@nD@@@T@@@@@EB`|bNKJA@@ @@BDP@@P@@@@pIA|D@@@@tR@G@@A@@@@mDPB@\@@@@pFD|H@vc`A@|FBD@@@@tR@F@@A@@@@m D@B@P@@@@PKA\@@D@@@@tR@I@pA@@@@[P`c@XLBC@P\HP@@@@PKAX@@D@@@@tR@H@p@@@@@^@pA @@@@VP@c@XLBC@P\HP@@@@`@AD@@D@@@@tR@J@@A@@@@mD`B@T@@@@PBB@@@@@@A@@@@nD@@AT@ @@@@EBDB@qa@A@@@@nDP@@p@@@@`LJDB@qaP@@P@@@@@@@tfBDJ@H@P@@@@`KA@@@E@@@@Pa@Ho hsbR@@@@`KA@@@D@@@@tR@K@@A@@@@mDpB@T@@@@@EBX@@Yb@A@@@@nDP@@t@@@@`LJX@@YbP@@ P@@@@@@@tfBDJPX@pB@D@@@@xR@@@PA@@@@TH@rKzlhD@@@@HP@A@@A@@@@gDpS@@@@PKA\@@ D@@@@tR@I@pA@@@@[PPd@@_BL@PGIP@@@@PKAX@@D@@@@tR@H@@A@@@@mDpA@P@@@@PKAd@@G@@ @@lAAcB@`I`A@ed@A@@@@mD`A@P@@@@PKA`@@C@@@@xA@G@@@@XAAaB@`I`A@ed@A@@@@BDP@@P @@@@PKAh@@D@@@@tR@J@PA@@@@IH@@@@@@D@@@@xR@@DPA@@@@TH`M@TRBD@@@@xR@A@@C@@@@r h`M@TRBA@@A@@@@@@P[JPh@`@@A@@@@nD@@@T@@@@@EB`|bNKJA@@@@nD@@@TA@@@p~Bpy@@@@ @@@@PFP@@@@@D@D@RPUZmUv\`xTYwAbTouVXnA@@@P@@@@PKAp@@D@@@@tR@L@PA@@@@THpF@tT BD@@@@xR@A@PC@@@@rhpF@tTBA@@A@@@@@@P[JPh@p@`L@P@@@@`KA@@@E@@@@Pa@HohsbR@@@@ `@AD@@D@@@@\R@OA@@@@mDpA@P@@@@PKAd@@G@@@@lAAcB`fI`A@We@A@@@@mD`A@P@@@@PKA `@@C@@@@xA@G@@@@XAAaB`fI`A@We@A@@@@BDP@@P@@@@PKAh@@D@@@@tR@J@PA@@@@IH@@@@@@ D@@@@xR@@DPA@@@@TH`M@`UBD@@@@xR@A@@C@@@@rh`M@`UBA@@A@@@@@@P[JPh@`@@A@@@@nD@ @@T@@@@@EB`|bNKJA@@@@nD@@@P@@@@PKAp@@D@@@@tR@L@PA@@@@THpF@@XBD@@@@xR@A@PC@@ @@rhpF@@XBA@@A@@@@@@P[JPh@n@PF@P@@@@`KA@@@E@@@@Pa@HohsbR@@@@`@AD@@D@@@@\R@ OA@@@@mDpA@P@@@@PKAd@@G@@@@lAAcBPsI`A@re@A@@@@mD`A@P@@@@PKA`@@C@@@@xA@G@@@ @XAAaB@sI`A@re@A@@@@BDP@@P@@@@PKAh@@D@@@@tR@J@PA@@@@IH@@@@@@D@@@@xR@@DPA@@@ @TH`M@HWBD@@@@xR@A@@C@@@@rh`M@HWBA@@A@@@@@@P[JPh@`@@A@@@@nD@@@T@@@@@EB`|bNK JA@@@@nD@@@P@@@@PKAp@@D@@@@tR@L@PA@@@@THpF@hYBD@@@@xR@A@PC@@@@rhpF@hYBA@@A@ @@@@@P[JPh@q@`L@P@@@@`KA@@@E@@@@Pa@HohsbR@@@@`@AD@@D@@@@\R@OA@@@@mDpA@P@@ @@PKAd@@G@@@@lAAUF@cI@Q@yb@A@@@@mD`A@P@@@@PKA`@@D@@@@tR@G@@A@@@@mDPB@\@@@@p FD\Z@\d@GADLBD@@@@tR@F@@A@@@@mD@B@L@@@@`G@\@@@@`EDTZ@\d@GADLBD@@@@HP@A@@A@@ @@mD`B@P@@@@PKAh@@E@@@@d`@@@@@@P@@@@`KA@P@E@@@@Pa@zDPpHP@@@@`KAD@@L@@@@HcBz DPpHD@@D@@@@@@@mi@aB@B@D@@@@xR@@@PA@@@@TH@rKzlhD@@@@xR@@@@A@@@@mD@C@P@@@@PK Ap@@E@@@@Pa@_DPzHP@@@@`KAD@@M@@@@HcB_DPzHD@@D@@@@@@@mi@aB@C@r@@A@@@@nD@@@T@ @@@@EB`|bNKJA@@@@BDP@@P@@@@pIA|D@@@@tR@G@@A@@@@mDPB@\@@@@pFD\Z@vd@GALOBD@ @@@tR@F@@A@@@@mD@B@L@@@@`G@\@@@@`EDTZ@vd@GALOBD@@@@HP@A@@A@@@@mD`B@P@@@@PKA h@@E@@@@d`@@@@@@P@@@@`KA@P@E@@@@Pa@zD@}HP@@@@`KAD@@L@@@@HcBzD@}HD@@D@@@@@@@ mi@aB@B@D@@@@xR@@@PA@@@@TH@rKzlhD@@@@xR@@@@A@@@@mD@C@P@@@@PKAp@@E@@@@Pa@_D@ GIP@@@@`KAD@@M@@@@HcB_D@GID@@D@@@@@@@mi@aBxB@Y@@A@@@@nD@@@T@@@@@EB`|bNKJA@@ @@BDP@@P@@@@pIA|D@@@@tR@G@@A@@@@mDPB@\@@@@pFD\Z@ie@GAxPBD@@@@tR@F@@A@@@@m D@B@L@@@@`G@\@@@@`EDTZ@he@GAxPBD@@@@HP@A@@A@@@@mD`B@P@@@@PKAh@@E@@@@d`@@@@@ @P@@@@`KA@P@E@@@@Pa@zD`CIP@@@@`KAD@@L@@@@HcBzD`CID@@D@@@@@@@mi@aB@B@D@@@@xR @@@PA@@@@TH@rKzlhD@@@@xR@@@@A@@@@mD@C@P@@@@PKAp@@E@@@@Pa@_D`MIP@@@@`KAD@@M@ @@@HcB_D`MID@@D@@@@@@@mi@aBdC@r@@A@@@@nD@@@T@@@@@EB`|bNKJA@@@@BDP@@P@@@@pIA |D@@@@tR@G@@A@@@@mDPB@\@@@@pFDtH@r_@B@|qAD@@@@tR@F@@A@@@@mD@B@P@@@@PKA\@@ D@@@@tR@I@pA@@@@[Ppg@HxAT@pIGP@@@@PKAX@@D@@@@tR@H@p@@@@@^@pA@@@@VPPg@HxAS@p IGP@@@@`@AD@@D@@@@tR@J@@A@@@@mD`B@T@@@@PBB@@@@@@A@@@@nD@@AT@@@@@EBDC@g\@A@@ @@nDP@@p@@@@`LJDC@g\P@@P@@@@@@@tfBDJ@H@P@@@@`KA@@@E@@@@Pa@HohsbR@@@@`KA@@@D @@@@tR@L@@A@@@@mD@C@T@@@@@EBXA@O]@A@@@@nDP@@t@@@@`LJXA@O]P@@P@@@@@@@tfBDJ@L @HC@D@@@@xR@@@PA@@@@TH@rKzlhD@@@@HP@A@@A@@@@gDpS@@@@PKA\@@D@@@@tR@I@pA@@@ @[Ppg@pyAT@PVGP@@@@PKAX@@D@@@@tR@H@p@@@@@^@pA@@@@VPPg@pyAS@PVGP@@@@`@AD@@D@ @@@tR@J@@A@@@@mD`B@T@@@@PBB@@@@@@A@@@@nD@@AT@@@@@EBDC@Z]@A@@@@nDP@@p@@@@`LJ DC@Z]P@@P@@@@@@@tfBDJ@H@P@@@@`KA@@@E@@@@Pa@HohsbR@@@@`KA@@@D@@@@tR@L@@A@@@@ mD@C@T@@@@@EBXA@B^@A@@@@nDP@@t@@@@`LJXA@B^P@@P@@@@@@@tfBDJ`K@dA@D@@@@xR@@@P A@@@@TH@rKzlhD@@@@HP@A@@A@@@@gDpS@@@@PKA\@@D@@@@tR@I@pA@@@@[Ppg@@@BT@p\GP @@@@PKAX@@D@@@@tR@H@p@@@@@^@pA@@@@VPPg@@@BS@p\GP@@@@`@AD@@D@@@@tR@J@@A@@@@m D`B@T@@@@PBB@@@@@@A@@@@nD@@AT@@@@@EBDC@s]@A@@@@nDP@@p@@@@`LJDC@s]P@@P@@@@@@ @tfBDJ@H@P@@@@`KA@@@E@@@@Pa@HohsbR@@@@`KA@@@D@@@@tR@L@@A@@@@mD@C@T@@@@@EBXA @[^@A@@@@nDP@@x@@@@`LJXA@[^`@@P@@@@@@@tfBDJPLxHC@r@@A@@@@nD@@@T@@@@@EB`|bNK JA@@@@BDP@@P@@@@pIA|D@@@@tR@G@@A@@@@mDPB@\@@@@pFD|Y@~]`FAtjAD@@@@tR@F@@A@ @@@mD@B@P@@@@PKA\@@D@@@@tR@I@pA@@@@[PPlA@qAfDPmFP@@@@PKAX@@D@@@@tR@H@p@@@@@ ^@pA@@@@VPpkA@qAfDPmFP@@@@`@AD@@D@@@@tR@J@@A@@@@mD`B@T@@@@PBB@@@@@@A@@@@nD@ @AT@@@@@EBPT@uZ@A@@@@nDP@@p@@@@`LJPT@uZP@@P@@@@@@@tfBDJ@H@P@@@@`KA@@@E@@@@P a@HohsbR@@@@`KA@@@D@@@@tR@L@@A@@@@mD@C@T@@@@@EBdR@][@A@@@@nDP@@t@@@@`LJdR@] [P@@P@@@@@@@tfBDJ@L@HC@D@@@@xR@@@PA@@@@TH@rKzlhD@@@@HP@A@@A@@@@gDpS@@@@PK A\@@D@@@@tR@I@pA@@@@[PPlAhrAfDpyFP@@@@PKAX@@D@@@@tR@H@p@@@@@^@pA@@@@VPpkAhr AfDpyFP@@@@`@AD@@D@@@@tR@J@@A@@@@mD`B@T@@@@PBB@@@@@@A@@@@nD@@AT@@@@@EBPT@h[ @A@@@@nDP@@p@@@@`LJPT@h[P@@P@@@@@@@tfBDJ@H@P@@@@`KA@@@E@@@@Pa@HohsbR@@@@`KA @@@D@@@@tR@L@@A@@@@mD@C@T@@@@@EBdR@P\@A@@@@nDP@@t@@@@`LJdR@P\P@@P@@@@@@@tfB DJ`K@dA@D@@@@xR@@@PA@@@@TH@rKzlhD@@@@HP@A@@A@@@@gDpS@@@@PKA\@@D@@@@tR@I@p A@@@@[PPlAxxAfDP@GP@@@@PKAX@@D@@@@tR@H@p@@@@@^@pA@@@@VPpkAxxAfDP@GP@@@@`@AD @@D@@@@tR@J@@A@@@@mD`B@T@@@@PBB@@@@@@A@@@@nD@@AT@@@@@EBPT@A\@A@@@@nDP@@p@@@ @`LJPT@A\P@@P@@@@@@@tfBDJ@H@P@@@@`KA@@@E@@@@Pa@HohsbR@@@@`KA@@@D@@@@tR@L@@A @@@@mD@C@T@@@@@EBdR@i\@A@@@@nDP@@x@@@@`LJdR@i\`@@P@@@@@@@tfBDJ@NrHC@r@@A@@@ @nD@@@T@@@@@EB`|bNKJA@@@@BDP@@P@@@@pIA|D@@@@tR@G@@A@@@@mDPB@\@@@@pFD\H@BX `@@|OAD@@@@tR@F@@A@@@@mD@B@P@@@@PKA\@@D@@@@tR@I@pA@@@@[PPf@HVAN@pAEP@@@@PKA X@@D@@@@tR@H@p@@@@@^@pA@@@@VPpe@HVAN@pAEP@@@@`@AD@@D@@@@tR@J@@A@@@@mD`B@T@@ @@PBB@@@@@@A@@@@nD@@AT@@@@@EBpB@GT@A@@@@nDP@@p@@@@`LJpB@GTP@@P@@@@@@@tfBDJ@ H@P@@@@`KA@@@E@@@@Pa@HohsbR@@@@`KA@@@D@@@@tR@L@@A@@@@mD@C@T@@@@@EBDA@oT@A@@ @@nDP@@t@@@@`LJDA@oTP@@P@@@@@@@tfBDJ@L@HC@D@@@@xR@@@PA@@@@TH@rKzlhD@@@@HP@A @@A@@@@gDpS@@@@PKA\@@D@@@@tR@I@pA@@@@[PPf@pWAN@PNEP@@@@PKAX@@D@@@@tR@H@p@ @@@@^@pA@@@@VPpe@pWAN@PNEP@@@@`@AD@@D@@@@tR@J@@A@@@@mD`B@T@@@@PBB@@@@@@A@@@ @nD@@AT@@@@@EBpB@zT@A@@@@nDP@@p@@@@`LJpB@zTP@@P@@@@@@@tfBDJ@H@P@@@@`KA@@@E@ @@@Pa@HohsbR@@@@`KA@@@D@@@@tR@L@@A@@@@mD@C@T@@@@@EBDA@bU@A@@@@nDP@@t@@@@`LJ DA@bUP@@P@@@@@@@tfBDJ`K@dA@D@@@@xR@@@PA@@@@TH@rKzlhD@@@@HP@A@@A@@@@gDpS@@ @@PKA\@@D@@@@tR@I@pA@@@@[PPf@HaAN@pTEP@@@@PKAX@@D@@@@tR@H@p@@@@@^@pA@@@@VPp e@HaAN@pTEP@@@@`@AD@@D@@@@tR@J@@A@@@@mD`B@T@@@@PBB@@@@@@A@@@@nD@@AT@@@@@EBp B@SU@A@@@@nDP@@p@@@@`LJpB@SUP@@P@@@@@@@tfBDJ@H@P@@@@`KA@@@E@@@@Pa@HohsbR@@@ @`KA@@@D@@@@tR@L@@A@@@@mD@C@T@@@@@EBDA@{U@A@@@@nDP@@@A@@@`LJDA@{Up@@P@@@@@@ @tfBDJ`LtPC@r@`L@HC@D@@@@xR@@@PA@@@@TH@rKzlhD@@@@HP@A@@A@@@@gDpS@@@@PKA\@ @D@@@@tR@I@pA@@@@[PPeA@{@PDp_BP@@@@PKAX@@D@@@@tR@H@@A@@@@mDpA@P@@@@PKAd@@G@ @@@lAAgF`xBpQ@GJ@A@@@@mD`A@P@@@@PKA`@@C@@@@xA@G@@@@XAAeF`xBpQ@GJ@A@@@@BDP@@ P@@@@PKAh@@D@@@@tR@J@PA@@@@IH@@@@@@D@@@@xR@@DPA@@@@TH`NA\h@D@@@@xR@A@@C@@@@ rh`NA\h@A@@A@@@@@@P[JPh@`@@A@@@@nD@@@T@@@@@EB`|bNKJA@@@@nD@@@P@@@@PKAp@@D@@ @@tR@L@PA@@@@THpGA|j@D@@@@xR@A@PC@@@@rhpGA|j@A@@A@@@@@@P[JPh@p@`L@P@@@@`KA@ @@E@@@@Pa@HohsbR@@@@`@AD@@D@@@@\R@OA@@@@mDpA@P@@@@PKAd@@G@@@@lAAgF@BpQ@y J@A@@@@mD`A@P@@@@PKA`@@C@@@@xA@G@@@@XAAeF@BpQ@yJ@A@@@@BDP@@P@@@@PKAh@@D@@@ @tR@J@PA@@@@IH@@@@@@D@@@@xR@@DPA@@@@TH`NAhk@D@@@@xR@A@@C@@@@rh`NAhk@A@@A@@@ @@@P[JPh@`@@A@@@@nD@@@T@@@@@EB`|bNKJA@@@@nD@@@P@@@@PKAp@@D@@@@tR@L@PA@@@@TH pGAHn@D@@@@xR@A@PC@@@@rhpGAHn@A@@A@@@@@@P[JPh@n@PF@P@@@@`KA@@@E@@@@Pa@Hohsb R@@@@`@AD@@D@@@@\R@OA@@@@mDpA@P@@@@PKAd@@G@@@@lAAgF`KCpQ@SK@A@@@@mD`A@P@@ @@PKA`@@C@@@@xA@G@@@@XAAeF`KCpQ@SK@A@@@@BDP@@P@@@@PKAh@@D@@@@tR@J@PA@@@@IH@ @@@@@D@@@@xR@@DPA@@@@TH`NALm@D@@@@xR@A@@C@@@@rh`NALm@A@@A@@@@@@P[JPh@`@@A@@ @@nD@@@T@@@@@EB`|bNKJA@@@@nD@@@P@@@@PKAp@@D@@@@tR@L@PA@@@@THpGAlo@D@@@@xR@A @PC@@@@rhpGAlo@A@@A@@@@@@P[JPh@q@`L@P@@@@`KA@@@E@@@@Pa@HohsbR@@@@`@AD@@D@@@ @\R@OA@@@@mDpA@P@@@@PKAd@@G@@@@lAA[F`kEXQ@[R@A@@@@mD`A@P@@@@PKA`@@D@@@@tR @G@@A@@@@mDPB@\@@@@pFDtZ@~S`HALJAD@@@@tR@F@@A@@@@mD@B@L@@@@`G@\@@@@`EDlZ@~S PHALJAD@@@@HP@A@@A@@@@mD`B@P@@@@PKAh@@E@@@@d`@@@@@@P@@@@`KA@P@E@@@@Pa@DphD P@@@@`KAD@@L@@@@HcBDphDD@@D@@@@@@@mi@aB@B@D@@@@xR@@@PA@@@@TH@rKzlhD@@@@xR@ @@@A@@@@mD@C@P@@@@PKAp@@E@@@@Pa@dDprDP@@@@`KAD@@M@@@@HcBdDprDD@@D@@@@@@@mi@ aB@C@r@@A@@@@nD@@@T@@@@@EB`|bNKJA@@@@BDP@@P@@@@pIA|D@@@@tR@G@@A@@@@mDPB@\ @@@@pFDtZ@XT`HATMAD@@@@tR@F@@A@@@@mD@B@L@@@@`G@\@@@@`EDlZ@XTPHATMAD@@@@HP@A @@A@@@@mD`B@P@@@@PKAh@@E@@@@d`@@@@@@P@@@@`KA@P@E@@@@Pa@D`uDP@@@@`KAD@@L@@@ @HcBD`uDD@@D@@@@@@@mi@aB@B@D@@@@xR@@@PA@@@@TH@rKzlhD@@@@xR@@@@A@@@@mD@C@P@ @@@PKAp@@E@@@@Pa@dD`DP@@@@`KAD@@M@@@@HcBdD`DD@@D@@@@@@@mi@aBxB@Y@@A@@@@nD @@@T@@@@@EB`|bNKJA@@@@BDP@@P@@@@pIA|D@@@@tR@G@@A@@@@mDPB@\@@@@pFDtZ@nV`HA |NAD@@@@tR@F@@A@@@@mD@B@L@@@@`G@\@@@@`EDlZ@nVPHA|NAD@@@@HP@A@@A@@@@mD`B@P@@ @@PKAh@@E@@@@d`@@@@@@P@@@@`KA@P@E@@@@Pa@Dp{DP@@@@`KAD@@L@@@@HcBDp{DD@@D@@ @@@@@mi@aB@B@D@@@@xR@@@PA@@@@TH@rKzlhD@@@@xR@@@@A@@@@mD@C@P@@@@PKAp@@E@@@@P a@dDpEEP@@@@`KAD@@P@@@@HcBdDpEEL@@D@@@@@@@mi@aB\SMv@`L@HC@r@@A@@@@nD@@@T@@@ @@EB`|bNKJA@@@@BDP@@P@@@@pIA|D@@@@tR@G@@A@@@@mDPB@\@@@@pFD\H@NP`@@Do@D@@@ @tR@F@@A@@@@mD@B@P@@@@PKA\@@D@@@@tR@I@pA@@@@[PPf@Pu@N@P~BP@@@@PKAX@@D@@@@tR @H@p@@@@@^@pA@@@@VPpe@Pu@N@P~BP@@@@`@AD@@D@@@@tR@J@@A@@@@mD`B@T@@@@PBB@@@@@ @A@@@@nD@@AT@@@@@EBpB@yK@A@@@@nDP@@p@@@@`LJpB@yKP@@P@@@@@@@tfBDJ@H@P@@@@`KA @@@E@@@@Pa@HohsbR@@@@`KA@@@D@@@@tR@L@@A@@@@mD@C@T@@@@@EBDA@aL@A@@@@nDP@@t@@ @@`LJDA@aLP@@P@@@@@@@tfBDJ@L@HC@D@@@@xR@@@PA@@@@TH@rKzlhD@@@@HP@A@@A@@@@gDp S@@@@PKA\@@D@@@@tR@I@pA@@@@[PPf@xv@N@pJCP@@@@PKAX@@D@@@@tR@H@p@@@@@^@pA@@ @@VPpe@xv@N@pJCP@@@@`@AD@@D@@@@tR@J@@A@@@@mD`B@T@@@@PBB@@@@@@A@@@@nD@@AT@@@ @@EBpB@lL@A@@@@nDP@@p@@@@`LJpB@lLP@@P@@@@@@@tfBDJ@H@P@@@@`KA@@@E@@@@Pa@Hohs bR@@@@`KA@@@D@@@@tR@L@@A@@@@mD@C@T@@@@@EBDA@TM@A@@@@nDP@@t@@@@`LJDA@TMP@@P@ @@@@@@tfBDJ`K@dA@D@@@@xR@@@PA@@@@TH@rKzlhD@@@@HP@A@@A@@@@gDpS@@@@PKA\@@D@ @@@tR@I@pA@@@@[PPf@@z@N@PQCP@@@@PKAX@@D@@@@tR@H@p@@@@@^@pA@@@@VPpe@@z@N@PQC P@@@@`@AD@@D@@@@tR@J@@A@@@@mD`B@T@@@@PBB@@@@@@A@@@@nD@@AT@@@@@EBpB@EM@A@@@@ nDP@@p@@@@`LJpB@EMP@@P@@@@@@@tfBDJ@H@P@@@@`KA@@@E@@@@Pa@HohsbR@@@@`KA@@@D@@ @@tR@L@@A@@@@mD@C@T@@@@@EBDA@mM@A@@@@nDP@@t@@@@`LJDA@mMP@@P@@@@@@@tfBDJPN@H C@D@@@@xR@@@PA@@@@TH@rKzlhD@@@@HP@A@@A@@@@gDpS@@@@PKAP@@D@@@@@_@E@@A@@@@p GpA@P@@@@@|Ad@@D@@@@@_@J@@A@@@@pGpB@P@@@@@|Ap@@D@@@@tR@F@PB@@@@zK@@@@@@@@@@ @@@@b@@A@@@@mDPA@P@@@@pIA|H@@@@XbAO@`A@|_@@P@@@@PKA@@@C@@@@@@@ %%%%%%%%%%%%%%%%%%%%%% End /document/IGBA0M02.wmf %%%%%%%%%%%%%%%%%%%%% %%%%%%%%%%%%%%%%%%%%% Start /document/IGBA0M01.wmf %%%%%%%%%%%%%%%%%%%% WwlqZB@@@@@@@@@@@@@{I@@@@@@@@H@@I@@@C`BH@@PB@XAD@@@@@XAD@@`IF|@@b@rUMYtPA@@ @@@@@A@pUBA@@@@`@@@@@@@B@@P`B@@@Aj@@@A@@@@`E@@@`B@@@@G@@@@D`@@@`v@@@@@@@@@@ @@@@`{j@@@_JA@@@RQMYD@@D@@DhB@@H_@@@`A@@@@@@@@@@@@@@@@@@@@@P@@@@p@@@@v@@@@b B@@@DB@@@@B@@@@b@@@@p@@@@pa@@@@`@@@@`H@@@@L@@@@|oB@@@@P@@@@@@@@@@@ @@@@a@@@@`@@@@@L@@@@L@@@@|@@@@XI@@@@L@@@@\@@@@XI@@@@L@@@@@@@@@hH@@@@L@@@@| oB@@@@P@@@@@@@@@@@@@@@a@@@@`@@@@PI@@@@L@@@@\@@@@XI@@@@L@@@@@@@@@hH@@@@L@ @@@|oB@@@@P@@@@tO@@@@@a@@@@`@@@@PI@@@@L@@@@\@@@@XI@@@@L@@@@@@@@@hH@ @@@L@@@@|_H@@@@H@@@@xA@@@@F@@@@@@@@@@@@@@`AB@@@aC@@@TB@@@@C@@@@G@@@@VB@ @@@C@@@@@@@@@VB@@@@C@@@@M@@@@JB@@@@C@@@@GB@@@@B@@@@^@@@@`A@@@@@@@@@@@@ @@X`@@@Px@@@@X@@@@p@@@@pO@Y@@@@p@@@@@@@@@@f@@@@pA@@@P@@@@@@@@@@@@@@@@@@@ @@@@@@@TB@@@@C@@@@A@@@@TB@@@@C@@@@@@@@@nB@@@@F@@@@G@@@@\@@@@`A@@@ZC@@@TB@@ @@C@@@@G@@@@VB@@@@C@@@@@@@@@VB@@@@C@@@@M@@@@JB@@@@C@@@@GB@@@@B@@@@Y@@@ @p@@@@@@@@@@X@@@@p@@@@pO@^@@@@`A@@@@@@@@@@@@@@X`@@@Px@@@@f@@@@pA@@@`@@@@ @E@@@@@@@@@@@@@@@@@@@@TB@@@@C@@@@B@@@@`B@@@@C@@@@A@@@@TB@@@@C@@@@B@@@@\B@@@ @F@@@@A@@@@D@@@@@@@@@@@BB@@TB@@@@C@@@@A@@@@XE@@@@L@@@@U@@@@xA@@@@iA@@@OB@@@ T@@@@`D@xA@aF`G@DZ@OB`D@|H@R@`G@XB@@@@G@@@@C@@@@@@@@@P@@@@@@@@@@@H`@BPI@@@@ L@@@@L@@@@@J@@@@L@@@@H@@@@PF@@@@L@@@@|CpF@@@@P@@@@HA@@@`G@@@@e@@@@p@@@@p @@@@@v@@@@@A@@@PhA@@@^@@@@XC@@@@D@@@@aF@@@|H@@@`M@@@@P@@@@HA@@@pc@@@@v@@@@@ A@@@`D@@@@^@@@@XB@@@@G@@@@B@@@@@@@@@@@@@@@@@@@@@@@@@PI@@@@L@@@@H@@@@@J@@@@L @@@@L@@@@pF@@@@P@@@@DZ@@@`G@@@@e@@@@p@@@@`@@@@@v@@@@@A@@@PhA@@@OB@@@lA@@@@D @@@@aF@@@|H@@@`M@@@@P@@@@XZ@@@pc@@@@[@@@@@A@@@PhA@@@xA@@@XC@@@@D@@@@fF@@@`G @@@pF@@@@P@@@@DZ@@@`X@@@@v@@@@@A@@@`iA@@@bA@@@lA@@@@D@@@@aF@@@lD@@@`M@@@@P@ @@@XZ@@@pR@@@@[@@@@@A@@@PhA@@@u@@@@XC@@@@D@@@@fF@@@TC@@@pF@@@@P@@@@DZ@@@`G@ @@@v@@@@@A@@@`iA@@@^@@@@lA@@@@D@@@@R@@@@|H@@@`M@@@@P@@@@DZ@@@pc@@@@e@@@@p@@ @@pA@@@`e@@@@p@@@@@@@@@`e@@@@p@@@@PC@@@`b@@@@p@@@@pa@@@@`@@@@@F@@@@L@@ @@|C`G@@@@X@@@@\@@@@pA@@@@G@@@hM@@@PI@@@@L@@@@\@@@@XI@@@@L@@@@@@@@@XI@@ @@L@@@@t@@@@hH@@@@L@@@@|_H@@@@H@@@@`A@@@@C@@@@@xA@@@@F@@@@N@@@@hA@@ @`iA@@@VB@@@TB@@@@C@@@@G@@@@VB@@@@C@@@@@@@@@VB@@@@C@@@@M@@@@JB@@@@C@@@@ GB@@@@B@@@@X@@@@p@@@@pO@^@@@@`A@@@`C@@@@Z@@@@XZ@@@pd@@@@g@@@@`A@@@p@@@@ @A@@@@@@@@@@``@@@e@@@@p@@@@p@@@@@h@@@@p@@@@P@@@@@f@@@@pA@@@P@@@@@@@@@@D@@@@ @@@@@@@@@`@TB@@@@C@@@@A@@@@`B@@@@C@@@@B@@@@lA@@@@D@@@@UF@@@PH@@@PI@@@@L@@@@ D@@@@PI@@@@L@@@@L@@@@`M@@@@P@@@@pW@@@p^@@@@[@@@@@A@@@@_A@@@{A@@@XC@@@@D@@@@ cE@@@LG@@@pF@@@@P@@@@LV@@@p\@@@@v@@@@@A@@@`RA@@@i@@@@lA@@@@D@@@@JE@@@dB@@@` M@@@@P@@@@DS@@@`L@@@@[@@@@@A@@@PLA@@@r@@@@XC@@@@D@@@@XD@@@hC@@@pF@@@@P@@@@` Q@@@`N@@@@v@@@@@A@@@p@@@@i@@@@lA@@@@D@@@@C@@@dB@@@`M@@@@P@@@@XN@@@`L@@@@[ @@@@@A@@@`y@@@@r@@@@XC@@@@D@@@@MC@@@hC@@@pF@@@@P@@@@tL@@@`N@@@@v@@@@@A@@@@m @@@@i@@@@lA@@@@D@@@@tB@@@dB@@@`M@@@@P@@@@lI@@@`L@@@@[@@@@@A@@@pf@@@@r@@@@XC @@@@D@@@@BB@@@hC@@@pF@@@@P@@@@HH@@@`N@@@@v@@@@@A@@@PZ@@@@i@@@@lA@@@@D@@@@iA @@@dB@@@`M@@@@P@@@@@E@@@`L@@@@[@@@@@A@@@@T@@@@r@@@@XC@@@@D@@@@w@@@@hC@@@pF@ @@@P@@@@\C@@@`N@@@@v@@@@@A@@@`G@@@@i@@@@\B@@@@F@@@@B@@@@@@@@@@@@@H@@@@@@TB@ @@@C@@@@B@@@@`B@@@@C@@@@C@@@@`A@@@@C@@@@@@@`@dA@@@@C@@@@@@@@@TB@@@@C@@@@B@@ @@XE@@@@L@@@@TF@@@@H@@@@gA@@@HB@@@T@@@@PeA@H@YF@a@TY@HBPdAPH@UF@`@lA@@@@D@@ @@UF@@@PH@@@PI@@@@L@@@@\@@@@XI@@@@L@@@@@@@@@XI@@@@L@@@@t@@@@hH@@@@L@@@@| _H@@@@H@@@@dA@@@@C@@@@@@@@@`A@@@@C@@@@@@@`@xA@@@@F@@@@N@@@@hA@@@`iA@@@SB@@ @TB@@@@C@@@@A@@@@TB@@@@C@@@@B@@@@XE@@@@L@@@@{E@@@\G@@@p`A@@@A@@@T@@@@@_A\G @@Fp^@pW@A@^AlG@|Ep]@lA@@@@D@@@@|E@@@lG@@@PI@@@@L@@@@\@@@@XI@@@@L@@@@@@@@@ XI@@@@L@@@@t@@@@hH@@@@L@@@@|_H@@@@H@@@@dA@@@@C@@@@@@@@@`A@@@@C@@@@@@@`@ xA@@@@F@@@@N@@@@hA@@@`iA@@@SB@@@TB@@@@C@@@@A@@@@TB@@@@C@@@@B@@@@XE@@@@L@@@@ bE@@@|F@@@`ZA@@@wA@@@T@@@@pXA|F@gEp\@LV@wApWALG@cEp[@lA@@@@D@@@@cE@@@LG@@@P I@@@@L@@@@\@@@@XI@@@@L@@@@@@@@@XI@@@@L@@@@t@@@@hH@@@@L@@@@|_H@@@@H@@@@d A@@@@C@@@@@@@@@`A@@@@C@@@@@@@`@xA@@@@F@@@@N@@@@hA@@@`iA@@@SB@@@TB@@@@C@@@@A @@@@TB@@@@C@@@@B@@@@XE@@@@L@@@@IE@@@TB@@@PTA@@@m@@@@T@@@@`RATB@NEPJ@hT@m@`Q AdB@JEPI@lA@@@@D@@@@JE@@@dB@@@PI@@@@L@@@@\@@@@XI@@@@L@@@@@@@@@XI@@@@L@@@@t@ @@@hH@@@@L@@@@|_H@@@@H@@@@dA@@@@C@@@@@@@@@`A@@@@C@@@@@@@`@xA@@@@F@@@@N@ @@@hA@@@`iA@@@SB@@@TB@@@@C@@@@A@@@@TB@@@@C@@@@B@@@@XE@@@@L@@@@pD@@@xB@@@@NA @@@v@@@@T@@@@PLAxB@uD`L@DS@v@PKAHC@qD`K@lA@@@@D@@@@qD@@@HC@@@PI@@@@L@@@@\@@ @@XI@@@@L@@@@@@@@@XI@@@@L@@@@t@@@@hH@@@@L@@@@|_H@@@@H@@@@dA@@@@C@@@@@@@ @@`A@@@@C@@@@@@@`@xA@@@@F@@@@N@@@@hA@@@`iA@@@SB@@@TB@@@@C@@@@A@@@@TB@@@@C@@ @@B@@@@XE@@@@L@@@@WD@@@XC@@@pGA@@@~@@@@T@@@@@FAXC@\D`N@`Q@~@@EAhC@XD`M@lA@@ @@D@@@@XD@@@hC@@@PI@@@@L@@@@\@@@@XI@@@@L@@@@@@@@@XI@@@@L@@@@t@@@@hH@@@@L@@@ @|_H@@@@H@@@@dA@@@@C@@@@@@@@@`A@@@@C@@@@@@@`@xA@@@@F@@@@N@@@@hA@@@`iA@@ @SB@@@TB@@@@C@@@@A@@@@TB@@@@C@@@@B@@@@XE@@@@L@@@@~C@@@TB@@@`AA@@@m@@@@T@@@@ p@TB@CDPJ@|O@m@p~@dB@CPI@lA@@@@D@@@@C@@@dB@@@PI@@@@L@@@@\@@@@XI@@@@L@@@@ @@@@@XI@@@@L@@@@t@@@@hH@@@@L@@@@|_H@@@@H@@@@dA@@@@C@@@@@@@@@`A@@@@C@@@@ @@@`@xA@@@@F@@@@N@@@@hA@@@`iA@@@SB@@@TB@@@@C@@@@A@@@@TB@@@@C@@@@B@@@@XE@@@@ L@@@@eC@@@xB@@@P{@@@@v@@@@T@@@@`y@xB@jC`L@XN@v@`x@HC@fC`K@lA@@@@D@@@@fC@@@H C@@@PI@@@@L@@@@\@@@@XI@@@@L@@@@@@@@@XI@@@@L@@@@t@@@@hH@@@@L@@@@|_H@@@@H @@@@dA@@@@C@@@@@@@@@`A@@@@C@@@@@@@`@xA@@@@F@@@@N@@@@hA@@@`iA@@@SB@@@TB@@@@C @@@@A@@@@TB@@@@C@@@@B@@@@XE@@@@L@@@@LC@@@XC@@@@u@@@@~@@@@T@@@@Ps@XC@QC`N@tL @~@Pr@hC@MC`M@lA@@@@D@@@@MC@@@hC@@@PI@@@@L@@@@\@@@@XI@@@@L@@@@@@@@@XI@@@@L@ @@@t@@@@hH@@@@L@@@@|_H@@@@H@@@@dA@@@@C@@@@@@@@@`A@@@@C@@@@@@@`@xA@@@@F@ @@@N@@@@hA@@@`iA@@@SB@@@TB@@@@C@@@@A@@@@TB@@@@C@@@@B@@@@XE@@@@L@@@@sB@@@TB@ @@pn@@@@m@@@@T@@@@@m@TB@xBPJ@PK@m@@l@dB@tBPI@lA@@@@D@@@@tB@@@dB@@@PI@@@@L@@ @@\@@@@XI@@@@L@@@@@@@@@XI@@@@L@@@@t@@@@hH@@@@L@@@@|_H@@@@H@@@@dA@@@@C@@ @@@@@@@`A@@@@C@@@@@@@`@xA@@@@F@@@@N@@@@hA@@@`iA@@@SB@@@TB@@@@C@@@@A@@@@TB@@ @@C@@@@B@@@@XE@@@@L@@@@ZB@@@xB@@@`h@@@@v@@@@T@@@@pf@xB@_B`L@lI@v@pe@HC@[B`K @lA@@@@D@@@@[B@@@HC@@@PI@@@@L@@@@\@@@@XI@@@@L@@@@@@@@@XI@@@@L@@@@t@@@@hH@@@ @L@@@@|_H@@@@H@@@@dA@@@@C@@@@@@@@@`A@@@@C@@@@@@@`@xA@@@@F@@@@N@@@@hA@@@ `iA@@@SB@@@TB@@@@C@@@@A@@@@TB@@@@C@@@@B@@@@XE@@@@L@@@@AB@@@XC@@@Pb@@@@~@@@@ T@@@@``@XC@FB`N@HH@~@`_@hC@BB`M@lA@@@@D@@@@BB@@@hC@@@PI@@@@L@@@@\@@@@XI@@@@ L@@@@@@@@@XI@@@@L@@@@t@@@@hH@@@@L@@@@|_H@@@@H@@@@dA@@@@C@@@@@@@@@`A@@@@ C@@@@@@@`@xA@@@@F@@@@N@@@@hA@@@`iA@@@SB@@@TB@@@@C@@@@A@@@@TB@@@@C@@@@B@@@@X E@@@@L@@@@hA@@@TB@@@@\@@@@m@@@@T@@@@PZ@TB@mAPJ@dF@m@PY@dB@iAPI@lA@@@@D@@@@i A@@@dB@@@PI@@@@L@@@@\@@@@XI@@@@L@@@@@@@@@XI@@@@L@@@@t@@@@hH@@@@L@@@@|_H @@@@H@@@@dA@@@@C@@@@@@@@@`A@@@@C@@@@@@@`@xA@@@@F@@@@N@@@@hA@@@`iA@@@SB@@@TB @@@@C@@@@A@@@@TB@@@@C@@@@B@@@@XE@@@@L@@@@OA@@@xB@@@pU@@@@v@@@@T@@@@@T@xB@TA `L@@E@v@@S@HC@PA`K@lA@@@@D@@@@PA@@@HC@@@PI@@@@L@@@@\@@@@XI@@@@L@@@@@@@@@XI@ @@@L@@@@t@@@@hH@@@@L@@@@|_H@@@@H@@@@dA@@@@C@@@@@@@@@`A@@@@C@@@@@@@`@xA@ @@@F@@@@N@@@@hA@@@`iA@@@SB@@@TB@@@@C@@@@A@@@@TB@@@@C@@@@B@@@@XE@@@@L@@@@v@@ @@XC@@@`O@@@@~@@@@T@@@@pM@XC@{@`N@\C@~@pL@hC@w@`M@lA@@@@D@@@@w@@@@hC@@@PI@@ @@L@@@@\@@@@XI@@@@L@@@@@@@@@XI@@@@L@@@@t@@@@hH@@@@L@@@@|_H@@@@H@@@@dA@@ @@C@@@@@@@@@`A@@@@C@@@@@@@`@xA@@@@F@@@@N@@@@hA@@@`iA@@@SB@@@TB@@@@C@@@@A@@@ @TB@@@@C@@@@B@@@@XE@@@@L@@@@]@@@@TB@@@PI@@@@m@@@@T@@@@`G@TB@b@PJ@xA@m@`F@dB @^@PI@lA@@@@D@@@@^@@@@dB@@@PI@@@@L@@@@\@@@@XI@@@@L@@@@@@@@@XI@@@@L@@@@t@@@@ hH@@@@L@@@@|_H@@@@H@@@@dA@@@@C@@@@@@@@@`A@@@@C@@@@@@@`@xA@@@@F@@@@N@@@@ hA@@@`iA@@@SB@@@TB@@@@C@@@@G@@@@VB@@@@C@@@@@@@@@VB@@@@C@@@@M@@@@JB@@@@C@@@@ GB@@@@B@@@@Y@@@@p@@@@@@@@@@X@@@@p@@@@@@@@H@^@@@@`A@@@`C@@@@Z@@@@XZ@@@` e@@@@e@@@@p@@@@pA@@@`e@@@@p@@@@@@@@@`e@@@@p@@@@PC@@@`b@@@@p@@@@pa@@@@` @@@@PF@@@@L@@@@@@@@@@F@@@@L@@@@@@@@B`G@@@@X@@@@@@@@@@@@@@@FH@@@DN@@@PI@@@@L @@@@\@@@@XI@@@@L@@@@@@@@@XI@@@@L@@@@t@@@@hH@@@@L@@@@|_H@@@@H@@@@dA@@@@C @@@@@@@@@`A@@@@C@@@@@@@`@xA@@@@F@@@@@@@@@@@@@@`AB@@@aC@@@TB@@@@C@@@@G@@@@VB @@@@C@@@@@@@@@VB@@@@C@@@@M@@@@JB@@@@C@@@@GB@@@@B@@@@Y@@@@p@@@@@@@@@@X@ @@@p@@@@@@@@H@^@@@@`A@@@@@@@@@@@@@@X`@@@Px@@@@X@@@@p@@@@@@@@@@Y@@@@p@@@@p O@R@@@@p@@@@P@@@@@e@@@@p@@@@`@@@@@RA@@@pT@@@p@@@@@qC@@@@@@@@@@@@@@@@Y@ @@@@@@Pl@@@@@DD@rAPZ@DF@lA@@@@@@@D@@@p`SFhI@JxLF@@@@D@@@@@@@@@@@@@@@k@@@@@@ @@@@@@@@@@PWL~F_eBXV@@@@@@@@@@D@@AX@jBhL^q@CHhPe\uUFUyAWYiLdN\MuUPMCL\QuPIQ UYXqeYoyF]sqE]rUWYteG\eqUYuUF^q@cKtQgYA@@@@@@@@@PD@dB@eUgYbECL``BUrUWYTeG\e erPzpuTWAuLppEUCeDUeaEWf}f[tMGWtIW]eQW^pUFWeA@@bECLnPG]@K@@@@P@@h`s@C@@@HB@ @@`@``BUrUWYTEV[meGHBUf\mEf[`pcXeIW[ayFPiqv\nxv]uyRYdUG@@@@@@@@@@@@@@@@@@@@ @@@@@@@@@@@P@ADP@ADP@ADP@gHWI@@@@L@@@@L@@@@pT@@@@TA@@@@[@@@pa@@@@wF@@@\I@@@ P@@@@@@@LjAA@phFTkA@@@GB@@@D@@@@@S@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@T@@@@p`@@C `@@@@pT@@@@\A@@@@[@@@@\@@@@CG@@@@H@@@P@@@@@@@LjAA@phFTkA@@@pA@@@L@@@@@S@@@@ @@@@@@@@@@@@@@@@@@@@@@@@@@@T@@@@pxbL@`@@@@@A@@@@H@@@@LE@@@@W@@@@pF@@@hE@@@p pA@@@jA@@@D@@@@@@@cZP@@LjAuZ@@@`V@@@@C@@@@pD@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ E@@@@LnPC@H@@@@P@@@@@B@@@@SA@@@pE@@@@lA@@@CA@@@L\@@@pT@@@@A@@@@@@phFD@@cZPm F@@@LD@@@p@@@@@LA@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@PA@@@@cKv@@B@@@@D@@@@`@@@@pT @@@@\A@@@@[@@@PK@@@@CG@@@tC@@@P@@@@@@@LjAA@phFTkA@@@m@@@@L@@@@@S@@@@@@@@@@@ @@@@@@@@@@@@@@@@@@@@T@@@@pxBNna@@@@@A@@@@H@@@@LE@@@@U@@@@pF@@@XA@@@pmA@@@f@ @@@D@@@@@@@cZP@@LjAuZ@@@`E@@@@A@@@@pD@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@E@@@PLc A@@H@@@@TB@@@@C@@@@G@@@@VB@@@@C@@@@@@@@@VB@@@@C@@@@M@@@@JB@@@@C@@@@GB@ @@@B@@@@^@@@@`A@@@@@@@@@@@@@@X`@@@Px@@@@e@@@@p@@@@pA@@@`e@@@@p@@@@@@@@@`e@@ @@p@@@@PC@@@`b@@@@p@@@@pa@@@@`@@@@`G@@@@X@@@@@@@@@@@@@@@FH@@@DN@@@`D@@ @@L@@@@D@@@@PI@@@@L@@@@H@@@@PI@@@@L@@@@L@@@@pT@@@@TA@@@PY@@@Pg@@@@[F@@@tJ@@ @P@@@@@@@LjAA@phFTdA@@@]B@@@D@@@@@S@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@T@@@@qHp[ na@@@@pT@@@@TA@@@lW@@@Pg@@@@BF@@@tJ@@@P@@@@@@@LjAA@phFD^A@@@]B@@@D@@@@@S@@@ @@@@@@@@@@@@@@@@@@@@@@@@@@@@T@@@@rDSMp`@@@@pT@@@@TA@@@HV@@@Pg@@@@iE@@@tJ@@@ P@@@@@@@LjAA@phFtWA@@@]B@@@D@@@@@S@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@T@@@@sTa@a c@@@@pT@@@@TA@@@dT@@@Pg@@@@PE@@@tJ@@@P@@@@@@@LjAA@phFdQA@@@]B@@@D@@@@@S@@@@ @@@@@@@@@@@@@@@@@@@@@@@@@@@T@@@@tXG@r`@@@@pT@@@@TA@@@@S@@@Pg@@@@wD@@@tJ@@@P @@@@@@@LjAA@phFTKA@@@]B@@@D@@@@@S@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@T@@@@ul@MXc @@@@pT@@@@TA@@@\Q@@@Pg@@@@^D@@@tJ@@@P@@@@@@@LjAA@phFDEA@@@]B@@@D@@@@@S@@@@@ @@@@@@@@@@@@@@@@@@@@@@@@@@T@@@@vPRhSb@@@@pT@@@@TA@@@xO@@@Pg@@@@ED@@@tJ@@@P@ @@@@@@LjAA@phFt~@@@@]B@@@D@@@@@S@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@T@@@@wlPqP`@ @@@pT@@@@TA@@@TN@@@Pg@@@@lC@@@tJ@@@P@@@@@@@LjAA@phFdx@@@@]B@@@D@@@@@S@@@@@@ @@@@@@@@@@@@@@@@@@@@@@@@@T@@@@x@`L@`@@@@pT@@@@TA@@@pL@@@Pg@@@@SC@@@tJ@@@P@@ @@@@@LjAA@phFTr@@@@]B@@@D@@@@@S@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@T@@@@yD`UA`@@ @@pT@@@@XA@@@|J@@@Pg@@@@~B@@@tJ@@@P@@@@@@@LjAA@phFDk@@@@]B@@@H@@@@@S@@@@@@@ @@@@@@@@@@@@@@@@@@@@@@@@T@@@@q@CNp`@@@@@B@@@@SA@@@`E@@@`e@@@@]B@@@TJ@@@Pk@@ @@A@@@@@@phFD@@cZPSB@@@tI@@@`@@@@@LA@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@PA@@@DSL@ l@B@@@@H@@@@LE@@@@V@@@@}A@@@tI@@@@c@@@@mB@@@D@@@@@@@cZP@@LjAiG@@@Pg@@@@B@@@ @pD@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@E@@@PLrlFHH@@@@`@@@@pT@@@@XA@@@PF@@@Pg@@@ @sA@@@tJ@@@P@@@@@@@LjAA@phFTX@@@@]B@@@H@@@@@S@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ @T@@@@qLSL@a@@@@@B@@@@SA@@@`E@@@pR@@@@]B@@@hE@@@Pk@@@@A@@@@@@phFD@@cZPHA@@@ tI@@@`@@@@@LA@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@PA@@@DCMDTEB@@@@H@@@@LE@@@@V@@@@ r@@@@tI@@@PP@@@@mB@@@D@@@@@@@cZP@@LjA}B@@@Pg@@@@B@@@@pD@@@@@@@@@@@@@@@@@@@@ @@@@@@@@@@@E@@@PLux^YH@@@@`@@@@pT@@@@XA@@@dA@@@Pg@@@@h@@@@tJ@@@P@@@@@@@LjAA @phFdE@@@@]B@@@H@@@@@S@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@T@@@@qXCFTc@@@@@B@@@@e @@@@p@@@@@FE@@@fXpC@XbBWudQCE@@@@@@@D@@@@@@@@@@B@@@@P`B@@@@@@@@DhB@@\@@@@XI @@@@L@@@@@@@@@XI@@@@L@@@@t@@@@hH@@@@L@@@@|_H@@@@H@@@@xA@@@@F@@@@@@@@@@@ @@@`AB@@@aC@@@TB@@@@C@@@@G@@@@VB@@@@C@@@@@@@@@VB@@@@C@@@@M@@@@JB@@@@C@@@@ GB@@@@B@@@@^@@@@`A@@@`_@@@@rB@@@TS@@@`s@@@@R@@@@p@@@@P@@@@@e@@@@p@@@@`@@ @@@RA@@@pT@@@@A@@@@qC@@@@@@@@@@@@@@@pk@@@@@@@Pl@@@@@DD@rAPZ@DF@lA@@@xB] tYV@@@@@@@@@@XA@K@pQuQG]mEf[`lTXvABJTIW]eQU^pUVJgmVXvuVYdyB]tYV@@@@@@@@@@lA @K@pQuQG]mEf[`lTXvubPoqFY``BUrUWYTeG\eerYkEf]bqFYnPG]fE@@@@@@@@@@\Pl@@@@@A@ `BpbK@@@`M@@@@A`F]``BUrUtMrPSLBaSMvXSMwPdLqDSNBMcPp@SPAACLAQDLtHSQvXSPyHSL@ yF]aMgKnMvXn@@@ndF[@DFY@K@@@@P@@@@@Ap`Ps{mCBKBC@@@@@@@@@@`AQzyo@\sAp^sfRtA@ @LpI@@@@CD@A@@p@gH@[@h@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@P@ADP@ADP@ADP@^XPI@@@ @L@@@@P@@@@pT@@@@DC@@@LH@@@pm@@@@uD@@@`L@@@P@@@@@@@LjAA@phFD`@@@@wB@@@`A@@@ @S@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@Y@@@@gUf[eIWXtev[nMGHfIw[mAB]hUFHeyFYI@@@@ d@@@@PB@@@@I@@@@X@@@@@B@@@@E@@@@P@@@@PB@@@@I@@@@`@@@@@A@@@@E@@@@X@@@@PB@@@@ L@@@@P@@@@PA@@@@I@@@@d@@@@@A@@@@I@@@@d@@@@PB@@@@e@@@@p@@@@pA@@@`e@@@@p@@@@@ @@@@`e@@@@p@@@@PC@@@`b@@@@p@@@@pa@@@@`@@@@`G@@@@X@@@@@@@@@@@@@@@FH@@@D N@@@PI@@@@L@@@@\@@@@XI@@@@L@@@@@@@@@XI@@@@L@@@@t@@@@hH@@@@L@@@@|_H@@@@H @@@@xA@@@@F@@@@CG@@@PB@@@p|A@@@HB@@@HA@@@@C@@@@A@@@@TB@@@@C@@@@B@@@@HE@@@@S A@@@E@@@@DO@@@@@DN@@@Px@@@@oB@@@@@@@qB@D@@PP@HG@iAPX@pF@@@@dA@@@@D@@NXP daZnAQRJtFxi_QX`gAZaA^rXBFxydYpgrrnAgZzXFT{BDZPmDQjASC@@@@pqVYpu{VrA]lLLGX af\`}mB|AwW@dG\ODkwfXoqFHr@BJTIW]eQU^pUVJbMw\yufLnPG]fE@@@@@@@@@@]@`B@Ht[o mv\hUF[fArTyufXoqFH@K@@@@P@@hplhB@@@XC@@@P@nPG]fE@@E]cLtDcPxTcMvTsMDISLqdcP sHD@@DTPp@SPDACMrTdMvDTNrHDPnQWXsyb[cMfKc}fKiqF@@@@qB@@@@D@@@@P@LHt|^{@tRs@ @@@@@@@@@@XPd@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ADP@ADP@ADP@A@@@e@@@@p@@@@PA@@ @@SA@@@`I@@@`rA@@@FB@@@T_@@@pa@@@@A@@@@@@phFD@@cZPGG@@@XH@@@pC@@@@LA@@@@@@@ @@@@@@@@@@@@@@@@@@@@@@@\A@@@@g\oIVXbeF[iQW^`|fY`TSB@@@@F@@@@d@@@@PB@@@@H@@@ @d@@@@@A@@@@D@@@@P@@@@PA@@@@G@@@@P@@@@PB@@@@E@@@@P@@@@PI@@@@L@@@@P@@@@@J@@@ @L@@@@T@@@@`T@@@@LE@@@T@@@@P|@@@@@Px@@@@aC@@@|J@@@@@@@DK@P@@@AA`\@dF@aA @[@@@@@K@@@@P@@L{JAH@YB{IBmfTK@@@@@@@@@@`AQvlOFDYs~\Sf@PJ@@pB}@@@@@@P@@@@@@ @@@@hPmhA@@@p@@@@@@@@@@@@@@@TN@@@@@@@@@@@@@@@@UyymSn@`_JLAA~D@W@PEUFE@tB@@@ @D@@AXPdM{SZnUf\``TXnQFHIQuP``BUrUWYTeG\eebUIyTQReDUCyBUTYT@@@@@@@@@@pA@K@` Po}vZ`Dd[teV\uEFHB}F[@@@JTIW]eQU^pUVJayF]qUWXbyB]tYV@@@@@@@@@@LB@L@`Po}vZ`D d[teV\uEFHB}F[dARRtEF[iMFHDK@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@DP@ADP@ADP@A D@@JTB@@@@C@@@@E@@@@LE@@@@b@@@@^G@@@HH@@@P}A@@@GB@@@D@@@@@@@cZP@@LjAm]@@@`` @@@@L@@@@pD@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@`E@@@pXoyF]iyV]aQWZoyFB@@@@I@@@@d@ @@@PA@@@@D@@@@d@@@@PB@@@@H@@@@T@@@@@A@@@@I@@@@d@@@@PI@@@@L@@@@P@@@@@J@@@@L@ @@@T@@@@PI@@@@L@@@@\@@@@XI@@@@L@@@@@@@@@XI@@@@L@@@@t@@@@hH@@@@L@@@@|_H@ @@@H@@@@xA@@@@F@@@@@@@@@@@@@@`AB@@@aC@@@TB@@@@C@@@@G@@@@VB@@@@C@@@@@@@@@VB@ @@@C@@@@M@@@@JB@@@@C@@@@GB@@@@B@@@@^@@@@`A@@@@@@@@@@@@@@X`@@@Px@@@@f@@ @@pA@@@PA@@@@@@@@@@@@@@@@@@@@@@@@@TB@@@@C@@@@E@@@@`B@@@@C@@@@A@@@@`A@@@@C@@ @@@dA@@@@C@@@@@@@@@TB@@@@C@@@@E@@@@\B@@@@F@@@@A@@@@D@@@@@@@@@@@BB@@TB@@ @@C@@@@A@@@@lB@@@@F@@@@G@@@@\@@@@`A@@@ZC@@@TB@@@@C@@@@D@@@@bB@@@@C@@@@A@@@ @TB@@@@C@@@@G@@@@bB@@@@C@@@@E@@@@TB@@@@C@@@@G@@@@VB@@@@C@@@@D@@@@VB@@@@C@@@ @D@@@@HB@@@@C@@@@GB@@@@B@@@@Y@@@@p@@@@@@@@@@X@@@@p@@@@pO@^@@@@`A@@@ @@@@@@@@@@@X`@@@Px@@@@b@@@@p@@@@pI@@@@@A@@@PBB@@@bC@@@DB@@@@B@@@@Y@@@@ p@@@@@@@@@@X@@@@p@@@@pO@b@@@@p@@@@pJ@@@@@A@@@@@@@@@@@@@@DB@@@@B@@@@ Y@@@@p@@@@@@@@@@X@@@@p@@@@pO@L@@@@@A@@@@@@@@@@@@@@TB@@@@C@@@@D@@@@JB@@@@ C@@@@{@@@@@E@@@@@@@@@@A@@@@E@@@@D@@@@LP@H@PA@@@@KH@@@@@@E@@@@p`@bCPBBL @@@@`G@P@@@@pIA|C@@@@xA@D@@@@\R@@@@@@^@@B@@@@zK@@@@@@@@@@@@@@D@@@@tR@@ @pA@@@@|K@@@|C@@@P@@@@PKAD@@D@@@@\R@@@@@@^@@A@@@@mD@@@P@@@@PKAD@@D@@@ @\R@@@@@@^@@A@@@@mD@@@P@@@@PKAD@@D@@@@\R@@@@@@^@@A@@@@lD@@@\@@@@`EDDN @IH@@@L@@D@@@@tR@@@@A@@@@mDP@@pA@@@p~BPA@I@@@@@@@|J@@@@PlAH`@bLU^sQWYmA@wB@ @IKK@@L@n@@XMw@lLa@@@@bsm@@|jAD@@@@tR@B@@A@@@@gDpO@@@@`G@P@@@@@KA@@@G@@@@ XAAaCPBB@@@C@PA@@@@IHpO@E@@@@D`@@@@@@`@@@@`~B@@@@@@@@@@@@@@A@@@@mDp@@P@@ @@PKAD@@G@@@@lAA[C`@B\@@J@@A@@@@mD@@@P@@@@PKAD@@D@@@@tR@B@@A@@@@gDpO@@@@` G@T@@@@P@B@@@@@PA@@@@IHpO@D@@@@pR@@@pA@@@@VPPx@d`@@@p@@`@@@@`~BT@@@@@@@@ @@@@@A@@@@mD@A@P@@@@@|AL@@D@@@@tR@D@pA@@@@|KP@@@@@@@@``P@@@@PKAL@@N@@@@Pr@E @PE@xA@dF`G@PZ@OBPE@|H@U@`G@`@@@@`~B@@@A@@@@@H`@B@A@@@@mDPA@P@@@@@|AP@@E@@@ @D`@@T@@@@@EBxA@U@@A@@@@mDPA@T@@@@pDBxA@dFPA@@@@SHpc@PZ@E@@@@La@OBPE@T@ @@@pDBxA@U@@B@@@@zK@@@@@@@@@@@@@@D@@@@tR@D@@A@@@@pGPA@T@@@@@EBxA@dF@A@@@@mD @A@T@@@@pDB|H@dFPA@@@@SHpc@dZ@E@@@@Pa@xA@iAT@@@@pDB`G@iFPA@@@@TH`X@PZ@E@@@@ La@bAPjAT@@@@@EBlD@dFPA@@@@SHpR@dZ@E@@@@Pa@u@@iAT@@@@pDBTC@iFPA@@@@TH`G@PZ@ E@@@@La@^@PjAT@@@@@EB|H@U@PA@@@@SHpc@PZ@D@@@@tR@@@@A@@@@mDP@@P@@@@PKAH@@D@@ @@\R@@@@@@^@PA@@@@IHpO@D@@@@pR@@@pA@@@@VP`v@H`@G@`B@P@@@@PKA@@@D@@@@t R@A@@A@@@@mD`@@P@@@@pIA|C@@@@xA@E@@@@d`@@P@@@@@KA@@@G@@@@XAAVBPjAhA@Q @@A@@@@mD@@@P@@@@PKAD@@D@@@@tR@B@@A@@@@gDpO@@@@`G@T@@@@PBB|C@A@@@@lD@@ @\@@@@`EDLI@iF`F@DA@G@@@@po@A@@@@@@@@BBA@@@@mDPA@P@@@@@|AL@@H@@@@ho@@@P@@@@ @@@@`@P@@@@PKAL@@D@@@@@_@D@PA@@@@TH@a@`Y@D@@@@tR@C@@A@@@@mDPA@T@@@@pDBlG@E PA@@@@SHp\@XV@E@@@@La@i@PSAT@@@@pDBHC@tDPA@@@@SH`N@lQ@E@@@@La@i@`@AT@@@@pDB HC@iCPA@@@@SH`N@@M@E@@@@La@i@pm@T@@@@pDBHC@^BPA@@@@SH`N@TH@E@@@@La@i@@[@T@@ @@pDBHC@SAPA@@@@SH`N@hC@E@@@@La@i@PH@\@@@@@B@@@@@@`@@@@D@@@@tR@D@@A@@@@pGP A@T@@@@PBB@@@@BPA@@@@AH@@@@@@D@@@@tR@D@`C@@@@dLPA@`Y@@B@gAPH@XF@b@PY@DB@fA@ H@E@@@@Pa@DB@fAP@@@@PKA@@@D@@@@tR@A@@A@@@@mD`@@P@@@@pIA|C@@@@xA@E@@@@D`@@ @@@@T@@@@PBB@@@@B@A@@@@lD@@@\@@@@`EDLI@iF`F@DA@D@@@@tR@C@@A@@@@mD@A@x@@@@@I CT@@Ep]@LX@{Ap_A|G@{Ep^@|W@wAPA@@@@THp^@|W@D@@@@tR@@@@A@@@@mDP@@P@@@@PKAH@ @D@@@@\R@@@@@@^@PA@@@@AH@@@@@@E@@@@d`@@@@`@P@@@@@KA@@@G@@@@XAASBPjAhA@Q@ @A@@@@mDp@@P@@@@PKAP@@N@@@@Pr@E@`YA|F@jEp\@XV@wA`XALG@fEp[@T@@@@@EBLG@fE@A@ @@@mD@@@P@@@@PKAD@@D@@@@tR@B@@A@@@@gDpO@@@@`G@T@@@@P@B@@@@@PA@@@@IH@@@@H@ D@@@@pR@@@pA@@@@VPpd@dZ@Z@PD@P@@@@PKAL@@D@@@@tR@D@`C@@@@dLPA@tT@e@PTAdB@MEP K@dT@i@PSATB@E@@@@Pa@i@PSAP@@@@PKA@@@D@@@@tR@A@@A@@@@mD`@@P@@@@pIA|C@@@@x A@E@@@@D`@@@@@@T@@@@PBB@@@@B@A@@@@lD@@@\@@@@`EDLI@iF`F@DA@D@@@@tR@C@@A@@@@m D@A@x@@@@@ICT@@tD`K@`S@r@@MAXC@pD`L@PS@n@PA@@@@TH`L@PS@D@@@@tR@@@@A@@@@mDP@ @P@@@@PKAH@@D@@@@\R@@@@@@^@PA@@@@AH@@@@@@E@@@@d`@@@@`@P@@@@@KA@@@G@@@@XA ASBPjAhA@Q@@A@@@@mDp@@P@@@@PKAP@@N@@@@Pr@E@pFAXC@_D`N@lQ@~@pEAhC@[D`M@T@@@@ @EBhC@[D@A@@@@mD@@@P@@@@PKAD@@D@@@@tR@B@@A@@@@gDpO@@@@`G@T@@@@P@B@@@@@PA@ @@@IH@@@@H@D@@@@pR@@@pA@@@@VPpd@dZ@Z@PD@P@@@@PKAL@@D@@@@tR@D@`C@@@@dLPA@HP@ e@`AAdB@BDPK@xO@i@`@ATB@E@@@@Pa@i@`@AP@@@@PKA@@@D@@@@tR@A@@A@@@@mD`@@P@@@@p IA|C@@@@xA@E@@@@D`@@@@@@T@@@@PBB@@@@B@A@@@@lD@@@\@@@@`EDLI@iF`F@DA@D@@@@t R@C@@A@@@@mD@A@x@@@@@ICT@@iC`K@tN@r@Pz@XC@eC`L@dN@n@PA@@@@TH`L@dN@D@@@@tR@@ @@A@@@@mDP@@P@@@@PKAH@@D@@@@\R@@@@@@^@PA@@@@AH@@@@@@E@@@@d`@@@@`@P@@@@@K A@@@G@@@@XAASBPjAhA@Q@@A@@@@mDp@@P@@@@PKAP@@N@@@@Pr@E@@t@XC@TC`N@@M@~@@s@hC @PC`M@T@@@@@EBhC@PC@A@@@@mD@@@P@@@@PKAD@@D@@@@tR@B@@A@@@@gDpO@@@@`G@T@@@@ P@B@@@@@PA@@@@IH@@@@H@D@@@@pR@@@pA@@@@VPpd@dZ@Z@PD@P@@@@PKAL@@D@@@@tR@D@`C@ @@@dLPA@\K@e@pn@dB@wBPK@LK@i@pm@TB@E@@@@Pa@i@pm@P@@@@PKA@@@D@@@@tR@A@@A@@@@ mD`@@P@@@@pIA|C@@@@xA@E@@@@D`@@@@@@T@@@@PBB@@@@B@A@@@@lD@@@\@@@@`EDLI@iF` F@DA@D@@@@tR@C@@A@@@@mD@A@x@@@@@ICT@@^B`K@HJ@r@`g@XC@ZB`L@xI@n@PA@@@@TH`L@x I@D@@@@tR@@@@A@@@@mDP@@P@@@@PKAH@@D@@@@\R@@@@@@^@PA@@@@AH@@@@@@E@@@@d`@@ @@`@P@@@@@KA@@@G@@@@XAASBPjAhA@Q@@A@@@@mDp@@P@@@@PKAP@@N@@@@Pr@E@Pa@XC@IB`N @TH@~@P`@hC@EB`M@T@@@@@EBhC@EB@A@@@@mD@@@P@@@@PKAD@@D@@@@tR@B@@A@@@@gDpO@ @@@`G@T@@@@P@B@@@@@PA@@@@IH@@@@H@D@@@@pR@@@pA@@@@VPpd@dZ@Z@PD@P@@@@PKAL@@D@ @@@tR@D@`C@@@@dLPA@pF@e@@\@dB@lAPK@`F@i@@[@TB@E@@@@Pa@i@@[@P@@@@PKA@@@D@@@@ tR@A@@A@@@@mD`@@P@@@@pIA|C@@@@xA@E@@@@D`@@@@@@T@@@@PBB@@@@B@A@@@@lD@@@\@@ @@`EDLI@iF`F@DA@D@@@@tR@C@@A@@@@mD@A@x@@@@@ICT@@SA`K@\E@r@pT@XC@OA`L@LE@n@P A@@@@TH`L@LE@D@@@@tR@@@@A@@@@mDP@@P@@@@PKAH@@D@@@@\R@@@@@@^@PA@@@@AH@@@@ @@E@@@@d`@@@@`@P@@@@@KA@@@G@@@@XAASBPjAhA@Q@@A@@@@mDp@@P@@@@PKAP@@N@@@@Pr@E @`N@XC@~@`N@hC@~@`M@hC@z@`M@T@@@@@EBhC@z@@A@@@@mD@@@P@@@@PKAD@@D@@@@tR@B@@A @@@@gDpO@@@@`G@T@@@@P@B@@@@@PA@@@@IH@@@@H@D@@@@pR@@@pA@@@@VPpd@dZ@Z@PD@P@ @@@PKAL@@D@@@@tR@D@`C@@@@dLPA@DB@e@PI@dB@a@PK@tA@i@PH@TB@E@@@@Pa@i@PH@P@@@@ PKA@@@D@@@@tR@A@@A@@@@mD`@@P@@@@pIA|C@@@@xA@E@@@@D`@@@@@@T@@@@PBB@@@@B@A@ @@@lD@@@\@@@@`EDLI@iF`F@DA@D@@@@tR@@@@A@@@@mDP@@P@@@@PKAH@@D@@@@\R@@@@@@ ^@PA@@@@AH@@@@@@E@@@@d`@@@@`@P@@@@@KA@@@G@@@@XAAVBPjAhA@Q@@A@@@@mD@@@P@@@@P KAD@@D@@@@tR@B@@A@@@@gDpO@@@@`G@T@@@@P@B@@@@@PA@@@@IH@@@@H@D@@@@pR@@@pA@@ @@VPPx@d`@@@p@@P@@@@PKA@@@D@@@@tR@A@@A@@@@mD`@@P@@@@pIA|C@@@@xA@E@@@@D`@@ @@@@T@@@@PBB@@@@B@A@@@@lD@@@\@@@@`EDDN@IH@@@L@@D@@@@tR@@@@A@@@@mDP@@P@@@@PK AH@@D@@@@\R@@@@@@^@PA@@@@AH@@@@@@E@@@@d`@@@@`@P@@@@@KA@@@G@@@@XAAaCPBB@@ @C@PA@@@@IH@@@@@@E@@@@D`@@P@@@@`@AD@@D@@@@tR@D@@G@@@@{KpC@@@@@@@@@@Y@@@ @@qB@@@@PPreVXlA@@@pO|sO@|C@@@@@@|C@@@@@@@pO|sO@P@@@@PKAT@@I@@@@HcBGB@lA D@@@@@LH`@@L@@@@HcBpA@lAL@@@@@LnHC@H@@A@`@@L@@@@HcBZA@lAL@@@@@LnPC@H@@A@`@@ L@@@@HcBCA@lAL@@@@@LnXC@H@@A@`@@L@@@@HcBm@@lAL@@@@@Ln`c[H@@A@`@@I@@@@HcBV@@ lAD@@@@PLca@@D@@@@tR@@@@A@@@@mDP@@P@@@@PKAH@@D@@@@\R@@@@@@^@@A@@@@lD@@@\ @@@@`EDDN@IH@@@L@@D@@@@tR@@@@A@@@@mDP@@P@@@@PKAH@@D@@@@\R@@@@@@^@@A@@@@l D@@@\@@@@`EDDN@IH@@@L@@D@@@@HP@A@@A@@@@mD@A@P@@@@PKAT@@I@@@@HcB]B@eAD@@@@PL B`@@I@@@@HcB]Bp^AD@@@@`Lq`@@I@@@@HcB]B`XAD@@@@pLU`@@I@@@@HcB]BPRAD@@@@@Mva@ @I@@@@HcB]B@LAD@@@@PMK`@@I@@@@HcB]BpEAD@@@@`Md`@@I@@@@HcB]B`@D@@@@pMK`@@I@ @@@HcB]BPy@D@@@@@N@`@@I@@@@HcB]B@s@D@@@@PNA`@@J@@@@HcB]Bpk@H@@@@PLp`@@H@`B@ @@@rhPg@XI@B@@@@DSLH@@B@h@@@@`LJtI@}A`@@@@@qHCB@`@@J@@@@HcB]B@Y@H@@@@PLs`@@ H@`B@@@@rhPg@lD@B@@@@DCMH@@B@h@@@@`LJtI@r@`@@@@@qTCB@`@@J@@@@HcB]BPF@H@@@@P Lv`@@H@@A@@@@mD@@@P@@@@PKAD@@D@@@@tR@B@@A@@@@gDpO@@@@`G@P@@@@@KA@@@G@@@@X AAaCPBB@@@C@@A@@@@mD@@@P@@@@PKAD@@D@@@@tR@B@@A@@@@gDpO@@@@`G@P@@@@@KA@@@G @@@@XAANC@NAHK@AB@A@@@@BDP@@P@@@@PKAP@@\@@@@lo@O@@@@@@@@@@oB@@@@DK@@@@@AIWZ aqF@|S@@@@@VlpO|sO|sO|sO|sO|sO@@A@@@@mD`A@lB@@@`LJ\K@CB@F@@@@gUf[eIW Xtev[nMGHfIw[mAB]hUFHeyFYI@PB@d@@I@`A@`@@E@@A@d@@I@@B@P@@E@`A@d@@L@@A@T@@I@ PB@P@@I@PB@d@@D@@@@tR@@@@A@@@@mDP@@P@@@@PKAH@@D@@@@\R@@@@@@^@@A@@@@lD@@@ \@@@@`EDDN@IH@@@L@@D@@@@tR@@@@A@@@@mDP@@P@@@@PKAH@@D@@@@\R@@@@@@^@@A@@@@ lD@@@\@@@@`ED`H@vG@I@X\@D@@@@HP@A@@A@@@@mD@A@pA@@@p~B|@@@@@aCPx@|J@@@@Pl@@A @@Dd\iEF[@|C@|sO|sO|sO|sO|sO|sO|sO|C@D@@@@tR@G@`G@@@@rh`a@h\@O@@@@@ g\oIVXbeF[iQW^`|fY`TSB@X@@I@PB@`@@I@@A@P@@D@PA@\@@D@PB@T@@D@@A@@@@mD`A@P@@@ @@|A\@@\@@@@lo@O@@@@Px@DN@oB@@@@DK@P@@@AIWZaqF@@pO|sO|sO@@@@@|sO|sO@pO @@pO@@@A@@@@mDpA@dA@@@`LJHH@^G@C@@@@c}f[tef[uEF]i}f[H@PB@d@@E@@A@d@@I@@B@T@ @D@PB@d@@D@@@@tR@F@@A@@@@pGpA@P@@@@PKA@@@D@@@@tR@A@@A@@@@mD`@@P@@@@pIA|C@ @@@xA@D@@@@pR@@@pA@@@@VPPx@d`@@@p@@P@@@@PKA@@@D@@@@tR@A@@A@@@@mD`@@P@@@@pIA |C@@@@xA@D@@@@pR@@@pA@@@@VPPx@d`@@@p@@`@@@@`~B@@@@@@@@@@@@@@A@@@@mDpA@P@@ @@@|AL@@E@@@@d`@@T@@@@P@B@@@@@@A@@@@mDpA@\@@@@@BD@@@@@@@@HHD@@@@tR@C@p A@@@@[Ppv@H`@G@`B@\@@@@@B@@@@@@@@@@@D@@@@tR@H@@A@@@@pGp@@P@@@@PKA@@@D@@@@@ _@G@@A@@@@mD@@@P@@@@PKA`@@D@@@@tR@F@@A@@@@gDpO@@@@`G@T@@@@P@B@@@@@PA@@@@I HpO@D@@@@pR@@@pA@@@@VPPx@d`@@@p@@P@@@@pIA|C@@@@xA@E@@@@D`@@@@@@T@@@@PB B|C@A@@@@gDpO@@@@`G@T@@@@P@B@@@@@PA@@@@IHpO@D@@@@tR@H@@A@@@@gDpO@ @@@@@@@ %%%%%%%%%%%%%%%%%%%%%% End /document/IGBA0M01.wmf %%%%%%%%%%%%%%%%%%%%%