## 辅导案例-MATH 4581

• August 11, 2020

1Summer 2020 Name: Total /100 MATH 4581: Statistics and Stochastic Processes Take-Home Exam Wright-Fisher model Consider N haploid individuals, each carrying 1 copy of a specific genetic locus (a location of interest in the genome). Each individual (= gene) can be of two types (= alleles), A or a, which correspond to two different pieces of genetic information at the same locus. Suppose that at each time unit each individual randomly and uniformly chooses another individual (possibly itself ) from the population and adopts its type (“parallel updating”). This is called resampling, and is a form of random reproduction. Suppose that all individuals update independently from each other and independently of how they updated at previous times. We are interested in the evolution of the following quantity: Xn = number of A’s at time n. Example 1. Let N = 6 and denote an individual of type A by •, while an individual of type a by •. We would like to compute the probability that there will be 3 individuals of type A at time n given that there are 4 individuals of type A at time (n−1), i.e. P (Xn = 3 | Xn−1 = 4). First, there are ( 6 3 ) possible choices of type A individuals in the nth time frame. Next, the probability of choosing an individual of type A for the update is 46 = 2 3 and 1 3 of type a, hence, P (Xn = 3 | Xn−1 = 4) = ( 6 3 ) ( 2 3 )3 ( 1 3 )3 . n− 1 • • • • • • n • • • • • • Problem 1 We consider the case N = 3. (a) [5 pts] Let Ai be the state with i individuals of type A (here i = 0, 1, 2, 3). Show that the process {X0, X1, X2, . . . , } is a Markov chain1 and list the absorbing states. (b) [5 pts] Compute the transition probabilities. from/to A0 A1 A2 A3 A0 A1 A2 A3 Definition 2. A discrete-time stochastic process {Y0, Y1, Y2, . . .} is called a martingale if E(Yn|Yn−1, . . . , Y0) = Yn−1 for any time n ≥ 0. In other words, the conditional expected value of the next observation, given all the past observations, is equal to the most recent observation. 1Check the Markov property 2(c)∗ [10 pts] Check that {X0, X1, X2, . . .} is a martingale. 2 (d) [5 pts] Assume we start in state Aj . What is the probability that population A takes over? j pj 0 1 2 3 (e) [5 pts] What is the average number of updates until the population is stabilized? j # of updates 0 1 2 3 2Hint: Markov property implies that E(Xn | Xn−1, . . . , X0) = E(Xn | Xn−1). Set Xn−1 = k and verify that E(Xn | Xn−1) = k as well. 3Optional stopping theorem Definition 3. Given a stochastic process {X0, X1, . . . , }, a non-negative integer-valued random variable τ is called a stopping time if for every integer k ≥ 0, the event τ ≤ k depends only on the events X0, X1, . . . , Xk. Example 4. Consider a gambler playing roulette with a typical house edge, starting with \$100 and betting \$1 on red in each game: (1) Playing exactly five games corresponds to τ = 5, and is a stopping time. (2) Playing until he either runs out of money or has played 500 games is a stopping time. (3) Playing until he is the maximum amount ahead he will ever be is not a stopping time, as it requires information about the future as well as the present and past. (4) Playing until he doubles his money (borrowing if necessary) is not a stopping time, as there is a positive probability that he will never double his money. The optional stopping theorem (or Doob’s optional sampling theorem) says that, under certain conditions, the expected value of a martingale at a stopping time is equal to its initial expected value. Since martingales can be used to model the wealth of a gambler participating in a fair game, the optional stopping theorem says that, on average, nothing can be gained by stopping play based on the information obtainable so far (i.e., without looking into the future). Theorem 5 (optional stopping theorem). Suppose that the stochastic process {X0, X1, . . . , } is a martingale and τ is a stopping time such that there exists a constant c > 0 with P (τ ≤ c) = 1. Then E[Xτ ] = E[X0]. Problem 2 Suppose that Ann plays the following game. At each turn the dealer throws an unbiased coin, and if the outcome is heads Ann wins \$10, while if it is tails she loses \$10. Assume that each coin toss is independent and Ann starts with \$100. What is the probability that she wins \$50 before running out of money? (a)∗ [15 pts] Let τ be the turn when Ann either has \$150 or runs out of money. Show that τ is a stopping time and satisfies the assumption of optional stopping theorem. (b) [10 pts] Apply the optional stopping theorem to answer the question. 4Problem 3[10 pts] Consider the Wright-Fisher model and let τ be the time when either population A or a takes over. Show that τ is a stopping time, but does not satisfy the assumption of optional stopping theorem. Queues Theorem 6. (Burke) Consider an M/M/1, M/M/c or M/M/∞ queue with arrival rate λ. In the steady state, the departures from the system form a Poisson process with rate λ, independently of µ (so long as µ > λ). Furthermore, at time t the number of customers in the queue is independent of the departure process prior to time t. Problem 4 Consider a two-stage tandem network composed of two nodes with service rates µ1 and µ2, respectively. This means that a customer joins the end of the wait line at the first counter upon arrival and moves to the end of the wait line for the second counter after being served at the first. The external arrival rate is λ (λ < µi, i = 1, 2) and the arrival process is Poisson. Assume that the service times at each node are exponentially distributed and mutually independent, and independent of the arrival processes, the total number of visitors is not bounded. 0, 0 0, 1 0, 2 0, 3 1, 0 1, 1 1, 2 2, 0 2, 1 3, 0 . . . . . . . . . . . . λ λ λ λ λ λ µ1 µ1 µ1 µ1 µ1 µ1 µ2 µ2 µ2 µ2 µ2 µ2 (a) [5 pts] Give an example of a real life occurence of such a queueing system. 5(b) [15 pts] Show that the steady-state probabilities are given by pnm = (1− ρ1)ρn1 (1− ρ2)ρm2 , where ρ1 = λµ1 and ρ2 = λµ2 . 3 (c)* [15 pts] Give a generalization of (b) to the case of n counters. Black-Scholes equation Problem 5 Recall that the solution of Black Scholes equation corresponding to the European Call option is CE(S, t) = SΦ(d1)−Ee−r(T−t)Φ(d2), where d1 = `n(S/E)+(r+σ 2)(T−t) σ √ T−t , d2 = d1−σ √ T − t = `n(S/E)+(r−σ2)(T−t) σ √ T−t and Φ(d) = 1√ 2pi d∫ −∞ e− x2 2 dx. (a) [5 pts] Find the solution of Black Scholes equation corresponding to the European Put option.4 3Hint: Using that the queue to the first counter is an M/M/1 system with rates λ and µ1, compute the steady state probability p̂n. Use Burke’s theorem to find the departure rate for this system. Notice that the arrival rate for the second system (which is also M/M/1) is equal to the departure rate for the first. Now find p˜m, the steady-state probability for the second queue. Show that Burke’s theorem allows to conclude pnm = p̂np˜m and get the desired result. 4Hint: use the put–call parity equation PE(S, t) + S = CE(S, t) + Ee −r(T−t). 6(b) [10 pts] Analyze the solution. (1) What is the value of the put if the market is wild; it is modeled by σ →∞? (2) What is the value of the put if the market is stable; it is modeled by σ → 0? (3) What is the value of the put if S →∞? (4) What is the value of the put if S → 0? (5) What is the value of the put if t→ T−? 7Feynman’s checkers Definition 7. A checker path is a finite sequence of integer points in the plane such that the vector from each point (except the last one) to the next one equals either (1, 1) or (−1, 1). A turn is a point of the path (not the first and not the last one) such that the vectors from the point to the next and to the previous ones are orthogonal. To a path s terminating at (x, t) we associate a vector → a (s) with length |→a (s)| := 2 1−t2 and angle formed with the x-axis equal to pi2 (1 − turns(s)). Next we define the vector → a (x, t) as → a (x, t) = ∑ s → a (s), where the sum over all checker paths s from (0, 0) to (x, t) with the first step to (1, 1). Denote P (x, t) := |→a (x, t)|2, a1(x, t) := x− coordinate of a(x, t), a2(x, t) := t− coordinate of a(x, t). Example 8. Consider the point (x, t) = (1, 3). Then we have two paths s1, s2 emerging at (0, 0) going to (1, 1) on the first step and terminating at (1, 3). We find turns(s1) = 1, turns(s2) = 2 and t = 3 (so 2 1−t 2 = 12 ) implies the angle of → a (s1) with the x−axis is 0 and the angle of →a (s2) with the x−axis is −pi2 , while | → a (s1)| = |→a (s2)| = 12 . In other words, → a (s1) = ( 1 2 , 0) and→ a (s2) = (0,− 12 ). Hence, → a (x, t) = → a (s1) + → a (s2) = ( 1 2 ,− 12 ), a1(x, t) = 12 , a2(x, t) = − 12 and P (x, t) = 14 . −3 −2 −1 1 2 3 1 2 3 4 s1s2 x t Remark 9. Here the x- and t-coordinates are interpreted as the electron position and time, respectively. We work in the natural system of units, where the speed of light, the Plank and the Boltzmann constants are equal to 1. Thus the lines x = ±t represent motion with the speed of light. Any checker path lies above both lines, i.e in the light cone, which means agreement with relativity: the speed of electron cannot exceed the speed of light. The square of the length of vector a(x, t) is called the probability to find an electron (with spin = 12 ) in the square (x, t), if it was emitted from (0, 0). Problem 5 (a) [2 pts] Find P (1, 1) and P (0, 2). (b) [3 pts] Find P (n, n) and P (−n, n+ 2) for n ≥ 1. (c) [5 pts] Find P (2, 4). 8Problem 6[10 pts] Let m,n be integers, s. t. n ≥ m and m + n ≥ 2. How many different checker paths are there from (0, 0) to (m,n)? Problem 7 (a)∗ [10 pts] Show that a1(x, t) = 1√2 (a1(x+ 1, t− 1) + a2(x+ 1, t− 1)) and a2(x, t) = 1√2 (−a1(x− 1, t− 1) + a2(x− 1, t− 1)). (b) [5 pts] Let t ≥ 0. Using the result in (a) show that ∑ x∈Z P (x, t) = 1.

• #### COST CALCULATOR

计算费用

###### MOST POPULAR

ezAce多年来为广大留学生提供定制写作、留学文书定制、语法润色以及网课代修等服务，超过200位指导老师为您提供24小时不间断地服务。