Meetings

Click on the theme item for the meeting plan for that theme.Click on the meeting item for references, exercises, and additional reading related to it.

**Theme 1 : Probabilistic Techniques**- 7 meetings- Meeting 01 : Fri, Feb 08, 03:00 pm-05:00 pm -
- Meeting 02 : Wed, Feb 13, 02:00 pm-04:00 pm -
- Meeting 03 : Fri, Feb 15, 03:00 pm-05:00 pm -
- Meeting 04 : Wed, Feb 20, 02:00 pm-04:00 pm -
- Meeting 05 : Fri, Feb 22, 03:00 pm-05:00 pm -
- Meeting 06 : Wed, Feb 27, 02:00 pm-04:00 pm -
- Meeting 07 : Fri, Mar 01, 03:00 pm-05:00 pm -

Linearity of Expectations: Max Cut, Quick Sort. Markov's inequality: Existence of independent set of certain size in a given graph. See Lecture-7 (Part-A and B) in the link below. References Exercises Reading Linearity of Expectations: Max Cut, Quick Sort. Markov's inequality: Existence of independent set of certain size in a given graph. See Lecture-7 (Part-A and B) in the link below.References: None Chebyshev inequality, applications to Randomized Selection. Introduction to Chernoff-Hoeffding bound. References Sections 3.2, 3.3, 4.1 from the book: "<i> Randomized Algorithms</i>" by Rajeev Motwani and Prabhakar Raghavan Exercises Reading Chebyshev inequality, applications to Randomized Selection. Introduction to Chernoff-Hoeffding bound.References: Sections 3.2, 3.3, 4.1 from the book: " *Randomized Algorithms*" by Rajeev Motwani and Prabhakar RaghavanApplications of Chernoff-Hoeffding: Load Balancing, Oblivious routing in a parallel computer. References Sections 4.1 and 4.2 from Rajeev Motwani & Prabhakar Raghavan <it>Randomized Algorithms </it> Exercises Reading Applications of Chernoff-Hoeffding: Load Balancing, Oblivious routing in a parallel computer.References: Sections 4.1 and 4.2 from Rajeev Motwani & Prabhakar Raghavan Randomized Algorithms Oblivious routing in the hypercub continued. Extensions to Chernoff': Sum of random variables with dependencies. Introduction to Martingales.Introduction to Martingales. References Section 4.2 from Rajeev Motwani & Prabhakar Raghavan <b><i>Randomized Algorithms </i></b> Exercises Reading Oblivious routing in the hypercub continued. Extensions to Chernoff': Sum of random variables with dependencies. Introduction to Martingales.Introduction to Martingales.References: Section 4.2 from Rajeev Motwani & Prabhakar Raghavan *Randomized Algorithms*Martingales, Azuma-Hoeffding inequality, application to distributed edge coloring of graphs. Introduction to Isoperimetry. References Chapter 5 and Section 6.4 from Dubhashi and Panconesi <b><it>Concentration of Measures for the analysis of Randomized Algorithms</it></b> Exercises Reading Martingales, Azuma-Hoeffding inequality, application to distributed edge coloring of graphs. Introduction to Isoperimetry.References: Chapter 5 and Section 6.4 from Dubhashi and Panconesi Concentration of Measures for the analysis of Randomized Algorithms Application of Martingales to Stochastic TSP. Talagrand's Isoperimetric ineqality-Statement without proof. Application to the Stochastic Traveling Salesperson problem. References Sections 7.3, 10.1, 10.3, 11.1, and 11.2 from Dubhashi and Panconesi: <i> Concentration of Measures for the analysis of Randomized Algorithms</i> Exercises Reading If you are interested probabilistic analysis of Euclidean optimization problems such as Euclidean TSP, Euclidean minimum weight matching etc.. See the following books: 1) Joseph Yukich: <i> Probability Theory of Classical Euclidean Optimization Problems</i>, Lecture Notes in Mathematics, 1998, vol. 1675. 2)Michael Steele: <i> Probability Theory and Combinatorial optimization</i>, SIAM, 1997. Steele's book above has a proof of Talagrang's isoperimetry and also its application to stochastic TSP. See Sections 6.2 and 6.4. Application of Martingales to Stochastic TSP. Talagrand's Isoperimetric ineqality-Statement without proof. Application to the Stochastic Traveling Salesperson problem.References: Sections 7.3, 10.1, 10.3, 11.1, and 11.2 from Dubhashi and Panconesi: *Concentration of Measures for the analysis of Randomized Algorithms*Reading: If you are interested probabilistic analysis of Euclidean optimization problems such as Euclidean TSP, Euclidean minimum weight matching etc.. See the following books: 1) Joseph Yukich: *Probability Theory of Classical Euclidean Optimization Problems*, Lecture Notes in Mathematics, 1998, vol. 1675. 2)Michael Steele:*Probability Theory and Combinatorial optimization*, SIAM, 1997. Steele's book above has a proof of Talagrang's isoperimetry and also its application to stochastic TSP. See Sections 6.2 and 6.4.Notion of Dependency graph among the events. Lovasz Local Lemma and Proof. The symmetric version of the lovasz local lemma, and application to satisfying assignments of CNF SAT in which variable occur only a bounded number of times. Application to showing existence of directed cycles of length 0(mod k) in a directed Graph. Probabilistic Recurrence relations, Karp's first theorem. A simple applications to Hoare's selection algorithm. References Chapter 5.1 from Noga Alon and Joel Specncer <b><i> Probabilistic Method </i></b>, The Probabilisitic Lens sectoin following chapter 5. Probabilistic recurrence is from Chapter 4 of Dubhashi and Panconesi. k-CNF is from the Lecture <a href="http://www.cse.iitm.ac.in/~jayalal/teaching/CS6845/2012/lecture-A06.pdf"> notes </a> by Narayanaswamy Exercises Reading Notion of Dependency graph among the events. Lovasz Local Lemma and Proof. The symmetric version of the lovasz local lemma, and application to satisfying assignments of CNF SAT in which variable occur only a bounded number of times. Application to showing existence of directed cycles of length 0(mod k) in a directed Graph. Probabilistic Recurrence relations, Karp's first theorem. A simple applications to Hoare's selection algorithm.References: Chapter 5.1 from Noga Alon and Joel Specncer , The Probabilisitic Lens sectoin following chapter 5. Probabilistic recurrence is from Chapter 4 of Dubhashi and Panconesi. k-CNF is from the Lecture notes by Narayanaswamy*Probabilistic Method***Theme 2 : Explicit constructions**- 5 meetings- Meeting 08 : Sat, Mar 23, 02:00 pm-04:00 pm -
- Meeting 09 : Tue, Mar 26, 03:00 pm-05:00 pm -
- Meeting 10 : Wed, Apr 03, 02:00 pm-04:00 pm -
- Meeting 11 : Fri, Apr 05, 03:00 pm-05:00 pm -
- Meeting 12 : Fri, Apr 26, 03:00 pm-04:30 pm -

Notion of explicit constructions and applications in derandomization. Construction of k-wise independent sample spaces. Introduction to Extractors. References Consturction of Pairwise Independent bits is from Salil Vadhan's <a href="http://people.seas.harvard.edu/~salil/pseudorandomness/basic.pdf"> survey </a>, see section 3.5 there. k-wise independent spaces is from the <a href="http://www.cse.iitm.ac.in/~jayalal/teaching/CS6845/2012/lecture-A07.pdf"> earlier version </a> of the course. For extractors, see an excellent <a href="http://people.seas.harvard.edu/~salil/pseudorandomness/extractors.pdf"> exposition</a> by Vadhan. Exercises Reading Notion of explicit constructions and applications in derandomization. Construction of k-wise independent sample spaces. Introduction to Extractors.References: Consturction of Pairwise Independent bits is from Salil Vadhan's survey , see section 3.5 there. k-wise independent spaces is from the earlier version of the course. For extractors, see an excellent exposition by Vadhan. Extracting Randomness: Extractors, Motivation, and construction. References Sections 6.1,6.2.1, and 6.3, in the <a href="http://people.seas.harvard.edu/~salil/pseudorandomness/extractors.pdf"> monograph</a> by Vadhan. Exercises Reading Extracting Randomness: Extractors, Motivation, and construction.References: Sections 6.1,6.2.1, and 6.3, in the monograph by Vadhan. Constructions of Extractors: Leftover Hash Lemma. Introduction to Expander Graphs: vertex and spectral expansion References Section 6.2.1 in the <a href="http://people.seas.harvard.edu/~salil/pseudorandomness/extractors.pdf"> monograph</a> by Vadhan. For Expander graphs, see Section 4.1 in the <a href="http://people.seas.harvard.edu/~salil/pseudorandomness/expanders.pdf"> monograph</a> by Vadhan. Exercises Reading Constructions of Extractors: Leftover Hash Lemma. Introduction to Expander Graphs: vertex and spectral expansionEdge expansion and Expander mixing Lemma. Random walks in expander graphs and amplifying success probability of randomized algorithms. Some examples of expander graphs. References See sections 4.2 and 4.3 in the <a href="http://people.seas.harvard.edu/~salil/pseudorandomness/expanders.pdf"> monograph</a> by Vadhan. Exercises Reading Edge expansion and Expander mixing Lemma. Random walks in expander graphs and amplifying success probability of randomized algorithms. Some examples of expander graphs.References: See sections 4.2 and 4.3 in the monograph by Vadhan. Unified Theory of Psedorandomness. We will overview a framework under which the Pseudorandom structures such as Expanders, Extractors, Samplers and List Decodable Codes can be characterized. References Based on the survey <a href="http://people.seas.harvard.edu/~salil/research/unified-icm.pdf" > paper </a> by Salil Vadhan. Exercises Reading Unified Theory of Psedorandomness. We will overview a framework under which the Pseudorandom structures such as Expanders, Extractors, Samplers and List Decodable Codes can be characterized.References: Based on the survey paper by Salil Vadhan. **Theme 3 : Fourier transforms and applications**- 6 meetings- Meeting 13 : Wed, Apr 10, 02:00 pm-04:00 pm -
- Meeting 14 : Fri, Apr 12, 02:00 pm-04:00 pm -
- Meeting 15 : Wed, Apr 17, 03:00 pm-05:00 pm -
- Meeting 16 : Fri, Apr 19, 03:00 pm-05:00 pm -
- Meeting 17 : Sun, Apr 21, 08:45 am-10:00 am -
- Meeting 18 : Sun, Apr 21, 10:05 am-11:00 am -

Introduction to Discrete Fourier Transforms. The example of polynomial multiplication. Coefficient vs Pointwise Evaluation representations. Tranforming one from the other. Choosing the points of evaluations. FFT formulation. Inverse transform. Multiplying polynomials using O(n log n) operations. Introduction to Fourier Analysis on the Boolean Cube. References Discrete Forier Transforms is taken from Manindra Agrawal's <a href="http://www.cse.iitk.ac.in/users/manindra/CS681/2005/Lecture5.pdf"> Lecture Notes</a>. Fourier representations of Boolean functions is from Ryan O'donnel's <a href="http://www.cs.cmu.edu/~odonnell/boolean-analysis/lecture2.pdf"> Lecture Notes</a> Exercises Reading Introduction to Discrete Fourier Transforms. The example of polynomial multiplication. Coefficient vs Pointwise Evaluation representations. Tranforming one from the other. Choosing the points of evaluations. FFT formulation. Inverse transform. Multiplying polynomials using O(n log n) operations. Introduction to Fourier Analysis on the Boolean Cube.References: Discrete Forier Transforms is taken from Manindra Agrawal's Lecture Notes. Fourier representations of Boolean functions is from Ryan O'donnel's Lecture Notes <!--The Fourier basis for the vector space of Boolean Realvalued functions. Viewing Fourier transform as a change of basis.--> Parseval's equality. Fourier transforms of example Boolean functions. Linearity Testing: the BLR test. Learning Boolean Functions References For Linearity Testing see Ryan O'donnel's <a href="http://www.cs.cmu.edu/~odonnell/boolean-analysis/lecture3.pdf"> Lecture Notes-3</a>, and for Learning Boolean Functions <a href="http://www.cs.cmu.edu/~odonnell/boolean-analysis/lecture8.pdf"> Lecture Notes-8</a> Exercises Reading Parseval's equality. Fourier transforms of example Boolean functions. Linearity Testing: the BLR test. Learning Boolean FunctionsReferences: For Linearity Testing see Ryan O'donnel's Lecture Notes-3, and for Learning Boolean Functions Lecture Notes-8 Learning Boolean Functions. From Fourier expansion to learning. Estimating the Fourier coefficients. Low-degree learning. Fourier spectrum for decision trees. References For learning epsilon-concentrated functions, see Ryan O'donnel's <a href="http://www.cs.cmu.edu/~odonnell/boolean-analysis/lecture8.pdf"> Lecture Notes-8</a>, and for Decision trees <a href="http://www.cs.cmu.edu/~odonnell/boolean-analysis/lecture9.pdf"> Lecture Notes-9</a>. The low degree learning is from Mansour's <a href="http://www.tau.ac.il/~mansour/papers/fourier-survey.ps.gz">Survey.</a> Exercises Reading Learning Boolean Functions. From Fourier expansion to learning. Estimating the Fourier coefficients. Low-degree learning. Fourier spectrum for decision trees.References: For learning epsilon-concentrated functions, see Ryan O'donnel's Lecture Notes-8, and for Decision trees Lecture Notes-9. The low degree learning is from Mansour's Survey. Learning Decision Trees continued. Sparse functions: Kushilevitz Mansour Theorem. References Learning sparse functions is from Mansour's <a href="http://www.tau.ac.il/~mansour/papers/fourier-survey.ps.gz">Survey. (See Section 4.3. )</a> For learning decision trees see Ryan O'donnel's <a href="http://www.cs.cmu.edu/~odonnell/boolean-analysis/lecture9.pdf"> Lecture Notes-9</a> <!--, and for Decision trees <a href="http://www.cs.cmu.edu/~odonnell/boolean-analysis/lecture9.pdf"> Lecture Notes-9</a>. --> Exercises Reading Learning Decision Trees continued. Sparse functions: Kushilevitz Mansour Theorem.References: Learning sparse functions is from Mansour's Survey. (See Section 4.3. ) For learning decision trees see Ryan O'donnel's Lecture Notes-9 DNF Formula. Learning algorithms for DNFs. Learning AC0: Linial-Mansour-Nisan Theorem. Overview of further applications of Fourier analysis of Boolean functions. References For learning DNFs see Ryan O'donnel's <a href="http://www.cs.cmu.edu/~odonnell/boolean-analysis/lecture9.pdf"> Lecture Notes-9</a>. For AC^0 circuits see the original <a href="http://www.cs.columbia.edu/~rocco/Teaching/S12/Readings/LMN.pdf"> paper by Linial Mansour and Nisan</a>, in particular Lemmas 3, 4, 5,6 and 7. Exercises Reading DNF Formula. Learning algorithms for DNFs. Learning AC0: Linial-Mansour-Nisan Theorem. Overview of further applications of Fourier analysis of Boolean functions.References: For learning DNFs see Ryan O'donnel's Lecture Notes-9. For AC^0 circuits see the original paper by Linial Mansour and Nisan, in particular Lemmas 3, 4, 5,6 and 7. Learning AC0 continued. Some concluding remarks. References Exercises Reading Learning AC0 continued. Some concluding remarks.References: None **Theme 4 : Essentials of Coding Theory**- 11 meetings- Meeting 19 : Wed, Jan 16, 02:00 pm-04:00 pm -
- Meeting 20 : Fri, Jan 18, 03:00 pm-05:00 pm -
- Meeting 21 : Wed, Jan 23, 02:00 pm-04:00 pm -
- Meeting 22 : Fri, Jan 25, 02:00 pm-04:00 pm -
- Meeting 23 : Wed, Jan 30, 02:00 pm-04:00 pm -
- Meeting 24 : Fri, Feb 01, 03:00 pm-05:00 pm -
- Meeting 25 : Wed, Feb 06, 02:00 pm-04:00 pm -
- Meeting 26 : Wed, Mar 06, 02:00 pm-04:00 pm -
- Meeting 27 : Fri, Mar 08, 03:00 pm-05:00 pm -
- Meeting 28 : Wed, Mar 13, 02:00 pm-04:00 pm -
- Meeting 29 : Fri, Mar 22, 03:00 pm-05:00 pm -

Administrative announcements. Introduction to Coding Theory. Shannon: Noiseless coding theorem, Noisy encoding theorem. Basic notions: distance, rate etc. References Introduction to Coding Theory: J H van Lint Essential Coding Theory: Lecure notes by Atri Rudra Exercises Reading Administrative announcements. Introduction to Coding Theory. Shannon: Noiseless coding theorem, Noisy encoding theorem. Basic notions: distance, rate etc.References: Introduction to Coding Theory: J H van Lint Essential Coding Theory: Lecure notes by Atri Rudra Application to hashing. Hamming codes, Hamming Bound, Linear codes. References Introduction to Coding Theory: J H van Lint Exercises Reading Application to hashing. Hamming codes, Hamming Bound, Linear codes.References: Introduction to Coding Theory: J H van Lint Hadamard codes, Singleton Bound and Reed-Solomon codes. References Exercises Reading Hadamard codes, Singleton Bound and Reed-Solomon codes.References: None Institute Holiday: Id-e-Milad References Exercises Reading Institute Holiday: Id-e-MiladReferences: None Reed-Solomon codes. Reducing the alphabet size: Reed Muller Codes and Concatenation Codes. Decoding Redd Solomon Codes: Berlekamp-Welch Decoding Algorithm References Exercises Reading <a href="http://www.cs.cmu.edu/~venkatg/teaching/codingtheory/notes/notes6.pdf"> Notes by Venkat Guruswami </a> Reed-Solomon codes. Reducing the alphabet size: Reed Muller Codes and Concatenation Codes. Decoding Redd Solomon Codes: Berlekamp-Welch Decoding AlgorithmReferences: None Reading: Notes by Venkat Guruswami Concatenation Codes, Decoding Reed-Solomon codes: Berlekamp-Welch Decoding algorithm. Introduction to List Decodable Codes. References <a href="http://www.cse.buffalo.edu/~atri/courses/coding-theory/lectures/lect27.pdf">Notes by Atri Rudra </a> Exercises Reading Concatenation Codes, Decoding Reed-Solomon codes: Berlekamp-Welch Decoding algorithm. Introduction to List Decodable Codes.References: Notes by Atri Rudra List Decodable Codes continued. List decoding Reed Solomon Codes: a toy case of list decoding, Sudan's algorithm for List decoding Reed-Solomon codes. References Lecture notes by Venkat Guruswami. <a href="http://www.cs.cmu.edu/~venkatg/teaching/codingtheory/notes/notes9.pdf"> See this </a> for the toy problem, <a href="http://www.cs.cmu.edu/~venkatg/teaching/codingtheory/notes/notes10.pdf"> see this</a> for Sudan's algorithm. Exercises Reading List Decodable Codes continued. List decoding Reed Solomon Codes: a toy case of list decoding, Sudan's algorithm for List decoding Reed-Solomon codes.Introduction to Average-case hardness. Average case hardness for permanent: 1) I permanent can be computed on at least 1/2+epsilon fraction of the inputs, then there is a randomized algorithm that works on all inputs. 2) Improving the above bound to 1/poly(n) fraction of inputs: Cai-Pavan-Sivakumar. References <a href="http://www.cs.cmu.edu/~venkatg/pubs/papers/ld-avg-PR.pdf"> Survey</aa> by Venkat Guruswami. <a href="pages.cs.wisc.edu/~jyc/papers/permanent.ps"> Paper </a> by Cai-Pavan-Sivakumar Exercises Reading Introduction to Average-case hardness. Average case hardness for permanent: 1) I permanent can be computed on at least 1/2+epsilon fraction of the inputs, then there is a randomized algorithm that works on all inputs. 2) Improving the above bound to 1/poly(n) fraction of inputs: Cai-Pavan-Sivakumar.References: Survey by Venkat Guruswami. Paper by Cai-Pavan-Sivakumar Continuation of worst-case vs average-case hardness for permanent. Proof of Cai-Pavan-Sivakumar. Aplications of Codes: Average case hardness, Worst-case to Average case reductions for E. References Exercises Reading Continuation of worst-case vs average-case hardness for permanent. Proof of Cai-Pavan-Sivakumar. Aplications of Codes: Average case hardness, Worst-case to Average case reductions for E.References: None Worst case to average case reductions continued. Amplification of Hardness within E, Locally decodable codes. A local decoder for Hadamard codes. <i> Repetition Lecture:</i> Sudan's list decoding algorithms. References Section 3 of the <a href="http://www.cs.berkeley.edu/~luca/pubs/codingsurvey.pdf">survey </a> by Luca Trevisan. See Section 3.1 for a definition of locally decodable codes and section 3.3.1 for a description of local decoder for Hadamard codes. Exercises Reading Worst case to average case reductions continued. Amplification of Hardness within E, Locally decodable codes. A local decoder for Hadamard codes.*Repetition Lecture:*Sudan's list decoding algorithms.References: Section 3 of the survey by Luca Trevisan. See Section 3.1 for a definition of locally decodable codes and section 3.3.1 for a description of local decoder for Hadamard codes. Local List decoding. Goldreich-Levin Theorem:statement. Application of GL-decoder to construct hardcore predicates of one-way permutations. Proof of the Goldreich-Levin Theorem. References Section 4 of the <a href="http://theory.stanford.edu/~trevisan/pubs/codingsurvey.pdf"> survey</a> by Luca Trevisan. The proof of Goldreich-Levin result is in Section 4.4, hardcore predicates is in Sections 4.5.1 and 4.5.2. Exercises Reading Local List decoding. Goldreich-Levin Theorem:statement. Application of GL-decoder to construct hardcore predicates of one-way permutations. Proof of the Goldreich-Levin Theorem.References: Section 4 of the survey by Luca Trevisan. The proof of Goldreich-Levin result is in Section 4.4, hardcore predicates is in Sections 4.5.1 and 4.5.2. **Theme 5 : Evaluation Meetings**- 20 meetings- Meeting 30 : Fri, Mar 15, 03:00 pm-05:30 pm -
- Meeting 31 : Sat, Apr 20, 02:00 pm-02:45 pm -
- Meeting 32 : Sat, Apr 20, 02:45 pm-03:30 pm -
- Meeting 33 : Sat, Apr 20, 03:30 pm-04:15 pm -
- Meeting 34 : Sat, Apr 20, 04:15 pm-05:00 pm -
- Meeting 35 : Wed, Apr 24, 02:00 pm-02:45 pm -
- Meeting 36 : Wed, Apr 24, 02:45 pm-03:30 pm -
- Meeting 37 : Wed, Apr 24, 03:30 pm-04:15 pm -
- Meeting 38 : Wed, Apr 24, 04:15 pm-05:00 pm -
- Meeting 39 : Sat, Apr 27, 02:00 pm-02:45 pm -
- Meeting 40 : Sat, Apr 27, 02:45 pm-03:30 pm -
- Meeting 41 : Sat, Apr 27, 03:30 pm-04:15 pm -
- Meeting 42 : Sat, Apr 27, 04:15 pm-05:00 pm -
- Meeting 43 : Sun, Apr 28, 09:15 am-10:00 am -
- Meeting 44 : Sun, Apr 28, 10:00 am-10:45 am -
- Meeting 45 : Sun, Apr 28, 10:45 am-11:30 am -
- Meeting 46 : Sun, Apr 28, 11:30 am-12:15 pm -
- Meeting 47 : Tue, Apr 30, 02:00 pm-02:45 pm -
- Meeting 48 : Tue, Apr 30, 02:45 pm-03:30 pm -
- Meeting 49 : Sat, May 04, 02:00 pm-04:00 pm -

<b> Mid Semester Exam </b> References Exercises Reading **Mid Semester Exam**References: None <b>Vamsi Krishna </b> <font color="red">Chinese Remaindering with errors</font> References Exercises Reading **Vamsi Krishna**Chinese Remaindering with errorsReferences: None <b> Suraj Sangavkar</b> <font color="red"> Private Information Retrieval</font> References Exercises Reading **Suraj Sangavkar**Private Information RetrievalReferences: None <b> Akshay Degwekar</b> <font color="red">Pseudorandomness for read-once formulas </font> References Exercises Reading **Akshay Degwekar**Pseudorandomness for read-once formulasReferences: None <b> William Kumar Moses Jr </b> <font color="red"> Rumor spreading in social networks</font> References Exercises Reading **William Kumar Moses Jr**Rumor spreading in social networksReferences: None <b>Ramnath J </b> <font color="red"> Janson's inequality, applications to subgraph counts and Randomized Rounding.</font> References Exercises Reading **Ramnath J**Janson's inequality, applications to subgraph counts and Randomized Rounding.References: None <b> Shiv Poojan Singh </b> <font color="red"> Method of Bounded Variances, Application to Dealing with unlikely circumstances.</font> References Exercises Reading **Shiv Poojan Singh**Method of Bounded Variances, Application to Dealing with unlikely circumstances.References: None <b> Gaurav Singh </b> <font color="red"> The sum of d small-bias generators fools polynomials of degree d.</font> References Exercises Reading **Gaurav Singh**The sum of d small-bias generators fools polynomials of degree d.References: None <b> Anant Dhayal </b> <font color="red"> BP-Space(S) subseteq DSPACE(S^3/2).</font> References Exercises Reading **Anant Dhayal**BP-Space(S) subseteq DSPACE(S^3/2).References: None <b> Sumathi</b> <font color="red">A Constructive Proof of the General Lovasz Local Lemma </font> References Exercises Reading **Sumathi**A Constructive Proof of the General Lovasz Local LemmaReferences: None <b> Anand Aiyer</b> <font color="red">A Constructive Proof of the General Lovasz Local Lemma </font> References Exercises Reading **Anand Aiyer**A Constructive Proof of the General Lovasz Local LemmaReferences: None <b> Vishal Pandya</b> <font color="red">A Tail Bound for Read-k Families of Functions. </font> References Exercises Reading **Vishal Pandya**A Tail Bound for Read-k Families of Functions.References: None <b>Tejas Kulkarni </b> <font color="red">The Randomized Coloring Procedure with Symmetry-Breaking </font> References Exercises Reading **Tejas Kulkarni**The Randomized Coloring Procedure with Symmetry-BreakingReferences: None <b> Saurabh Kumar Verma </b> <font color="red"> Min Cut</font> References Exercises Reading **Saurabh Kumar Verma**Min CutReferences: None <b> Sharath Chandar</b> <font color="red">Active Property Testing </font> References Exercises Reading **Sharath Chandar**Active Property TestingReferences: None <b> Ashwini Kumar Sahoo </b> <font color="red"> Randomized Skip Lists: An application of Chernoff-Hoeffding bound.</font> References Exercises Reading **Ashwini Kumar Sahoo**Randomized Skip Lists: An application of Chernoff-Hoeffding bound.References: None <b> Ankit Chauhan </b> <font color="red"> Learning and Lower Bounds for AC0 with Threshold Gates.</font> References Exercises Reading **Ankit Chauhan**Learning and Lower Bounds for AC0 with Threshold Gates.References: None <b>Anant Nag </b> <font color="red">Randomized algorithms for DNF Counting </font> References Exercises Reading **Anant Nag**Randomized algorithms for DNF CountingReferences: None <b> Choppa Aditya</b> <font color="red">(Probabilistic) Recurrence Relations Revisited </font> References Exercises Reading **Choppa Aditya**(Probabilistic) Recurrence Relations RevisitedReferences: None <b> End-Semester Exam</b> References Exercises Reading **End-Semester Exam**References: None