Meetings

Click on the theme item for the meeting plan for that theme.Click on the meeting item for references, exercises, and additional reading related to it.

**Theme 1 : Probablistic Method & Pseudo-randomness**- 11 meetings- Meeting 01 : Wed, Jan 15, 02:00 pm-04:00 pm - Narayanaswamy
- Meeting 02 : Fri, Jan 17, 02:00 pm-04:00 pm - Narayanaswamy
- Meeting 03 : Wed, Jan 22, 02:00 pm-04:00 pm - Narayanaswamy
- Meeting 04 : Fri, Jan 24, 02:00 pm-04:00 pm - Narayanaswamy
- Meeting 05 : Wed, Jan 29, 02:00 pm-04:00 pm - Narayanaswamy
- Meeting 06 : Fri, Jan 31, 02:00 pm-04:00 pm - Narayanaswamy
- Meeting 07 : Wed, Feb 05, 02:00 pm-04:00 pm - Narayanaswamy
- Meeting 08 : Fri, Feb 07, 02:00 pm-04:00 pm - Narayanaswamy
- Meeting 09 : Wed, Feb 12, 02:00 pm-03:00 pm - Narayanaswamy
- Meeting 10 : Wed, Feb 19, 02:00 pm-04:00 pm - Narayanaswamy
- Meeting 11 : Fri, Feb 21, 03:00 pm-05:00 pm - Narayanaswamy

The evaluation patten and policy. Mentioned the course webpage and the contents. Material Discussed: Basic Probabilistic Method-Experiments, Outcomes, Probability of Outcomes. Experiments are samples from probability distributions. Random Variables, Expected Value of random variables, Linearity of Expectation, Markov's Inequality, Tchebyshev Inequality, Chernoff-Hoeffding bounds, Lovasz Local Lemma, Limited Independence Sample Spaces, Randomness Extractors, Random Walks and applications, Expander graphs. An introduction the probabilistic method- Introduced the MAX-CUT probabilistic. Random color the vertices 0-1. Expected number of non-monochromatic edges is half the number of edges in the graph. There exists cuts of value half the number of edges. Converting this existential argument into a deterministic algorithm that finds a cut of value at least half the number of edges. References The Probabilistic Method by Alon and Spencer. Randomized Algorithms by Rajeev Motwani Exercises Reading Pairwise Independence and Derandomization by Luby and Wigderson. The evaluation patten and policy. Mentioned the course webpage and the contents. Material Discussed: Basic Probabilistic Method-Experiments, Outcomes, Probability of Outcomes. Experiments are samples from probability distributions. Random Variables, Expected Value of random variables, Linearity of Expectation, Markov's Inequality, Tchebyshev Inequality, Chernoff-Hoeffding bounds, Lovasz Local Lemma, Limited Independence Sample Spaces, Randomness Extractors, Random Walks and applications, Expander graphs. An introduction the probabilistic method- Introduced the MAX-CUT probabilistic. Random color the vertices 0-1. Expected number of non-monochromatic edges is half the number of edges in the graph. There exists cuts of value half the number of edges. Converting this existential argument into a deterministic algorithm that finds a cut of value at least half the number of edges.References: The Probabilistic Method by Alon and Spencer. Randomized Algorithms by Rajeev Motwani Reading: Pairwise Independence and Derandomization by Luby and Wigderson. Today we deal with a different distribution to solve the max-cut. We consider unit vectors assigned to the vertices so that for as many edges as possible, the vectors given to the vertices are orthogonal. No we pick a random hyperplane passing through the origin and partition the vertices based on the hyperplane. This yields a stronger result than half the number of edges. We then talk about the constructing pairwise independent sample spaces using the GF(2^t). We then return to the basic probabilistic method and place lower bounds on R(k,k), existence of tournaments iwth property S_k, and the existence of dominating sets of a certain size in a graph G. References For the max-cut results, see Design of Approximation Algorithms by Shmoys and Williamson For the the GF(2^t) based construction of pairwise independent sample spaces, see Luby and Wigderson Rest is from Alon and Spencer Exercises Reading Design of Approximation Algorithms by Shmoys and Williamson Pairwise Independence and Derandomization by Luby and Wigderson Today we deal with a different distribution to solve the max-cut. We consider unit vectors assigned to the vertices so that for as many edges as possible, the vectors given to the vertices are orthogonal. No we pick a random hyperplane passing through the origin and partition the vertices based on the hyperplane. This yields a stronger result than half the number of edges. We then talk about the constructing pairwise independent sample spaces using the GF(2^t). We then return to the basic probabilistic method and place lower bounds on R(k,k), existence of tournaments iwth property S_k, and the existence of dominating sets of a certain size in a graph G.References: For the max-cut results, see Design of Approximation Algorithms by Shmoys and Williamson For the the GF(2^t) based construction of pairwise independent sample spaces, see Luby and Wigderson Rest is from Alon and Spencer Reading: Design of Approximation Algorithms by Shmoys and Williamson Pairwise Independence and Derandomization by Luby and Wigderson Existence of tournament on n players satisfying a property that each k elements subset is defeated by one player. Upperbounds on the Size of a dominating set by sampling each vertex independently, and the connection to the greedy algorithm for finding a minimum dominating set. References Alon and Spencer, Chapter 1. Exercises Reading Probabilistic Lens: Erdos-Ko-Rao theorem. Existence of tournament on n players satisfying a property that each k elements subset is defeated by one player. Upperbounds on the Size of a dominating set by sampling each vertex independently, and the connection to the greedy algorithm for finding a minimum dominating set.References: Alon and Spencer, Chapter 1. Reading: Probabilistic Lens: Erdos-Ko-Rao theorem. Two colorability of n-uniform hypergraphs. Definition of m(n) the minimum number of of n-uniform hyperedges that can be there so that there is a the hypergraph is not 2 colorable. In otherwords, any u-uniform hypergrpah with for m(n)-1 hyperedges, is 2 colorable A simple probabilistic argument shows that if there are less than 2^{n-1} edges then it is 2 colorable. Also, the presentation of a better lower bound. Using the probabilistic method to argue an upper bound on m(n)- fix a 2 coloring of the vertics and pick n-uniform hyperedges uniformly at random and ask for the probability that it is not a valid 2 coloring. Choose parameters to ensure that none of the 2 colors are valid for the randomly chosen instance. The number of sets chosen for this probability to be smaller than 1 is an upper bound on m(n). References Alon and Spencer Exercises Question if the number of edges in a n-uniform hypergraph is smaller than the lower bound m(n), can you find a 2 coloring? Reading Two colorability of n-uniform hypergraphs. Definition of m(n) the minimum number of of n-uniform hyperedges that can be there so that there is a the hypergraph is not 2 colorable. In otherwords, any u-uniform hypergrpah with for m(n)-1 hyperedges, is 2 colorable A simple probabilistic argument shows that if there are less than 2^{n-1} edges then it is 2 colorable. Also, the presentation of a better lower bound. Using the probabilistic method to argue an upper bound on m(n)- fix a 2 coloring of the vertics and pick n-uniform hyperedges uniformly at random and ask for the probability that it is not a valid 2 coloring. Choose parameters to ensure that none of the 2 colors are valid for the randomly chosen instance. The number of sets chosen for this probability to be smaller than 1 is an upper bound on m(n).References: Alon and Spencer Exercises: Question if the number of edges in a n-uniform hypergraph is smaller than the lower bound m(n), can you find a 2 coloring? Balancing vectors: For any set of n unit vectors, the sum up random re-orientations by {0,pi}, results in a vector of length square-root of n. Theorems 4.1 and 4.2. Method of Alterations: sample from a distribution and alter. A different experiment to get a better bound on R(k,k). Then, the use of probabilistic method to give lower bound on maximum independent set size in sparse graphs (nd/2 edges, for some d) References Alon and Spencer Exercises Identify and algorithm for the multipliers in Theorems 4.1 and 4.2. Reading Turans Theorem after Chapter 6 in Alon and Spencer. Balancing vectors: For any set of n unit vectors, the sum up random re-orientations by {0,pi}, results in a vector of length square-root of n. Theorems 4.1 and 4.2. Method of Alterations: sample from a distribution and alter. A different experiment to get a better bound on R(k,k). Then, the use of probabilistic method to give lower bound on maximum independent set size in sparse graphs (nd/2 edges, for some d)References: Alon and Spencer Exercises: Identify and algorithm for the multipliers in Theorems 4.1 and 4.2. Reading: Turans Theorem after Chapter 6 in Alon and Spencer. Focus on Chebyshev inequality. Universal hash families guaranteeing pairwise independence. For A,B subset of set of t-bit strings, approximating the measure of A X B = |A||B|/2^{2t}. Use of Chebyshev Inequaltiy in estimating this measure, and the use of markov's inequality. Application to computing higher matrix powers, and application of higher matrix powers to derandomization via pseudorandom generators. References Tompa's notes on Randomized Algorithms and Pseudorandom generators Exercises The probabilistic lens: Bregman's theorem Reading The chapter on second moment in Alon and Spencer Focus on Chebyshev inequality. Universal hash families guaranteeing pairwise independence. For A,B subset of set of t-bit strings, approximating the measure of A X B = |A||B|/2^{2t}. Use of Chebyshev Inequaltiy in estimating this measure, and the use of markov's inequality. Application to computing higher matrix powers, and application of higher matrix powers to derandomization via pseudorandom generators.References: Tompa's notes on Randomized Algorithms and Pseudorandom generators Exercises: The probabilistic lens: Bregman's theorem Reading: The chapter on second moment in Alon and Spencer Derandomizing space S(n) bounded randomized algorithms. Model of RSPACE(S(n)) with one sided error and stopping in time 2^{S(n)}. Modeling computation of a randomized algorithm on an input as querying for an entry in the 2^{S(n)}-th power of an implicitly specified this matrix. Then using the hash lemma proved to argue that a randomly chosen h is good to approximate the square of this matrix. The distance measure between the approximate square matrix and the actual square matrix is formalized- using the statistical distance between 2 distributions assuming a 2 step random walk starts at a vertex. This is done for each vertex, and the worst distance is taken as the distance between the two matrices. Then the recursion is set up. Assuming that the hash functions are good at each recursive step, the accumulated error is bounded as a function of error of one squaring multiplied by an exponential in the number of recursion levels. Now, the bad event is that one hash function is bad in one of the recursion levels, the union bound is used, along with our chebyshev inequality result. This analysis also give us a structure of the psuedorandom generator. References Tompa's notes. Nisan's 1991 paper "Pseudorandom generator for Space bounded Computation". Exercises Reading Derandomizing space S(n) bounded randomized algorithms. Model of RSPACE(S(n)) with one sided error and stopping in time 2^{S(n)}. Modeling computation of a randomized algorithm on an input as querying for an entry in the 2^{S(n)}-th power of an implicitly specified this matrix. Then using the hash lemma proved to argue that a randomly chosen h is good to approximate the square of this matrix. The distance measure between the approximate square matrix and the actual square matrix is formalized- using the statistical distance between 2 distributions assuming a 2 step random walk starts at a vertex. This is done for each vertex, and the worst distance is taken as the distance between the two matrices. Then the recursion is set up. Assuming that the hash functions are good at each recursive step, the accumulated error is bounded as a function of error of one squaring multiplied by an exponential in the number of recursion levels. Now, the bad event is that one hash function is bad in one of the recursion levels, the union bound is used, along with our chebyshev inequality result. This analysis also give us a structure of the psuedorandom generator.References: Tompa's notes. Nisan's 1991 paper "Pseudorandom generator for Space bounded Computation". Statement of the Lovasz Local Lemma and proof. Simple application to Property B. References Alon and Spencer Exercises Reading Statement of the Lovasz Local Lemma and proof. Simple application to Property B.References: Alon and Spencer Symmetric version of the lovasz local lemma. Application to 2 coloring hypergraphs and connection to bounded occurence sat. SAT in whcih each variable occurs twice is easy can be reduced to matching in P time and also doable in log space. On the coloring of real numbers such that for any set S, every translate is multicolored. References Alon and Spencer Exercises Reading Satisfiability Problems Complete for Deterministic Logarithmic Space. STACS 2004: 317-325 Karpiniski's results on bounded occurence sat Symmetric version of the lovasz local lemma. Application to 2 coloring hypergraphs and connection to bounded occurence sat. SAT in whcih each variable occurs twice is easy can be reduced to matching in P time and also doable in log space. On the coloring of real numbers such that for any set S, every translate is multicolored.References: Alon and Spencer Reading: Satisfiability Problems Complete for Deterministic Logarithmic Space. STACS 2004: 317-325 Karpiniski's results on bounded occurence sat Random walks on undirected graphs. Hitting time, cover time, commute time. Independence of a commute on the edges passed. Randomized Log space algorithm for undirected st connectivity. References Paper by AKLLR and notes by Tompa Exercises Reading http://dbwilson.com/mix/ is a very nice paper. Random walks on undirected graphs. Hitting time, cover time, commute time. Independence of a commute on the edges passed. Randomized Log space algorithm for undirected st connectivity.References: Paper by AKLLR and notes by Tompa Reading: http://dbwilson.com/mix/ is a very nice paper. Geometric application of the lovasz local lemma. This is a result due to Pach and Levitska where we are asking for a tiling of the R^3 with unit disks. These are called k-covering, that is one in which each point is covered at least k times. What is known due to Pach is that if we know that each point is covered by at most t disks, then there is a decomposition of the disks into 2 sets such that each one is a covering. References Alon and Spencer. Exercises Reading Geometric application of the lovasz local lemma. This is a result due to Pach and Levitska where we are asking for a tiling of the R^3 with unit disks. These are called k-covering, that is one in which each point is covered at least k times. What is known due to Pach is that if we know that each point is covered by at most t disks, then there is a decomposition of the disks into 2 sets such that each one is a covering.References: Alon and Spencer. **Theme 2 : Algorithmic Coding Theory & Applications**- 11 meetings- Meeting 12 : Wed, Feb 26, 02:00 pm-04:00 pm - Jayalal Sarma
- Meeting 13 : Wed, Mar 05, 02:00 pm-04:00 pm - Jayalal Sarma
- Meeting 14 : Fri, Mar 07, 02:00 pm-04:00 pm - Jayalal Sarma
- Meeting 15 : Wed, Mar 12, 02:00 pm-04:00 pm - Jayalal Sarma
- Meeting 16 : Fri, Mar 14, 02:00 pm-04:00 pm - Jayalal Sarma
- Meeting 17 : Sat, Mar 15, 10:00 am-12:00 pm - Jayalal Sarma
- Meeting 18 : Wed, Mar 19, 02:00 pm-04:00 pm - Jayalal Sarma
- Meeting 19 : Fri, Mar 21, 02:00 pm-03:00 pm - Jayalal Sarma
- Meeting 20 : Sat, Mar 22, 03:00 pm-05:00 pm - Jayalal Sarma
- Meeting 21 : Wed, Mar 26, 02:00 pm-04:00 pm -
- Meeting 22 : Fri, Mar 28, 02:00 pm-04:00 pm -

Hamming vs Shannon. Overview. <br> Modeling the channel. Noiseless case. Compression, Decompression. Notion of entropy. Shannon's Source Coding Theorem. Noisy channels - Binary Symmetric Case. Noisy Coding theorem. Discussion and Channel Capacity. Proof of the existence (overview). References Lecture notes will be sent out once the proof is also completed. Till then, <a href="http://people.csail.mit.edu/madhu/FT01/scribe/lect1.ps">this is a good reference</a>. Exercises Reading <ul> <li><a href="www.cs.cmu.edu/~venkatg/teaching/ITCS-spr2013/notes/15359-2009-lecture25.pdf">Venkat Guruswami's Notes on Entropy</a></li> </ul> Hamming vs Shannon. Overview.

Modeling the channel. Noiseless case. Compression, Decompression. Notion of entropy. Shannon's Source Coding Theorem. Noisy channels - Binary Symmetric Case. Noisy Coding theorem. Discussion and Channel Capacity. Proof of the existence (overview).References: Lecture notes will be sent out once the proof is also completed. Till then, this is a good reference. Reading: Shannon's Noisy Coding Theorem : Proof of Upper bound. Outline of an argument for the Lower Bound. <br> Taxonomy of Codes. Targets of Coding theory. References Exercises Reading Shannon's Noisy Coding Theorem : Proof of Upper bound. Outline of an argument for the Lower Bound.

Taxonomy of Codes. Targets of Coding theory.References: None A combinatorial and constructive view : minimum distance and rate. A simple application of codes : Constructing hash families from codes. Basic code constructions. Hamming code. Generating Matrix and Parity Check Matrix. References Exercises Reading A combinatorial and constructive view : minimum distance and rate. A simple application of codes : Constructing hash families from codes. Basic code constructions. Hamming code. Generating Matrix and Parity Check Matrix.References: None Can better rate be achieved for d=3? The Hamming Bound. Characterization of minimum distance. Generalized Hamming Codes. Distance. Dual of Code. Parity Check matrix vs Generator Matrix. Hadamard Code. Distance of Hadamard Code. References Exercises Reading Can better rate be achieved for d=3? The Hamming Bound. Characterization of minimum distance. Generalized Hamming Codes. Distance. Dual of Code. Parity Check matrix vs Generator Matrix. Hadamard Code. Distance of Hadamard Code.References: None Singleton Bound. Reed-Solomon Codes. Reed-Muller Codes, Concatenation Codes - Bounds and parameters. References Exercises Reading Singleton Bound. Reed-Solomon Codes. Reed-Muller Codes, Concatenation Codes - Bounds and parameters.References: None Proof of Schwartz-Zippel Lemma. <br> Unique decoding problem. Trivial algorithms<br> Decoding problem for Reed-Solomon Codes. Error locating Polynomials. Berlekamp-Welsh decoding algorithm for Reed-Solomon Codes. References Exercises Reading Proof of Schwartz-Zippel Lemma.

Unique decoding problem. Trivial algorithms

Decoding problem for Reed-Solomon Codes. Error locating Polynomials. Berlekamp-Welsh decoding algorithm for Reed-Solomon Codes.References: None List decoding problem. Johnson's bound. List decoding of Reed-Solomon Codes. A bivariate view of the Berlekamp-Welsh decoding algorithm. Sudan's list decoding algorithm. References Exercises Reading List decoding problem. Johnson's bound. List decoding of Reed-Solomon Codes. A bivariate view of the Berlekamp-Welsh decoding algorithm. Sudan's list decoding algorithm.References: None <a>Shorter Lecture</a> : Amplification of number of roots by the method of multiplicities. Guruswami-Sudan List Decoding Algorithm for Reed-Solomon codes that achieves the Johnson's bound. References Exercises Reading Shorter Lecture : Amplification of number of roots by the method of multiplicities. Guruswami-Sudan List Decoding Algorithm for Reed-Solomon codes that achieves the Johnson's bound.References: None Application : An efficient algorithm computing Permanent for 1/2+epsilon fraction of input will imply an efficient randomized algorithm for permanent that works for all inputs. <br> Cai-Pavan-Sivakumar - Efficiently computing permanent in atleast 1/n^k fraction of inputs will imply an efficient randomized algorithm for permanent that works for all inputs. Tool : Sudan's list-decoding algorithm. References Exercises Reading Application : An efficient algorithm computing Permanent for 1/2+epsilon fraction of input will imply an efficient randomized algorithm for permanent that works for all inputs.

Cai-Pavan-Sivakumar - Efficiently computing permanent in atleast 1/n^k fraction of inputs will imply an efficient randomized algorithm for permanent that works for all inputs. Tool : Sudan's list-decoding algorithm.References: None Sublinear Time Decoding : Locally decodable codes. Applications to Private information Retrieval. Local Smooth Decoder, that suits both purposes. Local Smooth decoder for Hadamard code. Locally decodable codes for hardness amplification within E. Conversion from worst-case hard Boolean function to average-case hard Boolean function using a local decoder. References Exercises Reading Sublinear Time Decoding : Locally decodable codes. Applications to Private information Retrieval. Local Smooth Decoder, that suits both purposes. Local Smooth decoder for Hadamard code. Locally decodable codes for hardness amplification within E. Conversion from worst-case hard Boolean function to average-case hard Boolean function using a local decoder.References: None Improving the average case hardness by using local-list decoders. Back to local decoding : Local decoding vs Local correction. t-query Local corrector for Reed-Muller Codes (where t is the degree bound of the polynomials). Making the code systematic. The trade-off between Query complexity and rate of the LDC. Variants which achieves constant query complexity at the expense of rate. References Exercises Reading Improving the average case hardness by using local-list decoders. Back to local decoding : Local decoding vs Local correction. t-query Local corrector for Reed-Muller Codes (where t is the degree bound of the polynomials). Making the code systematic. The trade-off between Query complexity and rate of the LDC. Variants which achieves constant query complexity at the expense of rate.References: None **Theme 3 : Fourier Analysis & Applications**- 6 meetings- Meeting 23 : Wed, Apr 02, 02:00 pm-04:00 pm - Raghavendra Rao
- Meeting 24 : Fri, Apr 04, 02:00 pm-04:00 pm - Raghavendra Rao
- Meeting 25 : Wed, Apr 09, 02:00 pm-04:00 pm - Raghavendra Rao
- Meeting 26 : Fri, Apr 11, 02:00 pm-04:00 pm - Raghavendra Rao
- Meeting 27 : Wed, Apr 23, 02:00 pm-04:00 pm - Raghavendra Rao
- Meeting 28 : Fri, Apr 25, 02:00 pm-04:00 pm - Raghavendra Rao

Introduction to Discrete Fourier Transforms. The example of polynomial multiplication. FFT formulation. Inverse transform. Multiplying polynomials using O(n log n) operations. References <a href="http://www.cse.iitk.ac.in/users/manindra/CS681/2005/Lecture5.pdf"> Notes</a> by Manindra Agrawal. Exercises Reading Introduction to Discrete Fourier Transforms. The example of polynomial multiplication. FFT formulation. Inverse transform. Multiplying polynomials using O(n log n) operations.References: Notes by Manindra Agrawal. Parseval's equality. Fourier transforms of example Boolean functions. Linearity Testing: the BLR test. Learning Boolean Functions. References Section 1.6 of O'Donnel's book. Exercises Reading Parseval's equality. Fourier transforms of example Boolean functions. Linearity Testing: the BLR test. Learning Boolean Functions.References: Section 1.6 of O'Donnel's book. Learning Boolean Functions. From Fourier expansion to learning. Estimating the Fourier coefficients. Low-degree learning. References O'Donell Chapter 3, and Mansour's survey. Exercises Reading Learning Boolean Functions. From Fourier expansion to learning. Estimating the Fourier coefficients. Low-degree learning.References: O'Donell Chapter 3, and Mansour's survey. Fourier spectrum for decision trees. Sparse functions: Kushilevitz Mansour Theorem. References <a href="http://www.cs.cmu.edu/~odonnell/boolean-analysis/lecture9.pdf"> Notes</a> by Ryan O'Donnel. Exercises Reading Fourier spectrum for decision trees. Sparse functions: Kushilevitz Mansour Theorem.References: Notes by Ryan O'Donnel. Learning Sparse functions. Algorithm for enumerating sets with large Fourier coefficients. References Ryan O'Donnel's book, Chapter 3 (See Goldreich Levin theorem). Also see, Mansour's survey. Exercises Reading Learning Sparse functions. Algorithm for enumerating sets with large Fourier coefficients.References: Ryan O'Donnel's book, Chapter 3 (See Goldreich Levin theorem). Also see, Mansour's survey. Learning DNF's. Hastad Switching lemma (without proof). AC0 circuits Linal-Mansour-Nisan Theorem. References <a href="http://www.cs.cmu.edu/~odonnell/boolean-analysis/lecture9.pdf"> Notes</a> by Ryan O'Donnel. Exercises Reading Learning DNF's. Hastad Switching lemma (without proof). AC0 circuits Linal-Mansour-Nisan Theorem.References: Notes by Ryan O'Donnel. **Theme 4 : Student Presentations**- 18 meetings- Meeting 29 : Tue, Apr 08, 04:30 pm-05:15 pm - Sivaramakrishnan
- Meeting 30 : Tue, Apr 08, 05:15 pm-06:00 pm - Dhannya B.
- Meeting 31 : Wed, Apr 09, 04:30 pm-05:15 pm - Manoj Kalaskar
- Meeting 32 : Wed, Apr 09, 05:15 pm-06:00 pm - Parishkrati
- Meeting 33 : Thu, Apr 10, 04:30 pm-05:15 pm - Karthik Abhinav
- Meeting 34 : Thu, Apr 10, 05:15 pm-06:00 pm - Ashwin Pananjady
- Meeting 35 : Mon, Apr 21, 05:00 pm-05:45 pm - Akshay Ram
- Meeting 36 : Mon, Apr 21, 05:15 pm-06:00 pm - Subin K.
- Meeting 37 : Wed, Apr 23, 05:15 pm-06:00 pm - Vivek Bagaria
- Meeting 38 : Thu, Apr 24, 02:00 pm-02:45 pm - Ramya C.
- Meeting 39 : Thu, Apr 24, 02:45 pm-03:00 pm - Devakar Kumar Verma
- Meeting 40 : Sat, Apr 26, 02:00 pm-02:45 pm - Swapnil Gupta
- Meeting 41 : Sat, Apr 26, 02:45 pm-03:30 pm - Pradip Hatwar
- Meeting 42 : Tue, Apr 29, 11:00 am-12:00 pm - Saikrishna B.
- Meeting 43 : Tue, May 06, 02:00 pm-02:45 pm - Saurabh Sawlani
- Meeting 44 : Tue, May 06, 02:45 pm-03:30 pm - Bibekananda
- Meeting 45 : Tue, May 06, 03:30 pm-04:15 pm - Assim Ambadi
- Meeting 46 : Tue, May 06, 04:15 pm-05:00 pm - Sachin Balchim

<a href="http://www.wisdom.weizmann.ac.il/~feige/mypapers/santaclaus.pdf">On Allocations that Maximize Fairness</a>. References Exercises Reading References: None <a href="http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.133.147&rep=rep1&type=pdf">Improved bounds and algorithms for hypergraph two-coloring</a> References Exercises Reading References: None <a href="http://www.cs.princeton.edu/courses/archive/spr04/cos598B/bib/CzumajSohlerTestHypergraphColor.pdf">Testing Hypergraph Coloring</a> References Exercises Reading References: None <a href="http://arxiv.org/abs/0903.0544">A Constructive Proof of the General Lovasz Local Lemma</a> References Exercises Reading References: None <a href="http://www.cs.umd.edu/~barna/focs10.pdf">New Constructive Aspects of the Lovász Local Lemma</a> - Part I References Exercises Reading References: None <a href="http://www.cs.umd.edu/~barna/focs10.pdf"> New Constructive Aspects of the Lovász Local Lemma</a> - Part II References Exercises Reading References: None <a href="http://people.csail.mit.edu/madhu/FT01/scribe/lect20.ps">Complexity of Computational Problems in Coding Theory</a> - See <a href="http://users.cms.caltech.edu/~umans/522/lec11.ps">here</a>. References Exercises Reading References: None <a href="">Expander Codes and their applications</a> - See <a href="http://www.cs.cmu.edu/~venkatg/teaching/codingtheory/notes/notes8.pdf">here</a> and <a href="http://www.cs.yale.edu/homes/spielman/561/lect12-09.pdf">here</a>. References Exercises Reading References: None <a href="http://www.cse.buffalo.edu/~atri/courses/coding-theory/book/chapters/chap13.pdf">Folded Reed Solomon Codes</a>. References Exercises Reading References: None <a href="http://www.cs.technion.ac.il/~shpilka/publications/DvirShpilka_LDC_PIT.pdf">Identity Testing of Depth-3 circuits and LDCs</a>. References Exercises Reading References: None <a href="http://www.ccs.neu.edu/home/viola/papers/d.pdf">The sum of d small-bias generators fools polynomials of degree d</a> References Exercises Reading References: None <a href="http://research.microsoft.com/pubs/132616/random10.pdf">Learning and Lower Bounds for AC^0 with Threshold Gates</a> References Exercises Reading References: None <a href="http://www.cs.columbia.edu/~atw12/papers/j4.pdf">Learning random monotone DNF</a> References Exercises Reading References: None <a href="http://www.cse.ust.hk/faculty/cding/JOURNALS/jucs97.pdf">Secret Sharing Schemes and Error Correcting Codes</a> References Exercises Reading References: None <a href="http://www.cs.technion.ac.il/~bshouty/MyPapers/A%20Lower%20Bound%20for%20Matrix%20Multiplication/A%20Lower%20Bound%20for%20Matrix%20Multiplication.ps">Matrix Product and Codes - Part I</a> References Exercises Reading References: None <a href="http://arxiv.org/pdf/cs/0201001">Matrix Product and Codes Part - II</a> References Exercises Reading References: None <a href="">Goldreich-Levin Local List Decoding Algorithm for Hadamard Code</a> - section 4.1-4.4. References Exercises Reading Goldreich-Levin Local List Decoding Algorithm for Hadamard Code - section 4.1-4.4.References: None <a href="http://people.csail.mit.edu/dmoshkov/courses/codes/lec5-venkat.pdf">Coding and Decoding of Concatenated Codes</a>. References Exercises Reading References: None