Graduate Students
Graduate Programs     
Graduate Home Ph D Programs Masters Degree Ph D Program Requirements Course Descriptions Current Courses Admissions Current Graduate Students Graduate Student Seminar SIAM Chapter Seminar Recent Graduates Incoming Students

Apply Now
Graduate Seminar

Chris Cox
CMU
Title: Shannon Capacity

Abstract: The area of information theory is all about studying the maximum amount of information that can be transmitted across a noisy channel under various constraints. In this talk, we look at transmitting information across a channel described by a graph $G$ which dictates which symbols can be confused. More precisely, the symbols in our message are the vertices of $G$ and two symbols are confusable if they are connected by an edge of $G$. If we want to be able to transmit the message perfectly, then we must, naively, restrict the symbols that we use to an independent set in $G$. This way, any two messages sent will be distinguishable. However, sometimes by adding redundancy to our message by, for example, sending the message in chunks of size two, we can actually transmit more information. The theoretical limit on the amount of information that can be sent under this model is called the Shannon capacity of $G$, denoted by $\Theta(G)$. The Shannon capacity is notoriously difficult to calculate; despite being defined in the 50s, we still don't know the value of $\Theta(C_7)$. In fact, it's not even known if $\Theta(G)$ is computable! It isn't necessary to have any previous knowledge about information theory as Shannon capacity is really just a graph theory problem. We'll formally define $\Theta(G)$ and discuss two general upper bounds: Lovàsz's theta function and Haemers' minimum rank.

Date: Tuesday, March 6, 2018
Time: 5:30 pm
Location: Wean Hall 8220
Submitted by:  Son Van