Awards for favorite talks.
Eric Siggia. Evolving regulatory networks to fit target morphological patterns. Positive (which means greedy, I believe) evolution of network topology, parameters, and outputs reproduce known morphology. It’s important that topology and parameters coevolve. It given final topology then hard to tune parameters. Duplication seems very important here.
Carsten Witt. Time complexity of EA as randomized algorithms. I normally think of EA as pretty blah but this talk was crystal clear. The idea is that it’s hard to analyze EA directly and the main work-horse is drift analysis, which is basically local analysis of how much progress is made in one step on average. Additive drift. Multiplicative drift. Chernoff bound analysis of deviation from expectation. Recurrence analysis.
David Haussler. Evolution of the human brain (probably the overall winner). Several strands of evidence suggest that Notch2NL significantly contributes to increase in the human brain size. It’s a fairly recent duplication; present in human, chimp, gorilla. In tissue culture, it seems to prolong the stem stage, leading to larger brains. Ectopic overexpression in mouse fetus changes morphology. Deletion of that region in human leads to small-brain illness. Also interesting that the duplication became a sudogene and was reactived via gene conversion before having these functional effects.
Other interesting ideas/tidbits.
- Steve Frank. Gaussian distribution is an attractor. What are other attractor distributions/functions common in nature? Perhaps hill function is an attractor in the sense that a broad class of interactions all result in hill function like shapes.
- Guy Sella. Fisher’s geometric model of genetic architecture.
- Edo Kussell. Molecular memory via LAC operon.
- Carl Bergstrom. Defensive complexity. The immune system is so complexity so that it’s harder to break/hijack.