Cybernetics: information theory at the dawn of the information age

Nature published a special issue last week for Alan Turing’s centenary; were he alive, he would be 100. Turing is widely regarded as the father of the computer. His contributions to code-breaking were key to cracking the Enigma code, and helped win World War II for the Allies. His untimely death at the age of 41 was a sad reminder of the power of prejudice over gratitude.

Turing’s work was fundamental to the field of cybernetics, which was as short-lived and as influential as he was. Although Turing himself never published on the subject, it drew heavily on code analysis of the kind he did during the war. Cybernetics existed because early computing existed, and all at once the passage of messages became at least as interesting in their content.  This is especially germane if you think about computer science in terms of its code-breaking background; Neal Stephenson’s Cryptonomicon tells this story, among others, beautifully.

Cybernetics made sense of information transfer by simplifying down to a system that could transduce input to output. It had a lot in common with thermodynamics, the study of heat transfer, which had solidified over the previous century or so and which also takes an input/system/output view of the world.

This ground rule was elaborated as different systems were studied in cybernetic terms. One of my own favorite elaborations (it’s delightfully silly!) was the study of second-order cybernetics, or understanding how information is transferred when we try to understand information transfer. (In the picture above, the lower diagram shows second-order cybernetics; Wiener, Bateson and Mead are three of the founders of the field).

The field was at its strongest during the Macy conferences, a series of academic conferences between 1946 and 1953 dedicated to both developing and spreading the ideas of cybernetics. You could make a cybernetic study of the conferences themselves; although we have the schedules, and writings from academics that reflected on the ideas shared there, but no proceedings were ever published, so we’ll never know exactly what was said. (So much has changed in science communication; today conferences are liveblogged up one side and down the other, like the SciOnline unconference series.)

Cybernetics was holistic at a very reductionist time; it played a major role in coming to understand biological systems in terms of their emergent properties. Its findings took two directions: toward the development of computers as more-and-more complicated information processing machines, and toward understanding the brain, which remains the most complicated information processor we know of.

Conventional wisdom says that the movement lost momentum after the conferences ended. Its progenitors were from different fields, and they never made a real interdisciplinary mark despite their ambition to found a unified study of everything.  However, it threw a long shadow across academia; cybernetics often turns up in astrofuturist poetry of the sixties, and modern scholars of systems and complexity point to cybernetics as their progenitor field.

I think of the 1940s and 1950s with a weird nostalgia, as a golden age in American academic science, and cybernetics plays a big part in why I think that way.  After all, American science was in no way socially or ethically ideal in that period; scientists developed the bomb, stole Henrietta Lacks’s cells, carried out the Tuskegee study, all the while excluding from their ranks many people who would have been great scientists.  But strictly in the realm of ideas, it was a time of very great discovery: the structure of DNA, the most basic signaling that controls a cell, the rules behind genetics and virology; not to mention the great strides in physics and chemistry that were made at the same time. Because discoveries made back then are fundamental to science education now, and experiments were both elegant and easy to explain, there’s an illusion that information was more manageable. But then I’m reminded it wasn’t; scholars back then founded a whole new field strictly for understanding information flow.

Advertisements

Linkaround

“Dr. Botstein took his visitor into the lab and announced, ‘I have here a novelist.’ ” — a New York Times article on how Jeffrey Eugenides created a surprisingly accurate portrait of the working life of a yeast geneticist in the 1980s, after just one afternoon in a lab (and, I imagine, a lot of reading).  For the record, this makes me more interested in The Marriage Plot than any other buzz I’ve read yet.  Watch this space.

 

RIP Dr. Renato Dulbecco: Nobelist, Italian resistance fighter, virologist, and (as a friend put it) “the D in your DMEM.”

 

“Worst of all, my brain felt boring.” — Scicurious on crafting a work/life balance in grad school.

 

 

Book Review: Feynman’s Rainbow

Richard Feynman is one of those scientists so successful that he’s crossed over into the public imagination, joining the canonical image of The Scientist along with other luminaries like Marie Curie and Albert Einstein.  (Oddly, most of the famous scientific characters of the last hundred and fifty years or so are physicists). His memoir, Surely You’re Joking, Mr. Feynman , is a humorous bestseller and highly recommended around these parts. So when I saw that he was the star of someone else’s memoir as well, I snapped it up.  In Feynman’s Rainbow, Leonard Mlodinow, a science writer and professor, recollects his year as a post-doctoral fellow at CalTech and the influence Feynman had on him.

This isn’t a story about physics. This is a story about life as a young physicist.  You might imagine the two would be inseparable, but at the time he chronicles, Mlodinow was between projects, more auditioning new research topics than thinking deeply about them. You get the sense that in the time You get the impression that Mlodinow did no theoretical physics at all in the time he chronicles, because the thinking (in contrast to the fretting-about-what-to-think-about) is given such short shrift.

The central question for our narrator is, what is most worth doing in this life?  This is also, of course, the fundamental question of reaching adulthood. He has it narrowed down to research in physics, which he defends convincingly as both fundamental to the universe and fun to do.  But now that he must decide, without the help of a professor in loco parentis, what in the vast universe of unsolved physics questions is the most salient, he is at sea.

He is not a conventional post-doc. While his peers are balancing teaching loads with rising-star research careers (names you’ve heard before, like Stephen Wolfram’s, drop casually in and out of the narrative), Mlodinow is still fumbling for a research topic that truly matters and, with growing desperation, smoking dope and getting drunk with his buddy Ray. This not the kind of career story that you expect will end well.

But at his university, among the gunners and the big names, is the biggest name of all to Mlodinow: Richard Feynman, now some decades past the Manhattan Project, but before the time of the Challenger explosion and that greatest of scientific anecdotes, the O-ring in the ice water.  Though he’s slowly losing a battle with cancer, this Feynman retains a lot of his famous pep. This comes in handy when our hero tries to draft the older professor as his mentor and career guide, a role he is reluctant to step into. Mlodinow wants Feynman’s advice on choosing a problem, building a career, and life in physics; if the memoir’s Feynman escapes the template of personal self-help guru for the young scientist, it is in large part because of cranky things the real man said about not existing to help whippersnappers with their personal crises.

The book quotes extensively Feynman’s actual words–recorded, fortuitously, in a move of great chutzpah that I could never have pulled off, but that also reflects how Mlodinow felt about the older physicist even then. More than Mlodinow’s own text, these transcripts reveal the wise teacher as another fallible human being. One quote that particularly made me wince, in the context of a book about young scientists:

 Women have had a great influence on me and have made me into the better person that I am today.  They represent the emotional side of life.  And I realize that that too is very important.

Thanks, bro.  I submit that respecting the intellectual capacity of your fellow human beings, too, is very important.

The memoir’s style is often white bread and matter of fact. Don’t ask a physicist for poetry, and you won’t be disappointed. He tries to keep the text accessible to non-specialist readers, which is very much in the spirit of Feynman’s own writing style; the problem is that he has less of a gift for conveying the excitement of a problem (and his own interest in it) without giving specifics.  I look forward to reading his more science-focused books, The Drunkard’s Walk and several collaborations with Stephen Hawking, to see whether his style is different when he’s talking about physical rather than emotional matters.

But there are also gems, moments when you realize why Mlodinow found Feynman so inspiring, and why he would write a book dedicated to imparting a sense of Feynman’s worldview:

 ‘Do you know who first explained the true origin of the rainbow?’ I asked.

‘It was Descartes,’ he said.  After a moment he looked me in the eye.

‘And what do you think was the salient feature of the rainbow that inspired Descartes’ mathematical analysis?’ he asked.

‘Well, the rainbow is actually a section of a cone that appears as an arc of the colors of the spectrum when drops of water are illuminated by sunlight behind the observer.’

‘And?’

‘I suppose his inspiration was the realization that the problem could be analyzed by considering a single drop, and the geometry of the situation.’

‘You’re overlooking a key feature of the phenomenon,’ he said.

‘Okay, I give up. What would you say inspired his theory?’

‘I would say his inspiration was that he thought rainbows were beautiful.’

What Mlodinow, channeling Feynman, has to offer is not career advice in the usual sense.  On the other hand, he gives something more durable: a parable about how he learned to feel worthy of academia and also realized it was not everything, a story of how to learn about and trust oneself. I’d recommend the book in particular to people who are starting their science careers–like I am. It’s Hallmark inspirational, coupled with a reminder that most of us feel like impostors at one point or another, even physicists who go on to list Stephen Hawking as a co-author.  Mlodinow made it; we can do this.

Linkaround

No proper new post this weekend– life happened. Instead, here are some nifty things I’ve read and a teaser for next Saturday: my review of Feynman’s Rainbow, a story about a young physicist trying to learn to do science and be a grown-up, who gets help from–you guessed it– the most famous American physicist of his generation.

Regarding “Coffee Spoons” and everyone’s continued fascination with how our favorite drug works, here’s Scicurious with an story on the brain region caffeine affects (I love how she first presents the article in an accessible way, even if you don’t have a neuroscience background, and then adds a scientific critique about what she would’ve needed to see to be sold on the argument).

And somebody else seems to be thinking the same way as me about databases:  “From Index Cards to Information Overload” is an interesting story from ScienceLine (which, by the way, is a really neat way for a journalism program to get its work out there) with some more practical concerns about massive databases, such as properly citing the people who did the work in the first place.

Finally– this quote from Ira Glass, one of my very favorite storytellers, showed up in my Facebook feed, and rang especially true to me this week!  Maybe you’ll appreciate it too.

Nobody tells this to people who are beginners, I wish someone told me. All of us who do creative work, we get into it because we have good taste. But there is this gap. For the first couple years you make stuff, it’s just not that good. It’s trying to be good, it has potential, but it’s not. But your taste, the thing that got you into the game, is still killer. And your taste is why your work disappoints you. A lot of people never get past this phase, they quit. Most people I know who do interesting, creative work went through years of this. We know our work doesn’t have this special thing that we want it to have. We all go through this. And if you are just starting out or you are still in this phase, you gotta know its normal and the most important thing you can do is do a lot of work. Put yourself on a deadline so that every week you will finish one story. It is only by going through a volume of work that you will close that gap, and your work will be as good as your ambitions. And I took longer to figure out how to do this than anyone I’ve ever met. It’s gonna take awhile. It’s normal to take awhile. You’ve just gotta fight your way through.

We return to our regularly scheduled blogging next Saturday!  In the meantime, open thread: what cool science articles have you read lately?

Some like it hot: temperature-dependent sex determination and sea turtle conservation

Sea turtles are among the most charismatic of megafauna.  They have fans everywhere, and more than their share of outreach programs designed to make more fans. Even locals who don’t much see the appeal can tell you that turtle ecotourism is big business.

And it’s not hard to see why.  Sea turtles are one of conservationists’ big success stories. Almost extinct before anyone even noticed, their numbers plummeted in the face of conservation efforts, recovering at what seemed the last possible minute thanks to heroic efforts and one important discovery about their physiology that came not a moment too soon.  The story of the Kemp’s ridley sea turtle, at one time the closest to the brink, is a story of science at work in real time.

At the beginning of the 20th century, when Kemp first put his name on the species, female Kemp’s ridleys flocked to beaches by the tens of thousands, laying their eggs all at once in a phenomenon known as arribada, or arrival. But humans caught on to this, and thanks to a brisk trade in turtle eggs, not to mention the help of other predators, the arribadas had disappeared by the time the Endangered Species Act was passed.

In 1977, Mexican and American wildlife agencies began a collaboration to protect nesting habitat at Rancho Nuevo, a park near the border that was the last known nesting site for Kemp’s ridleys. The conservationists knew that if they left nests where they were laid, they’d lose eggs to raccoons, foxes, and opportunistic humans. So when beach patrols found eggs, they dug them up and removed them to either nest areas near a base camp and inside of a fence, or Styrofoam boxes filled with sand.  Around hatching time, in a move that must have looked very strange to anybody not in on the reasoning, researchers allowed the hatchlings to crawl down the beach into the ocean, then scooped them up with nets to raise in captivity and release after about a year. This was called head-starting, and the thinking behind the nets was that females were more likely to return and lay eggs on beaches they had successfully imprinted on.

Researchers didn’t know, until head-started females returned to lay their eggs, how many females and how many males they were raising. It’s hard to tell the sex of hatchling sea turtles; they keep the relevant organs inside a shell, and at such a young age their blood hormone levels are not very high.  After they disappear into the ocean, it’s some years until females reach maturity and return to lay eggs, and many hatchlings don’t make it so long. Sea turtles are difficult to research in general: there aren’t many of them, they live most of their life cycle well out of the reach of land-bound humans, and if you want to study their hatching, you had better be prepared for a swashbuckling adventure from a pirate novel, racing to find buried treasure deposited on the beach at night before a poacher snatches it up. For the most part, researchers incubating eggs for release into the wild had to rely on the findings of studies of captive populations (like the Cayman Turtle Farm), because to count sex ratios of their own hatchlings would mean sacrificing precious live turtles.

Researchers knew, from a paper published in 1979, that sex determination in turtles (as with other reptiles) depends on temperature. Instead of being genetically determined at conception, as sex is for humans, birds, and other chromosome-dependent species, sex among turtles is determined by the temperature at which the egg is kept.

To say that a trait as fundamental as an organism’s sex can come from outside factors like the temperature sounds crazy.  It cuts to the heart of the genes-versus-environment question in development (the question being, “which is more important?” and the answer being, “it appears to depend on what trait you measure”). Fortunately for turtle researchers, the concept of temperature-dependent sex determination had been floating around evolutionary biology circles for some two decades, in such strange and wonderful organisms as the nematode, and the orchid, and evolutionary theories on the benefits of temperature-dependent sex determination (or TSD) abounded.  Unfortunately, the pivotal temperature, at which an even number of males and females develop, differed from species to species, and for sea turtles like the Kemp’s ridley, no one knew the key temperature.

It was only after about six years of head-starting that researchers could even begin to determine the pivotal temperature; they needed to wait for adult females to return, then determine using some careful statistics just how many males and females they had released some years before.  In 1988, Donna Shaver, a ranger at Padre Island National Seashore, published the results of about ten years’ worth of study on head-started hatchlings’ sex ratios, identifying the pivotal temperatures.  Over 30.8 C (roughly 87 degrees F), all of the hatchlings turned out female; under 29.0 C (87 F) they all turned out male. This was a terrifyingly tiny temperature range; it was lucky that artificial hatching and head-starting efforts had produced any males at all. This publication roughly coincided with the lowest Kemp’s population (in 1985, just 518 wild females nested), and knowing how to produce a natural sex ratio was key to conservation and repopulation efforts.

Nearly thirty years later, we still don’t know exactly what gene or network of genes in turtles determines sex in response to temperature. There may even be different pivotal temperatures for different subpopulations within the species that nest on different beaches. Although the head-starting program was shut down amidst debates about its cost-effectiveness just ten years after it began, the practice of moving nests to more protected areas until they hatch is still widespread.  And nowadays, when nests are re-located, volunteers take care to move the eggs quickly, to avoid placing them on hot sand or in direct sun, keeping conditions as natural as possible. While the Kemp’s ridley is still on the critically endangered list, it survives; we have the work of conservationists to thank for that.

For one more thought on temperature dependent sex determination: maybe it killed the dinosaurs.  I ask you.

Bioinformatics and the importance of curation

I recently finished a class on bioinformatics, the study of how best to use all the information scientists have accumulated and are accumulating in databases scattered around the world and web. And there’s a lot of information out there. Sometimes you hear the word “inventory” used to describe massive studies that characterize a lot of cellular components at once, as in, at some point in the future we will have a complete inventory of all proteins in all cancer cells, or stages of embryonic development, or what have you. I have to confess, I always picture an attic full of boxes, with structures and sequences and expression patterns, all minimally labeled, gathering dust.

Huge aggregations of information are scary.  Even just in one database for just one field of study—say, the NCBI website—even that tiny slice of All The Knowledge in the World makes us face that no one person will ever be able to assimilate it all. My feelings about bioinformatics are a lot like my feelings as a child, when I realized I would never be able to read all the books in the world: a crestfallen sense of having to miss out on something really, really interesting.

And at this point, maybe everyone is missing out. Even the authors of big-data studies, say, microarrays that assay expression levels of every gene in the genome in disease and non-disease tissues, can’t follow every lead to its source; often a few differently-regulated genes get followed up, but the rest just get put out there for others to work with. Sometimes I suspect that we (that collective, Internet-era “we”) have all the information to answer any cell-level biological question we could ask, but no idea of what the questions will be or how best to frame them.

One of the solutions to this problem is data mining—enlisting a computer’s help to sift through the masses of information with a program that looks for connections a human might see, if a human had the time and brainpower. Reading up on the art of data mining, I stumbled across a company called Narrative Science with a novel idea on how to present the results of data mining. Instead of making graphs from data, it makes sentences or even short passages, computer-generated but composed so that they read like something a human wrote. The company calls it a novel kind of visualization.  I think it’s absolutely brilliant, both as a way forward for handling a huge amount of information, and for the amount of cleverness that must have gone into programming a computer to take scores from a game and generate something like this:

WISCONSIN appears to be in the driver’s seat en route to a win, as it leads 51-10 after the third quarter…

As I’ve mentioned, I like my science in story form; decontextualized data, however useful they may be, are automatically less compelling to me.  So I’m glad that, in this age when the gestalt seems to be moving from the longform newspapers of the past to the Twitters of the future, that there’s still some consensus that narrative is a good vehicle for understanding the world.  According to some neuroscientists, perhaps it is the best vehicle.

When I think it over, I realize that nature has, or is, exactly as staggering a dataset as anything they’ve got at NCBI, with considerably more information much better encrypted. People have been trying to extract laws and trends from it for generations. All we’ve accomplished with our microarrays and our high-throughput proteomic studies is removing one step between the framing of a question and finding out the answer. Plenty of work of interpretation and meaning-making remains to be done.

Book review: Spook


Spook: Science Tackles the Afterlife, Mary Roach’s second published book, is a whimsical tour of modern and historical investigations into whether human consciousness survives death.  It follows up on her previous book, Stiff: the curious lives of human cadavers, with further investigation of what happens when people die. This time around, the question is not about the fate of our earthly remains but instead whether that’s all of us that does remain. The two books are unified by the marginality of the research they describe, and the curiosity and persistence Roach shares with us in getting as close to the bottom of things as circumstances allow. After all,

 By definition, death is a destination with no return ticket. Clinically dead is not dead-dead. So how do we know the near-death experience isn’t a hallmark of dying, not death? What if several minutes down the line, the bright light dims and the euphoria fades and you’re just, well, dead?  We don’t know, says Greyson. ‘It’s possible it’s like going to the Paris airport and thinking you’ve seen France.’

Lively as it is, the book ends up a trifle disorganized.  It presents a peculiar mix of Duke scientists designing random image displays for operating room ceilings to test whether an out-of-body experience can be confirmed, with tourists wandering supposedly haunted areas with tape recorders, listening for the voices of the Donner party. The field is fractured. There are a lot of individuals out there looking for ghosts, the weight of the soul or proof of the persistence of consciousness, who trust their own methodologies but scorn others’. The book reflects this lack of communication and collaboration, juxtaposing historical skeptics’ efforts to discredit mediums with a visit to a school for psychics-in-training.  One gets the sense that for all our new technologies, very little progress has been made in the last hundred and fifty years toward either showing that an afterlife is real, or showing conclusively that it is not.

To be fair, the field of trying-to-find-out-whether-ghosts-exist is uncommonly peppered with false starts and short on the compelling positive results—though as a kind of compensation, the book is more full than average of cocktail party stories.

What’s more, there’s something about Roach’s work that captures the spirit of research. She isn’t always sure where she is headed, but she invites us along and lets us see the questions as they arise, the leads as they pan out (or not), and all of the side routes that make the work so interesting and maddening to do. The result is like a travelogue, a road trip with a delightfully witty companion. One could wish for more science books to follow this model.