Monday, September 30, 2013

Toddler Break

Some days, I just feel fried.  Today's a good example, on the tail end of four solid days of meeting and conference, which consumed the entirety of my weekend.  Oh, it was good stuff, and I have no regrets about going there, except that I didn't get my time off and will not get any more until Friday, when my folks fly in to visit us.

I like my time off and need my time off, and the best of it comes in toddler time.  I'm just a sucker for the enthusiasm and vigor with which Harriet engages and plays with me.  Some time in the past few weeks, I have naturally stopped calling her "baby"---with the onset of walking, we are clearly in toddler territory, emphasized even more so by her enthusiastic adoption of "Yeah!" and "No!"  My day ends sharply at 4pm every day (at least when I'm in town), when I go and I pick her up from day care.  When we get home, I keep meaning to go try to meet up with the neighbors who have little kids, but instead I just end up hanging out in the living room with Harriet, sitting on the floor while she jumps on me or puts things in my pocket or runs and snuggles and puts things in stacks or baskets.

Little humans are so damned endearing, once you get used to them: they are a microcosm of our grown up world, played fast and open in kaleidoscopic intensity.  Frowning in concentration, then laughing her evil laugh, angry and bitter cries when she can't get something she's put in my pocket back out of it, crafty smiles and sidelong glances to see if I'll let her get way with her ambitions.

That period of grace in the early evening, between when Harriet and I arrive home and when I start to make dinner or Ananya returns or whatever else can break our bubble, is my toddler break, and it's something I dearly look forward to.

Monday, September 23, 2013

O'Hare Again

I'm on another layover in O'Hare again, that great hub of air travel in the backlands of Chicago. Named for Butch O'Hare, the first Navy ace in World War II.  I've learned to curse it less these days, when once upon a time I always feared and loathed my stop-overs here.  But now that I am stopping here so often, I'm coming to find it comfortable, familiar, reliable in its jam-packed halls of madness, and the corridor of international flags that leads you through the main American Airlines terminal.

This is my 14th connection through O'Hare this year, and I expect to be making 9 more of them before the year is finished (maybe more, but I hope not).  That's... that's actually a lot more than I'd realized, before a grep for "ORD" through my travel records found such surprising numbers.  Apparently, even before leaving Boston for Iowa, a surprisingly large number of my trips went through O'Hare, and once you get Iowa in the mix, well, that's two stops in O'Hare for pretty much every trip.

The one truly damning thing about O'Hare for the traveling scientist, at least in the American terminals, is its lack of power and wireless.  I think there is a similar principle operating here as I have noticed in the case of hotels.  With hotels, the pricier per night, the more likely that the internet access will not be free, and the more likely that the connections will be terrible.  Cheap hotels often have excellent, excellent internet.  For airports, the bigger and more business-traveller-filled, the more likely they are to have no easily accessible power plugs and no free wireless.  Little regional airports pretty much always have nice seats with good and free connections.

And so I pay.  But not directly for the wireless.  I pay for a membership in the lounge, which means I get free decent wireless, and a quiet and calm place to sit and work or read or think.  It's worth it, but I feel ashamed.  Which is senseless, really.  If I'm traveling often, it makes sense, economically, to make those hours on a layover good and productive and not draining.  And yet, somehow I feel like I'm betraying my class.  Becoming one of Them, whoever that might be.

As I look back on where I came from, I know that Jake the child would barely recognize some parts of me in the man that I have become.  Would find it strange and alien, distasteful, impure.  I have a really devilish pride, you know, that makes me walk away from sensible things for a long time sometimes.  For years, as a grad student, I resisted any thought of getting an air conditioner.  I had never lived with air conditioning growing up (in the countryside, on the coast of Maine, where the air was so much cooler anyway), and I decided, in my stupid head somehow, that air conditioning was a sign of weakness, of over-consumption, of All That Was Wrong With America.  I even turned down a free air conditioner that my parents gave me as a gift, made them take it back to the store in my pride and self-righteous choice of bodily mortification.  We lived on the top floor of a 3-story apartment at the time, and there was a flat piece of black tar roof right outside the bedroom window, so on a really hot day the room just baked like a pizza oven.  I would lie there and sweat, my head aching, trying to focus on whatever I was trying to do.  But I was winning... winning against everyone else in the world, because I could judge them weaker and less than me, since I could live without an air conditioner.

You know, dear reader, I really hope you're laughing at me right now.  I hope you're finding that pride as stupid and shameful as I do now, looking back at it.  Trying to understand, where does it come from?  Where do I get this impulse, this need to go and do things as purely as possible?  I don't even realize I'm doing it sometimes, only in retrospect, as I'm analyzing the shame that comes when I finally come and let myself succumb to pragmatism.  Like having an airline lounge membership.  Like taking a taxi to the airport, when I used to take the T.  I can scoff at my resistance to air conditioning at the same time as I feel queasy about my airline lounge membership, and really, what's the difference?

Where is the boundary between pride and pride, dear reader?  If I bought myself a membership before I started stopping in O'Hare... an apparently startling number of times... then it would be a pride of aspiration, of conspicuous consumption, of indulgence in status and luxury.  If I avoided buying a membership now that I'm stopping here so often, then it would be a pride of avoidance, setting myself above the people who do have one through self-denial and asceticism.  I think that my wife, ever one for pragmatism, would find my dilemma odd and disheartening.  Honestly, I don't know where it comes from, myself, except that it is always with me, this improper degree of social consciousness that so frequently leaves me on the horns of a dilemma, where either path it problematic.

Perhaps, as I continue to learn and grow and mature, I'll finally just manage to let go a little.
I just wish I had a better measuring stick, so I could always compute where the boundaries are...

Wednesday, September 11, 2013


Dear readers, I am happy to announce an excellent thing. That excellent thing is WebProto, and it's a way that all of you, even those without a similar scientific background, can get a chance to play with the artifacts of my program of research.

Go on, click that link.  Or better yet, try this one that is all loaded up with a bunch of pretty demos.

Ain't it cool?

The WebProto effort was kicked off by my colleague Kyle Usbeck, who first realized that the new HTML5 and WebGL extensions in browsers had gotten powerful and sophisticated enough that it was reasonable to think about running a complex network simulator with 3D graphics and lots of virtual machines just as part of a web-page.  No weird plug-ins, no special-purpose software, just a modern web browser and a decently fast machine.  Then there was a lot of blood, sweat, and tears as we worked together to figure out how to make JavaScript do our bidding, and another batch when the way we first built the system was too slow and we needed to port the whole virtual machine into JavaScript.

The point of the WebProto project is to make aggregate programming techniques, like we and a number of our colleagues have been developing, readily accessible to anyone who is interested. There have been a lot of really exciting advances over the past few years, in techniques that make it easier to program networks, by letting you write programs for the whole network at once.  Then the system figures out how that translates into the interactions that the individual devices need to have with one another.  This is all well and good, but in order to use any of these techniques, you always had to figure out how to install and configure a touchy piece of research software, usually with some gnarly dependencies.  Ours (MIT Proto) is unfortunately no exception.  And using it had some serious user interface issues for anybody who's not a fan of the Unix command line. So the barrier to entry has just been too high for most people to bother with it.

Enter WebProto.  WebProto's three big goals are:

  1. Make cutting edge aggregate programming (via our Proto language) accessible to anybody with a browser.
  2. Bring aggregate programming, test configuration, and network simulation together in one easy interface. 
  3. Make it easy for anybody using WebProto to share their programs and simulations.

I think we've succeeded... at least enough to get a good start on things.  We're hoping this will be a good educational tool too, for people teaching classes on distributed algorithms or self-organization or complex systems.  Heck, we've already got a tutorial of our own that has embedded examples and problem sets!  And it's all free and open software, so anybody who improves things can contribute back to the community.

And you know why else I'm feeling that glow of pride right now?  Because today was Demo Day at SASO, and WebProto received an award for being the best demo of the conference.

So come on down, dear reader, and help yourself to our wonderful toys...

Tuesday, September 10, 2013


Hey folks, it's time for my favorite conference of the year!  That's SASO, or as its more properly known: the IEEE International Conference on Self-Adaptive and Self-Organizing Systems.  SASO is probably the conference which is most home to me as a scientific research community, and I've attended it every year since it began, in 2007.  Now in its seventh year, the conference is going strong: it's not a big conference, but that's by design: it's very cross-disciplinary, and has only a single track in order to ensure that people aren't closing off into their own little sub-communities.  That does have the side effect of keeping it relatively small, but I find a lot of value here.

The main conference started today, but the affiliated workshops and tutorials began already yesterday.  I started off my day yesterday in the workshop on socio-technical systems, where I was invited to give a talk about my recent results on very fast approximate consensus, which I had published at this year's Spatial Computing Workshop.  The link between approximate consensus and social interactions is pretty clear: whenever a group needs to make a decision, it needs to come to an approximate consensus.  My own work has been motivated more by the technological side, but there is a lot of overlap and my talk, The Importance of Asymmetry for Rapidly Reaching Consensus, appeared to be quite interesting to the attendees.

I found quite a bit of interest to listen to as well, and something that Jeremy Pitt said in his talk has been bouncing around in my mind ever since.  Jeremy, talking about social capital, decried the commodification of social relationships, FaceBook's business model being an excellent bad example: "friends are people you can count on, not people that you can count."  Turning towards a more positive definition of social capital, the foundation of the definition is "trustworthiness."

And here's the observation that has been racketing around in my head ever since: if social capital is based on trustworthiness, then any attempt to engineer social capital is guaranteed to undermine itself.  Look at it this way: if you do something that I find beneficial, generous, or reliable, I will trust you more.  If I know that you only did it in order to win my trust, then I have no reason to trust you, because I know you're trying to manipulate me.

This poses a real paradox.  Building trust is vital, and so understanding how to build trust is useful, but the more precisely you understand how to build trust, the less trustworthy you may be!  It makes me think of the problem of ethical placebo.  The placebo effect is a remarkable mind-body interaction in medicine: when you give a person a non-functional treatment, like a sugar pill or a salt-water injection, it can provide real medical benefit, like reduction of pain.  The theater of the treatment is the treatment.  If you know that a treatment isn't real, however, then the placebo effect doesn't work.  Now, it used to be acceptable to give patients a placebo without telling them, but that's not considered ethical any more, and for good reason.  Lying to patients for their own good is not too far away from clear horrors like involuntary sterilization of the disabled, stealing babies because the mother is unmarried, and faking treatment to study how bad a disease is.  So you can't lie to a patient, but telling them the truth destroys the efficacy of the treatment.

The problem of engineering with social capital seems similar: somehow the engineering needs to understand and manage social capital without turning it into the type of cold and calculating process that sucks all the human value out of the relationships.  I don't have any good answer, except that somehow, we need to keep the technical systems out of the relationships as much as possible: let them facilitate, let them help track information and help rendezvous, but don't attempt to monetize those relations... but can we avoid it?  The conversation went on, and we talked of it over dinner as well, broadening to the questions of privacy and the balance of trust between people and governments and corporations, and the way that software intermediates it all.

Friday, September 06, 2013

A Business Trip Back Home

I'm back in Boston again, for the first time since moving to Iowa.  I'm at home right now, for a certain definition of home, in our old apartment in Somerville.  It's like I never left, in some ways, back in the bedroom where I first arrived just over five years ago, where I lived by myself all alone for the very first time in my life.  I moved into this apartment in Somerville just as I was starting at BBN, at the same time as my relationship at the time was breaking up and as our old commune-style apartment where I'd lived through most of grad school was dissolving and people were going their separate ways.  My first night alone in this apartment was remarkably frightening to me.  When I was a kid, my brother and I tried to sleep in the lean-to a few hundred yards away from the lake camp where we always went on family vacation, and we had to come back in the middle of the night since we kept imagining bears into every snapping twig and sighing tree.  My first night alone in this apartment, I kept hearing burglars and home invaders in every creak of the old house and every autonomic twitch of an appliance.  But truly, I had never before lived alone.

It was here that I lived when I met my now-wife, here that we returned for shelter during the memorable snow-storm that accompanied our first date.  Many times, this house has been transformed, as I settled in, as we learned to live with one another, as we tried to superpose two well-equipped bachelor households into one tiny space, as we rearranged the rooms and the furniture again and again seeking the balance, utility, and mental space that we are only now finding in Iowa.  It's rearranging again now, as our landlord renovates the downstairs: right now, there is simply no kitchen, nothing at all but walls and floorboards, and the living room is filled with all the appliances and furniture that used to go there.  I pass through a seal of plastic sheeting when I go upstairs, to where the bedrooms at least remain untouched, and sleep backwards on the bed from how we usually do, in this temporary half-nest that is home and not-home and new and old all at once.

Soon, I will go to the office, and I will move all my possessions there from the window office that I am giving up to an interior office more suitable for an occasional visitor.  Another mark of my status changing, my presence here in Boston diluting as it concentrates in Iowa.  Can't live in two places at once, you know.  And truly, it's a simple fact of presence and location.  Working in Iowa is good, and I like the University and the faculty I'm getting to know there, but I am definitely becoming less in touch with my old co-workers in Boston.  The closest collaborators, no problem, but in just two hours in the office yesterday, I had good and unexpected hallway discussions with three different colleagues who I hadn't even spoken to since I left for Iowa.

Relocation is challenging and good and hard and necessary.  I like my new life, I miss my old life, I can never go back and can never stay still in stasis.  Last night, I had dinner with an old, old friend, and we talked about growth and struggle and the demons we fight and are at least aware that we are trying to overcome.  It's interesting and strange and lovely that as a well-privileged adult, I have choices to make, and my life is never at all near where I'd expected to be five years in the past, and that has been true for at least the past 20 years: a major turn in aims and expectations at least once every five years.  My apartment is in (renovative) ruins, and it is home, and so as well is my house in Iowa, and I'm happy where I am, doing what I'm doing, and also can't wait to return home to the home in my other state.

Yours in a satisfied and joyful confusion, dear reader, and the bittersweet tang of a life not lived in stasis.

Thursday, August 29, 2013

Return to the Land of the Blogging

Good evening, dear readers, and welcome back to me.

Tonight, I return to the land of the blogging, not for an epic post, but just the simple beginning of my return to self-reflection and communication with the wider world unregarded.  My silence, dear readers, began with a simple thing: just taking a week or two off while I moved from Boston to Iowa City.  Not that there was any particular reason that even moving would need to result in a pause in blogging.  After all, this wonderful sophisticated software allows me to schedule updates however I would like them---any time you see a post set precisely at 5pm on a Monday, that would be the reason.  Nevertheless, I went away, and now I am back.

I'm back, slowly feeling my way into a new set of routines in Iowa.  I'm back, with the momentous change of living in my own house for the first time in my life, something which feels (now that we're over the initial unpacking) just simply delicious.  For the first time in my life, I have all the space I would like, and our house is beautiful.  Also, not overly large.  Lots of little stresses that came from packing two households of stuff into one tiny place as we lived in Boston have simply evaporated, and we are taking every pain to avoid filling up the new place.  It is good.

Our neighbors are lovely, and I can bike to work from my home.  Well, the Iowa version of work.  I've picked up an affiliation with the Electrical and Computer Engineering department here at the University of Iowa, secondary to my primary affiliation with BBN.  So I work remotely for BBN and am building collaborations with people here in Iowa.  For the next little while, at least, I'm in a magnificent office, squatting while the professor who normally occupies it is on sabbatical, practically next door to my wife and a bunch of other folks in her department who are enjoyable on both a personal and professional level.

Still, I'm finding my feet.  With Ananya working full time again, and more than full time given the realities of tenure track, Harriet's in daycare full time, and I'm the one who picks her up in the afternoon.  So every day, I have a hard stop at 4pm (5pm Eastern, since I'm keeping my work in sync with my colleagues in Boston).  I bike home, grab the car, go over to Harriet's daycare, and then it's me and her for an hour or two until Ananya (who's got the morning baby shift), gets home.  Family time for a couple hours, then the rituals of baby bedtime, with a daughter who despises the idea of sleeping while parents are up and around.  It all leaves rather less "overtime" than I am used to being able to put in.

Soon, I will return to Boston for my first trip back since I arrived in Iowa.  It's going to be a fast, intense trip, followed immediately by a conference in Philadelphia.  I'm looking forward to it, and also going to miss my new life here, even as I delight in returning to my roots and my bigger city fast-flying life.  And you, dear readers, will once again be in on the discussions.  Welcome back to me, and also to you.

And now, let us end with a picture of Harriet adapting well to her new state: eating corn on the cob all by herself, just like an adult.  I am very impressed with our 13-month old, who refuses adult assistance and can eat half an ear all by herself in a single sitting.

Monday, July 15, 2013


I've just gotten back from a rather intense set of conferences, having spent the majority of my time for the last week just plain networking.  This is something that most definitely does not come naturally to me.  I am not a politician, in the sense that I am not good at estimating what people's wants and desires are, or how my actions will be interpreted by them.  Sometimes, this happens even for those friends and family who I most know and love.  It's not that I don't understand emotions---I don't fall anywhere on the autism spectrum---but that I simply get too swept up in my own momentum and my own desires and feelings and stop paying close enough attention.  And I do have to work to put myself in others shoes---again, not because I can't, but because the pull of my own perspective is so compelling to me, and I'm often quite bad about making assumptions that I should not and don't even realize that I am making.

It's a flaw.  It's especially a flaw in science, though perhaps there is a bit more margin for error there than in other entrepreneurial professions. After all, if what you can deliver is valuable enough, then there are many sins that can be forgiven.  But it makes a difference, and a big difference at that.  For those of you not scientists, this fact may be surprising.  Is not science the land of the fact, the truth, the existence of rightness against all odds?  Eppur si muove?

What you have to remember, dear reader, is that the classic image of the scientist as lone scholar or brilliant genius is rooted in a time when the practitioners of science were by and large the lordly class, set apart and at idleness to think by their wealth or their position.  From the classical Greek philosophers to Confucian scholars, medieval monks to Indian astronomers.

Over the past two to three centuries, however, the democratization of science and its fusion with technology have expanded the breadth of participation in science by many orders of magnitude.  At the same time, the advance of technology enabled by this democratization has been tying us ever more tightly together into a single large, global community.

Scientifically, there are so many things going on now and with so much complexity, that one person alone is quite limited in what they can accomplish, even if they were in the privileged position of one of those ancient lords and could take things like food, shelter, and internet access for granted.  To be really effective in the world we live in now, a scientist must collaborate: colleagues bring problems to solve and techniques to help in solving them.  Working together strengthens your ability to publish, to seek funding together, to think of new ideas, and in all other ways to go out and get your science done.  So any scientist who wishes to pursue their ideas in research, as opposed to being a technician in the lab of another person, will of necessity need to learn to be effective to at least some minimal level in the land of networking.

For me, I hate the necessity of having goals in mind for networking.  I basically like people.  I like a lot of my colleagues, and I really enjoy debating things.  If networking just meant socializing, shooting the breeze and debating the nature of facts, I would have no problem with it.  But it's also important to get things done, to avoid offending people accidentally, and to step beyond "interesting discussion" and forward to "mutually beneficial action."  These are the things that I struggle with, and above all else I fear offending colleagues, for I know that is something I can do all too easily, especially if I think I'm debating a bit of science, but they feel like actually I am attacking them.

This last few days went well, I think, with lots of exciting discussions and the opening up of new opportunities, though if there are any ways that I have screwed up, the most important mistakes are the ones that I am least likely of all to know about.

But on I go, to live, to learn, and hopefully to keep making good progress: in my science, in my relations with colleagues, and at simply being a decent human being.

Friday, July 05, 2013

Scientific Posters

I have a confession to make.  Posters are the method of scientific communication that I like the least.  I've just been preparing three right now, for presentation next week---two on synthetic biology and one on representation for electromechanical design.  The problem is really the logistics: design, production, transportation, and presentation.

Let us consider the poster's two main forms of competitor, the paper and the talk.
When I produce a paper, these four stages are:
  • Design: Assemble the document in LaTeX, a system extremely well adapted for this task, using figures built however I feel like it.
  • Production: Run LaTeX on the document, admire the beautiful PDF.
  • Transportation: Upload electronically to the publisher's site.
  • Presentation: The publisher does everything automatically, and the paper is disseminated to some fragment of the scientific community.

Talks are not quite so smooth, because I usually have to deal with PowerPoint.  I resisted this for a long time, and still use OpenOffice when I can.  The problem is that a lot of material from collaborators comes to me in PowerPoint form, or is explicitly required to be in PowerPoint by the US Government.  So I just have to put up with substandard editing software, giving me a workflow of:
  • Design: Curse the Beast That Dwelleth in Redmond as I wrestle with its Hideous Offspring.
  • Production: Flip into presentation mode and walk through to make sure nothing weird is happening with animations or videos.
  • Transportation: Get myself to the conference, bringing my computer and its precious, precious bits.  Leave a copy of the presentation accessible to myself via SSH, in case my computer self-destructs and I have to download it and try it on somebody else's machine.
  • Presentation: Stand up in front of a room full of scientists and talk to them, take a few questions, continue the interesting bit of discussion in the hallway after.

Posters are also very graphic heavy, and so they tend to end up in PowerPoint also, if only to ensure smooth transfer of images from existing slides to the poster design.  But that's only the beginning of the trouble.  My poster workflow is:
  • Design: PowerPoint was never intended to edit poster-sized images, and so it goes very slowly.  This interacts with its already iffy UI to result in lots of waiting for updates, discovering that event processing lags have resulted in PowerPoint changing the wrong object, undoing, trying again, etc.  It also really wants to resize text pasted in from smaller documents to fit the poster, turning some things into 80+ point font and leave others the same, in a pattern that I haven't entirely figured out yet.  Eventually, though, the beast is tamed, and I've got a PDF that PowerPoint isn't resizing down to 8.5x11 because it has decided I can't really have wanted A0 format.
  • Production: Print on a specialized piece of hardware for that is generally temperamental and poorly maintained, because it's not part of the daily workflow.  Then get a pair of scissors and trim the poster down to size because the size requirements of the poster session are always different from the width of the poster printer.  Or, like today, discover that the poster printer is down, everybody's taking the day off, and desperately search for an outside shop that can print my posters before my flight leaves.
  • Transportation: Roll the posters up, putting them in a tube if you can lay your hands on one, then carry giant delicate pieces of paper through taxis, airports, planes, and public transit while attempting to preserve them from crushing, rain, etc.  If something bad happens, you're out of luck.
  • Presentation: Give an interactive talk like you're singing a round: as people arrive and leave, you'll generally have a mixed audience who have all come in at different phases of the discussion and have missed different parts of the material.  It's essentially the following (please hum along, to the tune of "Row, row, row your boat"):

Primi:                    Secundus:             Tertio:
This is my abstract
Methods over here         This is my abstract
Here you see experiments  ...                   This is my abstract
Conclusions are so clear                        ...
Moreover, you're probably doing this in a crowded hall with dozens to hundreds of other posters all being presented simultaneously.  At the end of a good poster session, I'm always hoarse.

So posters are just much more of a hassle than the other forms, from A to Z. It does have its advantages, though, in that it's much more interactive than any of the other forms.

Now, dear readers, I'm sure that somebody is going to want to suggest that I could solve all my software problems by ditching Microsoft.  Let me head that off at the pass by saying that, much as I think Beamer and Prezi are awesome, Beamer just can't illustrate or animate worth a damn, and Prezi has some serious usability issues.  Don't talk to me about Keynote.  Network effects mean my world is going to be dominated by PowerPoint, with OpenOffice/LibreOffice the only hope of salvation.

But for now, I'll carry my precious cardboard tube and be thankful the production nightmare is over...

Thursday, June 27, 2013

Ethics & Retraction Watch

I have a confession to make: I read Retraction Watch.  It's one of my pieces of mental junk food, like bad soap opera.  And though I started for the soap-opera and schadenfreude, I've stayed also because of the serious questions about ethics that get raised.

Monday, June 17, 2013

The Programmer Litmus Test

One of my current projects has induced me to learn Haskell, a rather dogmatic programming language notorious for its insistence on purely functional programming and extremely strict data type enforcement.  It's been both interesting and frustrating: Haskell clearly shows its origin in mathematical thinking.  On the good side, there is the elegant way that data types are inferred, the fact that sophisticated and subtle manipulations of mathematical functions are elementary to the language, and the intuitive handling to sets and recursion.  On the bad side, anything that doesn't fit the elegant paradigm is horribly painful to handle, when things do go wrong they go very wrong very quickly, and there is a mathematician-style culture of using single-character symbols and minimal comments.  I detest this last, since it renders programs extremely hard to decode.  It reminds me of a criticism of Perl that I once heard: "banging on the keyboard has a 50/50 chance of producing a valid program."

Learning Haskell also reminds me of a litmus test that I like, for testing whether somebody is a Real Programmer.  The test is a single simple question: 
"What programming languages do you know?"
If the person responds with a clear and simple list, they aren't are Real Programmer.

My own answer?  Well, mostly I work with C, C++, MATLAB, Lisp, Java, and now Haskell.  But I also deal with GNU make and bash and tcsh, some Python, recently a little SQL, then HTML and Javascript of course.  I've done a little bit of elisp hacking when I needed to, a little VHDL.  I don't know if I should count representation languages like SBOL or LTML.  Do nesC and AVR macros count as the same as C/C++ or separate?  I think the other parts of the autotools suite besides Make probably count as separate languages.  Almost forgot about regexp patterns, I think they count.  And parser definition languages like flex and yacc.  680x0 assembly, back in the day, when graphics accelerators and OpenGL didn't exist, and I guess I did some x86 assembly too in classes, and there was another processor, a toy processor specifically for classes whose name I don't remember.  Oh, and BASIC of course, my first language ever.  Research languages like Proto and MGS.  LabView made me use G---I hate graphical programming languages---though I guess Logo was fun as a kid.  I can't remember whether I ever actually wrote any AppleScript or not; same with SmallTalk.  Curl was another that came from classes, along with whatever it was we used for PAL programming before we switched to CPLDs and used VHDL instead.  Oh, and LaTeX of course---anything you can write quicksort in definitely counts as a programming language.

You get my point, I think.  I could probably keep going on for a while, and still not have everything.  This is part of the nature of working with code.  If you spend enough time working with enough complex systems, you'll inevitably have to tangle with lots of different programming languages, just through the pragmatics of making things work in whatever context of duct-tape and spackle your partners and predecessors have had to set up to get stuff done. Likewise, if you tangle with enough languages and gain a deep enough understanding of the underlying concepts, then picking up new languages is relatively easy---at least to an intermediate level.  Real Programmers have done both of these things, and it will have left its scars on their memory in one form or another---and I don't think you can really be fluent without it.

Note that being a Real Programmer doesn't necessarily make you any damned good at any particular task, though.  It just means you've got the potential to get there with a sharp learning curve if you've got the time and motivation.  For example, I'm no longer competent to carry out any but the most basic system administration tasks, since I stopped tracking that field early in grad school, which is why I own a Mac.

So, dear reader: are you a Real Programmer?

Monday, May 27, 2013

Outbound Again

It feels like a much shorter time than the two and a half weeks since my last professional travel---though perhaps that's because the time since has still had several talks, a significant paper submission, and a major project deadline.  Tonight, I head out again, this time for an experience that is all new to me as a variety of professional service.

Despite the fact that I'm at a company rather than a university, I still have a number of opportunities for academic advising, and it is rather encouraged by BBN as well. I've advised students at both the Masters and Ph.D. level before, but never before has helping with a student involved an international journey.  Some months ago, one of my close colleagues asked me to serve on the thesis committee of his Ph.D. student, whose work I respect and have been following closely for some time.  Since they're over in Europe, I assumed my participation in the defense would be via some sort of remote dial-in (see previous discussion of my fondness for the magic of modern telepresence).  In fact, however, they'd like to have me there in person, so I'm getting on a plane this evening, and will spend about 48 hours on the ground over there, attending the defense and connected celebrations, as well as getting some good time to catch up with my colleague and hear all about his latest ventures.

Alas, I do not get to wear my doctoral hood, which will continue to remain undisturbed in its quiet corner at the back of my closet.  Apparently, there is a special and different form of garb that the university traditionally prescribes for committee members employed by industry (remember: academia is one of the only still-extant reservoirs of medieval guild traditions in the modern world). So one of the first things I'll be doing upon arrival is getting fit for the archaic scholastic version of a rented tuxedo.  I'm unsure just what it is that I will be wearing, and I fear that it won't live up to the gaudy inventions of my imagination.

Another lovely side benefit of this trip comes from the fact that its financing means I wasn't restricted to a US-flag airline---one of the typical requirements of traveling with any aid from a US grant.  As it happened, the cheapest fares (by far) when I was booking my tickets went through Iceland, so it became not only possible but the Official Best Travel Option for me to stop off and see my dear friends in Reykjavik on the way back---I'm taking a day of vacation, and quite looking forward to meeting their new baby in person, who I've previously met only over Skype.

This trip is another piece of time away from our own increasingly intriguing and interactive baby, but at least I've had a long and quiet time together with her this weekend.  We were all going to a wedding down in Maryland, of one of my wife's oldest and dearest friends, but Harriet went down with various standard unpleasant baby ailments that I shan't embarrass future-her by describing to the Internet.  In any case, it became clear that subjecting Harriet to 16 hours of driving would be a bad idea, but with no actual danger in the offing I encouraged Ananya to go without us.  So while Ananya went by Amtrak, Harriet & I rolled around on the kitchen floor, playing with pots and pans, discovering how magnets go back on the refrigerator, and listening (me at least), to my newest audiobook to keep the non-baby half of my mind happy too.

And since I can't fend her off from this keyboard much longer, it's time to post and see if I can answer a few of those long-neglected emails from friends before I vanish again for the airport...

Monday, May 20, 2013

Know Your Time-Zones

In this business, sometimes you've really gotta know your time zones.  When you come down to the wire on a paper, sometimes it really matters whether it's East Coast, West Coast, or (as in my unfortunate case this evening), Central Europe.  At some point, the deadline has really, truly passed, and the submission site shuts down, and the computer won't take your paper any more.

It's important to know exactly when that is, which is why computer science conferences generally choose 11:59pm or 12:01am, avoiding the date ambiguity of midnight, and always specify the time zone---usually somewhere in the US or Europe, but occasionally as far West as American Samoa (because computer people love to find extreme edge cases).  A later deadline doesn't necessarily help, of course, because it's just encouraging you to stay up later at night if you aren't yet done.

Properly, this shouldn't matter.  We should all be good little boys and girls and get things in before the last minute.  Nobody should ever have to wonder which midnight it is, but we do have to sometimes: maybe there's another deadline first, or maybe you figure out something new as you're editing a paper and you have to make lots of extra changes to fix it, or maybe it's ongoing work and you're still adding new material, or maybe you're just crap at deadlines.  I honestly don't think the last actually pertains very often to last-minute submissions: I think more often it's just the combination of perfectionism, triage, and shrinking margins of error.

When I was an undergraduate, I had a theory about deadlines that worked pretty well for me.  I held that taking a semester of classes was like surfing on a wave: you start out up at the front on top, and every time you let anything slip, you slide a little bit further back on the wave.  If you slide far enough back, you're in danger of falling off (missing assignment deadlines), and you get really stressed and can't do anything that you want to because if you miss one step, you fall off the wave.  But you're always doing the same amount of work: it's just that when you're farther from the deadline you've got more options for moving it around and doing the stuff that fits better with your mood/desires/headspace at the moment.  Oh, and having less stress.  So every semester I'd start off doing my assignments the moment that I got them, and slowly slip back towards the deadlines as the work piled up.  Then the semester would end, there would be no homework assignments over break, and I'd be reset up on the top of the wave again at the beginning of the next semester.  Only now, as a working professional, there are no semester ends for me any more.

Technically, I'm already 10 days past the deadline on this submission, but first the conference extended the deadline by seven days (which is not unusual), and then a note went around that the submission site would actually continue staying open for three days beyond.  And so I embraced work-life balance for the weekend: played with my daughter while my wife dealt with her own last-minute paper crisis, went up to Maine to visit my parents, got a couple good nights' sleep.  Today I spent most of the day on an ongoing project that's actually billable, as opposed to this paper which is more directed at future work and foundational principles.  No problem: I knew I had only about three hours of work to go to finish up the paper, and I came in right on time, even a little bit early (though not as early as my ambition, of course).

And the site was closed.

You see, I'd read "CET" and thought "CST".  CET is Central European Time, where Italy and Hungary and Germany are.  CST is Central Standard Time, and has Iowa and Texas, and Chicago.  Seven hours difference this time of year, most times of year in fact, except for a brief transient when Daylight Savings is different in Europe and the US (God only knows why).  And so that's why my paper is still listed as "abstract only" at this moment.

I'm not sorry, because it's ended up as a damned good paper, full of pretty pictures too, so if it doesn't go here, it will find another good venue.  And hopefully the program chairs will have mercy on me and allow my paper to proceed into review in any case, because I'd really prefer it to have a chance to go where I intended---that's an audience that I like a lot and think will really appreciate the ideas.  But for now, I'm tumbled off my wave and hoping to avoid the same mistake on the next one.

Monday, May 13, 2013


One of wife's colleagues, who is also a parent, had this to say about the challenge of being both a scientist and a parent: it's not about whether you will drop balls, but about choosing which balls to drop at which times, and making sure that you don't keep dropping the same ball.

I just spent the last hour in a steam-filled bathroom, singing a terribly unhappy, snuffling and coughing baby back to sleep, steaming the congestion from her sinuses and rocking her in her chair.  "Swing Low, Sweet Harriet" was about the right speed, with lots of low, resonant, sleep-inducing notes; "Harriet the Eukaryote" is a bit too upbeat.  It's hard to watch a baby who really, really just wants to be asleep and cannot get there, but with warmth and song and rocking, she finally at least has made it.  Tonight, I have to get a full night's sleep, because of a critical presentation I'm going to be giving tomorrow, and I'm singing to my baby and writing to you, dear readers.

Career and life inevitably come into conflict for a scientist.  I didn't have a weekend this past weekend, because there was a major workshop on Mammalian Synthetic Biology (fortunately right here in Cambridge), and I needed to be there to present my work, talk and make plans with my colleagues, and also to run a workshop of our own on metrology (more on that one later).  But I didn't get to see Harriet much, and so I've been snatching every bit of time I can at the moment.  Tonight, I met Harriet and Ananya in the library, and we played together there for a little bit before I had to go off to my photo critique group.  Why didn't I skip photo group?  Because it will be the last one that I can attend before we move to Iowa, and it's been my one really consistently peaceable and pleasurable hobby for the last few years.

And so my life grinds finely, sometimes.  I pick which balls to drop, and which to catch, and try to remember that one of those balls is my own ability to find time to rest and recover.  I plan to take a day off later this week, to catch up on weekend.  I'll probably be able to write back to the friends whose letters I'm neglecting now as well, and deal with the pile of mail sitting on the kitchen counter.

But also, unstructured time is precious.  I have a form of meditation that I do for myself at times, where I simply get moving---on foot, on bike, in a car, or even the Paris Metro---and simply see which way I turn as I move forward.  It's an exercise in letting go of planning, just getting in touch with my basic preferences and impulses.  It's not even so much about understanding them as simply giving them a free rein to move me and help to separate out "should" and "need" from "want" and "like."  There are times when I go in circles, and times when I go in straight lines, go out on familiar streets or proceed down random side-roads where I've never set foot before.  I haven't done that for a little while, maybe not even since Newcastle, when I had a day two weeks ago to wander the afternoon around the edge of the North Sea.  I promise myself these times every so often, and have to preserve them even in the face of possible guilt that I could be spending the time on a paper or with my daughter or wife.

This blog too is a promise to myself, which is why I'm making sure that I don't drop it.  It's not a duty, exactly, but a promise to remember to step back and stop doing and just think about the things I'm doing every once in a while.  But that promise now is done, and the promise of sleep is beckoning.

Monday, May 06, 2013

A building tension of words...

I'm experiencing a synchronization event right now, currently sitting in scientific meeting three of four in a three week period.  With only a couple of days at home between each of these events, I'm afraid that you, dear reader, have been falling below my wife and daughter in my priorities.  But the tension of unwritten words is building, and soon they will emerge, to tell of Newcastle and Atlanta, Minneapolis and Boston, workshop and journal, synthetic biology and spatial computing, morphogenesis and metrology...

Monday, April 22, 2013

Returning to Standard Life

My psyche may not be quite settled again, but it is now officially time to return to a more standard existence.  And by that, I mean the establishment of scientific and engineering standards.  I'm on my way, right now, out of the country over to the UK for the next SBOL standards development workshop.  SBOL expands to Synthetic Biology Open Language, and it's a community-driven effort to make it easier for practitioners of that science to exchange information about the systems they are building.  This applies to person-to-person communication, but also (perhaps more importantly) to communication from one computerized tool to another.  This latter is so important because building new organisms involves a lot of different types of information and processes, and it's hard for humans to track all of it correctly except for the very simplest of designs.  It also allows engineering services, like fabrication of new DNA, to be taken on by specialists who can take advantage of economies of scale to do them efficiently and cheaply.  The group at Newcastle is hosting this meeting, and we'll spend three days showing off our newest tools to each other and delving into such weighty topics as prioritization of working groups, standards body governance, interoperation with related standards, and extensions to handle additional types of knowledge.

Not long after I get back, I'll be hosting another standards discussion, the Workshop on Metrology for Mammalian Synthetic Biology that we're organizing in conjunction with the first Workshop on Mammalian Synthetic Biology.  At that one, we'll be asking a lot of really basic questions that don't yet have good answers, like "Who wants to know how a biological part behaves in a cell?", "What do we need to measure about its behavior?", and "How do we tell whether we got the measurement right?"  I expect the discussion will be much broader than just mammalian cells, but we needed to narrow the scope enough to have a productive discussion.  Mammalian cells also have some interesting properties that make them an attractive target: they're relatively big and tough, which seems to let them get less messed up by adding our new biological computing circuits into them, and there are also starting to be a lot of good mammalian biological computing components to choose from, which means we might soon be able to try to build much more complex systems.

In one sense, this type of work is boring.  After all, most of the process of getting things done is about having long dry arguments about tiny points of detail, and there's nothing inherently sexy about building better rulers or trying to agree on a data format that doesn't leave anybody too dissatisfied (consensus and satisfaction are necessarily opposed qualities).

That first impression is misleading, however.  Every scientific endeavor, no matter how sexy the topic may seem, has a lot of tedious work underlying it that is necessary to building a solid foundation.  Debugging code is never sexy.  Building makefiles and regression tests is never sexy.  Feeding mice or culturing cells or staring blankly at columns of data in Matlab is never sexy.  But you can't get where you need to go unless you do the hard work.  The alternative is to practice "slash and burn" science, where you go just far enough to get proof-of-concept results, then publish them with grandiose claims and move on to the next thing, leaving little for those who follow you to build on and yet making it difficult for them to do the work you ignored: "Didn't Smith and Jones already do that?"

When you get the unsexy parts right, you enable great things.  And standardization is a huge, huge, part of that.  Standards, and especially standards of measurement, are civilizational infrastructure.  Consider, for example, a 2x4 in your local hardware store: that humble piece of wood represents an absolute revolution in the construction and remodeling of houses.  Likewise, the next time that you chance to use a tape measure, think about how hard it is to make tape measures the same length.  How is it that you don't get different lengths of tape measures from different stores?  Why doesn't the length of tape measures drift over time as the machines used to make them slowly wear out? They do vary, of course, but a remarkably complex system of engineering mechanisms, professional associations, and government bureaucracy combine to ensure that you always have as many significant digits of length available as you need and are willing to pay for.

I didn't get into synthetic biology out of a desire to work on standards.  I got into synthetic biology to practice sexy cool mad science with living organisms.  Getting there from where we are today, however, will require that we build that depth of infrastructure that is so much needed and so often unappreciated.

Friday, April 19, 2013

I am proud of my city tonight

I am proud of my city tonight.

It has been a long and unusual day, locked down in our home while every law enforcement agent for a hundred miles was searching through Watertown for the second Marathon bomber.  Everything felt both nervous and detached at the same time: we weren't even technically in the lock-down zone, since we are just over the border away from Cambridge, but my office is in Cambridge, and it all just felt too close for comfort to go outside, and especially not with a baby.  Yet at the same time, I felt basically safe: one of the amazing things about our informational age is that I could be sitting here, in my apartment, and know the minute that something critical occurred, popping up in the feed on my computer or phone.  So I knew that nothing was likely to occur in our neighborhood.

Still, it's been with us all day.  It's why I quickly decided that today could not be a work day for me, not even from home.  There was simply too much on my mind, and I wanted to take my time here with my family and just be.  Last night, when the suspects killed Sean Collier, the MIT police officer who died, they did it right outside the building where I completed my Ph.D.  I know the spot well, having walked or biked across it many times.  The 7-11 they were reported to have robbed (though it seems to have turned out otherwise), it is right on the corner by the building where I meet my synthetic biology collaborators.  When they carjacked somebody over by Third Street, that is where I used to drive all the time, getting from my previous apartment to the lab and back.  The gas station where the carjacking victim escaped them is where I used to fuel up when I lived in Cambridgeport, right after undergrad.  The area the police were searching in Watertown is back behind where all the good Armenian bakeries are, and the place the suspect was finally caught is just a couple of blocks from where I bought my car.  The apartment where the brothers Tsarnaev lived?  Just 15 minutes walk from from us, on the other side of Inman Square.  So this whole drawn out incident feels very close to home, but at the same time almost surreal.  And I can still hear the helicopters overhead, as I have been able to since late morning.

So with all of this in the ambient, what is it that's making me proud of my city?  What makes me proud is the way that the city has responded.

I remember, after the attacks of September 11th, that my first thought was how scared I was of what our response would be as a nation.  What depths we might lower ourselves to, having been stung so badly by those attacks.  And I think that I was right to have feared that.

Here, from the moment when this attack occurred, there has been a sense of measured judgement and sympathy in the response of people.  Even while the first responders were delivering first aid, Bostonians came out of their homes to feed, warm, and house the stranded runners.  While CNN speculated wildly about Saudis, the local media has been rock solid, clearly distinguishing known and unknown, and getting us real information while never reporting rumor as news.  And then today...

Today, while a million citizens were shut up in their homes, the whole city seemed to me to feel like an embrace of "keep calm, carry on."  There's something scary and deadly going on out there, but we've put our trust in tax-funded professionals who are moving cautiously and deliberately.  People from within the search zone, sought out and interviewed by the media, really portrayed the best of Boston's tough but peaceful values.  And as the picture of the suspects continued to develop, I haven't yet heard anyone drop into xenophobic or Islamophobic rants.  Over and over, the direction goes more towards sorrow and sympathy, and a strength that has nothing to do with aggression or revenge.

That's something that I think we need to celebrate.  Our strength, as a free and liberal society, to find a place for the whole breadth of the world, in open celebrations of humanity like the Boston Marathon itself, and to hold and embrace that even in the face of what horrors the world can bring.  It takes a great moral courage and strength to do that, far more than it does to throw up walls to try to keep out the world and seek safety in isolation.  We have always been strong on the strength of our diversity, and even amidst tragedy and fear, I heard that strength in every word that came from Greater Boston today, all the way to the vast multi-ethnic crowd who lined the streets of Watertown after the suspect was arrested, and cheered every one of the officers who passed, thanking them for a safe resolution that even brought in the suspect still alive.

This city is old, by American standards.  It has deep roots and a proud history and attitude, and tonight I think it certainly has that right.

Monday, April 15, 2013

A Bad Day in Boston

Today has been a bad day in Boston.  I myself was nowhere near marathon, let alone near the finish line where the bombs went off.  Couldn't get through to my wife because the phones were down, and so even though I knew she wasn't planning to be anywhere near the marathon either, I still had all these pangs of fear and worry about her and Harriet until I actually made contact some hours after the bombs.  Poor Harriet was picking up on her parents stress, and having no idea what was going on, so we sat in the living room and one of us read aloud to distract and calm us adults while the other lay on the floor with Harriet, playing with her and making sure she didn't butt-hop her way into any hard table edges.

Something like this really rattles me, in a way that I find difficult to rationally explain.  It doesn't exactly scare me---an attack like today's bombing of the Boston Marathon is so random and so infrequent that I just can't figure out how one could be scared of it properly and yet still live one's life.  It would be like being scared of meteor strikes.  Not to say we can't do better at preventing such horrors, but I think we're likely dealing with the work of a deranged individual, not some organized cause, and in a modern technological society, there's only so much that can be done to limit the capability of individuals to do harm.  The only way to make society overall safer from this sort of attacker is to make sure that less people slip through the cracks of isolation, alienation, and mental illness---a long, hard, and complex process, sure to be opposed by those who think that all somebody needs to do is "pull themselves up by their bootstraps" or to "act like an adult."  There's a lot of emotionally wounded adults out there, and today there are many more.  No good, and nothing much that we can do but carry on, live our lives as constructively and humanely as we can, and grieve when we need to grieve.

Stay safe, everybody, and live well both today and for all your hopeful tomorrows.

Tuesday, April 09, 2013

What Sort of Careers Do Tiggers Like?

Yesterday was a day at DARPA (of which I shan't speak further, since it was all about possible future projects).   On the way back, though, I had a conversation that made me really happy.  Sitting next to me on the plane was a young man, part of a high school trip that had come down to see the capital as part of their AP Government class.  As we started talking (his classmates congratulating him on starting conversations with strangers), I also learned that he was a senior, wanting to become an engineer, and thinking about UMass Amherst (a worthy school indeed), though a little boggled by all the possibilities of departure from home into the wild world of undergrad. So I shared a bit more about myself and my life as a scientist, and we had a pleasant though somewhat lopsided conversation.

What I really want to share with you, though, dear readers, is one thing that came up as we talked, a metaphor that have I found extremely helpful in those times when I am figuring out what I want to do with my life, professionally or otherwise: what do Tiggers like?

This comes from an old Winnie-the-Pooh story (by which I mean A.A. Milne, as I do not accept the validity of the Disney interpretations in my personal canon).  In the story, the newly arrived Tigger is hungry, and the other animals offer to get him food.  Only problem is, Tigger isn't sure what it is that he likes to eat!  The generous Pooh allows that he might share some of his honey, and Tigger immediately responds that "Of course, of course!  Tiggers just love to eat honey!"  But the honey is too sweet, so Eeyore offers his own favorite dish of thistles.  "Of course, of course!  Tiggers just love to eat thistles!" And thistles, of course, are too spiky.  So it goes, throughout all the animals, until finally Kanga offers Roo's nasty medicine, which proves exactly to Tigger's taste.

I find this quite instructive as a life lesson.

The moral, as I see it: when you have lots of plausible options, don't be paralyzed trying to figure out which is the best to do.  Just do things.  Do worthy things, that you do with passion and persistence.  You can always change course later if you have to, and having given it your best in the mean time, you will not suffer badly for it.

Tuesday, April 02, 2013

An Evening Celebrating Science

Last week was the annual Science Development Program Dinner at BBN.  The SDP is a complex and unusual institution at the company, one of the things that I think makes it pretty unique.  Generally speaking, it's an umbrella promoting all of the more academic-style aspects of the company, like journal publication, hosting visiting scientists, running seminar series, teaching university courses and supervising students, professional service on committees and editorial boards.  It also runs the pseudo-tenure process for promotion on our scientific career track, a set of senior ranks parallel to the management track.

I think the SDP is a very important thing for keeping the culture of BBN, riding on the edge it does between academia and more traditional industry.  And once a year, the company throws a party to celebrate its scientific side, and to reflect on where we have been and where we are going, scientifically.  Everybody who has done significant SDP activity in the past year is invited, and over the course of the program we hear about it all.

This year, we also remembered Wally Feurzeig, who just passed away after 50 years of AI research at BBN.  He was somebody who touched my life decades before I even knew his name, as one of the inventors of the LOGO programming language, which I spent many happy hours making pictures with in elementary school, without even realizing the ways it was teaching me about algorithms.  I also heard a story about one of our founders, Leo Beranek, long ago "retired" but still working on his own as a scientist at age 98: he will apparently be presenting a couple of papers at the same Acoustical Society conference as my wife in a couple of months, not as some sort of "aged and distinguished speaker" talk, but as ordinary peer-reviewed scientific papers.

I find these things inspiring.  I find the whole event inspiring, in the way that it invites me to step back from the hustle and bustle of daily deadlines and the money chase, and to renew myself at the well of scientific thought and the importance of inquiry for lasting impact and also for its own sake.  It's an important part of why I am working where I am, and what brings other people I want to work with there as well.

Tuesday, March 26, 2013

A typographical nuisance

If you don't care about grammar or typesetting, you should just stop reading this post right now.  Lord knows that I should probably stop writing it.

I'm not much of a grammar nazi.  But I am somebody who, once I've gotten my nose rubbed in a grammatical issue, can never unsee it again.  For example, during grad school, Hal Abelson taught me once and for all how to tell whether to use "which" or "that": "that" begins a clause with information that is necessary to identify the subject, while "which" begins a clause with "bonus" information, which adds to your knowledge about an already identified subject (see what I did there?).  I have never since been able to use the wrong one or see the wrong one used without it whacking me in the face.  I even mark it on papers I'm reviewing---not that I am such a quibbler that I would ever put that on a review, but I can't help but notice it and mark it.  Similarly, I can't put a comma or period after a closing quotation mark, since it should be enclosed within the quotes.

And here's where Blogger bugs me on a totally trivial matter.  I reflexively type two spaces after a sentence.  Apparently I shouldn't, really, at least according to the god Wikipedia, which declares that this standard has gone by the wayside.  Still, I do.  It got drilled into me back in the monospace era, I'm really not sure how, and that's what I always do.  Absolutely reflexively.  I also tend to end my own lines with a return rather than letting them wrap, which is simply stupid for a typeset world, though that should really be blamed on my reliance on LaTeX as my choice for professional document production---you should see the madness of line lengths and comment structure there in my document source, all the better to maintain organization and keep version control as informative as possible.  LaTeX is also why I use these "---" separators, as they would produce an appropriate-length dash in LaTeX.

But anyway, back to the "two space" thing: when I type two spaces in a row, if it's at the end of a line, blogger will put the second space onto the next line.  Or maybe it's not blogger but my web browser, I don't know.  I think it's Blogger, because I know of no other text layout software or WYSIWYG editor in the world that will push whitespace onto another line unless absolutely forced.  It's doubly infuriating that the spaces are handled correctly in the article editor, then render incorrectly after it's published.  But whoever is wrapping my text, can you please finish parsing the whitespace before you start the next line?

And now, that's way more than ever should be said on a subject this trivial, particularly given that I can't illustrate my gripe as beautifully as Matthew Inman.

Thursday, March 21, 2013

Reviewing with help from Harriet

Paper-writing season is apparently closely followed by paper-reviewing season---not a surprise really, given that my professional service to the communities whose conferences I care about often includes serving on the program committees of those conferences.  Over the last few weeks, I have reviewed nearly a dozen papers, which can take up a startling amount of time.

I had help, fortunately.  Harriet's not very happy to have her father's attention focused on a piece of paper rather than her, but the paper itself is of great interest.  As I was writing up one batch of reviews, Harriet was sitting near me on the bed, flapping her arms and playing away by herself with great gusto.  I had my pile of papers to finish writing about on the left, and as I finished the first of these, I set it over to my right, to start a pile of completed papers.  A few moments, later, I realized that my "done" pile was apparently within grabbing range of our barely-still-sessile baby:

Babies appreciate the kinesthetic properties of the scientific literature.
After that, my course was clear.  As I finished each paper, I turned it over to the local biological shredder for careful destruction.  She pounced on each with great delight, examining them, crinkling them, and in her joy giving me space enough to finish my task.

For me, though, the stiutation is a little bit more complicated.  I put in a lot of effort when I review, mostly from the Golden Rule perspective: I want to give the people I review the same sort of depth and fairness that I myself would want in feedback for a paper.  I try to be constructive, too, saying "This specific thing would help the paper in that way" rather than just "The paper is lacking in substance."  But sometimes a paper really tries my patience.

Good papers are a delight to review.  Really bad papers aren't very enjoyable, but at least they're fairly easy, because they are so, so terrible.  The worst paper I have ever reviewed was quite some time ago, and was a manuscript that had been produced by a clearly mentally deranged person.  In addition to its flaws from a scientific perspective, the text constantly changed color and font, and the "figures" were clip art.  But honestly, I didn't mind reviewing it that much, because the flaws were right there in front of your eyes.

No, the papers that are a true trial for me to review are those that are on the borderline in substance and are also heavily dependent on mathematical formalism.  A wonderful heuristic for mathematical papers that my advisor, Gerry Sussman, once told me, is to compare the length of the definitions section to the length of the theorems and proofs.  The higher the ratio of definitions to proofs, the more likely that you are dealing with shallow over-formality rather than any sort of significant result.  It's a failure mode that I totally understand: it just feels more "sciencey" to say, "Let B be a purely inertial spherical object whose state S^B(T) at time T is described by a tuple (x,y,x',y'), where S^B(0) = (0,2,10,0), and where y'' = g." rather than "Consider a ball thrown at 10 meters per second, beginning 2 meters above the ground in normal Earth gravity."  And it's a lot harder to write prose that is both lucidly transparent and scientifically precise.

But I definitely hate reviewing that sort of over-formalized material, because the flaws are never obvious, but are buried in a sea of mathematical notation, from which they must be carefully extracted.  The text is tedious to read, and I'm always worried that I'll be wrong because maybe I didn't understand or remember some turn of notation relevant to what I'm complaining about.

So I do it, and I curse and I sweat, and I simply pray that my own papers are not causing the same reaction in another reviewer at the same time, somewhere on the other side of the world.

Monday, March 11, 2013

The Measurement of Babies Redux

A followup to my earlier post, and also apropos some other recent discussion regarding null hypotheses on paper distributions and the general scientific method: in my earlier post on measuring babies, I stated that we had observed Harriet being taller than a 30-inch carpentry level, and then wandered off into a discussion of height and weight distributions with respect to infant age.  At her six-month checkup appointment, not all that long after that post, her pediatrician found a much less startling height, somewhere around 28 inches (I can't remember exactly).

I have no doubt that the doctor got the right number---their measurement system is actually fairly ingeniously simple.  You simply lay the baby down on the disposable paper that gets pulled out to cover the examination table, mark a line tangent to the feet, and then mark up on top of the head.  With good hands and a compliant baby, getting those two marks right is easy.  Then measure between the marks, and the length of any normally growing baby is enough relative to likely sideways displacement that any distortion from angle is likely to be quite small (my quick-and-dirty estimate is that at Harriet's height, a 1-inch sideways displacement should give less than 1% error).  It's imperfect, but pretty damned good.

So, what about our earlier measurement of 30 inches?  As with most surprising experimental results, it boils down to simple experimental error.  Not quite so dramatic as accidentally finding particles moving faster than the speed of light, but then we're dealing with a much smaller scale and more poorly controlled experiment.

What could have caused it?  Remember, Harriet was playing with the level, so we weren't exactly dealing with a stable instrument.  I certainly didn't look to see whether the level was actually level, so it may have been leaning somewhat.  I may also have suffered from some degree of an optical illusion since I was looking downward, with first Harriet and then the level further from me.  I may have counted some of her fluffy hair without realizing it.  She was being partially supported by me, as she worked on her great (and slightly premature) ambition of standing, so she may well have been stretching upward in some way.  At the end of the day though, if an error doesn't persist, it's probably not worth trying to investigate its causes, since they are likely to be transient and, frankly, boring.

One of the most important lessons of science, I think, is embedded in this experience: most things that appear extremely unusual actually are not.  Instead, most compellingly unusual things are the result of some combination of happenstance and circumstance, and our cognitive bias for noticing unusual things plucks them out of the background noise and throws them into stark salience.  For example, I can remember quite clearly the circumstances of Harriet playing with the level, but can't remember just what the doctor actually measured.  It's not a mistake to pay attention to unusual-seeming things: certainly, it has been evolutionarily adaptive for our species, and still is.  But it's equally important to remember that our unusualness detectors are tuned up so high that they give us constant false positives, and that is because those few circumstances where there really is something there make it all worthwhile.  Sometimes it saves us from a stalking leopard or drunk driver. Other times, it is as in the quote attributed to Asimov: "The most exciting phrase to hear in science, the one that heralds new discoveries, is not 'Eureka!' (I found it!) but 'That's funny ...'"  It's just that finding one "That's funny..." requires going through a rather large number of places where it turns out not to be after all.

So, an interesting lesson in the banality of experimental error and the importance of proper metrology (which appears to be one of my current favorite scientific concepts, thanks in no little part to my ongoing work in synthetic biology).  With regards to the measurement of babies, however, in the end the same judgement applies: we have a long trim baby (though she's coming more into standardized proportion, which of course doesn't mean a damned thing).

Monday, March 04, 2013


As I mentioned recently, just about immediately upon going remote for a month, I had to fly right back to Boston. That was the Thursday before last, when I narrowly escaped the Doom of O'Hare on my way to a curious workshop. This event, SemiSynBio, was an invitation-only gathering organized by the SRC, an organization that essentially acts as a research arm of the semiconductor industry, who had invited a group of people to come and discuss semiconductors, engineering of biology, and how the two of them might fit together in the future.

From my perspective, there were three main strands of discussion:

  • DNA used as a nanotechnological material for fabrication, memory or computational substrate.
  • Integrated systems of semiconductors and biological organisms, where the biology does the chemistry (sensing, actuation, power, etc.) and the semiconductors do the information processing and decision making.
  • Engineering of biological organisms (generally single-celled), where both the computation and the chemistry operate self-contained within the organism.

Could biology provide the new substrate to keep Moore's law alive for another decade or so? Could the incredibly massive design techniques of the semiconductor industry be adapted for coping with the tangled complexity of evolved organisms?  I don't think there were any answers yet, but it made for a good conversation, and something interesting may come of it...

My own talk was squarely in the third area, presenting the work we've done on biological design automation.  If you look at those slides, you'll get the first public peek at some truly large circuits produced by the Proto BioCompiler.  Not that we can even plausibly build those any time in the next few years, but they're well within the possibilities of eukaryotic cells... and the structures produced by the optimizing compiler are intriguingly difficult to interpret and reminiscent of some of the tangles in naturally occurring gene regulatory networks... I look forward to digging in and seeing if there's anything there...

Saturday, March 02, 2013

Is Paper-Writing Season Real?

As I mentioned in my last post, one of the things I just struggled my way through was a fierce batch of paper deadlines.  All told, there were eight paper deadlines in less than a month, meaning that even with excellent and responsible co-authors and triaging two papers, I still had a rather intense several weeks.

I feel like this sort of "paper-writing season" happens to me on a regular basis.  Certainly, every year around January/February feels like a time of madness, and there are other similar pockets of crunch time that show up at other times, though perhaps less consistently.  But is this phenomenon real, or just an artifact of my own time management and retrospective view on the matter?  Being a sucker for an occasional graph, and certainly for the ability to procrastinate on reviewing papers a little bit more, I made a list of the past year's worth of deadlines, both for conferences and workshops (which are generally regular in when they occur) and for journals and book chapters (which are generally irregular and presumably independent).  It looks like this:

Jake's Paper Deadlines, Mar. 2012 - Feb. 2013
This includes both those deadlines where I actually submitted something and those where I persistently care about and track the conference but did not actually submit (e.g., those triaged submissions from last month).  The journal and book chapter deadlines include all of the revision deadlines as well, so those publications contribute 1-3 deadlines to the collection, depending on how many iterations happened and how many were within the sample period, given the months- to years-long time scale for journal review and revision.  I didn't include the camera-ready deadlines from conferences and workshops, since the level of revision required for those is generally much more lightweight, and a few are even abstract-based, requiring no revision at all.

The verdict?  Well, let's see... I don't usually look for statistical significance in data sets this small or this poorly controlled, so it's going to take a little bit of work to figure out.  Usually I'm dealing with excessively large numbers of data points or nice tight distributions, and if I even have to ask whether a difference is significant, then it means that the result is probably too poor a quality for me to use in any case.  But the search for low p-values is practically a rite of passage in most disciplines, so I guess it's about time that I went on a p-value fishing expedition of my own.

Matlab's built in easy-bake significance-testing functions all seem to assume Gaussians, rather than the case we should be considering here, which is a uniform random distribution over months of the year.  So it's off to go spend a little quality time with Wikipedia, which has an excellent article giving pretty much exactly what I need.  Futz around with the numbers for a while, and I think I've managed to calculate things correctly... and it's surprising just how primitive and easy to screw up these tests are.  Bottom line, though, I think I've got my numbers correct, and they are giving me the following: conference deadlines are distributed randomly throughout the year (p=0.41) and journals are significantly non-random (p=0.018).

That first result is somewhat surprising, but I believe it.  Despite the occasional hell that is January/February, with six deadlines in two months, the actual month-to-month variation just isn't that high.  The second is a good example of why you should never believe a p-value without interrogating it fiercely.  You see, it happens that last year we submitted two papers to the same special journal issue in April.  If I drop just that single duplication, knocking the journal count for April from five down to four, we end up instead with p=0.19, an order of magnitude worse on the magical significance scale.  If you wanted, you could say that the significance test was doing exactly its job, and detecting that there was a non-random correlation; I would say, however, that it's clear that just a little bit of noise (a single doubled deadline) was enough to completely mess with our ability to ask the real question (are paper deadlines randomly distributed?), and persist in my stance that any effect that requires a significance test to see is a pretty weak effect, scientifically.

Bottom line, then, for this investigation, is that it appears that a) there's no point in speculating about interesting external effects that might cause my deadlines to bunch up, and b) just because the clumps come randomly doesn't mean they aren't likely to be seriously intense if I don't prepare for them well in advance, and that's hard to do when journal revisions are part of the mix.  There is a conference paper-writing season for me, it comes at the beginning of the year, and just a few randomly occurring journal interactions are likely to be enough to tip it over the edge from intense to excruciatingly stressful.

Do you have a paper-writing season, or similar deadline-fest in your own lives, dear readers?

Wednesday, February 27, 2013

Through the Magic Window

The past two weeks, my dear reader, have been even more complex than usual.  Besides a whole pack of paper deadlines (more on that in another post), we drove an infant for 2000 kilometers, I am working remotely for a few weeks, and even had to fly back to Boston for a couple of days of meetings.  Not that any of this is any excuse for neglecting you, which I shall now attempt to correct.

What I want to talk about right at the moment is my solution for working remotely.  I got the idea from reading another scientist's blog quite some time ago---I wish I could remember who, but I have no recollection.  That mysterious but wise blogger talked about going on sabbatical and staying in touch with their lab through the simple expedient of a computer set up in the middle of the lab, with an open Skype connection.  Anybody who wanted to talk to the professor could then just stop by the computer and they'd be right there.

I really liked this idea.  I've been a big fan of video Skype for meetings for quite a while: I find that being able to see the people I'm talking to makes a huge difference in my ability to communicate and, especially, to track well on what they're saying.  This is especially important when you're talking with multiple people at once---when I'm on a telephone conference, unless I know all the other people's voices really well, I'm likely to get confused as to who exactly is saying what.  I'm especially grateful for Skype when talking with European collaborators, who it's difficult and expensive to even call on a conventional phone (yes, I know there are other solutions besides Skype, like Google hangouts, but I'm a creature of habit when it comes to brand loyalty and other relatively unimportant distinctions).

This is another of those "living in the future" moments I get, from time to time.  Remember when video phones were a staple of science fiction?  Now it's easier for me to use video than not, half the time.  And the fact that it's "living in the future" dates me, I'm sure---my seven-month-old daughter already finds her parents' cell phones mystifying because she can't see the person talking to her.

So anyway, when contemplating working remotely, I decided that the best thing to do was to set my (large) desktop monitor in Cambridge up as the "Magic Window" connecting me to my office.  On my end, I've got my laptop, which is where I do all my work, and a loaner machine that sits connected to the office whenever I'm working and not off in meetings or whatnot.  My Cambridge machine is logged in as a special-purpose Skype user, which automatically boots Skype and sets it up to answer my incoming call automatically.  And for meetings, I bought myself an iPad, christened it "Meeting Jake" and got one of those covers that can prop it up on a table.

I figured this idea was worth a shot, and so far, it's working like a charm.  With a good ethernet link to a high-bandwidth connection on either side, Skype is crisp and clean and runs for hours with nary a hitch.  Except when a police car goes zooming by my Cambridge office and the siren makes me jump. The iPad isn't quite so good, but I had lunch with folks today, and it worked well enough.

My only complaint?  It's much harder to go drop in on people.  My normal day in the office involves a lot of walking around---not necessarily going to anywhere in particular, but just stretching my legs, checking whether somebody I want to talk to is around, etc.  I hear there are telepresence robots that one can get these days, but I also hear they're really not up to the job yet.  

So: so far so good, and hopefully it will continue to work well for the next couple of weeks while I continue to be working remotely.  Also, so far so good on long drives with infants: as long as there was one parent in the back seat with her, Harriet was fine---in fact, probably happier overall than when her parents are distracted by the cares and requirements of an ordinary day.  An so, for your moment of baby zen, what does a seven-month-old infant look like in the middle of a long cross-country drive?

Harriet, delightedly playing on her car seat at a rest stop.