At least that is how I read it.
I "passionately love" metaphors and comparisons of this kind, they are so much meaningful. I mean, 50MB of "code" is "a bit" different when your computer is build from logic gates and a bit different when your computer is the universe, or, to be more precise, the laws of physics that determine the final form and function of the proteins that are build from the DNA. In other words, I don't think it is right to use a measure of information like it would be a measure of complexity (which such analogies imply). Also it completely ignores the very complicated process of development of the brain, where information is supplied from the outside all the time and without which the brain isn't too useful.
It's the same with the page linked:
We don’t have any idea how to make a description of such a complex machine that is both dense and flexible.
Just like the fact that we have computers build from gates and not laws of the universe to depend on didn't make the task of writing "dense and flexible descriptions" quite a bit harder. Especially we do it with our brains and we don't have millions of years of trial and error at our disposal like evolution has.
This is different with the DNA. There is no middle layer like Boolean algebra to abstract-out the device that does the computation. The protein that will be the result of connecting the amino acids specified by the DNA, it's form and function, is very highly dependent directly on physical laws in a very complicated way - if you simulate protein folding (and consider how hard this is in the first place), you can see how much the outcome will vary when you for example change the value of some physical constant by a small amount. Then all those proteins start interacting with each other in highly complicated ways, also dependent on a wide variety of physical laws and on the outside environment, and if you consider DNA a program, those physical laws are parts of the computational model, of course if the concept of a computational model makes any sense when studying non-man-made artefacts. That's roughly why applying computer science metaphors to the DNA always sounds a bit ridiculous to me.
Of course, it is a different question whether we can find a computational model that would explain to us the working of a healthy, fully-developed human brain, I think it is worth mentioning as it is easy to confuse those two questions.
But that's a straw man argument. The brain is a massively parallel net of neurons connected by synapses. Saying that it takes "10 steps" to respond to a stimulus is like saying that a GPU only applies a few pixel shaders on a scene per frame. Perhaps, but that's over millions of pixels.
Also, human DNA is unique from that of close mammals on a set of base pairs of size the order of 1-5 percent of our genome. So that means 10-50 MB. That's actually a pretty substantial size: it can even store a small operating system kernel (Linux can be compiled to under 10 MB).
Edit: The 1 GB figure comes from the fact that the human genome size is almost 3 billion base pairs, where each base pair exactly encodes 2 bits. So, (3x10^9)*2/8=750 MB which rounds up.
http://www.nature.com/nature/journal/v431/n7011/abs/nature03...