This is definitely at the crossroads of some of my main interests: math, Star Wars, and creatively wasting one’s time. (source BoingBoing)
This is definitely at the crossroads of some of my main interests: math, Star Wars, and creatively wasting one’s time. (source BoingBoing)
This past week of February was National Engineers Week, and it’s always an excellent time to learn about different engineers today as well as those whose shoulders we stand on. I haven’t practiced engineering as a professional in over eight years, but I still work with engineers and structural engineering every day at Bentley Systems.
I wanted to post a bit on some of the history of software engineering and, in particular, just how much women have contributed and really created that discipline.
Ada Lovelace pictured with her table of algorithms created as an example code
Lovelace is widely recognized as having created the very first computer code language, when transcribing in her shorthand some mathematics to use on Charles Babbage’s difference engine. Stephen Wolfram did some research on Lovelace’s life and wrote a fascinating article on her life and work.
Prior to the general adoption of digital computers, a “computer” was actually a human person who sat and did calculations all day. These were almost without exception women, many of whom had degrees in mathematics but were not able to continue on in the field due to their gender. During World War II, when the US Army was researching the first digital computer — the ENIAC, a group of these women who had been calculating munition trajectories were hired on to encode the same calculations into that computer. They wrote the computer code and the debugging for the first computer.
And Katherine Johnson was just about the best. So good, in fact, that when digital computers were being used to calculate the mission trajectories for the first moon landing, John Glen insisted that they be checked by Johnson first2.
Makers.com has a wonderful set of video interviews about her career.
Last year, Johnson was awarded a Presidential Medal of Freedom —one of the two highest civilian honor this country bestows— in honor of her accomplishments as well as her being a role model for women and people of color.
Rear Admiral Grace Hopper was an early computer scientist who is probably best known for having discovered an actual bug (a moth) in a piece of computer equipment (a printer). However, it was her contribution of creating the first digital compiler for taking human-readable code and converting it to machine language that was truly a remarkable achievement.
As a I told my after school coding club kids last Fall, anytime you are debugging code so a computer can understand it, think about Admiral Hopper!
Margaret Hamilton standing next to listings of the Apollo Guidance Computer (AGC) source code (Courtesy Wikipedia)
While Katherine Johnson and others had calculated the trajectory for the Apollo mission, the spacecraft itself now had digital computers on board. Margaret Hamilton was the lead software engineer —a phrase coined by Anthony Oettinger and then put into wide use by Hamilton— for the Apollo craft’s operating system. Her foresight into operation priorities saved the day when a radar system malfunctioned but the guidance system architecture still landed the lunar module. She founded Hamilton Technologies in 1986.
I can’t help but wonder that men haven’t simply co-opted the role of software engineer from women once it became clear that software was a worthwhile endeavor. However, there are many great women engineers practicing today, in both software and other engineering disciplines. I have the privilege of working with many at Bentley Systems. However, we’ve done a great disservice to young women in creating a culture that fails to encourage women into science, technology, engineering, and mathematics careers. STEM programs go a long way to help right this, but I think we also need to recognize that women have managed to create much of the modern world we know today, particularly in the field of software. And this in spite of the uphill climb many of these women faced in just finding work at all!
So in honor of engineers week, let’s be sure to let young women know that not only is their a future in STEM for them, but there is also an amazing past to be proud of!
“Coding is for girls” by Anne McGraw
Leonard Nimoy passed away earlier today. If you asked many people, they might tell you that they hear Morgan Freeman’s voice in their head when they imagine the voice of God. To me, it will always be Leonard Nimoy. That placid, chain-smoking-induced growl that, in part, made Spock such a wonderful character of his fills me with awe.
As a child, in addition to Star Trek reruns (both the original series and the animated series), I grew up watching Nimoy host Nickelodean’s Standby: Lights, Camera, Action!. That show was a wonderful look at how movies are made. Nimoy was a wonderful host, engaging in demonstrations of special effects and occasional gags. His love of movies was evident. In a time before the internet, Wikipedia, and movie blogs, it was a source for me to learn about movies, actors, and directors. In fact, it was there that I first learned1 that the original Star Wars were the middle piece of a larger trilogy, and someday there would be prequels (before the word prequel existed, even, I think) and sequels2. I also learned about Star Trek III: The Search for Spock and the Klingon language from the same show. Of course, that film was directed by Nimoy, who’s involvement in movies and television grew beyond acting.
It’s said to never meet your heroes, as they will only disappoint you. However, I do truly regret never having had to the chance to meet Leonard Nimoy in person. He truly seemed like a beautiful person in most every way and Gene Roddenberry once called him “the conscious of ‘Star Trek'”. A wonderful quote from Nimoy:
Whatever I have given, I have gained.
It’s very sad to have lost Nimoy but I’m so glad that he was able to continue to appear in popular television and films, even up until very recently. His character of Spock is a cornerstone of pop-culture and it’s due almost entirely to Nimoy’s acting. In a show that is remembered for some cheesy plots and hammy acting, as well as some rather uneven movies, Nimoy was a gem in Star Trek. Honestly, if you can watch the scene of Kirk and Spock in the radiation chamber at the end of Wrath of Kahn and not get choked up, you are possibly more Vulcan than human:
It’s hard to think of a better way to remember Nimoy that with a performance like that. Live long and prosper.
RadioShack announced today that they have filed for Chapter 11 bankruptcy. They will close about 2,400 of their stores with many of the remaining locations being purchased by Sprint. This is more-or-less fitting, given that the brand has basically gone from the go-to supply store for electronics parts to a cell phone reseller. I honestly can’t say that they no longer carried any electronics parts, but I seriously doubt it’s something most of their locations carried at all.
It’s disappointing news for some. Wired has as a story on how influential RadioShack was in building Silicon Valley1. Steve Wozniak (Apple co-founder) recounts how some original telephony hacking got he and Steve Jobs to go on to build computers:
He used [a Touch Tone dialer purchased at RadioShack] for the now-infamous Blue Box, which he and Steve Jobs used to make their own free calls without interference from Ma Bell. Without RadioShack, there’s no Blue Box. And as Woz tells it, without the Blue Box there’s no Apple.
While it’s good to understand RadioShack’s importance in the hacker / maker / DIY culture that helped to spur innovators like Woz, it’s important to note that the RadioShack we all knew and loved died many years ago. They either didn’t see the rise of makers or simply ignored it, in lieu of chasing mobile phone buyers. Admittedly, that was chasing the money at the time. Of course, it’s not served them well in the long run. And they company that brought
IBM Compatible PCs to many homes across the country (including my friend, TJ’s, when we were kids) got out of the computer manufacturing business early on.
The time my older brother & I fixed my washing machine with a kit I ordered off the internet.
Even so, I think there’s never been a better time to be a maker or a tinkerer. With a nearly endless supply of free how-to videos on YouTube, countless DIY and repair sites catering to anyone with a screwdriver and some time, and amazing online shops like Adafruit, someone today has far more access to get started building whatever they can dream up. So, for that, I can be ok saying good bye to RadioShack. Frankly, I wrote them off a long time ago.
In early January, Angela and I got matching his-and-hers FitBit One’s to start tracking our activity. Angela’s actually been wearing a pedometer for years now. But the FitBit does a lot more data tracking than a simple pedometer. I’ve been wearing it everyday since then.
There a few technologies I’ve adopted that I would consider life-changing. Maybe not the sort that change the entire course of my life, but certainly that have had a dramatic impact on my day-to-day behavior. DVR (TiVo), smartphone (iPhone), and a personal activity tracker (FitBit). As a professional, I’ve always been at a desk for a lot of my time. But when I practiced engineering, I was often going on site visits and moving around throughout the day. Now that I’ve been working remotely for a software company, that’s not the case. My activity level can vary dramatically from day-to-day. I had no idea just how much until I started wearing the FitBit.
One day I’d break 10,000 steps shortly before lunch (if I went running, typically). On another day, I might be lucky to approach 2,000 steps. What’s more, is my eating varied just as much. And my activity (i.e., caloric expense) had absolutely no correlation with my eating (i.e., caloric intake). So my body would one day get twice as many calories as it really needed and another not enough. I was essentially training my caveman-era/lizard-brained body to hold on to every scrap of calories it got because who knew what tomorrow would bring.
Wearing the FitBit and carefully tracking my calories eaten has help to change that behavior. I now track my calorie intake using LoseIt1. Having a number of activity goals —steps, active minutes, stairs, and miles— all of which
gamify my physical activity. Of course, I don’t meet the targets all (most?) of the time, but just having the goals points me in the right direction rather than stumbling around in the dark.
Of course, just tracking the data is one thing. It would be all too easy to just pile it all together in some useless place. FitBit’s web site and iPhone app are really exceptional. In fact, I sort of use my FitBit as just a recorder (and occasional timepiece) and rarely take it out of my pocket. I simply use the iPhone app. On an iPhone 4S or newer, the smartphone syncs directly to the FitBit via Bluetooth 4.
I also use the FitBit to track my sleep, although that’s more to make sure I’m getting enough rather than judging the quality of it. Apparently, I’m generally 98% efficient at sleeping, whatever that means. The velcro wrist strap is a pain and tends to come off my arm. I’m on my second wrist strap, as well as second silicone clip. As a result, I’m considering upgrading to a Force next year. The One has been great so far.
I was in the process of reorganizing my computer science and technical writing shelf today during lunch when I began to notice a pattern: I have quite a few books related to DITA and the underlying technologies of the DITA Open Toolkit. Well, this isn’t by coincidence. It’s a big part of my job and something I’m really interested in. But it occurred to me just how much time I’ve spent pouring through these texts of structured authoring and XML-based technology—all in hopes of grokking this for my job.
So, in no particular order, here’s a list of some of my books on the subject:
And, some wider shots of my (sort of) organized bookshelves:
The <center> cannot hold it is too late. The force of regex and HTML together in the same conceptual space will destroy your mind like so much watery putty.
At first, I was a little surprised. I love using regular expressions to make bulk changes throughout an XHTML document or even across a project consisting of hundreds of files. But, after reading through the post several times and thinkng about what I’ve been able to accomplish with some (relatively) simple XSLT files and a XML parser, it occurred to me that it is absolutely correct.
You, see as great as regular expressions are, they are not aware of the context. They have no idea if your matching a pattern within a C++ routine or an XHTML file. They can only parse characters and short strings as they are, with no understanding of their meaning.
EXstensible Stylesheet Language Transforms, on the other hand, are solely for the purpose of manipulating XML content. By definition, they are aware of XML elements and their attributes. The entire purpose of them is high-level modifications. In fact, after having used them now to successfully convert some XHTML to DITA XML, I have to say the powers feel almost god-like.
RegEx still have their use with XML—particularly with badly formed SGML/HTML one might have had dumped in their lap. But if the need is actually manipulating XML elements or attributes within a file (or even across files), then it’s really foolish to try to accomplish something with multiple regular expressions when a single XSL template will do (and often without the unintended consequences of a greedy RegEx).
I have read numerous times how Gene Roddenberry—the creator of Star Trek—preferred the eyes and mouth of an actor playing some alien not be obscured by makeup. The theory goes that this allows the actor to actually, well, act and the audience better empathize with the character. This makes good sense on a series like Star Trek, where the interaction with aliens is often less shoot ’em up and more diplomacy and moral drama. However, I had never considered this point extending to dogs.
I read this post on Improvements in the Windows Explorer earlier today with quite a bit of excitement. There’s a lot to learn in here about the thought process that goes behind the Ribbon UI which was developed at Microsoft and is finally reaching the Explorer window. I, personally, welcome the changes and think it is great that they are exposing so many power features but with the ability to make the interface as minimal as needed for someone who won’t use them. As someone who’s getting into more UX design, particularly when it comes to Ribbon UI applications, this sort of stuff is invaluable.
Gruber mentioned it in an aside piece, pointing out that Apple and Microsoft are really diverging in terms of UI design1. This is certainly true when comparing the (still in Alpha) Windows 8 Explorer window with the UI changes in OSX Lion. While it is fair to argue that Microsoft’s UI is busy, I think Apple has gone a bit too far in the other direction. My largest gripe is that all the color has been removed from most icons, making it a bit harder to differentiate one gray square from another. The ribbon can be minimized in any Ribbon UI program—resulting in what are functionally just graphical menus. There is a tool (oddly, with a gray gear icon) in the Finder which is “Perform tasks with the selected item(s)” which generally accomplishes the same task. Of course, it is just a menu and limited to practical menus sizes (no different than a right-click contextual menu at all).
The Windows 7 Explorer dialog is similarly simple, with a menu-ish toolbar providing some context-sensitive tools along the top. This interface looks a bit like Internet Explorer 8, but that is still different enough to most Windows programs that I think many users just never got used to the controls. In IE, the main purpose is browsing. Hiding settings, etc. aren’t needed most of the time and I’d wager many users don’t even know about them. However, I think anyone using a file manager is often looking to do more than just browse those files.
Windows 8—assuming that many of these features don’t get stripped out or watered down by some larger committee (as has happened to Windows releases in the past; thus Vista)—seems to try to cater to both casual users by way of the collapsable Ribbon and even the Metro UI (which will prevent many users from even seeing the Explorer window) as well as to power users who think that reducing the number of clicks to show hidden items from five down to two is awesome. Trying to have it both ways may very well not work, as is too often the case.
But, right or wrong, the Finder in OSX Lion is still going to be nearly as lousy after Windows 8 as it was when OS X first launched2. At least the Windows team is willing to listen to criticism and make some drastic changes.
Today is World Backup Day. Now, before you start looking over your shoulder or throwing the car in reverse, keep in mind this means backing up your data.
As in hard drives.
The fact that most people probably don’t really think about data when they hear the phrase
back up doesn’t really bode well for such an awareness campaign. However, as more of our daily lives — even the non-geeks out there — become more digital than physical, it is important for all of us to think about this. How many photos of your vacations, videos of your kids, purchases of music and film, purchased software with download-only delivery, or important documents that are no where else but stored in a series of ones and zeros on a hard disk? I know that in our household, it is pretty much everything of any importance for almost the past decade.
As a result of all that digital content, we have an enormous amount of storage in our house. Among our three main computers — my iMac desktop, Angela’s laptop, and my work laptop — we have nearly 2.25 terabytes of storage1. That number alone is the sort of thing that would have sounded like pure science fiction a couple of decades ago. Today, it’s really not that much at all2.
What’s more, while today’s computers and their hard drives are fairly robust, these things do fail. Even when that happens, it isn’t the end of the world. Data can be recovered but it is far from cheap. In a world of
The golden rule is that anything digital worth keeping should have three copies:
This provides physical separation of your backups and while this was the sort of luxury that only large companies could afford years ago, it is simple and (relatively) cheap today with the dramatic drop in price of large hard drives and high-speed internet connections.
We use a set of hard disks that I either purchases for this purpose or put together from old equipment for our local backups. We use a hodge-podge of software to manage these backups:
That covers our local backups, but it is extremely important to also keep a remote backup in case of physical disaster or theft. For that, we use
In terms of cost, our entire local storage system could be purchased for about $250 (going rate is around $100/ terabyte for external storage). Carbonite is $55 per year per machine, though it’s cheaper for longer periods and you can use some coupons to get a month or two for free. So, for roughly $500, it is possible to provide an extremely robust backup for our home computers (if your work doesn’t pay to back up your work computer, they should) for nearly the entire expected life of those machines. It’s far from cheap but the peace of mind and ease of use is really worth it.
Ask anyone who has lost even a fraction of their digital photo albums or music collection and I’m sure they’ll agree.
So, snap to it and do yourself a big favor.
Fringe,it is far from catastrophic.↵