kdmurray.blog

Thoughts and opinions of a technology enthusiast.

Let It Go – Putting XP Out to Pasture

keep_calm_and_let_it_go_by_lordani0512-d6yfjy3On April 8th, 2014, Microsoft is ending support for Windows XP. After that date if you are still running Windows XP you will no longer receive security patches or other updates for Windows on your computer.

What does this mean?

If you still have a computer running XP it means that it will start to become less safe to browse the web and open email attachments on Windows XP than it is on Windows 7 or just about any other operating system. Your computer will continue to work, but in essence you’re now driving your computer without wearing your seatbelt. You might be fine, but you have an ever-increasing chance of catastrophe.

Why is Microsoft doing this to us?

One of the implicit contracts we make as users of technology is that it will need to be upgraded at some point. Windows XP is old. In software terms, it’s ancient. Microsoft has supported Windows XP for 12 years (that’s 56.4 in Internet years. As a point of comparison Apple launched it’s first version of OS X about 5 months before Windows XP was released. Apple is typically quite cryptic about their security policies but as a rule of thumb only supports an OS up to two versions back from the current one. That means that since the release of OS X 10.9 in October 2013 the following releases of the Mac Operating system are no longer supported:

With the exception of “Cheetah” all of these releases are newer than Windows XP and all of them are no longer supported by Apple.

What can I do?

Buy a new computer.

It may sound trite, but the simple fact is that for about twice the cost of a Windows upgrade you can get a brand new computer which in all likelihood will outperform whatever you’re using today. (If this isn’t the case for you, chances are you don’t have XP.)

Buying a new machine is hands down the best way to get yourself off of the Swiss cheese that is the XP security scene and into something else. Buy a Windows 8 tablet/laptop. Buy a Mac. Buy a Linux computer. Something. ANYTHING!

No. I’m stubborn.

If you can’t or simply don’t want to upgrade there are a few things you can do to protect yourself. They won’t be as good as running a modern operating system, but they’ll help.

Stop Running as an Administrator

You can stop about 90% of hacks in their tracks by running as what Windows calls a “Limited User.” While this is a bit less convenient for most people since it will prevent you from installing software or changing core system settings, it will protect you from the majority of attacks because it prevents you from installing software or changing core system settings. See what I did there? Yes. You are your own greatest threat. Follow these instructions from the Microsoft Knowledge Base to create a limited user account for your system.

Don’t Use Internet Explorer

Don’t. Simple. Download something else:

Use Firewalls

Windows XP has a built-in software firewall as of Service Pack 2 (released in August 2004). Make sure you have enabled the firewall. No exceptions. Also make sure you have a router on your network between you and the Internet. The vast majority of home routers (and home gateways) offer a hardware firewall which stops uninvited guests (hackers) from getting in to your computer.

Do I have a router?

On Windows XP use the following steps to find out if you’re running a hardware firewall. (It’s not 100% conclusive, but this check will give you the right answer in the overwhelming majority of cases.)

  1. Point your web browser to http://icanhazip.com/
  2. Make a note of the IP address.
  3. Click on Start -> Run
  4. Type cmd and press Enter
  5. In the resulting command prompt window type ipconfig and press Enter
  6. Look at the text on the screen, if the line “IP Address” matches what you have in step 2, get a new router.

Connection-specific DNS Suffix . : workgroup
IP Address. . . . . . . . . . . . : 192.168.1.106
Subnet Mask . . . . . . . . . . . : 255.255.255.0
Default Gateway . . . . . . . . . : 192.168.1.254

Bottom Line

Your computer is not going to explode and melt through your desk like so much digital napalm on April 9th. You are however taking your digital life in your hands every time you use the Internet by continuing to run Windows XP.

It’s time. Let it go.

Image credit: lordani0512 on DeviantArt.

Better Dropbox Control with Symlinks

One of the things that’s always made me crazy about the way Dropbox behaves on the Mac is the way that it restricts the location of the Dropbox folder. While recent iterations have allowed you to move the folder to a new location, it must still be called “Dropbox” which is annoying because it prevents you from using your entire home folder as your Dropbox keeping all those files in one place. (I know there are a ton of reasons not to do this — not the least of which is the Library folder — but humour me…)

What I had wanted was an easy way to keep portions of my main user folder in Dropbox without compromising the familiar folder layout that we all know and love. The solution came in the form of Symlinks.

Symbolic links are essentially shortcuts to other locations on your computer, or network. In many cases they navigate like folders creating a sort of worm-hole to the other folder location. So what I did was create linked folders from places on my computer to places within my Dropbox folder. The Dropbox client picked up these changes and dutifully synchronized the contents of the linked directories up to my Dropbox account just as if the files were actually in the Dropbox folder.

These commands will create a directory called Miscellaneous in my Documents directory and then link it to Dropbox using the name Misc-Callisto. All other Dropbox clients will see this as a directory called Misc-Callisto and any changes to documents in that cloud folder will be reflected in my ~/Documents/Miscellaneous directory.


mkdir ~/Documents/Miscellaneous
ln -s ~/Documents/Miscellaneous ~/Dropbox/Misc-Callisto

It is also possible to do this synchronization in two directions. If I have my ~/Dropbox/Misc-Callisto directory in my Dropbox, I can create a linked folder to an alternate location on another machine. This is a bit more complicated and can result in lost data if you don’t follow the steps carefully.

  1. Sync up your Dropbox folder
  2. Stop the Dropbox client
  3. Move the directory Misc-Callisto to wherever you want the files to live on your new client.
  4. Rename the directory to whatever you want to call it, in this case ~/Documents/Callisto-Backup/Miscellaneous
  5. Link the folder back to Dropbox using the same name, in this case Misc-Callisto
  6. Restart the Dropbox client

The ln command in step 5 is:


ln -s ~/Documents/Callisto-Backup/Miscellaneous ~/Dropbox/Misc-Callisto

Any changes reflected in either of the local folders will appear in the local folder on the other machine thanks to the wonders of Dropbox and symbolic links.

Note: On recent versions of Windows you can perform the same symlinking behaviour from the command prompt using the mklink command. The following code blocks are the Windows equivalents for the two ln commands above:


mkdir ~DocumentsMiscellaneous
mklink ~DropboxMisc-Callisto ~DocumentsMiscellaneous


mklink ~DropboxMisc-Callisto ~DocumentsCallisto-BackupMiscellaneous

Now go forth and link-up your file system for maximum synchronization!

Guest Spot – Knightcast 0056 “The Best of KWTV Live”

I recently had the honour of being asked to be a guest on Knightwise’s podcast during his KWTV Live event in September. He took the opportunity to interview three different people about the current state of the three major operating systems, Linux, OS X and Windows. The three guests for the evening were:

Larry spoke on the state of Linux and what drives Linux adoption; Bart covered the highlights and lowlights of OS X Lion in some detail; and I talked about the Windows 8 developer preview and the state of Windows tablet PCs.

Give it a listen!

Three-week Ubuntu Experiment – Migrating to Open-Source

This past spring I made an attempt to move myself out of the shackles of the commercial software world and truly embrace open-source. I tried to move my primary machine off Windows 7, and onto Ubuntu Linux. I knew the transition wouldn’t be seamless but I’d heard so many good things about living in a Linux universe that I decided it was time.

The experiment did not go as well as I might have hoped, and despite my efforts to stick with it for some time, I eventually had to cut the experiment short. As I was preparing to re-image my system I started a blog post which I decided not to post at the time. I’ve included a short excerpt which shows my state of mind back in May, just after the experiment concluded.

I told myself I was going to stick it out for at least 3 months. But here I sit, not 3 weeks after making the decision to migrate my primary machine to Ubuntu, with the Windows 7 installation disk in hand. What could possibly have brought me to this point? Primarily, time.

It’s going to take me about 8 hours of work to prep all the data on my system for the transition, wipe the linux partition, re-install windows, re-install the applications, re-install VMWare, re-install my Linux VMs (I do still have a use for them!). The problem is, things on linux generally have taken longer than they should. Some of this is due to the fact that I’m learning, and I’ve tried to ignore those. Others are generally due to the fit and finish of Ubuntu.

So what went wrong?

Problem #1 – 10.10 or 11.04?

I generally resist the temptation to move to the latest OS release, but when I tried setting up a Windows VM under VirtualBox in Ubuntu 10.10 the audio was mucked up. It seemed a bit slow too, but that may have been my imagination. So I tried installing the newly minted 11.04. The VM now worked like a charm, but that was a long multi-step process.

Problem #2 – Virtualization

Trying to set up a virtual machine that would start up at boot time (like a Windows service or any number of linux daemons) proved a nearly impossible task. After several hours of searching, tweaking, testing, and ultimately failing, I decided to abandon the effort and live with manually starting my VMs.

Problem #3 – File Sharing

Setting up network shares was probably one of the better experiences I had. I was able to set up a “public” share on the linux machine and access it from anywhere on the network… as long as I didn’t want to protect it with a username and password. That was going to require more voodoo and black magic than I was prepared to endure for such a simple task. Overall, not a bad experience.

Problem #4 – Flash in Browsers

Like it or not Flash is still an integral part of the web, and Flash in the browser was just one of those things that never quite worked right. When I talk about fit and finish of a product, this is what I mean. Blocky artifacts showing up on video players was the most common issue, though there were other things like playback and audio problems as well.

Problem #5 – Lack of Air Support

The fact that I felt compelled to write a blog post calling attention to a tutorial for getting Adobe Air installed under Ubuntu 11.04 speaks to just how difficult this didn’t need to be. On any other major platform, you can go to a website and simply click the install button. The rest is automatic. Not here though.

Problem #6 – Button Clicks

I constantly had problems just clicking on buttons. Sometimes in an application (Chromium comes to mind) but sometimes just within the Ubuntu environment itself. This kind of thing makes you start to question the faith you have in your OS.

Problem #7 – Learning Curve

I suppose it’s a bit unfair to put this here as it’s undoubtedly the same issue that would come up moving between any two major operating systems. The bottom line is that I have a young family with whom I like to spend the majority of my day. That means that when I decide to sit down at the computer to do something, I don’t really have the time to spend learning how to do things all over again.

There were a few things that were also pleasant surprises during this whole thing. Mostly to do with 3rd party applications.

CrashPlan support

CrashPlan was able to seamlessly match up my Windows backup to the Linux file system. This made it very easy to move everything over. I just hope it works as well in reverse.

AcidRip

Digitizing DVDs has never been easier. It took a couple of tries to get the quality settings just where I wanted them, but the process worked out really well.

Shell

I love the *nix shell, Bash in particular. This is the one thing I will truly miss when I move back to Windows. Having commands like rsync at my disposal, and built in SSH support are also fantastic. While this is something that has to be hacked into a Windows installation, it is available by default on OS X.

In summary…

The availability of good software to do most tasks is one of the key benefits of moving to an open source experience, but the truth is that the experience really didn’t live up to my hopes or my expectations. I’m getting to the point where I want my computing time to be spent creating, not just experimenting with different ways that I could set up my tool sets. And as time moves on, the number of free or open-source applications available on the major commercial platforms like Windows and OS X is growing. Once either of those operating systems is installed I can do everything I want to do without having to pay a license for another piece of software — and in many cases the applications are as good or better than the open-source tools available for the Linux platforms. Add to that the growing number of applications which reside in the cloud and are completely browser and platform agnostic and it starts to become a simple equation for me.

Is it worth the $150 or so that it costs to get my new computer preloaded with a commercial OS? Yes.

Windows Phone 7 – First Impressions

LG Optimus 7The Redmond-based software giant’s previous offering in the mobile space (the much maligned Windows Mobile) has taken a lot of flack in recent years over the quality and features (or lack thereof) in their mobile operating systems. One of the biggest challenges was the fact that Microsoft did not control the hardware stack. Vendors could essentially build anything they wanted with “compatible” hardware with little or no enforceable guidance from the software maker. All that has changed in 2010. Microsoft has provided a minimum specification for Windows Phone 7 devices which seems to be providing a more consistent experience across devices, and overall better performance than in years past.

In trying to describe it over the past couple of days I keep finding myself referring to it as ‘not an iPhone’. Though it shares many of the same features and capabilities of its iOS brethren, it doesn’t follow the lead in OS design. The overall feel of the UI is very fluid. Screen transitions both in the OS and within many if the applications are smooth and scrollong through long lists of data or

The main screen of the WP7 interface is the set of configurable ‘Live Tiles’. These are in essence large icons which can also be updated by the apps they belong to. Messaging and email applications, for example, display the number of new message and the Marketplace app shows the number of apps you have which are waiting for an update.

The second panel on the main screen is the application list. All of the applications are displayed on a single scrollable list. This alone is a break with the now traditional layout of iOS and Android devices displaying screens and screens if icons. This difference provides an instant differentiation for the new Windows devices.

The one class of applications that is treated differently is games. Games are all listed from within the XBox Live hub isolating them a bit from the rest of the applications.

In the last sentence I mentioned a hub. This is the second major concept that the OS introduces. The hubs are, for want of a better term, points of convergence that bring together disparate sources of similar information. The best and most cited sample of this is the ‘peope hub’. the people hub allows you to merge in your contacts from your (multiple) email accounts and join it to contact information in your MSN messenger account and even your Facebook friends. The people hub uses all of that information to create a single list of contacts each of which contains information from the various sources.

The convergence of the people hub is nice. I’ll be happier once the OS can expand beyond Facebook and Windows Live to incorporate the services I actually use on a regular basis like Twitter, GoogleTalk, Tumblr and Flickr.

So far so good for the newest mobile OS. I’ll have more posts coming in the next few weeks getting into some of these features in more detail, covering other aspects of the Windows Phone 7 ecosystem, and hopefully touching on the developer story for WP7.

Camping out with Windows 7

Windows 7

I’ve been looking for a better way to do my Windows dev work at home for a while now.  I’ve explored a few different options including VMs and Mono, none of which suited the needs that I have.

I’m not someone who has to have the latest & greatest computers to get my stuff done. The things I use my computer for don’t require a whole lot of horsepower. Truth be told, the newest computer in the whole house is my three-year-old Macbook. So when it came to deciding which of the three machines in my house were going to get the Windows 7 treatment it wasn’t hard to decide.

Apple has said that they won’t be providing official support for Windows 7 on any of their machines for another few weeks, and when they do it’ll be on a limited subset of their Intel-based machines, and only for customers who’ve shelled out the extra $30 for Snow Leopard. Admittedly I’m not an expert in computer hardware, but I’ve been around the block enough times to know that “not officially supported” doesn’t mean “it won’t work”.

The first thing I tried to do was just clear some disk space and run the boot-camp wizard to set up a partition for Windows. Once again I ran into the problem of OS X not being able to reorganize the files on disk to create a contiguous partition. This doesn’t usually pose a problem with computers that have a disk defragmenting tool but of course OS X has some redimentary defrag technology built-in and thus the notion that “Mac’s don’t need to be defragged”. I call shenanigans.

Once I resigned myself to the fact that the only way I was getting back to the nirvana of dual booting was going to be to re-image the Macbook again, I backed up the system, procured a copy of Snow Leopard and got started with the process. Reinstalling OS X was about the same as with Leopard. A couple of new options but nothing earth shattering. The Windows 7 installation on the was also nothing special. Smooth and straightforward as we’d expect out of any modern OS, but it did move fairly quickly.

If you happen to be reading this before you do your installs there’s one useful piece of information in the 14 page document that Apple says you need to read before trying the scary installation of Windows on your Mac. That would be that the drivers for Windows are located on your Snow Leopard install disk. I spent about 3 hours trying to find drivers.

Even though Apple says Windows 7 isn’t supported, the included drivers on the Snow Leopard disk (intended for use with Vista) work just fine.  Windows also reports that some drivers fail to install properly, but in my case there’s nothing overtly wrong. Network, audio, video keyboards & mice are all working as expected with the exception of multitouch functionality on the trackpad. Since I’ll be using the Windows side of the machine most often when connected to a full desk setup (KVM) I’m not too worried about it.

Windows 7 RTM in July??

Tonight the tubes of the Interwebs are all atwitter with rumours that Microsoft may reach the release-to-manufacturer (RTM) milestone for Windows 7 in July. The date being bandied about is July 13th which coincides with a Microsoft event in New Orleans.

This is stunning news particularly when thinking back to the release of Windows Vista 18 months ago. Vista was pushed back a number of times and the delays caused the operating system no end of grief when the OS failed to meet the expectations of consumers when it hit the street in 2007.

A Windows 7 RTM in July would mean that desktops and laptops enabled with Windows 7 may be ready for consumers in time for the key back-to-school buying season.  Add to this that PC vendors like Alienware are already selling Windows Vista licenses with a Windows 7 upgrade offer; and that Microsoft is taking pre-orders for the OS, and it really smells like Windows 7 is not far off.

Sources: Ars Technica, Geeksmack, @Codinghorror

Mac vs. PC :: Will my next computer be a Mac?

macwinIt’s been about two and a half years since I made the switch from being a dedicated Windows user to buying my first Mac. I have really enjoyed my MacBook and wanted to take a few moments to discuss some of the differences and similarities I’ve found with the Mac ownership experience, compared to my earlier (and ongoing) experiences with the Windows platform.

Marketing and Markets
Both Windows and Mac enthusiasts love to evangelize about their platform of choice.  It’s human nature, we all want people to know how smart we are for choosing the best of what’s available.

socialpiechartAs is often the case with most of these “holy wars” the smaller market tends to be more vocal, and more likely to point out all the flaws in its larger competitor.  This is certainly the case with the Apple community.  From the endless stream of “Get a Mac” ads and their YouTube parody counterparts to news releases and security firms touting the reduced target area of not running Windows, those who have and love Macs are always there to tell you that the solution to every problem with MS Windows is to simply get a mac.

And it’s not like Microsoft hasn’t provided a great deal of ammo for the pundits to use in their PR-muskets.  From the troubled launch of Windows Vista to the sad state of what is the Zune to the rather pathetic I’m a PC ad campaign Apple has certainly made up ground on the Redmond-based software giant.  Since 2001, Apple has nearly tripled their market share.  That’s a very significant jump for any company.  But let’s be realistic about what that really means.  The Mac maker has raised its market share from about 3.5% to somewhere around the 10% mark.  Even with Apple’s huge growth over the past 8 years, nine out of every 10 computers sold is running a version of Microsoft Windows.

telus-blackberry-8330-smAs a result, Microsoft for their part shrugs off the attacks of the all things “i” maker, often ignoring the marketing onslaught and focusing on its target market: the Enterprise.  Does anyone remember when Apple launched the 3G iPhone, App Store and support for Enterprise features on the iPhone?  Apple certainly hasn’t made great strides into the corporate handheld market, which is something the Microsoft does better, but that Research In Motion’s BlackBerry does extremely well — but that’s a topic for another post.  Microsoft and Apple both make products which can be used in the business markets.  But time after time, companies are continuing to choose the Microsoft platform over that of Apple, a huge percentage of the 90% that Microsoft controls in the operating systems space is thanks to the purchases of large companies.  If one were to examine only consumer purchases of computers, Apple would fare much better, probably somewhere around the 20% mark in parts of the world.

The consumer market is without question Apple’s strongest.  By developing a series of technologies and services that all work well together, it’s quite possible to change over your entire home to run on Apple technology.  From beautifully designed iMacs that can sit proudly in your living room, to powerful Mac Pros that can serve content for the entire household, to AppleTV which can sit atop your HD digital cable box and serve as an all-in-one media centre, to the AirPort Extreme and Time Capsule backup consoles to manage your network and keep everything interconnected.  appletaxAdd to that Apple’s iTunes and Mobile Me services and you’ve got an entire suite of hardware and software that talks to each other almost flawlessly, and really does make your day-to-day computing experience much smoother.  There’s only one catch, the Apple Tax.

The Apple Tax is what those outside the Apple community call the difference between the price of a Mac, and the price of the most closely aligned (in hardware specs at least) PC.  Often times the difference between a Mac and a PC comes in between 20% and 40%, with the Macs invariably being the more expensive machines.  PC enthusiasts will shame people for wasting their money on “pretty hardware” while the Mac community talks about security, ease of use and bundled software.  Over the past three years or so I’ve come to realize that the reason this debate won’t die is that they’re all right.

My Mac Experience

mac_leoWhen I first picked up my Macbook one of the things that excited me about the experience was the new-ness of it.  This was a computing platform that I wasn’t particularly familiar with, and since I considered myself to be something of a technology afficionado I figured I should jump in and see what all the fuss was really about.

Within hours I had posted my first blog post and was happily exploring the features of OS X Tiger.  There were a few quirks of the Mac OS that drove (drive) me nuts but overall it was a pretty good experience.  Much more polished than other Windows alternatives (RedHat, Ubuntu, Fedora) that I’d looked at in the past.  One of the strongest points in the Mac’s favour early on was the Unix-style BSD-based terminal.  This is where, for me at least, some of the magic of OS X came into play.

I’ve always been a command-line geek.  There’s no question in my mind that computers function at their best when they don’t need to worry about drawing a “pretty picture” for us lazy humans.  Command-line applications (and for that matter services/daemons) run better, and more often than not, more reliably than applications with elegant user-interfaces.  Being able to explore the world of the UNIX/Linux command line on my shiny new Mac was indeed a revelation for me.  It even led to me porting the wget application to run on Mac OS X.  This wasn’t something that I’d ever consider trying to do for Windows, though it probably isn’t much more difficult.

mpkgAs time moved forward I really enjoyed my MacBook. Adding new applications to the computer was as simple as downloading them from the Internet and in most cases dragging the application to the Applications folder.  In other cases I would need to double-click an .mpkg file to run the installer.

But I noticed after a while that all the software I’d been downloading for my Mac Lab Rat segments for the old version of the podcast had really cluttered up my system.  Thankfully OS X allows you to clean up all of that mess from the installations with just the drag of a mouse.  Yep, that’s right. To uninstall an application from OS X, you just need to drag it to the trash can.  That’s much simpler than un-installing programs on Windows, right?  Well, that’s not really the whole truth.

First off, you need to understand how a Mac stores applications.  Each application is stored in a package ending with a .app extension.  This is, in reality, just a folder that contains the majority of the files that the application uses.  Dragging “the application” to the trash is really just a way of deleting the application folder.  But with many applications this doesn’t delete the entire application footprint.

There are two folders where applications store the majority of their extra files and these are the /Library and the /Users/<username>/Library folders.  Apple’s own recording application GarageBand stores over 1.5GB of files in these library folders, removing the application using the Drag-and-Drop method will leave those files on your computer.

Malware & Baddies
toxic-wasteThere’s no question that anyone who buys a Mac today, or has bought one in the past 10 years has experienced but a fraction of a percentage of the malware, spyware, viruses and badness that Windows owners have to deal with on a regular basis.  Apple touts this fact when they promote their Macs as one would expect, and as they should. The lack of these problems on a Mac is a great reason to use the system.  Mac fanboys would have you believe that the Mac Operating System is fundamentally designed to be more secure. They talk about the fact that because you’re less likely to be infected by problems on a Mac, the Mac OS is orders of magnitude more secure than Windows.  But notice nowhere does it say that there are fewer vulnerabilities in OS X than in Windows.

The reality is that with Windows’ huge market share (remember the 90% number we talked about earlier?) they are the 10,000lb gorilla.  When your next biggest competitor makes up less than 10% of the market, it’s clear who will be the target. (For those in the business of building gorilla killin’ helicopters (malware), the real target is King Kong not Nim Chimpsky.)

If you’re writing malware of any kind, you’re typically doing it in one of two ways:

  1. Target companies
  2. Target the highest number of people possible

The majority of malware authors choose to go with option #2: cast a wide net and see how many fish you can catch.  If your net is set to catch Windows machines, the sheer math of it will get you more infected machines than if you were to target the much smaller Mac market.  That said, with success comes difficulty.  Mac users are starting to see pockets of activity targeting OS X.  Consider the Pwn to Own competitions that security companies have run for the past few years. Invariably, OS X has been compromised at each of them, and in most cases extremely quickly. Modern operating systems are all susceptible to exploits and security holes. Even linux systems are vulnerable to attacks, they simply have the benefit of a large number of people to quickly patch holes and a user community generally less susceptible to getting themselves infected.  OS X is not an invulnerable operating system.

Software – Included and Excluded

macappsIt’s often touted that the software included on Mac Systems helps to justify the increased price tag of purchasing these machines. It does help, to be sure. The quality of the included software is quite high, and allows you to manage photos, music & email, make videos, burn movies, and record audio.  What Apple doesn’t want you to know is that there are lots of applications out there for Windows too, some of which may even be bundled with your system when you buy it.  Consistency is Apple’s strongest point. They can use phrases like “iLife comes with every new Mac”.

I’ve used every application that comes with iLife at least once.  The most frequently used applications being iPhoto and GarageBand; unfortunately I’ve not been overly satisfied with either and the only reason I stuck with them is that they were for all intents and purposes free applications.  iPhoto in particular lacked a number of features, the most obvious of which is the ability to organize images into folder hierarchies.  This has been fixed in the latest version, but I don’t feel like paying $69 for something that free apps like Picasa can do for free.

GarageBand has worked out quite well for the most part, but does leave a few things to be desired.  The interface is excellent, making creating podcasts and other recorded audio quick and fairly intuitive.  It becomes obvious fairly quickly though that this product too is targeted at a consumer audience as there are a number of audio manipulation features missing including fine grain control over cutting and pasting audio, and the application crashes with my podcast files once it gets over an hour in length.

While the iLife suite is touted as being partial justification of the increased cost of the Macs, in many cases I’ve abandoned these applications in favour of free applications that I was able to download from the Internet.  I’m in the midst of replacing iPhoto with Picasa and GarageBand with Audacity (which admittedly is missing a bunch of features too, so I’ll probably have to use both).

Coming from a Windows world, I was accustomed to being able to find software online that did what I needed my computer to do, and the vast majority of the time not having to pay for it — and let me be clear, I’m talking SourceForge, not PirateBay.  What I found in coming to the Mac world is that commercial ISVs (independent software vendors) were far more common for home-use applications on the Mac than for Windows.  Translation: If you want it, be prepared to pay for it.  Third-party developers have done a great job of writing software that has a Mac look & feel.  Apple and Microsoft both publish guidelines on best practices for developing software for their respective platforms.  The ISVs that publish software for the Mac do a great job of creating a quality product the only catch of course being that you need to buy the apps.  There is open-source software available on the Mac, but as with the malware developers. the open-source community prefers to stick to platforms where they can get the most eyeballs on their product.

Getting Things Done
checkmarkThis is far and away the most subjective category in my review.  There is no question that I’ve been extremely productive with my MacBook over the past three years.  I’ve written hundreds of blog posts, contributed to my online forums, remotely managed software on my websites, handled email, instant messaging, twitter, virtualization and managed my online life.  The thing is, most of the time I’m not using a Mac specific application to do those tasks.  All of my Internet activity is done using FireFox rather than Apple’s own Safari browser.  The main reason for that is that I find Safari to be a bit clumsy to use, and above all else, I miss the ability to download tons of free plugins and extensions for the browser that make my online life better.

One task where the Mac has a leg up on Windows, conceptually at least, is the fact that it’s built-in command-line interface is based on BSD.  This means that all of the default tools for handling command-line operations in a Unix environment are already present, and the most important of those for me is SSH.  Native command-line support of SSH makes administering my web servers a more seamless task, and despite the fact that it’s command-line in nature, that may be the most Mac-like feature of my Macbook.  I can get this done on windows without much effort as well, but with the Mac, this truly was built-in from the get-go.

Re-Staging Systems
I’m hard on my computers.  I always have been.  Every system I’ve ever owned prior to my MacBook has been re-staged or re-imaged about once per year.  Sometimes this was for OS upgrades, sometimes because it had become slow and unusable, and sometimes because I wanted to try a major configuration change to make the computer more useful to me.  Something that really appealed to me about the Mac from those I’d spoken to prior to purchasing it was the idea that all of this would be gone once I got a mac.  Never would I need to do the dreaded “wipe and reload” operation that I’d become used to in Windows.  The reality is, I’ve re-staged my Macbook about the same number of times (if not more) than I had originally done on Windows.

  1. Bought a new Mac
  2. Over the course of the first 6-8 months, downloaded every piece of Mac software I could find. Un-installing them left me with a clutter of junk in the “Library folder” for the dozens and dozens of apps I had removed. To clear this up permanently, I re-staged the computer.
  3. About 6 months later, I wanted to try out the pre-release version of Boot-Camp that came with OS X 10.4.  Unfortunately after the previous re-installation I had chosen a “case-sensitive” file system — this doesn’t work well with Boot Camp.  I re-staged the computer.
  4. When OS X 10.5 came out, I felt somewhat duty-bound to pick up the new release on it’s first day of RTM.  To put this on, I followed my policy with all OS updates (and the advice I had found online) which is to always start clean. I re-staged the computer.
  5. I decided a few months later that I wanted to try dual-booting my computer with Windows and OS X 10.5, unfortunately I had filled up my 80 GB hard drive so much that the OS X couldn’t create a decent boot partition.  I re-staged the computer.
  6. Several months later I bought a new 320 GB hard drive and promptly proceeded to load it into my Mac.  Since the Boot-camp thing wasn’t really working out anyway I decided this would be a great time to get a fresh start.  I re-staged the computer.

Over the 32 months since I’ve owned the Macbook, I’ve re-staged the machine five times.  That’s about once every 6 months give-or-take.  That’s a bit more often than my Windows machines annual re-load, but I figure two of them were due to my unfamiliarity with the Mac OS.  So three times in three years, I call that a draw.

Conclusion – Will my next computer be a Mac?
After looking at my Mac experience objectively for a couple of months as I’ve written this article on and off, I’ve come to two undeniable truths about how the Mac fits in to my life.

  1. The Mac is an outstanding computer, that does nearly everything that I’ve ever needed it to.
  2. For me, it isn’t worth the 30-40% premium over a comparable Windows-based notebook.

I really do love my Macbook, and I’m going to find a way to keep it running and in active service until it simply becomes too expensive to maintain (read: need to replace the battery, or a system component out of warranty).  But I also know that my next machine, which will be a replacement for the desktops in my basement will most likely be an off-the-shelf PC.  The vast majority of what I do on my computer is done on the Internet.  The applications I use on my Mac every single day are Firefox, Thunderbird, MSN, TweetDeck, TextPad and the CLI SSH client.  All of those applications are available on every single computer that I’ve ever used.  So when I buy the next system, the only decision for me as far as operating systems go, will be whether I buy Windows, or install the latest LTS edition of Ubuntu.

AnkhSVN and Visual Studio 2008

ankhsvnSource control is one of those things that developers get really polarized about.  Most agree that having source control on projects is a necessity, but that’s typically were the similarities end.  Some folks are of the mind that every line of code, however insignificant, should be under source control.  This provides records of what was written, and a reference for things that were done in the past.  Others believe that source control should be reserved for “real” projects, things that are deliverables for customers, or products to be released to real-world environments.  I really don’t want to get into this debate tonight, so I’m going to stick to the technology.

I was wanting to get some source control in place for a few of my personal projects.  I chose to go with Subversion for my source control server for a few reasons, not the least of which was that my hosting company supports auto-configuration of SVN repositories, so I was able to get that set up in just a couple of minutes.  That left me some time to contemplate how I would access the repository from the client.

newproject_svnI’m running Visual Studio 2008 on my development machine and this gives me the ability to use plugins for the IDE, a feature that is sadly missing from the express editions.  There were a couple of good options available for SVN plugins, VisualSVN which is the 800lb gorilla in this space, and the open-source option CollabNet’s AnkhSVN.  Given the fact that this was for personal exploration of the toolset, the open source (free) option was the obvious choice.

The setup for AnkhSVN was quick and painless, and when the IDE opened up it put options for source control right in the menus where they were nice and easy to find.  I created a project, and selected the “add to Subversion” checkbox, entered the necessary credentials and created the project in my SVN repository.

anhksvnWhen in Visual Studio, the AnkhSVN controls are located on a tab at the bottom of the IDE, alongside other solution-wide functionality like the To-do list, output window etc.  This pane tracks all of the changes (adds, deletes and updates) that you’ve made to the solution files.  This is extra handy as a review when you’re ready to make your commits back to the repository.  By quickly scanning the list of changes you’re able to write solid commit comments to provide some decent documentation for you, or those who come after you.

I’m still relatively new to Subversion and AnkhSVN, but I’m looking forward to exploring them in more detail — maybe I’ll even do a podcast episode about it!

Hide the Undock Button in Windows XP in Five Steps

If you have a Windows XP notebook, and love to use the Run menu item, chances are you’ve occasionally hit “E” instead of  “R” when you bring up your start menu.  The result? Windows ejects the PC from it’s dock and forces you to re-dock it before you can carry on with your work.  It’s only about a 90-second process, but it’s annoying as hell and will completely take you ‘out of the zone’ when you’re in the middle of your project.

  1. Open the registry editor (Start -> Run -> regedit)
  2. Open one of the applicable keys:
    • Current User: [HKCUSoftwareMicrosoftWindowsCurrentVersionPoliciesExplorer]
    • All Users: [HKLMSoftwareMicrosoftWindowsCurrentVersionPoliciesExplorer]
  3. Right-click the Explorer folder -> New -> DWORD Value and call it NoStartMenuEjectPC (case-sensitive!)
  4. Right-click NoStartMenuEjectPC -> Modify
  5. Choose one of the following values to set the behaviour you want:
    • type 1 to hide the undock button
    • type 0 to show the undock button

Now you can safely use your keyboard shortcuts without worrying about accidentally undocking your computer.