I’m way behind getting this posted. It’s the AGP #53 from way back in early March. Enjoy “A Touch of Culture.”
I could swear I wrote about this at some point in the distant past, but I couldn’t find the article this week when I needed it to help troubleshoot an issue with another developer. The issue he was having was how to access objects from the executing web page’s HttpContext object from a class other than the CodeBehind of the executing web-forms page. Essentially he was looking for a way to map a web-path to a physical folder path without needing to hard-code it or know where the application was deployed on the server in question.
If done correctly, an application can reside anywhere in the file system and be deployed to a virtual directory at any depth without causing a problem with URL resolution. In the code-behind of a web-forms page, the code is simple:
string physicalPath = Server.MapPath("~/somefolder/myfile.xml");
However doing this from another page involves just a little bit more work:
Using System.Web; string physicalPath = HttpContext.Current.Server.MapPath("~/somefilder/myfile.xml");
It’s really quite straightforward when you see it, and I can’t believe that I forget how to do it. This method will also provide you access to lots of other useful objects which makeup the “state” of the application from an HTTP perspective.
I recently had the honour of being asked to be a guest on Knightwise’s podcast during his KWTV Live event in September. He took the opportunity to interview three different people about the current state of the three major operating systems, Linux, OS X and Windows. The three guests for the evening were:
- Larry Buschey from Going Linux podcast
- Bart Busschots from The International Mac podcast
- Me from here, and of course the Aussie Geek Podcast
Larry spoke on the state of Linux and what drives Linux adoption; Bart covered the highlights and lowlights of OS X Lion in some detail; and I talked about the Windows 8 developer preview and the state of Windows tablet PCs.
Give it a listen!
This past spring I made an attempt to move myself out of the shackles of the commercial software world and truly embrace open-source. I tried to move my primary machine off Windows 7, and onto Ubuntu Linux. I knew the transition wouldn’t be seamless but I’d heard so many good things about living in a Linux universe that I decided it was time.
The experiment did not go as well as I might have hoped, and despite my efforts to stick with it for some time, I eventually had to cut the experiment short. As I was preparing to re-image my system I started a blog post which I decided not to post at the time. I’ve included a short excerpt which shows my state of mind back in May, just after the experiment concluded.
I told myself I was going to stick it out for at least 3 months. But here I sit, not 3 weeks after making the decision to migrate my primary machine to Ubuntu, with the Windows 7 installation disk in hand. What could possibly have brought me to this point? Primarily, time.
It’s going to take me about 8 hours of work to prep all the data on my system for the transition, wipe the linux partition, re-install windows, re-install the applications, re-install VMWare, re-install my Linux VMs (I do still have a use for them!). The problem is, things on linux generally have taken longer than they should. Some of this is due to the fact that I’m learning, and I’ve tried to ignore those. Others are generally due to the fit and finish of Ubuntu.
So what went wrong?
Problem #1 – 10.10 or 11.04?
I generally resist the temptation to move to the latest OS release, but when I tried setting up a Windows VM under VirtualBox in Ubuntu 10.10 the audio was mucked up. It seemed a bit slow too, but that may have been my imagination. So I tried installing the newly minted 11.04. The VM now worked like a charm, but that was a long multi-step process.
Problem #2 – Virtualization
Trying to set up a virtual machine that would start up at boot time (like a Windows service or any number of linux daemons) proved a nearly impossible task. After several hours of searching, tweaking, testing, and ultimately failing, I decided to abandon the effort and live with manually starting my VMs.
Problem #3 – File Sharing
Setting up network shares was probably one of the better experiences I had. I was able to set up a “public” share on the linux machine and access it from anywhere on the network… as long as I didn’t want to protect it with a username and password. That was going to require more voodoo and black magic than I was prepared to endure for such a simple task. Overall, not a bad experience.
Problem #4 – Flash in Browsers
Like it or not Flash is still an integral part of the web, and Flash in the browser was just one of those things that never quite worked right. When I talk about fit and finish of a product, this is what I mean. Blocky artifacts showing up on video players was the most common issue, though there were other things like playback and audio problems as well.
Problem #5 – Lack of Air Support
The fact that I felt compelled to write a blog post calling attention to a tutorial for getting Adobe Air installed under Ubuntu 11.04 speaks to just how difficult this didn’t need to be. On any other major platform, you can go to a website and simply click the install button. The rest is automatic. Not here though.
Problem #6 – Button Clicks
I constantly had problems just clicking on buttons. Sometimes in an application (Chromium comes to mind) but sometimes just within the Ubuntu environment itself. This kind of thing makes you start to question the faith you have in your OS.
Problem #7 – Learning Curve
I suppose it’s a bit unfair to put this here as it’s undoubtedly the same issue that would come up moving between any two major operating systems. The bottom line is that I have a young family with whom I like to spend the majority of my day. That means that when I decide to sit down at the computer to do something, I don’t really have the time to spend learning how to do things all over again.
There were a few things that were also pleasant surprises during this whole thing. Mostly to do with 3rd party applications.
CrashPlan was able to seamlessly match up my Windows backup to the Linux file system. This made it very easy to move everything over. I just hope it works as well in reverse.
Digitizing DVDs has never been easier. It took a couple of tries to get the quality settings just where I wanted them, but the process worked out really well.
I love the *nix shell, Bash in particular. This is the one thing I will truly miss when I move back to Windows. Having commands like rsync at my disposal, and built in SSH support are also fantastic. While this is something that has to be hacked into a Windows installation, it is available by default on OS X.
The availability of good software to do most tasks is one of the key benefits of moving to an open source experience, but the truth is that the experience really didn’t live up to my hopes or my expectations. I’m getting to the point where I want my computing time to be spent creating, not just experimenting with different ways that I could set up my tool sets. And as time moves on, the number of free or open-source applications available on the major commercial platforms like Windows and OS X is growing. Once either of those operating systems is installed I can do everything I want to do without having to pay a license for another piece of software — and in many cases the applications are as good or better than the open-source tools available for the Linux platforms. Add to that the growing number of applications which reside in the cloud and are completely browser and platform agnostic and it starts to become a simple equation for me.
Is it worth the $150 or so that it costs to get my new computer preloaded with a commercial OS? Yes.
The Redmond-based software giant’s previous offering in the mobile space (the much maligned Windows Mobile) has taken a lot of flack in recent years over the quality and features (or lack thereof) in their mobile operating systems. One of the biggest challenges was the fact that Microsoft did not control the hardware stack. Vendors could essentially build anything they wanted with “compatible” hardware with little or no enforceable guidance from the software maker. All that has changed in 2010. Microsoft has provided a minimum specification for Windows Phone 7 devices which seems to be providing a more consistent experience across devices, and overall better performance than in years past.
In trying to describe it over the past couple of days I keep finding myself referring to it as ‘not an iPhone’. Though it shares many of the same features and capabilities of its iOS brethren, it doesn’t follow the lead in OS design. The overall feel of the UI is very fluid. Screen transitions both in the OS and within many if the applications are smooth and scrollong through long lists of data or
The main screen of the WP7 interface is the set of configurable ‘Live Tiles’. These are in essence large icons which can also be updated by the apps they belong to. Messaging and email applications, for example, display the number of new message and the Marketplace app shows the number of apps you have which are waiting for an update.
The second panel on the main screen is the application list. All of the applications are displayed on a single scrollable list. This alone is a break with the now traditional layout of iOS and Android devices displaying screens and screens if icons. This difference provides an instant differentiation for the new Windows devices.
The one class of applications that is treated differently is games. Games are all listed from within the XBox Live hub isolating them a bit from the rest of the applications.
In the last sentence I mentioned a hub. This is the second major concept that the OS introduces. The hubs are, for want of a better term, points of convergence that bring together disparate sources of similar information. The best and most cited sample of this is the ‘peope hub’. the people hub allows you to merge in your contacts from your (multiple) email accounts and join it to contact information in your MSN messenger account and even your Facebook friends. The people hub uses all of that information to create a single list of contacts each of which contains information from the various sources.
The convergence of the people hub is nice. I’ll be happier once the OS can expand beyond Facebook and Windows Live to incorporate the services I actually use on a regular basis like Twitter, GoogleTalk, Tumblr and Flickr.
So far so good for the newest mobile OS. I’ll have more posts coming in the next few weeks getting into some of these features in more detail, covering other aspects of the Windows Phone 7 ecosystem, and hopefully touching on the developer story for WP7.
Update 2011-02-11: Since the post was originally published Microsoft has issued a hotfix for this issue, which is slated to be included in Windows 7 SP1.
I recently ran into a problem on my fancy new machine while trying to commit a rather large number of files into an SVN repository. The error message stated that some of the files in the .svn control directory had become corrupted and unreadable.
After Googling around a bit I came across a post on the CollabNet issue log which identified this as an issue with the NTFS stack on Windows 7. This post included another link to a Microsoft technet discussion about the issue.
The long and the short of it is that this is an identified issue in the NTFS implementation in all editions of Windows 7 (both 32 and 64 bit versions). The indexing service is locking files which SVN is trying to move. This only appears to be a problem with large batch transactions. Smaller ones, for me at least, have been working just fine but YMMV.
In case you don’t want to read the whole discussion thread, here’s the response from the NTFS team developer who responded to the community reports:
This is a known regression in Windows 7 in the NTFS file system. It occurs when doing a superceding rename over a file that has an atomic oplock on it (atomic oplocks are a new feature in Windows 7). The indexer uses atomic oplocks which is why it helped when you disabled the indexer. Explorer also uses atomic oplocks which is why you are still seeing the issue. When this occurs STATUS_FILE_CORRUPT is incorrectly returned and the volume is marked “dirty” which is a signal to the system that chkdsk needs to be run. No actual corruption has occured.
NTFS Development Lead
The identified workaround for this issue is to stop the indexing service. If you don’t use search very often you can disable it. If you do, you can just stop the service and allow it to restart the next time you restart Windows.
The next trick, of course, is finding the indexing service. In Windows 7 the service has been renamed “Windows Search”. It serves essentially the same functions as the old “Indexing Service”.
There have been some reports that this issue affects Windows Vista as well, but I don’t have a Vista machine to test with.
Tonight the tubes of the Interwebs are all atwitter with rumours that Microsoft may reach the release-to-manufacturer (RTM) milestone for Windows 7 in July. The date being bandied about is July 13th which coincides with a Microsoft event in New Orleans.
This is stunning news particularly when thinking back to the release of Windows Vista 18 months ago. Vista was pushed back a number of times and the delays caused the operating system no end of grief when the OS failed to meet the expectations of consumers when it hit the street in 2007.
A Windows 7 RTM in July would mean that desktops and laptops enabled with Windows 7 may be ready for consumers in time for the key back-to-school buying season. Add to this that PC vendors like Alienware are already selling Windows Vista licenses with a Windows 7 upgrade offer; and that Microsoft is taking pre-orders for the OS, and it really smells like Windows 7 is not far off.
It’s been about two and a half years since I made the switch from being a dedicated Windows user to buying my first Mac. I have really enjoyed my MacBook and wanted to take a few moments to discuss some of the differences and similarities I’ve found with the Mac ownership experience, compared to my earlier (and ongoing) experiences with the Windows platform.
Marketing and Markets
Both Windows and Mac enthusiasts love to evangelize about their platform of choice. It’s human nature, we all want people to know how smart we are for choosing the best of what’s available.
As is often the case with most of these “holy wars” the smaller market tends to be more vocal, and more likely to point out all the flaws in its larger competitor. This is certainly the case with the Apple community. From the endless stream of “Get a Mac” ads and their YouTube parody counterparts to news releases and security firms touting the reduced target area of not running Windows, those who have and love Macs are always there to tell you that the solution to every problem with MS Windows is to simply get a mac.
And it’s not like Microsoft hasn’t provided a great deal of ammo for the pundits to use in their PR-muskets. From the troubled launch of Windows Vista to the sad state of what is the Zune to the rather pathetic I’m a PC ad campaign Apple has certainly made up ground on the Redmond-based software giant. Since 2001, Apple has nearly tripled their market share. That’s a very significant jump for any company. But let’s be realistic about what that really means. The Mac maker has raised its market share from about 3.5% to somewhere around the 10% mark. Even with Apple’s huge growth over the past 8 years, nine out of every 10 computers sold is running a version of Microsoft Windows.
As a result, Microsoft for their part shrugs off the attacks of the all things “i” maker, often ignoring the marketing onslaught and focusing on its target market: the Enterprise. Does anyone remember when Apple launched the 3G iPhone, App Store and support for Enterprise features on the iPhone? Apple certainly hasn’t made great strides into the corporate handheld market, which is something the Microsoft does better, but that Research In Motion’s BlackBerry does extremely well — but that’s a topic for another post. Microsoft and Apple both make products which can be used in the business markets. But time after time, companies are continuing to choose the Microsoft platform over that of Apple, a huge percentage of the 90% that Microsoft controls in the operating systems space is thanks to the purchases of large companies. If one were to examine only consumer purchases of computers, Apple would fare much better, probably somewhere around the 20% mark in parts of the world.
The consumer market is without question Apple’s strongest. By developing a series of technologies and services that all work well together, it’s quite possible to change over your entire home to run on Apple technology. From beautifully designed iMacs that can sit proudly in your living room, to powerful Mac Pros that can serve content for the entire household, to AppleTV which can sit atop your HD digital cable box and serve as an all-in-one media centre, to the AirPort Extreme and Time Capsule backup consoles to manage your network and keep everything interconnected. Add to that Apple’s iTunes and Mobile Me services and you’ve got an entire suite of hardware and software that talks to each other almost flawlessly, and really does make your day-to-day computing experience much smoother. There’s only one catch, the Apple Tax.
The Apple Tax is what those outside the Apple community call the difference between the price of a Mac, and the price of the most closely aligned (in hardware specs at least) PC. Often times the difference between a Mac and a PC comes in between 20% and 40%, with the Macs invariably being the more expensive machines. PC enthusiasts will shame people for wasting their money on “pretty hardware” while the Mac community talks about security, ease of use and bundled software. Over the past three years or so I’ve come to realize that the reason this debate won’t die is that they’re all right.
My Mac Experience
When I first picked up my Macbook one of the things that excited me about the experience was the new-ness of it. This was a computing platform that I wasn’t particularly familiar with, and since I considered myself to be something of a technology afficionado I figured I should jump in and see what all the fuss was really about.
Within hours I had posted my first blog post and was happily exploring the features of OS X Tiger. There were a few quirks of the Mac OS that drove (drive) me nuts but overall it was a pretty good experience. Much more polished than other Windows alternatives (RedHat, Ubuntu, Fedora) that I’d looked at in the past. One of the strongest points in the Mac’s favour early on was the Unix-style BSD-based terminal. This is where, for me at least, some of the magic of OS X came into play.
I’ve always been a command-line geek. There’s no question in my mind that computers function at their best when they don’t need to worry about drawing a “pretty picture” for us lazy humans. Command-line applications (and for that matter services/daemons) run better, and more often than not, more reliably than applications with elegant user-interfaces. Being able to explore the world of the UNIX/Linux command line on my shiny new Mac was indeed a revelation for me. It even led to me porting the wget application to run on Mac OS X. This wasn’t something that I’d ever consider trying to do for Windows, though it probably isn’t much more difficult.
As time moved forward I really enjoyed my MacBook. Adding new applications to the computer was as simple as downloading them from the Internet and in most cases dragging the application to the Applications folder. In other cases I would need to double-click an .mpkg file to run the installer.
But I noticed after a while that all the software I’d been downloading for my Mac Lab Rat segments for the old version of the podcast had really cluttered up my system. Thankfully OS X allows you to clean up all of that mess from the installations with just the drag of a mouse. Yep, that’s right. To uninstall an application from OS X, you just need to drag it to the trash can. That’s much simpler than un-installing programs on Windows, right? Well, that’s not really the whole truth.
First off, you need to understand how a Mac stores applications. Each application is stored in a package ending with a .app extension. This is, in reality, just a folder that contains the majority of the files that the application uses. Dragging “the application” to the trash is really just a way of deleting the application folder. But with many applications this doesn’t delete the entire application footprint.
There are two folders where applications store the majority of their extra files and these are the /Library and the /Users/<username>/Library folders. Apple’s own recording application GarageBand stores over 1.5GB of files in these library folders, removing the application using the Drag-and-Drop method will leave those files on your computer.
Malware & Baddies
There’s no question that anyone who buys a Mac today, or has bought one in the past 10 years has experienced but a fraction of a percentage of the malware, spyware, viruses and badness that Windows owners have to deal with on a regular basis. Apple touts this fact when they promote their Macs as one would expect, and as they should. The lack of these problems on a Mac is a great reason to use the system. Mac fanboys would have you believe that the Mac Operating System is fundamentally designed to be more secure. They talk about the fact that because you’re less likely to be infected by problems on a Mac, the Mac OS is orders of magnitude more secure than Windows. But notice nowhere does it say that there are fewer vulnerabilities in OS X than in Windows.
The reality is that with Windows’ huge market share (remember the 90% number we talked about earlier?) they are the 10,000lb gorilla. When your next biggest competitor makes up less than 10% of the market, it’s clear who will be the target. (For those in the business of building gorilla killin’ helicopters (malware), the real target is King Kong not Nim Chimpsky.)
If you’re writing malware of any kind, you’re typically doing it in one of two ways:
- Target companies
- Target the highest number of people possible
The majority of malware authors choose to go with option #2: cast a wide net and see how many fish you can catch. If your net is set to catch Windows machines, the sheer math of it will get you more infected machines than if you were to target the much smaller Mac market. That said, with success comes difficulty. Mac users are starting to see pockets of activity targeting OS X. Consider the Pwn to Own competitions that security companies have run for the past few years. Invariably, OS X has been compromised at each of them, and in most cases extremely quickly. Modern operating systems are all susceptible to exploits and security holes. Even linux systems are vulnerable to attacks, they simply have the benefit of a large number of people to quickly patch holes and a user community generally less susceptible to getting themselves infected. OS X is not an invulnerable operating system.
Software – Included and Excluded
It’s often touted that the software included on Mac Systems helps to justify the increased price tag of purchasing these machines. It does help, to be sure. The quality of the included software is quite high, and allows you to manage photos, music & email, make videos, burn movies, and record audio. What Apple doesn’t want you to know is that there are lots of applications out there for Windows too, some of which may even be bundled with your system when you buy it. Consistency is Apple’s strongest point. They can use phrases like “iLife comes with every new Mac”.
I’ve used every application that comes with iLife at least once. The most frequently used applications being iPhoto and GarageBand; unfortunately I’ve not been overly satisfied with either and the only reason I stuck with them is that they were for all intents and purposes free applications. iPhoto in particular lacked a number of features, the most obvious of which is the ability to organize images into folder hierarchies. This has been fixed in the latest version, but I don’t feel like paying $69 for something that free apps like Picasa can do for free.
GarageBand has worked out quite well for the most part, but does leave a few things to be desired. The interface is excellent, making creating podcasts and other recorded audio quick and fairly intuitive. It becomes obvious fairly quickly though that this product too is targeted at a consumer audience as there are a number of audio manipulation features missing including fine grain control over cutting and pasting audio, and the application crashes with my podcast files once it gets over an hour in length.
While the iLife suite is touted as being partial justification of the increased cost of the Macs, in many cases I’ve abandoned these applications in favour of free applications that I was able to download from the Internet. I’m in the midst of replacing iPhoto with Picasa and GarageBand with Audacity (which admittedly is missing a bunch of features too, so I’ll probably have to use both).
Coming from a Windows world, I was accustomed to being able to find software online that did what I needed my computer to do, and the vast majority of the time not having to pay for it — and let me be clear, I’m talking SourceForge, not PirateBay. What I found in coming to the Mac world is that commercial ISVs (independent software vendors) were far more common for home-use applications on the Mac than for Windows. Translation: If you want it, be prepared to pay for it. Third-party developers have done a great job of writing software that has a Mac look & feel. Apple and Microsoft both publish guidelines on best practices for developing software for their respective platforms. The ISVs that publish software for the Mac do a great job of creating a quality product the only catch of course being that you need to buy the apps. There is open-source software available on the Mac, but as with the malware developers. the open-source community prefers to stick to platforms where they can get the most eyeballs on their product.
Getting Things Done
This is far and away the most subjective category in my review. There is no question that I’ve been extremely productive with my MacBook over the past three years. I’ve written hundreds of blog posts, contributed to my online forums, remotely managed software on my websites, handled email, instant messaging, twitter, virtualization and managed my online life. The thing is, most of the time I’m not using a Mac specific application to do those tasks. All of my Internet activity is done using FireFox rather than Apple’s own Safari browser. The main reason for that is that I find Safari to be a bit clumsy to use, and above all else, I miss the ability to download tons of free plugins and extensions for the browser that make my online life better.
One task where the Mac has a leg up on Windows, conceptually at least, is the fact that it’s built-in command-line interface is based on BSD. This means that all of the default tools for handling command-line operations in a Unix environment are already present, and the most important of those for me is SSH. Native command-line support of SSH makes administering my web servers a more seamless task, and despite the fact that it’s command-line in nature, that may be the most Mac-like feature of my Macbook. I can get this done on windows without much effort as well, but with the Mac, this truly was built-in from the get-go.
I’m hard on my computers. I always have been. Every system I’ve ever owned prior to my MacBook has been re-staged or re-imaged about once per year. Sometimes this was for OS upgrades, sometimes because it had become slow and unusable, and sometimes because I wanted to try a major configuration change to make the computer more useful to me. Something that really appealed to me about the Mac from those I’d spoken to prior to purchasing it was the idea that all of this would be gone once I got a mac. Never would I need to do the dreaded “wipe and reload” operation that I’d become used to in Windows. The reality is, I’ve re-staged my Macbook about the same number of times (if not more) than I had originally done on Windows.
- Bought a new Mac
- Over the course of the first 6-8 months, downloaded every piece of Mac software I could find. Un-installing them left me with a clutter of junk in the “Library folder” for the dozens and dozens of apps I had removed. To clear this up permanently, I re-staged the computer.
- About 6 months later, I wanted to try out the pre-release version of Boot-Camp that came with OS X 10.4. Unfortunately after the previous re-installation I had chosen a “case-sensitive” file system — this doesn’t work well with Boot Camp. I re-staged the computer.
- When OS X 10.5 came out, I felt somewhat duty-bound to pick up the new release on it’s first day of RTM. To put this on, I followed my policy with all OS updates (and the advice I had found online) which is to always start clean. I re-staged the computer.
- I decided a few months later that I wanted to try dual-booting my computer with Windows and OS X 10.5, unfortunately I had filled up my 80 GB hard drive so much that the OS X couldn’t create a decent boot partition. I re-staged the computer.
- Several months later I bought a new 320 GB hard drive and promptly proceeded to load it into my Mac. Since the Boot-camp thing wasn’t really working out anyway I decided this would be a great time to get a fresh start. I re-staged the computer.
Over the 32 months since I’ve owned the Macbook, I’ve re-staged the machine five times. That’s about once every 6 months give-or-take. That’s a bit more often than my Windows machines annual re-load, but I figure two of them were due to my unfamiliarity with the Mac OS. So three times in three years, I call that a draw.
Conclusion – Will my next computer be a Mac?
After looking at my Mac experience objectively for a couple of months as I’ve written this article on and off, I’ve come to two undeniable truths about how the Mac fits in to my life.
- The Mac is an outstanding computer, that does nearly everything that I’ve ever needed it to.
- For me, it isn’t worth the 30-40% premium over a comparable Windows-based notebook.
I really do love my Macbook, and I’m going to find a way to keep it running and in active service until it simply becomes too expensive to maintain (read: need to replace the battery, or a system component out of warranty). But I also know that my next machine, which will be a replacement for the desktops in my basement will most likely be an off-the-shelf PC. The vast majority of what I do on my computer is done on the Internet. The applications I use on my Mac every single day are Firefox, Thunderbird, MSN, TweetDeck, TextPad and the CLI SSH client. All of those applications are available on every single computer that I’ve ever used. So when I buy the next system, the only decision for me as far as operating systems go, will be whether I buy Windows, or install the latest LTS edition of Ubuntu.
A couple of weeks ago at Mix ’09 the ASP.NET MVC team announced the RTW (release-to-web) version of the MVC framework. I’ve been looking at the framework and playing with pieces of it for a few months now, but due to school & work commitments haven’t really had a chance to give it a good run through, or build anything meaningful with it.
This past week I’ve gone back to the ASP.NET website and discovered that there is now a long list of tutorials which have been put in an order to help make the major features of the MVC framework more learnable, particularly for those of us who haven’t had that MVC-heavy comp-sci education. The tutorials come in either written or video form (there is some overlap) and do provide some good step-by-step instructions for exploring the new methodology.
Expect me to get into more detail about the ins-and-outs of the MVC framework in upcoming editions of the new podcast (more details soon, I promise!!)
You can, of course, download and use the MVC framework with Visual Studio 2009 without the tutorials, but I would highly recommend giving the first few a once-over. Have a look at the tutorial site and see what you think.
Leave it to the big brains of Scott Guthrie and Scott Hanselman to come up with this simple yet super cool concept. Nerd Dinners — a way (by use of web technology) for a bunch of introverts to meet up with each other and enjoy a meal, and the kind of conversation that drives our non-geek spouses crazy.
Anything that helps to foster conversation between people who are passionate about a topic is a good thing in my book. If it involves code and geekery, so much the better.
It’s a great idea, and I hope to see one (or a few) organized in and around Vancouver before too long… otherwise I’ll have to set one up. The closest one as of this posting is in Redmond. Cool, but I don’t think I can justify a two hour drive down for dinner. Not yet, anyway.
Check out NerdDinner.com and meet up with some nerds in your community!