The Tao of Helvetica

by Matt 14. February 2008 11:35

Want to see an example of good writing? A little while ago, I got rather excited upon seeing a documentary for Helvetica. Yes, the typeface. No, that's not the example of good writing.

Jeff Atwood recently waxed lyrical on the very same subject. And of course, he came over all, you know, erudite and knowledgeable, while I just sounded like a geek. Huh.

At least I was there first? Nope, not even close. The Helvetica-the-movie-the-blog has been posting since January 2007, and it's fascinating.

And I'll tell you why I think it's fascinating. Firstly, it's a classic typeface. Its sheer ubiquity is astonishing - but you don't need me to tell you that. Watch the documentary. It's a very good film - better than you might give it credit for, given its subject.

But that's just a specialisation. This is all about attention to detail. A designer has specifically chosen to use that typeface. It's going to have a huge impact on their design, from branding to readability, portraying a particular message, image, lifestyle. And most people won't notice. It's a nearly invisible detail, and yet it's been lavished with such attention. That's brilliant.

I remember when I first realised this. Not long out of university I was working for a company that did finite element analysis. One of our customers modelled glass bottles, or more accurately, making them. I can still remember being amazed at the amount of effort and variation that can go into making a bottle. A plain, old, anonymous bottle.

It was the secret life of stuff. Everything gets complex the deeper you look.

And it got me hooked on the detail. There's something deeply satisfying about peeking behind the curtain and seeing this at work. Ever noticed how subtly the iPod's backlight fades in and out? How the rear windscreen wiper comes on automatically when it's raining and you switch into reverse?

But more than that, it's affected how I work. If you see how much care there is in everyday stuff, well, you just have to be as considerate in your work, don't you?

So go on, spoil yourself. Go font spotting.

Tags:

Hungarian notation. Never too old.

by Matt 27. January 2008 18:16

I'm with Jeremy. Fancy not knowing about Hungarian notation (and the real McCoy, not that dodgy Systems stuff people sneer at).

I much prefer C#'s recommended way of doing things - full, unabbreviated words AllMashedTogether. It's just so much more civilised. But the part of me that still loves C++ can slip into Hungarian with surprising ease...

Tags:

Typemock and the Pit of Success

by Matt 18. January 2008 04:53

Roy Osherove is defending Typemock against accusations that it will "kill design for testability".

Of course it will. It's a technology that allows you to replace any type. Why on earth wouldn't I use it instead of having to factor out all those interfaces and create overridable methods?

Poor design is the path of least resistance with Typemock. And I know that you should be trusting your devs to do things properly, but you've also got a responsibility to set them up for success. Don't hand them a loaded gun.

Typemock is fantastically good at a small number of things - namely testing APIs that you can't factor away (http request and context spring to mind, but I've also been able to refactor around these without too much trouble. Linq to Sql might be another good case, but could you test the logic with Linq to Objects instead?). As soon as you reach for Typemock on your own code, something's gone wrong. And the problem with Typemock is that this isn't enforced.

Put simply, Typemock isn't the Pit of Success.

Tags:

LINQ and higher order functions

by Matt 7. January 2008 07:14

While not a new concept, higher order functions are currently very much in vogue, especially with the new found interest in dynamic languages. But your statically typed (.net) languages can enjoy higher order functions, too, thanks to delegates (and for the old school C/C++ programmer, plain old function pointers and the STL).

Dustin Campbell has a great post describing the concept and giving plenty of examples for .net 2.0. The interesting part is where he introduces us to the old stalwarts of list manipulations - filter, map and reduce.

It's a popular post that has been linked to lots of times, but I've seen very few links to the follow up post. Here, Dustin sets of a light bulb in my head by pointing out that the LINQ commands are really just implementations of filter, map and reduce (the keywords where and select and the Aggregate extension method, respectively). This was an aha moment for me that changed the emphasis of LINQ. It's easy to get distracted by the SQL-like syntax and the LINQ to SQL database integration, and not notice that LINQ to objects is not so much about querying than it is about sequence manipulation, where sequences can be more than simple lists.

This really opens LINQ up for me. I can think of plenty of places to use it now. Reducing Dare's many loops to a single loop over the results is just a great example.

Tags:

LINQ, Dare, IanG and me

by Matt 4. January 2008 11:31

I've been wanting to have a play around with LINQ for quite a while now. I had a good look when VS2008 RTM'd and I was very impressed. It's a very elegant design, building slowly but surely on the building blocks of type inference, anonymous types, anonymous delegates and lambdas, extension methods and of course the master stroke of deferred execution via iterators. Clearly more than the sum of its parts. Mike Taulty has a great post that details how it's all composed. (And that's just LINQ to objects. The pluggable providers of IQueryable are a whole new ball game.)

But I hadn't got my hands dirty until today.

Too bad Ian Griffiths stole my thunder. I was looking at exactly the same problem as he's just posted about - Dare Obasanjo lamenting that anonymous types won't give him the same effect as tuples. Ian has (of course) nailed anything I wanted to say on the matter, so go read his post; anonymous types are not equivalent to tuples, and if you want to use them to acheive similar results, you need to restructure your code. And use LINQ.

In fact, just looking at the code makes you want to run for LINQ. It's a prime candidate. It's got several loops over several different collections. Each loop filters or maps a previous sequence to produce a new sequence, and the final loop is then iterated for display. That's exactly what LINQ is designed for.

And for a first attempt, I'm very pleased with how mine turned out. At the time his post dropped into my feed reader, here's what I had:

// Get a sequence of appropriate items
var items = from fileInfo in new DirectoryInfo(cache_location).GetFiles("*.xml")
            let doc = XElement.Load(fileInfo.FullName)
            let feedTitle = (string)doc.Element("Title") ?? string.Empty
            from rssItem in
                (from itemNode in doc.Descendants("Items")
                 where !bool.Parse((string)itemNode.Element("IsDeleted") ?? "False")
                     && !bool.Parse((string)itemNode.Element("IsErrorMessage") ?? "False")
                 select MakeRssItem(itemNode))
            where rssItem.OutgoingLinks.Count > 0
            where filterFunc(rssItem)
            select new
            {
                Item = rssItem,
                FeedTitle = feedTitle
            };

// Map the appropriate items to a list of all outgoing links with a chain of votes
var linksWithVotes = from item in items
                     from outgoingLink in item.Item.OutgoingLinks
                     group new
                     {
                         Item = item.Item,
                         FeedTitle = item.FeedTitle,
                         Weight = voteFunc(item.Item)
                     } by outgoingLink.Key;

// Collapse the groups down to a list of links with scores
var weightedLinks = (from linkWithVote in linksWithVotes
                    select new
                    {
                        Url = linkWithVote.Key,
                        Weight = linkWithVote.Sum(x => x.Weight),
                        Votes = linkWithVote
                    }).OrderByDescending(x => x.Weight);

foreach (var item in weightedLinks.Take(10))
	...

Now, I've got more than Ian does. That's because I was working from Dare's previous post about building a meme tracker for RSS Bandit in C# 3.0, which has the full program. So my first LINQ statement loads and transforms the rss items (modified to pull the items from my Sharpreader cache, not RSS Bandit). There are a couple of other points where I can learn from Ian's solution, mainly the use of the "into" clause.

I create "linksWithVotes" (Ian and Dare's "all_links") by finishing off with a group, meaning I have to deal with a group in my next query. Ian pushes the group "into" a variable, and then pulls the votes out of that in the final select, giving a much nicer final shape (a flat sequence).

Similarly, I called the OrderByDescending extension method directly. Ian gets it into the natural query by using another "into" and following it up with simple select.

And finally, I wasn't taking the minimum weight of the votes per feed title (Ian's a little confused about this one. I'm not massively sure on the reasoning, but I think it's so that if I always link to a particular url, only my oldest, weakest vote counts. I think it beats gaming, but I could be wrong.) And that's also solved by grouping "into" a variable and then selecting the min value, and summing that sequence.

I'm incredibly impressed that this whole set of loops can be reduced to a single foreach statement, but like Ian, I'm a bit unsure on the readability of this solution (mine or Ian's). As Ian says, this could be because of the nature of the algorithm, but I also wonder if it's partly the new-ness of the LINQ syntax, and having to mentally translate it into iterators and extension methods. Testing also looks tricky with this. But from a geeky-cool-new-toy perspective it's brilliant. It's so much more declarative - I've only got one loop, and that's over the 10 result items I'm interested in - I'm not even looping over the files to read them in.

To quote Ian's post:

"In summary, although I’m still finding my feet, I’m rather coming to like LINQ."

Tags:

An unexpected Windows?

by Matt 4. December 2007 11:07

Hey, good timing. I'd wanted to follow up on this and point out a few links and stuff, and hot on the heels of looking at how Windows can run UNIX programs comes the news that the UNIX-based OSX has some vestigial code to handle Windows programs. It doesn't seem to do anything too interesting at the moment, but it does recognise the Windows PE executable file format, and it even appear to try and load Windows dlls. Rumours abound that it's the first steps to getting OSX to be able to natively run Windows applications.

I love this idea - it's kinda obvious and counter-intuitive at the same time. Conventional wisdom states that you simply can't run a Windows binary on UNIX - they are different OS's, after all. But it's also just x86 code, so why shouldn't you be able to run it?

Of course, it's going to be more complicated than just running x86 code - there's a whole bunch of system services that need to be available, including COM, the registry, the file system layout, the clipboard, etc.

But it's not a new idea. Wine has been running Windows apps on UNIX-like machines for ages. Wine isn't like Interix. It's a compatibility layer, and works in a number of different ways. They have reimplemented a load of Win32 dlls that retarget the calls to the underlying UNIX OS, so you don't need Windows installed (they've even implemented DirectX!). Or you can optionally use native Windows dlls. Or simply recompile against their libs to get source code compatibility. But binary compatibility is the main draw.

A similarly herculean effort exists in the other direction - Cygwin. Their home page nicely states what it is, and what it isn't. It's different to Wine, in that it's a source code compatible UNIX environment. You can't take a UNIX binary and just run it (just as in Interix). And it's different to Interix in that it's a compatibility layer built on top of Win32, rather than it's own subsystem, and so it inherits the constraints of the Win32 platform, constraints that Interix can avoid. A nice example of which is fork. Fork creates a copy of a process, including address space. Win32 has no such concept, but (and Google is failing me for backup, here) Interix has full support for fork, because the Windows NT kernel has full support for fork. Similarly, NTFS was designed to support POSIX requirements, such as group ownership, hard links and of course, case sensitivity. These concepts aren't available to Win32, so also aren't available to Cygwin apps.

Cygwin has it's drawbacks, but it has good things too. By running in the Win32 subsystem, it has access to GDI, and so a port of the X Windows Server is possible. Interix can't display graphics, and so has to rely on a full Win32 port, such as Xming. But most important is the simple fact that Cygwin Actually Works. This is huge in and of itself. It's very popular, and has a massive range of ports, and autoconf support means porting should be very easy. In fact, it's so popular that it has better support of ported apps than Interix, and should really be considered the de facto standard for running UNIX apps on Windows.

So why do I prefer Interix? Well, that's easy. It's cooler in a far more geeky way. Cygwin's clearly done incredibly impressive stuff, but (with the greatest of respect) it's a big hack. Interix is a much cleaner way of doing things. Take how the Windows file system is exposed, for example. In Cygwin, you get to your standard driver letters (C:, D:) via /cygdrive/c or /cygdrive/d, and network paths, such as \\server\share\dir are available as //server/share/dir. Interix exposes them in a slightly more UNIX friendly /dev/fs/C, /dev/fs/D and /net/server/share/dir. A minor thing perhaps, but I like it.

(But then I also like Cygwin's /proc/registry.)

Right, quick link-fest. The Interop Community site is a semi-official site that has a bunch of forums, faqs, tech notes and provides a package management system for a whole heap of extra utilities, including recent versions of the GNU utilities (and it's managed by Rodney Roddick, one time Interix dev). Alternatively, you can download packages from NetBSD or Debian. Another previous dev has a great techy white paper for download. And there's even a blog.

And now I'm back off to my command line.

Ctrl-D.

Tags:

Download the Windows kernel source code!

by Matt 30. November 2007 06:01

While doing a little bit of research into key loggers for work (don't worry - I'm looking at CardSpace), I came across a site that offered the Windows kernel as a source code download. Interesting, but probably dodgy as they come.

Turns out, it's legit. Microsoft have released the Windows Research Kernel as Shared Source. Check out the page, it contains loads of the core parts of the kernel OS, including thread scheduling and memory management. Pretty cool, huh? Perhaps Scott Hanselman should look at this for his next Weekly Source Code.

There's just the small issue of the licensing requiring academic affiliation, so you're not likely to get your hands on it here.

Tags:

An unexpected UNIX

by Matt 30. November 2007 04:04

One of the 300+ features of the latest version of OSX caught my eye (do you really think that drivers for a rival OS are a feature of your operating system?). Apparently, OSX is now a fully certified UNIX environment.

Phew. Glad about that. What does it mean, again? Wikipedia explains both the Single UNIX Specification, and the POSIX standards, but to be honest, I'm not too sure how to translate this to the real world. Both specifications should allow for a high level of command line and source code compatibility, but then there's still the need for crazy AutoConf build scripts. At least these systems are now allowed to use the name UNIX (unlike BSD and Linux, surprisingly. They aren't certified).

Anyway. Good for Apple. It's (genuinely) a decent sales feature.

So would it be churlish of me to point out that Windows is a certified UNIX environment? And that it has been since 1998? Probably, but whoa - Windows. UNIX. What? Gosh.

Now this is an easy thing for Apple to do - OSX has always been UNIX-based (based on Mach, which is based on BSD?) But Windows quite clearly isn't UNIX. Or is it?

Well, this is the cool bit. What we're talking about is the POSIX subsystem for Windows NT, and it's one of those really nice little nuggets of technology most people haven't even heard of. Here comes the science.

Windows NT has a nifty feature in its design. As per most OS's it has kernel mode and user mode (where the kernel runs the OS, and user mode runs your programs). But on top of that, Windows NT splits user mode up into environment subsystems.

I can pretty much guarantee that you'll have never noticed this, because the default subsystem is Win32 - what you and I call Windows. Explorer and notepad are win32 applications, as are .net apps. The majority of the dlls on your hard disk are win32 - hence the "win32 API". Win32 even has the killer app - the window manager and the graphics display are part of the win32 subsystem. (Even win16 and DOS - and presumably win64 - are part of this subsystem.)

Windows NT has traditionally supported more subsystems, specifically, POSIX and OS/2. The latter was even going to be the default and primary subsystem, but languished once win32 was chosen. It was apparently only character based, and had no graphical support.

(The Gentoo Linux subsystem turned out to be an April Fool's joke, but the article is still worth a read.)

But the POSIX subsystem is still alive and kicking, and it provides an actual UNIX operating environment, translating the POSIX system calls to Windows NT kernel calls. All the familiar UNIX sights are there - a case sensitive file system (yeah, NTFS supports case sensitivity - purely for POSIX), a root based view of the file system, where everything lives under "/" - /bin, /usr. Your environment startup scripts live in /etc, and you've got access to a bunch of devices under /dev (including /dev/null). And those lovely, familiar, cryptic commands are all present too - ls, vi, grep, awk, sed. It's like a home away from home.

Of course, it's not all a bed of roses. You're now living in a separate subsystem to win32, and you can't call win32 APIs from the POSIX subsystem. And since the user interface lives in win32, you're stuck in that horrible console window (it's not cmd.exe - but it's the same window that cmd.exe uses). There are ways round this, though. And there's some pretty good voodoo to make all of this work as seamlessly as possible. You can run win32 commands from a POSIX app, and vice versa. POSIX apps display in the task manager, and can be killed (handy if you accidentally happen to cat /dev/random, for example. Ahem).

It's had a bit of an interesting history, too. It was originally implemented by Microsoft, apparently just providing enough functionality to satisfy government contracts. Softway Systems managed to get a source license, renamed in OpenNT, then Interix and developed it far enough that Microsoft bought it back and repackaged it as Services for UNIX. And now it's bundled with Vista and known as Subsystem for UNIX Applications.

Rodney Roddick, one of the key developers from the OpenNT days, maintains an Interix website, where you can find a potted history, and a "blog" that includes a slightly more personal remembrance (scroll down to the "10 Years of Interix" entry. There are no links).

I really like this approach to running and porting UNIX programs onto Windows. It has its advantages and disadvantages, but it's one of those lovely ideas that you like even if it's not perfect.

Tags:

Kim Cameron's going to love this...

by Matt 20. November 2007 18:28

What do you reckon? Is losing the names, addresses, dates of birth, national insurance number and bank details of up to 25 million individuals, including children under 16 enough for Britain to wake up to the idea that a national identity card is a bad idea? Or that we have to be really, really careful about how we use and store biometric data?

A stunning example of how important the Laws of Identity are becoming. Perhaps more disturbingly, it's a stunning example of how the ramifications of basic computer security and privacy are just not understood, or perhaps even considered. And this is the Facebook generation...

Tags:

Not that I'm questioning the value of Wikipedia. Oh no.

by Matt 12. November 2007 04:28

My two year old daughter watches a TV programme called Wonder Pets. She's the very example of their target audience.

And yet it has a wikipedia page.

Now I'm torn between the need to capture, document and collate the zeitgeist for future generations to study, and the thought that some people have too much time on their hands.

Tags:

Month List

RecentComments

Comment RSS