Clean, Tidy, Fast. Pick Three

Ten Simple Tricks to Speed up Your Excel VBA Code is a fine post that offers some highly practical insights into Excel/VBA performance that insiders have been using to their benefit. This is the kind of post that would quickly result in one’s expulsion from an Excel Magic Circle. If there was one. Which there isn’t. Unless you know otherwise.

The first five tips all deal with asking Excel to suspend various time-consuming activities during macro execution. Of the five, #1 is both the most important and the most dangerous:

Application.Calculation = xlCalculationManual
'Place your macro code here
Application.Calculation = xlCalculationAutomatic

It has the potential to save a lot of time, but crucially, it may also affect the way your workbook works. A naive (or just plain bad) workbook construction may have macro code interacting with worksheet calculations, so the state of the sheets may change during the execution of the macro depending on where calculations happen. This is a situation that can often develop when macros are developed from code created with the Macro Recorder. Be warned.

Of course, we should also be wary of assuming that the calculation mode started out as automatic: the user may have switched to manual for very good reasons. Anyway, it’s a bit rude.

Tips #2 to #5 add further performance boosters, managing the enabled and disabling of screen and status bar updating, , page break display and event processing behaviour (which may also need consideration similar to calculation mode as described above). By now, we’ve got quite a lot of boiler-platey code to wrap around the code that’s actually supposed to do something useful. I don’t like that.

How much cleaner would it be if we could write something like this:

With New ExcelInteractionSuspender ' please feel free to think of a better name...
    'Place your macro code here
End With

Quite a lot cleaner, I’d say.

Such a class might look something like this:

Option Explicit

Private storedCalcMode
Private storedScreenUpdateMode

Private Sub Class_Initialize()
    ' store current settings
    storedCalcMode = Application.Calculation
    storedScreenUpdateMode = Application.ScreenUpdating
    ' now switch 'em off
    Application.Calculation = xlCalculationManual
    Application.ScreenUpdating = False
End Sub

Private Sub Class_Terminate()
    Application.Calculation = storedCalcMode
    Application.ScreenUpdating = storedScreenUpdateMode
End Sub

For the sake of brevity I’ve only included the first two options (which are the ones I usually find provide the most benefit in any case). The others are left as an exercise for the student…

Optically Illusory

I came across a charming little optical brain-bender a while back, and wondered what would be needed to reproduce it in Excel/VBA. Not much, as it transpired (see below). I’ve golfed the code somewhat for brevity. Dump the following in an empty VBA module and run “DrawIllusion”…

Private pi As Single
Public Sub DrawIllusion()
  Dim shp, c
  pi = WorksheetFunction.pi()
  ActiveWindow.DisplayGridlines = False
  For Each shp In ActiveSheet.Shapes: shp.Delete: Next
  NewShape 0, 0, 420, 0, vbWhite
  For c = 1 To 4: DrawACircle 200, 200, 60 + (c - 1) * 40, 15, 5, -20 + (c Mod 2) * 40: Next
End Sub

Private Sub DrawACircle(x, y, r, side, gap, rotationOffset)
  Dim numSquares, i, angle, shp
  numSquares = NumToDraw(r, side, gap)
  For i = 1 To numSquares
    angle = (i - 1) * 2 * pi / numSquares
    Set shp = NewShape(x + r * Cos(angle), y + r * Sin(angle), side, angle * 180# / pi, IIf(i Mod 2 = 1, vbBlack, vbWhite))
    shp.Rotation = shp.Rotation + rotationOffset
  Next
End Sub

Private Function NumToDraw(r, side, gap)
  NumToDraw = Int(2 * pi * r / (side + gap))
  NumToDraw = NumToDraw - NumToDraw Mod 2
End Function

Private Function NewShape(x, y, side, rot, color)
  Set NewShape = ActiveSheet.Shapes.AddShape(msoShapeRectangle, x, y, side, side)
  NewShape.Rotation = rot
  NewShape.Line.ForeColor.RGB = color
  NewShape.Line.Weight = 1
  NewShape.Fill.ForeColor.RGB = RGB(128, 128, 128)
End Function

Getting Bigger All The Time

in which we try to get tricky with Excel and find that Excel has adequate built-in trickiness already, thank you very much indeed

Over the years I’ve had a number of roles that involved fixing, tuning, cleaning and otherwise future- and bullet-proofing a fair number of Excel workbooks. There’s a whole catalog of potential issues to entertain and amuse but very common is the Extending Input Data Set. (I capitalised that in the hope that I might have christened some kind of Excel anti-pattern).

Your average Excel user is smart enough to know that some input data may grow over time: they may be copy-pasting from some other source, grabbing a database query result or even just typing in more values over time. It’s very common to see a row-by-row enrichment of such data for analysis and the formulae need to be extended to match the rows of input data. In the worst case, users have pre-copied more rows than they expect ever to need and don’t notice when they’ve started to miss data. More happily, they copy down as they see the data extending. If they see the data extending, that is1.

Helping a colleague to avoid such an unpleasantness recently led to a couple of interesting (to me) solutions. Firstly, we looked at extending the calculations applied to a list of data that we knew would extend every time we ran the spreadsheet model.

We assume we have a defined name, input_data that does what it says. Further, we have output_formulae, which should correspond one-for-one with the input_data rows. When input_data  changes, we want to automatically extend output_formulae to match. Pretty straightforward, as it turns out:

Private Sub Worksheet_Change(ByVal Target As Range)
  If Intersect(Target, Range("input_data")) Is Nothing Then
    Exit Sub
  End If
  ExtendOutputFormulaeAsRequired
End Sub

Sub ExtendOutputFormulaeAsRequired()

  Dim inputRowCount As Long, calcRowCount As Long
  inputRowCount = Range("input_data").Rows.Count
  calcRowCount = Range("output_formulae").Rows.Count

  If inputRowCount <= calcRowCount Then
    Exit Sub ' enough formula rows already - could reduce if needed, but not done here
  End If

  With Range("output_formulae")
    ' Assumes just formulae are needed:
    .Offset(calcRowCount, 0).Resize(inputRowCount - calcRowCount, .Columns.Count).Formula = _
        .Rows(calcRowCount).Formula
  End With
End Sub

That works pretty well. At least, it does if input_data is defined, something we can usually do fairly easily, using a named formula, something like =OFFSET(B2,,,COUNT(B:B),6) (assuming we have 6 rows and there’s a non-numeric column heading in B1, which in this case there is).

Here’s the wrinkle that gives rise to the second interesting (to me) question, that necessitated a question on Stack Overflow. What if I can’t be sure that all my columns will be the same length? I want the length of the longest column (there won’t be gaps vertically, which is something at least). So in the sample below, there are 7 rows of data in columns B to G, with a heading on row 1, so 7 is the longest column and defines the number of rows in input_data.

Input data – extend me!
1 1 1 1 1 1
1 1 2 1 1 1
1 1 1 1 1 1
1 1 1 1 1 1
1 1 1
1
1

I don’t want to use .CurrentRegion because I can’t be certain that adjacent cells won’t be populated at some time in the future. We can’t do something like MAX(COUNT(B:G)) because COUNT() is built to handle two-dimensional ranges. I tried to be tricky, {=MAX(COUNT(OFFSET(B1:G1),,,1000000,1))}, hoping that Excel would evaluate the COUNT() for a set of OFFSETs from B1 to G1. Excel wasn’t fooled. Stack Overflow time then, and enter SUBTOTAL(), a function I confess I’ve never used in over 20 years of Excellence.

I define a new name, input_data_columns as =Sheet1!$B:$G, which I can then use in the definition of input_data_rowcount:

=MAX(SUBTOTAL(2,OFFSET(input_data_columns,,COLUMN(input_data_columns)-MIN(COLUMN(input_data_columns)),,1)))

In a worksheet this needs to be Control-Shift-Entered as an array formula, but there’s no obvious way to do this when defining a Named Formula. Fortunately, although I don’t understand why, it works OK anyway, leading in turn to input_data becoming

=OFFSET(Sheet1!$B$2,0,0,input_data_rowcount,6)

That works! Is it future-proof? Probably not2 but it might very well be an improvement on what came before.

Briefly checking what happens, if we remove the MAX() and put the remainder into a worksheet as an array formula, we get {4, 4, 7, 4, 5, 5}, which is indeed the result of COUNT() on each column.


1 From a selfish perspective, this is a Good Thing – it requires “professional” expertise to deal with and has put bread on my table many a time.

2 See 1

Another notch in the utility belt

Having been largely an all-thumbs c# programmer for the last year or so I’ve had little opportunity to add to the world of Excel know-how (and I don’t really feel qualified to say much about the .Net world yet, even if I did once get a Tweet from Jon Skeet). I do keep an eye out for Excel questions on StackOverflow, though, and one caught my eye this morning.

The questioner wanted to add a zero to a chart range. Not sure why he didn’t just add an extra constant value at the end (start or beginning) of the range, but perhaps he couldn’t (maybe the data was in another, read-only workbook?). Anyway, I came up with something that extended an existing trick a little and I thought it was mildly interesting.
Using a named formula in a chart series is1 a fairly well-known idea theses days. It lets us define a dynamic series (usually using OFFSET()) and have our charts update automagically as our series grow and shrink. What I hadn’t previously thought of2 was that I could use a VBA function call as the Name’s “Refers to” with the result being useable in a chart Series definition.

Turns out it could.

Public Function PrependZero(rng As range)
Dim val As Variant, res As Variant
Dim idx As Long
    val = Application.Transpose(rng.Columns(1).Value) ' nifty!
    ReDim res(LBound(val) To UBound(val) + 1)
    res(LBound(res)) = 0 ' this is where the function lives up to its name
    For idx = LBound(val) To UBound(val)
        res(idx + 1) = val(idx) ' copy the remainder
    Next
    PrependZero = res
End Function

(That Transpose thing is handy – I hadn’t known about it as a way of getting a 1-D range value as a vector. Two transposes needed for data along a row.)

You could use this approach for lots of things. Like, well, lots of things. Honestly, you don’t need me to spell them out.


1 This makes me a little sad – it was a source of Excel-fu that impressed people to a quite disproportionate degree. Array formulae and VBA classes still have this power.
2 I’m aware that I might be the last kid in the class here. Be gentle.

Split Me, Shape Me

in which we publish something we wrote a while back but forgot to post for reasons now lost in the mists of time

While I’d love to be able to take credit for having thought of this myself, the truth is I can’t. I did clean up the code a fair bit, though, so any tidiness that particularly appeals is entirely down to me.

Reading this post a while back, I leapt to post my CellSplit() function in the comments, then realised that I typically use it in conjunction with a rather more substantial macro, one that wasn’t going to sit well in someone else’s blog comments.

Here’s my CellSplit(), which is just a wrapper for the VBA Split() function:

Public Function CellSplit(celValue As String, delim As String) As Variant
    CellSplit = Split(celValue, delim)
End Function

On its own, it’s moderately useful, but unless you know how many parts your input string will split into, you’re going to be fooling about with range resizing. In these dark days there’s no time for fooling about – we need a tool to get that resizing down for us in One Click.

Enter a toolbar (sorry, RibbonX) button, assigned to the Range Resize Wizard…

Public Sub RangeResizeWizard()

Dim result As Variant
Dim fmla As String
Dim rng As range
Dim targetRows As Long
Dim targetCols As Long

On Error GoTo Catch

Set rng = Selection
If IsEmpty(rng) Then Exit Sub

If rng.HasArray Then
  Set rng = rng.CurrentArray
  fmla = rng.FormulaArray
ElseIf rng.rows.Count = 1 And rng.Columns.Count = 1 Then
  fmla = rng.Formula
Else
  Exit Sub
End If

result = Evaluate(fmla)

With rng
  .ClearContents
  If IsArray(result) Then
    If NumberOfDimensions(result) = 2 Then
      targetRows = UBound(result, 1) - LBound(result, 1) + 1
      targetCols = UBound(result, 2) - LBound(result, 2) + 1
    Else
      targetRows = 1
      targetCols = UBound(result, 1) - LBound(result, 1) + 1
    End If
    On Error GoTo RestoreFormula:
    .Resize(targetRows, targetCols).FormulaArray = fmla
    On Error GoTo Catch
  Else
    .Formula = fmla
  End If
End With

Finally:
  On Error GoTo 0
  Exit Sub

Catch:
  Debug.Print Err.Description
  Resume Finally

RestoreFormula:
  Debug.Print Err.Description
  rng.FormulaArray = fmla
  Resume Finally
End Sub

Function NumberOfDimensions(arr As Variant)

Dim dimensions As Long
Dim junk As Long

On Error GoTo FinalDimension

For dimensions = 1 To 60000
  junk = LBound(arr, dimensions)
Next
Exit Function

FinalDimension:
  NumberOfDimensions = dimensions - 1
End Function

Note that the array formula in question will be re-evaluated twice, once to determine the dimensions of the output array and again when the formula is pushed back into the new, perfectly-sized range. Further, the routine assumes you haven’t left anything important lying around where it can get overwritten. So it’s to be used with care, but if you spend much time with array formula, you’ll probably find it as useful as I do (if you haven’t already written it yourself).

Note further that it’s not really a “wizard” in the programming sense, in that it doesn’t do any of that walking-you-through-steps thing. It’s just a bit magic.

Pin Me Up, Pin Me Down

(in which we return to VBA for a Small But Useful macro-writing session)

My friend Jens comes up with interesting questions. Even better, he doesn’t come up with them too often.

Today’s poser refers to Excel 2007, a technological marvel that has only reached our work PCs relatively recently. Jens had discovered the Usefulness that is the “pin” tool on the “Recent Documents” list:

So the pinned documents don’t go away, but they do move around on the list, rising up when used, drifting down as lesser workbooks are opened. What Jens wanted was to have his pinned files stay at the top. Or the bottom. Either, really, providing they hung around together.

We couldn’t find anything in Options, Advanced or otherwise, and some Googling didn’t turn up much of use1 and at time of writing (admittedly not too long ago) I hadn’t got any useful feedback on Stackoverflow. Time to apply a different set of talents.

Excel has Application.RecentFiles, which does pretty much what it suggests. Even better, the “.Add” method, when called with a file that’s already on the list, promotes that file to the top. If we could call that method for the pinned files, we’d be laughing. If we could identify them. It turns out we can.

Excel has to store the list somewhere so it can find it when restarting. Where do “modern” Windows programs store that kind of information? The registry2. Specifically, “HKEY_CURRENT_USER\Software\Microsoft\Office\12.0\Excel\File MRU”, where MRU stands, most likely, for Most Recently Used. My registry has 50 entries here, probably reflecting that Advanced Excel Options allows me to display up to 50 documents, screen real estate permitting.

So looking at the entries, we see that there’s one string for each of our MRU entries and that each string has some structure to it:

That first part: “[F00000001]”, is particularly interesting: the last digit is a “1” for all the files we’ve marked as pinned. So we can identify pinned files. All we need now is to be able to read the registry from VBA. We can’t use “GetSetting” because it’s limited in scope. More Googling and we discover that the Windows Scripting Host knows about registries so we add a reference to “Windows Script Host Object model” in order to be able to early-bind to WshShell.

So here’s the first attempt. Probably heaps of unhandled situations, but right now it works on my machine…

Public Sub PromotePinnedFiles()

Const KEY_ROOT = "HKEY_CURRENT_USER\Software\Microsoft\Office\12.0\Excel\File MRU\Item "

Dim itemIndex As Long
Dim itemData As String
Dim itemPath As String

On Error Resume Next ' general-purpose error-ignoring

With New WshShell ' Or use CreateObject("WScript.Shell") if reference not set
 For itemIndex = 1 To 50 ' I think 50 is the most Excel will store
  itemData = .RegRead(KEY_ROOT & itemIndex) ' get the item (ignoring errors if fewer than 50 present)
  If Len(itemData) > 0 Then
   If InStr(itemData, "*") > 0 Then
    If Mid(itemData, 10, 1) = "1" Then
     itemPath = Mid(itemData, InStr(itemData, "*") + 1)
     Application.RecentFiles.Add itemPath ' "promotes" the item to #1 in the list
    End If
   End If
  End If
 Next
End With

End Sub

The effect of which is this:

It just remains to decide how best to package it for my learned colleague.


1 Inevitably, there will be any number of comments telling me precisely where I should have looked.
2 Readers who answered “INI files” are invited to remove themselves to the back of the classroom, where the machines are still running Excel 3.0.

Tiny VBA Tooltippery Tip

Project Euler 100/304

(in which we discover What Went On In 1997)

This morning’s iteration of the daily blog/news trawl for useful information threw up “Five tips for debugging a routine in the Visual Basic Editor“, all of which are sensible, although unlikely to be news to anyone reading this, if we’re honest.

Tip #3, “View variables using data tips”, however, reminded me of something that I don’t believe is widely known. Since the site seems to require a full-blown account creation that I can’t see as appropriate for a simple comment, I’m going to mention it here.

Hovering the mouse pointer over a variable while in VBA’s Break mode will show the variable’s value in a tool tip:

The smart VBA programmer

That’s fine: almost all the time we get to see exactly what we want. Above about (or maybe exactly) 60 characters, however, we get the leading part and three little dots:

Still no problem if we only want the start of the string...

What if we want to see what’s at the end of the string, though? Well, back in (I think) 1997, I managed to get my then employer to send me to VBA DevCon (no easy task, given that the location was EuroDisney), at which I happened to meet the Microsoft guy who actually wrote the hover/tooltip thing (it was in the VB4 editor first, I believe) and he told me that viewing the last 60-ish characters of the string could be achieved by holding down the Control key before moving the pointer over the variable name:

Presto!

I don’t think I’ve ever seen this recorded. Of course, I haven’t exactly gone looking for it, so if you came all the way to the end only to discover that I was just repeating something that everyone knows, then I can only apologise. We’ll get over it.

Sorted for Excel and Whee!

If you happened upon the VBA Lamp and by rubbing it were able to produce the Excel VBA Genie and were granted a VBA Wish, would you ask for a built-in Sort() function?

If you build the kind of Excel apps that I do, you’ll run up against the need for a Sort all too frequently. Sorting arrays with sizes in the tens of thousands is not unusual and I was reminded of this when reading a post in this entry at Andrew’s Excel Tips the other day.

While it’s not crucial in the (useful) idea presented, the code has a quick and and dirty sort that is about the worst sort algorithm1 one could intelligently come up with. It’s also an prettty intuitive solution, which just goes to show that sorting may not be as simple as we may think. I’m not attacking Andrew’s skillz here, btw: the code as presented is certainly fit for purpose; it’s not presented as a general-purpose utility (at least I hope it isn’t).

I’ve accumulated a few algorithms over the years and I’ve coded up a couple more while “researching” this piece. On the one hand, we have the oft-derided BubbleSort and its close relation, CocktailSort. In the same group I’d include InsertionSort and SelectionSort. I’m going to be harsh and categorise those, with the naive sort above, as “Slow”. Well, “mostly slow”, as we’ll see.

In the “Fast” group, we have the much-touted QuickSort, and somewhere in between, we have HeapSort,and my current algorithm of choice, CombSort. As I was researching this, I also coded up ShellSort, which is about as old as I am and which was claimed to be faster than QuickSort under some conditions.

I ran some comparisons, not meant in any way to be perfectly scientific2. I ran each algorithm on arrays of 50 to 50,000 values with six different characteristics:

  • already sorted
  • exactly reversed
  • mostly sorted (values typically within one or two places of target)
  • ordered blocks of 100 random values (the first 100 values are 0 + RAND(), then 100 + RAND() and so on)
  • completely random
  • random 10-character strings

First off, the 50-record results:


50 recs (ms) sorted near sorted blocks random strings reversed
Shell 0.06 0.06 0.11 0.10 0.24 0.09
Quick 0.13 0.13 0.16 0.13 0.29 0.14
Comb 0.09 0.10 0.17 0.17 0.33 0.13
Heap 0.34 0.33 0.32 0.32 0.52 0.28
Insertion 0.02 0.02 0.20 0.17 0.47 0.37
Selection 0.25 0.25 0.25 0.25 0.54 0.25
Cocktail 0.01 0.02 0.44 0.39 1.02 0.77
Bubble 0.01 0.02 0.50 0.45 1.12 0.78
Naive 0.22 0.23 0.50 0.46 1.06 0.77

I’d say it’s pretty clear that it doesn’t matter much what you use to sort a small array, just about anything will be fast enough (unless you’re going to perform that sort tens of thousands of times in a run). It’s also apparent that the “slow” algorithms are actually pretty good if our data is already reasonably well-ordered.

So far, so “so what?”

Let’s look at the opposite end of the spectrum: 50,000 values? Here, the Fast/Slow divide is apparent. First the “Slows” (two tests only, for reasons that should become apparent):


50K (ms) near sorted random
Bubble 31 522,216
Cocktail 30 449,696
Insertion 19 179,127
Naive 219,338 510,010
Selection 220,735 220,743

Yes, that’s hundreds of thousands of milliseconds. “Three or four minutes” to you and me. The “Fasts”, meanwhile:


50K (ms) sorted near sorted blocks random strings reversed
Shell 162 164 219 377 929 250
Quick 296 298 327 365 790 306
Comb 390 396 477 622 1,348 452
Heap 899 903 885 874 1,548 844

(I only ran two tests on the “Slows”, for fear of dozing off completely.)

Again, for data where values are near their final sorted positions there’s clear evidence that something like an Insertion Sort is much faster than any of the “sexier” options. Provided you know your data will actually meet that criterion, of course.

All that considered, I’m switching from CombSort to ShellSort as my default algorithm. While it loses out a little to QuickSort in the “random” test (probably most representative of my normal use case) it doesn’t carry with it the fear of stack overflow through extreme recursion, something that’s bitten me with QS in the past. Anyway, us old’uns have got to stick together.

As already mentioned, if you have small quantities of data or infrequent sort calls in your application, it really doesn’t make much difference which algorithm you use, although I’d still suggest being aware of the properties of several options and having a basic implementation to hand. Once you reach a point where sorting contributes materially to your application run time then you owe it to yourself and your users to make an intelligent selection.

Here’s my ShellSort implemenation in VB, transcoded fairly literally from the Wikipedia pseudo-code (optimisations welcome):


Public Sub ShellSort(inp)
' sorts supplied array in place in ascending order
Dim inc As Long, i As Long, j As Long
Dim temp ' don't know what's in the input array...
  If Not IsArray(inp) Then Exit Sub ' ...but it had better be an array
  inc = (UBound(inp) - LBound(inp) + 1) / 2 ' use Shell's originally-proposed gap sequence
  Do While inc > 0
    For i = inc To UBound(inp)
      temp = inp(i)
      j = i
      Do
        If j < inc Then Exit Do ' check these conditions separately, as VBA Ors don't short-circuit
        If inp(j - inc) <= temp Then Exit Do ' ... and this will fail when j < inc
        inp(j) = inp(j - inc)
        j = j - inc
      Loop
      inp(j) = temp
    Next
    inc = Round(CDbl(inc) / 2.2)
  Loop

End Sub


1 Not the absolute worst: there’s the catastrophic genius of BogoSort, to name but one, but let’s not go anywhere nearer to there.
2 Just so we understand each other, these are my figures for my code on one of my machines. YMMV. A lot.

Going Metric

(stackoverflow rep: 10,307, Project Euler 97/288 complete)

(in which a single comment substitutes for a butterfly’s wing, metaphorically speaking)

A week or so back , as part of the unfocused rant1 that stemmed from exposure to a particularly ghastly Excel travesty.

Since it struck a chord with at least one person2 and I’d been meaning for months to write something more about it, I’m going to ramble about my feeble (but still surprisingly useful) approach to VBA code metrics.

The original idea came from three contemporaneous events several years: a large and nasty workbook to be heavily modified, some reading about McCabe’s cyclomatic complexity metric and coming across a little VB utility named Oh No!

Oh No!

This utility scanned a VB project and displayed a chart that attempted to show you where possible problem areas might lie by using size and complexity of routines on its X and Y axes (Function, Sub, Property etc). Size was number on non-blank, non-comment lines, while complexity was (from the Help file):

The complexity is measured as the total number of structure statements in a procedure. These statements are: Do, For, If, ElseIf, Select, While and With.

It looked like this (pointed in this instance at its own source code):

The third dimension, “blob size”, was related to the number of parameters in the routine. All in all, it’s a nicely-done, well thought-out little app.

For my uses it had its drawbacks, of course. To start with, it wanted a .vbp (VB project) file for input, something I didn’t have in my Excel/VBA project. For another, I didn’t seem to have the sort of brain that could easily extract useful information out of the graphical display. The code didn’t like continued lines, something I use a lot, and I felt that size and parameter count were contributors to complexity: we didn’t really need three dimensions.

Most important of all, writing my own app looked like a fun part-time project and I’m a complete sucker for those.

I should confess now that a cyclomatic complexity calculation proved to be beyond me when I developed this utility in 2002. It’s probably still beyond me now, but back then the reference I had was the original paper and gaining the necessary understanding from that would in all likelihood have taken all the fun out of the whole thing. So I fudged it. Sue me.

Approaching the problem

The VBComponents of the target workbook are scanned and a collection of modules is constructed, each of which contains a separate object for each procedure2 it contains. As procedures are constructed from lines, they attempt to classify what they’re being presented and tally up the incidences of things worth counting:

Public Enum eCounterType
    ecComments = 1
    ecDeclares = 2
    ecElseIfs = 3
    ecGotos = 4
    ecHighestNestLevel = 5
    ecIifs = 6
    ecIfs = 7
    ecLines = 8
    ecLongLines = 9
    ecLoops = 10
    ecMultiDimCount = 11
    ecOneCharVars = 12
    ecParameters = 13
    ecSelects = 14
    ecWiths = 15
End Enum

You may have spotted that the deepest nesting is also recorded, although it’s not exactly a “counter” – perhaps I should have used “scoreCard” or something more expressive.

Each procedure gets a complexity “score”, which ended up being calculated, after much tweaking, using a tariff something like this:

Pts Awarded for
1.0 every “nesting” command (If, Do, For, Select, ElseIf, etc)
0.1 each line of executable code after the first, rounded down
0.1 each declared variableafter the first, rounded down
1.0 each line that contains multiple Dims
0.5 each parameter after the first
2.0 each If (more than loops because of the arbitrarily complex branching)
0.2 each long line (defined as 60 chars or more, after concatenating continued lines)
1.0 each Go To
special the square of the deepest nesting level encountered

.
It’s not supposed to be exact and it doesn’t need to be: the purpose is to give a quick sanity check and, particularly when encountering new code for the first time, an idea of where problem areas may lie.

There are deficiencies: I don’t think the code handles one-line If/End Ifs correctly – I believe it incorrectly increments the nesting level, which leads to a higher score. Since I really don’t like one-line Ifs, that’s just going to increase the chance that I see them, so I was happy to leave it like that.

If you look at the output, there are other metrics: information on the whole project, module-level summaries (if you see 300 global variables you might consider that worth investigating, for example).

There’s no penalty for “Static” declarations, probably because I hadn’t stumbled across any code that thought it was a good idea. I’d put a fairly hefty tariff on that: it’s a major obstacle to the unravelling process.

Another “drawback”: there’s no real parser – a better approach might have been to write one and utilise an AST (the very useful roodi library for Ruby uses one and can report a “proper” cyclomatic complexity metric as a result). Also, I didn’t use regular expressions – I probably hadn’t encountered them (or was still frightened) at that time: it would have made some of the line-parsing a good deal more terse.

I called it “LOCutus” – “LOC” for “lines of code”, the balance presumably indicating that the Borg were on my mind at the time. For better or worse, I’ve added it to the xlVBADevTools download (in a ZIP file with xlUnit). It’s an Excel XP file – I don’t think I’ve ever tested on XL2007 or higher, since I’ve yet to work at a site where it was used and prefer not to look at large crappy Excel apps without being paid for it…

If you download it, you’ll see that the report is showing the results of a scan on the app’s own code (shown as a HTML table below, so formatting may vary slightly):

VBProc (Module) Globals Members Procs
0 7 21
Procedure Lines Max nest Comments Declares Ifs Loops Long lines Go Tos OCV X
Load 8 2 0 3 1 1 0 0 0 9
ProcessLine 20 1 0 1 5 0 0 0 0 12
CheckForOneCharVars 7 2 0 1 1 1 0 0 0 8
CountParameters 14 3 0 4 3 1 0 0 0 17

.
The “worst” proc scores 17, largely due to its 3-level nesting. If I were to have a need to change it, I’d consider breaking out the inner loop (or taking the outermost If and refactoring into a guard clause). Seventeen isn’t generally high enough to make me concerned though.

That’s the most “formal” part of the diagnostic “toolkit” that I apply to strange new Excel spreadsheets – anyone out there have something else we could look at?

xlVBADevTool download


1 Aren’t they all? You should hear me around the office. Probably not, actually.
2 OK, exactly one.
3 These days I favour “routine” as a generic term for Function/Sub/Property or whatever. Times change.

It’s An Ill Wind

I was pontificating recently about the horror of tightly-coupled worksheet calculations and macro code in Excel and the mess that results from carelessly mashed-together programming paradigms.

By a horrible coincidence, a few smug weeks later I suddenly found myself deep in the horror again. This time, I had to contend with two versions, needing to set up and run several tens of thousands of real-world cases through “before” and “after” scenarios.

I soon started to turn up nuggets of wonderment. Here are a couple of gems from the worksheets:

=IF(CPPI=TRUE,TRUE,FALSE)
=MAX(1,DATE(YEAR(C4),MONTH(C4),DAY(C4))-DATE(YEAR(C6),MONTH(C6),DAY(C6)))

“Why”, I asked myself, “don’t these monkeys think?” (When someone asked me which monkeys I was referring to, I realised that my inner monologue was malfunctioning, but that’s not important right now.)

Obviously they were thinking, apart from anything else the whole thing does seem to work, inasmuch as a set of inputs produce a set of plausible-looking outputs. I’d assert, however, that an experienced, competent developer has a continuous, higher-order review process running, asking questions like “does that look right?”, “could that be written better?”, “is that clear?”, “could someone else understand that?” It would appear that the individuals concerned in this case had no such process running.

I have a little tool that I bring out when faced with an unknown Excel VBA application. It scans the code and delives a report that gives me an idea of the degree of caution I should exercise, calculating for each routine an “X” number, being some arcane function of code lines, mesting level, numbers of ifs, loops, one-character variables (seldom straightforward to rename, even if they’re called “q” or “z”). A score of 10 or above indicates a possible problem area. Over 50 and we’re definitely in bandit country. Here’s the report for the top-scoring routine in the horror workbook (names changed to conceal the guilty):

422 is by a factor of about 3 the highest-scoring routine I’ve ever encountered. (By contrast, and perhaps it’s an unfair comparison, the top score in xlunit is 24.) That 1044 lines of code doesn’t include comments, blank lines, variable declarations or similar. Even better, it’s declared Static, something so appallingly dangerous (to me, at least – I get nauseous when I see even a Static variable) that I had to change the analyser because it had never seen it before. “Option Explicit” is honoured more in the breach than the observance and there are a mere (ha!) 156 global variables.

I was flicking through my cherished (and slightly stained) copy of The Pragmatic Programmer last night and I’ll take as my text the section “What Make a Pragmatic Programmer?” on pages xiii and xix of the Preface. Summarising the characteristics and comparing them with the authors of the unholy mess described, I get:

Early adopter/fast adapter: given that the abomination I’m working with is the culmination of more than two years’ work and is the fourth or fifth (and probably “best”) version, FAIL.

Inquisitive: I see no sign of any curiosity, not even “why do we suck?” FAIL

Realistic: I’m (generously) prepared to specify that this one may not really apply here. (Big of me.)

Jack of all trades: and masters of none? These guys didn’t make it to Jack of one trade. FAIL.

Critical Thinker: referring to the inner monologue again, there’s no evidence to suggest that anything like the questions listed above were ever applied here. And plenty of evidence to the contrary. BIG RED FAIL.

The tragedy here is that the monkeys described are unlikely to realise exactly how snivellingly far from acceptable (let’s not waste “best” here) practice they are, as the Dunning-Kruger effect tells us is likely, so they’re not going to change in any useful way. Failing to see that the twin demons of high coupling and low (non-existent, if we’re honest) cohesion are rampaging through their code leaves me wondering where we’d start.

Depressingly, these guys aren’t unusual, they’re not even the worst. The type is perhaps a little more common in, say, the Excel world, where the tools to develop staggering incompetence are so widely available, but they’re everywhere else – the invisible 90% of the iceberg.

On the plus side, even moderately competent technicians who are prepared to don the elbow-length protective gloves and dive in to such messes will be able to look forward with some confidence to a lifetime in which there will be a paycheck for as long as they can stand to pick it up.