Wednesday, August 12, 2009

Developer Tools You Don't Use – And Why You Don't Use Them

Developer Tools You Don't Use – And Why You Don't Use Them

August 11, 2009 —

document.write('

Just about every developer uses a debugger, at least occasionally. The reasons are obvious: Code inevitably has defects, and a tool can help find them.

However, other categories of software tools are used far less often. According to several years-worth of data collected by analyst firm Evans Data, a steady 20-30% of software developers stay away from some developer tools categories entirely. For example, a quarter of North American developers never use a load and stress test tool; 22% never use data modeling tools.

[ See also: Fathers of Technology: 10 Unsung Heroes and The end of bloatware: The return of programming's golden age? ]


Sometimes this makes perfect sense, based on the application or where it runs. Intranet apps rarely need a load and stress tool to determine if they can handle a Slashdot load. If you think your application might endure the occasional heavy load, you might consider hosting with a cloud-based server rather than tune for a situation that never arrives. When some developers say, "These tools aren't necessary," they have good reason for that opinion. According to the latest Evans North American Development Survey, performance tools aren't used by 29% of the developers who primarily write departmental in-house software (as compared to those who write apps the whole enterprise uses), and 44% of those departmental developers never use load and stress test tools. In contrast, 90% of independent software vendors (ISVs) use performance tools at least occasionally, and three quarters use load testing tools.

Most unused development tools
Performance tools 18%
Load and stress test tools 25%
Refactoring tools 26%
Application modeling tools 27%
Data modeling tools 28%
Source: Evans Data

But other categories of developer tools, such as security testing and performance tuning, presumably could benefit everyone. Why don't developers adopt them? To learn the answer, I asked dozens of developers why they don't use data modeling tools, application modeling tools, load and stress test tools, security testing tools, refactoring tools, or performance tools. I also invited a few vendors to chime in, to share the most common sales objections they encounter.

The Tools Are Unnecessary or Irrelevant

A common answer to, "Why don't you…?" is "Because I don't need it."

These tools are out of scope for many projects, points out developer Quentin Neill. Recently, he built an ad-hoc tool to mine a database, merge data into a document infrastructure, and publish it to the web. That application didn't use any of those tools, he says; there was simply no need for them.

Often, tools are perceived as overkill. Developer Leonid Lastovkin explains, "I do not need a bulldozer to build a sand castle. If I am building a big sand castle – then maybe." Developers like Lastovkin must see a tangible, practical need before adopting a toolset. "Performance [tools] can be quite useful, but not until you hit a realistic bottleneck," he says. "It all depends on scalability requirements."

That sentiment is echoed by Charles Wilde, CTO at Aton International, who decides on tool relevance based on project size and type. For instance, an Agile development project with three people might skip data and application modeling tools; a short lifetime project may not justify refactoring or performance tools. "As a project manager, I must make decisions on tools based on cost/benefit. These decisions can vary widely on the specific details," says Wilde.

"I do not need a bulldozer to build a sand castle. If I am building a big sand castle – then maybe."

Leonid Lastovkin

This assumes that developers know when the out-of-scope application status changes. Explains Brian Vosicky, consultant at Corporate Technology Solutions, most projects start small and do not take into account possible growth; customers want a quick turnaround that meets today's requirements. "But the problem is: Requirements constantly change and expand, which then exacerbates the shortsightedness in design and testing and skews the calculation of the benefit," says Vosicky.

Of about 40 teams supported by Capgemini's Paul Oldfield, a methodologist with 25 years experience, few if any of the Legacy Enhancement teams use application modeling tools, because they already know the application back to front and inside out. Other teams have different reasons. "The better [teams] manage fine with whiteboards and paper; a few individuals 'just wing it' despite evidence that they are less productive overall if they go that way."

Sure. That makes sense. If you believe that any individual developer knows what he's talking about.

I Don't Need Their Functionality. Uh, Whatever It Is.

The question raised by the common "I don't need it" argument, however, is, "How do you know?"

One thing I learned from my developer conversations is that vendors don't always make it obvious how the software improves the development process. And developers have their own way of doing things or are confident that their code is adequate (whether or not the attitude is justified), so they see no driving reason to change.

The more experienced you are, the less you need these tools, in the opinion of many. For Mark Hunter, founder of FlipScript.com, the tool cost-to-value ratio is way too small, especially when factoring in the cost in his time. "For instance, why use data modeling tools when an experienced developer like myself can knock out the entire SQL script to create the database in a couple of hours, and later modify it in seconds?"

"Half of the developers in the world are below average, and you and I are depending on their code as well as on software written by the smartest programmers."

Richard Kirkcaldy is a developer at Computer Gentle, serving small businesses and non-profits. Even with larger scale deployments, he'd rather rely on his own knowledge than theoretical measurements. "It's much more helpful to know from experience that a certain database type will handle a certain number of users," he says. "If you don't know how many users a system will handle, you can ask people who have used similar setups." In other words: I can do this job better than the software.

Developer David Stevenson certainly agrees with that premise. "I just completed an XML over HTTP application, and I built my own performance measurement/load test tools from scratch," he says. "I would have to see a tool that wouldn't take an inordinate amount of time to learn, and that could do an equal job to the custom tool that I built."

Maybe Hunter, Kirkcaldy, and Stevenson are justified in their viewpoint (I assume they are), and maybe it's true for you (because as my reader you have demonstrated good taste and brilliance). But statistically, half of the developers in the world are below average, and you and I are depending on their code as well as on software written by the smartest programmers. Remember that guy on your team three years ago who was a legend in his own mind? Do you think his apps could stand to be scrutinized by a fine-tune finishing tool? Do you think he'd have admitted it?

As Larry Warnock, CEO of IT vendor Phurnace Software explains, "Most of the [sales] objections we hear are actually rooted in pride. And it's completely understandable. Developers have spent hundreds of hours creating their baby: home-grown deployment scripts. But ultimately it's tough to argue with the power of automation tools."

It's one thing to reject developer tools because they don't bring enough value to justify the time and money to acquire and install them. However, developers are not always aware of what the tools can do, at least according to vendors. Rob Cheyne, CEO of Safelight Security Advisors, says it depends on the team's skill set. A team with a performance engineering expert may already have a suite of stress test tools. But, he cautions, other teams may not know what's possible. Cheyne says, "Some of the really good security tools have only been available for a few years, and new tools come out all the time."

Vendor Mandeep Khera, CMO for security-tools company Cenzic, believes that too many developers don't recognize that they might have a Web security problem; after all, they haven't been hacked. "They think having SSL is enough. Usually it takes a lot of education and showing them actual hacks to convince them," Khera says.

The Tools Don't Deliver on Their Promises

The vendors, naturally, believe that their tools improve the software development process. (You expected otherwise?) But for some developers, it's not a matter of willingness to adopt a tool; the problem is that the tools don't deliver what they promise.

A decade ago, one developer (I'll call him Dennis) was part of an evaluation team asked to recommend one web stress testing package from the Management-deemed three finalists (all costing over $100,000). None really tested load, he says. They were far too underpowered, and what seemed like exact, discrete, measurable performance specs were mushy. "We held our nose and endorsed the best of the three," Dennis says. "The company bought one of the two losers instead, based on a sales to executive end-run."

Frank Koehl, founder and lead developer at Fwd:Vault, just doesn't believe that security testing tools, for instance, are worth the effort. "I code to eliminate the security holes, but a test is supposed to uncover ones I may have missed," Koehl says. "Because these attacks are so specific, and the tools can only be built for general purpose use, they are often useless. The ones that can get situation-specific are typically far more trouble than they're worth, because they still don't get you inside an attacker's head."

But not every developer rejects the tools. Many would be happy to use them – if only they could.

It's Not Part of the Development Process

In the eyes of Stevi Deter, a lead software engineer at Mantis Technology Group, a primary barrier to adoption is that these tools aren't part of the ordinary software development lifecycle (SDLC). Says Deter, "Unlike, say, Test Driven Development, there is no 'Security Driven Development' concept that teaches processes to include security concerns as an integral part of the development process. Instead, it seems to be viewed as a later step in the SDLC, one that can be skipped."

Safelight's Cheyne acknowledges that adding a new tool can cause concern about breaking an existing process. "It can be risky to change it until you fully understand the benefits and have the time and resources to work it into the existing process," he says.

"It is hard to quantify what benefits the tools will deliver, but the cost is up-front and highly visible."

Dave Poole

The tools also require that the practitioner know what to do with what the tool tells her. For example, explains Alice Kærast, code administrator at Qvox.org, security testing tools aren't very useful unless the developer understands and can deal with the results. "They only ever give obscure vulnerabilities which nobody will exploit and they miss real vulnerabilities that can be exploited," she says. Kærast is comfortable with her (open source) code and believes it's written securely.

Developer and database guy Dave Poole is also dubious about results. "It is hard to quantify what benefits the tools will deliver, but the cost is up-front and highly visible," he says. "I started looking at database stress test tools but was hampered by cost, poor documentation, obscure user interfaces, and ultimately time."

Or, as Denis Sinegubko, founder and developer of Unmask Parasites admits directly, "I don't know how to use them effectively and don't have time to learn."

It's not just a matter of asking software to do a better job than you're doing yourself. If you don't see a need for, say, refactoring, you surely won't buy software to assist with it. Keith Barrows, lead architect at RivWorks, uses refactoring tools personally, but he knows a lot of developers who don't know how to refactor. That makes for messy apps, he says. Similarly, few of Oldfield's teams use refactoring tools. "Many seem just to let the code ossify despite attempts to introduce them to better ways of thinking," he says.

They Might Be Useful, But They're Too Expensive

Another perception is that the tools cost too much for the developer's budget, or (going back to tool quality) they don't offer enough bang for the buck.

Stevi Deter summarized this attitude succinctly: "[Security testing tools] are expensive, so even as a developer focused on continually improving my skills, I find it hard to learn about them on my own. Contract prices don't include these tools, so it's hard to justify the outlay to my employer."

Because specialist tools cost a lot of money, says database guy Dave Poole, few small companies have the budget or time to invest in them. "I had to deliver an e-commerce site written in PHP with Notepad.exe!" he exclaims. "Once you step into the big league, decent tools start to give the ROI but they do require an outlay both in financial terms and in staff training terms."

The "high prices" may be a matter of perception more than actuality, even though the economy has reduced IT budgets. Cenzic's Khera says his company can overcome many myths by showing them how to use a SaaS solution at a very low price.

It's Hard to Convince Management They're Necessary

In some cases, the barrier is not developer reluctance, but finding employers who'll cough up the money to pay for software quality improvement tools. That can be a Catch 22, since a developer may not know what the software can do until she uses it herself, making it hard to ask for the budget allocation.

David Stevenson's company uses only free tools, other than the Microsoft development suite. Buying tools requires the costs to be justified, he says. "Unless you are familiar with the tool (and have a free version to train yourself on) you will not know what benefits and cost justifications the tool will provide, so that you can (attempt to) justify spending on a tool."

For some, tool adoption reflects overall corporate attitudes and, alas, company politics. According to Geoffrey Feldman, a consultant with 30 years of experience, companies exist on a continuum with "Document and analyze everything before writing any code" at one end and "Get 'er done" on the other – and the "Get 'er done" school doesn't use tools unless contractually required. "At the extreme end, if their customer pays for it, it's done," Feldman says. These companies also have "people lovers:" managers who gain importance based on the number of people under their command. "Many of these tools are labor saving devices," Feldman points out, "And thus they eat into the justification for [managing] lots of people."

So, why aren't you using these tools? The bottom line is that they are perceived (rightly or wrongly) as expensive, unnecessary, mystifying or inaccessible, and hard to learn. That's quite a challenge for those of us who care desperately about code quality, and for those who want to create tools to improve it.

All may not be lost, though. For some tool categories, adoption may just be a matter of time.

Safelight's Cheyenne points to debuggers as a good example. For a period of time after modern debuggers first became available, developers continued to debug code manually, such as adding break points and printing to the screen. "Not everybody was immediately aware that debuggers existed, they were comfortable with the way they had been doing things, and the debugger wasn't always bundled in with the compiler and used to cost extra money," Cheyenne remembers. But adoption followed quickly after developers got a taste of the new tools. "Today, no enterprise developer would dream of debugging the old-fashioned way," Cheyenne adds. "It's extremely inefficient and costs far more in developer hours than the cost of the tools. The tools that truly improve ROI will always be adopted in the long run."