search
top

Let’s turn off all features to improve UX

Here’s a thought; what if all features on a new software upgrade were turned off by default? What if when you bought a software upgrade, all the new features were disabled and the software worked exactly as if it did before you installed it? What if new software had all but the most structurally necessary features disabled? Yes, I’m serious. But why? For a maximum user experience of course … WTF? … how can you have a maximum user experience with everything turned off? Because the biggest issue with user experience is surprises. Well duh! But now we’re talking bugs, and everybody knows bugs suck. Yeah, bugs suck, but that’s not what I’m talking about. I’m talking about features that work to spec, but unfortunately the user doesn’t even know exist! Notice that I’m not talking about features that users don’t know how to use. I’m talking specifically about features, often intrusive features, which the user doesn’t know about. Like what? … oh, I don’t know … like having my keyboard cycling between US, Canadian French, and Canadian Multilingual, in an apparently random pattern last week. This happened because the default hot keys on the Language Bar software were set to the same keys as is standard for highlighting text for cut & pasting. This took me days to figure out, for a variety of reasons, the biggest being that, the change is not obvious until I type a letter that has a different mapping, turning my your single quotes ‘, into “ or è! I was completely oblivious to the fact that utility even had keyboard shortcuts, as I’m sure the person who selected those keys was oblivious to the fact they are standard shortcuts. This was extremely frustrating, and notice it was not a bug. The problem is that things did not work as expected. Unfortunately, when the user doesn’t know ‘about’, let alone ‘how to use’, all the funky bells & whistles we added to improve their lives, their experience more closely resembles a series of surprises. And surprises suck … oh yes they do suck. I’m not talking “Ed McMahon showing up on your door step” surprises, I’m talking “your meeting starts in 5 minutes, and the printer doesn’t work, and when you reboot, a large Windows Update starts installing” surprises. Surprises are frustrating and unproductive. So why the surprises? Because users are diving into the software to get stuff done, and don’t have the... read more

New Job

New job I’m starting at a new job this morning.  I’ve been looking since April. Well sort of; it took me a long time to accept the fact that full time might be the best choice for me at this point in my life. But once I accepted that, I went straight to Infusion Development and got in their interview process.  I’ve known about Infusion since about 2006 via their ads on DotNetRocks. As I understood, they’ve been growing super fast, have reputation for having bright developers, & a bleeding edge development strategy.  For example, Infusion is a leading developer of Microsoft Surface apps … If you’re reading this, I’m sure you know about Surface already, but if you don’t, go to Google Video and do a search on it.  It’s an awesome product. I won’t be working on the Surface though. I really want to work in ASP.NET MVC / jQuery, and that’s what I’ll be doing.  I’ve wanted to work at Infusion for a long time, but didn’t act on it since I had a pretty good thing going as an independent consultant. But when my contract ended in April, a bit of soul searching led me to the conclusion that I needed to push my career to another level, and this might be the best strategy. As I understand it, Infusion has over 100 developers in their Toronto offices. If you’ve ever looked my Twitter feed, you know I like talking to smart developers. So, yeah, I’ll be in heaven.  🙂 One thing I’ve noticed since I started looking for a job is that I seem to have missed an important exit in my career. Nobody looks for developers with more than 10 years experience. It seems that after 10 years most dev’s have moved to management or architecture, which makes sense, but as a ‘do it all’ independent consultant, I never made that distinctive leap. I made more of a indistinctive transition. Based on conversations during interviews, I got the impression they would like to see me pursue a move to architect and will help me do it. So that’s nice … Assuming I did interpret that correctly.  Infusion also has a wickedly awesome intensive training program as well, where you do intensive, 8 hours a day, in house, until you learn the new technologies.  I believe they even have their own training team. Freakin awesome!   The HR lady was happy when I expressed... read more

Visual Studio Bug – ‘if’ followed by a try / catch causes debugger stepping error

Yesterday I was debugging and stepped into a method. I wanted to get past my parameter validation checks and into the meat of the method, so I quickly, F10’d my way down the method, but I noticed a line of code was stepped on which should not have been touched. The code was a simple parameter validation like: 1 2 if (enumerableObj == null) throw new ArgumentNullException("enumerableObj");if (enumerableObj == null) throw new ArgumentNullException("enumerableObj"); with several similar parameter validation lines above it and a try/catch block containing the meat of the method below it. The odd thing was, I thought I saw the debugger step on the throw statement even though the enumerableObj should have had a value. I assumed I had somehow passed in a null value to the enumerableObj parameter and had nearly missed the problem in my haste. I had been moving quickly, so quickly in fact that I had stepped about 3 more lines into the method before I even stopped. To be honest at this point, I wasn’t even sure if I saw it step into the ‘if’ block, so I repositioned my debug cursor back to the ‘if’ condition, and stepped again. Sure enough, it stepped into the ‘if’ block. I assumed I passed in a null parameter, but when I evaluated enumerableObj, it was set, what’s more, evaluating the entire enumerableObj == null expression resulted in false, as expected. But why the heck was I being stepped into the ‘if’ block when the ‘if’ condition was false? I retried it again, just in case the enumerableObj had somehow been set as a result of a side effect somewhere, but even then, it still stepped into the ‘if’ block. So, I did the standard stuff; cleaned my solution, deleted my bin and obj directories, reopening the solution, restarted Visual Studio, & rebooted, all the while rebuilding and retesting the project with each change. Nothing seemed to work. I even cut & pasted my code into notepad, then cut & pasted from notepad back into Visual Studio to ensure there was no hidden characters in my files.* None of this worked, so I started commenting out code in the method, and eventually was able to isolate it to the above code failing if, and only if, it was followed by a try / catch block. Seriously! If the try / catch block was there, it would step onto the throw statement even though it should... read more

An Abstract Data Model

This is post 3 from a 7 part series entitled Technical Achievements in my Last Project. Overview Normally, when I build a new system, I design the new data model based on the requirements, and build my business objects and data access, based primarily on a that data model*. The remainder of the application is built on the components beneath it, so when you change something at the bottom, like the data model, changes ripple throughout the application. The data model serves as the foundation of my application. Now as far as this project goes, one of the important requirements was to deliver the new system incrementally, while leaving the older system to run in parallel until completely replaced. Parallel Data Models This presented a bit of a dilemma for me since the current database was … well … lacking, and I was planning to refactor it enough to make it a very unstable foundation for the old system. I wanted to refactor it for a number of reasons including; missing primary keys, no foreign keys, no constraints, data fields which were required but not there, data fields which were there but not used, data fields containing 2 or more pieces of information, and tables which should have been multiple tables. Not to mention the desire to achieve a consistent naming convention without the insane column names using characters like ‘/’ and ‘?’ … seriously. However the parallel systems requirement caused a bit of a dilemma. I mean, how do you manage parallel systems, one of which needs a stable foundation, and the other is so temperamental that you don’t want to touch it. My options as I saw them were something like: Scrap the data model refactoring. This really didn’t get much thought. Well it did, but the thought was, is this the best route for the client? And if so, should I offer to help them find my replacement or just leave? I definitely wasn’t up for replacing one unmaintainable piece of junk for another. New data model and re-factor the existing app. The existing application was a total nightmare built in classic Access spaghetti code fashion. Just touching that looked like going down a rabbit hole of certain doom. New application on the old data model and refactor the data model later. This would have caused a real disconnect between the data model and the application. I’m not sure if the data model and application ever... read more

Hey #region haters; Why all the fuss?

I hear a lot of programmers saying #regions are a code smell. From programmers I don’t know to celebrity programmers, while the development community appears to be passionately split.* The haters definitely have a point about it being used to hide ugly code, but when I open a class and see this, it just looks elegant to me. Now none of these regions are hiding thousands of lines of ugly code, actually, most of these regions contain only 3 properties and/or methods and the last curly brace is on line 299. So the whole thing with 17 properties and methods including comments and whitespace is only 300 LOC. … really, how much of a mess could I possibly be hiding? To me, the only question is whether I should have this functionality in the ContainerPageBase or the MasterPageBase**. You may also notice the regions I have are not of the Fields / Constructors / Events / Properties / Methods variety. It has taken some time for me to accept that all data members (aka fields) do not need to be at the top of the class as I was classically trained to do and that perhaps grouping them by functionality is a better idea. This philosophy only makes regions that much more valuable. … is anybody still here? …. have I converted anyone? 😉 * These posts are fairly old, but in my experience in the developer community; the consensus hasn’t changed. ** The Database Connection & Current User regions may have some scratching their heads. There are valid reasons for them, however the Data Connection region will never be included at this level again. More on that in a future post. Copyright © John MacIntyre 2010, All rights... read more

Large Application Estimation in 2 Weeks

This is post 2 from a 7 part series entitled Technical Achievements in my Last Project. My role in this project started out by being asked to assess the existing project, provide insight into options to move it forward, with one of those options being a rewrite*. An estimation was needed for the rewrite option, so I was given 2 weeks to do it. This post explains how I was able to pull off this massive estimation undertaking in a mere 2 weeks. Ideally, the project documentation from the existing system could be used to give an excellent estimate, but this is a blog post, not a fairly tale. Or a thorough specification could have derived from an in depth analysis of the existing application, which business could have adjusted as needed, and used to conclude a reasonable estimate. But this is the real world, and this is a real business; and I was given a real (short) deadline. Now I should also mention this wasn’t a 20 KLOC project, it was a fairly complex piece of software with over 500 KLOC** and almost 1800 database objects along with satellite applications. Everybody understood how this short timeframe severely limited the accuracy of anything I would be able to provide, but I was determined do the best job possible. So my next goal was to figure out how to do a somewhat accurate estimate, provided the constraints, where I wouldn’t be setting myself up for a lynching at the end of it. I explored many different ways to get a rough idea about the entire projects scope. This is what I finally settled on: Dumped all Microsoft Access Objects First I modified an Access VBA script I found for exporting objects to text files and exported everything. Dumped all database DDL I wrote a little command line utility to loop through a SQL Server database, pull the DDL for each object using the sp_helptext stored procedure, and write it out to text files. Created an analysis database Created an analysis database primarily comprised of three tables; one for all the entities the application is comprised of, a second for linking which entity called which, and the third for linking menu items to all dependent forms. Collected the names of all objects into the database I wrote another little command line utility to read each code file dumped out in steps 1 & 2, and add the objects name and a... read more

Technical Achievements in my Last Project

I’ve wanted to write this series for a long time, but hadn’t gotten around to it. Now, however, with my contract ending soon, I feel if I don’t write some of this down, I will never find the time, and it will be lost forever, which would be disappointing since I feel there are some really interesting things I did on this project which could benefit others. This isn’t about the kludges needed to fix Microsoft’s dysfunctional webforms architecture to work the way I need it to. It’s not about how to fire a server side event from a client side created button or how to write JavaScript for an ASCX used multiple times on the same page, when you don’t know what the rendered ID will be, and it’s not about overcoming resistance created by bizarre vendor API paradigms or outright bugs. It’s about overcoming the big requirements challenges placed before me. The project basically revolved around a significantly large and complex Microsoft Access application which had many issues. It was determined the best course of action was to rebuild it as an ASP.NET web application. However two important constraints placed upon me were; a) development could not be done in parallel, switching everybody over all at once upon completion, b) the new web application must be run from within the existing MS Access application and interact seamlessly until the MS Access app is completely replaced. The first constraint wasn’t a big deal until you consider the fact that the existing database was a total mess, needing refactoring, and the Access app was a spaghetti code nightmare where changes could potentially drag into eternity. It was wisely decided very early on that we would not do anything to upset the stability of the existing application. The series will cover some of the more interesting things done on this project and will be over 7 parts: Introduction and Overview Introduction to the series, brief run down of the general requirements and intention of the project. Large Application Estimation in 2 Weeks How I assessed the condition of a very large and complex MS Access application with 540 KLOC and almost 1,800 database objects, and how I was able to provide a very rough estimate to its reconstruction with a 2 week deadline. I expect to be able to piece together how I did this from memory and rough notes I have, but if I’m unable to come up with... read more

The CRUDLAFS Technique for Software Estimation

I have always, like so many other programmers, had a problem with software estimation and costing out a software project. Most of my career was plagued by software estimates so bad that I made significantly less money on my independent projects than I did at my low paying day jobs. This was not the reason I was working through my vacations! The turning point came when I realized the CRUD acronym encompassed most of the functionality of the line of business applications I was writing and could be used as a check list to ensure I wasn’t missing functionality. Later, this checklist was refined to CRUDLAFS. Wait! What? CRUDLAFS? … did I just coin a new acronym? …. Hmmm, looks like I did!</egosmirk> While I’m sure you are already familiar with the CRUD acronym, here is CRUDLAFS: Create All functionality around validating and creating an entity Read All functionality around reading and displaying an entity Update All functionality around validating and updating an entity Delete All functionality around deleting an entity List All functionality around querying and displaying a list of entities Additional  All additional functionality related to an entity Filter All functionality around filtering a list of entities Sort All functionality around sorting a list of entities I set up an Excel spreadsheet with CRUDLAFS as column headings, all my known or probable entities down the left, and probable hours to accomplish each in the cells, like so: So in my example, the estimated time to create a list of orders is 5 hours. And when I say known or probable entities, I mean; on smaller projects, I usually have a pretty good idea what all the entities will be, but on big projects, I charge hourly until requirements, data model, GUI design / functional specifications, and initial risk assessment is complete. Most larger projects just have too much back and forth communication and the customer usually doesn’t even have more than a vague idea of what they want. This was a very important distinction which has kept me out of bankruptcy. I get the feeling this may not work well with agile approaches, but for Big Design Up Front projects, it has been a major leap for me. Copyright © John MacIntyre 2010, All rights... read more

« Previous Entries Next Entries »

top