search
top

A simple proposal to rule out obvious software patents

I just read the EFF’s new website (https://defendinnovation.org/) about how to change the patent system to prevent the patent trolls from stifling innovation.  I like some of the ideas, but question how realistic others are (i.e. #3? – C’mon; How many implementations in how many languages will you need to write to cover all your bases?) I realize this is an extremely naive statement to make, but I’m going to make it anyway; solving the ‘non-obvious’ aspect of a software patent is simple. … and it doesn’t involve judges learning to code. Here’s my proposal:  Software patent proposals should include a unit test suite and a single solution implementation.  The test suite would be made public immediately, without the implemented solution.  The public would take a crack at making the test suite pass. If there are no successful passes, then it could be deemed as a difficult problem with a non-obvious solution. If the patent office is bombarded with working solutions, then it unquestionably fails the ‘non-obvious’ test. It’s debatable what to do if a relative few brilliant developers solve it, while most fail.  But this would definitely eliminate laughable patents like Amazon’s 1-Click which would have bombarded the patent office in less than an... read more

An Abstract Data Model

This is post 3 from a 7 part series entitled Technical Achievements in my Last Project. Overview Normally, when I build a new system, I design the new data model based on the requirements, and build my business objects and data access, based primarily on a that data model*. The remainder of the application is built on the components beneath it, so when you change something at the bottom, like the data model, changes ripple throughout the application. The data model serves as the foundation of my application. Now as far as this project goes, one of the important requirements was to deliver the new system incrementally, while leaving the older system to run in parallel until completely replaced. Parallel Data Models This presented a bit of a dilemma for me since the current database was … well … lacking, and I was planning to refactor it enough to make it a very unstable foundation for the old system. I wanted to refactor it for a number of reasons including; missing primary keys, no foreign keys, no constraints, data fields which were required but not there, data fields which were there but not used, data fields containing 2 or more pieces of information, and tables which should have been multiple tables. Not to mention the desire to achieve a consistent naming convention without the insane column names using characters like ‘/’ and ‘?’ … seriously. However the parallel systems requirement caused a bit of a dilemma. I mean, how do you manage parallel systems, one of which needs a stable foundation, and the other is so temperamental that you don’t want to touch it. My options as I saw them were something like: Scrap the data model refactoring. This really didn’t get much thought. Well it did, but the thought was, is this the best route for the client? And if so, should I offer to help them find my replacement or just leave? I definitely wasn’t up for replacing one unmaintainable piece of junk for another. New data model and re-factor the existing app. The existing application was a total nightmare built in classic Access spaghetti code fashion. Just touching that looked like going down a rabbit hole of certain doom. New application on the old data model and refactor the data model later. This would have caused a real disconnect between the data model and the application. I’m not sure if the data model and application ever... read more

Large Application Estimation in 2 Weeks

This is post 2 from a 7 part series entitled Technical Achievements in my Last Project. My role in this project started out by being asked to assess the existing project, provide insight into options to move it forward, with one of those options being a rewrite*. An estimation was needed for the rewrite option, so I was given 2 weeks to do it. This post explains how I was able to pull off this massive estimation undertaking in a mere 2 weeks. Ideally, the project documentation from the existing system could be used to give an excellent estimate, but this is a blog post, not a fairly tale. Or a thorough specification could have derived from an in depth analysis of the existing application, which business could have adjusted as needed, and used to conclude a reasonable estimate. But this is the real world, and this is a real business; and I was given a real (short) deadline. Now I should also mention this wasn’t a 20 KLOC project, it was a fairly complex piece of software with over 500 KLOC** and almost 1800 database objects along with satellite applications. Everybody understood how this short timeframe severely limited the accuracy of anything I would be able to provide, but I was determined do the best job possible. So my next goal was to figure out how to do a somewhat accurate estimate, provided the constraints, where I wouldn’t be setting myself up for a lynching at the end of it. I explored many different ways to get a rough idea about the entire projects scope. This is what I finally settled on: Dumped all Microsoft Access Objects First I modified an Access VBA script I found for exporting objects to text files and exported everything. Dumped all database DDL I wrote a little command line utility to loop through a SQL Server database, pull the DDL for each object using the sp_helptext stored procedure, and write it out to text files. Created an analysis database Created an analysis database primarily comprised of three tables; one for all the entities the application is comprised of, a second for linking which entity called which, and the third for linking menu items to all dependent forms. Collected the names of all objects into the database I wrote another little command line utility to read each code file dumped out in steps 1 & 2, and add the objects name and a... read more

Technical Achievements in my Last Project

I’ve wanted to write this series for a long time, but hadn’t gotten around to it. Now, however, with my contract ending soon, I feel if I don’t write some of this down, I will never find the time, and it will be lost forever, which would be disappointing since I feel there are some really interesting things I did on this project which could benefit others. This isn’t about the kludges needed to fix Microsoft’s dysfunctional webforms architecture to work the way I need it to. It’s not about how to fire a server side event from a client side created button or how to write JavaScript for an ASCX used multiple times on the same page, when you don’t know what the rendered ID will be, and it’s not about overcoming resistance created by bizarre vendor API paradigms or outright bugs. It’s about overcoming the big requirements challenges placed before me. The project basically revolved around a significantly large and complex Microsoft Access application which had many issues. It was determined the best course of action was to rebuild it as an ASP.NET web application. However two important constraints placed upon me were; a) development could not be done in parallel, switching everybody over all at once upon completion, b) the new web application must be run from within the existing MS Access application and interact seamlessly until the MS Access app is completely replaced. The first constraint wasn’t a big deal until you consider the fact that the existing database was a total mess, needing refactoring, and the Access app was a spaghetti code nightmare where changes could potentially drag into eternity. It was wisely decided very early on that we would not do anything to upset the stability of the existing application. The series will cover some of the more interesting things done on this project and will be over 7 parts: Introduction and Overview Introduction to the series, brief run down of the general requirements and intention of the project. Large Application Estimation in 2 Weeks How I assessed the condition of a very large and complex MS Access application with 540 KLOC and almost 1,800 database objects, and how I was able to provide a very rough estimate to its reconstruction with a 2 week deadline. I expect to be able to piece together how I did this from memory and rough notes I have, but if I’m unable to come up with... read more

AboutMe.xml

I have a pet peeve and like most pet peeves it’s an irrelevant petty little annoyance, not quite a huge, humanity, oppressing problem.

My pet peeve is filling out the same information; name, address, city, etc… on paper forms. All that standard information at every doctor’s office, school, activity registration form for my kids, etc… I mean why do I need to keep writing this stuff? And why does somebody else have to take the time to retype it into their system?

Really! In all seriousness … what a waste of time! 5 minutes I’ll never get back, every time I start a new relationship with any organization.

But wait … I have a vision! Not a big glorious, save humanity vision, it’s more of a save each person 5 minutes of writers cramp, kind of vision. Yes! That kind of glorious vision! …

read more

How To Write Dynamic SQL AND Prevent SQL Injection Attacks

One of my pet peeves is when general rules are taken as gospel, and declared as the only acceptable practice regardless of the circumstance. One of the big ones is Dynamic SQL. There’s a heck of a good reason for this, and it’s called an SQL Injection Attack, and if you are not familiar with it, I would strongly urge you to leave this post right now, and read up on it. Anyway, Dynamic SQL is not inherently evil, it’s the appending of user entered text that is evil. Appending user entered text is just lazy and can be easily avoided with parameterization. The trick is to create dynamic SQL …

read more

top