I was talking to someone who recruits people in his company. Part of the filtration process involves writing an entrance test which is in an MCQ format. This test of theirs was conducted recently. After they got all the answer sheets, they fed all of them into a scanner so that they could be graded by a computer. The program that worked on all the scans also did the job of identifying the people who had cheated. I’m told that the algorithm they use has worked well in the past, but this time it wasn’t very effective. When they did this in the past, they used to do it in batches, and as soon as the papers would arrive. This time they put all the papers into one batch and fed them into the scanner together. Sure it saved them the hassle of placing each batch into the scanner one by one, but this increased the amount of computation because the number of comparisons that had to be made increased automatically. Also many of the pairs(of people) it had identified as having cheated consisted of people who were sitting in different cities while writing the exam. The program was never written to handle such volumes of data. They had always thought that all the scanning would take place in batches and everything would work flawlessly. But they forgot about all that. And it didn’t work. If they had remembered, they could have added a few lines of code just to set cheating=false or to stop performing the cheating-test if paper(i).city != paper(j).city. Or some such thing.
Sometimes when you’re programming, you just miss the tiniest bit of detail and you put yourself in an uncomfortable position facing huge repercussions. You just have to cut corners then to get a quick solution to your problem. This may not be the best solution and it may be the opposite of what you were looking for, but at that time and place you don’t have a choice. This is mainly because programs are only as intelligent as we write them to be. Writing an intelligent program to make the right choices in different scenarios isn’t as easy as it looks in science fiction movies focusing on AI. If you can think of only a ninety nine scenarios where your program must respond to certain events, the hundredth scenario you missed may create unnecessary complications. If humans are made to deal with such events, complications are much easier to handle. The moment you or I see two papers coming from two different places, we would just go ahead and ignore any similarity that exists between the two papers. We are not programmed to behave this way. We just know our choices well. The reason computers make mistakes sometimes is because they don’t understand these choices humans make. They make the choices we program them to make.
If I was asked to write a program which is expected to bring forward perfect results, I would spend a couple of hours thinking about the perfect solution and start programming. By the time the program is ready to be executed, a hundred assumptions I made to speed up the programming process have already rendered it imperfect. I just don’t know it yet. That’s fair because I’m working under constraints. But when I find out about flaws, it’s my responsibility as a developer to go and make changes so that the end-user doesn’t suffer. Why? Because the end-user, in most cases(regardless of whether I release the source code or not), is not aware about the mistakes I make and will be scared to use my application the moment it doesn’t work as he/she expects it to. It is this detail I look at when I review applications while I use them or search for alternatives. If I’m comparing two text editors, and if I find that a search on a page for a word with an accented ‘e’ using a keyword with a regular 'e’ gives me results on one editor and none on the other one, I think something is wrong. It is this detail that is important to me, as a programmer and more importantly, as a desktop user. If developers respect the fact that users know little about how software works, it’s obvious that they’ll gain the respect they deserve from them.