The other day, while automating our load tests, (you heard that right – we are even automating load tests as part of agile efforts) i came across an interesting scenario.

Load runner, which we where using to automate the clustered web application, spread all its output across tons of small files, containing one/many similar lines. The 2-3 lines each file contained would look something like this

Report.c(199): Error: App Error: refresh report. DataSet took longer than the refresh rate to load.  This may indicate a slow database (e.g. HDS) [loadTime: 37.708480, refreshRate: 26].      [MsgId: MERR-17999]
Report.c(83): Error: App Error: report took longer than refresh rate: 26; iter: 32;  vuser: ?; total time 41.796206    [MsgId: MERR-17999]

But what we wanted to know to know at the end of the test was a summary of errors as in

  • Report.c(199): Error: App Error: refresh report. DataSet took longer than the refresh rate to load.  This may indicate a slow database (e.g. HDS) – 100 errors
  • Report.c(200) Error Null arguments – 50 errors

Approaching the solution

How would you solve this problem ? Think for a moment and not more ….

Well being the C++ junkie that i’m, i immediately saw an oppurtunity to apply my sword and utility belts of hash-tables / efficient string comparisons etc and whipped out Visual Studio 2010 (i havent used 2010 too much and wanted to use it) and then it occured to me

Why not use Java ? I had recently learned it and it might be easier.

The Final Solution

cat *.log | sed  ‘s/total time \([0-9]*\)\.\([0-9]*\)/total time/;s/loadTime: \([0-9]*\)\.\([0-9]*\)/loadTime: /;s/iter: \([0-9]*\)/iter: /;s/refresh rate: \([0-9]*\)/refresh rate:  /;s/refreshRate: \([0-9]*\)/refreshRate:  /’ | sort | uniq –c

I probably should optimize the sed list of commands such that i can pass in the replacement commands as input to sed from a file. But i left it at that, happy that my solution was ready to use / would work all the time / does not require maintenance / saved me a lot of time.

The end

 

Thinking about the episode, i was reminded of the hit Japanese samurai movie Zatoichi and its happy ending at the end.

  • The villain (problem) is dead
  • Samurai is happy with his effort
  • The subjects (files/compiler/system/software/hardware) are happy due to the efficiency created
  • Karma (company + job) is happy due to the time saved.

and to boot it nearly follows the Art of War zen which advices the clever warrior to win without pulling out his sword …

 

Happy coding !!!

 

 

What if you had bought Apple stock instead of Apple products

How to find and keep crappy programmers

Why japan is doomed (and US and EU too)

How a small town went bust much like US and Greece

The not so new VS 2010 has added what seems like a cool feature in the solution explorer window. It shows all your external dependencies

But wait a minute,  what is all this that you see when you expand the external dependencies ?

The drop down shows ALL the dependencies, even those that are indirectly pulled in. The feature is useful but i wish someone had put a thought into how it is used … If i had been shown the layered dependencies, that showed the directly referred h files and then the ones that are further pulled in and so on, it might have been more usable.

Wish someone from Microsoft does hear this and make amends !!

I remember during the days of Slackware free CD and windows 98 , how me and my friends used to customize the OS installation to make it lean and very fast, on the almost always older hardware we had to run them on. But anytime we had to install or test a new software we had found or someone had shared, it would break the entire install and we had to re-install stuff from scratch.

Even ghosting was not an option those days, since (Well we hadn’t heard of that back then) and the hard disk sizes was changing so often, we had to keep upgrading them every now and then.

Now finally, there is a PERFECT solution for this – create your own OS image or appliance (if you are into selling devices) using free software and cool web applications –

SUSE studio allows you to –

  • choose the packages you want, (automatic dependency pull in)
  • default customizations (locale, users, startup, installation scripts etc)
  • copy your own files into the final system

You can finally create an installable image, of that perfect OS you had always dreamt of. Or at least the one we guys used to dream of.

Watch the web casts  on the home page and dedicated screen cast page. They are really cool. I wish this could be done with windows too and it would be god send for folks who sell computers or sell applications to dumb end customers. wow !!

I read a lot of blogs (or at least i used to). After neglecting my enthusiastic collection of blog links (157) of THE top most readable material on the web for over an year, i decided it was time to change.

I made 3 folders

  • Check daily – contains max 10 blogs i will check and read new articles from
  • Tech Stuff – Has blogs, tech news and lots of similar stuff
  • Photography stuff

The point is, the only way i could catch up on my reading was if i was not trying to catch up with a lot of things. In short, i was coping with information overload by just ignoring all that information.

So my solution was to arrange stuff so that i could catch up with the ones that mattered. The rest, well i might flip through them on evenings or weekends IF i get around doing that and i would still be better than before, and not ignore it totally.

Where else do we face overload and how do you cope ?

  • email accounts
  • Electronic stuff that you work with/want to buy
  • Health notes from forwards and emails
  • List of things to do / read etc
  • Things to keep track off, and manage at work
  • Education (exams skipped during school coz you would not score anyway?)
  • Weight loss – (So much to do, might as well not try)
  • Social Networking updates ?
  • Cooking ?
  • Stocks and Financial info ?

I wonder if there is any other way to cope with overload, other than just filtering  … outsourcing  ??

It had to happen – organizations are starting to warm up to stable free applications like GIMP. In fact CISCO notice boards (aka active LCD displays) has started asking employees whether they are using the free apps. Once this trend takes hold, we might at last see the uptake FSF hoped to see, in Free (Free as in Free beer) applications and other GNU apps ….

The significance of this move is the implication that companies like CISCO are ready to provide internal support and has made resources available to support free apps like GIMP, or are paying someone else to do this, which basically means the same thing. This is really cool.

The loss of social stigma against uptake of free applications is the most important win here for the software development community as a whole. wow !! I guess we shall now see lots of small startups come up to support the free software and hopefully  this would mean more volunteers to add features and fix issues in free software.

FSF rocks !!!

Some time back i encountered a scenario, in which a chain of stored procedures gets called recursively and this was eating into our performance (90% cpu time). All of a sudden obscure micro issue about whether to use table variable or temp tables suddenly became important.
After ferreting out some good sources of information on comparing these, we managed to reduce the cpu consumption by
  • Making all our stored proc use temp tables rather than table variables made a difference for us, because our procedures where getting invoked frequently and stored-procedures are recompiled every time if table variables are used.
  • But we had minimal impact from db writes due to temp tables because very few data was actually used
Ultimately the maximum improvement came from changes to the algorithm to make sure that the procedures do not get needlessly invoked or at least parts of them do not run (recursive algorithm was flattened out)

LearningIf micro optimizations start to look important, it is time to look for improvements in the algorithms that are used.
NOTES – Information gleamed from the exercise
  1. Most often you will require table variables
  2. Table variables cause FEWER recompilations than temporary tables. (source MSDN)
  3. Table variables seems to be scoped at the stored proc level. So if you create a table variable inside a while loop, it would retain old data (painfully found from a bug)
  4. Table variables cannot have indexes created on them except the primary key specified when creating these tables
  5. Create index command can cause query recompilations
  6. Table variables are held in memory and involve no disk IO – therefore no rollbacks are possible on these. On the upper side no locks are acquired by using table variables.
  7. Table variables cannot have statistics created on them so for huge data it COULD be inefficient  ?
Follow

Get every new post delivered to your Inbox.