Wednesday, February 8, 2012

Walking on "The Arch Way"

I had once started down the Arch path before. Back then the intent was to revive old Fujitsu Tablets. As soon as the Live image fired up I abandoned it. It was then I realized that Live CD doesn't mean Live CD with X. Yesterday I found myself research Arch in my quest to find a lean Linux distribution that was on a rolling release. Arch was the answer and it had the latest versions of all software that I needed current.

Having used Linux for over six years, I was still intimidated by Arch because of what I read about it. GUI installers are against "The Arch Way". There was this whole text-file-based configuration editing step that I had read about and was not too fond of. However, the part that scared me the most about the non-GUI installer was the disk partioning. I have 100GB of data on my home partition that I wasn't going to sort through before installing Arch. Also, I was not too thrilled by the idea of running a back up of that data over USB either. Frankly, I felt that "The Arch Way" was frankly getting in my way. But sticking with Kubuntu was no longer the option and I couldn't find any other Linux distribution to be a viable alternative either. I decided to read through the beginner's guide on the ArchWiki and fire up a virtual machine to get a feel for the installation process and to see what the final product might look like.

To my surprise, the installation was not nearly what I had expected. It was very straight forward and methodical. Having already read the beginner's guide also made it a lot easier to follow through. I made sure I familiarized myself with the partitioning utility during the installation. Besides the computer's host name, I did not have to change any of the defaults in the configuration editing step. The defaults were quite sufficient. I was able to follow the steps from the beginner's guide and setup X and KDE and was able to boot into KDE session on the new Arch VM. To be certain, I tried this routine one more time and made some mental notes.

Since I was going to install Arch on my laptop, I decided to not hook up the ethernet cable for Internet and rather do the post installation updates over WiFi. I already had my wpa_supplication configuration file in my home directory. I would come in handy during post installation. I made note of the UID of my user account under Ubuntu. This would surely come in handy once Arch is installed.

During the installation:
1. On partitioning section, I made sure that I didn't reformat my home partition.
2. On the configuration editing step, I changed the HOSTNAME to what I had previously had under Ubuntu.
3. I skipped editing the network configuration as that was needed only for ethernet based Internet updates.
4. On the package selection step, I selected sudo and all the network and wireless related packages.
After the installation, after reboot, I checked to see if my home partition was still intact. And it was. I created a new user for my login and used the UID that I got when I was still running Ubuntu. I fired up the WiFi with the following command:

wpa_supplicant -Dwext -i wlan0 -cwpa.conf -B
dhcpcd

I was connected to Internet through WiFi which I validated by issing a ping to www.google.com. I modified the /etc/pacman.d/mirrorlist file to uncomment the mirrors for my updates. After that I ran the pacman update command:

pacman -Syu

This updated the pacman database and started the initial update process. Once the update was done, I followed the instructions from the beginner's guide to install X and then KDE. I followed the KDE guide on the Wiki to setup kdm as the display manager. Once KDE was installed, I rebooted. After the reboot, I was presented a graphical login screen. Once inside KDE session, I had to restart wpa_supplicant for WiFi. I also realized that the network daemon was slowing the boot process. I disabled it from the rc.conf file.

The Wiki mentioned about graphical tools for package management and KPackageKit/Apper was one of them. Since I had used Apper with Kubuntu and was familiar with its functionality, I decided to install it. Installation of Apper was probably the trickiest thing to figure out during this entire exercise. But I was able to install it from AUR. Once Apper was installed, installing other software became a piece of cake. I installed all the plasma widgets and plasmoids. This enabled me to use the Network Management plasma widget that I was accustomed to under Kubuntu. Since it needed NetworkManager to function, I installed NetworkManager daemon and enabled it in rc.conf file. On the next reboot, I had networking and I could get on the WiFi using KDE's network management settings.

I did realize that while I had my home folder from previous setup, my desktop and KDE settings had disappeared. Since I knew that Kubuntu stored user level configuration under ~/.kde/share, I decided to take a look. I did find that there was a ~/.kde4 folder under home along with ~/.kde. This was it. Arch was using the ~/.kde4 folder instead of ~/.kde which was why my previous settings had not taken effect in the new setup. I copied the share folder from .kde to .kde4 folder and logged out. After loggin back in, I was in a familar workspace.

With all my software installed under the ArchLinux and with my original KDE settings restored, it feels like I never switched distributions.

One of the things that I had forgotten to do was to create a group by the same name as the user as Kubuntu did. That caused a temporary permissions issue which I was able to resolve without much difficulty.

I hope this information will be helpful to those who are considering to move to a different distribution but my be intimidated by what they might have read about it.

Good bye Kubuntu

I never thought it'd come to this. I was content with what I had. I had been using Kubuntu full time since their 7.04 release. Kubuntu was the realization of my love for the KDE desktop environment as it offered near latest builds of KDE desktop and all of the software that I needed for everyday computing. I was a happy camper.

While I was already familiar with Linux, KDE desktop was what converted me into a fulltime Linux user and Kubuntu was the only distribution that I had found to implement KDE well. So the decision to switch away from Kubuntu to another KDE based desktop was a difficult one. There were three factors that led me to consider an alternate KDE distribution.

Until a few months ago, I used my computer only for regular everyday tasks like for messaging, surfing, e-banking, online shopping, etc. I did some programming, but that was mostly as a hobby. Few months ago I started spending more time doing application development for more than just recreation. I mostly develop using the Mono framework on my Linux machine. Since Mono is integrated with Ubuntu, to ensure stability, Canonical does not offer updates to Mono very often for Ubuntu. Mono packages for Ubuntu lag behind the official Mono releases and recently the gap has only widened. Since, and this is by design, Mono itself lags behind the latest .Net Framework, to be able to utilize the power of the latest release of Mono, I have had to compile it from source on my laptop on number of occasions. I've had to do this after every Ubuntu upgrade every six months since the Ubuntu upgrade would cause some of the dependencies to be overwritten.

Canonical releases Ubuntu on a schedule with one release in April and the other in December. This means that while the system receives regular updates, major features and enhancements are only released with those scheduled releases. These features and enhancements are not only those that Canonical might include in the new releases of Ubuntu, but they might also include enhancements to Desktop Environment and to the Linux Kernel etc. Sometimes, waiting for a full release in order to avail some of these enhancements doesn't seem justifiable.

I would have still continued using Kubuntu, if it weren't for the news that I came across couple days ago. Canonical will be discontinuing funding the development of Kubuntu. Now for some perspective, Canonical had one paid full time developer who is responsible for KDE implementation in Kubuntu who they would no longer fund for the effort and Canonical would continue to provide only infrastructure support to Kubuntu. I am sure that this does not mean the end for Kubuntu. But it might mean that KDE related updates to Kubuntu would become more infrequent over time. Due to the lack of full time developmental resources from Canonical, the effort might even be taken over by the community for KDE related maintenance.

I did some research on different Linux distributions to replace Kubuntu on my laptop. The criteria was set: I needed a distribution that would allow me to easily install and upgrade to the latest versions of software, especially Mono Framework and KDE, along with the latest Linux Kernel updates. After reviewing my choices against the criteria, there was only one clear winner - ArchLinux. I chose Arch over other Linux distributions because of its rolling release and the availability of latest versions of the software that I use almost everyday. I now have KDE 4.8 and Mono 2.10.8 installed under ArchLinux on my laptop. After installing Apper for package management, I've also found myself on familiar grounds again. The transition was as smooth as one could hope for and I am already beginning to enjoy my new distribution.

I will always have fond memories of Kubuntu though, as the distribution through which I first experienced Software Freedom. But it is time to move on.

Friday, February 3, 2012

iPads in Education - Jeff Hoogland's take

I came across this post yesterday from Jeff Hoogland on how he realized his Asus T101MT tablet does so much more than an iPad. It is titled Confused about iPads in Education (http://jeffhoogland.blogspot.com/2012/02/confused-about-ipads-in-education.html)

Sunday, January 15, 2012

Implementing ASP.NET MVC like NameValueCollection translation to method parameters

If you've programmed with ASP.NET MVC, you know about a neat feature where the form post data or the query string, which is string data, is automagically converted to strongly typed objects when it is passed to its Controller Action.

For instance, consider the following code:

public class HomeController : Controller
{
     public ActionResult ProductQuery(string Name, int Code)
     {
     ...
     }
}

The web request to http://<server<:port>></AppRoot>/Home/ProductQuery?Name=123&Code=456 will be properly handled and Name will be treated as a string while Code will be treated as an integer value.

Also consider the following code:

// UserRegFormData.cs
public class UserRegFormData
{
     public string Name{get;set;}
     public DateTime Dob{get;set;}
}

// HomeController.cs
public ActionResult Register(UserRegFormData urfd)
{
...
}

You could now setup an html form with action=/Home/Register, method=post, a text field with name="urfd.Name" and another one with name="urfd.Dob". When this form is submitted with data in the two text fields, ASP.NET MVC will automatically create the strongly typed UserRegFormData object with correct values for its properties. This wasn't the case before ASP.NET MVC. You would receive all the values as strings and you were responsible for doing the conversions in your ASP.NET server side code.
Well I've been thinking about how this could be done without using ASP.NET MVC.

Why reinvent the wheel, you might ask. The answer is two fold. Firstly, there might be situations where ASP.NET MVC might just not be a suitable solution, a non web application, perhaps. This type of routing and smart parameter translation would also be useful in scenarios where input contains both the parameters as well as the operation the perform on them. Secondly, I have been facinated by this very approach ever since I discovered it in ASP.NET MVC. When I recently came across Managed Extensibility Framework (MEF) as it is packaged with .Net 4, I wondered if it could be used to create an MVC framework from scratch. It would be a good learning opportunity and I wanted to cease it.

While I made some progress with routing URLs to methods in controller objects, I got faced with the issue of handling the input. The input data would not translate itself to strongly typed objects. Few searches on the Internet revealed that there was nothing readily available for this and I really didn't want to dig through the ASP.NET MVC code to see how it was implemented there.

I did an Internet search on how to convert string objects to other object types. This StackOverflow article provided the solution:

Another search on how to assign property values using reflection yeilded this post on DotNetSpider:

I found how to create strongly typed arrays from this article on Byte.com:

I had most of what I needed to get started.

I have uploaded the initial code to my github repository. The code is still crude and it has not yet undergone a refactoring exercise. https://github.com/jimmy00784/MEFExample/tree/master/StringToArgumentsTest.

Monday, January 9, 2012

RavenDB on Linux - Update

I spent some more time with RavenDB source code trying to figure out what might have been causing the runtime errors which had led me to comment the "SatisfyImportsOnce" line and supply code to manually load MEF exports.

It turns out that some of the Imports were not being satisfied. The one place in particular was in the OAuth code under Raven.Database/Server/Security/OAuth/OAuthClientCredentialsTokenResponder.cs file. The member IAuthenticateClient AuthenticateClient was expecting import of type IAuthenticateClient which was not being satisfied.

I reverted my changes made to in connection with disabling SatisfyImportsOnce and loading exports manually, rebuilt the project, and fired up the server application. I was presented with the same nasty stack trace.
I commented out the Import attribute from AuthenticateClient, rebuilt the project, and tried running the server second time. It worked!

There were other similar instances in the code where the imports were being satisfied with corresponding exports. I learnt this from running the xUnit tests on the application. It wasn't making sense. RavenDB was supposed to be a complete solution.

I did a filesystem search for AuthenticateClient under solutions root folder and sure enough I found results in CSharp code files that were not part of the Raven.sln file. These files and many more were under the Bundles folder under their own solution. I compiled these projects - Raven.Bundles.Tests did not build due to some issues - mono or monodevelop specific I assume.

I copied the generated dll files into the Bundles/Plugins folder and set its path as the value to the "Raven/PluginsDirectory" key in App.Config for Raven.Server project.
I uncommented the import attribute, rebuilt the solution and fired up the server the third time. It worked this time as well.

Next, I'll try to re-run some of those xUnit tests that had failed earlier to see how much ground could be covered out of those 1160 tests that came packaged with RavenDB.


This article is part of the series NoSQL - RavenDB on Linux. The series contains the following articles:
NoSQL - RavenDB on Linux
Open Source Shines - RavenDB on Linux
RavenDB on Linux - Source Code
RavenDB on Linux - Update

Tuesday, January 3, 2012

It's 2012!!!

Have a Happy, Prosperous, and an Open-Source New Year!!!
OK, that last bit doesn't make any sense, but you get it, don't you :)

Saturday, December 31, 2011

RavenDB on Linux - Source Code

I've created a new github repository to temporarily host the updated code for RavenDB to enable execution under Linux. I say temporary because a few things could happen.

Worst case scenario, and I don't believe this would happen, but there is a slight possibility, that I might be asked to shut down the repository because I unknowingly violated some terms of use. For an open source project, this is a highly unlikely scenario.

The normal case scenario would be that I get no notice from the creators of RavenDB in which case this code would continue to exist under its own repository.

The bese case scenario, and I would really like this to be the case, would be that my changes in some shape be accommodated upstream in RavenDB making RavenDB a cross platform tool.

I did not investigate why the OutputStream.Flush() command was causing an exception. At the same time, this is really my first attempt at MEF and .Net 4.0 and I don't know why the exports were not automagically loaded, in resolution to which, I had to manually load them using reflections. A better fix would be to identify and resolve these issues.

I am glad, however, that I was able to fulfill a personal quest of learning about RavenDB, and in the process, making it run under Linux. This opens up the possibility of making RavenDB a serious contender against MongoDB on the non-Windows platforms.

RavenDB along with my source code changes are available at https://github.com/jimmy00784/RavenDB-for-Linux https://github.com/jimmy00784/ravendb.

Note: Source code url updated.


This article is part of the series NoSQL - RavenDB on Linux. The series contains the following articles:
NoSQL - RavenDB on Linux
Open Source Shines - RavenDB on Linux
RavenDB on Linux - Source Code
RavenDB on Linux - Update