Tuesday, September 11, 2012

3d graphics programming - adding some physics

Now that we got the basics of 3d game programming using Irrlicht out of the way, how about adding some physics? I'll demonstrate how physics could be added to a 3d application using the open source Tokamak Physics library. In order to better illustrate the source code, I've uploaded it to GitHub. It is accessible via the 3dGraphicsExamples repository.
Update the Qt project file to include Tokamak library:
In order to use Tokamak in C++ code, you'll need to include its header file:
Before Tokamak can take over as the physics engine behind your application, you need to provide it with some details about the simulation as well as information about your 3d models. In Tokamak, 3d objects that are supposed to be mobile are called rigid bodies, while bodies that are supposed to stay in their position are called animated bodies. In order for Tokamak to run, it needs to know how many objects are needed to be tracked in the simulation as well as information about gravity.
In our simulation, we'll start with poker cards initially suspended in air at varying distances from the a point and all cards facing this point. These cards will fall on a inanimate floor. Create the floor in Irrlicht and record it's attributes in Tokamak:
Create cards using the routine from the previous example however this time also set their physical attributes in Tokamak:
In order to run the simulation, Tokamak requires you to provide an advance interval. In order to get a consistent feel, you'll need to use a timer object and keep track of elapsed time interval. Luckily, Irrlicht provides you with a timer object as well:
You'll also need to ensure that your simulation runs consistently regardless of the amount of processing that goes on in your render loop. The below code is adaptation of code used by Adam Dawes on his site:
In your render loop, you'll have to traverse through your catalog of 3d objects in Tokamak and apply the position and rotation to the corresponding Irrlicht scene nodes.
There you go! You should now see your 3d objects, all initially facing in different directions, falling from on the floor, colliding with each other and bouncing. The complete source file can be found on github: example2.h

Tuesday, August 28, 2012

Introduction to 3d graphics programming using IrrLicht

IrrLicht is free and open source 3d graphics library written in C++. I came across IrrLicht when I was looking for a graphics library that was simple to understand and easy to follow. I had tried DirectX SDK under Windows a few years ago and remember having to blindly copy-paste code samples to get something to work without being able to understand how the code in front of me worked. I had since read about Ogre3d and couple other graphics engines.

When I found IrrLicht, I was surprised by how soon I was able to get myself up and running. The few tutorials on their site covered enough ground to make me feel confident about spending more time experimenting with the library. In this post, I'll do a walkthrough of some code that I've written.
You'll need to download IrrLicht library or compile it from source. IrrLicht is available for download from their website. It is also available as a download from official repositories of major Linux distributions. IrrLicht is also cross-platform and you'll be able to write 3d graphics applications for Linux, Mac, and Windows.

I use QtCreator to do C++ development but you could use any text editor or IDE of your choice. Just remember to include path to IrrLicht header link against IrrLicht library.

Here is how my Qt project file looks like:



TEMPLATE = app
CONFIG += console
CONFIG -= qt

unix:!macx:!symbian: LIBS += -L/usr/lib/ -lIrrlicht

INCLUDEPATH += /usr/include/irrlicht
DEPENDPATH += /usr/include/irrlicht

SOURCES += \
    program.cpp



Add the following lines to your cpp file:

#include <irrlicht.h>
#include <iostream>

using namespace std;
using namespace irr;
using namespace core;
using namespace scene;
using namespace video;
using namespace io;

int main(int &argc, char ** argv)
{
    //Create an Irrlicht Device.
    IrrlichtDevice * device = createDevice(EDT_OPENGL,dimension2d(800,600));

    //Get the Scene Manager from the device.
    ISceneManager * smgr = device->getSceneManager();

    //Get the Video Driver from the device.
    IVideoDriver * driver = device->getVideoDriver();

    //Add a Cube to the Scene.
    ISceneNode * node = smgr->addCubeSceneNode();

    //Needed to make the object's texture visible without a light source.
    node->setMaterialFlag(EMF_LIGHTING, false);

    //Add texture to the cube.
    node->setMaterialTexture(0,driver->getTexture("/usr/share/irrlicht/media/wall.jpg"));

    //Set cube 100 units further in forward direction (Z axis).
    node->setPosition(vector3df(0,0,100));

    //Add FPS Camera to allow movement using Keyboard and Mouse.
    smgr->addCameraSceneNodeFPS();

    //Run simulation
    while(device->run())
    {
        //Begin Scene with a gray backdrop #rgb(125,125,125)
        driver->beginScene(true,true,SColor(0,125,125,125));

        //Render the scene at this instant.
        smgr->drawAll();

        //End the scene
        driver->endScene();

        //Logic to update the scene will go here.
    }
    return 0;
}


Compile and Run.
You should see a textured cube in the middle of the screen. You will also notice that using your mouse and keyboard, you can move aroud the redered world.



Let's extend this example to do some fancy stuff. How about loading pictures from your computer in a 3d world? Let us do just that. Let's add a routine to to which we'll pass path to a directory as an argument.

We'll traverse this directory for picture files and for each picture file we find, we'll throw a cube on the screen and scale it to right proportion.

#include <irrlicht.h>
#include <iostream>

using namespace std;
using namespace irr;
using namespace core;
using namespace scene;
using namespace video;
using namespace io;

#define PICSCOUNT 800 //Maximum number of pictures to load.

void processFolder(IrrlichtDevice * device, const path &newDirectory)
{
    //Get the File System from the device.
    IFileSystem * fs = device->getFileSystem();

    //Get the Scene Manager from the device.
    ISceneManager * smgr = device->getSceneManager();

    //Get the Video Driver from the device.
    IVideoDriver * driver = device->getVideoDriver();

    //If maximum number of pictures already loaded, then don't load anymore.
    if(driver->getTextureCount() >= PICSCOUNT)
    {
        return;
    }

    //Change working directory.
    fs->changeWorkingDirectoryTo(newDirectory);

    //Get List of files and sub folders.
    IFileList * flist = fs->createFileList();

    //Sort by file names and folder names.
    flist->sort();

    //Loop through the contents of the working directory.
    for(u32 i = 0; i < flist->getFileCount(); i++)
    {
        //If current item is a directory
        if(flist->isDirectory(i))
        {
            //and it is not "Parent Directory .."
            if(strcmp(flist->getFileName(i).c_str(),"..") != 0)
            {
                //process contents of the current sub directory
                processFolder(device,flist->getFullFileName(i));
            }
        }
        else //If current item is a file
        {
            //Get file name
            path filename = flist->getFileName(i);

            //Get extension from file name
            std::string extension = filename.subString(filename.size() - 4,4).c_str();

            //If file extension is .png
            if(strcasecmp(extension.data(),".png") == 0)
            {
                //Create a new cube node with unit dimention
                ISceneNode * node = smgr->addCubeSceneNode(1.0f);

                //Scale the cube to the dimentions of our liking - 75x107x0.1
                node->setScale(vector3df(75,107,0.1f));

                //Set random X cordinate between -500 and 500
                long x = random()% 1000 - 500;

                //Set random Y cordinate between -500 and 500
                long y = random()% 1000 - 500;

                //Set random Z cordinate between -500 and 500
                long z = random()% 1000 - 500;

                //Create a position vector
                vector3df pos(x,y,z);

                //Change cordinates such that direction is preserved and length is 800 units
                pos = pos.normalize() * 800.0f;

                //Apply new position to cube
                node->setPosition(pos);

                //Get active camera
                ICameraSceneNode * cam = smgr->getActiveCamera();

                //Set camera to "look" at cube
                cam->setTarget(node->getPosition());

                //Apply camera's new rotation (as a result of above) to the node
                node->setRotation(cam->getRotation());

                //Make cube's texture visible without light
                node->setMaterialFlag(EMF_LIGHTING, false);

                //Set the file's graphics as texture to the cube
                node->setMaterialTexture(0,driver->getTexture(flist->getFullFileName(i).c_str()));

                //If maximum number of pictures already loaded, then don't load anymore.
                if(driver->getTextureCount() >= PICSCOUNT)
                {
                    return;
                }
            }
        }
    }
}

int main(int &argc, char ** argv)
{
    //Create an Irrlicht Device.
    IrrlichtDevice * device = createDevice(EDT_OPENGL,dimension2d(800,600));

    //Get the Scene Manager from the device.
    ISceneManager * smgr = device->getSceneManager();

    //Get the Video Driver from the device.
    IVideoDriver * driver = device->getVideoDriver();

    //Add FPS Camera to allow movement using Keyboard and Mouse.
    smgr->addCameraSceneNodeFPS();

    //Process contents of this folder.
    processFolder(device, "/home/karim/Images/Cards-Large/150/");

    //Run simulation
    while(device->run())
    {
        //Begin Scene with a gray backdrop #rgb(125,125,125)
        driver->beginScene(true,true,SColor(0,125,125,125));

        //Render the scene at this instant.
        smgr->drawAll();

        //End the scene
        driver->endScene();

        //Logic to update the scene will go here.
    }
    return 0;
}

The images will appear as though they are spread around and stuck on the inside/outside of a transparent sphere.



Let change it a little. Make the following changes:

From:

//Change cordinates such that direction is preserved and length is 800 units
pos = pos.normalize() * 800.0f;


To:

//Set Y cordinate to 0
pos.Y = 0;




The images will now appear as though they are spread around like the stone henge. How about a little more fun? Make the following change:


From:

//Set Y cordinate to 0
pos.Y = 0;


To:

//Set Y cordinate to 0
//pos.Y = 0; //Comment




I hope you this tutorial proves helpful in getting you started in your journey through the facinating world of 3d graphics programming.

Wednesday, August 22, 2012

3d world creation and simulation using Open Source tools and libraries

Over past few weeks I've been looking into 3D world simulation. I checked out Ogre3d however I found it to be way too complex with a steep learning curve. I then came across irrLicht and was greately impressed with its simplicity of use and how quickly it let me jump from following their tutorials to creating my own code. The tutorials on their site are well documented and very easy to follow. Among other things, the tutorials also address basic collision detection. Simple simulations were very easy to make, however, I ran into a road block when attempting to implement slightly more complex physics. I soon realized that while it was possible, I'd have to hand code almost all physics, for instance, gravity, friction, momentum, etc.
After doing some research I came across Tokamak physics engine. I found Tokamak to be surprisingly easy to learn however I only found very few example. Right around the same time I also found Bullet Physics which I found to be very well documented with plenty of tutorials and examples available online. Bullet is also a more comprehensive physics engine than Tokamak and comes with a slight learning curve over Tokamak. On the other hand, Tokamak can get you started and running in no time.
I am currently exploring IrrLicht and both Bullet and Tokamak in couple hobby projects. If the projects gain critical mass, I'll share the experience and perhaps example code in future posts.

Saturday, July 28, 2012

PCLinuxOS: A surprise addition to our family computers

We have 5 computers in our household. My dad, my wife, and I have a laptop each, and my mom uses a desktop that sits in my parents' bedroom. There is also a laptop that sits connected to our living room television. With the exception of this computer, we run Linux on all our computers. My mom and my wife use Kubuntu on their computers, while my dad and I have ArchLinux/KDE. The day before yesterday I realized that I hadn't run updates on my dad's laptop in a while so in haste, I issue the following at the terminal:
$sudo pacman -Syu
I had forgotten about a recent advisory issued at the ArchLinux Wiki about including "--ignore glibc" for the upgrade and to upgrade glibc only after the system upgrade is complete. Now I am not sure if that was the reason or if it was something else, but it rendered the laptop in a non-upgradable state. Passing "--ignore glibc" now was causing other dependency issues. I figured it was time to give another more beginner friendly Linux distribution a try for my dad's computer. I am very happy with ArchLinux on my computer, however, for my dad's computer, I wanted something that would be easier for my dad to manage. The first distribution that came to my mind was Zorin OS. Screen shots and features made it look very promising. Zorin OS appears to be very polished distribution and I had also started the download only to notice a 1.4 GB iso that would take a very long time download because of the limited number of mirror sites published on their site. Also, I didn't have a blank DVD available that would be needed in order to make the iso usable on my dad's computer.

Spending some time on Distro Watch revealed some familiar distributions, but one stood out: PCLinuxOS. I recalled that REGLUE (formerly HeliOS Project) use PCLinuxOS on the computers that they setup for disadvantaged kids in Austin, TX area. I had also read few articles in the past on the E-Magazine that PCLinuxOS publishes regularly and was generally impressed by the content they generated. So I decided to give PCLinuxOS/KDE a try. I read the details about the software included on the iso on their download page. Some software were slightly older that what I had gotten used to on my Arch. However, I chose Arch for my laptop for exactly that reason. Moreover, since my dad only needed a laptop to check his emails, create and review documents and spreadsheets, and use the Internet, I was not going to let a few slightly older versions of software prevent me from giving PCLinuxOS a try.

I downloaded the iso, burnt it to a blank CD, slapped it in the laptop, and fired up the LiveCD. While I was not very impressed with what I was looking at, I was not disappointed either. I did notice that the built-in wireless adapter that was not functioning correctly under ArchLinux, was now working perfectly fine. This was an excuse enough for me to do a full install, and I proceeded with that in mind.

During the install, I was presented with the option to use existing partitions to setup PCLinuxOS. I decided to preserve the /home partition, and reformat the / and /boot partitions, a feature I had come to depend on for quiet some time now. With the /home partition intact, my dad would continue to have access to all his documents, audio files, as well as application settings. Once the install was complete, after the first boot, I was asked to set the default timezone and create my first user account. I setup my user account as the first user just like it was before. I was now presented with a login splash screen. It was the same login splash that I had setup for my account when the laptop still had ArchLinux on it. I then setup my dad's account so he could login and begin using the laptop.

From a very brief glance at the different configuration utilities that were available to me when I logged in were firewall setup, network setup, video setup etc, which I found to be very easy to use. I also noticed that root user was also listed in the user selection at the login screen. I decided to login using that account. I was presented with a popup greeting that reminded me of something that I had forgotten. PCLinuxOS is a rolling release distribution just like ArchLinux. The popup also instructed me to run a system update using Synaptic Package Manager. I followed the instructions and did a full system upgrade, and I restarted the computer after the upgrade. Once the system was upgraded with the latest packages, I was pleasantly surprised to see a more visually pleasing desktop than what I saw prior to the restart. The system looks much more polished now and setting up applications that my dad had gotten familiar with over the months was a piece of cake.

I can now say with much confidence that we'll be using PCLinuxOS on at least this computer as a permanent distrubution. It is now the number one Linux distribution that I will recommend to people as a first distribution to try.

Friday, July 20, 2012

Update: ArchLinux + ATi Catalyst issue

After doing some more reading about the ATi Catalyst 12.6 driver release and upgrade information, I realized that the my hardware, Radeon HD 6250, was indeed supported. The "Unsupported Hardware" watermark was probably a bug that will hopefully be fixed in one of the future releases, whenever that might be. I did find a resolution for the time being that removes the watermark on the ArchLinux Wiki page for ATi Catalyst.

It involves running this following script:

#!/bin/sh
DRIVER=/usr/lib/xorg/modules/drivers/fglrx_drv.so
for x in $(objdump -d $DRIVER|awk '/call/&&/EnableLogo/{print "\\x"$2"\\x"$3"\\x"$4"\\x"$5"\\x"$6}'); do
 sed -i "s/$x/\x90\x90\x90\x90\x90/g" $DRIVER
done

Once this script is run, and the X server is restarted, the watermark disappears. All other functionality, especially OpenGL rendering etc, seems to be intact. Good enough for me.

However, this highlights the problem of having closed source, proprietary hardware drivers. Had the drivers been open, this issue would have been fixed not too long after the drivers get released. Hopefully the open source "radeon" will catch up in performance and features with the proprietary drivers and we won't have to worry about these kinds of issues in the future.

Friday, July 13, 2012

ArchLinux - Use at your own risk

Update: update-archlinux-ati-catalyst-issue.html

I have an HP Pavilion g series laptop running ArchLinux. With 17" wide screen and quad code AMD A6 processor, I really have nothing to complain. I love Arch. It allows me to be on the bleeding edge. Among all the other distros that I have tried in the past, Arch gave me the most control with out as much hassle. Today, I use ArchLinux exclusively on my laptop. And getting front row passes for being able to try the latest and greatest of features of Linux almost makes me giddy. Except, I am stuck with AMD Radeon graphics built in the APU.

AMD/ATi are not very keen about providing regular driver updates for the AMD Radeon chip that is in my computer. The open source driver lags far behind the proprietary driver in features. The proprietary driver relies on xorg-server version 1.11 while the latest xorg-server is version 1.12. Staying stuck with xorg-server 1.11 means that while my X server is slightly behind on features and security updates, but also are all the components that depend on it, for instance, XFree86 drivers for my input devices. During every system wide update, I have to be extra careful as to not update any of the XFree86 drivers as there is not a hard dependency extablished for xorg-server between the versions that I use. An incompatible input device driver renders my keyboard and touchpad unuseable as soon as X session takes over, regardless of me using the laptop keyboard and touchpad, or an external keyboard and mouse.

This was a little easier until a few days ago. The version of Catalyst driver available was 12.4 which had a hard dependency on xorg-server 1.11. All I had to worry about were XFree86 drivers.

Few weeks ago, AMD released a new Catalyst driver for newer hardware, version 12.6. Unfortunately for me, my hardware is not capable of utilizing that driver as it was excluded from the driver. When I installed it without reading the Arch Wiki page about the upgrade, I noticed that bottom right corner of my screen displayed a watermark - Unsupported Hardware. 12.6 version of Catalyst however is listed as an upgrade to the 12.4 version of Catalyst. So now it shows up in my list of upgrade. Changing the pacman mirror for Catalyst resolved the issue as the new mirror only had 12.4 available for download.

A few nights ago, I was in bed at around 10:30 PM and I was hoping to sleep right away. I decided that since I hadn't run a system update in the past few days, I'd go ahead and run a quick update before sleeping. I fired up Apper, a universal KDE package manager that works with Pacman as well as Apt-Get for debian based systems, and saw 30 some applications awaiting updates. I scrolled down the list and unchecked the usual suspects: xorg-server 1.12, etc. I realized that somehow, I had failed to notice Catalyst version 12.6 in the list of updatable software and did not exclude it. I realized my mistake one X server restart later when the nasty watermark became visible. I figured it was not as big a deal and fired up Apper with the intention of downgrading the Catalyst driver to 12.4. Except, it wasn't there. I decided to switch to the open source driver for the time being while I investigated what might have happened. The open source driver was installed and I was at the login menu. All looked good, except, my keyboard and touchpad wouldn't work. The open source driver has a dependency on latest version of xorg-server. I realized later that when I installed tthe open source driver which caused the xorg-server upgrade, I hadn't selected the XFree86 input driver for synaptic to be upgraded. The version of the input driver that was installed had a dependency on older version of xorg-server and now the missmatch was causing rendering my laptop useless as soon as X session started.

Fast forward couple of very frustrating hours and I was able to break the boot process before X session loaded to get terminal access to a half ready system. Then after remounting the root partition as read/write, I unstalled xorg-xserver and all the installed dependencies. A reboot later into a more usable and now network aware system, using terminal, I installed the open source ati driver. Once I was able to log into the KDE session, I fired up Apper again, and this time more careful about it, I manually downloaded the correct version of Catalyst driver from the mirror site, uninstalled the open source driver and downgraded xorg-server and the hard dependencies as well as XFree86 input driver. I then using pacman installed the 12.4 version of the Catalyst driver and restarted X server. I was now able to login and use the system and begin my efforts at documenting my experience.

Wednesday, February 8, 2012

Walking on "The Arch Way"

I had once started down the Arch path before. Back then the intent was to revive old Fujitsu Tablets. As soon as the Live image fired up I abandoned it. It was then I realized that Live CD doesn't mean Live CD with X. Yesterday I found myself research Arch in my quest to find a lean Linux distribution that was on a rolling release. Arch was the answer and it had the latest versions of all software that I needed current.

Having used Linux for over six years, I was still intimidated by Arch because of what I read about it. GUI installers are against "The Arch Way". There was this whole text-file-based configuration editing step that I had read about and was not too fond of. However, the part that scared me the most about the non-GUI installer was the disk partioning. I have 100GB of data on my home partition that I wasn't going to sort through before installing Arch. Also, I was not too thrilled by the idea of running a back up of that data over USB either. Frankly, I felt that "The Arch Way" was frankly getting in my way. But sticking with Kubuntu was no longer the option and I couldn't find any other Linux distribution to be a viable alternative either. I decided to read through the beginner's guide on the ArchWiki and fire up a virtual machine to get a feel for the installation process and to see what the final product might look like.

To my surprise, the installation was not nearly what I had expected. It was very straight forward and methodical. Having already read the beginner's guide also made it a lot easier to follow through. I made sure I familiarized myself with the partitioning utility during the installation. Besides the computer's host name, I did not have to change any of the defaults in the configuration editing step. The defaults were quite sufficient. I was able to follow the steps from the beginner's guide and setup X and KDE and was able to boot into KDE session on the new Arch VM. To be certain, I tried this routine one more time and made some mental notes.

Since I was going to install Arch on my laptop, I decided to not hook up the ethernet cable for Internet and rather do the post installation updates over WiFi. I already had my wpa_supplication configuration file in my home directory. I would come in handy during post installation. I made note of the UID of my user account under Ubuntu. This would surely come in handy once Arch is installed.

During the installation:
1. On partitioning section, I made sure that I didn't reformat my home partition.
2. On the configuration editing step, I changed the HOSTNAME to what I had previously had under Ubuntu.
3. I skipped editing the network configuration as that was needed only for ethernet based Internet updates.
4. On the package selection step, I selected sudo and all the network and wireless related packages.
After the installation, after reboot, I checked to see if my home partition was still intact. And it was. I created a new user for my login and used the UID that I got when I was still running Ubuntu. I fired up the WiFi with the following command:

wpa_supplicant -Dwext -i wlan0 -cwpa.conf -B
dhcpcd

I was connected to Internet through WiFi which I validated by issing a ping to www.google.com. I modified the /etc/pacman.d/mirrorlist file to uncomment the mirrors for my updates. After that I ran the pacman update command:

pacman -Syu

This updated the pacman database and started the initial update process. Once the update was done, I followed the instructions from the beginner's guide to install X and then KDE. I followed the KDE guide on the Wiki to setup kdm as the display manager. Once KDE was installed, I rebooted. After the reboot, I was presented a graphical login screen. Once inside KDE session, I had to restart wpa_supplicant for WiFi. I also realized that the network daemon was slowing the boot process. I disabled it from the rc.conf file.

The Wiki mentioned about graphical tools for package management and KPackageKit/Apper was one of them. Since I had used Apper with Kubuntu and was familiar with its functionality, I decided to install it. Installation of Apper was probably the trickiest thing to figure out during this entire exercise. But I was able to install it from AUR. Once Apper was installed, installing other software became a piece of cake. I installed all the plasma widgets and plasmoids. This enabled me to use the Network Management plasma widget that I was accustomed to under Kubuntu. Since it needed NetworkManager to function, I installed NetworkManager daemon and enabled it in rc.conf file. On the next reboot, I had networking and I could get on the WiFi using KDE's network management settings.

I did realize that while I had my home folder from previous setup, my desktop and KDE settings had disappeared. Since I knew that Kubuntu stored user level configuration under ~/.kde/share, I decided to take a look. I did find that there was a ~/.kde4 folder under home along with ~/.kde. This was it. Arch was using the ~/.kde4 folder instead of ~/.kde which was why my previous settings had not taken effect in the new setup. I copied the share folder from .kde to .kde4 folder and logged out. After loggin back in, I was in a familar workspace.

With all my software installed under the ArchLinux and with my original KDE settings restored, it feels like I never switched distributions.

One of the things that I had forgotten to do was to create a group by the same name as the user as Kubuntu did. That caused a temporary permissions issue which I was able to resolve without much difficulty.

I hope this information will be helpful to those who are considering to move to a different distribution but my be intimidated by what they might have read about it.

Good bye Kubuntu

I never thought it'd come to this. I was content with what I had. I had been using Kubuntu full time since their 7.04 release. Kubuntu was the realization of my love for the KDE desktop environment as it offered near latest builds of KDE desktop and all of the software that I needed for everyday computing. I was a happy camper.

While I was already familiar with Linux, KDE desktop was what converted me into a fulltime Linux user and Kubuntu was the only distribution that I had found to implement KDE well. So the decision to switch away from Kubuntu to another KDE based desktop was a difficult one. There were three factors that led me to consider an alternate KDE distribution.

Until a few months ago, I used my computer only for regular everyday tasks like for messaging, surfing, e-banking, online shopping, etc. I did some programming, but that was mostly as a hobby. Few months ago I started spending more time doing application development for more than just recreation. I mostly develop using the Mono framework on my Linux machine. Since Mono is integrated with Ubuntu, to ensure stability, Canonical does not offer updates to Mono very often for Ubuntu. Mono packages for Ubuntu lag behind the official Mono releases and recently the gap has only widened. Since, and this is by design, Mono itself lags behind the latest .Net Framework, to be able to utilize the power of the latest release of Mono, I have had to compile it from source on my laptop on number of occasions. I've had to do this after every Ubuntu upgrade every six months since the Ubuntu upgrade would cause some of the dependencies to be overwritten.

Canonical releases Ubuntu on a schedule with one release in April and the other in December. This means that while the system receives regular updates, major features and enhancements are only released with those scheduled releases. These features and enhancements are not only those that Canonical might include in the new releases of Ubuntu, but they might also include enhancements to Desktop Environment and to the Linux Kernel etc. Sometimes, waiting for a full release in order to avail some of these enhancements doesn't seem justifiable.

I would have still continued using Kubuntu, if it weren't for the news that I came across couple days ago. Canonical will be discontinuing funding the development of Kubuntu. Now for some perspective, Canonical had one paid full time developer who is responsible for KDE implementation in Kubuntu who they would no longer fund for the effort and Canonical would continue to provide only infrastructure support to Kubuntu. I am sure that this does not mean the end for Kubuntu. But it might mean that KDE related updates to Kubuntu would become more infrequent over time. Due to the lack of full time developmental resources from Canonical, the effort might even be taken over by the community for KDE related maintenance.

I did some research on different Linux distributions to replace Kubuntu on my laptop. The criteria was set: I needed a distribution that would allow me to easily install and upgrade to the latest versions of software, especially Mono Framework and KDE, along with the latest Linux Kernel updates. After reviewing my choices against the criteria, there was only one clear winner - ArchLinux. I chose Arch over other Linux distributions because of its rolling release and the availability of latest versions of the software that I use almost everyday. I now have KDE 4.8 and Mono 2.10.8 installed under ArchLinux on my laptop. After installing Apper for package management, I've also found myself on familiar grounds again. The transition was as smooth as one could hope for and I am already beginning to enjoy my new distribution.

I will always have fond memories of Kubuntu though, as the distribution through which I first experienced Software Freedom. But it is time to move on.

Friday, February 3, 2012

iPads in Education - Jeff Hoogland's take

I came across this post yesterday from Jeff Hoogland on how he realized his Asus T101MT tablet does so much more than an iPad. It is titled Confused about iPads in Education (http://jeffhoogland.blogspot.com/2012/02/confused-about-ipads-in-education.html)

Sunday, January 15, 2012

Implementing ASP.NET MVC like NameValueCollection translation to method parameters

If you've programmed with ASP.NET MVC, you know about a neat feature where the form post data or the query string, which is string data, is automagically converted to strongly typed objects when it is passed to its Controller Action.

For instance, consider the following code:

public class HomeController : Controller
{
     public ActionResult ProductQuery(string Name, int Code)
     {
     ...
     }
}

The web request to http://<server<:port>></AppRoot>/Home/ProductQuery?Name=123&Code=456 will be properly handled and Name will be treated as a string while Code will be treated as an integer value.

Also consider the following code:

// UserRegFormData.cs
public class UserRegFormData
{
     public string Name{get;set;}
     public DateTime Dob{get;set;}
}

// HomeController.cs
public ActionResult Register(UserRegFormData urfd)
{
...
}

You could now setup an html form with action=/Home/Register, method=post, a text field with name="urfd.Name" and another one with name="urfd.Dob". When this form is submitted with data in the two text fields, ASP.NET MVC will automatically create the strongly typed UserRegFormData object with correct values for its properties. This wasn't the case before ASP.NET MVC. You would receive all the values as strings and you were responsible for doing the conversions in your ASP.NET server side code.
Well I've been thinking about how this could be done without using ASP.NET MVC.

Why reinvent the wheel, you might ask. The answer is two fold. Firstly, there might be situations where ASP.NET MVC might just not be a suitable solution, a non web application, perhaps. This type of routing and smart parameter translation would also be useful in scenarios where input contains both the parameters as well as the operation the perform on them. Secondly, I have been facinated by this very approach ever since I discovered it in ASP.NET MVC. When I recently came across Managed Extensibility Framework (MEF) as it is packaged with .Net 4, I wondered if it could be used to create an MVC framework from scratch. It would be a good learning opportunity and I wanted to cease it.

While I made some progress with routing URLs to methods in controller objects, I got faced with the issue of handling the input. The input data would not translate itself to strongly typed objects. Few searches on the Internet revealed that there was nothing readily available for this and I really didn't want to dig through the ASP.NET MVC code to see how it was implemented there.

I did an Internet search on how to convert string objects to other object types. This StackOverflow article provided the solution:

Another search on how to assign property values using reflection yeilded this post on DotNetSpider:

I found how to create strongly typed arrays from this article on Byte.com:

I had most of what I needed to get started.

I have uploaded the initial code to my github repository. The code is still crude and it has not yet undergone a refactoring exercise. https://github.com/jimmy00784/MEFExample/tree/master/StringToArgumentsTest.

Monday, January 9, 2012

RavenDB on Linux - Update

I spent some more time with RavenDB source code trying to figure out what might have been causing the runtime errors which had led me to comment the "SatisfyImportsOnce" line and supply code to manually load MEF exports.

It turns out that some of the Imports were not being satisfied. The one place in particular was in the OAuth code under Raven.Database/Server/Security/OAuth/OAuthClientCredentialsTokenResponder.cs file. The member IAuthenticateClient AuthenticateClient was expecting import of type IAuthenticateClient which was not being satisfied.

I reverted my changes made to in connection with disabling SatisfyImportsOnce and loading exports manually, rebuilt the project, and fired up the server application. I was presented with the same nasty stack trace.
I commented out the Import attribute from AuthenticateClient, rebuilt the project, and tried running the server second time. It worked!

There were other similar instances in the code where the imports were being satisfied with corresponding exports. I learnt this from running the xUnit tests on the application. It wasn't making sense. RavenDB was supposed to be a complete solution.

I did a filesystem search for AuthenticateClient under solutions root folder and sure enough I found results in CSharp code files that were not part of the Raven.sln file. These files and many more were under the Bundles folder under their own solution. I compiled these projects - Raven.Bundles.Tests did not build due to some issues - mono or monodevelop specific I assume.

I copied the generated dll files into the Bundles/Plugins folder and set its path as the value to the "Raven/PluginsDirectory" key in App.Config for Raven.Server project.
I uncommented the import attribute, rebuilt the solution and fired up the server the third time. It worked this time as well.

Next, I'll try to re-run some of those xUnit tests that had failed earlier to see how much ground could be covered out of those 1160 tests that came packaged with RavenDB.


This article is part of the series NoSQL - RavenDB on Linux. The series contains the following articles:
NoSQL - RavenDB on Linux
Open Source Shines - RavenDB on Linux
RavenDB on Linux - Source Code
RavenDB on Linux - Update

Tuesday, January 3, 2012

It's 2012!!!

Have a Happy, Prosperous, and an Open-Source New Year!!!
OK, that last bit doesn't make any sense, but you get it, don't you :)