Wednesday 30 December 2009

nVIDIA CUDA

Started a new project in my spare time. Trying to add CUDA functionality to rarcrack, a password recovery program for rar files after finding an old encrypted rar file which I forgot the password to. RAR uses 128-bit AES as far as I know, should be interesting to see what this CUDA technology is all about and how easy it is to use.

Tuesday 17 November 2009

CentOS, Nagios and SELinux

I've just finished installing Nagios on our CentOS box. I kept on running into errors when trying to get Nagios to start. The errors weren't very specific, merely stating

Starting nagios:CONFIG ERROR! Start aborted. Check your Nagios configuration.

This confused the crap out of me, mostly because of the lack of information in the error messages. So after a few hours of googling, I found that the problem was with SELinux. This is the fourth or fifth time that this piece of software has gotten in the way of installs. It's no wonder then that the first piece of advice when telling someone on a forum/mailing list is to give the advice to set 'SELINUX=disabled'.

This ofcourse ends up voiding any security advantage that SELinux may have had, but apparently everyone seems to think that Linux is secure enough.

Friday 6 November 2009

Linux and Remote Desktop

Recently we've installed a remote server in North America. In order to set up some Windows boxes, we figured it would be easier to download the DVD images from the servers themselves. This posed a problem however, for several reasons:

1. XenServer requires that any Linux installs be specially compiled for Xen. This is due to the way that the Xen guys wrote the code for the HyperVisor. From my understanding (and without going through the lkml) the code was not very 'clean' and/or did not conform to the kernel coding guidelines. This resulted in the Xen code not being available in the standard kernel that comes with most distributions.

2. Figuring out how to add an ISO repository and actually have its contents come up when installing a new VM is harder than it probably should be on Xen. When trying to figure this out, I found plenty of examples of how to add a new iSCSI/CIFS repository to the XenServer, but nothing on how to set up an ISO repository on the local storage.

3. Apparently there's also a way of uploading a Windows XP VM, but this was more complicated to get going than adding the local ISO repository.

Due to these problems it was decided to take one of the existing distributions (Debian Lenny) and install a desktop environment and enable Remote Desktop functionality through the use of one of the below listed technologies. After pissing about for two/three days with these protocols I can honestly say that the NX protocol and the binaries available off the NoMachine site are the easiest to configure and the best working solutions. This is kind of disappointing for me, since I would have loved to see a FLOSS solution out do a propriatary one, but as a programmer you have to give credit where credit is due.

When it comes to Remote Desktop functionality for Linux, there are a few options regarding protocols:

* XDMCP - Basically the original X idea of remote displays hooked up to a mainframe. Couldn't get it to work for the life of me. Also, found a known, unfixed bug whereby enabling remote login makes the gdm listen only to IPv6 addresses
* VNC - Hard to set up and slow
* RDP - See VNC
* NX (FreeNX) - Awesome. Although FreeNX is a real bastard to set up

UPDATE: Found out that Google released their own NX Server called Neatx.

Monday 2 November 2009

Zend CE Server on CentOS

I just finished installing Zend Server Community Edition on CentOS. One of the problems encountered along the way was that starting Apache would give an error saying that it did not have the necessary permissions to access '/usr/local/zend/lib/apache2/libphp5.so'. This is due to SELinux not allowing the httpd process access to this file. After searching around for a solution, I found that most of the forum posts gave the advice to either switch SELinux to permissive mode or to disable it altogether. To me, this was a bit like throwing the baby out with the bathwater, so after reading up a bit on SELinux, I figured an easier (and more secure) solution was to just add this file to the 'httpd_modules' context. So, here your are:

chcon -t httpd_modules_t /usr/local/zend/lib/apache2/libphp5.so

This command adds the Zend php binary to the 'httpd_modules' context, allowing Apache to load it and start up normally.

Thursday 10 September 2009

MySQL Bug #11918

In the software industry, every now and then you come accross some indescribably dense acts of stupidity that cost you hours of debugging time. When it's not your own code that's the problem, you get to rant about it on your blog ;)

Today I spent the last four hours trying to create a MySQL stored procedure to do something which I thought was quite basic. It just had to take an integer as input and return that number of rows from a specific table and at the end delete the rows. Kind of like a stack data structure (the rows 'pop' out), but with a database. So I sat down and wrote this code:

DELIMITER $$

DROP PROCEDURE IF EXISTS `test`.`testGet`$$
CREATE PROCEDURE `test`.`testGet` (IN numRecords INT)
BEGIN
CREATE TEMPORARY TABLE tmpTable
SELECT * FROM test LIMIT numRecords;

DELETE FROM test WHERE idTest IN (SELECT idTest FROM tmpTable);

SELECT * FROM tmpTable;

DROP TABLE tmpTable;
END$$

DELIMITER ;

So after saving this code and trying to run it I get this cryptic error:

You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'numRecords;

DELETE FROM test WHERE idTest IN (SELECT idTest FROM tmpTable);

' at line 4 (errno: 1064)

I spent the next two hours trying to figure out what MySQL was complaining about, pulling my hair out, trying to see which part of the code was screwing up.... finally after much googling, I came across this bug report on the MySQL bug tracker, the title of which simply reads "SP does not accept variables in LIMIT clause". The bug's been sitting in the bug tracker since the 13th of July 2005, FOUR YEARS this last July. The bug report is full of comments with detailed workarounds, most of which end up being versions of the 'EXECUTE STMT USING' statement. There's a big discussion in the comments as to whether this is a bug fix or a feature request, meanwhile Oracle and SQL Server continue to dominate the marketplace. Nice going guys. I'm stupified that this bug has been hanging around for four years and is still not fixed. I'm tempted to download the source and have a look at how difficult it would be to patch.

Sunday 6 September 2009

Eclipse Opening Tutorial woes

I've been looking around for a good all around development environment for a while now and today I've decided to give Eclipse a go. I've installed the eclipse package from the Ubuntu 9.04 repositories and everything seemed to be working well. I was a little disappointed with the fact that the Ubuntu repositories only had version 3.2 of Eclipse, when the main stable release was at 3.5, but this didn't bother me too much.I opened up Eclipse and decided to have a go at the first two tutorials, hoping to get an idea of what this IDE's all about.

The first tutorial goes through the process of creating a Java project with the ambitious aim of printing out 'Hello World!' to the command line. Going through the tutorial was quite straight forward, teaching you how to create a new jave project and add classes to the project.

The second tutorial was the same as the first, except that instead of printing 'Hello World!' to the command line you would print it to a window. In order to do this you use the SWT by downloading the Eclipse project and making it a part of your projects runtime. The tutorial went fine until it came to running the code and I ran into the following error:

Exception in thread "main" java.lang.UnsatisfiedLinkError: no swt-gtk-3550 or swt-gtk in swt.library.path, java.library.path or the jar file
at org.eclipse.swt.internal.Library.loadLibrary(Unknown Source)
at org.eclipse.swt.internal.Library.loadLibrary(Unknown Source)
at org.eclipse.swt.internal.C.(Unknown Source)
...
This is extremely disappointing. I mean, this is meant to be the second tutorial that the user works through, it should be bullet proof. The entry barrier to getting up and running on Eclipse should not be so high, that you have to go out of your way to learn the intricacies of SWT. So far, not a very good start for Eclipse.

Update: Got the second tutorial working. Instead of adding the SWT project off of the official site, I added the 'swt-gtk.jar' library found under /usr/lib/java on my Ubuntu install. Still not too happy about having to go to these kinds of lengths to get the first couple of examples up and running, but at least I'm learning :)

Saturday 5 September 2009

Don't write a single line of code without a Specification

Today I was updating one of our servers and naturally had a lot of free time while I waited for everything to download and install, so I started reading the 12 Steps to Better Code (also known as the Joel Test) article on the joelonsoftware blog.

The article makes some really good points about software development and how to do it right. From a programmers perspective it also gives you clues as to what to look for in a company you're potentially going to be working for.

After reading through the article I clicked on one of the links and found myself reading the full 4-part series about functional specifications. This article went into much more depth than the original 12 Steps to Better Code. The articles talked about why you need a spec, how to write a spec, who should write one and some general tips. This series of articles was timely for me personally due to our company having hired a new contractor which has taken on a project that's been dragging on for years at the company and as one of the first things he did was to write a functional specification and a technical specification (although the functional spec is actually referred to as the functional requirements, po-ta-to poh-ta-to). The amazing thing is that this first, relatively simple step has impressed the hell out of management. Finally they have a document that they can refer to to see the full functionality of the finished product and can start answering questions by clients as to what new features the program will have etc... They also look at the document and can get some more concrete idea of which stage the project is at, although this isn't the idea behind functional specifications. The other point that really hit home for me was about how by putting the functional and technical specifications in their respective documents, you minimize the interruptions to your own work, because instead of having management, marketing, Q&A people etc... bothering you with questions all day about how each part of the finished product will work, you can just refer them to the document. I would love it if I could answer all of the annoying questions I get throughout the day with "Have you read the spec?".

Overall, the message of "don't write a single line of code until you have a spec" is a powerful one, now if only I had the self-discipline to stick to it :)

Friday 4 September 2009

Installing 64-bit Adobe Flash on Ubuntu

Adobe Labs released their alpha flash player 10 for 64-bit linux in November 2008. In order to install it, simply go to the download site, download the .tar.gz file and extract it. The uncompressed file is just the libflashplayer.so. In the install instructions they say to place this file in ~/.mozilla/plugins and restart Firefox. However, when I tried this I found it didn't work. In order to make the plugin load, I placed it under /usr/lib/mozilla/plugins/ and this did the trick.

While most advice recommends that you install the 32-bit version of flash along with an emulation layer, I've found the 64-bit version of the flash player works better than the 32-bit version and is generally quite stable.

Tuesday 25 August 2009

Folding@Home

I've recently joined the folding@home project under the name srkiNZ84. Folding@Home is a hugely distributed computing project with the purpose of doing protein folding (anyone remember seti@home?). I've joined TeamUbuntu and started crunching my way through those work units.

At first what got me intrigued with this project was just trying to benchmark my computer(s), but now what really keeps me interested in it is the competition. You see, with this project you get to keep statistics on how many work units you've completed, in what kind of time frame etc... and every member of the project is ranked. This gives rise to cool personal and team statistics screens like this one from extremeoverclocking or this one from xtreme cpu. Personally I think the competition is one of the main reasons that the project has so many followers.

Right now I'm ranked 455th in my team and 171,330th overall. This isn't too bad considering I've only started. However, the only real way I'm going to get into the top 10% or so of the rankings is by making use of a high performance GPU client. These are versions of the folding@home software that run on your graphics card and make use of it's parallel processing capabilities to complete work units in a much shorter space of time. ATI were the first to produce a client for their line of graphics cards, but were soon followed by nvidia, which blew them away with their much better CUDA performance. I'm not even sure that there is a GPU client for Linux (at least not on the official folding@home website), but I haven't been through the whole recruitment post on ubuntu forums, so I'm hoping there's some clues on there.

Thursday 25 June 2009

Signature Errors on Ubuntu repositories

On all of the Ubuntu installs I've ever had, I almost always eventually run into the following error when running the 'apt-get update' command or hitting the 'Reload' button in Synaptic.

Error:
W: GPG error: http://nz.archive.ubuntu.com hardy-updates Release: The following signatures were invalid: BADSIG 40976EAF437D05B5 Ubuntu Archive Automatic Signing Key
W: You may want to run apt-get update to correct these problems

From what I understand the problem comes from apt downloading an incomplete/corrupt signature file, meaning that the signature then doesn't match that of the packages. The fix at the moment is to simply delete the incomplete/corrupt signature file and download it again. To do this run the following command:

sudo rm /var/lib/apt/lists/partial/*

Afterwards running apt-get update doesn't return the same error.

Saturday 20 June 2009

Setting up a VPS - Part 3 - Ruby on Rails

One of the things I wanted to do with this VPS was to have a go at getting a Ruby on Rails environment going and seeing what the hype was all about. This turned out to be more trouble than I originally thought. The problem was that after following the RoR install guide found here, the machine kept on crashing whenever I got to the part about updating RubyGems. i.e.

sudo gem update --system

This was finally traced to the fact that the VPS only had 128MB of memory and no swap space. After adding another 128MB of memory and dedicating 512MB of swap space, the update finished fine and I had a Rails environment. The whole issue did take a while though, mostly because I was busy at work and often couldn't reply to HostingDirect straight away.

Thursday 11 June 2009

Setting up a VPS - Part 2 - Postfix Virtual Domain/Users

The next step in setting up the VPS, was installing and configuring the mail server. For this job, I've gone with the current king of MTA's - Postfix. The basic approach I've gone with when setting this up is to start simple and then add functionality bit by bit. In order to do this I've basically followed the guide found here. In the end I've ended up with support for virtual domains (seperate domains) and virtual users (non-UNIX users) with a flat file backend. I don't have much to add to this tutorial, except to point out that in the setup where it says 'virtual_uid_maps = static:5000' - this means that the process which is trying to deliver the message (i.e. write to disk and create any files/folders necessary) will be running as this user. So, there's no point in setting it to 5000, unless there is a user with that id, which has write access to the virtual domain folder.

I've also had to add some directives to prevent the mail server being flooded with spam. Directives which check that the server comes from a FQDN as well as checking that the IP Address isn't on any blacklists. i.e.

# Wait until the RCPT TO command before evaluating restrictions
smtpd_delay_reject = yes

# Basics Restrictions
smtpd_helo_required = yes
strict_rfc821_envelopes = yes

# Requirements for the connecting server
smtpd_client_restrictions =
permit_mynetworks,
permit_sasl_authenticated,
reject_rbl_client bl.spamcop.net,
reject_rbl_client dnsbl.njabl.org,
reject_rbl_client cbl.abuseat.org,
reject_rbl_client sbl-xbl.spamhaus.org,
reject_rbl_client list.dsbl.org,
permit

# Requirements for the HELO statement
smtpd_helo_restrictions =
permit_mynetworks,
permit_sasl_authenticated,
reject_non_fqdn_hostname,
reject_invalid_hostname,
permit

# Requirements for the sender address
smtpd_sender_restrictions =
permit_mynetworks,
permit_sasl_authenticated,
reject_non_fqdn_sender,
reject_unknown_sender_domain,
permit

# Requirement for the recipient address
smtpd_recipient_restrictions =
permit_mynetworks,
permit_sasl_authenticated,
reject_non_fqdn_recipient,
reject_unknown_recipient_domain,
reject_unauth_destination,
permit

These directives originally came from the email section of an article on howtoforge.com about setting up Mandriva Directory Server.

There's still a lot of work to go with setting up this email server, I haven't even got to setting up Dovecot and SASL. Then I want to set up Amavis and combine it with ClamAV and SpamAssassin(with Baysian filtering and feedback). I also need to setup DKIM, both for signing mail coming from the server and for checking incoming DKIM messages and ofcourse as always there's a need for a decent web front end, to enable you to check your mail. I've been hearing good things about Google Apps, but I don't know anyone that's set it up on their own servers. I wonder whether that's even possible or whether you have to use google's mail servers?

So many technologies, so little time... and this is only setting up the email :)

Monday 1 June 2009

Setting up a VPS - Part 1 - Hosting, SSH Security and ntp

Got a VPS from an outfit here in NZ called HostingDirect. Opted for Ubuntu 64-bit edition with the Small VPS package (128MB RAM, 10GB disk, 1 IP address). Also got domain registration (cheapest in NZ) and hosting with them which comes with free website hosting, which is nice.

The configurable options in the VPS setup allowed you to select LAMP setup for $150, Email server (SMTP, POP3, IMAP) for $60 and Security Tools for $45. I thought these prices were a bit steep, especially since the Small VPS package only cost $25/month after GST. But then I reminded myself what I charge for setting up such systems and it made sense. I didn't opt for these services, preferring to set them up myself.

So the VPS was provisioned in the afternoon on the 28th but I didn't have time to start configuring it until that night when I came home. By time I started having a look at it, there were already signs of brute force attacks on the ssh server. So the first thing I did was to create a new non-root user and add him to the 'admin' group which was already setup in the sudoers file (mimicking the typical Ubuntu setup). From here I disabled the root ssh login and changed the ssh port to 222. Later I changed the ssh port back to the standard 22 and installed a great new piece of software I found called 'fail2ban' which bans login attempts for a period of time based on the number of unsuccessful login attempts.

Before sorting out the ssh server and fail2ban, I did the obligatory 'apt-get update' followed by an 'apt-get upgrade' which all ran fine. I also did a check on the version of Ubuntu and kernel, with the follwing results:

$ lsb_release -a
No LSB modules are available.
Distributor ID: Ubuntu
Description: Ubuntu 8.04.2
Release: 8.04
Codename: hardy


$ uname -a
Linux example.org 2.6.24-23-xen #1 SMP Mon Jan 26 03:09:12 UTC 2009 x86_64 GNU/Linux

So I ended up with Ubuntu 8.04 LTS 64-bit version, which is exactly what I wanted. Shopping around for NZ VPS sellers, I found that a lot of them offered Ubuntu 7.10, which I found strange. I would think more people would prefer the long term release, maybe something to do with stability issues of each distribution running on Xen.

The next thing to set up was the ntp deamon, whch was quite straight forward and only involved adding the line 'server nz.pool.ntp.org' to the '/etc/ntp.conf' file and restarting the ntp daemon.

The VPS also came with access to XenShell, which is a way to administer your VPS through Xen (kind of like VMWare's server console). I've never worked with XenShell before so I'll have to look for a good tutorial to figure out how to make use of this tool.

That's all for today, it's late now and tomorrow I'll start setting up Postfix and all the neccessary extras, a task which it is much better to attempt with a clear head.

Wednesday 20 May 2009

Linux Sysadmin Tools

I just found this really useful site 'Ubuntu Server Guide'. It gives a nice general overview of how to set up and configure most of the main uses of your Ubuntu server. There's a few gems in there, like a really useful little package called etckeeper, which is basically a version control system for your '/etc' directory, meaning that you always have backups of your configuration and that you can tell exactly when changes to the config were made and by who. The seems like such a simple, yet brilliant idea, I can't believe the package isn't standard on any distro. It reminds me of Sun's (now Oracle's) ZFS and it's ability to take snapshots and to be able to 'roll-back'.

The other really useful article I came upon when I was investigating a possible break into our servers was this one and this one, explaining how with the use of the 'chattr' command you can make files which are 'immutable' (can't be modified by anyone, including root) or files which are append-only. The append-only would be brilliant for logs, meaning that a person breaking into your machine couldn't just go and modify the logs to cover their tracks. However, this assumes that the person would not have gotten root access, otherwise they could just set the file to 'not-append-only' change the file and then make it look like it hadn't been modified. Also, these tricks apparently only work with ext2/3, so anyone using other file systems is out of luck.

Monday 6 April 2009

Algorithmic Trading

Apparently there's a way you can get your computer to hook into the worlds financial systems and automatically buy/sell shares. How much attention did you pay in AI class?

Makes me think of the day when all the stock markets will be run and controlled by robots... can't wait

Thursday 2 April 2009

Another Day, antoher Crisis

Just woke up this morning and thought I'd get working on tool I'm creating to be able to monitor if and when we get put onto blacklists, when lo and behold, I discover our default ip is again on the cbl blacklist. A couple of hours of diagnosing and debugging later and I've found that our hosting provider as a part of their recent upgrades decided to go and change our hostname settings on our MTA, causing everything to come off of 'localhost.localdomain'. Wonderful. As if we didn't have enough problems, now we've got a MTA that's misconfigured and for the changes to take effect it needs a nice reboot. Well, luckily the reboot went well and our MTA is again sending happily off of a proper hostname.

Friday 27 March 2009

Lack of Planning

With this job at Jericho, it feels like I'm just constantly putting out fires. In one way that's a good thing, meaning that at least I'm busy and have work to do (which is not a bad thing in a recession). However, it does mean that I find less time to sit down and do some real planning.

For example, I've started looking into getting some industry qualifications in order to put into practice my idea that in the IT industry, to be successful, you must be constantly learning. Some industry qualifications would also look very nice on my resume. The ones I started looking at have been the Red Hat certifications and the Cisco certifications, although I wouldn't be opposed to getting Microsoft Certified, but that's a low priority. You can find so many '.NET' monkeys around that it would almost be an advantage (in my humble opinion) to not have any certs from the software giant.

Also, I've been thinking about starting my own Linux consulting firm. Nothing big, mostly just a way for me to make some extra cash on the weekends and further refine my skills. I think there's a real gap in the market here in NZ. I was also inspired by learning about OpenLogic and seeing one of their presentations. This was a very nice reminder that FOSS is in fact used in large enterprise, a fact that I sometimes forget working in a .NET shop like Jericho.

Here at work, the jobs keep piling up. On the system administration side of things I have to set up a box to act as a router/firewall/logger etc... For this I'ver gone with using SmoothWall, which is a dedicated 'router' distro which I've used before with success at Primesoft. The job after that will be to set up our own email server, whose main job will be to replace the contract we have with Net24/IPX. The machine's main purpose will be to act as a spam filter and a proxy. Although, I'll also get it to keep copies of the mail locally and set it up with a IMAP and a POP server, just in case our local Exchange server dies, so we have access to our mail. I'm probably going to go with setting up a Ubuntu box with Postfix and SpamAssassin. I also really want to set up Bayesian filtering on the server. The tricky part there's going to involve setting up a feedback mechanism, so that we can get the employees to train the filter.

On the Deliverability side of things, the job we're doing right now is creating a classifier to classify our asynchronous bounces. Previously we've been using a product called ListNanny, but it has failed us miserably. Obviously it's a product which is made for small lists but stands no chance when it comes to enterprise class ESP's. So far, we've got a set of regular expressions which we use to determine the classification. The set is not large enough and running a test over the last 10,000 messages only yields about 20-30% match rate. So, we're not only going to have to increase the size of the regex set, but also create a system to maintain such a set. There is also the idea I had to use that bounce classifier that I found in CPAN, but when I tested it out, it just didn't seem that accurate. There is a possibility to rewrite/redifine it seeing as how it is open source, but having our own solution has it's own advantages.

Personally, due to the ever changing nature of asynchronous bounce messages in my opinion, this would be a perfect problem to tackle using an AI technique such as a Bayesian classifier, the same as is used by SpamAssassin. Thinking about it, it wouldn't require that much tweaking. Simply, instead of getting the system to give a message a spam score, you give a score for each category (reply, hard bounce, soft bounce, complaint ....). This system would also need to have a feedback mechanism, to keep the false positives and false negatives to a minimum. This would actually be a really boring job, effectively resigning some poor soul to being a 'bounce monkey'.

The other project we've got going is trying to create a real time 'whiz-bang' graphing application for monitoring our sends. Something similar to glTail. I've already started looking at employing the same approach as glTail i.e. a ssh connection that is kept alive providing the 'stream' which is then used to create the numbers to be displayed on the graph. I've started looking at using either Perl (for its easy text manipulation) or Java (for its cross platform-ness). The problem with Java is that it doesn't have any official ssh libraries. There are libraries out there, but I have my doubts as to the quality. With Perl, it would only run on Linux so integrating it with ssh would be a piece of cake (well, actually a pipe, but hey...). I guess I could try maybe using cygwin and integrating that with a java front end somehow.... Anway, it'll probably be a while until I have enough time to start worrying about that.

Tuesday 24 March 2009

Long Overdue Update

It's been a while since the last time I wrote. It's been about 4 months since I started at Jericho Ltd and the job now takes up a majority of my time. Everything from producing deliverability reports for the clients to making sure our network is secure is handled by myself. I find myself often looking at the huge list of things to do and getting demotivated. It would take another 4 months just to get everything off the list of things to do.

On a brighter note, I went to a 'Agile Professionals' conference (agile as in the methodology) today. The guy (Jeff Smith from Suncorp) didn't have a bad thing to say about open source which was good. He also talked about the necessary attitudes and philosophies that you should have when trying to create teams (people centric) and where the innovation in a company should come from (bottom up). These were all very good points, it was a shame that only myself and Clint showed up to the conference, it would have been nice to have a few more people from work there.

I've also been reading. Finished reading Fred Brooks legendary book 'The Mythical Man Month' and the first two books in John Scalzi's Old Man's War series. The mythical man month is a real gem and deserves its reputation as a great book. I don't think I've ever seen so many truisms about software expressed so eloquently. John Scalzi's work is good as well, although the claims on the cover of the book about him being Heinlein's equal are exaggerations.