Tuesday, February 19, 2013

Installing ActiveMQ as a Service on Linux

Since the directions to this kind of thing seems to be disappearing, I'm going to post them here for my future reference. These instructions are a little dated, but they should apply directly to new versions.


Description

Apache ActiveMQ is a complete message broker and full JMS 1.1 provider featuring clustering, distributed destinations and XA support with pluggable persistence (JDBCBDBJDBM) and transport layers (TCPUDP, multicast, NIOSSL, Zeroconf, JXTA, JGroups).

Installation

This installation was done on Ubuntu x64, may be different for other Linux distributions.
The instructions will vary a little depending on system architecture (32 or 64 bit)
If you haven’t done so already download ActiveMQ from here
extract the download to a directory of your choice. I placed mine in /usr/local.
The rest of the guide will assume it’s in /usr/local, the full path of my installation is /usr/local/apache-activemq-5.5.0

Configuration

Open activemq and set ACTIVEMQ_HOME to point to your installation directory

1
$ sudo vi /usr/local/apache-activemq-5.5.0/bin/linux-x86-64/activemq

In activemq

1
ACTIVEMQ_HOME="/usr/local/apache-activemq-5.5.0"

Save activemq and open wrapper.conf, change set.default.ACTIVEMQ_HOME and set.default.ACTIVEMQ_BASE to point to your installation directory

1
$ sudo vi /usr/local/apache-activemq-5.5.0/bin/linux-x86-64/wrapper.conf

In wrapper.conf

1
2
set.default.ACTIVEMQ_HOME=/usr/local/apache-activemq-5.5.0
set.default_ACTIVEMQ_BASE=/usr/local/apache-activemq-5.5.0

Save wrapper.conf and create a soft link in init.d

1
$ sudo ln -s /usr/local/apache-activemq-5.5.0/bin/linux-x86-64/activemq /etc/init.d/activemq

Note: When creating a soft link make sure it’s the full path even if your currently in that directory. I didn’t and I had issues making one.

Update rc.d

1
2
$ sudo update-rc.d activemq \ [hit_enter]
 start 66 2 3 4 5 . stop 34 0 1 6 .

And you’re done.

Bonus points

Start or stop the service manually

1
2
$ service activemq start
$ service activemq stop

Check if ActiveMQ is running

1
$ service activemq status

Uninstalling the service

1
2
$ sudo update-rc.d -f activemq remove
$ sudo rm /etc/init.d/activemq

Wednesday, August 17, 2011

Implementing a message queue with MSSQL

I’m working on the project that needs some background application servers to execute long-running tasks in order to off load the web server. Architecturally, there are a couple ways that you could do it, but I want to leverage existing infrastructure--that means using MSSQL server. One of the things I want to avoid was polling SQL server for new tasks. SQL server provides a cool technology built right into the database engine called Service Broker. It provides native support for messaging and queuing applications. One of the things you can do with the Service Broker is setup a query notification dependency between an application and an instance of SQL server. This makes it so you can receive notifications from SQL server when a table changes. .NET provides a class called SqlDependency to register to receive these very query notifications. Here is a toy database:
USE [master]
GO
/****** Object:  Database [QueueSample]    Script Date: 08/17/2011 17:49:11 ******/
CREATE DATABASE [QueueSample] ON  PRIMARY 
( NAME = N'QueueSample', FILENAME = N'C:\Program Files\Microsoft SQL Server\MSSQL10_50.MSSQLSERVER\MSSQL\DATA\QueueSample.mdf' , SIZE = 2048KB , MAXSIZE = UNLIMITED, FILEGROWTH = 1024KB )
LOG ON 
( NAME = N'QueueSample_log', FILENAME = N'C:\Program Files\Microsoft SQL Server\MSSQL10_50.MSSQLSERVER\MSSQL\DATA\QueueSample_log.ldf' , SIZE = 1024KB , MAXSIZE = 2048GB , FILEGROWTH = 10%)
GO
ALTER DATABASE [QueueSample] SET COMPATIBILITY_LEVEL = 100
GO
IF (1 = FULLTEXTSERVICEPROPERTY('IsFullTextInstalled'))
begin
EXEC [QueueSample].[dbo].[sp_fulltext_database] @action = 'enable'
end
GO
ALTER DATABASE [QueueSample] SET ANSI_NULL_DEFAULT OFF
GO
ALTER DATABASE [QueueSample] SET ANSI_NULLS OFF
GO
ALTER DATABASE [QueueSample] SET ANSI_PADDING OFF
GO
ALTER DATABASE [QueueSample] SET ANSI_WARNINGS OFF
GO
ALTER DATABASE [QueueSample] SET ARITHABORT OFF
GO
ALTER DATABASE [QueueSample] SET AUTO_CLOSE OFF
GO
ALTER DATABASE [QueueSample] SET AUTO_CREATE_STATISTICS ON
GO
ALTER DATABASE [QueueSample] SET AUTO_SHRINK OFF
GO
ALTER DATABASE [QueueSample] SET AUTO_UPDATE_STATISTICS ON
GO
ALTER DATABASE [QueueSample] SET CURSOR_CLOSE_ON_COMMIT OFF
GO
ALTER DATABASE [QueueSample] SET CURSOR_DEFAULT  GLOBAL
GO
ALTER DATABASE [QueueSample] SET CONCAT_NULL_YIELDS_NULL OFF
GO
ALTER DATABASE [QueueSample] SET NUMERIC_ROUNDABORT OFF
GO
ALTER DATABASE [QueueSample] SET QUOTED_IDENTIFIER OFF
GO
ALTER DATABASE [QueueSample] SET RECURSIVE_TRIGGERS OFF
GO
ALTER DATABASE [QueueSample] SET  ENABLE_BROKER
GO
ALTER DATABASE [QueueSample] SET AUTO_UPDATE_STATISTICS_ASYNC OFF
GO
ALTER DATABASE [QueueSample] SET DATE_CORRELATION_OPTIMIZATION OFF
GO
ALTER DATABASE [QueueSample] SET TRUSTWORTHY OFF
GO
ALTER DATABASE [QueueSample] SET ALLOW_SNAPSHOT_ISOLATION OFF
GO
ALTER DATABASE [QueueSample] SET PARAMETERIZATION SIMPLE
GO
ALTER DATABASE [QueueSample] SET READ_COMMITTED_SNAPSHOT OFF
GO
ALTER DATABASE [QueueSample] SET HONOR_BROKER_PRIORITY OFF
GO
ALTER DATABASE [QueueSample] SET  READ_WRITE
GO
ALTER DATABASE [QueueSample] SET RECOVERY FULL
GO
ALTER DATABASE [QueueSample] SET  MULTI_USER
GO
ALTER DATABASE [QueueSample] SET PAGE_VERIFY CHECKSUM
GO
ALTER DATABASE [QueueSample] SET DB_CHAINING OFF
GO
EXEC sys.sp_db_vardecimal_storage_format N'QueueSample', N'ON'
GO
USE [QueueSample]
GO
/****** Object:  User [queuesampleuser]    Script Date: 08/17/2011 17:49:11 ******/
CREATE USER [queuesampleuser] FOR LOGIN [queuesampleuser] WITH DEFAULT_SCHEMA=[dbo]
GO
/****** Object:  Table [dbo].[QueueItems]    Script Date: 08/17/2011 17:49:13 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
SET ANSI_PADDING ON
GO
CREATE TABLE [dbo].[QueueItems](
[QueueItemId] [int] IDENTITY(1,1) NOT NULL,
[Action] [varchar](50) NOT NULL,
[Parameters] [varchar](8000) NOT NULL,
CONSTRAINT [PK_QueueItems] PRIMARY KEY CLUSTERED 
(
[QueueItemId] ASC
)WITH (PAD_INDEX  = OFF, STATISTICS_NORECOMPUTE  = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS  = ON, ALLOW_PAGE_LOCKS  = ON) ON [PRIMARY]
) ON [PRIMARY]
GO
SET ANSI_PADDING OFF
GO
/****** Object:  Table [dbo].[ProcessedQueueItems]    Script Date: 08/17/2011 17:49:13 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
SET ANSI_PADDING ON
GO
CREATE TABLE [dbo].[ProcessedQueueItems](
[QueueItemId] [int] NOT NULL,
[Action] [varchar](50) NOT NULL,
[Parameters] [varchar](8000) NOT NULL,
[MachineName] [varchar](50) NOT NULL,
[Start] [datetime] NOT NULL,
[End] [datetime] NULL,
CONSTRAINT [PK_ProcessedQueueItems] PRIMARY KEY CLUSTERED 
(
[QueueItemId] ASC
)WITH (PAD_INDEX  = OFF, STATISTICS_NORECOMPUTE  = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS  = ON, ALLOW_PAGE_LOCKS  = ON) ON [PRIMARY]
) ON [PRIMARY]
GO
SET ANSI_PADDING OFF
GO

Notice how you have to enable the service broker per database. Here is the class used to register to receive query notifications:
using System;
using System.Collections.Generic;
using System.Data;
using System.Data.SqlClient;
using System.Linq;
using System.Text;
using SqlDependencyConsole.Properties;

namespace SqlDependencyConsole
{
public delegate void QueueChangeHandler();

public class JobQueue
{
private static readonly Guid GUID = Guid.NewGuid();

private SqlConnection _SqlConnection;
private SqlDependency _SqlDependency;

public event QueueChangeHandler QueueChanged;

public JobQueue()
{
SqlDependency.Stop(Settings.Default.QueueSampleConnectionString);
SqlDependency.Start(Settings.Default.QueueSampleConnectionString);
_SqlConnection = new SqlConnection(Settings.Default.QueueSampleConnectionString);
}

~JobQueue()
{
SqlDependency.Stop(Settings.Default.QueueSampleConnectionString);
}

public int? CheckQueue()
{
SqlCommand cmd = new SqlCommand("SELECT [QueueItemId],[Action],[Parameters] FROM [dbo].[QueueItems]", _SqlConnection);
cmd.Notification = null;

if (_SqlDependency == null)
{
Console.WriteLine("Creating sql dep");
_SqlDependency = new SqlDependency(cmd);
_SqlDependency.OnChange += new OnChangeEventHandler(OnChange);
}

if (_SqlConnection.State == System.Data.ConnectionState.Closed)
_SqlConnection.Open();

DataTable dataTable = new DataTable();
dataTable.Load(cmd.ExecuteReader());
int? queueItemId = null;
try
{
foreach (DataRow row in dataTable.AsEnumerable())
{
Console.WriteLine("Trying {0}", row.Field<int>("QueueItemId"));
try
{
cmd = new SqlCommand("INSERT INTO [QueueSample].[dbo].[ProcessedQueueItems] ([QueueItemId],[Action],[Parameters],[MachineName],[Start],[End]) VALUES(@QueueItemId,@Action,@Parameters,@MachineName,@Start,null)", _SqlConnection);
cmd.Parameters.AddWithValue("@QueueItemId", row.Field<int>("QueueItemId"));
cmd.Parameters.AddWithValue("@Action", row.Field<string>("Action"));
cmd.Parameters.AddWithValue("@Parameters", row.Field<string>("Parameters"));
cmd.Parameters.AddWithValue("@MachineName", GUID.ToString());
cmd.Parameters.AddWithValue("@Start", DateTime.Now);

int result = cmd.ExecuteNonQuery();

cmd = new SqlCommand("DELETE FROM [QueueSample].[dbo].[QueueItems] WHERE QueueItemId = @QueueItemId", _SqlConnection);
cmd.Parameters.AddWithValue("@QueueItemId", row.Field<int>("QueueItemId"));

result = cmd.ExecuteNonQuery();

queueItemId = row.Field<int>("QueueItemId");
break;
}
catch (Exception ex) { Console.WriteLine(ex.Message); }
}
}
catch (Exception ex) { Console.WriteLine(ex.Message); }
finally { _SqlConnection.Close(); }

return queueItemId;
}

private void OnChange(object sender, SqlNotificationEventArgs e)
{
Console.WriteLine("Change detected");
SqlDependency dependency = sender as SqlDependency;

// Notices are only a one shot deal
// so remove the existing one so a new 
// one can be added
dependency.OnChange -= OnChange;

_SqlDependency = null;

if (QueueChanged != null)
{
QueueChanged();
}

}
}
}

Here is the main class. Run a couple of them. The idea of this little toy program is to dispatch a task to exactly one process. The process gets the change notification and then tries to claim the top item in the queue by trying to insert it into another table. Because SQL enforces primary key uniqueness, exactly one process can successfully insert it. That’s how this toy program decides if it successfully got a queue item.
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading;

namespace SqlDependencyConsole
{
class Program
{
private static JobQueue _JobQueue;
private static Random _Random;

static void Main(string[] args)
{
_Random = new Random();
_JobQueue = new JobQueue();
_JobQueue.QueueChanged += new QueueChangeHandler(OnQueueChanged);
Console.WriteLine("Running...");
CheckQueue();
Console.ReadLine();
}

private static void OnQueueChanged()
{
CheckQueue();
}

private static void CheckQueue()
{
int? queueItemId = _JobQueue.CheckQueue();
if (queueItemId.HasValue)
{
Console.WriteLine("Processing {0}", queueItemId.Value);
Thread.Sleep(_Random.Next(2000));
CheckQueue();
}
}
}
}

Sunday, April 10, 2011

New Project

I’m going to be starting a new project. The task--create a website for a photographer. I think ordinarily if it were just going to be static content, I would throw up a CMS of some kind and be done with it, but it’s just not that simple—never is. There are a couple custom things that have to take place.

One, there needs to be some kind of public photo gallery so that the photographer can show off different categories of photos. This gallery needs to take any kind of photo, watermark it and compress it. Preferably it needs to allow for a zip file of photos to be uploaded all at once. As a side note, I’m really surprised that browsers haven’t made multiple file uploading easier by now. But, I digress. This gallery also needs to be able to easily share things on social networks.

Two, there needs to be a restricted area where clients can login and choose which photos they would like in their packages after a photo shoot. Consequently, there needs to be an easy way for the photographer to create and upload these packages for pictures. As mentioned before, these photos also need to be watermarked and compressed.

Three, there is going to be static content so there needs to be an easy developer’s-hands-off way to manage that.

I think that covers the highlights. So let me talk about ideas and technologies I’m throwing around in my head. First, a long time ago I used to be of the mind set that I would just write everything myself because it would be easier. I have since come to my senses and like integrating open source projects as much as I can into the final product. I figure why reinvent what someone else has already done and done well (most of the time). Second, I’m a Microsoft guy and would prefer a .NET based platform. That rules out any PHP based (I very much dislike that language anyway) projects, which is fine.

So let me start with the static content requirement. This problem has been solved over and over by content management systems (CMS). There are quite a few popular ones. In the .NET world, I’ve worked extensively with DotNetNuke. Almost all the websites that have to do with the BYU College of Life Sciences are running off an instance of DotNetNuke. DNN was a perfect fit because it could do everything we needed it to do right out of box without actually have to extend it or search out extensions ourselves, with a few little exceptions. Also, the user interaction seemed to be intuitive for non-technical users. As a result, Life Sciences hosts 40 – 50 websites in that CMS. My problem with DNN is that skinning and extending, although powerful, are cumbersome. It is hard to just do something simple, from a developer’s perspective. From a user perspective, the process of adding modules to a page and setting them up is a synch.

So that leads me to my next choice, that being Umbraco. So, I’m not very familiar with Umbraco, but from the tutorials I’ve watched and the time I’ve spent messing with it, I’m leaning towards using it because of how easy it is to extend. There is hardly any overhead to plug into the many available parts of Umbraco. There is also a fairly extensive web services API making things such as mobile, desktop and cloud app integration very easy. So, I think I’m going to use Umbraco as my main platform and extend it to take care of the other requirements.

As far as a public photo gallery, I’m debating between Flickr or Gallery Server Pro—Flickr because it is free, has good API’s and a copyright displayed on each image, GSP because I have complete control over how images get stored, how they get watermarked, etc. Flickr apparently can’t watermark pictures, which is problematic, but it does have great social networking support, which GSP does not. I’ve have to think about this one more.

The restricted area I’ve already consigned myself to write.

So that’s what I’m up against. Not too bad.

Tuesday, March 29, 2011

Modernizr, a really sweet library

I just ran into this sweet Javascript library. I can help detect emerging HTML 5 and CSS 3 technologies in browsers making it easy to bring some browsers forward in functionality while leaving legacy ones where they are. Check it out!

Tuesday, February 22, 2011

“The system cannot find the file specified” error in the WIF FAM module

I've been trying to get WIF to work in a web farm environment for a while and have run into issues when the fed auth cookies are decrypted in another web app. It turns out that DAPI, the default service for encryption, although easy to setup in IIS 7.x, is not the best option in a web farm. I found this article that provides some nice instructions on how to change the default cookie security behavior.

http://weblogs.asp.net/cibrax/archive/2010/02/17/the-system-cannot-find-the-file-specified-error-in-the-wif-fam-module.aspx