Monday, June 01, 2009

Packaging a Rails 1.1 app for JRuby

I've recently needed to "convert" a Rails 1.1 app to run under JBoss using JRuby to improve the deployment story and long-term maintenance for a new virtual server. The configuration/conversion wasn't without significant hiccups. I'll try to cover in some detail here what I did to get things working.

First, I tried upgrading to Rails 2.x. Don't try this. If you're not a professional Rails developer, you'll likely tear your hair out and ask yourself why you ever decided to use Rails in the first place. In my case, upgrading would have required a near total re-write of the (very simple) app.

For the impatient, here's the basic gist of what I did.

  1. update config/boot.rb, require_gem -> gem

  2. freeze to edge, RELEASE=1.1.4

  3. reset config/boot.rb(?) -> rake rails:update

  4. pluginize warbler

  5. update environment.rb

  6. create jboss-web.xml

  7. create -ds.xml file

  8. create warble.rb

  9. update warble.rb

  10. update database.yml -- copy production: block

  11. update new/thankyou.rhtml (change absolute links)

  12. add close_connections.rb

  13. edit new_rails_defaults.rb

I used JRuby 1.2.0 on a Rails 1.1.4 app, deploying to a JBoss 4.2.0.GA application server against a MySQL 4.1 database.

This isn't meant to be a list you can work off, merely a list showing what's involved, so if you count yourself amongst the faint of heart, stop here :).

update config/boot.rb, require_gem -> gem

If you don't actually have the original RubyGems that you had when you started your Rails 1.x project, some things have changed. Most particularly, require_gem is no longer deprecated, its GONE! So, edit
by changing the two instances of require_gem to simply gem. That wasn't too hard :)

freeze to edge, RELEASE=1.1.4

Now you're ready to freeze. You really want to freeze because it will simplify things down the road for your warbler tasks and runtime within JRuby. Until I froze, I kept having oodles of issues that basically track down to various commands trying to do things in my Rails project whilst executing code from the latest Rails on my system (2.3.2, I think). You should be able to freeze to just about any RELEASE or TAG you want. Either of the following should work:

$ rake rails:freeze:edge TAG=rel_1-1-4
$ rake rails:freeze:edge RELEASE=1.1.4

reset config/boot.rb(?) -> rake rails:update

Guess what? Now we need to undo step #2. But, this is pretty easy. You can either manually revert the changes you made to boot.rb, or just run:

$ rake rails:update

Which, contrary to its name, shouldn't do anything terribly crazy other than reset your boot.rb back to what it was previously (and it does this now based on your *frozen* Rails, not whatever the latest is in your system).

pluginize warbler

Assuming you've installed Warbler (if you haven't, gem install warbler) - let's take the road less travelled and pluginize. This keeps everything in a nice local package. It also means you can use the normal rake commands instead of

jruby -S warble <cmd>

Alright, to pluginize, straight from warbler's docs:

$ jruby -S warble pluginize

update environment.rb

Not sure where I found this, but it appears to be essential. Running in the JRuby environment appears to take a a few gems that aren't included elsewhere, so update your
to include the following:

if RUBY_PLATFORM =~ /java/
require 'rubygems'
require 'active_record'
gem 'activerecord-jdbcmysql-adapter'
require 'active_record/connection_adapters/jdbcmysql_adapter'

Insert this immediately preceding the line that looks like this: do |config|

create jboss-web.xml

I wanted to create a configuration that takes advantage of JBoss' connection pools for connections to MySQL, unfortunately (dirty secret), I haven't gotten it to work yet. Soooo, you can consider this step optional. Once I figure out what piece is missing to get JNDI DataSource access working, however, this piece will certainly be needed, so, take it or leave it :)

The jboss-web.xml maps an application-local resource-ref to a global JNDI name. Typically used to map a 'generic' JNDI ref such as jdbc/rails to a specific ref, such as jdbc/my_cool_apps/app1. Of course, you could just use jdbc/my_cool_apps/app1, but the thinking is that a level of indirection helps when you need to change things - you just change a config at the container level and don't need to muck about with the app (repackaging/redeploying/etc.). Again, YMMV, take it or leave it.

Here it is:

<?xml version="1.0" encoding="UTF-8"?>

create -ds.xml file

Now that you've mapped the local JNDI name to the global JNDI name, you should probably setup the configuration in JBoss that creates the global JNDI name. Here it is (substitute in your own parameters):

<?xml version="1.0" encoding="ISO-8859-1"?>




<!-- Typemapping for JBoss 4.0 -->


This by far isn't the most sophisticated DataSource you can configure, but there are better references elsewhere for the options available.

This gets dropped in the JBoss application server's "deploy" directory to setup the DataSource in global JNDI.

create warble.rb

Alright, back to the app. Let's create the warble config file, mostly run on defaults, but we need a couple customizations.

$ jruby -S warble config

$ script/generate warble

I had some issues with the latter, can't remember if that was before I figured out I needed to freeze, but the first way worked for me, YMMV.

update warble.rb

Documentation for warbler contains oodles of information on configuring, I found no fault with that information. Here's what I did:


config.includes = FileList["jboss-web.xml"]
config.gems += ["activerecord-jdbcmysql-adapter"]

config.gems["rails"] = "1.1.4"

(use whatever version suits you)

config.webxml.jndi = 'jdbc/rails'

That's it!

update database.yml -- copy production: block

So, here's where, if the world were a happy place, I'd tell you how to configure database.yml to use JNDI. Unfortunately, following the available documentation, I haven't gotten this to work. So, instead, I'll show you how to switch to use the 'jdbcmysql' adapter, instead of the 'mysql' adapter.


adapter: mysql

adapter: jdbcmysql

Painful, I know. You'll need to use a host: parameter, too, JDBC doesn't connect to /tmp/mysql.sock.

update new/thankyou.rhtml (change absolute links)

Not sure this applies to everyone, by my .rhtml files had absolute references in them to static resources. That needs to change to use relative paths that resolve to within your application. Just removing the leading '/' did the trick for me.

add close_connections.rb

This may be an optional step, I think its only needed if you do use JNDI (not in use here, yet). In any case, you'll want to add an initializers/close_connections.rb to config in your app. Contents:

if defined?($servlet_context)
require 'action_controller/dispatcher'
ActionController::Dispatcher.after_dispatch do

edit new_rails_defaults.rb

Finally (are you still reading?!), the spec for warbler uses a Rails 2.x (I think) JSON config, this needs to be commented out. Find your new_rails_defaults.rb under vendor/plugins/warbler-0.9.13 (version may vary), and comment out the last line.

#ActiveSupport.escape_html_entities_in_json = false

That's it!! Seriously.

Use a line like this to package/repackage/deploy your .WAR and enjoy.

rake war:clean; rake war; \
if [ -e <rails_app>.war ]; then
unzip -q -o -d $JBOSS_HOME/server/default/deploy/<rails_app>.new <rails_app>.war;
mv $JBOSS_HOME/server/default/deploy/<rails_app>.war $JBOSS_HOME/server/default/deploy/<rails_app>.old;
mv $JBOSS_HOME/server/default/deploy/<rails_app>.new $JBOSS_HOME/server/default/deploy/<rails_app>.war;
rm -rf $JBOSS_HOME/server/default/deploy/<rails_app>.old ;

Where <rails_app> is your app name; $JBOSS_HOME is set to your JBoss install dir. BEWARE - I chased my tail AROUND AND AROUND because I didn't realize that (a) rake war doesn't clean tmp/war before setting up & repackaging the WAR, so if you're trying to fix things, your old stuff may still be around; similarly, unless you're deploying the packaged WAR file, JBoss doesn't do you any favors and you should remove the exploded WAR before placing your fresh WAR dir in the deploy dir.

Thursday, March 06, 2008

Iditarod GPS in Google Earth

First, let me describe myself as a "rookie" fan. My wife got interested in the last great race last year (2007) [edit(3/6 9:33AM) my wife has been interested for several years, but only recently has technology allowed us to keep up-to-date on the race from afar, instead of waiting for coverage months later], but for me, it all started this year. My wife and I have gotten to know a family in Alaska this year, the Holt's, through my wife's blogs and overlapping interests. Well, it turns out that Rick Holt is a rookie in the 2008 Iditarod.

That piqued my tepid interest in the Iditarod to what I'd conservatively call a fury. Double that when my wife informed me that, for the first time, the Iditarod would provide GPS tracking of some of the mushers (a trial). Oh, and Rick would have one of the GPS devices. SWEET.

So, my interest is piqued and I'm excited about tracking Rick's progress against the other GPS-enabled mushers, as well as his overall standings (currently 44th after an overnight push through McGrath into Takotna, COOL!). But, I'm underwhelmed with the Microsoft Virtual Earth mapping of the mushers and the course. My first thought is to see if I can re-use the data being fed to Microsoft Virtual Earth, to feed into Google Maps, but as I'm working on that, my wife finds EarthSlot, which brings together the Arctic Regions Supercomputing Center and the Geographical Information Network of Alaska to provide a KML feed of the musher's current standings using the GIS data format used by Google Earth.

Well, that's cool - the KML feed has tons of information, but I thought I could do better. I figured that with the information from IonEarth, I should be able to map the GPS near-realtime location of the GPS-enabled mushers into Google Earth, with all the stats available in the Microsoft Virtual Earth mapping.

And I did. Click here to download the KML with a self-refreshing link to the latest near-realtime data provided by IonEarth, translated by a script to KML. I've also enhanced the original EarthSlot KML with a network link to National Weather Service / NOAA weather radar aggregation for Alaska.

Pulling this data into Google Earth provides a few interesting benefits.
  1. Google Earth provides better aerial / terrain data for the course than Microsoft Virtual Earth (this is subjective, I suppose)
  2. The data feed I provide gives Google Earth information so when you double-click on a musher's sled icon, Google Earth will rotate your map view to correspond with the musher's current heading. [this was useful when Rick was heading into Nikolai, from the course / GPS, it looked like he was off course, but rotating the map view to his current heading showed he was heading straight into the checkpoint!]

    Map is aligned with Ken Anderson's current heading

  3. Google Earth has a Ruler. Open the Ruler tool and click on your favorite GPS-enabled musher and anywhere else on the map (next musher, next checkpoint, whatever) and get a very accurate estimate of the distance (in feet, miles, etc.)

  4. Google Earth imports information from Panoramio which includes pictures folks have taken at the various checkpoints (may be from this year or previous years). Information from wikipedia on certain locations (e.g. Takotna, Nikolai, Mcgrath) is also available right in Google Earth.

  5. The weather radar overlay from the National Weather Service is just SO FREAKIN' COOL!

    Weather around current position of mushers seems clear, but there's a big storm rolling through ahead.

The added indirection between the Google Earth feed and the IonEarth data that my feed script provides, also allows some throttling of the upstream requests. That isn't possible when just using Ajax in a browser to the primary data source. My script refreshes the data from IonEarth once every 10 minutes, max. Within that timeframe, it serves up the KML to Google Earth with cached GPS data from IonEarth. [Edit 3/6/08]: I've been asked (politely) to reduce the refresh rate to once an hour]

I'll publish a post soon on the details of the script that takes the IonEarth data and transforms it into KML.

Wednesday, January 02, 2008

Computers: planned obsolescense, what can be done?

This morning, my wife showed me the Story of Stuff. The "Story of Stuff," with Annie Leonard, is a discussion of extraction, manufacturing, distribution, consumption, waste and more. This is an amazingly put together flash video (about 20m) and at one point it squarely points the finger at computers with "planned obsolescense" and "perceived obsolescense." Specifically, the inability to upgrade many (most?) computers that folks have these days. If the next new thing comes out (Vista, even OS X Leopard) - its time to buy a new computer and chuck everything you had before?! Is this necessary?! Sure, for some of us, we can upgrade certain things ... add memory, etc. But industrial design of computers isn't designed to be upgraded. And with the type of innovative, awesome designs we see from companies like Apple, you can't tell me this isn't possible.

So, go watch this, then comment below - what can be done to put computers, software, etc. into a sustainable, green chemistry cycle? What's being done already? Where should conscientious consumers be putting their $$?


Friday, November 16, 2007

At No Fluff Just Stuff Chicago

Arrived at NFJS Chicago today and promptly jumped into the afternoon sessions. I'll be trying to post information I'm finding interesting as I come across it and I've already put up two posts. Where? Well - on the new Coding blog I started in my iWeb site. The first post is on Enterprise Performance & Scalability. The second is on Domain Driven Design and how to build the ubiquitous language in your team.

Still need to blog about monitoring your applications, but that will come tomorrow, I'm afraid.


Wednesday, October 31, 2007

IP Aliasing on Leopard

I just added a new blog in my iWeb site on tips/tricks I use to be productive. The first post is about IP Aliasing on Leopard, something I am using to run multiple instances of JBoss AS. This makes a nice environment to test the Web Services Transactions (XTS) module from JBoss Transactions (JBossTS).

Saturday, October 27, 2007

Eclipse on Leopard: fine. 64-bit? Not so fine.

I know I'm a late adopter and all, but I thought I'd provide some information on Eclipse on OS X Leopard. First off: it runs fine. Many of the concerns were that 64-bit Java would break Eclipse because of the SWT/Carbon underpinnings, which are 32-bit, as I understand it.

Well, rest easy - as it does with many new technologies, Apple has left full backwards compatibility in place. It appears that Java runs in 32-bit mode by default, so basically everything that worked previously (including SWT) will continue to work.

So what about Eclipse on 64-bit Java? Yeah, it doesn't work. So - to all those that predicted this, kudos. You got it right. Adding "-d64" is the flag that tells the JVM to run in 64-bit mode - and adding this to eclipse.ini "-vmargs" causes an exception when SWT is loaded.

Want to test if you can run 64-bit Java? Here's what to do. In Terminal, run the following command:

/System/Library/Frameworks/JavaVM.framework/Versions/1.5.0/Commands/java -d64 -Xmx2560M -version

Running this should output something like so:

java version "1.5.0_13"
Java(TM) 2 Runtime Environment, Standard Edition (build 1.5.0_13-b05-237)
Java HotSpot(TM) 64-Bit Server VM (build 1.5.0_13-119, mixed mode)

If you don't get this, or get some type of message like "Cannot run Java in 64 bit mode. Continuing in 32 bit mode." then you may not have a 64-bit chipset. Notably, the PPC chips (while G5s are 64-bit) cannot run 64-bit Java). Further, the first generation of Intel chips released by Apple, Duo Core, are not 64-bit. Only the Core 2 Duo (and later) chips are 64-bit. Finally - if you ran the "-d64" test by just specifying "java" on the command line, instead of the full path - you may be running into a bug that was discovered too late to be fixed in GM.

I'll give a shot at running 64-bit Java from within Eclipse (say, to run JBoss AS - that should work). I'll report back with results!

Sunday, October 21, 2007

Locavore in training.

To document and track my ongoing efforts to become more of a Locavore with my family, I have started a new blog on my new iWeb site. I have a few things up there already (and am liking how easy it is to layout photos in iWeb, sweet!). I hope to keep things up-to-date, though this Locavore family might run into harder straights with winter coming on and the harvest nearing completion.

Though that in itself should provide ample material to blog about! Enjoy.

New blog!

I have started a new blog, mostly to evaluate the iWeb '08 software that's part of the iLife '08 suite from Apple. As part of my "Site" in iWeb, this blog will focus on thoughts and insights I have in the field of requirements engineering (RE). While most of my day-to-day activities don't focus 100% on this field, in my current position, I do deal with this on an ongoing basis, helping the business better communicate their needs, formulate what features the software should provide, and if needed, expand on those features in use case specifications.

Anyway, topics related to RE will be the focus of this new iWeb blog. Enjoy!