Showing posts with label Java. Show all posts
Showing posts with label Java. Show all posts

5/18/2017

Building IntelliJ Idea Community edition on Ubuntu 16.10

Building IntelliJ Idea on Ubuntu is just a bit more complex than simply executing ant.

Acquiring dependencies

The commented lines below are required if you wish to use jdk1.6; in practice, things seem to build without (as of May, 2017 at least)

#sudo add-apt-repository ppa:webupd8team/java
#sudo apt-get update
#sudo apt-get install oracle-java6-installer
#sudo update-alternatives --config java
sudo apt-get install zlib1g-dev ant git
mkdir $HOME/tools
cd $HOME/tools
git clone git://git.jetbrains.org/idea/community.git idea
cd idea
./getPlugins.sh 
build/conf/install_nsis3.sh $HOME/tools/idea

If the build finishes with

...
scons: done building targets.
...the build of dependencies finished successfully.

Starting the build

If you've installed JDK1.6 (the commented lines above), here we need to refer to it (first line below). Again, IntelliJ Idea Community Edition builds without this, so left it commented out.

#export JDK_16_x64=/usr/lib/jvm/java-6-oracle
cd $HOME/tools/idea
ant

11/23/2015

Apache Karaf: no matching cipher found: client aes128-ctr,aes192-ctr,aes256-ctr,arcfour256,arcfour128,aes128-gcm@openssh.com,aes256-gcm@openssh.com,aes128-cbc,3des-cbc,blowfish-cbc,cast128-cbc,aes192-cbc,aes256-cbc,arcfour,rijndael-cbc@lysator.liu.se server

During a recent deployment of a custom Karaf distribution via the karaf-maven-plugin, I've experienced a very strange behaviour when deployed to FreeBSD rather than my developer Mac OS rig.

Connecting thru SSH failed, so the logging has been increased:

dfi:~ doma$ ssh karaf@optiplex1 -p 8100
no matching cipher found: client aes128-ctr,aes192-ctr,aes256-ctr,arcfour256,arcfour128,aes128-gcm@openssh.com,aes256-gcm@openssh.com,aes128-cbc,3des-cbc,blowfish-cbc,cast128-cbc,aes192-cbc,aes256-cbc,arcfour,rijndael-cbc@lysator.liu.se server 

Here's Karaf's exception logged:

16:24:03,890 INFO  8]-nio2-thread-1 125 shd.server.session.ServerSession Server session created from /192.168.0.21:64951
16:24:03,894 DEBUG 8]-nio2-thread-1 125 shd.server.session.ServerSession Client version string: SSH-2.0-OpenSSH_6.2
16:24:03,900 DEBUG 8]-nio2-thread-1 125 d.common.session.AbstractSession Send SSH_MSG_KEXINIT
16:24:03,901 DEBUG 8]-nio2-thread-1 125 d.common.session.AbstractSession Received SSH_MSG_KEXINIT
16:24:03,902 WARN  8]-nio2-thread-1 125 d.common.session.AbstractSession Exception caught
jjava.lang.IllegalStateException: Unable to negotiate key exchange for encryption algorithms (client to server) (client: aes128-ctr,aes192-ctr,aes256-ctr,arcfour256,arcfour128,aes128-gcm@openssh.com,aes256-gcm@openssh.com,aes128-cbc,3des-cbc,blowfish-cbc,cast128-cbc,aes192-cbc,aes256-cbc,arcfour,rijndael-cbc@lys
ator.liu.se / server: )
        at org.apache.sshd.common.session.AbstractSession.negotiate(AbstractSession.java:1159)[125:org.apache.sshd.core:0.14.0]
        at org.apache.sshd.common.session.AbstractSession.doHandleMessage(AbstractSession.java:388)[125:org.apache.sshd.core:0.14.0]
        at org.apache.sshd.common.session.AbstractSession.handleMessage(AbstractSession.java:326)[125:org.apache.sshd.core:0.14.0]
        at org.apache.sshd.common.session.AbstractSession.decode(AbstractSession.java:780)[125:org.apache.sshd.core:0.14.0]
        at org.apache.sshd.common.session.AbstractSession.messageReceived(AbstractSession.java:308)[125:org.apache.sshd.core:0.14.0]
        at org.apache.sshd.common.AbstractSessionIoHandler.messageReceived(AbstractSessionIoHandler.java:54)
        at org.apache.sshd.common.io.nio2.Nio2Session$1.onCompleted(Nio2Session.java:184)
        at org.apache.sshd.common.io.nio2.Nio2Session$1.onCompleted(Nio2Session.java:170)
        at org.apache.sshd.common.io.nio2.Nio2CompletionHandler$1.run(Nio2CompletionHandler.java:32)
        at java.security.AccessController.doPrivileged(Native Method)[:1.8.0_60]
        at org.apache.sshd.common.io.nio2.Nio2CompletionHandler.completed(Nio2CompletionHandler.java:30)[125:org.apache.sshd.core:0.14.0]
        at sun.nio.ch.Invoker.invokeUnchecked(Invoker.java:126)[:1.8.0_60]
        at sun.nio.ch.Invoker.invokeDirect(Invoker.java:157)[:1.8.0_60]
        at sun.nio.ch.UnixAsynchronousSocketChannelImpl.implRead(UnixAsynchronousSocketChannelImpl.java:553)[:1.8.0_60]
        at sun.nio.ch.AsynchronousSocketChannelImpl.read(AsynchronousSocketChannelImpl.java:276)[:1.8.0_60]
        at sun.nio.ch.AsynchronousSocketChannelImpl.read(AsynchronousSocketChannelImpl.java:297)[:1.8.0_60]
        at java.nio.channels.AsynchronousSocketChannel.read(AsynchronousSocketChannel.java:420)[:1.8.0_60]
        at org.apache.sshd.common.io.nio2.Nio2Session.startReading(Nio2Session.java:170)[125:org.apache.sshd.core:0.14.0]
        at org.apache.sshd.common.io.nio2.Nio2Acceptor$AcceptCompletionHandler.onCompleted(Nio2Acceptor.java:135)[125:org.apache.sshd.core:0.14.0]
        at org.apache.sshd.common.io.nio2.Nio2Acceptor$AcceptCompletionHandler.onCompleted(Nio2Acceptor.java:120)[125:org.apache.sshd.core:0.14.0]
        at org.apache.sshd.common.io.nio2.Nio2CompletionHandler$1.run(Nio2CompletionHandler.java:32)
        at java.security.AccessController.doPrivileged(Native Method)[:1.8.0_60]
        at org.apache.sshd.common.io.nio2.Nio2CompletionHandler.completed(Nio2CompletionHandler.java:30)[125:org.apache.sshd.core:0.14.0]
        at sun.nio.ch.Invoker.invokeUnchecked(Invoker.java:126)[:1.8.0_60]
        at sun.nio.ch.Invoker$2.run(Invoker.java:218)[:1.8.0_60]
        at sun.nio.ch.AsynchronousChannelGroupImpl$1.run(AsynchronousChannelGroupImpl.java:112)[:1.8.0_60]
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)[:1.8.0_60]
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)[:1.8.0_60]
        at java.lang.Thread.run(Thread.java:745)[:1.8.0_60]

After trying out all sort of tips read via google, finally realized: setting the JAVA_HOME fixes the issue. To solve for all users I added this line to /etc/profile:

export JAVA_HOME=/usr/local/openjdk7

You can also put the same line to $HOME/.profile to fix it only for one user.

Karaf is beautiful but this one was ugly. Well, my fault: starting Karaf actually complains about not setting JAVA_HOME (and that results may vary), but since I used it like that for quite a while, didn't expect this.

7/16/2015

Building Hadoop 2.4.0 on Mac OS X Yosemite 10.10.3 with native components

Install pre-requisites

We'll need these for the actual build.

sudo port install cmake gmake gcc48 zlib gzip maven32 apache-ant

Install protobuf 2.5.0

As the current latest version in macports is 2.6.x, we need to stick to an earlier version:

cd ~/tools
svn co http://svn.macports.org/repository/macports/trunk/dports/devel/protobuf-cpp -r 105333
cd protobuf-cpp/
sudo port install

To verify:

protoc --version
# libprotoc 2.5.0

Acquire sources

As I needed an exact version for my work to reproduce an issue, I'll go with version 2.4.0 for now. I suppose some of the fixes will work with earlier or later versions as well. Look around in the tags folder for other versions.

cd ~/dev
svn co http://svn.apache.org/repos/asf/hadoop/common/tags/release-2.4.0 hadoop-2.4.0
cd hadoop-2.4.0

Fix sources

We need to patch JniBasedUnixGroupsNetgroupMapping:

patch -p0 <<EOF
--- hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/security/JniBasedUnixGroupsNetgroupMapping.c.orig 2015-07-16 17:14:20.000000000 +0200
+++ hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/security/JniBasedUnixGroupsNetgroupMapping.c 2015-07-16 17:17:47.000000000 +0200
@@ -74,7 +74,7 @@
   // endnetgrent)
   setnetgrentCalledFlag = 1;
 #ifndef __FreeBSD__
-  if(setnetgrent(cgroup) == 1) {
+  setnetgrent(cgroup); {
 #endif
     current = NULL;
     // three pointers are for host, user, domain, we only care

EOF

As well as container-executor.c:

patch -p0 <<EOF
--- hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/native/container-executor/impl/container-executor.c.orig 2015-07-16 17:49:15.000000000 +0200
+++ hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/main/native/container-executor/impl/container-executor.c 2015-07-16 18:13:03.000000000 +0200
@@ -498,7 +498,7 @@
   char **users = whitelist;
   if (whitelist != NULL) {
     for(; *users; ++users) {
-      if (strncmp(*users, user, LOGIN_NAME_MAX) == 0) {
+      if (strncmp(*users, user, 64) == 0) {
         free_values(whitelist);
         return 1;
       }
@@ -1247,7 +1247,7 @@
               pair);
     result = -1; 
   } else {
-    if (mount("none", mount_path, "cgroup", 0, controller) == 0) {
+    if (mount("none", mount_path, "cgroup", 0) == 0) {
       char *buf = stpncpy(hier_path, mount_path, strlen(mount_path));
       *buf++ = '/';
       snprintf(buf, PATH_MAX - (buf - hier_path), "%s", hierarchy);
@@ -1274,3 +1274,21 @@
   return result;
 }
 
+int fcloseall(void)
+{
+    int succeeded; /* return value */
+    FILE *fds_to_close[3]; /* the size being hardcoded to '3' is temporary */
+    int i; /* loop counter */
+    succeeded = 0;
+    fds_to_close[0] = stdin;
+    fds_to_close[1] = stdout;
+    fds_to_close[2] = stderr;
+    /* max iterations being hardcoded to '3' is temporary: */
+    for ((i = 0); (i < 3); i++) {
+ succeeded += fclose(fds_to_close[i]);
+    }
+    if (succeeded != 0) {
+ succeeded = EOF;
+    }
+    return succeeded;
+}

EOF

Install Oracle JDK 1.7

You'll need to install "Java SE Development Kit 7 (Mac OS X x64)" from Oracle. Then let's fix some things expected by the build at a different place:

export JAVA_HOME=`/usr/libexec/java_home -v 1.7`
sudo mkdir $JAVA_HOME/Classes
sudo ln -s $JAVA_HOME/lib/tools.jar $JAVA_HOME/Classes/classes.jar

Install Hadoop 2.4.0:

Sooner or later we've been expected to get here, right?

mvn package -Pdist,native -DskipTests -Dtar

If all goes well:

main:
     [exec] $ tar cf hadoop-2.4.0.tar hadoop-2.4.0
     [exec] $ gzip -f hadoop-2.4.0.tar
     [exec] 
     [exec] Hadoop dist tar available at: /Users/doma/dev/hadoop-2.4.0/hadoop-dist/target/hadoop-2.4.0.tar.gz
     [exec] 
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-dist ---
[INFO] Building jar: /Users/doma/dev/hadoop-2.4.0/hadoop-dist/target/hadoop-dist-2.4.0-javadoc.jar
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop Main ................................ SUCCESS [1.177s]
[INFO] Apache Hadoop Project POM ......................... SUCCESS [1.548s]
[INFO] Apache Hadoop Annotations ......................... SUCCESS [3.394s]
[INFO] Apache Hadoop Assemblies .......................... SUCCESS [0.277s]
[INFO] Apache Hadoop Project Dist POM .................... SUCCESS [1.765s]
[INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [3.143s]
[INFO] Apache Hadoop MiniKDC ............................. SUCCESS [2.498s]
[INFO] Apache Hadoop Auth ................................ SUCCESS [3.265s]
[INFO] Apache Hadoop Auth Examples ....................... SUCCESS [2.074s]
[INFO] Apache Hadoop Common .............................. SUCCESS [1:26.460s]
[INFO] Apache Hadoop NFS ................................. SUCCESS [4.527s]
[INFO] Apache Hadoop Common Project ...................... SUCCESS [0.032s]
[INFO] Apache Hadoop HDFS ................................ SUCCESS [2:09.326s]
[INFO] Apache Hadoop HttpFS .............................. SUCCESS [14.876s]
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS [5.814s]
[INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS [2.941s]
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [0.034s]
[INFO] hadoop-yarn ....................................... SUCCESS [0.034s]
[INFO] hadoop-yarn-api ................................... SUCCESS [57.713s]
[INFO] hadoop-yarn-common ................................ SUCCESS [20.985s]
[INFO] hadoop-yarn-server ................................ SUCCESS [0.040s]
[INFO] hadoop-yarn-server-common ......................... SUCCESS [6.935s]
[INFO] hadoop-yarn-server-nodemanager .................... SUCCESS [12.889s]
[INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS [2.362s]
[INFO] hadoop-yarn-server-applicationhistoryservice ...... SUCCESS [4.059s]
[INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS [11.368s]
[INFO] hadoop-yarn-server-tests .......................... SUCCESS [0.467s]
[INFO] hadoop-yarn-client ................................ SUCCESS [4.109s]
[INFO] hadoop-yarn-applications .......................... SUCCESS [0.043s]
[INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS [2.123s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS [1.902s]
[INFO] hadoop-yarn-site .................................. SUCCESS [0.030s]
[INFO] hadoop-yarn-project ............................... SUCCESS [3.828s]
[INFO] hadoop-mapreduce-client ........................... SUCCESS [0.069s]
[INFO] hadoop-mapreduce-client-core ...................... SUCCESS [19.507s]
[INFO] hadoop-mapreduce-client-common .................... SUCCESS [13.039s]
[INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS [2.232s]
[INFO] hadoop-mapreduce-client-app ....................... SUCCESS [7.625s]
[INFO] hadoop-mapreduce-client-hs ........................ SUCCESS [6.198s]
[INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS [5.440s]
[INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS [1.534s]
[INFO] Apache Hadoop MapReduce Examples .................. SUCCESS [4.577s]
[INFO] hadoop-mapreduce .................................. SUCCESS [2.903s]
[INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS [3.509s]
[INFO] Apache Hadoop Distributed Copy .................... SUCCESS [6.723s]
[INFO] Apache Hadoop Archives ............................ SUCCESS [1.705s]
[INFO] Apache Hadoop Rumen ............................... SUCCESS [4.460s]
[INFO] Apache Hadoop Gridmix ............................. SUCCESS [3.330s]
[INFO] Apache Hadoop Data Join ........................... SUCCESS [2.585s]
[INFO] Apache Hadoop Extras .............................. SUCCESS [2.361s]
[INFO] Apache Hadoop Pipes ............................... SUCCESS [9.603s]
[INFO] Apache Hadoop OpenStack support ................... SUCCESS [3.797s]
[INFO] Apache Hadoop Client .............................. SUCCESS [6.102s]
[INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS [0.091s]
[INFO] Apache Hadoop Scheduler Load Simulator ............ SUCCESS [3.251s]
[INFO] Apache Hadoop Tools Dist .......................... SUCCESS [5.068s]
[INFO] Apache Hadoop Tools ............................... SUCCESS [0.032s]
[INFO] Apache Hadoop Distribution ........................ SUCCESS [24.974s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 8:54.425s
[INFO] Finished at: Thu Jul 16 18:22:12 CEST 2015
[INFO] Final Memory: 173M/920M
[INFO] ------------------------------------------------------------------------

Using it

First we'll extract the results of our build. Then actually there is a little bit of configuration needed even for a single-cluster setup. Don't worry, I'll copy it here for your comfort ;-)

tar -xvzf /Users/doma/dev/hadoop-2.4.0/hadoop-dist/target/hadoop-2.4.0.tar.gz -C ~/tools

The contents of ~/tools/hadoop-2.4.0/etc/hadoop/core-site.xml:

<configuration>
    <property>
        <name>fs.defaultFS</name>
        <value>hdfs://localhost:9000</value>
    </property>
</configuration>

The contents of ~/tools/hadoop-2.4.0/etc/hadoop/hdfs-site.xml:

<configuration>
    <property>
        <name>dfs.replication</name>
        <value>1</value>
    </property>
</configuration>

Passwordless SSH

From the official docs:

ssh-keygen -t dsa -P '' -f ~/.ssh/id_dsa
cat ~/.ssh/id_dsa.pub >> ~/.ssh/authorized_keys

Starting up

Let's see what we've did. This is a raw copy from the official docs.

  1. Format the filesystem:
    bin/hdfs namenode -format
    
  2. Start NameNode daemon and DataNode daemon:
    sbin/start-dfs.sh
    

    The hadoop daemon log output is written to the $HADOOP_LOG_DIR directory (defaults to $HADOOP_HOME/logs).

  3. Browse the web interface for the NameNode; by default it is available at:
  4. Make the HDFS directories required to execute MapReduce jobs:
    bin/hdfs dfs -mkdir /user
    bin/hdfs dfs -mkdir /user/<username>
    
  5. Copy the input files into the distributed filesystem:
    bin/hdfs dfs -put etc/hadoop input
    

    Check if they are there at http://localhost:50070/explorer.html#/

  6. Run some of the examples provided (that's actually one line...):
    bin/hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-2.4.0.jar grep input output 'dfs[a-z.]+'
    
  7. Examine the output files:

    Copy the output files from the distributed filesystem to the local filesystem and examine them:

    bin/hdfs dfs -get output output
    cat output/*
    

    or

    View the output files on the distributed filesystem:

    bin/hdfs dfs -cat output/*
    
  8. When you're done, stop the daemons with:
    sbin/stop-dfs.sh
    

Possible errors without the fixes & tweaks above

This list is an excerpt from my efforts during the build. They meant to drive you here via google ;-) Apply the procedure above and all of these errors will be fixed for you.

Without ProtoBuf

If you don't have protobuf, you'll get the following error:

[INFO] --- hadoop-maven-plugins:2.4.0:protoc (compile-protoc) @ hadoop-common ---
[WARNING] [protoc, --version] failed: java.io.IOException: Cannot run program "protoc": error=2, No such file or directory
[ERROR] stdout: []

Wrong version of ProtoBuf

If you don't have the correct version of protobuf, you'll get

[ERROR] Failed to execute goal org.apache.hadoop:hadoop-maven-plugins:2.4.0:protoc (compile-protoc) on project hadoop-common: org.apache.maven.plugin.MojoExecutionException: protoc version is 'libprotoc 2.6.1', expected version is '2.5.0' -> [Help 1]

CMAKE missing

If you don't have cmake, you'll get

[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on project hadoop-common: An Ant BuildException has occured: Execute failed: java.io.IOException: Cannot run program "cmake" (in directory "/Users/doma/dev/hadoop-2.4.0/hadoop-common-project/hadoop-common/target/native"): error=2, No such file or directory
[ERROR] around Ant part ...... @ 4:132 in /Users/doma/dev/hadoop-2.4.0/hadoop-common-project/hadoop-common/target/antrun/build-main.xml

JAVA_HOME missing

If you don't have JAVA_HOME correctly set, you'll get

     [exec] -- Detecting CXX compiler ABI info
     [exec] -- Detecting CXX compiler ABI info - done
     [exec] -- Detecting CXX compile features
     [exec] -- Detecting CXX compile features - done
     [exec] CMake Error at /opt/local/share/cmake-3.2/Modules/FindPackageHandleStandardArgs.cmake:138 (message):
     [exec]   Could NOT find JNI (missing: JAVA_AWT_LIBRARY JAVA_JVM_LIBRARY
     [exec]   JAVA_INCLUDE_PATH JAVA_INCLUDE_PATH2 JAVA_AWT_INCLUDE_PATH)
     [exec] Call Stack (most recent call first):
     [exec]   /opt/local/share/cmake-3.2/Modules/FindPackageHandleStandardArgs.cmake:374 (_FPHSA_FAILURE_MESSAGE)
     [exec]   /opt/local/share/cmake-3.2/Modules/FindJNI.cmake:287 (FIND_PACKAGE_HANDLE_STANDARD_ARGS)
     [exec]   JNIFlags.cmake:117 (find_package)
     [exec]   CMakeLists.txt:24 (include)
     [exec] 
     [exec] 
     [exec] -- Configuring incomplete, errors occurred!
     [exec] See also "/Users/doma/dev/hadoop-2.4.0/hadoop-common-project/hadoop-common/target/native/CMakeFiles/CMakeOutput.log".

JniBasedUnixGroupsNetgroupMapping.c patch missing

If you don't have the patch for JniBasedUnixGroupsNetgroupMapping.c above, you'll get

     [exec] [ 38%] Building C object CMakeFiles/hadoop.dir/main/native/src/org/apache/hadoop/security/JniBasedUnixGroupsNetgroupMapping.c.o
     [exec] /Library/Developer/CommandLineTools/usr/bin/cc  -Dhadoop_EXPORTS -g -Wall -O2 -D_REENTRANT -D_GNU_SOURCE -D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -fPIC -I/Users/doma/dev/hadoop-2.4.0/hadoop-common-project/hadoop-common/target/native/javah -I/Users/doma/dev/hadoop-2.4.0/hadoop-common-project/hadoop-common/src/main/native/src -I/Users/doma/dev/hadoop-2.4.0/hadoop-common-project/hadoop-common/src -I/Users/doma/dev/hadoop-2.4.0/hadoop-common-project/hadoop-common/src/src -I/Users/doma/dev/hadoop-2.4.0/hadoop-common-project/hadoop-common/target/native -I/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/include -I/Library/Java/JavaVirtualMachines/jdk1.7.0_51.jdk/Contents/Home/include/darwin -I/opt/local/include -I/Users/doma/dev/hadoop-2.4.0/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/util    -o CMakeFiles/hadoop.dir/main/native/src/org/apache/hadoop/security/JniBasedUnixGroupsNetgroupMapping.c.o   -c /Users/doma/dev/hadoop-2.4.0/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/security/JniBasedUnixGroupsNetgroupMapping.c
     [exec] /Users/doma/dev/hadoop-2.4.0/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/security/JniBasedUnixGroupsNetgroupMapping.c:77:26: error: invalid operands to binary expression ('void' and 'int')
     [exec]   if(setnetgrent(cgroup) == 1) {
     [exec]      ~~~~~~~~~~~~~~~~~~~ ^  ~
     [exec] 1 error generated.
     [exec] make[2]: *** [CMakeFiles/hadoop.dir/main/native/src/org/apache/hadoop/security/JniBasedUnixGroupsNetgroupMapping.c.o] Error 1
     [exec] make[1]: *** [CMakeFiles/hadoop.dir/all] Error 2
     [exec] make: *** [all] Error 2

fcloseall patch missing

Without applying the fcloseall patch above, you might get the following error:

     [exec] Undefined symbols for architecture x86_64:
     [exec]   "_fcloseall", referenced from:
     [exec]       _launch_container_as_user in libcontainer.a(container-executor.c.o)
     [exec] ld: symbol(s) not found for architecture x86_64
     [exec] collect2: error: ld returned 1 exit status
     [exec] make[2]: *** [target/usr/local/bin/container-executor] Error 1
     [exec] make[1]: *** [CMakeFiles/container-executor.dir/all] Error 2
     [exec] make: *** [all] Error 2

Symlink missing

Without the "export JAVA_HOME=`/usr/libexec/java_home -v 1.7`;sudo mkdir $JAVA_HOME/Classes;sudo ln -s $JAVA_HOME/lib/tools.jar $JAVA_HOME/Classes/classes.jar" line creating the symlinks above, you'll get

Exception in thread "main" java.lang.AssertionError: Missing tools.jar at: /Library/Java/JavaVirtualMachines/jdk1.7.0_79.jdk/Contents/Home/Classes/classes.jar. Expression: file.exists()
 at org.codehaus.groovy.runtime.InvokerHelper.assertFailed(InvokerHelper.java:395)
 at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.assertFailed(ScriptBytecodeAdapter.java:683)
 at org.codehaus.mojo.jspc.CompilationMojoSupport.findToolsJar(CompilationMojoSupport.groovy:371)
 at org.codehaus.mojo.jspc.CompilationMojoSupport.this$4$findToolsJar(CompilationMojoSupport.groovy)
...

References:

http://java-notes.com/index.php/hadoop-on-osx

https://issues.apache.org/jira/secure/attachment/12602452/HADOOP-9350.patch

http://www.csrdu.org/nauman/2014/01/23/geting-started-with-hadoop-2-2-0-building/

https://developer.apple.com/library/mac/documentation/Porting/Conceptual/PortingUnix/compiling/compiling.html

https://github.com/cooljeanius/libUnixToOSX/blob/master/fcloseall.c

http://hadoop.apache.org/docs/r2.4.1/hadoop-project-dist/hadoop-common/SingleCluster.html


Install Apache Ant 1.8.1 via MacPorts

If the latest version of Apache Ant in MacPorts is not what you're after, you can try to downgrade. Here's how simple it is:

cd ~/tools
svn co http://svn.macports.org/repository/macports/trunk/dports/devel/apache-ant -r 74985
cd apache-ant/
sudo port install

To verify:

# ant -version
# Apache Ant version 1.8.1 compiled on April 30 2010

If you want to get other revisions, see here.

Switch between versions

But wait, there's more! You can even switch between the different versions installed easily. Actually this is what makes the whole process comfortable, no need to store different versions of tools at arbitrary locations...

sudo port activate apache-ant
--->  The following versions of apache-ant are currently installed:
--->      apache-ant @1.8.1_1 (active)
--->      apache-ant @1.8.4_0
--->      apache-ant @1.9.4_0
Error: port activate failed: Registry error: Please specify the full version as recorded in the port registry.

I have these three versions - I used 1.8.4 for the earlier Tomcat build, and 1.8.1 for the Hadoop build (the next post...)

But now that Hadoop is also built on OSX for my work, I switch back to the latest version:

sudo port activate apache-ant@1.9.4_0
--->  Deactivating apache-ant @1.8.1_1
--->  Cleaning apache-ant
--->  Activating apache-ant @1.9.4_0
--->  Cleaning apache-ant

To verify:

ant -version
# Apache Ant(TM) version 1.9.4 compiled on April 29 2014

Neat, huh?

6/18/2015

Downgrading MacPorts: use Ant 1.8.4 to build Tomcat 6 in Yosemite 10.10.3

Building Tomcat6 on MacPorts fails on building jakarta-taglibs-standard-11 due to Maven 1.9.4 present in the MacPorts repo. The error manifests itself like this:

...
    [javac] /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_jakarta-taglibs-standard-11/jakarta-taglibs-standard-11/work/jakarta-taglibs-standard-1.1.2-src/standard/build.xml:178: warning: 'includeantruntime' was not set, defaulting to build.sysclasspath=last; set to false for repeatable builds
    [javac] Compiling 236 source files to /opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_jakarta-taglibs-standard-11/jakarta-taglibs-standard-11/work/jakarta-taglibs-standard-1.1.2-src/standard/build/standard/standard/classes
    [javac] Fatal Error: Unable to find package java.lang in classpath or bootclasspath
BUILD FAILED
/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_tarballs_ports_java_jakarta-taglibs-standard-11/jakarta-taglibs-standard-11/work/jakarta-taglibs-standard-1.1.2-src/standard/build.xml:178: Compile failed; see the compiler error output for details.
...

The build can be fixed by downgrading Ant:

cd ~
svn co -r 94758 http://svn.macports.org/repository/macports/trunk/dports/devel/apache-ant
cd apache-ant
sudo port install

Now tomcat can be installed from ports:

sudo port install tomcat6

Starting tomcat now shows we need some further customization:

# sudo port load tomcat6
# sudo less /opt/local/share/java/tomcat6/logs/catalina.err
2015-06-17 20:25:42.976 jsvc[587:5132] Apple AWT Java VM was loaded on first thread -- can't start AWT.
Jun 17, 2015 8:25:42 PM org.apache.catalina.startup.Bootstrap initClassLoaders
SEVERE: Class loader creation threw exception
java.lang.InternalError: Can't start the AWT because Java was started on the first thread.  Make sure StartOnFirstThread is not specified in your application's Info.plist or on the command line
        at java.lang.ClassLoader$NativeLibrary.load(Native Method)
        at java.lang.ClassLoader.loadLibrary0(ClassLoader.java:1833)
        at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1730)
        at java.lang.Runtime.loadLibrary0(Runtime.java:823)
        at java.lang.System.loadLibrary(System.java:1044)
        at sun.security.action.LoadLibraryAction.run(LoadLibraryAction.java:50)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.awt.Toolkit.loadLibraries(Toolkit.java:1605)
        at java.awt.Toolkit.(Toolkit.java:1627)
        at sun.awt.AppContext$2.run(AppContext.java:240)
        at sun.awt.AppContext$2.run(AppContext.java:226)
        at java.security.AccessController.doPrivileged(Native Method)
        at sun.awt.AppContext.initMainAppContext(AppContext.java:226)
        at sun.awt.AppContext.access$200(AppContext.java:112)
        at sun.awt.AppContext$3.run(AppContext.java:306)
        at java.security.AccessController.doPrivileged(Native Method)
        at sun.awt.AppContext.getAppContext(AppContext.java:287)
        at com.sun.jmx.trace.Trace.out(Trace.java:180)
        at com.sun.jmx.trace.Trace.isSelected(Trace.java:88)
        at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.isTraceOn(DefaultMBeanServerInterceptor.java:1830)
        at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.registerDynamicMBean(DefaultMBeanServerInterceptor.java:929)
        at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.registerObject(DefaultMBeanServerInterceptor.java:916)
        at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.registerMBean(DefaultMBeanServerInterceptor.java:312)
        at com.sun.jmx.mbeanserver.JmxMBeanServer$2.run(JmxMBeanServer.java:1195)
        at java.security.AccessController.doPrivileged(Native Method)
        at com.sun.jmx.mbeanserver.JmxMBeanServer.initialize(JmxMBeanServer.java:1193)
        at com.sun.jmx.mbeanserver.JmxMBeanServer.(JmxMBeanServer.java:225)
        at com.sun.jmx.mbeanserver.JmxMBeanServer.(JmxMBeanServer.java:170)
        at com.sun.jmx.mbeanserver.JmxMBeanServer.newMBeanServer(JmxMBeanServer.java:1401)
        at javax.management.MBeanServerBuilder.newMBeanServer(MBeanServerBuilder.java:93)
        at javax.management.MBeanServerFactory.newMBeanServer(MBeanServerFactory.java:311)
        at javax.management.MBeanServerFactory.createMBeanServer(MBeanServerFactory.java:214)
        at javax.management.MBeanServerFactory.createMBeanServer(MBeanServerFactory.java:175)
        at sun.management.ManagementFactory.createPlatformMBeanServer(ManagementFactory.java:302)
        at java.lang.management.ManagementFactory.getPlatformMBeanServer(ManagementFactory.java:504)
        at org.apache.catalina.startup.Bootstrap.createClassLoader(Bootstrap.java:183)
        at org.apache.catalina.startup.Bootstrap.initClassLoaders(Bootstrap.java:92)
        at org.apache.catalina.startup.Bootstrap.init(Bootstrap.java:207)
        at org.apache.catalina.startup.Bootstrap.init(Bootstrap.java:275)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)

That rings a bell. We need to run headless! To customize MacPort's Tomcat, edit setenv.local:

# sudo vi /opt/local/share/java/tomcat6/conf/setenv.local

This example is to use JDK1.7 and some self-signed certificate magic [setenv.local]

JAVA_JVM_VERSION=1.7
JAVA_OPTS="-Djava.awt.headless=true -XX:PermSize=500m -XX:MaxPermSize=800m -Xmx2g -Djavax.net.ssl.keyStore=/Users/doma/.keystore -Djavax.net.ssl.keyStorePassword=password -Djavax.net.ssl.trustStore=/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/jre/lib/security/cacerts -Djavax.net.ssl.trustStorePassword=changeit"

Restart Tomcat6:

sudo port unload tomcat6
sudo port load tomcat6

Did we fix it?

# sudo less /opt/local/share/java/tomcat6/logs/catalina.err
Jun 17, 2015 9:29:37 PM org.apache.catalina.core.AprLifecycleListener init
INFO: The APR based Apache Tomcat Native library which allows optimal performance in production environments was not found on the java.library.path: .:/Library/Java/Extensions:/System/Library/Java/Extensions:/usr/lib/java
Jun 17, 2015 9:29:37 PM org.apache.coyote.http11.Http11Protocol init
INFO: Initializing Coyote HTTP/1.1 on http-8080
Jun 17, 2015 9:29:37 PM org.apache.catalina.startup.Catalina load
INFO: Initialization processed in 507 ms
Jun 17, 2015 9:29:37 PM org.apache.catalina.core.StandardService start
INFO: Starting service Catalina
Jun 17, 2015 9:29:37 PM org.apache.catalina.core.StandardEngine start
INFO: Starting Servlet Engine: Apache Tomcat/6.0.35
Jun 17, 2015 9:29:37 PM org.apache.catalina.startup.HostConfig deployDescriptor
INFO: Deploying configuration descriptor host-manager.xml
Jun 17, 2015 9:29:38 PM org.apache.catalina.startup.HostConfig deployDescriptor
INFO: Deploying configuration descriptor manager.xml
Jun 17, 2015 9:29:38 PM org.apache.catalina.startup.HostConfig deployDirectory
INFO: Deploying web application directory docs
Jun 17, 2015 9:29:38 PM org.apache.catalina.startup.HostConfig deployDirectory
INFO: Deploying web application directory examples
Jun 17, 2015 9:29:38 PM org.apache.catalina.core.ApplicationContext log
INFO: ContextListener: contextInitialized()
Jun 17, 2015 9:29:38 PM org.apache.catalina.core.ApplicationContext log
INFO: SessionListener: contextInitialized()
Jun 17, 2015 9:29:38 PM org.apache.catalina.startup.HostConfig deployDirectory
INFO: Deploying web application directory ROOT
Jun 17, 2015 9:29:38 PM org.apache.coyote.http11.Http11Protocol start
INFO: Starting Coyote HTTP/1.1 on http-8080
Jun 17, 2015 9:29:38 PM org.apache.jk.common.ChannelSocket init
INFO: JK: ajp13 listening on /0.0.0.0:8009
Jun 17, 2015 9:29:38 PM org.apache.jk.server.JkMain start
INFO: Jk running ID=0 time=0/18  config=null
Jun 17, 2015 9:29:38 PM org.apache.catalina.startup.Catalina start
INFO: Server startup in 710 ms

Et voila. Tomcat started.

Background

The same process can be applied to downgrade anything in the ports tree. To find the proper release, see https://trac.macports.org/log/trunk/dports/devel/apache-ant

To see which versions you are currently having:

# sudo port installed apache-ant
The following ports are currently installed:
  apache-ant @1.8.4_0 (active)
  apache-ant @1.9.4_0

To use 1.9.4 again:

# sudo port activate apache-ant @1.9.4_0
--->  Deactivating apache-ant @1.8.4_0
--->  Cleaning apache-ant
--->  Activating apache-ant @1.9.4_0
--->  Cleaning apache-ant
# ant -version
Apache Ant(TM) version 1.9.4 compiled on April 29 2014

Reference: https://trac.macports.org/wiki/howto/InstallingOlderPort

6/01/2015

IntelliJ IDEA: pass JAVA_HOME, M2_HOME, MAVEN_OPTS to the IDE using Yosemite

Place the following content (enhance it to your taste obviously) to /Library/LaunchDaemons/setenv.MAVEN_OPTS.plist:

<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
  <plist version="1.0">
  <dict>
  <key>Label</key>
  <string>setenv.BAR</string>
  <key>ProgramArguments</key>
  <array>
    <string>/bin/launchctl</string>
    <string>setenv</string>
    <string>MAVEN_OPTS</string>
    <string>-XX:PermSize=500m -XX:MaxPermSize=800m -Xmx2g -Djavax.net.ssl.keyStore=/Users/doma/.keystore -Djavax.net.ssl.keyStorePassword=password -Djavax.net.ssl.trustStore=/Library/Java/JavaVirtualMachines/jdk1.7.0_79.jdk/Contents/Home/jre/lib/security/cacerts -Djavax.net.ssl.trustStorePassword=changeit</string>
  </array>
  <key>RunAtLoad</key>
  <true/>
  <key>ServiceIPC</key>
  <false/>
</dict>
</plist>

You'll have to either restart your computer or run the following line to apply the changes:

launchctl load -w /Library/LaunchDaemons/setenv.MAVEN_OPTS.plist

The next candidate is M2_HOME, the file to create is /Library/LaunchDaemons/setenv.M2_HOME.plist :

<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
  <plist version="1.0">
  <dict>
  <key>Label</key>
  <string>setenv.BAR</string>
  <key>ProgramArguments</key>
  <array>
    <string>/bin/launchctl</string>
    <string>setenv</string>
    <string>M2_HOME</string>
    <string>/opt/local/share/java/apache-maven-3.1.1</string>
  </array>
  <key>RunAtLoad</key>
  <true/>
  <key>ServiceIPC</key>
  <false/>
</dict>
</plist>

Again, restart or run the following to apply:

launchctl load -w /Library/LaunchDaemons/setenv.M2_HOME.plist

...and here's the proof that you don't have to set the value of the M2_HOME after each Maven project import:

Reference: http://www.dowdandassociates.com/blog/content/howto-set-an-environment-variable-in-mac-os-x-launchd-plist/


5/17/2015

Migrating SVN repositories to GIT

mkdir ~/mig
cd ~/mig
rm -rf parent
git svn clone file:///var/svn/repos/parent --stdlayout --no-metadata
cd parent
mv .git/refs/remotes/trunk .git/refs/heads
mv .git/refs/remotes/tags .git/refs/tags

git remote add origin git@java-notes.com:parent.git
git push origin --all

3/09/2015

IntelliJ Idea live templates: code for singletons

Say you have this class:

public class TerminalTitlebar {
    public void set(String title) {
        System.out.println("\033]0;" + title + "\007");
    }
}

...and you want to introduce code to change this into a singleton easily.

File -> Settings -> Live Templates -> Click + to create a new entry.

Enter i for abbreviation, "singleton instance" or something similar for description, and the code in the "Template text" field:

private static $CLASS_NAME$ instance;
public static $CLASS_NAME$ getInstance() {
    return instance == null ? instance = new $CLASS_NAME$() : instance;
}

Now click "Edit Variables", and change the Expression for CLASS_NAME to "className()", press OK

To tell IntelliJ to use the generated snippet in Java code, click "Change" in the "Applicable in" line, and select Java.

Now type the letter i (this was the abbreviation we used earlier) and press TAB and the magic happens:

public class TerminalTitlebar {
    private static TerminalTitlebar instance;

    public static TerminalTitlebar getInstance() {
        return instance == null ? instance = new TerminalTitlebar() : instance;
    }

    public void set(String title) {
        System.out.println("\033]0;" + title + "\007");
    }
}

Voila, you can use it in any class. Enjoy!


9/18/2014

Building and installing Apache Maven 3.0.5 from sources


cd ~/tools
wget http://repo1.maven.org/maven2/org/apache/maven/apache-maven/3.0.5/apache-maven-3.0.5-src.tar.gz
tar xvzf apache-maven-3.0.5-src.tar.gz
cd apache-maven-3.0.5
#for the next line to succeed, you need a binary maven version in the path
mvn clean install -DskipTests
cd ..
tar xvzf apache-maven-3.0.5/apache-maven/target/apache-maven-3.0.5-bin.tar.gz

9/09/2014

Fix Java font smoothing on Ubuntu: re-building OpenJDK from sources with Infinality patches

OpenJDK7

First we need to install some pre-requisities:
sudo apt-get install libasound2-dev libcups2-dev libfreetype6-dev gawk g++ libxext-dev libxrender-dev libxtst-dev libfontconfig1-dev mercurial openjdk-6-jdk

export LANG=C ALT_BOOTDIR=/usr/lib/jvm/java-6-openjdk-amd64
Getting the sources:
cd ~/tools
mkdir jdk7u60
cd jdk7u60
hg clone http://hg.openjdk.java.net/jdk7u/jdk7u60 forest
cd forest
sh ./get_source.sh
Patching fonts with Infinality:
cd ~
git clone https://gist.github.com/2893461.git
mv 2893461/fontfix.patch ~/tools/jdk7u60/forest/jdk/
cd ~/tools/jdk7u60/forest/jdk/
patch -p1 < fontfix.patch
The actual build:
cd ~/tools/jdk7u60/forest
. jdk/make/jdk_generic_profile.sh
#make ALLOW_DOWNLOADS=true fastdebug_build
make all
Verifying:
build/linux-amd64-fastdebug/j2sdk-image/bin/java -version

In case you get java.lang.RuntimeException: time is more than 10 years from present: 1104530400000

Try this:

cd ~/tools/jdk7u60/forest/jdk/
wget http://hg.openjdk.java.net/jdk7u/jdk7u/jdk/raw-rev/74a70385c21d
patch -p1 < 74a70385c21d

OpenJDK6

sudo apt-get install libasound2-dev libcups2-dev libfreetype6-dev gawk g++ libxext-dev libxrender-dev libxtst-dev libfontconfig1-dev mercurial openjdk-6-jdk libmotif-dev
export LANG=C ALT_BOOTDIR=/usr/lib/jvm/java-6-openjdk-amd64
cd ~/tools
mkdir jdk6
cd jdk6
hg clone http://hg.openjdk.java.net/jdk6/jdk6 forest
cd forest
sh ./get_source.sh
cd ~
git clone https://gist.github.com/2893461.git
mv 2893461/fontfix.patch ~/tools/jdk6/forest/jdk/
rm -rf 2893461
cd ~/tools/jdk6/forest/jdk/
patch -p1 < fontfix.patch
cd ~/tools/jdk6/forest
. jdk/make/jdk_generic_profile.sh
make all

OpenJDK8

cd ~/tools
mkdir jdk8
cd jdk8
hg clone http://hg.openjdk.java.net/jdk8/jdk8 forest
cd forest
sh ./get_source.sh
cd ~
git clone https://gist.github.com/2893461.git
mv 2893461/fontfix.patch ~/tools/jdk8/forest/jdk/
rm -rf 2893461
cd ~/tools/jdk8/forest/jdk/
patch -p1 < fontfix.patch
cd ~/tools/jdk8/forest
chmod +x configure
./configure
make all

Reference:

http://mail-index.netbsd.org/pkgsrc-users/2014/12/30/msg020846.html

9/08/2014

Fix Java font smoothing on FreeBSD: re-building the port tree's OpenJDK from sources with Infinality patches

The current OpenJDK available in FreeBSD's port tree lacks some eye-candy regarding font-rendering capabilities. Since Freetype2 already includes subpixel-rendering support, we just have to patch the JDK. So first we make an "extract patch" (e.g. no compile, only acquiring sources, applying official FreeBSD patches), then we'll patch with the Infinality patches available in the git repo below.

OpenJDK7

cd /usr/ports/java/openjdk7
sudo make extract patch
cd ~
git clone https://gist.github.com/2893461.git
sudo mv 2893461/fontfix.patch /usr/ports/java/openjdk7/work/openjdk/jdk/
cd /usr/ports/java/openjdk7/work/openjdk/jdk/
sudo patch -p1 < fontfix.patch
cd /usr/ports/java/openjdk7/
sudo make install clean

OpenJDK6

cd /usr/ports/java/openjdk6
sudo make extract patch
cd ~
rm -rf 2893461
git clone https://gist.github.com/2893461.git
sudo mv 2893461/fontfix.patch /usr/ports/java/openjdk6/work/jdk/
cd /usr/ports/java/openjdk6/work/jdk/
sudo patch -p1 < fontfix.patch
cd /usr/ports/java/openjdk6/
sudo make install clean
If you already have openjdk7 installed, you'll need to reinstall instead. In this case replace the last line with
sudo make deinstall reinstall clean

9/05/2014

Installing JDK 1.6, JDK 1.7 and JDK 1.8 from binary

You never know which one you need. Obviously if you have the luxury of a package manager, that will be better, especially if the PM supports multiple versions. Here I didn't have root permissions on a PC still I needed all versions; the switch can be made by sim-linking the proper version to ~/tools/jdk. Don't forget to download the files first - can't do wget since we have to accept the license agreement. Here we go:

JDK1.6

mv ~/Downloads/jdk-6u45-linux-x64.bin .
chmod +x jdk-6u45-linux-x64.bin
./jdk-6u45-linux-x64.bin

JDK1.7

mv ~/Downloads/jdk-7u67-linux-x64.tar.gz ~/tools
tar xvzf jdk-7u67-linux-x64.tar.gz

JDK1.8

mv ~/Downloads/jdk-8u11-linux-x64.tar.gz .
tar xvzf jdk-8u11-linux-x64.tar.gz
If you add ~/tools/jdk/bin to your PATH (via ~/.profile?), here's how touse jdk 1.8 as default:
ln -s ~/tools/jdk1.8.0_11 ~/tools/jdk

Building and installing Apache Tomcat 7.0 from sources (no sudo needed)

You will need ant though. If you have a package manager and the appropriate rights it's super easy, either "apt-get install ant" on Ubuntu or "portmaster devel/apache-ant" on FreeBSD - or download and put ant to ~/tools. You will also need to update your PATH so that ant can be found.
cd ~/tools
svn co http://svn.apache.org/repos/asf/tomcat/tc7.0.x/trunk ~/tools/apache-tomcat-7-sources
cd ~/tools/apache-tomcat-7-sources
ant -Dbase.path=~/tools/apache-tomcat-7-sources/base download-compile
ant -Dbase.path=~/tools/apache-tomcat-7-sources/base deploy
mkdir -p ~/tools/apache-tomcat-7
mv ~/tools/apache-tomcat-7-sources/output/build/* ~/tools/apache-tomcat-7/
ln -s ~/tools/apache-tomcat-7 ~/tools/tomcat
That concludes the installation. In case you need some JDBC drivers and alike (unfortunately Tomcat 7 does not manage our dependencies like Karaf), here is a sample:
cd ~/tools/tomcat/lib/
wget http://repo1.maven.org/maven2/org/hsqldb/hsqldb/2.3.2/hsqldb-2.3.2.jar
wget http://repo1.maven.org/maven2/org/postgresql/postgresql/9.3-1102-jdbc41/postgresql-9.3-1102-jdbc41.jar
wget http://repo1.maven.org/maven2/javax/activation/activation/1.1.1/activation-1.1.1.jar
wget http://repo1.maven.org/maven2/javax/mail/mail/1.5.0-b01/mail-1.5.0-b01.jar
You may need to add this to ~/.profile file:
export JAVA_HOME=$HOME/tools/jdk

Building and installing Apache Karaf from sources (no sudo needed)

Once upon a time I was working for a company where sudo rights weren't given to developers. So I had to spend a significant fraction of my time finding workarounds, how silly is that. The no-sudo-needed series is the result of all these efforts: I've saved the process, so next time it is needed, it's easy to get back to it.

The point here is to be able to install them just by copying and pasting the code below. Things change around us though, so whatever may build today might just stop working tomorrow. Thus I do my best to keep these code snippets up-to-date.

Like today I need Karaf again. Building from source feels very appropriate, so even though I have sudo on my computers since leaving that company, I still keep coming back here for the "easy builds". Today I updated with Karaf 2.3.11, it is currently building in another window. If succeeds, I'll create another snippet for it. If you need it, feel free to get it!

Building Apache Karaf 2.3.11 from sources

wget http://apache.belnet.be/karaf/2.3.11/apache-karaf-2.3.11-src.tar.gz
tar xvzf apache-karaf-2.3.11-src.tar.gz
cd apache-karaf-2.3.11/src
mvn clean install -DskipTests
mv assemblies/apache-karaf/target/apache-karaf-2.3.11.tar.gz ~/tools
cd ~/tools
tar xvzf apache-karaf-2.3.11.tar.gz
ln -s apache-karaf-2.3.11 karaf

Building Apache Karaf 2.3.6 from sources

wget http://apache.belnet.be/karaf/2.3.6/apache-karaf-2.3.6-src.tar.gz
tar xvzf apache-karaf-2.3.6-src.tar.gz
cd apache-karaf-2.3.6/src
mvn clean install -DskipTests
mv assemblies/apache-karaf/target/apache-karaf-2.3.6.tar.gz ~/tools
cd ~/tools
tar xvzf apache-karaf-2.3.6.tar.gz
ln -s apache-karaf-2.3.6 karaf

Building Apache Karaf 4.0 from sources

git clone https://git-wip-us.apache.org/repos/asf/karaf.git
cd karaf
git checkout karaf-4.0.0
# to avoid "Detected JDK Version: 1.6.0-65 is not in the allowed range [1.7,1.9)" on osx, can only build with 1.7
export JAVA_HOME=`/usr/libexec/java_home -v 1.7`
mvn clean install -DskipTests

Building Karaf 4.0.10 from sources, checking out given branch only, building with java 1.8 on OSX:

git clone -b karaf-4.0.10 --single-branch https://github.com/apache/karaf.git
cd karaf
export JAVA_HOME=`/usr/libexec/java_home -v 1.8`
mvn clean install -DskipTests -DskipITs

Building and installing Apache Karaf 2.2.5 or 2.3.11 from sources (no sudo needed)

In my ~/.m2/settings.xml, I have a separate profile which uses a local Nexus repository as a proxy to all the remotes. It is not active by default, only activated with the -P local switch. Thus please don't update your settings.xml like this if you don't have a local Nexus proxy (and you won't need the "-P local" switch later neither in this case). The relevant part of my settings.xml:
<settings xmlns="http://maven.apache.org/SETTINGS/1.0.0"
          xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
          xsi:schemaLocation="http://maven.apache.org/SETTINGS/1.0.0
                          http://maven.apache.org/xsd/settings-1.0.0.xsd">
    <profiles>
        <profile>
            <id>local</id>
            <!--<activation>-->
            <!--<activeByDefault>true</activeByDefault>-->
            <!--</activation>-->
            <properties>
                <repos.release.url>http://127.0.0.1:8081/content/groups/public</repos.release.url>
                <repos.snapshot.url>http://127.0.0.1:8081/content/groups/public</repos.snapshot.url>
            </properties>
            <repositories>
                <repository>
                    <id>nexus</id>
                    <name>public</name>
                    <url>http://127.0.0.1:8081/content/groups/public</url>
                    <layout>default</layout>
                    <snapshots>
                        <enabled>true</enabled>
                        <updatePolicy>always</updatePolicy>
                    </snapshots>
                    <releases>
                        <enabled>true</enabled>
                        <updatePolicy>daily</updatePolicy>
                    </releases>
                </repository>
                <repository>
                    <id>snapshots</id>
                    <name>snapshots</name>
                    <url>http://127.0.0.1:8081/content/repositories/snapshots</url>
                    <layout>default</layout>
                    <snapshots>
                        <enabled>true</enabled>
                        <updatePolicy>always</updatePolicy>
                    </snapshots>
                    <releases>
                        <enabled>false</enabled>
                        <updatePolicy>daily</updatePolicy>
                    </releases>
                </repository>
                <repository>
                    <id>releases</id>
                    <name>releases</name>
                    <url>http://127.0.0.1:8081/content/repositories/releases</url>
                    <layout>default</layout>
                    <snapshots>
                        <enabled>false</enabled>
                    </snapshots>
                    <releases>
                        <enabled>true</enabled>
                        <updatePolicy>daily</updatePolicy>
                    </releases>
                </repository>
            </repositories>

            <pluginRepositories>
                <pluginRepository>
                    <id>nexus</id>
                    <name>public</name>
                    <url>http://127.0.0.1:8081/content/groups/public</url>
                    <layout>default</layout>
                    <snapshots>
                        <enabled>true</enabled>
                        <updatePolicy>always</updatePolicy>
                    </snapshots>
                    <releases>
                        <enabled>true</enabled>
                        <updatePolicy>daily</updatePolicy>
                    </releases>
                </pluginRepository>
                <pluginRepository>
                    <id>snapshots</id>
                    <name>snapshots</name>
                    <url>http://127.0.0.1:8081/content/repositories/snapshots</url>
                    <layout>default</layout>
                    <snapshots>
                        <enabled>true</enabled>
                        <updatePolicy>always</updatePolicy>
                    </snapshots>
                    <releases>
                        <enabled>false</enabled>
                        <updatePolicy>daily</updatePolicy>
                    </releases>
                </pluginRepository>
                <pluginRepository>
                    <id>releases</id>
                    <name>releases</name>
                    <url>http://127.0.0.1:8081/content/repositories/releases</url>
                    <layout>default</layout>
                    <snapshots>
                        <enabled>false</enabled>
                    </snapshots>
                    <releases>
                        <enabled>true</enabled>
                        <updatePolicy>daily</updatePolicy>
                    </releases>
                </pluginRepository>
            </pluginRepositories>
        </profile>
    </profiles>
</settings>

...and the script for the build for version 2.2.5:
mkdir -p ~/tools
cd ~/tools
wget http://archive.apache.org/dist/karaf/2.2.5/apache-karaf-2.2.5-src.tar.gz
tar xvzf apache-karaf-2.2.5-src.tar.gz
cd apache-karaf-2.2.5/src
mvn clean install -DskipTests -P local
sudo mkdir -p /opt
cd /opt
sudo tar xvzf ~/tools/apache-karaf-2.2.5/src/assemblies/apache-karaf/target/apache-karaf-2.2.5.tar.gz
sudo ln -s apache-karaf-2.2.5 karaf

For version 2.3.11

mkdir -p ~/tools
cd ~/tools
wget http://archive.apache.org/dist/karaf/2.3.11/apache-karaf-2.3.11-src.tar.gz
tar xvzf apache-karaf-2.3.11-src.tar.gz
cd apache-karaf-2.3.11/src
mvn clean install -DskipTests -P local
sudo mkdir -p /opt
cd /opt
sudo tar xvzf ~/tools/apache-karaf-2.3.11/src/assemblies/apache-karaf/target/apache-karaf-2.3.11.tar.gz
sudo ln -s apache-karaf-2.3.11 karaf

Autostart on Ubuntu: to wrap or not to wrap

If you try to install the wrapper to achieve autostart, everything seems to be fine, but the wrapper segfaults on subsequent reboots, at least that's the sad situation on Ubuntu 12.04.5. This used to work before.
Still it has to work, so the workaround is to create a /etc/init.d/karaf script with this content:
#!/bin/bash

KARAF_HOME=/home/neusoft/tools/karaf

case "$1" in
  start)
    log_begin_msg "Starting Karaf..."
        $KARAF_HOME/bin/start
    log_end_msg 0
    ;;
  stop)
    log_begin_msg "Shutting down Karaf..."
        $KARAF_HOME/bin/stop
    log_end_msg 0
    ;;
  *)
  ;;
esac
To register the file with the system
sudo update-rc.d karaf defaults 98 02

Autostart on FreeBSD

Using FreeBSD, autostarting the service is different:

echo 'karaf_enable="YES"' | sudo tee -a /etc/rc.conf > /dev/null

The rc.conf entry will start the script under /etc/rc.d. And the contents of the /etc/rc.d/karaf file:

#!/bin/sh
. /etc/rc.subr
export KARAF_OPTS="-Dkey=value"
start_cmd="karaf_start"
stop_cmd="karaf_stop"
karaf_start() {
        export JAVA_HOME=/usr/local/openjdk7
        /opt/karaf/bin/start
}
karaf_stop() {
        export JAVA_HOME=/usr/local/openjdk7
        /opt/karaf/bin/stop
}
load_rc_config $name
run_rc_command "$1"

Then you just have to make it executable:

sudo chmod +x /etc/rc.d/karaf

Or even make a symbolic link for the ones expect logs to be at standard places:

sudo ln -s /opt/karaf/data/log /var/log/karaf

Gotchas

Building on OSX Yosemite with Apple's JDK 1.6 succeeds, but logging on shows that we have a crippled container:

# features:addurl http://repo1.maven.org/maven2/org/apache/felix/org.apache.felix.ipojo.features/1.12.0/org.apache.felix.ipojo.features-1.12.0.xml
# Command not found: features:addurl

Investigating /opt/karaf/data/log/karaf.log shows us the cause of the error:

"Bundle org.apache.karaf.features.command is waiting for dependencies [(objectClass=org.apache.karaf.features.FeaturesService)]"

Installing Oracle JDK 1.7 & setting JAVA_HOME appropriately fixes the problem.

export JAVA_HOME=`/usr/libexec/java_home -v 1.7`
...
(rebuild the whole shebang)
...

Building and installing Apache Maven from sources (no sudo needed)

Maven 3.0.3

Maven 3.0.4 is known for it's incompatibilities connecting to self signed certificates, which can be solved by a small patch. If you don't depend on this version, it's even easier not to use the broken version at all. Here's how to build from sources:
mkdir -p ~/tools
cd ~/tools
wget http://repo1.maven.org/maven2/org/apache/maven/apache-maven/3.0.3/apache-maven-3.0.3-src.tar.gz
tar xvzf apache-maven-3.0.3-src.tar.gz
cd apache-maven-3.0.3
#for the next line to succeed, you need a binary maven version in the path
mvn clean install -DskipTests
cd ..
tar xvzf apache-maven-3.0.3/apache-maven/target/apache-maven-3.0.3-bin.tar.gz

Maven 3.2.3

mkdir -p ~/tools
cd ~/tools
wget http://repo1.maven.org/maven2/org/apache/maven/apache-maven/3.2.2/apache-maven-3.2.2-src.tar.gz
tar xvzf apache-maven-3.2.2-src.tar.gz
cd apache-maven-3.2.2
#for the next line to succeed, you need a binary maven version in the path
mvn clean install -DskipTests
cd ..
tar xvzf apache-maven-3.2.2/apache-maven/target/apache-maven-3.2.2-bin.tar.gz

8/21/2014

Using Mylyn for Rally in Eclipse Luna

Mylyn has a rally connector making us possible to use the nice editor instead of the web-interface. After installing the appropriate Rally connector, unfortunately it does not work. The error log tells us
java.lang.NullPointerException
 at com.rallydev.mylyn.internal.core.RallyClient.stripQualifier(RallyClient.java:1720)
 at com.rallydev.mylyn.internal.core.RallyClient.updateHttpHeaders(RallyClient.java:1714)
 at com.rallydev.mylyn.internal.core.RallyClient.internalGet(RallyClient.java:1447)
 at com.rallydev.mylyn.internal.core.RallyClient.parseResult(RallyClient.java:1389)
 at com.rallydev.mylyn.internal.core.RallyClient.getSubscription(RallyClient.java:1327)
 at com.rallydev.mylyn.internal.core.RallyClient.updateRepositoryData(RallyClient.java:1369)
 at com.rallydev.mylyn.internal.core.RallyRepositoryConnector.updateRepositoryConfiguration(RallyRepositoryConnector.java:119)
 at com.rallydev.mylyn.internal.core.RallyRepositoryConnector.updateRepositoryConfiguration(RallyRepositoryConnector.java:136)
 at org.eclipse.mylyn.internal.tasks.core.sync.UpdateRepositoryConfigurationJob.run(UpdateRepositoryConfigurationJob.java:46)
 at org.eclipse.core.internal.jobs.Worker.run(Worker.java:54)

A bit of googling leads us to https://rallydev.force.com/answers?id=90614000000ChHdAAK telling us we need to hack eclipse.ini by adding -Dosgi.framework.version=5.0.0 (obvious, isn't it?) Just in case I also specified 1.6 with "-vm /usr/people/bedomanl/tools/jdk1.6.0_45/bin/java". Here's the resulting eclipse.ini file:
-startup
plugins/org.eclipse.equinox.launcher_1.3.0.v20140415-2008.jar
--launcher.library
plugins/org.eclipse.equinox.launcher.gtk.linux.x86_64_1.1.200.v20140603-1326
-product
org.eclipse.epp.package.jee.product
--launcher.defaultAction
openFile
-showsplash
org.eclipse.platform
--launcher.XXMaxPermSize
256m
--launcher.defaultAction
openFile
--launcher.appendVmargs
-vm
/usr/people/bedomanl/tools/jdk1.6.0_45/bin/java
-vmargs
-Dosgi.requiredJavaVersion=1.6
-Dosgi.framework.version=5.0.0
-XX:MaxPermSize=256m
-Xms40m
-Xmx512m

12/09/2009

Eclipse: quickly change perspectives with keyboard shortcuts

Perspectives offer a nice way to quickly adapt Eclipse to the kind of work we are doing; there is a perspective for development, debugging, etc. After using it a while, creating some keyboard-shortcuts helps a lot as well. At the end of this article, we'll have perspective-icons like these:



Let's see what the buttons do:
  • JavaEE Perspective (Alt+1)
    • In this perspective the Project Explorer (tree) is displayed on the left, on the right we have the code-editor window and the tool-windows benefiting from a wide display are on the bottom. The icon shows a tree, that's why it makes sense to use this perspective for the one displaying the tree.
    • To open this perspective, select Window->Open Perspective->Other, and select JavaEE. The defaults are almost right, we just have to either close the outline and task list view if we don't use it, or move them to the left (next to the Project Explorer) so that they won't occupy so much space.
    •  To attach it to Alt+1, press Ctrl+Shift+L twice (this is a double-shortcut which on first press shows the available options in a tooltip, on a second press presents us the Preferences->Key window); in type filter text enter java ee; click on Show Perspective (Parameter Java EE) ; click binding, then press the new bind-key Alt+1 and Apply and OK.
  • Java Perspective (Alt+2)
    • This one is very similar to the previous one, except that we don't have the tree displayed, which results in a much wider source-window (for developers fancying long lines...)
    • To open this perspective, select Window->Open Perspective->Other, and select Java (Default). Let's close the Outline / Task List / Spring Explorer view as well as the Package Explorer and Hierarchy view. 
    • Attach the Show Perspective (Parameter: Java) command to Alt+2
    • This results in a very clean interface; we just have to get used to press Alt+1 to switch to the view we declared previously if we need the views we just closed. Just try pressing Alt+1 and Alt+2 repetitively a couple of times; you'll see easy it is to switch. In no time we get used to press Alt+2 when we need a "code-and-tool" like environment and Alt+1 when we want to browse around in the Package Explorer / Ant / etc views.
  • Debug Perspective (Alt+3)
    • The debug-layout really depends on the developer, but the methology is similar. So open this perspective and customize it the way you want.
    • Attach it to Alt+3.
  • Team Synchronizing Perspective (Alt+4)
    • Customize / attach key like earlier.
  • Database Development Perspective (Alt+5)
    • Customize / attach key like earlier.
The last ones are really depending on your development workflow. A typical workflow using these shortcuts can be something like this:
  • Find and Open the needed class or resource:
    • Alt+1 for browsing in Project Explorer
    • Alt+2 and press Ctrl+Shift+R to quickly open the source-file/resource. Here the first couple of characters of the classname should be enough to type - or CaMeL case, e.g. if you want to open DeadLetterChannel, just enter DLC. Neat, huh? And highly addictive...
  • After the modifications are done starting the debugger switches immediately to the Debug Perspective, but if the wide code-view is needed just press Alt+2 (it's a real relief after the very-useful-but-so-crowded Debug perspective); and when you need something from the debugger tool windows, it's very easy to switch back to the Debug  Perspective with Alt+3. Since the usual shortcuts like F5 (step into), F6 (step over) and F7 (step return) work in Java perspective as well, I usually find myself stepping thru the code there, and only when I arrive at the troublesome part, I switch to the Debug Perspective to see the watches, call stacks, etc.
  • When everything is ready, debugged, etc, Alt+4 switches to the Synchronising Perspective to submit the changes to source-control.
  • After a rebuild on the development server, the Database perspective (Alt+5) can be used to interact with the database.
The order of the buttons on the top-right corner depends on the order we opened the perspectives; I usually order them the same way as my keyboard-shortcuts (e.g. the first one is JavaEE because Alt+1 activates it, etc); thus the mouse usage is coherent to the keyboard. The buttons can be easily re-ordered with the mouse by dragging them to their place (Note: the "Show Text" option is turned off on the screenshot on the top of this post)

Comments and opinions are warmly welcome!
How does your developer workflow look like?

Thanks for reading and have a nice day!

11/27/2009

AppFuse Struts2 Tutorial with Eclipse / SpringSource Tools Suite

Excerpt from AppFuse's website:

AppFuse is an open source project and application that uses open source tools built on the Java platform to help you develop Web applications quickly and efficiently. It was originally developed to eliminate the ramp-up time found when building new web applications for customers. At its core, AppFuse is a project skeleton, similar to the one that's created by your IDE when you click through a wizard to create a new web project.

AppFuse builds on Maven to accomplish its goal. AppFuse + Maven can be used command line, creating our application with just a single line; but here we will be using Eclipse.

In this tutorial we'll go through the creation of a simple Struts2 web-application, which we'll extend later. We'll use SpringSource Tools Suite 2.2.1 (STS) which is a free tool based on Eclipse 3.5.1 with a pre-configured set of plugins useful for web-development. STS has the Maven2 plugin which we'll use to configure AppFuse for us.

So, grab a copy of SpringSource Tools from http://www.springsource.com/products/sts. After the installation, select File->New->Project... Type maven, and click Next:



Leave "Use default Workspace location" checked and click Next.



Enter struts in the filter, and select struts2-archetype-starter. This is a so called "Maven Archetype", a ready-to-be-used project template built around the Project Object Model (POM), which will create a starter application for Apache Struts 2.0. Click Next.



Now enter the following info:
  • Group Id: org.techmissive
  • Artifact Id: testdrive-struts
  • Version: 1.0.0
  • Package: org.techmissive.testdrive.struts
And click "Finish":



We have created our template project. Now Eclipse (I mean STS of course) will start downloading all our dependencies, and build the project for us. On OSX, we can see directly the Maven-log; on Windows we have to open the Console view (Alt+Shift+Q then C), and click the down-arrow next to the Open Console button on its toolbar (the window-like button left of the green lamp) and select Maven console to see what Maven is doing.



Included with this template is Jetty, a lightweight servlet container, which can be used to test our application locally without installing a full blown J2EE Server. Let's see how we can start it from STS.

Select Run -> Run Configurations... from the Menu, here we'll specify what happens when we start our application. Select 'Maven Build' from the list, and click on the New icon on the top of the window to create a new configuration.



To set the Base directory, click on Browse Workspace. Now click on testdrive-struts and OK. This sets ${workspace_loc:/testdrive-struts} in the Base directory field, which is a relative resolution from our workspace location.

To run Jetty, we have to run a given goal, so enter jetty:run-war in the goals field.

Replace the New_configuration default name with something more meaningful, like Run Jetty.



Click Apply to save, and Run to start the configuration we just created.

Now Eclipse is working again, starts to download all kind of stuff. But in the end proudly confirms the successful starting of Jetty:



Let's test if it works. Jetty is listening by default on the port 8080, so let's start a browser and check what is there. Start Safari and enter http://localhost:8080/testdrive-struts in the address field:



We have a running Web-application and we didn't have to write a single line of code. This is the main purpose of AppFuse: to provide us a skeleton application we can start working on without hours of hunting for the libraries, configuring them, etc. I just wish all the profiles would work the first time like this one...

10/21/2009

Mac OS X: Using IntelliJ IDEA on 64bit JDK6

UPDATE 2:
The retail version of IntelliJ Idea 9 uses the JDK6 32bit by default on Snow Leopard (in contrast with the betas which were using JDK6 64bit).
The article has been updated to reflect this.

IntelliJ Idea 9.0 - Mac OSX 10.6 Snow Leopard

The default installation of IntelliJ IDEA 9.0 uses JDK6 32bit on Mac OSX 10.6 "Snow Leopard". With a little configuration we can force it  to use the 64 bit JDK6, which is also part of the operating system.

After you installed IntelliJ IDEA, Cmd+click on the IDEA launcher-icon to open it in Finder. Ctrl-click on the IntelliJ IDEA icon and choose Show Package Contents. Select Contents and right-click Info.plist. Select Open with -> TextEdit. This brings up the file in the text editor.

Do a search (CMD+f) on the string JVMArchs. You'll find the following entry:

<key>JVMArchs</key>
<array>
  <string>i386</string>
  <string>x86_64</string>
  <string>ppc</string>
</array>

Put the x86_64 entry on top (e.g. swap the i386 and the x86_64 lines), save the file (CMD+s) and close TextEdit (CMD+q). If you restart Idea, you'll see it is using the 64bit JDK if you check it in the IntelliJ Idea->About.

You can also check if a given process is 64bit in Applications->Utilities->Activity Monitor. Check out the last column called Kind.

IntelliJ Idea 8.0, 9.0 - Mac OSX 10.5 Leopard

In contrast, Idea uses JDK5 on MacOS X 10.5 "Leopard", and that's 32bit only. Again just a little configuration and we can force it to use the 64bit JDK6.

After you installed IntelliJ IDEA, CMD+click on the IDEA launcher-icon which brings up the entry in Finder, now right-click (or ctrl-click) on the IntelliJ IDEA icon and choose Show Package Contents. Select Contents and right-click Info.plist. Select Open with -> TextEdit. This brings up the file in the text editor.

Do a search (CMD+f) on the string JVMVersion. You'll find the following entry:
<key>JVMVersion</key>
<string>1.5*</string>
Change the 1.5* to 1.6*, save the file (CMD+s) and close TextEdit (CMD+q). Since the JDK6 in Leopard is 64-bit only, we don't have to touch the JVMArchs section.

Start IDEA and check in About IntelliJ IDEA if it is indeed using the 64bit JDK6.

IntelliJ Idea 8, 9 betas: if IntelliJ does not start anymore, and honors us with a [JavaAppLauncher Error] unable to find a version of Java to launch in the log file, we have to replace the launcher with a 64bit executable. Open a terminal:

cp /System/Library/Frameworks/JavaVM.framework/Resources/MacOS/JavaApplicationStub64 /Applications/Maia-IU-90.94.app/Contents/MacOS/idea

Replace Maia-IU-90.94 with the proper name of your application - I was using a developer preview of IntelliJ IDEA 9, but this same method works for version 8 as well.

32bit or 64bit?

To me IntelliJ Idea seems more responsive on 64bit JDKs. I'm wondering if you consider it faster as well? If you are developing for JDK6 on Leopard you should use similar amount of memory now (the 64-bit JDK footprint is bigger but there's only one copy of the JDK in memory), and - as a question of personal taste - the display looks nicer since JDK6 uses the native font smoothing style of OSX (a mixture of regular antialiasing and subpixel rendering), whilst JDK5 uses plain anti-aliasing without subpixel rendering. There are some comments online about memory leaks using JDK6, but personally I didn't experience this.

Happy developing using this great tool!