Pages

Wednesday, December 24, 2008

Flex : Solution to Error: unable to resolve '/assets/icons/icon.png' for transcoding using Embed tag

It so happens that when we work on IDE like flex builder, every thing seems to be working fine but the same would not work other wise.

Problem:

We use maven as build tool, and my mxml and css has tags like:

[Embed(source='/assets/icons/icon.png')]

We get the error

Error: unable to resolve '/assets/icons/icon.png' for transcoding

You have tried to put assets folder in line with the main application file, in common and any other place you can think of, and still this problem is not resolved.
You have also tried with or with out '/' in front of 'assets' in your Embed tag as given by many people around the Internet.

Solution:

With all the assets that will be used in the Embed tag, create a 'swc' library and put it in the mxmlc class path. And use with out a forward slash in the beginning,as in, assets/icons/icon.png.

Next Problem:

How do you create a swc file?

Solution:

The method I adopted in the beginning to just to get my app up as I had a release next day was, if you own Flexbuilder, then create a Flex Library project, and include all these resources in the library. Use this generated library in the mxmlc build path to build the project.

Next Problem:

I can not always keep adding a resource as and when included into my Flex builder library project to generate the swc. So, I will have to automate this. Before, I build the mxmlc, I have to create the swc, and add it in the build path of mxmlc so that the compilation is successful.

Solution:

This is done using the compc compiler. So, I will have to use this compiler to generate the swc of the resources being embedded in the Flex project, then run the mxmlc.

Next Problem:

Running compc directly from command line, to include files, takes the form:

compc -output assets.swc -include-file <file_alias_1> <real file path 1> .. -include-file <file_alias_n> <real file path  n>

or

We can use the compc task in the maven ant run plugin. But, this compc task, can include only classes, and could not include all the resources.

Solution:

I assume that the folder structure is like below:

flex
|- libs
|- mainapp
     |--src
           |
           |-- index.mxml
           |-- assets
           |     |-- icons
           |    |-- images
           |-- css
           |-- pom.xml

The packages that Flex app uses can be any where, either in mainapp/src or out side it, but its configured in the class path of the mxmlc task in pom.xml.

Step 1: Add the following dependencies with in the maven-antrun-plugin.

<dependencies> 
<dependency>
<groupId>ant-contrib</groupId>
<artifactId>ant-contrib</artifactId>
<version>1.0b2</version>
</dependency>
<dependency>
<groupId>ant</groupId>
<artifactId>ant-jakarta-regexp</artifactId>
<version>1.6.1</version>
</dependency>
<dependency>
<groupId>ant</groupId>
<artifactId>ant-nodeps</artifactId>
<version>1.6.5</version>
</dependency>
</dependencies>


Step 2: Define ant-contrib task as below:
<taskdef 
resource="net/sf/antcontrib/antlib.xml" classpathref="maven.plugin.classpath">
</taskdef>

Step 3: Before invoking the mxmlc task, add the below ant script. If you are not using maven, then directly use the below ant script in the ant build.xml, making sure the dependent jars are present in the ant/lib directory.
    <fileset id="assets.flex" dir="src" includes="**/*.jpg,**/*.png,**/*.css,**/*.swf,**/*.TTF,**/*.jpeg,**/*.xml" /> 
<pathconvert pathsep=" " property="assets.flex.output" refid="assets.flex" dirsep="/">
<map from="${basedir}/src/" to=""/>
</pathconvert>
<echo message="...Resources being considered..."/>
<var name="filelist" value=""/>
<var name="prefixfilelist" value="-include-file"/>
<for list="${assets.flex.output}" delimiter=" " param="asset">
<sequential>
<echo>Asset: @{asset}</echo>
<propertyregex property="prop"
input="@{asset}"
regexp="(.*)mainapp/src/(.*)"
select="\2"
casesensitive="false" />
<var name="filelist_tmp" value="${filelist}"/>
<var name="filelist" unset="true"/>
<var name="filelist" value="${filelist_tmp} ${prefixfilelist} ${prop} @{asset}"/>
<var name="prop" unset="true"/>
</sequential>
</for>
<echo message="-output assets.swc ${filelist}"/>
<exec executable="${FLEX_HOME}/bin/compc.exe" failonerror="true">
<arg line="-output ../libs/assets.swc ${filelist}"/>
</exec>

Its just a simple program that generates the required argument for compc compiler and invokes it, any developer should be able to read and understand this script. Hence, I will leave out the explanations part of it.


Step 4: Proceed with the mxmlc task with libs in the class path, sample is given below:
<mxmlc  file="src/index.mxml" 
output="../target/App.swf"
link-report="../target/report.xml"
warnings="false"
services="${CONFIG}/services-config.xml"
context-root = "/App"
>
<load-config filename="${FLEX_HOME}/frameworks/flex-config.xml"/>
<source-path path-element="${FLEX_HOME}/frameworks"/>
<compiler.library-path dir="${FLEX_HOME}/frameworks" append="true">
<include name="libs" />
</compiler.library-path>
<compiler.library-path dir="../." append="true">
<include name="libs" />
</compiler.library-path>
<compiler.source-path path-element="../mainapp/src"/>
</mxmlc>

This will make sure all your assets is present in the mxmlc build path, and the Embed transcode error does not occur.
I will put the entire ant run plugin in the pom file here for reference:    
<build> 
<plugins>
<plugin>
<artifactId>maven-antrun-plugin</artifactId>
<dependencies>
<dependency>
<groupId>ant-contrib</groupId>
<artifactId>ant-contrib</artifactId>
<version>1.0b2</version>
</dependency>
<dependency>
<groupId>ant</groupId>
<artifactId>ant-jakarta-regexp</artifactId>
<version>1.6.1</version>
</dependency>
<dependency>
<groupId>ant</groupId>
<artifactId>ant-nodeps</artifactId>
<version>1.6.5</version>
</dependency>
</dependencies>
<executions>
<execution>
<phase>compile</phase>
<configuration>
<tasks>
<taskdef
resource="net/sf/antcontrib/antlib.xml" classpathref="maven.plugin.classpath">
</taskdef>
<taskdef resource="flexTasks.tasks" />
<fileset id="assets.flex" dir="src" includes="**/*.jpg,**/*.png,**/*.css,**/*.swf,**/*.TTF,**/*.jpeg,**/*.xml" />
<pathconvert pathsep=" " property="assets.flex.output" refid="assets.flex" dirsep="/">
<map from="${basedir}/src/" to=""/>
</pathconvert>
<echo message="...Resources being considered..."/>
<var name="filelist" value=""/>
<var name="prefixfilelist" value="-include-file"/>
<for list="${assets.flex.output}" delimiter=" " param="asset">
<sequential>
<echo>Asset: @{asset}</echo>
<propertyregex property="prop"
input="@{asset}"
regexp="(.*)mainapp/src/(.*)"
select="\2"
casesensitive="false" />
<var name="filelist_tmp" value="${filelist}"/>
<var name="filelist" unset="true"/>
<var name="filelist" value="${filelist_tmp} ${prefixfilelist} ${prop} @{asset}"/>
<var name="prop" unset="true"/>
</sequential>
</for>
<echo message="-output assets.swc ${filelist}"/>
<exec executable="${FLEX_HOME}/bin/compc.exe" failonerror="true">
<arg line="-output ../libs/assets.swc ${filelist}"/>
</exec>
<mxmlc file="src/index.mxml"
output="../target/App.swf"
link-report="../target/report.xml"
warnings="false"
services="${CONFIG}/services-config.xml"
context-root = "/App"
>
<load-config filename="${FLEX_HOME}/frameworks/flex-config.xml"/>
<source-path path-element="${FLEX_HOME}/frameworks"/>
<compiler.library-path dir="${FLEX_HOME}/frameworks" append="true">
<include name="libs" />
</compiler.library-path>
<compiler.library-path dir="../." append="true">
<include name="libs" />
</compiler.library-path>
<compiler.source-path path-element="../mainapp/src"/>
</mxmlc>
<goals>
<goal>run</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>

Sunday, December 07, 2008

Tutorial to use Eclipse with JAX-WS for contract-first Webservice

I assume that the wsdl is already present and this tutorial is to show, how to use eclipse to generate the java code.

Configuration:
Eclipse 3.3
JAX-WS RI 2.1.5
JDK 6

Step 1: Download the JAX-WS RI jar(JAXWS2.1.5-20081030.jar) from https://jax-ws.dev.java.net/2.1.5/

Step 2: Run the command java -jar JAXWS2.1.5-20081030.jar, Accept the license and allow it to be installed. I will call this $JAXWS_HOME

Step 3: Open eclipse, and Run->External Tools->Open External Tools Dialogue..

Step 4: Give name as GenerateJavaFromWSDL, Location as $JAXWS_HOME\bin\wsimport.bat, Working Directory(path to directory containing wsdl) as ${project_loc}/WEB-INF/wsdl where ${project_loc} [Eclipse variable already defined in eclipse, no need to create this ]is the project location in eclipse, arguments as "myWSDL.wsdl -keep -d ${project_loc}/WEB-INF/src" [without quotes] where myWSDL.wsdl is the wsdl to generate java code, keep is to keep the generated java files and d is the output directory. You can also provide p as argument to specify the package for the java files being generated.

Step 5: Click the Run button. The java files are generated.

Wednesday, December 03, 2008

Filtering in Java

I find the below piece of code pretty useful for filtering purposes in java. I use the apache Commons collection library for this purpose. In the below code snippet, I am filtering on id in the model Data present in the List. After running this code snippet, the 'list' will contain only those Data Objects that satisfy the filter criteria, in this case that will be 'userId = Id'. Predicate defines the filter criteria in the snippet below.

public void filterOnId(String Id,List<Data> list){ 
final StringBuilder string = new StringBuilder(Id);
Predicate predicate = new Predicate(){
public boolean evaluate(Object object) {
boolean returnValue = false;
if(object instanceof Data){
Data vo = (Data)object;
returnValue = vo.getUserId().compareTo(string.toString())==0?true:false;
}
return returnValue;
}
};
CollectionUtils.filter(list, predicate);
}

Sunday, November 30, 2008

Integrating ActiveMQ 5.1 with Tomcat 6

Step 1: Add resource entry to CATALINA_HOME/conf/server.xml under GlobalNamingResources..

    <Resource 
name="jms/ConnectionFactory"
auth="Container"
type="org.apache.activemq.ActiveMQConnectionFactory"
description="JMS Connection Factory"
factory="org.apache.activemq.jndi.JNDIReferenceFactory"
brokerURL="tcp://localhost:61616"
brokerName="LocalActiveMQBroker"
useEmbeddedBroker="false"
/>
<Resource
name="jms/Data"
auth="Container"
type="org.apache.activemq.command.ActiveMQTopic"
description="Data Topic"
factory="org.apache.activemq.jndi.JNDIReferenceFactory"
physicalName="queue.data"
/>

Step 2: Add resource link entry to CATALINA_HOME/conf/context.xml (For God knows What reason, An entry the context.xml in META-INF of my webapp did not work)
<ResourceLink global="jms/ConnectionFactory" name="jms/ConnectionFactory" type="javax.jms.ConnectionFactory"/> 
<ResourceLink global="jms/Data" name="jms/Data" type="javax.jms.Topic"/>

Step 3: Add resource ref entry in web.xml
    <resource-ref> 
<res-ref-name>jms/ConnectionFactory</res-ref-name>
<res-type>javax.jms.ConnectionFactory</res-type>
<res-auth>Container</res-auth>
<res-sharing-scope>Shareable</res-sharing-scope>
</resource-ref>
<resource-ref>
<res-ref-name>jms/Data</res-ref-name>
<res-type>javax.jms.Topic</res-type>
<res-auth>Container</res-auth>
<res-sharing-scope>Shareable</res-sharing-scope>
</resource-ref>

Step 4: Put the following jars in CATALINA_HOME/lib from ActiveMQ_HOME/lib
activemq-core-5.1.0.jar
commons-logging-1.1.jar
geronimo-j2ee-management_1.0_spec-1.0.jar
geronimo-jms_1.1_spec-1.1.1.jar
geronimo-jta_1.0.1B_spec-1.0.1.jar



Step 5: Start the server, you should be able to use JNDI to connect to ActiveMQ.


Step 6: When you use IntialContext in your code, first get the context of "java:comp/env".
    Context initCtx = new InitialContext(); 
jndiContext = (Context) initCtx.lookup("java:comp/env");
Then you can use:
connectionFactory = (ConnectionFactory)jndiContext.lookup("jms/ConnectionFactory");
destination = (Destination)jndiContext.lookup("jms/Data");


Then, carry on with normal JMS flow.

JMS Tutorial : Topic Subscriber Client

This is in continuation of my tutorial on JMS. The below client uses topic to communicate with the ActiveMQ.

The jndi.properties

# START SNIPPET: jndi 

java.naming.factory.initial = org.apache.activemq.jndi.ActiveMQInitialContextFactory

# use the following property to configure the default connector
java.naming.provider.url = tcp://localhost:61616

# use the following property to specify the JNDI name the connection factory
# should appear as.
#connectionFactoryNames = connectionFactory, queueConnectionFactory, topicConnectionFactry
connectionFactoryNames = connectionFactory, queueConnectionFactory, topicConnectionFactry

# register some queues in JNDI using the form
# queue.[jndiName] = [physicalName]

# register some topics in JNDI using the form
# topic.[jndiName] = [physicalName]
#topic.MyTopic = example.MyTopic
topic.sample.data = sample.data

# END SNIPPET: jndi

/** 
* @author shreyas.purohit
*
*/
public class SampleJMSConsumer implements Runnable {

public void run() {
Context jndiContext = null;
TopicConnectionFactory connectionFactory = null;
TopicConnection connection = null;
TopicSession session = null;
TopicSubscriber consumer = null;
Topic destination = null;
String sourceName = null;
final int numMsgs;
sourceName = "sample.data";

/*
* Create a JNDI API InitialContext object
*/
try {
jndiContext = new InitialContext();
} catch (NamingException e) {
e.printStackTrace();
System.exit(1);
}

/*
* Look up connection factory and destination.
*/
try {
connectionFactory = (TopicConnectionFactory) jndiContext
.lookup("topicConnectionFactry");
destination = (Topic)jndiContext.lookup(sourceName);
} catch (NamingException e) {
e.printStackTrace();
System.exit(1);
}

try {
connection = connectionFactory.createTopicConnection();
session = connection.createTopicSession(false, TopicSession.AUTO_ACKNOWLEDGE);
consumer = session.createSubscriber(destination);
connection.start();
try {
Thread.sleep(2000);
} catch (InterruptedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
MessageListener listener = new MyTopicMessageListener();
consumer.setMessageListener(listener);
// Let the thread run for some time so that the Consumer has
// suffcient time to consume the message
try {
Thread.sleep(50000);
} catch (InterruptedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
} catch (JMSException e) {
e.printStackTrace();
} finally {
if (connection != null) {
try {
connection.close();
} catch (JMSException e) {
}
}
}
}

}

The message listener:
/** 
* @author shreyas.purohit
*
*/
public class MyTopicMessageListener implements MessageListener {

/* (non-Javadoc)
* @see javax.jms.MessageListener#onMessage(javax.jms.Message)
*/
public void onMessage(Message arg0) {
if(arg0 instanceof ObjectMessage){
try {
//Print it out
System.out.println("Recieved message in listener: " + ((ObjectMessage)arg0).getObject());
System.out.println("Co-Rel Id: " + ((ObjectMessage)arg0).getJMSCorrelationID());
}catch(Exception e){
e.printStackTrace();
System.exit(1);
}
}else{
System.out.println("~~~~Error in format~~~");
}
}

}

The JMS Application:
/** 
* @author shreyas.purohit
*
*/
public class JMSApp {

/**
* @param args
*/
public static void main(String[] args) {
runInNewthread(new SampleJMSConsumer());
}
public static void runInNewthread(Runnable runnable) {
Thread brokerThread = new Thread(runnable);
brokerThread.setDaemon(false);
brokerThread.start();
}
}

Tuesday, November 11, 2008

RMI Tutorial : RMI and tomcat

Using RMI in java is very very easy, but, people like me will still get stuck using it, when trying to integrate with Tomcat. And, the help is very less outside in the internet.

There are two things to remember:
1. Provide sufficient access priviledges to the classes and jars using the RMI. This is done using catalina.policy present in $CATALINA_HOME/conf. Below is the addition to the policy files:

grant codeBase "file:${catalina.home}/webapps/MyAPP/WEB-INF/classes/-" { 
permission java.security.AllPermission "", "";
};

grant codeBase "file:${catalina.home}/webapps/MyAPP/WEB-INF/lib/-" {
permission java.security.AllPermission "", "";
};
grant codeBase "file:${catalina.home}/webapps/MyAPP/WEB-INF/lib/some-common-3.0.jar" {
permission java.io.FilePermission "*", "read, write";
};

2. Do not copy the entire code that is avalable on the internet for the server and client. The main() method given at lot of places in the web should not be copied as it is. Specifically, comment out the below snippet on both client and server:

//        if (System.getSecurityManager() == null) { 
// System.setSecurityManager(new RMISecurityManager());
// }

We do not want to install a new security manager. Tomcat provides the securitymanager, and lets use the same. Please do use the same if you want the RMI to work.
Tutorial:

Server side:

Step 1: Create an contract, an interface that extends java.rmi.Remote.
public interface IRemoteService extends Remote{ 

public final String serviceName = "MyRemoteService";
public abstract void startDoing() throws RemoteException;

public abstract void stopDoing() throws RemoteException;
}

Step 2: Write an implementation for the Interface.
public class RemoteServiceImpl implements IRemoteService { 
public RemoteServiceImpl(){
super();
}
public void startDoing() throws RemoteException {
return new MyTask().do();
}
public void stopDoing() throws RemoteException {
return new MyTask().dont();
}
}

Step 3: Either use a startup servlet or any class that is called after the server is up and running and before the remote service is invoked, and initialize the registry. In the blow snippet case, just create a new object some where before service will be invoked. Note in below code, there is no securitymanager present anywhere.
public class InitRemoteService { 
public static boolean isRegistered = false;
public static IRemoteService service;
public InitRemoteService(){
if(!isRegistered){
try {
service = new RemoteServiceImpl();
IRemoteService stub =
(IRemoteService) UnicastRemoteObject.exportObject(service, 0);
Registry registry = LocateRegistry.createRegistry(9345);
registry.rebind(IRemoteService.serviceName, stub);
System.out.println("Remote service bound");
isRegistered = true;
} catch (Exception e) {
System.err.println("Remote service exception:");
e.printStackTrace();
}
}
}
}

Client Side:

Step 1: Write the client as given below in the snippet. You can note, there is no security manager related code either here. It also, lists all the service names in the registry.
try { 
Registry registry = LocateRegistry.getRegistry(HOST,9345);
String[] names = registry.list();
for(String name1 : names){
System.out.println("~~~~" + name1 + "~~~~");
}
IDPRemoteService serv = (IDPRemoteService) registry.lookup(IDPRemoteService.serviceName);
System.out.println(serv.startDoing());
} catch (Exception e) {
System.err.println("Remoteservice exception:");
e.printStackTrace();
}

Enjoy, using RMI for remote object invocations, esp on servers like Tomcat that does not support EJB's or when EJB level of advanced concept is not required.

Accessing resources from Java Class

Here is a simple code snippet that can be used to get properties file from a java program.
Properties config = new Properties(); 
try {
config.load(new FileInputStream(new File(URLDecoder.decode(getClass().getClassLoader().getResource(CONFIG_FILE_NAME).getFile(), "UTF-8"))));
} catch (Exception e1) {
e1.printStackTrace();
throw new RuntimeException(e1);
}

This, snippet has never let me down till now in getting a properties file.

Monday, November 10, 2008

ObjectOutputStream : Writing the same object over and over again!

Well, Well, Well.. You know, I was just writing a very tiny socket application in Java and was stuck for a long time on 'Writing Objects from server to Client!'. I myself can not believe that. With so many samples around on the Internet, I was still stuck. A pretty interesting problem I would like to share. I have a server writing objects to its clients. But the client is always receiving the same object. The first one. The code snippet is given below:

....
....
....
os = new ObjectOutputStream(socket.getOutputStream());
....
....
....

if(toBeSentData.shouldConsume()){
    Object object = toBeSentData.get();
    if(null != object){
        os.writeObject(object);
        os.flush();
    }
}

You can see, I am flushing the output stream. I debugged through the writeObject code, to find out that my Object was not written at all. Some handle was written. Specifically, in method private void writeObject0(Object obj, boolean unshared) throws IOException, the snippet is present below.

// handle previously written and non-replaceable objects
int h;
if ((obj = subs.lookup(obj)) == null) {
    writeNull();
    return;
} else if (!unshared && (h = handles.lookup(obj)) != -1) {
    writeHandle(h);
    return;
} else if (obj instanceof Class) {
    writeClass((Class) obj, unshared);
    return;
} else if (obj instanceof ObjectStreamClass) {
    writeClassDesc((ObjectStreamClass) obj, unshared);
    return;
}


The handles.lookup(obj) was never returning -1. So, the problem is some sort of caching(Not exactly caching) I understood. The only guy who had explained this was Qusay H. Mahmoud in his December 2001 post about Advanced Socket Programming at http://java.sun.com/developer/technicalArticles/ALT/sockets/
I really thank him for that, else I would have been stuck with the problemo for god knows How long!!

I will just copy the last part of the article here. This is what gave me the solution.

Object Serialization Pitfall

When working with object serialization it is important to keep in mind that the ObjectOutputStream maintains a hashtable mapping the objects written into the stream to a handle. When an object is written to the stream for the first time, its contents will be copied to the stream. Subsequent writes, however, result in a handle to the object being written to the stream. This may lead to a couple of problems:

    * If an object is written to the stream then modified and written a second time, the modifications will not be noticed when the stream is deserialized. Again, the reason is that subsequent writes results in the handle being written but the modified object is not copied into the stream. To solve this problem, call the ObjectOutputStream.reset method that discards the memory of having sent an object so subsequent writes copy the object into the stream.
    * An OutOfMemoryError may be thrown after writing a large number of objects into the ObjectOutputStream. The reason for this is that the hashtable maintains references to objects that might otherwise be unreachable by an application. This problem can be solved simply by calling the ObjectOutputStream.reset method to reset the object/handle table to its initial state. After this call, all previously written objects will be eligible for garbage collection.

The reset method resets the stream state to be the same as if it had just been constructed. This method may not be called while objects are being serialized. Inappropriate invocations of this method result in an IOException.

Just call the os.reset(); after flush. It worked like a charm for me!!

Saturday, October 11, 2008

AS Model generator : XDoclet2 plugin to generate the Flex model AS file for the respective java server side value objects

My previous post gave an introduction to this plugin for xdoclet2 by writing a tutorial on xdoclet2 and velocity with this plugin as an example. Now, I will explain what this plugin can do.

First download the latest plugin(as3-plugin.jar) from dist folder at github project:

https://github.com/shreyaspurohit/AS3Generator

The supported annotations are:

1. as3.class - A class level tag.

Attributes:
1.a. name - The name of the generated AS class. Should be same as the java class name. Required.
1.b. generate - On false, does not generate the AS file for the java class. Default: true.
1.c. generate-bindable-metadata - Generates [Bindable] at class level in AS. Default value: true.
1.d. generate-remote-class-metadata - Generates [RemoteClass(alias="..")] on true. Default value: false.

2. as3.field - A Field level tag.

Attributes:

2.a. type - The fully qualified flex type to be generated in the AS file. Required.
2.b. import - Imports the flex type defined above. Default: false.
2.c. generate-bindable-field-metadata - Generate field level [Bindable] in the AS file. Default: false.
2.d. generate - Controls generation of the field in as files. Default: true.

The Java-Flex mapping supported are:

1. int - Number
2. double - Number
3. long - Number
4. java.lang.Short - Number
5. java.lang.Byte - Number
6. java.lang.Integer - Number
7. java.lang.Double - Number
8. java.lang.Long - Number
9. java.lang.Float - Number
10. java.lang.String - String
11. java.lang.Character - String
12. java.util.Collection - mx.collection.ArrayCollection
13. java.util.Map - Object
14. java.util.Dictionary - Object

For any other type define the as3.field with the flex type, and if the import is necessary. If that does not solve the problem, define it as Object, then cast it in the flex code when necessary. If this too does not solve the problem contact me on my blog. Will add the feature and release the next version.

Usage:

Write a ant build script and invoke xdoclet with target:

<target name="generate">
        <xdoclet>
        <fileset dir="${basedir}/src">
             <include name="**/*.java"/>          
         </fileset>
        <component classname="com.ssb.plugin.as3.As3Plugin"
                   destdir="${basedir}/src"/>
        </xdoclet>
</target>

Please look at my previous post for the sample model class which uses all these annotations.
Download or browse the latest sample from the github sample folder and see the build file for details.

XDoclet2 Custom Plugin ( Actionscript3 (AS3) model generator from java value objects ) and Velocity Tutorial

I work on flex and it was necessary for me to generate the flex side model classes in sync with the java side value objects. I Googled but was not able to find one that is as simple as the one written by Joe Berkovitz of Allurent. It was pretty old code and lacked configuration options for generating AS classes. I decided to write my own plugin for xdoclet2 that can be used to generate the flex code. And, here given below is a tutorial for learning xdoclet2. Since, the template engine it uses is velocity, this post also serves as a tutorial for velocity. Finally, I provide you with a complete working edition of the xdoclet2 plugin, and a sample to use the same.

Please look at the end of this tutorial to find links to the source, plugin jar and sample files.

System Configuration
1. xdoclet2 v1.0.4
2. velocity 1.5
3. JDK 5 update 15
4. ant 1.7.1

XDoclet2

XDoclet2 allows you to read the annotations on the java source files, and generate either XML or Java or any other file. It uses velocity template engine for any file type generation, and Jelly template engine for XML generation.

Step 1:

Download the xdoclet2 plugin distribution from http://sourceforge.net/projects/xdoclet-plugins/
Extract the zip or the tar.gz to an comfortable location.

Step 2:

Write the sample java class with the annotations that needs to be supported by the plugin. In my case I wrote Model.java. A sample snippet is given below:

package com.ssb.sample;
import java.math.BigDecimal;
import java.math.BigInteger;
/**
*
* @author Shreyas
* @as3.class name="Model" generate="true" generate-bindable-metadata="true" generate-remote-class-metadata="true"
*/
public class Model {
    private int i;
    private double d;
    private long l;
    /**
     * @as3.field type="Number" import="false" generate-bindable-field-metadata="true"
     */
    private BigDecimal bdecimal;
    /**
     * @as3.field type="mx.flex.BigInt" import="true"
     */
    private BigInteger binteger;

    /**
     * @as3.field type="mx.flex.BigInt" generate="false"
     */
    private transient BigInteger doNotGeneratebinteger;
    .
    .
    .
Step 3:

Define Tag validators. This is helpful for validating if the annotations provided are as anticipated. Tag validators extends DocletTag as shown below. The interface itself is annotated using the qtags.

package com.ssb.plugin.as3.qtags;
import com.thoughtworks.qdox.model.DocletTag;
/**
*
* @qtags.location class
* @qtags.once
*
*/
public interface As3ClassTag extends DocletTag{
    /**
     * @qtags.required
     */
    String getName_();
    /**
     * @qtags.default true
     */
    String isGenerate();
    /**
     * @qtags.default true
     */
    String isGenerateBindableMetadata();
    /**
     * @qtags.default false
     */
    String isGenerateRemoteClassMetadata();
}

There are some points to be noted:
a. There is a one to one matching between the name of the interface - As3ClassTag -  and the annotation on the sample class -as3.class-. The First letter is capitalized, the '.' is removed, the next letter is also capitalized, a 'Tag' is appended.
b. The methods in the interface uses the standard java bean convention. For each attribute in the tag, 'get' or 'is' is prefixed to the attribute name with the first letter capitalized.
c. 'name' is a special attribute, a '_' should be used at the end of the getter to avoid internal naming collision.
d. The attribute when contains '-', the methods in the interface nelects it and follows the first letter capitalization rule.
e. Do not use camel casing in the sample annotation. Use only small letters or hyphens to separate for ease of work.

Definitions of tags:
a. qtags.location Whether the tag applies to class level or field level or method level, repectively, we have class, field, method as its values.
b. qtags.once The tag can be used only once in teh source file.
c. qtags.required The field is required.
d. qtags.default The default value of the field.

Step 4:

Generate the tag validator implementation and tag library. The tag library will be used by our plugin. A ant script is used to generate these items. The target to generate is given below. Please look at source code to get the entire build script.

<target name="gen.qtags.impl">
      <property name="xdoclet.qtags.namespace" value="as3"/>

      <xdoclet>
           <fileset dir="src">
             <include name="**/*.java"/>
           </fileset>
           <component
              classname="org.xdoclet.plugin.qtags.impl.QTagImplPlugin"
              destdir="${basedir}/src"
           />   
           <component
            classname="org.xdoclet.plugin.qtags.impl.QTagLibraryPlugin"
                destdir="${basedir}/src"
                packagereplace="com.ssb.plugin.${xdoclet.qtags.namespace}.qtags"
           />
      </xdoclet>
</target>

Step 5:

Write the xdoclet2 plugin. As3Plugin extends QDoxPlugin.

package com.ssb.plugin.as3;
import java.util.*;
import org.apache.log4j.Logger;
import org.generama.*
import com.ssb.plugin.as3.qtags.TagLibrary;
import com.thoughtworks.qdox.model.*;

public class As3Plugin extends QDoxPlugin{
    private Map<String, String> typeMap = new HashMap<String, String>();
    public As3Plugin(VelocityTemplateEngine templateEngine,
            QDoxCapableMetadataProvider metadataProvider,
            WriterMapper writerMapper) {
        //Call the superclass constructor.
        super(templateEngine, metadataProvider, writerMapper);
        ..
        ..
        ..

In the above snippet, the constructor takes VelocityTemplateEngine as argument. Do not bother how it gets that. The xdoclet is responsible for injecting it to the constructor. First, call the super class constructor.

        //Replace .java with .as extensions
        setFileregex(".java");
        setFilereplace(".as");
        //Set Multiple file output to true.
        setMultioutput(true);
        //Instantiate the generated tag library.
        new TagLibrary(metadataProvider);
        //Initialize the the type map
        initTypeMap();
The comments are pretty clear for the above code snippet. This ends the constructor.

The initTypeMap method is shown below.

    /**
     * Initializes the type map.
     */
    protected void initTypeMap() {
        typeMap.put("int", "Number");
        typeMap.put("double", "Number");
        typeMap.put("long", "Number");
        typeMap.put("java.lang.Short", "Number");
        typeMap.put("java.lang.Byte", "Number");
        ..
        ..
        ..
    }

    /**
     * Over ridden method, determines whether the given java class should be converted to as3 or not.
     *
     * @param metadata A java Class.
     * @return
     */
     public boolean shouldGenerate(Object metadata) {
            JavaClass javaClass = (JavaClass) metadata;
            boolean ignore = "false".equalsIgnoreCase(javaClass.getNamedParameter("as3.class","generate"));
            if (!ignore)
              return true;
            else
              return false;
     }
The above method decides whether AS files should be generated or not for the annotated java file. This is called by xdoclet and the Object argument is a JavaClass type. The getNamedParameter takes the annotation name and the attribute whose value has to be retrieved. If, it is false then do not generate the AS file.

    protected void populateContextMap(Map map) {
        super.populateContextMap(map);
        map.put("tagUtil", new TagUtil());
    }

The above code initializes the context map by adding tagUtil, so that it can be accessed by velocity template.

Now for each of the annotation and its attribute write a public method to get its value. This is present in TagUtil.java. Remember, it is public. Protected will not work, as this will be later used by the Velocity. Here below are sample for two of them:

    /**
     * Gets the As3 Class name from the annotation as3.class, attribute 'name'
     *
     * @param metadata The java class.
     * @return
     */
    public String getAs3Name(Object metadata){
        JavaClass javaClass = (JavaClass) metadata;
        return javaClass.getNamedParameter("as3.class","name");
    }
    /**
     * Gets the Field type from the annotation as3.field, attribute 'type'
     *
     * @param metadata The java Field.
     * @return
     */
    public String getTypeName(Object metadata){
        JavaField javaField = (JavaField)metadata;
        return getTagValue("as3.field", "type", javaField);
    }   
    private String getTagValue(String tagName,String tagAttribute, JavaField field){       
        return field.getNamedParameter(tagName, tagAttribute);
    }
This is it!! Really, this is all it takes to write a XDoclet related code for the plugin. Now, we have to integrate it with velocity to generate the output AS files.

Step 6:

Write the Velocity template file for Code generation.

Velocity

Step 1: Create a As3Plugin.vm file at com.ssb.plugin.as3 package, same package as the plugin. The file name must identical to the plugin name.

Step 2: Write the velocity code in the vm file. It is shown below as snippets.

// ${dontedit}
#set( $class = $metadata )
#set( $truevalue = "true")

The ${} are variables that can hold values in the VTL(Velocity template language). #set is used to assign values. In working with xdoclet, the $metadata is injected from xdoclet and holds the class name. $truevalue is assigned string 'true';

package $plugin.getDestinationPackage($class);
{

#foreach($field in $class.getFields())
#set($import = $plugin.shouldImport($field))
  #if($import != "false")
    import $import;
  #end
#end
#if($tagUtil.isGenerateBindableMetadata($class) == $truevalue)
    [Bindable]
#end
..
..

Other variables that are available to the velocity are $plugin and $tagUtil. $plugin holds the instance of the As3Plugin class. Velocity can be used to invoke java methods through this variable, passing any arguments if required. The getDestinationPackage method is present in the base class of the As3Plugin. It gives the package of the java class. The package of the AS class will be same as this one.
#foreach is a directive to loop in velocity. Velocity supports #if#elseif#else#end directive also. Any method you see being invoked on $plugin is actually being invoked on As3Plugin java side object. So, shouldImport calls the As3Plugin classes shouldImport method. Similarly, isGenerateBindableMetadata calls the TagUtil classes isGenerateBindableMetadata method.

Velocity supports writing methods, by defining macro's etc, but that is not needed for writing this template file. That's all is velocity!! Its the most basic stuff in velocity that is being used here. Please refer velocity user manual for more detailed information( http://velocity.apache.org/engine/devel/user-guide.html ).

Every thing needed for our plugin is now in place. Just compile the classes, include all the resources and build the plugin jar. The build script is provided in the source.

To use the plugin, write a ant build script as given below:

<?xml version="1.0" encoding="ISO-8859-1"?>
<project name="Test AS3 Plugin Jar" default="main">

    <property file="build.properties"/>
    <path id="xdoclet.task.classpath">
        <!-- xdoclet2 runtime dependencies -->

          <fileset dir="${xdoclet.plugin.install.dir}/lib">
            <include name="**/*.jar"/>
          </fileset>
        <pathelement location="${basedir}/lib/as3-plugin.jar"/>

    </path>

    <!-- Define xdoclet task -->
     <taskdef
        name="xdoclet" classname="org.xdoclet.ant.XDocletTask"
         classpathref="xdoclet.task.classpath"
     />
    <target name="main" depends="generate"/>
    <target name="generate">
        <xdoclet>
        <fileset dir="${basedir}/src">
             <include name="**/*.java"/>          
         </fileset>
        <component classname="com.ssb.plugin.as3.As3Plugin"
                   destdir="${basedir}/src"/>
        </xdoclet>
     </target>
</project>

build.properties contain:
xdoclet.plugin.install.dir = E:/DevToolsInstalled/xdoclet-plugins-dist-1.0.4

This completes the tutorial for writing xdoclet2 plugin using velocity template engine. This generates non java files, .as, as output.

Resources:

Go to: https://github.com/shreyaspurohit/AS3Generator

1. Download the latest plugin jar from dist folder (as3-plugin.jar) to use the lib.

2. Download the latest source of the plugin by cloning git (or download zip). Edit the build.properties to provide the xdoclet2 installation directory.
Run ant on build.xml to generate the as3-plugin.jar in the base directory.

3. Download the latest samples from sample folder by cloning git (or download zip). Edit the build.properties to provide the xdoclet2 installation directory.
Run ant on build.xml to generate Model.as, Model2.as at com.ssb.sample.

4. Download the latest java doc from folder api on git by cloning or downloading zip.

Please read my next post to understand the capabilities of the plugin.

Wednesday, October 01, 2008

Step by step CXF Webservice Tutorial

We had an application running, and a webservice had to be exposed. With some experience in Axis2, I decided to learn some thing new, CXF service. Well Initially, I gave up on CXF service.(Will give the details in some time). So, thought let me use the Axis2 service. But, the only way I knew to use it was deploying the axis as webapp,and then writing and deploying the service as aar. So, a search to find integrating Axis2 with an existing webapp begins! And, with shame, I can tell, I was not able to find any where a clue about doing it. I could have experimented by including there axis distribution lib, and its web.xml content in my applications, but decided to learn some thing newer. And, hence the CXF.

I was developing a flex app on the blazeds turnkey server that is distributed by adobe. And, the webservice had to be integrated with this existing application. Since, CXF was the one I wanted to learn, I tried using it(as given below in tutorial), but always ended up either with one or other exceptions as given below.

1. On startup  of server this exception used to occur when all the jars of CXF are put in the WEB-INF/lib folder.

SEVERE: Context initialization failed
org.springframework.beans.factory.BeanDefinitionStoreException: Unexpected exception parsing XML document from class pat
h resource [com/ssb/service/data.xml]; nested exception is java.lang.IllegalArgumentException: Class [org.spring
framework.scripting.config.LangNamespaceHandler] does not implement the NamespaceHandler interface
Caused by: java.lang.IllegalArgumentException: Class [org.springframework.scripting.config.LangNamespaceHandler] does no
t implement the NamespaceHandler interface at org.springframework.beans.factory.xml.DefaultNamespaceHandlerResolver.initHandlerMappings(DefaultNamespaceHan


2. When the spring-XXXX.jars were removed(Google helped me to point out this solution!), and the webapp was started, no problem at all. The server was up and running. But, on accessing the service as:

http://localhost:8400/cxftry/services

This exception was that I had:

java.lang.NoSuchMethodError: org.springframework.context.ConfigurableApplicationContext.addApplicationListener(Lorg/springframework/context/ApplicationListener;)V
org.apache.cxf.transport.servlet.CXFServlet.loadSpringBus(CXFServlet.java:104)
org.apache.cxf.transport.servlet.CXFServlet.loadBus(CXFServlet.java:70)
org.apache.cxf.transport.servlet.AbstractCXFServlet.init(AbstractCXFServlet.java:90)

With no information on web about any of these with CXF, I guessed the problem is the appserver, turnkey, that I was using. Though, it uses tomcat 6, not sure why it doesnt work. I downloaded a tomcat 6 server, and put the application on it and it worked like a charm. :o)

System Configurations:

1.CXF 2.1.2

2.JDK 1.5

3.TOMCAT 6

Here is the tutorial, to use CXF to expose as a webservice with Java-First methodology.

1. Download the CXF from http://cxf.apache.org/download.html
2. Extract to a directory(here, I will call it as CXF_HOME)
3. Copy paste all the lib(not recomended, go through WHICH_JARS present in CXF_HOME/lib to decide on the jars you need for your application) to WEB-INF/lib directory.
4. Write your Service Endpoint Interface (SEI), nothing but a java interface that will be exposed as a webservice.

package com.ssb.service;

import javax.jws.WebMethod;
import javax.jws.WebParam;
import javax.jws.WebService;
import javax.xml.ws.WebFault;

import com.ssb.exception.SomeException;
import com.ssb.model.Data;

@WebService
public interface IDataService {
    @WebMethod(operationName="getData")
    public Data  getData(@WebParam(name="id")String id) throws SomeException;
}

5. Annotate your Exception with @WebFault if any.

package com.ssb.exception;

import javax.xml.ws.WebFault;

@WebFault(name="exception")
public class SomeException extends Exception {

    public String contactInfo = "Sacrosanct Blood.";
    private static final long serialVersionUID = 1L;

    private SomeException() {
    }

    public SomeException(String message, Throwable cause) {
        super(message, cause);
    }

    public SomeException(String message) {
        super(message);
    }
}

6. Annotate your Service Implementation as a webservice.

package com.ssb.service;

import javax.jws.WebService;

import com.ssb.exception.SomeException;
import com.ssb.model.Data;

/**
* @author shreyas.purohit
*
*/
@WebService(endpointInterface="com.ssb.service.IDataService", serviceName="dataService")
public class DataServiceImpl implements IDataService {

    /**
     *
     */
    public DataServiceImpl() {
    }

    public Data getData(String id) throws SomeException{
        Data data = new Data();
        return data;
    }

}

7. Configure the data.xml for the webservice(com/ssb/service/data.xml).

<beans xmlns="http://www.springframework.org/schema/beans"
      xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
      xmlns:jaxws="http://cxf.apache.org/jaxws"
      xsi:schemaLocation="http://www.springframework.org/schema/beans
http://www.springframework.org/schema/beans/spring-beans.xsd
http://cxf.apache.org/jaxws
http://cxf.apache.org/schemas/jaxws.xsd">

  <import resource="classpath:META-INF/cxf/cxf.xml" />
  <import resource="classpath:META-INF/cxf/cxf-extension-soap.xml"/>
  <import resource="classpath:META-INF/cxf/cxf-servlet.xml" />
  <jaxws:endpoint id="data"
                  implementor="com.ssb.service.DataServiceImpl"
                  address="/dataService"/>
</beans>

8. Configure web.xml for CXF to work.

<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE web-app PUBLIC "-//Sun Microsystems, Inc.//DTD Web Application 2.3//EN" "http://java.sun.com/dtd/web-app_2_3.dtd">
<web-app>

<display-name>CXF</display-name>
    <description>CXF Application</description>
    <context-param>
        <param-name>contextConfigLocation</param-name>
        <param-value>classpath:com/ssb/service/data.xml</param-value>
      </context-param>
      <listener>
        <listener-class>
          org.springframework.web.context.ContextLoaderListener
        </listener-class>
      </listener>
      <servlet>
        <servlet-name>CXFServlet</servlet-name>
        <servlet-class>
            org.apache.cxf.transport.servlet.CXFServlet
        </servlet-class>
      </servlet>
      <servlet-mapping>
        <servlet-name>CXFServlet</servlet-name>
        <url-pattern>/services/*</url-pattern>
      </servlet-mapping>
    <welcome-file-list>
            <welcome-file>index.html</welcome-file>
            <welcome-file>index.htm</welcome-file>
    </welcome-file-list>
</web-app>

9. Start the server, and access http://localhost:8080/cxftry/services
You should be able to see the services. On clicking on it, the wsdl can be seen.

The annotations used in the above code is very simple and self explaining.

1. @WebService : Indicates its a webservice.

Attributes used:

endpointInterface=Specifies the full name of the SEI that the implementation class implements.
serviceName=Specifies the name of the published service.

2. @WebMethod    : Indicates a webservice method.

Attributes used:

operationName=Specifies the operation name, i.e the webservice method name.

3. @WebParam    : Used to give the parameters in the operations a meaningfull name in the published WSDL.

Attributes used:

name= Name of the argument to be displayed on the WSDL for the operation.

4. @WebFault    : Defines an exception that the operation/web service method can throw.

Attributes used:

name= The name to be used in the WSDL.

This completes a quick tutorial for publishing a Java-First webservice.

Tuesday, September 30, 2008

Log4j Tutorial: Layout and Appender

I was working on an application related to trading. I had to write large amount of data to the disk in CSV format. If the file size increased over an MB, then the file had to roll over. We decided to use the existing log4j framework and use the rolling file appender to get the job done. Well, to say, it seemed easy, but finally had to write the code which i wanted to avoid using something that exists!!

First thing is layout :

The data written to the file is in CSV format, and does not need any information like debug level, or time or any thing else. So, the data sent to the logger, had to be just written to file with nothing more attached to it. This is do-able using the layouts in log4j. I just extended the SimpleLayout, and over-rided the 'format' method as below.

public class NoLayout extends SimpleLayout {

    /**
     *
     */
    public NoLayout () {
        super();
    }
    public String format(LoggingEvent event)
    {
        sbuf.setLength(0);
        sbuf.append(event.getRenderedMessage());
        sbuf.append(Layout.LINE_SEP);
        return sbuf.toString();
    }
}

Once the layout was done and the RollingFileAppender was configured to be used in the log4j.xml, then next set of problems came to light. When the file rolls to next file, by default the file is named as fileName.index where index is maximum to configured in XML. We wanted the files to have fileName.index.csv as names.

I thought this was a simple thing, and it should be configurable some where. The pattern to save the archived file should be configurable. On looking around Google, I saw the Rolling Policy(FixedWindow, with file name pattern configurable) and Triggering Policy(SizeBased, needed if TimeBasedRollingPolicy is not used).
There website quotes:


To be of any use, a RollingFileAppender instance must have both a RollingPolicy and a TriggeringPolicy set up. However, if its RollingPolicy also implements the TriggeringPolicy interface, then only the former needs to be set up. For example, TimeBasedRollingPolicy acts both as a RollingPolicy and a TriggeringPolicy.


Some more samples across the web,

<rollingPolicy class="org.apache.log4j.rolling.TimeBasedRollingPolicy">
              <param name="FileNamePattern" value="/wombat/foo.%d{yyyy-MM}.gz"/>
</rollingPolicy>

I thought this can be used and i configured in the log4j.xml, but only to my disappointment log4j does not recognize elements <rollingPolicy/> and <triggeringPolicy/>. But there DTD, log4j.dtd, had these elemets. On looking into the log4j code, specifically, org.apache.log4j.xml.DOMConfigurator, the only elements they recognize are:

static final String CONFIGURATION_TAG = "log4j:configuration";
  static final String OLD_CONFIGURATION_TAG = "configuration";
  static final String RENDERER_TAG      = "renderer";
  static final String APPENDER_TAG     = "appender";
  static final String APPENDER_REF_TAG     = "appender-ref"; 
  static final String PARAM_TAG        = "param";
  static final String LAYOUT_TAG    = "layout";
  static final String CATEGORY        = "category";
  static final String LOGGER        = "logger";
  static final String LOGGER_REF    = "logger-ref";
  static final String CATEGORY_FACTORY_TAG  = "categoryFactory";
  static final String LOGGER_FACTORY_TAG  = "loggerFactory";
  static final String NAME_ATTR        = "name";
  static final String CLASS_ATTR        = "class";
  static final String VALUE_ATTR    = "value";
  static final String ROOT_TAG        = "root";
  static final String ROOT_REF        = "root-ref";
  static final String LEVEL_TAG            = "level";
  static final String PRIORITY_TAG      = "priority";
  static final String FILTER_TAG    = "filter";
  static final String ERROR_HANDLER_TAG    = "errorHandler";
  static final String REF_ATTR        = "ref";
  static final String ADDITIVITY_ATTR    = "additivity"; 
  static final String THRESHOLD_ATTR       = "threshold";
  static final String CONFIG_DEBUG_ATTR  = "configDebug";
  static final String INTERNAL_DEBUG_ATTR  = "debug";
  static final String RENDERING_CLASS_ATTR = "renderingClass";
  static final String RENDERED_CLASS_ATTR = "renderedClass";


  So, there was nothing about the rollingPolicy or triggeringPolicy, though there DTD defines them.
  So, I thought let me configure it through code. But, the Appender, log.getAppender(name), does not have anyway's to set the rollingpolicy or triggeringpolicy. Wow!!
  So, a simple requirement, manifested into hours of research, and finaly decided to extend the RollingFileAppender and give an configurable filename extension(The code is given below.). To all my dismay, I see the code of RollingFileAppender.rollOver(), the method every time uses >>> fileName + '.' + index <<< !!!!! (I was expecting a getFileName(index)!)
  So, I had to take in there code[copy and paste :(]. Now, they are using a private variable 'nextRollover' inside, this method.[Atleast, if it was protected, I could have just ignored it, and used the one from the parent class]. So, even that I had to take in, along with one more method that uses it 'subAppend'. And, finally got my requirement up and running!! Only thing I wanted is a file name to archived file that can be configured.
  If any one else is stuck in same problem, hope the classes here are of some help for you.


  package org.apache.log4j;
  import java.io.File;
  import java.io.IOException;
  import org.apache.log4j.helpers.CountingQuietWriter;
  import org.apache.log4j.helpers.LogLog;
  import org.apache.log4j.spi.LoggingEvent;
  public class ConfigurableRollingFileAppender extends RollingFileAppender {
      public String fileExtension;
      protected long nextRollover;
      /**
       * @param fileExtension the fileExtension to set
       */
      public void setFileExtension(String fileExtension) {
          this.fileExtension = fileExtension;
      }
      /**
       * @return the fileExtension
       */
      public String getFileExtension() {
          return fileExtension;
      }
      /**
       * @param nextRollover the nextRollover to set
       */
      protected void setNextRollover(long nextRollover) {
          this.nextRollover = nextRollover;
      }
      /**
       * @return the nextRollover
       */
      protected long getNextRollover() {
          return nextRollover;
      }
      public ConfigurableRollingFileAppender() {
          super();
      }
      public ConfigurableRollingFileAppender(Layout layout, String filename)
              throws IOException {
          super(layout, filename);
      }
      public ConfigurableRollingFileAppender(Layout layout, String filename,
              boolean append) throws IOException {
          super(layout, filename, append);
      }
      /**
       * A literal copy paste from the superclass,
       * and using the required file name at places necessary.
       *
       */
      @Override
      public void rollOver() {
          File target;
          File file;
          if (qw != null) {
              long size = ((CountingQuietWriter) qw).getCount();
              LogLog.debug("rolling over count=" + size);
              //   if operation fails, do not roll again until
              //      maxFileSize more bytes are written
              nextRollover = size + maxFileSize;
          }
          LogLog.debug("maxBackupIndex="+maxBackupIndex);
          boolean renameSucceeded = true;
          // If maxBackups <= 0, then there is no file renaming to be done.
          if(maxBackupIndex > 0) {
            // Delete the oldest file, to keep Windows happy.
            file = new File(getCompleteFileName(maxBackupIndex));
            if (file.exists())
             renameSucceeded = file.delete();
            // Map {(maxBackupIndex - 1), ..., 2, 1} to {maxBackupIndex, ..., 3, 2}
            for (int i = maxBackupIndex - 1; i >= 1 && renameSucceeded; i--) {
          file = new File(getCompleteFileName(i));
          if (file.exists()) {
            target = new File(getCompleteFileName(i + 1));
            LogLog.debug("Renaming file " + file + " to " + target);
            renameSucceeded = file.renameTo(target);
          }
            }
          if(renameSucceeded) {
            // Rename fileName to fileName.1
            target = new File(getCompleteFileName(1));
            this.closeFile(); // keep windows happy.
            file = new File(fileName);
            LogLog.debug("Renaming file " + file + " to " + target);
            renameSucceeded = file.renameTo(target);
            //
            //   if file rename failed, reopen file with append = true
            //
            if (!renameSucceeded) {
                try {
                  this.setFile(fileName, true, bufferedIO, bufferSize);
                }
                catch(IOException e) {
                  LogLog.error("setFile("+fileName+", true) call failed.", e);
                }
            }
          }
          }
          //
          //   if all renames were successful, then
          //
          if (renameSucceeded) {
          try {
            // This will also close the file. This is OK since multiple
            // close operations are safe.
            this.setFile(fileName, false, bufferedIO, bufferSize);
            nextRollover = 0;
          }
          catch(IOException e) {
            LogLog.error("setFile("+fileName+", false) call failed.", e);
          }
          }
      }
      /**
      This method differentiates RollingFileAppender from its super
      class.
      Copy paste from RollingFileAppender, as the nextRollover attribute is private in RollingFileAppender, and had to be included here.
      @since 0.9.0
   */
       protected
       void subAppend(LoggingEvent event) {
         super.subAppend(event);
         if(fileName != null && qw != null) {
             long size = ((CountingQuietWriter) qw).getCount();
             if (size >= maxFileSize && size >= nextRollover) {
                 rollOver();
             }
         }
        }
      protected String removeFileExtensionFromName(String fileName){
          return fileName.substring(0, fileName.lastIndexOf("."));
      }
      protected String getCompleteFileName(int index){
          return (fileExtension != null && !fileExtension.trim().equalsIgnoreCase("")) ? removeFileExtensionFromName(fileName) + '.' + index + '.' + fileExtension : fileName + '.' + index;  
      }
  }

 

The log4j.xml, can be used this way:

<?xml version="1.0" encoding="UTF-8" ?>
<!DOCTYPE log4j:configuration SYSTEM "log4j.dtd">

<log4j:configuration xmlns:log4j="http://jakarta.apache.org/log4j/">

    <appender name="console" class="org.apache.log4j.ConsoleAppender">
        <param name="Target" value="System.out" />
        <layout class="org.apache.log4j.PatternLayout">
            <param name="ConversionPattern" value="%-5p %c{1} - %m%n" />
        </layout>
    </appender>

    <appender name="FILE_APPENDER"
        class="org.apache.log4j.ConfigurableRollingFileAppender">
        <param name="File"
            value="../PUBLISH_FILE.csv" />
        <param name="Append" value="true" />
        <param name="maxFileSize" value="1024KB" />
        <param name="MaxBackupIndex" value="20" />
        <param name="FileExtension" value="csv" />
        <layout class="org.apache.log4j.NoLayout" />
    </appender>
    <root>
        <priority value="info" />
        <appender-ref ref="FILE_APPENDER" />
    </root>

</log4j:configuration>

Java Abstract Methods : Return type

Consider an abstract class declaring an abstract method. Let the return type of the method be Object. Now lets say, a class extends this abstract class and defines this method. What should be the return type of this method to override the base abstract method? ;D, it can be anything!!


public abstract class AbstractClassA{
    public abstract Object getDataForDoingSomething();
}

public abstract class SClassA extends AbstractClassA{
    public List getDataForDoingSomething() {           
                return new ArrayList();
    }
}

AutoBoxing in java : A boon when used carefully!

I am sure this is a very known mistake that should be avoided, but, it costed me an hour to figure out why the application I am working on was not behaving appropriately. I was looping through a list to find out an object that has to be removed. And, then the index was to be used to remove the object. The Objects in the list had over-ridden the equals and hashcode method,and I could not use the 'indexOf' or 'remove' on the list to find the object and remove it. My mistake was to use Object Integer instead of native int. Well every thing worked fine, but When called remove on the list, the List was trying to remove the integer object instead of object at that integer index! Solution was simple as you al know!! Just an interesting point to be taken down in my diary.

private List<Stock> stocksSubscribed = new ArrayList<Stock>();

private void removeStockWithStockName(String symbol) {
        Integer removalIndex = 0;
        for(Stock stock : stocksSubscribed){
            if(symbol.equalsIgnoreCase(stock.getSymbol())){
                break;
            }
            removalIndex++;
        }
        stocksSubscribed.remove((int)removalIndex);
}

Thursday, July 31, 2008

Java Properties File - Some less known facts

I came across a problem using java properties file, and I think, its a good piece to go here. I am sure, majority of the developer who are new to java(even some oldies) may encounter this issue.

The properties file contents:

a0=impl1
a1:sol1=impl2

The java program is below:

public static void main(String[] args) {
        Properties properties = new Properties();
        try {//Assume, no problemo finding properties file
            properties.load(Tester.class.getClassLoader().getResourceAsStream("test.properties"));
            System.out.println(properties.getProperty("a0"));
            System.out.println(properties.getProperty("a1:sol1"));            
        } catch (FileNotFoundException e) {
            e.printStackTrace();
        } catch (IOException e) {
            e.printStackTrace();
        }
    }

 

Surprisingly the output of the following program is:

a0=impl1
null

When you do properties.list(System.out), I was little surprised to see:

a0=impl1
a1=sol1=impl2

The ':' has been converted to '='.

Finally, with the help of wiki I found that, the name value pairs in the java properties file can be of formats:

a. name=value (This all us know definitely know)
b. name:value c. name value

And, the solution is to escape the special character using backslash. That is in the properties file,

a0=impl1
a1\:sol1=impl2

Hope this helps someone who is surprised to see 'un-expected' output.

Tuesday, July 22, 2008

j2ssh : On authentication, removing user prompt

 

While doing an SFTP connection with j2ssh client in java, it is possible that you get a message as in:

Do you want to allow this host key? [Yes|No|Always]:

To remove this message, though not recommended, we have to give a custom implementation of AbstractKnownHostsKeyVerification class. The best possible way is to extend ConsoleKnownHostsKeyVerification and override the methods onUnknownHost or onHostKeyMismatch. And use this AlwaysAllowingConsoleKnownHostsKeyVerification instance while connecting, as in:

SshClient ssh = new SshClient();

ssh.connect(hostname,new AlwaysAllowingConsoleKnownHostsKeyVerification());

The class AlwaysAllowingConsoleHostsKeyVerification is defined below. I understand that this is like a hack in security, but, sometimes, you may like to know it anyway's.

 

import com.sshtools.j2ssh.transport.ConsoleKnownHostsKeyVerification;
import com.sshtools.j2ssh.transport.InvalidHostFileException;
import com.sshtools.j2ssh.transport.publickey.SshPublicKey;

public class AlwaysAllowingConsoleKnownHostsKeyVerification extends
        ConsoleKnownHostsKeyVerification {

    public AlwaysAllowingConsoleKnownHostsKeyVerification()
            throws InvalidHostFileException {
        super();
        // Don't not do anything else
    }

    @Override
    public void onHostKeyMismatch(String s, SshPublicKey sshpublickey,
            SshPublicKey sshpublickey1) {
        try
        {
            System.out.println("The host key supplied by " + s + " is: " + sshpublickey1.getFingerprint());
            System.out.println("The current allowed key for " + s + " is: " + sshpublickey.getFingerprint());
            System.out.println("~~~Using Custom Key verification, allowing to pass through~~~");
            allowHost(s, sshpublickey, false);
        }
        catch(Exception exception)
        {
            exception.printStackTrace();
        }
    }

    @Override
    public void onUnknownHost(String s, SshPublicKey sshpublickey) {
        try
        {
            System.out.println("The host " + s + " is currently unknown to the system");
            System.out.println("The host key fingerprint is: " + sshpublickey.getFingerprint());
            System.out.println("~~~Using Custom Key verification, allowing to pass through~~~");
            allowHost(s, sshpublickey, false);
        }
        catch(Exception exception)
        {
            exception.printStackTrace();
        }
    }

}