Showing posts with label build tools (ant/maven/hudson/etc). Show all posts
Showing posts with label build tools (ant/maven/hudson/etc). Show all posts

Tuesday, December 20, 2011

Maven, Artifactory and Hudson for Oracle OSB Continuous Integration

In the previous blog we discussed about using Ant and Hudson/Jenkins for continuous integration. What hasn't been explicitly discussed is about dependency management, that Maven can handsomely handle.

The benefits of using Maven instead of Ant:
1. standardization following best practices (e.g. directory structure) that leads to shorter/simpler configuration file (pom.xml), less maintenance, and higher reusability
2. transitive dependency management: Maven will find and solve the conflicts of the libraries needed. Perhaps you know this concept already if you've used ivy framework with Ant, but this concept is central in Maven so that lots of innovations has been implemented regarding this feature (e.g. enterprise repositories).
For example I just made adjustment and commited StudentRegistrationService-ver2.0 which depends on LDAPService-ver2.0 and hibernate-ver3.jar. When I deploy the StudentRegistrationService-ver2.0, Maven will include also the LDAPService-ver2.0 and hibernate-ver3.jar from a enterprise repository that stores all the libraries used in your company. If the build & test processes success, the artifact of my new StudentRegistrationService-ver2.0 will be included in the repository, so other services which consume my service will be able to use this version 2.0 of my service. Strong enough, you can specify the version dependencies using ranges (e.g. min version, max version), so I can specify that my service depends on LDAPService max version 2.0 (since I don't support the new interface of the newer LDAPService yet) and also depends on PaymentService min version 3.1.1 since there is a payment bug in the PaymentService version lower than 3.1.1. Here is an example of defining these dependencies in the pom.xml of the StudentRegistrationService:

<dependency>
<groupId>TUD</groupId>
<artifactId>LDAPService</artifactId>
<version>[0,2.0]</version>
</dependency>

<dependency>
<groupId>TUD</groupId>
<artifactId>PaymentService</artifactId>
<version>[3.1.1,)</version>
</dependency>

An illustration about how it works:


1. Using Hudson/Jenkins to let the svn commit trigger the Maven build
Please see the previous blog about how to install and setup Hudson/Jenkins.

For this example, I specify Hudson/Jenkins to pool the svn server every minute (set by the schedule "* * * * *" using cron format). When there is a new commit in the mysvnproject, the Maven "install" goal (along with its previous lifecycles phases i.e. compile, test) will be invoked.


2a. Checkout using mvn-scm plugin

<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-scm-plugin</artifactId>
<version>1.1</version>
<configuration>
<username>username</username>
<password>password</password>
</configuration>
<execution>
<id>checkout</id>
<configuration>
<connectionUrl>mysvnserver</connectionUrl>
<checkoutDirectory>mysvndir</checkoutDirectory>
<excludes>folder2exclude/*</excludes>
</configuration>
<phase>compile</phase>
<goals>
<goal>checkout</goal>
</goals>
</execution>
</plugin>

2b. Build the OSB, you can use the same ant task as in my previous blog, wrapped with maven-antrun-plugin.

3a. Obtain the dependencies from the repositories, using dependency:copy-dependencies or dependency:copy.

3b. Deploy the OSB project and its dependencies, you can use the same ant task as in my previous blog, wrapped with maven-antrun-plugin.

4. Run SOAP UI web service test using maven-soapui-plugin (or alternatively you can use testrunner.bat / testrunner.sh similar to my other blog)

<plugin>
<groupId>eviware</groupId>
<artifactId>maven-soapui-plugin</artifactId>
<version>3.0</version>
<executions>
<execution>
<phase>test</phase>
<id>soapuitest</id>
<configuration>
<projectFile>${mysoapuitestfile}</projectFile>
<outputFolder>${testreportdir} </outputFolder>
<junitReport>true</junitReport>
<exportwAll>true</exportwAll>
<printReport>true</printReport>
<settingsFile>${soapuisettingfile}</settingsFile>
</configuration>
<goals>
<goal>test</goal>
</goals>
</execution>
</executions>
</plugin>

5. Archieving the artifact using an enterprise repository.
The benefit of using enterprise repository:
• your developers don't have to search, download and install the libs manually
• it's faster & more reliable than downloading the libs from internet, the concept is similar to proxy server that cache the internet.
• it will store the artifacts of your company projects from ant/maven builds, so they will be readily available for testing and shipping
• web administration interface, search, backup, import/exports

I chose Artifactory as enterprise repository since it has more features than other products, such as: xpath search inside XML/POM, hudson integration (e.g. for build promotion), conn to ldap, cloud (saas) possibility, easy install (running in an embedded jetty server or as service in Windows/Linux.)
You can use Hudson artifactory plugin to integrate Artifactory to Hudson/Jenkins process.

I use 3 local repositories inside your Artifactory for different library categories:
open source/ibibliolibraries (e.g. apache common jars), the Artifactory can download these automatically
proprietary libraries (e.g. oracle jdbc jar), you need to install these manually (e.g. via Artifactory web interface)
company libraries, you need to install these manually or via Hudson build as done in this example. For the company repository, I define such that the repository cab handle both the release/stable version (e.g. the PaymentService-ver3.1.1 which is already well tested and approved) as well as the snapshot version (e.g. I am not finished with my StudentRegistrationService-2.0 yet but I want to make it available for other projects which depend on it). For example in the artifactory.config.xml:

<localRepository>
<key>tud-repo</key>
<description>mycompany-libs</description>
<handleReleases>true</handleReleases>
<handleSnapshots>true</handleSnapshots>
</localRepository>

<localRepository>
<key>ibiblio-repo</key>
<description>stable-opensource-libs</description>
<handleReleases>true</handleReleases>
<handleSnapshots>false</handleSnapshots>
</localRepository>

You need to declare these repositories in your pom.xml (or with similar approach in settings.xml for all of your projects):

<repositories>
<repository>
<id>ibiblio-repo</id>
<url>http://myreposerver:port/artifactory/repo</url>
<snapshots>
<enabled>false</enabled>
</snapshots>
</repository>
<repository>
<id>tud-repo</id>
<url>http://myreposerver:port/artifactory/repo</url>
<snapshots>
<enabled>true</enabled>
</snapshots>
</repository>
</repositories>
<pluginRepositories>
<pluginRepository>
<id>ibiblio-repo</id>
<url>http://myreposerver:port/artifactory/repo</url>
<snapshots>
<enabled>false</enabled>
</snapshots>
</pluginRepository>
<pluginRepository>
<id>tud-repo</id>
<url>http://myreposerver:port/artifactory/repo</url>
<snapshots>
<enabled>true</enabled>
</snapshots>
</pluginRepository>
</pluginRepositories&gt

For the sake of clarity there are some details omitted from this blog. The concepts in this blog work also for non OSB projects (e.g. Java/J2EE applications).

Any comments are welcome :)



See also: http://soa-java.blogspot.nl/2011/03/soa-continuous-integration-test.html


References:
Setting Up a Maven Repository
Comparison Maven repository: Archiva, Artifactory, Nexus
Amis blog: Soapui test with maven


Continuous integration for Oracle OSB projects using Ant and Hudson / Jenkins

Continuous integration (CI) is pervasive. This doesn't come from nothing, CI has many benefits, such as to avoid last minute integration hell and to improve software quality


The principles of Continuous integration
1. every commit to the SCM should be build so that you'll have an early feedback if the build breaks
2. automate build for consistency
3. automate deployment for consistency
4. automate test of the build artifact for consistency
5. archive the build artifact so that it'll be readily available (e.g. for further test)
6. keep the reports & metrics from build & test

You can also add additional steps for example when this commit breaks the test you can reject the commit by rolling back the deployment (using undeploy wlst task) & roll back the svn commit using merge backward:
svn merge -r currentver:previousver ; svn commit

This blog will show how to achieve these steps using Ant and Hudson/Jenkins. We use Ant since in many organizations Ant is already well adopted compared with Maven. In another blog we will discuss how to achieve the same goal using Maven and artifact repository (e.g. Artifactory or Nexus) which are handier than Ant particularly with respect to dependency management.

How it works:
1.Using Hudson/Jenkins to let the svn-commit triggers the Ant build
I chose Hudson/Jenkins since it's easy to use, has good features, scalable and recommended by many people (including folks working at Oracle). You can see Jenkins as a new version of Hudson, Jenkins was created to avoid legal problem with Oracle when Kohsuke Kawaguchi, the Hudson's creator, left Sun/Oracle.
Installing Hudson/Jenkins is easy, in Windows it can be run as a Window service. Hudson/Jenkins contains an embedeed Winstone servlet engine, so you can also run it using
java -jar hudson.war --httpPort=aportnumber
To install Hudson/Jenkins in Weblogic you need to add the deployment descriptor weblogic.xml to solve classpath conflicts of certain jars (depend on which version you install), also Hudson/Jenkins will not work in Weblogic servers with SOA/OSB extensions due to some conflicting settings.
You can add some plugins for example locale (to set languages), svn related plugins, trac (to recognize trac tag e.g. fixed#), cvs/svn browser viewer, html report, scp artifact repository, promote builds, among others.

You need to set some configurations such as JDK location, svn connection, Ant location, Maven location, Junit test report location, artifacts location, SMTP/email server for notifications.

For this example, I specify Hudson/Jenkins to pool the svn server every minute (set by the schedule "* * * * *" using cron format). When there is a new commit in the mysvnproject, the Ant ("main" target) will be invoked.

2. Checkout the new commited svn project using svntask:

<target name="checkout">
<delete failonerror="false" includeemptydirs="true" quiet="true"
dir="${servicename}" />
<svn username="${subversion.user}" password="${subversion.password}">
<checkout url="${subversion.path}" destPath="${servicename}" />
</svn>
</target>

Build this code to an OSB project using ConfigExport

<target name="makeosbjar" depends="deletemetadata">

<!-- osb jar compile -->
<java dir="${osb.eclipse.home}"
jar="${osb.eclipse.home}/plugins/org.eclipse.equinox.launcher_1.0.201.R35x_v20090715.jar"
fork="true"
failonerror="true"
maxmemory="768m">
<jvmarg line="-XX:MaxPermSize=256m" />
<arg line="-application com.bea.alsb.core.ConfigExport" />
<arg line="-data ${workdir}" />
<arg line="-configProject ${osb.config.project}" />
<arg line="-configJar ${jardir}/${packagename}" />
<arg line="-configSubProjects ${servicename}" />
<sysproperty key="weblogic.home" value="${osb.weblogic.home}" />
<sysproperty key="osb.home" value="${osb.home}" />
</java>
</target>

3. Deploy the jar to the OSB server using WLST import (a python script)

<target name="deployOSB">
<wlst fileName="${import.script}" debug="true" failOnError="false"
arguments="${wls.username} ${wls.password} ${wls.server} ${servicename} ${jardir}/${packagename} ${import.customFile}">
<script> <!-- run these before import.py -->
adminUser=sys.argv[1]
adminPassword=sys.argv[2]
adminUrl=sys.argv[3]
passphrase = "osb"
project=sys.argv[4]
importJar=sys.argv[5]
customFile=sys.argv[6]
connect(adminUser,adminPassword,adminUrl)
domainRuntime()
</script>
</wlst>
</target>

4. Run the SOAPUI web service test and generate the junit test report:
<target name="soapui-test">
<exec executable="cmd.exe" osfamily="windows" failonerror="false">
<arg line="/c ${testrunner.bat} -j -freports ${soapui.test}"/>
</exec>

<junitreport todir="${testreportdir}">
<fileset dir="reports">
<include name="TEST-*.xml"/>
</fileset>
<report format="frames" todir="${testreportdir}/html"/>
</junitreport>
</target>


Example of Hudson/Jenkins output:

Hudson also sent email notifications:

For the sake of clarity, there are some details omitted from this blog (e.g. the ant classpath for lib needed: svnant, svnjavahl, svnclientadapter, xmltask), these details will be found in the build.xml. Please download the build.xml, build.properties and the import.py wlst here.

The concepts in this blog work also for non OSB projects (e.g. Java/J2EE applications).

Source: Steve's blogs http://soa-java.blogspot.com/

Any comments are welcome :)




The file download is made possible by OpenDrive


References:

Using the Oracle Service Bus Plug-ins for Workshop for WebLogic http://docs.oracle.com/cd/E13159_01/osb/docs10gr3/eclipsehelp/tasks.html
Using WLST http://docs.oracle.com/cd/E15051_01/wls/docs103/config_scripting/using_WLST.html
Biemond's blog http://biemond.blogspot.com/2010/07/osb-11g-ant-deployment-scripts.html
How-to-deploy-Hudson-to-weblogic http://jenkins.361315.n4.nabble.com/How-to-deploy-Hudson-to-weblogic-td3246817.html

Monday, April 11, 2011

Showing the svn / build version number in your (web) application

It is often useful to include the version information in your (web) applications so you can track which code/build that your customers use. This comes handy especially when you need to solve bugs. Always mention which version of code/build where the bug occurs when you write a bug reproduction scenario in your bug tracking tools (e.g. Bugzilla.) As an example, one of the most common way to include the version information is in the "About" dialog.


Showing a version number in a JSP web application
For example you can include this in your jsp:

Version: <%= application.getInitParameter("version-svn ") %>

which reads the version-svn parameter value in the web.xml:

<context-param>
<param-name>version-svn</param-name>
<param-value>170</param-value>
</context-param>


Changing the web.xml parameter value using Ant script
In the Ant build.xml you can use XmlTask to change the version number in the web.xml:

<taskdef classname="com.oopsconsultancy.xmltask.ant.XmlTask" name="xmltask">



<target name="changeversionwebxml" depends="getsvnver">

<xmltask dest="${webinf}/web.xml" source="${webinf}/web.xml" failwithoutmatch="true">

  <replace path="/*[local-name()='web-app']/*[local-name()='context-param'][1]/*[local-name()='param-value'][1]/text()" withtext="${revision.max}">

 </xmltask>

</target>

The xpath predicate "/*[local-name()='name of an element']" is necessary to resolve the namespaces in the web.xml. The [1] is necessary since in my web.xml I use more than one "web-app/context-param/param-value" element, so I need to tell the xmltask that I want to change the first "param-value" that it finds. The ${revision} property is set using another Ant target described below.

Getting the svn revision number using Ant script
We can use SvnAnt to get the highest svn revision number of your source tree starting from the ${basedir}.

<path id="svnant.classpath">

<!-- if you use EclipseIDE, add the svnant jars using eclipse>preference>ant>classpath -->

<fileset>

<include name=" ">

</fileset>

</path>

<typedef classpathref="svnant.classpath" resource="org/tigris/subversion/svnant/svnantlib.xml">
<target name="getsvnver">
<svn>
<wcVersion path="${basedir}"/>
</svn>
<echo message= "Subversion revision.max: ${revision.max}" />
</target>

The ${basedir} points to your working directory which contains your source codes along with .svn files (svn bookkeeping files), one way to define the basedir in the build.xml for example:

<project name="project name" basedir="../" default="your default target">


Using a build number
Actually the svn revision information is more appropriate for developers. For a shipped product, a build number will be more useful. Instead of the svn number, you'd better include extra info such as time, build machine, build owner (developer who run the build task). For example using Hudson (a continuous integration tool) you can access the build information using environmental variables:
<property environment="env"/>
<property name="buildinfo" value="${env.JOB_NAME}-${env.BUILD_NUMBER}"/>

You can substitute the ${revision.max} in the xmltask above with this ${buildinfo}.

Please post your comments / suggestions. Thank you.

Monday, March 21, 2011

SOA continuous integration test

The Problem
Consider this scenario in the Acme corporation:


The EmployeePages webservice (WS) is developed by an extern company. The ShowCollaboration WS, the salary WS & Employee Database WS are developed by intern teams.

There are potentially 2 problems in this scenario:

1. Integration problem facing by multiple development teams
After 3 months, the extern team comes to deliver the EmployeePages WS, soon they face an integration hell. It seems that during these 3 months the Employee Database WS & the ShowCollaboration WS have been changed by the Acme teams.

2. Side effects to other webservices due to new requirements
Suppose there is a new request to modify the schema of one of the services or to add a new function or webservice. The Acme chief architect is suppose to maintain the overview to the whole webservices and their schemas, he will warn/advice the developers about the dependencies/potential side effects to other webservices. However this scenario with 4 webservices is just a simplification, in real life we have more than 100 interconnected webservices. Thus it will be difficult for the chief architect to maintain overview of the whole things.



The Solution
The solution for this problem is by incorporating (regression) tests against other webservices in the Acme corporation. If everything is fine (the new change doesn't break other webservices) the change will be promoted (accepted to the baseline revision in the version control).

In this scenario we have 2 version control repositories: svnbase (the main repository) & svntest (the temporary repository for testing).
The steps are as following:

  1. Commit the changes to svntest

  2. Build the new changes (using continuous integration tools such as CruiseControl or Hudson)

  3. Perform regression tests with all other webservices

  4. If the new changes break other webservices, you need to communicate with the other team to sort out the problem. We need to revert/rollback the new change in svntest.
    If the new changes doesn't break anything, the change will be promoted (committed to the svnbase).

These steps can be implemented using Ant, Maven, or even shell scripts. By integrating these steps to the standard commit-process, we force small-steps continous integration which will avoid integration hell later on. Also the regression test will warn the developers & the chief architect about the dependencies/side effect which otherwise may not be clearly foreseen.





The test

This is an example of a simple webservice test using SoapUI: (please click the image for a better resolution)


In this test we applies several assertions. SoapUI will assert if the response is a valid SOAP message, then validate it using WSDL and match the Schema structure. I added also an extra test to test the business logic/data integrity for example to check that the name field in the response contains a certain string (e.g. "PXE").

It's advisable to add the bin directory of SOAPUI to your path. This directory contains testrunner.bat (or testrunner.sh for Unix) which runs the SoapUI testsuite and loadtestrunner.bat (or loadtestrunner. sh for Unix) which runs the SoapUI loadtest. Then we can run the SoapUI test from the command line/shell script:

testrunner -r soapui-project.xml

the option -r will generate reports to the console. The soapui-project.xml is the name of your SoapUI test project. You may also run the test using Ant instead of the command line/shell script.



In this Ant script we generate a html junit report which summarize the SoapUI test results.




Performance test
You might not also include a performance test in the regression test. The reason behind this is if the performance of your webservice is severely degraded due to the new changes, it will also worsen the performance of other webservices which depend on it. If the performance of the system becomes unacceptable, you might need to examine the business logic and/or to optimize the hardware/database. This is an example of a loadtest in SoapUI (please click the image for a better resolution):


SoapUI will run 100 threads over 60 seconds (which I hope is well above the burn-in period.) I set an assertion which warns me if the response time exceeds 5 sec.



Comparison with unit test in Oracle SOA Suite

Another alternative for example by using unit test in Oracle SOA Suite. My hesitation with this method:
• technology-specific (only works with Oracle SOA)
• currently the content test is only possible with xml/string match, thus can't deal with xml-fragment, missing xpath query flexibility as in SoapUI.




Summary
This article describes a solution for integration problems which facing SOA development. The solution for this problem is by incorporating (regression) tests against other webservices. If everything is fine (the new change doesn't not break other webservices) the change will be promoted (accepted to the baseline revision in the version control).

By integrating these steps to the standard commit-process, we force small-steps continuous integration which will avoid integration hell later on. Also the regression test will warn the developers & the chief architect about the dependencies/side effect which otherwise may not be clearly foreseen.

The managements will be happy too since this solution will move the development process from chaotic integration (CMM1) to a well defined standard integration process (CMM3).




Please drop your comments (e.g. other ideas/tools to solve this problem, other SOA integration problems, etc.) Thanks!