Posts Tagged NetBeans

Building a Deployment Pipeline Using Git, Maven, Jenkins, and GlassFish (Part 2 of 2)

Build an automated deployment pipeline for your Java EE applications using leading open-source technologies, including NetBeans, Git, Maven, JUnit, Jenkins, and GlassFish. All source code for this post is available on GitHub.

System Diagram 3a

Introduction

In part 1, Building a Deployment Pipeline Using Git, Maven, Jenkins, and GlassFish (Part 1 of 2), we built the first part of our basic deployment pipeline using leading open-source technologies. In part 2, we will use Jenkins CI Server and Oracle GlassFish Application Server to complete our deployment pipeline.

To review, the three main goals of our deployment pipeline are continuous integration, automated testing, and continuous deployment. Our objective is to automatically compile, test, assemble, and deploy our Java EE application to multiple environments, as the project progresses through the software development life cycle (SDLC).

Setting up Git Server

As I mentioned in part 1, as a part of a development team using Git, you would place your project on a remote Git Server. You and your team members would each clone the repository from the Git Server to your local development environments. You and your team would commit your code changes locally, then pull, merge, and push your changes back to the remote Git Server. Jenkins will pull the project’s source code from the Git Server.

In part 1 of this post, we just created a local Git repository. In part 2, we will properly set-up our project on a remote Git Server. First, we need to export our local repository into a new, bare repository on the Git Server. The Git term, ‘bare repository’, refers to a repository that does not contain a working directory. The repository has no working copies of your source files. You only use the bare repository to clone, pull from, and push to. The bare repository contains a .git extension (i.e. ssh://user@server:/git-repos/myproject.git).

From the root of your remote Git Server repository, execute the following command, substituting the path to your local project. If your Git Server is on a separate machine that your local project repository, you will need to copy the new bare repository to the remote Git Server. This involves a few simple steps, explained in this post, and at git-scm.com.

git clone --bare {path-to-existing-local-repository}\{name-of-repository} {name-of-repository}.git
Export Local Project to New Bare Repository

Export Local Project to New Bare Repository

Once you have created the repository on the remote Git Server, I would recommend you clone the remote repository to your local machine and discard your original local repository from part 1 of the post. You don’t have to do this step, but cloning fresh from the server will make sure Git is working correctly. The screen grabs below illustrate an example of cloning a new repository to my local NetBeans Project folder.

Clone New Bare Server Repository - Screen 1

Clone New Bare Server Repository – Screen 1

Clone New Bare Server Repository - Screen 2

Clone New Bare Server Repository – Screen 2

Clone New Bare Server Repository - Screen 3

Clone New Bare Server Repository – Screen 3

Configuring Jenkins

The diagram below illustrates the deployment pipeline from Git Server to Jenkins to GlassFish in finer detail. It begins with an initial commit to the local Git project repository and ends with the deployment of the project’s WAR file to the GlassFish domain. We will walk through it step-by-step.

System Diagram 3c

Jenkins Plugins

Before we create our new Jenkins Jobs, we need to configure Jenkins properly. You will need a recent version of Jenkins installed, along with the following plugins:

  1. Build With Parameters Plugin
  2. Copy Artifact Plugin
  3. Jenkins GIT plugin (includes Jenkins GIT client plugin)
  4. Jenkins Parameterized Trigger plugin
  5. Maven Integration plugin
  6. Credentials Plugin (optional for use with Git Server if security is enabled)
  7. ThinBackup (optional to install supplied Jenkins jobs configuration files)

Global Security

Jenkins can be configured with or without Global Security. For this post, I have enabled Global Security, as it typical of most development environments. I chose to use ‘Jenkins’s own user database’ option for authentication. In larger development environments, authentication would normally be done against LDAP.

Jenkins' Configure Global Security

Configuring Global Security

The user I have set up, ‘jenkins’, will be the user that Git authenticates with when connecting to Jenkins (explained later). Set up your own user and note their API Token. Since Global Security has been enabled, we will need the token later to trigger the Jenkins build from Git. Your user’s unique api token will be different than in the example below.

Jenkins User API Token

Jenkins User API Token

Jenkins Jobs

We will set up two Jenkins ‘free-style software project’ jobs, ‘GitMavenGlassFish_Build’ and ‘GitMavenGlassFish_Deploy’. We won’t be using the obvious choice, a ‘maven2/3 project’. If you’re interested, here’s why. The first job, the build job, will be responsible for pulling the source code from the Git Server. The build job, with help from Maven, will compile, test, and assemble the application code. The second job, the deployment job, will pull the artifacts from the build job and deploy them to GlassFish. The build job will trigger the deployment job, once the build job completes successfully. This is explained in detail, to follow.

Why Two Jobs?

Following good modular design and Separation of Concerns (SoC) principles, separating the build from the deployment gains us several advantages, including:

  1. Modularity– Ability to change deployment methodology or deployment targets, without disrupting the build and test process. For example, we might move the application hosting from GlassFish to WebLogic, or decide to use Ant instead of Maven for deployment tasks. This can happen totally independent of the build and testing processes.
  2. Separation/Isolation – For any reason we are unable to deploy the artifacts as part of the deployment job, we won’t impact the continuous integration and automated testing processes, which are part of the separate build job.
  3. Support – Support is easier by having smaller pieces of functionality to troubleshoot and maintain.

In a larger enterprise environment, you would probably encounter further separation of concerns. Unit testing, performance testing, deployment validation, and documentation generation (javadocs) are often handled by separate jobs. Jenkins represents a smaller pipeline within our larger deployment pipeline.

I intentionally left out notification for brevity. At minimum, you would want to be notified when the build or deployment jobs failed. Additionally, with continuous deployment, the deployment would trigger a notification to the stakeholders of that environment, such as the Testers. This lets them know the new software is ready to be tested. Notifications often include a list of bug fixes and feature enhancements that need to be tested. This can easily be pulled from Git into Jenkins and out to the end user.

Both Jenkins jobs definitions are available as xml files on gist.github.com. Using Jenkins’ ThinBackup Plugin, you can save both gists locally, and then restore them to your Jenkins server. The build job gist is here and the deployment job gist is here. This may save you some configuration time.

Jenkins Build Job

Both the build job and the deployment jobs require an input parameter. This property represents the targeted environment (GlassFish domain) for deployment, such as ‘testing’.. How this parameter is passed to Jenkins is discussed later in the Git Hooks section, below.

Reviewing the below screen grab of the build job’s configuration, you will observe the following steps:

  1. Build Request – A build request is received by the job (explained later). The request contains an input parameter indicating the ‘environment’. The parameter must be one of the choices listed in ‘Choices’.
  2. Maven Dependencies – Based on the pom file, Maven retrieves all the required dependencies from the remote Maven repository, if the dependencies are not already contained in the workspace’s local repository. Note the setting ‘User private Maven repository. This creates a local repository for project dependencies within the project’s workspace.
  3. Pull from Git – Jenkins pulls the code from the Git Server using the supplied repository configuration information. Note my Git Server does not require authentication. If it did, we would set-up and use the proper credentials.
  4. Build – Jenkins builds the project using the Maven command ‘clean install -e’. The pom file contains the necessary configuration information.
  5. Unit Test – The above Maven ‘install’ command also calls JUnit to execute the unit tests. The results of these tests are published and displayed as part of the build job’s details.
  6. Assemble WAR – The above Maven ‘install’ command also assembles the project’s WAR file.
  7. Archive Artifacts – Based on the success of the build and unit tests, Jenkins archives specific artifacts needed by the deployment job. Jenkins uses the input parameter in #1 to define which properties file and password file to archive.
  8. Trigger Deployment Job – Based on the success of the build and unit tests, Jenkins triggers the ‘downstream’ deployment job, passing it the same environment parameter.
Jenkins Build Job Configuration

Jenkins Build Job Configuration

Jenkins Deployment Job

Reviewing the below screen grab of the deployment job’s configuration, you will observe the following steps:

  1. Build Request – A build request is received from the upstream build job. The request contains the input parameter indicating ‘environment’.
  2. Copy Artifacts – Jenkins copies the artifacts from the build job that called the deploy job.
  3. Read Properties – Maven executes the command ‘mvn properties:read-project-properties glassfish:redeploy -e’. The first half of this command instructs Maven to read the appropriate properties file, as indicated by the environment parameter, ‘glassfish.properties.file.argument=${environment}’.
  4. POM – Maven substitutes the key ‘glassfish.properties.file.argument’ in the pom file with the environment value. This tells Maven the name of the properties file, which supplies all the remaining property values to the pom file.
  5. Maven Dependencies – If the dependencies are not already contained in the workspace’s local repository, Maven retrieves all the required dependencies from the remote Maven repositories, based on the pom. Note the setting ‘User private Maven repository’ checked in the screen grab below. This option instructs Jenkins to creates a local repository for project dependencies within the project’s workspace.
  6. Deployment – The last half of the command in #3 deploys, or more accurately redeploys the application’s WAR file to GlassFish. The ‘glassfish:redeploy’ works only if the WAR file has already been initially deployed to the GlassFish domain using the ‘glassfish:deploy’ command. For this process, I am assuming the initial deployment was already done directly through the GlassFish Administration Console, NetBeans, or command line.
Jenkins Deploy Job Configuration

Jenkins Deploy Job Configuration

Git Hooks

To achieve continuous integration, we want to automatically build and test our job after each change to our code. We have a number of choices to make this happen. The obvious choice is letting Jenkins poll the Git Server. Although polling would simplify configuration, polling is frowned upon in many environments. Even the creator of Jenkins, Kohsuke Kawaguchi, frowns upon polling in his post, ‘Polling Must Die‘.

Why is polling bad? It adds unnecessary activity and delay. Let’s say Jenkins’ polling frequency is set to every 2 minutes, but you only have an average of 5 pushes to your remote Git Server project repository per day. Based on these stats, in just one day, Jenkins will poll Git 720 times to discover only 5 pushes. That’s 144 times per push. Also, based on the polling frequency, when you do push, you could wait up to 2 minutes for Jenkins to queue the build job. The longer you wait for feedback on your changes, the greater chance your defects could be pulled down by other developers. You should expect immediate and continuous feedback.

A vastly more efficient and configurable method of continuous integration between Git and Jenkins is Git Hooks. Git Hooks allow us to execute scripts based on specific Git actions. In our case, when a developer completes a successful push to the remote Git Server project repository, we want to call Jenkins to build, test, and deploy the modified project code. Using hooks means we only call Jenkins when a successful push is completed. Furthermore, we can be assured Jenkins will immediately queue our request to build and deploy the job when a push occurs.

Post-Receive Hook

There are several types of Git Hooks. They include ‘post-commit’, ‘pre-push’, ‘update’, ‘pre-rebase, and so forth. I recommend this post on kernel.org for a good explanation of the hook types and thier purposes. Git also includes sample hook files inside the ‘hooks’ subdirectory of each new repository .git folder.

For our pipeline, we will employ the ‘post-receive’ hook. Whenever a successful push is received by Git Server’s project repository, the ‘post-receive’ hook will be called. The script commands, contained in the post-receive hook file, will be executed. Hooks can language agnostic; they can be almost any scripting language, such as Perl, Shell, Bash, or Ruby.

To create the hook, create a new file, ‘post-receive’, in the hooks sub-directory of the Git Server’s project repository. Add the below code to the file. Change the command to match your local file path. Also, change the API Token to match your user’s token from Jenkins. Note the command requires cURL to be installed on the Git Server. If installing cURL is not an option, there are other options available to execute the http post call from the hook’s script.

#!/bin/sh
# Call Jenkins to start build and pass environment parameter
#
echo "executing post-receive hook"
echo "environment=testing"
echo "user=jenkins"
# cURL POST request using jenkins user with API token
curl -u jenkins:{your-api-token-here} \
--data "delay=0sec&environment=testing" \
"{your-jenkins-server-url:port}/job/GitMavenGlassFish_Build/buildWithParameters"
view raw post-receive.sh hosted with ❤ by GitHub

NetBeans and Git Hooks

Now some slightly bad news. As with any integration, there is always trade-offs; that is the case with NetBeans and Git. Although NetBeans works well with Git, there are a few features that have not been implemented. Unfortunately, this lack of complete integration effects NetBeans’ ability to make use of Git Hooks. Only after three hours of troubleshooting and research on the Internet, did I realize this limitation. The hooks fire fine if a git push command is executed from a command prompt or from within a Git application like Git Gui or Git Bash. However, from NetBeans, the Team -> Remote -> Push… does not cause the hooks to be called.

Example Post-Receive Hook - Works from Command Prompt

Post-Receive Hook Working from a Windows Command Prompt

Git Hooks do not work with NetBeans because NetBeans does not use a command line client for Git. NetBeans uses a pure java implementation of the Git client, Java GIT, known as JGit. I understand that other IDE’s also share this limitation. There are several discussions on StackOverflow and on the NetBeans bug tracking site about the issue and workarounds.

So what does this mean? You can use NetBeans to perform all of your local tasks. However, when it comes time to push your code back to the remote Git Server repository, you must use a command prompt, Bash shell, or a command line based tool. I recommend Git Gui. Git ships with built-in GUI tools, including git-gui and gitk. It can be downloaded from git-scm.com.

Git Gui Graphical User Interface for Git

Git Gui Graphical User Interface for Git

Push Files Using Git GUI Instead of NetBeans

Push Files Using Git GUI Instead of NetBeans

Pushing changes to the remote Git Server using Git Gui instead of NetBeans may seem inconvenient at first. However, the more advanced your needs become with Git, the more you will find you need the additional functionality of Git Bash, Git Gui, and gitk. Tasks like resetting the branch to a previous revision, compressing the Git repository database, and visualizing repository history, can all be done with tools like Git Gui and gitk. I have Git Gui running when I am working in NetBeans or other IDEs; it becomes second nature.

Using Git Gui and gitk Used to Examine Repository

Using Git Gui and gitk to Examine and Modify the Project Repository

Deploying to GlassFish

At this point we have configured the Git Server, created the Jenkins build and deploy jobs, and configured our Git hook. We are ready to test our deployment pipeline. First, make sure your GlassFish domains are running. Also, recall we are assuming that an initial deployment of the application has occurred. This might be directly through the GlassFish Administration Console, through NetBeans, or via the command line. Recall, Jenkins will be only be executing a re-deploy.

Check and Start GlassFish Domains

Check and Start GlassFish Domains

To test the system, make an innocuous change to the Project. Commit the change to your local Git repository. Following that, push the change back to the remote Git Server repository using Git Gui. If the hook fired, you will see output to the Git Gui terminal window, echoed from the post-receive hook as it executed its script.

Push with Git Gui Triggering Jenkins Build

Push with Git Gui Triggering Jenkins Build

The post-receive hook executes the cURL command, which posts an HTTP request to Jenkins via the Jenkins Remote API. You should observe is the Jenkins build job queued and running.

Jenkins Build Job Running

Jenkins Build Job Running

When the build completes, review the Parameters menu option in the left navigation menu. It shows that the environment parameter was passed from the post-receive hook to the build job. The build results window also provides test results, Git Build Data, and the changes pushed to Git that triggered the CI build.

Jenkins Build Job Results

Jenkins Build Job Results

The console output from the build provides a detailed view of the build process. Using the ‘-e’ for echo with the Maven command, increases the level of output detail. You see the details of Maven copying the required dependencies from the remote repository to the local workspace repository, prior to compilation. You see the unit tests being executed. Finally, you see the WAR file assembled and the required artifacts archived.

Regarding Maven Dependencies, you will only see the dependencies copied on the first build to an empty workspace. Maven does not re-pull dependencies if they already exist in the workspace’s local repository. To see the difference, empty your workspace and build the job, then immediately rebuild the job. Compare the console outputs of both jobs. You will see a significant difference in the Maven dependency activities.

Jenkins Build Job Console Results

Jenkins Build Job Console Results

Once the build job has completed successfully, you should notice the Jenkins deployment job running, triggered by the build job. When complete, note the detail that lists the exact build job that called the deployment job, and its build number. For example, the upstream build job #45 triggered the downstream deployment job #33. This linkage between upstream and downstream jobs is retained in the job’s history.

As before, review the Parameters menu option in the left navigation menu. It shows that the environment parameter was passed from the post-receive hook to the build job, and then on to the deployment job.

Jenkins Deployment Job Complete

Jenkins Deployment Job Complete

A review of the console output will confirm that the artifacts were copied from the build job and the WAR file was deployed to the ‘testing’ GlassFish domain.

Jenkins Deployment Job Console Output

Jenkins Deployment Job Console Output

GlassFish

If the hook fired, and both the Jenkins build and deployment jobs ran successfully, you should observe that the project’s WAR files, containing your recent change, was deployed to the testing GlassFish domain.

Application Installed on GlassFish Server Testing Domain

Application Installed on GlassFish Server Testing Domain

You can verify this by calling the application’s RESTful ‘resources/helloWorld’ URI, from your browser. Repeat the process by changing the output string, commit the change, and push. See if you see your change deployed.

Application Running on GlassFish Server Testing Domain

Application Running on GlassFish Server Testing Domain

Jenkins Workflows

Using our deployment pipeline, we have two distinct workflow options:

  1. Continuous– Use Git hooks to build, test, and deploy the WAR file to the domain(s) of choice when changes are pushed. Any time a change is pushed, a build, test, and deploy, should occur. This would be just for development at first. Once the project enters the testing phase of the SDLC, then it would include deployments to testing.
  2. Semi-Automated – Start the Jenkins build manually in the Jenkins browser-based Administration Console. This is more typical for a release to Production. Most teams are not comfortable extending the continuous deployment functionality into Production. Often, a deployment team will deploy the project artifacts in a controlled and staged approach. The Jenkins build and/or deployment jobs both allow this feature, along with the ability to provide the environment parameter both jobs needs.

Conclusion

In part 1, we learned how to create a simple Java EE web application project in NetBeans using Maven. We learned how to integrate JUnit for unit testing, and how use Git to manage our source code.

In part 2, we learned how to configure a remote Git Server, how to configure Jenkins CI Server to clone our project from the Git Server, build, test, and assemble it. If the build was successful, we learned how to configure Jenkins to deploy our project to a specific GlassFish domain, based on the project’s stage in the SDLC. We achieved our goals of continuous integration, automated testing, and continuous deployment.

Going Forward

To extend and enhance our deployment pipeline, you might consider adding the following features: 1) further separate the Jenkins jobs by function, 2) add build and deploy notifications, 3) add the ability to deploy to multiple environments simultaneously (i.e. development and testing), 4) add additional testing to confirm the deployment to GlassFish, 5) configure a versioning and naming scheme for the deployed artifacts, and 6) add error handling if a parameter is not received or is not one of the expected values.

, , , , , , , , , , , , , , , ,

9 Comments

Building a Deployment Pipeline Using Git, Maven, Jenkins, and GlassFish (Part 1 of 2)

Build an automated deployment pipeline for your Java EE applications using leading open-source technologies, including NetBeans, Git, Maven, JUnit, Jenkins, and GlassFish. All source code for this post is available on GitHub.

System Diagram 3a

Introduction

In my earlier post, Build a Continuous Deployment System with Maven, Hudson, WebLogic Server, and JUnit, I demonstrated a basic deployment pipeline using leading open-source technologies. In this post, we will demonstrate a similar pipeline, substituting Jenkins CI Server for Hudson, and Oracle’s GlassFish Application Server for WebLogic Server. We will use the same NetBeans Java EE ‘Hello World’ RESTful Web Service sample project.

The three main goals of our deployment pipeline will be continuous integration, automated testing, and continuous deployment. Our objective is to automatically compile, test, assemble, and deploy our Java EE application to multiple environments, as the project progresses through the software development life cycle (SDLC).

Building a reliable deployment pipeline is complex and time-consuming. To make it as easy as possible in this post, I chose NetBeans IDE for development, Git Distributed Version Control System (DVCS) for managing our source code, Jenkins Continuous Integration (CI) Server for build automation, JUnit for automated unit testing, GlassFish for application hosting, and Apache Maven to manage our project’s dependencies. Maven will also manage the build and deployment process to GlassFish, along with Jenkins. The beauty of NetBeans is its out-of-the-box, built-in integration with Git, Maven, JUnit, and GlassFish. Likewise, Jenkins has plugin-based integration with Git, Maven, JUnit, and GlassFish. Also, Maven has plugin-based integration with GlassFish.

Maven is a powerful tool for managing modern software development projects. This post will only draw upon a small part of Maven’s functionality and plug-in architecture extensibility. Specifically, we will use the Maven GlassFish Plugin. According to the Java.net website, which host’s the plug-in project, ‘the Maven GlassFish Plugin is a Maven2 plugin allowing management of GlassFish domains and component deployments from within the Maven build life cycle.’

 Requirements

To follow along with this post, I will assume you have recent versions of the following software installed and configured on your Windows OS-based computer (the process is nearly identical for Linux):

  1. NetBeans IDE. Current version: 7.4
  2. JUnit. Current version: 4.11 (included with NetBeans 7.4)
  3. GlassFish Server. Current version: 4.0 (included  with NetBeans 7.4)
  4. Jenkins CI Server. Current version: 1.538
  5. Apache Maven. Current version: 3.1.1
  6. cURL. Current version: 7.33.0
  7. Git with Git Gui and gitk. Current version: 1.8.4.3
  8. Necessary system environmental variables:
    M2_HOME, M2, JAVA_HOME, GLASSFISH_HOME, and PATH

GlassFish Domains

To simulate a simple deployment pipeline, we will create three GlassFish domains, simulating three common software environments, Development, Testing, and Production. A typical software project is promoted through these environments as it moves from development, to testing, and finally release to production. Each environment has distinct stakeholders with specific roles to play in the software development life cycle, including developers, testers, deployment teams, and end-users. Larger-scale, enterprise software development often includes other environments, such as Performance and Staging.

Create the domains from the command line using ‘asadmin’ commands such as the ones below. Note I have a ‘GLASSFISH_HOME’ system environment variable set up. The ports are your choice, but make sure they don’t conflict with existing installations of other applications, such as Jenkins, Tomcat, IIS, WebLogic, and so forth.

asadmin create-domain --domaindir "%GLASSFISH_HOME%\domains" --adminport 7070 --instanceport 7071 production
asadmin create-domain --domaindir "%GLASSFISH_HOME%\domains" --adminport 6060 --instanceport 6061 testing
asadmin create-domain --domaindir "%GLASSFISH_HOME%\domains" --adminport 5050 --instanceport 5051 development

As part of the creation process, you’re prompted for an admin account and a new password. I kept the ‘admin’ username, but added a new password for each domain created. This password is the same as one used in the separate password files (explained below).

C:\Users\gstaffor>asadmin create-domain --domaindir "%GLASSFISH_HOME%\domains" --adminport 7070 --instanceport 7071 production
Enter admin user name [Enter to accept default "admin" / no password]>admin
Enter the admin password [Enter to accept default of no password]>
Enter the admin password again>
Using port 7070 for Admin.
Using port 7071 for HTTP Instance.
Using default port 7676 for JMS.
Using default port 3700 for IIOP.
Using default port 8181 for HTTP_SSL.
Using default port 3820 for IIOP_SSL.
Using default port 3920 for IIOP_MUTUALAUTH.
Using default port 8686 for JMX_ADMIN.
Using default port 6666 for OSGI_SHELL.
Using default port 9009 for JAVA_DEBUGGER.
Distinguished Name of the self-signed X.509 Server Certificate is:
[CN={my_computer_name},OU=GlassFish,O=Oracle Corporation,L=Santa Clara,ST=California,C=US]
Distinguished Name of the self-signed X.509 Server Certificate is:
[CN={my_computer_name}-instance,OU=GlassFish,O=Oracle Corporation,L=Santa Clara,ST=California,C
=US]
Domain production created.
Domain production admin port is 7070.
Domain production admin user is "admin".
Command create-domain executed successfully.

Add the GlassFish domains to NetBeans’ Services -> Server tab, and start them.

Create New GlassFish 4.0 Production Domain - Screen 1

Create New GlassFish 4.0 Production Domain – Screen 1

Create New GlassFish 4.0 Production Domain - Screen 2

Create New GlassFish 4.0 Production Domain – Screen 2

Create New GlassFish 4.0 Production Domain - Screen 3

Create New GlassFish 4.0 Production Domain – Screen 3

Create New GlassFish 4.0 Production Domain - Screen 4

Create New GlassFish 4.0 Production Domain – Screen 4

Setting Up the Project

To set up our NetBeans project, you can clone the repository on GitHub or build your own project from scratch and copy the files into the project. I will not spend a lot of time explaining the code since we have used it in earlier posts. This post is about the deployment pipeline system, not the project’s code.

If you choose to create a new project, first, create a new Maven ‘Project from Archetype’. Select the Archetype for a ‘web application using Java EE 7’ (webapp-javaee7).

New Maven Project - Screen 1

New Maven Project – Screen 1

New Maven Project - Screen 2

New Maven Project – Screen 2

I recommend you create the project inside of your local Git repository folder.

New Maven Project - Screen 3

New Maven Project – Screen 3

Maven will execute a series of commands to create the default NetBeans project with dependencies.

Git

As a part of a development team using Git, you place your project on a remote Git Server. You and your team members each clone the repository on the Git Server to your local development environments. You and your team commit your code changes locally, then pull, merge, and push your changes back to the Git Server. Jenkins will pull the project’s source code from the remote Git Server.

In part 2, we will properly set-up our project on the Git Server, exporting our existing repository into a new, bare repository on the Git Server. However, for brevity in part 1 of this post, we will just create a local Git repository. To start, create a new Git repository for the project. In NetBeans, select Team -> Git -> Initialize Repository… Choose the new Maven project folder.

Initialize New Git Repository

Initialize New Git Repository

The initial view of the Maven project should look like the below screen grabs. Note the icons and the green files show that the project is part of the Git repository.

Initial Projects Tab View of New Maven Project

Initial Projects Tab View of New Maven Project

Initial Files Tab View of New Maven Project

Initial Files Tab View of New Maven Project

Perform an initial commit of the project to Git to make sure everything is working.

Initial Commit of New Maven Project to Git

Initial Commit of New Maven Project to Git

Next, copy the supplied HelloWorldResource. java and NameStorageBean.java classes into the project. The package classpath will be refactored by NetBeans. Copy all the remaining files and folders, including the (3) files in the WEB-INF folder, properties folder with (3) properties files, and passwords folder with (3) password files.

JUnit

Next, right-click on the NameStorageBean.java class and select Tools -> Create Tests. Replace the contents of the new NameStorageBeanTest.java file’s NameStorageBeanTest class with the contents of the supplied NameStorageBeanTest.java file. These are two very simple unit tests that will show how JUnit provides automated testing capabilities.

Create JUnit Tests - Screen 1

Create JUnit Tests – Screen 1

Create JUnit Tests - Screen 2

Create JUnit Tests – Screen 2

Project Object Model (POM)

Copy the contents of the supplied pom file into the new pom file. There is a lot of configuration in the supplied pom. It will be easier to copy the supplied pom file’s contents into your project then trying to configure it from scratch.

Basically, beyond the normal boilerplate pom configuration, we have defined (3) properties, (3) dependencies, and (5) build plugins. The three dependencies are junit, jersey-servlet, and javaee-web-api. The five plugins are maven-compiler-plugin, maven-war-plugin, maven-dependency-plugin, properties-maven-plugin, and the maven-glassfish-plugin. Each plugin contains individual plug-in specific configuration. The name of the plugin should be sufficient to explain their primary purpose.

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.blogpost</groupId>
<artifactId>HelloGlassFishMaven</artifactId>
<version>1.0-SNAPSHOT</version>
<packaging>war</packaging>
<name>HelloGlassFishMaven</name>
<properties>
<!-- Input Parameter - GlassFish properties file -->
<glassfish.properties.file.argument></glassfish.properties.file.argument>
<endorsed.dir>${project.build.directory}/endorsed</endorsed.dir>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.11</version>
</dependency>
<dependency>
<groupId>com.sun.jersey</groupId>
<artifactId>jersey-servlet</artifactId>
<version>1.13</version>
</dependency>
<dependency>
<groupId>javax</groupId>
<artifactId>javaee-web-api</artifactId>
<version>7.0</version>
<scope>provided</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<configuration>
<source>1.7</source>
<target>1.7</target>
<compilerArguments>
<endorseddirs>${endorsed.dir}</endorseddirs>
</compilerArguments>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
<version>2.3</version>
<configuration>
<failOnMissingWebXml>false</failOnMissingWebXml>
<filteringDeploymentDescriptors>true</filteringDeploymentDescriptors>
<webresources>
<resource>
<directory>${basedir}/src/main/webapp/WEB-INF</directory>
<filtering>true</filtering>
<targetpath>WEB-INF</targetpath>
<includes>
<include>**/glassfish-web.xml</include>
</includes>
</resource>
</webresources>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<version>2.6</version>
<executions>
<execution>
<phase>validate</phase>
<goals>
<goal>copy</goal>
</goals>
<configuration>
<outputDirectory>${endorsed.dir}</outputDirectory>
<silent>true</silent>
<artifactItems>
<artifactItem>
<groupId>javax</groupId>
<artifactId>javaee-endorsed-api</artifactId>
<version>7.0</version>
<type>jar</type>
</artifactItem>
</artifactItems>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>properties-maven-plugin</artifactId>
<version>1.0-alpha-2</version>
<configuration>
<files>
<file>${basedir}/properties/${glassfish.properties.file.argument}.properties</file>
</files>
</configuration>
</plugin>
<plugin>
<groupId>org.glassfish.maven.plugin</groupId>
<artifactId>maven-glassfish-plugin</artifactId>
<version>2.1</version>
<configuration>
<glassfishDirectory>${GLASSFISH_HOME}</glassfishDirectory>
<user>${glassfish.user}</user>
<passwordFile>${basedir}/passwords/${glassfish.pwdfile}</passwordFile>
<echo>true</echo>
<debug>true</debug>
<terse>true</terse>
<domain>
<name>${glassfish.domain}</name>
<host>${glassfish.host}</host>
<adminPort>${glassfish.adminport}</adminPort>
</domain>
<components>
<component>
<name>${project.artifactId}</name>
<artifact>${project.build.directory}/${project.build.finalName}.war</artifact>
</component>
</components>
</configuration>
</plugin>
</plugins>
</build>
</project>
view raw pom.xml hosted with ❤ by GitHub

When complete, right-click on the project and do a ‘Build with Dependencies…’. Make sure everything builds. The final view of the project, with all its Maven-managed dependencies should look like the two screen grabs shown below. Make sure to commit all your new code to Git.

Final Projects Tab View of Project

Final Projects Tab View of Project

Final Files Tab View of Project

Final Files Tab View of Project

Maven and Properties Files

In part 2, will be deploying our project to multiple GlassFish domains. Each domain’s configuration is different. We will use Java properties files to store each of the GlassFish domain’s configuration properties. The ability to use Java properties files with Maven is possible using the Mojo Project’s Properties Maven Plugin. I introduced this plugin in an earlier post, Build a Continuous Deployment System with Maven, Hudson, WebLogic Server, and JUnit.

Each environment (Development, Testing, Production), represented by a GlassFish domain, has a separate properties file in the project (see the Files Tab view above). The properties files contain configuration values the Maven GlassFish Plugin will need to deploy the project’s WAR file to each GlassFish domain. Since the build and deployment configurations are required by the project, including them into our Git repository and automating their use based on the environment, are two best practices.

# contents of all three files shown here
# development domain properties file
glassfish.domain=development
glassfish.host=glassfish4-app-server
glassfish.adminport=5050
glassfish.user=admin
glassfish.pwdfile=pwdfile_development
# testing domain properties file
glassfish.domain=testing
glassfish.host=glassfish4-app-server
glassfish.adminport=6060
glassfish.user=admin
glassfish.pwdfile=pwdfile_testing
# production domain properties file
glassfish.domain=production
glassfish.host=glassfish4-app-server
glassfish.adminport=7070
glassfish.user=admin
glassfish.pwdfile=pwdfile_production

In our project’s particular workflow, Maven accepts a single argument (‘glassfish.properties.file.argument’), which represents the environment we want to deploy to, such as ‘development’. The property value tells Maven which properties file to read, such as ‘development.properties’. Maven replaces the keys in the pom file with the values from the ‘development.properties’ file.

The properties file also tells Maven the full path to the separate password file, containing the admin user password, such as ‘pwdfile_development’. In an actual production environment, we would store encrypted password files on a secured file path. For simplicity in our example, we have included them unencrypted, within the project’s main directory.

System Diagram 3b

There are other Maven capabilities that also would achieve our deployment goals. For example, you might consider the Maven Release Plugin, as well as look at using Maven Build Profiles.

Testing the Pipeline

Although we have not built the second half of our deployment pipeline yet, we can still test the system at this early stage. All the necessary foundational elements are in place. To test the our system, right-click on the Maven Project icon in the Projects tab and select Custom -> Goals… Enter the following Maven Goals: ‘properties:read-project-properties clean install glassfish:redeploy -e’. In the Properties text box, enter the following: ‘glassfish.properties.file.argument=testing’ (see screen grab below). This will execute a number of Maven Goals and associated commands, visible in the Output tab.

With this one simple command, we are asking Maven to 1) read in our Java properties file and password file, 2) clean the project, 3) pull down all our project’s dependencies, 4) compile the project’s code, 5) execute the unit tests with JUnit, 6) assemble the WAR file, and 7) deploy it to the ‘testing’ GlassFish domain using asadmin. The terse nature of the command really demonstrates the power of Maven to manage our project and the deployment pipeline!

Run Maven within NetBeans to Test Pipeline

Run Maven within NetBeans to Test Pipeline

If successful you should see a message in the Output tab, indicating as much. Reviewing the contents of the Output tab will give you complete insight into the Maven process under the NetBeans hood. We used the ‘-e’ (echo) argument with Maven and the ‘Show Debug Output’ to further provide information to us about the process. The output contains all calls to Maven and subsequently to asadmin (GlassFish). You can learn a lot about using Maven and asadmin (GlassFish) by studying the Debug Output.

Conclusion

In the first part of this post, we learned how to create a simple Java EE web application project in NetBeans, using Maven. We learned how to integrate JUnit for automated testing, and how use Git to manage our source code.

In the second half of this post, we will learn how to configure Jenkins CI Server to retrieve our project from the remote Git repository, build, test, and assemble it into a WAR file. If these steps are successful, Jenkins will deploy our project to a GlassFish domain or multiple domains, based on the project’s stage in the software development life cycle. We will demonstrate how to automate Jenkins to achieve true continuous integration and continuous deployment.

, , , , , , , , , , , , , , ,

9 Comments

WebLogic Server: Installation to Deployment in 30 Minutes

Install WebLogic Server 12c on Windows, create a new domain, and deploy a sample application, all in 30 minutes or less! A quick overview of the entire process, installation to deployment.

12

Installing Oracle WebLogic Server 12c on Windows

Oracle has made the installation and setup of WebLogic Server 12c on Windows, remarkably easy for us as developers. In less than a half-hour, you can install WebLogic Server, create a new WLS domain, configure the new WebLogic Server domain in NetBeans, and deploy your first application. In this post, we will run through the most basic example of the install and configuration process.

In an actual production environments, even on your development machine, you will have added considerations when deploying high-performance enterprise applications and services to WebLogic Server. Considerations, such as security, persistence, web service configuration, performance, monitoring, and messaging, not be covered. None the less, this post should show just how easy it is to get started with WebLogic Server.

In this brief post, we will cover :

  1. Installing Logic Server 12c on Windows
  2. Creating a New WebLogic Server Domain
  3. Accessing the New WebLogic Server Domain
  4. Creating a Sample Project to Deploy to Domain
  5. Deploying the Sample Project

Installing WebLogic Server 12c

Download the latest version of WebLogic Server (WLS) from Oracle’s website. Once the rather large download is complete, double-click on the executable file to start the install process. I used all the default settings during the installation, illustrated below.

01
02
03
04
06

Oracle recommends you create a %MW_HOME% environmental variable, whose value is the Middleware Home Directory, shown in the screen-grab, below:

C:\Oracle\Middleware

Note the WebLogic Server Product Installation Directory in the screen-grab, below. You will need it for other configuration later in the post. You might also consider creating a %WL_HOME% environmental variable to store this value. It saves time when executing WLS commands in the terminal:

%MW_HOME%\wlserver_12.1

07
08
09

13

If you exit the Quick Start utility, and wish to return to it later, you can run the following command script:

%WL_HOME%\common\quickstart\quickstart.cmd

Creating a New WebLogic Server Domain

With WLS installed, the first thing you will want to do is create a new WebLogic Server domain to host your applications. The easiest way to create a domain is the Fusion Middleware Configuration Wizard. To start the Wizard, run the following command script (there is also an executable in the same directory):

%WL_HOME%\common\bin\config.cmd

The following screen-grabs show the creation a basic domain using the Wizard, without any added feature and functionality, such as web services, messaging, or persistence.

01

02

03

Make sure to note the username, password, and port you choose during the set-up. You will need them later to start the domain and to deploy to it.

04

05

06

07

08

09

Accessing the New WebLogic Server Domain

To start the new domain, run the following command script:

%MW_HOME%\user_projects\domains\dev_domain\startWebLogic.cmd

10 Alternate Start

11

Once started, to reach the domain’s Administration Console, open a browser and enter the following URL (adjust it for the port you chose earlier):

http://localhost:7021/console/login/LoginForm.jsp

Log  into WLS using the username and password you chose during the domain creation in the previous steps.

12

13

14

Configuring WebLogic Server in NetBeans

Open NetBeans and switch to the ‘Services’ Tab, right-click on ‘Server’, and select ‘Add Server…’. Enter the values used during the installation and domain creation steps, above.

15

The ‘Server Location’ will be the same value as our %WL_HOME% variable:

C:\Oracle\Middleware\wlserver_12.1

16

The ‘Domain’ path will the root of our new domain, located where all domains are,  in the ‘domains’ directory:

C:\Oracle\Middleware\user_projects\domains\dev_domain

17

18

Creating a Sample Project to Deploy to Domain

To test your new WLS domain, create a quick Java EE Hello World RESTful web service NetBeans sample project. We have used this sample many times before in previous posts to demonstrate various server deployments.

19

20

Right-click on the project, and select ‘Properties’, and then select the ‘Build’ -> ‘Run’ menu item. Change the Server to ‘Oracle WebLogic Server’. Leave all the other options with their default values.

21

Deploying the Sample Project

Right click on the Apache Ant ‘build.xml’ file on the ‘Files’ tab. Select ‘Run Target’ -> ‘Other Targets’ -> ‘run-deploy’. Apache Ant will run the run-deploy target, which will run a series of dependent targets. They will compile the project, build the .war file, and deploy the .war to the newly created WLS domain. Since we did not configure any additional server within the domain, the project will be deployed to the existing ‘AdminServer’ server.

22

Return to the Administrative Console, and switch to the ‘Deployments’ window to view the newly deployed project.

23

To view the deployed project, open a second web browser window and enter the following URL:

http://localhost:7021/HelloWorldWeblogicLocal/resources/helloWorld

24

Conclusion

There you have it, a whirlwind WebLogic Server installation to deployment demonstration. So, how long did it take you?

, , , , , ,

6 Comments

Build a Continuous Deployment System with Maven, Hudson, WebLogic Server, and JUnit

Build an automated testing, continuous integration, and continuous deployment system, using Maven, Hudson, WebLogic Server, JUnit, and NetBeans. Developed with Oracle’s Pre-Built Enterprise Java Development VM. Download the complete source code from Dropbox and on GitHub.

System Diagram

Introduction

In this post, we will build a basic automated testing, continuous integration, and continuous deployment system, using Oracle’s Pre-Built Enterprise Java Development VM. The primary goal of the system is to automatically compile, test, and deploy a simple Java EE web application to a test environment. As this post will demonstrate, the key to a successful system is not a single application, but the effective integration of all the system’s applications into a well-coordinated and consistent workflow.

Building system such as this can be complex and time-consuming. However, Oracle’s Pre-Built Enterprise Java Development VM already has all the components we need. The Oracle VM includes NetBeans IDE for development, Apache Subversion for version control, Hudson Continuous Integration (CI) Server for build automation, JUnit and Hudson for unit test automation, and WebLogic Server for application hosting.

In addition, we will use Apache Maven, also included on the Oracle VM, to help manage our project’s dependencies, as well as the build and deployment process. Overlapping with some of Apache Ant’s build task functionality, Maven is a powerful cross-cutting tool for managing the modern software development projects. This post will only draw upon a small part of Maven’s functionality.

Demonstration Requirements

To save some time, we will use the same WebLogic Server (WLS) domain we built-in the last post, Deploying Applications to WebLogic Server on Oracle’s Pre-Built Development VM. We will also use code from the sample Hello World Java EE web project from that post. If you haven’t already done so, work through the last post’s example, first.

Here is a quick list of requirements for this demonstration:

  • Oracle VM
    • Oracle’s Pre-Built Enterprise Java Development VM running on current version of Oracle VM VirtualBox (mine: 4.2.12)
    • Oracle VM’s has the latest system updates installed (see earlier post for directions)
    • WLS domain from last post created and running in Oracle VM
    • Credentials supplied with Oracle VM for Hudson (username and password)
  • Window’s Development Machine
    • Current version of Apache Maven installed and configured (mine: 3.0.5)
    • Current version of NetBeans IDE installed and configured (mine: 7.3)
    • Optional: Current version of WebLogic Server installed and configured
    • All environmental variables properly configured for Maven, Java, WLS, etc. (MW_HOME, M2, etc.)

The Process

The steps involved in this post’s demonstration are as follows:

  1. Install the WebLogic Maven Plugin into the Oracle VM’s Maven Repositories, as well as the Development machine
  2. Create a new Maven Web Application Project in NetBeans
  3. Copy the classes from the Hello World project in the last post to new project
  4. Create a properties file to store Maven configuration values for the project
  5. Add the Maven Properties Plugin to the Project’s POM file
  6. Add the WebLogic Maven Plugin to project’s POM file
  7. Add JUnit tests and JUnit dependencies to project
  8. Add a WebLogic Descriptor to the project
  9. Enable Tunneling on the new WLS domain from the last post
  10. Build, test, and deploy the project locally in NetBeans
  11. Add project to Subversion
  12. Optional: Upgrade existing Hudson 2.2.0 and plugins on the Oracle VM latest 3.x version
  13. Create and configure new Hudson CI job for the project
  14. Build the Hudson job to compile, test, and deploy project to WLS

WebLogic Maven Plugin

First, we need to install the WebLogic Maven Plugin (‘weblogic-maven-plugin’) onto both the Development machine’s local Maven Repository and the Oracle VM’s Maven Repository. Installing the plugin will allow us to deploy our sample application from NetBeans and Hudson, using Maven. The weblogic-maven-plugin, a JAR file, is not part of the Maven repository by default. According to Oracle, ‘WebLogic Server provides support for Maven through the provisioning of plug-ins that enable you to perform various operations on WebLogic Server from within a Maven environment. As of this release, there are two separate plug-ins available.’ In this post, we will use the weblogic-maven-plugin, as opposed to the wls-maven-plugin. Again, according to Oracle, the weblogic-maven-plugin “delivered in WebLogic Server 11g Release 1, provides support for deployment operations.”

The best way to understand the plugin install process is by reading the Using the WebLogic Development Maven Plug-In section of the Oracle Fusion Middleware documentation on Developing Applications for Oracle WebLogic Server. It goes into detail on how to install and configure the plugin.

In a nutshell, below is a list of the commands I executed to install the weblogic-maven-plugin version 12.1.1.0 on both my Windows development machine and on my Oracle VM. If you do not have WebLogic Server installed on your development machine, and therefore no access to the plugin, install it into the Maven Repository on the Oracle VM first, then copy the jar file to the development machine and follow the normal install process from that point forward.

On Windows Development Machine:

Installing weblogic-maven-plugin onto Dev Maven Repository

Installing weblogic-maven-plugin on a Windows Machine

cd %MW_HOME%/wlserver/server/lib
java -jar wljarbuilder.jar -profile weblogic-maven-plugin
mkdir c:\tmp
copy weblogic-maven-plugin.jar c:\tmp
cd c:\tmp
jar xvf c:\tmp\weblogic-maven-plugin.jar META-INF/maven/com.oracle.weblogic/weblogic-maven-plugin/pom.xml
mvn install:install-file -DpomFile=META-INF/maven/com.oracle.weblogic/weblogic-maven-plugin/pom.xml -Dfile=c:\tmp\weblogic-maven-plugin.jar

On the Oracle VM:

Installing WebLogic Maven Plugin into Oracle VM Maven Repository

Installing WebLogic Maven Plugin into the Oracle VM

cd $MW_HOME/wlserver_12.1/server/lib
java -jar wljarbuilder.jar -profile weblogic-maven-plugin
mkdir /home/oracle/tmp
cp weblogic-maven-plugin.jar /home/oracle/tmp
cd /home/oracle/tmp
jar xvf weblogic-maven-plugin.jar META-INF/maven/com.oracle.weblogic/weblogic-maven-plugin/pom.xml
mvn install:install-file -DpomFile=META-INF/maven/com.oracle.weblogic/weblogic-maven-plugin/pom.xml -Dfile=weblogic-maven-plugin.jar

To test the success of your plugin installation, you can run the following maven command on Windows or Linux:

mvn help:describe -Dplugin=com.oracle.weblogic:weblogic-maven-plugin

Sample Maven Web Application

Using NetBeans on your development machine, create a new Maven Web Application. For those of you familiar with Maven, the NetBeans’ Maven Web Application project is based on the ‘webapp-javaee6:1.5’ Archetype. NetBeans creates the project by executing a ‘archetype:generate’ Maven Goal. This is seen in the ‘Output’ tab after the project is created.

01a - Choose the Maven Web Application Project Type

1a – Choose the Maven Web Application Project Type

01b - Name and Location of New Project

1b – Name and Location of New Project

By default you may have Tomcat and GlassFish as installed options on your system. Unfortunately, NetBeans currently does not have the ability to configure a remote connection to the WLS instance running on the Oracle VM, as I understand. You do not need an instance of WLS installed on your development machine since we are going to use the copy on the Oracle VM. We will use Maven to deploy the project to WLS on the Oracle VM, later in the post.

01c - Default Server and Java Settings

1c – Default Server and Java Settings

1d - New Maven Project in NetBeans

1d – New Maven Project in NetBeans

Next, copy the two java class files from the previous blog post’s Hello World project to the new project’s source package. Alternately, download a zipped copy this post’s complete sample code from Dropbox or on GitHub.

02a - Copy Two Class Files from Previous Project

2a – Copy Two Class Files from Previous Project

Because we are copying a RESTful web service to our new project, NetBeans will prompt us for some REST resource configuration options. To keep this new example simple, choose the first option and uncheck the Jersey option.

02b - REST Resource Configuration

2b – REST Resource Configuration

02c - New Project with Files Copied from Previous Project

2c – New Project with Files Copied from Previous Project

JUnit Tests

Next, create a set of JUnit tests for each class by right-clicking on both classes and selecting ‘Tools’ -> ‘Create Tests’.

03a - Create JUnit Tests for Both Class Files

3a – Create JUnit Tests for Both Class Files

03b - Choose JUnit Version 4.x

3b – Choose JUnit Version 4.x

03c - New Project with Test Classes and JUnit Test Dependencies

3c – New Project with Test Classes and JUnit Test Dependencies

We will use the test classes and dependencies NetBeans just added to the project. However, we will not use the actual JUnit tests themselves that NetBeans created. To properly set-up the default JUnit tests to work with an embedded version of WLS is well beyond the scope of this post.

Overwrite the contents of the class file with the code provided from Dropbox. I have replaced the default JUnit tests with simpler versions for this demonstration. Build the file to make sure all the JUnit tests all pass.

03d - Project Successfully Built with New JUnit Tests

3d – Project Successfully Built with New JUnit Tests

Project Properties

Next, add a new Properties file to the project, entitled ‘maven.properties’.

04a - Add Properties File to Project

4a – Add Properties File to Project

04b - Add Properties File to Project

4b – Add Properties File to Project

Add the following key/value pairs to the properties file. These key/value pairs are referenced will be referenced the POM.xml by the weblogic-maven-plugin, added in the next step. Placing the configuration values into a Properties file is not necessary for this post. However, if you wish to deploy to multiple environments, moving environmentally-specific configurations into separate properties files, using Maven Build Profiles, and/or using frameworks such as Spring, are all best practices.

Java Properties File (maven.properties):

# weblogic-maven-plugin configuration values for Oracle VM environment
wls.adminurl=t3://192.168.1.88:7031
wls.user=weblogic
wls.password=welcome1
wls.upload=true
wls.remote=false
wls.verbose=true
wls.middlewareHome=/labs/wls1211
wls.name=HelloWorldMaven

Maven Plugins and the POM File

Next, add the WLS Maven Plugin (‘weblogic-maven-plugin’) and the Maven Properties Plugin (‘properties-maven-plugin’) to the end of the project’s Maven POM.xml file. The Maven Properties Plugin, part of the Mojo Project, allows us to substitute configuration values in the Maven POM file from a properties file. According to codehaus,org, who hosts the Mojo Project, ‘It’s main use-case is loading properties from files instead of declaring them in pom.xml, something that comes in handy when dealing with different environments.’

Project Object Model File (pom.xml):

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.blogpost</groupId>
<artifactId>HelloWorldMaven</artifactId>
<version>1.0</version>
<packaging>war</packaging>
<name>HelloWorldMaven</name>
<properties>
<endorsed.dir>${project.build.directory}/endorsed</endorsed.dir>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.10</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>javax</groupId>
<artifactId>javaee-web-api</artifactId>
<version>6.0</version>
<scope>provided</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<configuration>
<source>1.6</source>
<target>1.6</target>
<compilerArguments>
<endorseddirs>${endorsed.dir}</endorseddirs>
</compilerArguments>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
<version>2.4</version>
<configuration>
<failOnMissingWebXml>false</failOnMissingWebXml>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<version>2.8</version>
<executions>
<execution>
<phase>validate</phase>
<goals>
<goal>copy</goal>
</goals>
<configuration>
<outputDirectory>${endorsed.dir}</outputDirectory>
<silent>true</silent>
<artifactItems>
<artifactItem>
<groupId>javax</groupId>
<artifactId>javaee-endorsed-api</artifactId>
<version>6.0</version>
<type>jar</type>
</artifactItem>
</artifactItems>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>properties-maven-plugin</artifactId>
<version>1.0-alpha-2</version>
<executions>
<execution>
<phase>initialize</phase>
<goals>
<goal>read-project-properties</goal>
</goals>
<configuration>
<files>
<file>maven_wls_local.properties</file>
</files>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>com.oracle.weblogic</groupId>
<artifactId>weblogic-maven-plugin</artifactId>
<version>12.1.1.0</version>
<configuration>
<adminurl>${wls.adminurl}</adminurl>
<user>${wls.user}</user>
<password>${wls.password}</password>
<upload>${wls.upload}</upload>
<action>deploy</action>
<remote>${wls.remote}</remote>
<verbose>${wls.verbose}</verbose>
<source>${project.build.directory}/${project.build.finalName}.${project.packaging}</source>
<name>${wls.name}</name>
</configuration>
<executions>
<execution>
<phase>install</phase>
<goals>
<goal>deploy</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>
view raw pom.xml hosted with ❤ by GitHub

WebLogic Deployment Descriptor

A WebLogic Deployment Descriptor file is the last item we need to add to the new Maven Web Application project. NetBeans has descriptors for multiple servers, including Tomcat (context.xml), GlassFish (application.xml), and WebLogic (weblogic.xml). They provide a convenient location to store specific server properties, used during the deployment of the project.

06a - Add New WebLogic Descriptor

6a – Add New WebLogic Descriptor

06b - Add New WebLogic Descriptor

6b – Add New WebLogic Descriptor

Add the ‘context-root’ tag. The value will be the name of our project, ‘HelloWorldMaven’, as shown below. According to Oracle, “the context-root element defines the context root of this standalone Web application.” The context-root of the application will form part of the URL we enter to display our application, later.

06c - Add Context Root Element to Descriptor

6c – Add Context Root Element to Descriptor

Make sure to the WebLogic descriptor file (‘weblogic.xml’) is placed in the WEB-INF folder. If not, the descriptor’s properties will not be read. If the descriptor is not read, the context-root of the deployed application will default to the project’s WAR file’s name. Instead of ‘HelloWorldMaven’ as the context-root, you would see ‘HelloWorldMaven-1.0-SNAPSHOT’.

06d - Move WebLogic Descriptor into WEB-INF Folder

6d – Move WebLogic Descriptor into WEB-INF Folder

Enable Tunneling

Before we compile, test, and deploy our project, we need to make a small change to WLS. In order to deploy our project remotely to the Oracle VM’s WLS, using the WebLogic Maven Plugin, we must enable tunneling on our WLS domain. According to Oracle, the ‘Enable Tunneling’ option “Specifies whether tunneling for the T3, T3S, HTTP, HTTPS, IIOP, and IIOPS protocols should be enabled for this server.” To enable tunneling, from the WLS Administration Console, select the ‘AdminServer’ Server, ‘Protocols’ tab, ‘General’ sub-tab.

Enabling Tunneling on WLS for HTTP Deployments

Enabling Tunneling on WLS for HTTP Deployments

Build and Test the Project

Right-click and select ‘Build’, ‘Clean and Build’, or ‘Build with Dependencies’. NetBeans executes a ‘mvn install’ command. This command initiates a series of Maven Goals. The goals, visible NetBean’s Output window, include ‘dependency:copy’, ‘properties:read-project-properties’, ‘compiler:compile’, ‘surefire:test’, and so forth. They move the project’s code through the Maven Build Lifecycle. Most goals are self-explanatory by their title.

The last Maven Goal to execute, if the other goals have succeeded, is the ‘weblogic:deploy’ goal. This goal deploys the project to the Oracle VM’s WLS domain we configured in our project. Recall in the POM file, we configured the weblogic-maven-plugin to call the ‘deploy’ goal whenever ‘install’, referred to as Execution Phase by Maven, is executed. If all goals complete without error, you have just compiled, tested, and deployed your first Maven web application to a remote WLS domain. Later, we will have Hudson do it for us, automatically.

06e - Successful Build of Project

6e – Successful Build of Project

Executing Maven Goals in NetBeans

A small aside, if you wish to run alternate Maven goals in NetBeans, right-click on the project and select ‘Custom’ -> ‘Goals…’. Alternately, click on the lighter green arrows (‘Re-run with different parameters’), adjacent to the ‘Output’ tab.

For example, in the ‘Run Maven’ pop-up, replace ‘install’ with ‘surefire:test’ or simply ‘test’. This will compile the project and run the JUnit tests. There are many Maven goals that can be ran this way. Use the Control key and Space Bar key combination in the Maven Goals text box to display a pop-up list of available goals.

07a - Executing the Maven test Goal

7a – Executing Other Maven Goals

07a - JUnit Test Results using Maven test Goal

7a – JUnit Test Results using Maven ‘test’ Goal

Subversion

Now that our project is complete and tested, we will commit the project to Subversion (SVN). We will commit a copy of our source code to SVN, installed on the Oracle VM, for safe-keeping. Having our source code in SVN also allows Hudson to retrieve a copy. Hudson will then compile, test, and deploy the project to WLS.

The Repository URL, User, and Password are all supplied in the Oracle VM information, along with the other URLs and credentials.

08a - Add Project to Subversion Repository

8a – Add Project to Subversion Repository

08b - Subversion Repository Folder for Project

8b – Subversion Repository Folder for Project

When you import you project for the first time, you will see more files than are displayed below. I had already imported part of the project earlier while creating this post. Therefore most of my files were already managed by Subversion.

08c - List of Files Imported into Subversion

8c – List of Files Imported into Subversion (you will see more)

08d - Project Successfully Imported into Subversion

08d – Project Successfully Imported into Subversion

Upgrading Hudson CI Server

The Oracle VM comes with Hudson pre-installed in it’s own WLS domain, ‘hudson-ci_dev’, running on port 5001. Start the domain from within the VM by double-clicking the ‘WLS 12c – Hudson CI 5001’ icon on the desktop, or by executing the domain’s WLS start-up script from a terminal window:

/labs/wls1211/user_projects/domains/hudson-ci_dev/startWebLogic.sh

Once started, the WLS Administration Console 12c is accessible at the following URL. User your VM’s IP address or ‘localhost’ if you are within the VM.

http://[your_vm_ip_address]:5001/console/login/LoginForm.jsp

The Oracle VM comes loaded with Hudson version 2.2.0. I strongly suggest is updating Hudson to the latest version (3.0.1 at the time of this post). To upgrade, download, deploy, and started a new 3.0.1 version in the same domain on the same ‘AdminServer’ Server. I was able to do this remotely, from my development machine, using the browser-based Hudson Dashboard and WLS Administration Console. There is no need to do any of the installation from within the VM, itself.

When the upgrade is complete, stop the 2.2.0 deployment currently running in the WLS domain.

Hudson 3.0.1 Deployed to WLS Domain on VM

Hudson 3.0.1 Deployed to WLS Domain on VM

The new version of Hudson is accessible from the following URL (adjust the URL your exact version of Hudson):

http://[your_vm_ip_address]:5001/hudson-3.0.1/

It’s also important to update all the Hudson plugins. Hudson makes this easy with the Hudson Plugin Manager, accessible via the Manage Hudson’ option.

View of Hudson 3.0.1 Running on WLS with All Plugins Updated

View of Hudson 3.0.1 Running on WLS with All Plugins Updated

Note on the top of the Manage Hudson page, there is a warning about the server’s container not using UTF-8 to decode URLs. You can follow this post, if you want to resolve the issue by configuring Hudson differently. I did not worry about it for this post.

Building a Hudson Job

We are ready to configure Hudson to build, test, and deploy our Maven Web Application project. Return to the ‘Hudson Dashboard’, select ‘New Job’, and then ‘Build a new free-style software job’. This will open the ‘Job Configurations’ for the new job.

01 - Creating New Hudson Free-Style Software Job

1 – Creating New Hudson Free-Style Software Job

02 -Default Job Configurations

2 -Default Job Configurations

Start by configuring the ‘Source Code Management’ section. The Subversion Repository URL is the same as the one you used in NetBeans to commit the code. To avoid the access error seen below, you must provide the Subversion credentials to Hudson, just as you did in NetBeans.

03 -Subversion SCM Configuration

3 -Subversion SCM Configuration

04 - Subversion SCM Authentication

4 – Subversion SCM Authentication

05 -Subversion SCM Configuration Authenticated

5 -Subversion SCM Configuration Authenticated

Next, configure the Maven 3 Goals. I chose the ‘clean’ and ‘install’ goals. Using ‘clean’ insures the project is compiled each time by deleting the output of the build directory.

Optionally, you can configure Hudson to publish the JUnit test results as shown below. Be sure to save your configuration.

06 -Maven 3 and JUnit Configurations

6 -Maven 3 and JUnit Configurations

Start a build of the new Hudson Job, by clicking ‘Build Now’. If your Hudson job’s configurations are correct, and the new WLS domain is running, you should have a clean build. This means the project compiled without error, all tests passed, and the web application’s WAR file was deployed successfully to the new WLS domain within the Oracle VM.

07 -Job Built Successfully Using Configurations

7 -Job Built Successfully Using Configurations

08 - Test Results from Build

8 – Test Results from Build

WebLogic Server

To view the newly deployed Maven Web Application, log into the WebLogic Server Administration Console for the new domain. In my case, the new domain was running on port 7031, so the URL would be:

http://[your_vm_ip_address]:7031/console/login/LoginForm.jsp

You should see the deployment, in an ‘Active’ state, as shown below.

09a - HelloWorldMaven Deployed to WLS from Hudson Build

9a – Project Deployed to WLS from Hudson Build

09b - Project Context Root Set by WebLogic Descriptor File

9b – Project’s Context Root Set by WebLogic Descriptor File

09c - Projects Servlet Path Set by web.xml File

9c – Project’s Servlet Paths

To test the deployment, open a new browser tab and go to the URL of the Servlet. In this case the URL would be:

http://[your_vm_ip_address]:7031/HelloWorldMaven/resources/helloWorld

You should see the original phrase from the previous project displayed, ‘Hello WebLogic Server!’.

10 - HelloWorldMaven Web Application Running in Browser

10 – Project’s Web Application Running in Browser

To further test the system, make a simple change to the project in NetBeans. I changed the name variable’s default value from ‘WebLogic Server’ to ‘Hudson, Maven, and WLS’. Commit the change to SVN.

11 - Make a Code Change to Project and Commit to Subversion

11 – Make a Code Change to Project and Commit to Subversion

Return to Hudson and run a new build of the job.

12 - Rebuild Project with Changes in Hudson

12 – Rebuild Project with Changes in Hudson

After the build completes, refresh the sample Web Application’s browser window. You should see the new text string displayed. Your code change was just re-compiled, re-tested, and re-deployed by Hudson.

13 - HelloWorldMaven Deployed to WLS Showing Code Change

13 – Project Showing Code Change

True Continuous Deployment

Although Hudson is now doing a lot of the work for us, the system still is not fully automated. We are still manually building our Hudson Job, in order to deploy our application. If you want true continuous integration and deployment, you need to trust the system to automatically deploy the project, based on certain criteria.

SCM polling with Hudson is one way to demonstrate continuous deployment. In ‘Job Configurations’, turn on ‘Poll SCM’ and enter Unix cron-like value(s) in the ‘Schedule’ text box. In the example below, I have indicated a polling frequency every hour (‘@hourly’). Every hour, Hudson will look for committed changes to the project in Subversion. If changes are found, Hudson w retrieves the source code, compiles, and tests. If the project compiles and passes all tests, it is deployed to WLS.

SCM Polling Interval

SCM Polling Interval

There are less resource-intense methods to react to changes than SCM polling. Push-notifications from the repository is alternate, more preferable method.

Additionally, you should configure messaging in Hudson to notify team members of new deployments and the changes they contain. You should also implement a good deployment versioning strategy, for tracking purposes. Knowing the version of deployed artifacts is critical for accurate change management and defect tracking.

Helpful Links

Maven Plug-In Goals

Maven Build Lifecycle

Configuring and Using the WebLogic Maven Plug-In for Deployment

Jenkins: Building a Software Project

Kohsuke Kawaguchi: Polling must die: triggering Jenkins builds from a git hook

, , , , , , , , , , , , , , , , , , , , , ,

6 Comments

Java RESTful Web Services Using MySQL Server, EclipseLink, and Jersey

Demonstrates the development of Java RESTful Web Services using MySQL Server, EclipseLink (JPA) and Jersey (JAX-RS). Built using NetBeans and hosted on GlassFish. Both the Java Library and RESTful Service NetBeans’ projects, demonstrated in this post, are now available on GitHub.
 
MySQL Diagram

Introduction

When implementing a Relational Database Management System (RDBMS), many enterprise software developers tend to favor Oracle 11g or Microsoft SQL Server relational databases, depending on their technology stack. However, there are several excellent alternative relational databases, including MySQL. In fact, MySQL is the world’s most popular open source database software, according to Oracle.

MySQL is available on over 20 platforms and operating systems including Linux, Unix, Mac and Windows, according to the MySQL website. Like Oracle and Microsoft’s flagship RDBMS, MySQL Server comes in at least four flavors, ranging from the free Community Edition, demonstrated here, to a full-featured, enterprise-level Cluster Carrier Grade Edition. Support for MySQL, like Oracle and Microsoft, extends beyond just technical support. MySQL provides JDBC, ODBC, .NET drivers for Java and .NET development, as well as other languages. MySQL is supported by many popular IDE’s, including MySQL’s own RDBMS IDE, MySQL Workbench. Lastly, like Oracle and Microsoft, MySQL provides extensive documentation, tutorials, and even sample databases, built using recommended architectural patterns.

In this post, we will use JDBC to map JPA entity classes to tables and views within a MySQL database. We will then build RESTful web services, EJB classes, which communicate with MySQL through the entities. We will separate the JPA entities into a Java Class Library. The class library will be referenced by the RESTful web services. The RESTful web services, part of a Java Web Application, will be deployed to GlassFish, where they are accessed with HTTP methods and tested.

Installation and Configuration

If you’ve worked with Microsoft SQL Server or particularly Oracle 11g, you’ll have a minimal learning curve with MySQL. Basic installation, configuration, and integration within your Java applications is like Oracle and Microsoft. Start by downloading and installing the latest versions of MySQL Server, MySQL Workbench, MySQL JDBC Connector/J Driver, and MySQL Sakila sample database. If on Linux, you could use the command line, or a native application management application, like Synaptic Package Manager, to perform most of the installations. To get the latest software and installation and configuration recommendations, I prefer to download and install them myself from the MySQL web site. All links are included at the end of this post.

For reference when following this post, I have installed MySQL Server 5.5.x on 64-bit Ubuntu 12.10 LTS, running within a Windows version of Oracle VM VirtualBox. I will be using the latest Linux version of NetBeans IDE 7.3 to develop the demonstration project. I will host the project on Oracle’s GlassFish Open Source Application Server 3.1.2.2, running on Ubuntu. Lastly, I will be referring to the latest JDK 1.7, in NetBeans, for the project.

MySQL Demo User Account

Once MySQL is installed and running, I suggest adding a new MySQL demo user account, to the Sakila database for this demonstration, using MySQL Workbench. For security, you should limit the user account to just those permissions necessary for this demonstration, as detailed in the following screen-grabs. You can also add the user from the command line, if you are familiar with administering MySQL in that way.

MySQL Workbench IDE

MySQL Workbench IDE

Configuring Demo User Login

Configuring Demo User Login

Configuring Demo User Administrative Roles

Configuring Demo User Administrative Roles

Configuring Demo User Account Limits

Configuring Demo User Account Limits

Configuring Demo User Schema Privileges

Configuring Demo User Schema Privileges

New MySQL Database Connection

To begin development in NetBeans, first create a new JDBC database connection to the MySQL Sakila database. In the Services tab, right-click on the Databases item and select New Connection… Use the new demo user account for the connection.

Note in the first screen-grab below, that instead of using the default NetBeans JDBC MySQL Connector/J driver version, I have downloaded and replaced it with the most current version, 5.1.24. This is not necessary, but I like to use the latest drivers to avoid problems.

New Connection Wizard - MySQL Driver

Locating the Driver in the New Connection Wizard

Make sure to test your connection before finishing, using the ‘Test’ button. It’s frustrating to track down database connection issues once you start coding and testing.

New Connection Wizard - Customize Connection

Customize the Connection in the New Connection Wizard

New Connection Wizard - Database Schema

Sakila Database Doesn’t Contain Additional Schema

Choosing a Name for the Connection

Choosing a Name for the Connection

New Database Connection for demoUser

New Database Connection for MySQL Sakila Database

New Java Class Library

Similar to an earlier post, create new Java Class Library project in NetBeans. Select New Project -> Java -> Java Class Library. This library will eventually contain the JPA entity classes, mapped to tables and views in the MySQL Sakila database. Following standard n-tier design principles, I prefer separate the data access layer (DAL) from the service layer. You can then reuse the data access layer for other types of data-consumers, such as SOAP-based services.

Create New Java Class Library Project

Create New Java Class Library Project

Naming New Java Class Library

Naming New Java Class Library

Entity Classes from Database

Next, we will add entity classes to our project, mapped to several of the MySQL Sakila database’s tables and views. Right-click on the project and select New -> Entity Classes from Database… In the next window, choose the database connection we made before. NetBeans will then load all the available tables and views from the Sakila database. Next, select ‘actor_info(view)’, ‘film_actor’, and ‘film_list(view)’. Three related tables will also be added automatically by NetBeans. Not the warning at the bottom of the window about the need to specify Entity IDs. We will address this next.

Choosing Database Tables and Views

Choosing Database Tables and Views

Entity Class Options

Entity Class Options

Entity Mapping Options

Entity Mapping Options

When selecting ‘Entity Classes from Database…’, NetBeans adds the ‘EclipseLink (JPA 2.0)’ global library to the project. This library contains three jars, including EclipseLink 2.3.x, Java Persistence API (JPA) 2.0.x, and state model API for JPQL queries. There is a newer EclipseLink 2.4.x library available from their web site.  The 2.4.x version has many new features. You can download and replace NetBeans’ EclipseLink (JPA 2.0) library by creating a new EclipseLink 2.4.x library, if you want to give its new features, like JPA-RS, a try. It is not necessary for this demonstration, however.

New Java Class Project with Entities

Java Class Project with JPA Entity Classes

Adding Entity IDs to Views

To eliminate warnings displayed when we built the entities, Entity ID’s must be designated for the two database views we selected, ‘actor_info(view)’ and ‘film_list(view)’. Database views (virtual tables), do not have a primary key defined, which NetBeans requires for the entity classes. NetBeans will guide you through adding the ID, if you click on the error icon shown below.

Adding Id to ActorInfo Entity

Adding and Entity ID to ActorInfo Entity

Id Added to Entity Class

Entity ID Added to Entity Class

ActorInfo.java Entity Class contents:

package com.mysql.entities;
import java.io.Serializable;
import javax.persistence.Basic;
import javax.persistence.Column;
import javax.persistence.Entity;
import javax.persistence.Id;
import javax.persistence.Lob;
import javax.persistence.NamedQueries;
import javax.persistence.NamedQuery;
import javax.persistence.Table;
import javax.xml.bind.annotation.XmlRootElement;
@Entity
@Table(name = "actor_info")
@XmlRootElement
@NamedQueries({
@NamedQuery(name = "ActorInfo.findAll", query = "SELECT a FROM ActorInfo a"),
@NamedQuery(name = "ActorInfo.findByActorId", query = "SELECT a FROM ActorInfo a WHERE a.actorId = :actorId"),
@NamedQuery(name = "ActorInfo.findByFirstName", query = "SELECT a FROM ActorInfo a WHERE a.firstName = :firstName"),
@NamedQuery(name = "ActorInfo.findByLastName", query = "SELECT a FROM ActorInfo a WHERE a.lastName = :lastName")})
public class ActorInfo implements Serializable {
private static final long serialVersionUID = 1L;
@Basic(optional = false)
@Column(name = "actor_id")
@Id
private short actorId;
@Basic(optional = false)
@Column(name = "first_name")
private String firstName;
@Basic(optional = false)
@Column(name = "last_name")
private String lastName;
@Lob
@Column(name = "film_info")
private String filmInfo;
public ActorInfo() {
}
public short getActorId() {
return actorId;
}
public void setActorId(short actorId) {
this.actorId = actorId;
}
public String getFirstName() {
return firstName;
}
public void setFirstName(String firstName) {
this.firstName = firstName;
}
public String getLastName() {
return lastName;
}
public void setLastName(String lastName) {
this.lastName = lastName;
}
public String getFilmInfo() {
return filmInfo;
}
public void setFilmInfo(String filmInfo) {
this.filmInfo = filmInfo;
}
}
view raw ActorInfo.java hosted with ❤ by GitHub

New Java Web Application

Next, we will create the RESTful Web Services. Each service will be mapped to one of the corresponding JPA entity we just created in the Java class library project. Select New Project -> Java Web -> Web Application.

New Web Application Project

New Web Application Project

Naming New Web Application

Naming New Web Application

Configuring Server and Settings

Configuring Server and Settings

Configuring Frameworks

Configuring Frameworks

New Java Web Application Project

New Java Web Application Project

RESTful Web Services from Entity Classes

Before we will build the RESTful web services, we need to add a reference to the previous Java class library project, containing the JPA entity classes. In the Java web application’s properties dialog window, under Categories -> Libraries -> Compile, add a link to the Java class library project’s .jar file.

Adding MySQL Entity Class Library

Adding MySQL Entity Class Library

Next, right-click on the project and select New -> RESTful Web Services from Entity Classes…

Adding RESTful Web Service from Entities

Adding RESTful Web Service from Entity Classes

In the preceding dialogue window, add all the ‘Available Entity Classes’ to the ‘Selected Entity Classes’ column.

Choosing Entity Classes

Choosing Entity Classes

Chosen Entity Classes

Chosen Entity Classes

After clicking next, you will prompted to configure the Persistence Unit and the Persistence Unit’s Data Source. Please refer to my earlier post for more information on the Persistence Unit. This data source will also be used by GlassFish, once the project is deployed, to connect to the Sakila MySQL database. The Persistence Unit will use the JNDI name to reference the data source.

Creating Data Source for Persistence Unit

Creating Data Source for Persistence Unit

Creating Data Source and JNDI Name

Creating Data Source and JNDI Name

Creating Persistence Unit

Creating Persistence Unit

Persistence Unit (persistence.xml) contents:

<?xml version="1.0" encoding="UTF-8"?>
<persistence version="2.0" xmlns="http://java.sun.com/xml/ns/persistence" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://java.sun.com/xml/ns/persistence http://java.sun.com/xml/ns/persistence/persistence_2_0.xsd">
<persistence-unit name="MySQLDemoServicePU" transaction-type="JTA">
<jta-data-source>jdbc/mysql_sakila</jta-data-source>
<class>com.mysql.entities.Actor</class>
<class>com.mysql.entities.ActorInfo</class>
<class>com.mysql.entities.Film</class>
<class>com.mysql.entities.FilmActor</class>
<class>com.mysql.entities.Language</class>
<exclude-unlisted-classes>true</exclude-unlisted-classes>
<properties/>
</persistence-unit>
</persistence>
view raw persistence.xml hosted with ❤ by GitHub
Generating Classes Using Jersey Options

Generating Classes Using Jersey Options

New Java Web Application with RESTful Web Services

Java Web Application with RESTful Web Services

As part of constructing the RESTful Web Services, notice NetBeans has added several Jersey (JAX-RS) libraries to the project. These libraries also reference Jackson (JSON Processor), Jettison (JSON StAX), MOXy (JAXB), and Grizzly (NIO) APIs.

Libraries Loaded by NetBeans to Java Web Application

Libraries Loaded by NetBeans to Java Web Application

Creating RESTful Web Services Test

Finally, we will test the RESTful Web Services, and indirectly the underlying entity classes mapped to the MySQL Sakila database. NetBeans makes this easy. To begin, right-click on the ‘RESTful Web Services’ folder in the Java web application project and select ‘Test RESTful Web Services’. NetBeans will automatically generate all the necessary files and links to test each of the RESTful web services’ operations.

As part of creating the tests, NetBeans will deploy the web application to GlassFish. When configuring the tests in the ‘Configure REST Test Client’ dialog window, make sure to use the second option, ‘Web Test Client in Project’. The first option only works with Microsoft’s Internet Explorer, an odd choice for a Java-based application running on Linux.

Configuring the REST Test Client

Configuring the REST Test Client

Highlighted below in red are the components NetBeans will install on the GlassFish application server. They include the RESTful web services application, a .war file. Each of the RESTful web service are Stateless Session Beans, installed as part of the application. In deployment also includes a JDBC Resource and a JDBC Connection Pool, which connects the application to the MySQL Sakila database. The Resource is automatically associated with the Connection Pool.

RESTful Web Services Deployed to GlassFish Server

RESTful Web Services Deployed to GlassFish Server

After creating the necessary files and deploying the application, NetBeans will open a web browser. allowing you can test the services. Each of the RESTful web services is available to test by clicking on the links in the left-hand navigation menu. NetBeans has generated a few default operations, including ‘{id}’, ‘{from/to}’, and ‘count’, each mapped to separate methods in the service classes. Also notice you can choose to display the results of the service calls in multiple formats, including XML, JSON, and plain text.

Testing RESTful Web Services from NetBeans

Testing RESTful Web Services from NetBeans Using Chrome

We can also test the RESTful Web Services by calling the service URLs, directly. Below, is the results of a my call to the Actor service’s URL, from a separate Windows client machine.

Calling the RESTful Web Services Directly

Calling the RESTful Web Services Directly

You can also use applications like Fiddler, cURL, Firefox with Firebug, and Google Chrome’s Advanced REST Client and REST Console to test the services. Below, I used Fiddler to call the Actor service, again. Note the response contains a JSON payload, not XML. With Jersey, you can request and receive JSON from the services without additional programming.

Fiddler2 Request Example

Fiddler2 Request Example

Conclusion

Using these services, you can build any number of server-side and client-side data-driven applications. The service layer is platform agnostic, accessible from any web-browser, mobile device, or native desktop application, on Windows, Linux, and Apple.

Links

MySQL Server: http://www.mysql.com/downloads/mysql

MySQL Connector/J JDBC driver for MySQL: http://dev.mysql.com/downloads/connector/j

MySQL Workbench: http://www.mysql.com/downloads/workbench

MySQL Sakila Sample Database: http://dev.mysql.com/doc/sakila/en/sakila-installation.html

NetBeans IDE: http://www.netbeans.org

EclipseLink: http://projects.eclipse.org/projects/rt.eclipselink

, , , , , , , , , , , , , ,

10 Comments

Object Tracking on the Raspberry Pi with C++, OpenCV, and cvBlob

Use C++ with OpenCV and cvBlob to perform image processing and object tracking on the Raspberry Pi, using a webcam.

Source code and compiled samples are now available on GitHub. The below post describes the original code on the ‘Master’ branch. As of  May 2014, there is a revised and improved version of the project on the ‘rev05_2014’ branch, on GitHub. The README.md details the changes and also describes how to install OpenCV, cvBlob, and all dependencies!

Introduction

As part of a project with a local FIRST Robotics Competition (FRC) Team, I’ve been involved in developing a Computer Vision application for use on the Raspberry Pi. Our FRC team’s goal is to develop an object tracking and target acquisition application that could be run on the Raspberry Pi, as opposed to the robot’s primary embedded processor, a National Instrument’s NI cRIO-FRC II. We chose to work in C++ for its speed, We also decided to test two popular open-source Computer Vision (CV) libraries, OpenCV and cvBlob.

Due to its single ARM1176JZF-S 700 MHz ARM processor, a significant limitation of the Raspberry Pi is the ability to perform complex operations in real-time, such as image processing. In an earlier post, I discussed Motion to detect motion with a webcam on the Raspberry Pi. Although the Raspberry Pi was capable of running Motion, it required a greatly reduced capture size and frame-rate. And even then, the Raspberry Pi’s ability to process the webcam’s feed was very slow. I had doubts it would be able to meet the processor-intense requirements of this project.

Development for the Raspberry Pi

Using C++ in NetBeans 7.2.1 on Ubuntu 12.04.1 LTS and 12.10, I wrote several small pieces of code to demonstrate the Raspberry Pi’s ability to perform basic image processing and object tracking. Parts of the follow code are based on several OpenCV and cvBlob code examples, found in my research. Many of those examples are linked on the end of this article. Examples of cvBlob are especially hard to find.

Project in NetBeans

Project in NetBeans

The Code

There are five files: ‘main.cpp’, ‘testfps.cpp (testfps.h)’, and ‘testcvblob.cpp (testcvblob.h)’. The main.cpp file’s main method calls the test methods in the other two files. The cvBlob library only works with the pre-OpenCV 2.0. Therefore, I wrote all the code using the older objects and methods. The code is not written using the latest OpenCV 2.0 conventions. For example, cvBlob uses 1.0’s ‘IplImage’ image type instead 2.0’s newer ‘CvMat’ image type. My next projects is to re-write the cvBlob code to use OpenCV 2.0 conventions and/or find a newer library. The cvBlob library offered so many advantages, I felt not using the newer OpenCV 2.0 features was still worthwhile.

Main Program Method (main.cpp)

/*
* File: main.cpp
* Author: Gary Stafford
* Description: Program entry point
* Created: February 3, 2013
*/
#include <stdio.h>
#include <sstream>
#include <stdlib.h>
#include <iostream>
#include "testfps.hpp"
#include "testcvblob.hpp"
using namespace std;
int main(int argc, char* argv[]) {
int captureMethod = 0;
int captureWidth = 0;
int captureHeight = 0;
if (argc == 4) { // user input parameters with call
captureMethod = strtol(argv[1], NULL, 0);
captureWidth = strtol(argv[2], NULL, 0);
captureHeight = strtol(argv[3], NULL, 0);
} else { // user did not input parameters with call
cout << endl << "Demonstrations/Tests: " << endl;
cout << endl << "(1) Test OpenCV - Show Webcam" << endl;
cout << endl << "(2) Test OpenCV - No Webcam" << endl;
cout << endl << "(3) Test cvBlob - Show Image" << endl;
cout << endl << "(4) Test cvBlob - No Image" << endl;
cout << endl << "(5) Test Blob Tracking - Show Webcam" << endl;
cout << endl << "(6) Test Blob Tracking - No Webcam" << endl;
cout << endl << "Input test # (1-6): ";
cin >> captureMethod;
// test 3 and 4 don't require width and height parameters
if (captureMethod != 3 && captureMethod != 4) {
cout << endl << "Input capture width (pixels): ";
cin >> captureWidth;
cout << endl << "Input capture height (pixels): ";
cin >> captureHeight;
cout << endl;
if (!captureWidth > 0) {
cout << endl << "Width value incorrect" << endl;
return -1;
}
if (!captureHeight > 0) {
cout << endl << "Height value incorrect" << endl;
return -1;
}
}
}
switch (captureMethod) {
case 1:
TestFpsShowVideo(captureWidth, captureHeight);
case 2:
TestFpsNoVideo(captureWidth, captureHeight);
break;
case 3:
DetectBlobsShowStillImage();
break;
case 4:
DetectBlobsNoStillImage();
break;
case 5:
DetectBlobsShowVideo(captureWidth, captureHeight);
break;
case 6:
DetectBlobsNoVideo(captureWidth, captureHeight);
break;
default:
break;
}
return 0;
}
view raw main.cpp hosted with ❤ by GitHub

Tests 1-2 (testcvblob.hpp)

// -*- C++ -*-
/*
* File: testcvblob.hpp
* Author: Gary Stafford
* Created: February 3, 2013
*/
#ifndef TESTCVBLOB_HPP
#define TESTCVBLOB_HPP
int DetectBlobsNoStillImage();
int DetectBlobsShowStillImage();
int DetectBlobsNoVideo(int captureWidth, int captureHeight);
int DetectBlobsShowVideo(int captureWidth, int captureHeight);
#endif /* TESTCVBLOB_HPP */
view raw testcvblob.hpp hosted with ❤ by GitHub

Tests 1-2 (testcvblob.cpp)

/*
* File: testcvblob.cpp
* Author: Gary Stafford
* Description: Track blobs using OpenCV and cvBlob
* Created: February 3, 2013
*/
#include <cv.h>
#include <highgui.h>
#include <cvblob.h>
#include "testcvblob.hpp"
using namespace cvb;
using namespace std;
// Test 3: OpenCV and cvBlob (w/ webcam feed)
int DetectBlobsNoStillImage() {
/// Variables /////////////////////////////////////////////////////////
CvSize imgSize;
IplImage *image, *segmentated, *labelImg;
CvBlobs blobs;
unsigned int result = 0;
///////////////////////////////////////////////////////////////////////
image = cvLoadImage("colored_balls.jpg");
if (image == NULL) {
return -1;
}
imgSize = cvGetSize(image);
cout << endl << "Width (pixels): " << image->width;
cout << endl << "Height (pixels): " << image->height;
cout << endl << "Channels: " << image->nChannels;
cout << endl << "Bit Depth: " << image->depth;
cout << endl << "Image Data Size (kB): "
<< image->imageSize / 1024 << endl << endl;
segmentated = cvCreateImage(imgSize, 8, 1);
cvInRangeS(image, CV_RGB(155, 0, 0), CV_RGB(255, 130, 130), segmentated);
labelImg = cvCreateImage(cvGetSize(image), IPL_DEPTH_LABEL, 1);
result = cvLabel(segmentated, labelImg, blobs);
cvFilterByArea(blobs, 500, 1000000);
cout << endl << "Blob Count: " << blobs.size();
cout << endl << "Pixels Labeled: " << result << endl << endl;
cvReleaseBlobs(blobs);
cvReleaseImage(&labelImg);
cvReleaseImage(&segmentated);
cvReleaseImage(&image);
return 0;
}
// Test 4: OpenCV and cvBlob (w/o webcam feed)
int DetectBlobsShowStillImage() {
/// Variables /////////////////////////////////////////////////////////
CvSize imgSize;
IplImage *image, *frame, *segmentated, *labelImg;
CvBlobs blobs;
unsigned int result = 0;
bool quit = false;
///////////////////////////////////////////////////////////////////////
cvNamedWindow("Processed Image", CV_WINDOW_AUTOSIZE);
cvMoveWindow("Processed Image", 750, 100);
cvNamedWindow("Image", CV_WINDOW_AUTOSIZE);
cvMoveWindow("Image", 100, 100);
image = cvLoadImage("colored_balls.jpg");
if (image == NULL) {
return -1;
}
imgSize = cvGetSize(image);
cout << endl << "Width (pixels): " << image->width;
cout << endl << "Height (pixels): " << image->height;
cout << endl << "Channels: " << image->nChannels;
cout << endl << "Bit Depth: " << image->depth;
cout << endl << "Image Data Size (kB): "
<< image->imageSize / 1024 << endl << endl;
frame = cvCreateImage(imgSize, image->depth, image->nChannels);
cvConvertScale(image, frame, 1, 0);
segmentated = cvCreateImage(imgSize, 8, 1);
cvInRangeS(image, CV_RGB(155, 0, 0), CV_RGB(255, 130, 130), segmentated);
cvSmooth(segmentated, segmentated, CV_MEDIAN, 7, 7);
labelImg = cvCreateImage(cvGetSize(frame), IPL_DEPTH_LABEL, 1);
result = cvLabel(segmentated, labelImg, blobs);
cvFilterByArea(blobs, 500, 1000000);
cvRenderBlobs(labelImg, blobs, frame, frame,
CV_BLOB_RENDER_BOUNDING_BOX | CV_BLOB_RENDER_TO_STD, 1.);
cvShowImage("Image", frame);
cvShowImage("Processed Image", segmentated);
while (!quit) {
char k = cvWaitKey(10)&0xff;
switch (k) {
case 27:
case 'q':
case 'Q':
quit = true;
break;
}
}
cvReleaseBlobs(blobs);
cvReleaseImage(&labelImg);
cvReleaseImage(&segmentated);
cvReleaseImage(&frame);
cvReleaseImage(&image);
cvDestroyAllWindows();
return 0;
}
// Test 5: Blob Tracking (w/ webcam feed)
int DetectBlobsNoVideo(int captureWidth, int captureHeight) {
/// Variables /////////////////////////////////////////////////////////
CvCapture *capture;
CvSize imgSize;
IplImage *image, *frame, *segmentated, *labelImg;
int picWidth, picHeight;
CvTracks tracks;
CvBlobs blobs;
CvBlob* blob;
unsigned int result = 0;
bool quit = false;
///////////////////////////////////////////////////////////////////////
capture = cvCaptureFromCAM(-1);
cvSetCaptureProperty(capture, CV_CAP_PROP_FRAME_WIDTH, captureWidth);
cvSetCaptureProperty(capture, CV_CAP_PROP_FRAME_HEIGHT, captureHeight);
cvGrabFrame(capture);
image = cvRetrieveFrame(capture);
if (image == NULL) {
return -1;
}
imgSize = cvGetSize(image);
cout << endl << "Width (pixels): " << image->width;
cout << endl << "Height (pixels): " << image->height << endl << endl;
frame = cvCreateImage(imgSize, image->depth, image->nChannels);
while (!quit && cvGrabFrame(capture)) {
image = cvRetrieveFrame(capture);
cvConvertScale(image, frame, 1, 0);
segmentated = cvCreateImage(imgSize, 8, 1);
cvInRangeS(image, CV_RGB(155, 0, 0), CV_RGB(255, 130, 130), segmentated);
//Can experiment either or both
cvSmooth(segmentated, segmentated, CV_MEDIAN, 7, 7);
cvSmooth(segmentated, segmentated, CV_GAUSSIAN, 9, 9);
labelImg = cvCreateImage(cvGetSize(frame), IPL_DEPTH_LABEL, 1);
result = cvLabel(segmentated, labelImg, blobs);
cvFilterByArea(blobs, 500, 1000000);
cvRenderBlobs(labelImg, blobs, frame, frame, 0x000f, 1.);
cvUpdateTracks(blobs, tracks, 200., 5);
cvRenderTracks(tracks, frame, frame, 0x000f, NULL);
picWidth = frame->width;
picHeight = frame->height;
if (cvGreaterBlob(blobs)) {
blob = blobs[cvGreaterBlob(blobs)];
cout << "Blobs found: " << blobs.size() << endl;
cout << "Pixels labeled: " << result << endl;
cout << "center-x: " << blob->centroid.x
<< " center-y: " << blob->centroid.y
<< endl;
cout << "offset-x: " << ((picWidth / 2)-(blob->centroid.x))
<< " offset-y: " << (picHeight / 2)-(blob->centroid.y)
<< endl;
cout << "\n";
}
char k = cvWaitKey(10)&0xff;
switch (k) {
case 27:
case 'q':
case 'Q':
quit = true;
break;
}
}
cvReleaseBlobs(blobs);
cvReleaseImage(&labelImg);
cvReleaseImage(&segmentated);
cvReleaseImage(&frame);
cvReleaseImage(&image);
cvDestroyAllWindows();
cvReleaseCapture(&capture);
return 0;
}
// Test 6: Blob Tracking (w/o webcam feed)
int DetectBlobsShowVideo(int captureWidth, int captureHeight) {
/// Variables /////////////////////////////////////////////////////////
CvCapture *capture;
CvSize imgSize;
IplImage *image, *frame, *segmentated, *labelImg;
CvPoint pt1, pt2, pt3, pt4, pt5, pt6;
CvScalar red, green, blue;
int picWidth, picHeight, thickness;
CvTracks tracks;
CvBlobs blobs;
CvBlob* blob;
unsigned int result = 0;
bool quit = false;
///////////////////////////////////////////////////////////////////////
cvNamedWindow("Processed Video Frames", CV_WINDOW_AUTOSIZE);
cvMoveWindow("Processed Video Frames", 750, 400);
cvNamedWindow("Webcam Preview", CV_WINDOW_AUTOSIZE);
cvMoveWindow("Webcam Preview", 200, 100);
capture = cvCaptureFromCAM(1);
cvSetCaptureProperty(capture, CV_CAP_PROP_FRAME_WIDTH, captureWidth);
cvSetCaptureProperty(capture, CV_CAP_PROP_FRAME_HEIGHT, captureHeight);
cvGrabFrame(capture);
image = cvRetrieveFrame(capture);
if (image == NULL) {
return -1;
}
imgSize = cvGetSize(image);
cout << endl << "Width (pixels): " << image->width;
cout << endl << "Height (pixels): " << image->height << endl << endl;
frame = cvCreateImage(imgSize, image->depth, image->nChannels);
while (!quit && cvGrabFrame(capture)) {
image = cvRetrieveFrame(capture);
cvFlip(image, image, 1);
cvConvertScale(image, frame, 1, 0);
segmentated = cvCreateImage(imgSize, 8, 1);
//Blue paper
cvInRangeS(image, CV_RGB(49, 69, 100), CV_RGB(134, 163, 216), segmentated);
//Green paper
//cvInRangeS(image, CV_RGB(45, 92, 76), CV_RGB(70, 155, 124), segmentated);
//Can experiment either or both
cvSmooth(segmentated, segmentated, CV_MEDIAN, 7, 7);
cvSmooth(segmentated, segmentated, CV_GAUSSIAN, 9, 9);
labelImg = cvCreateImage(cvGetSize(frame), IPL_DEPTH_LABEL, 1);
result = cvLabel(segmentated, labelImg, blobs);
cvFilterByArea(blobs, 500, 1000000);
cvRenderBlobs(labelImg, blobs, frame, frame, CV_BLOB_RENDER_COLOR, 0.5);
cvUpdateTracks(blobs, tracks, 200., 5);
cvRenderTracks(tracks, frame, frame, CV_TRACK_RENDER_BOUNDING_BOX, NULL);
red = CV_RGB(250, 0, 0);
green = CV_RGB(0, 250, 0);
blue = CV_RGB(0, 0, 250);
thickness = 1;
picWidth = frame->width;
picHeight = frame->height;
pt1 = cvPoint(picWidth / 2, 0);
pt2 = cvPoint(picWidth / 2, picHeight);
cvLine(frame, pt1, pt2, red, thickness);
pt3 = cvPoint(0, picHeight / 2);
pt4 = cvPoint(picWidth, picHeight / 2);
cvLine(frame, pt3, pt4, red, thickness);
cvShowImage("Webcam Preview", frame);
cvShowImage("Processed Video Frames", segmentated);
if (cvGreaterBlob(blobs)) {
blob = blobs[cvGreaterBlob(blobs)];
pt5 = cvPoint(picWidth / 2, picHeight / 2);
pt6 = cvPoint(blob->centroid.x, blob->centroid.y);
cvLine(frame, pt5, pt6, green, thickness);
cvCircle(frame, pt6, 3, green, 2, CV_FILLED, 0);
cvShowImage("Webcam Preview", frame);
cvShowImage("Processed Video Frames", segmentated);
cout << "Blobs found: " << blobs.size() << endl;
cout << "Pixels labeled: " << result << endl;
cout << "center-x: " << blob->centroid.x
<< " center-y: " << blob->centroid.y
<< endl;
cout << "offset-x: " << ((picWidth / 2)-(blob->centroid.x))
<< " offset-y: " << (picHeight / 2)-(blob->centroid.y)
<< endl;
cout << "\n";
}
char k = cvWaitKey(10)&0xff;
switch (k) {
case 27:
case 'q':
case 'Q':
quit = true;
break;
}
}
cvReleaseBlobs(blobs);
cvReleaseImage(&labelImg);
cvReleaseImage(&segmentated);
cvReleaseImage(&frame);
cvReleaseImage(&image);
cvDestroyAllWindows();
cvReleaseCapture(&capture);
return 0;
}
view raw testcvblob.cpp hosted with ❤ by GitHub

Tests 2-6 (testfps.hpp)

// -*- C++ -*-
/*
* File: testfps.hpp
* Author: Gary Stafford
* Created: February 3, 2013
*/
#ifndef TESTFPS_HPP
#define TESTFPS_HPP
int TestFpsNoVideo(int captureWidth, int captureHeight);
int TestFpsShowVideo(int captureWidth, int captureHeight);
#endif /* TESTFPS_HPP */
view raw testfps.hpp hosted with ❤ by GitHub

Tests 2-6 (testfps.cpp)

/*
* File: testfps.cpp
* Author: Gary Stafford
* Description: Test the fps of a webcam using OpenCV
* Created: February 3, 2013
*/
#include <cv.h>
#include <highgui.h>
#include <time.h>
#include <stdio.h>
#include "testfps.hpp"
using namespace std;
// Test 1: OpenCV (w/ webcam feed)
int TestFpsNoVideo(int captureWidth, int captureHeight) {
IplImage* frame;
CvCapture* capture = cvCreateCameraCapture(-1);
cvSetCaptureProperty(capture, CV_CAP_PROP_FRAME_WIDTH, captureWidth);
cvSetCaptureProperty(capture, CV_CAP_PROP_FRAME_HEIGHT, captureHeight);
time_t start, end;
double fps, sec;
int counter = 0;
char k;
time(&start);
while (1) {
frame = cvQueryFrame(capture);
time(&end);
++counter;
sec = difftime(end, start);
fps = counter / sec;
printf("FPS = %.2f\n", fps);
if (!frame) {
printf("Error");
break;
}
k = cvWaitKey(10)&0xff;
switch (k) {
case 27:
case 'q':
case 'Q':
break;
}
}
cvReleaseCapture(&capture);
return 0;
}
// Test 2: OpenCV (w/o webcam feed)
int TestFpsShowVideo(int captureWidth, int captureHeight) {
IplImage* frame;
CvCapture* capture = cvCreateCameraCapture(-1);
cvSetCaptureProperty(capture, CV_CAP_PROP_FRAME_WIDTH, captureWidth);
cvSetCaptureProperty(capture, CV_CAP_PROP_FRAME_HEIGHT, captureHeight);
cvNamedWindow("Webcam Preview", CV_WINDOW_AUTOSIZE);
cvMoveWindow("Webcam Preview", 300, 200);
time_t start, end;
double fps, sec;
int counter = 0;
char k;
time(&start);
while (1) {
frame = cvQueryFrame(capture);
time(&end);
++counter;
sec = difftime(end, start);
fps = counter / sec;
printf("FPS = %.2f\n", fps);
if (!frame) {
printf("Error");
break;
}
cvShowImage("Webcam Preview", frame);
k = cvWaitKey(10)&0xff;
switch (k) {
case 27:
case 'q':
case 'Q':
break;
}
}
cvDestroyWindow("Webcam Preview");
cvReleaseCapture(&capture);
return 0;
}
view raw testfps.cpp hosted with ❤ by GitHub

 

Compiling Locally on the Raspberry Pi

After writing the code, the first big challenge was cross-compiling the native C++ code, written on Intel IA-32 and 64-bit x86-64 processor-based laptops, to run on the Raspberry Pi’s ARM architecture. After failing to successfully cross-compile the C++ source code using crosstools-ng, mostly due to my lack of cross-compiling experience, I resorted to using g++ to compile the C++ source code directly on the Raspberry Pi.

First, I had to properly install the various CV libraries and the compiler on the Raspberry Pi, which itself is a bit daunting.

Compiling OpenCV 2.4.3, from the source-code, on the Raspberry Pi took an astounding 8 hours. Even though compiling the C++ source code takes longer on the Raspberry Pi, I could be assured the complied code would run locally. Below are the commands that I used to transfer and compile the C++ source code on my Raspberry Pi.

Copy and Compile Commands

scp *.jpg *.cpp *.h {your-pi-user}@{your.ip.address}:your/file/path/
ssh {your-pi-user}@{your.ip.address}
cd ~/your/file/path/
g++ `pkg-config opencv cvblob --cflags --libs` testfps.cpp testcvblob.cpp main.cpp -o FpsTest -v
./FpsTest

Compiling Program on Raspberry Pi

Compiling Program on Raspberry Pi

Special Note About cvBlob on ARM

At first I had given up on cvBlob working on the Raspberry Pi. All the cvBlob tests I ran, no matter how simple, continued to hang on the Raspberry Pi after working perfectly on my laptop. I had narrowed the problem down to the ‘cvLabel’ method, but was unable to resolve. However, I recently discovered a documented bug on the cvBlob website. It concerned cvBlob and the very same ‘cvLabel’ method on ARM-based devices (ARM = Raspberry Pi!). After making a minor modification to cvBlob’s ‘cvlabel.cpp’ source code, as directed in the bug post, and re-compiling on the Raspberry Pi, the test worked perfectly.

Testing OpenCV and cvBlob

The code contains three pairs of tests (six total), as follows:

  1. OpenCV (w/ live webcam feed)
    Determine if OpenCV is installed and functioning properly with the complied C++ code. Capture a webcam feed using OpenCV, and display the feed and frame rate (fps).
  2. OpenCV (w/o live webcam feed)
    Same as Test #1, but only print the frame rate (fps). The computer doesn’t need display the video feed to process the data. More importantly, the webcam’s feed might unnecessarily tax the computer’s processor and GPU.
  3. OpenCV and cvBlob (w/ live webcam feed)
    Determine if OpenCV and cvBlob are installed and functioning properly with the complied C++ code. Detect and display all objects (blobs) in a specific red color range, contained in a static jpeg image.
  4. OpenCV and cvBlob (w/o live webcam feed)
    Same as Test #3, but only print some basic information about the static image and number of blobs detected. Again, the computer doesn’t need display the video feed to process the data.
  5. Blob Tracking (w/ live webcam feed)
    Detect, track, and display all objects (blobs) in a specific blue color range, along with the largest blob’s positional data. Captured with a webcam, using OpenCV and cvBlob.
  6. Blob Tracking (w/o live webcam feed)
    Same as Test #5, but only display the largest blob’s positional data. Again, the computer doesn’t need the display the webcam feed, to process the data. The feed taxes the computer’s processor unnecessarily, which is being consumed with detecting and tracking the blobs. The blob’s positional data it sent to the robot and used by its targeting system to position its shooting platform.

The Program

There are two ways to run this program. First, from the command line you can call the application and pass in three parameters. The parameters include:

  1. Test method you want to run (1-6)
  2. Width of the webcam capture window in pixels
  3. Height of the webcam capture window in pixels.

An example would be ‘./TestFps 2 640 480’ or ‘./TestFps 5 320 240’.

The second method to run the program and not pass in any parameters. In that case, the program will prompt you to input the test number and other parameters on-screen.

Input Options for Application

Input Options for Application

Test 1: Laptop versus Raspberry Pi

Test 1: Displaying Webcam Feed using OpenCV (laptop)

Test 1: Displaying Webcam Feed using OpenCV (laptop)

Test 1: Displaying Webcam Feed using OpenCV (Raspberry Pi)

Test 1: Displaying Webcam Feed using OpenCV (Raspberry Pi)

Test 3: Laptop versus Raspberry Pi

Test 3: Detecting Red Color Range in Static Image using OpenCV and cvBlob (laptop)

Test 3: Detecting Red Color Range in Static Image using OpenCV and cvBlob (laptop)

Test 3: Detecting Red Color Range in Static Image using OpenCV and cvBlob (Raspberry Pi)

Test 3: Detecting Red Color Range in Static Image using OpenCV and cvBlob (Raspberry Pi)

Test 5: Detecting Objects within Blue Color Range using OpenCV and cvBlob (laptop)

Test 5: Detecting Objects within Blue Color Range using OpenCV and cvBlob (laptop)

Test 5: Laptop versus Raspberry Pi

Test 5: Detecting Objects within Blue Color Range using OpenCV and cvBlob (Raspberry Pi)

Test 5: Detecting Objects within Blue Color Range using OpenCV and cvBlob (Raspberry Pi)

The Results

Each test was first run on two Linux-based laptops, with Intel 32-bit and 64-bit architectures, and with two different USB webcams. The laptops were used to develop and test the code, as well as provide a baseline for application performance. Many factors can dramatically affect the application’s ability do image processing. They include the computer’s processor(s), RAM, HDD, GPU, USB, Operating System, and the webcam’s video capture size, compression ratio, and frame-rate. There are significant differences in all these elements when comparing an average laptop to the Raspberry Pi.

Frame-rates on the Intel processor-based Ubuntu laptops easily performed at or beyond the maximum 30 fps rate of the webcams, at 640 x 480 pixels. On a positive note, the Raspberry Pi was able to compile and execute the tests of OpenCV and cvBlob (see bug noted at end of article). Unfortunately, at least in my tests, the Raspberry Pi could not achieve more than 1.5 – 2 fps at most, even in the most basic tests, and at a reduced capture size of 320 x 240 pixels. This can be seen in the first and second screen-grabs of Test #1, above. Although, I’m sure there are ways to improve the code and optimize the image capture, the results were much to slow to provide accurate, real-time data to the robot’s targeting system.

Links of Interest

Static Test Images Free from: http://www.rgbstock.com/

Great Website for OpenCV Samples: http://opencv-code.com/

Another Good Website for OpenCV Samples: http://opencv-srf.blogspot.com/2010/09/filtering-images.html

cvBlob Code Sample: https://code.google.com/p/cvblob/source/browse/samples/red_object_tracking.cpp

Detecting Blobs with cvBlob: http://8a52labs.wordpress.com/2011/05/24/detecting-blobs-using-cvblobs-library/

Best Post/Script to Install OpenCV on Ubuntu and Raspberry Pi: http://jayrambhia.wordpress.com/2012/05/02/install-opencv-2-3-1-and-simplecv-in-ubuntu-12-04-precise-pangolin-arch-linux/

Measuring Frame-rate with OpenCV: http://8a52labs.wordpress.com/2011/05/19/frames-per-second-in-opencv/

OpenCV and Raspberry Pi: http://mitchtech.net/raspberry-pi-opencv/

, , , , , , , , , , , , , , , , , , , ,

52 Comments

Build Automation – Calling GlassFish’s asadmin and Apache Ant Directly

Automating deployment of applications from NetBeans to GlassFish is easy using Apache Ant and GlassFish’s asadmin utility. Calling these two applications directly, without requiring the complete file path, can be a real time-savings. With Ubuntu (Linux), like with Windows OS, this can be done by adding their file paths to the $PATH environment variable.

Below is an example of adding both asadmin and Ant to the .bashrc file in your home directory. To open the .bashrc file, open the Terminal and enter ‘sudo gedit ~/.bashrc‘. You will be prompted for your password. When the .bashrc file opens, enter the following text at the end of the .bashrc file. Make sure you change the file paths to match your local system if they are different.

export ANT_HOME=./netbeans-7.2/java/ant
export ASADMIN_HOME=./glassfish-3.1.2.2/glassfish
export PATH=$PATH:$ASADMIN_HOME/bin:$ANT_HOME/bin

Close the .bashrc file and type ‘asadmin’ at the Terminal window prompt. You should see the response below. Type ‘exit’ to get out of asadmin. Next, type ‘ant’. Again, you should see the response below. This means both applications are now available directly, on any file path or from within any application, like Jenkins or Hudson.

Adding GlassFish's asadmin and Apache Ant to $Path Environmental Variable

Adding GlassFish’s asadmin and Apache Ant to $Path Environmental Variable

You can also add these variables in other ways. Here are links to other posts, which go into much more detail, and show methods to add these for all users, in addition to just yourself:

, , , , , , , ,

Leave a comment

Discover All Properties Available to an Apache Ant Target

Ever waste time searching for a certain property you need to build an Ant target? Here’s a quick tip to save you some time – echoproperties. According to The Apache Ant Project website, the echoproperties task ” displays all the current properties (or a subset of them specified by a nested <propertyset>) in the project. The output can be sent to a file if desired. This task can be used as a somewhat contrived means of returning data from an <ant> invocation, but is really for debugging build files.”

Recently, I was working on a new Java Web Application Project in NetBeans IDE 7.2.1. I wanted to build an Ant target to automate the deployment of the project’s .war file to GlassFish. To do so, I needed to identify properties that could return 1) the project’s name, 2) the path to the project’s .war file, and 3) the path to GlassFish’s asadmin utility. Calling the echoproperties task from within the Ant target below, from within my open project, returned a list of over 90 property key/value pairs.

<target name="list-all-properties">
    <echoproperties />
</target>

Although the results were enlightening, I couldn’t find the properties I was hoping to reference in the new target. Next however, I ran the Ant target again, adding the two dependency targets my GlassFish deployment target was going to need, clean and dist.

<target name="list-all-properties" depends="clean, dist">
    <echoproperties />
</target>

Running the revised target returned almost 450 properties, all available to Ant. The new properties were a result of the clean and dist targets running before the call to echoproperties. Those target’s properties were now also available. Here is a snippet of the results:

...
ant.project.invoked-targets=list-all-properties
ant.project.name=MySqlEntityWebDemo
ant.version=Apache Ant(TM) version 1.8.3 compiled on February 26 2012
ap.cmd.line.internal=
ap.proc.none.internal=
ap.processors.internal=
ap.supported.internal=true
application.args.param=
awt.toolkit=sun.awt.X11.XToolkit
basedir=/home/gstaffor/NetBeansProjects/MySqlEntityWebDemo
build.classes.dir=build/web/WEB-INF/classes
build.classes.excludes=**/*.java,**/*.form
build.compiler.emacs=true
build.dir=build
build.dir.to.clean=build/web
build.generated.dir=build/generated
build.generated.sources.dir=build/generated-sources
build.meta.inf.dir=build/web/META-INF
build.test.classes.dir=build/test/classes
build.test.results.dir=build/test/results
build.web.dir=build/web
build.web.excludes=**/*.java,**/*.form
client.urlPart=
compile.jsps=false
conf.dir=src/conf
debug-args-line=-Xdebug
debug-transport=dt_socket
debug-transport-by-os=dt_socket
debug.classpath=build/web/WEB-INF/classes\:/home/gstaffor/JavaFiles/eclipselink_2_4_1/jlib/eclipselink.jar...
debug.test.classpath=/home/gstaffor/JavaFiles/eclipselink_2_4_1/jlib/eclipselink.jar...
default.javac.source=1.7
default.javac.target=1.7
deploy.ant.properties.file=/home/gstaffor/.netbeans/7.2/gfv3-430621021.properties
display.browser=true
dist.dir=dist
dist.ear.war=dist/MySqlEntityWebDemo.war
dist.jar.dir=/home/gstaffor/NetBeansProjects/MySqlEntityWebDemo/dist
dist.javadoc.dir=dist/javadoc
dist.war=dist/MySqlEntityWebDemo.war
...
j2ee.compile.on.save=true
j2ee.copy.static.files.on.save=true
j2ee.deploy.on.save=true
j2ee.platform=1.6-web
j2ee.platform.classpath=/home/gstaffor/glassfish-3.1.2.2/glassfish/modules/bean-validator.jar...
j2ee.platform.embeddableejb.classpath=/home/gstaffor/glassfish-3.1.2.2/glassfish/lib/embedded/glassfish-embedded-static-shell.jar
j2ee.platform.is.jsr109=true
j2ee.platform.wscompile.classpath=/home/gstaffor/glassfish-3.1.2.2/glassfish/modules/webservices-osgi.jar...
j2ee.platform.wsit.classpath=
j2ee.server.domain=/home/gstaffor/glassfish-3.1.2.2/glassfish/domains/domain1
j2ee.server.home=/home/gstaffor/glassfish-3.1.2.2/glassfish
j2ee.server.instance=[/home/gstaffor/glassfish-3.1.2.2/glassfish...
j2ee.server.middleware=/home/gstaffor/glassfish-3.1.2.2
j2ee.server.type=gfv3ee6
jar.compress=false
...
war.content.additional=
war.ear.name=MySqlEntityWebDemo.war
war.name=MySqlEntityWebDemo.war
web.docbase.dir=web
webinf.dir=web/WEB-INF

Reviewing the results, I was able to find all the properties I needed to build the target, below.

<target name="glassfish-deploy" depends="clean, dist"
        description="Build distribution (WAR) and deploy to GlassFish">               
    <exec failonerror="true" vmlauncher="false" 
          executable="${j2ee.server.home}/bin/asadmin" >
        <arg line="--host=localhost --port=4848 
            --user=admin --passwordfile=pwdfile --secure=false
            deploy --force=true --name='${ant.project.name}' 
            --contextroot='/${ant.project.name}' '${dist.war}'" />
    </exec>
</target>

Almost any properties you need to develop an Ant Target is probably available if you know where, or how to look.

, , , , , , , , , , , , ,

Leave a comment

RESTful Mobile: Consuming Java EE RESTful Web Services Using jQuery Mobile

Use jQuery Mobile to build a mobile HTML website, capable of calling Jersey-specific Java EE RESTful web services and displaying JSONP in a mobile web browser.

Both NetBeans projects used in this post are available on DropBox. If you like DropBox, please use this link to sign up for a free 2 GB account. It will help me post more files to DropBox for future posts.

Background

In the previous two-part series, Returning JSONP from Java EE RESTful Web Services Using jQuery, Jersey, and GlassFish, we created a Jersey-specific RESTful web service from a database using EclipseLink (JPA 2.0 Reference Implementation), Jersey (JAX-RS Reference Implementation), JAXB, and Jackson Java JSON-processor. The service and associated entity class mapped to a copy of Microsoft SQL Server’s Adventure Works database. An HTML and jQuery-based client called the service, which returned a JSONP response payload. The JSON data it contained was formatted and displayed in a simple HTML table, in a web-browser.

Objectives

In this post, we will extend the previous example to the mobile platform. Using jQuery and jQuery Mobile JavaScript libraries, we will call two RESTful web services and display the resulting JSONP data using the common list/detail UX design pattern. We will display a list of Adventure Works employees. When the end-user clicks on an employee in the web-browser, a new page will display detailed demographic information about that employee.

Similar to the previous post, when the client website is accessed by the end-user in a mobile web browser, the client site’s HTML, CSS, and JavaScript files are downloaded and cached on the end-users machine. The JavaScript file, using jQuery and Ajax, makes a call to the RESTful web service, which returns JSON (or, JSONP in this case). This simulates a typical cross-domain situation where a client needs to consume RESTful web services from a remote source. This is not allowed by the same origin policy, but overcome by returning JSONP to the client, which wraps the JSON payload in a function call.

We will extend both the ‘JerseyRESTfulServices’ and ‘JerseyRESTfulClient’ projects we built in the last series of posts. Here are the high-level steps we will walk-through in this post:

  1. Create a second view (virtual table) in the Adventure Works database;
  2. Create a second entity class that maps to the new database view;
  3. Modify the existing entity class, adding JAXB and Jackson JSON annotations;
  4. Create a second Jersey-specific RESTful web service from the new entity using Jersey and Jackson;
  5. Modify the existing Jersey-specific RESTful web service, adding one new methods;
  6. Modify the web.xml file to allow us to use natural JSON notation;
  7. Implement a JAXBContext resolver to serialize the JSON using natural JSON notation;
  8. Create a simple list/detail two-page mobile HTML5 website using jQuery Mobile;
  9. Use jQuery, Ajax, and CSS to call, parse, and display the JSONP returned by the service.

RESTful Web Services Project

When we are done, the final RESTful web services projects will look like the screen-grab, below. It will contain (2) entity classes, (2) RESTful web service classes, (1) JAXBContext resolver class, and the web.xml configuration file:

JerseyRESTfulServices Project View in NetBeans

JerseyRESTfulServices Project View in NetBeans

1: Create the Second Database View
Create a new database view, vEmployeeNames, in the Adventure Works database:

USE [AdventureWorks]
GO

SET ANSI_NULLS ON
GO

SET QUOTED_IDENTIFIER ON
GO

CREATE VIEW [HumanResources].[vEmployeeNames]
AS
SELECT TOP (100) PERCENT BusinessEntityID, REPLACE(RTRIM(LastName 
     + COALESCE (' ' + Suffix + '', N'') + COALESCE (', ' + FirstName + ' ', N'') 
     + COALESCE (MiddleName + ' ', N'')), '  ', ' ') AS FullName
FROM Person.Person
WHERE (PersonType = 'EM')
ORDER BY FullName
GO

2: Create the Second Entity
Add the new VEmployeeNames.java entity class, mapped to the vEmployeeNames database view, using NetBeans’ ‘Entity Classes from Database…’ wizard. Then, modify the class to match the code below.

package entities;

import java.io.Serializable;
import javax.persistence.Basic;
import javax.persistence.Column;
import javax.persistence.Entity;
import javax.persistence.Id;
import javax.persistence.NamedQueries;
import javax.persistence.NamedQuery;
import javax.persistence.Table;
import javax.validation.constraints.NotNull;
import javax.validation.constraints.Size;
import javax.xml.bind.annotation.XmlRootElement;
import javax.xml.bind.annotation.XmlType;

@Entity
@Table(name = "vEmployeeNames", catalog = "AdventureWorks", schema = "HumanResources")
@XmlRootElement(name = "vEmployeeNames")
@NamedQueries({
    @NamedQuery(name = "VEmployeeNames.findAll", query = "SELECT v FROM VEmployeeNames v"),
    @NamedQuery(name = "VEmployeeNames.findByBusinessEntityID", query = "SELECT v FROM VEmployeeNames v WHERE v.businessEntityID = :businessEntityID"),
    @NamedQuery(name = "VEmployeeNames.findByFullName", query = "SELECT v FROM VEmployeeNames v WHERE v.fullName = :fullName")})
public class VEmployeeNames implements Serializable {

    private static final long serialVersionUID = 1L;
    @Basic(optional = false)
    @NotNull
    @Id
    @Column(name = "BusinessEntityID")
    private int businessEntityID;
    @Basic(optional = false)
    @NotNull
    @Size(min = 1, max = 102)
    @Column(name = "FullName")
    private String fullName;

    public VEmployeeNames() {
    }

    public int getBusinessEntityID() {
        return businessEntityID;
    }

    public void setBusinessEntityID(int businessEntityID) {
        this.businessEntityID = businessEntityID;
    }

    public String getFullName() {
        return fullName;
    }

    public void setFullName(String fullName) {
        this.fullName = fullName;
    }
}

3: Modify the Existing Entity
Modify the existing VEmployee.java entity class to use JAXB and Jackson JSON Annotations as shown below (class code abridged). Note the addition of the @XmlType(propOrder = { "businessEntityID"... }) to the class, the @JsonProperty(value = ...) tags to each member variable, and the @Id tag to the businessEntityID, which serves as the entity’s primary key. We will see the advantages of the first two annotations later in the post when we return the JSON to the client.

package entities;

import java.io.Serializable;
import javax.persistence.Basic;
import javax.persistence.Entity;
import javax.persistence.Id;
import javax.persistence.NamedQueries;
import javax.persistence.NamedQuery;
import javax.persistence.Table;
import javax.validation.constraints.NotNull;
import javax.validation.constraints.Size;
import javax.xml.bind.annotation.XmlRootElement;
import javax.xml.bind.annotation.XmlType;
import org.codehaus.jackson.annotate.JsonProperty;

@Entity
@Table(name = "vEmployee", catalog = "AdventureWorks", schema = "HumanResources")
@XmlRootElement
@NamedQueries({
    @NamedQuery(name = "VEmployee.findAll", query = "SELECT v FROM VEmployee v"),
    ...})
    @XmlType(propOrder = {
    "businessEntityID",
    "title",
    "firstName",
    "middleName",
    "lastName",
    "suffix",
    "jobTitle",
    "phoneNumberType",
    "phoneNumber",
    "emailAddress",
    "emailPromotion",
    "addressLine1",
    "addressLine2",
    "city",
    "stateProvinceName",
    "postalCode",
    "countryRegionName",
    "additionalContactInfo"
})
public class VEmployee implements Serializable {

    private static final long serialVersionUID = 1L;
    @Basic(optional = false)
    @NotNull
    @Id
    @JsonProperty(value = "Employee ID")
    private int businessEntityID;
    @Size(max = 8)
    @JsonProperty(value = "Title")
    private String title;
    @Basic(optional = false)
    @NotNull
    @Size(min = 1, max = 50)
    @JsonProperty(value = "First Name")
    private String firstName;
    @Size(max = 50)
    @JsonProperty(value = "Middle Name")
    private String middleName;
    @Basic(optional = false)
    @NotNull
    @Size(min = 1, max = 50)
    @JsonProperty(value = "Last Name")
    private String lastName;
    @Size(max = 10)
    @JsonProperty(value = "Suffix")
    private String suffix;
    @Basic(optional = false)
    @NotNull
    @Size(min = 1, max = 50)
    @JsonProperty(value = "Job Title")
    private String jobTitle;
    @Size(max = 25)
    @JsonProperty(value = "Phone Number")
    private String phoneNumber;
    @Size(max = 50)
    @JsonProperty(value = "Phone Number Type")
    private String phoneNumberType;
    @Size(max = 50)
    @JsonProperty(value = "Email Address")
    private String emailAddress;
    @Basic(optional = false)
    @NotNull
    @JsonProperty(value = "Email Promotion")
    private int emailPromotion;
    @Basic(optional = false)
    @NotNull
    @Size(min = 1, max = 60)
    @JsonProperty(value = "Address Line 1")
    private String addressLine1;
    @Size(max = 60)
    @JsonProperty(value = "Address Line 2")
    private String addressLine2;
    @Basic(optional = false)
    @NotNull
    @Size(min = 1, max = 30)
    @JsonProperty(value = "City")
    private String city;
    @Basic(optional = false)
    @NotNull
    @Size(min = 1, max = 50)
    @JsonProperty(value = "State or Province Name")
    private String stateProvinceName;
    @Basic(optional = false)
    @NotNull
    @Size(min = 1, max = 15)
    @JsonProperty(value = "Postal Code")
    private String postalCode;
    @Basic(optional = false)
    @NotNull
    @Size(min = 1, max = 50)
    @JsonProperty(value = "Country or Region Name")
    private String countryRegionName;
    @Size(max = 2147483647)
    @JsonProperty(value = "Additional Contact Info")
    private String additionalContactInfo;

    public VEmployee() {
    }
    ...
}

4: Create the New RESTful Web Service
Add the new VEmployeeNamesFacadeREST.java RESTful web service class using NetBean’s ‘RESTful Web Services from Entity Classes…’ wizard. Then, modify the new class, adding the new findAllJSONP() method shown below (class code abridged). This method call the same super.findAll() method from the parent AbstractFacade.java class as the default findAll({id}) method. However, the findAllJSONP() method returns JSONP instead of XML or JSON, as findAll({id}) does. This is done by passing the results of super.findAll() to a new instance of Jersey’s JSONWithPadding() class (com.sun.jersey.api.json.JSONWithPadding).

package service;

import com.sun.jersey.api.json.JSONWithPadding;
import entities.VEmployeeNames;
import java.util.ArrayList;
import java.util.Collection;
import java.util.List;
import javax.ejb.Stateless;
import javax.persistence.EntityManager;
import javax.persistence.PersistenceContext;
import javax.persistence.criteria.CriteriaBuilder;
import javax.persistence.criteria.CriteriaQuery;
import javax.persistence.criteria.Root;
import javax.ws.rs.Consumes;
import javax.ws.rs.DELETE;
import javax.ws.rs.GET;
import javax.ws.rs.POST;
import javax.ws.rs.PUT;
import javax.ws.rs.Path;
import javax.ws.rs.PathParam;
import javax.ws.rs.Produces;
import javax.ws.rs.QueryParam;
import javax.ws.rs.core.GenericEntity;

@Stateless
@Path("entities.vemployeenames")
public class VEmployeeNamesFacadeREST extends AbstractFacade<VEmployeeNames> {
    ...
    @GET
    @Path("jsonp")
    @Produces({"application/javascript"})
    public JSONWithPadding findAllJSONP(@QueryParam("callback") String callback) {
        CriteriaBuilder cb = getEntityManager().getCriteriaBuilder();
        CriteriaQuery cq = cb.createQuery();
        Root empRoot = cq.from(VEmployeeNames.class);
        cq.select(empRoot);
        cq.orderBy(cb.asc(empRoot.get("fullName")));
        javax.persistence.Query q = getEntityManager().createQuery(cq);

        List<VEmployeeNames> employees = q.getResultList();
        return new JSONWithPadding(
                new GenericEntity<Collection<VEmployeeNames>>(employees) {
                }, callback);
    }
    ...
}

5: Modify the Existing Service
Modify the existing VEmployeeFacadeREST.java RESTful web service class, adding the findJSONP() method shown below (class code abridged). This method calls the same super.find({id}) in the AbstractFacade.java parent class as the default find({id}) method, but returns JSONP instead of XML or JSON. As with the previous service class above, this is done by passing the results to a new instance of Jersey’s JSONWithPadding() class (com.sun.jersey.api.json.JSONWithPadding). There are no changes required to the default AbstractFacade.java class.

package service;

import com.sun.jersey.api.json.JSONWithPadding;
import entities.VEmployee;
import java.util.ArrayList;
import java.util.Collection;
import java.util.List;
import javax.ejb.Stateless;
import javax.persistence.EntityManager;
import javax.persistence.PersistenceContext;
import javax.persistence.criteria.CriteriaBuilder;
import javax.persistence.criteria.CriteriaQuery;
import javax.persistence.criteria.Root;
import javax.ws.rs.Consumes;
import javax.ws.rs.DELETE;
import javax.ws.rs.GET;
import javax.ws.rs.POST;
import javax.ws.rs.PUT;
import javax.ws.rs.Path;
import javax.ws.rs.PathParam;
import javax.ws.rs.Produces;
import javax.ws.rs.QueryParam;
import javax.ws.rs.core.GenericEntity;

@Stateless
@Path("entities.vemployee")
public class VEmployeeFacadeREST extends AbstractFacade<VEmployee> {
    ...
    @GET
    @Path("{id}/jsonp")
    @Produces({"application/javascript"})
    public JSONWithPadding findJSONP(@PathParam("id") Integer id,
            @QueryParam("callback") String callback) {
        List<VEmployee> employees = new ArrayList<VEmployee>();
        employees.add(super.find(id));
        return new JSONWithPadding(
                new GenericEntity<Collection<VEmployee>>(employees) {
                }, callback);
    }
    ...
}

6: Allow POJO JSON Support
Add the JSONConfiguration.FEATURE_POJO_MAPPING servlet init parameter to web.xml, as shown below (xml abridged). According to the Jersey website, this will allow us to use POJO support, the easiest way to convert our Java Objects to JSON. It is based on the Jackson library.

<?xml version="1.0" encoding="UTF-8"?>
<web-app version="3.0" xmlns="http://java.sun.com/xml/ns/javaee" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://java.sun.com/xml/ns/javaee http://java.sun.com/xml/ns/javaee/web-app_3_0.xsd">
    <servlet>
        <servlet-name>ServletAdaptor</servlet-name>
        <servlet-class>com.sun.jersey.spi.container.servlet.ServletContainer</servlet-class>
        <init-param>
            <description>Multiple packages, separated by semicolon(;), can be specified in param-value</description>
            <param-name>com.sun.jersey.config.property.packages</param-name>
            <param-value>service</param-value>
        </init-param>
        <init-param>
            <param-name>com.sun.jersey.api.json.POJOMappingFeature</param-name>
            <param-value>true</param-value>
        </init-param>
        ...

7: Implement a JAXBContext Resolver
Create the VEmployeeFacadeREST.java JAXBContext resolver class, shown below. This allows us to serialize the JSON using natural JSON notation. A good explanation of the use of a JAXBContext resolver can be found on the Jersey website.

package config;

import com.sun.jersey.api.json.JSONConfiguration;
import com.sun.jersey.api.json.JSONJAXBContext;
import javax.ws.rs.ext.ContextResolver;
import javax.ws.rs.ext.Provider;
import javax.xml.bind.JAXBContext;

@Provider
public class JAXBContextResolver implements ContextResolver<JAXBContext> {

    JAXBContext jaxbContext;
    private Class[] types = {entities.VEmployee.class, entities.VEmployeeNames.class};

    public JAXBContextResolver() throws Exception {
        this.jaxbContext =
                new JSONJAXBContext(JSONConfiguration.natural().build(), types);
    }

    @Override
    public JAXBContext getContext(Class<?> objectType) {
        for (Class type : types) {
            if (type == objectType) {
                return jaxbContext;
            }
        }
        return null;
    }
}

What is Natural JSON Notation?
According to the Jersey website, “with natural notation, Jersey will automatically figure out how individual items need to be processed, so that you do not need to do any kind of manual configuration. Java arrays and lists are mapped into JSON arrays, even for single-element cases. Java numbers and booleans are correctly mapped into JSON numbers and booleans, and you do not need to bother with XML attributes, as in JSON, they keep the original names.

What does that mean? Better yet, what does that look like? Here is an example of an employee record, first as plain old JAXB JSON in a JSONP wrapper:

callback({"vEmployee":{"businessEntityID":"211","firstName":"Hazem","middleName":"E","lastName":"Abolrous","jobTitle":"Quality Assurance Manager","phoneNumberType":"Work","phoneNumber":"869-555-0125","emailAddress":"hazem0@adventure-works.com","emailPromotion":"0","addressLine1":"5050 Mt. Wilson Way","city":"Kenmore","stateProvinceName":"Washington","postalCode":"98028","countryRegionName":"United States"}})

And second, JSON wrapped in JSONP, using Jersey’s natural notation. Note the differences in the way the parent vEmployee node, numbers, and nulls are handled in natural JSON notation.

callback([{"Employee ID":211,"Title":null,"First Name":"Hazem","Middle Name":"E","Last Name":"Abolrous","Suffix":null,"Job Title":"Quality Assurance Manager","Phone Number Type":"Work","Phone Number":"869-555-0125","Email Address":"hazem0@adventure-works.com","Email Promotion":0,"Address Line 1":"5050 Mt. Wilson Way","Address Line 2":null,"City":"Kenmore","State or Province Name":"Washington","Postal Code":"98028","Country or Region Name":"United States","Additional Contact Info":null}])

Mobile Client Project

When we are done with the mobile client, the final RESTful web services mobile client NetBeans projects should look like the screen-grab, below. Note the inclusion of jQuery Mobile 1.2.0. You will need to download the library and associated components, and install them in the project. I chose to keep them in a separate folder since there were several files included with the library. This example requires a few new features introduced in jQuery Mobile 1.2.0. Make sure to get this version or later.

JerseyRESTfulClient Project View in NetBeans

JerseyRESTfulClient Project View in NetBeans

8: Create a List/Detail Mobile HTML Site
The process to display the data from the Adventure Works database in the mobile web browser is identical to the process used in the last series of posts. We are still using jQuery with Ajax, calling the same services, but with a few new methods. The biggest change is the use of jQuery Mobile to display the employee data. The jQuery Mobile library, especially with the release of 1.2.0, makes displaying data, quick and elegant. The library does all the hard work under the covers, with the features such as the listview control. We simply need to use jQuery and Ajax to retrieve the data and pass it to the control.

We will create three new files. They include the HTML, CSS, and JavaScript files. We add a ‘.m’ to the file names to differentiate them from the normal web browser files from the last post. As with the previous post, the HTML page and CSS file are minimal. The HTML page uses the jQuery Mobile multi-page template available on the jQuery Mobile website. Although it appears as two different web pages to the end-user, it is actually a single-page site.

Source code for employee.m.html:

<!DOCTYPE html>
<html>
    <head> 
        <title>Employee List</title> 
        <meta name="viewport" content="width=device-width, initial-scale=1"> 
        <meta http-equiv="Content-Type" content="text/html; charset=UTF-8">

        <link rel="stylesheet" href="jquery.mobile-1.2.0/jquery.mobile-1.2.0.min.css" />
        <link type="text/css" rel="stylesheet" href="employees.m.css" />

        <script src="jquery-1.8.2.min.js" type="text/javascript"></script>
        <script src="jquery.mobile-1.2.0/jquery.mobile-1.2.0.min.js" type="text/javascript"></script>
        <script src="employees.m.js" type="text/javascript"></script>
    </head> 
    <body> 
        <!-- Start of first page: #one -->
        <div data-role="page" id="one" data-theme="b">
            <div data-role="header" data-theme="b">
                <h1>Employee List</h1>
            </div><!-- /header -->
            <div data-role="content">	
                <div id="errorMessage"></div>
                <div class="ui-grid-solo">
                    <form>
                        <ul data-role="listview" data-filter="true" 
                            id="employeeList" data-theme="c" data-autodividers="true">
                        </ul>
                    </form>
                </div>
            </div><!-- /content -->
            <div data-role="footer" data-theme="b">
                <h4>Programmatic Ponderings, 2012</h4>
            </div><!-- /footer -->
        </div><!-- /page -->
        
        <!-- Start of second page: #two -->
        <div data-role="page" id="two" data-theme="c">
            <div data-role="header" data-theme="b">
                <a href="#one" data-icon="back">Return</a>
                <h1>Employee Detail</h1>
            </div><!-- /header -->
            <div data-role="content" data-theme="c">	
                <div id="employeeDetail"></div>
            </div><!-- /content -->
            <div data-role="footer" data-theme="b">
                <h4>Programmatic Ponderings, 2012</h4>
            </div><!-- /footer -->
        </div><!-- /page two -->
    </body>
</html>

Source code for employee.m.css:

#employeeList {
    clear:both;
}

#employeeDetail div {
    padding-top: 2px;
    white-space: nowrap;
}

.field {
    margin-bottom: 0px;
    font-size: smaller;
    color: #707070;
}

.value {
    font-weight: bolder;
    padding-bottom: 12px;
    border-bottom: 1px #d0d0d0 solid;
}

.ui-block-a{
    padding-left: 6px;
    padding-right: 6px;
}

.ui-grid-a{
    padding-bottom: 12px;
    padding-top: -6px;
}

8: Retrieve, Parse, and Display the Data
The mobile JavaScript file below is identical in many ways to the JavaScript file used in the last series of posts for a non-mobile browser. One useful change we have made is the addition of two arguments to the function that calls jQuery.Ajax(). The address of the service (URI) that the jQuery.Ajax() method requests, and the function that Ajax calls after successful completion, are both passed into the callService(Uri, successFunction) function as arguments. This allows us to reuse the Ajax method for different purposes. In this case, we call the function once to populate the Employee List with the full names of the employees. We call it again to populate the Employee Detail page with demographic information of a single employee chosen from the Employee List. Both calls are to different URIs representing the two different RESTful web services, which in turn are associated with the two different entities, which in turn are mapped to the two different database views.

callService = function (uri, successFunction) {
        $.ajax({
            cache: true,
            url: uri,
            data: "{}",
            type: "GET",
            contentType: "application/javascript",
            dataType: "jsonp",
            error: ajaxCallFailed,
            failure: ajaxCallFailed,
            success: successFunction
        });          
    };

The rest of the functions are self-explanatory. There are two calls to the jQuery Ajax method to return data from the service, two functions to parse and format the JSONP for display in the browser, and one jQuery method that adds click events to the Employee List. We perform a bit of string manipulation to imbed the employee id into the id property of each list item (li element. Later, when the end-user clicks on the employee name in the list, the employee id is extracted from the id property of the selected list item and passed back to the service to retrieve the employee detail. The HTML snippet below shows how a single employee row in the jQuery listview. Note the id property of the li element, id="empId_121", for employee id 121.

<li id="empId_121" class="ui-btn ui-btn-icon-right ui-li-has-arrow ui-li ui-btn-up-c" 
    data-corners="false" data-shadow="false" data-iconshadow="true" 
    data-wrapperels="div" data-icon="arrow-r" data-iconpos="right" data-theme="c">
    <div class="ui-btn-inner ui-li">
        <div class="ui-btn-text">
            <a class="ui-link-inherit" href="#">Ackerman, Pilar G</a>
        </div>
        <span class="ui-icon ui-icon-arrow-r ui-icon-shadow"> </span>
    </div>
</li>

To make this example work, you need to change the restfulWebServiceBaseUri variable to the server and port of the GlassFish domain running your RESTful web services. If you are testing the client locally on your mobile device, I suggest using the IP address for the GlassFish server versus a domain name, which your phone will be able to connect to in your local wireless environment. At least on the iPhone, there is no easy way to change the hosts file to provide local domain name resolution.

Source code for employee.m.js:

// ===========================================================================
// 
// Author: Gary A. Stafford
// Website: http://www.programmaticponderings.com
// Description: Call RESTful Web Services from mobile HTML pages
//              using jQuery mobile, Jersey, Jackson, and EclipseLink
// 
// ===========================================================================

// Immediate function
(function () {
    "use strict";
    
    var restfulWebServiceBaseUri, employeeListFindAllUri, employeeByIdUri,
    callService, ajaxCallFailed,
    getEmployeeById, displayEmployeeList, displayEmployeeDetail;
    
    // Base URI of RESTful web service
    restfulWebServiceBaseUri = "http://your_server_name_or_ip:8080/JerseyRESTfulServices/webresources/";
    
    // URI maps to service.VEmployeeNamesFacadeREST.findAllJSONP
    employeeListFindAllUri = restfulWebServiceBaseUri + "entities.vemployeenames/jsonp";
        
    // URI maps to service.VEmployeeFacadeREST.findJSONP
    employeeByIdUri = restfulWebServiceBaseUri + "entities.vemployee/{id}/jsonp";
    
    
    // Execute after the page one dom is fully loaded
    $(".one").ready(function () {        
        // Retrieve employee list
        callService(employeeListFindAllUri, displayEmployeeList);
        
        // Attach onclick event to each row of employee list on page one
        $("#employeeList").on("click", "li", function(event){
            getEmployeeById($(this).attr("id").split("empId_").pop());
        });
    });
      
    // Call a service URI and return JSONP to a function
    callService = function (Uri, successFunction) {
        $.ajax({
            cache: true,
            url: Uri,
            data: "{}",
            type: "GET",
            contentType: "application/javascript",
            dataType: "jsonp",
            error: ajaxCallFailed,
            failure: ajaxCallFailed,
            success: successFunction
        });          
    };
    
    // Called if ajax call fails
    ajaxCallFailed = function (jqXHR, textStatus) { 
        console.log("Error: " + textStatus);
        console.log(jqXHR);
        $("form").css("visibility", "hidden");
        $("#errorMessage").empty().
        append("Sorry, there was an error.").
        css("color", "red");
    };
    
    // Display employee list on page one
    displayEmployeeList = function (employee) {
        var employeeList = "";
                
        $.each(employee, function(index, employee) {
            employeeList = employeeList.concat(
                "<li id=empId_" + employee.businessEntityID.toString() + ">" + 
                "<a href='#'>" + 
                employee.fullName.toString() + "</a></li>");
        });
        
        $('#employeeList').empty();
        $('#employeeList').append(employeeList).listview("refresh", true);
    };
    
    // Display employee detail on page two
    displayEmployeeDetail = function(employee) {
        $.mobile.loading( 'show', {
            text: '',
            textVisible: false,
            theme: 'a',
            html: ""
            
        });
        window.location = "#two";
        var employeeDetail = "";
                
        $.each(employee, function(key, value) {
            $.each(value, function(key, value) {
                if(!value) {
                    value = "&nbsp;";
                }
                
                employeeDetail = employeeDetail.concat(
                    "<div class='detail'>" +
                    "<div class='field'>" + key + "</div>" +
                    "<div class='value'>" + value + "</div>" +
                    "</div>");   
            });
        });
        
        $("#employeeDetail").empty().append(employeeDetail);
    };
    
    // Retrieve employee detail based on employee id
    getEmployeeById = function (employeeID) {
        callService(employeeByIdUri.replace("{id}", employeeID), displayEmployeeDetail);
    };
} ());

The Final Result

Viewed in Google’s Chrome for Mobile web browser on iOS 6, the previous project’s Employee List looks pretty bland and un-mobile like:

Previous Project as Viewed in Google Chrome Mobile Browser

Previous Project as Viewed in Google Chrome for Mobile Web Browser

However, with a little jQuery Mobile magic you get a simple yet effective and highly functional mobile web presentation. Seen below on page one, the Employee List is displayed in Safari on an iPhone 4 with iOS 6. It features some of the new capabilities of jQuery Mobile 1.2.0’s improved listview, including autodividers.

Employee List

Employee List

Here again is the Employee List using the jQuery Mobile 1.2.0’s improved listview search filter bar:

Employee List - Filtered

Employee List – Filtered

Here is the Employee Detail on page 2. Note the order and names of the fields. Remember previously when we annotated the VEmployeeNames.java entity with the @XmlType(propOrder = {"businessEntityID", ...}) to the class and the @JsonProperty(value = ...) tags to each member variable. This is the results of those efforts; our JSON is delivered pre-sorted and titled the way we want. No need to handle those functions on the client-side. This allows the client to be loosely-coupled to the data. The client simply displays whichever key/value pairs are delivered in the JSONP response payload.

Employee Detail

Employee Detail

Employee Detail - Bottom

Employee Detail – Bottom

, , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

4 Comments

Returning JSONP from Java EE RESTful Web Services Using jQuery, Jersey, and GlassFish – Part 2 of 2

Create a Jersey-specific Java EE RESTful web service, and an HTML-based client to call the service and display JSONP. Test and deploy the service and the client to different remote instances of GlassFish.

Background

In part 1 of this series, we created a Jersey-specific RESTful web service from a database using NetBeans. The service returns JSONP in addition to JSON and XML. The service was deployed to a GlassFish domain, running on a Windows box. On this same box is the SQL Server instance, running the Adventure Works database, from which the service obtains data, via the entity class.

Objectives

In part two of this series, we will create a simple web client to consume and display the JSONP returned by the RESTful web service. There are many options available for creating a service consumer (client) depending on your development platform and project requirements. We will keep it simple, no complex, complied code, just HTML and JavaScript with jQuery, the well-known JavaScript library.

We will host the client on a separate GlassFish domain, running on an Ubuntu Linux VM using Oracle’s VM VirtualBox. This is a different machine than the service was installed on. When opened by the end-user in a web browser, the client files, including the JavaScript file that calls the service, are downloaded to the end-users machine. This will simulate a typical cross-domain situation where a client application needs to consume RESTful web services from a remote source. This is not allowed by the same origin policy, but overcome by returning JSONP to the client, which wraps the JSON payload in a function call.

Here are the high-level steps we will walk-through in part two:

  1. Create a simple HTML client using jQuery and ajax to call the RESTful web service;
  2. Add jQuery functionality to parse and display the JSONP returned by the service;
  3. Deploy the client to a separate remote instance of GlassFish using Apache Ant;
  4. Test the client’s ability to call the service across domains and display JSONP.

Creating the RESTful Web Service Client

New NetBeans Web Application Project
Create a new Java Web Application project in NetBeans. Name the project ‘JerseyRESTfulClient’. The choice of GlassFish server and domain where the project will be deployed is unimportant. We will use Apache Ant to deploy the client when we finish the building the project. By default, I chose my local instance of GlassFish, for testing purposes.

01a - Create a New Web Application Project in NetBeans

Create a New Web Application Project in NetBeans

01b - Create a New Web Application Project in NetBeans

Name and Location of New Web Application Project

01c - Create a New Web Application Project in NetBeans

Server and Settings of New Web Application Project

01d - Create a New Web Application Project in NetBeans

Optional Frameworks to Include in New Web Application Project

01e - Create a New Web Application Project in NetBeans

View of New Web Application Project in NetBeans

Adding Files to Project
The final client project will contains four new files:

  1. employees.html – HTML web page that displays a list of emplo