Embedded jetty CDI enabled jersey based Java web service example

Java web services example project

Outdated post, there are better ways to do this today.


The Development process and rolling out of java web applications usually entail having to deal with setups such as development environments, production environments and various configurations external to the scope of the intended core deployment unit (WAR), these obligatory duties make it a bit of a challenge to build an efficient product delivery process. Take the example of having a large, distributed team with no locality constraints, a truly global team of contributors. In such a project, managing dependencies, including specific versions of a Servlet container/JavaEE compliant server and a set of configurations for said server/container is essentially required for all developers across the team. Furthermore, utilizing continuous delivery techniques for an (arguably) better DevOps culture becomes way too complicated, having to maintain development/deployment/stagings images becomes a necessity, one way to manage this introduced complexity is the use of image containers such as docker and vagrant, however, this introduces an extra set of tools to manage. Another solution would be using hypervisors for deployment options, which means maintaining images of all that is needed for the Java web application to function in multiple environments. This option has the same shortcomings of the containers mentioned above, with the additional problem of platform dependence and setup complexity.

To deal with the issues above, I adapted the embedded server choice with Java based web services, this approach gave me the advantage of being able to guarantee that my development team can clone the project from the repository, and immediately start contributing value by using the default development profile, without the need to worry about any form of configuration outside the cloned project. The distribution of the project became a matter of making it available for cloning off the repository, instructions were limited and focused on adding functionality and coding standard, with little regard for configuration, dependencies and servers. Because of the encapsulated nature of the distributed project, where all necessary configurations are made visible, verbose and programmer friendly,
Necessary external dependencies were easy to hook to the application as they were controlled by configuration files with properties routing to development database servers and caching services.

When NOT to use this project setup

There are probably many project specific cases when using this type of setup is a No-No. But I could only relate to the cases I personally encountered. First, when targeting a large deployment server, one that can handle multiple deployed applications in a shared setup, it would be impractical to have each application spawn it's own embedded container and hog it's own otherwise shareable resources, a shared Servlet container/JavaEE server may be a better choice there.
Another case where it might be preferable to not use the embedded server approach is when your distributed development unit includes bundled databases or none Java based dependencies, although bundling interdependent application tiers is a questionable practice, you may have a compelling reason to do so, in that case, you may be better off using a different approach.
Finally, the embedded server approach won't really give you the benefits it promises if you have little control over the deployment environment, I once made the mistake of developing a large application using this setup, and realizing in the end that it has to run alongside other java services deployed in a standalone instance of Apache tomcat, the application required some NIO magic , hardware communications and JNID resources that were hard to stitch into the new Servlet container deployment setup , it wasn't fun at all.

The moral of the story is that you need to have a broad knowledge of your project, team, development and deployment environments before choosing a framework for your implementation.

Source code

You can clone the project from Github at Service base example project.

Minimum Requirements

Unless you plan to bundle a JVM (think Atlasian's JIRA distribution) and firewall setup with your distributed deployable application, Deployment servers and development machines will have to provide these two components for your java application to run. It would be great to eliminate all external dependencies, and it is quite possible, but the choice to keep these two required dependencies external to the distributed, configurable application looks like the best option we have for now. Because packaging the JVM imposes a restriction on the deployment and development environments, a restriction that would be intrinsic to the distributed project/deployable package, this is because while java applications are platform independent, the JVM itself is not. Jamming in firewall configuration is also a crippling step, as it is also platform/implementation dependent. To sum up, The two external requirements for both development and deployment machines are:

  • JVM (1.7)
  • Some form of a firewall setup, to control routing of communications (development machines can skip this, and work with unprivileged localhost ports)

Of course, you will also have to provide any external dependencies that your application uses, such as databases and caching servers. Your application will be introduced to these external dependencies using configuration files (Java standard properties files) within the distributed project.

Project structure

The project structure is very similar to the standard maven project setup.

Dircetory structure Description
src/main/java application Java sources
src/main/webapp application web sources
src/main/resources application resources
src/test/java/com/siphyc/mock test mockup sources
src/test/java/com/siphyc/test test sources (unit tests)
src/test/resources test resources
misc scripts, helpers, docs
LICENSE.md license statement
NOTICE.txt Notices and attributions required by libraries that the project depends on
README.md readme file
pom.xml Project Object Model file

Dummy functionality

This example application provides a registry for customers in a mobile repair shop, the system keeps track of two types of smartphones submitted for repairs, Androids and Iphones, each new entry has to contain the customer name, the status of repair (boolean), date of entry, date of update and the model of the smartphone. Although you could see some resemblance to a well designed web service, with somewhat rational response codes and request handling, It's a pretty flawed system, and can only be used for demonstration purposes.

Toolkits and frameworks

server (Embedded Jetty)

Embedded jetty is the servlet container of choice for java based web services implementation at Siphyc, the reason being it's ease of deployment and development platform independence. Developers may use any familiar IDE or text editor for development and jump right into contributing to the project's progress, without the need for configuring a full blown servlet container or JavaEE compliant web server. The use of Jetty embedded server also facilitates fast deployment setup, as the deployment server machine requires only a working JVM and a properly setup firewall.

The basic required maven dependencies to include for jetty server in this project are:


Because of the unconventional setup of this project, we also need to configure maven to build the project in a way that would make our deployable war package function without needing an external java container.

To achieve this, we will add the following to our POM.xml file, under plugins :


In the above configuration of maven-war-plugin, we're instructing the archiver to include the entry (Main-Class: JServer) in the packaged WAR's META-INF/MANIFEST.MF, we're also instructing the archiver to exclude any jetty-*.jar from being packaged into the application war under WEB-INF/lib/, this is because these packages are meant to be server packages, not application libraries. This exclusion is done mainly to reduce the size of the deployable package and prevent any dependency clashes between the actual application libraries and jetty server internal libraries. We'r basically keeping tight control over the application's classpath. You will later see that, whenever we need some server classes to be exposed to the application, we'll expose them on a per need basis using jetty's own classpath controls.

Now that we've added JServer.class as the main class to the WAR's manifest, we need to setup the ground for it to work. We will start by moving the class from it's after compilation location under ${project.build.directory}/classes/ to the root directory of the WAR. We perform that by adding the following bit of xml to our pom.xml file :

		  <echo>Setting up embedded jetty main class ...</echo>
		  <move todir="${project.build.directory}/${project.artifactId}-${project.version}/">
		      <fileset dir="${project.build.directory}/classes/">
			  <include name="JServer.class"/>

We're telling our maven-antrun-plugin to execute a task, the task is simply moving JServer.class to the root of the WAR before packaging.

Next, let's take a look at JServer.java:

import com.mysql.jdbc.jdbc2.optional.MysqlConnectionPoolDataSource;
import java.io.FileNotFoundException;
import java.io.IOException;
import org.eclipse.jetty.server.Server;
import org.eclipse.jetty.webapp.WebAppContext;
import org.slf4j.LoggerFactory;

import javax.naming.Reference;
import java.net.URL;
import java.security.ProtectionDomain;
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.PreparedStatement;
import java.sql.SQLException;
import java.util.Properties;
import javax.naming.NamingException;
import org.eclipse.jetty.plus.jndi.Resource;
import org.eclipse.jetty.server.Connector;
import org.eclipse.jetty.server.Handler;
import org.eclipse.jetty.server.HttpConfiguration;
import org.eclipse.jetty.server.HttpConnectionFactory;
import org.eclipse.jetty.server.SecureRequestCustomizer;
import org.eclipse.jetty.server.ServerConnector;
import org.eclipse.jetty.server.SslConnectionFactory;
import org.eclipse.jetty.server.handler.ContextHandler;
import org.eclipse.jetty.server.handler.ContextHandlerCollection;
import org.eclipse.jetty.server.handler.DefaultHandler;
import org.eclipse.jetty.server.handler.HandlerList;
import org.eclipse.jetty.server.handler.SecuredRedirectHandler;
import org.eclipse.jetty.util.ssl.SslContextFactory;
import org.h2.jdbcx.JdbcConnectionPool;
public class JServer {

    final static org.slf4j.Logger logger = LoggerFactory.getLogger(JServer.class);

    static Server embed_server;
    private static Properties conf;

    public static void main(String[] args) throws Exception {
    private static ServerConnector setupHttpsConnectors(int securePort, int actualSecurePort, String keystorePath, String ketStorePass) {

    private static ServerConnector setupHttpConnectors(int port, int securePort) {

    private static ContextHandlerCollection setupHttpsRedirect(WebAppContext webapp) {

    private static WebAppContext setupWebapp() {
    private static void setUpMySqlResource() {
    private static void setUpDemoH2Resource() {

    private static void initH2DemoInMemoryDatabase() throws SQLException {

    private static Properties loadConfig() throws FileNotFoundException, IOException {
    public static void stopServer() throws Exception {

Ignoring all the configuration business for now, we essentially have a main function that bootstraps jetty embedded server, adding all the necessary configurations for it to be able to serve our java web application. What's important to note here is that through the bootstrapping process, we're using a lot of imports from the dependencies we added in our pom.xml file, for our JServer.class to find these dependencies, we have to configure Maven to actually guarantee their existence in the server classpath.

To achieve this, we'll use our maven-dependency-plugin, adding the following bits to our pom.xml file:


In the above listing, we're basically copying the necessary libraries to bootstrap and run the server to the root directory of the WAR file, which makes them accessible to the server. Note that this does not remove these dependencies from the application's classpath, as they still exist under WEB-INF/lib.

This concludes the build system's configuration to run our web application using embedded jetty.

JAX-RS implementation (Jersey)

Jersey is an open source RESTful Web Services framework, it is the JAX-RS Reference Implementation, with a convenient test framework and extra goodies that facilitates developing RESTful Web services that seamlessly support exposing data in a variety of representation media types and abstracts away the low-level details of the client-server communication.

CDI implementation (WELD)

Context and dependency injection is a JCP standard that is a requirement for JavaEE compliance for Java server containers. Although we at Siphyc do not follow the Standards of JavaEE to the letter, we recognize the importance of CDI in all our Java web applications. For that we use WELD, which is the reference implementation for CDI: Contexts and Dependency Injection for the Java EE Platform.

The dependencies needed to integrate Jersey framework and WELD in our project, and then have them play nicely with each other are :

<!--WELD dependencies-->
<!--JERSEY-weld integration-->

In order to register our WELD provided bean manager as a JNDI resource we have:

        new org.eclipse.jetty.plus.jndi.Resource(webapp,
                new Reference("javax.enterprise.inject.spi.BeanManager", "org.jboss.weld.resources.ManagerObjectFactory", null));

This registers a JNDI resource "BeanManager" in the webapp's scope, some boilerplate code is needed for this to work, you can checkout the source of JServer.java to understand the requirements for registering JNDI resources in Embedded jetty, you can also take a look at
embedded jetty JNDI documentation . Note that this can also be done in xml.

We move on to register Weld's listener and configure jersey's servlet In our override-web.xml (or web.xml) under webapp/WEB-INF/ :

<web-app version="3.0" xmlns="http://java.sun.com/xml/ns/javaee" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://java.sun.com/xml/ns/javaee http://java.sun.com/xml/ns/javaee/web-app_3_0.xsd">

        <!--async response support ...-->    

We're passing com.siphyc.binder.AppJerseyBinder as the javax.ws.rs.Application implementation to be used to configure Jersey's Servlet, then registering the Servlet to handle requests to the defined url-pattern, which will cause all detected jersey web services to be exposed under this url pattern.

In our custom AppJerseyBinder, we define which package in our application contains jersey's web services, and we add a class to allow our web services to handle multipart form data:

public class AppJerseyBinder extends ResourceConfig {

    public AppJerseyBinder() {
        String[] pacakgesToScan = {"com.siphyc.endpoints"};
        packages(true, pacakgesToScan);

Note that without the jersey-cdi1x dependency, we would have to manually hook Jersey to WELD in order to allow WELD managed beans to be injected into jersey services, and we'd have to do a lot of manual binding between implementations and contracts in our application to achieve this.

A final requirement to get WELD to play nicely with our embedded jetty server is exposing a set of server classes that are needed by WELD's decorator in it's initialization phase, since we've excluded all server jars from our application's WEB-INF/lib directory, we have to explicitly tell Jetty to expose the necessary server classes to weld decorator in our JServer.java class:


So we're telling jetty server to prepend the listed server classes to the webappContext's classpath, effectively exposing them to WELD decorator.

JPA implementation (EclipseLink)

Java Persistence API is also A javaEE standard that is fully adopted and integrated into Siphyc's java based projects, we utilize EclipseLink implementation for it being the reference implementation for JPA standards. Note that in most complex setups, EclipseLink's Caching capabilities are disabled in favor of Memcached external server deployments.

There isn't much being done here, in our JServer.java class, we're registering a JNDI resource, the same way we did with our CDI BeanManager:
private static void setUpMySqlResource() {
        try {
            MysqlConnectionPoolDataSource dataPool = new MysqlConnectionPoolDataSource();
            if (conf.getProperty("DB_useSSL").equalsIgnoreCase("true")) {
                System.setProperty("javax.net.ssl.trustStore", conf.getProperty("keystorePath"));
                System.setProperty("javax.net.ssl.trustStorePassword", conf.getProperty("keystorePass"));
                // debugging ssl connection ...
//                System.setProperty("javax.net.debug", "all");
            new Resource(null, "jdbc/conName", dataPool);
        } catch (NamingException ex) {
            // unlikely
            logger.info("Error setting up Mysql resource \n" + ex);

This is an example of registering a MySQL connection datapool as a JNDI resource, Note that this can also be done in XML.

To have a fully functional persistence unit, we add persistence.xml to our META-INF file :

<persistence version="2.0" xmlns="http://java.sun.com/xml/ns/persistence" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://java.sun.com/xml/ns/persistence http://java.sun.com/xml/ns/persistence/persistence_2_0.xsd">
  <persistence-unit name="persistenceUnit" transaction-type="RESOURCE_LOCAL">

This defines our data source to be non-jta, so the container will not be managing transactions, we'll have to do that ourselves. you can checkout class com/siphyc/dao/JPAcommonUtils.java and the JPA controllers in the same package to see how this is done. Note that the JPA controllers are mostly auto-generated by Netbeans IDE(assuming Tomcat server as the servlet container), then they're modified a bit to eliminate the dependency on the container managed transactions and JPA entity manager, you may choose to register a JTA implementation (like Atomikos) with jetty, and go with all that container managed injection magic, which is a fine setup for transaction dependent functionality, but I generally prefer the fine control afforded by controlling the transactions myself, since transaction business is not usually a big part of carefully considered designs. In real life projects, I disable EclipseLink's cache in favor of Memcached when necessary, adding a Memcached layer (within DAO layer) that is controlled by a standard Memcached java client, which gives a better tailored control over Memcached maintained objects and how they're accessed, This path gives the advantage of detaching the lower part of my DAO layer (entity management, persistence, data source ..) from the caching implementation, with the downside of introducing more work, since I have to intercept all communications and update memory maintained objects.

Logging framework (SLF4J)

Nothing Special here, the required dependency is:

 <!--logging framework -->

You can explore the file logback.xml to see how logging is handled for each application package, what's important to note here is that for us to log Server activities, we need to expose logback internals to the server, this is done exactly the same way we've done when moving JServer.class using antrun-plugin:

<echo>Setting up logback config ...</echo>
<move todir="${project.build.directory}/${project.artifactId}-${project.version}/">
    <fileset dir="${project.build.directory}/classes/">
	<include name="logback.xml"/>

we also exposed the necessary classes by copying them to the root of the WAR inside the execution "jetty-classpath" in our pom.xml file.

Project configuration

To control certain properties of our project, we use maven profiles to control the activated properties that are used at run time when the application is deployed. These properties affect both the application and the embedded server, since both the server and the application are assembled into a WAR at the same time, we can control everything from which ports the server will listen to, to which resources the application will have access to, to whether the server will be using ssl or not, all in a java properties simple syntax file. This is where the benefits of this project setup are most apparent. Your server configurations are no longer decoupled from your application's, all configurations in one place, following the exact same syntax. All configuration files follow the standard Java platform properties.

Let's start by exploring the configuration setup in this example project:

File Description
src/main/resources/active.properties The activated configuration variables that will be used by the application and server at runtime
src/main/resources/connections/conn.properties A convenient properties file that holds references to all the external resources that could be used by the applicaiton
src/main/resources/dev/conf.properties The file that defines the configuration variables used when the 'dev' maven profile is used when assembling the WAR file
src/main/resources/dev2/conf.properties The file that defines the configuration variables used when the 'dev2' maven profile is used when assembling the WAR file

To make these configuration files work, we have to activate filtering of resources in our pom.xml file:


The most important thing you need to take note of when using filtering, a directory with filtered resources cannot have any file with encoding other than UTF-8, I'm saying utf-8 because in our pom.xml properties, we have:


So do not include binary files that are used by your application in the resouces, it will be corrupted and will give you a very very very hard time figuring out why things aren't working ... You've been warned.

Moving on, note that we have a variable ${build.profile.id} in the path that defines the directory of the conf.properties file that will be used, this variable will be filled by activating the desired profile while running maven to build the project, so we need to define all the possible profiles and match their names with the resource directories that hold the various configuration files as listed in the above table:


The ${build.profile.id} variable is defined in each listed profile.

Let's take a look at one of the config.properties :



# H2 embedded database

# ssl

Here are some descriptions for some of these properties (skipping the database specific stuff):

Property values Description
proj.useSsl True/False whether to use ssl or not
proj.http.port port number the HTTP port to use
proj.https.port port number the unprivileged HTTPS port to use
proj.https.ssl.port port number (443) the standard HTTPS port to use
keystore.pass password the password to java keystore
keystore.path path the path to java keystore

When maven activates this profile, these properties fill the active.properties file:



# database

# ssl

Which in turn is used by the server and application at runtime. of course, we'd have to make the active.properties file accessible to the server, so we'll deal with it the same way we did with JServer.java and logback.xml in antrun-plugin previously, this time, we won't be moving the file, but copying:

<echo>Copying active configuration ...</echo>
<copy file="${project.build.directory}/classes/active.properties" 

Now, the server can get the configuration values and use them like:

    private static Properties loadConfig() throws FileNotFoundException, IOException {
        Properties prop = new Properties();
        return prop;

conf = loadConfig();
String propertyValue = conf.getProperty("PROPERTY_NAME");

Since we copied the active.properties file, and did not move it, the application can use the same approach to pick up variables form the active.properties file for conditional implementations.

Unit tests

There are three types of testing needed to effectively test the application logic in this project, first, testing of Jersey enabled web services under com.siphyc.endpoints package, which require a special context initialization process with Jersey and it's internal CDI implementation HK2, in addition to services mockups. The second type of testing is the testing of the com.siphyc.service package, which represents the services layer in this application and requires WELD environment setup plus mockups of the dao layer. The final type of testing is the testing of the com.siphyc.dao package, which represents the DAO layer of our application. The package com.siphyc.servlet can be tested the same way the services layer is being tested, but was skipped and left as an exercise. Mockups are provided for each step of testing, and javadocs are used to explain the details of what's happening, enjoy.


Some aspects of this project were assumed to be too simple to address in this post, one of these is the redirection to https, effectively forcing all traffic (not form posting though!) to be handled with HTTPS. Another aspect is the setup of keystores and certificates, if you're planning to use ssl, you need a signed certificate and a properly setup keystore, you may also add your certificate to the JVM's certs file. The readme file explains how to compile and run this project, the rest is up to you.