2017/07/09

Automatic DB migrations for Spring Boot with Liquibase

Introduction

I have lately written a short tutorial on Building REST services with Spring Boot, JPA and MySQL, with Part 1 and Part 2.

I decided to add an essential part to any serious project with an SQL store: management of Database Migrations.

In a real world where requirements will change, or when schemas cannot be fully designed up front, you will be facing a real problem sooner than later: how to manage changes in Database schema, once the application or web service is running?

I wrote an article some time ago, which still seems to be quite valid. You can read there the details of a recommended development workflow to cope with Database migrations in all phases of development.

In this article, I will apply the same ideas from my previous article to a more up-to-date app: a Spring Boot Rest Web service with MySQL.

Adding the Liquibase plug-in for Maven

Let's add the Liquibase plug-in.
What we want to achieve at this first step is to be able to evolve our model and keep working as before adding Liquibase: the Hibernate Maven plugin will take care of recreating the schema up-to-date every time we run tests or lunch the application.

We add the dependency:



Our initial Liquibase changeset will be empty.

We need to add the Database Changelog File for liquibase: the file where all changes managed by Liquibase are registered. It will initially be empty.
We add the file src/main/resources/db/db.changelog.xml:



We add the liquibase.properties file:



We finally add the plug-in with relevant configuration:



With this configuration, we are telling Liquibase not to try any database migrations. This is our normal workflow of updating model with JPA annotations and running tests. Hibernate plugin takes care of dropping and re-creating the database.

This is what we see if we run mvn clean test




Generating DB diff automatically with Liquibase: First migration

We have finished evolving our model and adding the additional logic and tests.
We are now happy and ready to commit a change.
This point could even be the very first version of your DB schema!

Let's generate the DB diff with Liquibase.
The generated diff file will be incorporated to the registered Liquibase DB schema updates. Additionally, when running our app, Liquibase will take care of migrating the DB schema to the latest version registered in our codebase.

To make all this magic happen, let's add a profile in our pom file so we can generate the DB diff anytime.


Let's generate our first Liquibase migration with mvn clean process-test-resources -Pdb-diff:



Liquibase has generated for us the file src/main/resources/db/db-20170709_144112.changelog.xml with this contents:




Great! We can now add this filename to our global DB changelog file:




Subsequent migrations with Liquibase

 To check our migration mechanism works well, let's update our model with a version field and generate again a DB diff via mvn process-test-resources -Pdb-diff.

Liquibase generates this file:



This seems like magic!

Automatic DB migration embedded in the app

Having added liquibase in our dependencies has also included a Liquibase Spring Bean to our app. This bean runs at application startup, checks the registered changesets against the app DB and brings the DB schema up-to-date automatically by applying any needed migrations.

It would be good to see this in action at development time, so we can test it.

Let's add another profile to our maven project for this.



This profiles skips any schema generation by the Hibernate plugin and drops the database. This way, when the app starts, the Liquibase Spring bean will enter into action and be forced to run all registered migrations.

If we now run the app via mvn clean spring-boot:run -Pdb-test:



Works as expected!

Source Code and Additional Info

Source code for this tutorial here.
Additional Liquibase maven plugin info: mvn liquibase:help
More info about Liquibase here.
Some more info in my previous article about DB migration.

2017/05/29

Avoiding undesired web Scraping and fake Web Search engines in Ruby on Rails

Introduction

If you have developed a nice web app with a lot of content, you will sooner or later face undesired web scraping.

The undesired web scraping will sometimes come from an unmasked bot, with user agents such as Go-http-client, curl, Java and others. But sometimes you will have to deal with bots pretending to be almighty Googlebot or some other legitimate bot.

In this article I will propose you a defense to mitigate undesired web scraping, and to detect fake bots disguised under a legitimate bot name (user agent), without compromising the response time.

This defense can be integrated in any rack-based web app, such as Ruby on Rails or Sinatra.

Request Throttling

If your website has a lot of content, any reasonable human visitor will not access many pages. Let's say that your visitor is a very avid reader and enjoys a lot your content. How many pages do you think it can visit:
  • per minute?
  • per hour?
  • per day?
Our defense strategy will be based on accumulating the number of requests coming from a single IP address for different slots of time.
When one IP address exceeds a pre-configured reasonably high number of requests for the given interval, our app will respond with an HTTP 429 "Too many requests" code.

To the rescue comes rack-attack: a rack middleware for blocking and throttling abusive requests.

Rack-attack stores request information in a configurable cache, with Redis and Memcached as some of the possible cache stores. If you are using Resque, you will probably want to use Redis for rack-attack too.


Here's a possible implementation of rack-attack:



Let's go through the code.

Any request whose path starts with one of these entries will be a candidate for throttling:


We set up a reasonable maximum number of requests for each of the intervals of time we will consider for request throttling:

This is arbitrary and you can choose different intervals of time.

We would like to limit the number of requests within 60 seconds coming from the same IP:


When this throttle block returns a non-falsey value, a counter will be incremented in the Rack::Attack.cache. If the throttle's limit is exceeded, the request will be blocked.

We will modify slightly the default rack_attack algorithm to allow legitimate web indexers in a timely manner.
Here's the new implementation of the algorithm:


Our new algorithm is basically the same as the original rack_attack one, except for the addition of these lines which check if the request comes from one of our allowed Search crawlers:


What this block does is:
  • Check if the request comes from a Search  Engine, identified by its user agent
  • If that's the case, assume it's true and verify offline the authenticity of the bot, so we do not delay the response. If it turns out to be fake, it will be blocked in subsequent requests

The performance of this algorithm will tipically be of just a few milliseconds.

Here's the Rails ActiveJob that will verify the authenticity of the bot. This can be implemented by a Resque queue.



Verify Bot


Let's see a possible implementation of VerifyBot.
Methods that VerifyBot will have:
  • verify: given a user agent and IP, verify the authenticity of the bot
  • allowed_user_agent: true for the user agents from bots we will allow
  • fake_bot: true for bots already verified as fake
  • allowed_bot: true for bots already verified as authentic

VerifyBot will use Redis to cache already verified bots and marked either as safe or fake. These two lists will be stored as Redis sets.





With these, only the implementation of the BotValidator is missing to complete the puzzle.

Bot Validator

Popular search engines authenticity can be verified by a reverse-forward DNS lookup. For instance, this is what Google recommends to verify Googlebot:
  1. Run a reverse DNS lookup on the accessing IP address
     
  2. Verify that the domain name is in either googlebot.com or google.com
     
  3. Run a forward DNS lookup on the domain name retrieved in step 1. Verify that it is the same as the original accessing IP address


Our BotValidator will have two main methods:
  • allowed_user_agent: true for users agents from bots we will allow
  • do_validation: true if the user agent can be authenticated. Will raise exception in case of a fake bot

Subclasses for each bot we want to validate will implement the methods:
  • validates? : true if responsible of validation for the given user agent
  • is_valid? : true when the bot is validated for the given user agent and IP address
Here's the implementation:


Subclass ReverseForwardDnsValidator implements the mentioned validation strategy that many search engines and bots follow.

To validate Googlebot or Bingbot, we will only need to subclass ReverseForwardDnsValidator and implement:
  • validates? : true if passed user_agent is the one the class validates
  • valid_hosts: array of valid reverse DNS host name terminations

Other subclasses for different validations can be added. For intance, one to validate Facebook bot, a generic one for Reverse-only DNS validation, etc.

2017/05/28

Building REST services with Spring Boot, JPA and MySql: Part 2

Introduction

In the first part of this tutorial we saw how to build a skeleton java app from scratch based on the Spring framework and implemented the persistence to MySql database.


In this second part, we will implement a REST web service with the Spring framework.


I'll be using maven 3, version 3.0.5 and Java 8 SDK. Google around for installation of these in your environment.

Step 2: Implement a REST endpoint with Spring


In order to use the Spring framework as the basis for our REST endpoint, we need to add the necessary dependencies to our existing pom.xml:



We already have a model persisted to MySql and now we will add a controller with a method index that retrieves all persisted instances of our model.

We will annotate this method so that it is published as a REST endpoint when running our app within a Servlet container.



The Spring annotations added to our code are:
  • @RestController This declares our class as a controller returning domain objects instead of views. Spring will take care of the JSON serialization automatically via Jackson serializer

  •  @RequestMapping(value="/games", method = RequestMethod.GET) This maps GET requests for the path /games to our controller method.

We can now add a test for our new REST endpoint.



In our test, instead of running our controller within an external application Server, we use the Spring class MockMvc, which will direct requests to our controller, making our test faster.

If we now run mvn clean test:


Running our REST endpoint

We are now ready to package our app and run it.

If we run mvn clean package:



We now have a jar and we can just run it. Yes!!! That's right: we can just run it directly!
Spring has generated an uber jar: a jar with all needed dependencies to run our app, including an embedded servlet container: by default Tomcat, but you can  easily change it for Jetty or any other of your preference.

If we launch the command
 

java -jar target/spring-boot-mysql-0.0.1-SNAPSHOT.jar


We can see on the console a Tomcat has started and is listening on 8080 for requests!

Source Code

Source code on GitHub

2017/05/21

Building REST services with Spring Boot, JPA and MySql: Part 1

Introduction

In this tutorial, we will see how to build a skeleton Java app from scratch based on the Spring framework and capable of having an evolving model persisted on MySql and a related REST web service.

As requirements are changing continuously, we will be handling updates to our model, which in the end translate as updates to our underlying database schema, with Liquibase: a database migration tool

For an overview of how you can manage Database migrations in your development lifecycle, have a look at one of my previous articles: Automatic DB migration for Java web apps with Liquibase

I'll be using maven 3, version 3.0.5 and Java 8 SDK. Google around for installation of these in your environment.

Step 1: Persist a model with JPA and Hibernate

Let's start with what Spring gives us in Spring Initializr for  a maven project with dependencies JPA and MySql.

Here's the generated POM.



In order to have a non-failing maven project, we need to add the details of the database schema to our project.

The resulting properties section in the POM:


And we add these properties to application.properties:


If we launch  mvn clean package  we have now a successful build.

For details on how to create and assign user permissions on MySql, Google is your friend :-)

Adding our Model and Repository

Let's add a sample Model class to our app.



And a Repository interface to access the persisted data. Spring Data will automatically generate the implementation for us.



We can now add a test to load all instances from our repository and verify it is working correctly.



In order to populate our Database for tests, we have the option of using Spring annotations directly in our Java unit test source code.

In this case, we will be using the maven-db-unit plugin instead.

Our updated pom.xml:



And our src/test/resources/sample-data.xml for the unit tests.




If we now run our test with mvn clean test, we have a build failure: we have no tables in our MySql schema and dbunit cannot insert the test data.



At this point, we need to generate a DDL script for our schema.

There are a number of options. You could opt for a Spring solution.

We will apply a more generic solution from a third party which works on Spring and non-Spring frameworks: Hibernate Maven Plugin from juplo.de. This is a completely new implementation of the Hibernate Maven plugin updated to Hibernate 5.

We need to add these lines to our pom.xml:



And the file src/test/resources/hibernate.properties needed by the hibernate-maven-plugin:


Notice in the updated pom.xml:

  • The hibernate-maven-plugin must appear before the dbunig-maven-plugin: the database tables will be created before the dbunit sample data is inserted.

  • Additionally, the file src/test/resources/hibernate.properties needs to be filtered by the standard maven resources plugin.

If we run mvn clean test, our test is finally passing after creating the database tables and populating them with unit test data:


We leave for a future part publishing a REST web service for our model and handling automatic Database migration with Liquibase.

Source code: GitHub

Check Part 2 of this tutorial here.

2013/04/07

Adding Signup email confirmation to AppFuse

Introduction


In todays SaaS world, user signup usually requires email confirmation because of continuous events that trigger new email messages to users.

AppFuse is a RAD Java web app framework, supporting a multitude of web frameworks: Spring MVC, Struts2, JSF, Tapestry, and there are more to come.

Let's implement a user signup email confirmation service for AppFuse. This can be adapted to any other Java web app framework.

Initial Web App

From the AppFuse quickstart page, I copy the maven command to generate an initial Struts 2 app from AppFuse archetypes:

mvn archetype:generate -B 
 -DarchetypeGroupId=org.appfuse.archetypes 
 -DarchetypeArtifactId=appfuse-basic-struts-archetype 
 -DarchetypeVersion=2.2.1 
 -DgroupId=com.operatornew 
 -DartifactId=signup 
 -DarchetypeRepository=http://oss.sonatype.org/content/repositories/appfuse

As we will be updating the User services, we generate the full source code from the newly created signup directory:

mvn appfuse:full-source

User email verification

 How can we verify a new user's email?

1. A new user signs up and fills in an email address which we want to verify as valid.

2. After the user submits their data, we'll generate a unique and difficult to guess token for each that signs up. The new user won't be able to log in until they complete their email verification process.

3. We'll send an email with a URL from our app which will include this generated token. As AppFuse supports multi-language, we'll generate the email in the active locale.

4. When the new user receives the confirmation email, they will be able to visit the included URL with unique token to say "Hey, it's me. I've received your difficult-to-guess token at the email address I gave you". We will mark then this user as confirmed.

User Signup Confirmation: Service and Model Layers

Ok. Let's add a new Java interface for our new Signup confirmation service. The service will be responsible for starting a user's data confirmation process and confirming the user's data. We will apply it to email verification, but it could be applied to mobile phone number verification as well.

public interface SignupConfirmService {

    User doConfirm(User user);

    /**
     * Starts a user Confirmation procedure
     *
     * @param user the User whose data needs to be confirmed. The user's signupDate will be updated accordingly
     * @param context the Web App context from which the locale and URL will be obtained
     * @return
     */
    User startConfirm(User user, WebAppContext context);

}

Hold on. What is this WebAppContext type? We'll need to include our web app URL in the generated email. As we'll implement the confirmation in the service level, we'll avoid to add an unnecessary dependency to servlet classes. After all, we're working at the service layer.


The WebAppContext type will be an abstract class:

public abstract class WebAppContext {
    private Locale locale;

    public WebAppContext(Locale locale) {
        this.locale = locale;
    }

    public Locale getLocale() {
        return locale;
    }

    abstract public String getAppUrl();
}

Updating our Model

We'll add to our User class three pieces of information:
  • a signup date
  • an email confirmation token
  • a confirmation date
We add the dates for administration purposes.

public class User extends BaseObject implements Serializable, UserDetails {

    ...

    private Date signupDate;
    private String emailConfirmToken;
    private Date confirmDate;

    ...

    @Column(name="signup_date", nullable=false)
    @Temporal(TemporalType.TIMESTAMP)
    @Field
    public Date getSignupDate() {
        return signupDate;
    }

    public void setSignupDate(Date signupDate) {
        this.signupDate = signupDate;
    }

    @Column(name="confirm_token", length=32)
    public String getEmailConfirmToken() {
        return emailConfirmToken;
    }

    public void setEmailConfirmToken(String confirm) {
        this.emailConfirmToken = confirm;
    }

    @Column(name="confirm_date")
    @Temporal(TemporalType.TIMESTAMP)
    @Field
    public Date getConfirmDate() {
        return confirmDate;
    }

    public void setConfirmDate(Date confirmDate) {
        this.confirmDate = confirmDate;
    }
}

Implementing our new Service

We can now write an implementation for our new SignupConfirmService to verify a new user's email address.
Our implementation will use AppFuse MailEngine service to send email, a ResourceBundle for mail subject i18n, and the Java SecureRandom class to generate a unique and difficult to guess token for each new user.

public class SignupConfirmEmail implements SignupConfirmService {

    // For the secure random generation of tokens
    private SecureRandom random = new SecureRandom();

    // AppFuse email service, injected by Spring
    private MailEngine mailEngine;
    // AppFuse email prototype, injected by Spring
    private SimpleMailMessage mailMessage;

    // i18n for the subject's email. Configurable in spring bean, with a default bundle name
    private String resourceBundleName = "MailResources";
    private ResourceBundle rb;


    .... // spring bean setters omitted

    @Override
    public User startConfirm(User user, WebAppContext context) {
        String code = new BigInteger(130, random).toString(32);
        user.setEmailConfirmToken(code);
        mailMessage.setTo(user.getFullName() + "<" + user.getEmail() + ">");
        mailMessage.setSubject(getFromResourceBundle("email.signup.subject", context.getLocale()));
        Map<String, Object> model = new HashMap<String, Object>();
        model.put("userFirstName", user.getFirstName());
        model.put("appURL", context.getAppUrl());
        model.put("signupConfirmURL", context.getAppUrl() + "/account/confirm?confirm="+user.getEmailConfirmToken());
        mailEngine.sendMessage(mailMessage, "signupConfirm.vm", model);
        return user;
    }

    @Override
    public User doConfirm(User user) {
        if (!user.isEnabled()) {
            // has not been confirmed yet:
            user.setEnabled(true);
            user.setConfirmDate(new Date());
        }
        return user;
    }

    private String getFromResourceBundle(String key, Locale locale) {
        if (rb == null) {
            try {
                rb = ResourceBundle.getBundle(resourceBundleName, locale);
            } catch (MissingResourceException ex) {
                rb = ResourceBundle.getBundle("com.operatornew." + resourceBundleName, locale);
            }
        }
        String str = rb.getString(key);
        return str;
    }

We can now define the SignupConfirmService implementation as a Spring bean in our applicationContext-service.xml file:

<bean id="signupConfirmService" class="com.operatornew.service.impl.SignupConfirmEmail">
    <property name="mailEngine" ref="mailEngine" />
    <property name="mailMessage" ref="mailMessage" />
</bean>

Next, we'll modify AppFuse's UserManager as follows:
  • add a signup() method for user signup, which will check if there is a signup confirmation service in place
  • we can make the signup confirmation service globally optional with a configuration parameter
  • add a confirmSignup() method for user signup confirmation

The implementation could be this one:

public User signup(User user, WebAppContext app) throws UserExistsException {
    // First save user to check there is no conflict with unique fields
    user = saveUser(user);

    if (signupConfirmService != null) {
        signupConfirmService.startConfirm(user, app);
        if (enableAccountAfterSignupConfirm) {
            user.setEnabled(false);
        }
    }

    // save updated fields
    return save(user);
}

public User confirmSignup(String token) throws UserSignupTokenNotFoundException {
    User user = userDao.getUserFromSignupToken(token);
    if (signupConfirmService != null) {
        user = signupConfirmService.doConfirm(user);
        return userDao.saveUser(user);
    } else {
        throw new RuntimeException("signupConfirmService null while trying to confirm signUp for token " + token);
    }
}

For confirmSignup, we need to add a method to userDao to retrieve a user by their signup token:

public User getUserFromSignupToken(String token) throws UserSignupTokenNotFoundException {
    List users = getSession().createCriteria(User.class).add(Restrictions.eq("emailConfirmToken", token)).list();
    if (users == null || users.isEmpty()) {
        throw new UserSignupTokenNotFoundException("user token'" + token + "' not found...");
    } else {
        return (User) users.get(0);
    }
}

Testing the new Service

If we run our tests now:

mvn test

We see we have broken something:

...
Results :

Failed tests:
  testSave(com.operatornew.webapp.action.SignupActionTest)

Tests in error:
  testAddAndRemoveUser(com.operatornew.service.UserManagerTest)

Tests run: 71, Failures: 1, Errors: 1, Skipped: 0

[INFO] ------------------------------------------------------------------------
[ERROR] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] There are test failures.

The surefire junit report for one of the failing tests looks like this:
...
Tests run: 3, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 0.584 sec <<< FAILURE!
testAddAndRemoveUser(com.operatornew.service.UserManagerTest)  Time elapsed: 0.082 sec  <<< ERROR!
com.operatornew.service.UserExistsException: User 'john' already exists!
 at com.operatornew.service.impl.UserManagerImpl.saveUser(UserManagerImpl.java:143)
...

However, it does not make sense that a user already exists after our changes. Upon inspection of the test log in the console, I can spot the error:

...
Caused by: com.mysql.jdbc.exceptions.jdbc4.MySQLIntegrityConstraintViolationException: Column 'signup_date' cannot be null
We fix this by setting a signupDate if it is null in saveUser() method:
...
if (user.getSignupDate() == null) {
    user.setSignupDate(new Date());
}

We rerun our tests:

...
Results :

Tests in error:
  testSave(com.operatornew.webapp.action.SignupActionTest)

Tests run: 71, Failures: 0, Errors: 1, Skipped: 0

[INFO] ------------------------------------------------------------------------
[ERROR] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] There are test failures.

After inspecting the surefire report, we can see this:

-------------------------------------------------------------------------------
Test set: com.operatornew.webapp.action.SignupActionTest
-------------------------------------------------------------------------------
Tests run: 5, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 0.168 sec <<< FAILURE!
testSave(com.operatornew.webapp.action.SignupActionTest)  Time elapsed: 0.064 sec  <<< ERROR!
org.springframework.dao.EmptyResultDataAccessException: Incorrect result size: expected 1, actual 0
 at org.springframework.dao.support.DataAccessUtils.requiredSingleResult(DataAccessUtils.java:71)
 at org.springframework.jdbc.core.JdbcTemplate.queryForObject(JdbcTemplate.java:730)
 at org.springframework.jdbc.core.JdbcTemplate.queryForObject(JdbcTemplate.java:749)
 at com.operatornew.dao.hibernate.UserDaoHibernate.getUserPassword(UserDaoHibernate.java:109)
...
 at com.operatornew.service.impl.UserManagerImpl.saveUser(UserManagerImpl.java:123)
 at com.operatornew.service.impl.UserManagerImpl.signup(UserManagerImpl.java:88)
...

Our implementation is failing in the new UserManagerImpl signup() method, when saveUser() is called the second time to save fields updated by the SignupConfirmationService implementation. The saveUser() method eventually calls UserDao's getUserPasword() to check if the password needs to be re-encrypted. But getUserPassword() is annotated not to support propagation of transactions, whichs makes it fail because our new user is created within a transaction that has not finished yet:

public interface UserDao extends GenericDao<User, Long> {
    ...
    @Transactional(propagation = Propagation.NOT_SUPPORTED)
    String getUserPassword(Long userId);
}

The second saveUser method call in our new signup method does not need to re-encrypt user password clearly. We can fix it with a private method that just saves the user instance without checking for password encoding, and a bit of refactoring to avoid duplicated code:

public User signup(User user, WebAppContext app) throws UserExistsException {
    // First save user to check there is no conflict with unique fields
    user = saveUser(user);

    if (signupConfirmService != null) {
        signupConfirmService.startConfirm(user, app);
        if (enableAccountAfterSignupConfirm) {
            user.setEnabled(false);
        }
    }
    // save updated fields
    return saveUserNoPwdEncoding(user);
}

public User saveUser(User user) throws UserExistsException {
    ...
    return saveUserNoPwdEncoding(user);
}

private User saveUserNoPwdEncoding(User user) throws UserExistsException {
    if (user.getSignupDate() == null) {
        user.setSignupDate(new Date());
    }

    try {
        return userDao.saveUser(user);
    } catch (Exception e) {
        e.printStackTrace();
        log.warn(e.getMessage());
        throw new UserExistsException("User '" + user.getUsername() + "' already exists!");
    }
}

All our tests now pass.

We can now write a new test for our SignupConfirmService class. We'll use Wiser to mock smtp server, and verify that startConfirm() sets a value on user's emailConfirmToken, that an email is sent and that the email's body contains the generated token:

public class SignupConfirmServiceTest extends BaseManagerTestCase {
    private Log log = LogFactory.getLog(SignupConfirmServiceTest.class);
    @Autowired
    MailEngine mailEngine;
    JavaMailSenderImpl mailSender = new JavaMailSenderImpl();

    @Autowired
    private SignupConfirmService signupConfirmService;
    private User user;

    @Before
    public void setUp() {
        mailSender.setHost("localhost");
        mailEngine.setMailSender(mailSender);
    }

    @After
    public void tearDown() {
        mailEngine.setMailSender(null);
    }

    @Test
    public void testStartConfirm() throws Exception {
        // mock smtp server
        Wiser wiser = new Wiser();
        // set the port to a random value so there's no conflicts between tests
        int port = 2525 + (int)(Math.random() * 100);
        mailSender.setPort(port);
        wiser.setPort(port);
        wiser.start();

        user = new User();

        // call populate method in super class to populate test data
        // from a properties file matching this class name
        user = (User) populate(user);
        assertNull(user.getEmailConfirmToken());

        user = signupConfirmService.startConfirm(user, new WebAppContext(new Locale("en")) {
            public String getAppUrl() {
                return "http://localhost:8080";
            }
        });
        wiser.stop();
        assertNotNull(user.getEmailConfirmToken());

        assertTrue(wiser.getMessages().size() == 1);
        WiserMessage wm = wiser.getMessages().get(0);
        assertThat((String)wm.getMimeMessage().getContent(), containsString(user.getEmailConfirmToken()));
    }
}

It will initially fail as we need to add resources for email subject i18n and the Velocity templates. In MailResources.properties:

email.signup.subject=Confirm your email Address

In signupConfirm.vm Velocity template:

Hello ${userFirstName},

To enable your account, please click in the following link or copy it onto the address bar of your favourite browser.

${signupConfirmURL}

After that, the test passes.

User Signup Confirmation: Web Layer

After implementing the service layer, we can now implement the web layer.

When a new user clicks on the sing up link, a page with a fill-in form is showed and the user enters their info.

After completion of the form, the user will press the signup button and, if configured to confirm email before the account is enabled, a page will be displayed informing the user that an email has been sent to their email address and they need to confirm their account by following email instructions.

In AppFuse with Struts2, the SignupAction class implements the Struts2 action for Signup. The save() method currently saves the new signed up user, sends them an email of signup welcome and logs the new user in.

We will change the implementation by calling the new UserManager.signup() method, eliminate the welcome email and only will log in the new user if the app is NOT configured to confirm email before the account is enabled.

public String save() throws Exception {
    // Set the default user role on this new user
    user.addRole(roleManager.getRole(Constants.USER_ROLE));

    try {
        // call signup, passing current locale and a new anonymous inner class returning the web app URL
        user = userManager.signup(user, new WebAppContext(this.getLocale()) {
            public String getAppUrl() {
                return RequestUtil.getAppURL(getRequest());
            }
        });

    } catch (AccessDeniedException ade) {
        // thrown by UserSecurityAdvice configured in aop:advisor userManagerSecurity
        log.warn(ade.getMessage());
        getResponse().sendError(HttpServletResponse.SC_FORBIDDEN);
        return null;
    } catch (UserExistsException e) {
        log.warn(e.getMessage());
        List<Object> args = new ArrayList<Object>();
        args.add(user.getUsername());
        args.add(user.getEmail());
        addActionError(getText("errors.existing.user", args));

        // redisplay the unencrypted passwords
        user.setPassword(user.getConfirmPassword());
        return INPUT;
    }

    if (userManager.isEnableAccountAfterSignupConfirm()) {
        // user signup needs confirmation to be enabled
        getSession().setAttribute(Constants.REGISTERED, user.getEmail());
        return "confirm";
    } else {
        saveMessage(getText("user.registered"));
        getSession().setAttribute(Constants.REGISTERED, Boolean.TRUE);

        // log user in automatically
        UsernamePasswordAuthenticationToken auth = new UsernamePasswordAuthenticationToken(
                user.getUsername(), user.getConfirmPassword(), user.getAuthorities());
        auth.setDetails(user);
        SecurityContextHolder.getContext().setAuthentication(auth);

        return SUCCESS;
    }
}

We will add two new struts actions:
  • account/needconfirm will inform the user that they need to confirm their account by following email instructions
  • account/confirm will try to confirm a user account with the supplied confirmation token. The result can either be successful or unsuccessful, informing the user accordingly
 The relevant Struts 2 xml configuration file for the new Struts 2 actions:

<action name="saveSignup" class="signupAction" method="save">
    <result name="cancel" type="redirect">/home</result>
    <result name="input">/WEB-INF/pages/signup.jsp</result>
    <result name="success" type="redirectAction">home</result>
    <result name="confirm" type="redirectAction">account/needconfirm</result>
</action>

<action name="account/needconfirm">
    <result>/WEB-INF/pages/signupNeedConfirm.jsp</result>
</action>

<action name="account/confirm" class="signupAction" method="confirm">
    <result name="success">/WEB-INF/pages/signupConfirmed.jsp</result>
    <result name="error">/WEB-INF/pages/signupConfirmError.jsp</result>
</action>

The page to inform the user that they need to confirm their account by following the email instructions can be like this signupNeedConfirm.jsp:

<%@ include file="/common/taglibs.jsp" %>
<%@page import="com.operatornew.Constants" %>
<% String email = (String)session.getAttribute(Constants.REGISTERED);
%>

<head>
    <title><fmt:message key="signupNeedConfirm.title"/></title>
</head>

<body class="signup-confirm"/>

<div class="span2">
</div>

<div class="span8 dialog-info">
<div class="alert alert-block alert-info">
        <h2><fmt:message key="signupNeedConfirm.heading"/></h2>
        <fmt:message key="signupNeedConfirm.message">
            <fmt:param><%= email %></fmt:param>f
        </fmt:message>
    </div>
</div>

<div class="span2"/>

It displays a message informing to which email has been sent the message explaining the user what they need to perform in order to complete the signup process.

The other two new jsp pages are very similar to this one.

Not least important is making the new action URL's publicly accessible to any user. For this, we update Spring's security config file security.xml like this:

<http auto-config="true">
    <intercept-url pattern="/account/confirm*" access="ROLE_ANONYMOUS,ROLE_ADMIN,ROLE_USER"/>
    <intercept-url pattern="/account/needconfirm*" access="ROLE_ANONYMOUS,ROLE_ADMIN,ROLE_USER"/>
    ...
</http>


Screenshots

Here are some screenshots of the added functionality.

Page requesting the user to confirm their email:
Email received by the user:



Page showing successful confirmation and account activation:


Page showing wrong account activation:

Sources


Sources can be found here.

2012/11/30

Automatic DB migration for Java web apps with Liquibase

Introduction


In an scenario of agile development, new versions are frequently released and deployed, and continuous changes in your database schema are frequent.

To deal with these database changes, a mechanism should be in place. In Ruby on Rails, you have it out-of-the-box and it works great. But in Java web apps, you have to find a solution and plug it in your own projects.

We will implement an automatic database update mechanism for Java web apps trying to meet the following goals:
  • It should not interfere during development
  • It should be easy to generate database updates during development
  • It should be easy to test database updates during development
  • At production, database updates should be performed automatically
Liquibase is a good tool to deal with database migrations:
  • Open-source
  • Database agnostic: can update most popular SQL databases
  • Integration: available as command line tool, maven plugin, ant task
  • Flexibility: deals SQL schema updates, custom updates via a Java class, even system commands updates
  • Automatic: can be integrated for automatic updates as spring bean or as servlet listener

Our App Before Liquibase


If we use hibernate and the hibernate3-maven-plugin,  during development our database schema is automatically kept up-to-date: hibernte3-maven-plugin extracts schema info from JPA annotations and hibernate configuration.

We will use a project based on the AppFuse framework, with flavours for Spring MVC, Struts2, Tapestry, JSF, but this is applicable to any java web app based on any framework.

From the AppFuse quickstart page, I copy the maven command to generate an initial Spring MVC app from AppFuse archetypes:

mvn archetype:generate -B
 -DarchetypeGroupId=org.appfuse.archetypes
 -DarchetypeArtifactId=appfuse-basic-spring-archetype
 -DarchetypeVersion=2.2-SNAPSHOT
 -DgroupId=com.mycompany
 -DartifactId=migration
 -DarchetypeRepository=http://oss.sonatype.org/content/repositories/appfuse

By inspecting the project's pom.xml, we can see the database schema is kept up-to-date during development by generating drop and create DDL commands, extracted from the JPA annotations in our model classes.

<plugin>
    <groupId>org.codehaus.mojo</groupId>
    <artifactId>hibernate3-maven-plugin</artifactId>
    <version>2.2</version>
    <configuration>
        <components>
            <component>
                <name>hbm2ddl</name>
                <implementation>annotationconfiguration</implementation>
                <!-- Use 'jpaconfiguration' if you're using JPA. -->
                <!--<implementation>jpaconfiguration</implementation>-->
            </component>
        </components>
        <componentProperties>
            <drop>true</drop>
            <jdk5>true</jdk5>
            <propertyfile>target/classes/jdbc.properties</propertyfile>
            <skip>${skipTests}</skip>
        </componentProperties>
    </configuration>
    <executions>
        <execution>
            <phase>process-test-resources</phase>
            <goals>
                <goal>hbm2ddl</goal>
            </goals>
        </execution>
    </executions>
    <dependencies>
    ... <!-- jdbc driver here -->
    </dependencies>
</plugin>

This is fine during development, because we do not need to worry about database migrations as yet. By running maven process-test-resources or maven jetty:run, we can see the tables are dropped and recreated each time we run our web app with jetty:

> mvn jetty:run
...
[INFO] [hibernate3:hbm2ddl {execution: default}]
[INFO] Configuration XML file loaded: file:.../migration/src/main/resources/hibernate.cfg.xml
[INFO] Configuration XML file loaded: file:.../migration/src/main/resources/hibernate.cfg.xml
[INFO] Configuration Properties file loaded: ...\migration\target\classes\jdbc.properties
alter table user_role drop foreign key FK143BF46A9B523CC9;
alter table user_role drop foreign key FK143BF46A407D00A9;
drop table if exists app_user;
drop table if exists role;
drop table if exists user_role;
create table app_user (id bigint not null auto_increment, account_expired bit not null, account_locked bit not null, address varchar(150), city varchar(50), country varchar(100), postal_code varchar(15), province varchar(100), credentials_expired bit not null, email varchar(255) not null unique, account_enabled bit, first_name varchar(50) not null, last_name varchar(50) not null, password varchar(255) not null, password_hint varchar(255), phone_number varchar(255), signup_date date, username varchar(50) not null unique, version integer, website varchar(255), primary key (id)) ENGINE=InnoDB;
create table role (id bigint not null auto_increment, description varchar(64), name varchar(20), primary key (id)) ENGINE=InnoDB;
create table user_role (user_id bigint not null, role_id bigint not null, primary key (user_id, role_id)) ENGINE=InnoDB;
alter table user_role add index FK143BF46A9B523CC9 (role_id), add constraint FK143BF46A9B523CC9 foreign key (role_id) references role (id);
alter table user_role add index FK143BF46A407D00A9 (user_id), add constraint FK143BF46A407D00A9 foreign key (user_id) references app_user (id);
...

Managing db updates in our project


Our development workflow could be like this one:


We will have two main Liquibase usage scenarios:
  • Liquibase at build-time: will generate all db changelogs
  • Liquibase at run-time: will automatically update the server schema as needed on deployment, including generation of the first database version for an empty schema

Liquibase at build-time

We will be evolving our app and when we have something to commit and push to our project's global repo, we can then generate the database migrations, if any.

We will add to our maven project the liquibase plugin and needed executions in order to:
  • generate database diff changelogs at any time when we want to consolidate our model updates
  • generate database production data dumps as changelogs at any time to consolidate app preloaded db data
  • exercise liquibase db migrations at any time for rapid testing (jetty:run)

Liquibase at runtime

The first time we deploy the app, it will contain the initial db changelog for an empty schema. Liquibase will generate all the db tables and populate with initial database data (default users, user roles, any lookup tables...). On subsequent deployments, our app will contain an additional db changelog to bring the server database schema up-to-date.


Integrating the db update at app startup


Liquibase can perform automatic db update at runtime by looking at the registered change sets in a changelog file and checking if they are applied against a table in our schema called DATABASECHANGELOG. It will create it automatically if it does not exist.

We can implement the automatic db update either with a Spring bean or with a Servlet listener.

We add the liquibase lib as dependency:
<dependency>
    <groupId>org.liquibase</groupId>
    <artifactId>liquibase-core</artifactId>
    <version>2.0.5</version>
</dependency>

And configure the Liquibase Spring bean in spring's applicationContext-resources.xml configuration file:

<bean id="liquibase" class="liquibase.integration.spring.SpringLiquibase">
    <property name="dataSource" ref="dataSource" />
    <property name="changeLog" value="classpath:db/db.changelog.xml" />
    <property name="defaultSchema" value="${db.name}" />
</bean>


Working on our app during development: evolving our model


During development, we will possible be making many changes to our app. We do not want to spend time for now on db migrations. Just evolve our model, annotate it with JPA and when doing rapid testing with jetty:run, generate the db from scratch.

Liquibase maintains a table with applied change sets to our database. Based on the contents of that table, on startup our app will run liquibase to apply any missing migration to our db.

In order to avoid:
  • liquibase trying to create db tables already created by hibernate3 maven plugin
  • liquibase changeset version conflicts (because of changes applied to existing changelogs, for instance)
we will need that maven performs these tasks:
  • Drop all tables from our schema, so DATABASECHANGELOG is deleted
  • Let Hibernate generate our db up-to-date based on our annotations
  • Make liquibase mark the db as up-to-date by updating DATABASECHANGELOG, without applying any db migrations
We can accomplish this by adding to our pom.xml:

<plugin>
    <groupId>org.liquibase</groupId>
    <artifactId>liquibase-maven-plugin</artifactId>
    <version>2.0.5</version>
        <configuration>
            <skip>${skipTests}</skip>
            <propertyFile>target/classes/liquibase.properties</propertyFile>
            <changeLogFile>target/classes/db/db.changelog.xml</changeLogFile>
        </configuration>
    <executions>
        <!-- drop db before generating schema with hbm2ddl to avoid any 
            inconsistencies between changelog files and DATABASECHANGELOG table -->
        <execution>
            <id>drop-db</id>
            <phase>process-resources</phase>
            <goals>
                <goal>dropAll</goal>
            </goals>
            <configuration>
                <propertyFile>target/classes/liquibase.properties</propertyFile>
                <changeLogFile>target/classes/db/db.changelog.xml</changeLogFile>
            </configuration>
        </execution>
        <!-- mark db up-to-date in the  DATABASECHANGELOG table after generating 
            schema with hbm2ddl so that no migration is executed -->
        <execution>
            <id>mark-db-up-to-date</id>
            <phase>test-compile</phase>
            <goals>
                <goal>changelogSync</goal>
            </goals>
        </execution>
    </executions>
</plugin>

Our liquibase.properties config file:

driver=${jdbc.driverClassName}
url=${jdbc.url}
username=${jdbc.username}
password=${jdbc.password}

Our initial empty liquibase changelog file (db.changelog.xml):

<databaseChangeLog>
</databaseChangeLog>


Generating db diffs to consolidate our model


And now we want to generate our db migration so we can consolidate our updated model. We will want our app ready to run a db update when deployed.

To generate our db diff, this is what we'll do:
  • Generate a mydb_prev schema, based on the current liquibase registered changelogs.
  • Generate a mydb schema based on our JPA annotations
  • Compute the db diff between these schemas and generate the db changelogs
We'll do this in a specific maven profile (db-diff), so we can activate it at any time to generate the db changelogs.

<plugin>
    <groupId>org.liquibase</groupId>
    <artifactId>liquibase-maven-plugin</artifactId>
    <version>${liquibase.version}</version>
    <configuration>
        <propertyFile>target/classes/liquibase-diff.properties</propertyFile>
        <changeLogFile>target/classes/db/db.changelog.xml</changeLogFile>
        <diffChangeLogFile>src/main/resources/db/db-${timestamp}.changelog.xml</diffChangeLogFile>
        <logging>info</logging>
    </configuration>
    <executions>
        <execution>
            <id>generate-db-prev</id>
            <phase>process-resources</phase>
            <goals>
                <goal>update</goal>
            </goals>
            <configuration>
                <dropFirst>true</dropFirst>
            </configuration>
        </execution>
        <execution>
            <id>generate-db-diff</id>
            <phase>process-test-resources</phase>
            <goals>
                <goal>diff</goal>
            </goals>
        </execution>
    </executions>
    <dependencies>
        <dependency>
            <!-- jdbc driver here -->
        </dependency>
    </dependencies>
</plugin>

I omit the buildnumber-maven-plugin config in our pom.xml at validate phase so that we can generate unique changelog filenames based on current timestamp.

Our liquibase-diff.properties config file:

driver=${jdbc.driverClassName}
url=${jdbc.url.prev}
username=${jdbc.username}
password=${jdbc.password}
referenceDriver=${jdbc.driverClassName}
referenceUrl=${jdbc.url}
referenceUsername=${jdbc.username}
referencePassword=${jdbc.password}

We can now run maven like this:
mvn process-test-resources -Pdb-diff

and voilà. Liquibase generates for us the changelog file for the initial version of our db, as we have run the diff against an empty changelog file:

<databaseChangeLog>
    <changeSet author="jgarcia (generated)" id="1354207484885-1">
        <createTable tableName="app_user">
            <column autoIncrement="true" name="id" type="BIGINT">
                <constraints nullable="false" primaryKey="true"/>
            </column>
            <column name="account_expired" type="BIT">
                <constraints nullable="false"/>
            </column>
            <column name="account_locked" type="BIT">
                <constraints nullable="false"/>
            </column>
            <column name="address" type="VARCHAR(150)"/>
            <column name="city" type="VARCHAR(50)"/>
            <column name="country" type="VARCHAR(100)"/>
            ...
        </createTable>
    </changeSet>
    <changeSet author="jgarcia (generated)" id="1354207484885-2">
        <createTable tableName="role">
    ...

We can now include this file in our initially empty db.changelog.xml file like this:

<databaseChangeLog>
    <include file="db/db-20121120_120949.changelog.xml" />
</databaseChangeLog>


Generating preloaded db data if any


Many web apps will have a set of data preloaded in the db: initial set of internal user accounts, available user roles, list of applicable taxes, ... whatever.

These can also be defined as a changesets so that Liquibase can update the database for us when running our app.

To generate data changesets, the maven liquibase plugin won't be of much help, as it does not include a goal for this. Instead, as it is also included in the plugin, we'll call directly the liquibase main java class as if we were using it from the command line. We'll do it with the exec-maven-plugin in a db-data maven profile so that we can generate the preloaded db data at any time:

<plugin>
    <groupId>org.codehaus.mojo</groupId>  
    <artifactId>exec-maven-plugin</artifactId>  
    <version>1.2.1</version>
    <executions>
        <execution>
            <phase>process-resources</phase>
            <goals>
                <goal>java</goal>
            </goals>
            <configuration>
                <mainClass>liquibase.integration.commandline.Main</mainClass>
                <includePluginDependencies>true</includePluginDependencies>
                <arguments>  
                    <argument>--driver=${jdbc.driverClassName}</argument>
                    <argument>--changeLogFile=src/main/resources/db/db-data-${timestamp}.changelog.xml</argument>
                    <argument>--url=${jdbc.url}</argument>
                    <argument>--username=${jdbc.username}</argument>
                    <argument>--password=${jdbc.password}</argument>
                    <argument>--diffTypes=data</argument>
                    <argument>--logLevel=info</argument>
                    <argument>generateChangeLog</argument>
                </arguments>
            </configuration>
        </execution>
    </executions>
    <dependencies>
        ...
        <!-- jdbdc driver -->
        <!-- liquibase plugin -->
        ...
    </dependencies>
</plugin>
...
<properties>
    <!-- avoid generating db schema + inserting db-unit -->
    <skipTests>true</skipTests>
</properties>

In AppFuse, the profile prod feeds the database with production data instead of test data. We use this profile to regenerate the db and populate it with production data.
After this, we can use the db-data profile to generate our changelog for the initial db data.

After running the maven commands:
mvn test-compile -Pprod
mvn process-resources -Pdb-data

we obtain this file from liquibase:

<databaseChangeLog>
    <changeSet author="jgarcia (generated)" id="1354214520109-1">
        <insert tableName="user_role">
            <column name="user_id" valueNumeric="2"/>
            <column name="role_id" valueNumeric="1"/>
        </insert>
        ...
    </changeSet>
    <changeSet author="jgarcia (generated)" id="1354214520109-2">
        <insert tableName="role">
            <column name="id" valueNumeric="1"/>
            <column name="description" value="Administrator role (can edit Users)"/>
            <column name="name" value="ROLE_ADMIN"/>
        </insert>
        ...
    </changeSet>
    <changeSet author="jgarcia (generated)" id="1354214520109-3">
        <insert tableName="app_user">
            <column name="id" valueNumeric="1"/>
            <column name="account_expired" valueBoolean="false"/>
            <column name="country" value="US"/>
            <column name="postal_code" value="80210"/>
            ...

This file needs to be updated, as liquibase:
  • dumps all data. You will have to keep only the added data to your db from the previous db version. No diff here performed by Liquibase.
  • data is not properly ordered regarding referential integrity

Once this file has been cleaned-up, we can include it as well in our main db.changelog.xml file:

<databaseChangeLog>
    <include file="db/db-20121120_120949.changelog.xml" />
    <include file="db/db-data-20121128_170043.changelog.xml" />
</databaseChangeLog>


Exercising the automatic db migration during rapid testing


Our app is now ready to perform all registered db migrations when deployed in a server. However, it would be nice too to exercise this during development when we launch a jetty:run for rapid testing of our unpackaged app.

For this purpose, we add a profile that performs these steps:
  • drops all tables from our schema
  • skips db schema generation from our JPA annotations
  • skips feeding db with unit test data
We can add these in a db-test profile:
<plugin>
    <groupId>org.liquibase</groupId>
    <artifactId>liquibase-maven-plugin</artifactId>
    <version>2.0.5</version>
    <executions>
        <execution>
            <id>drop-db</id>
            <phase>process-resources</phase>
            <goals>
                <goal>dropAll</goal>
            </goals>
            <configuration>
                <propertyFile>target/classes/liquibase.properties</propertyFile>
                <skip>false</skip>
            </configuration>
        </execution>
    </executions>
</plugin>
...
<properties>
    <skipTests>true</skipTests>
</properties>

We can now exercise our db migration to check it works fine:

> mvn jetty:run -Pdb-test
...
[INFO] [liquibase:dropAll {execution: drop-db}]
...
[INFO] [hibernate3:hbm2ddl {execution: default}]
[INFO] skipping hibernate3 execution
...
[INFO] Started Jetty Server
...
2012-11-30 13:25:19.341:INFO:/:Initializing Spring root WebApplicationContext
INFO 30/11/12 13:25:liquibase: Successfully acquired change log lock
INFO 30/11/12 13:25:liquibase: Reading from `migration`.`DATABASECHANGELOG`
INFO 30/11/12 13:25:liquibase: Reading from `migration`.`DATABASECHANGELOG`
INFO 30/11/12 13:25:liquibase: ChangeSet db/db-20121120_120949.changelog.xml::20121120_120949::jgarcia (generated) ran successfully in 921ms
INFO 30/11/12 13:25:liquibase: ChangeSet db/db-data-20121128_170043.changelog.xml::20121128_170043-data::jgarcia (generated) ran successfully in 37ms
INFO 30/11/12 13:25:liquibase: Successfully released change log lock

The logs show Liquibase has updated our db successfully.


Liquibase pitfalls


Your db schema will usually be specified in your jdbc connection parameters.
Liquibase automatically inserts schema references in the generated changelogs, for indexes, lke this:
  • baseTableSchemaName="migration"
  • referencedTableSchemaName="migration"
You better erase these or you will run into trouble if you set a different schema name in your jdbc configuration file.

For a full list of the maven liquibase plugin goals and params, you can run this command:

mvn liquibase:help

It is up-to-date, as opposed to the liquibase site documentation.

Liquibase validates db change sets by comparing some attributes of the change sets present in the changelog file against those registered in the DATABASECHANGELOG table as applied change sets. It compares:
  • the full path filename that contains the change set
  • the MD5 checksum of the change sets
If any of these is different, it will try to re-apply the changeset. If you want to avoid this, you can clear the corresponding fields in the db table and liquibase will refill them with the actual values.

To avoid differences in filename because of different path (relative vs absolute path, path updtes, etc), you can set the logicalFilePath attribute of the in each liquibase file. There is no parameter to omit the path of changelog files in an applied changeset.


Liquibase best practices


It is cleaner to have a single db.changelog.xml file that includes the generated db changelog files, so changelogs are grouped together:

<databaseChangeLog>
    <include file="db/db-20121120_120949.changelog.xml" />
    <include file="db/db-data-20121128_170043.changelog.xml" />
    <include file="db/db-20121129_093229.changelog.xml" />
</databaseChangeLog>

I also like to consolidate a set of related changesets from a changelog file in a single changeset. Instead of having many changesets, each one creating a table, creating an index, creating a referential integrity, etc, I tend to group many of these updates as a single changeset.

Liquibase autogenerates a numeric id to identify each changeset. I prefer to assign it a timestamp, as it gives more info and they still appear ordered.

Test, test, test.


Sources


Sources can be found here.

2012/11/07

About Me

I have been working as developer for more than 20 years. I enjoy software development.

In these years I have worked with a variety of technologies: C, C++, ObjectStore and Poet at the beginning of my professional career. Later on, Java and the incipient servlets, Oracle, MySql. Then Struts, Hibernate, Lucene ... And lately, I am working with Java, Spring, Spring Security, Hibernate Search, Struts 2, Apache CXF, Bootstrap and jQuery, to name a few.

Same goes for Engineering practices: from cascade lifecycle to spiral to agile.

And the tools: make, ant, maven, jenkins, ... CVS, visual source safe, subversion, mercurial, git ...

Lately I am into experimenting with Ruby on Rails, Grails, MongoDB. There are some great online courses around about these!

I am interested in technologies that allow to build better applications: meeting user expectations, and with beautiful and maintainable code.

I am a committer of AppFuse: an open-source java web framework. Among other things, I have contributed with:
  • better i18n support
  • upgrade to Hibernate 4
  • re-implement the full-text search service with Hibernate Search + Lucene

BitBucket GitHub