Chitika

Showing posts with label patterns. Show all posts
Showing posts with label patterns. Show all posts

Monday, April 11, 2005

Immutable classes

Here's my opinion on immutable classes, answering Bayu's comment which asks for an explanation on when and how to implement them.

Making a Java class immutable usually make the overall design of a system slightly better compared to making all Java classes mutable. Whenever possible, making such class immutable will make it easier to be implemented and maintained.

First, don't provide any mutator methods, i.e. methods which are able to modify the internal of the class. If you have to provide a mutator method, make sure that it's thread-safe.

Then, make all fields private. If you need to expose a constant value through public static final field, make sure that the internal of the constant value cannot be modified. Otherwise, you'll only open a hidden vulnerability. For example, an array or list constant value's elements may be modified without changing the reference of the array or list itself.

Sometimes, making an immutable class is a pleasure by itself. It's almost very easy to test, sometimes it needs no testing because it just cannot possibly break. Then, it's nice also to create constant values which are instances of this immutable class, just like constant values for Boolean.TRUE, Boolean.FALSE, etc.

There are many great examples of immutable class usages. Even the immutability itself can be raised to multiple levels, providing event tighter access on each level, e.g. no methods can be overridden, etc. I suggest you to read Effective Java book by Joshua Bloch. The book discusses immutability in great length.

I hope this short introduction to immutability may lure many future design to favor it whenever possible.. :D

Thursday, March 31, 2005

List Pagination (Continued)

To answer Josh's comment on why the implementation of list pagination is quite long, here's a brief background. This post may be beneficial to Java/OOP newbie, in terms of thinking in an OOP and working in a TDD (Test Driven Development) way.

It's all started with the modeling of what a Page object may look like. Well, it has to have a page number and the contents (list of records) itself. Then, we need to define what can we do with a Paginating List. We need to be able to get the first page, the last page, the previous & next page based on the current Page object. They're all described within the Paginating interface.

Then, I added two more information inside the Page class, which can be provided easily by any implementation class of Paginating interface, i.e. the total number of page and the total number of records shown per page. The addition of this two properties to the Page class proves to be a very beneficial to the PaginatingImpl class itself.

Now, before we dive into why the PaginatingImpl looks like it is, one thing needs to be restated here. The PaginatingImpl is just one example of how we can implement the Paginating interface. You can provide your own (which can be a very different) implementation. The only requirement for the implementation is only to support all of the methods declared in the Paginating interface.

When developing the PaginatingImpl, one thing I keep in mind is that I wanted this class to be immutable and thread-safe. By being an immutable class, the PaginatingImpl itself almost automatically becomes thread-safe and therefore sharable & cache-able. Hence I say that this solution can be used for a web application or a non-web application.

The implementation code within the PaginatingImpl itself is straightforward and is mainly driven by the test code. If you haven't download the test class, then I suggest you download them here first. This is the key driver which drives the PaginatingImpl & Page classes to evolve into what they are currently.

For example, to ease the assertion codes within the PaginatingTest class, I added the equals() and hashCode() methods in the Page class. Then, after performing several test cases, I remembered that the ArrayList implementation does not override the equals() and hashCode() method from Object class. Therefore, I may get an inconsistent (and invalid) result when comparing two List objects which contain exactly the same sequence of the same elements.

Hence I added two more methods, i.e. isListEqual() and listHashCode() which are merely my implementation of the equals() and hashCode() method for the List class. (Note: BTW, to all the Java newbies reading this post, if you don't know why equals() and hashCode() need to be overridden correctly at the same time, please leave me a comment, I'll try to post the explanation for them. Or you can Google them or read in your own javadoc API or search it in Javaranch's SCJP Forum.)

The toString() method was added in the Page class merely to support easy debugging which I used exactly only once before sharing this code to you all. So the method definitely has its uses.

If you read carefully enough within the PaginatingImpl, you may find strange arithmetic operation, e.g. minus 1, or minus 2, or plus 1. Well, those are small logic but they're crucial to prevent bugs occuring from this rather small implementation of List Pagination.
If you want to test yourself, just take three of my classes, Page, Paginating & PaginatingTest, then develop your own PaginatingImpl class. It's a good challenge, and you may just develop a better implementation than mine, and enrich the PaginatingTest with many more test cases.

I sure would love to hear if anyone would want to take the challenge. It's a good start to all three elements of today's software development, i.e. Java, OOP & TDD. You need to download JUnit libraries though, but it should be no big deal.

Designing just another solution maybe is an easy task, but designing a good, robust & extensible one maybe is not such an easy task.. :D

Anyway, let me know what you guys think.
I sure hope that this post is even more useful when compared to my previous one.

Further Reading:
Test Driven Development by Example
Head First Design Patterns
Design Patterns
JUnit in Action
JUnit Recipes

Wednesday, March 30, 2005

List Pagination (Value List Holder)

Someone at our JUG Indonesia has a problem with displaying a List in a number of pages (aka List Pagination). This problem can be solved using the Value List Holder design pattern. I'm attaching my solution to the problem for him (or anyone who may found this useful), so that it'd be easier for him to have a look and discuss.

This solution can be used for both web application and non-web application.
Anyway, just put a comment if you have one.. :D

You can download the zipped source code (plus the test class) here.

import java.util.Iterator;
import java.util.List;

public class Page {

  private int pageNum;
  private int totalPage;
  private int pagesize;
  private List contents;

  public Page (final int pageNum,
      final int totalPage,
      final int pagesize,
      final List contents) {
    this.pageNum = pageNum;
    this.totalPage = totalPage;
    this.pagesize = pagesize;
    this.contents = contents;
  }

  public int getPageNum() {
    return pageNum;
  }

  public int getTotalPage() {
    return totalPage;
  }

  public int getPagesize() {
    return pagesize;
  }

  public List getContents() {
    return contents;
  }

  public boolean isFirstPage () {
    return pageNum == 1;
  }

  public boolean isLastPage () {
    return pageNum == totalPage;
  }

  public boolean equals(Object o) {
    if (this == o) return true;
    if (!(o instanceof Page)) return false;

    final Page page = (Page) o;

    if (pageNum != page.pageNum) return false;
    if (pagesize != page.pagesize) return false;
    if (totalPage != page.totalPage) return false;
    if (contents != null ?
      !isListEqual (contents, page.contents)
      : page.contents != null)
        return false;

    return true;
  }

  public int hashCode() {
    int result;
    result = pageNum;
    result = 29 * result + totalPage;
    result = 29 * result + pagesize;
    result = 29 * result + (contents != null ?
      listHashCode (contents) : 0);
    return result;
  }

  private boolean isListEqual (
    final List a, final List b) {
    if (a == b || a.equals(b)) return true;

    final Iterator ia = a.iterator ();
    final Iterator ib = b.iterator ();
    while (ia.hasNext() && ib.hasNext()) {
      final Object oa = ia.next();
      final Object ob = ib.next();
      if (!oa.equals(ob)) {
        return false;
      }
    }
    if (ia.hasNext() || ib.hasNext()) {
      return false;
    }
    return true;
  }

  private int listHashCode (final List a) {
    int result = 0;
    for (Iterator iterator = a.iterator();
      iterator.hasNext();) {
      final Object o = iterator.next();
      result = 29 * result + o.hashCode();
    }
    return result;
    }

  public String toString () {
    final StringBuffer sb = new StringBuffer ();
    sb.append ("Page ").append (pageNum)
      .append (" of ").append (totalPage);
    sb.append ("\n");

    for (Iterator it = contents.iterator();
      it.hasNext();) {
      final Object o = it.next();
      sb.append (o).append ("\n");
    }
    return sb.toString ();
  }
}


public interface Paginating {

  Page getFirstPage ();

  Page getLastPage ();

  Page getNextPage (Page currentPage);

  Page getPrevPage (Page currentPage);
}


import java.util.List;

public class PaginatingImpl implements Paginating {

  private List originalList;
  private int pagesize;
  private static final String INVALID_PAGESIZE =
    "Pagesize must be a positive integer.";

  public PaginatingImpl (final List originalList,
    final int pagesize)
  throws IllegalArgumentException {
    if (pagesize <= 0)
      throw new IllegalArgumentException (INVALID_PAGESIZE);
    this.originalList = originalList;
    this.pagesize = pagesize;
  }

  public Page getFirstPage () {
    Page result = null;
    if (originalList != null && originalList.size () > 0) {
      result = new Page (1, getTotalPage(), pagesize,
        iterateFrom (0));
      }
    return result;
  }

  public Page getLastPage () {
    Page result = null;
    if (originalList != null && originalList.size() > 0) {
      final int totalPage = getTotalPage();
      final int startIndex = (totalPage - 1) * pagesize;
      result = new Page (totalPage, totalPage, pagesize,
        iterateFrom (startIndex));
    }
    return result;
  }

  public Page getNextPage (final Page currentPage) {
    if (currentPage == null) return getFirstPage ();
    if (currentPage.isLastPage()) return currentPage;

    Page result = null;
    if (originalList != null) {
      result = new Page (currentPage.getPageNum() + 1,
        currentPage.getTotalPage(),
        pagesize,
        iterateFrom (currentPage.getPageNum() * pagesize));
    }
    return result;
  }

  public Page getPrevPage (final Page currentPage) {
    if (currentPage == null) return getFirstPage ();
    if (currentPage.isFirstPage()) return currentPage;

    Page result = null;
    if (originalList != null) {
      result = new Page (currentPage.getPageNum() - 1,
         currentPage.getTotalPage(),
           pagesize,
        iterateFrom ((currentPage.getPageNum() - 2) *
          pagesize));
    }
    return result;
  }

  private List iterateFrom (final int startIndex) {
    final int totalSize = originalList.size ();

    int endIndex = startIndex + pagesize;
    if (endIndex > totalSize) endIndex = totalSize;

    return originalList.subList (startIndex, endIndex);
  }

  private int getTotalPage () {
    if (originalList == null || originalList.size() <= 0)
      return 0;
    final int totalSize = originalList.size();
    return ((totalSize - 1) / pagesize) + 1;
  }
}


Further Reading:
Head First Design Patterns
Design Patterns
JUnit in Action
JUnit Recipes

Friday, October 22, 2004

implementing Visitor pattern

During a development of any project, there will be many times where we have to deal with Collection classes, especially List & Map. Sometimes we have to iterate through a List to find an object which matches our criteria. Sometimes we iterate to filter out elements of the List which does not meet our purpose. Sometimes we iterate the List to summarize their values. There are plenty of stuffs we can perform while we're iterating a List.

Currently, I'm trying to implement the Visitor pattern while iterating through a List. There are several cases which I encounter, which requires me to iterate through a List and filter out elements of that List which does not meet the criteria set earlier. At first, this may seem to be a simple thing to do, just iterate, compare & remove. But, living up to the DRY (Don't Repeat Yourself) paradigm, I'm trying to *think* a level of abstraction to the problem.

Here's what a simple filter code would look like:

for (ListIterator it = aList.listIterator ();
     it.hasNext ();) {
  Customer c = (Customer) it.next ();
  String occupation = c.getOccupation ();
  if (occupation == null ||
      occupation.equals ("java developer")) {
    it.remove ();
  }
}

Now, when I require the similar filter logic (iterate, compare & remove) to be reused for different sets of data and compare rules, I'd have to recode the whole iteration again. And, this type of thing tends to increase in numbers before a project ends.

So, here's my current approach. I'm declaring an interface in which every filter class needs to implement.

public interface ListFilter {
  public boolean passes (Object o);
}

This filter will be used within the generic iteration. Here's an example where I use the generic (iterate, compare & remove) logic to filter the List based on more than one filter rules.

public static List filterList (
    List original, ListFilter[] filters) {
  if (original == null ||
      filters == null || filters.length <= 0) {
    return null;
  }

  // assume you have a method to clone the List
  List cloned = cloneList (original);

  for (ListIterator it = cloned.listIterator ();
       it.hasNext ();) {
    Object o = it.next ();
    for (int i=0; i<filters.length; i++) {
      ListFilter filter = filters[i];
      if (filter != null && !filter.passes(o)) {
        it.remove();
        break;
      }
    }
  }

  return cloned;
}

For the above sample case, where we'd like to filter out all unemployed Customers and all java developers from the sales options the company is trying to promote, then we could have implemented the ListFilter as follows:

public class OccupationFilter implements ListFilter {

  private List forbiddenOccupations;

  public OccupationFilter (List forbiddenOccupations) {
    this.forbiddenOccupations = forbiddenOccupations;
  }

  public boolean passes (Object o) {
    if (o != null && o instance of Customer) {
      if (forbiddenOccupations != null &&
          forbiddenOccupations.size() > 0) {
        Customer c = (Customer) o;
        String occupation = c.getOccupation ();
        if (forbiddenOccupations.indexOf(occupation) >= 0)
        {
          return false;
        } else {
          return true;
        }
      } else {
       return true;
      }
    }
    return false;
  }
}

And, since our generic filter logic is capable of applying multiple filters during a single iteration, it helps us to easily add more filters as we see fit. These filters can even act as singleton if they don't have a dynamic part of their rules.

I'm currently trying to abstract out a summary logic (iterate, compare, summary if necessary) from the same List iteration. I hope I can find a neat way to do it.. :D

Further Reading:
Refactoring: Improving the Design of Existing Code
Design Patterns
Head First Design Patterns


Tuesday, September 28, 2004

singleton anti-pattern

At first, Singleton was introduce in GOF legendary book as one of the creational design pattern. Singleton design pattern helps to ensure that there will only be one instance of the Singleton class, for each JVM. While the Singleton pattern may be useful for certain types of classes, e.g. constant classes, class loaders, etc., it may not be appropriate for other type of classes.

But now, it's suggested that the usage of this class should be limited to those certain types of classes, because the Singleton pattern makes it almost impossible for the class (and possibly all other classes which depend on it) to be testable. It's very hard to subclass, or to create a mock object for a Singleton class.

It's interesting to keep adapting to implement Test-Driven Development.. :D

Further Reading:
Refactoring: Improving the Design of Existing Code
Design Patterns
Head First Design Patterns

Thursday, September 23, 2004

Comparator and Comparable

Again, the topic on Collection usage seems overlooked by most Java developers. Sorting is something that's very common in any programming world, therefore we shouldn't bother to reinvent the wheel for such purpose. As common as it is, yet not many of us know how to do it correctly and efficiently.

In the real business world, there are many cases where we're required to perform sorting on a Collection of some user-defined objects. If the element of the Collection is a String type, or any other class which implements Comparable, then the sorting can be performed in the natural order. But sometimes, we want to perform a different criteria for sorting, or probably we want to sort a user-defined type.

Here's where Comparator comes into play. Clean and extensible.

Here's an example of a simplified user-defined type:

public class Customer {
  private String firstName;
  private String middleName;
  private String lastName;

  // setter & getter methods..
}

In the above class, the most commonly found as natural ordering would be order by lastName. This can easily be implemented using the Comparable interface, i.e.

public class Customer implements Comparable {
  // as shown previously..

  public int compareTo (Object o) {
    if (o == null || !(o instanceof Customer)) {
      throw new IllegalArgumentException ("...");
    }
    Customer c = (Customer) o;
    String cLastName = c.getLastName();

    if (lastName == null && cLastName == null) return 0;
    // assuming you want null values shown last
    if (lastName != null && cLastName == null) return -1;
    if (lastName == null && cLastName != null) return 1;
    return lastName.compareTo (cLastName);
  }
}

Sorting a List of Customer objects would be as simple as:

  Collections.sort (customerList);

But, if we want to use a different ordering, e.g. order by the first name, then we cannot use the natural ordering as defined within the Customer class. Instead, we have to define an alternative ordering, in the form of a Comparator class.

public class CustomerFirstNameComparator
implements Comparator {
  // use singleton whenever possible..

  public int compare (Object o1, Object o2) {
    if (o1 == null && o2 == null) return 0;
    // assuming you want null values shown last
    if (o1 != null && o2 == null) return -1;
    if (o1 == null && o2 != null) return 1;
    if (!(o1 instanceof Customer) ||
        !(o2 instanceof Customer)) {
      throw new IllegalArgumentException ("...");
    }

    Customer c1 = (Customer) o1;
    Customer c2 = (Customer) o2;
    String firstName1 = c1.getFirstName();
    String firstName2 = c2.getFirstName();

    if (firstName1 == null && firstName2 == null) return 0;
    // assuming you want null values shown last
    if (firstName1 != null && firstName2 == null) return -1;
    if (firstName1 == null && firstName2 != null) return 1;
    return firstName1.compareTo (firstName2);
  }
}

Sorting a List of Customer objects by their first name, would be:

  // assuming you implement singleton..
  Comparator comparator =
    CustomerFirstNameComparator.getInstance();

  Collections.sort (customerList, comparator);

Simple, clean & extensible. You can start defining more and more Comparator classes to suit your needs. As it is a Java class, you can also perform complex comparison on the objects.

Friday, September 17, 2004

decoupling in session facade

Some J2EE architectures involve session facade design pattern, as a way to hide the implementation from the client and as a way to minimize network roundtrips (if the session has a remote interface). Most of these architectures have either an Entity Bean, a POJO, or another Session Bean, depending on the needs & the requirements.

Most developers still get the facade concept incorrectly. Sometimes, there's hardly any difference between the reponsibilities of the session facade and the responsibilities of the underlying classes. Sometimes, both layers implement the business rules and access the DB at the same time. This may lead to a bad situation where less and less code can be reused.

Here's my opinion on the facade concept. I'm trying to decouple the DB access layer from the business rules layer as best as possible. For example, I've always tried to put the business rules in the facade, and leave the DB access to the underlying layer, may it be an Entity Bean, a POJO or another Session Beans. This would help me a lot in terms of reusability and ease me up a bit on portability.

I get to reuse the same DB access method for multiple business rules. I get the chance to refactor these common methods to be as generic as possible while still keeping the performance in check. I also get a better chance at DB access layer switching, e.g. from Session Bean (DAO) to Entity Bean or from TopLink to Hibernate, etc. Only my DB access layer needs to be modified, where my business rules layer should be less impacted by such switch.

A sample facade would be like the following:

public void deleteUser (String stUserID,
    String stLoginUserID)
throws Exception {
  // verify that all arguments are valid
  // ...

  if (stUserID.equals(stLoginUserID)) {
    // cannot delete him/herself
    throw new Exception ("...");
  }

  // delegate the process to underlying layer
  List roles = listUserRoles (stUserID);

  for (Iterator it = roles.iterator();
    it.hasNext(); ) {
    RoleVO role = (RoleVO) it.next();
    String roleName = role.getName();
    if (roleName.equals("SUPERVISOR")) {

      // delegate the process to underlying layer
      reassignMembers (stUserID);

    } else if (roleName.equals("MEMBER")) {

      // delegate the process to underlying layer
      reassignTasks (stUserID);

    } else if (roleName.equals("ADMIN")) {

      // admin users cannot be deleted,
      // they must be demoted first
      throw new Exception ("...");
    }
  }

  // delegates the process to underlying layer
  invalidateUser (stUserID);
}

As the example above shows, DB accesses to different tables in the DB is being delegated to the underlying layer, i.e. listUserRoles (String), reassignMembers (String), reassignTasks (String), and invalidateUser (String). On the other hand, the business rules, i.e. validation for self-removal, validation for admin role, and additional steps to be performed when the user is a supervisor/member, before actually invalidating the user, will stay in the facade.

This decoupling will improve readability, reusability, and at the same time will minimize the ripple effects should the underlying DB schema is changed, or the business rule is slightly modified.

I know sometimes it's hard to get the discipline..
But in the end, I think it's worth the efforts.. :D

Wednesday, September 15, 2004

optimizing collection usage

We've been playing with Collection classes for years now. We used to play with Vector & Hashtable, before we were given ArrayList, HashMap, and the new Linked* classes. But the question still remains, have we use them properly and efficiently?

I believe that not many developers have realized this. Improper usage of Collection may cause unnecessary memory allocation & performance degradation. Especially when involving loop or recursive calls.

For example, if we're required to create an instance of Map interface which contains a single entry (key-value pair), and we know that this Map instance will never be altered, then the best way to go is by using the singletonMap() method in the Collections utility class.

Map param = Collections.singletonMap ("USER_ID", stUserID)

instead of creating the default HashMap implementation, which will automatically allocate 16 entries into the memory usage.

Map param = new HashMap ();
param.put ("USER_ID", stUserID);

By using the default HashMap implementation, we're allocating more memory than we need, and we perform tasks which we may never need. The similar method exists for List as well.

Still with the same "optimize while initializing" spirit, both HashMap and ArrayList should be instantiated efficiently, to prevent the overhead of auto-resizing implementation of both classes. For example, an ArrayList will automatically expand itself, including copying the old elements of the internal array to the newly created internal array. This cause overhead in the number of instructions that'll need to be executed as well as the extra memory to be allocated. Similar conditions apply to HashMap as well.

If we have known for sure what's the number of elements to be included within the Collection, it'd save a good number of instructions to be executed as well as unnecessary amount of memory to be allocated.

For example, to convert elements of a List from a Class type to another Class type (e.g., from Transfer Object to ActionForm), we'd known what the size is beforehand.

public static List convert (List transferObjects) {
  List result = new ArrayList (transferObjects.size() + 1);
  for (Iterator it = transferObjects.size();
       it.hasNext(); ) {
    TransferObject obj = (TransferObject) it.next ();
    ActionForm bean = ActionFormFactory.getInstance (
        "xForm");

    // assuming that there's another private method
    // dealing with the conversion for each field
    convert (obj, bean);
    result.add (bean);
  }
}

Same with a HashMap, whose default load factor is 0.75. Here's my trick:

// the number of entries known to be inserted
final int KNOWN_CAPACITY = 3;
final int INITIAL_CAPACITY =
    1 + (KNOWN_CAPACITY + 1) * 4 / 3;

Map params = new HashMap (INITIAL_CAPACITY);
params.put ("USER_ID", stUserId);
params.put ("LOGIN_ROLE", stLoginRole);
params.put ("KEYWORD", stKeyword);

Hopefully, using the above approaches, I can minimize (or prevent) the unnecessary auto-resizing & extensive memory allocation.
I think it's very easy to be done, and it has a good impact on the performance & quality of the code.
What do you think?