Category Archives: testing

How to unit test Java servlets

The question of how to unit test Servlets comes up a lot. How can it be done? Should it be done? What are the options?

A unit test, in the realm of xUnit semantics, is an isolated test of the smallest testable subset of a program. Usually this translates to a test of a single method in an Object. When this object itself is part of a framework or container, such tests border on becoming Integration Tests. How could these types of objects still be ‘unit tested’?

Written by: Josef Betancourt, Date: 2015-09-17, Subject: Servlet testing

Options

Here are a few options.

POJO

When you write a servlet, ultimately the servlet object is instantiated by the server container. These objects do a lot behind the scenes that may prevent invoking methods on them when not attached to an actual container.

A servlet, or any other server based object like an EJB, provides access to problem domain services or functionality. The easiest way to test these Objects is to refactor that service into plain old Java objects, POJO.

Jakob Jenkov writes: “… push the main business logic in the servlet into a separate class which has no dependencies on the Servlet API’s, if possible”.

If your working with a framework that is likely the design approach anyway.

Servlet stub library

A library that allows creation of “server” objects can make creating stubs for testing very easy. Again, a framework should provide such a feature.

Mocking

Mocking using modern libraries like Mockito, Powermock, and JMockit, provides a very powerful approach. There is also what appears to be a more focused Mockrunner project, which has a mockrunner-servlet module.

In listing 1 below, a test is created for a target SutServlet class’s doGet. This method will set the response status to 404 if an ID request parameter is null.

Using JMockit, proxies of HttpServletRequest and HttpServletResponse are created. The request’s getParameter and the response’s setError methods are mocked. The actual unit test assertion is done in the mocked setError method.

Listing 1, JMockit use

@RunWith(JMockit.class)
public class SutServletTest_JMockit {
    
    @Test
    public void should_Set_ResourceNotFound_If_Id_Is_Null() throws Exception {
        new SutServlet().doGet(
        new MockUp<HttpServletRequest>() {
            @Mock
            public String getParameter(String id){
                return id.compareToIgnoreCase("id") == 0 ? null : "don't care";
            }
        }.getMockInstance(),
        new MockUp<HttpServletResponse>() {
            @Mock
            public void sendError(int num){
                Assert.assertThat(num, IsEqual.equalTo(HttpServletResponse.SC_NOT_FOUND));              
            }
        }.getMockInstance());
    }
     
}

JDK Dynamic Proxies

The Mock approach can also be duplicated using dynamic proxies. JDK dynamic proxy support is usable here. JDK proxies have one limitation, they can only proxy classes that extend an interface. (Still true in Java 9?). Servlets extend interfaces, so we can the proxy support in the JDK.

Listing 2, using JDK proxies

public class SutServletTest_using_jdk_proxy {
    
    private static final String DON_T_CARE = "don't care";
    private static final String SEND_ERROR = "sendError";
    private static final String GET_PARAMETER = "getParameter";

    /**  @throws Exception  */
    @Test
    public void should_Set_ResourceNotFound_If_Id_Is_Null() throws Exception {
        
        // request object that returns null for getParameter("id") method.
        HttpServletRequest request  = (HttpServletRequest)Proxy.newProxyInstance(this.getClass().getClassLoader(),
            new Class[]{HttpServletRequest.class},
                new InvocationHandler() {
                    public Object invoke(Object proxy, Method method, Object[] args) throws Throwable {
                        if(method.getName().compareToIgnoreCase(GET_PARAMETER) ==0){
                            return ((String)args[0]).compareToIgnoreCase("id") == 0 ? null : "oops";
                        }
                        return DON_T_CARE;
                    }
                }
        );
        
        // Response object that asserts that sendError arg is resource not found: 404.
        HttpServletResponse response  = (HttpServletResponse)Proxy.newProxyInstance(this.getClass().getClassLoader(),
            new Class[]{HttpServletResponse.class}, 
                new InvocationHandler() {
                    public Object invoke(Object proxy, Method method, Object[] args) throws Throwable {
                        if(method.getName().compareTo(SEND_ERROR) == 0){
                            Assert.assertThat((Integer) args[0], IsEqual.equalTo(HttpServletResponse.SC_NOT_FOUND));
                        }
                        return DON_T_CARE;              }
                }
        );
         
        new SutServlet().doGet(request,response);
    }
}

Javassist Proxies

Just for completeness, in listing 3, we use the Javassist library.

Listing 3, using Javassist proxy

public class SutServletTest_using_assist {
    
    private static final String DON_T_CARE = "don't care";
    private static final String SEND_ERROR = "sendError";
    private static final String GET_PARAMETER = "getParameter";

    /**  @throws Exception  */
    @Test
    public void should_Set_ResourceNotFound_If_Id_Is_Null() throws Exception {
        
        // request object that returns null for getParameter("id") method.
        HttpServletRequest request  = (HttpServletRequest)createObject(new Class[]{HttpServletRequest.class},
            new MethodHandler() {
                public Object invoke(Object self, Method thisMethod, Method proceed, Object[] args) throws Throwable {
                    if(thisMethod.getName().compareToIgnoreCase(GET_PARAMETER) == 0){
                        return ((String)args[0]).compareToIgnoreCase("id") == 0 ? null : "oops";
                    }
                    return DON_T_CARE;
                }
            }
        ); 
        
        // Response object that asserts that sendError arg is resource not found: 404.
        HttpServletResponse response  = (HttpServletResponse)createObject(new Class[]{HttpServletResponse.class},
            new MethodHandler() {
                public Object invoke(Object self, Method thisMethod, Method proceed, Object[] args) throws Throwable {
                    if(thisMethod.getName().compareTo(SEND_ERROR) == 0){
                        Assert.assertThat((Integer) args[0], IsEqual.equalTo(HttpServletResponse.SC_NOT_FOUND));
                    }
                    return DON_T_CARE;
                }
            }
        );
         
        new SutServlet().doGet(request,response);
    }
    
    /**
     * Create Object based on interface.
     * <p>
     * Just to remove duplicate code in should_Return_ResourceNotFound_If_Id_Is_Null test.
     * @param interfaces array of T interfaces
     * @param mh MethodHandler
     * @return Object
     * @throws Exception
     */
    private <T> Object createObject(T[] interfaces, MethodHandler mh ) throws Exception{
        ProxyFactory factory = new ProxyFactory();
        factory.setInterfaces((Class<?>[]) interfaces); // hmmm.        
        return factory.create(new Class[0], new Object[0], mh);
    }
}

Embedded server

Its also possible to start an embedded server, deploy the servlets, and then run the tests. Various Java app servers (like Tomcat and Jetty) support this and are well documented. The complexity comes when only partial integration is required. For example, we may want to have a real app server running the tests, but do we also really need a database server too? Thus, we also have to deploy stubs or mocks to this embedded server. Many resources on web for this approach, for example, “Integration Testing a Spring Boot Application“.

Another approach is the concept of the Hermetic Servers.

AOP

AOP can be used on embedded server, and this would allow “easy” mocking of integration endpoints and mocks. Such an approach was shown here “Unit test Struts applications with mock objects and AOP“.

References

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported License.

Continuous Testing while developing, CDT?

I previously wrote about Continuous Testing here. Strange, at the time a web search turned up very little about the concept. What was found was the use of the term in the sphere of Continuous Integration systems and processes. Today there are more relevant hits.

Terminology
On Wikipedia the term CT, “Continuous testing”, is redirected to Test Automation. I don’t know the arcana of Wikipedia but “Continuous testing” is hidden somewhere as you can see if you visit the redirection link. The edits on the redirection page shows that some editors where getting into the details of ‘real’ CT, etc.

One editor mentioned the problem of dependency detection. If you edit source Foo and source Fee is a dependency, then you should rerun tests for Fee too?

Via a web search, Wikipedia article has, Continuous test-driven development, CTDD. That seems relevant. However, that seems to imply the original Test Driven Development TDD practices are being used. From what I read, TDD is not that popular. The use of unit testing is more popular. So, if a tool automatically runs unit or functional tests on local code changes, that has nothing to do with how those tests were written, TDD or not. The tests could have been written years later for some legacy system that is now being maintained with appropriate tests.

CT is also possible in non-IDE dev environments of course. An example is Wallaby.js, which is a continuous test runner for JavaScript.

Continuous Testing
We don’t edit code then invoke a compile step anymore. Our IDEs do that automatically. Then why have to invoke our unit tests manually? This “Continuous Testing” (CT) approach enables a smoother Test-Driven Development (TDD), maintenance, or refactoring, work flow.

This type of CT is in contrast to tests run on a Continuous Integration server, Continuous Developer Tests, CDT. Dev tests (unit, functional, integration, …) are run in the developer workstation in response to source changes. Nothing new of course, IDEs have always rebuilt on such events, but the tests have not been run at a fine-grained level.

Is there any evidence of this? Some papers on CT are found here.

Great videos on the CT as implemented in Mighty-Moose, a product for Microsoft Visual Studio, are found at continuoustests.
Mighty Moose Demo, a CT product for Visual-Studio.

Cons?
Mentioning this to any developer will give you immediate “buts”: But my tests take too long; it would be distracting; I change code constantly;…… I sometimes think developers are driven by a little motor in them, but … but … but … buuuut.

Implementations?
Why isn’t automatically running of tests supported in IDE’s like Eclipse? Build systems, like Maven, have always supported test goals. Now Gradle will support Continuous Builds.

Is there is a direct way to invoke the JUnit plugin via adding a new custom “builder” to Eclipse? A builder in Eclipse is triggered by resource changes. So, on source code change this builder would have to run an associated ‘JUnit run configuration’ that in turn could run the GUI test runner or the build system which invokes the tests.

Links

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported License.

In dev, a missing test always passes

What if a you lose an automated test? What if that was testing a critical functional area? This article discusses this and for unit tests implements a Java @RequiresTest annotation

How to lose a test

Can’t lose a test? Sure you can. Test reports just give you counts and changes? Who looks at that? No one. In a long time span, test maintenance can develop warts and tests are silently deleted since they are failing and its too hard to fix or no time available. The original developers of a component may have gone on to other things and that tender loving care is nowhere to be found.

Coverage reports don’t help much with this. Unless there are drastic changes in test results or coverage levels, no one looks at them, they become just another management spreadsheet number, or developer hipster feel-good schtick.

Who loses tests

Sure, for tools, utilities and highly focused systems, especially FOSS, this is not likely. The rapid change and larger development teams ensure full use of test tools. In these projects there is more likely to be a level of “test-infected” developers.

For other kinds of systems, like IT projects, testing will be forgotten when the scat hits the fan, or when the test evangelist moves on or gives up. Testing will just rely on manually repeated testing of a local facsimile of the target system and waterfail test department testing.

Quoting Fowler here “Imperfect tests, run frequently, are much better than perfect tests that are never written at all.”, I would add or those that are never run.

Does it matter

For real-world large applications that quickly become legacy, any missing tests can prove disastrous. A missing test would make any potential defect show up much later. Later is too late and costs more to fix.

Ironically, the best example of the lost of tests are legacy systems that have no automated tests, a de-testable system. In such a system, defects are found in late stage waterfall phases, or worse in production.

What should be tested

Ideally everything would have a valid unit/functional/integration test. In reality this is not cost effective and some would argue that some things should not be tested. For example, it is claimed that getter/setters do not need tests. (Clearly in a language with true properties, this is true. Java, not.)

So if some things should not be tested, what should be? And if those things that should be tested are not tested?

Options

If missing tests are a concern, what can be done? As in many system decisions, it depends: What kind of tests, when are the tests run, who manages the tests, what kind of test monitoring, and so forth.
The following are just a few options that could be considered.

Monitoring of missing tests

The Continuous Integration system or the build tools it invokes present and track missing tests. Missing tests are considered a failure and must be acted on: confirming or removing from consideration.
There is no need to create this missing test list. The build system adds to this list as each build is performed.

Required tests database

For critical systems or subsystems a required test database could be used. This would be more of a management and tool issue since an ongoing project may change many tests during its duration.
Required tests specification is not a new concept. Hardware systems have always had such a concept and even go further by having Built-In Self Tests.
Note that one argument against recent standards and by extension a ‘required test database’ is that this is not congruent with modern agile processes.

Requires test annotation

For “devtests”, using xUnit frameworks, it is much easier to indicate what should be tested. This can be easily made part of the build configuration system and run as part of the test phase. To make more resilient to bit rot, the source code itself should store this information. In the Java devcology this can be done for unit tests using annotations.

Example

In listing 1, a developer has decided that two methods should have tests. So the methods are annotated with @RequiresTest.
Listing 1, an application class with annotations

package com.anywhere.app;

import com.octodecillion.test.RequiresTest;

/**
  */
public class Foo1 {	
	@RequiresTest
	public void fum1(){
        // stuff		
	}

	@RequiresTest
	public void fum12(){		
        // stuff
	}
}

Below in listing 2, a Java implementation of a RequiresTest annotation is shown. It uses the same approach I used in Search Java classpath for JUnit tests using Spring, except now the filter checks for a different annotation. These two could be combined into one implementation that searches for tests or requires annotation.

Funny, I did not annotate the RequiresTestAnnotationScanner with @RequiresTest.

Listing 2. A requires test annotation

Links

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported License.

Unit testing Java exception handling using JMockIt

How do you test that a method caught an exception? What if that catch did not have a side effect, it just logged output, or simply swallowed the exception?

Context
We have a method in a class that was written with a try catch and we need to unit test it. There are many ways this task can occur, such as the need to test legacy code or the need to write a test before a refactoring of such code.

We won’t go into the anti-pattern aspects or what constitutes proper handling of an exception here.

Error hiding is an anti-pattern in computer programming. Due to the pervasive use of checked-exceptions in Java, we must always address what to do with exceptions. Error hiding is when a catch clause does not property handle an exception.
 

In a catch block there are three common ways of handling an exception:

try{
   ... stuff ...
}catch(X){
   // 1. do something here
   // 2. maybe throw X or something else
   // 3. skip the above and do nothing
}

How are these tested?

Thrown exception
When the method reacts by throwing an exception we can test using the standard JUnit @Test(expected=SomeException.class), or for fine-grained verification, the test itself has a try catch block where we can assert the exception details.

Swallowed exception
If a method does nothing in the catch block, which is also called “swallowing” the exception, should it even be a test issue? Yes. The method is just tested normally. One of the tests must force the exception, of course. We do this since in future the method may be changed to handle the exception differently. One of the test assertions that can be made is that the forced exception is not thrown from the method.

Exception handler pointcut
Testing that an exception in the method under test was actually caught is possible, but only with Aspect Oriented Programming (AOP). One example language is AspectJ which supports the pointcut:

handler(TypePattern)
Picks out each exception handler join point whose signature matches TypePattern.
 

Behavior in catch
It gets more interesting when the catch block has ‘behavior’. This behavior has side effects. If these side effects are only local to the method, such as setting a flag false, then normal testing is adequate. If this behavior has side effects at the class or with collaborating objects, then this requires more complex testing.

It can get murky with this kind of testing. What is important is that one does not test the implementation (but sometimes that is crucial), only the interactions and requirements of the target “Unit” under test. What constitutes a “unit” is very important.

“Typically, a unit of behavior is embodied in a single class, but it’s also fine to consider a whole set of strongly-related classes as a single unit for the purposes of unit testing (as is usually the case when we have a central public class with one or more helper classes, possibly package-private); in general, individual methods should not be regarded as separate units on their own.” — Rogerio in JMockit Tutorial

.

Example
The method being tested invokes a method on a collaborating object and that object throws an exception. In the catch block, the exception is logged using the logging utility collaborator . Though not part of an explicit API, that logging may be critical to the use of a system. For example, an enterprise log monitoring system expects this logging for support or security concerns. A simple class Shop is shown in figure 1,

Figure 1, the class to test

public class Shop {
    private ShoppingSvc svc;
    
    /**
     * Get product name.
     * @param id the unique product id
     * @return the product name
     */
    public String getProduct(int id){
        String name = "";
        try {
            name = svc.getProductName(id);
        } catch (Exception e) {
            Logger.getAnonymousLogger()
			.log(Level.SEVERE, 
			"{result:\"failure\",id:\""+ id + "\"}");
        }
        
        return name;
    }
    
}

JMockit Use
JMockit supports two type of testing: behavior and state-based (or “Faking”).
Using the state based approach we create a mock for the getProductName(String) method of the collaborating (or dependent) class, ShoppingSvc. With JMockit this is easily done as an inline MockUp object with the target method mocked to throw an exception.

Listing 2, mocking

new MockUp<ShoppingSvc>() {
    @Mock
    public String getProductName(int id) throws IOException{
		throw new IOException("Forced exception for testing");
    }
};

JMockit’s behavior based support is then used to test the catch clause handling. As in other mocking frameworks, a record-replay-verify phases are used. Since the side effect of the exception handler here is the use of the logging dependency and we are not testing the logger, we ‘behaviorally’ mock the Logger class.

We can do this in the test method signature, @Mocked final Logger mockLogger. This mocks every method in the logger class. Then we set an expectation on the log method being used in the exception handler, then verify the method was actually invoked.

The full test class is shown in figure 3 below and the sample code is in a repo on GitHub:https://github.com/josefbetancourt/examples-jmockit-exceptions.

An alternative to using both state and behavior mocking is to just specify the exception throwing with the expectations. The article “Mocking exception using JMockit” shows how to do this. Of course, the JMockit Tutorial has all the details.

Listing 3, the full test class

@RunWith(JMockit.class)
public class ShopTest{
    /**
     * 
     * @param mockLogger Logger object that will be behaviorally mocked.
     */
    @Test
    public void shouldLogAtLevelSevere(@Mocked final Logger mockLogger)
    {
        /**
         * state-based mock of collaborator ShoppingSvc
         */
        new MockUp<ShoppingSvc>() {
            @Mock
            public String getProductName(int id) throws IOException{
                throw new IOException("Forced exception for testing");
            }
            
        };
        
        // the SUT  
        final Shop shop = new Shop();

        // what we expect to be invoked
        new Expectations() {{
            mockLogger.log(Level.SEVERE,anyString); 
        }};
        
        shop.getProduct(123); // actual invocation
        
        // verify that we did invoke the expected method of collaborator
        new Verifications(){{
            mockLogger.log(Level.SEVERE, anyString);  // we logged at level SEVERE
        }};
    }
}

Alternatives?
Just write better code so that you don’t need unit tests? This is mentioned in Functional Tests over Unit Tests

Test using a scripting language like Groovy? See Short on Time? Switch to Groovy for Unit Testing

Software
– JUnit: 4.12
– JMockit: 1.18
– JDK: 1.8
– Eclipse: Mars
– Maven: 3

Links

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported License.

Continuous Integration (CI) misconception

In some online resources the term Continuous Integration (CI) is always used in the broadest sense to mean that on some schedule or event the outputs of every ongoing project or separate teams are obtained, put together somehow, and then a test system is updated so that various tests can be invoked. No wonder some test and management professionals are wary of the concept.

The problem here is the “other” usage. More correctly CI can even be applied to one team on one project. One distinguishing feature of CI is that there are multiple developers*. Thus, as these developers complete various tasks and commit or push to a shared repository, a build and deploy process is run to create testable systems.

The term “integration” in CI is applicable to more inclusive senses, or a fuzzy continuum, from one project and one team to combinations of these. Thus, some processes are CI to a certain degree, or worse very CI anti-pattern to a certain degree.

In the modern CI best practices, CI is done via various build and deployment servers that automate some or all of the pipeline. In the past at some companies, the designated build person was doing manual Continuous Integration.

Sure, in CI there will be episodes of actual integration with other current projects, teams, or externally generated artifacts. If this is automated, then we have full CI.

* Even a single developer who uses various branching strategies on one code base may use CI practices.

Links

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported License.

Mock Java time and date using JMockIt

Example scenario: A class under test (CUT) gets the current date and uses collaborating classes that invoke month dependent rules. How can it be unit tested?

Sure you can get the current date in the test and set the month. But, the CUT and all of its collaborators may also get the current date. Without changing the environment system time how will you force the month to be, for example, December?

That this is a problem may indicate there is a code smell in the design, but in a non test infected group, tests are added, if at all, after the code is shipped. Thus, changes are scheduled for the next agilefall.

One easy approach is to just change what Calendar.getInstance() gives you. With JMockIt this is very easy. Just put something like the anonymous mock (see lising 1) in the test method.

Source code

/** 
  * Test December rules.
  * @author J. Betancourt
  */
@Test
public void should_invoke_December_rules(){
    new MockUp<Calendar>() {
    	@Mock
    	public Calendar getInstance(Invocation inv) {
    		Calendar cal = inv.proceed();
    		cal.set(Calendar.MONTH, 11);
    		return cal;
    	}
    };

    doTestStuffHere();

}

Listing 1.

Note that within the mocked method I still have access to the mocked class instance cause I also included an Invocation as the first argument, thus I invoke that to get the calendar as usual, then I change the month. Kind of like an Aspect ‘around’ advice.

JMockIt performs instrumentation via java.lang.intrument and the ASM library, so even collaborating objects will use this Mocked instance for the life of the test. This is a big feature offered by JMockIt compared to other Mocking frameworks afaik.

This should be applicable to mocking other temporal methods like System.currentTimeMillis(), and that in itself effects other methods to get date and time.

Caution
Mocking JDK classes may have side effects on running tests. In one of my tests, the java.io.Printwriter was mocked. This caused Eclipse to disconnect from the running JUnit test. The solution was: right after executing the method that involved the use of the writer, doing a mock tearDown(). This was on a JMockIt version 1.2. It probably changed in the latest version.

Shouldn’t time be accessed from a testable configurable component?
Getting time related data via Calendar.getInstance(), System.currentTimeMillis(), or other JVM provided facilities is bad practice in some cases. It is similar to using “new “, creating hidden object graphs, and making testing difficult. A better approach is to centralize the time related access through a service or utility class. One benefit of this is that to change the environment date for a ‘system’ test, you don’t have to change the vacuum tubes in the mainframe system, just change the calendar access via configuration.

Notes
1. This worked on JDK 1.6 using JMockIt version 1.2. Is it a good technique? Let me know.
2. After I wrote this I searched for this topic and found many good blog posts discussing this subject. Why didn’t I find these when I initially had the problem?

Further reading

Something I’m listening to while I code …

“Become, Seem, Appear” performed by Oregon on their CD “Oregon In Concert”. On YouTube
Price for a new CD is $5,545.60. I better find my CD and store it in a safe place!

“The Silence Of A Candle” performed by Oregon on their CD “Oregon In Concert”. On YouTube.

Audio CD (November 6, 2001), Original Release Date: 1975, ASIN: B00005RDJS

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported License.

Groovy implementation of INIX file format, part 2

Continued experimentation with the INIX “ini” file format. I take the original post code and add an alias feature.

Background
In three prior posts I presented a very simple metadata file storage approach. Pretty much using the INI file format with the sections as “heredocs.” In original INI format, afaik, the data within the sections were properties only, x=y.

Updates

  • Dec 25, 2015: Just saw a mention of Tom’s Obvious, Minimal Language (TOML) This is not directly related to what I’m after with Inix, but interesting as an example of a simple markup language.
  • Alias
    Now I am working on adding the ability for a section of data to load data from another section. The sections to load are indicated by the ‘@’ symbol and then the section path. Multiple sections can be indicated. Though I’m using the term ‘alias’ for this, perhaps a better term is ‘importing’. So far, I can parse the alias from the tag string.

    I have not implemented the actual import. One complexity left to solve is recursion. If this section imports another section, what if that section imports others sections?

    Alias use case
    Since currently the Inix file format is being used for test data, aliasing allows reuse of data without duplication, i.e., DRY. This is problematic with hierarchical data like JSON or XML, but much easier with lists or maps. Further features like overriding and interpolation would be useful for Java Properties data. The goal would be to eventually support use of the Cascading Configuration Pattern.

    Example 1

    [>First]
    Data in first    
    [<]
    
    [>Second@First]
    Data in second
    [<]
    

    Now when the data in section “Second” is loaded, the data from the aliased section is prepended to the current section data:

    Data in first    
    Data in second
    

    Tag format
    The section tag format is now: [>path#fragment@aliases?querystring]. Note that unlike a URI, the fragment does not appear at the end of the string.

    The section ID is really the path#fragment. Thus, the end tag could be [<] or can use the section ID: [<path#fragment]. Example 2

    [>demo1/deploy#two@account897@policy253?enabled=true&owner=false]
    stuff here
    [<demo1/deploy#two]
    

    Grammar
    The start of a grammar follows, but has not been ‘checked’ by attempted use of a parser generator like Antlr.

    grammar Inix;
    section: start CRLF data end;
    start: '[>' path (fragment)?(alias)*('?' args)? ']';
    end: '[<' path? ']';
    path: NAME ('/' NAME)*;
    fragment: '#' NAME;
    alias: '@' NAME
    args: (NAME=NAME ('&' NAME=NAME)*)?;
    data: (ANYTHING CRLF)*;
    NAME: ('a'..'z' | 'A'..'Z')('a' .. 'z' | 'A'..'Z'|'0'..'9'|'_');
    
    TODO:

    1. Do the actual import of aliased section data.
    2. Allow multiple params per param key: ?measure=21,measure=34,measure=90. Or better yet, just allow array in arg string: measure=[21,34,90],color=red

     

    Implementation
    Source code available at Github: https://gist.github.com/josefbetancourt/7701645

    Listing 2, Implementation

    Test class
    Note that there are not enough tests and the implementation code has not been reviewed.

    Listing 3, Test class

    The test data is:

    Listing 4, data file

    Environment
    Groovy Version: 2.2.2 JVM: 1.7.0_25 Vendor: Oracle Corporation OS: Windows 7

    Further Reading

    1. The Evolution of Config Files from INI to TOML
    2. Groovy Object Notation using ConfigSlurper
    3. Configuration Files Are Just Another Form of Message Passing (or Maybe Vice Versa)
    4. INI file
    5. Data File Metaformats
    6. Here document
    7. JSON configuration file format
    8. Creating External DSLs using ANTLR and Java
    9. Groovy Object Notation (GrON) for Data
      Interchange
    10. Cloanto Implementation of INI File Format
    11. http://groovy.codehaus.org/Tutorial+5+-+Capturing+regex+groups
    12. URI
    13. Designing a simple file format
    14. The Universal Design Pattern
    Creative Commons License
    This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported License.

    Testing Groovy with Nested Test Classes

    The Groovy language allows testing with inline code, it provides the built-in JUnit, Mock, stubs, and other support. Presented here are two simple ways of embedding a JUnit test class inside a Groovy class.

    In a prior post, “List sorting using topological ordering of a digraph“, I used this embedded test approach in the Java coding. But this embedding did not work when I tried it with a Groovy script I was working on. I revisited this technique to see how to do with Groovy.

    Note that I am interested in directly running a class or script and having the embedded JUnit test run. One could of course just invoke the embedded test class using a JUnit runner. If it is possible to run the nested class, why have the class or script runnable as a test? Good question.

    Running nested test directly:

    GroovyEmbeddedTests\src>java -Dtest.script=true -cp .;..\bin;..\lib\junit-3.8.1.jar;\java\groovy\embeddable\groovy-all-2.1.2.jar;\java\groovy\lib\asm-4.0.jar  junit.textui.TestRunner com.octodecillion.testing.Example3
    
     
    I was reading a JavaScript programming book the other day. The author talked about statistics that show most JavaScript is untested, i.e., has no tests. I bet that extends to other uses of other scripting languages: the only test is to use them and see if they still do what they should. (BTW, for testing JavaScript, Jasmine is a nice framework.)

    Nested Tests?
    Should JUnit type test classes be embedded within the class or script system under test (SUT)? About the only clear advantage is that instead of two files you have one.

    In terms of deployment, Java one has the option of shipping only the non-test compiled classes. If one is shipping ‘script’ source itself, that is much harder to do.

    There are some ‘human’ advantages to embedded tests. For one they are in one’s face, not another file that the average developer won’t even look at. See Ben Christensen’s blog post for further discussion.

    Approach
    Groovy classes
    To run nested test classes we use the approach that Groovy uses in its GroovyShell class. This class is pretty neat; the runScriptOrMainOrTestOrRunnable method can run a Class, Runnable, Script, JUnit3, JUnit4, and even wash dishes. The private method that runs a JUnit test is what I needed, so I copied it into a new EmbeddedTestUtil class, shown in listing 4.

    Groovy Scripts
    While running nested tests is easy in Groovy classes, Scripts are very different. The Groovy compiler compiles the Groovy script into a Java class where all the ‘script’ code is put into the run() method, and a main entry point is added to invoke that run method. There are other issues of course, such as bindings, field declarations, and so forth.

    One way of running nested tests in a script is to signal this desire with a system property. If that property is set, the test class is run via the appropriate JUnit runner. These tests can themselves run the target script by invoking the run method.

    Examples
    In listing 1, the Example1 class contains a nested ExampleTest1 JUnit 3 test class. To run this test class, we add a ‘main’ entry point to the top level class. This entry point will instantiate the test class and then use JUnit’s test runner to invoke it’s implicit ‘run’ method.

    Listing 1, Groovy class with nested test
    package com.octodecillion.testing
    
    import junit.framework.TestCase;
    import junit.textui.TestRunner
    import org.codehaus.groovy.runtime.InvokerHelper;
    
    /**  */
    class Example1 {
    	String name;
    	
    	/** Nested JUnit3 test class */
    	static class ExampleTest1 extends GroovyTestCase{
    		private Example1 ex
    		
    		protected void setUp(){
    			ex = new Example1()
    			ex.name = "hello"
    		}
    		
    		public void test1(){
    			println "in test1 ...."
    			assert(ex.name.equals("hello"))
    		}
    		
    	}
    	
    	/**
    	 * Run the {@link Example$ExampleTest} tests.
    	 * 
    	 */	
    	static main(args) {		
    		EmbeddedTestUtil.runJUnit3Test(Example1.ExampleTest1)
    	}
    
    }
    

     

    An alternative to a nested test class, we can simply put the test class within the same class source file as in listing 2.

    Listing 2, Groovy class source file with inline test
    package com.octodecillion.testing
    
    import junit.framework.TestCase;
    import junit.textui.TestRunner
    import org.codehaus.groovy.runtime.InvokerHelper;
    
    /**
     */
    class Example2 {
    	String name;
    	
    	/** Run {@link ExampleTest2} tests. */
    	static main(args) {
    		EmbeddedTestUtil.runJUnit3Test(ExampleTest2)
    	}
    
    }
    
    /** Inline JUnit3 test class */
    class ExampleTest2 extends GroovyTestCase{
    	private Example2 ex
    	
    	protected void setUp(){
    		ex = new Example2()
    		ex.name = "hello"
    	}
    	
    	public void test1(){
    		println "in test1 ...."
    		assert(ex.name.equals("hello"))
    	}
    	
    }
    

     

    In listing 3 we attempt to test a Groovy script.

    Listing 3, Groovy script with nested test
    /**  */
    package com.octodecillion.testing
    
    import junit.framework.TestCase;
    import junit.textui.TestRunner
    import org.codehaus.groovy.runtime.InvokerHelper;
    import java.util.concurrent.*
    
    if(System.getProperty("test.script")){	
    	Executors.newSingleThreadExecutor().execute(new Runnable(){
    		void run() {
    			EmbeddedTestUtil.runJUnit3Test(ExampleTest3)
    		}	
    	})
    
    	return;
    }
    
    String name;
    
    println "Hello world!"
    
    /** Nested JUnit3 test class */
    class ExampleTest3 extends GroovyTestCase{
    	def Example3 ex
    	
    	protected void setUp(){
    		System.clearProperty("test.script")
    		ex = new Example3()
    		ex.name = "hello"
    	}
    	
    	public void test1(){
    		println "in test1 ...."
    		assert(ex.name.equals("hello"))
    	}
    	
    	/** Run the script */
    	public void test2(){
    		println "in test2 ...."
    		
    		ByteArrayOutputStream baos = new ByteArrayOutputStream()
    		def myOut = System.out
    		def pout = new PrintStream(baos)
    		System.setOut(pout)
    		ex.run()
    		System.setOut(myOut)
    		pout.flush();
    		def actual = baos.toString()
    		
    		assert(actual.contains("Hello world!"))
    	}
    	
    }
    

     

    We copied some of the JUnit support built into Groovy into a new utility class shown in listing 4.

    Listing 4, EmbeddedTestUtil class
    package com.octodecillion.testing
    
    import junit.framework.TestCase;
    import junit.textui.TestRunner
    import org.codehaus.groovy.runtime.InvokerHelper;
    
    /** 
     * Invoke embedded test classes.
     * 
     * Based on code from groovy.lang.GroovyShell.
     * See http://groovy.codehaus.org/gapi/groovy/lang/GroovyShell.html
     * @author j. betancourt
     */
    class EmbeddedTestUtil {
    
    	/**
    	 * Run the JUnit 3 test.
    	 * 
    	 * Copied the approach used in 
    	 * groovy.lang.GroovyShell.runJUnit3Test(Class scriptClass)
    	 * @see https://github.com/groovy/groovy-core/blob/master/src/main/groovy/lang/GroovyShell.java
    	 */	
    	static runJUnit3Test(Class clazz) {		
    		try {	
    			
                Object testSuite = InvokerHelper.
    				invokeConstructorOf("junit.framework.TestSuite", clazz);			
                InvokerHelper.invokeStaticMethod(
    				"junit.textui.TestRunner", "run", testSuite);
            } catch (ClassNotFoundException e) {
                throw new GroovyRuntimeException(
    				"Failed to run the unit test. JUnit is not on the Classpath.", e);
            }
    	}
    	
    	/**
    	 * Run the JUnit 4 test.
    	 * 
    	 * Copied the approach used in 
    	 * groovy.lang.GroovyShell.runJUnit4Test(Class scriptClass)
    	 * @see https://github.com/groovy/groovy-core/blob/master/src/main/groovy/lang/GroovyShell.java
    	 */	
    	static runJUnit4Test(Class clazz, def loader) {	
    		def junit4Utils = "org.codehaus.groovy.vmplugin.v5.JUnit4Utils"	
    		try {			
                return InvokerHelper.invokeStaticMethod(junit4Utils,"realRunJUnit4Test", [clazz, loader]);
            } catch (ClassNotFoundException e) {
                throw new GroovyRuntimeException(
    				"Failed to run the unit test. JUnit is not on the Classpath.", e);
            }
    	}
    	
    	private EmbeddedTestUtil(){
    		// 
    	}	
    	
    }
    

     

    In listing 5 below, I run the examples in a Windows shell.

    Listing 5, run of tests in Windows shell
    src>groovy -cp . com\octodecillion\testing\Example1
    .in test1 ....
    
    Time: 0.023
    
    OK (1 test)
    
    
    src>groovy -cp . com\octodecillion\testing\Example2
    .in test1 ....
    
    Time: 0.018
    
    OK (1 test)
    
    src>groovy -cp . -Dtest.script=true com\octodecillion\testing\Example3
    .in test1 ....
    .in test2 ....
    
    Time: 0.047
    
    OK (2 tests)
    </div>
    

     

    Conclusion
    Presented was a possible alternative to Groovy test source code structuring using nested test classes. Classes and scripts were given as examples. Perhaps a more powerful approach is to use more high-level test frameworks, such as Behavior Driven Development (with for example, JBehave). This would allow the target class or script to also contain its own executable specification.

    Environment
    Eclipse Kepler (4.3)
    Groovy 2.1.2
    Java JDK 1.7.0_25
    Windows 7 64bit

    Further Reading

    Creative Commons License
    This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported License.

    Use AOP Aspects as Mocks in JUnit tests?

    This post illustrates a subtle relationship between Mock Objects and Aspect Oriented Programming. Some code examples are shown, and ends with questions for further research. This is a follow on to a prior “Using AspectJ For Testing Legacy Code“. Author: Josef Betancourt

    Use AOP Aspects as Mocks in JUnit tests?

    Written by:
    Josef Betancourt
    Date:
    2013-05-29
    Subject:
    AOP vs Mocks for testing

    Categories and Subject Descriptors
    D.2.2 [Software Engineering]: Design Tools and Techniques, Object-Oriented programming

    Keywords
    interceptors, AOP, Mock Objects, JMockit, JUnit, Aspectj, unit test

    Introduction
    In Unit testing during code development and maintenance, the smallest unit of code is tested in isolation from collaborating units or subsystems. In well designed systems, testing is much simpler: outgoing interfaces are easier to manage. In legacy systems and/or badly written code, that may not be the case. Further, when testing legacy systems, changes to increase testability are not always possible.

    Thus, various patterns, frameworks, and tools are used to support this isolation. Mock Object frameworks are ideal for this. Aspect Oriented Programming (AOP) is also capable of providing this isolation but is rarely mentioned. What are the differences? Is one better than another for testing?

    Aspect Oriented Programming
    See the wikipedia entry for more information.
    In the Java world, AspectJ is the most well-known AOP implementation.

    Mock Objects
    Mock Objects are simulated objects that mimic the behavior of real objects in controlled ways.

    The JMockit toolkit is a modern Java mocking toolkit. One distinguishing feature of JMockit is that it is powerful enough to Mock legacy unmockable code.

    Example Project
    A Client class uses a Service object’s query method to get a user’s name: Client.getUserName(int) –> Service.query(int).

    For the test we want to substitute the name for a specific argument. When invoked with argument integer 1, the result is the string “second”, but in the test we want to return “Hello world!”. This example is just to show the technology; a hello world app, not practical in itself.

    Advice.
    To intercept a method call on an object in AOP an Aspect is created.

    “Aspects wrap up pointcuts, advice, and inter-type declarations in a modular unit of crosscutting implementation.”
     

    The Advice below is written in the annotation based AspectJ syntax introduced in AspectJ 5.

    @Aspect
    static class TestAspect {
    		
    	@Pointcut("call( * AspectJUnit.Service.query(int)) && args(i)")		
    	void intercept(int i){}
    		
    	@Around("intercept(i)")
    	public String query(ProceedingJoinPoint tjp, int i){
    	    return (i == 1) ? "Hello world!"
    		: (String) tjp.proceed(new Object[]{i});
    	}		
    	
    }
    
     

    This defines a method that will ‘replace’ the original target. The terminology used by the AspectJ programming manual:

    • A Join Point is a well-defined point in the program flow.
    • A Pointcut picks out certain join points and values at those points.
    • An Advice is code that is executed at a join point.
     

    AspectJ defines many Join Points. There is even a Handler execution join point: “When an exception handler executes. Handler execution join points are considered to have one argument, the exception being handled”.

    The pointcut language is very rich and the signature patterns can be generic or very specific.

    We can also use an anonymous pointcut instead:

    @Aspect
    static class TestAspect {
    		
    	@Around("call( * AspectJUnit.Service.query(int)) && args(i)")
    	public String doQuery(ProceedingJoinPoint jp, int i){
    	    return (i == 1) ? "Hello world!"
    		: (String) jp.proceed(new Object[]{i});
    	}		
    	
    }
    
     

    Note that the advice, here doQuery(..), does not have to match the original target method’s name.

    Mocking
    In JMockit‘s State Based API an object can be mocked by using a MockUp class. Within that class any method that needs to be mocked is re-implemented and annotated with @Mock.

    Since JMockit allows even private, final, or static methods to be overridden, this is not normal Java method overriding.

    class MockService extends MockUp<Service> {
    	@Mock
    	public String query(Invocation invocation, int n) {
    		return (n == 1) ? "Hello world!"
    			: ((Service) invocation.
    			getInvokedInstance()).query(n);
    	}			
    }
    
     

    We can also define the mock using an anonymous inline class within the test method:

    new MockUp<Service>() {				
            @Mock
    	public String query(Invocation invocation, int n) {
    	     return (n == 1) ? "Hello world!" 
                  : ((Service) invocation. 
                    getInvokedInstance()).query(n);
    	}
    };
    
     


    Comparison
    Structure and Syntax
    Compare this to the prior Aspect. They are both a class definition. The new behavior is also a method. The difference is the pointcut. In JMockit the pointcut and the advice specification are combined. Note that the method signature must match the target method’s. In the AspectJ Aspect, these are separate.

    @Aspect
    static class TestAspect {
    		
    	@Around("call( * AspectJUnit.Service.query(int)) && args(i)")
    	public String doQuery(ProceedingJoinPoint jp, int i){
    	    return (i == 1) ? "Hello world!"
    		: (String) jp.proceed(new Object[]{i});
    	}		
    	
    }
    
    class MockService extends MockUp<Service> {
    
    	@Mock
    	public String query(Invocation invocation, int n) {
    		return (n == 1) ? "Hello world!"
    			: ((Service) invocation.
    			getInvokedInstance()).query(n);
    	}			
    }
    
     

    The combined syntax in the Mock approach is simpler, and since a test is usually targeted at one specific method, the extra AOP pointcut expressiveness may have limited use. The Mock class can even be created in the test inline. A paper referenced below even makes the suggestion that test based pointcut definitions are an improvement to AOP since they are less fragile.

    The aspect must be created as a static class and the associated advice is instrumented at compile time. The mock is integrated in the JUnit run time so it’s ‘advice’ is active only for the current test, thus, you have more opportunities for various tests of the same target. The full source code of the test using Aspects does not have as many assertions as the one written using Mocks because of this early instrumentation. See section below for listings.

    There are more differences. These are of course due to the intended application of these two technologies. AOP is a more general purpose systems tool. Crosscutting concerns are an architectural artifact whereas isolation is a testing concern.

    Mocks are a subset of Aspects
    AOP is optimized for cross-cutting concerns. But, collaborator isolation is just an example of a single-use test-based cross-cutting concern. So Mocks are just a special use of general AOP capabilities.Mocks supply a single joinpoint, and in many implementations allow the use of an Around advice. This has been sufficient and with other features such as Expectations, the Mock Objects framework fits well into current testing approaches.

    In AOP there is a richer language for selecting different kinds of joinpoints. And the pointcuts are used by different kinds of advice, such as cflow, before, after, around, and so forth. The modularization (inheritance and other features) into Aspects presents an opportunity for more high-level syntax and use, especially with the annotation based language.

    Can Mock Objects use AOP techniques?
    This begs the question of whether Mock Objects can or should use generic AOP techniques? If so how?

    One example of the integration of AspectJ is the Spring framework, which now can use the AspectJ pointcut language in its own AOP implementation. Spring’s implementation is not specifically targeted as a test solution.

    Another technology is the use of a language that can create “advice” rules and be invoked via instrumentation. This is available in Byteman. Byteman uses Event Condition Action (ECA) rules.

    More recent versions of JMockit now supports a ‘wildcard’ mock method: ‘”Object $advice(Invocation)”, which if defined will match every method in the mocked class hierarchy.’ — mockit Annotation Type Mock

    If AOP pointcuts do have a use-case in testing mocks, could this be an area where a pointcut DSL could be introduced in future?

     

    Aspects and Mocks shown in full listings

    Example Mocks in a JUnit test
    package com.octodecillion.jmockit.example;
    
    import static org.hamcrest.core.Is.is;
    import static org.junit.Assert.assertThat;
    
    import java.util.Arrays;
    import java.util.List;
    
    import mockit.Invocation;
    import mockit.Mock;
    import mockit.MockUp;
    
    import org.junit.Before;
    import org.junit.Test;
    import org.junit.runner.RunWith;
    import org.junit.runners.JUnit4;
    
    /**
     * mockit.Invocation class use example
     */
    public class JMockItExample2 {
    
    	/**  the client we are testing */
    	public class Client {
    		private Service service;
    		
    		public Client(Service service) {
    			this.service = service;
    		}
    		
    		/** invoke the 'expensive' service. */
    		public String getUserName(int n){
    			return service.query(n);			
    		}
    		
    	}
    	
    	/** the service that does integration stuff */
    	public class Service {
    			/**
    			 * @param key 
    			 * @return 
    			 */
    			public String query(int n) {				
    				return values.get(n);
    			}
    			
    			private final List<String> values = 
    				Arrays.asList("first","second","third");
    	}
    	
    	/**
    	 * Nested test class.
    	 *
    	 */
    	@RunWith(JUnit4.class)
    	public static class UnitTest{
    		private JMockItExample2 example;
    		
    		@Before
    		public void before(){
    			example = new JMockItExample2();
    		}
    		
    		/**   */
    		@Test
    		public void should_do_around_advice() {
    			String expected = "second";
    			String actual = example.new Service().query(1);
    			assertThat(actual, is(expected));
    			
    			new MockUp<Service>() {				
    				/**
    				 * Use an Invocation to allow use of mocked object.
    				 */
    				@SuppressWarnings("unused")
    				@Mock
    				public String query(Invocation invocation, int n) {
    					return (n == 1) ? "Hello world!"
    						: ((Service) invocation.
    						getInvokedInstance()).query(n);
    				}
    			};
    			
    			expected = "Hello world!";
    			actual = example.new Service().query(1);
    			assertThat(actual, is(expected));
    			
    			// do the around advice
    			actual = example.new Client(example
    					.new Service()).getUserName(1);
    			
    			assertThat(actual, is(expected));			
    
    		}		
    
    	}
    
    }
    

     
    Example Aspect in a JUnit test
    /**  */
    package com.octodecillion.jmockit.example;
    
    import static org.hamcrest.core.Is.is;
    import static org.junit.Assert.assertThat;
    
    import java.util.Arrays;
    import java.util.List;
    
    import org.aspectj.lang.ProceedingJoinPoint;
    import org.aspectj.lang.annotation.Around;
    import org.aspectj.lang.annotation.Aspect;
    import org.aspectj.lang.annotation.Pointcut;
    import org.junit.Test;
    import org.junit.runner.RunWith;
    import org.junit.runners.JUnit4;
    
    /**
     * Using AspectJ in a JUnit test.
     * <p>
     * For compile issue in Eclipse: {@link http://stackoverflow.com/questions/8958267/java-lang-verifyerror-expecting-a-stackmap-frame}
     * @see http://stackoverflow.com/questions/8958267/java-lang-verifyerror-expecting-a-stackmap-frame
     * 
     */
    public class AspectJUnit {
    
    	/**  the client we are testing */
    	public class Client {
    		private Service service;
    		
    		public Client(Service service) {
    			this.service = service;
    		}
    		
    		/** invoke the 'expensive' service. */
    		public String getUserName(int n){
    			return service.query(n);			
    		}
    		
    	}
    	
    	/** the service that does integration stuff */
    	public class Service {
    			/**
    			 * @param key 
    			 * @return 
    			 */
    			public String query(int n) {				
    				return values.get(n);
    			}
    			
    			private final List<String> values = 
    				Arrays.asList("first","second","third");
    	}
    	
    	@Aspect
    	static class TestAspect {
    		
    		@Around("call( * AspectJUnit.Service.query(..)) && args(i)")
    		public String doQuery(ProceedingJoinPoint tjp, int i){
    			return (i == 1) ? "Hello world!"
    				: (String) tjp.proceed(new Object[]{i});
    		}		
    		
    	}
    	
    	/**
    	 * Nested test class.
    	 *
    	 */
    	@RunWith(JUnit4.class)
    	public static class UnitTest{
    		private AspectJUnit example;
    		
    		/**   */
    		@Test
    		public void should_do_around_advice() {
    			String expected = "Hello world!";
    			String actual = example.new Service().query(1);
    			assertThat(actual, is(expected));
    			
    			actual = example.new Client(example
    					.new Service()).getUserName(1);
    			
    			assertThat(actual, is(expected));			
    
    		}
    		
    		@org.junit.Before
    		public void setup(){
    			example = new AspectJUnit();
    		}
    
    	}	
    
    }
    

     

    Related Reading

    Creative Commons License
    This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported License.

    JMockIt method not found in type when using Invocation arg

    Strange I was writing a very simple JUnit test using JMockIt. But, it has a compile error.

    I wanted to also invoke the original mocked target object in the test. In the state based Mockup API one does this by adding a mockit.Invocation as the first argument to the method signature.

    	@Mock
    	public String getV(Invocation invocation, String key) { ....... }
    

    Then, the original mocked object is available via: invocation.getInvokedInstance().

    It does not work in Eclipse 4.2. I copied the same code and opened the source in Eclipse 3.7. It works. Thus, the issue is Eclipse or a setup difference.

    Update
    May 19, 2013: I see that the classpath ordering is different as shown in the .classpath config file. Also, the JRE_CONTAINER is different, the one in the Eclipse 4.2 project is using JavaSE-1.7. That should have been jdk1.7.0_13. Will have to retest.
    That wasn’t the issue.

    Environment

    • Windows 7 64bit, AMD PC
    • JDK 1.7
    • Eclipse 4.2
    • JUnit 4
    • JMockIt 1.2
    Creative Commons License
    This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported License.