August 31, 2009

Web Service Composition Benchmark

During the WS Challenge we had a discussion about Web Service Composition benchmarks. I think this is also result of a recently published journal article on WSBen. WSBen is a web service composition benchmark that had been developed as a general composition benchmark for the AI community. The history of WSBen goes back to a Web Service Challenge on the EEE/ICEBE conference in 2005. I don't know further details and how it is related to the current Web Services Challenge but it seems that at least one participant from the Penn state University is related to this project or persons of this project. The test sets of WSBen are related to simplified WSDL documents and only support syntactic service discovery. The focus of WSBen is applying a network model in order to generate clusters of Web Services and is also able to visualize it with a GUI. So much about WSBen.

Let me make this clear from the beginning: The current Web Services Challenge Software (Generator + Client + Checker) is a Semantic Web Service Composition Benchmark. If we also consider that the source code is available and that we apply modern design patterns to make it open and extensible then it can be also called a framework. It is as easy as that. The test sets are purely random generated without any structure. Their difficulty range from test sets which can be used for testing your system and up to a difficulty which cannot be solved in minutes or hours with even the winning system. The test sets do not have any structure, an algorithm suitable for a specific test set maybe inadequate for another algorithm. The difference between a specific benchmark and our toolset is that we have not yet published a set of test sets which can be used to evaluate your system. We have not done this so far because we did not see any use in that. Now, with several fast composition systems at hand and two years of experience with these tools we could develop such benchmark.

The Web Services Challenge has over 5 years experience in measuring the performance of composition systems. We provide tools, test sets, rules, performance results and now also compositions systems. I have about 70 generated test sets from the last challenge and we will upload them shortly. Anybody who is still interested in a composition benchmark should download these test sets and document their results: test system, initialization time, response time, the result file and checking file. The evaluation should be done on a more or less simple desktop computer. The results will be collected and used to generate new test sets. On the other hand if we find several test sets where the algorithms produce different results then we are able to analyze them. The resulting package of test sets, already collected results and measurements will be the best benchmark available. Benchmarking is not the definition of an individual but a community effort.

August 30, 2009

Web Services Challenge 2010

The Web Services Challenge 2010 is still in planning but I want to describe some innovations. The innovations of the Web Services Challenges are dependent on resources. This means persons and money. The financial support is yet to be confirmed and so I can only give a rough list of ideas and changes. The project itself and a timetable cannot be planned at the moment. Please bear in mind, any changes in the WS Challenge means a lot of work for us, sometimes I even think more for us than for the participants. Each change in the challenge means a new testset generator, document formats, documentation, client application and of course a solution checking software. In contrast to the participant's composition system we have to ensure 100% bug free software. Otherwise the whole WS Challenge pointless and its results would be worthless. I am glad to say that we have successfully organized and improved the WS Challenge so far and of course want to continue our good work.

The WS Challenge 2010 will introduce a dynamic environment where both services and their Quality of Service (QoS) change. The initial set of services that are provided for the initialization of the composition system will be updated frequently. Services will be added and removed from the initial set of services. Furthermore, the QoS parameters on response time and throughput will change. Service changes and QoS values will be updated through a client application and the composition system has to answer with updated compositions. We will evaluate the time the composition system needs to respond with updated compositions and also the QoS of the overall composition which is comparable to last year. The proceedings will be as follows:
  1. Composition System Initialization with WSDL, WSLA and OWL.
  2. Sending the WSDL Request.
  3. The Composition System responds with a valid composition.
  4. Sending a list of removed services.
  5. Sending a list of new services.
  6. Sending a list of updated QoS parameters.
  7. The Composition System responds with a valid composition.
  8. Steps 4 to 7 can be repeated several times.
This is just a sketch of what we want to achieve in 2010 and still unconfirmed. I am open for any suggestions and hope for a lively discussion.

Web Services Challenge 2009

The Web Services Challenge 2009 had been situated at the CEC/EEE conference in Vienna from 20th - 22nd July. We had 9 accepted papers and 7 participants. Sadly enough, 2 participants quit the competition without giving any reasons. We had participants from the following countries:
  • USA
  • China
  • South Korea
  • Slovakia
  • Netherlands
The competition progressed smoothly and we had a clear result at the end of the competition (Link). This year we had a high percentage of reliable and fast systems. 4 systems had been able to solve all test sets and also in acceptable time. It is a pity that one system produced valid solutions in all test sets but the result did not win any points. All these systems operated for all test sets in milliseconds and build a high entry barrier for all composition systems still to come.

I would like to say thanks to all participants. It had been nice to meet all of you after a long time of sole e-mail communication. Hopefully, you had been as satisfied with the proceedings of the challenge as I am. I really enjoyed our conversations both at the welcome reception and conference dinner. I got a good impression of the requests of the participants and I will use it to improve the WS-Challenge in the future years.

Finally, I want to list some WSC related links:

Online Tools

In the recent time I had several problems using multiple computers. This is not unusual as you easily accumulate several devices:
  • Desktop Computer (Gaming and Performance)
  • Laptop (Work, Mobility)
  • Netbook (Video Conferencing and Communication)
There are several issues when using multiple computers like synchronisation of data, backup of data, access of data. Also the browser is relevant as I had problems with bookmarks and passwords for sites. Currently I am still evaluation the following online services:
  • Mozy - Online Backup (2GB free account)
  • SugarSync - Online Synchronization (2GB free account)
  • Xmarks - Online Synchronization of Bookmarks and Passwords (free account)
So far I had no problems but I also suggest to test online services before paying for them. The services offer high encryption and user defined PIN codes. I hope there is no security risk but I still have to monitor some online sites for problems with these services. You never know, you can just hope. It has to be noted that client software for these services are only available for Windows and sometimes for Mac computers. SugarSync also offers synchronization software for mobile devices. All services can be easily upgraded to flate rate accounts and are also quite cheap.