Web Services (web + services)

Distribution by Scientific Domains
Distribution within Information Science and Computing


Selected Abstracts


Empowering Automated Trading in Multi-Agent Environments

COMPUTATIONAL INTELLIGENCE, Issue 4 2004
David W. Ash
Trading in the financial markets often requires that information be available in real time to be effectively processed. Furthermore, complete information is not always available about the reliability of data, or its timeliness,nevertheless, a decision must still be made about whether to trade or not. We propose a mechanism whereby different data sources are monitored, using Semantic Web facilities, by different agents, which communicate among each other to determine the presence of good trading opportunities. When a trading opportunity presents itself, the human traders are notified to determine whether or not to execute the trade. The Semantic Web, Web Services, and URML technologies are used to enable this mechanism. The human traders are notified of the trade at the optimal time so as not to either waste their resources or lose a good trading opportunity. We also have designed a rudimentary prototype system for simulating the interaction between the intelligent agents and the human beings, and show some results through experiments on this simulation for trading of the Chicago Board Options Exchange (CBOE) options. [source]


A large-scale monitoring and measurement campaign for web services-based applications

CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 10 2010
Riadh Ben Halima
Abstract Web Services (WS) can be considered as the most influent enabling technology for the next generation of web applications. WS-based application providers will face challenging features related to nonfunctional properties in general and to performance and QoS in particular. Moreover, WS-based developers have to provide solutions to extend such applications with self-healing (SH) mechanisms as required for autonomic computing to face the complexity of interactions and to improve availability. Such solutions should be applicable when the components implementing SH mechanisms are deployed on both or only one platform on the WS providers and requesters sides depending on the deployment constraints. Associating application-specific performance requirements and monitoring-specific constraints will lead to complex configurations where fine tuning is needed to provide SH solutions. To contribute to enhancing the design and the assessment of such solutions for WS technology, we designed and implemented a monitoring and measurement framework, which is part of a larger Self-Healing Architectures (SHA) developed during the European WS-DIAMOND project. We implemented the Conference Management System (CMS), a real WS-based complex application. We achieved a large-scale experimentation campaign by deploying CMS on top of SHA on the French grid Grid5000. We experienced the problem as if we were a service provider who has to tune reconfiguration strategies. Our results are available on the web in a structured database for external use by the WS community. Copyright © 2010 John Wiley & Sons, Ltd. [source]


Plug-and-play remote portlet publishing

CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 12 2007
X. D. Wang
Abstract Web Services for Remote Portlets (WSRP) is gaining attention among portal developers and vendors to enable easy development, increased richness in functionality, pluggability, and flexibility of deployment. Whilst currently not supporting all WSRP functionalities, open-source portal frameworks could in future use WSRP Consumers to access remote portlets found from a WSRP Producer registry service. This implies that we need a central registry for the remote portlets and a more expressive WSRP Consumer interface to implement the remote portlet functions. This paper reports on an investigation into a new system architecture, which includes a Web Services repository, registry, and client interface. The Web Services repository holds portlets as remote resource producers. A new data structure for expressing remote portlets is found and published by populating a Universal Description, Discovery and Integration (UDDI) registry. A remote portlet publish and search engine for UDDI has also been developed. Finally, a remote portlet client interface was developed as a Web application. The client interface supports remote portlet features, as well as window status and mode functions. Copyright © 2007 John Wiley & Sons, Ltd. [source]


UNEP-GEMS/Water Programme,water quality data, GEMStat and open web services,and Japanese cooperation

HYDROLOGICAL PROCESSES, Issue 9 2007
Sabrina Barker
Abstract The purpose of this paper is threefold. First, it demonstrates how monitoring stations that collect water quality data can be situated globally via satellite data from Google Earth. Important technical issues such as interoperability and Open Web Services are discussed in this context. Second, it illustrates how researchers at local levels can benefit from this global technology. The discussion draws from the online water quality database, GEMStat, which contains water quality data and sediment load calculations from around the world. These types of data, collected locally, can be shown to bear global implications through Internet technology. GEMStat has been expanded to include Open Web Services to enable interoperability with other online databases. Third, it illustrate an international framework of cooperation through GEMS/Water Japan, introducing on-site monitoring activities as well as management of international river basin (Mekong/La Plata). Considerations for future application framework are presented in conclusion. Copyright © 2007 John Wiley & Sons, Ltd. [source]


A pedagogical Web service-based interactive learning environment for a digital filter design course: An evolutionary approach

COMPUTER APPLICATIONS IN ENGINEERING EDUCATION, Issue 3 2010
Wen-Hsiung Wu
Abstract The course of digital filter design in electronic/electrical engineering involves complicated mathematical equations and dynamic waveform variations. It is a consensus among educators that using simulation tools assist in improving students' learning experiences. Previous studies on system simulation seemed to lack an appropriate approach to design such a course. Few emphasized the design of an interactive learning environment by using an evolutionary approach. This study integrated the design concept of an evolutionary approach and Web service-based technology into a simulation system entitled Pedagogical Web Service-Based Interactive Learning Environment (PEWSILE) was introduced. The PEWSILE system contained two interactive learning environments,a simple system and an advanced system. It offered a total of six pedagogical Web services. The simple interactive learning environment included text/color-based services, and text/color/diagram-based services. The advanced interactive learning environment included batch-based, interval change-based, comparison-based, and scroll bar-based services. The study also assessed the students' performance in six pedagogical Web services covering interaction and overall use, usefulness, and intention to use through a questionnaire survey and subsequent interviews. Three significant findings were reported. For example, in the advanced interactive learning environment, the designs of interval change-based and comparison-based services make it easier to observe differences in the outcome of parameter change, while batch-based services lacks the element of waveform comparison. In sum, the findings in this study provide helpful implications in designing engineering educational software. © 2010 Wiley Periodicals, Inc. Comput Appl Eng Educ 18: 423,433, 2010; View this article online at wileyonlinelibrary.com; DOI 10.1002/cae.20163 [source]


Using Web 2.0 for scientific applications and scientific communities

CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 5 2009
Marlon E. Pierce
Abstract Web 2.0 approaches are revolutionizing the Internet, blurring lines between developers and users and enabling collaboration and social networks that scale into the millions of users. As discussed in our previous work, the core technologies of Web 2.0 effectively define a comprehensive distributed computing environment that parallels many of the more complicated service-oriented systems such as Web service and Grid service architectures. In this paper we build upon this previous work to discuss the applications of Web 2.0 approaches to four different scenarios: client-side JavaScript libraries for building and composing Grid services; integrating server-side portlets with ,rich client' AJAX tools and Web services for analyzing Global Positioning System data; building and analyzing folksonomies of scientific user communities through social bookmarking; and applying microformats and GeoRSS to problems in scientific metadata description and delivery. Copyright © 2009 John Wiley & Sons, Ltd. [source]


The development of a geospatial data Grid by integrating OGC Web services with Globus-based Grid technology

CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 14 2008
Liping Di
Abstract Geospatial science is the science and art of acquiring, archiving, manipulating, analyzing, communicating, modeling with, and utilizing spatially explicit data for understanding physical, chemical, biological, and social systems on the Earth's surface or near the surface. In order to share distributed geospatial resources and facilitate the interoperability, the Open Geospatial Consortium (OGC), an industry,government,academia consortium, has developed a set of widely accepted Web-based interoperability standards and protocols. Grid is the technology enabling resource sharing and coordinated problem solving in dynamic, multi-institutional virtual organizations. Geospatial Grid is an extension and application of Grid technology in the geospatial discipline. This paper discusses problems associated with directly using Globus-based Grid technology in the geospatial disciplines, the needs for geospatial Grids, and the features of geospatial Grids. Then, the paper presents a research project that develops and deploys a geospatial Grid through integrating Web-based geospatial interoperability standards and technology developed by OGC with Globus-based Grid technology. The geospatial Grid technology developed by this project makes the interoperable, personalized, on-demand data access and services a reality at large geospatial data archives. Such a technology can significantly reduce problems associated with archiving, manipulating, analyzing, and utilizing large volumes of geospatial data at distributed locations. Copyright © 2008 John Wiley & Sons, Ltd. [source]


A workflow portal supporting multi-language interoperation and optimization

CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 12 2007
Lican Huang
Abstract In this paper we present a workflow portal for Grid applications, which supports different workflow languages and workflow optimization. We present an XSLT converter that converts from one workflow language to another and enables the interoperation between different workflow languages. We discuss strategies for choosing the optimal service from several semantically equivalent Web services in a Grid application. The dynamic selection of Web services involves discovering a set of semantically equivalent services by filtering the available services based on metadata, and selecting an optimal service based on real-time data and/or historical data recorded during prior executions. Finally, we describe the framework and implementation of the workflow portal which aggregates different components of the project using Java portlets. Copyright © 2007 John Wiley & Sons, Ltd. [source]


The LEAD Portal: a TeraGrid gateway and application service architecture

CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 6 2007
Marcus Christie
Abstract The Linked Environments for Atmospheric Discovery (LEAD) Portal is a science application portal designed to enable effective use of Grid resources in exploring mesoscale meteorological phenomena. The aim of the LEAD Portal is to provide a more productive interface for doing experimental work by the meteorological research community, as well as bringing weather research to a wider class of users, meaning pre-college students in grades 6,12 and undergraduate college students. In this paper, we give an overview of the LEAD project and the role that LEAD portal is playing in reaching its goals. We then describe the various technologies we are using to bring powerful and complex scientific tools to educational and research users. These technologies,a fine-grained capability based authorization framework, an application service factory toolkit, and a Web services-based workflow execution engine and supporting tools,enable our team to deploy these once inaccessible, stovepipe scientific codes onto a Grid where they can be collectively utilized. Copyright © 2006 John Wiley & Sons, Ltd. [source]


The Open Grid Computing Environments collaboration: portlets and services for science gateways

CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 6 2007
Jay Alameda
Abstract We review the efforts of the Open Grid Computing Environments collaboration. By adopting a general three-tiered architecture based on common standards for portlets and Grid Web services, we can deliver numerous capabilities to science gateways from our diverse constituent efforts. In this paper, we discuss our support for standards-based Grid portlets using the Velocity development environment. Our Grid portlets are based on abstraction layers provided by the Java CoG kit, which hide the differences of different Grid toolkits. Sophisticated services are decoupled from the portal container using Web service strategies. We describe advance information, semantic data, collaboration, and science application services developed by our consortium. Copyright © 2006 John Wiley & Sons, Ltd. [source]


Programming scientific and distributed workflow with Triana services

CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 10 2006
David Churches
Abstract In this paper, we discuss a real-world application scenario that uses three distinct types of workflow within the Triana problem-solving environment: serial scientific workflow for the data processing of gravitational wave signals; job submission workflows that execute Triana services on a testbed; and monitoring workflows that examine and modify the behaviour of the executing application. We briefly describe the Triana distribution mechanisms and the underlying architectures that we can support. Our middleware independent abstraction layer, called the Grid Application Prototype (GAP), enables us to advertise, discover and communicate with Web and peer-to-peer (P2P) services. We show how gravitational wave search algorithms have been implemented to distribute both the search computation and data across the European GridLab testbed, using a combination of Web services, Globus interaction and P2P infrastructures. Copyright © 2005 John Wiley & Sons, Ltd. [source]


Processing methods for partially encrypted data in multihop Web services

ELECTRONICS & COMMUNICATIONS IN JAPAN, Issue 5 2008
Kojiro Nakayama
Abstract Message layer security is necessary to ensure the end-to-end security of Web services. To provide confidentiality against the intermediaries along the message path, XML encryption is used to partially encrypt the message. Because the data structure is changed by the partial encryption, the encrypted message is no longer valid with respect to the original schema definition. Thus, problems occur regarding the processing of the schema validation and the data binding by the intermediary. In this paper, we discuss two possible methods to solve these problems. The first method is to transform the original schema definition. The second is to transform the received message. We examined these methods by applying them to demonstration experiment of Web services. © 2008 Wiley Periodicals, Inc. Electron Comm Jpn, 91(5): 26, 32, 2008; Published online in Wiley InterScience (www.interscience.wiley.com). DOI 10.1002/ecj.10112 [source]


A cyberenvironment for crystallography and materials science and an integrated user interface to the Crystallography Open Database and Predicted Crystallography Open Database

JOURNAL OF APPLIED CRYSTALLOGRAPHY, Issue 2 2008
Jacob R. Fennick
With the advent and subsequent evolution of the Internet the ways in which computational crystallographic research is conducted have dramatically changed. Consequently, secure, robust and efficient means of accessing remote data and computational resources have become a necessity. At present scientists in computational crystallography access remote data and resources via separate technologies, namely SSH and Web services. Computational Science and Engineering Online (CSE-Online) combines these two methods into a single seamless environment while simultaneously addressing issues such as stability with regard to Internet interruption. Presently CSE-Online contains several applications which are useful to crystallographers; however, continued development of new tools is necessary. Toward this end a Java application capable of running in CSE-Online, namely the Crystallography Open Database User Interface (CODUI), has been developed, which allows users to search for crystal structures stored in the Crystallography Open Database and Predicted Crystallography Open Database, to export structural data for visualization, or to input structural data in other CSE-Online applications. [source]


The design and use of WSDL-Test: a tool for testing Web services

JOURNAL OF SOFTWARE MAINTENANCE AND EVOLUTION: RESEARCH AND PRACTICE, Issue 5 2007
Harry M. Sneed
Abstract Web services are becoming increasingly important to many businesses, especially as an enabling technology for systems that adopt a service-oriented architecture approach to their development. However, testing Web services poses significant challenges. This paper describes the design and use of WSDL-Test, a tool designed specifically for this purpose. A key feature of WSDL-Test is its ability to simulate the actual usage of Web services in a controlled environment. This enables WSDL-Test to generate requests and validate responses in a rapid and reliable manner. To illustrate the use of WSDL-Test, the paper also discusses our experience using the tool on a real-world online eGoverment application. Copyright © 2007 John Wiley & Sons, Ltd. [source]


On the business value and technical challenges of adopting Web services

JOURNAL OF SOFTWARE MAINTENANCE AND EVOLUTION: RESEARCH AND PRACTICE, Issue 1-2 2004
S. Tilley
Abstract This paper provides a balanced perspective of the business value and technical challenges of adopting Web services. Technology adoption is a continual challenge for both tool developers and enterprise users. Web services are a prime example of an emerging technology that is fraught with adoption issues. Part of the problem is separating marketing hype from business reality. Web services are network-accessible interfaces to application functionality. They are built using Internet technologies such as XML and standard protocols such as SOAP. The adoption issues related to Web services are complex and multifaceted. For example, determining whether this technology is a fundamental advance, rather than something old under a new name, requires technical depth, business acumen, and considerable historical knowledge of past developments. A sample problem from the health care industry is used to illustrate some of the adoption issues. Copyright © 2004 John Wiley & Sons, Ltd. [source]


Web services for controlled vocabularies

BULLETIN OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE & TECHNOLOGY (ELECTRONIC), Issue 5 2006
Diane Vizine-Goetz
First page of article [source]


A large-scale monitoring and measurement campaign for web services-based applications

CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 10 2010
Riadh Ben Halima
Abstract Web Services (WS) can be considered as the most influent enabling technology for the next generation of web applications. WS-based application providers will face challenging features related to nonfunctional properties in general and to performance and QoS in particular. Moreover, WS-based developers have to provide solutions to extend such applications with self-healing (SH) mechanisms as required for autonomic computing to face the complexity of interactions and to improve availability. Such solutions should be applicable when the components implementing SH mechanisms are deployed on both or only one platform on the WS providers and requesters sides depending on the deployment constraints. Associating application-specific performance requirements and monitoring-specific constraints will lead to complex configurations where fine tuning is needed to provide SH solutions. To contribute to enhancing the design and the assessment of such solutions for WS technology, we designed and implemented a monitoring and measurement framework, which is part of a larger Self-Healing Architectures (SHA) developed during the European WS-DIAMOND project. We implemented the Conference Management System (CMS), a real WS-based complex application. We achieved a large-scale experimentation campaign by deploying CMS on top of SHA on the French grid Grid5000. We experienced the problem as if we were a service provider who has to tune reconfiguration strategies. Our results are available on the web in a structured database for external use by the WS community. Copyright © 2010 John Wiley & Sons, Ltd. [source]


Multiversion concurrency control for the generalized search tree

CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 12 2009
Walter Binder
Abstract Many read-intensive systems where fast access to data is more important than the rate at which data can change make use of multidimensional index structures, like the generalized search tree (GiST). Although in these systems the indexed data are rarely updated and read access is highly concurrent, the existing concurrency control mechanisms for multidimensional index structures are based on locking techniques, which cause significant overhead. In this article we present the multiversion-GiST (MVGiST), an in-memory mechanism that extends the GiST with multiversion concurrency control. The MVGiST enables lock-free read access and ensures a consistent view of the index structure throughout a reader's series of queries, by creating lightweight, read-only versions of the GiST that share unchanging nodes among themselves. An example of a system with high read to write ratio, where providing wait-free queries is of utmost importance, is a large-scale directory that indexes web services according to their input and output parameters. A performance evaluation shows that for low update rates, the MVGiST significantly improves scalability w.r.t. the number of concurrent read accesses when compared with a traditional, locking-based concurrency control mechanism. We propose a technique to control memory consumption and confirm through our evaluation that the MVGiST efficiently manages memory. Copyright © 2009 John Wiley & Sons, Ltd. [source]


UNEP-GEMS/Water Programme,water quality data, GEMStat and open web services,and Japanese cooperation

HYDROLOGICAL PROCESSES, Issue 9 2007
Sabrina Barker
Abstract The purpose of this paper is threefold. First, it demonstrates how monitoring stations that collect water quality data can be situated globally via satellite data from Google Earth. Important technical issues such as interoperability and Open Web Services are discussed in this context. Second, it illustrates how researchers at local levels can benefit from this global technology. The discussion draws from the online water quality database, GEMStat, which contains water quality data and sediment load calculations from around the world. These types of data, collected locally, can be shown to bear global implications through Internet technology. GEMStat has been expanded to include Open Web Services to enable interoperability with other online databases. Third, it illustrate an international framework of cooperation through GEMS/Water Japan, introducing on-site monitoring activities as well as management of international river basin (Mekong/La Plata). Considerations for future application framework are presented in conclusion. Copyright © 2007 John Wiley & Sons, Ltd. [source]